PORTFOLIO
Welcome! Thank you for taking the time to look through my portfolio. I am constantly adding new projects, features, and insights to this site as I learn and grow as a developer.
My name is David Olamide. As a programmer, I enjoy working closely with teammates to build robust game systems that enrich the player experience, such as exciting movement systems, robust AI systems, sophisticated UI systems, or programmatic effects. I have launched games developed in Unreal Engine, and I've done extensive work in C++. And game systems programming is the most exciting area for me for now. I was responsible for game systems and UI systems in Legends of Orisha and game systems in the Avalon project. I have successfully completed my undergraduate thesis on a physical checkers-playing AI system that can play against a human opponent.
Below, I have listed some of my work. You can find my profile on LinkedIn.
LEGENDS OF ORISHA: BLOOD AND WATER
This is a 3D African action-adventure, role-playing game of Dimension 11 Game Studio, currently in development. I handled all the programming of the player, vehicle (a horse), and AI systems, as well as animations in the game. It will be launching on PC, Xbox, and PlayStation.
The game features a main third-person character who explores the game world through level progress and unravels the story as the game progresses. The player utilizes primitive weapons which historically depict the precolonial era in South Western, Nigeria. The player is able to pick up items along the way which will offer more versatility and more interesting gameplay. Gameplay engineers worked to bring the ideas that have been conceptualized for the game to life and create a balanced game world which includes exploration and puzzle solving. The game is estimated on release to have 8+ hours of gameplay.
Legends of Orisha while still in development, won an award in the “one to watch” category of the Games Industry Africa Awards (GIAA) 2022.
Contributions
Lead Responsibilities
Breaking down deliverables for various gameplay system implementations in the game
Design and Implementation of the Player’s Combat System
Design and implementation of some AI combat systems
Design and implementation of UI systems in the game using Unreal Motion Graphics (UMG) tools.
Rapid prototyping during the early phase of development
Cross-Discipline Communication
Technology Design Documentation
Code Review, Debug and Profile
The Player’s Combat
I and Igumbor Ewere were responsible for the player’s combat. The combat includes melee attacks (fist attacks and cutlass attacks). I majored in the non-cinematic aspects of the player’s combat (attack, damage and hit reactions, death, and effects). We worked closely with the gameplay animators to get the appropriate attack animations and the equivalent reactions. I worked on the design spreadsheets that detailed the different weapons, attacks, damages, and reactions that the player would have. I also added input events to toggle between different combat states of the player, which served as a placeholder for the functionality to be tied up to the game’s UI.
How is the player's attack handled?
I implemented the player’s combat functionality as a component of the player character class, it communicates directly with the player class. When the player presses the attack input, it calls a function on the actor component, which handles and processes the information to determine whether an attack can be made. I implemented an array that holds the attacks of the player and an incrementing function that plays the next attack in the event of a successful combo, as well as a boolean control variable that prevents spamming of attacks. On the attack montage, I implemented a hit-trace anim notify class that tracks which AI was hit when the attack happened so that the appropriate hit reaction can be played on the AI. When an attack hits, the attack animation is sent as a message to the enemy that was hit. This information is used by the enemy to apply the appropriate damage to itself.
How is the player's hit reaction handled?
To handle hit reactions on the player from damage received externally (from the world or AI), I created a map data structure that maps the incoming attack with the appropriate hit reaction montage that the player should play. This approach gave a performant and less complicated result. This functionality allows for spamming because the desirable behavior is for the player to respond immediately to new hits even while still reacting to former hits.
The AI’s Combat
I and Igumbor Ewere were also responsible for the AI’s combat. The combat varied for the different enemy archetypes we had in the game. For the minon AIs, I implemented a positioning system that utilized a ticketing aggression system that my teammate implemented. I also implemented that tracking system, which helps the player select an AI target and actively engage with that target in real time. I implemented a state tree that handles the AI behaviors.
How is the minion AI’s positioning handled?
I implemented the minion AI’s positioning system to utilize the ticketing aggression system. This means that only enemies that are in an aggressive state could be assigned positions around the player. Using the Unreal Engine spline tool, I was able to procedurally generate positions around the player based on the following conditions: (AI’s distance to the player, the AI’s angle from the camera, the AI’s stun/hit state, and the AI’s recent aggression status. This system was set to be updated only when the aggression system updates, as it does a series of mathematical computations to arrive at the most optimal positions for the AIs.
How is the boss AI’s positioning handled?
The boss AI uses a normal distance check on the player to make the decision on whether to attack the player or move closer or farther away from the player. The player has a state that is set to idle when multiple minion AIs are attacking the player. This ensures that the player does not get overwhelmed when in combat.
How are the minion AI’s behaviours handled?
To automate the behavior of the minion AIs, I implemented a state tree in the Unreal engine. Because of its advantage over the behavior tree, where there is no reevaluation of the entire tree when a state changes, the state tree was the desired choice. I implemented all the states, transitions, and preconditions for the enemy AIs. These states include idle, movement, attack, hit reaction, stun, and death. Each of these states individually forms a hierarchy where they have children states within them. Death, for example, had both normal death and cinematic death in it.
NB: Codes and blueprint graphs are under NDA, and they cannot be shown here.
POST MORTEM
What went well?
Adopting new tools and features in the Unreal Engine like the state tree and gameplay tags helped to avoid pitfalls I would have had using the Unreal engine
The addition of test codes and debug menus to features allowed for easy fixing of bugs throughout development.
Demos and gameplay sync meets were helpful, especially for update purposes and it allowed for easy transfer of responsibilities across the gameplay team.
Diversification of skills across the team came in handy as problems that weren’t directly related to code could easily be fixed by the engineering team.
What went wrong?
During alpha, handling stress was challenging; I was very much in doubt during this phase of development.
Implementation of new features required a sync meeting within the engineering team and scheduling one was a hassle when calendars were cramped up with lots of meetings.
Feature creep happened, alongside frequent ongoing changes in gameplay. During this time, playtesting the game was given less importance than it should have been.
What did I learn?
This project has taught me team management. I had to lead and manage a team with diverse skill sets and ensure that everyone was on the same page. This was no easy feat.
Saying NO in the right places is very important; as well as following the pipeline, especially when confused.
Generating builds weekly and playtesting the game as well as submitting bugs was one Quality of life process we adopted and this came with a lot of benefits. We were able to detect bugs early.