top of page

Original Concept / Solo Project

PROXIMA

BOTFall.png

Software used:

LogoUnreal.png
LogoBlender.png
LogozBrush.png
LogoDesigner.png
LogoPainter.png
LogoAbleton.png
LogoPhotshop.png

Project last updated 2025

Proxima is a single-player, first-person shooter that blends combat, exploration, and puzzle-solving. Set aboard a retro-futuristic research station orbiting Jupiter’s moon Europa, you play as a scientist reassigned to study a mysterious alien lifeform in a classified facility. When containment fails and the station descends into chaos, you’re left as the sole survivor- forced to fight, hide, and think your way through escape.

The station is designed for immersion: filled with interactable objects, detailed physics where appropriate, and densely layered environmental storytelling that rewards curiosity with a rich, unfolding backstory.

This game is a work in progress. The first version of the game, titled "Subject 11," won Gnomon's Fall 2022 Best of Term for Gameplay.


First-person animations created by Kyle Martin. All other aspects created by me, including:
 

  • Gameplay design & systems

  • Character models & design

  • Rigging & animation

  • Environment art & design

  • Prop art & design

  • UI art, design & functionality

  • SFX & music

Jump to a section on this page:

GAMEPLAY SYSTEMS & SCRIPTING

To make Proxima playable, I had to design and implement a wide range of gameplay systems, many of them complex, all of them interconnected. For a highly interactive, single-player first-person shooter focused on exploration and puzzle-solving, these systems were essential:

  • A responsive, immersive first-person controller with a custom player model and animations

  • A flexible, expandable inventory system with item inspection, supporting logs, documents, and keepsakes that deepen the game’s story

  • A simple, data-driven NPC and dialogue system

  • A cutscene framework that lets scenes play out around the player without taking away control, inspired by games like Half-Life and Elder Scrolls

  • Basic but varied enemy AI, with custom models, animations, and behavior logic

  • A hidden quest/state manager to track player progress, story triggers, goals, and requirements

  • A wide variety of interactable world objects- vending machines, fully usable terminals, email systems, healing stations, even a functioning rail transit system- designed to make the station feel alive and reactive

  • Physics-based interactions, allowing the player to pick up, carry, throw, and use objects to solve puzzles and affect the environment

It’s a long list, and there’s still more work ahead. But building these systems, and ensuring they all work together intuitively, has been key to shaping a game that feels immersive, reactive, and memorable.

Below, you can find breakdowns of some of the different systems, along with my thought process while designing them.

INTERACTION

One of the most essential gameplay features in the game is being able to look at any interactable object, press the interact button, and have something happen- whether that's opening a door, going through a computer, or just turning a light on and off. I needed a system that could send an interact input from the player to any necessary object, play a correct first person interaction animation if needed, and have a per-object result in the game.

This ended up being a pretty standard line trace from the player view forward to catch any actors in the level that implemented an interaction Blueprint Interface I created. To simplify this, I made a couple master "Interactable" actor classes that most interactables in the level could be a child of.

These master actors implemented the correct interaction BPI, and included a range of base functions and variables for different interactable types: objects you could inspect and add to your inventory, doors you can open and corresponding lock/unlock functionality, furniture, computers, etc. A number of these master classes, or unique ones, include extra shared functionality with the player character to easily play custom first person animations along with the interaction for further immersion: the player visibly sitting down in furniture, picking up objects and drinking from them, placing their hand on a button, etc.

With all of this functionality based around the BPI and master actor classes, I was able to quickly build out a large amount of interactions throughout the world that felt natural and oftentimes added a lot of random fun.

PHYSICS

With pre-scripted interactions in place, I moved on to implementing physics-based interactions. Any object that should realistically have physics had it enabled, and was assigned appropriate weight and properties. I also introduced a new mechanic: holding the interact button briefly allows the player to pick up, carry, place, or throw physics-enabled objects. This system not only added tactile immersion, but also opened the door to creative puzzle design.

Most of the physics interaction functionality was built into the player controller using Unreal’s built-in Physics Handle component. Like the main interaction system, a line trace checks for interactable objects; this time also looking for a “Physics” tag. If an object has the tag, the player can hold down the interact button to begin a timed pickup action. Once the progress bar fills- and if the player is still targeting the same object- the item is attached to the physics handle at a customizable distance in front of the camera.

While held, the object follows the player with motion damping for a more natural, weighty feel, and uses soft constraints to prevent clipping through geometry. Releasing the input drops the object; pressing the “throw” input (same as the attack button) launches it forward with force.

This system took extensive fine-tuning to feel right and avoid common physics bugs, but it's become one of the most satisfying elements of the game. While initially built for immersion, it’s now forming the foundation for upcoming physics-based puzzles. I’ve also added reactive world details, like objects having a response to being shot at- fire extinguishers that explode and fly around, screens that shatter and power off, and vending machines that glitch and spill physics-enabled soda cans. All physics objects naturally collide with the player, NPCs, enemies, and each other, adding to the dynamic feel of the environment.

INVENTORY

Beyond scripted and physics-based interactions, I also wanted players to store and revisit important items through an inventory system. With the world filled with collectible lore objects like audio logs, documents, and keepsakes, it was important to give players an easy way to access and inspect them later. The inventory also serves as a quick reference for weapons, ammo, and other resources gathered throughout the journey on the Proxima.

The inventory system in Proxima is built around Unreal’s DataTables. I created a generic inventory Struct and used it to drive a central “Items” DataTable, which stores all possible inventory items. Each entry includes key variables like item ID, display name, description, icon, category (e.g., Key, Ammo, Readable), whether it’s readable, and any associated text transcripts.
 

As described earlier, interactable items in the world are based on “Master” actor classes. Inspectable inventory items use one of these, with exposed variables that let me define the correct item ID and associated data. When a player interacts with the item in-game, the system checks the DataTable for actor's defined item ID, and displays an “Inspect” UI. This interface shows a 3D model of the object, the name and description, transcript (if readable), and options to pick it up or exit.
 

When an item is picked up, it's removed from the world and its item ID is sent to an "Inventory Manager" blueprint, which tracks all currently held items in an array. This data is passed to the Player State for saving, and is used to populate the inventory UI when opened. Players can view each item’s 3D model, name, description, transcript, and quantity. Planned features include free rotation of 3D models, item dropping, and sorting by category- thanks to the category tags stored in each item’s data.
 

The Inventory Manager can also be queried at any point during gameplay for quest progression, puzzle logic, conditional dialogue, or other systems that need to reference what the player has collected.

COMBAT

To support Proxima’s first-person shooter gameplay, I built a core combat system with all the essentials: picking up weapons, collecting ammo, shooting, and reloading. The gunplay is designed to feel snappy and responsive, inspired by classics like Half-Life and Fallout. Weapons feature a diegetic HUD that displays ammo directly on the gun, along with aim-down-sights functionality for more precise targeting. 

Currently, guns are the only available weapon type in Proxima. I built the weapon system using Unreal Engine’s base FPS template as a starting point, borrowing the idea of a modular Weapon Component. When a weapon is picked up and equipped, this component attaches to the player enabling an alternate combat set of first person animations for the player.  It also references the equipped weapon’s item ID to apply the correct model, animations, damage, ammo type, reload speed, and other attributes.

The Weapon Component handles core functionality like shooting, aiming down sights, and reloading, while tracking ammo with a flexible stock system. It also supports cosmetic skin swaps, which I'm experimenting with as hidden cheat code unlocks.

When firing a weapon, damage is applied appropriately to enemies, NPCs, or destructible environment objects, and physical force is applied to nearby physics-enabled assets for added realism.

Ammo data is managed through the Inventory Manager, which the Weapon Component references to enable or restrict shooting. This data is also displayed diegetically through a 3D widget on the back of each weapon, showing both the current clip and remaining stock.

ENEMIES

With combat in place, the next step was building enemies to fight. I aimed to create enemies with unique models, custom animations, and layered AI behavior- including health and death logic, pathfinding, and multiple behavior states.

Each enemy can shift between three main states: passive, where they roam freely; investigative, where they respond to distant sounds, damage, or visual cues; and aggressive, where they actively chase and attack the player or other threats.

Each enemy type in Proxima has its own master pawn Blueprint to handle its unique behaviors and attack patterns. While core systems like health, damage, and stamina are handled through a shared enemy BPI, individual enemies use their own Blueprints, Behavior Trees, and Blackboards to control state changes, animation logic, movement, and attacks.

I designed custom models, textures, and rigs for each enemy, and brought them into Unreal to create unique Animation Blueprints. These animations work in sync with the AI systems, giving enemies lifelike behavior as they move through the environment. Every area in Proxima includes a full navmesh, allowing enemies to intelligently navigate around obstacles and track the player.

Enemies operate in three main states: passive, investigative, and aggressive.

  • Passive: Enemies wander randomly, occasionally stopping at points of interest or triggering ambient animations for added immersion.

  • Investigative: Triggered by sounds, damage, or spotting something suspicious. Enemies alert others nearby and head toward the source of the disturbance. Each in-game sound has an assigned loudness and range to determine how far away it can be heard. If nothing is found, enemies return to passive mode after a short delay.

  • Aggressive: If the player or an NPC is spotted directly or identified during investigation, the enemy becomes aggressive, executing unique attacks and chasing the target. If the target escapes and breaks line of sight, the enemy returns to investigative mode.

Each enemy has its own health value, and when that hits zero, all AI logic is disabled and the enemy enters a ragdoll state. Their body becomes fully physics-enabled- reacting naturally to the environment and allowing the player to pick up, move, or throw them.

ASSETS & ART CREATION

All art assets for this project were created by me, using Blender, Adobe Substance Designer & Painter, zBrush, Photoshop, and assembled in Unreal Engine 5. This includes all environments, props, character models, rigs, animations, shaders, and VFX. Below, you can find a look at assets and the creation process.

ENVIRONMENT

In the game, you explore a post-apocalyptic frozen landscape, dotted with abandoned buildings and structures. For this, I made a collection of assets to fill out the environment. Asset creation followed a standard games pipeline- high poly sculpts in zBrush, re-topologized down to low poly assets in Blender, with high poly details baked down onto the assets in Substance Painter, and materials created in Substance Painter and Substance Designer.

ASSET SHOWCASE

In the gallery below, you can look through different showcases of various assets around the game environment. You can expand the current slide's image by clicking it.

GALLERY

Click the thumbnails below to view high res in-game screenshots.

  • LinkedIn

© 2025 Caleb Moore

bottom of page