The year is 2028. The survivors of nuclear Armageddon cling to existence in the subways beneath Moscow. Experience a new story-driven adventure built exclusively for VR that blends atmospheric exploration, stealth and combat in the most immersive Metro experience yet. Put on your gas mask and brave the crippling radiation and deadly threats of the Metro in this chilling, supernatural story from Dmitry Glukhovsky.
The development and release of Metro: Awakening marked a turning point for the studio. As we transitioned from co-development and smaller work-for-hire titles to tackling a flagship AAA IP, our pipelines, team structure and design philosophies had to mature. This scale-up required me to shift from a generalist into a Principal role, defining the core gameplay vision and implementing it from prototype to final polish.
Translating Metro’s signature, unforgiving tactility into an accessible VR experience was an incredibly rewarding challenge, forcing us to confront hardware limits, tech debt and tight scope. While overcoming these constraints, I was able to successfully execute the design vision I had spent years forming, ultimately resulting in a fantastic award-winning launch.
In Virtual Reality, immersing players is easy yet incredibly fragile. Because players feel physically present, they bring their real-world mental models into the game. If a hand twists unnaturally or an interaction is unexpectedly difficult, the illusion shatters. In a franchise like Metro, which is defined by its tactile realism, taking shortcuts to resolve this was unacceptable.
While we had an established codebase for hands and interactions, they needed an update and clear design ownership for a title of this quality. To tackle this, I authored a vision document for the hand systems and interactions, establishing three strict pillars for player hands: Presence, Intuition and Control.
For presence and control, players must feel true ownership over their virtual hands. This relies not only on an accurate visual model, but especially on feeling 1:1 agency over their hands; the virtual hands must reliably reflect physical input.
Our studio’s legacy systems relied heavily on physics. While intuitive on paper, the transition from Unreal Engine 4 (PhysX) to UE5 (Chaos) exacerbated issues like hands lagging, drifting or behaving unpredictably. To protect the Control pillar, we made the pragmatic call to invest in a kinematic solution for hand movement. This guaranteed 1:1 parity with the player’s controllers. The virtual hands would only break parity when we deliberately designed them to do so. While this made implementation interactions more complex, it was a necessary investment to eliminate unpredictable physics jank and maintain absolute control.
For intuition, players should never have to actively think about how to move their hands to perform a real-world action. The hurdle is making interactions feel intuitive without actual physical resistance. Pure 1:1 simulations fail in VR because the controllers cannot provide the tactile signals (like resistance on a door handle) that our subconscious relies on.
To solve this, we focus on simulating the experience of reality, not reality itself. We leverage the malleability of human proprioception – the brain’s ability to sense body position – by intentionally breaking our kinematic 1:1 parity at specific points:
The core systems for hand interaction were created by tech to fit my design, primarily utilizing “Interaction Sockets”. These components could be placed in any blueprint, defining grab points with transforms and animation poses. Each socket had tweakable rules determining the details of the interaction. While incredibly flexible, building it on legacy code during rapid expansion meant it became a cumbersome beast by the end of development. But tech debt is a given reality under a tight scope.
I expanded this system directly to improve UX with grab styles. Initially, grabbing was purely radius-based. I extended the logic to support alternatives, like volume-overlap checks, parent-primitive collision and more, allowing us to build more complex, custom interaction rules.
We also introduced an input buffer that allowed players to press the grab button slightly too early or too late but still successfully execute the interaction. This was vital for smoothing out high-speed stressful situations like catching an object or frantically reloading a weapon.
While our kinematic hand system guaranteed 1:1 hand control, the post-apocalyptic world of Metro is heavy and rusted. If a player grabs a massive steel door and it swings open effortlessly, the atmosphere is lost. To bridge the gap between the controller’s lack of resistance and the heavy virtual world, we established rules for Virtual Friction.
These rules built on my interaction vision documentation and the early prototyping efforts of several design interns. Based on their work, I refactored the interaction systems and distilled the rules into a custom component: the “Friction Interaction Constraint”.
Inheriting from Unreal’s default physics constraints, the FIC was designed to seamlessly handle interactions in two states:
The leading logic came from the kinematics. While held, depending on the linear or angular limit set, the FIC translated the difference in relative hand transform(s) between frames into movement for the interaction. This delta difference was passed through several tweakable systems to calculate the final interaction alpha:
To ensure the object felt identical regardless of how the player interacted with it, the tweakable values determining virtual friction also drove underlying physics settings, like linear and angular damping.
This core framework was further expanded throughout development by both myself and the tech team and I collaborated with the broader design team to implement the constraint across our interactions. While it was easy to over-tweak the values and break hand parity, the balance we achieved after heavy iteration successfully tricked the player’s brain into feeling friction.
Weapon reloading in the Metro universe demands a high degree of fidelity and realism. Players had to physically execute every step of the reloading process: removing the magazine, inserting a fresh one and chambering the round, with minimal “button-press” automation. However, unforgiving realism risks immersion-breaking frustration when a player panics during a firefight.
I spearheaded the effort to create a weapon system that felt realistic but could be executed through subconscious muscle memory. The goal was to make the players feel like skilled survivors, even without real-world firearms experience.
To maintain the illusion of skill without the frustration of simulation, I prototyped and implemented several invisible assists:
Adding multiple interactions onto a single weapon made them dense with interaction points (grips, bolts, magazines). Relying purely on our interaction system’s base proximity algorithms caused frustrating false positives, like accidentally pulling a magazine when reaching for the weapon’s bolt.
Specific weapons presented unique UX hurdles. The Shambler, a revolver-style shotgun, allowed players to load rounds into any of its six slots, often leaving gaps that caused the weapon to click empty during combat. To address this, I engineered the chamber to automatically spin to the next loaded round when the bolt was pulled, making the weapon cooler and resolving the issue while simultaneously allowing the weapon to clearly communicate when the last round was fired.
Yet the Shambler also highlights the difficulty of invisible design. Late in development, we shifted the default ammo pouch position on the player’s body. Consequently, the path to grab new shotgun ammo closely overlapped with the path to pull the bolt, leading to false positives.
I would love to say we completely resolved this, but we ran out of time before release. While I stand behind the choice to auto-grab the bolt, this could have been caught while playtesting. Because these systems are meant to be unnoticeable, diagnosing failures requires close observation of both the player’s gameplay footage and their physical body to analyse whether they perform as intended.
While we leaned on invisible assists for handling, I refused to compromise on mechanical reality. We actively omitted aim assist, relying entirely on perfectly tweaked iron sights. I dedicated significant research to ensure mechanical details, like visible rounds entering the chamber and accurately operating pistol hammers, functioned exactly as they should, and maintained this standard rigorously throughout development.
After hand interactions and gunplay reached a solid state, I was asked to pivot and support the AI team with human enemies. The team had built an impressive foundation using a custom HTN (Hierarchical Task Network) plugin. I had to learn this system quickly and refine the combat and stealth states to ensure the experience hit the Metro quality bar, shifting the focus from logical simulation to curated player experience.
While I am proud of the end results for our human characters, it was a massive team effort where I played a supportive, refinement role.
The initial stealth simulation was accurate but lacked the right pacing. I reworked the investigation and alarmed states to be more ‘game-y’ to drive tension:
Near release, I took full ownership of designing the spider mutants. Under a tight deadline, I prototyped and fully implemented two variations in just a few months using state machines, collaborating with tech for animation and system support and level design for encounters.
Despite a relatively small budget and some clear improvement points, these enemy types became one of the most memorable encounters in Awakening. I’m quite happy to be lovingly hated by reviewers and players for their addition.
Oversaw the creation of prototypes, core systems, and interactions through discussions, reviews, and feedback with the gameplay programmers and designers, ensuring we met quality levels within scope.
Mentored three design interns over the course of the project, with one of them continuing as a junior under my management after their internship finished.
Promoted to Principal Gameplay Designer at the start of 2024 in recognition of my chosen specialization and the skill shown during the early years of development.