Resident Evil 4 is one of the most beloved games in the medium's history. Since its original release on the Gamecube in 2005, it's been ported to nearly every single major console. Capcom knows how fans feel about its iconic third-person shooter. So when it decided to bring the classic game to virtual reality, it entrusted the job to Armature Studio, a developer intimately familiar with what does and doesn't work in VR.
When adapting Resident Evil 4 to the new technology, Armature Studio had one mantra: "Is it fun?" Instead of opting for an experience that stayed painstakingly similar to the original Gamecube title, Armature based every decision on whether it would make for an engaging experience in VR. That meant ditching much of RE4's tried-and-true methods that have worked so well on controllers for nearly two decades. That includes stopping and shooting, navigating to an inventory menu to equip weapons and items, and many more small intricacies that have been burned into players' minds since 2005. With the firm belief in their "Is it fun?" mantra, Armature Studio effectively adapted RE4's gameplay into an experience that not only evokes the original but also stands as an immensely enjoyable VR adventure.
We spoke to Senior Producer Tom Ivey about how Armature Studio recreated Resident Evil 4 in Unreal Engine, from recreating the original game's assets entirely to fine-tuning every aspect of the gameplay and striking the perfect balance between being faithful to the classic and making an experience that excels in VR.
Can you speak to how the project of porting Resident Evil to VR came about?
Tom Ivey, senior producer: Around the time Armature was finishing up our last project with Oculus Studios, Sports Scramble for the Quest, we began discussing what the team should work on next. We had such a great working relationship with the Oculus Studios team over the course of our last two projects that both teams were eager to work together again, with a focus on finding a larger project for our next collaboration. The decision to bring Resident Evil 4 to the Quest 2 just evolved from there—it really seemed like the perfect fit for a translation to VR: a great mix of action-heavy gameplay, a dramatic story, and memorable environments.
The timing just worked out kind of perfectly, to be honest, and we were honored to be able to work on a franchise that was so well respected and held so many great gaming memories for the team.
What can you tell us about Armature Studio?
Ivey: Armature Studio was founded in 2008 by three of the key developers of the Metroid Prime franchise and is made up of many developers with a long history in the gaming industry. Over the course of the last 13 years, we’ve had the privilege to work on a wide variety of projects, from ports across a vast array of platforms to co-development with some of the industry’s biggest publishers, like Epic and Riot Games, as well as creating multiple original titles in house, such as Recore for Xbox and Windows PC, and Fail Factory and Sports Scramble for the Oculus Quest. Many of our senior team members have a long history of working with Japanese publishers and developers over the years, which helped establish a great working relationship with Capcom, with an eye on keeping true to Resident Evil 4’s innovative gameplay while carefully adapting it for VR.
This isn't Armature's first VR game, but it's still quite different from your previous projects, Fail Factory and Sports Scramble. Were there things you’ve learned from those experiences that you’ve built on for Resident Evil 4 VR?
Ivey: We absolutely had a leg up at the start of Resident Evil 4 for VR because the team had worked on those two previous projects. I think a lot of the hand manipulation and gesture-based approaches—say, picking up an item and the feel of it in your hand or throwing an item and how it should play out in the world—were grounded in a lot of work we had done before, though, of course, we really pushed it even further with Resident Evil 4 for VR.
I think where we did the most learning and systems-building was on the movement aspects of the game. Both Fail Factory and Sports Scramble are single-location-based games—with Fail Factory, you can play it all in one position, and with Sports Scramble, it’s more about physical movement in the room-scale environment. For Resident Evil 4, we had to find the best solutions for navigating a large-scale world. We wanted to make sure we made the game playable for as wide an audience as possible in terms of comfort settings and accessibility within that world.
Can you talk about what it was like being tasked to recreate such an iconic game in VR?
Ivey: I mean, the immediate answer is that it’s an honor. You get to work on Resident Evil 4! But it’s also terrifying in its own way. So many of us on the team had a burnt-in memory of playing this game for the first time on the GameCube—this amazing experience, right? You remember your first time in the village being chased by a chainsaw-wielding Ganado and just how supremely well done it was, and it’s like you do not want to mess this up!
Thankfully, we had amazing partners in Oculus and Capcom—they were really both so supportive.
How did you approach modernizing the experience for VR while keeping the core of what fans love about the game intact?
Ivey: I remember early on we all realized that we would have to make changes to the game for it to work in VR and that the most important thing to keep in mind—rather than some esoteric rule or magic number or specific timing—was that it should be fun. Always fun. The question brought up with any change wasn’t “Is this the same exact value here?” It was always “Is this fun? Yes? Then OK.”
With that in mind, we approached every situation with the question, “How can we take this aspect of the game and make it even more fun through the power of the VR presentation and the Oculus Touch controllers?” The gameplay, the characters, the world, they’re already solid—you have this amazing foundation—so how do you just amp it up? We tried to “VR-ize” everything we could, from the core weapons to every single button or lever you interacted with in the original. The goal was to make the player feel immersed in that amazing world as much as possible and to make them feel as cool as Leon: make it easy to do these great moves with your guns, pulling your knife out as you take perfect aim, throwing one weapon to your other hand so you can grab a grenade and throw it into a pack of enemies.
That said, we also had lots of discussions early on with Capcom and Oculus, when establishing the movement and camera systems, about keeping the presentation of Leon in the world. It was emphasized early on that you aren’t some faceless secret agent here, you’re Leon Kennedy, so keeping in some of the iconic moves he does—his kick, or suplex, or even just showing him jump out of a second-story window—we intentionally kept some of these actions in the third-person presentation, while reducing many of the more straightforward actions to first-person to reduce friction in the gameplay. It was a delicate balance, and I hope we found the right side of it.
Why was Unreal Engine a good fit for Resident Evil 4 VR?
Ivey: A few years back, we made the shift at the studio to focus as much of our development on Unreal Engine as possible, so we could build our knowledge base and improve development times and quality with each project, rather than starting from scratch with a new engine each time. The ability to do extremely rapid prototyping in the Unreal Engine was definitely a benefit to the team, as well as allowing developers across disciplines to get their work into the game easily. With each VR project, we’re also gaining more experience and insight into how best to use Unreal on an Android-based VR platform and learning critical lessons on both performance and integration. We’re excited to see how Unreal Engine further develops for these types of platforms and games.
Considering the game was not originally built on Unreal Engine. Can you walk us through what it was like porting the game to UE?
Ivey: Our first prototypes for movement and shooting were 100% Unreal—rapidly standing up examples where we imitated the rough behavior of the enemies with dummy Unreal assets and grey-boxed the village geometry, just to get a feel for the best presentation of the core player systems. Once we had locked that down with Oculus and Capcom, we started bringing over the original game code and assets and building on top of that.
At a fundamental level, we’re running the original game code, with our VR additions stacked on top in a mixture of C++ and Blueprints, and then the visual layer, all coming from Unreal. This means things like enemy behavior, movement, collision checks, and puzzles are all running from the original source, but we’ve made changes here and there and then stacked on our systems for driving Leon’s movement, the first-person hands and weapons, pickups, interactable objects, and more on top of that.
So, the shotgun on the wall might be expressed as an object in the original code, but once you pick it up, you’re running our representation of the hands and the loading of that weapon, and its physical presence in the world: the model is our up-resed and re-rigged animated model in Unreal. Then you fire, and the bullet and damage and interaction with the enemy and how they respond, that’s back to the original game code. We’re also running all their original animation data directly and pushing the results into Unreal poseable meshes.
Resident Evil 4 VR is the best-looking version of the game yet. Can you walk us through how the team revamped the game’s visuals?
Ivey: We made many changes to the game in order to make sure the visuals held up under the close scrutiny of VR, but always with the mantra that it should look how the player remembered the original game. That aspect was key—it should hold up to their memories, even if, in reality, it was being rendered very differently from the original game. We talked with Oculus and Capcom at length about this during our initial pass at arting the first two rooms of the game.
From a pure production standpoint, we ended up recreating almost every texture in the game from scratch, at 4x, 6x, 8x, or sometimes even 10x the original resolution. For every example, we looked to the original GameCube textures for reference. We also modified the level geometry, but this was mostly to smooth out stark edges, create blend textures where there were harsh seams, or create 3D models for objects like candelabras or other props that were previously just a set of 2D planes. Because you can stick the camera in so many new places with the freedom of VR, we also had to patch a huge amount of holes and awkward joins that just weren’t visible in the original game.
We then started to tackle recreating the lighting and fog in Unreal. This was where we first started really pushing the color scape of the game. The original game has a fairly muted palette, mixed with heavy fog created through rendering thousands of semi-transparent 2D planes. Using the planar fog doesn’t really hold up in VR, and the muted palette had the effect of making the game presentation in VR feel very flat. We began to push the saturation and depth of color through textures and lighting to provide more subtle visual variety to the landscape, accentuating what was already there—finding more red tones in the brown areas of the early game or greens or blues in the later areas. These were usually subtle shifts overall—again, when the player looks at the game, the goal was that they would remember the scene as being just like the original—but these changes did a lot to give depth and variation to the landscape while still feeling like the same overall “muted” style of the original game.
We also put a ton of effort into the new weapon and hands for VR, not just in creating the detailed models and textures, but in the animation rigs: ensuring that turning, clenching, and twisting elements of the fingers, palms, and wrists responded as seamlessly and realistically as possible.
Can you talk about the ways you optimized the game for the Oculus Quest 2’s mobile architecture?
Ivey: The original game on Gamecube ran at 30 FPS and at TV resolution. For the Quest 2, you need to run at 72 FPS at least, and our early experiments with recreating the initial rooms of the game showed that to look good, we would need to run at almost the native resolution of the display (around 1900x1900 pixels per eye) and enable antialiasing. So, despite the game being from 2005, it was still going to be a challenge because we were going to be drawing 200 times as many pixels per second on a mobile platform.
Early in development, we realized we would need to draw many more objects per frame than in our previous games, and that led us to change from the Open GL graphics driver to Vulkan, despite it being, at the time, relatively new and untested on Quest.
We also made a set of modifications to Unreal to improve performance on mobile hardware. We modified Unreal to allow for a far clipping plane. We tweaked Unreal's precomputed visibility to make it work better, and we dove deep into the engine’s software occlusion culling to ensure we were making the most of it. We also had the engineering team working closely with the artists to divide up the room geometry to allow for better clipping and culling, and created a controlled set of materials for usage throughout the game that were optimized for their specific use cases.
In addition to targeting rendering times, we also looked through the original source code for places that we could improve algorithmically, with the benefit of sixteen years of hindsight. For example, we sped up some collision routines by a factor of ten.
As part of our optimization efforts, it was very important for us to be able to track the state of the game’s performance at any time—both in response to targeted changes but also to identify any changes made by the team as a whole that might have unintentionally affected performance. To that end, we created a tool that would jump to various locations in every room in the game and sample the performance, and then dump the resulting data into tracking sheets we could use to analyze shifts in the overall health of the game. The Village, Salazar’s Tower, and the Militant Base were three areas that kept coming back to haunt us.
The game now features manual reloading, dual wielding, body holsters, and allows players to move while shooting. Can you talk about how you approached tailoring these combat systems for VR?
Ivey: A lot of this work was done very early in the prototyping phase before we loaded the first original game asset. We worked closely with Oculus and Capcom to come to an agreement on the best presentation of Leon and the player in the game, creating a host of options as well as defining what we thought of as the optimal presentation. This way, the teams across the different parts of the world could get a feel for the direction we were going but also be able to test out various “what ifs” and subtle changes without having to wait for an entirely new build.
This allowed us to come to an agreement fairly quickly as to what the primary player representation in the game would feel like, with the pistol and the shotgun as two examples of different types of weapon holding, reloading, and firing. We also established many other more subtle aspects of the systems at this time—all with the desire to keep the player in the moment as much as possible, to remove the “fiddly” nature of some VR interfaces while still making it feel engaged and “real.” This included small things like not allowing the player to drop items back into the environment. When you’re hastily switching between weapons, we didn’t want you to lose your gun on the ground—that doesn’t add to the fun of being Leon; Leon is too cool for that.
We wanted you to feel free to toss your gun in the air, then toss the clip after it, grab the gun in midair and see the clip connect, and immediately fire off a shot. We wanted everyone, not just expert players, to try wild moves in the heat of combat because they weren’t worried about barely missing a grab—so the “smooth snap” ranges of weapon grabbing, clip loading, and such, are a little looser. This also led to simplifications like having the shotgun reload entirely with a single shell reload -- it was important that the game wasn’t too much of a “real world” simulator; it was more of a “Leon, the super agent” simulator.
This desire to keep the player “in the moment” as much as possible also extended to mapping the various weapons and consumables directly to your body. While the original game had you switching to the inventory screen very often to switch weapons or use healing items, we wanted the player to be in the action, doing these things live, feeling the thrill of balancing all these different systems in the heat of the moment. However, it was very important to the team that the iconic “Inventory Tetris,” as we called it, remain intact. This is a memorable part of the Resident Evil experience for many players, so we still needed to maintain that presence and the need to interact with it.
So while the player might reload their weapon live and grab grenades or consumables directly from different areas of their body, they still return to the inventory menu to map these quick select slots, combine herbs, make room for more items, or connect bonus attachments to their weapons. The balance just came naturally.
With these new ways to fight, did the team have to rebalance any of the enemies or specific combat encounters?
Ivey: This is another area where the overall balance just evolved from some early choices we established, as well as the guiding principle that we talked about earlier: “Is it fun? Then okay.” We had many discussions initially about whether the freedom of movement and improved aiming would make the game too easy or remove all the thrill. But we quickly found that managing the physical reloading and tracking of ammo, and the more limited field of view than the original game’s third-person camera, sort of subtly balanced against the gains made elsewhere. When physically placed in the environment in VR, players also just naturally tended to play more aggressively in spots, rather than leaning back on the tactical “move as far away as possible, turn to face enemy, fire until they reach midrange, move as far away as possible” system employed in the original game. You’re more likely to stick around in the middle of a crowd, rapidly turning to try to get a headshot on every enemy, twisting, spinning, and leaning around.
That said, we did make changes to the enemy AI: improving pathfinding in various cases, adjusting how they target Leon vs. Ashley to prefer Leon more often, letting them throw weapons at the player from more positions, things like that. We also made small changes to the spawn rates of enemies in a few of the more grueling sections of the game, as we found that the length of the encounters could be overwhelming and less than fun in VR. That said, we generally kept the professional setting identical to the original setup for the hardcore players.
I think the biggest challenge we had with rebalancing the game came from the fact that the game is structured in a specific way, and we had no intention of changing the overall game flow—we’re not going to change where you first get the shotgun, for example, or the timing of when you first fight against a lot of enemies. But we didn’t want to have a large unrelated training level before you started into Leon’s story. The problem is, the original game doesn’t measure out clean beats for training the player—it relies heavily on the player reading instruction manuals given out in certain parts, as well as the fact that it used relatively few controls overall, and all weapons controlled the same. For the VR version, there were more things we needed to teach the player in regards to physically handling weapons and objects, but we didn’t want to heavily interrupt the game flow once you really started up. Our solution was to start with basic tutorials, then use “slow time vignettes,” and leader-line prompts for some of our live tutorial experiences, and then create a robust set of player-manual videos as a supplement to that basic training. We also put a lot of effort into just making physical interactions as natural and stumble-proof as possible.
With added smashable boxes, levers that players need to physically crank, and puzzles redesigned with VR in mind, can you talk about the work that went into making a game not originally designed for VR feel more interactive and tactile?
Ivey: Early on, Armature Creative Director Mark Pacini made a list of the interactions he absolutely wanted to make sure we converted into fully physical VR versions. This included things you encounter a lot or that are integral to the gameplay, like picking up objects, breaking creates, or opening doors, but also really memorable moments that he thought would be worthwhile to emphasize for people that had played the original game, like opening the bear trap for the dog in the very first level.
We then made an expanded list that included every object in the game—buttons, levers, crank wheels, sliding puzzles, you name it—writing down the category of action (push, press, pull one way, pull two way, and such), how often it was used and where, what the physical version might be, and the complexity of the work required to convert it to a physical interaction. Our original hope was that we could convert a large portion of that list by the end of the project. I’m happy to report that we were able to convert every prop on the list to a physical representation and let the player really feel like they’re in the world interacting with the environment as a basic part of gameplay.
There were a few cases where we kept the original “fast-action button press” implementations or third-person representations because of gameplay and comfort considerations or the desire to keep Leon’s presence in the game. For example, we allow the player to quickly vault over fences and windows with a button press, rather than recreating this through an awkward motion. We also don’t have the player mimic climbing up a ladder or a cliff edge -- this was done both because we felt that the number (and in the case of ladders, the height) might make this laborious rather than fun, as well as the fact that the game had no internal state for AI or other systems to understand a player being “partially up a ladder” or “partially over a windowsill.” We looked at what adding physical climbing would grant the player from a gameplay and immersion perspective, balanced against the navigational friction and code complexity and bugs it might introduce, and made the call. We went through the exact same process for allowing the player to physically open doors—that is, to partially open the door and peek through it or shoot through it—and we felt that the gameplay benefits, in that case, were worth the effort of rebuilding some fundamental aspects of the player and AI state to make it work. Even then, it took a lot of effort to get right—updating the AI to handle the state, as well as making it feel easy and responsive to grab and swing open the door or to push the door with your body, so it doesn’t feel like we’ve added a burden to navigating the world.
There are a lot of customizable options to ensure players are comfortable while playing. How did Armature approach this?
Ivey: It was super important to us to provide as many comfort options as possible to ensure that as many players of all walks of life could experience the world of Resident Evil 4 in VR. We spent a lot of time on this topic for the game. As with every option, we wanted to make sure it was implemented in such a way that it felt just as fun and immersive to play the game as the default standing, immersive mode options. For example, we worked really hard on building out the teleport-movement mode, to the extent that it’s actually a super engaging and unique way to play the game (and something that we’re very curious to see speed-runners handle!)
We also had a lot of testing across a wide variety of users, both external white-paper testers as well as our internal and external QA teams. Unfortunately, with the advent of COVID, our white-paper testing wasn’t as extensive as we would normally have liked, but across the development teams, we were still able to include a diverse set of users of various sizes and statures, genders, and physical accessibility needs.
Even with all that testing and consideration, we still ended up missing some options that players have let us know would be greatly appreciated, so we’re adding even more options to the game that we hope to release in the future. Oculus has also recently expanded on its accessibility options at a system level, which we hope will help developers of all sizes make their games more accessible.
As we work on more VR products of increasing complexity, there are always new considerations we’re coming to understand, and it’s important to us that we keep learning and adding to our knowledge base as we develop in the VR landscape. We also want to create more exacting standards and practices across all our products to make sure we cover an ever-increasing variety of accessibility options and have those options baked into the game as early as possible, so they can be carefully considered and seamlessly integrated as much as possible.
What would you say has been the most challenging aspect of porting the game to VR, and how did you overcome it?
Ivey: From the technical side of things, over the course of a two-year project, a lot of challenges come and go: getting an understanding of all the original code and data formats (for packaging, audio, animation, cutscenes, cameras, pathfinding, textures, triggers, messages, and so on), reimplementing original particle effects systems, making everything run at a frame rate, having the hands grab 100 different objects smoothly and accurately, even something as seemingly straightforward as recreating the system of character shadows in cutscenes. There are a ton of unique challenges when porting a game, digging into decades-old code with comments in a language foreign to most of your development team, and then, for an adaptive port like this, you add on the challenges of all the new content you’ve created, and how it interacts with those old systems, and they sort of all magnify each other.
But at the end of the day, you’re still working on Resident Evil 4, so it’s hard to complain! You just dig into the problem, try to find the best and most robust solution, and then move on to the next challenge.
The game has received widespread critical and community acclaim. How does this make the team feel?
Ivey: The whole team is ecstatic to see the positive response to the project. We really care a lot about this game and have a ton of respect for the original version, so we wanted to make sure we did our best to represent the quality of Resident Evil 4 when we brought it to VR. To watch people pulling off these incredible moves and seeing their personality shine through with how they play, it’s just great. We’re all having a blast sharing clips of people playing the game in all these unique ways!
It’s also great to hear that for many players, this is their first experience with Resident Evil—we’re happy we were a part of introducing them to this amazing franchise!
Thank you for your time. Where can we learn more about Armature Studio and Resident Evil 4 VR?
Ivey: You can find more information about Armature Studio at
www.armature.com and
@ArmatureStudio and Resident Evil 4 VR at
www.oculus.com/resident-evil-4/.