Gamer Limit Banner

2605957630061794638nllfvu_fs

Nothing has provided more gamers with a intrepid sense of war and bloody carnage then the First Person Shooter. Generations of gamers can, scarily enough, name and identify weapons by model, modification and in some cases, manufacturer. M-16? Assault Rifle of choice for the US Army. AK-47?  Infidel remover for terrorists and freedom fighters the world over.

With the epic evolution of realism that has merged into mainstream gaming, weapons are now more then a mush of pixels in the bottom middle of the screen. Everything from Iron Sights to Silencers are now recreated in minute detail. They jam too, as well as misfire, slam your avatar’s shoulder with immense recoil and ricochet shrapnel across the room you’ve decided to unload a personal arsenal in.

As games merge closer with the battlefields they emulate, I decided to take a look behind the scenes and find out exactly how your RPG-7 gets from the range to your monitor.

It’s arguably known that one of the most popular representations of battle comes from the Americas Army series. First released in 2002, Version 1.0 was nicknamed “Recon“, designed to be the first in a series of PC games to give potential recruits a taste of Army action. Rather then simply dropping players in an arbitrary battle from the get go, hopeful soldiers are forced to complete tutorials that gauge their abilities and introduce them to relatively complex weaponry and tactics.

It’s easy to see the impressive amount of detail that has gone into the latest incarnation of the game, Americas Army 3, which utilises the latest version of the Unreal Engine. A better UI, ranking system that bears similarities to COD4, and the ability to import profiles from the previous games. The training portions are fascinating, being taught the correct way to fire weapons and then being tested on them is a challenging and fun chance to excel.

It’s also a great chance to take a close look at the particular pieces of kit that you’re given to use. All of the equipment, vehicles and weapons in AA are taken directly from the materiel departments of the US Army. The weapons, from the way they fire, the sound they make and the actions your avatar use to reload and prep sights are all exactly the same as any soldier would be trained to use them.

And that’s because the same guys who model, texture and program them hold and fire them on live fire ranges, Chief Engineer of the AA program, Michael Barnett, tells me. “We operate the vehicles and fire the weapons on army ranges.“, he notes, “I have personally fired many small arms weapons and shot the M2 .50 cal machine gun, Mark19 40mm grenade launcher and the TOW ITAS anti-tank missile.” No lab coats and ballistic monitoring programs for these designers it seems.

In addition to providing the nitty gritty on how to make everything as realistic for the player as possible, Michael is the technical lead for all projects relating to the AA platform, which includes government training, simulation applications and other elements of real soldier-VR contact. Impressiveness aside, Michael helps create virtual battlefields that the real fighters learn in before they head over to the dusty battlegrounds of Iraq and Afganistan.

Impressive. But I wanted details. How does a weapon go from your hands on a range at Ft. Benning to an imposing force of power in the hands of your average Gamer?

Michael was more then happy to elaborate. And, being an engineer, he got technical. Very technical.

The 3D modelling of weapons and other systems for which we build trainers is done using very high resolution CAD models that are decimated for real time use. The internal mechanical workings, textures, materials, and sounds are all produced from data collection of the actual systems to make the most authentic experience possible.

Physical measurements not documented are made with 3D laser scanners, weapon sounds are recorded using high end equipment on the actual ranges, and avatar animations for movement are captured using a Viacom mocap system by actual Special Forces soldiers so every detail is captured.

In creating a simulation of any Army equipment or weapon system, we begin with a TM,(technical manual) and an OMM, (operations and maintenance manual) which describes the basic mechanical workings of the system and details all human interactions with the system from proper operations to maintenance.

Because the documentation lags behind development of the actual state, we next get (our) hands on the actual tactical system or weapon in the field to press every button, turn every switch and put the system in every possible state. From this we create a System specification outlining the states and state transitions. We use this as a master document for our software design giving us a level of detail not possible from any other means of data collection.

As we operate the actual tactical systems, taking them through every probably operational state, we naturally become resident experts on the system going so far as to model even any system software bugs that we may find about the tactical system so that our trainer is exactly what a soldier would experience using the real system.

So, basically, its a detailed and intense process.  Weapon firing specifics like accuracy, jam rate and recoil are all taken from official Army ballistic tables, which in themselves are taken from both range, lab and battlefield firing. Jam rates in the game are specifically manipulated to occur more often to provide a user with the experience, rather then a directly proportional representation of how often a rifle would actually jam.

The same goes when calculating damage. AA, like most titles, features a class system which involves a medic. But unlike most titles, damage taken in AA is area specific. If you take a shot in the heart, you die. A bullet in the leg, you limp, and possibly bleed out. I asked Michael how he took hitboxes into account.

For the game, a players avatar is divided into over 16 parts, and injuries are recorded for which part is hit and the physical 3D location on that part. The AA3 game uses a simple calculation for the injured section based on round or striking material.

Other Govt applications will use the physical location per part along with more extensive data on the type of injury in order to train medical personnel. Medical data approved by the Army’s school is used for the more detailed training and simulated effects.

Some of the more indepth projects that Michael works on involve much more specific problems. A media application might involve realistic wounds and time scenarios, where battlefield medicine needs to be done in real time, utilising a plethora of bandages, equipment and drugs. Wouldn’t that make “pressing “e” to heal” a hell of a lot more redundant?

There are reems of data behind everything you do in a modern FPS. Every bullet you fire, and depending on where you are, where you aim and what weapon you are using, is calculated in real time to the ends of the earth. Does it burn out before hitting its target? Does the wind, barometric pressure or atmospheric temperature affect its trajectory? Does it slice through a wall and into your enemy’s head, or simply impact a wall and stay stuck forever.

Everything from recoil, a players stance and steadyness are important to an accurate shot. Real soldiers are modeled in different positions while firing different weapons, and tables are developed, which include simulated injury, slight movements and breathing, to create the most realistic environment possible. Holding your breath to take a sniper shot isn’t just a gimmick – you’ll find every soldier sucks in a breath after lining up a target between the sights.

Nothing, and I mean nothing, is taken for granted. The US considers America’s Army more of a training, or recruitment tool than a “game”, requiring it to be of the same standard that any other official document or training manual world. As a result, the project has a high priority in regards to funding, thus representing the dedication to the community by the devs and the extraordinary amount of patches and upgrades the title has received since inception.

So the next time you drop into a server, having completed your ranger and advanced marksmanship training, have a think about all of the science that has gone into your virtual experience. Every shot counts.

  1. avatar jonondaspot

    If you want to have a realistic article about virtualizing weaponry, AA should be at the bottom of the list. They had their clocked cleaned at the trials to be the official US Army simulation. That was won by VBS2 VTK which is also the official sim of the US Marines, the British MOD and NATO.

    http://virtualbattlespace.vbs2.com/

  2. @jonondaspot
    We were lucky enough to get an interview with them :D . We’d love to interview the people responsible for VTK as well.

  3. @jononaspot

    I was really impressed by Michael’s knowledge and technical scope. I didn’t really get into the nitty gritty for the article’s sake, but there was a hell of a lot of detail that I omitted to make it more accessible.

    I’m hoping to follow it up by talking to developers (and Michael) of other military simulations that focus more on government application – including VTK and also the sim that the Australian Army use as well.

  4. avatar RSamples

    America’s Army and VBS2 are both used by the Army but for different purposes. VBS2 was chosen for some of the Army’s training due to it being a broad commercial application that had a great deal of ready made features for doing mission rehearsal and its compatibility with legacy Army sims using an Army standard of network communication called HLA. But America’s Army based on Unreal Engine 3 and other technologies like the CRY engine are definately better choices for producing more highly detailed system simulations requiring realistic ballistics, physics, control dynamics and visual realism like light and shadows, environmental effects, human movement and expression modeling, sensors such as night vision and infrared, and soldier training for target id and friendly identification. We have worked with both AA and VBS2 and each has its strengths. There is no one that does it all. Besides, in 1 or 2 years both will be obsolete and I understand the Army has deep development capabilities with AA whereas VBS2 is a box set. Good Article! Keep em coming.

  5. avatar BahDog

    Oh dear.
    As soon as I read the introductory paragraph about how ‘scary’ it is that people can name guns by model I thought to myself.. “This guy is probably from Australia.”
    So people shouldn’t be observant?
    Australia is also known as the land of banning things unnecessarily and being generally pussified. They make some pretty good wines though.

    It’s also fitting to note than America’s Army is, like someone said, pitifully low on the level of realism.
    Operation Flashpoint, ArmA and ArmA:2 are military simulators…
    America’s Army is just a shitty FPS.

  6. avatar jonondaspot

    RSamples. No one in the Army is looking at AA anymore. It was used some in ATL but that has pretty much died off. Crytek isn’t being investigated either. Not by PEOSTRI and not by NSC, nor the newly christened TCM Gaming under TRADOC. The only people looking at that right now (Crytek) are the SF guys. Real World from DARPA is under consideration but USSOCOM has pretty much written that program off too though a few people there are going to sink their careers trying to prop up that debacle.

    The US Army just bought VBS2 and are still in the Phase 1 fielding stage. They aren’t looking at anything else right now and certainly they have written AA off completely. The weakness of AA along with most other products comes down to one fact. Terrain. You have to be able to do terrain conversion of DTED and SAT data. AA can’t and even worse it does not use real World Relative lighting because such a feature was never built into the Unreal engine. It isn’t going to happen either because that would involve yanking out the entire lighting engine of Unreal 3 and unlike other engines with U3 it would be cheaper and more common sense to write an engine from scratch. Crytek just opened a studio in Orlando to court DoD but so far DoD isn’t listening because Crytek, despite using the Harrington Group, has zero understanding of how the US Military works.

    VBS2 Does HLA compliance thanks to Calytrix. Interfacing with OneSAF and JCATS as well as C2 – BFT – has been very smooth.

    Bohemia plans to infuse VBS2 with the Arma 2 engine as well. You should be aware that right now the ballistics model in VBS2 is far superior to AA. As regards Crytek, their big fault is their inability to render large terrain areas. They are trying to develop streaming technology but so far they are still on the drawing board. The TRADOC commander wanted the 15 sims gone and for the US Army to focus on a single product. That is what they are doing and the reason why locations such as Ft Hood run all of their convoy simulators provided by Lasershot with VBS2 as the enclosed sim.

    VBS2 isn’t a super comprehensive solution but you are going to see the Army use it more exclusively for the next several years. It beats anything else out there because of it’s RTE and AAR capability – that alone completely buries Crysis and Unreal. And as I said, forget Unreal because it’s design inherently causes major issues with real world/real time lighting in terrain conversion. Organizations such as TCOIC are in full swing production using VBS2.

  7. @BahDog

    I wasn’t saying that it was really a bad thing, I was just saying that most people wouldn’t had that same knowledge years ago.

    And really, if it bothered me that much, would I have written a feature on the subject? :)

  8. This is a very insightful piece. Awesome article James.

  9. avatar MBarnett

    jonondaspot, not sure where you are getting your info but, with all due respect sir, your assertions are just incorrect. As an engineer for the Army, I can tell you we are currenty using AA for numerous government applications so the Army is definately still looking at AA, VBS2, Real World, Crytek, Evolution, and other 3D engines as well. You mention the ATL (Adaptive Thinking and Leadership) application. This was one of the very first AA applications developed back in 2002 or 2003 and an excellent success story for training non-kinetic soft skills at the JFK Special Warfare Center. Since then, AA has been used for numerous projects for education, simulation, training, virtual equipment & weapons prototyping, and outreach. We have also successfully imported geospecific terrain from sources like DEM, DTED, GeoTiff and LADAR data for AA and even highly modified the lighting model to address dynamic light sources, time of day and night vision. This was all possible because the Army has the full source code for the Unreal Engine, and numerous commercial companies specializing in various disciplines have developed middleware technologies for Unreal in Artificial Intelligence, lighting, UI, physics, outdoor environments, terrain paging, inverse kinematics, human & vehicle simulation and others. We are able to readily acquire such technologies from the experts and later swap them out as something better comes along. Even the render engine is swappable. We are also developing with VBS2 which has a good set of out-of-the-box features and editing capabliity but not full source code access yet. Hopefully that will change in the near future so we can harness the full potential. I’m glad to be using both and look forward to what the industry will yield in the near future. The Army will continue to be flexible and use the right solution for the right task. Hope this helps. Thanks to James for the article and to everyone for your comments.

Leave a Reply