Military contractor, Engineering & Computer Simulations (ECS), has received a grant from the federal Small Business Innovation Research program—likely ranging from $500,000 to $1.5 million—to pilot a virtual reality training program for U.S. Army medics.

Founded in 1997, ECS is a Florida-based contractor which builds digital training and other tech solutions for the U.S. military.

The company today announced that it has received a ‘Phase II’ grant from the federal Small Business Innovation Research (SBIR) program, an annual fund of more than $3 billion which aims to support private companies in developing new technologies that both fit federal needs and show potential for commercialization.

The grant is to support the development of a virtual reality training program for U.S. Army medics under the umbrella of the Army’s ‘Synthetic Training Environment System’ program (which employs digital training of all sorts).

SEE ALSO
HapTech is Aiming its Electromagnetic Haptics at Military VR Training

While ECS didn’t announce the amount of the grant, SBIR documents say that most Phase II grants range from $500,000 to $1.5 million. The company skipped a Phase I grant, which is smaller and focused on the initial concept, while Phase II focuses on building a functional prototype.

ECS says its VR medic training program is designed to integrate with the Army’s existing Tactical Casualty Combat Care procedures.

The VR training program will include “multi-player integration, instructor dashboard and analytics, STE integration, and a training effectiveness evaluation,” the company says.

Image courtesy ECS

ECS plans to partner with the Mayo Clinic to guide the medical aspects of the system and with HaptX to employ the use of the company’s advanced haptic gloves to increase the realism of the VR training program.

HaptX—which makes perhaps the most advanced haptic gloves presently available—has been positioning its product toward virtual reality training and other non-consumer applications. In late 2019 the company announced that it raised $12 million in funding to continue development of its gloves.

Image courtesy HaptX

Bulky but impressive, the HaptX gloves offer both force-feedback and detailed haptics. Together, the glove can both lock the wearer’s hand in a position which simulates the feedback provided by a physical object and create convincing sensations of touch with an array of micro-pneumatic haptics across the palm and fingers. We most recently got to try the HaptX gloves in 2018.

SEE ALSO
Hands-on: HaptX's VR Glove is the Closest I've Come to Touching the Virtual World

ECS hopes that using the gloves will make its VR training program more effective by enhancing the realism of the simulation and allowing trainees to feel the sensations of holding virtual tools and interacting with patients.

For now the project is in the pilot phase, but if proven effective it could be rolled out widely for Army medic training in the future.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Ad

    XR firms need to divest from militaries, police, and intelligence services. We should be really concerned by this. This is is just medics but a single step forward would be unacceptable. This is getting close to the point where ignoring ethics and lines will get people killed.

    • Andrew Jakobs

      Why? because you don’t agree with it doesn’t mean they shouldn’t. Oh, you mean they are ignoring what YOU think good ethics are? Ethics are just in the eye of the beholder. Personally I’m all for it, especially in law enforcement, if it will make their job much more efficient and actually get the job done..
      So what YOU think is unacceptable, I don’t think it is..

      • Ad

        You’re being political, just for the other side. Ethics aren’t in the eye of the beholding when this technology is being wielded by states and armies. What does making policing more efficient even mean? It’s well documented by the justice department that newer tools in policing have been consistently used to extract illegitimate and often illegal fines from the populace, or radically violate their privacy and rights. This isn’t you stating your preference, it’s you sanctioning the state to use new technology to harm because you either don’t know or don’t care what the consequences are.

        • Andrew Jakobs

          That’s exactly what ethics are, political.. Ethical IS in the eye of the beholder. States and armies are still part of our society it’s not like they are something completely separate.
          New technology will ALWAYS! be misused by people, that’s just the nature of people. State=people.. And one example of VR making policing more efficient is the use with crimescene investigation. And I’m all for using stuff like facial recognition as it will make it much more efficient to search for people in a large database, because facialrecognition is nothing more than automating the manual labor of having to sift through all those photo’s/video, the problem is how people seem to deal with the result of such a search, and THAT’S! what need to be addressed.
          And as I said, ethics ARE in the eye of the beholder, as ethics are just some rules made up by some people who THINK what it right and what is wrong, and mostly those ‘ethics’ are created by a small number of people, not the masses (because you know, those small number of people think the masses are too dumb to actually be able to make decisions themselves)..
          Because I have a different opinion of how technology should be used, doesn’t mean I’m unethical because you don’t agree with the way it’s being used.. Maybe you should come down to earth and start being realistic and see the world/people for what it really is and not always only focus on the negative sides..

          • Jistuce

            While I disagree with Ad’s apparent argument that military purchase of video game controllers cause police impropriety, the law enforcement implementation of facial recognition seems to have blown past “automating existing database searches” and on towards “constructing an extensive automated tracking system for all citizens using those cameras we hung on the traffic lights”.

            Personally, I’m not comfortable living in a country where we look to the worst excesses of the old Soviet Union and say “hold my beer”. I don’t mind the polioe having facial recognition, but I want the cameras gone.

          • Ad

            Facial recognition is basically one of the best examples of how technology needs to be kept specifically out of the hands of police and militaries. As another example, doctors refuse to take part in executions and across the board states have been completely unable to find doctors to help them kill people. It is weird to see you so confidently saying something that is completely untenable. And your contrast of the ethical elite to the majority who don’t care about ethics is a lie fostered by industry types.

    • asdf

      This is a tool that can increase the quality and speed of teaching others in almost anything. It has nothing to do with ethics or what its being used to train. Its like saying rulers or calculators should be banned from the military because they use it as a tool. Grow up man.

      • Ad

        Or facial recognition and AI? Both of which countless campaigners and software engineers are calling to keep away from armies? Google just pulled its AI work with the pentagon because their workers staged a massive walk out. Do you know better than them? Grow up, these aren’t toys.

        • Jistuce

          This isn’t a surveillance tool or a weapon. It is a training aid. What is the downside of VR training? In point of fact, it seems more likely to SAVE lives by getting soldiers better weapons training before they start handling real weapons than it is to get people killed by… I dunno, you didn’t specify how.

          Unless your entire argument is “teh army is evilz and can’t haz anythings”

          Also, I don’t think Google employees are actually the be-all and end-all of ethics debate, so “Google employees staged a massive walkout over military AI contracts” doesn’t really prove anything other than Google’s hiring process creates a company with fairly homogeneous political opinions(which is concerning for a company with that much control over communication).

          • Ad

            Training to do what? If there was a training aid that trained police to violate people’s rights in interrogations without getting caught, would that be okay?

            Also I’m sorry but combat training does get people killed, that’s kind of the point? Tech does not have to and should not participate in that. And spare me the myths about Silicon Valley being liberal, isn’t this the company that hired Damore and only fired him after massive public backlash to his irrational manifesto and him actively being unable to stop causing chaos at google?

          • Jistuce

            You’re assuming that without VR equipment, this training won’t happen. The ability to purchase VR equipment has no impact on whether training will occur.

            I fail to see how “training with VR” is going to get people killed over the existing situation of training without VR.

            I’m with asdf. This is a training aid, it isn’t enabling any grand new evil.

            You seem to actually mean “all military and law enforcement purchases are inherently unethical”, in which case maybe you should be worrying about something more serious than video game controllers. Like the automated monitoring systems being implemented across the country that put the soviet KGB to shame.

            If that isn’t what you mean, then please explain why training with VR is worse than the same training without VR.

          • Ad

            As another example, doctors refuse to take part in executions and across the board states have been completely unable to find doctors to help them kill people, even though those executions come after due process (in theory). The XR industry should take a similar approach, rather than catering to and profiting off of war. The Air Force wants to make extensive use of XR, from training to controlling drones to visualizing the battlefield in real time. We have to say as a community and industry that we will not take part in that, and it starts now.

            Your standard of “it will happen anyway” also makes no sense. If it makes it much easier and cheaper, it will change and go in new directions, in addition to XR being used to train combatants to kill other human beings.

          • Jistuce

            So you have no argument for “VR military training is bad”, just an unfounded assertion that if training with VR works it will automatically mutate into something worse. And you are going to continue to distract from your lack of meat with irrelevant tangents. Gotcha.

            My standard is simple: There is no difference to training WITH fancy video game controllers than training WITHOUT fancy video game controllers aside from an additional intermediate step before they start training with actual weapons and ammo. The potential effects I see here are a reduction in training accidents, and little more than that.
            Therefore, why NOT sell them fancy video game controllers?

          • Ad

            That isn’t how ethics works, and the doctors case was absolutely relevant. The executions happen anyway, so why don’t the doctors take part? You’re looking at this in a one dimensional way. There is nothing unfounded about saying that VR training will proceed into something else. The AirForce is using VR to train mechanics and they have specifically said that this is a “trajectory” that will lead into VR being used for command, control, and combat.

            Would be you be fine with Chinese troops being trained with VR on how to subjugate Hong Kong, attack civilians without being caught, or identify democracy leaders on the ground and neutralize them? There could easily, like in the past, be serious issues with the training like priming soldiers with realistic looking scenarios to be hostile to populations on the ground. That is one difference, where training shows scary looking brown people who are secretly insurgents, priming soldiers to overreact. That is not something XR companies can guarantee won’t happen. IBM is suspending its facial recognition research over similar concerns that they cannot control what is done with it and how dangerous it is. I chatted with a VR dev working with the army on twitter who said that he was focused on ethics and saving money only, then he started to become increasingly unhinged and said that nature is violent and ethics are irrelevant, and he wants to do whatever necessary to defeat the CCP. Training is not neutral, not will this end there.

            One AR company already discussed how AR tech could be used to have soldiers surveil civilians and identify suspicious individuals much more efficiently, which would allow it to be scaled up. You don’t seem to understand that nothing remains the same if it is made easier, cheaper, or more efficient and doing so to war has a very obvious result. Your standard doesn’t actually make sense in reality because it’s one dimensional in the three dimensional world.

          • CXF

            First, you’re operating completely on fear without fully grasping that this train has left the station. There is no stopping it. I’ve personally participated in helping to design a more efficient, more lethal, futuristic XR system for a specific mil branch intended for the year 2030. Between now and then, many versions will be constructed, tested and iterated upon. The military is so far ahead of where you think they are, it’s laughable. You don’t know how ignorant you are – which is by design. It’s none of our collective business where the state-of-the-art is with regard to military defense (and offense). Our national safety depends on it. Everyone is entitled to their opinion. Yours shared here is so clearly based firmly in multi-faceted ignorance and blinding fear.

          • Ad

            That’s disgusting? Our national security is dependent on a technological advantage over “peer competitors” who spend a fraction of what we do? We have an army that has been built for peer competitor wars for the last 7 decades and it’s resulted in us losing in Vietnam and Afghanistan, limping out of Iraq, using it in illegal and dangerous ways with plenty of blowback, instigating chaos around the world, and causing a breakdown in international cooperation. Not to mention the military is sapping the blood of this country with it’s absolutely absurd budget. This isn’t blind fear, it’s the fact that this industry needs to swear off participating in the machinery of death, actually learn lessons of the tech industries many mistakes here and in China, and not be seduced by lazy arguments about national security that you for some reason find convincing. The next generation will live in the chaos you help make because of your irrational fear of some enemy hiding in the shadows.

    • kontis

      I hope you are not a hypocrite, so when your country is invaded you will happily accept invaders with a peace sign and love.

      • Ad

        That’s like saying I should support rebuilding our nuclear program because I don’t want to be murdered. Also no nuclear armed country can be invaded on a large scale.