Connecting state and local government leaders
In early tests of its augmented reality training software, a Texas-based company made first responders perform faster and more accurately.
When it comes to emergency response, organizations want training sessions to mimic real-life scenarios and take place far more often than the crises themselves.
But at emergency management agencies today, first responders receive much of their training in the classroom, and when they do test their skills in the field, the simulations are usually expensive, rehearsed and infrequent.
The government of Austin, Texas, recently found virtual and augmented reality could offer a way to cheaply up the number of training sessions available to first responders and expose them to a wider array of situations they may face in the line of duty. Early tests also suggest the tech could improve responders speed and accuracy when real-world disaster strikes.
The city last year partnered with Augmented Training Systems to build an augmented reality model of its AmBus, an oversized ambulance created for particularly devastating emergencies, and allowed responders to virtually explore the vehicle. Participants were later asked to locate particular items and rehearse various scenarios in the real AmBus.
People who trained on the digital model were 45 percent more accurate and nearly 30 percent faster in performing the real-life tasks than those who only received traditional classroom training, according to Texas State University Associate Professor Scott Smith, who also serves Augmented Reality Systems’ chief executive. Today, Austin officials are finalizing a five-year contract with the company to annually train some 200 new responders and investing some $100,000 in developing similar programs for triage and hazmat scenarios, Smith told Nextgov.
The AmBus program would come at roughly half the price Austin pays annually for traditional training, he said.
According to Smith, the potential benefits of virtual reality training go far beyond cost savings.
“I think that's the basic premise of it—providing a context where people can perform … in a safe space where you can manipulate things, offer a number of variations of learning, offer a number of feedback options,” said Smith. “[Disaster] events are not going to slow down, and we really need to become more prepared for them. A Powerpoint presentation can't cover that.”
Beyond its work with Austin, Augmented Training Systems is developing digital training software for construction zones and active shooter situations, but Smith said the tech could potentially be used to train for wildfires, hurricanes and other disaster scenarios.
On the federal level, he said the Homeland Security Department could use the tech to bring a more standardized, data-driven approach to its myriad emergency response efforts.
Today, he said, the agency’s strategies are largely informed by data collected through after-action reports, but that information can often be skewed the biases that come with self-reported data. Through simulated trainings, officials can explore how people are most prone to respond in certain situations and create new best practices and strategies accordingly.
“Let's say over 50 times, people respond in a certain way,” he said. You can “create procedures around best practices in that space ... whereas now we're waiting for a [live] event to occur to make policy.”
Still, the tech wouldn’t entirely replace more theory-heavy training programs, according to Smith. There are still benefits to learning in a classroom environment and putting those skills into practice in real-world demonstrations, he said.
“I believe we can equip and understand more about what a first responder needs in those spaces with this type of technology,” he said. “I think VR, AR and real-time data processing [are] only meant to enhance current functioning.”
On March 10, Smith will present the tech to a panel of mayors in a $10,000 pitch competition at SXSW.
Jack Corrigan is a Staff Correspondent at Nextgov, which originally published this article.