Artificial Intelligence Can Free Imagery Analysts to Focus More on the Unknown
Analysts at the National Geospatial-Intelligence Agency may spend as much as half their time poring over satellite imagery of activities they're already familiar with, taking place in locations where everybody is already looking.
Their time could be better spent doing other things, said Susan Kalweit, director of analysis at NGA.
During an artificial intelligence-themed panel discussion here Thursday, Kalweit said human analysts could be freed up by AI to spend more time doing the cognitively difficult work of identifying and determining the significance of unfamiliar activities that take place in unfamiliar locations.
"That's the discovery piece," Kalweit said. "That's where you want to anticipate where, when and how will Russia go into central Europe. That's a key question around anticipatory intelligence. And unfortunately, because we spend so much of our time at the (other places) we have less than ten percent of our time to spend on those really key questions, the unknown/unknown, and the black swans -- trying to anticipate what's going to happen."
Kalweit actually divvied up analyst time into four groupings, based on known or unknown locations and known or unknown activities, and offered up a Punnett square-like model as a way to visualize that. More than 90 percent of analyst time is spent in three of those locations, she said, while less than ten percent of time is spent in the "top-right" square, analyzing the unknown/unknown -- and that's the type of work best done by human analysts.
"[That's] where really you want the human brain to spend most of its time," she said. "And where machines augment that is in the hypothesis analysis: the highly cognitive analysis, providing alternatives of what this activity might be, being able to put together multiple signatures and looking at trends from multiple years in multiple locations over very different sorts of scenarios and giving us hypothesis on what might happen or might be happening."
Kalweit said her analyst workforce remains positive about machine augmentation in their work, "especially for monitoring the mundane" and "work that now takes time and is not cognitively challenging." That might include change detection or object identification, for instance.
Friction points in the future, however, might arise if AI encroaches on work analysts value the most and see themselves as being best at.
What It Means
"Contextualization," she said. "What does it mean? The idea that a machine would be able to spit out 'this is what it means' is really where that friction lies. They want help with analysis of alternatives, to expand their thinking, to provide hypothesis that they can work with other analysts and their counterparts in testing, but not for the machine to spit that out."
Ironically, it may be the analysts themselves who will help AI systems eventually supplant them. Kalweit has top-level visibility on development of AI systems that are used by NGA and said the greatest advancement there comes when developers work side by side with those who use those systems.
"Where we have had absolute success in a very consistent way is when our industry partners are paired with our image scientists or our analysts and are doing the development in real time, together," she said. "So the true DevOps, true paired programing, has resulted in the greatest successes that we have had in this new world."