The US military needs more advanced generative AI tools to keep pace with Russia and China in the realm of online influence and information warfare, a Pentagon-backed study has revealed.
Generative AI could give US forces a critical edge in influence campaigns, an area where rivals are already operating at a scale the Pentagon struggles to match, according to research published by California-based nonprofit policy think tank RAND Corporation.
It emphasized that America could fall even further behind if it fails to adopt the tech at scale.
To assess current efforts, RAND consulted a small group of subject-matter experts, industry leaders, and other government researchers. It also hosted a workshop with influence-focused units to identify their operational and tactical needs for AI.
Funding, Coordination Needed
The study found that the Pentagon must overcome a serious lack of investment and coordination to stay competitive. Stronger collaboration among stakeholders could bolster tech procurement and long-term sustainment, it said.
Moreover, the paper said buying and fielding generative AI will require a smarter, more flexible approach, along with a plan to keep those tools running across joint and mission-specific teams.
“Already, multiple organizations are acquiring duplicative tools, leading to redundancies in investments,” RAND said.
“The cost of sustainment activities over the life cycle (routine maintenance, upgrades for improved capability, changes for interoperability) makes this need for coordination even more imperative to meet.”
Medium, Not Answer
The study highlighted that generative AI is a tool, not a standalone solution, for addressing challenges in influence operations, from planning and analysis to measuring impact.
While influence ops are often associated with multimedia messaging, RAND said AI’s true potential lies in supporting campaign planning, decision-making, and real-time assessments.
But there’s still no clear, department-wide strategy for how generative AI should be used, or what risks and opportunities it brings, for influence ops and information warfare, according to the study.
“Limited guidance on differentiating between the need for highly customized generative AI solutions and more broadly applicable commercial off-the-shelf alternatives confounds both vendors and [Department of Defense] acquisition officials,” RAND said.
“There are no standardized reassessment criteria for generative AI tools being used in influence. Users note that vendors are not incentivized to continue developing generative AI products after initial acquisition.”
What Comes Next
The authors urged leaders in influence ops to press forward. That includes defining what they need, increasing funding, streamlining workflows, and teaming up with Pentagon-wide AI offices to tap into shared systems and resources.
They also recommended selecting the right teams to put these ideas into action.
Finally, the Office of Information Operations Policy and the Principal Information Operations Advisor are encouraged to take the lead in setting standards, offering training, and setting rules for using AI-generated content responsibly in influence operations.