Last month during the Military AI Literacy Seminar for the US Department of Defense, Microsoft along with OpenAI demonstrated their offerings, increasing the debate about what military applications AI technology is likely to have. The event that took place in October 2023 presented product demonstrations focusing on what items apply to the Pentagon, which includes OpenAI’s image generator DALL-E.
Microsoft’s proposal for military integration
The main tool presented by Microsoft was an image generator called DALL-E that could find uses and applications in the latest battle management systems. These very comprehensive systems, critical for military planning, provide all commanders with an overview of entire battlefields from a single position. This is also very helpful in coordination and movement control of various activities, choosing the targets for artillery and airstrikes, and so on. The proposal of Microsoft included the utilization of DALL-E for the innovative improvements of the command control software systems to obtain their goals of efficiency and effectiveness.
OpenAI, on the other hand, kept its distance from the EuroDIG 2019 presentation, contrary to Microsoft’s. The organization reported that it did not take part in the conference and did not sell any product to the Czech defense department. OpenAI’s media specialist, Liz Bourgot, informed the public that the company’s position is against the application of any of its tools by the military for this would entail some activities that may lead to the destruction or injury of people, for example. Even when Microsoft’s offer was carried out, Bourget still considered OpenAI as partly deceptive about getting into any collaboration with defense agencies for using the technology in military purposes.
Clarifications from Microsoft
Through Microsoft’s support, the Pentagon can already use DALL-E for more realistic training and simulation software in the military forces. The firm has shunned this idea as the plan, as such, did not turn out to be fact. Microsoft noted that the hypothetical case demonstrations were what they had come up with based on dialogs with (their)customers as to how they might use the new product. Moreover, the company stipulated that in case of any Pentagon team use of OpenAI tools that are provided to Microsoft, the Pentagon, and Microsoft would at the same time have to have defined rules of use and control.
OpenAI’s recent code of conduct update eliminated explicit exclusions for the use of its advanced technology in the military and for warfare. The allegation muddied the ethical waters of using AI in military applications even further. In addition, OpenAI stayed firm in its objection to using its tools for any detrimental activities, but the elimination of the requirements has left open the question as to what the restrictions and conditions in the application of AI to defense are.
Ongoing discussions and ethical considerations
The disclosure of a Microsoft patent about this approach and OpenAI’s explanation of its ethical standards and procedures has chemically changed the ecology of the technical industry as well as that of policymakers who are now worrying about the ethical boundaries of Artificial Intelligence technology. The integration of AI and the military process opens up such morally complex issues as well as involves legal reasoning, which demands the elaboration of more stringent procedures for the creation and deployment of AI systems.
In line with the advances in AI solutions, ethical considerations are also aggravated, especially in sectors crucial to the sovereignty of the country, like national protection. The Microsoft and OpenAI presentation to the US Department of Defense seminar emphasizes the rising intention to adopt AI technology to support military matters. Though beneficial aspects exist, there also exist concerns about the outcomes of these applications along with the ethical issues related to this integration. Therefore, we need to keep them under consideration and see how they are incorporated into the military operations.
This article originally appeared in Yahoo