Pentagon developing artificial intelligence that can make crucial life-or-death decisions in battle
03/31/2022 // Arsenio Toledo // Views

The Department of Defense is trying to create an artificial intelligence program that can make decisions regarding life or death on the battlefield.

The new program, called In The Moment, is being led by the Defense Advanced Research Projects Agency (DARPA), the innovation arm of the military. Its goal is to let an AI program make the difficult decisions that military strategists are forced to grapple with regularly. (Related: US military experimenting with artificial intelligence that can predict the future.)

DARPA wants In The Moment to make quick decisions during stressful situations using algorithms and data. The proponents of the AI program believe removing human biases may save lives.

In the hypothetical situations that DARPA believes In The Moment would be applicable for use, military commanders might be in disagreement over what to do. One scenario includes deciding who gets priority care following a mass casualty event like a terrorist attack.

"Two seasoned military leaders facing the same scenario on the battlefield, for example, may make different tactical decisions when faced with difficult options," wrote DARPA in a press release. "As AI systems become more advanced in teaming with humans, building appropriate human trust in the AI's abilities to make sound decisions is vital."

DARPA explained it is imperative that the In The Moment program understands the key characteristics of expert human decision-making for it to properly represent an algorithm for the AI program and be trusted to make good decisions.

Brighteon.TV

During the development of the In The Moment, DARPA also wants to develop decision-maker characterization techniques and create a quantitative alignment score between an algorithm and a human decision-maker.

"We're going to collect the decisions, the responses from each of those decision-makers and present those in a blinded fashion to multiple triage professionals," said program manager Matt Turek. "Those triage professionals won't know whether the response comes from an aligned algorithm or a baseline algorithm or from a human."

Questions remain about whether AI should be given power over life and death

The In The Moment program is still in its early stages of development. But the program's creation comes as many other countries are figuring out how to use modern technology like artificial intelligence to update antiquated military systems, such as the centuries-old system of military triage that is prone to human error.

But ethicists and AI experts are concerned about leaving such life or death decisions in the hands of a machine program, and they wonder if AI should be involved in these kinds of situations.

"AI is great at counting things," said Sally Applin, an anthropologist, researcher and expert on the intersection between algorithms, AI and ethics. "But I think it could set a precedent by which the decision for someone's life is put in the hands of a machine."

Other ethicists are concerned about how the In The Moment program would function. In the case of a program that solves triage questions, they are concerned over how the program would choose which soldiers get prioritized for care over others, and who would be to blame if somebody who could have been saved died anyway.

Peter Asaro, a philosopher specializing in robotics and artificial intelligence at The New School, commented that military officials will need to decide how much responsibility to give to an algorithm that makes decisions about triage. It must also figure out how to deal with ethical situations – such as about who gets priority care.

"That's a values call," said Asaro. "That's something you can tell the machine to prioritize in certain ways, but the machine isn't gonna figure that out."

Applin also called for additional tests to be made to make sure DARPA's new program does not perpetuate biased decision-making processes in its algorithm that could, for example, decide that White patients should be prioritized for healthcare decisions over Black ones.

"We know there's bias in AI. We know that programmers can't foresee every situation. We know that AI is not social. We know that AI is not cultural," said Applin. "It can't think about this stuff" but people might unconsciously make these decisions for the program when developing its algorithm.

MilitaryTechnology.news has more articles regarding the latest innovations in military technology and artificial intelligence.

Watch this video of "Anastasi in Tech" as host Anastasiia talks about the future of AI.

This video is from the High Hopes channel on Brighteon.com.

More related stories:

China deploys KILLER ROBOTS to its contested border with India.

Humans are no match for artificial intelligence – "It's not even close," says Nobel laureate.

EU proposing legislation to restrict facial recognition tech and "high-risk" artificial intelligence applications.

French Army testing Boston Dynamics' robot dog in combat scenarios.

Next-gen warfare: DARPA tests "drone swarms" that will be operated by artificial intelligence, not human beings.

Sources include:

WND.com

GreenwichTime.com

TheDefensePost.com

Brighteon.com



Take Action:
Support Natural News by linking to this article from your website.
Permalink to this article:
Copy
Embed article link:
Copy
Reprinting this article:
Non-commercial use is permitted with credit to NaturalNews.com (including a clickable link).
Please contact us for more information.
Free Email Alerts
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.
App Store
Android App
eTrust Pro Certified

This site is part of the Natural News Network © 2022 All Rights Reserved. Privacy | Terms All content posted on this site is commentary or opinion and is protected under Free Speech. Truth Publishing International, LTD. is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Truth Publishing assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms and those published here. All trademarks, registered trademarks and servicemarks mentioned on this site are the property of their respective owners.

This site uses cookies
Natural News uses cookies to improve your experience on our site. By using this site, you agree to our privacy policy.
Learn More
Close
Get 100% real, uncensored news delivered straight to your inbox
You can unsubscribe at any time. Your email privacy is completely protected.