According to a June 14 report by the English-language Chinese paper Global Times, the AI pilots were deployed to help human pilots hone their decision-making and combat skills. It added that aside from just pilot training, AI will become an integral part of future Chinese aircraft and help pilots with combat decisions.
Furthermore, the AI pilots can also learn from each engagement – and have already performed better than human opponents. PLA Air Force flight brigade team leader and fighter ace Fang Guoyu found himself shot down by one of the AI pilots during one instance, according to the Chinese military's official newspaper PLA Daily.
Fang showed off his flying skills and won one round of combat. But the AI managed to turn the tables during the next round by using the skilled fighter pilot's same technique against him, emerging triumphant.
The ace pilot explained that while the AI-piloted aircraft was easy to defeat in the early stages of training, it eventually learned from its human opponent. "The move with which you defeated it today will be in its hands tomorrow," he said. Fang compared the AI pilots to the extraordinary human pilots who win the Golden Helmet air combat contests in China. He said: "It's like a digital 'Golden Helmet' pilot that excels at learning, assimilating, reviewing and researching."
Meanwhile, brigade commander Du Jianfeng told the PLA Daily that AI systems are increasingly being integrated into training. He also praised the AI's ability to handle aircraft skillfully and make tactical decisions flawlessly. "The AI has shown adept flight control skills and errorless tactical decisions, making it a valuable opponent to hone our capabilities," Du said.
The brigade commander added that the AI pilots serve to "sharpen the sword" of Chinese fighter pilots as it forces them to become more innovative with their technique. (Related: Elon Musk warns: 95 percent chance artificial intelligence exterminates humanity.)
Not to be outdone, the U.S. has also delved into the use of AI for military purposes. In January 2020, the Defense Advanced Research Projects Agency (DARPA) publicized footage of its experimental Offensive Swarm-Enabled Tactics (OFFSET) program. The OFFSET program utilized large swarms of drones to find targets and gather intelligence in urban raid missions.
The OFFSET test was conducted at DARPA'S Camp Shelby Joint Forces Training Center in Hattiesburg, Mississippi. It involved a coordinated group of 250 autonomous air and ground vehicles, tasked to find certain QR codes in the training facility – which was designed to approximate a city block. (Related: Next-gen warfare: DARPA tests "drone swarms" that will be operated by artificial intelligence, not human beings.)
The swarm of air and ground vehicles divided themselves into several different groups with different tactical assignment. Some entered buildings to find the QR codes, while others stationed themselves at strategic points to watch for any threats. Other autonomous vehicles patrolled the terrain to create a fully three-dimensional map of the environment.
While the drones operated autonomously, they fed live footage to human operators. In turn, the operators observed the drones via a laptop interface. The operators also managed to use an augmented reality headset to interact with a live digital map of the environment based on the drones created.
The January 2020 OFFSET test was the third of six planned tests to ensure the reliability of both the drones and tracking software. Defense contractors Northrop Grumman and Raytheon, alongside Case Western Reserve University and Northwestern University, developed the drones and operation systems used in the OFFSET program.
Visit MilitaryTech.news to read more about countries utilizing AI for military purposes.
Sources include: