China and Russia are exchanging ideas around the military use of artificial intelligence (AI) amid a global race involving several nations.

In a South China Morning Post report, both nations compared assessments on military use cases for AI focused on safety within international best practices. The discussions involved ethical implications and the impact of existing international conventions on deploying AI for military purposes.

Chinese and Russian negotiators pledged to abide by the guidelines of the Group of Governmental Experts (GGE) of the States Parties to the Convention on Inhumane Weapons on Lethal Autonomous Weapons Systems (LAWS).

By the end of the deliberations, both countries aim to be on the same path in developing local AI regulations. Guided by similar ethics, Russian and Chinese top military brass have begun incorporating AI technology in systems employed by their armed forces for surveillance and other decision-support systems.

“The meeting confirmed the closeness of the Russian and Chinese approaches to this issue,” said a Russian spokesperson. “It was noted that there is a need for further close cooperation in this area both in the bilateral format and in the relevant multilateral platforms, primarily within the framework of the GGE on LAWS.”

Despite the similarities in ethical direction, China has taken a hard stance against AI-based autonomous weapons systems, with Russia and the U.S. opting to double their investments in the field.

Common denominators can be gleaned from the Chinese and Russian approaches toward AI use for military purposes. Both parties are acutely aware of the risks posed by AI systems, pledging to adopt “a prudent and responsible attitude” toward their military deployment and integration with other technologies.

Rather than develop military utilities in silos, both China and Russia are keen to onboard more nations to their Global AI Governance Initiative, but the current geopolitical issues may clog the negotiation process.

AI risks

In 2023, the United Nations (UN) Security Council described the risks posed by AI as akin to the threat of nuclear warheads, urging state actors to impose proper guardrails for their development.

The UN warned that AI misuse can potentially spark regional conflicts via misinformation from deepfakes. Despite the grim warnings, several nations, including Russia, China, the United Kingdom, the United Arab Emirates, and Saudi Arabia, are bolstering their GPU supplies amid the so-called AI race.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Turning AI into ROI

YouTube video

New to blockchain? Check out CoinGeek’s Blockchain for Beginners section, the ultimate resource guide to learn more about blockchain technology.

Thank you for engaging with us at SmartLedger through 'China, Russia collaborate to explore safe military use of AI' - https://smartledger.solutions/china-russia-collaborate-to-explore-safe-military-use-of-ai/. We hope you found the insights valuable.

For more thought leadership and updates, delve deeper into our resources and stay ahead with the latest innovations.

 

This post was originally published on this source site: this site

image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog