Lawmakers in the U.S. state of Oregon have turned their sights on political actors abusing
artificial intelligence (AI), pledging to roll out stringent legislation in the coming weeks.

The lawmakers are focused on AI misuse in political campaigns, noting that voters may be swayed by false AI-generated content. Led by Sen. Aaron Woods and Rep. Daniel Nguyen, Oregon’s lawmakers confirm that increasing AI adoption muddies the terrain for fair political campaigns.

Lawmakers are deliberating over two bills to regulate AI use in political spheres to achieve a semblance of control. The first, Senate Bill 1571, requires all political campaigns to disclose AI-generated content in advertisements and other types of communications.

The bill does not require full disclosure when AI is used for simple photo edits and other benign use cases. However, a failure to disclose AI-generated content deemed sufficient to sway voters’ minds may cost defaulters steep fines of up to $10,000 in addition to other penalties.

“It’s increasingly difficult to identify what’s real and what is generated by AI,” said Nguyen. “You can be voice duplicated, and then what’s being said is not really what you said.”

The second bill submitted before the House backs the creation of a task force to explore the long-term impacts of AI in Oregon. If the bill becomes law, the 14-member task force will lead Oregon’s push into AI, measuring the effects of the emerging technology on
education, manufacturing, and mass media.

Both bills have passed the scrutiny of legislative processes and require the signing of Gov. Tina Kotek to become law.

As the country marches toward the 2024 elections, the U.S. Federal Elections Commission (FEC) is tightening the screws for AI use in campaigns via “full rulemaking.” The FEC’s rules are expected to lay the foundations for state legislators seeking to introduce AI rules for political actors in their local jurisdictions.

“The technology will almost certainly create the opportunity for political actors to deploy it to deceive voters[,] in ways that extend well beyond any First Amendment protections for political expression, opinion or satire,” read one petition to the FEC in mid-2023.

Bracing for widespread change

Alongside the attempts of the FEC, AI developers are introducing guardrails to prevent the misuse of the technology in the buildup to general elections. Meta (NASDAQ: META) has gone one step further by imposing a blanket ban on political advertisers from using its generative AI tools to curb misinformation.

Google’s new rules require advertisers to clearly label AI content in political ads unless deemed inconsequential in the grand scheme of things. U.S. authorities appear confident in rolling out comprehensive AI rules for political campaigns akin to its famous regulations involving digital asset donations.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: AI Forge masterclass—Why AI & blockchain are powerhouses of technology

YouTube video

New to blockchain? Check out CoinGeek’s Blockchain for Beginners section, the ultimate resource guide to learn more about blockchain technology.

Thank you for engaging with us at SmartLedger through 'Oregon lawmakers crack down on AI misuse in political campaigns' - https://smartledger.solutions/oregon-lawmakers-crack-down-on-ai-misuse-in-political-campaigns/. We hope you found the insights valuable.

For more thought leadership and updates, delve deeper into our resources and stay ahead with the latest innovations.

 

This post was originally published on this source site: this site

image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog
image-blog