Washington, D.C. To stop the growing abuse of AI-made deepfake technology, Martin Heinrich (D-NM), Marsha Blackburn (R-TN), and Maria Cantwell (D-WA) have all introduced new legislation. The Material Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED) aims to reach consensus on how to watermark AI-created content.
The COPIED Act wants to give people who make things, like singers, journalists, and artists, more power by making their digital works clear and in their hands. As concerns grow about AI’s potential to change and abuse content without giving credit where credit is due, Senator Cantwell made it clear that the bill will address those concerns.
“This legislation will ensure much-needed transparency into AI-generated content and restore control to creators,” he told us.
Regulatory and Ethical Implications of the COPIED Act
If the COPIED Act becomes law, AI service providers like OpenAI will have to make users include details about where the content they create with AI came from. Credit can’t be taken away or changed in a way that a computer can read. This step makes sure of that. This will hold people who make material more responsible.
The FedTrade Commission (FTC) would be in charge of making sure the COPIED Act is followed. The rules that are in place now say that any violations are unfair or dishonest business practices. Deepfake technology is said to have made scams and frauds easier, and the number of them has grown by 245% in the last few years. This government tracking is meant to stop this.
The senator said that the bill needed to be passed right away because bad people have been pretending to be famous people to trick and scam people. This is an example of how deep fakes are being used for cheating.
“Artificial intelligence has enabled malicious actors to create deepfakes with alarming accuracy, exploiting the likeness of individuals without their consent,” said Blackburn.
Around the time the bill was presented, stories came out about people who had lost a lot of money to deep-fake scams. By 2025, there will most likely have been a loss of more than $10 billion. What’s been going on in the cryptocurrency world lately shows how weak deep-fake technology is. Con artists have been able to pose as famous people in their areas and pull off major scams.
Some people are worried about how well the safety steps in place for tech right now work to fix these problems. Michael Marcotte is in charge of running the National Cybersecurity Center (NCC). He says Google doesn’t do enough to protect against cryptographic deepfake hacks.
Still, people have various thoughts on what AI means for morality. Democrats who care about the COPIED Act will likely have heated arguments about how it could make digital material and the people who use it less safe. This bill is a significant move forward in protecting the rights of people who make material in a world that is becoming more and more digital. It is also making it safer to use fake technology.