The DEEPFAKES Accountability Act

July 23, 2019   Robert Gmeiner

  • Property, Markets & Trade
  • Blog

Representative Yvette Clark (D-N.Y.) recently introduced the Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act to address the now-feared problem of deepfakes, or videos that realistically show someone saying or doing things that were never said or done.

The bill, if it becomes law, will require anyone who produces a deepfake to disclose the fact that it is not real by using “irremovable digital watermarks, as well as textual descriptions.” Not doing so will be a crime. Beyond criminal penalties, victims will have a right to sue to “vindicate their reputations.” The bill also considers deepfakes to be covered by existing unlawful impersonation statutes. Obviously, those who create deepfakes with malicious intent will probably not follow the rules. As deepfakes get more difficult to detect, it will get more difficult for victims to “vindicate their reputations.”

TechCrunch has pointed out that this bill still does serve a purpose because it at least defines this behavior as a crime. If the bill becomes law, there won’t be an issue of needing to somehow show that malicious deepfakes are outlawed by some other law.

This bill provokes thought about property in oneself. If one’s likeness and reputation are a property right, why should it be permissible to produce a deepfake even if it is watermarked? Even if something is clearly a deepfake, whether because it is watermarked or of such bad quality that it cannot possibly be real, could it still damage a reputation? Nancy Pelosi didn’t really like the recent fake video of her appearing drunk with slurred speech and Facebook claimed it didn’t break the rules. Would posting that video be a problem? Also, what if the watermark is faint enough that it isn’t seen on a thumbnail-sized playback on a website? Could the written disclaimer say something like “Image for this video is based on ____ event and altered or edited for editorial purposes,” or something even murkier?

It is good that this law defines illegal behavior, so we don’t have to wait for court decisions that make rules for everyone based on a single case with no legislative input. The problem is that it ignores the fact that damage resulting from a deepfaked piece of content may still occur even with this law. If your reputation is yours, then the real question to answer is whether deepfakes should be legal in the first place. The far more pressing problem for now, though, is perfecting the ability to detect deepfakes and making that capability at least as widespread as the capability to produce them.