Kevin Frazier on Lawfare talks about a new right – the right to reality:
New technologies pose new risks that require new rights. The right to privacy emerged when the camera made private affairs public. The right to be forgotten took root when data shared online for a specific purpose for a finite time became a permanent part of social history. Now, with the spread and evolution of artificial intelligence (AI), there is a need for a right to reality—broadly, a right to unaltered or “organic” content.
He sees social media content as categorizable, from
Class 1: Content would have no or negligible risks of having been created, altered, or informed (that is, based on AI-led research) by AI tools. This class would constitute “organic” content, which is written by humans based on research conducted by humans.
to
Class 4: Content would have a high risk of having been created or altered by AI tools. Both classes 3 and 4 would qualify as “artificial” content.
and that social media platforms should clearly label all entries using one of these categories.
Social media platforms and certain publishers will protest against this proposal if it gains traction, but it’s worthwhile because we’re not likely to ever see an article with a byline of AI Content Generator. That is, bylines, or authors, are a part of the message of any article: It provides context to the attentive reader concerning biases, stated and unstated, assumptions, ideologies, and several other factors.
It’ll be interesting to see if this is recognized over the next five years.