Society featured

Owning your own information revisited

February 18, 2024

In 2020 I suggested an amendment to the U.S. Constitution that would declare that each individual owns his or her own image and information. Anyone wanting to use that image or information would need to get permission and perhaps even pay for it. I pointed out that it’s not such a radical idea since celebrities already do own their own identities and people who use them without permission and/or payment are subject to lawsuit.

Fast forward to today. The U.S. Federal Trade Commission (FTC) has now proposed a regulation that would hold accountable those using technology to impersonate others. The FTC said in a news release:

The Commission is also seeking comment on whether the revised rule should declare it unlawful for a firm, such as an AI [artificial intelligence] platform that creates images, video, or text, to provide goods or services that they know or have reason to know is being used to harm consumers through impersonation.

The FTC just approved a rule that would protect businesses and government from similar harms. This would extend these protections to individuals.

A recent robocall to New Hampshire voters before that state’s presidential primary used an AI-generated voice claiming to be President Joe Biden telling people NOT to vote in the primary. The case is being investigated for possible election law violations.

At the time I wrote my previous piece, the idea of AI-generated fake voices wasn’t in the air. It seems that in such a world it would be wise not only to assert ownership of our own images and information, but also of our own voices.

The FTC rule is actually only a small step forward in protecting the identity and reputations of each of us. That’s because our identity is multifaceted. It’s not just our physical attributes of body, face and voice. It is the whole tapestry of our personal history including our relationships with others, our work experience, our places of residences and even our trips.

It is reasonable that those we work with and those we buy from (using means other than cash) need to have access to certain information, largely to determine if we are who we say we are. An impenetrable wall between the world and our information and image is impractical.

But, we ought to have much more say about who gets access to our information and image and what they can and cannot do with them. One of the best ways to monitor that would be to own our own information and make it available with permission or even sell it. We know that such data is extremely valuable to marketers who regularly gather it and also buy it from information brokers. Why shouldn’t we get a share of that money if we so choose? Or decide to forego remuneration if in a particular case we value our privacy more?

A new tool called Nightshade allows artists to “add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways.” Artists have been seeking such protection since it became apparent that AI platforms were ripping off their art to train their machine models to draw and paint like them—in order to make it possible for others using the platforms to create images in the same style (without actually drawing and painting) and then claim authorship.

Writers face a similar dilemma but no similar solution.

What the rest of us need is some way to poison our personal data so that it becomes unusable to data brokers and their customers—all without undermining our ability to seek employment, make purchases and communicate freely with others. That would be a tall order. But I can’t be the first person to think of this which means I’m guessing there are people working on it. The sooner they figure it out, the better.

Kurt Cobb

Kurt Cobb is a freelance writer and communications consultant who writes frequently about energy and environment. His work has appeared in The Christian Science Monitor, Common Dreams, Le Monde Diplomatique, Oilprice.com, OilVoice, TalkMarkets, Investing.com, Business Insider and many other places. He is the author of an oil-themed novel entitled Prelude and has a widely followed blog called Resource Insights. He is currently a fellow of the Arthur Morgan Institute for Community Solutions.

Tags: artificial intelligence, identity theft