Opinion: OpenAI’s DALL-E 2 is the big tech equivalent of ‘soylent green’

This article contains spoilers for the 1973 film “Soylent Green.”

It’s a hot AI summer out here for everyone who has even the slightest interest in putting the “art” in artificial intelligence. I’m talking about DALL-E 2 and OpenAI’s announcement that its incredible text-to-art generator would be entering a closed-beta.

Most exciting of all: an additional one million people will gain access to DALL-E 2. Woohoo! Let’s do a cartwheel.

Greetings, humanoids

Subscribe to our newsletter now for a weekly recap of our favorite AI stories in your inbox.

Up front:There would be no cartwheels in the Neural offices at TNW that day.

DALL-E 2 is, in this humble editor’s opinion, a scam. But that’s nothing new in the technology world. Facebook is a scam. Google is a scam. Microsoft is a scam. They’re all profiting off of something that has nothing to do with what they’re telling us they’re selling.

If you lay bricks, you’re a bricklayer. And if you write programs, you’re a programmer. But if you sell ads in 2022, you’re probably a search giant or a social media company. It doesn’t have to make sense so long as it makes profits.

For the most part, big tech makes its profits by convincing you to be the product. Why try and sell everyone on Earth a TV, when it’s easier and more profitable to find a TV manufacturer who’ll pay you for advertising access to just about everyone on Earth?

The data we generate powers the products that these companies sell. The agreement we make with them is pretty much you can use our data to make the product that we’re logging into better. And, in practice, that basically means that we’re cool with Meta using our surfing data to help advertisers target us as long as we can keep playing Candy Crush for free.

Background: DALL-E 2 is a fancy extension of OpenAI’s transformer model, currently iterated as “GPT-3.” It takes text prompts and turns them into pictures. It’s mind-bogglingly good and will absolutely 100% revolutionize the world of content creation.

Let me be super clear here: I love it. I think it’s amazing. It’s a technological triumph and the whole world deserves access to it (with the giant caveat that OpenAI has to make it safe).

However, selling access to it is dangerous. It’s not just unethical. It could represent a significant shift from the current level of exploitation big tech’s able to get away with, to a paradigm in which big tech essentially turns us into the digital equivalent of Soylent Green.

Spoiler alert: Soylent Green is an old movie but it’s a classic. I’m going to spoil the giant twist in this film right now, so you’d best surf away if that’s going to be a problem.

Per the movie, in a distant future humankind is on the brink of starvation. To survive, the government gives them nutrition bars and that’s all most people ever eat. One day, they come out with a new flavor called “Soylent Green.” Everybody loves it and, luckily, it’s produced in abundance. Unfortunately, it turns out that Soylent Green is made of humans. To solve the starvation problem, the government started processing dead people and feeding them to us.

So, two things:

  1. As I recall, Soylent Green was actually free — I haven’t watched it in a while, I’m sure a reader will correct me if I’m wrong.
  2. It would be impossible to eat a Soylent Green bar made out of your own dead body.

I bring all that up because it explains my problem with OpenAI selling access to DALL-E 2.

My take: Giving people access to DALL-E 2, with certain protections in place, created an environment in which anyone can peruse what I’d like to call the library of human imagery as interpreted through a transformer model.

But the moment OpenAI turned DALL-E 2 into a monetized service (OpenAI allows you to purchase credits for use beyond your beta limit), it turned our data into a product and began selling it back to us.

It went from being a library of art that we could all (potentially) enjoy, to a for-profit endeavor. OpenAI even makes it explicit (in the above linked post) that those who have access to the beta are legally entitled to ownership of any artwork the model produces at their request.

Who gave OpenAI the right to sell ownership to images our data helped create?

It’s easy to make the comparison to a writer who reads a bunch of books and then uses that inspiration to create a new book.

But that’s not what DALL-E 2 does. Don’t get hung up on the “art” aspect of it. If we descale the model, DALL-E 2 takes a picture, holds on to it, and then hands it to you when you ask for it.

If it has eight pictures of the ocean, and you tell it to hand you one, it will. But since DALL-E 2 is an agent that only exists in a digital world, it can do things that wouldn’t be practical in the physical realm.

It can smash those eight pictures together and make them one picture. And it can even be fine-tuned to decide whether eight is the right number of pictures to smash together or if six would do. Maybe ten?

Eventually, OpenAI gets to the point where its model is smashing millions or billions of images together. It’s still the same trick.

Wrapping it up: Data is data and output is output. Doesn’t matter if it’s art, text, or advertising profiles. When Facebook takes your data, it sells that data to advertisers so you can play more Candy Crush.

When OpenAI takes your data, it never had your permission. You never ticked a box or logged into a website that expressly informed you that any images you put online would be used to train OpenAI’s for-profit AI system.

And that means, whether it’s super fun like Candy Crush or super convenient like facial recognition, OpenAI is exploiting you and your data every time someone purchases access to the DALL-E 2 model.

The company took your data and trained a model so well that you don’t have to be a futurist to know that eventually, DALL-E will be as crucial for design as Photoshop is now.

The difference between Adobe and OpenAI, however, is that Adobe gets your permission to train its AI on your data.

Nobody ever asked me if I wanted my face in Clearview AI‘s database or OpenAI’s, and I feel the same way about both of them monetizing it.

This sets a dark and dangerous precedence for the use of scraped data going forward.

Facebook Twitter Google+ Pinterest
Tel. 619-537-8820

Email. This email address is being protected from spambots. You need JavaScript enabled to view it.