Some Thoughts On The Impact Of Generative AI

I was momentarily distracted by this twitter post today. It included an image as a funny meme, and my impression is that the image is mostly likely generated by AI. And I made an off-hand comment that it felt bad to me.

Thanos ready for his NYTimes rehabilitation profile like

- from twitter user @alexisohanian


What followed was a reasonable question that i get fairly often since I started commenting on AI. Why does it matter? Why am I against it. I felt this deserved a considered response. So here it is. Reproduced from the twitter thread.

Okay. I'm actively procrastinating right now, so I guess it's a good time to answer this common question in more depth. Probably long thread incoming.

Curious why you think it's bad that (twisting your words a bit here) you can generate these cool images really easily. Feels like we should celebrate that.

I mean, there are definitely issues with AI image generation, but I don't think ease of access/use makes my list

- from twitter user @kzeillmann

I take issue with my actual feelings about this being boiled down to "it's bad". One of the reasons we have a hard time finding nuance is that our instinct is try to boil everything down to a binary. And "good" vs "bad" is our very favorite.

My perspective on generative ai tech is still developing. But I'll try to address multiple different facets of what I've been thinking about.

I think the important place to start is that it's happening. I'm not a person who argues to put the genie back in the bottle so to speak.

Once a new technology is released and gains traction, our energy has to turn towards engaging with the impact of it. We can still talk about our feelings. But if that's all we do, we aren't doing the hard work of actually trying to influence the world we live in.

What I said here was "it feels bad". What I meant by that is it makes me sad when I think about all of the implications around the Ohanian's tweet. It makes me think of how the world is gonna change in ways I personally am not looking forward to.

That's not a statement about whether the tech is "good" or "bad". It's a statement about my own feelings and what I wish for the society in which I exist. And it doesn't mean I'm against any kind of progress. It means I looked at this one and I'm not sure it works for me.

That said, we can ask some practical questions. Should humans take something that used to require a lot of labor and creativity and give ourselves the capability to do it instantly with no effort? In the absence of context, the answer is probably yes.

I've been talking a lot about abundance. I believe in it deeply as a way forward for humanity. But learning how to live in a society that produces abundance is not easy. And it's pretty clear to me that we are not culturally ready to do so.

Why is abundance bad in this case? There are a few different answers to this. The first one is also very practical. It's going to destroy industries that currently keep a lot of people gainfully employed. People will suffer because of this tech.

I know most of us have heard this argument before. And because it comes up so often, I think people have decided it's not interesting or important. I disagree. It is the most important thing. Progress has a high cost. Even if we all agree that the eventual outcome is "better".

The reason it bothers me so much is that a lot of people get real comfortable saying that the eventual benefits outweigh the impact. But they don't actually want to talk about how we might actually reduce or mitigate that impact. We're just supposed to let it happen.

Just speaking for America, I don't think we will do all that much to help the people who are going to be devastated by this "advancement". It's gonna be a story of how we perpetuated another generation of misery and struggle that turns into strife and grievance in the future.

But that's not the only reason this makes me sad. We can also look at this a bit more philosophically. Why does it matter that a human expends effort and creativity to produce something? The answer is because we say it does. Because we want it to matter.

Part of this is subjective for me. Human effort has always mattered. That's why the real Mona Lisa is worth more than the millions of perfect reproductions that have been produced. Art matters because a human decided to make it. No other reason. And for me, that's good enough.

But not all of it is subjective. I believe there are huge impacts to devaluing the production of art. It destroys value from a labor perspective, but also from a consumer perspective. AI art can be striking. But it has no story, and is thus inherently less interesting.

We have already seen the precursor to this with social media. People's art gets stolen and shared all the time. Without the artist's consent. Without attribution. And without ever giving the viewer the opportunity to hear a story that might make it more compelling.

Stories matter to human culture more than just about anything else. We *know* this. And yet we willingly participate in destroying our stories with ever greater efficiency and finality. And it makes me sad. Our culture is worse overall when this happens.

And as I keep repeating, what we do with computers is accelerates everything. Whatever impact we are having, when we enable a computer to do it, we are increasing that impact exponentially. That's what's about to happen to visual and written art.

Because again, whatever this technology does, it will do it a billion times at the speed of light. There is going to be tremendous consequence and impact. And as soon as we decide it’s “thinking”, then humans will no longer be accountable for what happens.

- from me

So let me try to round this out with some final thoughts.

First off, if we are going to create abundance, then we have to start envisioning a world where humans aren't forced to earn their existence through labor.

I don't think anybody is willing to have that conversation. It's way more complicated than it seems even at first blush. For those who I know are itching to bring it up, UBI is potentially a piece of the puzzle, but is nowhere near a complete solution.

Second, if we're gonna keep devaluing culture, we're going to have to figure out what else matters to us. My experience is that when humans don't have enough interesting distractions to keep us occupied, we tend to try to destroy each other.

All of this context is included when you ask me about generative AI and I say "it feels bad".

Am I doing too much right now? Yeah, probably. Does that mean I'm overthinking it? Absolutely not. There is no magic. Either we choose to try to avoid suffering or we don't.