Donald Glover and Wyclef Jean have been promoting Google’s AI initiative. (Quartz) Apparently using artists to promote it is meant to get ahead of accusations that AI is bad for artists. This deal doesn’t make me feel like AI is more acceptable; it makes me like those artists less. Normally I have lots of Childish Gambino on my writing playlists, and since reading this article, I’ve just been going “ugh” and skipping past it when he comes up.
I make no secret of my anti-AI stance. There are just so many reasons to be opposed to it.
To the best of my knowledge, none of the AI tools have datasets where the included data came from consenting people. Companies tend to do a lot of foggy language surrounding their data usage to make it hard to know whether your stuff is getting used, and it usually *is* getting used, as with Slack. (Ars Technica)
Even Adobe, who’s been offering people money for video clips, is mostly working off of material which may be legally indefensible. (Hollywood Reporter)
But behind closed doors, companies are warning that the way most AI systems are built might be illegal. “We may not prevail in any ongoing or future litigation,” states a securities filing issued in June by Adobe. It cites intellectual property disputes that could “subject us to significant liabilities, require us to enter into royalty and licensing agreements on unfavorable terms” and possibly impose “injunctions restricting our sale of products or services.” In March, Adobe unveiled AI image and text generator Firefly. Though the first model is only trained on stock images, it said that future versions will “leverage a variety of assets, technology and training data from Adobe and others.”
Beyond that, it’s so, so bad for the planet. As I’ve previously shared, a ChatGPT prompt uses 15x the energy of a traditional web search, (Quartz) and some experts say it’s more complicated than even that. (Bluesky)
I also just don’t understand why people want to use this stuff to replace human-made artwork. The act of creation is a solid 90% of the reason why stuff is good! Most of the magic occurs between artist and art. Whatever your favorite painting, I guarantee there is a lot more of a relationship between the painter and the work than you, who is looking at it. There’s the skills it took to get there, the techniques, the place they did it, the sensory experience of creation, the thoughts and feelings of making things come together.
Frankly, everything AI produces is either bad (or hews closely enough to stolen material to be good, but arguably plagiaristic). It’s *not good* at what it does. The best work they promote has, at best, a sort of loopy dream logic that doesn’t stand up to any degree of scrutiny longer than scrolling past it real quickly on your feed.
Nicole Thoughts Stained With Ink has a more comprehensive publishing-specific post about her anti-AI stance that I appreciated reading. Like she says, there are plenty of good uses for AI that we can benefit from. AI assistants can be really helpful on myriad matters. You could say that the anti-motion sickness tech Apple is trialing is a kind of AI (Jalopnik via Quartz). Also, we should be using AI to understand complicated scientific concepts that are too difficult or labor-intensive for humans to work on. That stuff is good. The art stuff is bad.
Companies aren’t going to behave well of their own volition. Unfortunately, politics in America is extremely not ready to help with this. (Engadget)
“It’s very hard to do regulations because AI is changing too quickly,” Schumer said in an interview published by The New York Times. Yet, in March, the European Parliament approved wide-ranging legislation for regulating AI that manages the obligations of AI applications based on what risks and effects they could bring. The European Union said it hopes to “protect fundamental rights, democracy, the rule of law and environmental sustainability from high-risk AI, while boosting innovation and establishing Europe as a leader in the field.”
Schumer seems to disagree with finding that balance, instead stating in the interview that investment into AI research and development “is sort of the American way — we are more entrepreneurial.”
And in the meantime, Google is doing more to bury actual search results under AI nonsense, (Ars Technica) but Childish Gambino is okay with that I guess.
Every company and well-heeled musician isn’t all-in on this, though. Sony Music now prohibits AI developers from using its catalogue for training data. (Ars Technica)
I would love for there to stop being AI-related news so I can stop writing posts like these.