The Woke Agenda. The Problem with AI Art.

Some people may take the saying “never judge a book by its cover” in the literal sense, but I’m not one of those. For the last few months, I’ve been getting review requests for books with AI covers. While some may look good at first glance, a deeper inspection reveals their nature. But the way they look not-quite-right isn’t the biggest problem with this so-called “AI art”, it’s the way these AI generative models work and the consequences of their use.

I cannot consider an AI-generated cover (thus neither an author that knowingly uses one) as professional because it undermines the work of real professionals: visual artists who hone their skills for years to create amazing results. Which is why I cannot in good conscience promote a book with an AI cover.

Just last year, SPFBO’s cover contest became centerpiece of an AI controversy, which led to the cancellation of this fun addition to the main contest and the distrust by authors of their cover artists.

In the literary industry people denounce the production of full books using these technologies and publishers complain about being flooded with AI submissions, but the same contempt isn’t extended to AI images. That’s why I’ll be focusing on the ethics of their use.


Ethical Considerations

It’s Biased

I’ve talked about biases in my various posts for The Woke Agenda, and AI models are not exempt as they collect information from the real world. The base of these models makes them subjective.

And because the internet is overflowing with images of naked or barely dressed women, and pictures reflecting sexist, racist stereotypes, the data set is also skewed toward these kinds of images.

(Heikkilä, 2022)

A couple of years ago, people had fun using the Lensa app (that used open-source AI art generator Stable Diffusion) to create avatars using their selfies. Senior reporter at MIT Technology Review Melissa Heikkilä tried the app and found that her generated portraits didn’t look deeper than her Asian heritage and sometimes created images of generic Asian women; more often than not, the portraits were hypersexualized. This is a reflection of the in-real-world fetishization of East Asian women.

AI is theft

Most generative models claim to use public domain images, but in reality, they gather information from the internet. Everyone’s personal pictures on the internet are used for this purpose and are tagged and categorized.

(…) we find highly questionable semiotic assumptions, echoes of nineteenth-century phrenology, and the representational harm of classifying images of people without their consent or participation.

(Crawford & Paglen, 2019)

This scraping includes copyrighted images and works from artists who have not consented to this usage, so these databases work on pirated intellectual property. In addition, people can use artists’ names as prompts to get a result that imitates said artist’s style by using their online works.

Just a month after the launch of Stable Diffusion, illustrator Greg Rutkowski‘s name became a popular prompt for the text-to-image generator. The model took advantage of Rutkowski’s use of alt text to gather and use his works.

Some of these models have opt-out options for artists to request that their works are not used, but this doesn’t change the fact that they should’ve been asked permission in the first place.

It Takes away jobs

The easy access to AI models and the quick results they offer have made them attractive for companies that have been firing in-house artists and stopped working with freelancers. These companies generate images using copyrighted artworks for commercial purposes.

(…) people in their droves are using it to create images for everything from company logos to picture books. It’s already been used by one major publisher: sci-fi imprint Tor discovered that a cover it had created had used a licensed image created by AI, but decided to go ahead anyway “due to production constraints”.

(Shaffi, 2023)

There’s been a surge of “AI artists” who task themselves with using certain strings of words to simulate works from real artists. I find it amusing when these prompters whine when someone copies their prompts saying they’ve been stolen.

Production of illegal content

The easy access to this technology and its rapid development gives some malicious actors the tools to generate realistic images of people without their consent.

Journalist Sarah Shaffi reports that Kaloyan Chernev, CTO of DeepDream Generator, admitted to its abuse when people tried to “generate images of nude children, despite the fact that no such images were present in the training dataset”, but later studies determined that material of CSA has been used to train AI models. These outputs make it easier to produce explicit imagery of real and fake minors.

The competitiveness of the field has made them bigger and better, and in consequence, more difficult to control and monitor.


Conclusion

It’s exhausting and frustrating being wary of every artwork you find online. Not everyone is trained to find inconsistencies in images to deduce they were AI generated, and technologies keep evolving, making it harder to spot the discrepancies.

The unregulated use of AI “artwork” has poisoned different sectors of the visual arts industry. From illustration software using artists’ work, art portfolio websites allowing the promotion of AI images, and freelance platforms without proper tagging by providers and deceiving unaware clients, to deepfakes.

(…) artists take pride in their craft. They don’t think of it as just a job. They think of it as like an expression of their individuality and their style and who they are as people.

(Piper, 2023)

The non-human nature of these systems makes their results not only passionless, but lacking in intuition and creativity. People are not creating AI images, they are taking away livelihoods in lieu of big companies with limitless greed.

Stay tuned for more “woke” content from this Social Justice Arcanist.


References


Leave a comment

You might be interested in these posts

Create a website or blog at WordPress.com