In this episode of the Arts Management and Technology Lab, Hannah Brainard and Anuja Uppuluri explore two significant threats to artists and cultural institutions: the ethical and legal implications of Artificial Intelligence (AI), specifically its pre-training practices, and the impact of climate change on art and cultural heritage. They discuss how AI models often exploit artists' work without consent or compensation, leading to calls for dynamic consent systems, robust attribution, and fair compensation models. Concurrently, they examine how climate change amplifies natural disasters, posing existential threats to museums and collections, particularly impacting under-resourced and historically marginalized communities. The conversation emphasizes the urgent need for systemic change and equitable allocation of resources to ensure the long-term sustainability and preservation of creative works and cultural heritage in both domains.
Show Notes
-
Gelt, Jessica. “Inside the Dash to Save the Getty Villa from the Palisades Fire: A Timeline.” Los Angeles Times, January 9, 2025. https://www.latimes.com/entertainment-arts/story/2025-01-09/inside-the-dash-to-save-the-getty-villa-from-the-palisades-fire.
Howell, Junia, and James R Elliott. “Damages Done: The Longitudinal Impacts of Natural Hazards on Wealth Inequality in the United States.” Social Problems 66, no. 3 (August 1, 2019): 448–67. https://doi.org/10.1093/socpro/spy016.
LAist. “The Bunny Museum Update.” Accessed April 26, 2025.
“Redlining in Los Angeles County.” Accessed April 10, 2025. https://www.laalmanac.com/history/hi727.php.
Stevens, Matt. “Palisades Fire Could Test Getty Center’s Efforts to Protect Its Art Collection.” New York Times, January 11, 2025. https://www.nytimes.com/2025/01/11/us/getty-center-museum-pacific-palisades-fire.html
The Art Newspaper - International art news and events. “Deadly Wildfires Destroy Los Angeles Art Spaces as Museums and Galleries Close,” January 9, 2025. https://www.theartnewspaper.com/2025/01/09/los-angeles-wildfires-destroy-art-spaces-museums-galleries-close.
“The Eaton Fire and the Cultural Loss of Black Art: Remembering John Outterbridge During Black History Month.” Fox 4 News,February 21, 2025. https://fox4kc.com/business/press-releases/ein-presswire/788045123/the-eaton-fire-and-the-cultural-loss-of-black-art-remembering-john-outterbridge-during-black-history-month
-
Gilbertson, Annie, and Alex Reisner. “Apple, Nvidia, Anthropic Used Thousands of Swiped YouTube Videos to Train AI.” Proof News, July 16 2024. https://www.proofnews.org/apple-nvidia-anthropic-used-thousands-of-swiped-youtube-videos-to-train-ai/
Perez, Sarah. “YouTube Will Now Let Creators Opt In to Third-Party AI Training.” TechCrunch,December 16 2024. https://techcrunch.com/2024/12/16/youtube-will-let-creators-opt-out-into-third-party-ai-training/
Coster, Helen. “Global News Publisher Axel Springer Partners with OpenAI in Landmark Deal.” Reuters, December 13 2023. https://www.reuters.com/business/media-telecom/global-news-publisher-axel-springer-partners-with-openai-landmark-deal-2023-12-13/
Brittain, Blake. “AI Companies Lose Bid to Dismiss Parts of Visual Artists’ Copyright Case.” Reuters, August 13 2024. https://www.reuters.com/legal/litigation/ai-companies-lose-bid-dismiss-parts-visual-artists-copyright-case-2024-08-13/
Shutterstock. “Shutterstock Expands Partnership with OpenAI, Signs New Six-Year Agreement to Provide High-Quality Training Data.” Press release, July 11 2023. https://investor.shutterstock.com/news-releases/news-release-details/shutterstock-expands-partnership-openai-signs-new-six-year
European Union. Artificial Intelligence Act (Regulation (EU) 2024/1689). Official Journal of the European Union, July 12 2024. https://artificialintelligenceact.eu/the-act/.
Perez, Sarah. “YouTube Will Now Let Creators Opt In to Third-Party AI Training.” The Verge, December 16 2024. (Supplementary coverage). https://www.theverge.com/2024/12/16/24322732/youtube-creators-opt-in-third-party-ai-training-videos
Transcript
Anuja Uppuluri
My name is Anuja Uppuluri and I'm a graduating senior at Carnegie Mellon, where I've been studying information systems, artificial intelligence, and discrete math.
Hannah Brainard
And I'm Hannah Brainard, a graduating master of arts management student at Carnegie Mellon, focused on the intersections of culture and climate.
Today we're talking about two very different threats to artists in their work, AI and climate change. Specifically pre-training practices in AI models and support systems for climate adaptation and mitigation efforts. When these systems were created, they didn't plan for long-term sustainability, and now we're looking to policy to prevent further damage.
First, I want to talk a little bit about these threats and their potential harm. Starting with you Anuja, could you share a little bit more about how AI models are trained and how these might be a threat to artists?
Anuja Uppuluri
Sure, Hannah. So modern foundation models like GPT-4, Claude and Midjourney gain their capabilities from a resource intensive process called pretraining.
And during pretraining, these AI systems are exposed to enormous collections of data, so this could be like texts from websites or like digitized books, billions of images, audio clips, source code, that kind of thing. And by predicting the next element in sequences trillions of times, they're internalizing statistical patterns and that's enabling various applications, without retraining the entire network– and obviously since that's a sustainable kind of practice for training, it's what AI companies keep doing.
But what people don't realize is that in pretraining, this often involves using artist work without knowledge or consent. In 2024, just recently, there was an investigation that revealed that transcripts from 871 Crash Course educational videos were included in AI training data sets without permission from Hank Green or his production company, like at all. He's talked about how it's just scummy quote unquote, that he was never even asked.
For another example, Margaret Atwood has also filed lawsuits against AI companies for using her writing without authorization and in general, there is a huge and pretty fundamental disconnect between creators who are the ones that are producing the content that is so valuable that AI companies are pretraining with it.
And no matter how much good AI companies are doing, in the end, they are profiting from using content that is the work of other people in their systems. And I think in general, because artists are only discovering that their work was used through leaks or investigations, there's a general feeling of exploitation.
Hannah Brainard
Yeah. I can understand why artists would be upset about the use of their work in that way. Could you share some specific examples about how artists discovered that their work was being used in training AI models? And maybe what legal actions they've taken in response to that?
Anuja Uppuluri
Sure. So there have been artists like, someone named Karla Ortiz, and they typically have been learning that their work was used without permission when they notice the AI systems can generate nearly identical copies of their style, just by using their name as a prompt. So I think some evidence of this systematic collection has come out when researchers were discovering databases and spreadsheets that were listing thousands of artists whose styles AI systems were specifically trained to replicate.
And I think it's one thing when there's something ubiquitous, like “generate something in the style of Monet” and then there's something else when it's a more independent artist that is selling their work that has value because it's their specific work. And in response, I think some artists have taken legal action with Ortiz joining Sarah Andersen and Kelly McKernan in a pretty landmark class action lawsuit that was against StabilityAI and Midjourney for using billions of images without consent or compensation.
There was another artist named Kristina Kashtanova who was facing a different kind of challenge. When the US Copyright office revoked protection for AI-generated images in their comic book that was called Zarya of the Dawn, and so they were the first person that had AI art copyright protection, withdrawn.
And in general, I think the AI industry has just been developed on the backs of creators and they haven't established consent frameworks or like any kind of attribution system or compensation model. And without guardrails like this we really risk undermining the creative ecosystem that is what makes AI valuable in the first place.
Hannah Brainard
Yeah, absolutely.
Anuja Uppuluri
Let's also talk about climate.
Hannah Brainard
Yeah.
Anuja Uppuluri
What kinds of threats do climate change effects pose to artists and art institutions?
Hannah Brainard
Yeah, that's a great question. So, just to start off, it's very well documented that climate change is amplifying the impacts of natural disasters. Rising sea levels not only increase flooding, but they also intensify the frequency and magnitude of storms. Rising temperatures, again, are a threat on their own, but it also extends drought periods and dry seasons, which fuels wildfires. It also increases evaporation, again, fueling storms. The list goes on and on about how climate change can really impact and grow natural disasters.
And this is a concern across the board, you know, for the way that our cities are planned, the way that we construct housing and other facilities. And, within the arts and cultural sector, this is of particular concern for museums. They're collections based. They're tangible. Museums are really tasked with preserving the cultural artifacts that we value. And these collections are also extremely sensitive to various environmental factors like temperature, light, humidity.
So even a small change can cause significant damage, let alone something at the scale of a natural disaster. So this is a growing concern, a growing question in the museum world about how we can both understand this risk and prepare for it.
Anuja Uppuluri
That makes sense. I guess in general, who do you think faces the greatest threat?
Hannah Brainard
Yeah. So, really it comes down to funding. Who has the resources to properly adapt to climate change? And this has long term implications for whose stories we protect and whose stories we value. In a century, which artifacts do we still have to tell the stories about?
So at its core that's really a climate justice question. Historic redlining is really one of the biggest factors with that, that's created long lasting disparities, in our community. So in the 1930s, the Homeowners Loan Corporation or HOLC created maps to evaluate risk level in different communities. And one of the biggest factors was race. This has led to long-term housing discrimination, divestment from BIPOC communities, creating dense, poorly maintained infrastructure, more vulnerable to natural disasters.
Not only are these communities, you know, more vulnerable structurally to harm, but the benefits disproportionately, don't help them. So like, after a storm, resources that become available are more valuable to property owners who can actually grow in wealth after a natural disaster, versus people who don't own homes, who most often lose wealth. And these same kinds of ideas can apply to cultural institutions in these communities.
Anuja Uppuluri
I guess then if we're thinking about things like housing discrimination, when you talk about these ideas applying to, like, cultural institutions, what does that look like and what kind of examples I guess can we think about?
Hannah Brainard
Yeah, so one of maybe the best examples now and most recent, are the wildfires in Los Angeles. They are sort of an interesting example. So two of the fires that we hear about are the fires in the Pacific Palisades, which are in the northwest end of the county that primarily impacted white, affluent communities.
And sort of within this area is the Getty Villa, which is one of the two museum campuses operated by the J. Paul Getty Foundation, which is one of the most wealthy arts institutions in the world. And you likely saw headlines of this sign that says the Getty Villa, like engulfed in flames, that they really were in the path of the fire.
But the collection was safe, following the fire, mostly because the institution had made significant investments in infrastructure to protect the collection. Fire resistant landscape irrigation, flame retardant, double walled galleries, things that would make sure that the collection was safe regardless of natural disaster like wildfire.
Conversely, the Eaton fire in the San Gabriel Valley disproportionately impacted black communities and lower income communities. So an example from the Eaton Fire, a black artist and activist John Outerbridge, his collection was completely destroyed in a fire, in a residential fire. Actually his daughter kept all the artifacts of his work. But Altadena, where it was, is a historically redlined community. And so these smaller niche collections tend to be in the most harm.
Another example that's quite sad actually is the Bunny Museum in Pasadena, also in the Eaton Fire. It was a collection of a husband and wife who exchanged, like, bunny toys every day for years, and eventually they decided to turn it into a museum. It's kind of sweet, but the entire museum was destroyed in the fire. So these sort of more niche, and specific spaces were lost most often. And the same kind of question applies to culturally specific museums.
So that was really the focus of my recent research of how culturally specific museums might be affected disproportionately by natural disaster and in Los Angeles specifically, it showed that while the threat of a natural disaster isn't substantially different, the social vulnerability level is. So, culturally specific museums are most often located in more socially vulnerable communities that would be less resilient to an event, like a natural disaster.
So, all that said, I wanna pivot back to you and talk a little bit more about this with AI. So, how are artists being disproportionately impacted by this?
Anuja Uppuluri
Before we talk about that, do you know what happened to the husband and wife from the bunny museum?
Hannah Brainard
Yeah. Oh, the husband and wife are okay.
Anuja Uppuluri
Okay.
Hannah Brainard
But the bunnies are unfortunately not, they did have actual live rabbits on site that were also safe.
Anuja Uppuluri
Okay.
Hannah Brainard
They were taken, but the, you know, toys and trinkets, and things like that, were lost.
Anuja Uppuluri
That's so sad. But yeah, to talk about how artists are impacted differently, as a result of pretraining, or I guess the procedures that occur with that. There's a really clear inequality in how AI is impacting different types of artists because, as we talked about slightly before, independent creators and those that are in marginalized communities are much more disproportionately affected.
Major publishing houses and established media organizations have also been able to negotiate some kinds of licensing deals with AI companies. We've seen Open AI partner with Axel Springer, and Conde Nast, and Microsoft has created partnerships with USA Today's publishers. But for individual artists, especially those without legal resources or industry connections or anything, they kind of don't have a seat at the table.
I think one thing also is that visual artists that are from diverse backgrounds are particularly vulnerable.Their work is scraped into training datasets without their knowledge. AI systems can generate imitations of their distinctive styles. And then they're just effectively appropriating whatever cultural expressions and aesthetics they've spent years developing. And when I think they don't have the resources to “fight”, quote unquote, or like try to advocate for themselves there's just a huge disparity that keeps growing and growing, and I think the difference between letting their work be used to create other beautiful things on this notion of AI like just factually democratizing creation and that kind of thing, the difference is that they then would actually have to get paid for the effects of their work, or at least the use of their work and I think this gap is something that big companies aren't addressing, and it's just not sustainable.
Hannah Brainard
Yeah, that's a really important point. We often hear AI companies talk about democratizing creativity, but it seems like they're building this democratization on the unpaid labor of actual creators.
Anuja Uppuluri
Yeah. And I think this also brings us back to the fact that both of the problems that we were talking about today, like climate vulnerability and appropriation in AI. They have some similar patterns of unsustainable systems that were just built without considering long term impacts on creators or communities.
And I think in terms of the AI industry currently, pretraining is definitely necessary it's just indispensable for AI advancement and it's literally how these systems are developing capabilities like reasoning and creativity, and they need to, you know, have information from which they can internalize patterns and these massive data sets really help with that.
But current implementation is just increasingly unsustainable from legal and ethical and practical perspectives, and we just have to start revamping the system if there's any path forward, I feel like.
Hannah Brainard
Yeah. So how do we do it? How do we revamp the system? What are the solutions and how can we build equity into these to address harm?
Anuja Uppuluri
I think one super good thing is that there are already some pretty promising signals of change. YouTube, I think in December, was rolling out an opt-in toggle for like third party AI training and the default is, set to off and media companies are negotiating some licensing deals with AI developers.
I guess in terms of a framework, what I think are things that could make AI development more sustainable and equitable for creators is some kind of dynamic consent management system. And I think the biggest thing in this is that there has been a feeling of creators feeling like they're losing control over their work once it enters training data sets. So they need to have, consent, not be treated as like a one-time event, but something that envisions, an interface where creators can grant or modify or revoke permissions as certain circumstances change. And I think another thing is a system of robust heritage attribution and I think these chains are things that can track data lineage throughout AI's lifecycle. So, this like “crisis in data transparency” is something that means creators often don't even know if their work was used. And proper attribution chains are something that wouldn't ensure that when an AI system learns from someone's work, that connection remains traceable and acknowledged.
And of course, fair compensation mechanisms that are scaling to different contexts are important. Companies like Shutterstock have been pioneering approaches where contributors are receiving ongoing royalties when AI systems are trained on their work and when they're generating new content with that. And so that's kind of transforming the relationship from something like extraction to partnership. And I think that's, at the end of the day, what is most important.
But I think one thing also to note is that these frameworks really aren't just theoretical, they're already kind of being implemented in various ways. The EU put in an AI act that is requiring transparency around training data. And researchers are developing technical tools that are tracing data lineage in AI data sets as well. So I think what is promising is how these approaches can scale across disciplines. And the same principles that are protecting a visual artist's style could also preserve a composer's musical signature or an author's narrative voice or something like that. And these sustainability mechanisms, I think building these into AI development from the ground up is something that is integral to us creating systems that can advance technology and also honor the human creativity that they were built upon.
I think going back to your topic and environment, how can we also just establish more equitable systems so that we're addressing these sorts of harms?
Hannah Brainard
Yeah, you know, I think kind of similar systems can apply regarding climate change and how museums adapt because really what it comes down to is the equitable allocation of funding systems for adaptation and resilience efforts. And specifically here, like how cultural specific identity can be considered as a factor for prioritizing the allocation of funds.
Anuja Uppuluri
For sure.
Hannah Brainard
With that, I do want to add the caveat though. So, you know, mapping risk is a huge tool in understanding sort of how to allocate these funds. But we have to be cautious to not reinforce harm like the HOLC maps have done. Understanding marginalized communities and where they are and how they're impacted by climate change means putting more resources into those museums in those communities, not avoiding them. So that's, sort of, an interesting line to walk on– understanding risk for the purpose of, you know, providing resources for that space.
Anuja Uppuluri
For sure. I think from what I'm hearing, both of our challenges are kind of showing us that there's a need for more systemic change rather than just bandaid solutions.
Hannah Brainard
Yeah.
Anuja Uppuluri
And I think it's small things like the reason that Getty survived, or at least the materials in there survived is because of things that were put in place from a long time before that you know, could be harder for other organizations to put in.
Hannah Brainard
Yes, definitely.
Anuja Uppuluri
Yeah, so I guess in general, designing some systems that are either respecting creators rights or addressing historical vulnerabilities or that kind of thing could be setting a precedent for better foundations that we're creating, for an environment where technology or cultural heritage can flourish together.
Hannah Brainard
Yeah. Absolutely. Well, thank you, Anuja. This conversation has been great.
Anuja Uppuluri
Thank you.
Hannah Brainard
Thank you for listening to this episode of the Arts and Society podcast. If you enjoyed this episode or learn something new, don't forget to share it with a friend. To see the latest, follow us on Facebook and Instagram at Arts and Society Pod. Until next time!

