Getty Images CEO: Respecting fair use rules won’t prevent AI from curing cancer
There is a lot of debate on whether freely training artificial intelligence (AI) models on copyrighted materials is permissible under the law. This debate escalates to court cases filed across varied jurisdictions and companies challenging companies that train their generative AI models by scraping copyrighted materials without permission.
The debate is largely framed as an all-or-nothing proposition. AI model makers argue all training is clearly permissible under the law. Some even assert that there is no copyright protection for online content as highlighted by Mustafa Suleyman, the CEO of Microsoft AI, in remarks earlier this year. “I think that with respect to content that’s already on the open web, the social contract of that content since the nineties has been that it is fair use. Anyone can copy it, recreate with it, reproduce with it. That has been freeware, if you like, that’s been the understanding,” Suleyman said.
As the CEO of Getty Images, I am responsible for an organization that employs over 1,700 individuals and represents the work of more than 600,000 journalists and creators worldwide. Copyright is at the very core of our business and the livelihood of those we employ and represent. As I am sure you can imagine, I vigorously disagree with the sweeping position outlined by Mr. Suleyman and others. I also suspect Microsoft would vigorously disagree if the same logic were applied to their software and gaming titles.
The creative community, which represents a sizeable share of the global economy, has mobilized against the unauthorized use of their collective text, music, photography, and video. Most recently, this mobilization manifested itself in over 30,000 artists signing a statement that “the unlicensed use of creative works for training generative AI is a major, unjust threat to the livelihoods of the people behind those works, and must not be permitted.” I vigorously agree with this statement. A world where artists cannot invest in and be rewarded for their work is a world with less creativity and fewer people able to make a living in their trade. That is not a compelling future.
This disagreement underscores why we are litigating against Stability AI in the U.S. and the U.K. We did not grant Stability AI permission to use millions of images owned and/or represented by Getty Images to train their Stable Diffusion model which was made commercially available starting in August of 2022. (Editor’s note: Stability AI denies all claims). The action against Stability AI alone represents a multi-million-dollar investment and is expected to take years to resolve. However, identifying unauthorized use, securing evidence, and pursuing litigation against each and every company that has trained on Stability’s content is prohibitively expensive, even for a company of Getty Images’ scale.
As litigation slowly advances, AI companies advance an argument that there will be no AI absent the ability to freely scrape content for training, resulting in our inability to leverage the promise of AI to solve cancer, mitigate global climate change, and eradicate global hunger. Note that the companies investing in and building AI spend billions of dollars on talent, GPUs, and the required power to train and run these models—but remarkably claim compensation for content owners is an unsurmountable challenge.
My focus is to achieve a world where creativity is celebrated and rewarded AND a world that is without cancer, climate change, and global hunger. I want the cake and to eat it. I suspect most of us want the same.
We need to be open to a more nuanced debate on AI and copyright. The reality is that there is an established path here: Fair use (and similar concepts around the globe) allows for third parties to draw on copyrighted works absent a license depending on the following criteria:
The purpose and character of the use, including whether such use is of a commercial nature or is in the public interest.
The nature of the copyrighted work.
The amount and substantiality of the portion used in relation to the copyrighted work as a whole.
The effect of the use upon the potential market for or value of the copyrighted work.
Fair use is applied on a case-by-case basis. So, let’s look at AI not as one monolithic case, but let’s parse to what it really is: a wide range of models, capabilities, and potential applications.
Does curing cancer impact the value of Kevin Bacon’s performances? Clearly no. Does solving for climate change impact the value of Billie Eilish’s music? Clearly no. Does solving for global hunger impact the value of Stephen King’s writing? Again, clearly no. Not only does it not harm the value of their work, they would likely never challenge such a use if it could benefit those aims even if such a use might be commercial in nature. As the CEO of Getty Images, I can say we would never debate or challenge these applications and that we would wholeheartedly welcome any support we could offer toward these applications.
But let us now look at a more limited area of the AI universe: content generation models. These are models that generate music, photos, and videos based on text or other inputs. These models are being trained on the content of artists absent their permission, commercialized by companies targeting the very same end markets from which those artists make their living, and clearly striking against the first and fourth criteria of fair use. Importantly, these models do not have the potential to elevate our societal outcomes. It’s pure theft from one group for the financial benefit of another.
Fair use is the path to have our cake and to eat it. It is long-established and we need to reject calls to ignore it or override it. And it doesn’t mean that content generation models cannot exist. Quite the contrary.
As the licensed models of Spotify and Apple Music evolved from the infringing original Napster, there are AI models developed with permission and with business models that reward creators for their contributions. Like Apple Music and Spotify, they will cost a bit more, but they can thrive and be broadly adopted if we create a fair playing field by addressing those companies that choose to “move fast and break things”, in this case, break established copyright law.
There is a fair path that rewards creativity and delivers the promises of AI. Let’s stop the rhetoric that all un-permissioned AI training is legal and that any requirement to respect the rights of creators is at the expense of AI as a technology.
More must-read commentary published by Fortune:
The opinions expressed in Fortune.com commentary pieces are solely the views of their authors and do not necessarily reflect the opinions and beliefs of Fortune.