AI lab TL;DR | Carys J. Craig - The Copyright Trap and AI Policy
Manage episode 463395163 series 3480798
🔍 In this TL;DR episode, Carys J Craig (Osgoode Professional Development) explains the "copyright trap" in AI regulation, where relying on copyright favors corporate interests over creativity. She challenges misconceptions about copying and property rights, showing how this approach harms innovation and access. Carys offers alternative ways to protect human creativity without falling into this trap.
📌 TL;DR Highlights
⏲️[00:00] Intro
⏲️[00:46] Q1-What is the "Copyright Trap," and why could it harm AI and creativity?
⏲️[10:05] Q2-Can you explain the three routes that lead into the copyright trap and their relevance to AI?
⏲️[22:08] Q3-What alternatives should policymakers consider to protect creators and manage AI?
⏲️[28:45] Wrap-up & Outro
💭 Q1 - What is the "Copyright Trap," and why could it harm AI and creativity?
🗣️ “To turn to copyright law is to turn to really a false friend. The idea that copyright is going to be our friend, is going to help us in this situation,(...) it's likely to do more harm than good."
🗣️ “We are imagining increasingly in these policy debates that copyright and protection of copyright owners will be a kind of counterweight to corporate power and to the sort of extractive logics of Big Tech and AI development. I think that that is misguided. And in fact, we're playing into the interests of both the entertainment industries and big tech ”
🗣️ "When we run into the copyright trap, this sort of conviction that copyright is going to be the right regulatory tool, we are sort of defining how this technology is going to evolve in a way that I think will backfire and will actually undermine the political objectives of those who are pointing to the inequities and the unfairness behind the technology and the way that it's being developed.”
🗣️ "AI industry, big tech industry and the creative industry stakeholders are all, I think, perfectly happy to approach these larger policy questions through the sort of logic of copyright, sort of proprietary logic of ownership, control, exchange in the free market, licencing structures that we're already seeing taking hold"
🗣️ "What we're going to see, I think, if we run into the copyright trap is that certainly smaller developers, but really everyone will be training the technology on incomplete data sets, the data sets that reflect the sort of big packaged data products that have been exchanged for value between the main market actors. So that's going to lessen the quality really of what's going in generally by making it more exclusive and less inclusive."
💭 Q2 - Can you explain the three routes that lead into the copyright trap and their relevance to AI?
🗣️ ""The first route that I identify is what's sometimes called the if-value-then-right fallacy. So that's the assumption that if something has value, then there should be or must be some right over it.“
🗣️ "Because something has value, whether economic or social, doesn't mean we should turn it into property that can be owned and controlled through these exclusive rights that we find in copyright law."
🗣️ "The second route that I identify is a sort of obsession with copying and the idea that copying is inherently just a wrongful activity. (...) The reality is that there's nothing inherently wrongful about copying. And in fact, this is how we learn. This is how we create.
🗣️ "One of the clearest routes into the copyright trap is saying, well, you know, you have to make copies of texts in order to train AI. So of course, copyright is implicated. And of course, we have to prevent that from happening without permission.. (...) But our obsession with the individuated sort of discrete copies of works behind the scenes is now an anachronism that we really need to let go.”
🗣️ "Using the figure of the artist as a reason to expand copyright control, and assuming that that's going to magically turn into lining the pockets of artists and creators seems to me to be a fallacy and a route into the copyright trap."
💭 Q3 - Why is output-based remuneration better for creators, AI developers, and society?
🗣️ "The health of our cultural environment (..) [should be] the biggest concern and not simply or only protecting creators as a separate class of sort of professional actors."
🗣️ "I think what we could do is shift our copyright policy focus to protecting and encouraging human authorship by refusing to protect AI generated outputs.
🗣️ "If the outputs of generative AI are substantially similar to works on which the AI was trained, then those are infringing outputs and copyright law will apply to them such that to distribute those infringing copies would produce liability under the system as it currently exists.“
🗣️ "There are privacy experts who might be much better placed to say how should we curate or ensure that we regulate the data on which the machines are trained and I would be supportive of those kinds of interventions at the input stage.
🗣️ “Copyright seems like a tempting way to do it but that's not what it does. And so maybe rather than some of the big collective licencing solutions that are being imagined in this context, we'd be better off thinking about tax solutions, where we properly tax big tech and then we use that tax in a way that actually supports the things that we as a society care about, including funding culture and the arts."
📌 About Our Guest
🎙️ Carys J Craig | Osgoode Hall Law School
🌐 Article | The AI-Copyright Trap
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4905118
🌐 Carys J Craig
https://www.osgoode.yorku.ca/faculty-and-staff/craig-carys-j/
Carys is the Academic Director of the Osgoode Professional Development LLM Program in Intellectual Property Law, and recently served as Osgoode’s Associate Dean. A recipient of multiple teaching awards, Carys researches and publishes widely on intellectual property law and policy, with an emphasis on authorship, users’ rights and the public domain.
#AI #ArtificialIntelligence #GenerativeAI
28集单集