The Legal Implications of AI-Generated Content in Copyright Law

The Legal Implications of AI-Generated Content in Copyright Law

Highlights

The increasing use of Artificial Intelligence (AI) in creative industries raises copyright concerns, particularly in terms of whether AI-generated art can be protected under copyright laws. The U.S. Copyright Office has taken the position that creations made by non-human entities, including machines, are not eligible for copyright protection.

The influence of Artificial Intelligence (AI) is expanding in diverse domains, as seen in natural language processing tools like GPT-3, image recognition software such as Google Lens, and product recommendation engines, including Amazon’s product suggestion system. AI is gaining traction in the art world, exemplified by the sale of “Edmond de Belamy,” a portrait generated by AI, for an unprecedented $432,500 in an auction. Nonetheless, the increasing involvement of AI in creative pursuits raises copyright concerns.

When it comes to training AI models, the use of copyrighted materials is considered to be in a legal grey area. As it stands now, copyright laws do not safeguard any creation that is wholly generated by AI, regardless of whether it stemmed from a human-crafted text prompt. While fair use laws permit the use of copyrighted material under certain conditions without the owner’s permission, the ongoing legal disputes could disrupt this status quo and bring uncertainty in the future of AI model training.

Undoubtedly, the advent of generative AI has revolutionized our lifestyle, labor practices, and artistry output within a mere few months. In turn, the inundation of AI-fabricated written works, pictures, and tunes, alongside the mechanisms through which they were created, has stimulated a plethora of intricate legal inquiries. These challenge our understanding of ownership, fairness, and the core foundation of innovation.

Can AI-generated art be copyrighted?

The issue of whether AI-generated art can be protected under copyright laws has been a contentious topic, with various opinions and viewpoints. The U.S. Copyright Office has taken the position that creations made by non-human entities, including machines, are not eligible for copyright protection. Consequently, the product of a generative AI model cannot be considered copyrightable.

The fundamental challenge lies in the way generative AI systems operate. These models learn by identifying and replicating patterns found in data. Thus, the AI system must first learn from human creations to produce output such as written text or images. For example, if an AI-generated image resembles the art of Japanese artist Yokoyama Taikan, it would have been trained using actual pieces of art created by the human artist. Similarly, to generate written content in the style of J. K. Rowling, the AI system would need to be trained with words written by J. K. Rowling.

However, according to current U.S. copyright law, these AI systems – which encompass image and music generators, as well as chatbots like ChatGPT – cannot be regarded as the creators of the content they produce. Instead, their outputs result from a culmination of human-generated work, much of which is copyrighted in some form and sourced from the internet. This does not mean that AI-generated works are necessary in the public domain. Another example if a company uses AI to generate content, that company may still have proprietary rights to that content, such as a trade secret or patent.

This raises a perplexing question: how can the rapidly evolving artificial intelligence industry be harmonized with the intricate details of U.S. copyright law? This is a question that creative professionals, companies, courts, and the U.S. government are all grappling with as they navigate the complexities and nuances of AI-generated content and intellectual property laws.

Will copyright issues get tougher when humans and AI do the work together?

The issue of copyright protection for creative works resulting from collaboration between humans and machines is complex. According to the Copyright Office, if a human arranges or selects AI-generated material creatively or modifies it in a sufficiently creative way, copyright protection will only apply to the human-authored components of the work, not the AI-generated material itself. The issue of copyright protection for works created jointly by humans and machines is less clear, and registration applications must name all joint authors.

The use of generative AI for creating artistic works can also lead to copyright infringement concerns if the output shows similarities to pre-existing works on the internet. These models are often trained on existing works found online, which may lead to similarities to previous works. While there are cases where a human creatively selects or arranges AI-generated material or modifies it, resulting in copyright protection for only the human-authored aspects of the work, the situation becomes murky regarding works jointly created by humans and machines. It’s a requirement to name all joint authors, including potentially the AI, in applications for registration. It may be challenging to ascertain whether generative AI output is a derivative work or infringes upon the rights of previous authors.

Lawsuits

Getty Images has taken legal action against Stability AI, accusing the company of unlawfully copying over 12 million photos from Getty Images’ collection and utilizing them in generative AI systems without proper permission or licensing. Stability AI is not alone in facing lawsuits related to generative AI. With the launch of generative AI by numerous companies such as Microsoft, OpenAI, and GitHub, creative industries are beginning to file lawsuits over the co-opting or use of copyrighted work by AI. In addition to Getty’s case, a group of artists has also sued Stability AI, Midjourney, and DeviantArt for alleged mass copyright infringement via the use of their work in generative AI systems. These lawsuits are bringing to light the legal implications of using generative AI, which is becoming an increasingly common practice.

Legal action of collective nature was instituted against GitHub, Microsoft, and OpenAI. The motion claimed that the AI-powered coding aide GitHub Copilot infringed copyright laws by generating code derived from code licensed under open source, which is publicly accessible. Copilot provides programmers with suggestions for novel code based on their existing code in real-time. As per the legal action, Copilot’s code-generating software was trained on code that was subject to copyright, without obtaining the necessary authorization. Furthermore, the program creates new code that is akin or identical to the original work. This is the premier lawsuit to be brought involving generative AI. The case aims to attain class-action status, and if it prevails, it could potentially affect the whole AI industry and how it utilizes publicly available code for training models.

Microsoft, GitHub, and OpenAI have submitted a motion to dismiss the legal action. They argue that Copilot produces unique code and that the code generated is not merely identical copies of the data used for training.

These are some lawsuits that were filed lately involving generative AI. The resolution of the legal action and its influence on the AI industry remains unknown.

Ending Remarks

Copyright law is a fundamental aspect of protecting intellectual property and encouraging creativity. It gives creators the right to control their work’s use, distribution, and adaptation and encourages them to create more by offering them exclusive rights. Creative Commons licenses provide even more options for creators to choose the level of protection they want for their work.

As AI technology advances, it becomes increasingly involved in the creative process. With AI’s ability to generate original content and collaborate with humans, there is a growing need for a legal framework that addresses the copyright protection of collaborative works involving AI. It is crucial to strike a delicate balance between safeguarding the rights of creators and nurturing innovation and originality. It is difficult to predict the exact trajectory of copyright law as it pertains to AI-generated works. Still, it is undeniable that as AI technology becomes increasingly integrated into the creative process, the legal framework governing copyright protection will undergo significant and ongoing transformation.

Source: https://indiaai.gov.in/article/the-legal-implications-of-ai-generated-content-in-copyright-law

Anndy Lian is an early blockchain adopter and experienced serial entrepreneur who is known for his work in the government sector. He is a best selling book author- “NFT: From Zero to Hero” and “Blockchain Revolution 2030”.

Currently, he is appointed as the Chief Digital Advisor at Mongolia Productivity Organization, championing national digitization. Prior to his current appointments, he was the Chairman of BigONE Exchange, a global top 30 ranked crypto spot exchange and was also the Advisory Board Member for Hyundai DAC, the blockchain arm of South Korea’s largest car manufacturer Hyundai Motor Group. Lian played a pivotal role as the Blockchain Advisor for Asian Productivity Organisation (APO), an intergovernmental organization committed to improving productivity in the Asia-Pacific region.

An avid supporter of incubating start-ups, Anndy has also been a private investor for the past eight years. With a growth investment mindset, Anndy strategically demonstrates this in the companies he chooses to be involved with. He believes that what he is doing through blockchain technology currently will revolutionise and redefine traditional businesses. He also believes that the blockchain industry has to be “redecentralised”.

j j j

The Legal Implications of AI-Generated Content in Copyright Law

The Legal Implications of AI-Generated Content in Copyright Law

The influence of Artificial Intelligence (AI) is expanding in diverse domains, as seen in natural language processing tools like GPT-3, image recognition software such as Google Lens, and product recommendation engines, including Amazon’s product suggestion system. AI is gaining traction in the art world, exemplified by the sale of “Edmond de Belamy,” a portrait generated by AI, for an unprecedented $432,500 in an auction. Nonetheless, the increasing involvement of AI in creative pursuits raises copyright concerns.

When it comes to training AI models, the use of copyrighted materials is considered to be in a legal grey area. As it stands now, copyright laws do not safeguard any creation that is wholly generated by AI, regardless of whether it stemmed from a human-crafted text prompt. While fair use laws permit the use of copyrighted material under certain conditions without the owner’s permission, the ongoing legal disputes could disrupt this status quo and bring uncertainty in the future of AI model training.

Can AI-generated art Be Copyrighted?

The issue of whether AI-generated art can be protected under copyright laws has been a contentious topic, with various opinions and viewpoints. The U.S. Copyright Office has taken the position that creations made by non-human entities, including machines, are not eligible for copyright protection. Consequently, the product of a generative AI model cannot be considered copyrightable.

The fundamental challenge lies in the way generative AI systems operate. These models learn by identifying and replicating patterns found in data. Thus, the AI system must first learn from human creations to produce output such as written text or images. For example, if an AI-generated image resembles the art of Japanese artist Yokoyama Taikan, it would have been trained using actual pieces of art created by the human artist. Similarly, to generate written content in the style of J. K. Rowling, the AI system would need to be trained with words written by J. K. Rowling.

However, according to current U.S. copyright law, these AI systems – which encompass image and music generators, as well as chatbots like ChatGPT – cannot be regarded as the creators of the content they produce. Instead, their outputs result from a culmination of human-generated work, much of which is copyrighted in some form and sourced from the internet. This does not mean that AI-generated works are necessary in the public domain. Another example if a company uses AI to generate content, that company may still have proprietary rights to that content, such as a trade secret or patent.

This raises a perplexing question: how can the rapidly evolving artificial intelligence industry be harmonized with the intricate details of U.S. copyright law? This is a question that creative professionals, companies, courts, and the U.S. government are all grappling with as they navigate the complexities and nuances of AI-generated content and intellectual property laws.

Will Copyright Issues Get Tougher When Humans and AI Do The Work Together?

The issue of copyright protection for creative works resulting from collaboration between humans and machines is complex. According to the Copyright Office, if a human arranges or selects AI-generated material creatively or modifies it in a sufficiently creative way, copyright protection will only apply to the human-authored components of the work, not the AI-generated material itself. The issue of copyright protection for works created jointly by humans and machines is less clear, and registration applications must name all joint authors.

The use of generative AI for creating artistic works can also lead to copyright infringement concerns if the output shows similarities to pre-existing works on the internet. These models are often trained on existing works found online, which may lead to similarities to previous works. While there are cases where a human creatively selects or arranges AI-generated material or modifies it, resulting in copyright protection for only the human-authored aspects of the work, the situation becomes murky regarding works jointly created by humans and machines. It’s a requirement to name all joint authors, including potentially the AI, in applications for registration. It may be challenging to ascertain whether generative AI output is a derivative work or infringes upon the rights of previous authors.

Lawsuits

Getty Images has taken legal action against Stability AI, accusing the company of unlawfully copying over 12 million photos from Getty Images’ collection and utilizing them in generative AI systems without proper permission or licensing. Stability AI is not alone in facing lawsuits related to generative AI. With the launch of generative AI by numerous companies such as Microsoft, OpenAI, and GitHub, creative industries are beginning to file lawsuits over the co-opting or use of copyrighted work by AI. In addition to Getty’s case, a group of artists has also sued Stability AI, Midjourney, and DeviantArt for alleged mass copyright infringement via the use of their work in generative AI systems. These lawsuits are bringing to light the legal implications of using generative AI, which is becoming an increasingly common practice.

Legal action of collective nature was instituted against GitHub, Microsoft, and OpenAI. The motion claimed that the AI-powered coding aide GitHub Copilot infringed copyright laws by generating code derived from code licensed under open source, which is publicly accessible. Copilot provides programmers with suggestions for novel code based on their existing code in real-time. As per the legal action, Copilot’s code-generating software was trained on code that was subject to copyright, without obtaining the necessary authorization. Furthermore, the program creates new code that is akin or identical to the original work. This is the premier lawsuit to be brought involving generative AI. The case aims to attain class-action status, and if it prevails, it could potentially affect the whole AI industry and how it utilizes publicly available code for training models.

Microsoft, GitHub, and OpenAI have submitted a motion to dismiss the legal action. They argue that Copilot produces unique code and that the code generated is not merely identical copies of the data used for training.

These are some lawsuits that were filed lately involving generative AI. The resolution of the legal action and its influence on the AI industry remains unknown.

Ending Remarks

Copyright law is a fundamental aspect of protecting intellectual property and encouraging creativity. It gives creators the right to control their work’s use, distribution, and adaptation and encourages them to create more by offering them exclusive rights. Creative Commons licenses provide even more options for creators to choose the level of protection they want for their work.

As AI technology advances, it becomes increasingly involved in the creative process. With AI’s ability to generate original content and collaborate with humans, there is a growing need for a legal framework that addresses the copyright protection of collaborative works involving AI. It is crucial to strike a delicate balance between safeguarding the rights of creators and nurturing innovation and originality. It is difficult to predict the exact trajectory of copyright law as it pertains to AI-generated works. Still, it is undeniable that as AI technology becomes increasingly integrated into the creative process, the legal framework governing copyright protection will undergo significant and ongoing transformation.

 

Source: https://thedatascientist.com/the-legal-implications-of-ai-generated-content-in-copyright-law/

Anndy Lian is an early blockchain adopter and experienced serial entrepreneur who is known for his work in the government sector. He is a best selling book author- “NFT: From Zero to Hero” and “Blockchain Revolution 2030”.

Currently, he is appointed as the Chief Digital Advisor at Mongolia Productivity Organization, championing national digitization. Prior to his current appointments, he was the Chairman of BigONE Exchange, a global top 30 ranked crypto spot exchange and was also the Advisory Board Member for Hyundai DAC, the blockchain arm of South Korea’s largest car manufacturer Hyundai Motor Group. Lian played a pivotal role as the Blockchain Advisor for Asian Productivity Organisation (APO), an intergovernmental organization committed to improving productivity in the Asia-Pacific region.

An avid supporter of incubating start-ups, Anndy has also been a private investor for the past eight years. With a growth investment mindset, Anndy strategically demonstrates this in the companies he chooses to be involved with. He believes that what he is doing through blockchain technology currently will revolutionise and redefine traditional businesses. He also believes that the blockchain industry has to be “redecentralised”.

j j j

SEC chair Gensler confirms “everything other than Bitcoin” is a security: Implications and analysis

SEC chair Gensler confirms “everything other than Bitcoin” is a security: Implications and analysis

SEC Chair Gary Gensler reiterated that Bitcoin is not a security but a commodity under the Commodity Futures Trading Commission (CFTC) purview. He also stated that “everything else other than bitcoin is a security,” which has significant implications for regulating cryptocurrencies and digital assets in the United States.

Gensler’s statement reflects the SEC’s long-held view that many cryptocurrencies and digital assets are securities under U.S. law. The SEC’s definition of a security is broad — it includes any investment contract in which an individual invests money in a common enterprise with the expectation of profits solely from the efforts of others. In other words, if an asset is sold as an investment with the expectation of profit based on the efforts of others, it is likely to be considered a security.

Gensler’s comments have sparked debate in the cryptocurrency community. Some argue that his view is overly broad and that many digital assets do not fit the SEC’s definition of a security. Others argue that the SEC’s approach is necessary to protect investors from fraudulent or manipulative activities in the cryptocurrency market.

One of the key implications of Gensler’s comments is that many digital assets may be subject to SEC regulation. This could include initial coin offerings (ICOs), a crowdfunding campaign where investors purchase digital tokens in exchange for cryptocurrencies like Bitcoin or Ethereum. Many ICOs have been criticized for their lack of transparency and accountability, and the SEC has taken enforcement action against several ICO issuers in recent years.

Another implication is that exchanges that trade digital assets may be subject to SEC oversight. Under U.S. law, exchanges facilitating securities trading must register with the SEC and comply with various regulations. If the SEC views many digital assets as securities, then exchanges that trade those assets may also be required to register with the SEC and comply with its regulations.

His comments suggest that the SEC may take a more aggressive approach to regulating the cryptocurrency market. This could include increased enforcement actions against issuers of digital assets considered securities and against exchanges that facilitate trading those assets. It could also lead to new regulations to increase transparency and accountability in the cryptocurrency market.

The SEC’s approach to regulating cryptocurrency has been debated for several years. Some argue that the SEC’s current approach is too cautious and stifling innovation in the cryptocurrency space. Others argue that increased regulation is necessary to protect investors from fraud and manipulation.

Gensler’s comments suggest that the SEC will likely take a more assertive approach to regulate the cryptocurrency market in the coming years. This could include increased enforcement actions, new regulations, and closer scrutiny of digital assets and exchanges that operates in the U.S.

Maybe we can take a step back to look into a few things. Firstly, it’s important to understand the context of Gensler’s statement. As mentioned earlier, Gensler reiterated the SEC’s stance in an interview with CNBC in July 2022 that Bitcoin is not a security but a commodity that falls under the Commodity Futures Trading Commission’s jurisdiction. He did not label other digital assets, avoiding answering the question directly. However, in a tweet by Jake Chervinsky in February 2023, it was suggested that Gensler may have prejudged that every digital asset aside from Bitcoin is a security.

Then my question is: What exactly is a security? In the US, the Securities Act of 1933 defines a security as any investment contract, note, stock, or any other type of investment in a common enterprise with the expectation of profits solely from the efforts of others. In simpler terms, it means an asset representing an ownership interest or a right to receive future profits or cash flows from a third party.

Suppose we consider Gensler’s statement that everything other than Bitcoin is a security. In that case, it implies that most digital assets such as Ethereum, XRP, and other cryptocurrencies would be considered securities under US law. This means that they would be subject to SEC regulations and oversight. It’s worth noting that this is not a new position for the SEC. For years, the SEC has warned cryptocurrency companies that their tokens could be classified as securities if they meet certain criteria.

The implications of this classification are significant. If a digital asset is classified as a security, the issuer must comply with SEC regulations, including registration and disclosure requirements. It would also have to follow strict trading, reporting, and investor protection rules. Additionally, investors would be protected under federal securities laws, which could increase their confidence in the digital asset market. However, it could also lead to additional costs and regulatory burdens for the companies issuing digital assets.

My opinion on this matter is that while Gensler’s statement may have been perceived as a blanket statement, the SEC’s approach to regulating cryptocurrencies is nuanced and fact-specific. The SEC has been clear that it will evaluate each token on a case-by-case basis to determine whether it meets the legal definition of a security. In other words, just because a digital asset is not Bitcoin does not automatically mean it’s a security.

Furthermore, regulatory oversight is necessary for the cryptocurrency market to mature and gain mainstream adoption. The lack of clear regulations has been a major roadblock for institutional investors, who are hesitant to invest in a market perceived as unregulated and risky. Clear regulations would also protect retail investors who may not have the knowledge or resources to navigate the complex world of cryptocurrencies.

To conclude, while Gensler’s statement that “everything other than Bitcoin” is a security may have caused some alarm in the cryptocurrency community, we believe that it’s important to view it in the context of the SEC’s broader approach to regulating digital assets. The SEC’s focus on investor protection and market integrity is crucial for the long-term success of the cryptocurrency market.

As the market continues to evolve, we expect that the SEC’s approach will continue to evolve, and we look forward to seeing how it develops. Meanwhile, I hope SEC can be more precise and take a more responsible stance when putting statements out in the market.

 

Source: https://cryptoslate.com/sec-chair-gensler-confirms-everything-other-than-bitcoin-is-a-security-implications-and-analysis/

la

Anndy Lian is an early blockchain adopter and experienced serial entrepreneur who is known for his work in the government sector. He is a best selling book author- “NFT: From Zero to Hero” and “Blockchain Revolution 2030”.

Currently, he is appointed as the Chief Digital Advisor at Mongolia Productivity Organization, championing national digitization. Prior to his current appointments, he was the Chairman of BigONE Exchange, a global top 30 ranked crypto spot exchange and was also the Advisory Board Member for Hyundai DAC, the blockchain arm of South Korea’s largest car manufacturer Hyundai Motor Group. Lian played a pivotal role as the Blockchain Advisor for Asian Productivity Organisation (APO), an intergovernmental organization committed to improving productivity in the Asia-Pacific region.

An avid supporter of incubating start-ups, Anndy has also been a private investor for the past eight years. With a growth investment mindset, Anndy strategically demonstrates this in the companies he chooses to be involved with. He believes that what he is doing through blockchain technology currently will revolutionise and redefine traditional businesses. He also believes that the blockchain industry has to be “redecentralised”.

j j j