Artificial intelligence has led to some incredible developments and businesses, but there have also been challenges and concerns raised about AI technology. One such challenge involves AI generation tools like ChatGPT using artists’ works as source material – something which has caused some people considerable anger, leading some of them to file lawsuits against the company behind this viral software.
A law firm in California filed a class action lawsuit seeking court-ordered cease and desist orders against the maker of a generative chatbot, alleging violations of several laws such as California Privacy Act and Digital Millennium Copyright Act as well as misappropriating private and copyrighted data belonging to users who utilize apps, platforms, programs or services that integrate ChatGPT; this includes image data from Snapchat; financial details stored with Stripe; music preferences managed by Spotify as well as private conversation analysis on Slack/Microsoft Teams as well as patient portals managed by MyChary among many more.
Allegedly, the defendants have abandoned their original principles of developing AI that will likely benefit humanity as a whole, instead focusing on winning an arms race for personal gain. According to plaintiffs’ accusations in their lawsuit, this continued misappropriation has caused irreparable damage.
Problems began this week when computational journalist Francesco Marconi claimed his own work was being used without payment for training on ChatGPT tool. When he tweeted asking the program for a list of news sources it used for training purposes, 20 outlets were named back as training material by ChatGPT program.
Marconi’s lawsuit is just the latest in a long series of accusations made against Amazon regarding their use of data for its products. Last March, Marconi joined hundreds of tech experts in publishing an open letter calling for a six-month pause on developing AI to study its potential impacts on humanity – this was in response to an alarmist report which asserted that AI technology may pose risks comparable to nuclear weapons or pandemics.
CNN reports that this week’s lawsuit filed against a software developer for breach of privacy laws marks an historic first in US. The suit alleges the companies involved have used unapproved data sources to train AI systems, while also seeking damages due to lost income and royalties. Uncertain of whether ChatGPT will settle or fight the lawsuit, more media outlets have expressed displeasure with how ChatGPT uses their content. At stake is whether or not the company can demonstrate it has valid licensing agreements for all sources used, if that fails, an extensive legal battle may ensue that has wide repercussions across the industry.