The Cost of Prioritizing Growth: What the Meta Class Action Lawsuit Reveals About Copyright and Responsibility in the Age of AI

In the United States, a new class action lawsuit has been filed against Meta (formerly Facebook). The plaintiffs are video creators known as “storm chasers,” who risk their lives to capture footage of tornadoes and massive storms. They allege that, despite the continued unauthorized reposting of their videos on Facebook and Instagram, Meta has failed to take adequate action and has effectively condoned copyright infringement.

This case is not merely a dispute between “creators and a platform.” It fundamentally calls into question the very business model of social media and the way data and rights should be handled in the age of generative AI.

Life-Risking Footage as Both “Creative Work” and “Asset”

The lawsuit is being led by a group including Brandon Clements, a well-known storm chaser in the United States. Videos captured in close proximity to storms and tornadoes are not only highly newsworthy but also possess social value, as they are used for research and disaster-prevention education.

For these creators, such footage is not merely data. It is a “creative work” produced through the investment of time, risk, and specialized expertise, and at the same time an “asset” that supports their livelihood. When these videos are reposted without authorization and third parties reap the benefits of views and advertising revenue, the very foundation of creative activity is undermined.

Is the DMCA Really Working?

The plaintiffs state that each time they discovered unauthorized reposting, they submitted takedown notices under the Digital Millennium Copyright Act (DMCA), amounting to hundreds of requests.

However, according to the complaint, there were cases in which removal requests were not properly processed and infringing posts remained online. While platform policies prohibit copyright infringement on paper, operational practice has failed to keep pace. This gap between principle and reality reflects a long-standing frustration shared by many creators.

Even more serious is the allegation that malfunctions in automated detection systems have led to a “reversal phenomenon,” in which infringers are treated as legitimate rights holders, while the actual rights holders face account suspensions or blocks.

Account Suspension Is, in Effect, a Business Shutdown

For today’s creators, an SNS account is far more than a personal page. It is a venue for publishing works, a point of contact with fans, and proof of credibility to business partners.

If such an account is frozen due to an erroneous judgment, the damage goes beyond lost revenue to include harm to social trust itself. If a system intended to prevent infringement ends up excluding the rights holders, it must be regarded as a failure of the system.

Why Do Platforms Become Slow to Act?

The plaintiffs point to a structural temptation faced by Meta: leaving infringement unaddressed helps maintain traffic. The advertising business of social media fundamentally depends on user engagement time and view counts. The more eye-catching videos there are, the more short-term revenue grows.

Even if unauthorized viral videos are mixed in, the numbers still look good. Strict copyright enforcement could reduce views and, consequently, revenue. This dilemma is likely one of the reasons behind delayed responses.

What is at issue here is not negligence, but a value judgment: whether to prioritize growth or legitimacy. Meta is being forced to make that choice.

This Issue Is Directly Connected to the Age of Generative AI

The significance of this lawsuit lies in the fact that debates over copyright infringement are closely linked to the issue of training data for generative AI. Video-generating AI and multimodal AI require massive amounts of content to achieve high performance. Technologies such as OpenAI’s video-generation AI “Sora” are no exception.

While tolerating copyright infringement and the use of AI training data are not identical issues, both share a common root: an industry structure in which rights clearance is easily postponed. If courts ultimately impose stronger management responsibility on platform operators, the trend toward directly bearing the cost of rights clearance—across AI companies as well—is likely to accelerate.

Not Someone Else’s Problem for Japan

Although this lawsuit is taking place in the United States, it is not a distant issue for Japanese creators and companies. In Japan as well, cases of unauthorized use of videos, images, and promotional materials on social media continue unabated.

When companies use social media for marketing, proceeding without clarity on what is lawful or whether secondary use is permitted exposes them to the risk of public backlash and litigation. In an era when platform responsibility is under scrutiny, user-side literacy is also being tested.

How Far Can Growth-First Thinking Be Justified?

The essence of this class action lawsuit is not its eventual outcome. Its significance lies in bringing before the courts the question of whether massive platforms can truly have incentives to prevent illegal activity—and if not, how society should redesign the system.

In an era when generative AI becomes social infrastructure and the value of video and data increases, copyright is no longer something that merely “protects the past,” but rather a “source of competitiveness.” In a world where unauthorized reposting is tolerated, those who take risks on the front lines and continue creating will not be rewarded.

This lawsuit confronting Meta signals that society has entered a phase of seriously reexamining issues long overlooked in the shadow of growth. It can be seen as a grave warning for the future of social media and generative AI alike.