copyright
The US Copyright Office denied protection for AI images. Prompt engineers discovered they were actually data entry clerks.
Silicon Valley says AI is creative. The US Supreme Court and the Copyright Office say it's just automation. Inside the legal collapse of the 'AI artist' myth.
For the better part of three years, the venture-backed marketing machine has attempted to sell the public on a new class of "creative" genius. We were told that typing "cyberpunk cat in the style of Rembrandt" into a text box was an act of high-concept authorship, a digital brushstroke for the 21st century. Billions of dollars in valuation were staked on the idea that these machines weren't just tools, but collaborators—entities capable of generating intellectual property that could be owned, licensed, and protected.
The reality, as documented in the quiet, sterile hallways of the US legal system, is significantly less glamorous. Current US legal standards, cemented by the Thaler v. Perlmutter ruling and USCO policy updates, have systematically established that AI-generated outputs lack the Human Authorship required for protection, rendering the marketing of "AI creativity" a legal fiction that places all such content into the public domain upon creation. Despite the confident proclamations of "prompt engineers," the law has increasingly come to view their labor as little more than highly descriptive data entry.
What happened: The wall of human authorship

The collapse of the AI creativity myth was not a single event, but a methodical dismantling by federal regulators and judges who remained unimpressed by Silicon Valley’s semantic gymnastics. The primary obstacle has been the statutory requirement for Human Authorship, defined as the requirement under the US Copyright Act that a work must be the product of human creative labor to receive protection.
The first major defeat arrived with Stephen Thaler and his "Creativity Machine." Thaler attempted to register an image titled "A Recent Entrance to Paradise," explicitly naming his AI system as the author. The US Copyright Office (USCO) rejected the application, a decision that was upheld in Thaler v. Perlmutter on August 18, 2023. US District Judge Beryl Howell was blunt in her assessment, writing that human authorship is a "bedrock requirement" of the legal framework.
This was followed by the case of Zarya of the Dawn, a comic book produced by Kris Kashtanova. While the USCO initially granted a registration, it partially revoked the copyright on February 21, 2023, after discovering the images were generated by Midjourney. The Office ruled that while the human-authored text and the selection/arrangement of the images were protected, the images themselves—the "art"—were not. They were simply products of an automated process, not creative expression.
The final nail in the coffin of the "AI artist" legal identity arrived in March 2026. The US Supreme Court refused to hear Stephen Thaler’s final appeal, effectively cementing the requirement for human authorship for the foreseeable future. There is no longer a plausibly legal path for a work generated solely by a machine to receive federal protection in the United States.
The USCO’s March 16, 2023 Policy Statement now requires all applicants to explicitly disclose the use of AI. Failure to do so can result in the cancellation of the registration, effectively making "stealth AI" a liability for commercial publishers.
Why it matters: The Creative Double Bind
The legal failure of AI creativity exposes what researchers have termed the Creative Double Bind. This is the logical contradiction where AI users claim authorship based on "creative intent" while relying on an automated system to produce the actual expression they cannot produce themselves.
The user wants the credit for the result, but they want the machine to do the heavy lifting. In a legal sense, this makes the user a "commissioning agent" rather than an "artist." If you tell a painter to "make me a picture of a sad dog," you do not own the copyright to the specific brushstrokes or the emotional weight of the image; the painter does. When the "painter" is a Stochastic Parrot—a system that predicts and stitches together data based on probability rather than understanding or intent—there is no artist at all. Consequently, the work falls directly into the public domain.
| Human Labor | AI Output | Legal Status |
|---|---|---|
| Typing a prompt | Image Generation | Public Domain |
| Writing a script | Text Generation | Public Domain |
| Selecting AI images | Layout/Sequencing | Human Copyright (Limited) |
| Editing AI pixels | Modified Image | Human Copyright (Derivative only) |
According to an analysis by the Harvard Law Review, AI is technically a form of pattern reproduction rather than an expression of intent. Because the user does not control the specific expression of every pixel or word—the "traditional elements of authorship"—they cannot claim to be the source. This creates a zero-value asset problem for corporations. If an asset cannot be copyrighted, it cannot be licensed with exclusivity. A movie studio cannot prevent a competitor from using its AI-generated "lead actor" if that actor’s likeness was never protectable to begin with.
The 'Transformative Tool' counter-argument
Defenders of generative AI, including Stability AI and Midjourney, argue that their systems are "transformative tools" and that the iterative process of prompting constitutes sufficient creative control. They suggest that the "human spark" exists in the refinement of the prompt, the selection of seeds, and the "painting out" of artifacts.
However, the US Copyright Office has explicitly rejected this framing. In its March 2023 guidance, the Office compared prompting to commissioning an artist rather than being the artist. The ruling in Thaler v. Perlmutter clarified that because the user does not control the specific expression of every pixel, the machine is the generator, not the tool. A tool—like a camera or a brush—requires the human to make the expressive choices. An AI system makes those choices for the human based on a statistical average of its training data. Confidently calling oneself an "artist" because you described a scene to a computer is, in the eyes of the law, a misunderstanding of what art is.
What's next: The coming training data reckoning

The debate is shifting from "Can AI own things?" to "Did AI steal things?" While the courts have settled the authorship question, the liability question is just beginning. The class-action lawsuit Andersen v. Stability AI, filed in January 2023, remains the lighthouse for this conflict.
The core of the issue is the Training Data Heist. These systems are trained on millions of copyrighted works without permission or compensation. If the "creativity" of the machine is actually just the compressed, uncompensated labor of human artists, then the entire industry is built on a foundation of infringement.
- Fair Use vs. Transformative Use: Companies argue that training is "transformative." Artists argue it is a "derivative machine" that directly competes with the people it robbed.
- Hybrid Authorship: Future litigation will likely focus on defining the "human spark" percentage. How much "Photoshopping" of an AI image is required before it becomes a human work?
- The Risk of Illegal Training: If the courts find that the training process itself is a copyright violation, even "public domain" AI outputs could be deemed the fruit of a poisonous tree.
The myth of the AI artist is legally dead
The evidence presented over the last three years supports the thesis that "AI Creativity" is a marketing buzzword rather than a legal category. The Thaler and Kashtanova rulings were not outliers; they were the application of a century of precedent that requires a human mind to be the "originating cause" of a work.
The marketing of these tools as "creative agents" was a documented attempt to bypass the costs of human labor. But by removing the human from the process, the companies also removed the legal value of the product. The Silicon Valley vision of an automated creator economy has run headlong into the bedrock of human authorship. For now, if a human didn't make it, the law doesn't care who prompted it. The "AI artist" is, and remains, a data entry clerk working for a machine that produces public property.