Sci-Fi Writers and Comic-Con Just Slammed the Door on AI

Sci-Fi Writers and Comic-Con Just Slammed the Door on AI - Professional coverage

According to TechCrunch, two major creative institutions recently took a hard line against generative AI. In December 2025, the Science Fiction and Fantasy Writers Association (SFWA) updated its rules for the prestigious Nebula Awards, ultimately deciding that works created “either wholly or partially” by large language models are ineligible. This came after an initial, more permissive rule requiring disclosure sparked immediate backlash from members, forcing the board to apologize and revise. Separately, this month, San Diego Comic-Con quietly changed the rules for its annual art show after artists complained, moving from allowing AI art (but not for sale) to a full ban on material created “partially or wholly” by AI. In both cases, the organizations reversed course after hearing from their communities, signaling a firm stance is becoming the norm.

Special Offer Banner

Why the backlash is so intense

Look, this isn’t just about being old-fashioned. The core issue, as writer Jason Sanford pointed out in his Genre Grapevine newsletter, is that these tools are built on what many see as theft—the mass scraping of copyrighted creative work without permission or compensation. So when SFWA first floated a “just disclose it” policy, it felt like a betrayal. Basically, it implied the organization was okay legitimizing a process that undermines its own members’ livelihoods. The swift member revolt that forced a total ban shows how deep that sentiment runs. It’s a moral and economic stance, not just an artistic one.

The impossible line to draw

Here’s the thing, though. Sanford also nailed the huge problem with a total ban: where do you actually draw the line? He noted that if you use any modern online tool for research or word processing, you’re probably interacting with an LLM component somewhere. Does using Grammarly or a search engine’s AI summary feature disqualify you? The SFWA’s final rule is clear in intent but potentially messy in practice. How do you prove a writer used an LLM “in its creation”? It sets up a potential future of suspicion and unprovable accusations, which is its own kind of toxic environment for a creative community.

A deterrent in search of a crime

Comic-Con’s situation is maybe even more telling. The art show head, Glen Wooten, told artists the old “can display, can’t sell” rule had been a successful deterrent for years—nobody had even tried to submit AI art. But the very *possibility* that someone *could* was enough to spark outrage. That tells you how charged this issue is. The community demanded a symbolic “NO!” as reported by ComicsBeat, not just a commercial restriction. It’s about defining the space as human-only, full stop. The ban is a cultural flag being planted, more than a response to an actual flood of submissions.

Where does this leave us?

So what’s the endgame? These bans are important declarations of value. They say, “What we do here is human, for now.” But they also feel a bit like building a wall on a shifting sand dune. As AI gets baked into every software tool, that bright line will blur. And these are voluntary awards and shows—they can set any rules they want. But can the entire creative industry operate as a gated community? Probably not. The real fight is happening upstream, in courts and legislatures, about the data these models are trained on. These bans by SFWA and Comic-Con are a powerful shot across the bow in that larger war. They’re saying the creative class isn’t going to quietly accept this new “normal” being forced on them.

Leave a Reply

Your email address will not be published. Required fields are marked *