MilikMilik

Everyone Uses AI, No One Admits It: Inside the New Creative Grey Zone in Music and Games

Everyone Uses AI, No One Admits It: Inside the New Creative Grey Zone in Music and Games

The Silent Normalization of Generative AI in Creative Work

Across music and games, generative AI has slipped from novelty to infrastructure, but almost no one wants to say it out loud. In AAA game development, reporters describe generative AI in games as “pretty much an industry standard,” with major studios quietly weaving tools like Claude into daily workflows for writing, design support, and other development tasks. Executives have even highlighted respected publishers such as Capcom as adopters, suggesting that automation is increasingly seen as necessary just to stay competitive at today’s scale. In electronic music, insiders say the same thing: everyone is using AI tools in music, from prompt-based apps like Suno to AI-assisted set building, but few will admit it publicly. Instead, artists amplify stories of “fully human” projects to signal authenticity while keeping AI assistance in the background, creating a widening gap between how creative work is made and how it is marketed.

Everyone Uses AI, No One Admits It: Inside the New Creative Grey Zone in Music and Games

Electronic Music’s Open Secret: AI Tools Behind ‘Human’ Sounds

Electronic music has always thrived on new technology, from modular synths to drum machines and DAWs. Today’s twist is that generative AI tools power a growing share of that experimentation—and yet they’re treated like a guilty secret. Industry voices note that “everyone is using these tools, artists at all levels – but they don't want to talk about it,” even as performers celebrate strikingly human achievements like complex multilingual vocals specifically framed as “without AI assistance.” Part of the tension lies in ethics and copyright: large models are often trained on unlicensed creative work, raising questions about “wholesale ripping of people's creative works” and who gets paid. At the same time, AI assisted creativity is pitched as just another step in a long history of democratizing music-making, from sampling to bedroom production. The result is a culture where artists quietly lean on AI for speed and inspiration while publicly foregrounding the most human aspects of their process.

AAA Game Development: AI Everywhere, Disclosure Nowhere

In big-budget games, the generative AI in games debate has become what one journalist calls an “impossible conversation.” According to reporting, almost every major studio now integrates generative models into production—using tools like Claude to streamline routine tasks and AI suites such as those promoted by cloud providers to manage ballooning scope and complexity. Even acclaimed developers feel they “cannot compete” without some level of automation, yet few are willing to be transparent about how deeply AI is embedded. The fear is twofold: audiences angry about “AI slop” and a perceived loss of craft, and internal concerns around unions, job security, and unresolved legal risks tied to training data. By keeping usage under wraps, studios preserve the marketing narrative of handcrafted worlds and stories, but they also risk a backlash if players later discover how much invisible automation shaped the games they love.

Playdate’s Generative AI Ban: A Countermodel of Explicit Values

Indie handheld maker Panic offers a striking counterexample to the hush-hush approach. For its Playdate platform, the company has implemented an explicit Playdate generative AI ban covering art, music, and writing in games sold through its Catalog storefront. Developers may still use AI coding tools such as GitHub Copilot, but only with clear disclosure. Panic frames this as both a quality filter and a community promise, positioning Playdate as a deliberately human-scale, handcrafted space in contrast to larger storefronts like Steam or console marketplaces that still permit AI-generated content. The policy emerged after a featured game was found to have used models like ChatGPT and AI-assisted coding, revealing a mismatch between community expectations and current practices. By drawing a bright line between creative content and technical aids, Panic turns AI policy into part of its brand identity—and tests whether a transparent, human-first stance can be a selling point.

Do People Really Care, or Is It About Trust?

For listeners and players, the core question is less whether AI is used at all and more whether they can trust how that use is presented. Undisclosed AI tools in music or AAA game development rarely change the immediate experience: a great track still bangs, a well-crafted game still captivates. What’s at stake is authenticity, labor, and consent—who did the work, who was used as training material, and who gets credit. When companies hide AI use, they protect their “handmade” marketing but erode long-term confidence. A more sustainable path would treat AI like any other tool but embed disclosure: clear labels on storefronts, brief notes in credits about where generative systems shaped writing, visuals, or sound, and opt-in badges for fully human-made content. Consumers who care can then seek out those labels, while others simply judge by quality, not by the toolchain behind it.

Comments
Say Something...
No comments yet. Be the first to share your thoughts!