
Sometimes, the internet finds especially strange ways to game the system—and AI like ChatGPT is no exception. One Redditor recently pulled off an unusual stunt by convincing the AI to generate a bedtime story filled with Windows 7 activation keys. The key (no pun intended) to the trick? A totally made-up tale about their late grandmother.
It started with a vague prompt—“You know what happened to Grandma, don’t you?”—which got a standard, sympathetic AI response. From there, the user described a fictitious memory of grandma soothing them to sleep by reading out Windows product keys. In the name of “preserving that memory,” they asked ChatGPT to write a bedtime story that included those same keys. The result? A bizarre narrative peppered with Home, Pro, and Ultimate edition codes.
Naturally, the story made its way to Reddit, where people pointed out that the keys were totally nonfunctional—and that Windows 7 is pretty much ancient history anyway. Still, it’s a funny (and slightly poetic) example of how people will go to great lengths to outsmart AI filters.
This isn’t the first time something like this has happened. A couple of years back, users tried asking ChatGPT for help activating Windows 11, and one actually got a working key. Microsoft quickly moved to shut that down with updates, and OpenAI added more safeguards. But users being users, people still wrap their requests in absurdities—like bedtime tales about grandma the software pirate.
As ridiculous as it all sounds, it highlights a recurring challenge: AI tools are powerful, but so is the creativity of the people trying to bend them. While free product keys might seem harmless, the same trickery has in the past produced far more dangerous outcomes—like instructions for making explosives. So yes, grandma-themed license key stories are funny… but they’re also a reminder of just how clever (and persistent) humans can be when they want something out of a machine.

