As someone who’s been on the hiring side there are some legalities involved on what to answer here. But I’ve always made a point of telling people who asked why. However I’m not in HR, so lots of people might get filtered before I even got a chance to interview them.
Also we asked candidates to do a take home and we talked about their solution during the interview, so most people got a good understanding of why they were rejected, but a couple of times people asked afterwards and I replied to them with the reasons we considered they were not at the level we were looking for, but that we would keep them in consideration for a more junior role if there ever was an opening.
Until there’s an AGI that won’t happen in any meaningful way. Why? Because here’s something that matches your criteria of:
You get a text based game where everything you try to do ends up with you dead because a corporation kills you unless you discover that if you look at the ground where you start there’s a penny from the year the murderer is from, and then you need to discover who’s the murder (changes every time) based solely on this, because that’s the sort of thing Sherlock Holmes would do. No, it’s not fun, it’s frustrating, it’s essentially luck, if that’s fun to you I have an infinitely replay able game, flip a coin and see how many times you can get heads in a row, if you get to 16 you win.
The thing is LLMs don’t understand “fun”, they’re just auto-completes, so they will just do boring or unfair stuff. And you would need to go very deep into the specifics of your game, to the point where you’re essentially programming the game, so at the end of the day it’s not something an end user would use.
That’s not to say there aren’t interesting uses for it inside games, but the moment you can prompt an entire game that’s actually fun to play on an AI, that same AI would be able to replace almost every job in the world.