Also "AI" has been in gaming, especially mobile gaming, for a literal decade already.
Household name game studios have had custom AI art asset tooling for a long time that can create art quickly, using their specific style.
AI is a tool and as Steve Jobs said, you can hold it wrong. It's like plastic surgery, you only notice the bad ones and object to them. An expert might detect the better jobs, but the regular folk don't know and for the most part don't care unless someone else tells them to care.
Another example is upscaled texture mods, which has been a trend for a long while before 'large language' took off as a trend. Mods to improve textures in a game are definitely not new and that probably means including from other sources, but the ability to automate/industrialize that (and presumably a lot of training material available) meant there was a big wave of that mod category a few years back. My impression is that gamers will overlook a lot so long as it's 'free' or at least are very anti-business (even if the industry they enjoy relies upon it), the moment money is involved they suddenly care a lot about the whole fabric being hand made and need verification that everyone involved was handsomely rewarded.
The issue isn't objective quality or realism, it's sticking to a specific style consistently.
_Everyone_ (and their grandmother) can instantly tell a ChatGPT generated image, it has a very distinct style - and in my experience no amount of prompting will make it go away. Same for Grok and to a smaller degree Google's stuff.
What the industry needs (and uses) is something they can feed a, say, wall texture into and the AI workflow will produce a summer, winter and fall variant of that - in the exact style the specific game is using.
And stablediffusion-web-ui before that and others, yes.
When googling, txt2img and img2img, or txt2video img2video etc. (for video) are useful terms, since they encapsulate the usage in a few terms. One could search img2video comfyui workflows, for example.
I thought it would be useful for the conversation to provide these terms, not mentioned before in the thread.
I think that's a different category, though. Those backgrounds are actual video recordings of real places, not 3D environments modeled from scratch. It looks 'real' because the background actually exists.
Your case would have been better if you had used Mad Max: Fury Road, or even Titanic as examples, rather then a mediocre TV show nobody remembers. Ugly Betty used green screens to make production cheaper, that did not improve the show (although it may have improved the profit margins). Mad Max: Fury Road on the other hand used CGI to significantly improve the visual experience. The added CGI probably increased the cost of the production, and subsequently it is one of the greatest, most awesome, movie ever made.
Actually if you look at the scene from Greys Anatomy [0:54] you can see where CGI is used to improve the scene (rather then cut costs), and you get this amazing scene of the Washington State Ferry crash.
I think you can see the parallels here. When people say they hate AI they are generally referring to the sloppy stuff it generates. It has enabled a proliferation of cheap slop. And with few exception it seems like generating cheap slop is all it does (these exception being specialized tools e.g. in image processing software).
Award winning shows and movies does not exclude forgettable cash grabs.
However, my counter examples included Grey’s Anatomy, Mad Max, and Titanic. None of these are considered high literature exactly (and all of them are award winning as well).
Household name game studios have had custom AI art asset tooling for a long time that can create art quickly, using their specific style.
AI is a tool and as Steve Jobs said, you can hold it wrong. It's like plastic surgery, you only notice the bad ones and object to them. An expert might detect the better jobs, but the regular folk don't know and for the most part don't care unless someone else tells them to care.
And then they go around blaming EVERYTHING as AI.