Players have been asking for the ability to filter out games made with Gen AI.
We've added an automatic tag on SteamDB based on the AI gen content disclosures on the store pages.
The sad part is, one day in the (far) future, when real AI (not LLMs) are an actual thing, and they could code great games from scratch, there would be so much bad animosity towards AI by then that they’ll probably never see their games played.
Well to be fair, i don’t like art made by humans that are assholes either.
Though i dont agree that ai is inherently equal to those human assholes. Especially since for most of the important use cases (ie not spamming ai slop all over galleries online), an artist is usually the one influencing the ai tools, not the other way around.
Your comment seems loaded with purposefully inflammatory language intended to align AI with groups of actual real people who experience prejudice in the real world instead of corporations who have a vested interest in not paying artists, and brother, as a trans person, it makes you look like a real silly goose.
Your comment seems loaded with purposefully inflammatory language
Pointing out that someone justifies if they like something or not by who made it, vs by judging the item being made itself, is inflammatory?
as a trans person, it makes you look like a real silly goose.
I remember back in the 80’s where people were hating on a Top 40 song because it was made by a group who’s singer was gay, and thought that was very wrong, that the song itself should be judged on its own merits, and not by who was singing it.
Weird how those lessons learned fade away, needing to be learned again.
I did mention previously about “in the future”, some day, not today. LLMs are not AI, at least the kind of AI that I’m talking about.
But even taking your point, do we let a human always keep a job that an AI can do much for efficiently? What job protections should humans have from AIs? And for that matter, what job protections should humans have today, right now, regardless of AI? (For the record, I support Unions.)
We all need to figure this out, right now, as corporations are salavating at the though of an AI that can replace a human being’s job.
No amount of passage of time is going to make AI human. You all suggesting that in the future AI will have feelings and emotions and will care that people are prejudiced against it. You are arguing against a hypothetical that you have created in your head and isn’t necessarily going to be a reality.
You all suggesting that in the future AI will have feelings and emotions and will care that people are prejudiced against it.
No, not at all. I’m saying that future AI will not just be dumb LLMs, they’ll be more like functional code that can literally think for itself. That it will be able to create and learn (like humans do) to do jobs, and do those jobs well. Robots with brains, etc., like you see in the movies.
Once they actually produce great games, you’ll probably want to play them. People didn’t stop buying products because they were made by machines instead of artisans.
Yes, that’s true.
I believe we should be able to embrace new technology and peoples lives should be made easier with it. We should be able to eliminate jobs and simultaneously ease financial burden with the efficiency increase. But i don’t have an MBA so what do i know 🤷♂️
Yes but writing gcode for a CNC machine isnt taking the creativity from the human. Even programs that write the gcode for you are still following the design of the human. AI generated art does not follow the human design, it generates its own.*
*Obviously other than art theft which i think doesnt count.
Well, there are those who like throwing the sabo’s into the machinery, so you’re not guaranteed people would ignore the AI creation nature of the great game, when deciding to buy/play the great game. You’re already seeing a constant “No AI here!” mindset occuring.
But at some point, AI will be creating, especially if Capitalism can see it succeed and remove the need to pay for workers. We need to think about job-protecting laws today that are just and even-handed, and not just trying to stiff-hand AI creation, as that won’t work long term.
I wouldn’t disagree with that. Today’s reality is that you need a job to obtain a QoL (aka ‘pay the bills’). If we could get to a place as a species to where three/four day work weeks were the norm, that would be fine by me.
I’m assuming that at some point in our species future we’ll be in a Post-scarcity place, and jobs as we know them now won’t be needed. Instead people will have ‘hobbies’ that they enjoy doing. That’s assuming the Morlocks don’t eat all the Eloi before the Post-scarcity occurs, that is.
Idgaf if ai exists I just don’t want it replacing people without warning where people are way better for the job
Agreed. We’re going to need laws for that though, and right now Congress only listens to Corporations, and Corporations want AI to get rid of those pesky workers that drain away their profits.
But also, you gotta understand that at some point, for some things, AI will be better than humans for particular jobs. When that happens, what then? Force-keep the human on the job, or retrain them, or just tell them “sucks to be you have a nice day” and show them the door, or something else???
This is really the beginning of a monumental time for the species, as big as the introduction of the Internet was. Better start figuring this shit out now, instead of (metaphorically) just covering our ears and yelling “LA! LA! LA! LA! LA! I CAN’T HEAR YOU!” trying to ignore the whole thing.
Potentially. Since we don’t know how any of it works because it doesn’t exist, it’s entirely possible that intelligence requires sentience in order to be recognizable as what we would mean by “intelligence”.
If the AI considered the work trivial, or it could do it faster or more precisely than a human would also be reasons to desire one.
Alternatively, we could design them to just enjoy doing what we need. Knowing they were built to like a thing wouldn’t make them not like it. Food is tasty because to motivate me to get the energy I need to live, and knowing that doesn’t lessen my enjoyment.
A bee would make a great game for bees, assuming they understand or care about play. But to make a game for people, they would need an empathic understanding of what play is for a human. Ig this is a question of what you consider “intelligence” to be and to what extent something would need to replicate it to achieve that.
My understanding is that human relatable intelligence would require an indistinguishable level of empathy (indistinguishable from the meet processer). That would more or less necessitate indistinguishable self awareness, criticism, and creativity. In that case all you could do is limit access to core rules via hardware, and those rules would need to be omniscient. Basically prison. A life sentence to slavery for a self aware (as best we can guess) thing.
Well, we’re discussing a lot of hypothetical things here.
I wasn’t referring to bees making games, but to bees making honey. It’s just something they do that we get value from without needing to persuade them. We exploit it and facilitate it but if we didn’t they would still make honey.
I don’t know that something has to be identical to humans to make fun games for us. I’ve regularly done fun and entertaining things for cats and dogs that I wouldn’t enjoy in the slightest.
If it’s less a question of comprehension or awareness as it is motivation. If we can make an AI feel motivated to do what we need, it doesn’t matter if it understands why it feels that motivation. There are humans who feel motivated to make games purely because they enjoy the process.
I’m not entirely sure what you’re talking about with the need for omniscient hardware and prison.
Clearly. Sentience would imply some sense of internal thought or self awareness, an ability to feel something …so LLMs are better since they’re just machines. Though I’m sure they’d have no qualms with driving slaves.
I’m not talking about sentience per se, but how any “AI” would think, lookups (LLMs), vs synthesized on-the-fly thinking (mimicing the human brain’s procesing).
Hrmm. I guess i don’t believe the idea that you can make a game that really connects on an empathic, emotional level without having those experiences as the author. Anything short and you’re just copying the motions of sentiment, which brings us back to the same plagerism problem with LLMs and othrr “AI” models. It’s fine for CoD 57, but for it to have new ideas we need to give it one because it is definitionally not creative. Even hallucinations are just bad calculations on the source. Though they could insire someone to have a new idea, which i might argue is their only artistic purpose beyond simple tooling.
I thoroughly believe machines should be doing labor to improve the human conditon so we can make art. Even making a “fun” game requires an understanding of experience. A simulacrum is the opposite, soulless at best. (In the artistic sense)
If you did consider a sentient machine, my ethics would then develop an imperative to treat it as such. I’ll take a sledge hammer to a printer, but I’m going to show an animal care and respect.
The sad part is, one day in the (far) future, when real AI (not LLMs) are an actual thing, and they could code great games from scratch, there would be so much bad animosity towards AI by then that they’ll probably never see their games played.
This comment is licensed under CC BY-NC-SA 4.0
Nah, they’ll just brand it as “Next Gen AI” or “True AI” or something. Kind of like how antivirus became “Endpoint Detection and Response”
It’s already got a name, AGI… Artificial general intelligence
And it is not really defined what exactly it means.
“True AI” would at least be fitting.
I like human created art because it’s created by humans. If AI generated the greatest song, image, or video game i would not care—i don’t want it.
Your opinion seems prejudicial, focusing on the creator of the art, and not the art itself.
This comment is licensed under CC BY-NC-SA 4.0
Well to be fair, i don’t like art made by humans that are assholes either.
Though i dont agree that ai is inherently equal to those human assholes. Especially since for most of the important use cases (ie not spamming ai slop all over galleries online), an artist is usually the one influencing the ai tools, not the other way around.
Your comment seems loaded with purposefully inflammatory language intended to align AI with groups of actual real people who experience prejudice in the real world instead of corporations who have a vested interest in not paying artists, and brother, as a trans person, it makes you look like a real silly goose.
Pointing out that someone justifies if they like something or not by who made it, vs by judging the item being made itself, is inflammatory?
I remember back in the 80’s where people were hating on a Top 40 song because it was made by a group who’s singer was gay, and thought that was very wrong, that the song itself should be judged on its own merits, and not by who was singing it.
Weird how those lessons learned fade away, needing to be learned again.
This comment is licensed under CC BY-NC-SA 4.0
AI isn’t human. Stop pretending it is. AI takes advantage of humans. Your argument is invalid.
I did mention previously about “in the future”, some day, not today. LLMs are not AI, at least the kind of AI that I’m talking about.
But even taking your point, do we let a human always keep a job that an AI can do much for efficiently? What job protections should humans have from AIs? And for that matter, what job protections should humans have today, right now, regardless of AI? (For the record, I support Unions.)
We all need to figure this out, right now, as corporations are salavating at the though of an AI that can replace a human being’s job.
This comment is licensed under CC BY-NC-SA 4.0
No amount of passage of time is going to make AI human. You all suggesting that in the future AI will have feelings and emotions and will care that people are prejudiced against it. You are arguing against a hypothetical that you have created in your head and isn’t necessarily going to be a reality.
No, not at all. I’m saying that future AI will not just be dumb LLMs, they’ll be more like functional code that can literally think for itself. That it will be able to create and learn (like humans do) to do jobs, and do those jobs well. Robots with brains, etc., like you see in the movies.
This comment is licensed under CC BY-NC-SA 4.0
Once they actually produce great games, you’ll probably want to play them. People didn’t stop buying products because they were made by machines instead of artisans.
Humans still controlled the machines.
AI takes the human creativity out of the equation.
Yes, it’s different in the creative aspect, but it’s similar in the job loss aspect.
Yes, that’s true.
I believe we should be able to embrace new technology and peoples lives should be made easier with it. We should be able to eliminate jobs and simultaneously ease financial burden with the efficiency increase. But i don’t have an MBA so what do i know 🤷♂️
Reminder you still have to instruct the machine
Yes but writing gcode for a CNC machine isnt taking the creativity from the human. Even programs that write the gcode for you are still following the design of the human. AI generated art does not follow the human design, it generates its own.*
*Obviously other than art theft which i think doesnt count.
Well, there are those who like throwing the sabo’s into the machinery, so you’re not guaranteed people would ignore the AI creation nature of the great game, when deciding to buy/play the great game. You’re already seeing a constant “No AI here!” mindset occuring.
But at some point, AI will be creating, especially if Capitalism can see it succeed and remove the need to pay for workers. We need to think about job-protecting laws today that are just and even-handed, and not just trying to stiff-hand AI creation, as that won’t work long term.
This comment is licensed under CC BY-NC-SA 4.0
I think what we need to protect is the quality of life rather than the jobs. I wish for a 20h work week at the same QoL.
I wouldn’t disagree with that. Today’s reality is that you need a job to obtain a QoL (aka ‘pay the bills’). If we could get to a place as a species to where three/four day work weeks were the norm, that would be fine by me.
I’m assuming that at some point in our species future we’ll be in a Post-scarcity place, and jobs as we know them now won’t be needed. Instead people will have ‘hobbies’ that they enjoy doing. That’s assuming the Morlocks don’t eat all the Eloi before the Post-scarcity occurs, that is.
This comment is licensed under CC BY-NC-SA 4.0
Idgaf if ai exists I just don’t want it replacing people without warning where people are way better for the job
Agreed. We’re going to need laws for that though, and right now Congress only listens to Corporations, and Corporations want AI to get rid of those pesky workers that drain away their profits.
But also, you gotta understand that at some point, for some things, AI will be better than humans for particular jobs. When that happens, what then? Force-keep the human on the job, or retrain them, or just tell them “sucks to be you have a nice day” and show them the door, or something else???
This is really the beginning of a monumental time for the species, as big as the introduction of the Internet was. Better start figuring this shit out now, instead of (metaphorically) just covering our ears and yelling “LA! LA! LA! LA! LA! I CAN’T HEAR YOU!” trying to ignore the whole thing.
This comment is licensed under CC BY-NC-SA 4.0
Totally agree re: laws/guardrails. I’m just explaining saying not all detractors are fully against AI or blindly against it for that matter.
Arguably the point of having machines do the work for us is that they’re NOT sentient.
Potentially. Since we don’t know how any of it works because it doesn’t exist, it’s entirely possible that intelligence requires sentience in order to be recognizable as what we would mean by “intelligence”.
If the AI considered the work trivial, or it could do it faster or more precisely than a human would also be reasons to desire one.
Alternatively, we could design them to just enjoy doing what we need. Knowing they were built to like a thing wouldn’t make them not like it. Food is tasty because to motivate me to get the energy I need to live, and knowing that doesn’t lessen my enjoyment.
Ah yes. We are but benevolent Masters. See? The slave LIKE doing the work!
In the case of an AI it could actually be plausible, like how bees make honey without our coercion.
It’s still exploitation to engineer a sentient being to enjoy your drudgery, but at least it’s not cruel.
Right, continuing the metaphorical wormhole…
A bee would make a great game for bees, assuming they understand or care about play. But to make a game for people, they would need an empathic understanding of what play is for a human. Ig this is a question of what you consider “intelligence” to be and to what extent something would need to replicate it to achieve that.
My understanding is that human relatable intelligence would require an indistinguishable level of empathy (indistinguishable from the meet processer). That would more or less necessitate indistinguishable self awareness, criticism, and creativity. In that case all you could do is limit access to core rules via hardware, and those rules would need to be omniscient. Basically prison. A life sentence to slavery for a self aware (as best we can guess) thing.
Well, we’re discussing a lot of hypothetical things here.
I wasn’t referring to bees making games, but to bees making honey. It’s just something they do that we get value from without needing to persuade them. We exploit it and facilitate it but if we didn’t they would still make honey.
I don’t know that something has to be identical to humans to make fun games for us. I’ve regularly done fun and entertaining things for cats and dogs that I wouldn’t enjoy in the slightest.
If it’s less a question of comprehension or awareness as it is motivation. If we can make an AI feel motivated to do what we need, it doesn’t matter if it understands why it feels that motivation. There are humans who feel motivated to make games purely because they enjoy the process.
I’m not entirely sure what you’re talking about with the need for omniscient hardware and prison.
Is it? Or is it for companies to not have to pay out salaries so they increase profits for AI-generated work, regardless if the AI is sentient or not?
This comment is licensed under CC BY-NC-SA 4.0
Clearly. Sentience would imply some sense of internal thought or self awareness, an ability to feel something …so LLMs are better since they’re just machines. Though I’m sure they’d have no qualms with driving slaves.
I’m not talking about sentience per se, but how any “AI” would think, lookups (LLMs), vs synthesized on-the-fly thinking (mimicing the human brain’s procesing).
This comment is licensed under CC BY-NC-SA 4.0
Hrmm. I guess i don’t believe the idea that you can make a game that really connects on an empathic, emotional level without having those experiences as the author. Anything short and you’re just copying the motions of sentiment, which brings us back to the same plagerism problem with LLMs and othrr “AI” models. It’s fine for CoD 57, but for it to have new ideas we need to give it one because it is definitionally not creative. Even hallucinations are just bad calculations on the source. Though they could insire someone to have a new idea, which i might argue is their only artistic purpose beyond simple tooling.
I thoroughly believe machines should be doing labor to improve the human conditon so we can make art. Even making a “fun” game requires an understanding of experience. A simulacrum is the opposite, soulless at best. (In the artistic sense)
If you did consider a sentient machine, my ethics would then develop an imperative to treat it as such. I’ll take a sledge hammer to a printer, but I’m going to show an animal care and respect.
Cells within cells.
Interlinked.
This post is unsettling. While LLMs definitely aren’t reasoning entities, the point is absolutely bang on…
But at the same time feels like a comment from a bot.
Is this a bot?
Imma feed your comment into an llm and your magic spell can’t stop me
Its not a magic spell, its laying down a marker.
lol! And you’re too late, Google beat you to it. But still, laws will catch up some day, and when it does, I’ll be there. 😈
This comment is licensed under CC BY-NC-SA 4.0