This past weekend, I made it through one of the big bosses in act 3 of Baulder’s Gate. I wept like a child over Karlach’s monologue about how she still feels empty after killing the guy who sold her to devils, and it didn’t change the fact that she was going to die. What’s the point of it all?


The scene in Blade Runner 2049: The moment he realizes the advertisement called him Joe and it was all a lie, and decides to do the right thing any way. Can’t seem to find an unedited clip.
https://youtu.be/gX3bpVC7C14
I’m still a believer that his Joi was different. Or maybe any of them could be, with the right environment. Much like Sam in Her wasn’t probably designed to go as far as she did, but they all (or many) ended up becoming something more. What was designed to be an AI girlfriend became aware in some aspects. Not saying what we have in reality is similar, just that emergence is still something to discuss even in a world of fakery to sell a product that isn’t really aware.
And it can be argued that even the tells of her being more could be saying what the user wants to hear, but… it may not be either. And that’s good writing, letting the reader have to fill in some of the ambiguous things on their own afterwards.
I want to believe. Maybe partially because it’s that much more tragic. His Joi wasn’t destroyed, she was murdered.
Careful with that logic. Many people feel that way about real ai right now and it has destroyed lives. Not that big of a leap today to compare the two.
It is a slippery slope. But the difference is that in a story you’re only given what the writer gives you, and you have to work the rest out. In reality you can show there limitations in what we have now.
It is a problem with our AI because like with anything else, people are easily convinced and marketed to for what they want to see, and they usually don’t want to dig too deep to find the truth in what they want to be true. Caveat emptor is Latin because selling something based on appearances has been around a long time. Today’s AI is our snake oil. It can be useful, but only if you understand what its limitations are, and how to best utilize its power while not getting sucked into its falsehoods.
And its setting for “always be supportive”. There I completely agree. Sycophancy in today’s AIs is horrifying.