![](https://sh.itjust.works/pictrs/image/a017c23a-943d-48a5-aa98-225a50e0ebd4.jpeg)
![](https://lemmy.world/pictrs/image/8f2046ae-5d2e-495f-b467-f7b14ccb4152.png)
Yea, let’s just slap the missile equivalent of chatgpt on a bunch of drone missiles, what could go wrong? /s
Serriously though, what happens if the AI driving the drone hallucinates? I wouldn’t want to be anywhere near these things when they’re testing them.
I wish I could turn the amazon ai generated bs in the product info search off permanently, as I only tend to use that search to figure out what actual people have said about the thing I’m looking at and I don’t trust a LLM to not hallucinate some random bs for something important I’m trying to figure out thanks.