Israeli intelligence sources reveal use of ‘Lavender’ system in Gaza war and claim permission given to kill civilians in pursuit of low-ranking militants
LLM’s hallucinate all the time. The hallucination is the feature. Depending on how you design the neural network you can get an AI that doesn’t hallucinate. LLM’s have to do that, because they’re mimicking human speech patterns and predicting one of my possible responses.
A model that tries to predict locations of people likely wouldn’t work like that.
Maybe don’t use something that is rarely discussed without using the word “hallucination” in your plans to FUCKING KILL PEOPLE?
This AI isn’t a LLM.
This AI isn’t even an AI
I mean, it probably has a neural network component.
Doesn’t mean that it won’t hallucinate. Or whatever you call an AI making up crap.
LLM’s hallucinate all the time. The hallucination is the feature. Depending on how you design the neural network you can get an AI that doesn’t hallucinate. LLM’s have to do that, because they’re mimicking human speech patterns and predicting one of my possible responses.
A model that tries to predict locations of people likely wouldn’t work like that.
“likely.”
Other AI systems can have hallucinations too.
The primary feature of LLM’s is the hallucination.