@tomtrottel @noelreports AFAIK hallucinations apply only to generative AI like LLMs or image generators. Image misclassification (mistake) is not hallucination (making things up).
Anyway, as the post says: "weapons that can track moving targets from 1km away". So they will be piloted to within 1 km or less of the target and only then image recognition takes over.
@elgregor thanks for that, I will update my knowledge object recognition techniques, since I am not sure right now, but I clearly see a lot of image missclassificaion errors, but thats a while ago. your second point is also a valid thing that my morning brain seems not have taken into consideration.