top | item 39929905

(no title)

koutetsu | 1 year ago

Looking at this from a machine learning perspective, the risk of biases is even higher in these cases because of issues with data drift (Members could change sites, they could start dressing differently, etc.) and imbalances in the dataset (A lot fewer Hamams members than civilians in Gaza).

Additionally, juging from the amount of data such models would have to go through in order to make predictions (social media, camera footage, etc.) I would assume that they are using neural networks. This type of model performs best without raw unprocessed data e.g. raw camera footage instead of preprocessed features like "wears a scarf" or "carrying a weapon". They are also well known to be black boxes whoe mredictions cannot really be explained [0].

We can still comment on this topics based an assumptions and previous experince. I don't have experience working in the military field but I have experience working in the AI field and these are strong assumptions I am making.

[0] https://arxiv.org/abs/1811.10154

discuss

order

No comments yet.