BuzzLuv !

What is the ‘magic powder’ Israel allegedly used to track Hamas in Gaza? | External Affairs Defence Security News

[ad_1]

Israel-Hamas, Israeli airstrike, Gaza Strip

Smoke rises following an Israeli airstrike in the Gaza Strip


Israel may have used machine learning “magic powder” to help identify Hamas targets in Gaza, said a report on Thursday, citing a video that purportedly shows a senior official from Israel’s Unit 8200, the country’s cyber intelligence agency, talking about the artificial intelligence (AI) tool. It is not clear whether this particular tool has been used during the ongoing Israel-Hamas conflict. 


The report comes in the wake of the recent testimony from six Israeli Defense Forces (IDF) insiders about Israel’s use of an AI tool called “Lavender” during its offensive against Hamas. The AI-generated database was allegedly used to help Israeli intelligence officers in the bombing campaign in Gaza by helping to identify potential human targets. In its response, the IDF has said that some of these accounts were “baseless”. 


In the February 2023 video, the head of Unit 8200’s data science and AI division, identified only as “Colonel Yoav”, said that he would provide an “example of one of the tools we use”, said the report by The Guardian, which broke the news.   


Colonel Yoav then described how the intelligence division employed machine learning techniques for “finding new terrorists” during Israel’s May 2021 offensive in Gaza. 


Describing a scenario where only some of the members of a terrorist group were known to Israeli defence agencies, Colonel Yoav said that by using “data science magic powder”, they were “able to find the rest of them”. 

The remarks were reportedly made by Colonel Yoav during an AI conference at the Tel Aviv University in February 2023. A video of Colonel Yoav from the conference was hosted on the university’s YouTube channel. 


During his 10-minute presentation, Colonel Yoav explained that when using AI to predict whether a person was a terrorist, Unit 8200 would use the information it had on people it believed were members of terrorist organisations “to find the rest of the group”. 


According to the report, Colonel Yoav also provided a specific example. According to the Colonel, during the IDF’s May 2021 military operation in Gaza, his department applied these principles and techniques to locate Hamas’ missile squad commanders and terrorists employing anti-tank missiles in Gaza “in order to operate against them”. 

The Colonel also explained that by employing a form of machine learning called “positive unlabelled learning”, his unit would “take the original sub-group” and then “calculate their close circles”. Subsequently, the unit would “calculate relevant features”, after which the results would be ranked and “the threshold” determined. 


However, Colonel Yoav also pointed out that feedback from intelligence officers was used to “improve” the algorithm. He stressed that “people of flesh and blood” eventually made the actual decisions. According to the report, the Colonel added, “Ethically speaking, we put a lot of emphasis on this.” 


Colonel Yoav went on to reportedly say that during the May 2021 offensive, Unit 8200 was able to produce over 200 new targets. He added that earlier, generating this amount of new targets would have taken them “almost a year”. 

The Guardian report said that Colonel Yoav’s footage raised doubts about a recent IDF statement on the use of AI. The IDF has said that it “does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist”. 


When asked by the Guardian to comment about Colonel Yoav’s video, the IDF said that the military had approved his participation in the conference. But, an IDF spokesperson denied that Colonel Yoav’s remarks conflicted with the IDF’s recent denials about using AI. 


“The IDF never denied the existence of a database of operatives in terrorist organisations, that cross-checks existing information on such operatives,” the defence force said, adding, “At the same time, the IDF fully stands by its statement that it does not use AI systems that choose targets for attack, and that the database in question is not a list of operatives eligible to attack. There is no contradiction.”   


During last week’s testimony about the Lavender tool, six Israeli intelligence officers said that it had been employed to help identify potential targets at an unprecedented speed and scale. However, the IDF was not accused of employing a system that automatically selected targets for attack. None of the officers at the testimony denied that humans were involved in authorising attacks. But, some of them did point out minimal human oversight. Their accounts were published by +972 magazine and the Guardian.   

First Published: Apr 12 2024 | 3:13 PM IST

[ad_2]

Source link

Leave a Comment

Discover more from BuzzLuv !

Subscribe now to keep reading and get access to the full archive.

Continue reading