• gAlienLifeform@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 months ago

      Fair enough, but I think this article is reasonably critical

      But critics warn the system is unproven at best — and at worst, providing a technological justification for the killing of thousands of Palestinian civilians.

      “It appears to be an attack aimed at maximum devastation of the Gaza Strip,” says Lucy Suchman, an anthropologist and professor emeritus at Lancaster University in England who studies military technology. If the AI system is really working as claimed by Israel’s military, “how do you explain that?” she asks

      The Israeli military did not respond directly to NPR’s inquiries about the Gospel. In the November 2 post, it said the system allows the military to “produce targets for precise attacks on infrastructures associated with Hamas, while causing great damage to the enemy and minimal harm to those not involved,” according to an unnamed spokesperson.

      But critics question whether the Gospel and other associated AI systems are in fact performing as the military claims. Khlaaf notes that artificial intelligence depends entirely on training data to make its decisions.

      “The nature of AI systems is to provide outcomes based on statistical and probabilistic inferences and correlations from historical data, and not any type of reasoning, factual evidence, or ‘causation,’” she says. “Given the track record of high error-rates of AI systems, imprecisely and biasedly automating targets is really not far from indiscriminate targeting.”

      Some accusations about the Gospel go further. A report by the Israeli publication +972 Magazine and the Hebrew-language outlet Local Call asserts that the system is being used to manufacture targets so that Israeli military forces can continue to bombard Gaza at an enormous rate, punishing the general Palestinian population.