I have editorialized the date in to provide context for yesterdays/todays news about another employee protesting Microsofts involvement with the Israeli military. If requested i can remove it from the title.
Gun manufacturer sells guns for war but denies they were used to harm people in war.
“We call them Love Bombs. Like Bombs. But like they love you for it.”
“that makes no sense.”
“no no, it’s a bomb that disposes propaganda!”
“Still makes no sense.”
“Well, you’d love the person that dropped TP if you needed TP, too.”
Make Bombs Poop Again
Ok, here’s a question for you:
Let’s say Microsoft counters with “Our terms and conditions were very clear that this software should not be used in war/genocides/holocausts”
Is that good enough? Who’s responsibility is it to check? Are vendors always supposed to keep tabs on their customers or is it a good enough excuse to say “naughty customer we told you not to do that”?
I’m not defending any actions here, but I always wonder if people want platforms that monitor and police everything or they value privacy and trust more.
If you are selling military systems, “privacy” does not apply. In the same way if you are selling systems to illegally dump waste for companies, you can hardly claim that this is their “private” matter. It is common in many countries that selling certain chemicals known to be usable for producing illicit drugs or explosives requires the buyer to provide a credible declaration of intended use for a legitimate purpose.
In criminal law, if you are giving tools to commit a crime to someone who you know will commit a crime or is known to commit crimes, you are complicit and you will be charged. Again this has nothing to do with privacy. But even if it is not required, if someone comes into a gun shop, has all the legal requirements and asks the seller how many children can be killed with this or that weapon in 5 minutes, the seller absolutely will be complicit if he stills sells this person a weapon.
Now back to Israel. There is extensive reports about these tools being used to target people in Gaza for killing them. There is reports about these tools being used in a way to maximize the targets selected as “legitimate”, to tweak numbers of “accepted casualities” to allow for more bombing and last but not least with “wheres daddy” to deliberately massacre an entire family of a target instead of only killing that target. This is just blatant murder.
Microsoft has the knowledge about it. They therefore have the duty to stop their complicity. To get back to the school shooter example. It will not be enough for the school shooter to sign a paper that he won’t shoot up a school in the weapons shop. He made his intent clear and it is obvious that he will not be dissuaded by breaking laws.
Is that good enough? Who’s responsibility is it to check? Are vendors always supposed to keep tabs on their customers or is it a good enough excuse to say “naughty customer we told you not to do that”?
if you create a rule, be it like a law, or a thing in a license agreement; or like, parents telling kids how to behave… It needs to be enforceable. Which means there needs to be some mechanism for identifying people who violate it.
The obligation is on the vendor for ensuring their ToS’s are complied with; and have mechanisms in place to validate that. just saying “well that’s against our ToS” and expecting everyone to follow it is kinda like making a law that says “you’re not allowed to think about the color blue.”
and by the way, if you think Win 11 isn’t telling MS everything you do; I got news for you. Now if win 11 or whatever tool they’re using is reporting on what the [defense contractor with classified secrets] is doing, that’s a different matter. But, it’s probably not actually a rule, and that’s probably some spin doc spining up the bullshit machine.
I get it - Fuck Israel, Fuck Microsoft.
But if MS says “Israel lied on their TOS and said that this AI was only for educational purposes”. You’d prefer that they violate every student’s privacy in case they were secretly working for Mossad.
No?
We’re not talking about Israeli educational institutions.
If you’re going to have a rule, you need to have a way to enforce the rule. Which is why MS wouldn’t put it in the ToS. That, and Israel wouldn’t sign that contract anyway. Every contract is unique.
saying “oh it’s against our ToS” is pure bullshit.
Israel lied on their TOS -> we remove access as they have been violating the contract. Again this doesn’t need to deal with anyones privacy.
These contracts aren’t with some university students though. They are directly with the Israeli military. And if they were with any universities in Israel, well these have close ties with the military including running logistics and IT infrastructure.
In a broader sense. These organizations are in violations of various laws and engaged in serious crimes. There is enough plausible indications that the Microsoft tools are also used for that. So by a simply logic of harm reduction Microsoft must terminate these contracts immediately. It does not require to establish evidence in either direction. The other party to the contract engaging in war crimes is all the evidence that is needed to terminate business with them.
The strat here is to simply not provide services to the israeli military. It’s that simple; I mean it’s literally the IDF nothing good could come out of helping them with anything.