A Microsoft employee disrupted the company’s 50th anniversary event to protest its use of AI.

“Shame on you,” said Microsoft employee Ibtihal Aboussad, speaking directly to Microsoft AI CEO Mustafa Suleyman. “You are a war profiteer. Stop using AI for genocide. Stop using AI for genocide in our region. You have blood on your hands. All of Microsoft has blood on its hands. How dare you all celebrate when Microsoft is killing children. Shame on you all.”

Sources at Microsoft tell The Verge that shortly after Aboussad was ushered out of Microsoft’s event, she sent an email to a number of email distribution lists that contain hundreds or thousands of Microsoft employees. Here is Aboussad’s email in full:

archive.today link

  • @[email protected]
    link
    fedilink
    English
    52 months ago

    Microsoft will sell US citizens out in a second when the government tells them to. They will use their AI to round us up without batting an eye. They can not be trusted anymore.

    Microsoft is now a threat to democracy and human existence. They are already working against us with governments. This is the tipping point that no one will hear about.

    There is too much at stake and they are too big to fail. The government will viciously take down anyone spreading the truth. Continuing to support Microsoft is now a death sentence to democracy.

  • @[email protected]
    link
    fedilink
    English
    212 months ago

    It takes massive courage to give up a cozy job at Microsoft and potentially damage your entire career to stand up for your values this way. Props to her!

  • @[email protected]
    link
    fedilink
    English
    132 months ago

    Reading some comments here, I want to leave a gentle reminder to my fellow redditfugees: the block user option is your friend. Curate your feed or get fed.

    When you see an aggressively oppositional account dropping shittastic hot takes, of course you can always engage and Have The Conversation if you want. You know what happens after you reply: the person likely leaves a bot to mess with your good intentions, raise your blood pressure, make you depressed and waste your time. Or maybe you successfully Prove Them Wrong and they change the goalposts, or wander off to needle someone else.

    We know by now, the more we engage, the more online space they get to fill with accelerationist Content.

    So just click the account name, then click the block button, and you’ll never see their viral brainrot again. Nobody needs to know; no need to announce it. If your freezepeach philosophy prevents that, maybe just upvote one of the replies you agree with and move on. If you’re on mobile, you can tag the account through Voyager etc instead of blocking, if you prefer.

    However you manage it, removing doomscroller ragebait from your Feed is worth doing.

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      2 months ago

      Copying their post over (with minimal formatting, unfortunately) for anyone that doesn’t care to go to that site (and to make sure it doesn’t randomly disappear)

      r/self.
      5 mo. ago.
      walkandtalkk.
      You’re being targeted by disinformation networks that are vastly more effective than you realize. And they’re making you more hateful and depressed.

      (I wrote this post in March and posted it on r/GenZ. However, a few people messaged me to say that the r/GenZ moderators took it down last week, though I’m not sure why. Given the flood of divisive, gender-war posts we’ve seen in the past five days, and several countries’ demonstrated use of gender-war propaganda to fuel political division in multiple countries, I felt it was important to repost this. This post was written for a U.S. audience, but the implications are increasingly global.)

      TL;DR: You know that Russia and other governments try to manipulate people online. But you almost certainly don’t how just how effectively orchestrated influence networks are using social media platforms to make you – individually-- angry, depressed, and hateful toward each other. Those networks’ goal is simple: to cause Americans and other Westerners – especially young ones – to give up on social cohesion and to give up on learning the truth, so that Western countries lack the will to stand up to authoritarians and extremists.

      And you probably don’t realize how well it’s working on you.

      This is a long post, but I wrote it because this problem is real, and it’s much scarier than you think.

      How Russian networks fuel racial and gender wars to make Americans fight one another

      In September 2018, a video went viral after being posted by In the Now, a social media news channel. It featured a feminist activist pouring bleach on a male subway passenger for manspreading. It got instant attention, with millions of views and wide social media outrage. Reddit users wrote that it had turned them against feminism.

      There was one problem: The video was staged. And In the Now, which publicized it, is a subsidiary of RT, formerly Russia Today, the Kremlin TV channel aimed at foreign, English-speaking audiences.

      As an MIT study found in 2019, Russia’s online influence networks reached 140 million Americans every month – the majority of U.S. social media users.

      Russia began using troll farms a decade ago to incite gender and racial divisions in the United States

      In 2013, Yevgeny Prigozhin, a confidante of Vladimir Putin, founded the Internet Research Agency (the IRA) in St. Petersburg. It was the Russian government’s first coordinated facility to disrupt U.S. society and politics through social media.

      Here’s what Prigozhin had to say about the IRA’s efforts to disrupt the 2022 election:

      Gentlemen, we interfered, we interfere and we will interfere. Carefully, precisely, surgically and in our own way, as we know how. During our pinpoint operations, we will remove both kidneys and the liver at once.

      In 2014, the IRA and other Russian networks began establishing fake U.S. activist groups on social media. By 2015, hundreds of English-speaking young Russians worked at the IRA. Their assignment was to use those false social-media accounts, especially on Facebook and Twitter – but also on Reddit, Tumblr, 9gag, and other platforms – to aggressively spread conspiracy theories and mocking, ad hominem arguments that incite American users.

      In 2017, U.S. intelligence found that Blacktivist, a Facebook and Twitter group with more followers than the official Black Lives Matter movement, was operated by Russia. Blacktivist regularly attacked America as racist and urged black users to rejected major candidates. On November 2, 2016, just before the 2016 election, Blacktivist’s Twitter urged Black Americans: “Choose peace and vote for Jill Stein. Trust me, it’s not a wasted vote.”

      Russia plays both sides – on gender, race, and religion

      The brilliance of the Russian influence campaign is that it convinces Americans to attack each other, worsening both misandry and misogyny, mutual racial hatred, and extreme antisemitism and Islamophobia. In short, it’s not just an effort to boost the right wing; it’s an effort to radicalize everybody.

      Russia uses its trolling networks to aggressively attack men. According to MIT, in 2019, the most popular Black-oriented Facebook page was the charmingly named “My Baby Daddy Aint Shit.” It regularly posts memes attacking Black men and government welfare workers. It serves two purposes: Make poor black women hate men, and goad black men into flame wars.

      MIT found that My Baby Daddy is run by a large troll network in Eastern Europe likely financed by Russia.

      But Russian influence networks are also also aggressively misogynistic and aggressively anti-LGBT.

      On January 23, 2017, just after the first Women’s March, the New York Times found that the Internet Research Agency began a coordinated attack on the movement. Per the Times:

      More than 4,000 miles away, organizations linked to the Russian government had assigned teams to the Women’s March. At desks in bland offices in St. Petersburg, using models derived from advertising and public relations, copywriters were testing out social media messages critical of the Women’s March movement, adopting the personas of fictional Americans.

      They posted as Black women critical of white feminism, conservative women who felt excluded, and men who mocked participants as hairy-legged whiners.

      But the Russian PR teams realized that one attack worked better than the rest: They accused its co-founder, Arab American Linda Sarsour, of being an antisemite. Over the next 18 months, at least 152 Russian accounts regularly attacked Sarsour. That may not seem like many accounts, but it worked: They drove the Women’s March movement into disarray and eventually crippled the organization.

      Russia doesn’t need a million accounts, or even that many likes or upvotes. It just needs to get enough attention that actual Western users begin amplifying its content.

      A former federal prosecutor who investigated the Russian disinformation effort summarized it like this:

      It wasn’t exclusively about Trump and Clinton anymore. It was deeper and more sinister and more diffuse in its focus on exploiting divisions within society on any number of different levels.

      As the New York Times reported in 2022,

      There was a routine: Arriving for a shift, [Russian disinformation] workers would scan news outlets on the ideological fringes, far left and far right, mining for extreme content that they could publish and amplify on the platforms, feeding extreme views into mainstream conversations.

      (Splitting into two pieces)

      • @[email protected]
        link
        fedilink
        English
        12 months ago

        (continued)

        China is joining in with AI

        Last month, the New York Times reported on a new disinformation campaign. “Spamouflage” is an effort by China to divide Americans by combining AI with real images of the United States to exacerbate political and social tensions in the U.S. The goal appears to be to cause Americans to lose hope, by promoting exaggerated stories with fabricated photos about homeless violence and the risk of civil war.

        As Ladislav Bittman, a former Czechoslovakian secret police operative, explained about Soviet disinformation, the strategy is not to invent something totally fake. Rather, it is to act like an evil doctor who expertly diagnoses the patient’s vulnerabilities and exploits them, “prolongs his illness and speeds him to an early grave instead of curing him.”

        The influence networks are vastly more effective than platforms admit

        Russia now runs its most sophisticated online influence efforts through a network called Fabrika. Fabrika’s operators have bragged that social media platforms catch only 1% of their fake accounts across YouTube, Twitter, TikTok, and Telegram, and other platforms.

        But how effective are these efforts? By 2020, Facebook’s most popular pages for Christian and Black American content were run by Eastern European troll farms tied to the Kremlin. And Russia doesn’t just target angry Boomers on Facebook. Russian trolls are enormously active on Twitter. And, even, on Reddit.

        It’s not just false facts

        The term “disinformation” undersells the problem. Because much of Russia’s social media activity is not trying to spread fake news. Instead, the goal is to divide and conquer by making Western audiences depressed and extreme.

        Sometimes, through brigading and trolling. Other times, by posting hyper-negative or extremist posts or opinions about the U.S. the West over and over, until readers assume that’s how most people feel. And sometimes, by using trolls to disrupt threads that advance Western unity.

        As the RAND think tank explained, the Russian strategy is volume and repetition, from numerous accounts, to overwhelm real social media users and create the appearance that everyone disagrees with, or even hates, them. And it’s not just low-quality bots. Per RAND,

        Russian propaganda is produced in incredibly large volumes and is broadcast or otherwise distributed via a large number of channels. … According to a former paid Russian Internet troll, the trolls are on duty 24 hours a day, in 12-hour shifts, and each has a daily quota of 135 posted comments of at least 200 characters.

        What this means for you

        You are being targeted by a sophisticated PR campaign meant to make you more resentful, bitter, and depressed. It’s not just disinformation; it’s also real-life human writers and advanced bot networks working hard to shift the conversation to the most negative and divisive topics and opinions.

        It’s why some topics seem to go from non-issues to constant controversy and discussion, with no clear reason, across social media platforms. And a lot of those trolls are actual, “professional” writers whose job is to sound real.

        So what can you do? To quote WarGames: The only winning move is not to play. The reality is that you cannot distinguish disinformation accounts from real social media users. Unless you know whom you’re talking to, there is a genuine chance that the post, tweet, or comment you are reading is an attempt to manipulate you – politically or emotionally.

        Here are some thoughts:

        • Don’t accept facts from social media accounts you don’t know. Russian, Chinese, and other manipulation efforts are not uniform. Some will make deranged claims, but others will tell half-truths. Or they’ll spin facts about a complicated subject, be it the war in Ukraine or loneliness in young men, to give you a warped view of reality and spread division in the West.

        • Resist groupthink. A key element of manipulate networks is volume. People are naturally inclined to believe statements that have broad support. When a post gets 5,000 upvotes, it’s easy to think the crowd is right. But “the crowd” could be fake accounts, and even if they’re not, the brilliance of government manipulation campaigns is that they say things people are already predisposed to think. They’ll tell conservative audiences something misleading about a Democrat, or make up a lie about Republicans that catches fire on a liberal server or subreddit.

        • Don’t let social media warp your view of society. This is harder than it seems, but you need to accept that the facts – and the opinions – you see across social media are not reliable. If you want the news, do what everyone online says not to: look at serious, mainstream media. It is not always right. Sometimes, it screws up. But social media narratives are heavily manipulated by networks whose job is to ensure you are deceived, angry, and divided.

  • @[email protected]
    link
    fedilink
    English
    58
    edit-2
    2 months ago

    The bravery of this women to speak up against injustice! No doubt she is going to “face consequences” for this disruption. Let’s wish her the best.

    Salam Alaykum Ibtiahl Aboussad!

  • @[email protected]
    link
    fedilink
    English
    332 months ago

    “I hear your protest, thank you.” He was trained with words of acknowledgement. Such useless words.

  • @[email protected]
    link
    fedilink
    English
    1912 months ago

    Brave as fuck. It’s a call for all of us to do a little more. What’s happening in our world shouldn’t be the norm anymore.

  • @[email protected]
    link
    fedilink
    English
    192
    edit-2
    2 months ago

    Thank you, Lemmy. I can’t find this on the technology subreddit. Reddit Is complacent.

    attempted to post the link on technology subreddit. It said that it has been posted too many times. I went and sorted by new, and it’s not visible.

    Not too many years ago this would have been at the very top of the technology subreddit.

    suspect for sure

  • @[email protected]
    link
    fedilink
    English
    252 months ago

    TL;DR: she works on the speech to text AI product on Azure, which is used by Israel for some of their operations.

    Look, I understand her desire to stop fighting in Palestine, but by that logic, we should also be protesting every software and computer manufacturer.

    • @[email protected]
      link
      fedilink
      English
      822 months ago

      By that logic, if it’s not practical to protest every single injustice in the world, we just shouldn’t bother.

      I’d say genocide is a good place to start, wouldn’t you?

      • @[email protected]
        link
        fedilink
        English
        52 months ago

        Protest the Linux Kernel cause they use it in North Korea and probably weapon systems around the world too?

        • @[email protected]
          link
          fedilink
          English
          312 months ago

          nobody is profiting from that, so that’s not her point. Linux foundation isnt directly selling something used to detect a blow up kids

          • @[email protected]
            link
            fedilink
            English
            6
            edit-2
            2 months ago

            They are giving it away for free, which according to you is morally better than selling it? So if I make the AI, make it open source and somebody uses it to blow up kids I’m good. But if I sell it I’m evil?

            • @[email protected]
              link
              fedilink
              English
              252 months ago

              I get your point. But Microsoft knows exactly who is using their cloud and why (id proof and industry), vs something for everyone to grab (LF doesnt require id proof and industry). Microsoft is knowingly serving child murderers and it knows their tech is used to do exactly that.

              You cannot ban kitchen knives because there were a mass stab, but you enforce strict background check so you dont sell rifles to school shooters.

              In other words, impossible to enforce linux ban without removing the open source aspect of it and affect good people like, but totally possible just not serve somebody commiting a genocide.

              • @[email protected]
                link
                fedilink
                English
                12 months ago

                I mean they sell to tons of military folks I assume. I know people didn’t sign up for it but the difference between genocide and military operation is purely what side you’re on. Also where do we draw the line? Windows use? Or just ai because of the mass surveillance capabilities? I can’t see that genie going back in the bottle.

                Nothing wrong with the protest and good on them for getting the word out because the scale is crazy but we saw this same shit 20 years ago and nobody has the appetite to politically challenge the status quo so far as the intelligence community etc. and scummy use of technology.

                • @[email protected]
                  link
                  fedilink
                  English
                  82 months ago

                  Genocide vs military op depends on side? I am on neither side, an observer from outside and I can see one fully armed side holocausting another side for the past 70 years and making sure nobody from anywhere in the world complain about it through lobbying.

                • @[email protected]
                  link
                  fedilink
                  English
                  22 months ago

                  the difference between genocide and military operation is purely what side you’re on.

                  The difference between rape and a good time is also purely what side you’re on.

              • @[email protected]
                link
                fedilink
                English
                52 months ago

                This is an important distinction, and one I wasn’t aware of, that Microsoft know exactly who, where and when the AI is being used, and what exactly it is being used for: the genocide of Palestinians. And who knows what other genocides will follow.

            • @[email protected]
              link
              fedilink
              English
              5
              edit-2
              2 months ago

              The protest is also about computational resources, tech support hours, and not just source code. Anyone could download an open-source state-of-the-art multi-petabyte model given enough bandwidth and time, but not just anyone can run it

              Although the documents do not specify how the different army units use these cloud storage and AI tools, they do indicate that about a third of the purchases were intended for “air-gapped” systems that are isolated from the internet and public networks, strengthening the possibility that the tools have been used for operational purposes — such as combat and intelligence — as opposed to simply logistical or bureaucratic functions. Indeed, two sources in Unit 8200 confirmed that the Military Intelligence Directorate purchased storage and AI services from Microsoft Azure for intelligence-gathering activities, and three other sources in the unit confirmed that similar services were purchased from Amazon’s cloud computing platform, AWS.

              The documents further show that Microsoft personnel work closely with units in the Israeli army to develop products and systems. Dozens of units have purchased “extended engineering services” from Microsoft, in which, according to the company’s website, “Microsoft experts become an integral part of the [customer’s] team.”

              The documents describe, for example, that in recent years the Military Intelligence Directorate has purchased private development meetings and professional workshops, which Microsoft’s experts have given to soldiers at a cost of millions of dollars. Between October 2023 and June 2024 alone, the Israeli Defense Ministry spent $10 million to purchase 19,000 hours of engineering support from Microsoft.
              An intelligence officer who served in a technological role in Unit 8200 in recent years, and worked directly with Microsoft Azure employees before October 7 to develop a surveillance system used to monitor Palestinians, told +972 and Local Call that the company’s developers became so embedded that he referred to them as “people who are already working with the unit,” as if they were soldiers.

              Source: https://www.972mag.com/microsoft-azure-openai-israeli-army-cloud/

  • @[email protected]
    link
    fedilink
    English
    72 months ago

    This is what happens when you read and watch hamas’ war propaganda online.

    There are photos and videos of russian soldiers dying. Does that give ukraine the fault for this war?

    • @[email protected]
      link
      fedilink
      English
      42 months ago

      Which particular set of Israeli policies are you supporting here?

      The 10-15 civilian deaths per strike on a possible low level Hamas supporter?

      The continued “accidental” killing of medical staff, including Red Cross and Red Crescent workers, and journalists?

      The idea of turning off water and electricity from all of Gaza as a valid response?

      The escalation of illegal settlement and theft of Palestinian property and land in the West Bank, with IDF support no less?

      Or the use of the Hannibal Protocol against Israeli Civilians on October 8th by IDF forces?

      Or do you just not think it is incumbent on the one with greater power and strength to offer the olive branch first if peace is truly desires? And in fact would rather the Palestinian people are smeared across an enlarged Israel?

    • @[email protected]
      link
      fedilink
      English
      47
      edit-2
      2 months ago

      Children are not soilders. I don’t think anyone is criticising israel for killing hamas leader or soilders, especially given that hamas is a terrist organization, and Isreal is provoked; but bombing hospital, killing children, journalists, and destroying UN shelters are not okay.

      Whether terrorist attack justify a full scale invasion might be debatable, but sabotaging humanitarian effort, killing children and non-combative unit is not.

        • @[email protected]
          link
          fedilink
          English
          15
          edit-2
          2 months ago

          Yes, and journalist soilders, UN shelter soilders, and the humanitarian aid workers from World Central Kitchen from U.S., U.K., and Australia, believe it or not, also hamas soldiers.

          • @[email protected]
            link
            fedilink
            English
            12 months ago

            The UN palestine workers (or at least some) were proven to be associated to hamas.

            For the rest I need sources.

            • @[email protected]
              link
              fedilink
              English
              52 months ago

              ‘Source for thee, but none for me’ is an interesting rhetorical strategy. Let’s see how it works out.

              • @[email protected]
                link
                fedilink
                English
                12 months ago

                Just google that shit bruv

                The UN even admitted to it and kicked out some of their workers

                • @[email protected]
                  link
                  fedilink
                  English
                  22 months ago

                  What the fuck are you on about? I’m commenting on your demand for sources while also providing none for numerous claims. Interesting strategy. You must have studied Plato.

    • @[email protected]
      link
      fedilink
      English
      282 months ago

      This is what happens when you read Israeli war propaganda online.

      Genocide apologists are the absolute worst scum.

      • @[email protected]
        link
        fedilink
        English
        1
        edit-2
        2 months ago

        I would try to write up well-worded arguments backed up with valid sources, but there’s no point. No matter how well I try to back up my arguments with valid sources, all I get is emotionally biased hate with no content so I’ve given up on that because it’s not worth the effort. Feel free to give me articles from unbiased sources.

        I do admit that israel has done the odd shitty thing but it by far isn’t as bad as pro-palestinian brainwashed people make it out to be.

        I want to also ask you to propose an alternative response to what hamas is doing after reading up on how they have acted in the past.

    • @[email protected]
      link
      fedilink
      English
      252 months ago

      You see an article about a major tech company collaborating with a major world government, and you think the terrorist resistance cells living in dirt are the ones more capable of disseminating propaganda?

      What if you’re the one who’s consumed the propaganda? Where’s the money, Larry?

        • @[email protected]
          link
          fedilink
          English
          12
          edit-2
          2 months ago

          lol your other comment also just says “i wont engage”. you say the conversation is emotional but your replies have nothing but grandstanding. ive heard a lot of crazy unsubstantiated claims around Israel-Palestine, but the idea that accusations of innocents being murdered is hamas propaganda is possibly the craziest. what channels are the propagandists going through? whose support does hamas have to disseminate this propaganda? are we done with russian bots now we’re onto saudi bots?

          i will gladly examine the sources of information and challenge my biases, but you’re gonna have to give me more to work with. we have a LOT of info on childrens hospitals being bombed, aid workers being executed, major US news outlets softballing their coverage, etc. What basis do we have to believe your claims? cuz i got news for you: saying that it was okay to bomb an entire children’s hospital because it was full of terrorists, that is the propaganda. hamas using civilians as a shield, israel blowing up civilians anyway, how can you point fingers at one and say “that’s fine”?

          • @[email protected]
            link
            fedilink
            English
            12 months ago

            Hamas literally rely on people from other countries to support their terrorism and prevent israel from acting. Hamas themselves are too weak to win this. All the sad social media posts and shit - all intentional, all manipulative. Why don’t you go to palestine and bring justice yourself instead of complaining on the internet?

            All the while they’re themselves engaging in a genocide on israelis. Read up on that. Hundreds of experts around the planet agree on that.

            Why should israel play fair when hamas doesn’t? Hamas hides in hospitals, undeniably.

  • Lady Butterfly
    link
    fedilink
    English
    382 months ago

    Can anyone ELI5 how they’re using AI for genocide? I have awful IT skills so I don’t understand AI

    • @[email protected]
      link
      fedilink
      English
      102 months ago

      Here’s her words on it

      When I moved to AI Platform, I was excited to contribute to cutting-edge AI technology and its applications for the good of humanity: accessibility products, translation services, and tools to “empower every human and organization to achieve more.” I was not informed that Microsoft would sell my work to the Israeli military and government, with the purpose of spying on and murdering journalists, doctors, aid workers, and entire civilian families. If I knew my work on transcription scenarios would help spy on and transcribe phone calls to better target Palestinians (source), I would not have joined this organization and contributed to genocide. I did not sign up to write code that violates human rights.

    • @[email protected]
      link
      fedilink
      English
      5
      edit-2
      2 months ago

      There are also many higher education professors doing smart drones. I was in one conference and they were showing off drones flying by themselves and showed the difference between different weather and everything.

    • @[email protected]
      link
      fedilink
      English
      17
      edit-2
      2 months ago

      Stop believing in the veneer of smartness and superiority around these genocidal fuckers.

      There is no NON ELI5 explanation here of how they’re using AI for genocide, because the truth is horrible, stupid and brutal.

      They are using AI because it is the best tool bullshitters have currently to offload blame for things they, individual human beings, chose to do onto obscure abstract entities like corporations, AI decisions and other bullshit.

      There is nothing more to it than that, I promise you, it is all just layers of bullshit that is attempting to obscure culpability for participating in a genocide, and honestly it is the perfect technology for that.

    • Realitätsverlust
      link
      fedilink
      English
      342 months ago

      AI is being pushed into war machines big time. America and China are both working on it. With ukraine showing how incredibly effective drones are in warfare, just imagine the damage and destruction a swarm of drones controlled by an AI could cause.

      • @[email protected]
        link
        fedilink
        English
        202 months ago

        But critics warn the [AI] system is unproven at best — and at worst, providing a technological justification for the killing of thousands of Palestinian civilians.

        This 2023 article didn’t age well.

    • @[email protected]
      link
      fedilink
      English
      332 months ago

      From the article:

      The Israeli military uses Microsoft Azure to compile information gathered through mass surveillance, which it transcribes and translates, including phone calls, texts and audio messages, according to an Israeli intelligence officer who works with the systems.

      From my understanding, they use AI to automate the processing of text, audio, and video data collected by the intelligence services.