Per one tech forum this week: “Google has quietly installed an app on all Android devices called ‘Android System SafetyCore’. It claims to be a ‘security’ application, but whilst running in the background, it collects call logs, contacts, location, your microphone, and much more making this application ‘spyware’ and a HUGE privacy concern. It is strongly advised to uninstall this program if you can. To do this, navigate to 'Settings’ > 'Apps’, then delete the application.”

  • @[email protected]
    link
    fedilink
    English
    114 months ago

    It didn’t appear in my apps list so I thought it wasn’t installed. But when I searched for the app name it appears. So be aware.

        • @[email protected]
          link
          fedilink
          English
          3
          edit-2
          4 months ago

          Even worse, i found this comment in the app store and it did the same on my device :

          Installed automatically without my knowledge, no notification, only found it because of a friend’s post, and even then, you only see it through a link, it doesn’t come up in your app list or a search of the Google play store. I thought it felt like my battery was draining a little quicker too, which is apparently also something noticed in connection to having this app. Uninstalling.

          The app can be found here :
          https://play.google.com/store/apps/details?id=com.google.android.safetycore
          .

          • @[email protected]
            link
            fedilink
            English
            24 months ago

            i was able to find it on my oneplus, and i also noticed, why is my oneplus 12r draining so fast?

          • @[email protected]
            link
            fedilink
            English
            24 months ago

            Oh right, maybe I noticed because of Storage Isolation, that’s an app which allows you to restrict folder access of other apps, and it prompts me to select actions for every newly installed app. So it casually prompts me whenever google pushes a new, hidden installation.

  • @[email protected]
    link
    fedilink
    English
    54 months ago

    The countdown to Android’s slow and painful death is already ticking for a while.

    It has become over-engineered and no longer appealing from a developer’s viewpoint.

    I still write code for Android because my customers need it - will be needing for a while - but I’ve stopped writng code for Apple’s i-things and I research alternatives for Android. Rolling my own environment with FOSS components on top of Raspbian looks feasible already. On robots and automation, I already use it.

  • @[email protected]
    link
    fedilink
    English
    134 months ago

    Kind of weird that they are installing this dependency whether you will enable those planned scanning features or not. Here is an article mentioning that future feature Sensitive Content Warnings. It does sound kind of cool, less chance to accidentally send your dick pic to someone I guess.

    Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing, and then prompts with a “speed bump” that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares.

    All of this happens on-device to protect your privacy and keep end-to-end encrypted message content private to only sender and recipient. Sensitive Content Warnings doesn’t allow Google access to the contents of your images, nor does Google know that nudity may have been detected. This feature is opt-in for adults, managed via Android Settings, and is opt-out for users under 18 years of age.

    • @[email protected]
      link
      fedilink
      English
      194 months ago

      Looks like more of a chance of false positives happening and getting the police to raid your home to confiscate your devices. I don’t care what the article says I know Google is getting access to that data because that’s who they are.

      • @[email protected]
        link
        fedilink
        English
        94 months ago

        Please, read the links. They are the security and privacy experts when it comes to Android. That’s their explanation of what this Android System SafetyCore actually is.

    • @[email protected]
      link
      fedilink
      English
      284 months ago

      To quote the most salient post

      The app doesn’t provide client-side scanning used to report things to Google or anyone else. It provides on-device machine learning models usable by applications to classify content as being spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users.

      Which is a sorely needed feature to tackle problems like SMS scams

      • desktop_user [they/them]
        link
        fedilink
        English
        64 months ago

        if the cellular carriers were forced to verify that caller-ID (or SMS equivalent) was accurate SMS scams would disappear (or at least be weaker). Google shouldn’t have to do the job of the carriers, and if they wanted to implement this anyway they should let the user choose what service they want to perform the task similar to how they let the user choose which “Android system WebView” should be used.

        • @[email protected]
          link
          fedilink
          English
          34 months ago

          Carriers don’t care. They are selling you data. They don’t care how it’s used. Google is selling you a phone. Apple held down the market for a long time for being the phone that has some of the best security. As an android user that makes me want to switch phones. Not carriers.

      • @[email protected]
        link
        fedilink
        English
        114 months ago

        Why do you need machine learning for detecting scams?

        Is someone in 2025 trying to help you out of the goodness of their heart? No. Move on.

        • @[email protected]
          link
          fedilink
          English
          44 months ago

          If you want to talk money then it is in businesses best interest that money from their users is being used on their products, not being scammed through the use of their products.

          Secondly machine learning or algorithms can detect patterns in ways a human can’t. In some circles I’ve read that the programmers themselves can’t decipher in the code how the end result is spat out, just that the inputs will guide it. Besides the fact that scammers can circumvent any carefully laid down antispam, antiscam, anti-virus through traditional software, a learning algorithm will be magnitudes harder to bypass. Or easier. Depends on the algorithm

          • @[email protected]
            link
            fedilink
            English
            44 months ago

            I don’t know the point of the first paragraph…scams are bad? Yes? Does anyone not agree? (I guess scammers)

            For the second we are talking in the wild abstract, so I feel comfortable pointing out that every automated system humanity has come up with so far has pulled in our own biases and since ai models are trained by us, this should be no different. Second, if the models are fallible, you cannot talk about success without talking false positives. I don’t care if it blocks every scammer out there if it also blocks a message from my doctor. Until we have data on consensus between these new algorithms and desired outcomes, it’s pointless to claim they are better at X.

      • @[email protected]
        link
        fedilink
        English
        24 months ago

        You don’t need advanced scanning technology running on every device with access to every single bit of data you ever seen to detect scam. You need telco operator to stop forwarding forged messages headers and… that’s it. Cheap, efficient, zero risk related to invasion of privacy through a piece of software you did not need but was put there “for your own good”.

    • @[email protected]
      link
      fedilink
      English
      44 months ago

      So is this really just a local AI model? Or is it something bigger? My S25 Ultra has the app but it hasn’t used any battery or data.

    • Spaniard
      link
      fedilink
      English
      84 months ago

      If the app did what op is claiming then the EU would have a field day fining google.

    • @[email protected]
      link
      fedilink
      English
      3
      edit-2
      4 months ago

      graphene folks have a real love for the word misinformation (and FUD, and brigading). That’s not you under there👻, Daniel, is it?

      After 5 years of his antics hateful bullshit lies, I think I can genuinely say that word triggers me.

  • DigitalDilemma
    link
    fedilink
    English
    114 months ago

    More information: It’s been rolling out to Android 9+ users since November 2024 as a high priority update. Some users are reporting it installs when on battery and off wifi, unlike most apps.

    App description on Play store: SafetyCore is a Google system service for Android 9+ devices. It provides the underlying technology for features like the upcoming Sensitive Content Warnings feature in Google Messages that helps users protect themselves when receiving potentially unwanted content. While SafetyCore started rolling out last year, the Sensitive Content Warnings feature in Google Messages is a separate, optional feature and will begin its gradual rollout in 2025. The processing for the Sensitive Content Warnings feature is done on-device and all of the images or specific results and warnings are private to the user.

    Description by google Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing, and then prompts with a “speed bump” that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares. - https://9to5google.com/android-safetycore-app-what-is-it/

    So looks like something that sends pictures from your messages (at least initially) to Google for an AI to check whether they’re “sensitive”. The app is 44mb, so too small to contain a useful ai and I don’t think this could happen on-phone, so it must require sending your on-phone data to Google?

  • @[email protected]
    link
    fedilink
    English
    71
    edit-2
    4 months ago

    For people who have not read the article:

    Forbes states that there is no indication that this app can or will “phone home”.

    Its stated use is for other apps to scan an image they have access to find out what kind of thing it is (known as "classification"). For example, to find out if the picture you’ve been sent is a dick-pick so the app can blur it.

    My understanding is that, if this is implemented correctly (a big ‘if’) this can be completely safe.

    Apps requesting classification could be limited to only classifying files that they already have access to. Remember that android has a concept of “scoped storage” nowadays that let you restrict folder access. If this is the case, well it’s no less safe than not having SafetyCore at all. It just saves you space as companies like Signal, WhatsApp etc. no longer need to train and ship their own machine learning models inside their apps, as it becomes a common library / API any app can use.

    It could, of course, if implemented incorrectly, allow apps to snoop without asking for file access. I don’t know enough to say.

    Besides, you think that Google isn’t already scanning for things like CSAM? It’s been confirmed to be done on platforms like Google Photos well before SafetyCore was introduced, though I’ve not seen anything about it being done on devices yet (correct me if I’m wrong).

    • @[email protected]
      link
      fedilink
      English
      10
      edit-2
      4 months ago

      This is EXACTLY what Apple tried to do with their on-device CSAM detection, it had a ridiculous amount of safeties to protect people’s privacy and still it got shouted down

      I’m interested in seeing what happens when Holy Google, for which most nerds have a blind spot, does the exact same thing

      EDIT: from looking at the downvotes, it really seems that Google can do no wrong 😆 And Apple is always the bad guy in lemmy

      • @[email protected]
        link
        fedilink
        English
        24 months ago

        Overall, I think this needs to be done by a neutral 3rd party. I just have no idea how such a 3rd party could stay neutral. Some with social media content moderation.

      • Noxy
        link
        fedilink
        English
        204 months ago

        it had a ridiculous amount of safeties to protect people’s privacy

        The hell it did, that shit was gonna snitch on its users to law enforcement.

        • @[email protected]
          link
          fedilink
          English
          14 months ago

          Nope.

          A human checker would get a reduced quality copy after multiple CSAM matches. No police was to be called if the human checker didn’t verify a positive match

          Your idea of flooding someone with fake matches that are actually cat pics wouldn’t have worked

          • Noxy
            link
            fedilink
            English
            44 months ago

            That’s a fucking wiretap, yo

      • Natanael
        link
        fedilink
        English
        16
        edit-2
        4 months ago

        Apple had it report suspected matches, rather than warning locally

        It got canceled because the fuzzy hashing algorithms turned out to be so insecure it’s unfixable (easy to plant false positives)

        • @[email protected]
          link
          fedilink
          English
          24 months ago

          The official reason they dropped it is because there were security concerns. The more likely reason was the massive outcry that occurs when Apple does these questionable things. Crickets when it’s Google.

          The feature was re-added as a child safety feature called “Comminication Saftey” that is optional on a child accounts that will automatically block nudity sent to children.

        • @[email protected]
          link
          fedilink
          English
          14 months ago

          They were not “suspected” they had to be matches to actual CSAM.

          And after that a reduced quality copy was shown to an actual human, not an AI like in Googles case.

          So the false positive would slightly inconvenience a human checker for 15 seconds, not get you Swatted or your account closed

          • Natanael
            link
            fedilink
            English
            2
            edit-2
            4 months ago

            Yeah so here’s the next problem - downscaling attacks exists against those algorithms too.

            https://scaling-attacks.net/

            Also, even if those attacks were prevented they’re still going to look through basically your whole album if you trigger the alert

            • @[email protected]
              link
              fedilink
              English
              14 months ago

              And you’ll again inconvenience a human slightly as they look at a pixelated copy of a picture of a cat or some noise.

              No cops are called, no accounts closed

              • Natanael
                link
                fedilink
                English
                14 months ago

                The scaling attack specifically can make a photo sent to you look innocent to you and malicious to the reviewer, see the link above

      • Lka1988
        link
        fedilink
        English
        15
        edit-2
        4 months ago

        I have 5 kids. I’m almost certain my photo library of 15 years has a few completely innocent pictures where a naked infant/toddler might be present. I do not have the time to search 10,000+ pics for material that could be taken completely out of context and reported to authorities without my knowledge. Plus, I have quite a few “intimate” photos of my wife in there as well.

        I refuse to consent to a corporation searching through my device on the basis of “well just in case”, as the ramifications of false positives can absolutely destroy someone’s life. The unfortunate truth is that “for your security” is a farce, and people who are actually stupid enough to intentionally create that kind of material are gonna find ways to do it regardless of what the law says.

        Scanning everyone’s devices is a gross overreach and, given the way I’ve seen Google and other large corporations handle reports of actually-offensive material (i.e. they do fuck-all), I have serious doubts over the effectiveness of this program.

    • Ulrich
      link
      fedilink
      English
      334 months ago

      Forbes states that there is no indication that this app can or will “phone home”.

      That doesn’t mean that it doesn’t. If it were open source, we could verify it. As is, it should not be trusted.

        • @[email protected]
          link
          fedilink
          English
          44 months ago

          The Graphene devs say it’s a local only service.

          Open source would be better (and I can easily see open source alternatives being made if you’re not locked into a Google Android-based phone), but the idea is sound and I can deny network privileges to the app with Graphene so it doesn’t matter if it does decide to one day try to phone home… so I’ll give it a shot.

          • @[email protected]
            link
            fedilink
            English
            84 months ago

            God I wish I could completely deny internet access to some of my apps on stock android. It’s obvious why they don’t allow it though.

            • @[email protected]
              link
              fedilink
              English
              34 months ago

              Check out Netguard. It’s an app that pretends to be a VPN client so most of your traffic has to go through it - and then you can deny/allow internet access per app. Even works without root.

    • @[email protected]
      link
      fedilink
      English
      164 months ago

      Doing the scanning on-device doesn’t mean that the findings cannot be reported further. I don’t want others going thru my private stuff without asking - not even machine learning.

    • @[email protected]
      link
      fedilink
      English
      104 months ago

      Issue is, a certain cult (christian dominionists), with the help of many billionaires (including Muskrat) have installed a fucking dictator in the USA, who are doing their vow to “save every soul on Earth from hell”. If you get a porn ban, it’ll phone not only home, but directly to the FBI’s new “moral police” unit.

  • @[email protected]
    link
    fedilink
    English
    194 months ago

    Thanks for bringing this up, first I’ve heard of it. Not present on my GrapheneOS pixel, present on stock.

    I suppose I should encourage pixel owners to switch from stock to graphene, I know which decide I rather spend time using. GrapheneOS one of course.

    • @[email protected]
      link
      fedilink
      English
      34 months ago

      I’ve looked into it.l briefly. Did you have any issues switching? I’m concerned about how some apps I need would function.

      • @[email protected]
        link
        fedilink
        English
        34 months ago

        I did a fair amount of research before the switch to find alternatives to Google services, some I’ve replaced, others I felt were too much of a hassle for my phone usage.

        I’ve kept my original pixel stock, the hardest part about switching this one over was plugging it in and following the instructions.

        I’m hoping to get rid of my stock OS pixel soon, it would appear my bank hasn’t blocked it’s app on Graphene, unlike Uber.

        For the rest I’ll either buy a cheap af shitbox to use purely for banking and Uber (if it comes to that).

        If you’ve any other questions I’m happy to help find then answers with you, feel free to DM me.

      • @[email protected]
        link
        fedilink
        English
        24 months ago

        I switched from a Samsung to a Pixel a couple years ago. I instantly installed GrapheneOS and have loved it ever since. It generally works perfectly normally with the huge background benefit of security and privacy. The only issues I have had is one of my banking apps doesn’t work (but the others work fine) and lack of RCS (but I’m sure it’s coming). In short, highly highly recommend. I will be sticking with GOS for the long term!

    • @[email protected]
      link
      fedilink
      English
      34 months ago

      I’ve got a Pixel 8 Pro and I’m currently using the stock OS. Anything in particular that you miss with Graphene OS?

      • @[email protected]
        link
        fedilink
        English
        14 months ago

        I switched from a Samsung to a Pixel a couple years ago. I instantly installed GrapheneOS and have loved it ever since. It generally works perfectly normally with the huge background benefit of security and privacy. The only issues I have had is one of my banking apps doesn’t work (but the others work fine) and lack of RCS (but I’m sure it’s coming). In short, highly highly recommend. I will be sticking with GOS for the long term!

    • @[email protected]
      link
      fedilink
      English
      34 months ago

      I’m traumatized by trying to use banking apps on lineage… don’t think I’ll risk it until I get a backup phone

    • @[email protected]
      link
      fedilink
      English
      24 months ago

      People can go further than that and install a ROM for their phone that doesn’t have any Google apps on it. People can even use applications that normally require Google Play Services by using microG, which spoofs things. You can also root your phone with Magisk and use apps to block anything leaking anything else.

  • @[email protected]
    link
    fedilink
    English
    184 months ago

    I just un-installed it

    Anyone know what Android System Intelligence does? Should that be un-installed as well?

    • @[email protected]
      link
      fedilink
      English
      84 months ago

      Jesus Christ they’re like bed bugs

      Is it too much to ask that my phone only contain the shit that makes it work, and not anything else?

      • @[email protected]
        link
        fedilink
        English
        4
        edit-2
        4 months ago

        Its a classic example of using “BUT THE CHILDREN” to be invasive dickheads.

        And it immediately reminds me of the story of the guy whose kid had a rash in the diaper area during covid, and the pediatrician requested pictures to remotely diagnose and treat, which google flagged as child pornography and called the cops on him, and banned/locked him out of everything (phone number, emails, pictures, etc etc) because he had everything on google.

        and no amount of the police, or even doctor, insisting the pictures were medical necessity and not child pornography would convince google to restore his acount or even let him recover his number/email/pictures/etc.

        • @[email protected]
          link
          fedilink
          English
          14 months ago

          The fact that Google refused to restore his account even after the police that they called said there was no child porn pisses me off to no end. They are officially allowed to close your account for no reason other than they don’t like you.

          • @[email protected]
            link
            fedilink
            English
            1
            edit-2
            4 months ago

            not only refused to restore the account, but still insisted he was a pedophile producing child pornography despite the cops and doctors and every other authority involved insisting he wasnt, and that the images were medically necessary, and refuse to even give/let him get a backup of all his family pictures, emails, etc.

            and theres gonna be a lot more of it once this stupid invasive spyware rolls out and gets going.

            If our parents and grandparents photos were digitized, they’d all probably be labled child porn producers, because almost every parent/grandparent/etc has some picture of their newborn getting a sink bath or some other completely harmless, and otherwise normal photo.

            and I think its so they can artificially inflate their numbers. They arent doing shit to stop actual child exploitation, so they hammer hard on this shit so they can make a big show of “cracking down and stopping” it.

    • Kilgore Trout
      link
      fedilink
      English
      34 months ago

      You can safely uninstall System Intelligence if you don’t need it. My phone has worked fine without it in the past year.

  • @[email protected]
    link
    fedilink
    English
    84 months ago

    Google says that SafetyCore “provides on-device infrastructure for securely and privately performing classification to help users detect unwanted content

    Cheers Google but I’m a capable adult, and able to do this myself.

    • TheWaterGod
      link
      fedilink
      English
      44 months ago

      I’m curious about this. I’ve got a Pixel 6 and noticed that the battery started going to shit about a month or so ago? I couldn’t find an install date for SafetyCore, but it was listed in my apps. I’ve uninstalled it now. It’ll be interesting to see if that was causing it.

        • TheWaterGod
          link
          fedilink
          English
          24 months ago

          It doesn’t show in the app drawer, but I found it via the all apps in Settings.

          Go to the Settings App > Apps > “See all XX apps”. It’s called Android System SafetyCore, so it should be close to the top of the list. Tap on it and select Uninstall.

          • @[email protected]
            link
            fedilink
            English
            14 months ago

            Well it looks like I don’t have it. Which is good, unless its hidden and unremovable. My battery app reports up to like 60% of power usage but nothing else. That means that some stupid app in the background is running down my battery for no good reason.

    • @[email protected]
      link
      fedilink
      English
      4
      edit-2
      4 months ago

      same here, i was wondering why my Op12r was draining like super fast, for a phone touthing 2+days battery(and im not even playing games or videos on it), yet it was draining as fast as an old pixel phone.

  • MochiGoesMeow
    link
    fedilink
    English
    144 months ago

    Is there any indication that Apple is truly more secure and privacy conscious over Android? Im kinda tired of Google and their oversteps.

    • @[email protected]
      link
      fedilink
      English
      304 months ago

      For true privacy you’ll want something like GrapheneOS on a Pixel, with no Google apps or anything. Some other ROM with no gApps as a second choice.

      Other than that, Apple SEEMS to be mildly better. I’ll give you an example: Apple pulls encryption feature from UK over government spying demands

      While it’s a bad thing that they pull the encryption feature, it’s a good sign - they either aren’t willing or able to add a backdoor for the UK security services. Then there was this case. If the article is to be believed, they started working on security as of iOS 8 so they could no longer comply with government requests. Today we’re on iOS 18.

      Apple claims their advertising ID is anonymized so third party apps don’t know who you are. That said, they still have the advertising ID service so Apple themselves do know a whoooooole lot about you - but this is the same with Google.

      Then regarding photo scanning - Apple received a LOT of backlash for their proposed photo scanning feature. But it was going to be only on-device scans on photos that were going to be uploaded to iCloud (so disabling iCloud would disable it too) and it was only going to report you if you had a LOT of child pornography on your phone - otherwise it was, supposedly, going to do absolutely nothing about the photos. It wasn’t even supposed to be a categorization model, just a “Does this match known CSAM?” filter. Google and Microsoft had already implemented something similar, except they didn’t scan your shit on-device.

      At the end of the day, Apple might be a bit more private, but it’s a wash. It’s not transparent and neither is Google. I like using their devices. Sometimes I miss the freedom of custom ROMs, but my damn banking apps stopped working on Lineage and I couldn’t be arsed to start using the banks’ mobile websites again like I’d done in the past. So I moved to iOS, as Oneplus had completely botched their Android experience in the meantime while I’d been using Lineage so I was kinda pissed at what I had considered one of the last remaining decent Android manufacturers (Sonys are overpriced and I will never own a Samsung, I hate them, I didn’t like my Huawei or Xiaomi much either).

      So if you want to run custom ROMs, get a Pixel or something. If not, Apple is as good a choice as Android. A couple of years ago it was the better choice even, as you’d get longer software support, but now the others have started catching up due to all the consumer outrage.

    • @[email protected]
      link
      fedilink
      English
      154 months ago

      The short answer is: Apple collects much of the same data as any other modern tech composite, but their “walled garden” strategy means that for the most part only THEY have access to that info.

      It’s technically lower risk since fewer parties have access to the data, but philosophically just about equally as bad because they aren’t doing this out of any real love for privacy (despite what their marketing department might claim)