Per one tech forum this week: “Google has quietly installed an app on all Android devices called ‘Android System SafetyCore’. It claims to be a ‘security’ application, but whilst running in the background, it collects call logs, contacts, location, your microphone, and much more making this application ‘spyware’ and a HUGE privacy concern. It is strongly advised to uninstall this program if you can. To do this, navigate to 'Settings’ > 'Apps’, then delete the application.”

  • @[email protected]
    link
    fedilink
    English
    71
    edit-2
    2 months ago

    For people who have not read the article:

    Forbes states that there is no indication that this app can or will “phone home”.

    Its stated use is for other apps to scan an image they have access to find out what kind of thing it is (known as "classification"). For example, to find out if the picture you’ve been sent is a dick-pick so the app can blur it.

    My understanding is that, if this is implemented correctly (a big ‘if’) this can be completely safe.

    Apps requesting classification could be limited to only classifying files that they already have access to. Remember that android has a concept of “scoped storage” nowadays that let you restrict folder access. If this is the case, well it’s no less safe than not having SafetyCore at all. It just saves you space as companies like Signal, WhatsApp etc. no longer need to train and ship their own machine learning models inside their apps, as it becomes a common library / API any app can use.

    It could, of course, if implemented incorrectly, allow apps to snoop without asking for file access. I don’t know enough to say.

    Besides, you think that Google isn’t already scanning for things like CSAM? It’s been confirmed to be done on platforms like Google Photos well before SafetyCore was introduced, though I’ve not seen anything about it being done on devices yet (correct me if I’m wrong).

    • Ulrich
      link
      fedilink
      English
      332 months ago

      Forbes states that there is no indication that this app can or will “phone home”.

      That doesn’t mean that it doesn’t. If it were open source, we could verify it. As is, it should not be trusted.

        • @[email protected]
          link
          fedilink
          English
          42 months ago

          The Graphene devs say it’s a local only service.

          Open source would be better (and I can easily see open source alternatives being made if you’re not locked into a Google Android-based phone), but the idea is sound and I can deny network privileges to the app with Graphene so it doesn’t matter if it does decide to one day try to phone home… so I’ll give it a shot.

          • @[email protected]
            link
            fedilink
            English
            82 months ago

            God I wish I could completely deny internet access to some of my apps on stock android. It’s obvious why they don’t allow it though.

            • @[email protected]
              link
              fedilink
              English
              32 months ago

              Check out Netguard. It’s an app that pretends to be a VPN client so most of your traffic has to go through it - and then you can deny/allow internet access per app. Even works without root.

    • @[email protected]
      link
      fedilink
      English
      102 months ago

      Issue is, a certain cult (christian dominionists), with the help of many billionaires (including Muskrat) have installed a fucking dictator in the USA, who are doing their vow to “save every soul on Earth from hell”. If you get a porn ban, it’ll phone not only home, but directly to the FBI’s new “moral police” unit.

    • @[email protected]
      link
      fedilink
      English
      10
      edit-2
      2 months ago

      This is EXACTLY what Apple tried to do with their on-device CSAM detection, it had a ridiculous amount of safeties to protect people’s privacy and still it got shouted down

      I’m interested in seeing what happens when Holy Google, for which most nerds have a blind spot, does the exact same thing

      EDIT: from looking at the downvotes, it really seems that Google can do no wrong 😆 And Apple is always the bad guy in lemmy

      • Noxy
        link
        fedilink
        English
        202 months ago

        it had a ridiculous amount of safeties to protect people’s privacy

        The hell it did, that shit was gonna snitch on its users to law enforcement.

        • @[email protected]
          link
          fedilink
          English
          12 months ago

          Nope.

          A human checker would get a reduced quality copy after multiple CSAM matches. No police was to be called if the human checker didn’t verify a positive match

          Your idea of flooding someone with fake matches that are actually cat pics wouldn’t have worked

          • Noxy
            link
            fedilink
            English
            42 months ago

            That’s a fucking wiretap, yo

      • @[email protected]
        link
        fedilink
        English
        22 months ago

        Overall, I think this needs to be done by a neutral 3rd party. I just have no idea how such a 3rd party could stay neutral. Some with social media content moderation.

      • Lka1988
        link
        fedilink
        English
        15
        edit-2
        2 months ago

        I have 5 kids. I’m almost certain my photo library of 15 years has a few completely innocent pictures where a naked infant/toddler might be present. I do not have the time to search 10,000+ pics for material that could be taken completely out of context and reported to authorities without my knowledge. Plus, I have quite a few “intimate” photos of my wife in there as well.

        I refuse to consent to a corporation searching through my device on the basis of “well just in case”, as the ramifications of false positives can absolutely destroy someone’s life. The unfortunate truth is that “for your security” is a farce, and people who are actually stupid enough to intentionally create that kind of material are gonna find ways to do it regardless of what the law says.

        Scanning everyone’s devices is a gross overreach and, given the way I’ve seen Google and other large corporations handle reports of actually-offensive material (i.e. they do fuck-all), I have serious doubts over the effectiveness of this program.

      • Natanael
        link
        fedilink
        English
        16
        edit-2
        2 months ago

        Apple had it report suspected matches, rather than warning locally

        It got canceled because the fuzzy hashing algorithms turned out to be so insecure it’s unfixable (easy to plant false positives)

        • @[email protected]
          link
          fedilink
          English
          12 months ago

          They were not “suspected” they had to be matches to actual CSAM.

          And after that a reduced quality copy was shown to an actual human, not an AI like in Googles case.

          So the false positive would slightly inconvenience a human checker for 15 seconds, not get you Swatted or your account closed

          • Natanael
            link
            fedilink
            English
            2
            edit-2
            2 months ago

            Yeah so here’s the next problem - downscaling attacks exists against those algorithms too.

            https://scaling-attacks.net/

            Also, even if those attacks were prevented they’re still going to look through basically your whole album if you trigger the alert

            • @[email protected]
              link
              fedilink
              English
              12 months ago

              And you’ll again inconvenience a human slightly as they look at a pixelated copy of a picture of a cat or some noise.

              No cops are called, no accounts closed

              • Natanael
                link
                fedilink
                English
                12 months ago

                The scaling attack specifically can make a photo sent to you look innocent to you and malicious to the reviewer, see the link above

        • @[email protected]
          link
          fedilink
          English
          22 months ago

          The official reason they dropped it is because there were security concerns. The more likely reason was the massive outcry that occurs when Apple does these questionable things. Crickets when it’s Google.

          The feature was re-added as a child safety feature called “Comminication Saftey” that is optional on a child accounts that will automatically block nudity sent to children.

    • @[email protected]
      link
      fedilink
      English
      162 months ago

      Doing the scanning on-device doesn’t mean that the findings cannot be reported further. I don’t want others going thru my private stuff without asking - not even machine learning.

    • @[email protected]OP
      link
      fedilink
      English
      412 months ago

      if there was something that could run android apps virtualized, I’d switch in a heartbeat

        • @[email protected]OP
          link
          fedilink
          English
          12 months ago

          not necessarily… I mean If they run under the same VM, I’d be fine with that as well…but having a sandboxed wrapper would for sure be nice.

        • @[email protected]
          link
          fedilink
          English
          122 months ago

          I gave it a run on Ubuntu touch with a fair phone like 8 months ago… It was still pretty rough then.

          • @[email protected]
            link
            fedilink
            English
            22 months ago

            I remember reading recently that it’s gotten better (haven’t tried myself so don’t hold me to it). I can say that Wayland in general has come a long way since I switched to Linux ~2 years ago

      • @[email protected]
        link
        fedilink
        English
        12 months ago

        I have used Waydroid, mainly with FOSS apps, and although it has some rough edges, it does often work for just having one or two Android apps functionality.

        Linux on mobile as a whole isn’t daily driver ready yet in my opinion. I’ve only tried pmOS on a OP6, but that seems to be a leading project on a well-supported phone (compared to the rest).

      • @[email protected]
        link
        fedilink
        English
        52 months ago

        Every one of them can, AFAIK. I have a second cheap used phone I picked up to play with Ubuntu Touch and it has a system called Waydroid for this. Not quite seamless and you’ll want to use native when possible but it does work.

        SailfishOS, PostmarketOS, Mobian, etc all also can use Waydroid or a similar thing

      • Refurbished Refurbisher
        link
        fedilink
        English
        9
        edit-2
        2 months ago

        There are two solutions for that. One is Waydroid, which is basically what you’re describing. Another is android_translation_layer, which is closer to WINE in that it translates API calls to more native Linux ones, although that project is still in the alpha stages.

        You can try both on desktop Linux if you’d like. Just don’t expect to run apps that require passing SafetyNet, like many banking apps.

        • @[email protected]OP
          link
          fedilink
          English
          42 months ago

          I know about WayDroid, but never heard of ATL.

          So yeah, while we have the fundamentals, we still don’t have an OS that’s stable enough as a daily driver on phones.

          And this isn’t a Linux issue. It’s mostly because of proprietary drivers. GrapheneOS already has the issue that it only works on Pixel phones.

          I can imagine, bringing a Linux only mobile OS to life is even harder. I wish android phones were designed in a way, that there is a driver layer and an OS layer, with standerdized APIs to simply swap the OS layer for any unix-like system.

          • Refurbished Refurbisher
            link
            fedilink
            English
            12 months ago

            Halium is basically what you’re talking about. It uses the Android HAL to run Linux.

            The thing is, that also uses the Android kernel, meaning that there will essentially never be a kernel update since the kernel patches by Qualcomm have a ton of technical debt. The people working on porting mainline Linux to SoCs are essentially rewriting everything from scratch.

    • @[email protected]
      link
      fedilink
      English
      29
      edit-2
      2 months ago

      The Firefox Phone should’ve been a real contender. I just want a browser in my pocket that takes good pictures and plays podcasts.

      • @[email protected]
        link
        fedilink
        English
        42 months ago

        too bad firefox is going through the way like google, they are updating thier privacy terms of usage.

        • @[email protected]
          link
          fedilink
          English
          42 months ago

          Yep. I’m furious at Mozilla right now. But when the Firefox Phone was in development, they were one of the web’s heroes.

          • @[email protected]
            link
            fedilink
            English
            22 months ago

            it says its only for LLM? as long as they dont try to expand the “privacy” in any case i download alternatives to the browsers anyways.

            • @[email protected]
              link
              fedilink
              English
              12 months ago

              I’m mostly just frustrated that the best option has now become merely the lesser evil.

      • @[email protected]
        link
        fedilink
        English
        202 months ago

        Unfortunately Mozilla is going the enshittification route more and more. Or good in this case that the Firefox Phone did not take of.

    • @[email protected]
      link
      fedilink
      English
      12 months ago

      I just gave up and pre-ordered the Light Phone 3. Anytime I truly need a mobile app, I can just use an old iPhone and a WiFi connection.

  • @[email protected]
    link
    fedilink
    English
    12 months ago

    Great, it’ll have to plow through ~30GB of 1080p recordings of darkness and my upstairs neighbors living it up in the AMs. And nothing else.

  • @[email protected]
    link
    fedilink
    English
    14
    edit-2
    2 months ago

    This is the stupidest shit, moral panic levels of miscomprehension. I mean, I was miffed and promptly removed safetycore because I don’t mind seeing sex organs and don’t want shit using battery for no reason, but wow Forbes.

    Edit: ok, the article is not so bad, just the shitty blurb from some forum reproduced here on Lemmy.

  • @[email protected]
    link
    fedilink
    English
    132 months ago

    Kind of weird that they are installing this dependency whether you will enable those planned scanning features or not. Here is an article mentioning that future feature Sensitive Content Warnings. It does sound kind of cool, less chance to accidentally send your dick pic to someone I guess.

    Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing, and then prompts with a “speed bump” that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares.

    All of this happens on-device to protect your privacy and keep end-to-end encrypted message content private to only sender and recipient. Sensitive Content Warnings doesn’t allow Google access to the contents of your images, nor does Google know that nudity may have been detected. This feature is opt-in for adults, managed via Android Settings, and is opt-out for users under 18 years of age.

    • @[email protected]
      link
      fedilink
      English
      192 months ago

      Looks like more of a chance of false positives happening and getting the police to raid your home to confiscate your devices. I don’t care what the article says I know Google is getting access to that data because that’s who they are.

  • Kraiden
    link
    fedilink
    52 months ago

    Huh. My device seems to have been skipped? I don’t do anything special, I’m using Play Store and Play Services, and I’m up to date, but it’s not showing up in my settings app list

    • falseprophet
      link
      fedilink
      12 months ago

      Sometimes it uses a different name I have noticed, try to see if something with a similar is listed

      • @[email protected]
        link
        fedilink
        English
        2
        edit-2
        2 months ago

        i havent had it yet either. only suspicious thing that i notice is some android system intelligence, but that has been there for a while now. i havent dared to uninstall/deactivate it yet since i dont know if anything critical is dependent on it. havent even noticed any suspicious network activity either on rethink, beyond the usual bullshit like some uninstalled application still trying to connect to google as “unknown”.

        • falseprophet
          link
          fedilink
          42 months ago

          Maybe they experimenting installing it on some phones, I had it but an different name. I couldn’t find it in my apps lists but when someone posted a direct link to play store app page it showed installed.

          • @[email protected]
            link
            fedilink
            English
            22 months ago

            hmm, i looked it up myself and it doesnt seem to say its installed for me there. Cant find it by searching on my phone, only on my pc through search engine. But someone on comments there brought a good point by telling that his some old phone basically bricked because of this due to it being incompitable.

            I also have fairphone, though i’m not sure if that really is the reason. Maybe they are indeed gradually installing it then.

  • @[email protected]
    link
    fedilink
    English
    1182 months ago

    Google says that SafetyCore “provides on-device infrastructure for securely and privately performing classification to help users detect unwanted content. Users control SafetyCore, and SafetyCore only classifies specific content when an app requests it through an optionally enabled feature.”

    GrapheneOS — an Android security developer — provides some comfort, that SafetyCore “doesn’t provide client-side scanning used to report things to Google or anyone else. It provides on-device machine learning models usable by applications to classify content as being spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users.”

    But GrapheneOS also points out that “it’s unfortunate that it’s not open source and released as part of the Android Open Source Project and the models also aren’t open let alone open source… We’d have no problem with having local neural network features for users, but they’d have to be open source.” Which gets to transparency again.

    • @[email protected]
      link
      fedilink
      English
      102 months ago

      Graphene could easily allow for open source solutions to emulate the SafetyCore interface. Like how it handles Google’s location services.

      There’s plenty of open source libraries and models for running local AI, seems like this is something that could be easily replicated in the FOSS world.

    • @[email protected]
      link
      fedilink
      English
      22 months ago

      Most people don’t really know what that actually means, and they don’t feel they have anything to hide from some nebulous corporate entity.

    • @[email protected]
      link
      fedilink
      English
      82 months ago

      why, what do you recommend?

      I mean you have just disclaime the whole android ecosystem, and the only other alternative is Apple, which is questionable if it’s better.
      and this would have even applied to my fairphone!
      would have, if I didn’t get rid of google services the day I got it.

      • Wren
        link
        fedilink
        English
        3
        edit-2
        2 months ago

        I don’t have to recommend anything just because I’m asking why people are buying spyware tech.

        Just like I may not know the proper way to safely jump out of an airplane, but I do know a parachute is involved.

        A person asking why people do a thing that seems stupid isn’t obligated to solve the problem.

      • @[email protected]
        link
        fedilink
        English
        92 months ago

        Please, read the links. They are the security and privacy experts when it comes to Android. That’s their explanation of what this Android System SafetyCore actually is.

    • @[email protected]
      link
      fedilink
      English
      3
      edit-2
      2 months ago

      graphene folks have a real love for the word misinformation (and FUD, and brigading). That’s not you under there👻, Daniel, is it?

      After 5 years of his antics hateful bullshit lies, I think I can genuinely say that word triggers me.

    • @[email protected]
      link
      fedilink
      English
      42 months ago

      So is this really just a local AI model? Or is it something bigger? My S25 Ultra has the app but it hasn’t used any battery or data.

    • Spaniard
      link
      fedilink
      English
      82 months ago

      If the app did what op is claiming then the EU would have a field day fining google.

    • @[email protected]
      link
      fedilink
      English
      282 months ago

      To quote the most salient post

      The app doesn’t provide client-side scanning used to report things to Google or anyone else. It provides on-device machine learning models usable by applications to classify content as being spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users.

      Which is a sorely needed feature to tackle problems like SMS scams

      • @[email protected]
        link
        fedilink
        English
        62 months ago

        if the cellular carriers were forced to verify that caller-ID (or SMS equivalent) was accurate SMS scams would disappear (or at least be weaker). Google shouldn’t have to do the job of the carriers, and if they wanted to implement this anyway they should let the user choose what service they want to perform the task similar to how they let the user choose which “Android system WebView” should be used.

        • @[email protected]
          link
          fedilink
          English
          32 months ago

          Carriers don’t care. They are selling you data. They don’t care how it’s used. Google is selling you a phone. Apple held down the market for a long time for being the phone that has some of the best security. As an android user that makes me want to switch phones. Not carriers.

      • @[email protected]
        link
        fedilink
        English
        22 months ago

        You don’t need advanced scanning technology running on every device with access to every single bit of data you ever seen to detect scam. You need telco operator to stop forwarding forged messages headers and… that’s it. Cheap, efficient, zero risk related to invasion of privacy through a piece of software you did not need but was put there “for your own good”.

      • @[email protected]
        link
        fedilink
        English
        112 months ago

        Why do you need machine learning for detecting scams?

        Is someone in 2025 trying to help you out of the goodness of their heart? No. Move on.

        • @[email protected]
          link
          fedilink
          English
          42 months ago

          If you want to talk money then it is in businesses best interest that money from their users is being used on their products, not being scammed through the use of their products.

          Secondly machine learning or algorithms can detect patterns in ways a human can’t. In some circles I’ve read that the programmers themselves can’t decipher in the code how the end result is spat out, just that the inputs will guide it. Besides the fact that scammers can circumvent any carefully laid down antispam, antiscam, anti-virus through traditional software, a learning algorithm will be magnitudes harder to bypass. Or easier. Depends on the algorithm

          • @[email protected]
            link
            fedilink
            English
            42 months ago

            I don’t know the point of the first paragraph…scams are bad? Yes? Does anyone not agree? (I guess scammers)

            For the second we are talking in the wild abstract, so I feel comfortable pointing out that every automated system humanity has come up with so far has pulled in our own biases and since ai models are trained by us, this should be no different. Second, if the models are fallible, you cannot talk about success without talking false positives. I don’t care if it blocks every scammer out there if it also blocks a message from my doctor. Until we have data on consensus between these new algorithms and desired outcomes, it’s pointless to claim they are better at X.

  • Sudomeapizza
    link
    fedilink
    English
    102 months ago

    For those that have issues on Samsung devices: see here if you’re getting the “App not installed as package conflicts with an existing package” error :

    If you have a Samsung device - uninstall the app also from Knox Secure Folder. Entering to Secure Folder>Settings>Apps