One chestnut from my history in lottery game development:

While our security staff was incredibly tight and did a generally good job, oftentimes levels of paranoia were off the charts.

Once they went around hot gluing shut all of the “unnecessary” USB ports in our PCs under the premise of mitigating data theft via thumb drive, while ignoring that we were all Internet-connected and VPNs are a thing, also that every machine had a RW optical drive.

  • AstralWeekends
    link
    fedilink
    222 years ago

    Made me write SQL updates that had to be run by someone in a different state with pretty much no knowledge of SQL.

  • _haha_oh_wow_
    link
    fedilink
    English
    19
    edit-2
    2 years ago

    I used to work with a guy who glued the USB ports shut on his labs. I asked him why he didn’t just turn them off in BIOS and then lock BIOS behind a password and he just kinda shrugged. He wasn’t security, but it’s kinda related to your story.

    ¯\_(ツ)_/¯

    Security where I work is pretty decent really, I don’t recall them ever doing any dumb crazy stuff. There were some things that were unpopular with some people but they had good reasons that far outweighed any complaints.

    • @[email protected]
      link
      fedilink
      52 years ago

      I just wrote a script that let me know if usb devices changed and emailed me. It was kinda funny the one time someone unplugged a USB hub to run a vacuum. I came running as like 20 messages popped up at once.

    • KrudlerOP
      link
      fedilink
      English
      10
      edit-2
      2 years ago

      I completely hear you.

      When they did this for the stated reason of preventing data theft via thumb drive, the mice & keyboards were still plugged into their respective USB ports, and if I really wanted I could just unplug my keyboard and pop in a thumb drive. Drag, drop, data theft, done.

      Further to this madness, half of the staff had USB hubs attached to their machines within a week which they had purchased at dollar stores. Like…?

      At any time, if I had wanted to steal data I could have just zipped it and uploaded it to a sharing site. Or transferred it to my home PC through a virtual machine and VPN. Or burned it using the optical drive. Or come up with 50 other ways to do it under their noses and not be caught.

      Basically just a bunch of dingbat IT guys in a contest to see who could find a threat behind every bush. IT policy via SlashDot articles. And the assumption that the very employees that have physical access to the computers… are the enemy.

      Okay I’ll concede that SOMEWHERE in the world there exists a condition where somebody has to prevent the insertion of an unauthorized thumb drive, they don’t have access to the BIOS, they don’t have the password, or that model does not allow the disabling of the ports. No other necessary devices are plugged in by USB. Policy isn’t or can’t be set to prevent new USB devices from being added to the system. And this whole enchilada is in a high-traffic area with no physical security and many with unknown actors.

      Right.

  • @[email protected]
    link
    fedilink
    24
    edit-2
    2 years ago

    Ours is terrible for making security policy that will impact technical solution options in a vacuum with a few select higher level IT folks and no one sorts out the process to using the new “secure” way first. Ending up in finding out something you thought would be a day or 2 task ends up being a weeks long odyssey to define new processes and technical approaches. Or sometimes just out right abandoning the work because the headache isn’t worth it.

    • @[email protected]
      link
      fedilink
      172 years ago

      Ours does this too. Except they stick to their guns and we end up having to just work around the new impediment they’ve created for months until it happens to inconvenience someone with enough pull to make them change it.

  • @[email protected]
    link
    fedilink
    752 years ago

    One IT security team insisted we have separate source code repositories for production and development environments.

    I’m honestly not sure how they thought that would work.

    • mesa
      link
      fedilink
      62 years ago

      Yep doing that now. Not sustainable in the slightest. Im glad im not in charge of that system.

    • Tar_Alcaran
      link
      fedilink
      202 years ago

      I’m honestly not sure how they thought that would work.

      Just manually copy-paste everything. That never goes wrong, right?

      • @[email protected]
        link
        fedilink
        12 years ago

        I mean, it’s what the Security guys do, right? Just copy+paste everything, mandate that everyone else does it too, Management won’t argue because it’s for “security” reasons.

        Then the Security guys will sit around jerking each other off about how much more secure they made the system

    • @[email protected]
      link
      fedilink
      English
      112 years ago

      Could work if dev was upstream from prod. But honestly there would be no difference between that and branches.

        • @[email protected]
          link
          fedilink
          English
          42 years ago

          Yeah…assuming that the policy was written “from blood” (meaning someone did something stupid).

          But even then you can put other checks and balances in place to make sure that kind of thing doesn’t happen.

          This is such an extreme reaction though. Or the policy was made from someone dumb

    • @[email protected]
      link
      fedilink
      25
      edit-2
      2 years ago

      That’s fucking bananas.

      In my job, the only difference between prod/dev is a single environmental file. Two repositories would literally serve no purpose and if anything, double the chances of having the source code be stolen.

      • @[email protected]
        link
        fedilink
        92 years ago

        That was the only difference for us as well. The CI/CD process built container images. Only difference between dev, test, and prod was the environment variables passed to the container.

        At first I asked the clueless security analyst to explain how that improves security, which he couldn’t. Then I asked him how testing against one repository and deploying from another wouldn’t invalidate the results of the testing done by the QA team, but he kept insisting we needed it to check some box. I asked about the source of the policy and still no explanation, at least not one that made any sense.

        Security analyst escalated it to his (thankfully not clueless) boss who promptly gave our process a pass and pointed out to Mr security analyst that literally nobody does that.

  • dual_sport_dork 🐧🗡️
    link
    fedilink
    53
    edit-2
    2 years ago

    Not my IT department (I am my IT department): One of the manufacturers for a brand of equipment we sell has a “Dealer Resource Center,” which consists solely of a web page where you can download the official product photography and user’s manuals, etc. for their products. This is to enable you to list their products on your e-commerce web site, or whatever.

    Apparently whoever they subcontracted this to got their hands on a copy of Front End Dev For Dummies, and in order to use this you must create a mandatory account with minimum password complexity requirements, and solve a CAPTCHA every time you log in. They also require you to change your password every 60 days, and if you don’t they lock your account and you have to call their tech support.

    Three major problems with this:

    1. There is no verification check that you are actually an authorized dealer of this brand of product, so any fool who finds this on Google and comes up with an email address can just create an account and away you go downloading whatever you want. If you’ve been locked out of your account and don’t feel like picking up the telephone – no problem! Just create a new one.

    2. There is no personalized content on this service. Everyone sees the same content, and it’s not like there’s a way to purchase anything on here or anyway, and your “account” stores no identifying information about you or your dealership that you feel like giving it other than your email address. You are free to fill it out with a fake name if you like; no one checks. You could create an account using [email protected] and no one would notice.

    3. Every single scrap of content on this site is identical to the images and .pdf downloads already available on the manufacturer’s public web site. There is no privileged or secure content hosted in this “Resource Center” whatsoever. The pictures aren’t higher res or anything. Even the file names are the same. It’s obviously hooked up to the same backend as the manufacturer’s public web site. So if there were such a thing as a “bad actor” who wanted to obtain a complete library of glamor shots of durable goods, for some reason, there’s nothing stopping them from scraping the public web site and coming up with literally exactly the same thing.

    It’s baffling.

  • Jeena
    link
    fedilink
    302 years ago

    There was a server I inherited from colleagues who resigned, mostly static HTML serving. I would occasionally do a apt update && apt ugrade to keep nginx and so updated and installed certbot because IT told me that this static HTML site should be served via HTTPS, fair enough.

    Then I went on parental leave and someone blocked all outgoing internet access from the server. Now certbot can’t renew the certificate and I can’t run apt. Then I got a ticket to update nginx and they told me to use SSH to copy the files needed.

    • @[email protected]
      link
      fedilink
      22 years ago

      They are sort of right but have implemented it terribly. Serving out a static webpage is pretty low on the “things that are exploitable” but it’s still an entry point into the network (unless this is all internal then this gets a bit silly). What you need to do is get IT to set up a proxy and run apt/certbot through that proxy. It defends against some basic reverse shell techniques and gives you better control over the webhosts traffic. Even better would be to put a WAP and a basic load balancer in front of the webhost, AND proxy external communications.

      Blocking updates/security services is dogshit though and usually is done by people that are a bit slow on the uptake. Basically they have completely missed the point of blocking external comms and created a way more massive risk in the process… They either need to politely corrected or shamed mercilessly if that doesn’t work.

      Good luck though! I’m just glad I’m not the one that has to deal with it.

  • @[email protected]
    link
    fedilink
    212 years ago

    They forbid us to add our ssh keys in some server machines, and force us to log in these servers with the non-personal admin account, with a password that is super easy to guess and haven’t been changed in 5 years.

  • TechyDad
    link
    fedilink
    422 years ago

    ZScaler. It’s supposedly a security tool meant to keep me from going to bad websites. The problem is that I’m a developer and the “bad website” definition is overly broad.

    For example, they’ve been threatening to block PHP.Net for being malicious in some way. (They refuse to say how.) Now, I know a lot of people like to joke about PHP, but if you need to develop with it, PHP.Net is a great resource to see what function does what. They’re planning on blocking the reference part as well as the software downloads.

    I’ve also been learning Spring Boot for development as it’s our standard tool. Except, I can’t build a new application. Why not? Doing so requires VSCode downloading some resources and - you guessed it - ZScaler blocks this!

    They’ve “increased security” so much that I can’t do my job unless ZScaler is temporarily disabled.

    • @[email protected]
      link
      fedilink
      42 years ago

      Oh man our security team is trialing zscaler and netskope right now. I’ve been sitting in the meetings and it seems like it’s just cloud based global protect. GP was really solid so this worries me

    • @[email protected]
      link
      fedilink
      English
      22 years ago

      It’s been ages since I had to deal with the daily random road blocks of ZScaler, but I do think of it from time to time.

      Then I play Since U Been Gone by Kelly Clarkson.

    • @[email protected]
      link
      fedilink
      102 years ago

      Yeah. Zscaler was once blocking me from accessing the Cherwell ticket system, which made me unable to write a ticket that Zscaler blocked me access to Cherwell.

      Took me a while to get an IT guy to fix it without a ticket.

    • AggressivelyPassive
      link
      fedilink
      152 years ago

      Also, zScaler breaks SSL. Every single piece of network traffic is open for them to read. Anyone who introduces zscaler should be fired and/or shot on sight. It’s garbage at best and extremely dangerous at worst.

        • AggressivelyPassive
          link
          fedilink
          82 years ago

          And it’s a horrible point. You’re opening up your entire external network traffic to a third party, whose infrastructure isn’t even deployed or controllable in any form by you.

          • @[email protected]
            link
            fedilink
            12 years ago

            The idea being that it’s similar to using other enterprise solutions, many of which do the same things now.

            Zscaler does have lesser settings too, at it’s most basic it can do split tunneling for internal services at an enterprise level and easy user management. Which is a huge plus.

            I’d also like to point out that the entire Internet is a third party you have no control over which you open your external traffic to everyday.

            The bigger deal would be the internal network, which is also a valid argument.

            • AggressivelyPassive
              link
              fedilink
              62 years ago

              I’d also like to point out that the entire Internet is a third party you have no control over which you open your external traffic to everyday.

              Not really. Proper TLS enables relatively secure E2E encryption, not perfect, but pretty good. Adding Zscaler means, that my entire outgoing traffic runs over one point. So one single incident in one single provider basically opens up all of my communication. And given that so many large orgs are customers of ZScaler, this company pretty much has a target on its back.

              Additionally: I’m in Germany. My Company does a lot of contracting and communication with local, state and federal entities, a large part of that is not super secret, but definitely not public either. And now suddenly an Amercian company, that is legally required to hand over all data to NSA, CIA, FBI, etc. has access to (again) all of my external communication. That’s a disaster. And quite possibly pretty illegal.

    • @[email protected]
      link
      fedilink
      42 years ago

      It has the same problem as any kind of TLS interception/ traffic monitoring tool.

      It just breaks everything and causes a lot of lost time and productivity firstly trying to configure everything to trust a new cert (plenty of apps refuse to use the system cert store) and secondly opening tickets with IT just to go to any useful site on the internet.

      Thankfully, at least in my case, it’s trivial to disable so it’s the first thing I do when my computer restarts.

      Security doesn’t seem to do any checks about what processes are actually running, so they think they’ve done a good job and I can continue to do my job

  • @[email protected]
    link
    fedilink
    302 years ago

    A long time ago in a galaxy far away (before the internet was a normal thing to have) I provided over-the-phone support for a large and complex piece of software.

    So, people would call up and you had to describe how they could do the thing they needed to do, and if that failed they would have to wait a few days until you went to the site to sort it in person.

    The software we supported was not on the approved list for the company I worked for, so you couldn’t use it within the building where the phones were being answered.

    • @[email protected]
      link
      fedilink
      22 years ago

      I’m absolutely shocked that a company had a software whitelist before the widespread adoption of the internet. Ahead of their time in implementing, and fucking up, software whitelisting!

      • @[email protected]
        link
        fedilink
        32 years ago

        It was for government owned computers, they didn’t want any pirated or virus-infected stuff, and at that point there was no way to lock down such a mish-mash of systems.

        The software company (who also do things like run prisons these days) had given permission for us to run the software and given a set of fake data so we could go through the motions when talking people through things, but apparently that wasn’t enough to get it on the list.

  • @[email protected]
    link
    fedilink
    492 years ago

    Removed admin access for all developers without warning and without a means for us to install software. We got access back in the form of a secondary admin account a few days later, it was just annoying until then.

    • Brkdncr
      link
      fedilink
      32 years ago

      Local admin of your interactive account is just. Ad though.

    • @[email protected]
      link
      fedilink
      32
      edit-2
      2 years ago

      I had the same problem once. Every time I needed to be an admin, I had to send an email to an outsourced guy in another country, and wait one hour for an answer with a temporary password.

      With WSL and Linux, I needed to be admin 3 or 4 times per day. I CCed my boss for every request. When he saw that I was waiting and doing nothing for 4 hours every day, he sent them an angry email and I got my admin account back.

      The stupid restriction was meant for managers and sales people who didn’t need an admin account. It was annoying for developers.

      • mesa
        link
        fedilink
        8
        edit-2
        2 years ago

        I worked at a big name health insurance company that did the same. You would have to give them an email, wait a week, then give them a call to get them to do anything. You could not install anything yourself, it was always a person that remote into your computer. After a month, I still didn’t have visual studio installed when they wanted me to work on some .Net. Then they installed the wrong version of Visual Studio. So the whole process had to be restarted.

        I got a new job within 3 months and just noped out.

    • @[email protected]
      link
      fedilink
      202 years ago

      As a security guy - as soon as I can get federal auditors to agree, I’m getting rid of password expiration.

      The main problem is they don’t audit with logic. It’s a script and a feeling. No password expiration FEELS less secure. Nevermind the literal years of data and research. Drives me nuts.

      • @[email protected]
        link
        fedilink
        132 years ago

        Cite NIST SP 800-63B.

        Verifiers SHOULD NOT impose other composition rules (e.g., requiring mixtures of different character types or prohibiting consecutively repeated characters) for memorized secrets. Verifiers SHOULD NOT require memorized secrets to be changed arbitrarily (e.g., periodically). However, verifiers SHALL force a change if there is evidence of compromise of the authenticator.

        https://pages.nist.gov/800-63-3/sp800-63b.html

        I’ve successfully used it to tell auditors to fuck off about password rotation in the healthcare space.

        Now, to be in compliance with NIST guidelines, you do also need to require MFA. This document is what federal guidelines are based on, which is why you’re starting to see Federal gov websites require MFA for access.

        Either way, I’d highly encourage everyone to give the full document a read through. Not enough people are aware of it and this revision was shockingly reasonable when it came out a year or two ago.

      • @[email protected]
        link
        fedilink
        English
        72 years ago

        It’s counterintuitive. Drives people to use less secure passwords that they’re likely to reuse or to just increment; Password1, Password2, etc.

  • Illecors
    link
    fedilink
    English
    502 years ago

    Hasn’t made life hell, but the general dumb following of compliance has left me baffled:

    • users must not be able to have a crontab. Crontab for users disabled.
    • compliance says nothing about systemd timers, so these work just fine 🤦

    I’ve raised it with security and they just shrugged it off. Wankers.

    • mesa
      link
      fedilink
      52 years ago

      Thats really funny. Made my day thanks.

      Are they super old school and not know about systemd? Or are they doing something out of compliance that they may hate too? I have so many questions.

      • Illecors
        link
        fedilink
        English
        72 years ago

        I actually think they’re new school enough where Linux to them means a lot less than it does to us. And so they don’t feel at home on a Linux machine and, unfortunately, don’t care to learn.

        I could totally be wrong, though. Maybe I’m the moron.

        • mesa
          link
          fedilink
          32 years ago

          I dont think your the moron. Thats super strange. I can only think it might be some sort of standard that they had to comply with…or whatever.

  • @[email protected]
    link
    fedilink
    462 years ago

    I had to run experiments that generate a lot of data (think hundreds of megabytes per minute). Our laptops had very little internal storage. I wasn’t allowed to use an external drive, or my own NAS, or the company share - instead they said “can’t you just delete the older experiments?”… Sure, why would I need the experiment data I’m generating? Might as well /dev/null it!

    • KrudlerOP
      link
      fedilink
      English
      5
      edit-2
      2 years ago

      Less the Lady Gaga obfuscation.

      We had 40,000 blank discs laying around at all times… because they were a regular part of sending art/data proofs to customers.

      o_O

  • @[email protected]
    link
    fedilink
    352 years ago

    We have a largeish number of systems that IT declared catheorically could not connect directly to the Internet for any reason.

    So guess what systems weren’t getting updates. Also guess what systems got overwhelmed by ransomware that hit what would have been a patched vulnerability, that came through someone’s laptop that was allowed to connect to the Internet.

    My department was fine, because we broke the rules to get updates.

    So did network team admit the flaw in their strategy? No, they declared a USB key must have been the culprit and they literally went into every room and confiscated all USB keys and threw them away, with quarterly audits to make sure no USB keys appear. The systems are still not being updated and laptops with Internet connection are still indirectly bridging them.

    • irotsoma
      link
      fedilink
      English
      192 years ago

      Wait, why don’t they use patch management software? If they allow computers with Internet access to connect to them, why not a patch management server?

      • @[email protected]
        link
        fedilink
        142 years ago

        They do. In fact they mandate IT assets to have three competing patch management software on them. They mandate disabling any auto updates because they have to vet them first. My official laptop hasn’t been pushed an update in 8 months.

          • @[email protected]
            link
            fedilink
            92 years ago

            Ironically, we actually have a Segment of our business that provides IT for other companies, and they do a decent job, but they aren’t allowed to manage our own IT. Best guess is that they are too expensive to waste on our own IT needs. If an IT staffember accidentally shows competence, they are probably moved to the billable group.

    • @[email protected]
      link
      fedilink
      22 years ago

      Also, I keep a “rogue” laptop to self administrate along with my official it laptop to show I am in compliance. Updates are disabled and are only allowed to be fine y by IT. I just checked and they haven’t pushed any updates for about 8 months.