40 comments

  • ccurrens 1356 days ago
    > If you find password protected zips in the release the password is probably either "Intel123" or "intel123". This was not set by me or my source, this is how it was aquired from Intel.

    Can't say I'm surprised, people are lazy.

    Another large tech company I used to work for commonly used an only-slightly more complex password. But it was never changed, so people who had left the team still could have access to things if they knew the password. It was an entry point into the system more than the company's Red team.

    • schmichael 1356 days ago
      Password protection may have been used to bypass antivirus and other filters. While you should treat dumps like this with a lot of suspicion, treat password protected zips with a heaping dose of care as they may have been used to evade automated defenses.
      • pjc50 1356 days ago
        Yes - but not for hostile purposes, but because your own company's antivirus won't let you mail an executable to a colleague.
        • marcosdumay 1356 days ago
          Usually this. Or in my workplace, an image.

          Antivirus are some crazy shit that may trigger on any random action and will teach people to follow the most unsafe procedures without questioning, so they can get anything done.

          • myself248 1356 days ago
            I've heard it put this way: If you force users to trade convenience for security, they will find a way to obtain convenience at the expense of security.
            • akira2501 1356 days ago
              > If you force users to trade convenience for security

              I _wish_ it was better security they were making the trade for. It often isn't though. These programs are large, expensive, and don't do much most of the time. I feel there's a perverse incentive for developers to make their AV products as noisy as is possible to justify their own existence.

              And yet.. even with full AV rollouts locked down at the highest level, bad actors still get into networks and exploit them. So, to me it feels like our users are trading away their convenience for our misguided CYA policies.

              • NegativeLatency 1355 days ago
                There was that one AV with a JS interpreter running as root

                https://news.ycombinator.com/item?id=22544554

              • lmilcin 1355 days ago
                The truth is, you don't need much in the way of AV software if you are willing to outright block certain types of files.

                In most large corporations you are basically not allowed to send anything that could even potentially hide a virus except for maybe Office files (nobody yet built a compelling alternative to Powerpoint and Excel).

                Typical rules already block all executable binaries, scripts and password protected archives (because they could hold binaries or scripts), etc. As a Java developer I have recently discovered my company started blocking *.java files.

              • majewsky 1355 days ago
                My guess/fear is that most AV software gets deployed because some insurance policy requires you to tick that box.
                • fakecigar 1355 days ago
                  A lot of this stuff (AV software) is getting deployed at all different layers of the environment. Firewalls are getting better at dynamic file analysis and file blocking, the endpoints are loaded with user behavior/analytics, av and dlp tools. AV is so omnipresent because it's in a decent amount netsec appliances these companies stand up
            • hinkley 1356 days ago
              If you make it harder for people to do the right thing than the wrong thing, they will choose the wrong thing.

              This has been brought up a million times in the context of DRM, but it is true in the general case as well.

              • tombert 1356 days ago
                I could be mistaken on this, but wasn't this basically the sales pitch for Spotify? Basically saying "you'll never get rid of piracy, but you can compete with it".
                • danudey 1356 days ago
                  This was the sales pitch for iTunes and the iTunes store:

                  "We approached it as 'Hey, we all love music.' Talk to the senior guys in the record companies and they all love music, too. … We love music, and there's a problem. And it's not just their problem. Stealing things is everybody's problem. We own a lot of intellectual property, and we don't like when people steal it. So people are stealing stuff and we're optimists. We believe that 80 percent of the people stealing stuff don't want to be; there’s just no legal alternative. So we said, Let's create a legal alternative to this. Everybody wins. Music companies win. The artists win. Apple wins. And the user wins because he gets a better service and doesn't have to be a thief."

                  https://www.esquire.com/news-politics/a11177/steve-jobs-esqu...

                  Another point of reference: because they had no legal ground to stand on, HBO targeted Canadian torrenters of Game of Thrones with an e-mail saying, among other things, "It's never been easier to [watch Game of Thrones legally]!"

                  This was true, it had never been easier. It had also never been harder. For the entire time that Game of Thrones was being aired, the only legal way for Canadians to watch it was to pay about a hundred dollars per month for cable and the cable packages that would give them HBO. You could buy it on iTunes, but only as a season, after the season was over.

                  So yeah, I kept torrenting it, everyone I know kept torrenting it, and everyone hated (or laughed at, or both) HBO the whole time.

                  • MaxBarraclough 1355 days ago
                    Interesting that it depends so much on region.

                    Here in the UK, Sky offer a cheap 'over-the-top' streaming alternative to their satellite offerings, [0] so you could watch Game of Thrones for £8/month, provided you didn't mind the inferior video quality.

                    [0] https://en.wikipedia.org/wiki/Now_TV_(Sky)

                    • Nursie 1355 days ago
                      They have a "topup" now which allows you to get real, full-fat 1080p.

                      Woohoo!

                      I did actually add that to my subscription, and during lockdown have used it to re-watch Game of Thrones :)

                      • MaxBarraclough 1355 days ago
                        I gave that a go but wasn't impressed by the 1080P quality. I suspect they're using a low bitrate.
                        • Nursie 1355 days ago
                          Most likely. You can get the bitrate to display (when the video controls are up maybe?) if you wanted to take a look.

                          Between that and whatever magic my OLED tv was doing, it looked pretty good to me.

                          Just a shame they haven't released it all in 4K/UHD yet...

                          • MaxBarraclough 1355 days ago
                            I doubt they'll offer 4K. They want to push people toward their expensive satellite packages for that.
                            • Nursie 1355 days ago
                              I meant HBO! I think GoT season 1 is the only season that's had a release at that res so far.

                              I was really hoping to get an HDR version of the "The long night", to address some of the banding and other visibility problems present in the episode, and maybe see a bit more of what went on. But there isn't one yet. So I watched it with the lights out so that my eyes adjusted :)

                              But yeah, you're probably right, NowTv has massive potential to undercut their main offering.

                • floatboth 1356 days ago
                  This was also a sales pitch for Steam – especially in developing countries where the whole concept of paying for non-physical things was a hard sell.

                  (Though in this case it wasn't just competition – access to official servers in online games was something that was often not pirateable.)

                • setr 1356 days ago
                  Not sure about Spotify, but I know gabe newell had famously made basically this argument, in regards to steam's success
            • TeMPOraL 1356 days ago
              It's true, and often it's not laziness - corporate security measures are often focused only on denying access, and they're so overbearing that, were they followed to the letter, they could easily shut the company down. It's through workarounds that actual work gets done.
              • Nasrudith 1356 days ago
                Sounds like a large organizational incentive intergration failure where subpieces are at odds such that they care more about dodging blame and outside of their domain it isn't their problem. "Not My Fault/Not My Problem" as a toxic approach making balancing decisions worse.
              • vmilner 1353 days ago
                I remember having issues with a corporate email system where base64/uuencoded data would fail to get through with a very rough dependency on size - large files had a smaller chance of getting through but it was clear that there wasn't a hard size limit. Eventually someone twigged that the problem was a "rude word" scanner, and that beyond a certain size you would hit the "scunthorpe" problem, and forbidden words would appear in the ASCII text randomly.
            • rodgerd 1355 days ago
              The thing is, usability is security. People will do anything to be able to do their job (because people like being able to, you know, eat and stuff). Things that stop you doing your job are bad for security.

              I wish more of the security industry would get their frigging heads around this. PGP did less for messaging security over decades of availability than iMessage and Signal did in a few weeks of availability.

          • sjg007 1356 days ago
            Antiviruses will quarantine compiler output...
            • silverdemon 1355 days ago
              This 100%. I recall many a fun night at $BIGCORP burning the midnight oil, receiving the warning emails that my "unauthorised software" had been reported to my manager, and that it had been quarantined away for my own safety and convenience. Given that $BIGCORP was a tech firm my manager would be intensely delighted that they would receive regular midnight notifications that I was doing my job. Whatever that damn thing cost it would have been cheaper to let the malware do its thing.
              • nine_k 1355 days ago
                Windows development seems to be fun as of recently. Didn't touch it for couple of decades.

                Sometimes I think that modern Windows is a nice platform already, even comfortable. (Like, you know, C++17 is very unlike C++98.) But then I'm reminded of the necessity to run an antivirus in front of it in a corporate environment.

                • quotemstr 1355 days ago
                  I intensely dislike corporate "security product" culture. For whatever reason, every IT department thinks that you have to ruin Windows with tons of invasive antivirus and monitoring software. I've seen zero evidence that these performance-killing tools are necessary. It's all theater. Microsoft itself doesn't do this shit to Windows, and neither should anyone else.
                • astura 1355 days ago
                  We have to have antivirus on our Linux computers for compliance.

                  Yes such a thing exists... https://www.mcafee.com/enterprise/en-us/products/virusscan-e...

                  • majewsky 1355 days ago
                    There was a discussion in our IT Security department about how to install McAfee on CoreOS servers. (For the uninitiated, CoreOS is a Linux distribution that comes without a package manager. It's intended as a base to run containers on, so you would deploy all software via container images.)

                    I remember someone suggesting to put McAfee into a fully isolated container that only exposes the port where it reports compliance, allowing it to scan itself to death all day long.

                  • roydivision 1355 days ago
                    There are legitimate use cases for anti virus on Linux, for instance when running mail or file servers.
                    • pantalaimon 1355 days ago
                      Aren't those scanning for Windows Viruses?
                      • nine_k 1355 days ago
                        Some can be cross-platform JS exploits.
            • AlotOfReading 1356 days ago
              At one company, Symantec would also quarantine the compiler and build system. It certainly made builds exciting to have the antivirus playing Russian roulette with the entire toolchain.
              • mcdevilkiller 1355 days ago
                Every time I went to configure a toolchain on Jetbrains' CLion, Cmake would create some test files and compile them. Windows Defender deleted every file and even the embedded toolchain. Fun :)
              • Spooky23 1355 days ago
                Of course many places have replaced dopey AV with creepier advanced tools like ATP or CrowdStrike.
            • pixl97 1355 days ago
              Ugh, welcome to my life.

              "You must exclude our program sub directory because temporary files are created containing interpreted code and your antivirus will ether block it outright, or lock the file so long you get application time outs"

          • TheSpiceIsLife 1356 days ago
            Let’s call a spade a spade.

            Antivirus software is malware.

        • danudey 1356 days ago
          In February, I e-mailed a python script to one of our developers to help debug an issue with their SSL configuration.

          Two days ago, I needed the script again but couldn't find it. Went to our e-mail thread and it said "the following potentially malicious attachments were blocked", showing mine, but... even from my outgoing mailbox? That seems ridiculous and problematic, considering that it sent fine at the time.

          I know that e-mail shouldn't be used as a replacement for Sharepoint or Dropbox or whatever, and I should have a local copy of what I need, but it just seems annoying and arbitrary.

          Anyway, I just logged into Outlook Web and downloaded it from the message there. Problem solved.

          • majewsky 1355 days ago
            If I had to deploy AV for mail, I would absolutely scan outgoing mail as well. Imagine if some compromised mail account in my org sends malware to accounts in other companies. These companies could then sue my company for negligence if they can show that we did not scan our mail for viruses on outbound (which could potentially be done by examining mail headers).

            (I am not a lawyer.)

          • dillonmckay 1356 days ago
            This has happened to me with gmail. Zipfiles I had sent in the past are no longer allowed to be downloaded from my sent items folder through the standard interface.
        • stefan_ 1356 days ago
          Your company's antivirus, or GMail. A binary? A zip with a binary? Nuh-uh.
          • Delk 1356 days ago
            To be fair, emailing binaries (apart from known types such as images, PDFs, etc.) is a rare enough use case for legitimate purposes and an easy enough way of spamming malware to clueless random people that it's probably a reasonable default for gmail.

            Having an option to allow them might be okay though. (I barely use gmail so I don't know if it has one or not.)

            • sjg007 1356 days ago
              Ah you must be young...
              • totetsu 1356 days ago
                for not using gmail? The hooked me in school
                • Nasrudith 1356 days ago
                  For not sending binaries by email - there is no shame to being young in this case as it means never developing the bad habits.

                  Before Dropbox and similiar it was far more a norm and various file sharing systems like SharePoint may wind up not actually used. Non-computer technical people often do so in companies all the time and practically use it as an ersatz version control system to the cringe of IT.

                • Alekhine 1356 days ago
                  He means there used to be a time when people would mail binaries to each other more often, before they got too big and DRM'ed for that.
                  • bawolff 1355 days ago
                    There was also a time when alt.binaries was a thing (technically not email, but usenet is pretty similar)
          • scruffyherder 1355 days ago
            I use vmdk’s

            Seriously I don’t know how long it’ll last but a zip file into a fat32 disk image in a vmdk got through just fine.

            The bonus is that 7zip can extract from vmdk.

        • hnick 1356 days ago
          We just rename our files with .novirus on the end. I assume the main point is to stop executables from outside running with a click, or internal forwards of the same by compromised users which is why it's so easy to bypass.
        • swiley 1355 days ago
          Shouldn’t you put it in either eg artifactory or a code repo?
      • somehnguy 1356 days ago
        Yes. Whenever I email or transfer a zip via any method really I always put a basic password on it.

        I've been bitten way too many times by dumb filters that pick some file out of the zip and declare that it is malicious. I also don't trust messenger apps to not pull my files out and do who knows what with them. A basic password prevents this junk 99% of the time for almost no effort.

        It won't stop a determined system from cracking the password. But that isn't what I'm trying to defend against.

        • saagarjha 1356 days ago
          Gmail doesn't seem to like archives it can't open :/
          • neltnerb 1356 days ago
            Ah, the halcyon days of merely changing the file extension from .exe to .txt...
            • jon-wood 1356 days ago
              This brings back happy memories of a college (senior high for the Americans in the audience) computing teacher finding a friend and I had been writing irritating malware instead of doing actual work, and his only comment being “if you’re going to email that to yourself change the extension so it doesn’t get flagged for IT support”.
          • blue52 1356 days ago
            Lol wonder why?
      • j1elo 1356 days ago
        Gmail won't even let you send a JAR file, or a zip you made out of a project where it happens to be a .jar file somewhere deep in some random subdirectory.
        • Spooky23 1355 days ago
          IIRC, You can do it by embedded the content into an Office file, which is a zip file.
      • lmilcin 1355 days ago
        I have left Intel couple of years ago, that's exactly what passwords were used for. It was pretty annoying to try to send files and putting them in encrypted archive wast the most convenient method.

        It was not just for binaries but for scripts, html, etc.

      • ccurrens 1356 days ago
        That's an excellent point I wouldn't have considered. I have no intention of looking at the dump anyway, but thanks for the warning.
      • 1-6 1356 days ago
        I think the proper term is Honeypotting.
    • MrStonedOne 1356 days ago
      Commonly password protected zips are used to bypass security systems that block all zips with exes in them.

      I doubt the encryption was believed to be a security barrier.

    • at-fates-hands 1356 days ago
      I was an admin for a medium sized company and handled their websites. Almost all of them (about a dozen or so) were hosted on Go Daddy. Plus they had about two dozen reserved domains they were sitting on like www.yourcompanysucks.com and others.

      I left the company 5 years ago. Just checked the login to see if it still worked.

      Yeap.

      Any disgruntled employee could change the password, lock them out of all of their sites (including several e-commerce sites that amount for a large chunk of revenue) and then if they really wanted to, delete all of them.

      I remember talking the main network guy about any backups when a lot of the ransomware stuff was making the rounds. The big, really big stuff on their network (mostly ERP stuff) was backed up in two or three places. Their web stuff? Yeah. . . NOPE.

      Pretty scary how lazy people are about stuff like that.

      • netsharc 1356 days ago
        I wonder if a malware should just grep for "pw:" or "password:" and then try the string it finds against anything encrypted. Or forward it to the control center.

        Also the contents of files like password[s].txt

    • TeeMassive 1356 days ago
      I worked for a company that made servers. In the on board management system's source code I remember seeing "base64 encryption". I think they removed it by the time I left, but still.
    • dandare 1356 days ago
      A company I know insists on rotating passwords fairly often. Everybody just increases the number at the end of their favourite password, i. e. intel1255
      • kps 1355 days ago
        I once worked at a place that required passwords to be changed every month and contain at least one upper and lower case letter, digit, and punctuation, and not match any previous password.

        So the password for August, 2020 would be “August, 2020”.

        • SturgeonsLaw 1355 days ago
          This is super common, to the point where Microsoft used a similar password scheme as an example when talking about password spraying attacks at an RSA conference presentation

          https://www.zdnet.com/article/microsoft-99-9-of-compromised-...

          It's why I'm advocating within my organisation to get rid of password expiration and enforce 2FA for clients, but there's a lot of inertia to push against with some of them. At least uptake of 2FA is consistently increasing.

          • kube-system 1355 days ago
            If you need backup, NIST standards agree with you.

            Scheduled password expiration weakens security by encouraging users to make predictable passwords, and by entrenching password resets as a routine and unscrutinized process.

        • CapricornNoble 1354 days ago
          Many DoD websites are the same. It's so annoying. I use a password manager at home but at work I don't have that luxury (installable software is tightly controlled and very limited).
      • Sylamore 1355 days ago
        Where I work they use a password filter to stop you from doing that...

        But it doesn't stop you from spelling out the numbers instead, plus that makes your PW longer

      • benhurmarcel 1355 days ago
        In my experience this is pretty standard across the industry.
      • astura 1355 days ago
        I use the month and year instead
    • dleslie 1356 days ago
      Also, the passwords are listed in docs that appear to be alongside the encrypted files. That's a bit like leaving the keys to your house _on top_ of your front doormat.
      • danudey 1356 days ago
        It's kinda like hiring a security guard for insurance purposes, even though they have strict instructions to never do anything, under any circumstances, other than call emergency services.
        • Nasrudith 1356 days ago
          To be fair having someone aware and around to watch and phone emergency services has a use.
        • reaperducer 1355 days ago
          It's kinda like hiring a security guard for insurance purposes, even though they have strict instructions to never do anything, under any circumstances, other than call emergency services.

          I see you've worked in retail.

    • cbanek 1356 days ago
      The shared stupid passwords like this that I've seen/had to use in my career would utterly shock you. Like hunter2 levels of shock.
      • david_draco 1356 days ago

          > Like ******* levels of shock.
        
        What do you mean with 7 star levels?
        • astura 1355 days ago
          This joke never gets old
          • yvdriess 1355 days ago
            The people that get bash.org jokes in contrast... :)
    • dasb 1356 days ago
      No one who knows what they're doing uses zip passwords as security. The passwords are probably there for other reasons.
    • de6u99er 1352 days ago
      Another password is "I accept" (based on the leakers Twitter messages).
    • loktarogar 1356 days ago
      at my first job they used a similar password as their go-to "temporary" password for users etc. I found later when I got to work with the users that they rarely changed this password even when "forced" to, and in many cases had it up on post-its next to their monitor.
      • reaperducer 1355 days ago
        and in many cases had it up on post-its next to their monitor.

        These days a post it is probably the best way to secure your password.

        99.9999999% of password hacks come over the wire now, from people in other cities, states, or nations. If someone is in your building, in front of the computer, even without the post-it, you're probably toast.

        • loktarogar 1355 days ago
          A post-it is not a good way to secure your office's generic temporary password.
    • reaperducer 1355 days ago
      Another large tech company I used to work for commonly used an only-slightly more complex password

      I know a brand-name healthcare company that uses Passw0rd for its internal WiFi, which is easily reachable from an interstate rest area.

    • tomrod 1356 days ago
      I knew one company who used the same password for bios as wifi.
    • jyriand 1355 days ago
      Some people/companies think that if you are behind VPN you can use simple and obvious passwords.
  • orisho 1356 days ago
    At a previous workplace we had a few places in the code which used the word backdoor. It was not an actual backdoor though, but merely a debugging server that could be enabled and allowed you to inspect internal state during runtime. At some point I removed the word backdoor, fearing it would get to a customer or during an audit someone would misunderstand. :|
    • skissane 1356 days ago
      Once I got a complaint from a security auditor that some code was using MD5. It wasn’t being used for any security purpose, just to check whether an autogenerated file had been manually edited. We decided it was easier to do what they wanted than argue with them, so we replaced it with CRC32C. That would have been faster than MD5, but nobody cares about saving a few milliseconds off reading a configuration file at startup. It would have made the manual edit check somewhat less reliable, but probably not by much in practice. But the security auditor was happy we’d stopped using MD5
      • heavenlyblue 1356 days ago
        You don’t actually need to listen to auditors. People like you (who can’t be bothered to argue because it’s apparently too hard) is the reason that smartass is still selling their services.
        • Delk 1356 days ago
          You either have way more grit at arguing than most people or you haven't worked at a large and cumbersome organization.

          I know most people at those kinds of organizations just don't have the grit to fight every one of those battles all over again, and choose to do the things they can affect with reasonable effort instead.

          I'm not saying that grit would be a bad thing to have. I appreciate the people who do it. But you really can't know what kinds of situations the parent commenter was in, and sometimes you can't really expect everyone to want to fight it.

          • waheoo 1355 days ago
            I agree with your sentiment in general, but this is telling a dumbass where to go.

            Its not a hard argument to win. Md5 here is fine, its not a security check.

            • kbenson 1355 days ago
              Sometimes the point isn't technical, but social. So MD5 isn't used for security purposes right now. At some point someone will want some hashing function, and they'll probably look at what the code already uses. The last thing you want is someone a bit clueless goi g "it was good enough there, it's good enough here" and using MD5 where they shouldn't. Removing it from a codebase helps with that problem.

              The problem here is that people assume they know every possible reason why the auditor might ask for something, when they don't. If the auditor is asking for it, and it costs almost nothing to do, maybe just do it instead of wasting everyone's time by acting like you know the totality on the subject, and everyone will probably go home happier at the end of the day.

              • andreareina 1355 days ago
                Isn't that what code review is for? To me that sounds like arguing against string formatting because someone could think it's ok for SQL queries.

                An auditor's job doesn't end at saying what things should be changed, it should include why as well (granted, we don't know the full content of the auditor's report here, maybe they did say why).

                • kadoban 1355 days ago
                  Code reviews are good checks. Making it more difficult for dumb ideas to show up in a code review and possibly be missed is also good.

                  If using md5 had any real benefit I'd say leave it, but what are you gaining?

                  • hashhar 1355 days ago
                    because CRC is actually worse for checking file content collisions (not that MD5 is perfect either).
                    • Reelin 1355 days ago
                      > because CRC is actually worse for checking file content collisions

                      So use SHA-1 or SHA-2 or SHA-3 or if you really hate NIST standards for some reason then CubeHash or Skein or Blake2 or ...

                      • skissane 1355 days ago
                        The reason why CRC32C was chosen as a replacement instead of SHA-2 or whatever - what happens if in a few more years, SHA-2 isn’t considered secure any more and some future security audit demands it be changed again? Whereas, a CRC algorithm isn’t usually used for security purposes, so a security audit is far less likely to pay any attention to it. The whole issue started because a security-related technology was used for a non-security purpose.
                        • Reelin 1355 days ago
                          > what happens if in a few more years, SHA-2 isn’t considered secure any more and some future security audit demands it be changed again

                          Then change it again? If you use the most recent available NIST standard it should hopefully be a very long time before meaningful (let alone practical) attacks materialize (if ever). If you end up needing to worry about that in a security audit, consider it a badge of success that your software is still in active use after so many years.

                          Using an insecure hashing algorithm without a clear and direct need is a bad idea. It introduces the potential for future security problems if the function or resultant hash value is ever used in some unforeseen way by someone who doesn't know better or doesn't think to check. Unless the efficiency gains are truly warranted (ex a hash map implementation, high throughput integrity checking, etc) it's just not worth it.

                          > a security-related technology was used for a non-security purpose

                          I would suggest treating all integrity checks as security-related by default since they have a tendency to end up being used that way. (Plus crypto libraries are readily available, free, well tested, generally prioritize stability, and are often highly optimized for the intended domain. Why would you want to avoid such code?)

                      • waheoo 1355 days ago
                        Are you seriously suggesting sha-1 as a good replacement to md5... for security reasons?
                        • Reelin 1355 days ago
                          Ahh poop, looks like I was out of date. Apparently a practical demonstration of an attack with complexity ~2^60 was recently demonstrated against legacy GPG (the v1.4 defaults) for less than $50k USD. [1] That being said, it looks like it still required ~2 months and ~900 GPUs versus MD5 at 2^18 (less than a second on a single commodity desktop processor).

                          So yeah, I agree, add SHA-1 to the list of algorithms to reflexively avoid for any and all purposes unless you have a _really_ good reason to use it.

                          [1] https://www.schneier.com/blog/archives/2020/01/new_sha-1_att...

                • Buge 1355 days ago
                  If someone is dumb enough to add it, someone is dumb enough to let it through code review.
                • woofie11 1355 days ago
                  Code reviews miss things.

                  (1) Code is in part a communication medium. This says "We use MD5"

                  (2) Code changes. If some sees something cryptohashed, they may use it differently in 5 years.

              • raverbashing 1355 days ago
                The reason they ask is that they have to fill a checkbox that says "no MD5" and of course they're don't know that CRC32 is worse

                And to be very fair, a lot of security issues would be caught with basic checkbox ticking. Are you using a salted password hashing function instead of storing passwords in plaintext? Are you using a firewall? Do you follow the principles of least privilege?

                • waheoo 1355 days ago
                  Except this is not for password hashing.

                  Why is this so difficult to grasp.

              • waheoo 1355 days ago
                What is this weird incessant need to play devils advocate.

                Sometimes people are just right.

                • kbenson 1355 days ago
                  Because most times people aren't "just right", they're just unwilling to widen their point of view, and/or they turn the issue into a way to assert their own importance and intellect over someone else at the expense of those they work with.

                  I don't need some coworker getting into some drawn out battle about how MD5 is fine to use when we can just use SHA (or CRC32C as that person did, which is more obviously non-useful for security contexts) and be done in 30 minutes. The auditor is there to do their job, and if what they request is not extremely invasive or problematic for the project, implementing those suggestions is your job, and arguing over pointless things in your job is not a sign of something I want in a coworker or someone I manage.

                  • waheoo 1354 days ago
                    > they turn the issue into a way to assert their own importance and intellect over someone else at the expense of those they work with.

                    This is exactly what the auditor is doing.

                    How can you not see the irony here?

                    > I don't need some coworker getting into some drawn out battle

                    This isn't a drawn out battle. This is a really fast one, md5 is fine here, you didn't check the context of its use, thats fine, whats the next item on your list?

                    Whats fucking hard about that?

                    Is this some kind of weird cultural thing with American schooling teaching kids they can't question authority?

                    • kbenson 1354 days ago
                      > This is exactly what the auditor is doing.

                      The auditor was asked to do it and is being paid to do it. Presumably, the people arguing are paid to implement the will of those that pay them. At some point people need to stop arguing and do what they're paid to do or quit. Doing this over wanting to use MD5 seems a pretty poor choice of a hill to die on.

                      > This is a really fast one, md5 is fine here, you didn't check the context of its use, thats fine, whats the next item on your list?

                      There are items like this all throughout life. Sure, you can be trusted to drive above the speed limit on this road, and maybe the speed limit is set a little low. But we have laws for a reason, and at some point you letting the officials know that the speed is two low and they really don't need to make it that low goes from helpful to annoying everyone around you.

                      > Whats fucking hard about that?

                      Indeed, what is so hard about just accepting that while you're technically correct that MD5 isn't a problem, you're making yourself a problem when you fight stupid battles nobody but you cares about, but everyone has to deal with?

                      > Is this some kind of weird cultural thing with American schooling teaching kids they can't question authority?

                      Hardly. Pompous blowhards exist in every culture. Also, that's hilarious. Your talking about a culture that rebels against authority just because they think they that's what they're supposed to do, even if it's for stupid reasons and makes no sense. See the tens of millions of us that refuse to wear masks because it "infringes on our freedom".

                      • waheoo 1353 days ago
                        > do what they're paid to do or quit.

                        I'm paid to tell idiots where to go. My boss doesn't pay me 6 figures to toe the line and fill in boxes. She pays me to use my judgement to move the company forward. I'm not wasting my time and her money on this sort of garbage and if they can't see the difference between casual use and secure use them we need to rethink our relationship with this company or they need to send us someone new.

                        > Your talking about a culture that rebels against authority

                        You just used the line "do what you're told or quit".

                        The cognitive dissonance here is unreal.

                        • kbenson 1353 days ago
                          > I'm not wasting my time and her money

                          I've very specifically couched all my recommendations for this for when it's trivial to do. Arguing about this with someone instead of doing it, when doing it may have some benefits but really only costs a few minutes instead of just doing so is definitely wasting her time and money.

                          > You just used the line "do what you're told or quit".

                          I noted what I wished people would do in very specific cases where they're wasting way too much time and effort to win a stupid argument rather than make a small change of dubious, not possibly not zero, positive security impact.

                          I don't see anything weird about acknowleding some of the extreme traits of the culture I live in while also wishing they would change, at least in specific cases where I think they do more harm than good.

                          Honestly, I'm confused why you would even make some cognitive leap that since I live in an area with a specific culture I must act in the manner I described that culture, especially when I did it in a denigrating way. I guess you think all Americans must be the same? That doesn't seem a useful way to interact with people.

            • Delk 1355 days ago
              As a technical choice, that's true. So the argument shouldn't be hard to win, assuming you're dealing with reasonable people, who are also answering to reasonable people. Those people (e.g. the leadership) also need to care enough about that detail to just not dismiss your argument because making the change is not a problem for them. And they need to not be so security-oriented (in a naive way) as to consider a "safer" choice always a better one regardless of whether there's a reasonable argument for it or not.

              That's more assumptions than it is sometimes reasonable to make.

              "You don’t actually need to listen to auditors" is decidedly not true for a lot of people in a lot of situations, and arguing even for technically valid or reasonable things is an endurance sport in some organizations.

              I mean, I even kind of want to agree with heavenlyblue's argument that you should fight that fight for the exact reason they're saying, and can see myself arguing the same thing years ago, but at least in case of some organizations, blaming people for taking skissane's stance would be disproportionate.

              • waheoo 1355 days ago
                Oh sorry, I thought we were discussing working with rational people.

                If you're working with irrational people you're going to have to do irrational things, but that's kind of a given isn't it? We don't really need to discuss that.

            • andreareina 1355 days ago
              Not hard to win if everyone is being reasonable. Given an auditor that thinks all uses of MD5 are proscribed, what would you put the odds of them being reasonable at?

              ETA: per 'kbenson it's not hard to conceive of a situation where proscribing MD5 is reasonable. Taking 'skissane's account at face value is probably reasonable, but my implicit assumption that the auditor would not explain if pressed isn't being charitable.

              • reallydontask 1355 days ago
                indeed

                Specially with the audit/pen test theatre where they have to put something in the report, otherwise why are they getting paid £20K for two days work?

                So most people choose the past of least resistance, when it doesn't matter much, so that you fight where it does.

                • waheoo 1355 days ago
                  Picking your battles is something we all need to learn to do.

                  I for one like to pick the easy wins, like this.

            • benhurmarcel 1355 days ago
              In a large company you have to choose your battles.
          • BrandoElFollito 1354 days ago
            I do it for the sake of educating our management.

            For now 10 years, I refuse to acknowledge the finding of the consulting company which flags the password scheme I use (passphrases) because the norm they use (a national one) talks about czps, symbols etc.

            I refuse to sign off and note that our company is a scientific one and to the difference of the auditors, we understand math taught to 16 yo children.

            This goes to the board who gets back to me, I still refuse on ethical gtounds and we finally pass.

            This is sad that some auditors are stupid when some other are fantastic and that you depend on which one you get assigned.

            A good read: https://serverfault.com/q/293217/78319

        • skissane 1356 days ago
          Sometimes customers demand security audits as part of sales contracts. If it is a high enough value deal, the company may decide it is in their business best interest to say yes. In that scenario, not listening to the security auditor is not a viable option. You need to keep them onside to keep the customer onside.

          Similarly, sometimes in order to sell products to government agencies you need to get security audits done. In that scenario, you have to listen to the security auditor and keep them onside, because if you don't keep them happy your ability to sell the product to the government is impeded.

          • Polylactic_acid 1356 days ago
            I have a feeling that these auditor people just make up bullshit when they can't find something real. The last few we have got have come up with total non issues marked as severe because they are easy to "exploit".

            Meanwhile I have been finding and fixing real security issues regularly. To be fair it would be extremely difficult for an external person to find issues in the limited time they have so the audit comes down to someone running through a list of premade checks to see if they find anything.

            • Sylamore 1355 days ago
              One thing I learned when I worked in internal IT security when dealing with auditors was that they will boil the ocean to find an issue, so never be perfect and leave a few relatively easy but not obvious to spot issues for them to write up that don't actually affect the security of your environment. If you don't leave them this bait, they will spend weeks to find a trivial issue (like using MD5 to check for config file changes vs password hashing) and turn it into a massive issue they won't budge on.

              The other issue is that if you make it seem too easy to answer their questions or provide reports, they will only ask more questions or demand more reports so even if its just dumping a list of users into a CSV file for them to review, make it seem like way more effort than it actually is otherwise you might find you've been forced into a massive amount of busy work while they continue to boil the ocean.

              • blablablub 1355 days ago
                Smart auditors ask for all items at the beginning of the audit. Smart IT people give them all items at the end of the audit. Auditors have only a limited time budget. The later they get answers, the less time for them is left for follow-up questions.
              • data_ders 1355 days ago
                3D chess! I agree sometimes it feels as if the security review questions are just set-ups for follow-ups that they didn’t include in the initial form (for whatever reason)
            • JaggedNZ 1356 days ago
              I've had audits like that, many are just for CYA and I'm often the dev patching obscure (or not so obscure) security issues.

              Honestly, I'm quite happy to have an auditor nitpick a few non-issues if the alternative is risking releasing an app that has a basic sql injection attack that wiggled past code review due to code complexity.

              I've also had an external audit that found an unreported security issue in a new part of a widely used framework, so there are auditors out there that do a good job of finding legitimate things.

            • rommel917 1355 days ago
              Some years ago I worked in $BIGBANK and auditor from $GOVERMENT told as to change street name property from textfield to dropdown (for all countries) to help them with fraud detection, and remove all diacritic characters from client names their new software don't like them.

              I told my manager that they are idiots and I won't listen them, he was like 'OK, as I expected' never done anything about it, next auditors didn't mentioned it.

              • hakfoo 1354 days ago
                This makes me wonder about the reliability of address verification technology.

                There are plenty of addresses where the official version in databases is slightly off from what people actually write on their mail. If I got a credit card transaction with the "official" version, that would be a significant fraud signal, that they were sourcing bogus data from somewhere.

        • restingrobot 1356 days ago
          So much this. My company just got done shelling out a ton of money for some asshat to tell me that we can't use http on a dev server. <head smashes through desk>
          • scoot_718 1356 days ago
            I mean, I mandate https in dev, but it sure isn't for security. It's so that auth works in dev and no changes are required to push prod
            • james_s_tayler 1355 days ago
              or someone doesn't accidentally just entirely overlook it and wind up with just http in prod.
          • dx034 1355 days ago
            I actually think that's valid. Sure, http on a dev machine isn't a security risk. But there is a tail risk that it ends up somewhere on a system that sends data between machines. Also, using http on dev and https on prod can lead to unexpected bugs. Banning http is not unreasonable.

            Same with the md5 complaint. That use of md5 wasn't a problem but there's a perfectly fine alternative and if you can ensure by automated tests that md5 is used nowhere, you also can guarantee that it's never used in a security relevant context.

            • skissane 1354 days ago
              > and if you can ensure by automated tests that md5 is used nowhere

              You can automatically check for the string "md5" in identifiers, but you can't reliably automatically check for implementations of the MD5 algorithm. All it takes is for someone to copy-paste an implementation of MD5 and rename it to "MyChecksumAlgorithm" and suddenly very few (if any) security scanning tools are going to be smart enough to find it.

              (Foolproof detection of what algorithms a program contains is equivalent to the halting problem and hence undecidable, although as with every other undecidable problem, there can exist fallible algorithms capable of solving some instances but not others.)

          • greedo 1356 days ago
            It's worse when the asshat convinces your manager that every internal site, whether dev or not needs https. Certs everywhere. Our team spends a decent % of our time generating and managing certs...
            • xenophonf 1356 days ago
              That’s me. I’m that asshat. It’s called defense in depth. I recommend automating certificate issuance and renewal. It’s totally worth it.
              • triangleman 1355 days ago
                Book or tutorial recommendations please.
                • jasonladuke0311 1355 days ago
                  • souprock 1355 days ago
                    How could that work? For security, an internal site lacks a connection to the internet.
                    • skissane 1355 days ago
                      Are you talking about a fully internal site, with not even indirect Internet access? For those kinds of airgapped applications, you should maintain your own CA infrastructure, and update all clients/browsers to trust its certificates.

                      For the more common scenario of internal sites/services which are not accessible from the public Internet, but not fully isolated from it either:

                      You don't need the internal site exposed to the Internet. If you use DNS-01 ACME challenge, you just need to be able to inject TXT records into your DNS. Some DNS providers have a REST API which can make this easier.

                      Another option – to use HTTP-01 ACME challenge, you do need the internal host name to be publicly accessible over HTTP, but that doesn't mean the real internal service has to be. You could simply have your load balancer/DNS set up so external traffic to STAR.internal.example.com:80 gets sent to certservice.example.com which serves up the HTTP-01 challenge for that name. Whereas, internal users going to STAR.internal.mycompany.com talk to the real internal service. (There are various ways to implement this – split horizon DNS, some places have separate external and internal load balancers that can be configured differently, etc)

                      Yet another option is to use ACME with wildcard certs (which needs DNS-01 challenge). Get a cert via ACME for STAR.internal.medallia.com and then all internal services use that. That is potentially less secure, in that lots of internal services may all end up using the same private key. One approach is that the public wildcard cert is on a load balancer, and then that load balancer talks to internal services – end-to-end TLS can be provided by an internal CA, and you have to put the internal CA cert in the trust store of your various components, but at least you don't have the added hassle of having to put it in your internal user's browser/OS trust stores.

                      (In above, for STAR read an asterisk – HN wants to interpret asterisks as formatting and I don't know how to escape them.)

                    • mschuster91 1355 days ago
                      Internal sites? Set up your own CA infrastructure.
                      • speedgoose 1355 days ago
                        I rather spend my limited time working on other security issues.
                        • CamperBob2 1355 days ago
                          Or (gasp) shippable features.
            • dx034 1355 days ago
              Which means if someone gets access to the internal network, they can read all traffic. And even dev systems can send confidential data. With letsencrypt and easy to generate certificates, https everywhere is very reasonable.
            • jogjayr 1355 days ago
              Not quite so crazy now that everyone's working from home, right? Unless you also use a VPN?
              • dx034 1355 days ago
                Even with VPN. I don't want any person on the vpn to be potentially able to read traffic between internal services. I think that would fail many audits.
            • quotemstr 1355 days ago
              It does though. There's no excuse for unencrypted traffic. Google doesn't have some VPN with squishy unencrypted traffic inside. Everything is just HTTPS. If they can do it, so can you. It's just not that hard to manage a PKI.
            • doliveira 1355 days ago
              Does your organization disable the "Non-secure" prompt in the browser as well? If not, I'd say that it does seem like a security risk to train your users to ignore browser warnings like that.
            • searchableguy 1355 days ago
              I am confused. Isn't that easily automated?
              • souprock 1355 days ago
                It's not easily automated. Somehow, you have to safely get a certificate across the air gap to the internal network.

                So I guess an internet-connected system grabs the certificates, then they get burned to DVD-R, then... a robot moves the DVD-R to the internal network? It's not easy. It's all much worse if the networks aren't physically adjacent. One could be behind a bunch of armed guards and interlocking doors.

                • skissane 1355 days ago
                  An airgapped network can include its own internal CA, and all the airgapped clients can have that internal CA's certificate injected into their trust stores, and all the services on the airgapped network can automatically request certificates from the internal CA – which can even be done using the same protocol which Let's Encrypt uses, ACME, just running it over a private airgapped network instead of over the public Internet.
                • sarakayakomzin 1355 days ago
                  >you have to safely get a certificate across the air gap to the internal network.

                  you can't trust a server on the internal network as root certificate trust? sounds like a scary situation.

              • mrweasel 1355 days ago
                We have a ton of internal stuff, most of it doesn’t even have external DNS. We use long lived certs signed with our own CA. we’d prefer using and automated solution, using a “real” CA, but non seems to be available.
        • maccard 1355 days ago
          I was on the receiving end of a security audit issue. I closed the bug s won't fix, my lead approved it, but when the team who paid the security auditor found out they demanded I fix it. I had to argue with it, infosec, and the auditor. Nobody really cares what I did, they just wanted to follow the rules. After a month of weekly hour long meetings I relented and changed the code.

          You're often not arguing with the auditor, you're arguing with the person who paid the security auditor in the first place who is likely not even technical. That's a battle toy will likely never win.

          • ants_a 1355 days ago
            To add to this, often the primary goal of the person who paid the security auditor is not to actually increase security. It is to get to claim that they did their due diligence when something does happen. Any arguments with the auditor, no matter how well founded, will weaken that claim.
        • protomyth 1355 days ago
          Depends I suppose. When your CFO tells you to fix it so you're in compliance, your opinion doesn't matter a whole lot. Never mind if it is a government auditor or their fun social counterpart the site visitor.

          I once got cited for having too many off-site backups. They were all physically secure (fire proof safes or bank lock box), but the site visitor thought onsite was fine for a research program. The site visitor's home site lost all its data in a flood.

        • reaperducer 1355 days ago
          You don’t actually need to listen to auditors.

          At my company, that's a one-way ticket to the unemployment line.

        • Spooky23 1355 days ago
          Sometimes an inexperienced auditor will show a minor finding that is a sign of a bigger issue. For example, if Windows is in FIPS mode, some MD5 functions will be disabled.

          If you need to be operating in FIPS 140 mode, that may be a problem of some consequence.

        • manmal 1355 days ago
          In some companies, you do. Medical certifications require regular audits, and failing an audit is _not_ good.
          • dx034 1355 days ago
            And it's good! Code Reviews can't surface all issues. Independent audits should be welcomed by developers to find more bugs and potential security risks (even though I'm a bigger fan of penetration tests instead of audits).
      • ponker 1355 days ago
        When you're trying to keep a company of 100,000 employees secure, you can't have an approach that says "let's figure out where we need to remove MD5 and remove it." You have to set an easy to understand, consistent guideline -- "tear out the MD5" -- so that there won't be any doubt as to whether it's done, some teams won't complain that they shouldn't have to change it because some other team didn't have to change it, etc. And then every time they do a security audit the same thing will come up and cause more pointless discussion.

        In isolation it looks like wasted work but in terms of organizational behavior it is actually the easiest way.

      • strictfp 1355 days ago
        Happened to me as well. Was writing an authentication service. We thought we were paying for an actual security audit, turns out we payed for a simple word scanning of our codebase. The review didn't find any of the canaries we left in the codebase, and we could never argue back with them. Big waste of money.
        • WhyNotHugo 1355 days ago
          Huh. I'm thinking it'd be fun to write code with know issues (with varying degrees of obviousness) and hire a bunch of different "auditing companies" to see which ones pick up on that.

          Publish the result for market comparison's sake.

          Then again, that requires plenty of money and I can't see how to monetize that in any way.

      • GuB-42 1355 days ago
        Not only that but MD5 still doesn't have an effective preimage attack, so it is still good enough for things like hashing passwords or to check is someone else didn't tamper with your files.

        Still, when it comes to security:

        - MD5 is actually too fast for hashing passwords, but there is still no better way than bruteforce if you want to crack md5-hashed-salted passwords.

        - Even if there is no effective preimage attack now, it is still not a good idea to use an algorithm with known weaknesses, especially if something better is available.

        What MD5 is useless for is digital signature. Anyone can produce two different documents with the same MD5.

      • gmac 1355 days ago
        Funny: we had the exact same thing from a pen tester. I think we replaced it with SHA256, though.
      • ccktlmazeltov 1356 days ago
        this is hilarious
        • jopsen 1356 days ago
          Makes perfect sense.

          Defense in depth, if you can grep the source code and not find any references to md5, then you have quickly verified that the code probably doesn't use md5.

          This you can easily verify again later, you can even make a test for it :)

          Even if in practice this had no impact, removing md5 usage, will make it harder to accidentally introduce it in the future.

          • the8472 1356 days ago
            The issue is not md5. The issue one wants to detect is weak hash functions used in cases where they're not appropriate. The fact that crc32 passed means that any obscure hash function would have passed too, even if it had been used in a context were it isn't appropriate.

            All it means that the audit is superficial and doesn't catch the error category, just famous examples within that category. That kind of superficial sanning may be worth something when unleashed on security-naive developers or even as optional input for more experienced ones. But "hard compliance rules" and "superficial scans" combine to create a lot of busywork which makes people less motivated to work with auditors instead of against them.

            • theon144 1356 days ago
              Both perspectives are somewhat correct, I feel; the requirement to remove any usage of md5 is beneficial, but the fact that crc32 passed means the audit shows the motivation was misplaced.

              The resulting situation might of course not be a net benefit though :/

            • jopsen 1350 days ago
              > But "hard compliance rules" and "superficial scans" combine to create a lot of busywork which makes people less motivated to work with auditors instead of against them.

              Absolutely :)

              The fact is that if you have experienced engineers a security audit is rarely able to find anything. You would basically have to do code reviews, and this is hard / expensive, and even then rarely fruitful.

              So, superficial scans, hardening, checking for obvious mistakes is really all you can do. Making hard rules is unproductive, but then again, migrating from md5 to crc32 hopefully isn't very expensive.

              IMO, crc32 is a better choice for testing for changes, and has the benefit of removing any doubt that the hash has any security properties.

            • meshaneian 1356 days ago
              Next up: Replace MD5 with BASE64+ROT13. Significantly worse functionality AND performance, but sounds more secure (to a layman) and doesn't trigger the "MD5" alert...
              • cortesoft 1355 days ago
                You joke, but an ex-security guy at my company literally told me “this file can’t be in plain text on disk. Base64 encode it”
                • skissane 1355 days ago
                  Base64 encoding does protect somewhat against "looking over your shoulder" attacks

                  (Unless the person looking over your shoulder has a really good memory and can remember the Base64, or decode it in their head. Or they have a camera.)

                • mschuster91 1355 days ago
                  Helps against attackers grepping the whole disk (or any folder named "conf" or similar) for "username", "user", "password", "pass", "key" and friends.

                  It's game over anyway if someone has a shell on your server but at least it complicates their life a bit.

            • dx034 1355 days ago
              But way more people would use md5 for password hashing than crc32. Of course someone could circumvent these tests, but the risk of someone copying an old tutorial where md5 is used for password hashing can be mitigated.
          • therein 1356 days ago

                // We use MD5 to check if config files are changed. This is not used anywhere else.
                typedef DigestMD5 ConfigFileHasher;
            • dyingkneepad 1356 days ago
              Until someone repurposes that thing to do something that is security-sensitive and forgets to remove the comment, misleading the next auditors.

              I always assume that people from the future who are going to touch my code are really dumb people, so I try to have as few traps as possible for them.

              • asddubs 1355 days ago
                i know for a fact the person who's gonna touch my code in the future is really dumb, because it's me
          • asddubs 1356 days ago
            yeah I can see that, what if someone ends up being smart and re-using the verification procedure for a file that does have security impact, DRY right?
        • Macha 1356 days ago
          I've seen similar rigidity from security audits. Stuff like "version 10.5.2 (released last week) of this software introduced a security bug that was fixed in 11.0 (released today), we need you to update from 10.5.1 (released last week + 1 day) to 11.0 now because our audit tool says so".
          • gorkish 1356 days ago
            Ah yes and also the vendor helpfully changed the API and did a complete rewrite in v11.0. Think about all the neat new things you will get to learn!
    • yjftsjthsd-h 1356 days ago
      It seems like a thin line between a debugging feature and a backdoor; "merely a debugging server that could be enabled and allowed you to inspect internal state during runtime" seems like a backdoor to me, doubly so if it's network-accessible. If Intel has, say, an undocumented way to trigger a debug mode that lets you read memory and bypass restrictions (ex. read kernel memory from user mode, or read SGX memory), is that not a backdoor? Or is the name based on intent?
      • jdminhbg 1356 days ago
        I think the difference is whether it's something that's always enabled. You could presumably make it available or not at compile time, so the software shipped to a customer wouldn't have it, but maybe if they were having issues, you could ship them a version with the debug server with their permission.
        • yjftsjthsd-h 1356 days ago
          I can agree with that with the caveat that "enabled" has to be at either something that only the user can do. If it requires that the customer intentionally run a debug build, that's fine; if it can be toggled on without their knowledge, then it's a problem.
          • orisho 1356 days ago
            It was disabled by default, and could only be enabled using environment variables. Even when enabled, the whole thing ran in Docker and the socket was bound to loopback, so you could only connect to it from within the container.

            When the intention is a debugging server, making it exposed to the world is a mistake and a security vulnerability. At that point it is effectively a backdoor, but the difference between a high level vulnerability such as this and a backdoor is developer intent.

            • tantalor 1356 days ago
              That doesn't sound very safe.
              • lukevp 1356 days ago
                What sounds unsafe about having a locally bound port inside a container that only binds with an env variable getting set?
                • still_grokking 1356 days ago
                  For example that someone finds out about that backdoor and activate it to spy on users. Forwarding a port in Docker is not magic…
                  • orisho 1356 days ago
                    Sure, it's simple. But you would have to be able to modify the container settings anyway. For all practical uses, and certainly in my case, you could just make it run a different image at that point. Or copy another executable into the container and run it. You're already privileged. Requiring you to be privileged to access the debug server means it's secure.
                    • still_grokking 1356 days ago
                      Until things around change and what was previously "a secure backdoor" becomes a "less secure backdoor". ;-)

                      One can read every second week about cases where some backdoor that was meant to be used "only for debugging" landed in the end product and became a security problem.

                      Actually I usually suspect malice when something like that is found once again, as "who the hell could be so stupid to deliver a product with a glaring backdoor". But maybe there is something to Hanlon's razor… :-D

              • orisho 1356 days ago
                If you're already running another process in the container, you could do whatever you want anyway.
      • fit2rule 1356 days ago
        Was it a backdoor, or a hidden door, or .. a utility panel?

        The difference, in my opinion, is in the documentation and frequency of use. Is it overt? Does the customer really know its there, and what its for?

        Perfectly fine to have an access panel that gives you access to the buss .. if the pilot knows you're doing it.

        But if its some random entrance in the back of an alley, only 2 or 3 users in the universe know what it is and how to use it ..

        • close04 1356 days ago
          Imagine the reaction if the same thing was coming from China. Nobody would ask the question.
          • fit2rule 1355 days ago
            Nationality has nothing to do with it. I also don't trust Americans with such devices.
            • close04 1355 days ago
              I'm talking about the general sentiment. You can see this on every* site, HN included. The litmus paper is that even pointing out something objectively true will get criticism (downvotes) rather than critical thinking. In the current atmosphere nobody asks the question when it comes to China/Russia/NK/Iran but will when it comes to the US despite the known history of hacking/spying on everyone else.

              *Recently a reputable tech site wrote an article introducing DJI (ostensibly a company needing no introduction) as "Chinese-made drone app in Google Play spooks security researchers". One day later the same author wrote an article "Hackers actively exploit high-severity networking vulnerabilities" when referring to Cisco and F5. The difference in approach is quite staggering especially considering that Cisco is known to have been involved, even unwittingly, in the NSA exploits leaked in the past.

              This highlights the sentiment mentioned above: people ask the question only when they feel comfortable that the answer reinforces their opinion.

    • akira2501 1356 days ago
      A manufacturer wanted to upgrade one of their equipment lines to be more modern. The developers of the original product, both hardware and software, were no longer with the company.

      Since they just wanted to add some new features on top and present a better rack-based interface to the user, they decided to build a bigger box, put one of the old devices inside the box, then put a modern PC in there, and just link the two devices together with ethernet through an internal hub also connected to the backpanel port and call it a day.

      The problem is, if you do an update, you need both the "front end" and the "back end" to coordinate their reboot. The vendor decided to fix this by adding a simple URL to the "backend" named: /backdoor/<product>Reboot?UUID=<fixed uuid>

      Their sales team was not happy when I showed them an automated tool in a few lines of ruby that scans the network for backend devices and then just constantly reboots them.

      They still sell this product today. We did not buy one.

      • kps 1355 days ago
        Reminds me a little of a place I worked.

        They sold very expensive devices that were actually an off-the-shelf 1U PC with custom software (which provided the real value). The problem — and this dates it — was that the PCs had a game port¹, which gave away that this custom hardware was really just a regular consumer PC. So they had some fancy plastic panels made to clip on the front and hide the game port.

        ¹ https://en.wikipedia.org/wiki/Game_port

        • Spooky23 1355 days ago
          I remember early in my career I came across a Unisys “mainframe”, which was literally a Dell box with a custom bezel, clustered with a few other nodes with a Netgear switch.
          • skissane 1355 days ago
            Many non-IBM mainframe vendors switched to software emulation on more mainstream platforms-nowadays mainly Linux or Windows on x86, but in the past SPARC and Itanium were also common choices. What you saw may have been an instance of that. A software emulator can often run legacy mainframe applications much faster than the hardware they were originally written for did.

            (With Unisys specifically, at one point they still made physical CPUs for high end models, but low end models were software emulation on x86; I’m not sure what they are doing right now.)

            • Spooky23 1355 days ago
              I don't know the details (~20 years ago), but pretty sure you hit the nail on the head. I think one of the boxes I saw were a hybrid -- Xeons with some sort of custom memory controller.

              It was my first exposure to this sort of thing, and I was taken aback by the costs of this stuff, which made the Sun gear I worked with look extremely cheap :)

              • skissane 1355 days ago
                > I was taken aback by the costs of this stuff, which made the Sun gear I worked with look extremely cheap :)

                Given the shrinking market share of mainframes, the only way for vendors to continue to make money is to increase prices on those customers who remain – which, of course, gives them greater encouragement to migrate away, but for some customers the migration costs are going to be so high that it is still cheaper to pay megabucks to the mainframe vendor than do that migration. With emulated systems like the ones you saw, the high costs are not really for the hardware, they are for the mainframe emulation software, mainframe operating system, etc, but it is all sold together as a package.

                At least IBM mainframes have a big enough history of popularity, that there are a lot of tools out there (and entire consulting businesses) to assist with porting IBM mainframe applications to more mainstream platforms. For the remaining non-IBM mainframe platforms (Unisys, Bull, Fujitsu, etc), a lot less tools and skilled warm bodies are available, which I imagine could make these platforms more expensive to migrate away from than IBM's.

      • ImaCake 1356 days ago
        This can't be real... are you serious? It sounds like one of those silly buisness parabels!
        • lotyrin 1356 days ago
          Even if this poster made it up, I'm certain it is also true at least once over, having remediated a near-identical problem from one of my employers' products at one point, and talked developers out of implementing it at least once at a different employer.
        • hnick 1356 days ago
          The older I get, the less I care if individual stories like this are true. The fact that they could be is concerning enough :) And they are educational nonetheless.
          • ImaCake 1355 days ago
            A good perspective! People find fiction novels to be enriching and filled with learning despite the fact they are just entertaining lies.
        • lloeki 1356 days ago
          It sounds like Dell’s iDRAC somehow. (Not that it is, but iDRAC had me scared more often than not)
          • icefo 1356 days ago
            The time iDrac annoyed me the most is when I bricked a server trying to update it.

            I made the terrible mistake of jumping too far between versions and the update broke iDrac and thus the server. There was no warning on Dell's website nor any when I applied the update. I only found out what happened after some googling where I found the upgrade path I should have taken.

            This is just terrible quality control and software engineering.

    • awalton 1356 days ago
      At my previous employer our code was littered with references to a backdoor. It was a channel for tools running in guest operating systems to talk to the host hypervisor through a magic I/O port.

      It's even openly called "backdoor" in open source code directly related to it: https://github.com/vmware/open-vm-tools/blob/master/open-vm-...

      • blasdel 1355 days ago
        More reasonable VMMs use the word "hypercall" for these paravirtualized interfaces
    • SEJeff 1356 days ago
      We use the term “manhole” for those sorts of things
      • shoo 1356 days ago
        readers may tangentially enjoy yosefk's "cardinal programming jokes" -- the first one featuring a manhole: https://yosefk.com/blog/the-cardinal-programming-jokes.html

        > I must warn you about those jokes. Firstly, they are translated from Russian and Hebrew by yours truly, which may cause them to lose some of their charm. Secondly, I'm not sure they came with that much charm to begin with, because my taste in jokes (or otherwise) can be politely characterized as "lowbrow". In particular, all 3 jokes are based on the sewer/plumber metaphor. I didn't consciously collect them based on this criterion, it just turns out that I can't think of a better metaphor for programming.

      • bogidon 1356 days ago
        Ew
        • SEJeff 1356 days ago
          As in a manhole cover in a street for maintenance.
          • bogidon 1356 days ago
            Yes, yes. Seems like an outdated term. Downvotes accepted.
            • SECProto 1356 days ago
              > Seems like an outdated term. Downvotes accepted.

              Manhole is, indeed, an outdated term. Generally the preferred term is "Maintenance Hole". Still abbreviated MH, and people in the field use all three interchangeably (much like metric/imperial).

              Source: I work with storm/sanitary/electrical maintenance holes.

            • throwaway45349 1356 days ago
              This reads like the people who want to use "womxn" because the normal version is a superstring of "men".
            • dogma1138 1356 days ago
              Fine personhole it is.
              • eli 1356 days ago
                "Maintenance Hole" actually, which is better because it's both more descriptive and not gendered.
                • dyingkneepad 1356 days ago
                  Until someone starts using the hole for a purpose that's not maintenance and we start arguing again :).
                  • akimball 1355 days ago
                    Or for something that is gendered.
                • oneplane 1356 days ago
                  But what if manhole is just mankind hole? (It probably isn't, I didn't look it up). Man doesn't always mean male, or does it?
                  • deathanatos 1356 days ago
                    > Man doesn't always mean male, or does it?

                    Not necessarily, but see: https://en.wikipedia.org/wiki/Gender_neutrality_in_English#D...

                    The link is about the debate as it is, but I would also encourage the use of good faith in interpreting any speaker: that is, assuming a person referring to "mankind" likely means all humans without exclusion based on gender or sex, and requiring some other material evidence before presuming bias.

                    I also wonder what these discussions are like in languages where most nouns are gendered, e.g., in French.

                    • still_grokking 1356 days ago
                      No clue about French but in German they started to use both versions at the same time glued together in made-up "special" forms. It's like using "he/she" for every noun. This makes texts completely unreadable and you need even browser extensions[1] to not go crazy with all that gendered BS language!

                      OK, I exaggerate, there are still people that don't try to be "politically correct" and still use proper language, and know that there is such a thing called "Generisches Maskulinum (English: generic masculine)"[2]. But in more "official" writings or in the media the brain dead double-forms are used up until the point you can't read such texts any more: Those double-forms (which are not correct German) cause constant knots in the head when trying to read a text that was fucked up this way.

                      (Sorry for the strong words but one just can't formulate it differently. As the existence of that browser extensions shows clearly I'm not alone when it comes to going mad about that rape of language. Also often whole comment sections don't discuss a topic at hand but instead most people complain about the usage of broken "gendered" pseudo-politically-correct BS language. That noun-gendering is like a disease!)

                      [1] https://addons.mozilla.org/en-US/firefox/addon/binnen-i-be-g... [2] https://de.wikipedia.org/wiki/Generisches_Maskulinum

                      • jack1243star 1355 days ago
                        Thanks for sharing. As a German learner this is quite fascinating to know.
                    • lloeki 1356 days ago
                      Believe it or not, we introduced a variant of bash brace expansion (except with implicit braces and dots instead of commas) in our grammar, named it “écriture inclusive”, and called it a day.

                      The way it kicks words previously loaded with neutrality in the curb but happened to have the same spelling as the gendered one, and entrenches a two-gender paradigm boggles the mind as to how it flies in the face of any form of inclusivity.

                      That and I still don’t know how to read “le.a fermi.er.ère” aloud. It’s just as ridiculous as “cédérom” because Astérix puts up a show at standing against the invader.

                  • bogidon 1356 days ago
                    Viewpoints can be encoded in language https://en.wikipedia.org/wiki/Male_as_norm
                    • nyolfen 1356 days ago
                      > In practice, grammatical gender exhibits a systematic structural bias that has made masculine forms the default for generic, non-gender-specific contexts.

                      many instances of this are simply an artifact of 'man' previously being an un-gendered term. but that fact is much harder to build group cohesion around than grievance.

              • iforgotpassword 1355 days ago
                Perchildhole.
            • dlivingston 1356 days ago
              Pick your battles. This isn’t a hill worth dying on (or even a hill worth getting slightly bruised on).
              • brongondwana 1355 days ago
                I have learned that flat out telling people that a hill isn't worth dying on tends to cause a bunch of corpses to collect up - if you don't want a molehill covered in bodies you need to persuade them to go die somewhere else.
              • bogidon 1356 days ago
                Yeah agree. And I think we could agree replying "Ew" and loosing a little bit of HN karma does not constitute more than bruising.

                EDIT: didn't see the "or even" there. Disagree. I think the analogy can be drawn out a bit, so I'll say that a bruise can heal pretty quick, and one would adapt better to climbing "hills" if they exercised regularly. Plus maybe smaller hills should be climbed too.

    • marricks 1356 days ago
      I'm really not at all interested in people explaining to me how finding mentions of back doors in technology used in millions of computers is probably OK because it may mean something else.

      Given US security apparatus clearly values and desire these back doors and have the necessary power to coerce companies to making them, generalizing the use of "back door" as a term for debugging or w/e seems almost expected.

      Even if they are for debugging "oops it's on in production!" is a great cover because none of these companies will EVER admit back doors were required by the government.

    • yelloworangefog 1356 days ago
      Frankly I don't think Intel's track record affords them the privilege of having good faith be assumed with something like this.
      • Traster 1356 days ago
        Intel employes 100,000 people, and most of them aren't even aware of most of Intel's transgressions, let alone approve of them.
        • heavenlyblue 1356 days ago
          It’s akin to defending Nazis because there were some Nazis who were forced to be Nazis because they couldn’t find a better job.
      • newacct583 1356 days ago
        Intel doesn't have a track record of shipping back doors, or even "bad faith" software really.
        • switchbak 1356 days ago
          Isn't their whole management engine essentially one big (poorly secured) backdoor?
          • monitmitra 1355 days ago
            It's essentially secured through obscurity. which I'm sure with this leak will lead to several CVEs over time...
    • OnACoffeeBreak 1356 days ago
      I worked at a place where IT had an admin user on every machine named "Backdoor". I opened a ticket when I noticed it, which was promptly closed explaining that it was normal.

      The same place had a boot script on every computer that wrote to a network-mounted file. Everyone had read permissions to it (and probably write, but I didn't test) and the file contained user names, machine names, and date-times of every login after boot for everyone on the domain going back 5 years. I opened a ticket for that, which was never addressed.

      • dx034 1355 days ago
        You could've probably reported this. Logging login times of everyone for all employees to see likely violates employees' privacy.
    • mzs 1355 days ago
      indeed it literally was the author's suggestion to search for the word 'backdoor':

      >This code, to us, appears to involve the handling of memory error detection and correction rather than a "backdoor" in the security sense. The IOH SR 17 probably refers to scratchpad register 17 in the I/O hub, part of Intel's chipsets, that is used by firmware code.

      https://news.ycombinator.com/item?id=24084977

    • coronadisaster 1355 days ago
      Thats nice of you but Intel's hardware has actual known backdoors.
    • bhouston 1356 days ago
      Yeah, I think that this is likely the case here from the screenshot.
      • bonzini 1356 days ago
        Judging from the current, in all likelihood it is the opcode that APEI (a part of ACPI) tables write to port 0xB2 in order to invoke firmware services that run in system management mode.
        • bonzini 1356 days ago
          Comment, not current. :)
    • akerro 1356 days ago
      >merely a debugging server that could be enabled and allowed you to inspect internal state during runtime

      When we talk about CPU it's bad enough. Think that your program has an input and output streams where most of the app data goes through and I can attach debugger and listen on the data.

      I would not be very happy about it and would still consider it backdoor.

      • dathinab 1356 days ago
        But do we now if that part ever ended up in any CPU sold by Intel instead of e.g. engineering samples ?
  • svnpenn 1356 days ago
    Someone have a mirror? Seems the actual files are here:

    https://t.me/exconfidential/590

    Edit: files are here

    https://mega.nz/folder/CV91XLBZ#CPSDW-8EWetV7hGhgGd8GQ

    or

    magnet:?xt=urn:btih:38f947ceadf06e6d3ffc2b37b807d7ef80b57f21

    • ColanR 1356 days ago
      The countries of origin of the peers downloading that torrent is pretty cool to see. A fairly broad cross-section of the world.
      • hawkoy 1356 days ago
        Not reliable. Most people torrenting this are hopefully using a vpn.
        • totetsu 1356 days ago
          so.. I shouldn't have clicked that link on my office network?
          • biddlesby 1355 days ago
            All good, just make sure to restart your computer at the next available opportunity
            • faeyanpiraat 1355 days ago
              How would that help?
              • akimball 1355 days ago
                Often a boot cycle gives the rootkit a chance to hook boot code to bootstrap into hypervisor.
        • jamesponddotco 1356 days ago
          Or a SeedBox.
        • gspr 1356 days ago
          Why?
          • crdrost 1356 days ago
            Because until this thing gets diffused and dissected by everyone and their mothers, the law is likely to view it as publication of confidential trade secrets, and people who can be confirmed to be spreading such things can get federal time, e.g. [1] for example. Using a VPN is the barest of mechanisms to try to obscure your identity to avoid this sort of punishment.

            [1] https://www.wsj.com/articles/SB10001424052970204409004577158...

            • Aaronstotle 1356 days ago
              I think there's a big difference between selling chemical secrets to a hostile government and this torrent. Namely, that no one is selling this information, it's available to anyone who can grab a magnet file.
              • dgellow 1355 days ago
                Here is the real thing: are you confident enough in your statement to argue that way when confronted by your government (or whatever is the concerned body here)? If yes, then feel free to do whatever you want with your free time and bandwidth, but otherwise you're better to stay as far as possible from these data.
                • getoffmyawn 1355 days ago
                  well in any case thanks for FUDge-packing this discussion and sharing your opinion which is based on nothing. i'll make sure to credit you as my partner-in-crime after I get my door kicked in for downloading files on the internet.
                  • mafriese 1355 days ago
                    I notice that you have no one in your circle of acquaintances who has illegally downloaded movies about torrents and got caught. I don't know how it is in other countries, but here in Germany friendly people ring your doorbell and take everything that is connected to electricity :). And if there is any data in there that is very damaging to Intel, then I think they will take the trouble to look for these people (at least in certain countries)
                    • getoffmyawn 1355 days ago
                      i'm sorry you live in a hellhole country and your friends don't understand how bittorrent works. maybe one day you can immigrate to a second-world country and grow some cojones, but until then you should continue living in fear and scaring your peers from downloading leaks early when there aren't fed trackers.
            • fastball 1356 days ago
              Right but if you just download without seeding, no crime is being committed, yes?

              So seems like the barest you can do is "disable seeding", not "use a VPN".

              • crdrost 1356 days ago
                IANAL and you should probably contact yours about such things but a straightforward reading suggests that because you knew you were downloading something likely illegally gotten, you are in fact on the hook for downloading it.

                    “Misappropriation” means: 
                
                      (i) acquisition of a trade secret of another by a person who knows or has
                        reason to know that the trade secret was acquired by improper means; or
                      (ii) disclosure or use of a trade secret of another without express or
                        implied consent by a person who 
                        (A) used improper means to acquire knowledge of the trade secret; or 
                        (B) at the time of disclosure or use knew or had reason to know that
                          his knowledge of the trade secret was 
                          (I) derived from or through a person who has utilized improper means 
                            to acquire it; 
                          (II) acquired under circumstances giving rise to a duty to maintain
                            its secrecy or limit its use; or 
                          (III) derived from or through a person who owed a duty to the person
                            seeking relief to maintain its secrecy or limit its use; or 
                        (C) before a material change of his position, knew or had reason to
                          know that it was a trade secret and that knowledge of it had been
                          acquired by accident or mistake.
              • nvr219 1356 days ago
                Ahem. Not seeding is a crime.
              • theon144 1356 days ago
                Depends heavily on the jurisdiction, I am afraid. This exact case was used as a precedent where I'm from (Czech Republic) that no, merely downloading over BitTorrent still constitutes "sharing copyrighted material".
                • jhgb 1355 days ago
                  Presumably that was because BitTorrent sends data even before receiving 100% of it? But I assume that downloading these files would not be allowed in this case anyway as per Zákon č. 121/2000 Sb. §29 (2) since this is not a published work.
              • dgellow 1355 days ago
                The only place I know where that would be the case is Switzerland, there downloading copyrighted material isn't illegal (and companies aren't allowed to track IPs of people downloading files via torrent), but sharing is. But in the context of a data leak of confidential trade secrets, that's likely to be a completely different situation.
          • augusto-moura 1355 days ago
            Torrent is not the most private way of downloading things, because if you share your already downloaded binary you are posting your ip in a tracker as a leecher or seeder. You actually can see live what torrents (at least the most popular) are you downloading[1], the site is only tracking the most popular hashes, but is easy to some entity track this intel hash specifically

            [1] https://iknowwhatyoudownload.com/en/peer/

    • sdflhasjd 1356 days ago
      The t.me page is a Telegram comment containing a mega.co.nz download link

      I think the tg:// link is just the site trying to open up in the Telegram app.

    • tartrate 1356 days ago
      I'd assume spreading this is not legal?
      • willis936 1356 days ago
        I’d assume it isn’t and just not talk about it loudly.
      • chii 1355 days ago
        spreading this is copyright infringement. Intel has to sue you for copyright infringement in court.
    • markysee 1349 days ago
      I think we will see new "Edward Snowden" soon !... Cool
    • anaphor 1356 days ago
      "Invalid magnet URI" from rtorrent
      • somehnguy 1355 days ago
        Works fine in Transmission
    • leyo1022 1355 days ago
      utorrent works.
    • leyo1022 1355 days ago
      utorrent works.
    • johnklos 1356 days ago
      You can't download from mega.nz unless you have their "downloader" app or an account, or if you have Firefox or Safari. It's useless.

      The torrent works.

      • crazypython 1356 days ago
        A web app that only works in non-Chromium browsers isn't useless.
        • bscphil 1356 days ago
          I've downloaded stuff from MEGA in the last week on a Chromium based browser, so I'm not sure what the problem is supposed to be here.
        • weinzierl 1355 days ago
          I think johnklos meant it only works in Chrome and not in Firefox or Safari. You can download the whole archive as zip with Chrome - no problem. Firefox, on the other hand, doesn't allow you to store that much data locally in the browser, so it doesn't work out of the box. You can download the two top level directories separately though and this works even in Firefox.
        • samtheprogram 1356 days ago
          Golly, it doesn't work in Chrome? What's the technical limitation here? Or did they just choose not to support it?

          Asking because as someone who uses FF as their daily driver and is surprised something is supported in it that isn't in Chrome...

      • the8472 1356 days ago
        > firefox works

        > it's useless

        does not compute

      • userbinator 1356 days ago
        Or if you have OpenSSL and curl...

        (At least the last time I had to download from MEGA, I RE'd what it does and it was somewhat clever - AES128 in counter mode, key is in the hash part of the URL.)

      • saagarjha 1356 days ago
        I thought Mega didn't work in Safari, because it wouldn't have enough cache or whatever in-memory thing it does?
      • throwaway889900 1356 days ago
        JDownloader 2 works fine and goes around their limitations to boot.
      • Legogris 1355 days ago
        It's the other way around re ff/chrome
  • Traster 1356 days ago
    This is more embarrassing than harmful. Having worked at companies like intel, it's not really that damaging leaking some of this IP - the worst that happens is some open source project gets slightly better or you have a few more bugs (not that Intel are lacking in that area). The second we see internal marketing, pricing & road map slides- that's when you know they're in real trouble.
    • kps 1355 days ago
      > Having worked at companies like intel

      I've worked at a company very much like Intel¹ and the really closely guarded secret — the one where two vetted people turn the launch key at the same time — was the microcode patching keys.

      ¹ I'm not saying it was Intel, but it was Intel

    • dathinab 1356 days ago
      No the worst is it there is some dirt and people find it like:

      - copyright infringement

      - patent infringement

      - actual backdoors (the word backdoor does appear but there are many ways how it can already without they being a backdoor got spying including bad naming sense of engineers and code used during prototyping only

    • dgellow 1355 days ago
      > the worst that happens is some open source project gets slightly better

      If anyone is reading this is working on open source projects that would benefits from what has been leaked: stay far away from such a leak. The last thing you want is your open source project to be accused of copyright or patent infringement.

  • beervirus 1356 days ago
    Fingers crossed that this will enable some smart person to completely disable the management engine.
    • wmf 1356 days ago
      AFAIK the ME is required to initialize the processor so it can never be completely disabled. The best you could do is remove any code beyond necessary initialization which has mostly already been done by me_cleaner.
      • noja 1356 days ago
        How easy is it to use me_cleaner? Last time I looked it required some wiring and a Raspberry Pi.
        • amiga-workbench 1356 days ago
          Quite straightforward, I used a ch341A SPI programmer. Just make sure you take multiple copies of your original ROM image and compare the hashes of them to make sure there was no screwup.

          It took me about 10 minutes to do my ThinkPad. All I lost was some enhanced integrated GPU power management and integrated thermal management, but I use a userland fan control program anyhow.

          • WanderPanda 1356 days ago
            How much did it cost? Everything
        • nullc 1356 days ago
          On some devices it's fairly easy: pop the chip out (or attack to it in-circuit with a clip), drop it in a programmer, run a tool.. run me cleaner.. run a tool again.

          On other devices you just can't read the chip or you can and me cleaner can't make any sense of it.

      • netsec_burn 1356 days ago
        Or buy a laptop from a manufacturer with the ME inoperable.
        • spijdar 1356 days ago
          The entire ME can't "technically" be disabled on modern Intel silicon. It's essentially the processor that "bootstraps" the whole CPU. Without (cryptographically signed) code running on the ME, the system can never boot.

          All the non-necessary bits can be disabled out of the box, however.

      • rasz 1355 days ago
        Afaik that stil leaves Computrace backdoor in the bios.
        • wmf 1355 days ago
          UEFI can also be "cleaned" in most cases.
  • pdevr 1356 days ago
    • technonerd 1356 days ago
      https://del.dog/sourcestatements.txt

      source: They have a server hosted online by Akami CDN that wasn't properly secure. After an internet wide nmap scan I found my target port open and went through a list of 370 possible servers based on details that nmap provided with an NSE script.

      source: I used a python script I made to probe different aspects of the server including username defaults and unsecure file/folder access.

      source: The folders were just lying open if you could guess the name of one. Then when you were in the folder you could go back to root and just click into the other folders that you didn't know the name of.

      deletescape: holy shit that's incredibly funny

      source: Best of all, due to another misconfiguration, I could masqurade as any of their employees or make my own user.

      deletescape: LOL

      source: Another funny thing is that on the zip files you may find password protected. Most of them use the password Intel123 or a lowercase intel123 source: Security at it's finest.

    • dleslie 1356 days ago
      ... They're claiming it came from an NDA'd source of IP that's shared with customers.

      Given that it _appears_ like there are backdoors in this Firmware code, we can conclude that if there are such backdoors then they were shared with numerous customers.

      That really doesn't improve the optics of the breach.

      • moonchild 1356 days ago
        Alternately, as others have noted, it could be overloaded nomenclature and doesn't actually indicate a backdoor. Which would be an excellent reason for them to feel comfortable sharing said 'backdoors' with their customers.
        • dathinab 1356 days ago
          It it is actually a backdoor but only gets put on prototypes/engineering samples or similar.

          Or maybe some well documented Intel management features need to backdoor there own security mechanisms to work.

          Or ...

          Well the point is it's a starting point for someone dissecting the data but not much more.

      • jpxw 1356 days ago
        Imagine what they aren’t sharing
    • jug 1356 days ago
      A guy on reddit claiming to be ex-Intel thinks it looks like material shared with OEM’s and thus a breach of something like a motherboard manufacturer rather than Intel.
    • japgolly 1355 days ago
      Yeah but the data is still a legit leak, even though the means by which is was obtained wasn't hacking.
    • johnnyfaehell 1356 days ago
      It wasn't hacked... But these files came into the hands of an unauthorised user... That seems a lot think something of theirs got hacked...
  • gabcoh 1356 days ago
    Is releasing this legal? It seems like this person isn't really disguising their identity or concerned about breaking the law. In their profile they even seem to brag about leaking company's code.
    • dleslie 1356 days ago
      Of course it's not legal. This is exfiltrated intellectual property being shared without license.
      • gabcoh 1356 days ago
        Is the person publishing this liable or just their source? Because this seems to be a hobby for the person publishing it and yet they also aren't concealing their identity. They list their former employer on their website.
        • kmeisthax 1356 days ago
          Misappropriating trade secrets for financial gain is a punishable offense, and this data would qualify as a trade secret, at least for as long as it's not general knowledge to everyone or it has yet to be reverse-engineered. Aside from that, much of the data in these files has standard copyright and patent concerns.
        • 21eleven 1356 days ago
          Is the person publishing it or are they linking people to a place where it is published?
          • jpxw 1356 days ago
            It seems that they are the ones publishing it, from the wording of the tweets.
        • whydoyoucare 1356 days ago
          More likely the person publishing this isn't aware of the far reaching consequences.
          • jki275 1356 days ago
            You should look up Till Kottman's other work. Very aware.
      • dosshell 1356 days ago
        Which country laws does apply? Is it really illegal to share this in the whole world? Im not so sure about that.
        • msbarnett 1356 days ago
          >Which country laws does apply?

          At the very least, Intel owns the copyright on this material, so sharing it is a copyright violation in any country that is a signatory to the Berne Convention or the TRIPS Agreement, which is effectively almost the entire planet.

          Then you have to add Trade Secret laws on top of that, which will have slightly narrower jurisdiction but still impact a lot of countries. There are very few places on earth where you would not be facing any legal trouble whatsoever for releasing this material.

          • mindfulhack 1355 days ago
            Interesting info!

            Here's a good list of potential candidate countries which are neither in Berne Convention nor TRIPS Agreement:

            https://en.wikipedia.org/wiki/Berne_Convention#List_of_count...

            It may take effort and money to be protected, e.g. setting up a legal entity in that country which takes full responsibility and you cannot be legally forcibly unmasked as being a proprietor, and other local laws may need to be fully checked out to explore other risks.

            Nothing is a guarantee and perfection doesn't exist but it's fun to explore these legal layers.

          • dathinab 1356 days ago
            But that only matters if given country also prosecute you effectively.

            You probably find quite a bunch of countries where it's illegal but not prosecuted. Through I only know if Russia.

          • drudru11 1355 days ago
            Hi msbarnett: sorry unrelated to this thread. In an older thread you mention an acronym TFA. The thread was a discussion on sparse files and removing bytes from the front of a file. What is TFA?
        • dathinab 1356 days ago
          For example in Russia you are basically guaranteed to not be prosecuted for all kinds of cybercrime as long as you don't have Russia/Russian companies or Russian citizens.
        • dleslie 1356 days ago
          Intel is an American company; so wherever there is an extradition treaty with the USA and where there are also similar laws.
          • johnnyfaehell 1356 days ago
            Wouldn't the publishing part just be standard copyright infrignement and just a civil matter?
          • aflag 1356 days ago
            No country Will extradite their own citizens, though. If intel wants the person punished, they have to sue them in the country they live in.
            • dleslie 1356 days ago
              > No country Will extradite their own citizens, though.

              This is demonstrably false. While not all of this list are Canadian citizens, some are:

              https://en.wikipedia.org/wiki/Category:People_extradited_fro...

              • refurb 1356 days ago
                Just had to remind of Charles Ng, huh? If you have a weak stomach, don’t dig further.
            • deviarte 1356 days ago
              Not entirely correct. A lot of Latin American countries do it all the time.
        • mensetmanusman 1356 days ago
          Most western countries agree that the concept of ‘intellectual property’ is a good thing and afford protection, or else society disincentivizes innovation due to game theoretic tragedy of the commons-type reasons.
          • akimball 1355 days ago
            And thus there is a sell-by date on capitalism.
        • olliej 1356 days ago
          Copyright law is fairly universal, so while stealing data/info from another company may not be illegal (depending on jurisdiction) copyright laws are pretty universal.
      • ldiracdelta 1356 days ago
        For this code it might be export-controlled as well. Many things in Intel are export controlled.
      • dcow 1356 days ago
        I don't believe this is accurate or in any way obvious even if this is the stance the courts would ultimately take. These files were downloaded from a publicly available CDN server discovered while browsing the internet. No authorization mechanisms were bypassed, no computer systems were hacked. These files are the result of a GET request to an Akamai server that happened to be hosting the files. Despite how this will be spun in pop culture, Intel did not secure access to these files. I'm not sure how you would prosecute someone for re-sharing a file they were given, under no legal contract, when they asked for it.
        • MauranKilom 1356 days ago
          You have a lot of faith in how technically versed the law and courts are on these topics - because they sure haven't kept up with the times. And even if they were willing to split hairs over these technical details:

          No civilian will agree with you that just because technically you could slip through several doors that happened to be not locked and got helpful advice from a neighbor, it doesn't mean that whatever you found behind those doors was "public" just because you didn't have to pick locks. Or that the photos you took of private company documents by social engineering your way inside must clearly be unsecured and publicly distributable because "they were given to me when I asked for them".

          • dcow 1356 days ago
            This isn't slipping through various open doors. There were no doors. This is literally a public server on the public internet serving files publicly. Intel is grossly negligent in securing their assets if they're hosting what they consider to be confidential trade secrets on public CDN servers.

            The analog would be if I posted a flyer on a telephone pole with what I considered confidential information and someone else took a picture of it. There's no way you could argue that I had a reasonable expectation that only people for whom the flyer was intended would be able to view the flyer.

            If someone deliberately bypassed computer security measures to acquire this information I'd agree. But you don't get a free pass to be negligent just because you're a big company. I suspect the EFF would support my viewpoint as they supported Weeve's appeal of a much more contentions and ethically gray scenario (the acquisition of personal information from a server that was negligently "secured" and required someone to imitate the calls an iPad would make).

            • dleslie 1355 days ago
              > This is literally a public server on the public internet serving files publicly.

              The flyer analogy does not work because the services were not broadcasting or otherwise advertising their presence.

              Following the house analogy, the thief tested all the front doors on the street and opened those which were not locked.

              • dcow 1355 days ago
                Then search engines must not be legal. They crawl the public internet and index what they find.

                What you’re effectively saying is that the flyer is unknowable unless a Street-view car drove past and snapped a picture of it and its owner engaged in SEO to make sure it landed near the top of search results.

                There is no “house” in this analogy (which you might call a corporate/private network secured or otherwise). No private network was accessed. This stuff was on the street, in the free pamphlet section of the newspaper stand.

        • zeroimpl 1356 days ago
          That’s a very weak argument. If I’m walking down the street at night and somebody comes up to me and says “GET /money”, I may respond with an HTTP 200, but that doesn’t mean the person didn’t just steal from me.
          • dcow 1356 days ago
            You just gave them money. They didn't coerce you. It would be different if they flashed a weapon.
            • zeroimpl 1355 days ago
              It's a fictitious example. I didn't say there was no weapon, nor said it was definitely theft. The point is that submitting a GET request in a public setting does not mean no crime.

              Coercion can be the difference between asking for money and theft. In the case of this intel data, it was clearly coerced from a server - it's not like it was linked on Google, they had to specially craft URLs to coerce the data out.

              • ddingus 1355 days ago
                For there to be theft a property owner has to lose their property and or use of said property.

                There is zero theft in this discussion.

                One could argue there was infringement, and that argument is very difficult to make without breaking the Internet for everyone just because someone with deep pockets failed.

                Coercion involves, at the core, an act of agency performed to the intent of someone else, who is not the agent, actor.

                Bad practice does not support coercion at all.

                I wonder whether that word even applies to entities lacking agency.

                Servers are automatons. They do not make value judgements and or creative acts of agency of any kind.

                We need these things as a basis for coersion.

                There are lots of things not indexed by Google and it is dangerous to imply people are somehow wrong when data is accessed sans a Google index.

                Security by obscurity does not make sense. This mess is part of why.

              • dcow 1355 days ago
                Again, I don't believe it's accurate or honest to call this coercion. These files were obtained from a content delivery network by visiting a url in a browser. Nothing deceptive, cunning, crafty, or coercive about it. Let me ask you, what files am I allowed to access on a public network? Must I ask owners permission before visiting their websites? Must I be able to find it with a search engine? What constitutes a file which anyone is allowed to view?
                • zeroimpl 1355 days ago
                  Were the files listed when going to http://server.com? No, the user had to:

                  1. Find the server via nmap

                  2. Guess at some URLs until the server finally responded with some hidden data.

                  While neither of those would require being an expert in the field, this is well beyond the realm of browsing public websites.

                  • dcow 1355 days ago
                    This is incorrect. Actually, yes, the files were in fact browseable and Akamai servers typically front with DNS names that presumably resolve to their any-cast addresses where they use SNI to select content bucket, so there would have been a "friendly" name involved. Going to https://server.com/folder displayed a list of folders and files all hyper-linked and connected as is common on the internet. The fact that the server was initially discovered by way of a crawler, a scan, is irrelevant (this is actually how search engines discover content, btw). The fact that a browser could browse these files suggests that it is not "well beyond the realm of browsing public websites".
                  • japgolly 1355 days ago
                    Exploration of public areas isn't illegal. There's no law mandating that viewing a website though the browser is legal, and any other means not. Techies legitimately access websites in all kinds of programmatic ways. Intel made their data publicly available. That it wasn't accidental doesn't change that.
                    • dleslie 1355 days ago
                      Opening unlocked doors, entering and removing property is generally considered to be theft.

                      The key here is that these services were not advertising their presence.

                      • ddingus 1355 days ago
                        Theft requires property owners to be denied their property or the use of it.

                        That did not happen here. Theft is not part of the discussion.

                        Infringement could be, and is at least the right language for the discussion.

                      • ddingus 1355 days ago
                        Where are the locks?

                        There are often ways to beaches and other spaces unadvertised, but otherwise OK to use.

                        Sure looks to me like a potential landmine for people. Bad practice with big pockets should still just be bad practice with the same consequences for all who don't bother with better practices.

                        There was no lock on this at all.

        • ummonk 1355 days ago
          If someone leaves their door unlocked and open it doesn't mean I have a right to walk in and take what I want.
          • ddingus 1355 days ago
            This is more like leaving it on the street, mixed in with a lot of other free stuff.

            Physical analogies break down.

            Truth is, this info was out there for anyone to copy, and who ever did that is definitely guilty of something and or liable.

            It will be tough to make passers by into criminals here. They aren't.

            Keeping secrets is hard. Should be.

            • ddingus 1355 days ago
              I wrote this poorly. Meant whomever left this data in the clear is guilty of something and or liable.
              • ummonk 1354 days ago
                It would be like keeping a dropped wallet - the person who keeps it is guilty, not the person who accidentally dropped it.
                • dcow 1354 days ago
                  No it wouldn't. It would be like taking a picture of a dropped wallet on the street.

                  Regardless, under what moral framework are we operating such that obvious guilt is prescribed to anyone who might pick up a wallet in the street, anyway? Of what crime are they guilty?

        • zerotolerance 1356 days ago
          Verizon and US courts would disagree with you.
          • dcow 1356 days ago
            Links? I'm curious about existing case law.
            • meowface 1356 days ago
              Here's one example: https://www.wired.com/2013/03/att-hacker-gets-3-years/

              >The two essentially wrote a program to send GET requests to [publicly available pages on] the web site.

              • dcow 1356 days ago
                Ruling was appealed and deemed incorrect: https://www.wired.com/2014/04/att-hacker-conviction-vacated/
                • dleslie 1355 days ago
                  Not quite, it was vacated on the grounds of improper venue. It wasn't reversed or similar; to be vacated is to be voided, as though the case never occurred.
                  • dcow 1355 days ago
                    I mean yes. I wish it was actually reversed on grounds that the ruling didn't stand. But that was the intention of the appeal. Dismissing it on improper venue is simply tactical. This is the legal system's way of saying, "there was enough contention in this case that we don't feel comfortable with the whole thing in the first place so we'll throw it out on a technicality and avoid inventing any case law here".
                    • meowface 1355 days ago
                      I'm not a lawyer, but I'm not 100% sure if thats's the best interpretation. It being thrown out on a technicality doesn't necessarily imply anything about their feelings regarding the facts of the case.

                      Basically, I would not be surprised that if the exact same case happened today, the defendants would still get jail time.

        • rvnx 1356 days ago
          Not sure about US but in France it's considered a quite serious crime
    • hedora 1356 days ago
      Stealing it is probably illegal, and there’s a copyright and export regulations argument to be made around copying it.

      However, my understanding of the law is that, once secrets are made public, further distributing the secrets is not illegal.

      So, republishing it is probably not more illegal than running a torrent of a Hollywood movie and an Ubuntu ISO (which can run afoul of export regulations).

      Note: I’m not a lawyer, and if what I said was true in practice, Julian Assange / Wikileaks would have nothing to fear from the law.

      • Nemo_bis 1355 days ago
        Also, there are currently hundreds of seeders (according to a popular torrent indexers) and there were probably thousands of snatches. Good luck prosecuting that many people.
    • elmo2you 1356 days ago
      In addition to what other people already mentioned (how it is illegal), it may depend on jurisdiction. Also whether the country of origin of the source has an extradition treaty with the USA, or if the USA can otherwise (e.g. extrajudicial kidnappings) get the culprit to stand trial in the USA.

      EDIT: While it maybe a relatively clear cut case according to US law, other countries (may) have different laws. There are also all kind of potential diplomatic and political obstacles, when this was done by somebody outside of the USA. For instance, good luck if this was a Russian or Chinese citizen.

    • draw_down 1356 days ago
      Sometimes people break the law!
  • bubblethink 1356 days ago
    May be a good time for Intel to open-source FSP anyway. They've been dilly-dallying around it for a while now. There were some phoronix articles about it a year ago.
  • saagarjha 1356 days ago
    • wonderlg 1356 days ago
      I’m completely ignorant about this but is it possible that they’re referring to a “debugging backdoor”?
      • bonzini 1356 days ago
        It's most likely a callback from OS to firmware, or at least this is what I can guess based on the single comment present in the screenshot and what I saw in the past in the APEI tables of Intel-based servers.

        APEI tables are a part of ACPI that tell the OS how to write an error record persistently in the machine log, inject a memory error for debugging purposes, and stuff like that that's tied to the RAS (Reliability/Availability/Serviceability) features of a server. The tables contain a list of instructions like "write a value to memory" or "write a value to an I/O port"; the way they work in practice is that, by following these instructions, the OS causes the processor to enter system management mode (that's the "backdoor" into the firmware) where the firmware services the APEI request.

        Since the tweet mentions SMM and RAS in the two lines it shows, my guess is that it's related to that functionality.

  • Jonnax 1356 days ago
    Who should we follow on twitter / blogs to read analysis of whether this is impactful or interesting?
  • unix_fan 1356 days ago
    The bad news just doesn’t end for this company,does it?
    • Alupis 1356 days ago
      Or karma finally doing it's thing?
    • eNTi 1356 days ago
      At this point it's just more 2020...
    • abvdasker 1355 days ago
      I get the sense over the last year that the wheels have fully come off the bus there
    • randomsearch 1356 days ago
      Hmmm indeed
  • james412 1356 days ago
    This kid has been posting these for fame (it's the same guy that posted the Daimler leak). I guess it's all fun and games until he finds himself in prison
    • elmo2you 1356 days ago
      That is indeed often the case with young narcissists (I don't know if it applies to this person, don't know him/her).

      That said, I remember the shocking arrogance and total disregard (for anything but their own ego) of a few young privileged "hackers", who were involved in DDOS services for hire, and also for some very nasty IoT bot net (if I recall correctly).

      Krebs wrote about them quite a bit. I think they even got caught because of that, but not sure. They did loads of real damage, that much is certain. But instead of going to jail, they got community service. Apparently with intervention of the US government, for which they now work. Go figure.

      • antihero 1355 days ago
        IDK if wanting a bit of fame and validation makes you a narcissist per-say.
        • elmo2you 1353 days ago
          Not per-se, indeed. But if that urge for validation is for something that's fundamentally wrong and/or only supports an person's failure to critically assess their own actions, then it usually is narcissism.
    • mirekrusin 1356 days ago
      I'm not sure you can even consider it "breaking in", it's more like tweeting that under intel.com/super-secret url you can see some internal, secret documents.
      • james412 1355 days ago
        The definition of "unauthorized access" is intentionally very broad, and ultimately depends on the kind of lawyer you can afford. Publicly taking a piss in the face of Intel and Daimler in exchange for a little lame publicity seems an incredibly dumb tradeoff
  • jarym 1355 days ago
    With stuff like this being exfiltraded (let’s admit if hackers got this they prob could have a whole ton of fab secrets) it won’t be long until America’s IP is all in the hands of China/Russia/Europe.

    We will have confirmation when China launch a ‘Xi Lake’ x86 compatible cpu...

    • Anarch157a 1355 days ago
      They already have it, thanks to AMD, before Trump blocked further cooperation.
  • greyface- 1356 days ago
    Any IP lawyers in the house willing to speculate on how this is going to go down? Intel surely isn't going to let this stand, and the (Swiss) leaker is being completely open about their identity. What's the legal action going to look like?
  • Keyframe 1356 days ago
    At this rate, I fully expect Nvidia to make a meger bid for intel.
    • mhh__ 1355 days ago
      Bigger market cap but something like 10 times less revenue, strange world (As intel get hammered in the media their revenue remains in a different league to AMD - which I suspect is partly because AMD can walk the walk after dropping their trousers but there is no foreplay [i.e. sales and software])
  • kr99x 1355 days ago
    The three "biggest deals" here are all... a lot less important than they look. Clarifying info on all three:

    "Did Intel get hacked?" I can't confirm the exact mechanism by which these files got out, but I do know that these files are things which get shared externally already with Intel's customers under NDA. If security in general is lax, that's one thing and future hacks of more sensitive stuff could be expected. If security in general is fine, but for some NDA customer sharing channel is lax, don't expect to see anything juicier.

    "Intel123 is an awful password." Yes it is, but... it's not for security. Intel123 is the password used to bypass executable/script filtering systems that overzealous IT put in place to "protect" employees. Employee A wants to share a zip with employee B. There are many channels they can use to do this, because the contents of the zip are not encrypted or restricted. None of these channels require encryption, but either A or B doesn't like/understand them, so they agree on email. Whoops, the filter says that executable could be harmful and out it goes. Zip-via-email doesn't work. Unless... well, if they put a password on it, the filter doesn't catch it. Good. Problem solved. This is so common that the convention Intel123 arose and solidified for exactly this purpose.

    "I see the word 'backdoor' in there!" Sure. Bad name choice. That's not the kind of backdoor you're thinking. There are a lot of things in the firmware that take this exact same form and don't use the word backdoor. It's a signal the low level firmware is keeping an eye out for, and if received, it will trigger some other piece of firmware to do some task in SMM. If that other piece of code takes input parameters and fails to verify them, then you may have a vulnerability on your hands - in fact, this was a very common kind of vulnerability before. Intel has fixed a lot of these over the years. Odds are they're mostly gone by now. If input parameters are verified (or none taken), the worst you could do is maybe a DoS by spamming that signal to keep the CPU clogged/stuck in SMM.

  • phendrenad2 1356 days ago
    This is more of an advertisement for all the cool stuff you can work on if you go work for Intel than anything.
  • intelleak 1356 days ago
    In what ways can an end user of intel processor expect to benefit from this? I'm guessing none, since ever consumer interface is already a standard ... Can anybody chime in?
    • molticrystal 1356 days ago
      While it doesn't mean it will happen, depending on what is leaked now and in the future, possibilities include:

      1. Verify that debug features that are remotely exploitable are actually disabled in consumer releases of their hardware.

      2. Re-implement proprietary parts of the boot sequence, such as activating memory controllers, in an open and public manner that can be more easily looked over for flaws, security and otherwise.

      3. Modify parameters and tweak hardware for additional stability or performance enhancements, especially undocumented or disabled(on lower graded chips of the same architecture) aspects of the hardware that may be present.

      On the other hand barriers include legal issues depending on what country people working on these originate from, ethical issues, and even industry barring, and this is not exhaustive. Consumers, especially consumers in countries not concerned about the legal aspects will likely gain the most advantages, if any are present.

    • dleslie 1356 days ago
      Optimistically, the exposure of backdoors in the firmware may cause Intel to patch and close them.

      Realistically, Intel will patch the firmware and replace the backdoors with new ones.

      • intelleak 1356 days ago
        Besides the backdoors I mean, I was thinking about performance or usability improvements...
        • eitland 1356 days ago
          Might finally get certain big PC vendors to consider using AMD which will increase competition and make sure we get better hardware in the future..?
        • mhh__ 1356 days ago
          Other than disabling security features, Intel already publish extensive optimization manuals, so I don't think there's anything to go on here.
    • wmf 1356 days ago
      It would be illegal, but some of that code might help Coreboot development.
      • outworlder 1356 days ago
        > but some of that code might help Coreboot development

        Unlikely.

        Most projects won't come anywhere near this sort of thing. There may be a possibility of doing clean room implementation, but writing the spec based on stolen IP is the problematic step.

        Then again, there is a high chance that none of this will be useful.

        • ATsch 1356 days ago
          It will be more or less impossible to prove or disprove that anyone obtained some crucial information from there. The info will always somehow make it's way into the places it's needed eventually.
          • dogma1138 1356 days ago
            It doesn’t matter if it’s provable or not, most developers won’t risk it especially if they want to keep their jobs or be hireable.

            If you review the content and publish say a blog post, even without legal repercussions it can impact your ability to be hired in the future since everything you do from that point can be tainted.

            So if you do look you should keep it quite or publish it under a pen name that you can’t ever take credit for.

            • ATsch 1355 days ago
              My point was that even without anyone taking that risk, the information will spread.

              Someone reads the code, mentions it to a friend, who adds it to a blog post, which gets cited in a wiki, which gets read by a developer unaware of the source. If the information is useful, it will end up getting spread.

            • johnnyfaehell 1356 days ago
              Say we use the Microsoft Windows code that got leaked, was anyone black listed for that?

              Also, I would assume other processor companies hire people from other processor companies and everyone all wants the best, most of the basic knowledge would have already made it's way to AMD and other companies.

              • dogma1138 1356 days ago
                But that isn’t basic knowledge, if you work in firmware development, embedded, SOC design etc. and your employer or future employers might be competing against Intel in some market segment (which given the sheer amount of products Intel has isn’t an unlikely scenario) I would be very careful about admitting not to mention publishing content based on this leak.

                If you work in a completely unrelated field then you don’t need to care as much.

                • johnnyfaehell 1356 days ago
                  But if you've worked at Intel those should be part of your job for a majority of people and when they leave that knowledge goes to.
            • gsich 1356 days ago
              You can always say you reverse engineered it yourself. Hard to disprove.
          • Zpalmtree 1356 days ago
            Maybe, but are you sure you want to end up in a very expensive court battle with Intel?
  • privacyonsec 1355 days ago
    Intel leaked sources contain "backdoor" keyword

    https://twitter.com/deletescape/status/1291419918685147149?s...

    • wingerlang 1355 days ago
      I’ve only seen people writing “backdoor!” without actually saying what kind of, for who, to what and so on. Seem pretty disingenuous to me. Could easily be something trivial.
  • xyst 1356 days ago
    I’m not touching the binaries or executables, but I’m interested in the source code. Has anybody found anything interesting?

    Will download from the mega link and explore in a VM later.

  • als0 1355 days ago
    The FSP source code is supposedly leaked as part of this, which is used to initialise the memory controller. Are we closer to (modern) blob-free Intel platforms?
    • kr99x 1355 days ago
      CloseR, yes. Close? No. For one, memory init code differs from product gen to product gen and pulls in platform/board specific libraries and inputs to set up some parameters. The bigger problem though is just how big and messy the memory init code is. It would take a substantial number of people a substantial amount of time to unwind and understand what's going on, let alone do a sane and/or clean-room implementation of it all.
  • pepemon 1356 days ago
    Intel went open source today.
  • RSO 1356 days ago
    Anybody else wondering at the rate of bad news around Intel at this moment? Like, is someone after them or is this just bad luck?
    • Nasrudith 1356 days ago
      Personally it seems more like complacency and cultural rot has caught up to them than any bad actor - excluding their own management chasing ego gratification or short term profits. Falling behind AMD in so many metrics when they were previously often a second-best rival screams that they need to get their shit together.
      • ddingus 1355 days ago
        This. Inertia, tech debt, a less nimble culture all add up.

        Intel needs to kick off a skunk works that basically gets funded well enough to find a new way.

        If they do it now, some space could get really interesting again.

  • danw1979 1356 days ago
    The advice to try a password of “Intel123” on any protected files says it all.

    This organisation genuinely deserves whatever is coming for them.

    • dathinab 1356 days ago
      In my experience password protected files are often password protected for obscure reasons which have nothing to do with the intent of keeping them secret, like:

      - prevent anti virus from messing with it

      - keep to some obscure regulations wrt. Contacts or law, where is enough if you can argue the data was encrypted.

      • mensetmanusman 1356 days ago
        If you make zip files in a company, there is never a repository of passwords, because that would be insecure, ergo zip files with other passwords usually are not easy to unzip after 5-10 years when the owner is dead/gone and the passwords are lost.
    • Chyzwar 1356 days ago
      This type of passwords are use in almost all big corporations. People are being asked to encrypt things but without password managers or keys management tools.
      • jychang 1356 days ago
        Yeah, I know of plenty of other companies with similar multi-time-use passwords. The data in that zip file probably isn't that confidential.
      • yjftsjthsd-h 1356 days ago
        > People are being asked to encrypt things but without password managers or keys management tools.

        That doesn't really make me think better of the company; if the company fails to support secure workflows, it's still on them when people fail to use secure workflows.

    • zenexer 1356 days ago
      It’s mentioned in the Twitter thread that at least some of the files have a password of “I accept” instead. That leads me to believe that the primary purpose might just be to indicate agreement to an NDA.
    • kr99x 1355 days ago
      The number one (by a wide margin) reason for Intel123 is that somebody is trying to email a zip to somebody else, but a mandatory filter notices "bad files" (oh no, executables!) inside the zip and removes it to keep people "safe". So the zip gets a common, known password, and the recipient gets their files in peace. It's not a security measure at all. It's a workaround for braindead IT "solutions" hindering day-to-day operations. The files can be shared via any number non-encrypted channels just fine, but the particular employees trying to share happen to be most familiar with email and the filter doesn't know or care if there are secrets - there are EXECUTABLES! Those are dangerous, don't you know?
    • jnwatson 1356 days ago
      The primary use case for such passwords is not confidentiality. It is to bypass overzealous email virus checkers.
    • dade_ 1356 days ago
      For the life of me, I can't understand why people insist on making passwords with the name of the company in them. It's so absolutely stupid, but common.
      • lez 1355 days ago
        Internal marketing
  • ggm 1355 days ago
    I'd love an IPR lawyer to explain legal paths to clean room spec of the bits of this which could be useful like ME or coreboot depending parts.

    I see comments which says "stay clear, they will" but I would like to know how, if at all, this could be done and be legal on the receiving side of the functional spec from a clean room.

  • inthewings 1356 days ago
    That's just another internal Meltdown from a Spectre employee !

    For once they don't leak our data.

  • chasd00 1356 days ago
    if you're at a company that can be considered an Intel competitor i would avoid this like the plague. Wasn't there problems for people working on Linux after only viewing source code from other operating systems?
  • WhyNotHugo 1352 days ago
    I'm very curious about this. What's the legality on just reading on this for mere curiosity (e.g.: I don't work for AMD nor write drivers, etc.).
  • dmix 1356 days ago
    Binaries unique to SpaceX, maybe related to a server or contract with SpaceX?
    • theon144 1356 days ago
      I've heard that it's apparently SpaceX camera drivers?
      • dmix 1355 days ago
        That's what it says in the docs
  • spicyramen 1355 days ago
    Cisco123
  • akayoshi1 1356 days ago
    The next Edward Snowden.
    • mhh__ 1356 days ago
      Unless they've actually found a real smoking gun, probably not even close. Besides, even if there isn't a backdoor in intel CPUs they've definitely tried.
  • iphone_elegance 1356 days ago
    Someone in China is smiling
  • alsdkfjkqjwer 1356 days ago
    Why would they include stuff from proprietary releases?

    I understand exposing backdoors and all, but who cares about a camera firmware for a airgaped system?

    wonder if some of the clients for those devices is involved and the goal of this is that those clients got fed up with the NDAs and wanted all this in "public domain"?