Why did software become worse lately?

I've seen basic functions split across apps, broken cloud services, and even big-budget banking apps that are painful to use. Reliability and security often feel lacking too.

I have a few theories why this happens: Are we all too distracted to do focused work? Does the industry focus too much on the newest trends rather than building things right the first time? Have easy coding tools led to devs who don't grasp the fundamentals?

Plus, what does the rise of AI mean for software quality? Could things get a LOT worse before they get better?

What are the worst examples of bad software that drive you crazy? Are there shining examples of exceptional quality that give you hope?

19 points | by headsnap_io 13 days ago

20 comments

  • gregjor 13 days ago
    You will have to define what you mean by “worse” and “lately.” I have over four decades experience in software development and I don’t think we ever had high-quality code, reliability, or good security.

    We have some examples from the old days that survived and have some of those traits, but I see that as survivor bias — the really bad code gets replaced or forgotten.

    Programmers often fail to understand that companies pay to develop software to make money, to solve business problems. Software doesn’t get written because it has intrinsic value, it gets written as a tool used to make money. In a profit-driven environment with short-term priorities, the attributes programmers call “quality” may not matter.

    • mattpallissard 12 days ago
      > I don’t think we ever had high-quality code, reliability, or good security.

      I agree with this. I haven't been at it for 4 decades, but have dealt with a lot of legacy code. Shitty code makes the world go 'round.

  • linguae 13 days ago
    Just a few of my opinions regarding why there’s so much low-quality software:

    1. No culture of craftsmanship in software outside of niches such as native Mac software development. “Move fast and break things” was supposed to mean embracing iterative development practices and not being afraid to make changes when necessary, but this philosophy has been stripped of its original context, which has resulted in recklessness.

    2. Unrealistic deadlines that favor slapdash, “get it out the door as fast as possible” solutions instead of taking the time to deliver quality software.

    3. UI/UX designs that serve business priorities over users’ needs. Egregious examples include dark patterns and annoying ads. Less egregious examples include design for branding instead of design based on the platform’s UI guidelines. Unfortunately the Web lacks UI guidelines, and so each website has its own UI/UX.

    4. What’s the incentive for quality? Market dynamics such as network effects and lock-in that prevent competition from gaining a foothold or even from existing in the first place. Why would a company improve its software if people are required to use that product and there are no alternatives? A lot of people, for example, complain about Microsoft Teams, but if their job requires it, they can’t use an alternative platform.

    Some of my favorite customer-facing software includes The Omni Group’s Mac software such as OmniGraffle and OmniOutliner, ClarisWorks, the classic Mac OS (less for its underpinnings, which weren’t the most reliable, and more for its interface), Mac OS X (especially the Tiger, Leopard, and Snow Leopard era), and late 1990s and early 2000s Microsoft software (Windows 2000, Office 97 and 2000, Visual Studio 6, Visio).

    • chrisjj 13 days ago
      > "Move fast and break things” was supposed to mean ... not being afraid to make changes when necessary

      I wonder how much of today's world of broken apps can be attributed to that unfortunate wording.

      Necessity of change should be no impediment to healthy fear of the consequences.

    • lambdaba 12 days ago
      I keep seeing this about OS X, what was special about those versions? And what are the problems with recent versions?
      • karmakaze 12 days ago
        Snow Leopard was the pinnacle of Mac OS that was "for the user". After that, macOS was a means to Apple's end. e.g. discontinuation of Front Row with no replacement.
        • saurik 12 days ago
          Oh wow, I'd forgotten about Front Row... presumably named after how if you say with a Mac in the front row of an auditorium the people behind you could pull out any Apple remote and use it to make your laptop fade to black and enter some crazy UI you didn't recognize ;P. (It took them way too long to come up with any kind of security model for that feature...)
      • linguae 12 days ago
        This is all opinion, but for me, it’s not that modern macOS has severely regressed from Snow Leopard, though I don’t like notarization and I also don’t like the flat design and iOS influences that have made its way to macOS in the past decade. However, it seems to me that macOS hasn’t improved substantially since Snow Leopard, at least from my point of view. This isn’t necessarily a bad thing; even with the iOS UI elements that have made its way to macOS, somebody from 2009 using Snow Leopard would easily navigate a 2024 Mac running Sonoma; the macOS interface hasn’t changed substantially since 2001. Other than integration with the rest of Apple’s ecosystem, most of the macOS features I use were present all the way back in Tiger, such as Spotlight.

        Another issue I have with modern Apple software is the removal of features (I still miss iPhoto; Photos.app is inferior) and the loss of the perfectionism that once defined Apple; I feel Apple’s software has gotten buggier. Of course the Mac today is much more stable than it was in the classic era (no memory protection and cooperative multitasking made for an unstable environment that meant needing to reboot in the middle of getting work done), but it seems like a regression from the 2000s when Steve Jobs had tight control.

        There’s also the context of when Tiger, Leopard, and Snow Leopard were released. I believe these releases of macOS were far better than contemporary Windows releases (XP, Vista, 7) and compared to KDE and GNOME of the time. In fact, if somebody made a Linux desktop environment today that was a clone of Tiger and if enough software was written that conforms to Apple’s human interface guidelines of the era, this would be heads and shoulders better than current versions of GNOME and KDE, but this is my opinion and everyone has different tastes.

    • lambdaba 12 days ago
      I keep seeing this about OS X, what was special about those versions? And what are the problems with recent versions?
  • tacostakohashi 12 days ago
    > even big-budget banking apps that are painful to use

    What do you mean by a "big-budget banking app"? Do you mean the website/online banking or mobile app that the bank provides to consumers for free?

    For those, the purpose of the software is:

    * Advertise the bank's latest credit card / mortgage products.

    * Keep all your data in their app, so you're locked in and can't get to your information without seeing the ads first.

    * Make it hard to do export data, switch banks, etc.

    Being a convenient way to manage finances is not a goal of the software, because that doesn't generate revenue.

  • eimrine 13 days ago
    Don't you confuse services with software? My unconnected laptop works perfectly for me.

    What about examples, I always mad when my old devices stops being functional despite nothing has changed in electric circuits. Ati videocards have no free drivers, Blackberry devices can not render webpages due to https requirement out of thin air, some parts of Lenovo laptop do not fit to another one duo to lock in BIOS.

    BTW look at modern laptops, they also became worse. I can not upgrade laptop CPU after 4th generation of Intel, also nowadays laptops just do not have a lot of ports and durable case.

    > I have a few theories why this happens: Are we all too distracted to do focused work? Does the industry focus too much on the newest trends rather than building things right the first time? Have easy coding tools led to devs who don't grasp the fundamentals?

    These questions do not touch me, try formulating once more.

    • chrisjj 13 days ago
      > Blackberry devices can not render webpages due to https requirement out of thin air

      They can still render the pages they used to render, right?

      The problem is that generally such pages are no longer published.

  • SlightlyLeftPad 13 days ago
    Not to sound overly reductive, there seems to be a growing contrast/rift between really good software and stagnant software. In many ways, it seems like large parts of the world are still operating on 1-2 decade old frameworks and systems.

    Although imperfect, Linux, Apple and Microsoft seem to stand out ahead as high quality OS frameworks. Amazon’s AWS is of general high quality in cloud as well. Google lately has been a counter-example.

    In my world, some of the worst software I see is coming from the gaming industry and I see these as examples of what happens when you have too many cooks in the kitchen, no efficient way to communicate, and a lust for money. So what I think we’re seeing now is the result of a massive consolidation of talent onto increasingly complex products with a goal to squeeze in an ever increasing number of features to ultimately extract the maximum amount of money out of.

  • michelsedgh 13 days ago
    I wonder why everything became worse lately. The software, food, services, healthcare, cars, amazon, social media, google, politics… literally everything that was good is now worse. I was thinking to myself and I couldn’t name anything that became better honestly. Please prove me wrong so I’ll be a bit happier tomorrow.
    • linguae 13 days ago
      Speaking from a US-centric point of view (I don’t know the situation abroad), making a living and keeping up with inflating costs feels like being on a treadmill that goes faster every year, with more and more demands for productivity from employers. I fear not being able to keep up. These employers face heavy demand from their shareholders to grow at all costs to keep those stock values increasing. It’s all about a relentless pursuit for growth. This is why there’s been a proliferation of subscription services; there’s pressure to monetize things as much as possible.
      • michelsedgh 13 days ago
        I am in the US, and what I notice is it must be worse everywhere else because the US exports its inflation more to other countries by the virtue of Dollar being the reserve currency and everything else is priced in Dollars. So it must even be worse everywhere else.
    • ted_bunny 12 days ago
      They say that's what it's gonna be like when the US/West collapses. Not a big crash, but things getting shittier generation by generation.
    • eimrine 13 days ago
      > Please prove me wrong so I’ll be a bit happier tomorrow.

      Recreative drugs are great nowadays, there are new stimulators on the market and new ways to consume such as battery-powered pen-shaped little something.

      • michelsedgh 13 days ago
        So ur telling me I should drug my brains to feel happy?! I can’t say I understand what u mean by this comment.
    • rudasn 13 days ago
      Well, you can be a little bit happier tomorrow if let go of that train of thoughts for a little while and do something fun for you.
      • michelsedgh 13 days ago
        I’m generally happy, but proving me wrong in this would make me a little happiER ;) Still no examples of what’s better tho. You can argue AI is better, but I have yet to see actual use cases in daily life for it. The algorithmication of everything in my opinion is what made a lot of stuff worse… Hacker News isn’t worse tho, you can say that.
    • cuu508 13 days ago
      Small new cars are safer and have lower emissions than older models from same class. They are worse in some other aspects though
      • eimrine 13 days ago
        They are safer indeed, but gaming too much towards safety in automotive produces too much displays and also surveillance.
      • michelsedgh 13 days ago
        Yeah but the build quality is much worse than just a few years ago. They have many issues out of the factory. There’s even many statistics that came out for this and even articles.
  • yen223 12 days ago
    90% of it are rose tinted glasses. We humans are very good at forgetting the inconveniences of the past when we no longer experience them.
  • piezoelectric 12 days ago
    It's sort of a "bad software devs make hard times make good software devs, good software devs make easy times, easy times make bad software devs" not to mention the mega corps fists holding everything
  • rawgabbit 12 days ago
    My personal opinion is that we bull-shitted ourselves into believing the slap-dash tech stack we created for internet facing applications is the greatest thing since sliced-bread. In the past, we developed software that was meant to be used by one person who used one pc which connected to one server and there was maybe three software vendors involved: the client OS, the software tool itself, and the server OS.

    With internet facing applications, it is not uncommon to deal with open sourced software (with no vendor responsible for maintaining it), several APIs (each API may or may not have a vendor who can provide support), and security by duct-tape (we slap this security module on it and hope it stops most attacks). That is the complexity of modern applications has increased exponentially. To deal with this complexity, instead of a mechanism to pay/fund engineers to improve our tech stack. We magically hand wave these concerns away. My opinion is that those who can afford it and demand security will go full "zero trust" by using flashed EPROMS. Software will no longer be soft. Instead software will be flashed onto EPROM chips and only after they go through thorough testing against hacks, will they be deployed into production. When your company buys cyberinsurance, instead of a checklist of security best practices, your company will attest their software stack consists of EPROM chips that have been certified by CISA and the like.

  • proc0 12 days ago
    It's because the engineers that build the software are also testing their own features, and are also generally tasked with handling the analytics as well, not to mention planning, tasking, and managing expectations. In other words, multiple engineering roles on the product development side have collapsed into a single role, and I believe this has impacted the industry significantly.

    I think in part this happened because of how tools evolved and have become more automated, but on the other hand, there is probably some incentive from the business side to make teams less specialized and more easily replaceable. The cost is probably worth it for business because nobody demands quality software, which is the root of the problem. The end user has not fought back against endless updates to patch buggy software that somehow still has responsiveness problems with extremely powerful hardware. I guess not enough people care, and so companies structure businesses accordingly. On the flip side, there might soon be opportunities for software companies to provide a better experience via good engineering, and reverse the trend a little.

  • shenene 12 days ago
    #1 reason imho is the rapid expansion of the field in term of technologies and the required knowledge combined with an influx of below par software 'engineers.

    Also, IT as a field attracted way to many non IT personal that, because they lack technical knowledge, always seems to go up to management functions because they make great power points.

    Mediocraty promotes mediocraty

  • gtirloni 12 days ago
  • cainxinth 12 days ago
    I’ve been reading that headline in various forms for thirty years now.
  • bradley13 13 days ago
    Since the Internet really took off, there has been a huge and unrelenting demand for software. Everything from basic websites to cloud services to the stuff running in the cloud.

    Lots of people have taken up the role of developers, but most of those people are not very good. Lots of sweatshop development, with zero knowledge of, or attention to either security or quality. People using frameworks and libraries they don't really understand, to get product out the door fast.

    • Workaccount2 12 days ago
      I know a few people who never struck me as the technical type or particularly interested in solving engineering problems who did a boot camp to crash course a new career in software and end up with cushy 6 figure work from home gigs.

      I know one got laid off recently though, so maybe things are changing.

  • mmh0000 12 days ago
    This is a complaint going back to the beginning of computers.

    It's much easy to summarize it: "All software sucks, but, no one has made anything better".

    This guy[1] has been complaining about people not using pure assembly and making "bloated" applications since the 1990's.

    [1] https://www.grc.com/smgassembly.htm

  • gardenhedge 12 days ago
    Banking apps may or may not have a big budget. They're created in a highly political environment. Engineers might not be making the decisions. E.g. do we spend time fixing X? Some business person might decide that since it affects only 10% of the user base they would rather have the tech team work on something that increases their bonus.
  • aristofun 12 days ago
    It is a part of the general trend of unproffessionalism and ignorance.

    Everything started being taken for granted -> people got lazy and stopped doong good job (in all levels from dishwashers to presidents) > everything slowly going downhill

    But there are good news.

    Crisis are there for reason. It fixes these things.

    Then we got another 5-10 years of growth and prosperity.

    • malicka 12 days ago
      This seems like a weirdly wishy-washy reason to go with, when we could pick out a more tangible one instead: For some types of software, being a bit painful to use is (counter-intuitively) the more profitable route. These user-hostile patterns are copied by other teams and apps because they are seen as “what’s professional” or “modern.” And then those get copied by others, so on and so forth… until you get the modern landscape of proprietary software.
      • aristofun 12 days ago
        Would you call someone who just copies witthhout putting own thinking and hard work a professional?
        • linguae 12 days ago
          "Nobody ever got fired for buying IBM" is a very old saying. The sentiment is similar to how many companies have blindly adopted FAANG-style Leetcode interviews even when their work is completely different from what the FAANGs do, as well as the adoption of other fads such as stack ranking. It's easier to imitate the market leaders than it is to innovate or invent, and innovation/invention requires taking risks.
  • watwut 12 days ago
    > and even big-budget banking apps that are painful to use

    Big budget banking apps were always painful to use. That is not a change in anything.

    > Reliability and security often feel lacking too.

    If anything, there is more focus on those then in the past.

  • zilti 13 days ago
    It is getting worse and worse due to CV-driven development, hypetrains, and moving further and further away from standards. It is horrible and frustrating.
  • kingspact 12 days ago
    [flagged]