I have a few theories why this happens: Are we all too distracted to do focused work? Does the industry focus too much on the newest trends rather than building things right the first time? Have easy coding tools led to devs who don't grasp the fundamentals?
Plus, what does the rise of AI mean for software quality? Could things get a LOT worse before they get better?
What are the worst examples of bad software that drive you crazy? Are there shining examples of exceptional quality that give you hope?
We have some examples from the old days that survived and have some of those traits, but I see that as survivor bias — the really bad code gets replaced or forgotten.
Programmers often fail to understand that companies pay to develop software to make money, to solve business problems. Software doesn’t get written because it has intrinsic value, it gets written as a tool used to make money. In a profit-driven environment with short-term priorities, the attributes programmers call “quality” may not matter.
I agree with this. I haven't been at it for 4 decades, but have dealt with a lot of legacy code. Shitty code makes the world go 'round.
1. No culture of craftsmanship in software outside of niches such as native Mac software development. “Move fast and break things” was supposed to mean embracing iterative development practices and not being afraid to make changes when necessary, but this philosophy has been stripped of its original context, which has resulted in recklessness.
2. Unrealistic deadlines that favor slapdash, “get it out the door as fast as possible” solutions instead of taking the time to deliver quality software.
3. UI/UX designs that serve business priorities over users’ needs. Egregious examples include dark patterns and annoying ads. Less egregious examples include design for branding instead of design based on the platform’s UI guidelines. Unfortunately the Web lacks UI guidelines, and so each website has its own UI/UX.
4. What’s the incentive for quality? Market dynamics such as network effects and lock-in that prevent competition from gaining a foothold or even from existing in the first place. Why would a company improve its software if people are required to use that product and there are no alternatives? A lot of people, for example, complain about Microsoft Teams, but if their job requires it, they can’t use an alternative platform.
Some of my favorite customer-facing software includes The Omni Group’s Mac software such as OmniGraffle and OmniOutliner, ClarisWorks, the classic Mac OS (less for its underpinnings, which weren’t the most reliable, and more for its interface), Mac OS X (especially the Tiger, Leopard, and Snow Leopard era), and late 1990s and early 2000s Microsoft software (Windows 2000, Office 97 and 2000, Visual Studio 6, Visio).
I wonder how much of today's world of broken apps can be attributed to that unfortunate wording.
Necessity of change should be no impediment to healthy fear of the consequences.
Another issue I have with modern Apple software is the removal of features (I still miss iPhoto; Photos.app is inferior) and the loss of the perfectionism that once defined Apple; I feel Apple’s software has gotten buggier. Of course the Mac today is much more stable than it was in the classic era (no memory protection and cooperative multitasking made for an unstable environment that meant needing to reboot in the middle of getting work done), but it seems like a regression from the 2000s when Steve Jobs had tight control.
There’s also the context of when Tiger, Leopard, and Snow Leopard were released. I believe these releases of macOS were far better than contemporary Windows releases (XP, Vista, 7) and compared to KDE and GNOME of the time. In fact, if somebody made a Linux desktop environment today that was a clone of Tiger and if enough software was written that conforms to Apple’s human interface guidelines of the era, this would be heads and shoulders better than current versions of GNOME and KDE, but this is my opinion and everyone has different tastes.
What do you mean by a "big-budget banking app"? Do you mean the website/online banking or mobile app that the bank provides to consumers for free?
For those, the purpose of the software is:
* Advertise the bank's latest credit card / mortgage products.
* Keep all your data in their app, so you're locked in and can't get to your information without seeing the ads first.
* Make it hard to do export data, switch banks, etc.
Being a convenient way to manage finances is not a goal of the software, because that doesn't generate revenue.
What about examples, I always mad when my old devices stops being functional despite nothing has changed in electric circuits. Ati videocards have no free drivers, Blackberry devices can not render webpages due to https requirement out of thin air, some parts of Lenovo laptop do not fit to another one duo to lock in BIOS.
BTW look at modern laptops, they also became worse. I can not upgrade laptop CPU after 4th generation of Intel, also nowadays laptops just do not have a lot of ports and durable case.
> I have a few theories why this happens: Are we all too distracted to do focused work? Does the industry focus too much on the newest trends rather than building things right the first time? Have easy coding tools led to devs who don't grasp the fundamentals?
These questions do not touch me, try formulating once more.
They can still render the pages they used to render, right?
The problem is that generally such pages are no longer published.
Although imperfect, Linux, Apple and Microsoft seem to stand out ahead as high quality OS frameworks. Amazon’s AWS is of general high quality in cloud as well. Google lately has been a counter-example.
In my world, some of the worst software I see is coming from the gaming industry and I see these as examples of what happens when you have too many cooks in the kitchen, no efficient way to communicate, and a lust for money. So what I think we’re seeing now is the result of a massive consolidation of talent onto increasingly complex products with a goal to squeeze in an ever increasing number of features to ultimately extract the maximum amount of money out of.
Recreative drugs are great nowadays, there are new stimulators on the market and new ways to consume such as battery-powered pen-shaped little something.
With internet facing applications, it is not uncommon to deal with open sourced software (with no vendor responsible for maintaining it), several APIs (each API may or may not have a vendor who can provide support), and security by duct-tape (we slap this security module on it and hope it stops most attacks). That is the complexity of modern applications has increased exponentially. To deal with this complexity, instead of a mechanism to pay/fund engineers to improve our tech stack. We magically hand wave these concerns away. My opinion is that those who can afford it and demand security will go full "zero trust" by using flashed EPROMS. Software will no longer be soft. Instead software will be flashed onto EPROM chips and only after they go through thorough testing against hacks, will they be deployed into production. When your company buys cyberinsurance, instead of a checklist of security best practices, your company will attest their software stack consists of EPROM chips that have been certified by CISA and the like.
I think in part this happened because of how tools evolved and have become more automated, but on the other hand, there is probably some incentive from the business side to make teams less specialized and more easily replaceable. The cost is probably worth it for business because nobody demands quality software, which is the root of the problem. The end user has not fought back against endless updates to patch buggy software that somehow still has responsiveness problems with extremely powerful hardware. I guess not enough people care, and so companies structure businesses accordingly. On the flip side, there might soon be opportunities for software companies to provide a better experience via good engineering, and reverse the trend a little.
Also, IT as a field attracted way to many non IT personal that, because they lack technical knowledge, always seems to go up to management functions because they make great power points.
Mediocraty promotes mediocraty
https://news.ycombinator.com/item?id=40087369
https://news.ycombinator.com/item?id=40090341
https://news.ycombinator.com/item?id=38764427
https://news.ycombinator.com/item?id=35537264
https://news.ycombinator.com/item?id=36844866
https://news.ycombinator.com/item?id=24716199
https://news.ycombinator.com/item?id=18421421
Lots of people have taken up the role of developers, but most of those people are not very good. Lots of sweatshop development, with zero knowledge of, or attention to either security or quality. People using frameworks and libraries they don't really understand, to get product out the door fast.
I know one got laid off recently though, so maybe things are changing.
It's much easy to summarize it: "All software sucks, but, no one has made anything better".
This guy[1] has been complaining about people not using pure assembly and making "bloated" applications since the 1990's.
[1] https://www.grc.com/smgassembly.htm
Everything started being taken for granted -> people got lazy and stopped doong good job (in all levels from dishwashers to presidents) > everything slowly going downhill
But there are good news.
Crisis are there for reason. It fixes these things.
Then we got another 5-10 years of growth and prosperity.
Big budget banking apps were always painful to use. That is not a change in anything.
> Reliability and security often feel lacking too.
If anything, there is more focus on those then in the past.