It was the best of times, it was the worst of times :-)
I really really liked the Amiga (and was a fairly regular contributor to the comp.sys.amiga newsgroup), I also interviewed for a job at Commodore HQ in Frankfurt (the actual headquarters of the tax shelter known as Commodore :-)). And one of the things that became painfully obvious was the Commodore was not set up to make Amiga successful, they didn't have the correct organizational structure nor the core corporate values that would make them successful.
At the time I was working at Sun, had joined as a reasonably early engineer, and watched how Sun had grown from a scrappy "start up with great pretensions" to something DEC and IBM started actually losing market share too. I saw the market for Amiga as coming up as a low priced workstation, not the bargain basement home PC. But it could be both, and for a while that path was making progress with the A500 and the A2000.
It was hard to do though, the workstation market really needed a "high resolution" flicker free display, the home market needed to look good on TVs. The architecture didn't have the display bifurcation line that was built into the PC or other workstations.
With out executive air cover to make the investments they needed to make in engineering, Commodore reverted to its roots of making things as cheaply as possible to add margin that way. "High end" systems that would have small markets were shelved, and even the commercial systems they were selling into the video post production market were starting to get a reputation for being cheaply made.
It took me a long time to get past my feelings of loss when we saw the future slip away.
> It took me a long time to get past my feelings of loss when we saw the future slip away.
I think that for many Amiga users, they've never gotten over their feelings of loss. Among retrocomputing enthusiasts, Amiga users are an extra odd bunch -- often making and selling commercial shrink wrapped software for years.
While it's not entirely unusual for people to put effort into a special game or whatnot for their favorite retrocomputer, and charge people for a special nostalgia filled limited run, Amiga users seem strangely tied to trying to make creating Amiga stuff a viable commercial venture  it's as quaint as it is bizarre these days.
The Amiga broke my heart. In some sense I will never get over it. At ages 12 to 18 I was a programmer on the Amiga, a music composer, digital image artist, I scoured BBS's, and so much more. I got together with groups of other people (from all over, and of all ages) doing the same things.
When it could no longer be denied that Commodore and the Amiga were dead, cold and buried, and it seemed the only alternative was a clunky, artless PC, I got so down on the whole thing that I left the field of computers entirely. I even went to university in a completely unrelated field.
I finally broke after many years, went back and got a computer science degree and has now been a programmer and systems architect for many years. But I have never forgotten my first love.
You wouldn't say that about an aftermarket for, say, vintage car parts, would you?
To be specific, if you buy yourself an original VW bus, you won't be on your own when you try to service it.
Amiga just might be the VW bus of retro computers (probably Commodore deserves this title better, but the analogy works the same way). It'll be alive for as long as the feelings associated with it are understood.
I think what really sticks out to me, more than other retrocomputer scenes, is that not only is software that should have long been turned over for free or gone open source being sold, but the business model for the software is all very 1990s. "Buy here, $199.99 and download" seems anachronistic to where most other scenes have gotten to.
I mean, these companies must move what, tens of licenses per year? I would bet that many of them would do better just to give the software away for free (as in beer) and put a patreon link on the home page.
I honestly really admire the Amiga scene's incredible tenacity, but can't also help but find it just a hair amusing.
Less like a VW bus, and more like a 1920's Lincoln (the 1980's is relatively ancient when it comes to personal computing), and not only do you have the community of vintage enthusiasts to help you restore it and get it in running condition, there's some company selling aftermarket entertainment systems, and they offer kits for the modern Accord and Camry, as well as 1920 Lincolns...
Fuji Instax is still pretty niche and survived largely because of Fujifilm's extremely diverse portfolio. Polaroid didn't so much "mess around with its product line" as "go bankrupt twice in a decade". The revival of Polaroid instant film was difficult, painful and wouldn't have happened at all if it weren't for the fanatical devotion of a number of key players outside of Polaroid Corporation.
Polaroid arguably discontinued their instant film business because they bet on Zink
Zink is a Polaroid technology that powers many instant cameras. I own a Polaroid-branded one (Polaroid SNAP), and it's awesome: it never ceases to amaze people. And when I tell them that yes, they can keep it, and it's also a sticker, their eyes light up!
I've made friends with this new digital iteration of the technology. But when the Impossible Project was started, that tech was still very far from being mature; and many people still prefer the look and feel of the instant film. They are different products.
Still, it wouldn't be fair to say that Polaroid didn't try to stay relevant in instant photography. Ultimately, Zink is a success - although it's a separate company now.
I can commiserate, I had the same feels of loss when Acorn closed down.
When I was a kid they held the same affection for me as some kind "nerd football" team that you'd support out of all reason. I used to think "if they'd have just done this then they would have won!". With hindsight, as an adult, that seems so sweetly, childishly naive; I was a funny kid. :D
It convinced me though that you shouldn't tie your dreams to someone else's commercial organisation unless you're a major shareholder.
The happy ending for me was that Gnu/Linux and Free software nicely filled the void in my heart. No one can take it from you and it's never "finished". Cheers Mr Stallman.
(That's so naive of me; I must be a funny adult too. :D )
"The happy ending for me was that Gnu/Linux and Free software nicely filled the void in my heart."
How ironic, considering that exact same thing is the bane of my existence: for someone like me who skipped the primitive, clunky, derided PC and jumped straight onto Suns and SGI's, Linux and GNU are a terrible, utterly depressing regression compared to HP-UX, Solaris and IRIX. I lost all will to work on computers because of GNU and Linux. It's that terrible when I compare it to AmigaOS, HP-UX, Solaris or IRIX.
Much has improved and illumos keeps getting better and faster: since 2005, no code which causes performance regressions was allowed into the Solaris codebase. Each speedup committed set the base higher. Nobody would be allowed to commit code which slowed the performance thereafter. Any case where GNU/Linux was faster was treated as priority 1 bug. Yes, a bug.
Implementing tools within other tools for convenience is stupid, as it kills modularity, which is the UNIX®️ philosophy: I don't need that kind of "convenience" since I know UNIX®️ and therefore which pipes to use to which commands to get the same effect. grep -r replaces find + xargs + grep and therefore goes against "do one thing and do it well" as well as against "design tools to interface with other tools". For example tar is a tape archiver, therefore it has no business implementing compression - that's what dedicated compressors like bzip2 or xz or 7z are for - they know best how to (de)compress and how to handle their own formats. Therefore, the GNU approach of convenience is stupid beyond retarded.
As a former Amiga owner, I don't think it is unique to Amiga but rather it was an end of an era and nothing burned as bright as the Amiga at the end of that era. I called it the tinkerer era where home computers where not just a box but rather something to be tinkered with.
Resources where limited and hacks where abundant to make them perform miracles (in their day). Tinkering gave way to homogenization and the whole landscape got sterile. I felt the same loss as I did with Amiga when Symbolics started to fade from the landscape, then there was DEC, SGI, Atari, etc. etc. The era almost took Apple and NeXT with them. Some burnt out longer than others but they all started to die then and there. It really was an epoch shift in computing and I think that is what we all truly miss. We pin it on our favorite tech of the time, but in the end, it was just a cooler time to be in computing.
> I think that for many Amiga users, they've never gotten over their feelings of loss.
Man you are so right. I reluctantly moved to the PC hoping till the very end that Commodore/Amiga would somehow survive the late 90s. That was a really bitter era when one had invested so much time and joy in the machine and its OS.
Well it has one advantage, it's a privacy focused computer disconnected from the wild internet and probably less likely to be back doored. I'd be happier using it a a crypto wallet or for personal finance spreadsheets than a modern PC.
To be fair to the Amiga enthusiasts, the Acorn / RISC OS group has a similar kind of -- well, I'll call it momentum.
Commercial software is still available (though not as much of it as was available in the past), and sometimes the prices are downright silly given the size of the market and the lack of maintenance on said software.
"I think that for many Amiga users, they've never gotten over their feelings of loss."
That is correct and my pain born of that loss has increased over the years. First Sun and then SGI provided some alleviation but when they both lost the pain came back in full force, since both ended up opening an old wound which had never fully healed.
It's not about the nostalgia. The nostalgia is present in lots of retrocomputing fandoms that aren't anywhere near as vulnerable to scams as the Amigans. RISC OS fans are nostalgic. NES fans are nostalgic. People insisting on re-building a Vax in their basement are nostalgic.
Nostalgia drives things like some guy puts together an order for Amiga logo keycaps, and sells them for barely more than cost price and maybe a couple of people feel the quality isn't as good as they expected. That's not a scam.
Ten years ago one of the Amiga companies told potential customers it would ship them a new multi-core PowerPC Amiga, within the year, with a new OS version that supported using multiple cores. A huge breakthrough. It didn't ship that year, and when it did ship it came with a "preview" that lacked support for the multi-core processor, and you will not be surprised to learn that almost a decade later those customers are still waiting for their "final" version with working multi-core.
But in many ways being heavily down-voted shows exactly why this works. Amigans are sure that if they are insistent enough that it's not a scam then it won't be. They just need to have faith, click that downvote, tell people they're wrong, if I scream loud enough that I can fly surely gravity will just have to believe me...
"Amigans are sure that if they are insistent enough that it's not a scam then it won't be. They just need to have faith, click that downvote, tell people they're wrong, if I scream loud enough that I can fly surely gravity will just have to believe me..."
You just described a stereotypical GNU/Linux user here on "Hacker" "News", are you aware of that?
The Amiga Unix licensing failure was an impressively well aimed shot to the foot.
Medhi Ali was responsible for gutting Commodore engineering and replacing with mostly cheaper PC people. Which led to the cost-cutting A600 that actually turned out more expensive to produce than the 500 it was supposed to replace. Dave Haynie had a lot to say about that era, all of it free of compliments.
Ali and Gould were drawing more than the CEO of IBM in Commodore's declining years.
From what I garnered from a few documentaries here and there long ago, Mehdi Ali was responsible for driving other companies into the ground too, and escaping those ships richer than he was when he got on them.
How does this happen and why is it allowed to?
Are these kinds of CEOs the designated cleanup people who get sent in after a company has been decided to get killed off and they're the [insert butcher analogue for stripping the leftover parts of a corpse]?
> I was working at Sun, had joined as a reasonably early engineer, and watched how Sun had grown from a scrappy "start up with great pretensions" to something DEC and IBM started actually losing market share too.
If one wanted to read observations about specific things Sun did effectively in this process, from either an organizational, strategic, or execution perspective, where would one look?
Just from chats I've had with people who were there, they claim it was just being better engineers and having more focus, and less internal rot. IBM is full of people who are good at keeping a job at IBM, ditto any company that lasts long enough. If you take any organization, and it lasts long enough, eventually it will be full of people who's only skill is keeping a job at that organization.
Apple was yet another company that was almost destroyed by its management, post-Jobs. If it wasn't for Steve's return, we came scarily close to becoming a soul-sappingly gray, homogenous Microsoft/PC-only world. Thanks to the Mac's success we got iOS, then because of that, Android.
I wonder if there's any chance of a new major architecture becoming successful today.
The history of disruptive innovations shows that dominant new software platforms tend to emerge around new hardware form factors. So there's probably no room in the market for a new OS for smartphones or laptops. But once AR goggles or quantum computers or something become technically feasible then someone will create a new OS that disrupts the market for Windows / Unix / Android / iOS.
Probably not as long as those two have some of the most profitable companies in the world behind them. We might get one of them pushing an replacement (e.g. Fuchsia), but competing against either of them when they've got the resources at their disposal they do currently seems destined for failure.
At best, we might see an open source contender gain some traction, but there's not really the same lineage to draw on there either. Linux at least had a few decades of Unix software and UI (including text consoles) going for it. An open source competitor to iOS or Android will either be copying what they do, or making it up as it goes along, which is unlikely to yield major advantages they haven't already taken advantage of, IMO.
That's a good point about AR. Then again, Google and Apple have been pretty active in pursuing these areas, going as far as to now include dedicated hardware to help with much of the AI/ML stuff required to do that well, and they both have enough money to buy almost any company that comes out with a great showing in that area.
If we got some dark horse contender that wasn't taken as serious for long enough to let it gain an advantage and/or was established or large enough that there wouldn't be a lot of pressure to sell, maybe. Nintendo/Sony/Microsoft are the current contenders for that in my mind based on that criteria (with Microsoft actually aggressively pursuing it). Any other company that makes a good showing is likely going to be someone else's next meal.
I know this article was about how commodore screwed up there marketing on the Amiga, but all those references to software, publications, and developers just took me on a nostalgia trip back to my 15 year old self pining for an Amiga as I worked in a farmer's fields all summer long to be able to buy one, and after I finally got one I spent many late nights teaching myself to code up some Homebrew bbs software for it.
You're not alone. I had mine around 12-13 I think. I remember how cool and modern it looked compared to the C64 I had before. There were all the great games, but also so much to learn programming wise. We had rather limited resources available at that time (esp. in a small european town), mostly computer magazines, older friends... It was a really different approach to learning.
I built hardware mods on my Amiga 1000 and 500, and for members of my local Amiga club
What got me was how easy it was to do, yet a large company like Commodore was not doing that I did at home.
500KB original memory and 1.5 MB of memory piggy-back to the chip memory, 1.8 of high speed static memory, designed myself and install a CPU speed doubler, a gray composite (used analog colour signals) to drive a high persistence monochrome monitor, a Zorro bus to IBM ISA bus adapter, external triple floppy drive array.
All this was very easy to make, I don't know why Commodore could not do the same things.
Commodore was still making good revenue up to 1993 despite peddling 7 year old designs. Firing all good engineers, cutting costs to the bone and selling outdated, but cheap to manufacture technology was a fantastic short term strategy. Surprisingly 1991 was the second best year by revenue, Commodore was already a walking corpse by then.
I was an Amiga user from 1988 to 1994 or so. 1991 felt like the peak of the Amiga. AmigaOS 2.0 was finally released, which was a huge upgrade. The A3000 was out... their first "professional" Amiga. There were lots of 3rd party vendors supporting it.
Same here, but I was younger and lucky that my dad was a complete Amiga nuts. We had like 4 A500 around the house, he used A2000, A3000 then finally A4000... I was really young (8-9) and mostly used them for gaming (Sid Meier's Pirate was my all time favorite). Good times.
I'm a bit sad that the Amiga never caught on much in the US...at least not in the home market. Looking back at computer history, it's almost surreal to see how much better it was than virtually all the competition (with the possible exception of NeXT), and still managing to lose the war to Windows.
I've played with the Amiga a lot via emulation, and it's still impressive to me; a home operating system with preemptive multitasking in the 80's?! With something like that, whomever was in charge had to work pretty hard to ruin it.
The Amiga in 1985 had amazing custom graphics and sound chips, a preemptive multitasking OS, double the RAM of the Mac 128k at 1/2 the price (Mac 128 MSRP is $6000 in 2018 dollars)
It’s really sad how much public hagiography is made over the Mac when almost no middle class family I knew of could afford it, certainly not with a LaserWriter.
The Commodore 64 was way more affordable and got a legion of kids interested in computing and coding, who later went on to adopt Amigas.
Even today if you look at the home brew, hacker, and demo scenes, Commodore dominates. Hardly anyone is doing stuff on old Apple 2s or Macs.
Commodore gets the short shrift in the Twitterati retelling of the personal computer evolution, and today’s millennials completely fixated on Jobs and Apple and ignore most of what was really happening in the 80s with home users.
> how much public hagiography is made over the Mac when almost no middle class family I knew of could afford it,
Part of the late '80s/early '90s revenue strategy for Apple was to sell into the educational market. The people who fondly remember the Apple computers of this period do so not because they had one at home, but because many of them were young children at this time, playing games on those Apples in school computer labs.
Well before there were Twitterati, the Silicon Valley glitterati fell in love with the Mac as a concept: a computer designed from the ground up to be easy to use, present a single, consistent interface, and be an appliance with minimal cognitive engagement from the user. Like a Yoko Ono piece, the vision was the product -- even if the actual hardware was lacking and expensive. And in the Mac conceptual world of the time, generating interest in computers and programming was seen as an anti-feature. You shouldn't have to be interested in those things to leverage the full power of the Mac, and if large numbers of people were getting interested in those things, the wrong things were being optimized for. Programming is just a job, and computers are just a tool to enable you to do your real work. Such was the thinking of the day.
The Amiga was an entirely different class of machine, designed by and for engineers, and it was a bit rougher around the edges UI-wise but it did far more in terms of real, concrete advancements in the state of the art.
AmigaOS was tremendously powerful, but Macintosh System (as it was called then) had much more UI polish and could be operated with one mouse button (this was important!). The official programmer's reference manual, Inside Macintosh, contained strict rules for how an application should look and behave. By contrast, on the Amiga, some great UI frameworks existed but they looked rougher and a lot of people seemed to roll their own UI and play by their own rules anyway. To me this was part of the Amiga charm, but it was inimical to the vision of computing Apple was selling.
I didn't realize just how weak the Mac APIs were for building actual applications until I tried writing one. You NEED a framework like PowerPlant in order to contend with the very primitive primitives Apple supplied. And even then, you don't get nice things like preemptive MT.
Applications on the Amiga all had different interfaces because intuition.library had infinite possibilities since it only implemented graphic primitives on top of graphics.library.
I'm an Apple guy now but I will never get used to a one button mouse - it's too retarded, especially when one comes from UNIX where three buttons are simply phenomenally super awesomely useful: I love the mark with the mouse and paste with the middle mouse button - it's the best. I cannot figure out why others haven't implemented that - it's so natural and intuitive - I love that I don't have to explicitly cut and paste - marking is enough.
Apple ][ was the school computer and the upper middle class computer. Commodore, Atari, Sinclair, Radio Shack and TI (when the TI-99/4A went on sale) were the computers that introduce computing to a whole generation of families without the income to afford Apple. Its telling that both Atari and Commodore sold more computers than Apple for a lot of years.
Its a shame that both Commodore and Atari forgot what their niche was.
You nailed the description. At my school they had one Apple ][ and this infuriating rule that only kids who had an Apple computer at home could use it. Us poor kids had to use TRS-80 model 3s. I carried an anti-Apple grudge for years and years because of that policy.
But they are expensive and the teachers didn't know how to fix them. At my school it was a similar setup, getting access to the computer lab was guarded like the crown jewels and the one guy who could program never said a word to me. I got a 486 the year after and at that point couldn't care less about the old boxes we had to school. Pity they didn't even brush on programming, it was all word processing making and bad computer art (easy to mark, the winner got to print theirs out in colour!).
I get the feeling my school age self would have been filled with a bit of rage at the teacher and the students who got to use the Apple computers. The TRS-80 Model 3 wasn't bad, but it wasn't exactly the funnest machine.
Well, I would guess that would teach a lesson, but probably not the one they wanted.
I'm also saddened when people omit the mention of the Sir Clive Sinclair's ZX Spectrum, another affordable little wonder that was responsible for bringing the other half of the world into computing. :)
That's because the ZX Sinclair Spectrum's only claim to fame is the rock bottom cheap pricing: it did not have any revolutionary hardware or operating system like the Amiga. It couldn't do things competition wasn't capable of. The only thing Spectrum competed on was price.
I actually had a hand-me-down C64 as a kid, so I couldn't agree more. It's a bit bizarre; the C64 and Apple II were around at about the same time, but the C64 had more of, well, everything, for a cheaper price. It sort of baffles the mind to me that Apple even stood a chance to them.
Really I guess that the Apple ][ was a platform rather than just a single machine.
So you've got the original Apple ][ which a C64 would have been a much better machine than with the benefit of 5 years and super aggressive cost cutting. However you've also got the Apple iigs in that family.
I think I'd see that as a much better machine than the C64 (but obviously much more expensive too).
Eh, your memory isn't as wrong as you may think. The C=64 was released during the Apple II+/Apple IIe era, and frankly, both of those machines were barely improvements. To give you an idea, the biggest improvement offered by the IIe was... support for lower-case characters.
The Commodores genuinely were something special for the era.
Keep in mind that the Amiga in 1985 was hard to get, due to production problems.
Reading the specs doesn’t give you much of a story. As the article talks, around 1986, software availability became the prime driver of platform sales. That meant taking a different approach to designing hardware. It takes a long time to figure out how to work with custom chips well, which is why the Amiga is still popular in the demoscene, but also a contributing factor for why there was not as much software available when it came out.
Macs and PCs are pretty boring by comparison. A CPU, some memory, interrupt controllers. But they had software. Macs had educational discounts back then (and still do, but not quite as dramatic).
> Commodore gets the short shrift in the Twitterati retelling of the personal computer evolution
Isn't that just because they had no actual impact on the industry? The Apple II was first as a mass consumer microcomputer (they even showed a prototype to it to Commodore years before the PET, when Tramiel still thought calculators were the thing). The Mac inspired every popular GUI that came after it (they all look more like the Mac than Xerox).
You could say that Commodore inspired a generation since they were so cheap and everyone had one, but you could also say that about Dell and Gateway 2000. The Amiga had amazing hardware, but so did the Sharp X68000 - still the whole concept of proprietary chips that software had to be written exclusively for has never had a long-lasting impact on the PC space.
Of the PCs and smartphones we're using today - how much of it can be traced to Commodore? The price? The method of vertical integration maybe?
As Apple fans like to say, being first isn’t always important is it? The Apple 2 may have been first, but the Commodore 64 was both more affordable, and more widely adopted.
Apple computers weren’t Personal Computers because precious few people owned them at Home, they were time share systems you got to use at school labs or the office.
What impact did Commodore have? An entire generation of engineers who went on to work on graphics software and hardware you use today were hacking C64s and Amigas as kids.
Do you think Linus Torvalds got started on Commodore hardware or Apple? You can directly trace a lot of the modern hacker ethos of the internet to the kids who grew up in the 80s on non-Apple hardware.
If you want to credit long lasting inventions that are part of software and hardware today, you can look at Alan Kay, and thank him for Smalltalk (which Brad Cox based Objective-C on)
Or you can look at Kerrigan/Richie/Pike.
Much of what makes modern PCs and smartphones what they are is invisible. There would be no iPhone without them no matter how important and revolutionary you think capacitive touch interfaces are, they stand at the top of a deep deep pyramid of inventions and innovations that did not come from Apple, and annoyingly, often isn’t credited by the historical retelling and hagiography.
A general purpose operating system. As a business graphical OS it was lovely to develop for. It made sense with a delightful conciseness that Windows and X could only dream of. Executables of equivalent capability could, quite literally, be an order of magnitude smaller.
There were few warts - AmigaDOS because of the BCPL roots as it was an extremely last minute addition when the planned CAOS never materialised, and icons were a pain to work with. Thirty years of working on other things and I still believe they got 95% just so. The plug and play of Win 95 was pathetic compared to AutoConfig (IIRC what it was called), in Zorro 1. MFM and IDE hard drives compared to SCSI on its own DMA channel etc etc. Stuff that took decades to arrive on Win.
As for a reboot - I can imagine an Amiga like OS experience on several platforms, but hardware? I find it difficult to even imagine how something could have the quantum leap that Amiga was compared to everything else on the market under $50k.
It's always fun when my old comments get dug up...
Actually recently I've started thinking about what it would take to create a toolchain to do the minimum to provide the pieces of AROS (for the uninitiated: AmigaOS "reimplementation" though it goes beyond the original in some respects) that might make sense on Linux and provide a compatibility layer to make it work.
AROS itself can run hosted on Linux, but not integrated well with Linux apps, but quite a lot of AROS relies on relatively small subsets of the AmigaOS API, and it'd be a fun experiment to e.g. bring data types to Linux, possibly combined with a fuse filesystem as a fallback to do on the fly conversions for apps with no support.
I'd love to see if some of those things could gain traction if suitably modernized.
I am thinking about what would make something revolutionary today. The only, but very important thing I can think of is that software today has no time constraints. All interfaces feel sluggish at some point.
What if we could make an OS with constraints, or an app store with a vetting process, or both complementing each other, to the effect that:
A widget pressed or touched or interacted with could always be trusted to respond in time - or fail in an understandable manner.
- No launching screens on touch interfaces suddenly being sluggish.
- No waiting for apps to download and install and can not be used during that time. (Solved by having updates installed quietly in the background.)
- No stutter or slowdowns, ever, no audio lags, ever.
the main thing should be that what you are interacting with must never feel like it's sluggish, no more than the water flowing out of a faucet starts to lag or freeze/unfreeze suddenly. The interface should feel so solid and "real", that if it stuttered you would be so shocked as if a thrown ball in real life stuttered in mid air.
: Give GUI code very high priority. This will have to involve putting some intelligence in GUI code, or the interface will appear to be unresponsive or do strange things when underlying IO or network is being slow.
: Focus on determinism and time budgets, not raw performance throughput
: Vetting of applications
: Constrain apps to hard RAM budgets
: IO budgets for apps?
: Have apps allocate and bid for network performance and available bandwidth
I have a feeling much of this would not need a ground up rewrite. Probably Android or Linux could be used as a basis for such a system.
The big lesson in this for AmigaOS is to thread everything and make message passing a cheap and easy mechanism, but also to make developers develop but at least test on very low end hardware.
Even a simple button involves half a dozen threads in AmigaOS between the drivers, the handlers "baking" raw events from the drivers into something higher level, the Intuition thread processing the button presses into events related to a specific button etc.. It affects total throughout but increases responsiveness.
I think that if the OS provides responsiveness, and some key apps do, people will demand it. That is what happened with AmigaOS. You didn't get away with making a system sluggish because you'd get compared to apps that were extremely responsive.
IFF standards. In today's world it would be an unthinkably open approach taken by open source only. Even more surprising it came from a joint venture between Commodore Amiga and EA! Every single graphics program understood, and via datatypes understood in a standard way, IFF graphics. Saving, processing or reading. They were so prevalent and expected that you would be hurting your chances to release something with a propriety only format. Same went for sound and no end of other things. Had the Amiga thrived there'd no doubt have been an IFF in place of many of the multimedia formats. With a standard OS level library call to decode them, etc.
The conciseness of approach, necessary in a system providing proper multi-tasking in 256K, meant all the services other platforms placed at least partially in the .exe were usually in the OS. There were system libraries you could rely on without the absurd version dependent dll hell of windows. I'm sure had the Amiga persisted there'd be some version annoyance, but I can't imagine it reaching the stupidity of now.
Windows had far more than glue code in the exe. If you needed to use a file requester, accept messages, have a window that could be resized etc there was tons of unique OS related code in the exe for all that etc etc. Update the OS and unless you update the source and rebuild the exe it will clearly and obviously be of the previous Windows release. Or as was so often the case buy the latest office and the look is clearly of the next, unreleased Windows. Amiga had it such that all that rubbish was nearly all external. Set up your structures, call the API, get woken up when there's something you need to care about. If you updated OS, all your window chrome, file requesters etc, would be of the new OS. No ridiculous dependencies on v 3.2.152 of MSVC.dll, and 3.3.x not being acceptable, meaning you end up with 12 different installed versions etc.
Only apps doing something clever - like CygnusEd with its hand written assembler scrolling that remains, 30 years later, my benchmark for "fast and smooth enough" editor scrolling. Essentially nothing has yet matched it, though Sublime is probably closest just without smooth scrolling. It was really difficult to come to accept - in some sense I still haven't - that I had to do so much of this OS housekeeping for myself each and every time, in every application for other platforms. I often used to wonder what Windows was, in fact, adding as it always seemed like I was doing everything myself. I gave up complete on Windows programming pretty quickly as a result. :)
AmigaDOS may have been a bit of a last minute, ugly addon, but in use it felt like a lightweight single user *nix. Proper filenames, priorities, comments, proper scripting and ARexx if you needed additional integration. Sure, it was far happier on a HDD, but what aside from DOS - more a program loader than OS - wasn't? :)
IFF actually lives on thanks to Microsoft and IBM in large part. RIFF is basically a little endian IFF,and used for AVI, wav and Webp among others.
What hasn't lived on, of course is a concerted push for an ecosystem around tools for working with the underlying container format instead of the specific formats. This is what made the biggest difference in the Amiga: to a great extent when coming up with a storage format,the question was increasingly "which IFF chunk types are suitable" rather than a question of designing a format from scratch.
PNG is a very similar format to IFF, though for some reason, despite have essentially the same needs and despite the PNG working group being aware of IFF, they chose to be incompatible with IFF.
Nerdier trivia: Erlang's BEAM VM emits compiled bytecode files in an IFF format. (Which is a strange choice, honestly, since they could have easily chosen to use a purpose-made executable-binary container format like ELF, which would have made .beam files more amenable to analysis using standard compiler toolchain tools.)
Didn't know that about Erlang. Reason is probably that ELF wasn't that widespread until the late 90s. When I started using Linux around 94, a.out was still common. It took several more years for ELF to become dominant.
Well "improvement" is a bit of a loaded word, but AmigaOS's exec kernel was a microkernel, and one of the few to pull it off without many problems. Compared to Linux's monolithic, or Windows/Apple's hybrid Mach thing, it's actually something that's still a bit uncommon.
It's still experimental, but RedoxOS is really the only newish OS that I know of that does the Microkernel design.
I would argue that a major defining characteristic of AmigaOS was that it ran in the same flat address space and privilege level as the applications. As a result, message passing is lighting fast (just pass a pointer) and applications can easily obtain direct access to hardware. This has obvious downsides as well— unstable app takes down whole system, no security whatsoever.
Hardware memory protection only came into being with, IIRC the 68030 - Edit: though available as separate coprocessor 68851 chip for 020. Early Windows was no different, limited memory protection first came with the 286, wasn't it? BSOD just as often, instead of Guru meditation - at least a guru let you into a remote debugger. :)
Without that it wasn't hard or unexpected for an unstable app to take down the system. Used to happen reasonably frequently on 68000 Unix systems. Certainly for every time that happened you might expect a couple of caught core dumps, but before hardware protection it was still wing and a prayer...
Indeed it was. And it took until Windows NT and the various PC Unixes to properly utilize it. Windows 95/98/ME were ostensibly running in 32-bit protected mode, but apparently could switch back to the old non-protected 16-bit mode to run old applications and drivers, compromising stability of the entire system.
Whatever Windows did, it was not good enough and way too easy to crash 16 bit Windows from within applications. Yes, I remember now I heard about 286 OS/2. But hardly common, even though cool. I was thinking MINIX which IIRC could use memory separation on 286. (But not on 8088/8086.) Still, you could only use 64 kbyte segments, limiting you data set a lot. You could not do the "large" model of up to half a meg or so you could in DOS.
Versions of MS-Windows before Windows NT used "cooperative multitasking" in which it was the responsibility of each process to yield CPU time to the next process in the task queue. Compare this with "pre-emptive multitasking" employed by UNIX, OS/2 and AmigaOS in which an interrupt causes the OS to save registers, stack pointer, etc and transfer control to another process (if needed) after each quanta.
If a Windows 3.1 process failed to yield, it could result in a nonresponsive OS. On Linux, an abusive process would have to try a bit harder to take down the system (fork bomb, hog a bunch of ram, etc). On AmigaOS, a process could just overwrite part of another process or the OS itself to cause a crash.
RISC OS (Acorn) had its infamous "Abort on data transfer" (or "Abort on instruction fetch" if you branched instead of LDR/STR'd). And if you were especially naughty and chased a null pointer, you got "ofla" -- which was the contents of the first four bytes of memory!
I remember telling my computer science teacher how the floppy disk file system worked (a directory was a linked list of sectors, each one of which represented the head of a file, IIRC) and he refused to believe anyone would implement it like that due to the obvious perf issues.
The ease with which you could drop in new filesystem drivers is another one of those things that was great. Aside from the official FFS there's been a number of other filesystems even long after Commodore went under.
A terrible idea because they were backed into a corner. Had the third party producing CAOS delivered, there would have been no need for an insane timescale port of Tripos to become AmigaDOS. Then floppies wouldn't have got OFS which was a HDD filesystem hacked to fit in as little time as possible. dos.library would have escaped the horrible BCPL mucking about with BSTRs and BPTRs too.
It's insane. All these 80s computers have silly fast response times, which put all modern machines to shame.
But, since arguably the Amiga is the only computer with a modern GUI and being super responsive, it really points out the absurdity of everything modern when you feel it.
It can't be gleaned from youtube videos, either. You must hold that damn old mouse in your hand and click something or drag a window. To the brain, there's zero latency. NOTHING. You ARE the computer. (I think that is one reason why it's so addictive, it's one of the truly cybernetic devices. My modern Mac comes close, but not quite. Scrolling on some phone apps come close.)
having had (and loved) a BBC B, part of me wishes i had replaced it with either an amiga or an archimedes, rather than the 386 i ultimately got. they both seemed very much in the spirit of the beeb, whereas the "ibm pc" did not.
A lot of history of the Amiga I was not aware of. I did not know they struggled so much in the US with the 500. I thought it was a success, and the struggles only came later after the 600, 1200, 4000, etc lost out to generic PCs.
Where I grew up in Norway, around 1990, everyone had or wanted an Amiga 500. Most of my mates had an Amiga, a few had a Nintendo NES. I did not know anyone who had an Atari ST. Nor a Spectrum which I think was more popular in the UK.
Ah the memories of "acquiring" a bunch of games, go to a mates house and hammer through them. The Secret of Monkey Island, Kick-Off 2 and its ilk was my early teenage years.
Later on, I progressed to the Amiga 1200 and started to use it more as a desktop, my first real ventures into programming and messing around with BBSes. Before I defected to a 486 PC...
Without the Amiga, I would not be the computer person I am today. With happy childhood memories.
I could have written the same comment. I bought a 486 PC after my Amiga 1200 but I've never really enjoyed the PC. Basically, I stopped playing video games and programming when I got the PC with Windows 95. What a boring machine compared to the Amiga. No joystick, no good programming environment (that I knew of), even the processor was hard to program compared to the 68000. Eventually I installed Linux and it got interesting again.
Edit: I also remember that so many demos and cracked games came from Norway and Sweden. I thought these guys were serious hackers, maybe because of the bad weather there!
For me it was kind of the opposite. I was a die-hard Amiga kid until around 1994 I think. Then I started to see all the cool games getting released on the PC. I would wish for games like Commander Keen and Duke Nukem. The VGA Sierra games. That was even before CD-ROMs took off.
It was a shame; the Amiga was so far behind by that point. It was so far behind that the PC could beat it using mostly software! Accelerator cards were crazily expensive for the Amiga ($1,500.00 AUD+ if I remember). The A1200 was too little, too late for me.
Getting my 486 sparked off probably one of the most exciting gaming times of my life.
Still insanely fond memories - I wrote a chunk of an adventure game engine in AMOS on the Amiga. It Came From The Desert II is one of my most fondly-remembered games.
Speaking of adventure games - the early Sierra games had their best engine implementations on the Amiga. AGI on the Amiga used the Tandy 3-voice version of the game music and I believe also supported a custom game palette (so for example Leisure Suit Larry 1 uses a better skin-tone colour on the Amiga than on the PC version). SCI0 games (think King's Quest 4. Police Quest 2) used instrument samples (which I believe were from the MT-32 version of the music for the game) to play probably the best music outside of the MT-32 PC versions.
SCI1 on the other hand, was a total mess. Slow to the point of barely playable and -- worst of all -- horrible graphics. I heard somewhere (can't remember where exactly) that SCI1 games did not take advantage of the Amiga's palette capability - the ENTIRE game was reduced to one colour palette shared across all rooms. The Amiga could do so much better, even if it couldn't keep up with the PC at that point; a good example being the port of King's Quest 6, which was done by Revolution Software and not Sierra. That port was praised. LucasArts games were also great - Monkey Island II was great on the Amiga (it's actually where I first played it).
It was frustrating. I loved the Amiga but when I saw Wing Commander featured in ACE magazine in 1990 I realised that the PC was where the innovative games were coming from. And I wanted to be a game dev so I had to follow where the industry was going. I hoped that Commodore would pull something out of the bag, but the A1200 wasn't enough. If they had done something like a Playstation 1 but with keyboard and OS for £5-£600 there might have been a chance.
What few people mention is how the price of computing suddenly shot up at the end of the home computer era. In my neck of the woods, the Amiga 500 was the "expensive one" - I had friends with £50 used ZX Spectrums. Suddenly we needed £1000 PCs if we wanted to stay relevant (and we absolutely needed to - "self-taught coder" was mine and my friends route out of the rural working class)
It was a very rough time for poorer nerds trying to make something of themselves. It coincided with the end of the "bedroom programmer" era; game developers started seeing themselves as media companies and demanding degrees. Then the web came along and swashbuckling expertise counted for something again, though I would have given anything to be an Amiga-era gamedev.
I had saved up 30000 Belgian Francs, just enough for an Amiga 500, when my father offered to add another 20000 on the condition that I'd buy an 8088-based PC with a whopping 30MB hard drive so that he could run Latex at home.(x)
It wasn't all marketing. The Amiga was expensive for the day. I had a C64 and wanted one badly, but as a poor college student I could only upgrade to the C128. The Amiga also competed against the cheaper Atari ST.
Businesses buy computers for the software and the Amiga did not have Lotus 123, MS Word, Multiplan, or Excel.
Microsoft after a certain point stopped supporting niche platforms. I had Flight Simulator and Multiplan on the C64, for example. The Amiga did not have either.
The Amiga did eventually find a niche where it was known to be the best solution but it wasn't a large enough market.
The original Amiga 1000 was certainly expensive. The later Amiga 500 was a lot more affordable, and that's what my parents got for me when I somehow managed to convince them it was the best choice.
My own impression for why the Amiga and every other machine of the era lost out to the IBM PC was that the PC was what virtually everyone used at work, so that's what they got at home.
Apple managed to grab the art/music creation and education markets, so managed to stay alive that way, while the rest of those early computers never really found their niche outside of games, for which the Amiga was arguably best suited to. The Amiga was an early power gaming machine and had the potential to grab the art/music making market from Apple, but few people other than techies really appreciated it or would make their computer purchasing decisions purely on that.
Working in many places during that era, the non-technical belief and perception was that Amiga was good at games therefore couldn't be good at business. Of course the non-techies were much more disproportionately the majority then. Trying to explain that didn't get you far.
You could spec up an A2000, released at the same time as the 500, with SCSI HDD and the lovely long persistence paper hi res white monitor for less than a clone. In the UK all the marketing was for the 500, and offices filled with Tandons, Dells, Amstrads and so on.
Same problem happened again with CDTV (and Philips CD-I) inventing multimedia before anyone had the first idea what that was supposed to achieve.
About the only thing MS did for the Amiga was Amiga Basic, which was bundled with AmigaDOS until v2 and was eventually surpassed by AMOS, GFA Basic and Blitz Basic.
There was a lot of good productivity software for the Amiga: Protext, Wordworth, Final Writer, Final Calc, Superbase, Professional Page (Gold Disk did a lot of good software, I wonder what happened to them?) but there was a kind of inferiority complex about being ignored by mainstream business. The Amiga magazines would often have to explain to their readers why they didn't use Amiga software to produce their magazine for example.
The C128 was a crazy Frankenstein's monster of a machine,though, with two CPUs (and the disk drive adds another), and all kinds of other expensive additions to support the CP/M mode. It even boots the Z80 first and checks for certain conditions before switching to the 8502 (and in CP/M mode some BIOS calls switches between the Z80 and 8502, I believe to save ROM space by reusing IO code etc. across them).
It's a fascinating but totally bizarre machine trying to satisfy several contradicting goals at once (be a business computer; be a better c64; be 100 percent c64 compatible) and in doing so doing none of them well enough other than perhaps the c64 compatibility (which wasn't perfect, but was very close), but at a too high cost.
Every time an Amiga article comes up, I say the same thing. I want to put together a dream team and reboot it. :D Imagine if the Ferrari brand died and nobody used it ever again...that is how I feel about the Amiga.
[Edit: to be clear, I am aware of the existing stuff...what I would like to see is the Amiga used to reimagine what a computer is - new hardware, new OS, new "web browser" (that ditches current conventions) ... I am aware this is a pipe dream :) ]
Ah, I think about this often. Although I moved from the Amiga to PC in 1996, I've never felt at home on other platforms. I consider myself an Amiga exile :)
So far, I've settled on a few principles for my new Amiga:
- No x86. The x86 platform is like a boring, ugly dude who has taken steroids and growth hormone for 20 years. He's bigger and stronger than everyone else, but he's still boring and ugly. The new Amiga should be as fun to code in asm as the 68000 was.
- Multimedia as first class citizen. None of this '70s character-mode fetishism you get in the Unix world :) On the Amiga, everything knew you had a graphics chip, hardware sprites and stereo sound.
- Good hardware integration. Imagine that your GPU was as accessible as your CPU, that you didn't need to install a ton of crap from Nvidia just to program it. I used to experiment with the Amiga gfx hardware in assembly language, from Basic, in a couple of pages of code. I miss how accessible the full power of my computer was.
I'd have to spend some more time thinking about the OS. On one hand, it would need modernization with regard to security, networking, Unicode, USB etc. On the other hand it was a lot more ergonomic than Linux, with hardly any historical cruft, and I'd never want to lose that.
The closest thing I've found to the "Amiga feeling" was when I was experimenting with a Playstation 2 emulator, and spent some time reading the hardware docs. It had a similar setup of exotic graphics chips hanging off a fat DMA system. However, the Amiga was far more than a straight games console, and its custom hardware was more abstract and flexible than you'd find in most consoles. (Compare the Amiga to the supremely powerful, but rigid Sharp X68000 for example)
There is a new Amiga being developed. It is an accelerator
board with a brand new 68080 processor. The thing is, it being FPGA, it actually doesn't need to be hosted in an actual Amiga any more. Therefore they are planning to make a standalone, essentially new Amiga. The Apollo 68080 Accelerator - New Amiga: http://www.apollo-core.com/index.htm?page=products
That is not a new "Amiga", that's just a nostalgic upgrade to an old Amiga computer.
A new "Amiga" would be something completely different than the old Amiga (which has long been surpassed by PCs with high-end graphics cards) or the current software/hardware model of a PC (which could be improved in many ways if you decided not to be a slave to current hardware and software standards).
I'm not sure what that means, really, because creating something new and useful is a hard (but not impossible) task. It really does take commitment in these days of software that is POSIX everywhere and (graphics) hardware that is sometimes interesting and high performance, but very buggy and effectively, probably intentionally, undocumented.
I mean, Amiga is still around...kinda? You can buy a copy of AmigaOS and run it on PowerPC hardware. There's also AROS, which isn't "true" Amiga, but is mostly source-compatible and definitely "feels" very Amiga-ey.
Honestly, though, now would probably be about the time for Amiga to make a bit of a comeback; with the advent of really awesome web applications, the actual operating system is becoming increasingly irrelevant to most users. If we could get the two big browsers out there (Firefox and Chrome) to make decent ports to AmigaOS/AROS, then I could conceivably some of my non-tech friends using it without even fully realizing it.
Sadly, though, I think that's too little too late; non-technical people seem happy enough with their Chromebooks, and technical people will be hampered by the lack of software available for Amiga on current computers.
Indeed, Rebol was groundbreaking in so many ways. Hard for some to understand, but for those whose brains have receptors for it there is nothing quite like it. Working in other langs can be hard once you're used to Rebol, because some things seem so obvious that you can't see why other langs don't do it.
I'm part of Team Red, working on Rebol's successor (https://www.red-lang.org/). We have a number of old Rebolers, but also a lot of new faces, and we're having a great time carrying the torch forward. We still have a lot of work to do, but by the end of the year we should have feature parity with R2, not to mention all the new features Red has, like a compiler, reactive system, Android support, and native GUIs.
I remember really enjoying paying around with rebol back in the day. Sadly I think it was open open sourced after development ceased. The bug tracker that was set up looks abandoned with nobody assigning tickets.
Amiga’s greatest flaw was not having a deinterlaced video mode out of the box, prohibiting serious use by professionals (can you imagine spending all day looking at a flickering monitor). Skimping out of a MMU in the A2000 was also a disaster.
Note, owned a A500, A1200, even AmigaAnywhere in 2001
VGA was still not common until years after it was introduced, though. My dad wrote and sold office software in that timeframe and most of his clients were still using machines without VGA and monochrome only.
I remember PC users still showing me EGA games into 89 because if how ridiculous it seemed to me that they paid silly money for a machine like that.
DragonflyBSD founder is Matt Dillon of Amiga fame. He took many concepts from his Amiga developer days and built a modern BSD using those concepts. Like it’s innovative approach to message handling and SMP.
Having enjoyed both the Amiga 500 and the Atari 520ST back in the day, I'm left wondering: how come that there never was a comeback of that form factor? Mechanical keyboards and small-factor computers (Mini ITX, NUC etc) are both popular, if someone launched a mechanical keyboard with integrated PC case on the back I think it may have a shot at being successful.
There are cases made in this style and they are compatible either with "Amiga Reloaded" and similar Amiga-compatible motherboards, or with SBCs like the Raspberry PI , but I couldn't find a similar product that can use standard PC components (except for the Kickstarter mentioned below);
There was one case with integrated keyboard made by a designer in UK and launched on Kickstarter a few years ago, this one could accept PC components but it had laptop-style chicklet keys, rather than mechanical switches .
Later, the same designer tried to launch a similar case, this time with mechanical keys, but the kickstarter unfortunately did not reach its goal 
Oh wow, I didn't know that the composer of that intro (Richard Joseph) also composed the in-game music for Chaos Engine, and the title music for Sensible Soccer and Speedball 2, all of which are in my top 5 favorite Amiga games.
DOOM was just the final nail. Commodore had shot itself so many times in all sorts of feet by that time, DOOM was just a very obvious symptom - something new is coming on the horizon, and it's the Commodity PC.
Akiko wasn't that fast. Should have been direct chunky support, not a conversion chip. Oh, and faster CPU with plenty of Fast RAM.
Or perhaps a blitter capable of rasterizing triangles with affine texture mapping. A bit like PS1. Low silicon requirements as just addition is required. Nothing complicated like multiplication or divides needed. Of course the price is loss of perspective correctness.
Absolutely. But it was started way too late because it was a rushed stopgap rather than the original plan. Both AGA and Akiko were basically results of Commodores multi-year failure to manage engineering properly and aim for manageable iterations. AGA would have been great if it'd come a couple of years earlier. Instead lots of resources were put into next gen chips that never surfaces because they were way too ambitious.
One of the weird things that happened for Atari ST and E was the inbuilt MIDI interface. It got really popular in the music world for quite sometime.
But no matter what, the component architecture of the PC was bulldozing its way to kill ataris and amigas. Only Apple really survived that challenge, I think mainly because it had the education market somewhat cornered, but even then would have likely died if it wasn't for the ipod.
Both Commodore and Atari might ( especially Atari ) have had a good chance in the home market of game consoles.... but no.
I was a major fan of the Amiga. It was a great community of users too. I remember meeting at a local user group that met twice a month at at library. Fred Fish disks exchanged, art projects shown, music played.
But the overall feeling was that Commodore was pretty much ignoring their fans, and not promoting the computer at all. Even at an Amiga World show in Chicago, Commodore was a no-show. WTF?
I have Commodore: The Final Years right next to me right now,as vacation reading, and Dale Lucks attempts at getting the Commodore board to listen to criticism from engineering is an important thread in it. I wish he'd succeeded in getting more influence.
I still have my A1200, with a VGA adapter - I can connect it to my LED monitor - a 68040 accellerator card, a pcmcia network card, and a collection of internal IDE spinning rust HDD's I could fit into it if one of them works. Powered by an adapted ATX PC psu.
Haven't powered it up for a few years, and the internal floppy drive, which was an upgrade to a PC format (and I can't remember the make or model of the upgrade), is busted.
Still, I have the AmigaOS 3.9 CD, I might get the old girl running again sometime.
The Amiga will always be an unfulfilled fantasy for me. Growing up, I lusted after its screenshots and specs in magazines, the forbidden fruit we could not afford, until something entirely different, and arguably better, came along (the PC.)
Although I can get a taste of it through emulators, I still wish I could have lived the experience of unboxing an Amiga when it was the hottest thing around. :)
As far as I know, Freescale/NXP still makes "real" 68K chips in the form of MC68SEC000 (3.3v-capable static CMOS version). Those will probably be discontinued within the next few years since they've been "not recommended for new designs" for a while. The full-fat 68HC000 (with vectored interrupts and 6800 bus compatibility, needed for compatibility with a few older systems) was discontinued in 2012 because Freescale shut down the only fab that still made it.
The bouncing ball is from the "Boing" demo, written in 1985 by two of the people who wrote much of the Amiga's OS. GLBoing is a clone of it from much later - at least 1992, what with that being the year the first version of OpenGL game out.
The first 28 lines of your first link are a comment block saying just that.
The team that created the Amiga was a lot of the team who created the Atari 8-bit computers, and the same can be said for those machines. (Atari did go on to make the 5200, not unlike how Amiga begat the CDTV and CD32.)
That feels a bit like saying that the Tesla Roadster should have been a British style electric milk float. They both have electric power trains but it misses the point of what made the Roadster so interesting.
As far as I understand it, the chipset was originally designed to support a game console and they only pivoted because of the video game crash.
In a sense it's more like saying that what was originally designed to be an electric milk float should have been an electric milk float. That's also not an entirely fair because I agree that the Amiga makes more sense as a computer. As a game machine it was soon beaten by more specialized hardware that did exactly what action games at the time demanded really well and nothing else.
Walked into my home towns local book store. Played marble madness on the Amiga until they almost threw me out of the store. Was sold on the Amiga and got one.also sold on deluxe paint and the computer image of Tutankhamon.
Side story John Draper legendary phone phreaker mentioned in the article known as captain crunch. Draper found that if you covered a hole in the included toy whistle in included in the package of captain crunch cereals it would produce 2600hz. Hence the nick name Captain crunch. 2600 hertz was the phone signaling used by telecom companies to tell that the phone call was hang up and billing should stop. Thus one could call for free by emitting 2600hz by a building a device callled blue box when long distance calls was very expensive.
Other famous uses of Blueboxes which sales founded another
computer company founded by phreakers Apple. Steve Jobs and Wozniak made the first money for building apple computers by selling blue boxes. Or at least so the legend goes.