Instead of typing a sequence of commands on the computer keyboard; the user merely points to tiny "icons" or commands on the screen by sliding the "mouse" (a plastic control box the size of a cigarette pack) on the desktop beside the computer. As the mouse rolls, an arrow called a cursor moves across the screen. To erase obsolete information, for example, the user moves the mouse to point first at whatever is to be thrown away, and then at an icon in the shape of a tiny trash can...
Sapphire is too brittle to be of much use in a phone screen. It would increase scratch resistance but be more prone to breaking than chemically strengthened glass. It works for watches because they can make it thicker to compensate.
I remember trying to teach someone to use a mouse, and having to move their arm back down onto the table. See, you can't hold it in mid air, you need to slide it across the mousepad. (and in this case it was specifically required, I believe it was a mouse systems mouse with an LED and a patterned metal mousepad)
Those doing customer support for consumers have some of the best stories. In the '80s, I heard one story about a bleeding-edge grandmother who was having troubles operating her newly purchased PC. One of her troubles turned out that she had placed the mouse on the floor and was trying to operate it with her foot. Why would she do that?! The mouse looked a bit like the foot control pedal on her Singer sewing machine :o
I feel like this has somehow passed into the realm of urban legend as I've since heard a few similar stories repeated back to me but I used to work in phone support quite a ways back and we had a recorded call that we used to pass around to all new employees. It was a new computer user trying to figure out how to change the background of a document (I believe it was a web page he was trying to build in FrontPage but it could have been a PowerPoint).
The call went something like this:
[Call center introduction and customer response]
Employee: "So what seems to be the problem, sir?"
Customer: "I can't find where to change the background of my page. I've looked at all the buttons but I don't see a 'background' or 'color' button."
Employee: "I completely understand. Sometimes these types of things seem like they're hidden but it's pretty easy."
Customer: "It can't be that easy. I've been looking for over an hour."
Employee: "Well, it's in a context menu so they're kind of hidden. You can access those by going anywhere on an empty are of the screen with the cursor and then just right click on the mouse. It should pop up a little menu where the cursor is."
Customer: "Ok. Just give me one second. I want to make sure I do this right. I'm going to go get a pencil."
Employee: "Of course, sir. I'll repeat it again when we've done it once so you can make sure you write it down correctly."
Customer: "Thanks." [A few moments of silence] "Ok... so I move the cursor to a spot on the screen... [shoulder brushes receiver as he writes] ...C-L-I-C-K and then the computer will understand that as a command and give me a menu? Is that right?"
Employee: "That's correct. After you right click, you should see a little box pop up next to the cursor with a list of items. One of them will be 'Properties'."
Customer: "Do I have to wait? I did that and nothing has come up yet. How long do I need to wait for this menu to come up?"
Employee: "It should come up right away after you right click. Did you write click on the mouse already?"
Customer: "Yes, I did. I mean... my handwriting is not the greatest but hopefully the computer can still understand it."
The customer had used the pencil to literally write the word "CLICK" on the mouse. He didn't know that the left and right mouse buttons did something different. He thought they were for left-handed or right-handed use.
The Lisa was a good machine, but there was one big problem. Motorola had come out with the M68000, but not the MMU for it. 680x0 MMUs were years late, and the first one was terrible. The Lisa had a real OS, and needed an MMU. That had to be built out of smaller parts, which increased the cost enormously.
There was also a major bug in the M68000 - instruction backout didn't work. That was fixed in the 68010. But on the 68000, a page fault was not handled properly if a register had incrementation set. So the Lisa compiler had to be dumbed down to not use that feature, slowing down execution somewhat.
If Motorola had fixed those problems sooner, the history of personal computing might have been very different. Intel's x86 machines, with their 16 bit address spaces, might have gone nowhere on the desktop.
Hence the cost-reduced Macintosh - no MMU, no memory protection, no CPU dispatcher. Also no hard drive. The original 128K Mac was a flop commercially. Not until memory cost came down and Apple got a hard drive into the product did it sell successfully. The IBM PC had a hard drive earlier, which got them going in business use. The floppy-only Macs were incredibly slow.
Certainly in the UK (special case?), that's not the case . We certainly had Apple ][s at school and 6th form in the UK as the main machines - later replaced by BBC model Bs and back then there were summer camps doing computing for enthusiastic youngsters. Apple was again the computer of choice in the early days, to be replaced later by the ZX Spectrum.
Really? Throughout the 80s, it was all Sinclairs, Commodores and Amstrads (certainly in the home) and even before the ZX Spectrum was the ZX81 which at £100 (or thereabouts) was the first home computer of choice for many, many people. I knew a number of people with CBM VIC-20s, a handful of people with Acorn Electrons.
Of course these were after the Apple 2 so maybe you're correct but I don't think even I saw my first Apple computer until 1989, and that wasn't even in the UK (I emigrated to the US for a short while that year).
My secondary school had a bunch of Research Machines, until they were replaced by BBC micros.
Not sure I agree. Just my anecdotal data of course, but I don't remember seeing an Apple II in UK schools in the 80s, but definitely BBC Micro/Master and a few Archimedes later. This was in 2-3 North West state schools, so there might have been regional differences.
Edit - oh yes, and about the ZX Spectrum... I can't imagine anyone willingly changing from something else to that rubber keyboard! Yuck.
> oh yes, and about the ZX Spectrum... I can't imagine anyone willingly changing from something else to that rubber keyboard! Yuck.
The ZX Spectrum had tons of games (many of them quite creative, many weird), all kinds of software (limitied, of course, but even a Lisp) and several magazines dedicated to it. Many of my friends owned one, so it was much more fun than owning a more obscure computer, no matter how much better its keyboard was. Remember, there was no internet.
I still have nostalgic conversations with friends from that time about how magic the ZX Spectrum world was.
Likewise here too. In fact all through the 80s and 90s I only knew one place that owned an Apple computer. Everyone else was running BBC / Acorn, Sinclair, Amstrad, Commodore 64, Atari, Amiga, Dragon, and, much later, IBM-compatible PCs.
The 380Z was the overall 'king of the hill' pre-BBC, yes, some schools had Apple II's, some even IIe's but everywhere had at least one 380Z.
Had the 'PC' not won, I suspect some of the dos compatibles that the UK produced would have been more popular - specifically, Apricots were doing fairly well with the F1 and XEN in the 83-87 era where the PC hadn't quite dominated yet.
Of course this is always going to be anecdotal, but I never saw them in the north-east in the 80s either. The predecessor to the BBC Micro that I remember was the RML 380Z. I was messing with computers at school and elsewhere throughout the 80s, and to this day I've never actually seen a working Apple II.
I do remember reading about the Lisa and thinking it looked amazing.
Again anecdotal, but my experience was that state schools had BBC Micros but private schools and many technology-focused colleges had AppleIIs. Apples were very expensive. I remember helping out at a technology college night class where people were being taught Logo programming (and later Pascal). It was the first time I'd seen more than one ApppleII in a room at a time (this class had a doze or so, two people to machine).
I could have typed that better for sure. I had meant to say that Apples were not common anywhere in the UK but the places I had seen some were in tech colleges and private schools which it seems would fit with your experience too.
Hmmm, this would have been before the BBC launch. This was a North London comp which received its first computer, an Apple II the month I finished my GCEs, so that would have been summer 81. When I went to a separate 6th Form college they also had Apple IIs with BBC Bs following soon after.
The BBC Micro didn't really get into mass production until 1982.
Our school had an Apple II in 1982 but they were quite expensive, so people mostly got Sinclair machines or Commodore 64s for home use. When I went to 6th form they had a few BBCs and an old Commodore PET. I don't the Apple IIs were common, but they were certainly around.
The way it's achieved may not matter much with a 4 GHz multi-core CPU running a multitasking OS, but having to deal with 16-bit pointers and segmented memory in a 4.77 MHz 8086/8 was a huge pain I felt in the flesh.
You would also need to update the compiler to use the MMU and backout instructions properly, and re-compile all the software to use it. It would also probably require some modifications to the OS and application code as well to get much advantage.
The 68000 and 68010 were 32bit CPUs, released in 1979 and 1982 respectively. Intel didn't have a 32bit x86 CPU until 1985 when it released the 386.
At the time though, the 68K was regarded as the best general purpose CPU you could get and was sort of a "default choice" for building any reasonably high performance systems. Unix workstation manufacturers like Sun and SGI originally built their systems on the M68K platforms before their respective RISC architectures developed and matured.
With that said, it's totally reasonable to make the hindsight argument that Apple should have developed the IIgs line further, with its 16-bit 65C816, and then reevaluated the CPU landscape in the mid 1980s, where you had more mature 68K chips, 32bit x86 CPUs, and a plethora of RISC options. In that world, I could very easily see the Macintosh make its debute on a RISC or 386/486 platform.
Speaking of the IIgs: It's rumored that Steve Jobs requested they downclock the CPU (it only ran at 2.8mhz) to prevent it from seeming faster than the Macintosh.
I worked on a IIgs at the very beginning of my career, porting a word processor that had originally been developed for the Amiga and subsequently ported to DOS and Mac. Having worked on the Mac previously, I was a bit surprised that the IIgs OS was actually more advanced in some ways. Color, for one, but hierarchical menus are another one that stick in my mind. It was slow and weird, but the IIgs was a remarkably modern update to the already-venerable Apple II series.
Because the 8088/8086/80286 had their own bugs, and 68K was a better architecture than x86. No segmentation, orthogonal registers and instruction sets, etc. This superiority probably peaked around the 68020, which started losing out to the 80386 the next year, and after that Motorola never quite recovered. The Pentium, which I imagine most here think of as a starting point, was really more of an endpoint in a complex battle between Intel, Motorola, NatSemi, and a whole bunch of RISC alternatives.
My dad and I found one at a garage sale when I was a kid. I bought, repaired, and resold mostly 8 bit machines, but found a big bunch of Apple Lisa stuff for not a lot of money, maybe $150. I didn't really know much about it, but it was interesting and I figured I could get my money back out of it. It had the computer, three external hard disks (2x5MB and 1x10MB), and some software and other accessories.
This would have been around 1988-1989, and the Mac had been out for a while; this seemed like a quaint old computer at the time. Since the Lisa wasn't quite Mac compatible, there wasn't really anything you could do with it...the software it had (early variants of stuff like Mac Write and Paint and such, called, I think Lisa Write, etc.) was what you got.
When I unboxed everything I found the receipts from when the original owner bought it all new. He'd spent something like $20,000+ on the whole setup. Computer was $10k, and each of the hard disks was several thousand dollars.
I sold the whole setup for about twice what I paid for it (I seem to recall about $300, but it's been a long time, and it wasn't super memorable...I bought and sold a lot of weird old computers back then), after cleaning it up and testing everything and tinkering with it until I was bored (I was a Commodore kid with a C128D and saving up for my first Amiga...Apple stuff was just a curiosity, not anything I wanted for myself).
Though I wasn't super into it at the time, it's one of the few things I kinda wish I still had all these years later. It has real historical significance that I didn't really appreciate at the time.
I think the difference is between productivity gain. In these early days, you bought a computer or you had to use a calculator and paper. It's the difference between automating a dentist or realtors office or not doing automation.
Now, buying a new computer results in marginal productivity gains because likely it's just replacing another computer and maybe improving things or maybe just adding a touch-bar.
Bill Machrone, long-time editor of PC Magazine, coined a "law" that the computer you wanted always cost $5,000. Which held reasonably true for a rather long time. It broke down maybe about 10 years ago although, as with the latest Mac, you can still get to $5K without too many gymnastics for pro video work or high end gaming.
You can configure a desktop for way cheaper than that and it can run 90% of modern games. Just don't get the latest generation top of the line Nvidia/AMD card, either get the previous top of the line one or a decent mid-range one (check benchmarks when choosing).
Don't get the most expensive RAM or CPU, you lose 10% real performance but you halve your budget.
When I was in college in the mid 2000s my senior project needed a single board computer. We spent $500 on a small PC104 board with a 100MHz National Instruments Geode (x86 compatible) process. That didn't include any RAM or storage. I really wish the Raspberry PI would have been available back then. Especially the Raspberry PI ecosystem, makes interfacing with stuff so much easier. We interfaced a small LCD I found at a surplus shop and it took us weeks to get it working. No you can buy a touch screen LCD for $30 and there are libraries to get it working in about 20 minutes.
> Computers have become a great deal cheaper in the ensuing decades.
By the way, it turns out people think "automation is replacing jobs in manufacturing" because they've misread the data showing them just how much cheaper computers have gotten since then, and therefore how much more computer one factory employee can make.
Most people couldn't. And businesses which could afford them wouldn't. Which is why Lisa bombed.
But Lisa's pricing made the Mac more appealing. It wasn't exactly affordable, but you were getting maybe a third of a Lisa - including the new GUI, which was obviously the future [tm] compared to the Apple II - at less than a third of the price.
Even at the prices being charged, the Mac and PC were competitive with previous standards, and affordable enough to have a sizeable market among affluent middle class users.
They offered more than the old S-100 boxes did, for the same or less money. And they were much cheaper and more "personal" than industrial minis like the PDP-11 and the VAX.
Fun fact: Lisa sales were much better than the projections in the marketing requirements document (MRD). However, the MRD was using a much lower price ("end user price under $5000") and now the Lisa had to bring in exploding development costs. What really bombed (or flexed) was the Apple III motherboard, resulting in Apple having to reduce its line of products – and Lisa was the first to go.
Regarding the price, mind that the Lisa came with an integrated office suite (long before the success of MS Office) and was targeted at offices and professionals (think dentists). Which rendered it a somewhat curious "workstation for office work", at least from todays perspective, where eventually office machines became the epitome of cheap, bare-bones boxes.
The MRD mentions as potential users secretaries, managers, and executives (of Furtune 1500 businesses) as well as bookkeepers in general.
I bought a TS1000 in 1981 or 82 for $99, and a C64 in probably spring 1983 for $299 at Circuit City. I was in 9th grade and used paper route money to get it (parents thought they were glorified Atari 2600's). Hehe, remember newspapers and paperboys?
There seems to be a few C64's including breadbin and C64C models on eBay most days that I look. Granted some may need recapping or other work done, but there are plenty of others in working condition or only needing minimal repairs. I think you could get one for not much more than $US200 + plus shipping.
As others have noted, individuals mostly didn't own PC clones or Macs in the early 80s and they weren't even all that common in businesses.
I did buy a dual floppy PC clone in about 1983. I don't remember how much it was--wish I still had my receipts from that far back--but it was a big purchase for me at the time. [ADDED: I probably dithered over it for something like a year, during which time it became obvious that PC clones were the future rather than the S100 etc. systems running CP/M.] Based on ads from the time, it was probably about $2500 but a printer and software would have added to that.
And when I went to business school about a year later, I was one of very few people in my class who had their own computer. (There was a small computer lab in the school--that actually had a Lisa among other things as I recall. At some point when I was there they added a bigger lab with a bunch of IBM AT clones (80286s)).
My first Mac (512K) and Imagewriter cost north of $5,000. I still remember paying for it with cash from anything I could sell, a cheque with the money I'd saved for months, and the rest was on my newly acquired VISA card.
When I think about that investment as a ratio to my disposable income at the time, I was crazy! But, as a retired IT Exec, things did work out in-the-end.
EGA (640×350) came out in late 1984 and was the first reasonably decent color display on the IBM PC. Prior to that you mostly had a choice between mono text, really crappy CGA color graphics, or something proprietary like the Hercules mono graphics.
My dad paid $6k+ for a IIci in 89. I had no idea at 9 years old and really no appreciation for what kind of money that was. In 2000 I bought a Power Mac G4 for around $2k. Being young with lots of disposable income, I didn't appreciate then either what kind of money that was. It's only recently that I've started to understand what a computer costs as I've bought computers while paying living expenses.
In about the same year, my dad brought home a Macintosh SE for the family. I was very young and remember only a few things: it ran the game Dark Castle in b&w from a hard drive, and my father had pulled the entire system out of a dumpster at work(!). The machine had been used in some way on a defense or NRO contract, and when the project was cancelled, the entire thing was simply discarded. If I recall, one of the back corners of the case was slightly dented from its unceremonious toss into the dumpster.
As a kid I took it all for granted, but we never would have been able to afford such hardware on my dad's engineering salary at the time. The price for a new one adjusted for inflation today is jaw dropping.
According to the article Lisa was a $50 million gamble.
I just read somewhere about the 2000 engineers Google took on for the Pixel phone from HTC. Keeping the lights on for the buildings they are in is a $50 million gamble, that's without paying them or allowing for inflation. But you get the idea, $50 million was cheap for the product compared to what hardware tech costs to develop today, particularly if it has an operating system to write from the ground up.
Incidentally, the threat from IBM mentioned in the article. Peanuts turned out to be the ill-fated PCjr that had good graphics and a bad price point for the home market. Popcorn turned out to be the first PC based lug-gable computer from IBM.
Trying to understand Steve Jobs is an effort in futility I know, but I've never understood what he was thinking around this time. He refused to accept responsibility for his daughter when she was born in 1978, and even 5 years later, he was still questioning the accuracy of DNA paternity tests in Time Magazine and despite being a multi-millionaire, only grudgingly paying $500 a month for child support. But then he goes and names this computer after her, yet claims for years it wasn't named for her. Sure, he was pretty young (28) but that's hardly an excuse. Jobs wasn't just a pathological narcissist, he was just plain weird.
It was out of his control, foisted upon him. He didn't want the responsibility at the time, didn't want to focus his time on it. So for an immature 20 something, he took the route of deny deny deny to evade it. Lisa had noted with emphasis previously his pattern for having something less than no patience for things that took his time against his wishes (when he didn't want them to).
Combined with his emotional issues, rage, temper - it resulted in a manic back & forth. When he wanted to be magnanimous or kind, he could be, and it always had to be strictly on his terms and only if he didn't feel forced into it in any manner (to the far end of that spectrum, like he'd do it only when it was least expected, as an amplification device). If anything attempted to force his hand, or he perceived such, he very aggressively rebelled against it in all cases (you see that pattern over and over again throughout his history with people and business). It had to be on his terms, or there would be no terms at all. When he could reorient the context of Lisa (his daughter) to his terms, as and when he saw fit, then it became acceptable. You can see some of that behavior in action, in Lisa's description of what it was like to live with Steve when she was younger (my way or the highway atmosphere; he had to feel in control). He seemingly struggled to control his emotions for most of his life, which must have been wildly frustrating for someone like him. That lack of personal emotional control might explain a control over-compensation directed at other things in his life, the need to control everything else (and perhaps a behavior where if he felt in control of everything else around him, then he could keep the other things from setting off his emotions, which he couldn't control properly).
"It was out of his control, foisted upon him. He didn't want the responsibility at the time, didn't want to focus his time on it. So for an immature 20 something, he took the route of deny deny deny to evade it."
'20 something' is well into adultland.
You want a good job? You want to vote? You want to be treated as an adult?
Then there's no such thing as 'out of his control', really.
For someone who's ostensibly responsible enough to run an entire company, being responsible for one's children is well within reason.
Everyone has challenges in their personal lives, it's 'never a cakewalk' - ok - but there are really no excuses for Jobs here. Point blank.
You misunderstood what adventured wrote, it isn't that he (adventured) believes that it was out of Jobs' control, but that Jobs himself saw something happening (his daughter's birth) that he didn't control. Jobs wanted to be in control.
This is stuff Lisa herself has written about.
(also 20somethings can act very immature, as can 30somethings, 40somethings, etc and the opposite where teenagers act more mature than expected - it is that "expected" part that sometimes fails with people, not everyone is the same)
Contrast this to the IBM PC where people could easily get started programming in BASIC (included in ROM!) or Asm (MS-DOS DEBUG), for which many magazines of the time had listings. Of course not every user did, but certainly a lot of them started and eventually helped greatly grow the amount of software available.
The PC had a learning curve but the user was in full control, whereas the Lisa didn't have much of one but had many impediments that prevented users from becoming developers. This attitude persists in Apple today.
Spirit. The Lisa was a MUCH more powerful computer than the first Mac - which is why you needed a Lisa in the first place. The Mac wasn't able to run much of anything developed for the Lisa. Mac software was written in machine language, as I recall the Lisa made more use of higher level languages, so even the tool chain would be problematic.
Additionally, the Lisa Desktop Library contains many similarly named routines to the Mac Toolbox. The Lisa Desktop Library was never directly exposed to third-party applications though. (The Lisa Toolkit exposed it indirectly.)
According to the wikipedia page for the Lisa, Jobs went to the team developing the mac, and changed the focus from a command-line machine to be a cheaper Lisa competitor, releasing the Mac 1 year after the first Lisa. How could Jobs get away with trying to cannibalize the sales from their flagship machine? It seems traitorous if true.
I really, really, really wanted one of these when they first came out and my stepdad laughed in my face when I asked to borrow that much money for a computer "for college" when we had a perfectly functional Apple IIe at home.
It’s interesting to me how well-reported this seems, or maybe how it seems informed by ensuing decades of literature rather than being reported in the moment. Apple’s mythos, Jobs’s showmanship, and the importance of Xerox PARC and Apple’s deal with Xerox were all established lore very early on. I guess it’s just bias on my part that it seems like things would have been less clear or known at the time.
>. To erase obsolete information, for example, the user moves the mouse to point first at whatever is to be thrown away, and then at an icon in the shape of a tiny trash can; at the press of a button on the mouse, the information vanishes
I'm trying to imagine most readers of a 1983 Newsweek magazine would read this and be turned off from computers for life
“Apple Inc. employee Jef Raskin named the Macintosh line of personal computers after the McIntosh. He deliberately misspelled the name to avoid conflict with the hi-fi equipment manufacturer McIntosh Laboratory. Apple's attempt in 1982 to trademark the name Macintosh was nevertheless denied due to the phonetic similarity between Apple's product and the name of the hi-fi manufacturer. Apple licensed the rights to the name in 1983, and bought the trademark in 1986.”
Watching early Apple presentations and modern presentations shows the company prioritizing sales over innovation. Now we see a phone that is marginally better than the previous year, with taglines like "the best iPhone ever". The future is AR and gestures in my opinion, if keyboard input speed is ever an issue then a glove Swype interface would work. Many smaller companies have already set the stage nicely, such as the currently monocular Vuzix Blade (https://www.vuzix.com/products/blade-smart-glasses).
Edit: Guess HN thinks we're going to be carrying screens in our pockets forever.
Funny, I've felt the same thing about VR. With AR, I can think of a million practical applications. With any kind of VR that is short of full sensory immersion, I can't think of a single practical use.
> With any kind of VR that is short of full sensory immersion, I can't think of a single practical use.
If you've ever had to use a hardware "simulator" to train on Big Hardware like a plane or submarine, using VR as a replacement is an "obvious win." Rather than dedicated rooms with all sorts of custom fake hardware that only a few people can use at a time, you can buy one classroom full of VR equipment, and then every student can do their simulator runs in parallel, allowing each student far more total simulator-time. As well, to switch to a simulation of newer-model hardware, you just need a new piece of VR software, rather than entire new rooms full of molded plastic and slapshod wiring.
The only games that really appealed to me as a use case for VR are "Keep Talking and Nobody Explodes" and using it as a virtual cockpit for mech/dog fighting sims, otherwise it just seems like a thing I have to have on my head for not much gain.
I think it is used for pornography a fair bit though
Every recent iphone (6s and newer) is an AR device. You can load up an Ikea app, and check how the furniture you're interested in looks in your room, or if that end table will actually fit between the wall and your couch. There was an estimate 2 years ago that there were 700 million iphones in active use. Let's assume that the current number isn't any larger and that 6/7 are recent enough to use ARKit, so let's say 600 million AR devices. I'm willing to bet that is more than all other AR companies combined.
The idea of having to wear a glove to type is horrid.
As with the iPod, iPad, and iPhone, I fully expect Apple to wait for things like smart glasses to evolve a bit before jumping into the fray with a mass-market application. Being one of the first-to-market has never been their thing.
They are pretty far from first to market. I'd say Google was first to market with Glass, along with several competitors including Vuzix. Apple developed ARKit, which would work nicely with glasses technology. I agree a glove is a little too complex, but an eye tracking Swype would be a useful innovation.