"Finally, of all the people who have contributed to the development of EMACS, and the TECO behind it, special mention and appreciation go to Richard M. Stallman. Henot only gave TECO the power and generality it has, but brought together the good ideas of many different Teco-function packages, added a tremendous amount of new ideas and environment, and created EMACS. Personally, one of the joys of my avocational life has been writing Teco/EMACS functions; what makes this fun and not painful is the rich set of tools to work with, all but a few of which have an "RMS" chiseled somewhere on them."
I think RMS is an old school, a true hacker. Most things happening in tech industry today, if you go deep down, there is an RMS chiseled somewhere on something. I wish we could have him in HN, he's a very interesting person.
If you missed it, the tiny section on ITS has some gems:
When you connected to one of the ITs machines (even over the net) and wanted to log in, ECC suggests checking to see if anyone is logged in with the name you wanted and if not, login in with that name. Not create an account. “Logging in” in those days mainly set your home directory. No accounts, no passwords
The command interpreter (shell in modern parlance) was the debugger, DDT. If any program crashed you could debug in in situ rather (with all connections open, files open etc) rather than sifting through a core dump. It’s ass if /bin/sh was gdb.
Those were great days in a more civilised age. You can see here where the mutability of the Lisp Machines came from.
> If you missed it, the tiny section on ITS has some gems:
I was there a few years later in the early 80s, and I wouldn't romanticize ITS. There were reasons it didn't live on.
However, one of the more hilarious inventions in the system: It used to be a pastime to try and crash a system. This was of course a huge PITA for anyone else on the system, but for whatever reason of target fixation, you couldn't stop people from trying.
So ITS had an explicit command to crash the system (KILL SYSTEM iirc), and it was well published. By removing the challenge, the system stayed up. There's some deep statement about hacker psychology in there somewhere.
Edit: Just saw your username. You weren't by any chance the same gumby from the 80s?
> Yes, I am the same gumby at the MIT AI lab from late 70s to mid 80s.
Apparently you have a pretty memorable username, because I don't think we ever interacted. I was a mere UROP. (I worked on a robot named TARDIS (tea and ravioli delivery service) for Jon Taft (tfat) and John [can't remember his last name], ultimately under tk. Nobody would remember me, being just a peon and then leaving to do hardware, but it was a great environment to cut your teeth, and a real privilege to have delved into LMs at that time.)
Hope things went well for you and the rest of that bunch, it was an incredible place.
This platform would also need to be useful in order to reach more than a very restricted audience. I find Plan 9 and Squeak/Pharo fascinating tools, but I need a "normal" machine to browse the web, deploy the software I write for a living, run Slack...
I always think it's a shame the only OS options we use these days are flavors of UNIX, a distant descendant of MVS, and the bastard child of VMS with a confusing GUI bolted on top of it.
This is where we have real opportunity that didn't exist in the 90s. Back then, we had more diverse (and exotic) personal computing environments, but they weren't largely online and they had one other problem: incompatibilities.
Today we don't worry as much about incompatibility, largely because we have widely accepted and used standards that we lacked back then. One could build a whole new personal computing machine and/or environment and as long as it implemented many of these standards it would be entirely usable (things like JSON, XML, etc).
In fact, the only "application" one would need in such a system is a web browser (no small task, I know). That goes a long way for "bringing people in." There is lots of room for creativity at present but unfortunately little in the way of imagination or will.
I don't agree. There is a place for something like end user programming. It was a big deal in the 90s and has largely fallen out of favor as developers go deeper into their own universes. Certainly something like Hypercard wasn't "only for developers" and there are many lessons one can draw from that.
There is no good reason that personal computing systems shouldn't or can't be in the large end user programmable systems.
Yes, there is a place for end user programming, among the end users that want to be programmers. History has shown time and time again that's a minuscule part of end users. Even Excel formulas are out of reach for most end users.
To what history are you referring? There are scholars who study this kind of thing (check out Bonnie Nardi, for example). Good systems that integrate end user programming are popular and used by non programmers when they are available. If they are a minuscule part of contemporary personal computing it is because developers have stopped making these types of systems, not because it is somehow above the abilities of regular people.
Sometimes new IT users do not comprehend why many things exists to navigate through text since they are born in mouse/X graphics era.
Many of that trick does not have a purpose anymore but many still have, and not only for terminal usage but also for speed and comfort.
The bad part IMO is the dichotomy between "younger" and "older" users, the former would benefit a lot from many "semi-forgotten techs" but the letter need to modernize and explain things a bit or at least advertise with proper demos.
In Emacs world many things happen in this direction with newly precooked configs, new packages etc, same happen in shell land with zsh/fish and the recent "epidemic" new tool set that mostly add sugar to classic unix utilities, that's good but we need more and more.
I dream a GuixSD with Emacs as a login shell on terminal and as default desktop with EXWM and a good looking default theme, a small org-based intro vitutor style at first startup, perhaps with few video demo and many, many more users may be enlightened :-)
> Sometimes new IT users do not comprehend why many things exists to navigate through text since they are born in mouse/X graphics era.
You’re right, but the way I like to put it is: I often use text rather than moving a mouse & clicking for the same reason that human beings often use speech rather than pointing & grunting (indeed, using a mouse is a lot like pointing & grunting at one’s computer).
Using a smartphone & tablet really drive home how powerful it is to have a full-featured input device (i.e., a real keyboard), rather than just a mouse-driven UI.
Not everywhere since Plan9 is really mouse-centric :-)
However I think anyone have used both keyboard centric environment, at any level, and "modern mouse usage" can say that mouse sometimes is needed (like for image editing) but for plain UI is far less needed and often not really comfortable nor efficient at all...
My only consideration is that our knowledge is mostly text, we have added video/images in the game but still text is by far predominant and mostly without any viable other option... So having a text-centric environment, without, of course, dropping graphics&c capabilities it's the most logic way of work...
Ex-hardcore vimmer for more than a decade here :-) there is no comparison...
Vim is a powerful and lightweight editor, Emacs is an operating systems people who do not know call it an editor. My demos are not about coding but MUA, slides (live in Emacs & exported in reveal.js/pdf with beamer), file management etc so not thing (n)vim do...
No comparison? Well, I'd say there certainly is a comparison to be made. If you look at the plugins/packages available for Vim and Emacs you'll find that they're surprisingly comparable. For some niches Emacs has better packages, for some Vim has better plugin. Yes, Emacs has a better extension language. Yet somehow excellent plugins are still written in VimScript. And Vim has its own advantages.
I say this as a former Vimmer who now mostly uses Emacs, mostly because of Org-mode. (Though I wouldn't use Emacs without Evil.) I did a lot of work on an Org-mode clone in Vim many years ago, which copied quite a few features and worked surprisingly well. Yes, it was somewhat hindered by limitations of Vim and VimScript. But the main problem was Org-mode had many years of head-start, and already had a large and active community of users, developers, and continual development, and there was simply no way for a resource-scarce, single-person project to catch up.
Oh, I do not intend to say that Vim is "inferior" or "bad", no intention to create religious flame, only that Vim was born and evolve as an editor, and do it's job pretty well. Emacs on contrary was born as a human-machine interface, like a DE, the ITS DE, that we can consider something like modern Lisp machines, or what we still have of them.
We can compare editing capabilities of both and I think they are essentially on par, but we can't compare a car with a speedometer...
It is quite false to suggest comparison between Emacs and Vim is like comparing a "car to a speedometer". Vim is not just an editor, it is an "extensible-editor" in much the same way as Emacs is.
I don't care if Emacs was born as a lisp machine, the fact is that the modern Vim-garden has as many add-on plugins and apps written for Vim as exist in Emacs' garden, possibly more. And in many cases the Vim extensions are better, despite being hobbled by an inferior extension language.
My English perhaps led me to express concept in an a bit convoluted, misleading or less effective way... What I intend is simply that Vim is a powerful editor, Emacs is a sort of OS. How can you compare them outside the editor part?
An example Emacs is my WM with EXWM (and few people use it even as a terminal login shell): can Vim act as a WM? It's also my MUA, and I do not know any Vim MUA, only few hacks mostly to compose a message (while some MUAs and many different apps offer Vim-like keybindings being effective, comfortable). It's also my personal finance manager, my pdf reader, it feature a built-in CAS not at the level of Maxima (supported by iMaxima as a wrapper-UI) but not marginal that can do symbolic, numeric, times, unit conversion, astronomical calculations, it have a shell that's a built-in REPL, ...
Without the intention of fuel a potential flame Emacs can be even a Vim emulator with Evil&c and it's existence and popularity clearly prize Vim as an editor.
As editors for me they are substantially on par, nvim offer also few nice non-editor things like a terminal-emulator that works better than sane-term/ansi-term on Emacs, it's startup time is super quick compared to Emacs but common ground end here. Of course Emacs is a text-centric UI so you may say that "being an editor" can also means being a full featured operating environment, that's true, but no Vim devs nor user I know have tried to do so with Vim while many do it with Emacs not only today but even many, many years ago. One of the nicest old post I found is http://www.sxemacs.org/ from the early 2000's just as an example.
Perhaps sometime in the future Vim or a Vim fork will try and do even a better job than Emacs, but that's only an hypothesis, not something that can be compared now.
You are being obtuse. There really is no comparison between Vim and Emacs.
Vim is just an editor.
Emacs is an environment, a Lisp Machine. I'm running a web browser, mail client, irc client, news client, twitter client, RSS reader and aggregator, file manager, music player, ebook reader on Emacs. 24/7. At the same time. These are all applications written in Emacs Lisp. Do people use Vim in this way? No.
Look at some Symbolics Genera videos on Youtube. Emacs is the closest (but still inferior) application we have to that.
Sure, but in a professional setting, on your own work, it is completely unacceptable.
Imagine if musicians did the same as computer scientists: all the keys, strings and holes in all the instruments would be labelled wit the note that they play. Difficult instruments like the clarinet would be rejected and deemed "obsolete" and non user-friendly. Young musicians would rally against the old musical notation, and would switch to a colored animated markup that only works inside new versions of ipads. That would be laughable, wouldn't it? Don't try to do the same with computers please. Learn the tools of your trade and master them. Do not fight against powerful tools because they need some training to use. Do not require "discoverability" for professional tools.
You’re assuming that you’re always hunched over the computer with your hands on the keyboard.
Often I’m in a relaxed position looking at the screen. We all agree that we spend more time reading code in writing? So, do you really want to be sitting at the computer with your hands on the keyboard at all times?
You’re also assuming that you have a physical keyboard. In the 21st century billions of people use tablets and phones.
Finally, I didn’t say you had to give up the keyboard. I said I would like to have the other option. You can still use AceJump.
Well, for me a computer (desktop) is only marginally an entertainment device so I normally sit, with my "comfort" chair (Varier Balans) and my keyboard ready to operate...
I have a smartphone, so I casually use touch input but mostly as a personal navigation device, a on-the-go mail reading device (I do not normally write mails on the go), a mobile phone for actually call or answer call GSM or VoIP and an org-mode limited display (Orgzly) for "to buy" grocery lists. Essentially no more. People who try to use such devices for more, like banking etc I simply call them poor, unfortunate and ignorant slaves.
I do not find such devices as an innovation but as an involution and a failure, an involution because of their absurd limit, a failure because we are still unable to have a "computer on the go" when needed...
It's not a matter of being "geek" but a matter of control our personal datas and environments and being able to produce contents, not only consume it. With mobile crap we can consume easily but we can't produce essentially nothing and we can't control our data. So they are absurd devices.
BTW yes, I can still use my physical keyboard, unfortunately they are more and more bad, the sole acceptable remaining are super-expensive and still bad... I would love having again my beloved SUN type6 or ancient ones with many keys I can use with single key binding on Emacs. I would love to have programmable keyboard that are not immense amount of crap that demand super-buggy software only to register custom keys or macros and even with them they can't really be customized.
> It's not a matter of being "geek" but a matter of control our personal datas and environments and being able to produce contents, not only consume it. With mobile crap we can consume easily but we can't produce essentially nothing and we can't control our data. So they are absurd devices.
Have you thought about it this way: maybe you're the one that can't create on these new devices? Just like my parents can't create things on a desktop, because they haven't mastered keyboard and mouse usage and desktop UI conventions. Kids can create quite impressive things on mobile devices. Maybe not things you'd like or prefer, but mobile devices can definitely be used for creating things. For some things they're even better than desktops, because of the more intuitive touch based interface.
Yes, mobile devices are more limited from certain points of view, but as photographers say: "the best camera is the one you have with you". Availability trumps flexibility and power for most people.
And regarding the consumption/creation dichotomy: it's not entirely false, but it's not very important either. Most people in most moments of their lives are not creators. I make things but I'm also passive at least 50% of my time. On top of that, many people don't even have a desire to be "creators". Maybe they want to help other people as doctors, lawyers, teachers, etc., and digital content creation doesn't interest them.
Well, on a mobile device I can take photos or videos, but they are not my creation: even if I construct a scene I only create the scene, not the video/shoot. The phone have done it, outside my real controls. More important recorded bit are not mine, nor under my control. They are tied to the device or to someone else computer.
You may state that even on my computer I act only through software, but there is a difference: first of all is software I can write, read and deploy myself (try to develop on Android for Android and deploy the code), so I can know and program my own environment on itself. Second it's something I can examine with simple tools, no need for jtag, soldering, or potentially destructive operation to see my bits, I can even open my raw disc with an editor.
Of course not all people want to create stuff but for me having designed devices that are more powerful than many ancient computer and that they can do far, far, far less, only adding things that we can already do with dedicated devices it a failure for me.
We talk about recycling to reduce the damage we do to our environment and we design expensive devices that can barely last few years (two/three max in mean), that are super-hard to repair and normally repaired device is not "as new" like my own desktop is "as new" when I change mobo or disk. We talk about being flexible in a changing world and we create devices that are substantially not upgradable nor modifiable by the user and often even by the OEM. We say that it's forbidden that a landlord enter in it's rented apartment and then we allow systems that after being formally bought are normally configured and modified/monitored by tons of different subjects. I can go far longer...
It is fun comparing the key sequences to a modern emacs.
I looked through the section for "Basic Buffer Editing Info" and every one of the key commands mentioned there would work in a modern emacs!
You can now use fancy cursor keys rather than CTRL-P, CTRL-N but if an emacs user from 1978 was stuck in front of an emacs today then they could get straight to work :-)
The meta keys look a bit different though. You type two alt-modes (whatever they are) to get into the mini-buffer and then enter "MM-commands". In modern emacs you type Meta-x (Alt-x on most keyboards) to get into the same mode.
This is one of the reasons I prefer using old, established, principled FLOSS software as opposed to Enterprise software. I don't have to change my habits because a middle manager decided to change the product to get a promotion.
That was a room on the 8th floor of tech square (almost the entire 9th floor, the top of that building, was machine room).
Tech square was an office complex built in anticipation of the space program. Then JFK was assassinated and LBJ moved mission control and all that stuff to Texas. NE43 was the name of 545 Main st. MIT had a bunch of floors, first 8&9, then 3 (CIA had the other half for a long time), then 7. I think Honeywell had an office there too, perhaps in one of the other buildings, because of the Multics project.
As a George, my name did some g(lobal) operation. It was an editor more talked about than used. SOS linemode edit on tops 10 was just simpler on a decwriter or vt52. I do recall teco being quite like ed: you really have to be comfortable with an amount of text carried in your head. So..it's an editor for people with a coherent mental model. I was more incoherent.
The impact of the gnu manifesto was pretty big. We talked about it, wondered when he'd get a compiler better than the PCC. I think we (leeds uni, 1980s) said years off but actually gcc 0.9? Came out far faster than I expected.
Roll forward 10-15 years. Former compiler writers out of work. Gcc and llvm killed the market. On the other hand we have really good competing duopoly of compilers and lots of upper language support and diverse languages. So.. win some lose some. And it's not like rust and go and Java have to use llvm and gcc. Rust does. Go doesn't always.
Commercial compilers are still produced - IBM with its XL C compiler, for example, and Intel's icc. Both are good. Intel's icc often produces slightly faster code than GCC. And a little back in time, SGI's MIPSPro compilers were extremely good. I'm sure there are a couple more contemporary compilers out there but the IBM and Intel compilers are something we use at work, at least.
I have no idea what you mean by 'no support for modern standards'. It's a modern compiler for modern C (and the newest versions are compatible with code written for modern gcc, with a flag). The only thing about it is it has its own syntax for flags, which aren't exactly difficult, and they're well documented.
It produces better code than gcc in many cases. Which is exactly what you would expect for a compiler written specifically for IBM by IBM. Lots of fine-tuning options for the Power architectures.
You say you have a few AIX machines - which one, and which OS version, and which compiler versions? I'm on Power7 and Power8, with AIX 7.1
Don't look at AIX 4/5.
I were, unfortunately at 3 years old I wasn't interested in computers much, and I didn't have the means to get MIT papers (or any recent papers, or the knowledge that such papers exist for that matter) even when I developed interest in computers.
But, yeah, being born in right time and place would be wonderful.
Nowadays it cannot even be fully comprehend how places like MIT, Stanford AI lab, Xerox Park research and, of course, Bell labs were islands of intelligence and sanity compared to the modern day's ocean of screaming bullshit.
''Dark liquid with negative mass'' sort of papers is what it all came to.
MIT, Stanford AI lab, Xerox Park research and Bell labs had 1978 their share of bullshit too. But today they are normally forgotten because survivorship bias let's us only remember the successful stuff, which BTW would include the bullshit if it were so successfully bad that it became memorable.
These were places where old-school (with classical understanding, if we follow Robert M. Pirsig) people tried to reduce grounded in reality concepts to optimal (as humanly possible) implementations, as opposed to the modern tendency to pile up more and more crap.