25 comments

  • nicklaf 2105 days ago
    It would be interesting to ask whether or not Ted Nelson's vision for personal computing has been achieved. In the podcast, personal computing was to be the means of liberation from a centralized priesthood on the one hand, and from ignorance (i.e., access to means of learning) on the other.

    Personal computers are now ubiquitous, and we've been liberated from central priesthoods which had restricted access to (expensive) computers in the past. But new forms of centralization emerge, whether they arise in the form of bloated, draconian corporate software foisted on workers within a company, or as public products such as Google and Facebook, which have been gobbling up the open web and replacing it with means of control and manipulation. [1]

    In fact, this latter concern of manipulation begs the question of what people really need computers for, and brings into question Ted Nelson's presumed answer to it: i.e., that people have an inherent need to use computers to facilitate their own creativity. But what about the possibility that the majority of people prefer to be passive consumers? Neil Postman warned that we would become a trivial society, rife with distraction, like in Huxley's Brave New World. [2]

    Perhaps computers have succeeded in universally capturing our imagination, but corporations (centralization!) have once again captured computing, and by extension our imagination as well, as Tim Wu laments [1]: am I more likely to open up my personal computer at odd moments of the day to check Facebook or Hacker News to momentarily capture my imagination? Or have I instead acquired the habit of pulling out a tablet to accumulate further progress in a creative work, which perhaps only requires internet connectivity for the purpose of reference? And if the answer is the former, is the reason because we don't have actual Dynabooks [3], but instead a more asymmetric device skewed toward consumption and away from creative expression? Or is it because the majority of people in fact prefer passive consumption anyway?

    [1] http://www.timwu.org/AttentionMerchants.html

    [2] https://en.wikipedia.org/wiki/Amusing_Ourselves_to_Death

    [3] https://en.wikipedia.org/wiki/Dynabook

    • jdietrich 2104 days ago
      >But what about the possibility that the majority of people prefer to be passive consumers?

      By any reasonable measure, we're in a renaissance of creativity. Most people in the creative industries are complaining of a total oversaturation of talent - too many bands, too many authors, too many artists, too many indie game developers, too many stand-up comedians.

      Maker culture and 3D printing has created a consumer market for CAD/CAM software, which no-one in the industry foresaw even a decade ago. By some measures, musical instruments and recording equipment are a bigger market than the recording industry. It has never been cheaper or easier to turn an idea into reality; there's abundant evidence that people are grabbing that opportunity with both hands.

      YouTube is full of crap, but it has also revealed a huge amount of demand for deep and meaningful content. My YouTube subs box is full of artisans, mathematicians, poets and monks. Some of them have seven-figure subscriber counts and five-figure Patreon revenue, some of them upload simply for the joy of sharing something that they love. It's easy to be cynical, but YouTube is a place of boundless magic and wonder.

      The internet isn't perfect, but it's not all doom and gloom either. The internet is full of assholes calling each other nazis, but it's also full of people sharing skills and making meaningful connections. Sturgeon's Law isn't going away, but we really have removed the stultifying layer of gatekeepers that controlled access to the media. There's an uncomfortable degree of consolidation and centralisation on the modern internet, but MySpace and Snapchat stand as testaments to how fickle and fragile that power is.

      To quote Ted Nelson in this interview: "The computer is a projective device, it is a Rorschach test. Anyone will see in it that which is of the most concern to him." If we choose to spend our time on the internet decrying the negative rather than contributing to the positive, we will get exactly what we deserve.

      • KineticLensman 2104 days ago
        > By any reasonable measure, we're in a renaissance of creativity

        There is a really interesting historical pattern here where new communications technologies create an initial splurge of independent creators but after some time the technology is consolidated into a smaller number of platforms (by regulation or monopolisation or sometimes both) that provide the main route for content to its consumers. These platforms tend to stifle peer competitors and are only replaced by disruptive rivals. Creativity on the platform depends on how much it stifles the content, the classic example being the Hollywood film industry in the 1940s .. 1960s which was highly vertically integrated and thus became an excellent vehicle for enforcing the so-called Hays Code, which defined morally acceptable content and attitudes.

        This idea is explored really well in the 2010 book ‘The Master Switch’ by Tim Wu [1] who shows how this pattern is played out with Radio, Films, Television and latterly the Internet. It would be really interesting to see an update to this excellent book to reflect the consolidation of Google, Amazon, Facebook, etc since it was published.

        My own concern is that because the internet is perhaps more pervasive than the prior media, any lockdown could have more severe chilling effects. As an extreme, consider future cloud-based compute appliances that don’t allow side-loaded software and whose content creation tools delete or report inappropriate language.

        [Edit - just noticed a similar comment from Jedd below. Next time I should read a bit further before commenting!]

        [1] https://en.wikipedia.org/wiki/Tim_Wu#The_Master_Switch

        [2] https://en.wikipedia.org/wiki/Motion_Picture_Production_Code

      • WJW 2104 days ago
        Interestingly, both problems could be happening _at the same time_! If before (say) 1% of all people got a chance to publish their work and the internet has increased that tenfold, there is still space for 90% of the population to be passive consumers. That 10% might still be 'too much' for some types of content of course (how many bands, standup comedians or indie game developers do you need before you hit diminishing returns anyway?)

        In fact, I suspect that most people fall into both groups: They might be 'creators' for some types of content and only 'passively consume' some different types. For example, I sometimes contribute to OSS but have never uploaded a Youtube video in my life.

      • ataturk 2104 days ago
        These are all good and interesting observations, particularly the ones about Youtube. On the other hand, I still feel like something is very wrong with the Web and the Internet in general. I think it is the incessant propagandizing and ad-mongering. I can't even visit most websites these days without "shields up" on my Brave browser. The NY Times and Washington Post websites are full of slant, required by their owners. Journalistic ethics are completely out the window. Independent news sites are totally unreliable for different reasons, some only because of lack of experience(?). It's an unruly mess.

        You mentioned CAD/CAM software for regular people--the prices don't reflect that these companies have gotten the idea yet.

        These days, basic free-market, small government, libertarian minded people are referred to as Nazis. It is not at all comforting that such a label could be so badly mis-applied while we witness the US Government dropping a bomb every 12 minutes in multiple undeclared wars, mostly for the sake of corporate profits, and yet people like me are the bad guys? I don't get that. It is all completely counter-productive.

    • miguelrochefort 2105 days ago
      The problem is that the computer doesn't help me figure out what to do next. It only provides a grid of icons, through which I cycle repeatedly until the real world grabs my attention.

      What we need is an OS designed to help people get things done. The best such framework is probably Getting Things Done [1]. Build the 5 steps (Capture, Clarify, Organize, Reflect, Engage) right into the system, and you'll have a winner.

      I've been repeating this idea for the past decade. Perhaps I should implement it myself.

      [1] https://en.wikipedia.org/wiki/Getting_Things_Done

      • wool_gather 2104 days ago
        Interesting though. Home assistant devices (Echo/Google Home) might be a first step in this direction, since they're entirely task-oriented. Not quite the same as a GTD system, of course: they just do a thing immediately. But the non-visual interface means that you can't idly flip through Facebook posts even if you tried.

        On the other hand, they don't have any support for in-depth tasks. If the things you wanted to get done were writing a letter or editing a photo, you're not going to get far without a screen.

        • mwcampbell 2104 days ago
          > you're not going to get far without a screen.

          Tell that to blind people. Not for editing a photo, of course, but they can certainly do other in-depth tasks without a screen.

          • wool_gather 2103 days ago
            Sure, but for a sighted person used to interacting with things visually, this is learning some entirely new mode of control. One of the strengths of computer GUIs is that they heavily leverage humans' most favored sense (for those of us that can use it).
    • 8bitsrule 2105 days ago
      whether or not Ted Nelson's vision for personal computing has been achieved.

      If you listen to one of Ted's recent videos, you'll find the answer in the weeping and gnashing of teeth. (The dumbing down of what could-have-been.)

      • bborud 2104 days ago
        The impression I got is that he is more of a technology philosopher than an actual technologist. He dreams up things, but he doesn't realize them and he hasn't managed to convince others to implement his vision either. In terms of software there is precious little to show for 50 years of thinking about this.

        And it isn't like this is some recent brainwave. By the time the web rolled around he'd had 30 years to do something. And as the web came into being he had an excellent chance to make it a vehicle for his ideas, but he didn't.

        One thing I remember from a (longish) chat I had with him is that some elements of his ideas exist today, but perhaps not where you would think about them. For instance in non-destructive media editing software you have concepts that remind me of some of his ideas.

        I think the one key takeaway I got from his talk and the chat afterwards is that only what you do really counts. Ideas really are worth less though not worthless.

        • AndrewKemendo 2104 days ago
          only what you do really counts. Ideas really are worth less though not worthless.

          It's a matter of degree right?

          So for example Erdos didn't actually do anything - in the sense that we're talking - such as creating products or works of physical art. His mathematics are ideas. Which people then used, but critically referenced Erdos as their originator.

          From what I can tell of Ted Nelson, the design and cybernetics ideas were obvious enough that many other people thought of them also, or were not so obscure as to be canonical. So like many ideas without fathers, it's whomever can create a product around them which get the credit.

          So for example "Steve Jobs invented the iPhone." Anyone with even a shallow historical understanding of handheld computing knows that this statement isn't true in any concrete sense, but because Apple, under Jobs' management, popularized the smartphone the attribution goes as such. Many people thought of, and wrote about or made in science-fiction the smart phone - so it was an obvious enough idea (by then) that execution is what mattered.

          Interestingly, if you push all the way back to writers like HG Wells, they predicted much of the technology we see today including smartphones, but they aren't credited with "inventing" them or even being important in their development really. So the right time, right place, right idea is still the combination needed to stand out.

          • bborud 2101 days ago
            > So for example Erdos didn't actually do anything

            He most certainly did. In his field, doing the work which is then published in papers _is_ doing something. However, just having an idea for how to work out a mathematical proof, but not doing it would represent, well, not doing it.

            I don't see how this would be unclear.

          • hindsightbias 2104 days ago
            > So the right time, right place, right idea is still the combination needed

            And the right people.

      • cornholio 2104 days ago
    • delbel 2104 days ago
      > But new forms of centralization emerge, whether they arise in the form of bloated, draconian corporate software foisted on workers within a company, or as public products such as Google and Facebook, which have been gobbling up the open web and replacing it with means of control and manipulation. [1]

      This reminds me of a book that came out in 1996 called Silicon Snake Oil: Second Thoughts on the Information Highway by Clifford Stoll. In it, he predicted, the internet will all turn into crap. At the time it was the most contrarian view point ever, I can't help to think I need to find a copy now in light of what facebook and mobile phones and their negative effect on society has done.

      He wrote an amazing book called the Cuckoo's Egg before that, basically everyone in silicon valley has probably read and can relate to if you are from that era.

      • KC8ZKF 2104 days ago
        I remember reading that book in 1996 and finding it incredibly elitist. His alternatives to the internet were visiting libraries, book stores, art film theaters, going to concerts, strolling through parks... But, he was living in Berkeley!

        The internet was a god send to most of the population who didn't live in the intellectual centers of world. Still is.

      • nicklaf 2104 days ago
        > This reminds me of a book that came out in 1996 called Silicon Snake Oil: Second Thoughts on the Information Highway by Clifford Stoll. In it, he predicted, the internet will all turn into crap. At the time it was the most contrarian view point ever, I can't help to think I need to find a copy now in light of what facebook and mobile phones and their negative effect on society has done.

        Very interesting. The book certainly sounds like something I ought to read. In fact, perhaps while simultaneously reading David Gelernter's 1992 book, Mirror Worlds: or the Day Software Puts the Universe in a Shoebox...How It Will Happen and What It Will Mean, whose optimistic vision of an immersive virtual reality out of networked computers would make for a rather stark contrast!

        • ataturk 2104 days ago
          David Brin was pretty on the mark with "Transparent Society" as well. Brin had more optimism than what came to be, but he nailed the "cameras everywhere" status quo where people whip out smart phones to capture footage of an event now. It's just that he also nailed the Orwellian side of it, too.
    • Jedd 2105 days ago
      It invites or raises the question rather than begs it, but your point is understood.

      You mention Tim Wu -- his book 'The Master Switch' focuses on the history of technologies as a cycle of invention --> egalitarian use & access --> centralised control and ownership by the elite.

      My feeling is that things like the trends away from net neutrality globally, and towards greater government surveillance / suppression of encryption, and corporate interest in encouraging consumption via technology rather than (as per your example) 'creative work', are fulfilling that prediction in both obvious and subtle ways.

      • noblethrasher 2105 days ago
        > It invites or raises the question rather than begs it, but your point is understood.

        By my reading, 'nicklaf used the phrase in the technically correct way.

        But, as a recovering philosophy major, I'll just point out that the phrase "begging the question" is just a bad 16th century English translation of the Latin phrase "petitio principii" that literally meant "requesting a postulate", and which itself was a medieval translation of the original Greek phrases "τὸ ἐν ἀρχῇ αἰτεῖσθαι" and "τὸ ἐν ἀρχῇ λαμβάνειν", which meant, respectively, "asking the original point" and "assuming the original point."[1]

        We say that the English translation is "bad" because the translators were jumping through hoops to avoid using the fancy Latin in favor of the vernacular (or vulgar) English.

        I caucus with the folks that say that we should just use the Latin phrase, "petition principii", as we do with other well-known fallacies.

        [1] http://languagelog.ldc.upenn.edu/nll/?p=2290

      • nicklaf 2105 days ago
        It invites or raises the question rather than begs it, but your point is understood.

        Hmm. I admit it's tenuously implied, but my thinking was that it does beg the question! Because what we're talking about is the ostensible resolution of Ted Nelson's dilemma of wanting 'generalized paper', but then observing that what people actually use it for in practice might lead us to question some of his original assumptions about what people need computers for after all. But OK, maybe this isn't quite correct usage of the term.

        Thanks a lot for pointing that out about The Master Switch, which I should probably read.

        • redwood 2105 days ago
          I'm impressed that you're both familiar with the rhetoric roots of the term
      • Obi_Juan_Kenobi 2104 days ago
        > trends away from net neutrality globally, and towards greater government surveillance / suppression of encryption ...

        Is that a trend?

        It strikes me that these are ongoing points of conflict, at least in particular with encryption. Whether it's illegal numbers, Wassenaar, or the UK sitting on a version of RSA for over two decades, keeping it top secret. Or the enduring controversy over TOR, a tool of government surveillance.

        I think the only thing that's changed is the degree and immediacy with which policy related to these issues affects the public.

        More generally, for all the 'consumption' Facebook and Instagram, you also have Arduino and 'maker' culture, Soundcloud rappers, Youtubers of all stripes (from the old dude in his garage machine shop, inexplicably addressing an audience of tens of thousands a few times a week, to the flavor of the month pop-drama garbage), an endless list of self-directed learning resources, Wikis, etc. etc.

        I certainly see the merit in the 'Wu' observation, but it strikes me how robust and resistance the internet is -- has been -- to this erosion.

        • Jedd 2103 days ago
          > Is that a trend?

          It feels like it - but I don't have a scatter plot that I can provide to prove it.

          Snowden revelations felt like the start of the wave of awareness, at least amongst those with even just a modicum of interest and savvy.

          In AU we tend to follow the trends (no matter how bad they are) set by other regimes. Two recent examples spring to mind. In 2017 a fairly horrific data retention law was passed that requires all ISP's to record and retain (for two years) all user meta data. In 2016 the AU Bureau of Statistics ran the latest census - there was a significant policy change regarding the long-term retention of these PII data, cynically announced a few days before Christmas the previous year (a profoundly quiet time in AU as most take their summer holidays around then).

          Watching news from the UK and the US - but less so the EU, at least - is discouraging in terms of the continued eroding of privacy and freedom. Separating large institutions such as FB / Google / Microsoft from nation states misses the point - their interests more often align with each other than with their citizenry / customers(products).

          > More generally, for all the 'consumption' Facebook and Instagram, you also have Arduino and 'maker' culture ...

          I love the arduino, but I don't think it's going to save us from the activities of bad actors.

          > I certainly see the merit in the 'Wu' observation, but it strikes me how robust and resistance the internet is -- has been -- to this erosion.

          Read the book. There's a trend of people thinking 'this is how it will always be' discovering later, too late, that that's not the case.

    • deegles 2104 days ago
      > computers have succeeded in universally capturing our attention

      This seems so ominous to me. It’s the end goal of the “attention economy,” right?

    • twic 2103 days ago
      > It would be interesting to ask whether or not Ted Nelson's vision for personal computing has been achieved.

      Well:

      I DON'T BUY IN

      The Web isn't hypertext, it's DECORATED DIRECTORIES!

      What we have instead is the vacuous victory of typesetters over authors, and the most trivial form of hypertext that could have been imagined.

      http://essaysfromexodus.scripting.com/tedNelsonWebHypertext

    • KirinDave 2104 days ago
      > In fact, this latter concern of manipulation begs the question of what people really need computers for, and brings into question Ted Nelson's presumed answer to it: i.e., that people have an inherent need to use computers to facilitate their own creativity. But what about the possibility that the majority of people prefer to be passive consumers? Neil Postman warned that we would become a trivial society, rife with distraction, like in Huxley's Brave New World.

      We're in a novel time to be introspective about the use of computers because in a very real sense, accessible information systems that treat all information equally have brought about an information overload apocalypse. With accessibility and creativity everywhere, we're starting to see what happens when any group of people can assemble social campaigns (bot networks spreading lies in social media frameworks) and achieve parity with the former "information priesthood" of centralized media.

      It's so easy to make and share information that people are inundated with the material of ethos (with the lies it cherishes and truths followers cling to), language, and creed. While other culture shock events have occurred in history, there is a fundamental difference in the recent decade has been the lack of any central actor for legacy power systems to coerce or stomp out. And because of this, we're having to find ways to cope at the individual level with an excess of information, even unwanted information that we don't even realize we're consuming.

      I'm not sure Nelson or even Huxley truly foresaw this. To someone who has not grown up in a sea of information with algorithms helpfully (and even desperately) shoving information into our faces to try and satisfy our needs before we realize we have them, it's difficult to describe the idea of struggling not to be manipulated by subtle bias. Prior to the information saturation we enjoy now, such attempts to totally control information tended to look a lot more like North Korea's press. Aspects of this even exist in less centralized media, such as when and where western press decides to "tell both sides of the story" as opposed to simply informing a position as fact.

      Counter to Huxley's rather puritan vision of us medicating and fornicating into irrelevance (which we've always done, let's be real), the idea that a minority view point (e.g., "America should be racially motivated literal monarchy") can be given the optics of having parity with ideas actually held by tens of millions of people seems to have energized people. Folks see a venue for increased social importance and control, and they're seizing it.

  • Jedd 2105 days ago
    While the host - Max Allen - is certainly a bit obtuse and doesn't share the same technical insights & visions of Ted Nelson (hardly surprising given they had significantly different experience and careers) he's respectful of his guest (doesn't interrupt or talk over), concedes various points willingly, appears to genuinely want to understand, and actually enjoys the discussion.

    The nostalgia sensation isn't just around Ted's prescience.

    • taberiand 2105 days ago
      His viewpoint is also rather reasonable considering the context of the interview, at the dawn of the information age. It's a little depressing however that it's not uncommon to still encounter that point of view today.
    • andyidsinga 2105 days ago
      I really appreciate that too - the interview / discussion is very interesting; draws a lot of good stuff out of Ted Nelson.
    • EGreg 2105 days ago
      That’s how interviews used to be.

      FOX News pioneered the interview where the interviewer spends most of the time talking, asking leading questions to the respected person they invited, then cutting them off, shouting them down and attacking them for being socialist or whatever. And on to the next.

      It’s more like watching gladiator fights than really learning people’s answers to questions.

  • DonHopkins 2105 days ago
    I love his criticism of IBM (and the literary adjective that perfectly describes IBM that I learned today):

    Q: Tell me about IBM?

    A: What would you like to know? OMINOUS LAUGH

    Q: Well, I'd like to know what's wrong with them, to start with, since that's what you want to talk about.

    A: Ok. IBM is first and foremost a very slick sales organization, which was created in the image of Thomas J. Watson, a supreme despot, and very imaginative salesman who managed to create an organization with less fat in -- pardon me -- less local fat than any other corporation that ever happened.

    And one that has over the years learned to devote teams to getting things done. That's the positive side. Get these things done with dispatch and with earnestness.

    Now whether they're done the way one would like to see them done if one contemplated what the real nature of a problem, this is and entirely different manner. And critics of such things as IBM's 360 and 370 computers would say that they are were Brobdingnagian, clumsy and surrounded by unnecessary difficulties.

    And this of course is why many people like the kids who showed me around the University of Waterloo this morning far prefer to use systems like Digital Equipment machines which are much more accessible to the sophisticated user.

    Anyway, the question is why has IBM prevailed in its way, and the answer is that they have a sort of monopoly. And one which obviously has a political side and a technical side. And the problem is now that as they are swinging their new communication system into place, it seems increasingly likely that this communication system is more built to maintain the monopoly than it is built to satisfy the needs of what people ought to have.

    Q: What communication system?

    A: Oh there's something called Satellite Business Systems, which IBM and Aetna Life Insurance and I think a few other partners have created in a joint venture.

    • dluan 2105 days ago
      The following short segment is great.

      "What's swinging into place Max, is that we have great communication networks now coming about for the transmission of digital information.

      Now by digital we just mean symbols. There's this mistake that digital means in numbers. That's wrong. The two commonest programming languages are musical notation and knitting instructions.

      That's not fu... why are you laughing."

      Knitting was really big back then.

      • fapjacks 2105 days ago
        Knitting is really big still. One of the biggest lobbies for copyright and IP legislation is (no kidding) the industry that creates patterns for sewing and knitting. Put out feelers on social media, see how many of your associates knit which you didn't know about. It was surprising to me for some reason to find out how pervasive it is.
        • tomatotomato37 2105 days ago
          Wait, by sewing pattern industry, do you mean the fashion industry?
          • fapjacks 2104 days ago
            Haha that's a great question, but no, there is actually a different set of companies selling patterns for sewing and knitting (traditionally to their target demographic of older ladies). Companies that typically sell patterns to places like Hobby Lobby or Michaels, this kind of thing, and not companies like Old Navy and Forever21 and the like. In that industry, patterns are sort of like patents in a way, and these companies have amassed huge libraries of patterns and aggressively defend them with litigation. I'll dig a bit to see if I can find information about the specific events I'm thinking of, but the gist of it was that alongside Napster I believe, one of the biggest and earliest instances of copyright/IP litigation was against little old ladies who were using P2P to fileshare sewing and knitting patterns. This was before the RIAA started suing customers, and I believe it actually partly served as the inspiration for the RIAA stupidly choosing to go that route. It's been some years since I've read the story, so I'm likely not remembering some details correctly. Edit: I can't find a link to the story I'm thinking of, but you can actually find some side effects of this litigation, for example this "copyright notice" (especially read the actual PDF as this notice was pretty clearly issued to protect the interests of these pattern companies I'm talking about): https://www.gov.uk/government/publications/copyright-notice-...
      • EGreg 2105 days ago
        He forgot recipes :)
    • thechao 2105 days ago
      Brobdignagian is the opposite of Lilliputian—from Gulliver’s travels. It means stupendously gargantuan.
    • JetSpiegel 2105 days ago
      Brobdingnag is the "Gulliver's Travels"[1] country where everyone is a giant. Famous for having a breastfeeding woman horrifying the narrator, and him sword fighting a rat.

      [1]: https://en.wikipedia.org/wiki/Gulliver%27s_Travels#Part_II:_...

      • DonHopkins 2105 days ago
        whoosh -- That's the second literary reference that went over my head when transcribing an interesting Ted Nelson video. I originally guessed he was saying that "Franklin Brouwer Zeus" invented the internet, when he actually said people thought "it sprang from the brow of Zeus". Here is the corrected version: ;)

        https://www.youtube.com/watch?v=edZgkNoLdAM

        When we look at the past of computerdom, it's through a lens that's peculiar, because things have changed so much so fast.

        That to me the 50 years since I've been in the computer field have gone so quickly that the past seems ever-present.

        In the 60's and 70's, a lot of young people have started communes. And it was a combination of free love, which is a term you don't hear any more because it's taken for granted, and pot, and LSD, and idealism, and hope for a new kind of economy.

        And that spirit of that age leaked into the computer world. There was a sense of possibility at the beginning that is different because we thought computing would be artisanal.

        We did not imagine great monopolies. We thought the citizen programmer would be the leader.

        When I say we I mean I, but of course I had a sense that I was sharing this with a lot of people.

        We had visions of democratization, of citizen participation, to create vistas of possibility for artistic expression, and artistic expression in software. And software is an art form, though not generally recognized as such.

        And because of Moore's Law, which has been stated to me not as Moore's Law, but just as a general principle: things will get faster and cheaper. We will be able to afford it. Right now a computer with a screen is $35,000. Tomorrow, who knows. It will be $100 some day.

        Now is the time to start thinking about what will be the documents of the future.

        As I would abstract it now, the two concepts were:

        We can have parallel connections between visible documents. So you can have two pages with a connection saying "this sentence is connected to that paragraph" and see it as a visible strap or bridge.

        And you can't do that yet. So that was one of my hypertext concepts.

        And the other hypertext concept was being able to click on something and jump to it.

        So as the hypertext concept developed and deteriorated over the years, only the jump link became popular in the hypertext systems of the 60's and the 70's, and then Tim Berners Lee created the World Wide Web, which was the sixth or seventh hypertext system on the internet.

        People think it sprang from the brow of Zeus, in fact it was just a clean job that had the clout of CERN behind it. How to see the possibilities when there are so many things around you that are a certain way?

        I don't know. The future is an unknown place. There are a lot of scary things about it.

        What aspects you are going to approach? Are you going to go on thinking about leisure, or about the terrible problems that confront the world?

        All I can say is: "Close your eyes, and think what might me."

        My first software designs were largely done with my eyes closed. Thinking: "Now if I hit that key, what should happen? If it hit this key, what should happen?"

        I was able to imagine -- they say this can't be done, but when my interfaces were built, they always felt the way I knew they would.

        And the people at Xerox PARC said "That's never possible. You never know how it's going to be." But I did.

        • TheTedNelson 2101 days ago
          Thanks for the transcription, but please change one word--

          >All I can say is: "Close your eyes, and think what might me."

          That last word should be "be".

          (Apparently I also created a Ycombinator account under the name "Ted Nelson", but they won't tell me the password.)

    • aportnoy 2105 days ago
  • pmoriarty 2105 days ago
    This reminds me of trying to convince a FidoNet BBS sysop in the late 80's to try the Internet. He adamantly refused, saying FidoNet would be all he'd ever need.

    It also reminds me of arguing in the 80's with an IBM PC user that the Amiga's 4096 colors were desirable. He insisted that 16 colors were all he'd ever need.

    I also tried to turn my dad on to the Internet and Usenet newsgroups in the late 80's. That failed too. He just wasn't interested.

    I guess I'm just not a very good salesman.

  • unimpressive 2105 days ago
    The funny thing is that early on in this interview the host tells Nelson that he's the only one saying anything about using computers to store and retrieve text. But that's simply not true, certainly not in 1979. For example, John McCarthy had been describing what largely came to be our digital literary future as early as 1970:

    https://news.ycombinator.com/item?id=10370990

    Even within the contemporary environment a significant fraction of computer use was not 'number crunching', but the sorting, retrieval, and printing of essentially textual information. (i.e, records) Records are tiny documents, which to me makes this early portion where Ted accepts the point without argument a large missed opportunity to get the point across.

    https://www.youtube.com/watch?v=HMYiktO0D64

    • 8bitsrule 2105 days ago
      1979

      Even 10 years later. In the mid-80s (by which time BYTE magazine was as big as a Sears catalog) I wrote a piece on the approach of the PC era and offered it to a smaller (100K) town's newspaper editor. His response: 'That's kind of a nitch product isn't it?'

      The bigger newspapers were already moving completely to computers, but such facts hadn't really reached such guys.

  • shagie 2105 days ago
    Started listening, ok, sounds like an interesting person. Check the Wikipedia page on him...

    > Theodor Holm "Ted" Nelson (born June 17, 1937) is an American pioneer of information technology, philosopher, and sociologist. He coined the terms hypertext and hypermedia in 1963 and published them in 1965.[1] Nelson coined the terms transclusion,[1] virtuality,[2] and intertwingularity (in Literary Machines), and teledildonics[3].

    tele... is that what I think it is? Click link. Yep.

    The interview is really interesting and how much he got right about the past 30 years.

  • miguelrochefort 2105 days ago
    This is how I feel about computers today. They're a mess. The fragmentation of hardware, operating systems, apps, websites, programming languages and frameworks is horrible.

    I frequently ask people how they feel about the current state of software. Most people think it's fine, they don't have a problem with it. They can't imagine how else it could be. This drives me mad.

    I have 100 apps on my phone. None of them talk to each other. Every new appliance, restaurant and event has its own app. These 100 apps will quickly turn into 1000. Clearly, the application paradigm doesn't scale. Where is it going to end?

    We're still emulating the physical world in software. File systems are still trees. Programming is still done with text. Paragraphs are still copy-pasted, rather than dynamically embedded. We barely made any progress since that radio interview. Xanadu is still vaporware.

    What will it take for a software revolution to take place?

    • bachbach 2105 days ago
      Something like Urbit probably.

      Those sorts of ideas sound improbable - it's hard to say whether it takes 10 years or even a different civilization. The Romans never managed to pass land reforms.

      The Chinese tech sector has gotten around some of it with the assistance of the government - there's glimpses of a different possible future but without full stack revolution it's hard to imagine new types of institutions being created.

      I'm really not sure Silicon Valley is able to pivot out of the tar pit it has itself created.

      The closest sign of real progress would be if somebody did something that destroyed a ton of Silicon Valley jobs.

      I know people think here: AI. I don't think so - as of yet I haven't seen any evidence for autonomous executive complex coordination. It's possible for a machine to do it - that I believe - but it doesn't follow that computers are going to be the machines that do that.

    • mrkstu 2105 days ago
      OLE embedding and OpenDoc utterly failing has been one of my great frustrations in the computer field. I should be able to have a universal document format with intelligent encapsulated data that I can manipulate in flexible ways.

      Excel is almost there as a standalone island, but I should be able to embed Excel's intelligence in all my documents.

      • rsync 2105 days ago
        "OLE embedding and OpenDoc utterly failing has been one of my great frustrations in the computer field. I should be able to have a universal document format with intelligent encapsulated data that I can manipulate in flexible ways."

        I think we have had this in the form of the UNIX shell environment and various command line primitives like cat/awk/sed/grep/wc/strings.

        It isn't sexy and it certainly isn't a WIMP[1] paradigm like Windows (which I associate exclusively with OLE) but if you can work with ASCII text (and you should...) then I think you have that environment and those abilities.

        [1] https://en.wikipedia.org/wiki/WIMP_(computing)

        • perl4ever 2105 days ago
          I'm not saying that paradigm isn't useful, but it seems like you're saying 70s technology is still good enough. But shouldn't we have come up with some improvement in the last 40 years? Really, all we've done is regained the abilities of big computers from long ago in modern microcomputers.

          I was just playing around with Selenium, and it was neat what you could do with it, but it was kind of depressing that it is basically a project to do for one application (or group of applications) what people used to have a vision for doing with all applications back in the 1990s (with AppleScript for instance).

          We keep redoing things we've done before in slightly different ways, scaling down the grand visions and retrying parts of them.

    • sverige 2105 days ago
      AI is going to invent that revolutionary software, but unfortunately you and I won't understand it or be able to manipulate it. Be careful what you wish for.
      • calebh 2104 days ago
        Imo gradient descent is not going to get us to general AI. I'm a programming languages theory person, and I also know a fair bit about machine learning. The current problem with neural networks is that they suck at processing variable length data, and they have problems with remembering the past. Programs can be represented by trees, graphs, or text. Deep learning isn't great at dealing with graphs or trees, and it's not that good at text either.

        The other issue is that deep learning works great for recognizing common patterns, but it sucks when faced with novel situations. So I don't think that we're going to have programs that program anytime soon. The first applications of AI to programming will probably be with programming assistants or with AI guided proof solvers. Programmer productivity will improve, but we're not going to see everyone losing their jobs.

        • nmca 2104 days ago
          The latter of those two points is much more valid than the former. You're comment about generalisation difficulty is pretty accurate, but as for "sucks at variable length data", I think neural machine translation [0] and the fact that schemes including RNNs just won M4 [1] indicate that this is incorrect. Your point about remembering the past is true (it's hard), but people are actively working on it. Unitary neural nets, and the fast/slow weight paradigm are very different angles that seem promising.

          As for handling trees + graphs, this actually works very well. Thomas Kipf is pushing this area forward, GATs [2] are a nice random example of how dominant differentiable programming (eg NNs) can be in this area. Unfortunately these graph approaches don't parallelise as nicely on GPUs as CNNs.

          Your predictions (assistants) seem likely to me.

          [0] https://arxiv.org/abs/1804.09849 [1] https://www.m4.unic.ac.cy [2] https://arxiv.org/abs/1710.10903

      • miguelrochefort 2105 days ago
        Is it coming soon? I'm not sure I can wait 5 more years.
        • sverige 2105 days ago
          Oh, well, probably not 5 years. More like 20 years away for human-level AI, according to everyone I've read since the 80s. Fusion power will be coming online right around then, too, so that's good. And the true value of the blockchain will finally be evident in about 20 years, too. Which is really good, because that's around the time we are likely to discover the secrets of immortality.
  • andyidsinga 2105 days ago
    this! Interviewer: "I don't understand what the necessity of an electronic device called a computer, or anything else like that - a computer screen to use your word - is in making people smarter. I don't think another gadget is going to have any effect at all"

    ... I can't tell you how many times I've heard this gadget/toy statement since I first got a computer when I was around 10.

    On the flip side - there are so many people with open minds willing to give things a shot (and often a not inexpensive one at that) ...thanks for giving me those computers mom - and then enjoying watching what happened :)

    (edit : typos)

  • thsowers 2105 days ago
    This is a great listen. The interviewer doesn't seem to understand that there are hard limits to humans ability to search, compute, organize, notice patterns etc
  • samueloph 2105 days ago
    Ok, so i didn't know Ted Nelson, from his wikipedia page: "Theodor Holm "Ted" Nelson (born June 17, 1937) is an American pioneer of information technology, philosopher, and sociologist. He coined the terms hypertext and hypermedia in 1963 and published them in 1965."

    I'm impressed.

  • Rapzid 2104 days ago
    Really great interview. Ideas should be challenged, and the value we, the audience, get from the responses to challenge is great. Not enough of this in the mainstream today.
  • EthanHeilman 2104 days ago
    Ted Nelson in continuing in a thread of argument first put forth first by Ada Lovelace:

    "Again, it might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations, and which should be also susceptible of adaptations to the action of the operating notation and mechanism of the engine. Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent."

    Sketch of The Analytical Engine Invented by Charles Babbage By L. F. MENABREA of Turin, Officer of the Military Engineers With notes upon the Memoir by the Translator ADA AUGUSTA, COUNTESS OF LOVELACE https://www.fourmilab.ch/babbage/sketch.html

  • cm2187 2104 days ago
    To be honest I wouldn't be convinced myself that I would need a personal computer to better index my pieces of text at home.

    Searching through un-indexed text would have been a far more powerful argument. I remember reading the memoir of a DGSE analyst (the DGSE is the french equivalent of the CIA). As always, the reality of secret services is far less sexy than its popular image and the book has an entire chapter about the importance of good (pre-computer area) archives about everything, every conversation any analyst had with anyone, any book or article written by any public figure, anytime someone is mentioned in any anecdote, etc. And how the essence of being a good analyst is to know these archives inside out, how to quickly browse through when looking for what we know on someone or a topic. On a personal computer that would have been a simple CTR-F.

    • Doxin 2103 days ago
      To be fair, that "un-indexed text" is still indexed. The indexing process is just hidden from the user and fully automated.
  • KirinDave 2104 days ago
    Wow, it's interesting that Ted says the use a computer "during the war" for for trajectory calculations. I wonder if he believes that, or if he's simply avoiding a very complex conversation about information theory and cryptography that in the 70s was both less well-understood and less popular.

    I wonder if instead he's referring to some other specific event that in the subsequent 50 years we've decided is insignificant to the formation of computers.

    • ipsin 2104 days ago
      They used computers during WW II for rangekeeping [1]. They were complicated and heavy electromechanical beasts.

      [1] https://en.wikipedia.org/wiki/Mark_I_Fire_Control_Computer

      • KirinDave 2104 days ago
        Right but this was NOT the most historically significant application of computers in WW2 by a long shot.

        That's why I'm wondering if this is a visible example of how history tends to redefine significance and refocus on different events over time.

        • stan_rogers 2104 days ago
          Calculating trajectory tables for artillery was ENIAC's primary job as well.
    • hchasestevens 2104 days ago
      One consideration is that much of Bletchley Park's efforts were kept hidden from the public until 1974[1] - it's possible that in 1979 the computer's role in breaking German cyphers was still not widespread knowledge.

      [1] https://en.wikipedia.org/wiki/Ultra#Postwar_disclosures

    • TheTedNelson 2101 days ago
      Bletchley Park was publicly unknown at the time of the interview. Whereas Eckert and Mauchly, with Eniac, had been computing those tables. (That was a secret too, but it was no longer a secret when I knew him in his last years.)
  • miguelrochefort 2105 days ago
    We're still very far from the connected and unified future Ted Nelson had in mind, yet I can't seem to find many people interested in solving that.

    Where are these people?

    • dustingetz 2105 days ago
      I'm in that space, link in my profile (not that it is recognizable as a hyperdata system yet, nonetheless it is). Why do you ask – are you?
  • xamuel 2104 days ago
    Interesting how they mention the coming war between the "computer centers" and the personal computer. And now looking back, we see that's a war that's been fought time and time again, and now we're fighting it all over again with the whole Cloud thing. Next week Cloud'll be out and desktops back in.
  • braindead_in 2104 days ago
    Here's a transcript, if you want to skim through it.

    https://scribie.com/transcript/290d2d5d863e4353b3843816aade9...

  • Kaius 2102 days ago
    "It is possible to insist that every changes is merely a small change in degree, rather than a change in kind."

    Good quote from Ted.

  • jonahx 2105 days ago
    The vicarious frustration in this listen was deep.

    It's not only a fascinating time-capsule, but a palpable reminder of how revolutionary ideas can be received. The clarity of Ted's thought and vision here is striking even by today's standards, and despite his patience and articulate explanations he might as well be talking to a brick wall.

    The Swift quote came to mind:

    > When a true genius appears in the world, you may know him by this sign, that the dunces are all in confederacy against him.

    • booleandilemma 2105 days ago
      “Don't worry about people stealing your ideas. If your ideas are any good, you'll have to ram them down people's throats”

      - Howard H. Aiken

    • abecedarius 2105 days ago
      • rocky1138 2105 days ago
        > When people ask me about my life’s ambitions, I often joke that my goal is to become independently wealthy so that I can afford to get some work done. Mainly that’s about being able to do things without having to explain them first, so that the finished product can be the explanation. I think this will be a major labor saving improvement.

        Honestly this is so good. I feel exactly the same.

      • miguelrochefort 2105 days ago
        I spent the last 10 years trying to tell people my vision. Nobody is getting it.

        Now I understand why. I wish someone told me this 10 years ago.

        Thank you.

    • DanBC 2104 days ago
      It's a radio interview, so he may be talking to a brick wall in the studio but it's likely a bunch of people listening suddenly "got it" from this interview: computers aren't about manipulating numbers, but are general purpose machines that manipulate symbols.
  • ehnto 2104 days ago
    That title had me thinking he was trying to unthink an epiphany the radio presenter had enlightened him with.
  • andyidsinga 2104 days ago
    I'm listening to this for the second time - I think there will be more.
  • redwood 2105 days ago
    Shadow IT goes back a long way!
  • BenjiWiebe 2105 days ago
    "Remarkable that humanity survived without [a computer] all this time." -sarcastic

    "Yes it is!" -serious

    • dqpb 2105 days ago
      I found it striking that the majority of Max Allen's questions/arguments were along the lines of "But isn't everything fine the way it is?"
      • sverige 2105 days ago
        He wasn't wrong, either. Having lived the first half of my life without a computer at home and the second half with more and more of them, the only thing I can say is a real improvement is that I can transfer funds from my account to my wife's at any time. The rest isn't that great. Probably hard to understand if you never lived without them.
        • mwcampbell 2104 days ago
          That's overly pessimistic. Isn't it awesome that computers have enabled worldwide communities like this one? Asynchronous communication through text, enabled by computers, is also a great equalizer for people with disabilities and other differences. Blind people, deaf people, quadriplegics, people with speech impediments (e.g. stuttering), non-native speakers of the language, etc. can all communicate and work together, without even being aware of each other's difficulties.

          To be fair, all of that was enabled by the earliest BBSes, and I guess we haven't made as much progress in the meantime as some people (like Engelbart and Nelson) hoped.

          • sverige 2104 days ago
            You are correct about asynchronous communication, of course. That is a great thing, and I should have included IRC / texting on the plus side.

            The worldwide communities? I enjoy them, but it seems to me that it is one of the fundamental causes of the increasing fragmentation of society. Hannah Arendt describe this as 'atomized' society. Totalitarian regimes have flourished in such environments.

            In the old days, if you didn't fit in, you had to make an effort to find common ground with others physically near you to enjoy a social life. This led to a lot of seredipity and built social cohesion. Nowadays, it's a lot easier to hate your neighbor who disagrees with you politically or culturally because there's always a virtual means of socializing just a click away.

            Nelson was a visionary to coin 'virtualization' but I think he and many others have been far too optimistic about the end results of that ongoing experiment. It is far easier to manipulate people into committing acts of physical violence against their neighbors than is commonly believed in a society that has become 'atomized,' and virtual communities are a terrific means to creating an atomized society.

  • BenjiWiebe 2105 days ago
    Hilarious. The poor interviewer just can't comprehend at all what a computer could be used for. Also, Ted Nelson was pretty good at predicting that a computer would be used as a writing medium, a filling cabinet, a musical instrument, a few more things I don't remember specifically. As I was listening I was just silently agreeing that yep, we have that, and that, and that already.
    • hindsightbias 2104 days ago
      I'd wonder if HN readers would have gotten it in the context of 1979.

      It took nearly three years for someone to wonder what would happen if you connected something like Hypercard to a network. And it wasn't Bill Atkinson.

    • hobls 2105 days ago
      Even hit on what we could now call “smart home” devices. He managed to come up with a lot of examples to try to illustrate to the host what computers would be used for!
  • putlake 2105 days ago
    Bitcoin today is where computers were in 1979. A little embarrassed to admit it but I feel like the short-sighted interviewer who fails to see the applications of the technology.
    • pavs 2105 days ago
      Are you saying that bitcoin has the potential to revolutionize the world economy, create trillions of dollars worth of industry across many fields? Create 100s of millions of jobs. Create new forms of arts, new ways of communications, help find cures for illness, help accelerate human conditions 100s of ways that I can't even comprehend at 5 in the morning, before my coffee.

      No, I don't think bitcoin has any such potential. Yes, some people made quick bucks at the expense of some other gullible people losing quick bucks. In the realm of human history, bitcoin/altcoin will be considered a mildly interesting idea, a lot of noise but eventually a major dud. No sane and major government will give up central control of currency, If bitcoin eventually ends up being centrally controlled (if not banned first), it defeats the existential angst of having cryptocurrency.

      Its an extremely environmentally inefficient way to screw people over, so please stop propping up cryptocurrency. Let it die.

      Bitcoin isn't rocket science, a lot of people understands what it is and its potential benefits. It's just people are confused what the hell do we need this in the first place.

    • kevinpet 2105 days ago
      If you looked back to 1979 you'd find a dozen other technologies that didn't pan out.
    • inteleng 2105 days ago
      Just because something new-ish exists doesn't mean it's the root of something society-changing. Bitcoin might be a steam engine.
      • davidgay 2105 days ago
        > Bitcoin might be a steam engine.

        I think it's fairly safe to say that steam engines revolutionised society (the industrial revolution and all that), arguably more so than those engines that replaced it.

        • inteleng 2104 days ago
          Sure. But they never worked on airplanes, and they're only a distant ancestor to the jet and rocket engine. You don't see people investing in steam engines these days.
      • shoo 2105 days ago
        > Just because something new-ish exists doesn't mean it's the root of something society-changing. Bitcoin might be a steam engine.

        this is a great comment, i can interpret it at least two entirely different ways.

        by "might be a steam engine" do you mean that bitcoin might be one of the key drivers of a major process of economic/industrial revolution?

        • inteleng 2104 days ago
          I mean it might be completely obsolete in five years. It has caused a great deal of innovation, even if much of that doesn't come to fruition.
      • rinze 2105 days ago
        Bitcoins are tulips.
        • pinewurst 2105 days ago
          Or Cabbage Patch dolls
          • inteleng 2104 days ago
            Cabbage Patch Kids and tulips wear out and disintegrate like any other organic matter. We will be lucky if bitcoin wallets from a decade ago are able to be cracked a thousand years from now.
        • inteleng 2104 days ago
          How original. I imagine you can show how the tulip craze fomented great innovation in the floristry field, and spawned a few actually good products despite being inherently misguided?