Clojure – Fascinated, Disappointed, Astonished (2016)

(lewandowski.io)

138 points | by systems 1552 days ago

8 comments

  • dwohnitmok 1552 days ago
    Just so people don't get the wrong idea, AFAICT STM isn't used that much in production Clojure (unlike say Haskell). This I chalk up to the lack of a couple crucial operators in Clojure's STM. Instead the usual method for dealing with concurrency is atoms. Note that once you get familiar with how to use atoms to structure concurrency, this is a lesson that is widely applicable. You can, for example, write essentially the same code in Java with an AtomicReference (how atoms are implemented anyways) as long as you take care to only put immutable data structures in the ref.

    I would personally sum up Clojure's main impact on my own programming in two ways.

    1. 90% of the time you just need immutable maps, sets, vectors, and primitives. Fancy custom classes and custom higher order functions usually aren't needed.

    2. Wishing more things had on-the-fly editing of a running program.

    • arohner 1551 days ago
      In my experience, STM isn't used because it's very rare to have situations where you need STM but don't need durability. In most production situations, you need to record state changes in a DB or an event log. In that case, the source of truth is now the DB, and your in-memory values are immutable snapshots.

      The other reason is that Clojure code is usually well-factored, and most clojurians have listened to Rich and simplified their programs. Not many problems require STM transactions across multiple refs. If you don't have multiple refs changing transactionally, you can use atoms instead for a performance benefit.

  • bcrosby95 1552 days ago
    It's missing my favorite concurrency construct: immutable, persistent data structures. I was initially interest in Clojure for things like STM, but found most of my concurrency problems are amenable to immutable data. You can certainly use this technique in other languages, but it tends to be higher effort.
    • Scarbutt 1552 days ago
      Once javascript gets immutable types and better handling of objects, the gap between it and clojure will very small for most daily tasks.

      https://github.com/tc39/proposal-record-tuple

      https://github.com/tc39/proposal-object-iteration

      • adamkl 1552 days ago
        The issue that I find with JavaScript (which I use on a daily basis) is that it has become a "kitchen-sink" language. Every time it gets new additions to "close the gap" with some other language, it gains more ways to do the same things.

        A big thing with Clojure is that it tries to provide a single, idiomatic way of doing things (though overuse of macros may cloud this picture at times). Generally, you can hop into any Clojure codebase and very quickly understand what's what.

        Since JavaScript can do just about anything (functional, imperative, object-oriented, prototypical), there's no one "true" way of writing code, so while its possible use the "good parts" of JavaScript, everyone seems to have a different idea of what those are.

        Flexibility can be a virtue, but I find myself preferring Clojure due to the constraints it provides.

        • kace91 1552 days ago
          As you say, there is no objective "good parts" of js, what exists is basically a large language that can act as a framework for you to create a language you like by leaving only what you want in the subset allowed by your linters.

          That's arguably a lot of work, but the payoff is that you get to use your new hand tailored "language" anywhere js can run, which is not the case if you ignore js and just pick a language you like from the get go instead.

          • adamkl 1551 days ago
            If you are flying solo, then sure, I can see the appeal, but for most(?) of us, that hand-tailored language needs to be learned by everyone you work with, and anyone who might need to work on your codebase in the future. And it’s unlikely that those future coders will feel the same about your hand-tailored language as you do.

            (editing to mention the fact that Clojure targets both servers (via JVM/CLR) and browsers (via ClojureScript) so you lose none of the reach you mention with JavaScript)

        • hinkley 1552 days ago
          Does anything illustrate this more clearly than the picture of 1e 'Javascript, the Good Parts' sat next to the O'Reilly Javascript book?
          • zachrose 1551 days ago
            That image was from a decade ago. The O'Reilly book must be three or four times thicker by now.
      • yogthos 1552 days ago
        That is absolutely not the case. There's a huge difference between having immutable constructs available in a language and immutability being the default.

        Clojure is built around the idea of immutability from ground up. It makes it natural to work with immutable data, and it guides you towards doing the right thing. An example of this would be transients, which allow you to do local mutation within a function, but the compiler will give you an error if you try to leak mutable state. All of this greatly reduces mental overhead for the developer because vast majority of code is referentially transparent. You can reason about functions in isolation without having to know anything about the rest of the application.

        Meanwhile, simply bolting on immutable constructs on top of JavaScript puts the burden of using them correctly squarely on the developer. It's still perfectly possible to stick a mutable reference into your immutable record, and then pass it around and modify it. The language does nothing to help you there. Furthermore, you have to consider the culture and the ecosystem around the language. Pretty much all Js libraries are written in imperative style, and passing mutable data is the accepted practice.

        And immutability is just one aspect, there are plenty of other differences that are very important for many daily tasks. One major difference is that Clojure is a much smaller and consistent language. Clojure code tends to follow a few accepted patterns that the community settled on. Js is a giant language that's grown organically and acquired tons of quirks and gotchas over the years. It's a minefield compared to Clojure.

        Another day to day benefit of using Clojure is its tooling. You get reliable hot loading, which is pretty much impossible with Js, you have sane library management, minification, code pruning to function level, and code splitting out of the box. These features are especially important when shipping code across the network to the browser.

        You also get a REPL where you can connect your editor to your running application, and interact with it directly from there. Any time you write a function, you're able to run it immediately and see what it does. There's no waiting for your code to compile, or the page to reload, or having to rebuild the state. It's an incredibly tight feedback loop.

        In short, there's no comparison between ClojureScript and Js regardless of what constructs get bolted onto Js in the future.

        • agumonkey 1552 days ago
          that's how I felt using immutable libs in python, it did help but the language still has open doors and I never really feel like juggling with potential side effect
      • fnordsensei 1552 days ago
        It’s not only about having immutable constructs available, but having immutability as default, and pervasive throughout the ecosystem.
      • slifin 1552 days ago
        ClojureScript is a superset of JavaScript, something as simple as writing a loop would very quickly illustrate there is a huge gap between the languages
        • blueberrytea 1552 days ago
          How so? Not disagreeing with the languages being different, but loops are very similar
    • kubanczyk 1552 days ago
      No. Immutable structures still require locking.

      If you are spending 100 ms calculating the new version of the immutable structure, you want to make sure that the nearby thread isn't making their own version. You don't want any of the two results silently overwritten. You want the nearby thread be sequential - to wait for you and base on your result.

      • dpratt71 1552 days ago
        This is not how persistent data structures are used in practice. There is no "the" new version of a persistent structure. The best analogy I can think of is to compare it to a VCS (e.g. git). There is no need to lock any existing commit in order to create a new commit (which together with prior commits, represents a new version of the code).
        • majormajor 1552 days ago
          I'm not familiar with Clojure, but you can hit conflicts in the Git world, though, which seem to be what the parent is concerned about. Two of us could be creating some new data based on the last data we had, at time T, and then the other person submits theirs at time T+5, and I submit mine at time T+10. In that case, my change hasn't taken theirs into account.
          • escherize 1552 days ago
            If you can "submit" your change back to the original datastructure then the original datastructure is not immutable, right? Here's a nice explaination about how the persistant immutable datastructures work: https://hypirion.com/musings/understanding-persistent-vector...
            • setr 1552 days ago
              I believe the question is that, if two threads take the same immutable vector, and both make a change to it independently, they'll end up with two new vectors (eg two branches in git); a vector that reflects thread1's change, and a second vector that reflects thread2's change. So now you have a conflict, which requires resolution; git has a human intervene.

              eg

              x = [1,2,3]

              Thread1 -> x + [4] => [1,2,3,4]

              Thread2 -> x + [5] => [1,2,3,5]

              But you were expecting [1,2,3,4,5]

              Reality was that you wanted an order to your events, normally enforced by locking, which the immutable vector doesn't seem to help you with; they were both able to update independently, but you actually wanted them to update dependently.

              If you try to use immutable datastructures to avoid locking, then how is conflict resolution handled?

              I think the answer would be that it doesn't help you avoid locking; either you lock & share a single reference to the latest version of your immutable vector, to enforce ordered events, or you define a resolution strategy separately. The immutability aspect just stops you from not having a resolution strategy -- which would always be incorrect

              And if I understand correctly, the ideal scenario for immutable datastructures in concurrent scenarios is when you can define such a merge strategy (and safely give threads their own copy of the datastructure to muddle with, without actually having to copy the entire datastructure)

              • dpratt71 1552 days ago
                You could, as per your example, use locking as part of a resolution/merge strategy to combine the results of two separate computations running on two separate threads. Or you could use some strategy that does not involve locking. Either way, it does not support the original claim I disputed that "Immutable structures still require locking".
                • setr 1552 days ago
                  >Either way, it does not support the original claim I disputed that "Immutable structures still require locking".

                  It does, if you believe serialization by locking is the main strategy to handle serialization (in which case, mutable or immutable, you still need to lock), and so... you still need locking. Serialization being the main scenario GP gave.

                  Your original answer didn't resolve the problem either -- fine, you didn't need to lock when adding elements to your immutable structure, but you still haven't reached serialization; you've just pushed the problem back another step.

                  The answer that I believe GP would need to correct his understanding, (and much more importantly, the answer that I'm interested in :-) is what serialization strategies does immutable datastructures enable, if not locking?

                  The other correction GP seems to require is whether serialization is actually that important in general, and whether functional programmers tend to experience otherwise... But I don't care about that answer :-)

                  • Scarbutt 1551 days ago
                    the answer that I'm interested in :-) is what serialization strategies does immutable datastructures enable, if not locking?

                    Depending on your performance goals, Compare-and-Set with retries a la clojure's atom reference construct.

                    https://clojure.org/reference/atoms

                  • dpratt71 1551 days ago
                    Oh goodness :) OP has conceded the point, but you're still down to argue on the basis of what a person may or may not believe is the main strategy to handle serialization. I give up. You win, I guess.

                    Regards your other question(s), I will just add that I answered many similar questions for myself (as well as disabusing myself of a lot of misconceptions) by undertaking to get a basic understanding of Haskell.

              • jfim 1551 days ago
                You're assuming that x+[5] mutates the list instead of returning a new list.

                If you're planning to have multiple threads append items to a list, the immutable way to do it is to have each thread return a list of items to append to the main list, then fold those items into the list.

                • reitzensteinm 1551 days ago
                  No, parent post is definitely not assuming that.
          • dpratt71 1552 days ago
            In brief, these sorts of conflicts simply do not arise in a fully persistent data structure. You may have a situation where you have a persistent data structure together with one or more mutable references, each to some version of the data structure. Yes, a modification to one of these mutable references would need to be synchronized, but they are separate from the persistent data structure itself.

            Again using git as an example, there is the persistent data structure, aka the commit graph, as well as mutable references to commits, aka branches. A change to what commit a branch references needs to be synchronized.

            • kubanczyk 1552 days ago
              I concede. If you have algorithm to do the `git merge` equivalent without human help, I guess no locking or STM is needed. Although that's very costly in implementation.

              It's great to have git as a mental model in this discussion, really useful.

              • dpratt71 1552 days ago
                Whether the "merge" implementation is costly or complicated very much depends on exactly what it has to do. The git example is pretty much a worst case example in this regard.

                An easier example could be that you have a tree structure that represents a mathematical expression. The evaluation of every node could proceed on its own lightweight thread. The merge strategy would be to simply perform the appropriate operation on the results produced by the threads evaluating the child nodes.

                • kubanczyk 1551 days ago
                  I don't get math expressions example at all. One thread modifies one value, so there is no (mutability) problem to solve.

                  You have customers' orders to buy items. One last item remains at your store. You accept one order and update HEAD. You accept another order in parallel and follow to merge. "Merge" here means that you need to return money to the customer and send out an apology e-mail.

                  More cumbersome than locking, isn't it? But possible, yes.

                  • dpratt71 1551 days ago
                    Yeah, you're right, I muddied the waters a bit with that example. I was just trying to think of examples of persistent data structures being used in a concurrent context, but you had in mind a situation where multiple agents could be updating the same data structure independently.

                    Your example is much better. I also was thinking of maintaining an account balance with a log (vs. synchronising updates to a stored amount), but it's not much different from your example.

                    And of course this "eventually consist" strategy is generally how things happen "at scale", persistent data structures or not.

          • bjoli 1551 days ago
            That would be problematic using non-immutable datastructures as well as you might end up with incorrect order or even nonsensical data if you don't use locking.

            Having used immutable data structures and concurrency in non-clojure languages, I mostly resort to something like concurrentML for concurrency. Message passing lets you solve situations like that in more elegant ways.

      • reitzensteinm 1552 days ago
        Clojure generally uses CAS semantics, although it also has transactions that can update multiple refs atomically. So nothing is ever "silently overwritten" and there's no locking required, it's all optimistic concurrency.

        However, to agree with part of your point, an atom being updated with CAS can only ever progress linearly; if you bang 8 threads in a loop trying to update it, you might as well use a lock, because 7/8 of the work is going to be thrown out.

        They should only be used for cases where on average one thread or less is working on updating it; anything more, and it needs to be split, the problem rethought, or some mechanism for not throwing out wasted work implemented.

        It's not a silver bullet.

      • Scarbutt 1552 days ago
        If he's talking about multithreading he probably wanted to say immutable data structures in combination with clojure's atom (CAS semantics).
      • samatman 1552 days ago
        Please read the first paragraph of this page and see if your objection still stands:

        https://clojure.org/about/concurrent_programming

  • nickik 1552 days ago
    A couple of things to point out for new Clojure readers.

    The 'def' form should not be used inside of 'defn'. Local bindings should use 'let'.

    When using the STM, 'ref-set' should not be used most of the time.

    (def names (ref []))

    (defn add-name [name] (dosync (alter names conj name)))

    The 'alter' function will call 'conj' and the return value will be the new value of 'names'.

    • bjoli 1551 days ago
      Why should local bindings use let? In scheme internal (define ...) is defined in terms of letrec, which can add some overhead when letrec is unoptimized, but for schemes with optimized letrec (guile 3.0, chez, racket...) (define ...) is just as fast as let or let* unless you start doing cyclic references.

      The only real reason to not use Def would be if the bound symbol escapes the scope, but that would be ridiculous for a modern language.

      • nickik 1551 days ago
        In Clojure 'def' creates a 'var' and those are global in the namespace.

        > The only real reason to not use Def would be if the bound symbol escapes the scope, but that would be ridiculous for a modern language.

        Its not 'ridiculous' to have a consistent definition of what something does.

      • pepper_sauce 1549 days ago
        It does exactly what the documentation says it does:

        "Creates and interns or locates a global var with the name of symbol and a namespace of the value of the current namespace (ns)."

    • Scarbutt 1552 days ago
      When using STM, delete that code and start again without STM.
    • holtalanm 1552 days ago
      i was thinking that, too, while reading this. It has been a while since I wrote any Clojure, but I didn't remember ever using `ref-set` with STM.

      also, local bindings with `let` were the defacto standard.

  • fiddlerwoaroof 1552 days ago
    This post is pretty old and many of the things in the "disappointed" section have been improved quite a bit.
    • goostavos 1552 days ago
      They're also, in classic blog post form, disappointments after a whole two weeks with the language So, you have to know things like `let`, which means knowing the language, is strangely listed as a disappointment.
    • akavel 1552 days ago
      Ehhh; OP or mods, please add (2016) suffix...
    • vimota 1552 days ago
      Curious about this! Could you elaborate on which?
      • fiddlerwoaroof 1552 days ago
        Many of the points are error-message related and Clojure 1.9 and 1.10 have improved errors quite a bit
      • Plugawy 1552 days ago
        Almost nobody uses refs in production code ;-)
        • AlexCoventry 1552 days ago
          That was the case even before the blog post was written.
  • kotutku 1552 days ago
    > I have to admit, that at the beginning using brackets was not easy for me. Once I’ve realized that the brackets are just on the other side of the function name, everything was simple and I could code very fast.

    A question to fellow Lispers: is there a good reason why LISPs do not put parenthesis they way we know from math? Why `(f foo bar)` and not `f(foo bar)`?

    • pdonis 1552 days ago
      > is there a good reason why LISPs do not put parenthesis they way we know from math? Why `(f foo bar)` and not `f(foo bar)`?

      Because the first is a list and the second isn't. The whole point of LISP is to represent code as data, and the fundamental data structure it uses is a list.

      • MadWombat 1552 days ago
        well... (1 2 3) is syntactic sugar for (cons 1 (cons 2 (cons 3 nil))). What prevents someone from adding syntactic sugar f (x y) for (f x y)?
        • samatman 1552 days ago
          Absolutely nothing.

          What you're describing is called in Lisp circles "m-expressions".

          s-expressions were supposed to be an implementation detail, and programmers were expected to use m-expressions in the day-to-day.

          No one ever got around to implementing them, though, and if you spend enough time working with Lisp, you'll most likely understand why.

        • pdonis 1552 days ago
          > What prevents someone from adding syntactic sugar f (x y) for (f x y)?

          Because (cons 1 (cons 2 (cons 3 nil))) is a list and f (x y) isn't. You could put parens around the latter to make it (f (x y)), but then you might as well save a pair of parens and just write (f x y).

        • Sharlin 1552 days ago
          Because `f (a b)` already means something else: two separate expressions that just happen to be adjacent.
          • pdonis 1552 days ago
            > `f (a b)` already means something else

            If it's inside a list already, yes. But if it's just an expression all by itself, it's not even well-formed in LISP, since it's not a list.

            • klez 1551 days ago
              Depending on what `a` is, it certainly could mean something. Those are two expressions, where `f` by itself does nothing and, if `a` is a function, it will be applied to `b`. If `a` is a macro it will be transformed, and then depends on what it will be transformed into. If `a` is neither, then it's an error.
        • dunefox 1552 days ago
          The fact that one of these is a list of symbols and the other one isn't. They mean different things.
          • MadWombat 1552 days ago
            > "They mean different things"

            For fuck's sake, these things don't mean anything by themselves. The only reason they mean different things is because the syntax is defined this way. So back to the original question, why is the syntax defined this way?

            • gsk22 1552 days ago
              Your proposed syntax becomes ambiguous when you are more than one level deep. What would be the meaning of an expression like f (g (x y)) ?

              Is it (f (g x y)) or (f g (x y))?

              • MadWombat 1552 days ago
                Hmm... I see. (x y) is a list, g (x y) is a function call, f (...) is another function call, so f (g (x y)) is (f (g x y)) in regular LISP syntax. The syntax is parsed from innermost to outermost, so where is the ambiguity?
                • filoeleven 1551 days ago
                  (x y) is also a function call, because lisp always interprets the first position of a list as such unless you quote it:

                    f (g ‘(x y))
                  
                  But that changes the meaning of g from “a fn with two args” to “a fn with one arg that is a list.”

                  If your meaning is instead that function-outside notation cannot be mixed with s-exps, I.e. you have to choose one mode or the other, then I think you’re right that there’s no ambiguity here (and no need to quote the args above). I suspect you would lose the power of macros, and may not be able to use existing ones, but I don’t write them so I’m not sure.

                  Below are some Clojure expressions with standard and then function-outside versions. I quickly found that I wanted to go back to attached() parens() for those, because that seems more readable to my eyes than having () them () spaced out. I’d never choose this option though, it took all of an afternoon to get used to lisp syntax, and I find it to be more readable now than c-style languages.

                    (let [a 1 b 2] (+ a b))
                    let ([a 1 b 2] + (a b))
                    let( [a 1 b 2] +(a b))
                  
                    (apply + (range 10))
                    apply(+ range(10))
                  
                    (defn my-fn [x] (* x x))
                    defn (my-fn [x] *(x x))
                  • MadWombat 1549 days ago
                    Eh... you are still stuck with LISP. We are talking about an alternate syntax where (x y) is not a function call.
            • lispm 1549 days ago
              > why is the syntax defined this way?

              Because Lisp has two syntax levels: one of s-expressions and one for the programming language on top of s-expressions.

              S-expression examples

                (berlin madrid london)
              
                (john (age 21) (weight 65))
              
              Lisp code examples:

                (if (> a b) a b)
              
              Non Lisp code, but a valid S-expression:

                (if (> a b) then a else b)
              
              You can define a new syntax for Lisp - like it has been done before - where you traditional syntax to define a programming language. You then just need a grammar and a parser.

              Originally s-expressions were only thought for data and programs would have a different syntax.

              A conditional might be written like that.

                 cond a > b -> a ;
                      t     -> b
              
              Lisp code with lists might then look like:

                 append[ (a,b,c) , car[list] ]
              
              Where (a,b,c) is a literal list of three symbols.

              But the Lisp system was internally defined using lists and the compiler&interpreter were using Lisp code as lists.

              Thus it was seemed more practical to just work with s-expressions (nested lists, ...) and defer the question of syntax to a later time.

              Later multiple attempts had been made to define a Lisp with a more traditional syntax: the Lisp 2 effort in the 60s, various syntactic front ends, RLISP, LOGO, ML, Dylan, ... . But for Lisp programmers it was more practical to keep the s-expression syntax, because the internal machineries of Lisp are using s-expressions anyway

          • int_19h 1552 days ago
            We're talking about syntax, not abstractions. There's no reason why f(x y) can't be a representation of a list that is represented by (f x y) in most Lisps.
        • lispm 1549 days ago
          > (1 2 3) is syntactic sugar for (cons 1 (cons 2 (cons 3 nil)))

          Not really.

          (1 2 3) is a list in Lisp.

          (cons 1 (cons 2 (cons 3 nil))) is code to produce a list.

          if we have

            (append '(1 2 3)
                    (cond 1 (cons 2 (cons 3 nil))))
          
          The first is a literal list and the second one is code, which produces a list at some point in time - maybe at runtime.
        • kazinator 1551 days ago
          No it isn't; and your definition is infinitely regressive because it further implies that (cons 1 (cons 2 ...)) is syntactic sugar for (cons 'cons 1 (cons 'cons 2 ...)) and so on.
    • bencw 1552 days ago
      I find this statement from Paul Graham's Revenge of the Nerds enlightening:

      > Lisp looks strange not so much because it has a strange syntax as because it has no syntax; you express programs directly in the parse trees that get built behind the scenes when other languages are parsed, and these trees are made of lists, which are Lisp data structures.

    • grayclhn 1552 days ago
      In my (very limited) experience, the benefit's not from representing functions this way, it's from representing everything this way -- including things that would be special syntax in other languages. It makes it easier to write and debug things like macros if the input is already represented as essentially an AST and the output doesn't need to be converted back into special syntax before it can be run.
    • abjKT26nO8 1552 days ago
      If you wrote function applications as `f(foo bar)`, then spaces between identifiers and parentheses would be semantically-significant. Putting a space between `f` and the first parenthesis in the aforementioned expression would produce an invalid expression. Similarly, doing the same with `g(baz f(foo bar))` would result in `g(baz f (foo bar))` which is an application of g to 3 arguments, second of which is the procedure bound to f, whereas in the original expression it was an application of g to 2 arguments, second of which was the result of application of f to two arguments. I don't think such sensitivity to whitespace would result in comfortable code editing when you move code around.
    • mdparker89 1552 days ago
      Code is data. `(f foo bar)` is both a list of symbols and a function call.
    • int_19h 1552 days ago
      Contrary to many of the responses you've got, the real answer is that it's just a convention with no particular reason other than historical cultural preference. For evidence, I present R: it looks like C syntactically, but all constructs actually desugar into function calls (even assignments!). And function calls themselves are also C-style - f(x, y) - but once it's parsed, the result is a "pairlist", which is exactly what it sounds like to any Lisper. So any R program is represented as nested lists, with constants and symbols as leaf nodes, exactly as in Lisp.
    • TeMPOraL 1551 days ago
      'pdonis gave you the correct answer, but another way of thinking about it: because Lisp way is more consistent. It also makes it more powerful.

      The math notation we commonly use is mostly a legacy notation, invented before we had computers and could do (or conceive of) symbolic processing. We write `f(foo, bar)` instead of `(f foo bar)` not because the former is better, but because it's been established before we invented trees (data structures) and figured out why they're important.

    • edflsafoiewq 1552 days ago

          def(fact(n acc)
              if(=(n 0)
                  acc
                  fact(-(n 1) *(acc n))))
      
      I'm not a fan.
    • agumonkey 1552 days ago
      there are UIs hacking around the idea

      sweet expression .. I forgot the name, made the top level forms parens-less

          define fact(n)
            (if ... )
      
      some other hacks include back-shifting any parens by one symbol to turn <fn>(...) into (<fn> ...)

      honestly I'm 'astonished' that the sexp debate never dies, no lisper cringes at sexp for more than a week (especially with a structural editing mode)

    • tpfour 1552 days ago
      You can modify the reader to do that but then you can also just write Python.
  • Rapzid 1552 days ago
    Been a while since I touched Clojure but at the time nothing beat light table's insta-REPL. Anything replace that, or any insta-REPL come close to replicating that experience for any language?
    • BoiledCabbage 1552 days ago
      I never used LightTable, but based on the videos I've watched I believe the "Calva" plugin for VS Code has the same Clojure insta-REPL functionality.

      https://github.com/BetterThanTomorrow/calva

    • ledgerdev 1552 days ago
      The integrated repl and inline results in lighttable were the best! Sad that it was abandoned.

      The best choice today for a similar experience is Chlorine(https://atom.io/packages/chlorine) which is built on top of Atom. It also uses the built in clojure socket repl, not the complex nrepl.

      Calva does it's best to show some data inline, but unfortunately vs-code doesn't support displaying inline results https://github.com/microsoft/vscode/issues/3220

      • BoiledCabbage 1552 days ago
        > Calva does it's best to show some data inline, but unfortunately vs-code doesn't support displaying inline results

        I've got no attachment to the Calva project whatsoever, but the below link pretty clearly contradicts what you're claiming.

        https://calva.readthedocs.io/en/latest/eval-tips.html#evalua...

        Not to mention I've done it.

        • ledgerdev 1551 days ago
          Sorry wasn't worded well, and in no way am claiming calva doesn't show inline information. I have used calva extensively though not in the past year. It should have read something like this:

          > does it's best to some inline information ... but doesn't support displaying inline results nearly as well as lighttable or Chlorine.

          So vscode/calva just adds text formatted only information information at the end of a line. This works fine if the result is short like say "hello world". But what happens if it's a large result, say a list of 100 items? In this case it just prints out some portion of the result or perhaps even the whole result? I can't even remember exactly. And if you throw exception, it's printed out on one very, very long line of text (or was last time I used it) at the end of the line. I can recall scrolling horizontally thousands and thousands of columns and trying to interpret the long the already hard to decipher clojure errors.

          What Chlorine and LightTable do/did much better is show the large/long results in a formatted block below the line, and with large data structures allow formatted expansion, because atom isn't limited to only line based decorations.

          Here even is the author of calva asking for more than just simple :before and :after line decorators in vscode. https://github.com/microsoft/vscode/issues/3220#issuecomment...

          Has calva been able to better handle inline errors and large results better? If so I need to try it out again.

          • Rapzid 1551 days ago
            It's an area that isn't "solved" IMHO. I love F# as well, but the typical fsi select to execute doesn't come close. Light table provided the ultimate scratch pad.

            Scala sheets are also meh IMHO

          • BoiledCabbage 1551 days ago
            No it doesn't have multi line results as far as I know.

            Thanks for clarifying.

      • lkschubert8 1551 days ago
        Calva definitely does show the results of evaluating a form inline.
    • giancarlostoro 1552 days ago
      This is still my favorite way to do Clojure. I gotta say it's such a nice editor.
    • nravic 1552 days ago
      Julia's REPL is beautiful
  • capableweb 1552 days ago
    > I have to admit, that at the beginning using brackets was not easy for me. Once I’ve realized that the brackets are just on the other side of the function name, everything was simple and I could code very fast.

    Yeah, having automated insertion of brackets/parenthesis is necessary to adopt, otherwise you find yourself annoyed very quickly. Once you've adopted to automatic insertion of that, it's hard to go back to language where you don't have that...

    • Plugawy 1552 days ago
      I'm not sure to be honest, I've been working with Clojure for 5 years now, and the only "help" I get from the editor I get is "highlight matching pair". I never got used to automatic paren insertion or things like paredit.
      • eigenhombre 1551 days ago
        To build on what yogthos writes here, Paredit was an absolute game changer for me. The ability to edit your code as a tree structure rather than just a sequence of characters lets you do many kinds of edits far faster than is possible in non-parentheses-laden languages.

        The following are all a single key combination in most editors that support Paredit or equivalents:

        - Kill an entire loop or data structure to the right of the cursor

        - Swap two deeply nested expressions to the left and right of the cursor

        - Absorb ("slurp") or expel ("barf") expressions to either side of your current expression, into that expression

        - Delete the entirety of the expression enclosing where your cursor is, leaving everything to the right intact but raised to the enclosing level (this one is amazingly common but you would never think to use it until someone shows it to you, after which you use it every day)

        It takes a couple of days to get these and many other shortcuts into your fingers, after which it's hard to live without them. http://emacsrocks.com/e14.html is a good introduction.

        Structural editing is the first thing I miss going back to Python, Ruby, JavaScript, etc. That, and, of course, instant REPL evaluation of the expression under my cursor.

      • yogthos 1552 days ago
        I really recommend just forcing yourself to use it for a week or so. You just have to switch your mindset to the opening paren being a control character for starting an expression. I just don't look at the parens at all when I work with Clojure.
    • agumonkey 1552 days ago
      true, as much as I like lisp, without automated pairs (not even asking for paredit) I go insane fast.

      that said paredit should have a statue on the moon or something

    • draw_down 1552 days ago
      That’s just for writing though, not reading. I still find lisp code very difficult to read, perhaps not due to parens specifically but more because there is no syntax.

      Like just the very simple test assertion in this article is not particularly easy to parse. Obviously I realize that it calls a function and asserts the return value is equal to the expected value because that’s how tests work, but it’s so much less readable than what I’m used to.

      • User23 1552 days ago
        Lispers read code by indentation, not by scanning parenthesis. Properly formatted Lisp, which is trivially automated thanks to the code being so easily parsed and manipulated, reads pretty much the same as Python. A good editor will shuffle things through the parenthesis based on indentation too, so the experience of writing it is keystroke for keystroke pretty close to Python as well if that's what you want.
      • mnm1 1551 days ago
        If you're learning Chinese, it'll be much less readable than your native language too. This is simply not an argument at all. Once you get used to it, reading clojure is easy. I'd say it's easier than any other language by far. Not only that, once you've worked with clojure you may wonder why all the other languages even bother with their superfluous syntax. At this point, if I had a choice, I would program mostly in clojure. The unnecessary, superfluous, verbose, and often boilerplate of other languages always makes me question why I would use them, write four times the amount of code, and still not get nearly the experience and software quality clojure can provide. I highly encourage you not to judge a language, and language, before actually learning and using it.
      • iLemming 1551 days ago
        > That’s just for writing though, not reading. I still find lisp code very difficult to read, perhaps not due to parens specifically but more because there is no syntax.

        Most of us are too familiar with the infix notation. It is deeply ingrained on the subconscious level of our assumptions about programming languages. It is indeed tricky to quickly parse a snippet written in a Lips with an untrained eye.

        Many software developers with prior experience in programming, at their very first encounter with a Lisp dialect, instinctively try to reject it. It looks unnatural and doesn't feel right.

        In comparison - most people who never wrote a single line of code ever before, usually don't struggle with Clojure or Racket.

        However, if one ignores the instinct, persists, and gives it some more time, they may find an incredible world of possibilities.

        There is a way to speed up the process of adapting to reading and writing Lisp code. The trick of learning to read Lisp code efficiently, is simply refactoring code on the go. Don't try to learn a Lisp dialect by merely staring at code - in books, on web pages, etc.

        Pick up an IDE or editor that supports Structural Editing. Structural Editing is a way to navigate and refactor your code without breaking parenthesis or other characters that define the structure of your code. Learn basic structural editing commands - slurp, barf, raise, transpose. And modify the code, pick up functions, and try to dissect them. With a connected REPL, Lisp allows you to evaluate any expression without any prior ceremony (a feature that most other languages don't have). You can try every single piece, break up functions into smaller ones, or compose multiple small functions into a bigger one. It doesn't take too long until you see the benefits of this approach. Sadly, for some people going back to normal life, after discovering powers of Lisp becomes like going back to drugs after detoxification.

        Almost every discussion oh HN about Lisp-dialect results in at least one comment of someone claiming that they've tried writing Lisp for many years but never have "discovered" its powers or even after years it still feels "unreadable" to them. Most of the time, those comments are disingenuous. Lisp does not require anyone to have a "differently wired brain." Nobody comes into this world with the innate ability to read and understand Mandarin better than English or with the ability to be able to learn Python and Javascript but not Clojure.

        Most people don't realize that very often, path to simplicity lies through the thorns of breaking existing habits. They choose an "easy" way. The "way of familiarity." And without even realizing, they usually create more work for themselves.

    • Swizec 1552 days ago
      This is what drove me away from lisp.

      If everyone says you have to use a plugin to auto-insert parentheses, that means computers know where they go. So why are we writing them??

      • wtetzner 1552 days ago
        The computer only knows where they go because the rest of the code already has the parens.

        Having the parens has advantages, like everything being explicitly delimited, which makes it really unlikely that you'll insert some code that you thought was part of a block, but really wasn't (think one-armed if without braces in C-like languages). It also makes it easy to have your editor help you navigate your code. Being able to jump around expressions conveniently can really speed things up.

      • ken 1552 days ago
        Lisp doesn't have more grouping characters than other languages. The primary difference is that it consistently uses prefix notation, so the cursor always immediately goes inside the parens, rather than needing to type one part of an expression (the function name) outside of it first. That's what modes like paredit help with.

        Programmers in languages like Ruby (optional parens for method calls) and Tcl (no parens for proc calls) might agree with you! Why does virtually every language require parens, when they're not needed? At least with Lisp, they serve a necessary purpose.

        Another one: almost every language uses commas to separate items in a list, including function arguments. In the vast majority of cases, the commas are useless. Lisp doesn't use commas to separate elements (though Clojure allows them if that's your druthers). Why can't I say f(x y) or [3 4 5] in, say, Python? There's nothing else it could possibly mean.

      • Skunkleton 1552 days ago
        The computer doesn’t know. It just knows that brackets need to go somewhere and makes it easier to edit them as a pair.
      • tpfour 1552 days ago
        Not everyone uses plugins. I don't have autocomplete but my parens flash when they're paired. I like it.

        You write the parentheses because they have meaning. I can understand your frustration but I don't think you gave lisp a fair chance if you are asking "why parens".

            > '(this is a list)
            > this is a list
        
        These don't have the same meaning. The computer can't know what meaning you want to give to the code beforehand.
      • mschaef 1552 days ago
        They make the extent of syntactic expressions more obvious to both people and tools.
      • capableweb 1552 days ago
        Because they are not perfect and you need control of where they go. They don't always get it right and you might have to correct them.

        At least none of the solutions I've found been 100% perfect. So I much rather have the computer do as much as it can, then I correct if it gets it wrong. Collaboration with the machine is better in this case.

      • azhenley 1552 days ago
        Try Forth :)
      • lukifer 1552 days ago
        I've never spent any serious time with Clojure/Lisp, and the nested parens have been a pretty major impediment. I'm curious if anyone's ever experimented with dialects that replace brackets with significant whitespace (newline+tabs), similar Python? I could also see a drag-and-drop coding GUI being pretty powerful.

        I'm sure once one spends enough time with it, one starts to think in Lisp, and the desire to solve the "problem" disappears. :)

        • tpfour 1552 days ago
          You never spent much time with it and the parens are a major impediment? If you cannot overcome the "parens" (non) issue, I suppose the language is not for you. It really is not an issue, _at all_.

          I don't say this to be demeaning! I would say there are far more chances you really enjoy the language than not. Give it a try and stop worrying about the parenthesis and start thinking in terms of an AST.

              (let ((x 0)) (+ x 1))
          
          If you want Python that's fine but there is definitely no problem with parens in Lisp/Scheme :)
          • lukifer 1552 days ago
            Ha, just thinking aloud! More of my own psychological barrier than a knock on Lisp. I do like it quite a lot, it's just completely alien compared to a life of experience with C-like languages.
            • filoeleven 1551 days ago
              I found its alien nature to be kind of helpful in learning Clojure, actually. The paren position change really does melt away quickly, and in the meantime it was a good reminder that I was “not allowed” to do procedural code or mutate variables; I had to use a different mindset. Most people who check out Clojure have been coding for 10+ years, so you’re in good company as far as ingrained habits go.

              As others have said, the best way to check it out is to get a REPL running in a decent IDE and play with the language. There’s something quite intuitive in its workflow: knowing that you want to do something, though you’re not quite sure what it is yet, and typing a single ( gives you all the power and space you need to figure it out no matter how deep inside a code block you are.

        • ken 1551 days ago
          Yes, wanting to "solve" Lisp syntax has been a common initial response to Lisp for at least several decades. In 1993, Gabriel and Steele wrote:

          > we expect future generations of Lisp programmers to continue to reinvent Algol-style syntax for Lisp, over and over and over again, and we are equally confident that they will continue, after an initial period of infatuation, to reject it. (Perhaps this process should be regarded as a rite of passage for Lisp hackers.)

        • samatman 1552 days ago
          Yes, Schemers have been working on sweet-expressions:

          https://srfi.schemers.org/srfi-110/srfi-110.html

          They've been implemented for (at least) Racket:

          https://docs.racket-lang.org/sweet/index.html

        • vnorilo 1552 days ago
          Indeed. I came from C++, but I quite like homoiconic syntax by now. There have been alternative syntaxes [1] but their lack of popularity seems to align with your hypothesis.

          1: https://dwheeler.com/readable/

        • zdragnar 1552 days ago
          Significant whitespace typically denotes blocks of statements or expressions. How do you use whitespace to represent a data structure? After all, the whole point is that literally everything, code included, is a list. (Edit: if all you want is homoiconicity, grab prolog, rebol or julia)

          There are plenty of non-paren functional and OO languages (f#, haskell, python, java) to replace functional and OO lisps depending on the itch you want to scratch. JavaScript has babel / babylon to generate a full AST to rewrite source code, though you need to either fork babylon or use sweet.js if you want your own macros.

          • lvass 1552 days ago
            >How do you use whitespace to represent a data structure?

            SRFI-49[0] describes I-expressions, which can be mixed with S-expressions. I've tried it while learning Scheme, because parens didn't look good at first. My experience was that parens are an issue only until you get used to them, afterwards they're a boon.

            [0] https://srfi.schemers.org/srfi-49/srfi-49.html

  • the-alchemist 1551 days ago
    I came from a strict C++ -> Java/Python world. My college class on programming languages included a Lisp, I think, but that was so long ago and was so poorly taught that I graduated without ever hearing the word "REPL".

    So grokking, let alone writing, Clojure code was daunting but extremely worthwhile. All good things come to those who wait, and sure enough, I see it now. Massive gain in productivity. Like, probably more than going from C++ to Java, or Java to Python (for me).

    P.S. This article is definitely a little dated in parts, but has of the warts still stand.