Turning the mathematics of vector calculus into simple pictures

(technologyreview.com)

139 points | by respinal 1625 days ago

5 comments

  • mikhailfranco 1624 days ago
    These methods were invented decades ago by Penrose (1971) [2], as a way of visualizing and improving on Einstein's summation convention [1]. Similar "string diagrams" were popularized for various categories by John Baez in "This Week's Finds" (TWF) [3], with extensive applications to QM by Coecke [4]. There are other modern examples all over the interweb [5] ...

    [1] https://en.wikipedia.org/wiki/Einstein_notation

    [2] https://en.wikipedia.org/wiki/Penrose_graphical_notation

    [3] http://math.ucr.edu/home/baez/week79.html ff

    [4] https://www.cs.ox.ac.uk/ss2014/programme/Bob.pdf

    [5] https://graphicallinearalgebra.net/

    • kmill 1624 days ago
      The conclusions section has a nice short history of some graphical notations, noting "graphical notations of tensor algebra have a history spanning over a century."

      Equation 18 in their paper is in Penrose's original paper ("Applications of negative dimensional tensors"). Fun fact: if you take a planar graph, all of whose vertices are degree-3, then interpret them as triple-products/determinants/cross-products, then the absolute value of the resulting scalar is proportional to the number of proper 4-colorings of the regions complementary to the graph. A reiteration of this is in Bar-Natan's paper "Lie algebras and the four color theorem."

      However, their contribution seems to be notation for dealing with 3-dimensional derivatives (gradient, curl, and divergence), which are special due to Hodge duality and the existence of a triple product. Equation 30 implying curl of gradient and divergence of curl both being zero is pretty nice.

      I think the authors are correct that the specialization of the notation for 3D vector calculus had not been written up yet.

      • mikhailfranco 1624 days ago
        Yes, your fun fact is in Baez's TWF week 92 (1996):

        http://math.ucr.edu/home/baez/week92.html

        and the core proposition goes back to Penrose.

        • kmill 1624 days ago
          I see I was rather oblique, but I brought up the fun fact because that equation is how Penrose proved it (to account for signs). By the way, the TWF article is citing the Bar-Natan paper.
    • fantispug 1624 days ago
      They do mention Penrose notation in the linked arxiv paper[1], as well as Cvitanović's Birdtracks[2] which let you do calculations on representations of groups.

      They claim they're making it more accessible with pedagogical examples (and innovations in representing differential vector calculus).

      I still prefer index notation. Their notation looks pretty intricate and I think it would be really hard to do long calculations in it. It's also not clear to me how you'd generalise it to non-commutative vector algebras like in quantum mechanics.

      [1] https://arxiv.org/pdf/1911.00892.pdf

      [2] http://birdtracks.eu/

    • eternalban 1624 days ago
  • knzhou 1624 days ago
    I’ve seen a lot of people enamored with graphical notations like this one, but nobody ever using them for anything nontrivial. In general, people just use them to laboriously reproduce a few results from undergrad courses, nothing more.

    If ordinary notation is like C, these graphical notations are like code golf languages, a neat gimmick that performs excellently at the few things you hardcoded it to do, and not much else.

    • wodenokoto 1624 days ago
      A Nobel Price winner seems to disagree with you.

      From the article:

      > the American physicist Frank Wilcjek, who worked with Feynman in the 1980s, once wrote: “The calculations that eventually got me a Nobel Prize in 2004 would have been literally unthinkable without Feynman diagrams.“

      • improbable22 1624 days ago
        Sure, but that's an odd reading of GP's "graphical notations like this one".

        Feynman diagrams are very useful, and made respectable lots of other graphical notations. There's a nice book on this:

        https://www.press.uchicago.edu/ucp/books/book/chicago/D/bo35...

        But diagrams as a replacement for writing out tensor indices, like in TFA, which was what I presume GP meant... these have not proven widely useful. I guess people doing tensor network things use them.

      • knzhou 1624 days ago
        Diagrammatic intuition is good. I use Feynman diagrams every day. But nobody computes with Feynman diagrams, they use them as an intuition aid to tell them what to compute, and that computation is done using standard notation.
        • wodenokoto 1624 days ago
          > ... and that computation is done using standard notation.

          I really thought you would have ended that sentence with "Matlab" or "python", not "standard notation"

    • joppy 1624 days ago
      I don't think that analogy holds at all. A better analogy is comparing some boolean function written purely in terms of AND, OR, XOR, etc with its corresponding circuit diagram. From the diagram, the information flow is much more obvious, and common constructions (building adders, as a random example) can be visualised. The diagrammatics give a different intuition on the problem than purely symbolic algebra.

      Of course, both the formula and the diagram encode exactly the same information. However when attempting to prove something about a formula, it can be very helpful to translate the formula into a diagram and use the diagram to guide algebraic manipulations (manipulations which perhaps we would not have known to make, if it weren't for the diagram).

      I know that Feynman diagrams are a quite famous example - even if results do not use them directly, the intuition they give on a problem are indispensable. There are other examples in pure mathematics where new proofs have been given not in terms of diagrams, but the development of suitable diagrams have enabled people to come up with a proof.

      • knzhou 1624 days ago
        There’s nothing wrong with using graphical intuition, my point is that nobody ever uses graphical notations to compute. For example, Feynman diagrams are a great way to tell you what to compute, but then the computation itself is always done in standard notation.
  • foxes 1624 days ago
    Ref of the original paper https://arxiv.org/abs/1911.00892. The biggest issue for me is they don't have proper upper/lower indices. A lower index is a leg pointing down, while an upper index is a leg pointing up. \delta should be \delta_i^j. The upper/lower I like to think of as extra type information. You can pair covectors and vectors. The other issue is with the dot product of course, I would write a_i b^i.
    • kmill 1624 days ago
      They're sneaking the Euclidean metric tensor in everywhere, which means there's no distinction between upper and lower indices. I agree it's confusing, and I'd rather they kept tensor orientations intact.

      The dot product is double contraction with the Euclidean metric; your notation is contraction of a vector and a dual vector.

  • slantaclaus 1624 days ago
    Not enough pictures in this article
  • madengr 1624 days ago
    This is all really confusing. Why not just represent divergence and curl for what they are? Sources/sinks, and circulations. As someone who had lots of courses in electromagnetics, I see no basis for physical reality in these diagrams.
    • bollu 1624 days ago
      While this is true, note that this special relationship between gradient, divergence, and curl only exists in 3D, due to the fact that 2 + 1 = 3, and the presence of the [hodge star](https://en.wikipedia.org/wiki/Hodge_star_operator) operator that allows us to convert "areas" into "normal vectors to areas". A curl is really a "infinitesimal area", which we then choose to interpret as "a normal vector".

      This is captured by the [De Rham cohomology](https://en.wikipedia.org/wiki/De_Rham_cohomology), whose calculations are sometimes simplified by using the [Penrose notation](https://en.wikipedia.org/wiki/Penrose_graphical_notation) as outlined above.

      Having used the notation, I don't think it helps for vector calculus. And I 100% agree with you that we should teach the intuition for div, grad, and curl first, but I wanted to bring to the attention of anyone who's never seen the "fully general" form of these objects as to what it looks like.

    • kmill 1624 days ago
      The paper doesn't really explain everything you need to be able to read the notation, but, being a tensor diagram aficionado, I have to say I found the way they encoded divergence and curl very exciting: it is all consistent and helpful for doing calculations.

      The loop-around-a-tensor notation is calculating the covariant derivative. For example, if you have a vector field, the covariant derivative is a "matrix field," where at every point in space you have a matrix that can take in a vector there and spit out the component-wise directional derivative. The divergence graphical notation (connecting the two wires together) is equivalent to taking a trace of this, meaning you create a scalar field where at every point you take the trace of the matrix from the covariant derivative.

      I guess all I'm saying is that the notation for divergence and curl are consequences to a couple of building blocks that have more generality. I'll also add that, like any language, it takes some getting used to.

      > Why not just represent divergence and curl for what they are?

      I'm curious how you would represent them for what they are. Maybe you have something in mind that would make calculations easier?

    • AnthonBerg 1624 days ago
      I have come to believe the theory that people have profoundly different styles of cognition :)

      Alternate representations are incredibly useful to get as many styles of cognition on board as possible. I found I could grasp the equation immediately with help of the diagrams, and translate that to an internal abstraction representing the world. For me the equations themselves are a much more inert thing to approach and take much more effort to untangle.

      We all think differently. Especially so the more abstract the thought is it seems. As we approach each individual's limit of abstraction. Then we start to diverge further and further in how we think. I personally am a very visual thinker, and find Einstein's descriptions of thoughts to match my experience of cognition: “Words or the language, as they are written or spoken, do not seem to play any role in my mechanism of thought. The psychical entities which seem to serve as elements in thought are certain signs and more or less clear images which can be “voluntarily” reproduced and combined…but taken from a psychological viewpoint, this combinatory play seems to be the essential feature in productive thought — before there is any connection with logical construction in words or other kinds of signs which can be communicated to others.”

      Einstein was smarter than me though.

      I'm a computer scientist. I know some mathematicians, and was talking about matters of cognition with them. One of the mathematicians – a mathematician's mathematician who is easily able to learn math from maths books – told me that his style of thinking is effectively the same as is printed the books. My mind was blown. For me, reading the same books involved "translating" into half-abstract visual/sensory elements. Once translated into my native representation of cognition the acquisition is complete and I can think in the new visual-sensory-abstract symbols. My friend on the other hand basically thinks in the words and math symbols found in the books.