We struggled with this, especially for collaborative projects, and it was one of the motivations for WritelaTeX (now Overleaf). Having more eyes on the problem (on a shared doc), especially the ability to loop in LaTeX experts, usually resolved many of these similar types of issues.
My colleagues and I use Overleaf for writing journal articles -- it's a really great tool. Thank you so much for writing it. I just really wish I could get my university [Oxford] to pay for it!
The bean counters don't have a problem paying Microsoft the amount of a small African nation's GDP per year, but I pay for my Overleaf pro account out of my own back pocket, and can't even claim it back on expenses...
As far as I can tell, in my department Overleaf isn't liked because it's (a) a personal service, (b) a subscription, and (c) from a company that they don't have in a labyrinthine Oracle-provided, horrific and IE5 "preferred" online purchasing system [which typically takes ~8 months to be added to]. Of these (b) is really the biggest bugbear because my research fellowship has a fixed budget for a fixed term and monthly subscriptions are basically anathema to the way that they do accounting.
I have one piece of commercial FEM software that is only ever "licensed not sold" [annually] -- an annual license is ~£1k; a perpetual one ~£100k. It's quite clear that the company want to go in that direction. I hate it. I don't like subscriptions, and I get riled by central university finance departments every single time...
Ironically, laptops and hard drives etc are quite easy to purchase: they count as 'consumables' and I have never worked in a department that actually audits their lifecycle properly -- much like lab reagents, once they're bought, the university [or my interactions with it at any rate] doesn't seem to even care if the first thing I do is throw it in a bin. This is a Good Thing™ as far as I'm concerned. Having decent computing wherever I am is absolutely key to getting my job done.
Slightly more on topic, the other thing I would say is that quantitative departments are very good at teaching their students LaTeX, but not necessarily teaching them to it well. I won't exactly say that learning TeX made me a better physicist, but it definitely helped me communicate like a professional one. I interact a lot with doctors and bioscientists -- I basically work in medical imaging -- and trying to cross that divide is very hard; we forget that doing a physics degree gives you lots of transferable skills. Overleaf is excellent at providing a "user-friendly face" for projects that I can share with doctors -- they don't need to understand the code, nor have the distribution installed locally; they can just contribute to a paper in progress quite easily, and in particular it's "track changes" feature is something that they like. My team and I tend to use git and % comments, but the value of a web UI is definitely there.
Debugging problems like alluded to in the original article is definitely easier locally, however.
Love your product/service, I am a paying customer (had been using the free version for a while; availability of full document history convinced me to convert). I have to often access a LaTeX document using multiple laptops, and overleaf is a massive improvement over my previous workflow of syncing source from GitHub for a local TeX editor.
I'm currently working on my thesis (alone) which I also sync between devices using github. I work on it locally in VSCode and wrote a small makefile to compile/clean up/etc.
I get the appeal of overleaf for collaborating (I never used it myself, but it seems like a great platform!), but what is it that you consider a massive improvement in overleaf over git+local compiling?
you might like latexmk, which automates running the right tools in the right order. It's pretty neat. It detects whether to run bibtex or biber (and when) and whether another run of (pdf|lua|xe)latex is required to resolve references/citations/..., etc.
Fundamentally, because of "reduction of steps" , although the workflows might be otherwise functionally equivalent:
(1) The cognitive overload of the git push and pull - while this might have seemed small when I was using this workflow, I don't have these steps anymore making them seem unnecessary in retrospect. Reduced steps in general are better IMHO: same reason as to why I might use something like Google docs, even though, a somewhat equivalent workflow would involve going via GitHub.
(2) Ensuring all local environments are similar or identical. Esp. a problem when installing new packages, say on device 1... when compiling on device 2, I need to either remember to do this by noting it somewhere (e.g. modifying the makefile), or I am reminded of it with a compilation failure.
Small gains perhaps, but they add up. Esp. if you are working on multiple documents at the same time (I typically have 2-3 active documents).
 If it matters, I need to switch between 3 devices running windows, ubuntu, elementary OS.
I wrote my thesis with git + vim + latex. The main issue with collaborating like that is that you require other people to be able to use, say, git + vim + latex.† That may not be a problem for you and your colleagues. If it is however, Overleaf basically reduces it to "please open a web browser and have a look".
The other major argument is, of course, that you get an off-site backup "for free" for something as life-changing as your doctoral thesis, too.
†Of course, the other thing about having lots of local commits is that you can easily graph, say, words as quantified by texcount vs time and include the resulting diagram in the final copy of the document...
You're answering a different question from the one that the GP asked. What's the advantage of using overleaf over git + $EDITOR for a document with one author? Regarding off-site backups, that is already provided by your git host (github/gitlab/...)
If you have a lot of files or some large files it can slow things down a lot. The git server is something that's due for a revamp, but as it stands the best thing you can do for now is separate out your LaTeX content and other project assets into different repositories. (We usually see these kind of problems when people have a bunch of non-document stuff in their Overleaf project)
> The git server is something that's due for a revamp,
Thanks for the suggestion! Indeed the largest repos are the slowest to update, even when we only change a single character on a small text file. This behavior is surprising because it is not what happens in git proper, some other part of the system must be introducing the delay.
I look forward to a normalization of overleaf's git interface, most notably the ability to have regular files, symbolic links and so on.
For personal projects I prefer running things directly on my computer, but I love Overleaf for collaborating. In the last year of my undergrad, there was a big switch in my year/discipline from using Word to LaTeX.
Typsetting is hard. I think the bulk of problems may come from an impedence mismatch between the expectations and reality of typesetting.
How many people actually look at the output file texput.log? What about using the family of \tracing* commands? Heck, most TeX engines drop you into a REPL as soon as they hit an error! How often do people make use of this, even minimally by using \show and \showthe to investigate state?
In my experience it seems like people (reasonably) come to TeX/LaTeX viewing it as a sort of advanced word processor. This is unfortunate, because if you approach it more like a full-fledged programming environment, then it ends up feeling a lot more friendly, if a bit archaic, IMHO.
When you're on a deadline, the frustration is real though. I got fed up enough that I bought a hard copy of the TeXbook and spent a week-long deep dive just trying to grok plain TeX. By the end, I ended up liking it enough that I re-wrote my master's thesis in plain TeX, with nice hand-crafted macros for section and equation numbering, etc. This actually made troubleshooting interesting and tractable for me.
If I were still in academia, I would probably advocate strongly for having a TeX/typesetting course for graduate students, if nothing else than to give us a good excuse to actually study and become closely acquainted with this tool that we rely so heavily on.
I love plain TeX and want more people to see the joy as well.
I got into LaTeX (not plain TeX) many years before I even wrote my first line of "normal" code (MATLAB). A couple more years passed before I really got into proper software engineering.
LaTeX being understood as a drop-in Word replacement is precisely part of the problem. It is just too arcane and different. If, like me back in the day, you approached LaTeX as an "alternate word processor" and from there delved into its backend/plain TeX, you are in for a bad time. Understanding it the other way around (some TeX, general ideas and patterns, then the rest [macros etc.] just falls into place) is a much healthier, holistic and sustainable approach that just makes sense. But in our case here, that is infeasible. It would require young engineers to learn a completely alien skill and cease being productive for many months. I reckon CS students/grads have the edge here, given their background and different approach.
Docbook is a markup format in either XML or SGML which was designed in part by O'Reilly for the publication of their books (although I do not believe it has been used for this purpose for many years). The XMl variant is far more popular today, but that's not saying a lot as Docbook itself has become rather obscure.
It doesn't really take on the problem of equations but does handle structured documents in a way which is much more flexible and arguably powerful than LaTeX, since you can use the full power of XML tooling including e.g. XSLT to define document components.
That said, I think it's quite simply too verbose to be popular, not to mention the tooling around it tends towards archaic. I had a friend who was an extremely heavy Docbook user but had configured his emacs to insert most tags by key-chord, making it much more efficient to write. It's very hard to deal with if you don't do that.
This is the kind of side project I should have seen 100 times on HN, but I don't remember ever seeing one. Why aren't there alternatives to LaTeX out there?  It's OK to have respect for Donald Knuth but still accept that language design has come a long way since 1984.
 Of course there are plenty of alternatives to LaTeX. Microsoft Word is just one. But I mean a language that compiles to TeX.
Let's try to scope out the magnitude of the project, you will understand.
* Because the problems with LateX are mostly due to the limitations of TeX syntax. So you really have to design a new improved TeX-like language.
* While, you can borrow most of the syntax of Latex for the semantic language newLatex built on top of newTex, you will have to make a lot of improvements there too. In principle, you don't have to, but in practice whoever does it, will.
* Then you have to write a compiler that works on multiple platforms.
* Compiler outputs PDF, a terrible format to work with anyway.
* Compiler for html will be demanded too, or the project will fail. So have to write that too.
* What about all the latex packages? There are hundreds of packages that will need to incorporated somehow. Or the project will fail.
* Who is going to do all of this? Language design requires a really small team to make an excellent product. But the remaining project is so large, so you need a larger team for that. So now you need a larger team that will just accept what the Language design committee creates.
* Once, you have created something, you have to convince scientists, some who have been using Latex for decades to switch. But scientists will only switch if journals will switch. And journals don't don't want to maintain templates in another language.
In short, there are a whole host of technical and political problems that nobody really wants to tackle. The project is simply not prestigious enough.
But if Ycombinator wants to help science, if they can fund a team to do this, this will probably contribute more to the advancement of science than almost anything else with a few million dollars.
> Of course there are plenty of alternatives to LaTeX. Microsoft Word is just one.
Only to the extent modelling clay is an alternative to CAD/CAM software and CNC tools.
There might be a GUI replacement to LaTeX, but it would have to be a graphical way to manipulate structure, not a WYSIWYG system. And anyway, WYSIWYG is hardly ever WYSIWIG , or WYSIWTG . (More like WYGIWYD , or YAFIYGI .)
I can count like half a dozen TeX/LaTeX alternatives (SILE, Patoline, Platypus, Lout, some are more dead than the others) without even looking them up. It is easy to make a TeX/LaTeX alternative, it is hard to get people to use it.
I strand by my comment. Replicate core TeX (and LaTeX) core features is easy (text layout, hyphenation, line breaking, math typesetting, etc. all are solved issue, often with many independent implementation that are as good as TeX‘s if not better), the issues that holds most people are packages and decades of TeX/LaTeX legacy.
In ConTeXt many people use mainly lua for scripting. Not strongly typed but still an improvement. After the initial struggle ConTeXt also feels more integrated than just a system of TeX macros but that's just my opinion.
At one point patoline had promising future (ocaml is nice) but nowadays it's neither widely used nor under active development.
Many people, including me, use Pandoc. The markup is simpler (extended Markdown), but, more importantly, you can compile it to a variety of other markups, so it lets you repurpose your documents. And if you need more detailed control than Pandoc allows, you can embed raw LaTeX (or html). You can also extend Pandoc by writing your own “filters,”¹ which means you can sort of create your own markup language. Is something like that what you had in mind?
It was a third hand file with a lot of packages and the content was quite different, so perhaps there was a good reason in the original file that was lost in the minimization or perhaps the problem is that sometimes people prefer to reuse a few environments for all the possible constructions.
About the first cell: IIUC You can determine the color by the row, the column or by the individual cell, so the color is recalculated in each cell. For some reason, it fails in the first cell. I'm still curious.