I love Latex but it (or more specifically TeX) is showing its age. It's perfect for writing a paper given a good template but writing and debugging these templates (.cls, .sty) is unnecessarily hard. For my job (in academia), I have to update such Latex templates on a monthly basis and always end up looking at the current improvements for Latex. This updating task involves not 'writing' in Latex but 'programming' in Latex. If you are used to modern programming languages then Tex is stubborn and hard, therefore, I always have the feeling there is improvement possible (I mean, Python, Javascript, C++, ... are all easier to debug). Although people are working hard on this and doing interesting work (i.e., LuaLatex, Latex3), I feel the underlying Tex language has had its time and a more drastic change might be necessary. Remember that Tex is designed for computers with an order of magnitude less resources.
An increasingly interesting Latex replacement is, maybe surprisingly, html in combination with css and javascript. With every update of our browser the inspection and debugging tools become more powerful and every time I can track layout and programming bugs a little bit faster. With the addition of more properties targeting paged media in css3 it now becomes possible to also create nicely looking pdfs starting from html. Prince (http://www.princexml.com/), for example, is ahead of the browsers for CSS paged media properties and outputs a pdf-file directly. But also typical features for which we praise Latex are becoming available:
Most people only use the basic commands and don't care about the underlying engine. Therefore, Latex as a 'writing' language is not too attached to the Tex 'programming' language. Pandoc (http://johnmacfarlane.net/pandoc/) could be enough to translate such Latex code to another markup format and use another engine. (Academics tend to use advanced Latex macros only when in need of space ;-))
I don't really mind TeX's age since the results of typesetting with it are very nice and there is practically no alternative at least in academia. I find it a bit hard to imagine that html can do the same job. The minor (or major) inconsistencies between html engines can lead to so many different renderings from the same source and I don't think that you want that for your texts. I agree though that typography in browsers has come a long way since the early 00s and is definitely in good track. The tools you are mentioning are nice but setting them up does not seem to be any less complicated than TeX and friends (not mentioning the fact that the hyphenated result [1] from Hyphenator looks so bad; it begs for margin protrusion, but that's a different issue)
In my opinion LuaTeX (and its LaTeX counterpart) is the future of TeX. It builds upon the strong base of TeX and combines it with Lua which is a fine language for this purpose.
Due to its age, there is a massive codebase in TeX/LaTeX. There is literally a package for everything. It would be a massive undertaking to replace the TeX system by something that isn't downwards compatible. And since the vast majority of LaTeX users don't get to see much of the underlying mess the pressure to do so is rather small.
I'm curious. Why do you need to update templates so frequently? I would have expected academia to have a fairly static, albeit perhaps large, set of templates. E.g., at most one for each journal in your field, one for each conference, one for each publisher, with any of these only changing rarely.
\rant{I think it's funny that you say this, because I was just remarking to someone the other day that the primary way I've seen TeX and its little forest friends age is that it's become rather unwieldy IN SPITE OF the fact that the hardware it's running on now is far more than an order of magnitude faster.
In college, I was typesetting my work in PlainTeX (I never did like LaTeX, but obviously I had it available) on a 14.77MHz 68000-based Amiga 2000, and the TeX distro came on floppies. I had a whopping 40Mb hard drive, and all the heavy lifting lived there - Metafont, dvips, tex itself. But they fit comfortably on 800K low-density floppies and ran from them, if you needed to. The other floppies were all fonts, and since the prevailing format for fonts in the rest of the world was Type 3 Postscript (yucky bitmap) and comparable TrueType, my work looked rockin.
So to review, I had it running largely off floppies on a machine a couple of orders of magnitude slower (and a couple of orders of magnitude less memory and storage) than my iPhone. And I frequently taught freshman English majors who wouldn't own their own computer for another 3 years how to use it, down to font rendering and selecting an output format for the target printer which was rarely Postscript back then.
Riddle me this then: why are current TeX distros completely indecipherable to me now? I mean, kpathsea was always a bit of a beast, but I understood it pretty much at a glance. How is it that, although I've used the platform on and off for two decades now, in the last 5 years I've had to call the Psychic Friends Network every time I tried to call a package that I thought I had installed correctly? Oh, and why is a whole install now larger than the sum total of all the storage I had at my disposal - every floppy, hard drive, mainframe quota, and gettemp limit - when I last used the system on a daily basis?
As far as I can tell, the last update to the core product was in 2008, and everything that's been added to the main engine since 1992 has been incremental support for things like modern font formats. So it should have grown linearly, not exponentially. But there it is. Big as life and twice as ugly.
This is actually the second question in five days I've seen in two different fora about, "How is TeX holding on?" And to look at the sample output that was produced by Lout, obviously the answer is, "Because no one ever came up with a replacement that produced better output." You don't have to ask Don Knuth to figure that one out. It's not that Lout hasn't surpassed TeX yet. It's that it hasn't beaten troff yet. The 70s called, and they're looking for their DEC LP01, man.
But I don't think people are actually voicing the question in their heads. I think the question they're actually asking is, "Who let this godawful piece of Frankencode run through the village terrorizing the children, and why won't someone please scrape it all into a pile and teach it how to sing Puttin' on the Ritz like it did 20 years ago?" Or, "If you got this thing back into shape, why wouldn't it be the rendering engine for ebooks, because if it's setup right, it can render a whole book from source live on an iPad which is 100x more powerful than its original compile target?" I can think of 20 questions like this. All the questions ultimately boil down to a wonderment that one of the best pieces of software ever written for making readable output is cared for so shoddily. It's like some laboratory experiment gone amuck on how layering bad abstractions on things makes even awesome things awful. }
And now for my next trick, I'm going to go integrate XeTeX into my current product to generate custom typeset results for customers. No, seriously, I am. I see 20 more years of this platform in my future...
There are a lot of alternatives for the easy stuff. TeX makes the harder stuff like bibliographies, cross references, figure and equation numbering, indices and aligning equations relatively easy. As a mathematician, I would need to see the equivalent of "Math into LaTeX" before I would even consider switching.
I played around with Lout for a good while during my brief fling with alternative typesetters; this included tracking down some of the (quite scarce at the time and still not easy to find) troff macro documentation.
It's cute and I liked that the language was generally sane but ultimately the typesetting quality was noticeably inferior to TeX even for block text; for setting mathematics Lout manages to make almost every single decision subtly wrong. Lout text quality is about comparable to troff in this way. If I'd never seen TeX it would look probably fine, or at least acceptable, but I can't stand to look at it today.
I love (La)TeX, but I'd also love to see viable alternatives in place. However, I've never heard a success story (for example, "I wrote a book in it and it was pleasant"—or even "… a paper … painless"!). In fact, this is the first story I've heard of anyone (other than Kingston & co. himself) even seriously trying it.
Is there a Lout / Nonpareil showcase out there somewhere?
In the world of "here's a preprocessor before LaTeX", pandoc tends to stand out as you can do Markdown or similar with inline LaTeX or HTML: http://johnmacfarlane.net/pandoc/
I think inventing typesetting languages that compile to TeX might partially solve the problem and make the whole typesetting experience more pleasant, since TeX is already a capable typesetting system, and has a rich ecosystem, and TeX engines are already capable of producing high-quality documents, but the hard part is writing TeX documents.
So what haml is to HTML, or say CoffeScript to javascript, a new language could be to TeX.
so inventing new, easier to use typesetting languages that use TeX as an underlying system might be a good solution..
I keep revisiting Lout every couple of years, and they keep not supporting UTF-8, which makes it unusable or me. It's a shame, because it looks really nice.
I have been using Lout occasionally. It is a perfectly capable typesetting systems, with both advantages and disadvantages over TeX/LaTeX/ConTeXt/etc. The average user will probably want to stick with TeX and friends, but there are situations where Lout is a good alternative.
The primary advantages are its very small size and its tight integration with PostScript. The small size means that is very easily deployed (such as when you need something to generate documents); the tight integration with PostScript means that it is relatively easy (assuming you know a bit of PostScript) to augment the output with graphics (I have even, for example, sometimes used Lout to generate EPS diagrams to embed in LaTeX documents). Much easier than using Metafun, for example.
The main disadvantages are that it cannot do everything that the TeX engine does and that it has relatively few contributors.
For example, there's no way (other than manually inserting column breaks) to automatically generate balanced columns. And while non-rectangular paragraphs are possible in principle (such as for drop capitals or flowing text around images), this generally requires sacrificing hyphenation (and can be non-trivial for the non-expert user).
Lout also has fewer people contributing to/using it, so it does not have the rich ecosystem of TeX and friends (e.g., while you can use it to generate slides, there are more and more powerful LaTeX styles available). Integration with bibliography tools and sites (that generally expect/produce BibTeX format) can be another matter.
With respect to typesetting quality, it uses the same line-breaking algorithm as standard TeX; it does not have the microtypography features of pdfTeX. For typesetting anything mathematics (beyond basic math), TeX is generally superior (which should not be surprising, given the effort Knuth put into getting that right).
Also, while modifying/creating LaTeX packages these days is not exactly child's play, Lout poses some challenges of its own; its functional typesetting language (used both to describe the content and how that content is laid out via so-called galleys) can be difficult to grasp for a novice.
yeah, i find lout pretty pleasant for non-scientific stuff, but when it comes to a bunch of equations and references you can't beat latex.
one thing the blog post didn't mention is that it is really easy to machine-generate lout (indeed, that was one of its design goals), so it provides a good option to quickly add ps and pdf output to a project.
These days I do most of my LaTeXing as an export from org-mode in emacs. Org-mode allows me to use convenient markup notation, and uses sensible defaults when exporting to LaTeX.
It keeps me happy, especially when exporting slides to Beamer, LaTeX based presentation tool.
Off-topic: Generating slides from an outliner like Org-mode produces those ugly bullet points en mass, doesn't it? The upside is of course, that they are easy and fast to generate. Why do you need that?
I wonder if Sphinx[1] or something like it could be a good replacement for LaTeX? It would need a better typesetting engine for PDFs, but the authoring experience with it is pretty nice.
I'd be curious to know if there any CSS/HTML experiments going on with typesetting academic papers. LaTeX may be beautiful and powerful but it may also be too hard.
And it really doesn't handle very good HTML output -- aren't we supposed to be on the internet already?
The output of Latex is so annoyingly good that I would use it to make me a cup of tea if I could ("there's a package for that", comes the faint cry). I use it for essays, reports, and even our Ukulele Society's songbook. Versatile it is.
But the problem is basically this:
You have to close your eyes, cover your ears and hum loudly when installing and compiling, because if you thought about the error messages, the distro size and all the binaries involved, you'd probably wish you were dead. Nothing, no quality output justifies a piece of software like this, it's like the elephant in the filesystem. I have nothing but admiration for the people who put this together, because I couldn't fit a mental model of the Tex system into two of my brains.
[+] [-] wannesm|14 years ago|reply
An increasingly interesting Latex replacement is, maybe surprisingly, html in combination with css and javascript. With every update of our browser the inspection and debugging tools become more powerful and every time I can track layout and programming bugs a little bit faster. With the addition of more properties targeting paged media in css3 it now becomes possible to also create nicely looking pdfs starting from html. Prince (http://www.princexml.com/), for example, is ahead of the browsers for CSS paged media properties and outputs a pdf-file directly. But also typical features for which we praise Latex are becoming available:
- Mathematical formulae: http://www.mathjax.org/
- Bibliographies: http://citationstyles.org/
- Advanced hyphenation: http://code.google.com/p/hyphenator/
Most people only use the basic commands and don't care about the underlying engine. Therefore, Latex as a 'writing' language is not too attached to the Tex 'programming' language. Pandoc (http://johnmacfarlane.net/pandoc/) could be enough to translate such Latex code to another markup format and use another engine. (Academics tend to use advanced Latex macros only when in need of space ;-))
[+] [-] spystath|14 years ago|reply
In my opinion LuaTeX (and its LaTeX counterpart) is the future of TeX. It builds upon the strong base of TeX and combines it with Lua which is a fine language for this purpose.
[1] http://hyphenator.googlecode.com/svn/tags/Version%204.0.0/Wo...
[+] [-] adrianN|14 years ago|reply
[+] [-] tzs|14 years ago|reply
[+] [-] kochbeck|14 years ago|reply
In college, I was typesetting my work in PlainTeX (I never did like LaTeX, but obviously I had it available) on a 14.77MHz 68000-based Amiga 2000, and the TeX distro came on floppies. I had a whopping 40Mb hard drive, and all the heavy lifting lived there - Metafont, dvips, tex itself. But they fit comfortably on 800K low-density floppies and ran from them, if you needed to. The other floppies were all fonts, and since the prevailing format for fonts in the rest of the world was Type 3 Postscript (yucky bitmap) and comparable TrueType, my work looked rockin.
So to review, I had it running largely off floppies on a machine a couple of orders of magnitude slower (and a couple of orders of magnitude less memory and storage) than my iPhone. And I frequently taught freshman English majors who wouldn't own their own computer for another 3 years how to use it, down to font rendering and selecting an output format for the target printer which was rarely Postscript back then.
Riddle me this then: why are current TeX distros completely indecipherable to me now? I mean, kpathsea was always a bit of a beast, but I understood it pretty much at a glance. How is it that, although I've used the platform on and off for two decades now, in the last 5 years I've had to call the Psychic Friends Network every time I tried to call a package that I thought I had installed correctly? Oh, and why is a whole install now larger than the sum total of all the storage I had at my disposal - every floppy, hard drive, mainframe quota, and gettemp limit - when I last used the system on a daily basis?
As far as I can tell, the last update to the core product was in 2008, and everything that's been added to the main engine since 1992 has been incremental support for things like modern font formats. So it should have grown linearly, not exponentially. But there it is. Big as life and twice as ugly.
This is actually the second question in five days I've seen in two different fora about, "How is TeX holding on?" And to look at the sample output that was produced by Lout, obviously the answer is, "Because no one ever came up with a replacement that produced better output." You don't have to ask Don Knuth to figure that one out. It's not that Lout hasn't surpassed TeX yet. It's that it hasn't beaten troff yet. The 70s called, and they're looking for their DEC LP01, man.
But I don't think people are actually voicing the question in their heads. I think the question they're actually asking is, "Who let this godawful piece of Frankencode run through the village terrorizing the children, and why won't someone please scrape it all into a pile and teach it how to sing Puttin' on the Ritz like it did 20 years ago?" Or, "If you got this thing back into shape, why wouldn't it be the rendering engine for ebooks, because if it's setup right, it can render a whole book from source live on an iPad which is 100x more powerful than its original compile target?" I can think of 20 questions like this. All the questions ultimately boil down to a wonderment that one of the best pieces of software ever written for making readable output is cared for so shoddily. It's like some laboratory experiment gone amuck on how layering bad abstractions on things makes even awesome things awful. }
And now for my next trick, I'm going to go integrate XeTeX into my current product to generate custom typeset results for customers. No, seriously, I am. I see 20 more years of this platform in my future...
[+] [-] beza1e1|14 years ago|reply
One big TODO is the copyright notice in the bottom left column first page. Two column-wide figures are also an issue.
Thanks for citationstyles.org. When i have time, I'll try to integrate that.
[0] http://beza1e1.tuxen.de/acm_html/test.html
[+] [-] dubya|14 years ago|reply
[+] [-] unknown|14 years ago|reply
[deleted]
[+] [-] gchpaco|14 years ago|reply
It's cute and I liked that the language was generally sane but ultimately the typesetting quality was noticeably inferior to TeX even for block text; for setting mathematics Lout manages to make almost every single decision subtly wrong. Lout text quality is about comparable to troff in this way. If I'd never seen TeX it would look probably fine, or at least acceptable, but I can't stand to look at it today.
[+] [-] JadeNB|14 years ago|reply
I love (La)TeX, but I'd also love to see viable alternatives in place. However, I've never heard a success story (for example, "I wrote a book in it and it was pleasant"—or even "… a paper … painless"!). In fact, this is the first story I've heard of anyone (other than Kingston & co. himself) even seriously trying it.
Is there a Lout / Nonpareil showcase out there somewhere?
[+] [-] adaszko|14 years ago|reply
* http://www.qtrac.eu/pyqtbook.html
* http://www.qtrac.eu/gobook.html
[+] [-] zdw|14 years ago|reply
[+] [-] dfc|14 years ago|reply
Understatement of the year. The closest preprocessor to pandoc that I know of is org-mode and the output is not nearly as nice and its emacs specific.
[+] [-] njoh|14 years ago|reply
so inventing new, easier to use typesetting languages that use TeX as an underlying system might be a good solution..
[+] [-] tangus|14 years ago|reply
[+] [-] rbehrends|14 years ago|reply
The primary advantages are its very small size and its tight integration with PostScript. The small size means that is very easily deployed (such as when you need something to generate documents); the tight integration with PostScript means that it is relatively easy (assuming you know a bit of PostScript) to augment the output with graphics (I have even, for example, sometimes used Lout to generate EPS diagrams to embed in LaTeX documents). Much easier than using Metafun, for example.
The main disadvantages are that it cannot do everything that the TeX engine does and that it has relatively few contributors.
For example, there's no way (other than manually inserting column breaks) to automatically generate balanced columns. And while non-rectangular paragraphs are possible in principle (such as for drop capitals or flowing text around images), this generally requires sacrificing hyphenation (and can be non-trivial for the non-expert user).
Lout also has fewer people contributing to/using it, so it does not have the rich ecosystem of TeX and friends (e.g., while you can use it to generate slides, there are more and more powerful LaTeX styles available). Integration with bibliography tools and sites (that generally expect/produce BibTeX format) can be another matter.
With respect to typesetting quality, it uses the same line-breaking algorithm as standard TeX; it does not have the microtypography features of pdfTeX. For typesetting anything mathematics (beyond basic math), TeX is generally superior (which should not be surprising, given the effort Knuth put into getting that right).
Also, while modifying/creating LaTeX packages these days is not exactly child's play, Lout poses some challenges of its own; its functional typesetting language (used both to describe the content and how that content is laid out via so-called galleys) can be difficult to grasp for a novice.
[+] [-] stewbrew|14 years ago|reply
Do they really still require users to explicitly mark paragraphs?
[+] [-] nullflux|14 years ago|reply
[+] [-] Lewisham|14 years ago|reply
[+] [-] cschmidt|14 years ago|reply
http://www.amazon.com/More-Math-Into-LaTeX-Edition/dp/038732...
[+] [-] orjan|14 years ago|reply
[+] [-] philwebster|14 years ago|reply
[+] [-] dbaupp|14 years ago|reply
[+] [-] tuananh|14 years ago|reply
[+] [-] nyar|14 years ago|reply
I have a 500GB hard drive.
[+] [-] Luyt|14 years ago|reply
[+] [-] duaneb|14 years ago|reply
[+] [-] zem|14 years ago|reply
one thing the blog post didn't mention is that it is really easy to machine-generate lout (indeed, that was one of its design goals), so it provides a good option to quickly add ps and pdf output to a project.
[+] [-] rflrob|14 years ago|reply
[+] [-] mad44|14 years ago|reply
It keeps me happy, especially when exporting slides to Beamer, LaTeX based presentation tool.
[+] [-] beza1e1|14 years ago|reply
[+] [-] vilya|14 years ago|reply
[1] http://sphinx.pocoo.org/
[+] [-] unknown|14 years ago|reply
[deleted]
[+] [-] dfc|14 years ago|reply
[+] [-] indubitably|14 years ago|reply
And it really doesn't handle very good HTML output -- aren't we supposed to be on the internet already?
[+] [-] leif|14 years ago|reply
[+] [-] suckerpunch|14 years ago|reply
But the problem is basically this:
You have to close your eyes, cover your ears and hum loudly when installing and compiling, because if you thought about the error messages, the distro size and all the binaries involved, you'd probably wish you were dead. Nothing, no quality output justifies a piece of software like this, it's like the elephant in the filesystem. I have nothing but admiration for the people who put this together, because I couldn't fit a mental model of the Tex system into two of my brains.