> Yes, we are aware of typst. I think it’s cool, but C++ hasn’t replaced C, Rust hasn’t replaced C++, Typst is unlikely to replace LaTeX. Likewise, many are aware of LuaTeX, but, again, the entrenching of a 40-odd year system is not to be underestimated. I am rooting for typst, anyway, and hope it finds its place.
Well here's the process I went through in the last few years:
I found out about LuaTeX, saw it was supposed to replace pdfTeX and thought the future of TeX was bright.
Then I saw the continued efforts in LaTeX3 and thought that was weird and wasteful: code now looks even worse with all \ExplSyntaxOn ... \ExplSyntaxOff sections and the new command syntax like \exp_args:Ne. If you're going to have a mix of two languages anyway, it makes much more sense for the second language to be a a minimal but real programming language like Lua.
Then the LuaTeX devs moved their efforts to LuaMetaTeX and I found myself scratching my head.
Then I spent some time with typst. Now I don't care what happens in TeX land... The experience with typst is incomparably better, and the pace of development is high in both the core language and the ecosystem. Features that took a decade to be fleshed out in LaTeX are sprouting like mushrooms in typst. It's not a fair fight.
The author is a PhD student that has been using LaTeX heavily for 10 years. But what should a new student use, and why? When the only reason to choose LaTeX is old colleagues and gatekeeping publishers, I know it's a matter of time.
> The author is a PhD student that has been using LaTeX heavily for 10 years. But what should a new student use, and why? When the only reason to choose LaTeX is old colleagues and gatekeeping publishers, I know it's a matter of time.
Sadly its more than that. Will we be able to compile a typst file made today in 10 years? I have to do that regularly with latex. Will everybody one collaborates with also use typst? Very unlikely. A new PhD student may find it beneficial to write papers with someone who only uses latex. then why bother with typst? (and I really want typst to win)
> Then the LuaTeX devs moved their efforts to LuaMetaTeX and I found myself scratching my head.
This makes sense: They're essentially “done” with LuaTeX; it works fine and is distributed with TeX Live; it's up to others to use it. Many people are in environments (submission to journals that insist on pdfTeX etc) where they cannot use LuaTeX; the LuaTeX developers cannot change that. (It has minor differences from TeX/pdfTeX but they don't seem keen to fix it.) Meanwhile for those who are willing to use a new system, they might as well simplify (remove the backward-compatibility requirements) and make a better typesetting system more suited to the needs of ConTeXt.
In other words, users can be divided into:
- those who insist/need to use pdfTeX + LaTeX
- those who are willing to try something different, for which there's ConTeXt (with lmtx) or (further afield) Typst, etc.
> The author is a PhD student that has been using LaTeX heavily for 10 years. But what should a new student use, and why? When the only reason to choose LaTeX is old colleagues and gatekeeping publishers, I know it's a matter of time.
From what I have seen, it's Overleaf.
Do newer students know or care what flavour of Tex Overleaf uses in the background? Not as far as I have seen.
Typst does look very nice based on the brief look I took at it, but besides the questions of adoption and entrenchment that have been raised in parallel comments, the choice of Rust as the implementation language is also concerning to me. I think that Rust and the community that surrounds it are associated with a newer, simultaneously somewhat trend-chasing and decidedly paternalistic culture of software engineering that does not mesh well with the longtermist demands of science and scientific publishing. Concretely,
* Where LaTeX evidently favours doing whatever it takes to achieve a desired result (exhibit 1 being the article we are discussing), Rust itself and the culture that begot it are clearly on the side of decreeing a Right Way from high above and treating the possibility of deviating from it as a bug. In the light of the discussion in Footnote 7, I could for example imagine a Rust-minded typesetting system designer decreeing that unnumbered "displaystyle" math will not be supported.
* Cultural acceptance of mandatory automatic updates means that backwards compatibility may actually be considered an anti-goal.
* Cultural acceptance of ideology/politics in software engineering brings the danger of invasive conditions. What if, by way of an aggressively interpreted CoC, {receiving funding from military/police-aligned agencies, working with Russian collaborators, working with Iranian collaborators} becomes grounds for being excluded from issue discussions or package repositories? (I do take note that Typst does not currently show signs of doing anything like this, but the tone of the wider Rust community does have to be taken into account.)
Of course all these concerns are speculative, but scientific papers can be a nightmare scenario of maintenance (half a year's worth of work, one-digit number of people in the world qualified to write, two-digit number of people who will bother to read). Under those constraints, some measure of paranoia feels appropriate.
Typst is exciting, but I wish their roadmap would prioritize features in common use in LaTeX like microtype rather than niche new features like style revocation.
Typst has nowhere the control and reproducibility of LuaTeX. It's core syntax is quite weak and will be a bottleneck in future. It is especially weak for structure of huge documents, and for mathematics.
LuaTeX is stupid but it still has features needed which none of the other markup languages posses.
Many of the equations and syntax I used in my PhD work can't be written in Typst. For instance, Young Tableaus, or commutation diagrams. Or circuits generated from inline code. I am not even sure if it has been coded in a way to support such extensions.
So I would not recommend a new PhD student (actually undergrad student) to learn Typst just yet.
> Yes, it’s $YEAR and we’re still producing PDFs. Again, historical reasons plus the fact that most people doing maths are not necessarily very interested in computers.
I generally find PDFs to be a very agreeable way to read the sort of content LaTeX is typically used to write. And in writing it myself I don't need to think about what weird layout issues someone else might encounter when viewing my content. There are certainly accessibility issues with PDFs, but also ways to mitigate that[0].
> Yes, we are aware of typst. I think it’s cool, but C++ hasn’t replaced C, Rust hasn’t replaced C++, Typst is unlikely to replace LaTeX. Likewise, many are aware of LuaTeX, but, again, the entrenching of a 40-odd year system is not to be underestimated. I am rooting for typst, anyway, and hope it finds its place. A good place to start would be to provide a compilation toolchain from typst to TeX, if they really want to replace TeX.
Pandoc[0] can convert Typst to LaTeX.
IMO If you are able to write in Typst, write in Typst, it's so much better and readable. Your final LaTeX3 macro are hard to read and difficult to parse with the eyes... Also Typst is easier to learn.
pandoc is seriously under-powered for the kinds of things that LaTeX and Typst can do. Much of the information in Typst/LaTeX source code would simply be ignored during the conversion. It is fine for simple documents, but cannot handle a lot of stuff.
A nice intro to making macros, which is one of the most powerful parts of LaTeX indeed.
Autoref itself seems a fine way of messing up your references and making your source code less readable. The beauty of naming is that you have the context at hand. Moving around blocks of text, or adding and removing text, happens throughout the process. With autoref, you now have to remember to _sometimes_ update the refs or get subtly different references. I wouldn't trust myself to get that right.
Markdown does not specify how to add labels and cross-references for figures, equations, tables, etc. Many moons ago, I asked about them on the CommonMark forum[1] and described a syntax that was general and internationalizable. Given that CommonMark has frozen the Markdown specification, I implemented a consistent label and cross-reference syntax for my editor[2], KeenWrite[3]. These labels and cross-references are translated to XHTML, then transformed from XHTML into ConTeXt macros that are subsequently typeset.
I still wonder why anyone would create such awful grammar for a programming language. Considering LaTeX's initial release was 40 years ago, there were certainly other programming languages from which to draw inspiration.
And I certainly don't believe that LaTeX DSL was the most suitable solution for solving typesetting problems.
LaTeX grammar makes much more sense when you spend some time using raw TeX with no macros at all - it's surprisingly capable on its own and the LaTeX becomes much more obviously "slight improvements".
Many things people think of as LaTeX are actually just TeX.
I would say that the TeX language was designed for the final user to add the "last mile", not for piling layers of macrosubstitution on top of something akin to lambda calculus. As amazing a feat of engineering LaTeX is, it has abused the TeX language beyond its natural limits. The price paid in complexity for abstraction was high. But the TeX language itself is a tiny elegant language.
Having studied a lot of the TeX source code and re-implemented portions of it, my theory is that it's the result of the TeX language evolving organically in a software development environment in which large scale refactoring is impossible. (Knuth didn't have source control, or unit tests, and the language he wrote TeX in has little-to-no support for meaningful abstractions. All of these make refactoring safely hard or impossible.) If you can't refactor, but still want to add a new feature, your only option is to implement things in a hacky way on top of existing features. This then bleeds into the TeX language itself.
I would say that the TeX language was designed for the final user to add the "last mile", not for piling layers of macrosubstitution on top of something akin to lambda calculus. As amazing a feat of engineering LaTeX is, it has abused the TeX language beyond its natural limits. The price paid in complexity for abstraction was high.
Nice in theory, in practice you have LaTeX tools with synctex, command, environment and references autocompletion, live math preview, proper syntax highlighting, jump to error line, etc. Nothing like that is available for pandoc markdown AFAIK, except perhaps for Quarto, which may have its uses but is too slow for small/medium sized documents and its tooling is not that capable anyway. Besides, it adds yet another complex layer on top of an already way too layered stack.
Am i missing something, or is this useless when you want to refer to an equation from "the past"? I find myself doing that often. For example a methods section might pose an optimization problem, then the experiments section a page or two later says "we solve the optimization `\eqref{main-problem}` using Solver X".
Nice piece, and his comment on typst is spot on. I would love for typst to displace LaTex, I hate LaTex and use it every day, and deep inside know it will never go away, unless a much better programmer than I writes a LaTex -> typst converter that covers all the corner cases. One can always hope.
> It’s weird I even have to say this, but don’t stalk me and email me at my personal address.
I am not sure of what happened, and it must have been unpleasant, but someone going up on your website hierarchy, they reach https://commutative.xyz/~miguelmurca/, they click the only link, and you personally list your profiles there, including email, github and insta. It is OK if someone contacts you for whatever reason on addresses and profiles that you explicitly shared.
> If you email me anywhere else, I will not respond. I also cannot force you to follow basic etiquette if you do write, but it would be appreciated.
This is rude, it looks bad in the article, and you are the one who doesn't follow basic netiquette.
I'm sorry if you felt that addendum was aggressive. However, I still feel like I'm justified in making it absolutely clear how (and if) I want to be contacted. I am in a complicated position where I'm speaking to a niche -- not an imaginary niche, by any means -- but actually addressing every reader. In the face of this, my option was to clearly state my boundaries, regardless.
I list an email address at the end of every article, for the purposes of discussing the content of the article. I add a "+ext" to every email I list (including the one on my personal home page). I had people ignore the email I stated in the article, find a different email, strip it of the + tags, and email me there in a foreign language and opening with "I assume you speak X" (presumably because of my name?).
I disagree with you that it's fair game to do the above. OSINT is, well, legal, and I'm not trying to hide my identity, clearly. But I would still be upset if, for example, someone wrote to my university email (which is not hard to find out, by your own procedure) regarding this post.
Nonetheless, again, it is not my goal to sound rude, but simply to set boundaries and expectations. I will remove the second sentence towards this goal, but stand by its objective meaning, and will keep the rest as is.
cbolton|1 year ago
Well here's the process I went through in the last few years:
I found out about LuaTeX, saw it was supposed to replace pdfTeX and thought the future of TeX was bright.
Then I saw the continued efforts in LaTeX3 and thought that was weird and wasteful: code now looks even worse with all \ExplSyntaxOn ... \ExplSyntaxOff sections and the new command syntax like \exp_args:Ne. If you're going to have a mix of two languages anyway, it makes much more sense for the second language to be a a minimal but real programming language like Lua.
Then the LuaTeX devs moved their efforts to LuaMetaTeX and I found myself scratching my head.
Then I spent some time with typst. Now I don't care what happens in TeX land... The experience with typst is incomparably better, and the pace of development is high in both the core language and the ecosystem. Features that took a decade to be fleshed out in LaTeX are sprouting like mushrooms in typst. It's not a fair fight.
The author is a PhD student that has been using LaTeX heavily for 10 years. But what should a new student use, and why? When the only reason to choose LaTeX is old colleagues and gatekeeping publishers, I know it's a matter of time.
YWall39|1 year ago
Sadly its more than that. Will we be able to compile a typst file made today in 10 years? I have to do that regularly with latex. Will everybody one collaborates with also use typst? Very unlikely. A new PhD student may find it beneficial to write papers with someone who only uses latex. then why bother with typst? (and I really want typst to win)
svat|1 year ago
This makes sense: They're essentially “done” with LuaTeX; it works fine and is distributed with TeX Live; it's up to others to use it. Many people are in environments (submission to journals that insist on pdfTeX etc) where they cannot use LuaTeX; the LuaTeX developers cannot change that. (It has minor differences from TeX/pdfTeX but they don't seem keen to fix it.) Meanwhile for those who are willing to use a new system, they might as well simplify (remove the backward-compatibility requirements) and make a better typesetting system more suited to the needs of ConTeXt.
In other words, users can be divided into:
- those who insist/need to use pdfTeX + LaTeX
- those who are willing to try something different, for which there's ConTeXt (with lmtx) or (further afield) Typst, etc.
fnands|1 year ago
From what I have seen, it's Overleaf.
Do newer students know or care what flavour of Tex Overleaf uses in the background? Not as far as I have seen.
4bpp|1 year ago
* Where LaTeX evidently favours doing whatever it takes to achieve a desired result (exhibit 1 being the article we are discussing), Rust itself and the culture that begot it are clearly on the side of decreeing a Right Way from high above and treating the possibility of deviating from it as a bug. In the light of the discussion in Footnote 7, I could for example imagine a Rust-minded typesetting system designer decreeing that unnumbered "displaystyle" math will not be supported.
* Cultural acceptance of mandatory automatic updates means that backwards compatibility may actually be considered an anti-goal.
* Cultural acceptance of ideology/politics in software engineering brings the danger of invasive conditions. What if, by way of an aggressively interpreted CoC, {receiving funding from military/police-aligned agencies, working with Russian collaborators, working with Iranian collaborators} becomes grounds for being excluded from issue discussions or package repositories? (I do take note that Typst does not currently show signs of doing anything like this, but the tone of the wider Rust community does have to be taken into account.)
Of course all these concerns are speculative, but scientific papers can be a nightmare scenario of maintenance (half a year's worth of work, one-digit number of people in the world qualified to write, two-digit number of people who will bother to read). Under those constraints, some measure of paranoia feels appropriate.
AlanYx|1 year ago
trueismywork|1 year ago
LuaTeX is stupid but it still has features needed which none of the other markup languages posses.
unknown|1 year ago
[deleted]
abdullahkhalids|1 year ago
So I would not recommend a new PhD student (actually undergrad student) to learn Typst just yet.
michaelmior|1 year ago
I generally find PDFs to be a very agreeable way to read the sort of content LaTeX is typically used to write. And in writing it myself I don't need to think about what weird layout issues someone else might encounter when viewing my content. There are certainly accessibility issues with PDFs, but also ways to mitigate that[0].
[0] https://libguides.lib.msu.edu/c.php?g=995742&p=8207771
wdroz|1 year ago
Pandoc[0] can convert Typst to LaTeX.
IMO If you are able to write in Typst, write in Typst, it's so much better and readable. Your final LaTeX3 macro are hard to read and difficult to parse with the eyes... Also Typst is easier to learn.
[0] -- https://pandoc.org/try/
venice_benice|1 year ago
crotchfire|1 year ago
Pandoc can "convert" html to things too, but you wouldn't use it as a web browser.
larsrc|1 year ago
Autoref itself seems a fine way of messing up your references and making your source code less readable. The beauty of naming is that you have the context at hand. Moving around blocks of text, or adding and removing text, happens throughout the process. With autoref, you now have to remember to _sometimes_ update the refs or get subtly different references. I wouldn't trust myself to get that right.
thangalin|1 year ago
[1]: https://talk.commonmark.org/t/cross-references-and-citations...
[2]: https://gitlab.com/DaveJarvis/KeenWrite/-/blob/main/docs/ref...
[3]: https://keenwrite.com/
rrgok|1 year ago
And I certainly don't believe that LaTeX DSL was the most suitable solution for solving typesetting problems.
bombcar|1 year ago
Many things people think of as LaTeX are actually just TeX.
mmplxx|1 year ago
trueismywork|1 year ago
The problem with LaTeX was always the lack of underlying proper programming language and data model. The syntax always has been excellent.
returningfory2|1 year ago
mmplxx|1 year ago
nabla9|1 year ago
Real LaTeX users don't use LaTeX to write documents.
vouaobrasil|1 year ago
mmplxx|1 year ago
mapreduce|1 year ago
Really? That's a bold claim. Got any source or stats for that?
I assure you I am a real LaTeX user and I use LaTeX to write documents. I see everyone around me doing the same.
trueismywork|1 year ago
Ar-Curunir|1 year ago
blt|1 year ago
YWall39|1 year ago
kunley|1 year ago
fnands|1 year ago
mmplxx|1 year ago
Weird conclusion, because LaTeX has mostly replaced TeX.
There is a nice symmetry here:
C -> C++ -> Rust ~ Typst <- LaTeX <- TeX
bmacho|1 year ago
I am not sure of what happened, and it must have been unpleasant, but someone going up on your website hierarchy, they reach https://commutative.xyz/~miguelmurca/, they click the only link, and you personally list your profiles there, including email, github and insta. It is OK if someone contacts you for whatever reason on addresses and profiles that you explicitly shared.
> If you email me anywhere else, I will not respond. I also cannot force you to follow basic etiquette if you do write, but it would be appreciated.
This is rude, it looks bad in the article, and you are the one who doesn't follow basic netiquette.
miguelmurca|1 year ago
I'm sorry if you felt that addendum was aggressive. However, I still feel like I'm justified in making it absolutely clear how (and if) I want to be contacted. I am in a complicated position where I'm speaking to a niche -- not an imaginary niche, by any means -- but actually addressing every reader. In the face of this, my option was to clearly state my boundaries, regardless.
I list an email address at the end of every article, for the purposes of discussing the content of the article. I add a "+ext" to every email I list (including the one on my personal home page). I had people ignore the email I stated in the article, find a different email, strip it of the + tags, and email me there in a foreign language and opening with "I assume you speak X" (presumably because of my name?).
I disagree with you that it's fair game to do the above. OSINT is, well, legal, and I'm not trying to hide my identity, clearly. But I would still be upset if, for example, someone wrote to my university email (which is not hard to find out, by your own procedure) regarding this post.
Nonetheless, again, it is not my goal to sound rude, but simply to set boundaries and expectations. I will remove the second sentence towards this goal, but stand by its objective meaning, and will keep the rest as is.
Cheers
unknown|1 year ago
[deleted]