top | item 8130542

(no title)

spacemanaki | 11 years ago

It's unlikely to occur as the type is exponential in size and thus quite unwieldy to do anything with! I haven't worked with generated ML or Haskell code, but I still think it's unlikely to be an issue in "real" code.

discuss

order

nostrademons|11 years ago

It's rarely a compile-time performance issue because O(2^N) algorithms perform well on modern hardware when N ~= 3. You could nest the `double` example from the article 6 times and it would still only be a 64x slowdown.

Where it does show up, annoyingly, is in error messages. You actually see this all the time with GCC's STL error messages (C++ templates provide a parameterized type system very similar to Haskell or ML, but without the inference, at least until C++11). STL containers are usually parameterized not only over the key and value, but over the comparator and allocator. Nest a couple of them and you suddenly have a type with several dozen variables. GCC expands the whole type out, without typedefs, when printing error messages, which means that you will sometimes get compile errors that are 10 pages long. Clang fixes this by preserving the typedefs, without which you wouldn't want to write such a type. Haskell has the same problem, except worse because it infers types, but GHC has gone to great pains to format error messages so they are at least readable if verbose.

ubernostrum|11 years ago

XML/SGML people know this sort of thing (exponential resource use due to expanding references) as the "billion laughs" attack. There are real-world consequences to this sort of stuff.