wall_words's comments

wall_words | 10 years ago | on: Leaf: Machine learning framework in Rust

The performance graph is deceptive for two reasons: (1) Leaf with CuDNN v3 is a little slower than Torch with CuDNN v3, yet the bar for leaf is positioned to the left of the one for Torch, and (2) there's a bar for Leaf with CuDNN v4, but not for Torch.

It's good to see alternatives to Torch, Theano, and TensorFlow, but it's important to be honest with the benchmarks so that people can make informed decisions about which framework to use.

wall_words | 10 years ago | on: Benchmarks for Blaze, A high-performance C++ math library

I've used Blaze for machine learning applications, where I've relied on the performance of elementwise operations and dense matrix multiplication on a single machine (the results advertised in the benchmark). Eigen has more functionality, but in my experience is not always optimized as well as Blaze. Neither has support for distributed computing, but I believe this is a problem that HPX is trying to address: https://github.com/STEllAR-GROUP/hpx

wall_words | 10 years ago | on: Benchmarks for Blaze, A high-performance C++ math library

I've had great success with Blaze, despite the fact that it has received little publicity compared to alternatives like Eigen, Armadillo, etc. Blaze is consistently the leader of the pack in benchmarks, and even outperforms Intel MKL on the Xeon E5-2660 (the CPU for which the benchmark results are shown).

wall_words | 11 years ago | on: Show HN: Paperman – LaTeX editor with a Markdown feel

If you want to generate LaTeX from Markdown, you can use Pandoc. Pandoc has various extensions to regular Markdown (including inline math, tables, etc.), so this gives you some flexibility when producing more complicated types of documents. In fact, Pandoc converts from Markdown to LaTeX to PDF when you choose PDF as the output format.

wall_words | 11 years ago | on: A visual proof that neural nets can compute any function

This is an important statement and should be upvoted more. Case in point: "the Weierstrass approximation theorem states that every continuous function defined on a closed interval [a, b] can be uniformly approximated as closely as desired by a polynomial function."

wall_words | 11 years ago | on: A visual proof that neural nets can compute any function

> but a migration of research interest away from neural nets seemed increasingly promising, and today, the migration seems largely complete.

What are you talking about? Deep learning is one of the hottest areas of research today, and a lot of it has to do with neural networks. NN's are the state of the art in several domains. Case in point: http://image-net.org/challenges/LSVRC/2014/results. All of the top entries use convolutional networks; in fact, almost all of the entries do.

The fact that the loss function represented by a neural network can be highly nonconvex is what makes them so effective in the domains in which they are used. See this presentation by Yann LeCun for more info: http://www.cs.nyu.edu/~yann/talks/lecun-20071207-nonconvex.p...

"ML theory has essentially never moved beyond convex models, the same way control theory has not really moved beyond linear systems. Often, the price we pay for insisting on convexity is an unbearable increase in the size of the model, or the scaling properties of the optimization algorithm ... This is not by choice: non­convex models simply work better. Have you tried acoustic modeling in speech with a convex loss? ... To learn hierarchical representations (low-level features, mid- level representations, high-level concepts....), we need “deep architectures”. These inevitably lead to non-convex loss functions."

This isn't to say that NN's are going to solve all our problems, but to say that there has been a shift in interest away from NN's is absurd.

wall_words | 11 years ago | on: Conrod – A Rust GUI Library

Interest in development of C++ has never been more vigorous. Representatives from more companies than ever before are getting involved in C++ standardization and giving talks: see http://cppcon.org for details. There are now numerous study groups involved in standardization, each focusing on a specific feature targeted for C++17. Scroll down to the bottom of this page for details: https://isocpp.org/std/the-committee.

C++11 and C++14 have done a lot to enable users to write clean, efficient code. I probably would not be using C++ today if I were forced to write in C++03 style. I am still not convinced that C++ has gotten any easier to learn over time. But for those who know how to use it well, no language ranks better in allowing users to write clean, efficient, and portable code.

page 1