npp | 10 years ago | on: Marvin Minsky dies at 88
npp's comments
npp | 10 years ago | on: How Apple Influenced The Labels To Shut Down My Music Streaming Startup
This business seems to have been run and have proceeded horribly, for the exact reasons that Caldwell and everyone else with experience in that area seems to be extremely familiar with.
Is there anything legitimately interesting to the "Apple" and "Steve Jobs" parts of this story other than the usual clickbait?
npp | 12 years ago | on: Why Wikipedia's A/B testing is wrong
This is discussed in many places, but here are a few examples:
http://www.johndcook.com/blog/2010/01/19/dont-invert-that-ma... http://en.wikipedia.org/wiki/Linear_least_squares_(mathemati...
npp | 13 years ago | on: A First Course in Linear Algebra
I would suggest trying these videos: http://www.stanford.edu/~boyd/ee263/videos.html. The prerequisites are very low and a main focus is on interpreting the abstract concepts in applications.
npp | 14 years ago | on: More Stanford Online Classes by January
npp | 15 years ago | on: Show HN: Readability-like API Using Machine Learning
In any case, if the feature extraction is taking too much time, what is sometimes done is to dynamically select which features to extract for a test example based both on the expected predictive value (e.g. via mutual information or some other feature selection method) as well as the time it takes to actually compute the feature. This can be measured by, say, average computation time per feature on the training set. This can speed things up a fair bit if the feature extraction takes too long, since you only bother computing the features you really need, and are biased towards the ones that are quick to compute. This may not translate to your particular application, though, if I remember correctly, I've seen it used a while back for image spam classification.
npp | 15 years ago | on: Ask HN: High-level math, useful?
It helps to know what a topology is, but not much more, and you would learn enough "on the way" in learning analysis properly. It helps to know what groups are, because they do show up in practical things, but you don't really need to know full-up "group theory". (They show up because they capture the idea of symmetries, and it is useful in certain practical situations to talk about something being symmetric with respect to various transformations, e.g. under permutations or rotations or whatever. But in this case you don't tend to do much analysis actually using group theory beyond this.) A whole course on abstract algebra is not necessary unless you're interested. It may help in some indirect way of "helping you think better", it may not.
See, say, http://junction.stanford.edu/~lall/engr207c/ as an example of an EE course that does a fair amount of math.
(Also, above, I don't mean 'applicable' in the very indirect sense of "helping you think better" -- I mean people use it to do real stuff. Whether you want to do that stuff is another story -- there are certainly good things in EE/CS that don't require this kind of math.)
npp | 15 years ago | on: Ask HN: So what's new in the world of A.I.?
1. Foundations and Trends in Machine Learning -- this is a journal aimed at publishing a very small number of well-written survey papers on various trends in ML. This is easier to follow than an entire conference (much lower traffic, higher signal/noise), and should be readable for a wider audience (assuming they are math-inclined).
2. Conferences like Algorithms for Modern Massive Datasets are practically-oriented, well attended by a lot of industry, and involve a lot of AI: http://www.stanford.edu/group/mmds/. Look through the speakers and topics. This is one example, there are others.
3. A lot of important tech companies have teams that do AI and AI-type things, at least using the modern definition of AI (Google, Facebook, Twitter, LinkedIn, Netflix, Amazon, Microsoft, eBay, even Apple with its Siri acquisition; there are others). This is not to mention people using this stuff in other areas, like finance and bioinformatics. These groups sometimes talk about what they're working on, so you can check this out.
npp | 15 years ago | on: Ask HN: Learning advanced math
Miscellaneous comments:
- Reading pure abstract algebra (e.g. Dummit & Foote) isn't a good use of time if you intend to go into statistics, since it only shows up in a few very special subareas. If you decide to go into one of these areas, you can learn this later.
- More advanced books on linear algebra usually emphasize the abstract study of vector spaces and linear transformations. This is fine, but you also need to learn about matrix algebra (some of which is in that Horn & Johnson book) and basic matrix calculus, since in statistics, you'll frequently be manipulating matrix equations. The vector space stuff generally does not help with this, and this material isn't in standard linear algebra books. (Similarly, you should learn the basics of numerical linear algebra and optimization -- convex optimization in particular shows up a lot in statistics.)
- People have different opinions on books like Rudin, but you need to learn to read material like this if you're going into an area like probability. It's also more or less a de facto standard, so it is worth reading partly for that reason as well. So read Rudin/Royden (or equivalent, there are a small handful of others), but supplement them with other books if you need (e.g. 'The Way of Analysis' is the complete opposite of Rudin in writing style). It helps to read a few different books on the same topic simultaneously, anyway.
- Two books on measure-theoretic probability theory that are more readable than many of the usual suspects are "Probability with Martingales" by Williams and "A User's Guide to Measure-Theoretic Probability" by Pollard. There is also a nice book called "Probability through Problems" that develops the theory through a series of exercises.
npp | 15 years ago | on: Ask HN: Is a Ph.D. in CS worth it?
This has a number of benefits: worthwhile non-academic experience, better sense of whether you really want to do a PhD or whatever else, usually more focus when you do go back because you have had time to reflect on what exactly you want to do and get out of it, some general maturity that comes from working rather than just being in school, less pressure in making a big decision right now, and so on. Since you aren't hell-bent on becoming a professor, it is good to see both some academia (your undergrad) and industry before jumping into a long-term thing like a PhD. It's also more comfortable applying to grad school from a job you already have rather than as an undergrad, since if you don't get in anywhere you like, you can simply stay at your job and even try again the following year. (This also all applies if you decide you just want to do an MS.)
Basically, you have to make your own decision about this, and this is a fairly simple (and productive) way to make the decision easier.
npp | 15 years ago | on: GraphLab: A New Parallel Framework for Machine Learning
npp | 15 years ago | on: Ask HN: Four months since launch: how are you finding the iPad?
npp | 15 years ago | on: Ask HN: machine learning success stories?
There are a number of examples that are not consumer-facing, like credit card fraud detection, snail mail routing, quantitative trading, market segmentation analysis, demand prediction for inventory control, and other things. It is also used for scientific data analysis in several areas, with bioinformatics being the really big one. There are other examples.
There are also applications that are not considered machine learning, but use the same ideas for different purposes. An example would be modern codes, which are used for things like compression and satellite communications, and are based on the same `graphical models' pervasive in machine learning.
There is hype, and some applications need only a little bit, but it is at least used in some real stuff.
npp | 15 years ago | on: Ask HN: Best book(s) to learn about the basics of economics?
Another option that is a bit more expensive but may be a lot more enjoyable is getting some DVDs from the Teaching Company. They get what are usually very good lecturers to give their courses, and
http://www.teach12.com/ttcx/coursedesclong2.aspx?cid=550 http://www.teach12.com/ttcx/coursedesclong2.aspx?cid=5610
seem to be the kinds of things you're looking for. I haven't watched these, but maybe worth a shot.
(Regarding some of the other suggestions: blogs are ok, but the more econ-heavy ones will assume some degree of familiarity with economic concepts, and will post on random topics in no particular order, so they don't really solve this problem. Non-mainstream books or famous monographs are also ok at some point, but as with all such things, it's generally best to at least understand the basic mainstream stuff first.)
npp | 16 years ago | on: Ask HN: So What Universities Are Good?
This at least avoids the "Java/C++ problem", but of course, there are many other very important factors to consider in choosing a school, and in particular, you should go with the best all-round school you have as an option, not the one that has a slightly "better" CS program but not much else.
npp | 16 years ago | on: John Gruber jumps the shark
npp | 16 years ago | on: IPad: $1 Billion Later, What Do You Think of It Now?
npp | 16 years ago | on: Mark Zuckberg’s Non-Apology: Facebook Screwed up with Privacy. But Keep Sharing
If they had started the service today advertising the settings they have now, I think they would have trouble getting users.
npp | 16 years ago | on: The Hilbert Hotel
npp | 16 years ago | on: The Hilbert Hotel
...He also said that he thought McCarthy was an idiot. :)