tiger10guy | 10 years ago | on: The future of UI is text
tiger10guy's comments
tiger10guy | 11 years ago | on: The Church of TED
Dan Dennett has said similar things.
tiger10guy | 11 years ago | on: Isaac Asimov Mulls “How Do People Get New Ideas?” (1959)
... but which variables?
Permutations grow exponentially as you add dimensions, so you can't have too many.
tiger10guy | 11 years ago | on: Do Deep Nets Really Need to be Deep? [pdf]
tiger10guy | 11 years ago | on: On the foolishness of “natural language programming”
I might ask my friend to move the report he's working on to a shared network location so I can load it into my computer and read it: "Hey Joe, can you move the report to the share?"
Joe might ask the computer to do the same thing: "cp /home/joe/reports/cool_report.pdf /network/share/reports/cool_report.pdf"
The actual ideas that are communicated are very similar, but not the same. English is good for communicating one idea while bash/GNU is good for communicating the other.
Just because English has some established formalism doesn't mean it's good at communicating the ideas we want to communicate to computers.
BTW, I don't care which field you put the issue under; it's the same issue and anyone who cares about it might contribute to the discussion.
tiger10guy | 11 years ago | on: Deep Learning Image Classifier
This is the implementation: http://torontodeeplearning.github.io/convnet/
tiger10guy | 11 years ago | on: Numenta Platform for Intelligent Computing
tiger10guy | 11 years ago | on: Numenta Platform for Intelligent Computing
http://research.microsoft.com/apps/pubs/default.aspx?id=2093...
It's worth noting that most people doing Deep Learning aren't trying to replicate the brain, but just want to do a better job at Machine Learning (ML) and Artificial Intelligence. Here's how I see it as someone working on Deep Learning; someone correct me if I'm wrong.
Deep Learning: Trying to do ML - yes Trying to replicate brain - no (for the most part)
Numenta (HTM/CLA): Trying to do ML - yes (not sure how much they succeed) Trying to replicate brain - yes, but (i) we don't know exactly how the brain works (ii) they make approximations
Projects like Nengo (http://nengo.ca/): Trying to do ML - no Trying to replicate brain - yes
I'm not very familiar with Nengo.
Edit: formatting
tiger10guy | 12 years ago | on: Untapped opportunities in AI
tiger10guy | 12 years ago | on: The worst response to a great idea
tiger10guy | 12 years ago | on: Parallelising Python with Threading and Multiprocessing
Parallel Python (PP) seems to have a clunkier API, but also more functionality. I think the biggest advantage is that it can distribute jobs over a cluster instead of just different cores on the same machine. I might look into PP if I need to do things on a cluster, but I think I'll still stick with joblib when I'm on one machine.
That's just my first impression. I'd be interested to read your blog post.
tiger10guy | 12 years ago | on: Parallelising Python with Threading and Multiprocessing
tiger10guy | 12 years ago | on: HN Plays 2048 (democracy)
tiger10guy | 12 years ago | on: 2048 in 4D
tiger10guy | 12 years ago | on: 2048 in 4D
I think my intuitions for the discrete and continuous are similar for 2D and 3D. Are they fundamentally similar? How do they differ, even if only a little? It seems that discrete 4D intuition should somehow help with continuous 4D intuition.
tiger10guy | 12 years ago | on: 2048 in 4D
Does that mean I've formed 4D intuition? I think this just happens to be in a class of 4D mechanics that's isomorphic to 2D variants, so the answer would be no. If so, how many such 2D variants are there?
For those interested, 4d rubick's cube:
tiger10guy | 12 years ago | on: 2048 in 4D
tiger10guy | 12 years ago | on: 2048 in 4D
(in response to https://news.ycombinator.com/item?id=7417294)
tiger10guy | 12 years ago | on: 2048 in 3D
tiger10guy | 12 years ago | on: Xenia - An Xbox 360 emulator