tiger10guy's comments

tiger10guy | 11 years ago | on: On the foolishness of “natural language programming”

The ideas that I, as a programmer in the traditional sense, want to communicate to computers are not the ideas I want to communicate to humans.

I might ask my friend to move the report he's working on to a shared network location so I can load it into my computer and read it: "Hey Joe, can you move the report to the share?"

Joe might ask the computer to do the same thing: "cp /home/joe/reports/cool_report.pdf /network/share/reports/cool_report.pdf"

The actual ideas that are communicated are very similar, but not the same. English is good for communicating one idea while bash/GNU is good for communicating the other.

Just because English has some established formalism doesn't mean it's good at communicating the ideas we want to communicate to computers.

BTW, I don't care which field you put the issue under; it's the same issue and anyone who cares about it might contribute to the discussion.

tiger10guy | 11 years ago | on: Numenta Platform for Intelligent Computing

In Li Deng's and Dong Yu's book on Deep Learning (March 2014) version they briefly relate Hierarchical Temporal Memory (HTM) to the Convolutional Neural Networks which are popular for Deep Learning.

http://research.microsoft.com/apps/pubs/default.aspx?id=2093...

It's worth noting that most people doing Deep Learning aren't trying to replicate the brain, but just want to do a better job at Machine Learning (ML) and Artificial Intelligence. Here's how I see it as someone working on Deep Learning; someone correct me if I'm wrong.

Deep Learning: Trying to do ML - yes Trying to replicate brain - no (for the most part)

Numenta (HTM/CLA): Trying to do ML - yes (not sure how much they succeed) Trying to replicate brain - yes, but (i) we don't know exactly how the brain works (ii) they make approximations

Projects like Nengo (http://nengo.ca/): Trying to do ML - no Trying to replicate brain - yes

I'm not very familiar with Nengo.

Edit: formatting

tiger10guy | 12 years ago | on: Parallelising Python with Threading and Multiprocessing

I haven't used that, but it looks interesting. After a brief look it seems like they both submit jobs to Python interpreters started up in other processes.

Parallel Python (PP) seems to have a clunkier API, but also more functionality. I think the biggest advantage is that it can distribute jobs over a cluster instead of just different cores on the same machine. I might look into PP if I need to do things on a cluster, but I think I'll still stick with joblib when I'm on one machine.

That's just my first impression. I'd be interested to read your blog post.

tiger10guy | 12 years ago | on: 2048 in 4D

You can forget about dimensions... yet it's still in 4 dimensions. That's what's so cool about it!

tiger10guy | 12 years ago | on: 2048 in 4D

That makes sense.

I think my intuitions for the discrete and continuous are similar for 2D and 3D. Are they fundamentally similar? How do they differ, even if only a little? It seems that discrete 4D intuition should somehow help with continuous 4D intuition.

tiger10guy | 12 years ago | on: 2048 in 4D

It's so hard to get intuitions about higher dimensions and this seems to do an incredible job. The game is simple enough and the cardinality of the dimensions low enough (2x2x2x2) that I can actually play it smoothly (having played the original and with a bit of practice at the 4D version).

Does that mean I've formed 4D intuition? I think this just happens to be in a class of 4D mechanics that's isomorphic to 2D variants, so the answer would be no. If so, how many such 2D variants are there?

For those interested, 4d rubick's cube:

http://www.superliminal.com/cube/cube.htm

tiger10guy | 12 years ago | on: 2048 in 4D

It seemed like too short a time. I figured someone had thought of it first, but I still couldn't help saying something. Nice work!
page 1