superfx
|
5 years ago
|
on: Pfizer submits Covid vaccine to FDA for approval, to distribute in December
Anyone know why China is not on the list of countries in which Pfizer is seeking regulatory approval?
superfx
|
5 years ago
|
on: Ask HN: Mind bending books to read and never be the same as before?
The Glass Bead Game is one of the best books I've read, and I think palpably changed how I think about some things even though I read it as an adult. I even wrote a blog post about it:
https://moalquraishi.wordpress.com/2013/05/05/the-glass-bead...Another book I'd recommend (in a very different category) is Godel, Escher, and Bach, by Douglas Hofstadter.
superfx
|
5 years ago
|
on: Why are Soviet math textbooks so hardcore in comparison to US textbooks? (2017)
Would you mind linking to the 21 Jump Street scene? I'm very curious. As an American nerd in my late 30s, I definitely recall being relentlessly teased as a kid. I don't have kids of my own now and so I don't really know how and if things have changed, but if true this would be a very positive development!
superfx
|
6 years ago
|
on: The coronavirus pandemic in five powerful charts
Doesn't that mean the effective R0 value for COVID-19 should be higher then?
superfx
|
6 years ago
|
on: HK's extradition law: Not just HK people have reason to fear Chinese “justice”
This strikes me as the quintessential problem with autocracies. Sometimes one gets extremely efficient governments in the short term, when the autocrat is competent and not entirely corrupt, but in the long run whatever short term gains were had are squandered by corruption and greed. Democracy is inefficient in the short term but efficient in the long run.
superfx
|
6 years ago
|
on: Free Wolfram Engine for Developers
superfx
|
6 years ago
|
on: Biological Function Emerges from Unsupervised Learning on 250M Protein Sequences
superfx
|
7 years ago
|
on: DeepMind StarCraft II Demonstration [video]
superfx
|
7 years ago
|
on: DeepMind StarCraft II Demonstration [video]
> Their chess AI beating Stockfish involved some at least a bit questionable setup.
I believe their eventual Science paper addressed these concerns.
superfx
|
7 years ago
|
on: The West Coast is beating the East Coast on transportation?
"E-scooters aren't a reliable way to get anywhere yet, and who knows if they'll ever be, not to mention that they are not for everyone. My grandmother is not going to ride one -- nor my wife, for that matter, nor should the kids. But the Subway is a common denominator."
I used to think so, but some European cities really do offer counterexamples. I'm thinking of places like Munich, Vienna, and Copenhagen. It's not uncommon to see people there who, by American stereotypes, wouldn't be expected to ride scooters: moms with kids, men in suits, etc. Perhaps the urban cultural gap is so vast that what you're saying is indeed true of the US, but I wouldn't take it as a given.
superfx
|
7 years ago
|
on: Learning Dexterity
It looks that way because they're moving rapidly from one face configuration to another. But there's no way that's happening by random. I would guess that even just holding the cube constant in a dynamic grip is quite difficult.
superfx
|
8 years ago
|
on: How 4000 Physicists Gave a Vegas Casino Its Worst Week (2015)
I think the same exact thing happened with NIPS 2011 in Reno.
superfx
|
8 years ago
|
on: End-to-end differentiable learning of protein structure
I would say the biggest thing is obviously the architecture, coupling LSTMs with the geometric units that spit out the actual 3D structure that can then be directly optimized via the dRMSD loss function. That's the biggest point of distinction from everything else out there (no contact map prediction, etc.) So it really is about end-to-end differentiability IMO, which hasn't been done before.
As for why it took so long, it is and it is not fine-tuning. Getting RGNs to train _at all_ was a rather difficult process, and required a lot of finicking around. But since I got them working, I haven't actually spent all that much time fine-tuning them, and so I expect there to be a lot of low-hanging fruit in terms of optimizing performance (starting from the baseline I found.)
superfx
|
8 years ago
|
on: End-to-end differentiable learning of protein structure
Yes! Certainly on the source code, and hopefully on CASP13 too.
superfx
|
8 years ago
|
on: End-to-end differentiable learning of protein structure
Re drug discovery, often times in “rational” drug design, medicinal chemists try to make small molecules that bind snuggly into a binding pocket on the protein. Having the structure of the protein aids greatly in that process.
superfx
|
8 years ago
|
on: End-to-end differentiable learning of protein structure
I do think however that protein folding is very much understudied in the ML community, relative to say the big three of vision, NLP, and speech. The lack of standardized data sets and benchmarks, not to mention the need for domain knowledge, have made it difficult to get into the field
superfx
|
8 years ago
|
on: End-to-end differentiable learning of protein structure
Hi! I’m the author of the paper. Not sure why you say Rosetta isn’t mentioned? It’s extensively referenced throughout the paper, discussed in the discussion section, and is one of the top 5 CASP servers compared to in the results section.
Also as for how it’s different from what’s described in the paper, that’s the topic of the introduction of the paper. Rosetta uses both fragment assembly and co-evolution methods.
superfx
|
8 years ago
|
on: Progressive Growing of GANs for Improved Quality, Stability, Variation [video]
But they seemed to have picked closest neighbors in pixel space instead of z-space, which is not the best idea, no?
superfx
|
8 years ago
|
on: Andrew Ng is raising a $150M AI Fund
Oh yes my bad. I meant to say I took Andrew's CS229, and Daphne's CS228. I never took CS221.
superfx
|
8 years ago
|
on: Andrew Ng is raising a $150M AI Fund
I took CS221 from Andrew in 2006 (or was it 2007?) Even more has changed since then ;-) It was my second ML course, after taking Daphne Koller's punishing CS229. Right then though I knew ML will sweep the world pretty soon.