bendyBus's comments

bendyBus | 11 years ago | on: Use ReactJS to Build Native Apps

"learn once, write everywhere" - I like the sound of that! But I guess in the spirit of React this is still only the "V" in MVC right? I don't suppose they are planning to broaden the scope to anything other than rendering views. So if you want to use this to build native apps, I guess you are going to have to integrate another cross-platform solution for storage/sensor APIs?

bendyBus | 11 years ago | on: Show HN: MonkeyLearn – Text Mining Made Easy

but really - and I'm sure this is your `secret sauce' so you don't want to give away too much - how do you get hypers which just work out the box? was this some kind of meta-regression on the hypers? Or did you do a bayesian optimisation?

bendyBus | 11 years ago | on: Learning Quantum Mechanics: Machines vs. Humans [video]

Yes, I'm part of the GAP research group (with Albert Bartok Partay and Gabor Csanyi). Representing environments is definitely still an ongoing research project. Other things I've played with in the past include identifying crystal structures at finite temperature (i.e. a classification rather than regression task), or differentiating between amorphous phases (since e.g. water has two amorphous solid phases with a 1st order transition in between, but there is no way in hell you would be able to tell one from the other visually.

We're currently working on a really ambitious new way to represent environments, but it's really preliminary at the moment.

Regarding your ion issue, what about the angular components? The radial functions really only tell you so much...

But more fundamentally, what do you mean by ion energy levels? I'm presuming you mean a metallic nucleus+core electrons, in a condensed phase at finite temperature. But of course that `atomic energy' -insofar as it exists- is a continuous function of position and not quantised, so I'm unsure what you mean by energy levels in this context.

bendyBus | 11 years ago | on: Show HN: MonkeyLearn – Text Mining Made Easy

Have been following these guys for a while and had a lot of fun playing with their API when they were still in the alpha stage. Really impressive stuff being able to do NLP in a black-box fashion. I'm sure it took a lot of time getting the default machine learning params to work well in so many cases. Admittedly haven't tried any competitor's products yet so would be keen to hear from those who have how it compares.

bendyBus | 11 years ago | on: Are you a right-brained programmer?

Sanity disclaimer: I'm sure right/left brained isn't the best or most current model of personality/cognition, but it's widely known and helps to frame this question, which I find interesting.

Is there a personality type which makes for better programmers?

Character of the archetypical left-brained person: fastidious strong logical/reasoning skills thinks in terms of structures

whereas the right-brained person finds creative/un-obvious solutions to problems good at thinking laterally thinks in analogies, better at spotting similarities than differences

Now there are many different ‘modes of thought’ a programmer encounters: coming up with the organising principles of a framework, chasing down a Heisenbug, finding a (slightly dirty) solution which saves having to re-write masses of code; these require very different cognitive skills.

Do different parts of a companies’ dev community tend to be populated by one personality type? Should even small teams contain a mix?

Is it possible to be a very right-brained, very productive programmer? And if so, is it at all clear to non-programmers that a career in software development is possible if you’re the “creative type”?

bendyBus | 11 years ago | on: Learning Quantum Mechanics: Machines vs. Humans [video]

I'm not sure if I would agree with categorising the field as a race. I absolutely agree that the number of people using Machine Learning within the atomistic simulation community is skyrocketing. But everyone is just exploring the range of possibilities and trying to see what the essential elements are for it to be successful. I think having more people working on it is a great thing!

Regarding LAMMPS, actually the GAP code is now also easily usable there with this plugin : https://github.com/libAtoms/QUIPforLAMMPS

The bispectrum is indeed a very powerful tool, but is not the ideal feature vector for representing the atomic environment. You should have a read of Bartok's more recent paper on this: http://journals.aps.org/prb/abstract/10.1103/PhysRevB.87.184... . One of the issues is that the bispectrum starts with an approximation of the neighbourhood atomic density as a sum of delta functions. Trying to represent such sharp features in a basis set expansion is actually very slowly converging. So the idea behind SOAP is to build a covariance kernel by directly comparing a smooth measure of the similarity of environments, which is also invariant to all physically relevant symmetry operations.

I would also like to add that in addition to GAP and SNAP, there are people like Jörg Behler doing this with Neural Networks and Francesco Paesani/Greg Medders with a different regression schemes. But in addition to making potential energy surfaces there are people like Paul Popelier `learning' atomic charges for building force fields and people in Vijay Pande's group doing machine learning on MD trajectories, which is something that excites me a great deal and I would love to understand in more detail.

It's a very exciting time to be in this field!

bendyBus | 11 years ago | on: Learning Quantum Mechanics: Machines vs. Humans [video]

There is a lot of hype surrounding Deep Neural Networks which seem to be able to solve extremely challenging machine learning problems with almost no human tuning of parameters.

This is an informal talk at the big-O meetup in London discussing the application of machine learning to quantum mechanical simulations of atoms and molecules. This is a particularly demanding application of machine learning techniques. The requirements for regression accuracy are very high, and in addition a number of physical laws need to be obeyed by the learning algorithm. The point of the talk is to invite discussion about the relative merits of domain expertise versus general-purpose algorithms for high-performance machine learning.

bendyBus | 11 years ago | on: What is going to happen in 2015

8/ A new generation of apps redefining work habits is well overdue. The obvious driver I see is that people's expectations for software usability/design are set by consumer apps. Also millenials are entering the workforce, and they have never had to take a course to learn how to use a piece of software. Any tool which requires training is arcane in their eyes, and probably rightly so. Add zero downtime, access-and-share from any device, auto sync & backup, beautiful UI to the list of requirements. Some of these are being addressed by office 365 and the like, but unbundled web-based productivity software looks ripe to dethrone mircosoft as THE enterprise productivity software vendor.
page 2