top | item 24070904

(no title)

meow1032 | 5 years ago

I don't disagree completely with this, but just want to point out that it's kind of a bad smell to have computational biologists who are - as someone in the article puts it - computationally illiterate. I have met lots of these types over the years, and usually their methods are kind of a gong show. If you can't properly sanitize your data inputs on your column headers, why should I trust that you've treated the rest of your data properly?

discuss

order

acidburnNSA|5 years ago

I have a strong feeling that, if people really put an effort into reading and replicating more papers, we would find that a lot of what's being published is simply meaningless.

In grad school I had a subletting roommate for a while who was writing code to match some experimental data with a model. He showed me his model. It was quite literally making random combinations of various trigonometric functions, absolute value, logarithms, polynomials, exponents, etc. into equations that were like a whole page long and just wiggling them around. He was convinced that he was on a path to a revolution in understanding the functional form of his (biological) data, and I believe his research PI was onboard.

I guess "overfitted" never made it into the curriculum.

zozbot234|5 years ago

> It was quite literally making random combinations of various trigonometric functions, absolute value, logarithms, polynomials, exponents, etc. into equations that were like a whole page long and just wiggling them around.

Technically, we call that a "neural network". Or "AI".

fock|5 years ago

I work in computational materials science (where ML brings funding) and a funny paper of this kind is here: https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.11... - they are literally trying out 100000s of possible combinations by brute force, to build a "physical model".

Then they go on conferences and brag about it, because they have to (otr they know it's bs). Datasets are soso (you can have a look at QM9...) and for more specialized things, people generally don't bother trying to benchmark or compare their results on a common reference. It's just something new...

And with all that: even without doing fancy statistical methods without knowing too much about it, your theoretical computations might not make so much sense (at least in the sheer number which is pumped out and published)...

hannob|5 years ago

> I have a strong feeling that, if people really put an effort into reading and replicating more papers, we would find that a lot of what's being published is simply meaningless.

People have figured that out long ago [1] (I know the author of that paper lately turned somewhat controversial, but that doesn't change his findings). It's not very widely known in the general public. But if you understand some basic issues like p-hacking and publication bias and combine that with the knowledge that most scientific fields don't do anything about these issues, there can hardly be any doubt that a lot of research is rubbish.

[1] https://journals.plos.org/plosmedicine/article?id=10.1371/jo...

gameswithgo|5 years ago

isn’t the saying, 80% of everything is garbage?

throwaway_pdp09|5 years ago

Not my area at all but isn't this genetic algorithm type stuff?

Balgair|5 years ago

In grad school I had a friend that was doing olfactory (smell) research on rats with tetrode drives (wires in their brain). He was looking at the neuronal response to smells that they gave the rats and had a few signals to match up. There was the signal from the arduino running the scent gates, the amps that reported the weak neuronal currents, the nose lasers that gated the ardunio, etc. He was having a hard time getting through all the data in his MatLab code and I offered to help for some beer.

After the 11th nested 'if' statement, I upped the request to a case of beer. I'm not certain he ever got the code working.

To the larger point, scientists are not programmers. They got into their programs to do research. What keeps them going is not the joy of programming, but the thrill of discovery. Programming is nothing but a means to an end. One they will do the bare minimum to get working. Asking hyper stressed out grad students to also become expert coders isn't reasonable.

And yes, that means that the code is suspect at best. If you load the code on to another computer, make sure you can defenestrate that computer with ease, do not use your home device.

whatisthiseven|5 years ago

I keep seeing this sentiment when it comes to those in the natural sciences, but it makes no sense.

I could replace "programming" in your above little bit with "mathematics" and it would be just as weird.

Our modern world runs on computers and programs, just as our modern world and modern science built itself on mathematics and required many to use it. So too the new world of science may require everyone to know to program just as they know about the chemical composition of smells, or the particulars of differential equations, etc.

And I know your argument isn't "they shouldn't learn programming", but honestly since I keep seeing this same line of reasoning, I can't help but feel that is ultimately the real reasoning being espoused.

Science is getting harder, and its requirements to competently "find the exciting things" raises the bar each time. I don't see this as a bad thing. To the contrary, it means we are getting to more and more interesting and in-depth discoveries that require more than one discipline and specialty, which ultimately means more cross-functional science that has larger and deeper impacts.

meow1032|5 years ago

> To the larger point, scientists are not programmers. They got into their programs to do research.

I would say most research, to an ever growing degree, is so heavily dependent on software that it's tough to make that claim anymore. It makes no sense to me. It's like saying Zillow doesn't need software engineers because they are in the Real Estate business, not the software business.

panda-giddiness|5 years ago

> To the larger point, scientists are not programmers.

I mean, sort of. Some research is essentially just programming; other research can get by with nothing but excel. Regardless, it's unreasonable to ask most scientists to be expert programmers -- most aren't building libraries that need to be maintained for years. If they do code, they're usually just writing one-shot programs to solve a single problem, and nobody else is likely to look at that code anyway.

epistasis|5 years ago

It's not usually computational biologists who are using Excel.

What if you want to share data with a wetlab biologist who want to explore their favorite list of genes on their own?

meow1032|5 years ago

There are lots of great computational biologists, but being a computational biologist doesn't necessitate being good with computers. Plenty of PI's rely pretty much exclusively on grad students and post-docs to run all their analyses.

Not that I'm saying using excel is bad either. I use excel plenty to look at data. But scientists need to know how to use the tools that they have.

baddox|5 years ago

If people are just looking at the spreadsheets then wouldn’t the cells interpreted as dates not be a problem? It seems like it would only be a problem if you’re doing computation on the cells.

Myrmornis|5 years ago

It's also my experience of research in biological sciences that it is a widespread belief/fact that in order to get published in a top journal, the analysis methods must be "fancy", for example involving sophisticated statistical techniques. I worked on computational statistical methods so I'm not against that per se, but the problem is that if you have the training to contribute at the research front of an area of biology you rarely have the training to understand the statistics. Some would say that the collaborative publication model is the solution to that, but in many cases the end result isn't what one would hope for. I do think that less emphasis on "fancy" statistics, and more emphasis on simple data visualizations and common sense analyses would be a good thing.

FrojoS|5 years ago

Agreed. And to add to that: The more fancy the statistics have to be, the less robust are the results.

dekhn|5 years ago

I'm an ex-computation biologist who did most of his work in python but periodically had to interop with excel.

THe basic assumption I have is that when I input data into a system, it will not translate things, expecially according to ad-hoc rules from another domain, unless I explicitly ask it to do so.

It's not clear what data input sanitization would mean in this case; date support like this in Excel is deeply embedded in the product and nobody reads the documentation of Excel to learn how it works.

gameswithgo|5 years ago

it would be nice if everyone was expert at everything, but they cant be. it would be nice if they hired experts but money doesn’t grow on trees. we often insist on a degree of excellence we refuse to pay for

meow1032|5 years ago

It's not about being an expert at everything or hiring more people. These aren't particularly hard problems, it's not difficult to find biologists who are incredibly adept at using python, R or C. It's about thinking about how science gets funded and how it gets implemented. I've written here before about the difference between "grant work" and "grunt work", and how too computer touching tends to get looked down upon at a certain level.

If you're deciding who gets a large-scale computational biology grant, and you're choosing between a senior researcher with 5000 publications with a broad scope, and a more junior researcher with 500 publications and a more compuationally focused scope, most committees choose the senior researcher. However, the senior researcher might not know anything about computers, or they may have been trained in the 70's or 80's where the problems of computing were fundamentally different.

So you get someone leading a multi-million dollar project who fundamentally knows nothing about the methods of that project. They don't know how to scope things, how to get past roadblocks, who to hire, etc.

jnwatson|5 years ago

It doesn't take being an expert at Excel to understand how Excel autoformats. It takes a few days of actually working with data or an introductory class that's today taught in American primary schools.

sincerely|5 years ago

Sorry for asking but are you familiar with how MS Excel aggressively converts data to dates? There's no way to "sanitize" it (without resorting to hacky solutions like including extra characters) and even if you fix the data, it will re-change them to dates the next time you open the file.

jaywalk|5 years ago

You're simply incorrect. If you set the column format to Text it will never convert data to dates, including when you open the file.

jhbadger|5 years ago

For the most part we aren't talking about computational biologists but experimentalists using Excel. People at the bench need to collect their data somehow, and using Excel for tabular data and Word for text data is just what they know. Typically they then pass these files over to computational biologists for analysis. Yes, it would be nice if they would use more appropriate tools, but I know from experience that the typical result of trying to teach them better tools is the experimentalists just rolling their eyes and saying that they don't have time to learn some nerdy program because they have experiments to run.

fortran77|5 years ago

Excel is a wonderful tool and a type of programming that is very accessible to many people. I use it all the time.

t34saves|5 years ago

Considering how Perl was chosen as the computational biologists lingua franca in 1990's - 2k's since it was good at text manipulation (since genes are represented by text) I would say they don't have a history of making good choices.

s800|5 years ago

Why?