The one potential problem with going down this road is that even fewer people will have a broad understanding of what is going on in complicated experiments as labs are increasingly encouraged to specialize in a technique.
Without some incentive for such people to exist, it seems likely that "obvious" connections between different fields may be missed for extended periods of time as people focus on executing incremental experiments.
I'm somewhat more worried about just the lack of vertical integration, rather than even the cross-field collaboration. The more you treat parts of your experiments as black-boxes done by someone else, the more likely it is you might misunderstand some details of what exactly is happening, and draw the wrong conclusions. It requires the abstraction boundaries being very good, anyway, so that nothing gets changed at one level that actually impacts another level's results, without the sides communicating. If it's the same people, or at least people who see each other daily in the same lab, they're much more likely to notice that a "minor" change in a detail is actually experimentally relevant, due to how the result will be used somewhere else, or assumptions about the process that were baked into an equation, or something along those lines.
Even in a field like CS this happens a lot; tons of benchmarks are screwed up or at least misleadingly reported because someone didn't 100% understand the entire stack on which their part was running, and what its implications were for their results.
Actually I think this has the potential to increase the connections between different fields because it will truly incentivize collaborations through a market mechanism rather than the bartering system which currently exists (i.e I'll trade you conducting this expt for authorship).
The way I envision it working is that the person who has the grand idea obtains a grant and then breaks it up into microgrants which are outsourced to specialists to conduct the expts efficiently. The results are then combined to obtain a broad understanding of the particular question being asked by the grant holder. The broad results can be communicated to all participants.
What this author considers a novel, future direction for science is something that I call "going to work in the morning." The US Department of Energy operates many user facilities distributed around the country, including nanotechnology centers, electron microscopy centers, neutron facilities, and X-ray facilities. (I run an instrument at one of the X-ray facilities.) Access to these facilities is free to the user and available through a peer-reviewed proposal system. In many cases, these facilities provide access to equipment and experimental techniques that don't exist at individual institutions. In some cases (notably at the nano-centers) capabilities are provided that might be available at a university or a company, but the nano-center provides access to a very broad suite of instrumentation as well as access to expert staff. While there might be a concern that the proposal system would cater to insiders (as one commenter discussing CERN suggests below), that is not my experience. I regularly have new folks coming to use my instrument.
My caveat here is that my field is in the physical sciences, while the author works in the life sciences. The user facility concept is specialty of the DoE and, while DoE user facilities do serve many life science users, it grew out of a physical sciences funding agency. Still, the user facility concept works well and is a model that could address the sort of distributed yet collaborative science that the author describes.
It does seem odd. What you're talking about is not even particular to big user facilities. If I don't have the (physics) facilities to do something, I just track down someone who does and ask them (which is not hard using google or looking at the relevant literature). The majority of the time they're happy to collaborate on the basis that they're on the paper. I think I'm missing some detail about this.
Some great quotes here:
"Outsourcing revolutionized the IT industry in the 1990s and 2000s and I believe outsourcing has the potential to revolutionize scientific research in the same way."
"Just think of how many more discoveries can be made when scientists are able to easily tap into the best resources. That’s what gets me excited!"
This is pretty radical stuff!
It makes sense when you look at other industries from car manufacturing to IT. In the west people moved to more and more specialized, cutting edge stuff (Tesla, developing new dbs) while the bread and butter moved to places where it was more economic. Makes sense that the trend would continue in science as well.
But the thing with science is that a scientist has historically been considered as not just someone competent enough to run a given experiment but also as a "seeker after truth". And this does mean something, it means that the scientist someone who you can trust to some degree to run experiments honestly. And that trust has been broken on regular intervals and that trust gets verified by repetition sure. But the basic "integrity" might just be something that is hard to outsource in the fashion of IT outsourcing. But then again, it might also involve a loss that various companies are happy to accept.
Hi Jess - Science Exchange allows scientists to outsource individual experiments to specialist labs and facilities. For example gene expression profiling can be outsourced to specialist microarray providers. This is becoming a bigger and bigger issue as technology advances and scientists are forced to become more specialized.
Outsourcing is already reasonably well establihed in biology. There are a number of companies, notably from South Korea, that offer sequencing, probe library creation and a host of other services that take the grunt work out of a lot of lab work. The services are cheap and the results are very reliable. The real benefit of all this is that is opens up the field to a lot more people that could otherwise not afford the infrastructure in equipment and lab staff.
From my limited view, not really. I've worked at multiple large insurance companies (I'm a consultant) over the past 6 years. For many it has created a large set communication disruptions that are hard to overcome. Also, many outsourcing companies have poor quality control mechanisms. I've been at multiple clients where an Indian firm "delivered" code that wouldn't compile.
Outsourced IT can be helpful for some processes. This seems to be mostly grind work like using humans to automate testing. Rather than investing in a real automated testing tool, scripts are created and then given to a team in the country of outsource. They sit there clicking on links and buttons until something fails. Then, if you're luck, they send you an email detailing what buttons and data were modified before the failure. More often than not, you just get an email that says X didn't work for account Y. But better them than customers.
That hasn't been reported on as widely in the mainstream press as the wave of moving work abroad a decade ago. It will take a time for the message to settle among the broad public.
[+] [-] thefool|14 years ago|reply
Without some incentive for such people to exist, it seems likely that "obvious" connections between different fields may be missed for extended periods of time as people focus on executing incremental experiments.
[+] [-] _delirium|14 years ago|reply
Even in a field like CS this happens a lot; tons of benchmarks are screwed up or at least misleadingly reported because someone didn't 100% understand the entire stack on which their part was running, and what its implications were for their results.
[+] [-] elizabethiorns|14 years ago|reply
The way I envision it working is that the person who has the grand idea obtains a grant and then breaks it up into microgrants which are outsourced to specialists to conduct the expts efficiently. The results are then combined to obtain a broad understanding of the particular question being asked by the grant holder. The broad results can be communicated to all participants.
[+] [-] 700ravens|14 years ago|reply
My caveat here is that my field is in the physical sciences, while the author works in the life sciences. The user facility concept is specialty of the DoE and, while DoE user facilities do serve many life science users, it grew out of a physical sciences funding agency. Still, the user facility concept works well and is a model that could address the sort of distributed yet collaborative science that the author describes.
[+] [-] flashingleds|14 years ago|reply
[+] [-] djkn0x|14 years ago|reply
[+] [-] jayzee|14 years ago|reply
[+] [-] joe_the_user|14 years ago|reply
But the thing with science is that a scientist has historically been considered as not just someone competent enough to run a given experiment but also as a "seeker after truth". And this does mean something, it means that the scientist someone who you can trust to some degree to run experiments honestly. And that trust has been broken on regular intervals and that trust gets verified by repetition sure. But the basic "integrity" might just be something that is hard to outsource in the fashion of IT outsourcing. But then again, it might also involve a loss that various companies are happy to accept.
[+] [-] jessriedel|14 years ago|reply
[+] [-] elizabethiorns|14 years ago|reply
[+] [-] smackay|14 years ago|reply
[+] [-] rebel19|14 years ago|reply
[+] [-] virmundi|14 years ago|reply
Outsourced IT can be helpful for some processes. This seems to be mostly grind work like using humans to automate testing. Rather than investing in a real automated testing tool, scripts are created and then given to a team in the country of outsource. They sit there clicking on links and buttons until something fails. Then, if you're luck, they send you an email detailing what buttons and data were modified before the failure. More often than not, you just get an email that says X didn't work for account Y. But better them than customers.
[+] [-] unknown|14 years ago|reply
[deleted]
[+] [-] georgieporgie|14 years ago|reply
It did? I thought it was generally accepted that outsourcing "IT" didn't really work out well.
[+] [-] roel_v|14 years ago|reply
[+] [-] blrgeek|14 years ago|reply
Notice GE GM BT Verizon etc don't have huge IT teams anymore? That's outsourcing. And it certainly worked.
For many basic IT services even ofshoring works well enough. Jury is out on offshored product development though.