The attention economy corrupts everything it touches: not just science, but journalism, politics, and even childhood.
Being famous used to be rather difficult. Of course there were exceptions (writing To Kill a Mockingbird, being the guy who dove into the river to save a drowning child, for example), but for the most part, you were going to live your life known only to the few hundred or thousand people you met personally.
Even though you could pick up a phone and dial anyone in the world who owned a phone, you wouldn't, and if you did, they'd hang up on you. Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.
Is that a good or a bad thing? It's certainly bad in some ways, and this is one of them.
The problem seems to be more fundamental. Attention is not inherently bad, the issue is how and what kind of attention is rewarded. Many platforms reward engagement of attention seeking behaviour, both good and particularly bad ones, as it easily evokes primal emotions in the audience. And so there is incentives for content creators to continue peddling shitty content.
The thing that scares me the most is that unlike reading a novel, or pensively writing a letter in the 19th century, we focus much less now on one thing for long periods. Even looking at HN, but look especially at TikTok, YouTube “shorts” — the dopamine pinball game going off in our brains and constant change of focus is robbing us of a skill which I fear will have unforeseen consequences at the scale of our global society.
Pretty sure being famous is still pretty damn hard. For every tick tock virally famous influencer there are thousands of wannabes that nobody cares about.
> Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.
I'm sorry, did a hand reach out of your phone and force you to install and view the app of the week?
>Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.
No you can’t. Attention is scarce. That’s why it’s called “attention economy”. In fact, it was far easier to force idiotic ideas onto the screens of millions of people before the internet came along. That’s what TV commercials were.
It is well-established that the human animal is evolved to live in small groups. When people come together in large numbers, it is a special occurrence, limited in time and space. The idea of living "as if" one is always in this situation is unnatural and arguably unhealthy. It is not something we should be promoting or even allowing. We should be promoting small groups.
If I was asked to "regulate" this problem of so-called "altmetrics", and the "attention economy" in general, here is how would I do it.
Twitter used to be based on SMS, but since 2020 it is just a gigantic website like Facebook. These two mega-sized websites are the primary sources of "altmetrics". If we take away the right to create these gigantic outsourced websites, what would happen.
I would place limits on websites that are comprised of so-called user-generated content. For example, if someone wants to run a website with millions of pages, they are free to do so. (If they could actually produce enough content to justify so many pages.) However, they are not free to have millions of different people author the pages. A website could not be a mass scale middleman (intermediaries) for people who wish to publish using the www. A mega-website doing no work to produce content, financed solely by selling people out to advertising could not, for example, supplant an organisation that employs journalists and editors to produce news.
By regulating creation of these mega-websites we could reduce the incentive for advertising. The mega-websites would lose their traffic and disappear. They would be replaced by normal-sized websites that cater to specific audiences.
Allowing a few websites to grow to enormous size while not having to do any work to produce content has been a mistake. Of course they can make billions in advertising revenue. It also allows any notion of egalitarianism in the www's design to be compromised in favour of a sharecropper model, so-called "platforms".
Without oversized websites no one would be able to publish their content to every website in existence. No website would be able to do zero work to create content and yet act as a middleman drawing massive traffic that can be monitised via advertising. That is what these mega-websites like Twitter and Facebook do. They sit between people who create content and people who consume it and sell these people out to advertisers.
The cost of publishing data/information to the www will continue to fall. The technology to make it easy will contnue to advance. We do need to be able to communicate in small groups, as we have always done. That is possible. We do not need to collectively use mega-websites with billions of pages run by a handful of third parties in order to do it. The follow-on effects of millions of people communicating via these third party websites are obviously harmful.
Came here to say this. There is nothing meaningful in our lives which goes completely unscathed from the cultural destruction of the attention economy.
Humans ... us people things, etc ... are prone to our most immediate concerns. It used to be that predators were our concern, now its social media. We react, we make noises, we continue. The unfortunate thing is that there are no lions to take out the weak anymore.
Having spent over 10 years in a university and been a professor, the problem isn’t attention seeking behavior but a lack of accountability. For example, you can literally make up any data you want in a grant proposal and so long as it sounds right no one can or will double check it. The foundation of academia is rotting, but maybe it’s always been like this
It has not. And "a lack of accountability" is a band-aid on the real problem: bad gatekeeping. People getting into science, not for the search for truth, but in search of respectability, green card, money, or whatever else. Trying to whip them into real scientists through transparency and accountability is like trying to achieve security in your home by flinging the gates and doors wide open but slapping cameras and motion detectors everywhere. Either they win, or you get fatigued.
I think this also applies at all levels. I'd argue that it is the big reason people feel very frustrated with reviewing, and especially in hyped areas (e.g. ML). There's plenty of incentives to reject papers (low acceptance rates mean "higher quality" publications, advantage yourself by rejecting or being overly critical of your competitors, you get no clout for reviewing and no one will get upset if you take all of 5 minutes to review), but very few incentives (I can't even name one) to accept papers. It is fairly easy to dismiss papers as not novel because we all build off the shoulders of giants and things are substantially more obvious post hoc. Metareviewers and Area Chairs will 99/100 times stand with reviewers even if they are in the wrong and can be proven so (I had a reviewer give me a strong reject claiming I should compare to another work, which was actually our main comparitor and we compared to in 5+ tables and 5+ graphs). I can't see these issues being resolved until we all agree that there needs to be some incentive to write high quality reviews. The worst of this is that the people it hurts the most is the grad students and junior researchers (prevents graduating and career advancement). I'm not saying we have to accept papers, but I am saying we need to ensure that we are providing high quality reviews. Rejections suck, but rejections that don't provide valuable feedback are worse.
If the publication system is a noisy signal then we need to fix it AND recognize it as such. There's been plenty of studies to show that this process is highly noisy but we're all acting like publications are all that matters.
This is all before we even talk about advantages linked to connections even in double blind reviews, collusion rings, or citation hacking. I feel we can't even get the first step right.
The biggest newspapers are writing stories with no proof or facts, and no one cares anymore.
No one cares, when you can watch 5 movies in a day and feel all sorts of emotions people used to only feel a few times a year, watch porn, play videogames, read about anything you want, you have little time or neurochemistry left to be mad at scientist, journalists or politicians lying to you.
I have read a lot about the history of science in Britain. Even before the creation of the Royal Society, people doing science generally all knew each other and communicated extensively, so I think there was a high degree of accountability. Even in the 19th century, scientific sub-communities were small. I suspect that the problem of accountability arose in the 20th century with the expansion of institutional science.
As far as I can tell, there's way more accountability in science now, just like there was more accountability in a Soviet factory than an American one in the Cold War. It just doesn't always achieve its intended results.
What most people fail to realize is that fundamentally, most principle investigators, the people who actually run the research world, are primarily fundraisers. Their day-to-day job is a mix of grant and proposal writing, relationship building/organizational meetings, and checking in on their postdocs, candidates, and lab techs.
Your main goal, as a PI, is to keep your lab running, and thus research flowing, by any means possible. For some PIs, this means milking every drop of available talent and time out of your doctoral candidates, and is the most common cause of the horror stories you hear about people leaving academia. On the other end of the extreme, they can embed themselves so deeply in fundraising with private or public capital that their lab staff don't see them for more than 15-30 minutes a week, because they are essentially living their lives hopping from sales meeting to sales meeting.
This wouldn't be a problem if the job of the PI was explicitly meant to be that of a salesman, but the actual role of a PI is to define the research being done. They draft the hypotheses, the expected impact, etc, because that is their intended role, but in reality these will always be constructed in a way that makes it easier for the PI to solicit funding.
It's impossible for the attention economy to not play into the research funding loop then, because every set of eyeballs is another potential revenue source for future research, or a tool to justify growing the footprint of your lab. I wouldn't go so far as to call the superstructure corrupting of science though, not in those words. I'd say it forces science to be mission focused, where the mission is a subtle negotiation between the people funding the research and the people performing it, and often times the person with the capital lands much closer to their ideal.
Just speaking from personal experience, what you describe gets murky really fast in a couple of ways.
The model you're describing implicitly is this idealistic "executive-mentor" model of the PI, who has the ideas and the postdocs or doctoral students are just implementing it. Basically scientist wants to produce, so needs help and outsources the work to others.
In my experience, though, this is not at all what happens many times. Ideas come from those doctoral students or postdocs or whowever, the PI takes them, and then they get credit for those ideas. I've seen PIs who really don't fundamentally understand the research areas, who kind of are just "black holes" for credit of the ideas and work of others around them, and then because they're more senior, they end up getting the credit. Sometimes this process seems intentional, in that the PI cultivates a false impression of what's going on, and sometimes it just happens because of the nature of the attention economy.
So although the "executive-mentor" model is a good one, what's closer to reality in many cases (although not all) is more of a "public liason-mascot" system, or some kind of hybrid.
Because of this mismatch between reality and the assumed schema, the attention economy then incentivizes abuse and corruption.
This isn't even getting into issues about how chasing grants as a fundamental scientific endeavor distorts what is researched. Even if you have a pure leader-mentor PI who is just trying to get their own independent ideas researched with funding, you then have to ask "what is rewarded? Is it what's good rigorous science, or what is popular?"
The problem I think is that what garners attention is not what is rigorous or innovative. Sometimes those things overlap, and maybe they're correlated, but they're not the same.
Maybe this isn't unique to science, but it doesn't make it ok, and it seems like changing it to prevent these problems is necessary.
Attention is not the problem; it's the lack of accountability. Social platforms care about engagement, not quality of content (there's virtually no mechanism to incentivize content meets any standard of quality other than what can be measured in the moment).
Why go to such a long tangent, when you could make a solid case about the legacy grant distribution system[1] corrupting science for decades? It is as close to funding and career success as it gets.
Sometimes I am left to wonder about the widespread criticism of science. Slow progress, broken publishing and career progression, terrible working conditions, prestige farming, poor mental health and exploitation of students, fake data, statistical warcrimes, bullying, sexism, racism, elitism, harrassment...
The question is, does any of this matter on a historical scale? Is Science doomed to fail? In 200 years, our descendants will probably look back at us with the same mix of condescension and slightly horrified fascination with which we view our 19th century counterparts. Our stupid scientific publishing system will be viewed similarly to the plumbing in London in the 1850s. The chimney sweep and the graduate student suffer similar plights. It is terrible, immoral, we should do better, but then this is always the case.
On one level, our future descendants should be grateful - we eradicated small pox (that alone would be enough really, an unprecedented gift to all future humanity, a boundless alleviation of suffering), discovered antibiotics, greatly improved child mortality, invented quantum mechanics and relativity, drastically increased our computational capacity, left the planet for the first time, and connected almost everyone in the world. All of this under conditions considerably less ideal than they are today, despite still being far from desirable. Maybe that's all that matters?
Science being Science will sort itself out most likely. We should mainly try to reduce the human suffering involved, and allow greater diveristy in how and where research is done.
My main concern is that perhaps there is an "attention threshold" on fundamental breakthrough results in physics and mathematics, and that we're diminishing the chances of ever getting such results again because theorists don't have enough attention to spare. If you spend all day writing grant proposals and trivial papers (just for the sake of publishing), then go home and veg out on Netflix and Twitter, when are you supposed to have that quality "sitting and thinking" time?
" Scientists list media exposure counts on résumés, and many PhD theses now include the number of times a candidate’s work has appeared in the popular science press."
This is a mandatory requirement to a EB1A green card. Maybe the government can do something from its side to reduce the fluff.
"The attention a scientist’s work gains from the public now plays into its perceived value. Scientists list media exposure counts on résumés, and many PhD theses now include the number of times a candidate’s work has appeared in the popular science press. Science has succumbed to the attention economy."
Sitting on a tenure and promotion committee at an R1 university, this type of stuff is just as likely to torpedo you as it is to boost you.
I agree. I’ve almost never seen this, and I read a lot of scientific vitae. (The one time I did see it it was because a major documentary by a major scientific org had been made from the results.) I think they mostly made this observation up. (There’s no citation.) Calls the whole thing into question, which sounded pretty simplistic to begin with.
The question is when? Surely , with passage of time, it s more likely scams will be revealed. But if the professor has already retired, the damage and lost effort is irreversible.
> "Science and scientists are part of society. Neither sit on a lofty perch that makes them impervious to societal shifts."
Ironically, this is not the general public's perception. Even more ironic is the fact that celebrity "scientists" like Bill Nye and Neil Degrasse Tyson are fond of pushing the "science is better than the rest of us" narrative.
I disagree with the premise of the article, interconnectedness is a great benefit to science. Yes there are inequities, but that also means more collaboration, and more scrutiny. Science is deeply corrupt but for other reasons such as a lot of money in certain areas of interest, pressure from governments and ivory tower mentality.
It's nice that this has finally come to the public's awareness. For a few years, I was afraid that the attention economy would be an 'invisible hand' which would have a significant impact on everything but which nobody would notice.
People used to be uncomfortable discussing 'attention' or 'the media' - I suspect because it was too abstract or not relevant enough to them - Now that many people are struggling to get any attention for their work, attention and the media seem more relevant.
When someone publishes the best work of their career and it receives less recognition than their early work, it sometimes makes them wonder what has changed.
Well, yes, but what else can we do? Certainly we shouldn't give out Nobel prizes on the number of like buttons clicked on TikTok, but at some point the most influential science is the science that influences the most people. Sure, it's possible that someone has written a great paper that will be super influential in three or four hundred years, but we have no way to measure or accurately predict that. So we're stuck with the citation counts and the votes for best paper at the conferences. It's all we've got.
> Twenty-five years ago, it was projected that, in an ever-more interconnected world, money would no longer be the prime currency, attention would be.
While it does make for a good opening line, using a single prediction from 25 years ago that arguably worked out to some extent (probably not everyone prefers likes over cash), without giving any context on how many similar predictions were made in the same context, just feels a bit odd.
> “get your science the attention it deserves.” (On Google, that search term garners nearly 500 million hits.)
So 500 million pages have one or more words similar to those. That's pretty flimsy evidence. Putting quotes around it I get 5 results (not 5 million, 5), one of which is this article. I don't doubt the overall conclusions of the article, but I do wonder how well supported some of it is.
Having multiple family members working actively as scientists and academics i've been pretty blackpilled about what "science" actually is for the most part.
Off course there's heaps of interesting papers and progress out there but at least 90% of money and time seems to be spent on politics, careerism and working actively for some disproportionally funded but "profitable" niche.
It's get ahead in the game, "earn money for investors" or further some industry astroturfed cause.
Also a lot of PhD's use them to grift like cheap salesmen these days unfortunately.
Probably has something to do with the corporate incentive structures that have emerged.
I'm curious if you think there is more of this in science than in other places. Or is it that we want to think of science and academia as better than/more idealistic?
This comment is similar to comments I've heard about nonprofits, govt work, and a quite a few large businesses. For a long time I thought nonprofits were generally good, in college I learned more about what nonprofit means and how that kneejerk reaction of mine could provide cover for a huge range of behaviors.
"money and time seems to be spent on politics, careerism and working actively for some disproportionally funded but "profitable" niche"
Science and technology provide the most important examples of surrogate activities. Some scientists claim that they are motivated by “curiosity” or by a desire to “benefit humanity.” But it is easy to see that neither of these can be the principal motive of most scientists. As for “curiosity,” that notion is simply absurd. Most scientists work on highly specialized problems that are not the object of any normal curiosity. For example, is an astronomer, a mathematician or an entomologist curious about the properties of isopropyltrimethylmethane? Of course not. Only a chemist is curious about such a thing, and he is curious about it only because chemistry is his surrogate activity. Is the chemist curious about the appropriate classification of a new species of beetle? No. That question is of interest only to the entomologist, and he is interested in it only because entomology is his surrogate activity. If the chemist and the entomologist had to exert themselves seriously to obtain the physical necessities, and if that effort exercised their abilities in an interesting way but in some nonscientific pursuit, then they wouldn’t give a damn about isopropyltrimethylmethane or the classification of beetles. Suppose that lack of funds for postgraduate education had led the chemist to become an insurance broker instead of a chemist. In that case he would have been very interested in insurance matters but would have cared nothing about isopropyltrimethylmethane. In any case it is not normal to put into the satisfaction of mere curiosity the amount of time and effort that scientists put into their work. The “curiosity” explanation for the scientists’ motive just doesn’t stand up.
The “benefit of humanity” explanation doesn’t work any better. Some scientific work has no conceivable relation to the welfare of the human race—most of archaeology or comparative linguistics for example. Some other areas of science present obviously dangerous possibilities. Yet scientists in these areas are just as enthusiastic about their work as those who develop vaccines or study air pollution. Consider the case of Dr. Edward Teller, who had an obvious emotional involvement in promoting nuclear power plants. Did this involvement stem from a desire to benefit humanity? If so, then why didn’t Dr. Teller get emotional about other “humanitarian” causes? If he was such a humanitarian then why did he help to develop the H- bomb? As with many other scientific achievements, it is very much open to question whether nuclear power plants actually do benefit humanity. Does the cheap electricity outweigh the accumulating waste and the risk of accidents? Dr. Teller saw only one side of the question. Clearly his emotional involvement with nuclear power arose not from a desire to “benefit humanity” but from a personal fulfillment he got from his work and from seeing it put to practical use.
The same is true of scientists generally. With possible rare exceptions, their motive is neither curiosity nor a desire to benefit humanity but the need to go through the power process: to have a goal (a scientific problem to solve), to make an effort (research) and to attain the goal (solution of the problem.) Science is a surrogate activity because scientists work mainly for the fulfillment they get out of the work itself.
I agree with some of the problems listed (over-hyping minor results) though personally I think the link to attention economy feels a bit contrived. There are much greater forces leading to these problems - notably the emphasis on metrics for science work (as mentioned) and politicization. This didn't convince me the attention economy lens adds anything
While this article feels true, I wish it provided more of its own researched evidence to prove the point. I went through several links and possibly the most relevant ones are "Why are medical journals full of fashionable nonsense?"[1] and the book "Science Fictions"[2].
Most of the other links talk about other tangentially related topics. The subheading "How the attention economy corrupts science" which should contain the meat actually has little to no research of its own that can convince me of the title (unless I'm willing to read the book "Science Fictions"). I read the article "Why are medical journals full of fashionable nonsense?" and found it to have a similar vibe although it had more concrete evidence. Still, the need for more than a few examples is something I feel is fair to call for. Basically, I don't find it to fully support this original article about "attention economy" corrupting science.
Overall, I think this is quite an ironic state. The article seems to hold on to what feels like an idea that is socially popular, something we all suspect, and it presents it as true with evidence that is either vague or indirect (look, this book exists on the topic, therefore it is true). The article fails to clearly draw the differences between "working hard to get attention to one's science" vs "the act of getting attention is corrupting science". I'm overall unconvinced that this article really does anything much to support its premise.
[+] [-] AlbertCory|3 years ago|reply
Being famous used to be rather difficult. Of course there were exceptions (writing To Kill a Mockingbird, being the guy who dove into the river to save a drowning child, for example), but for the most part, you were going to live your life known only to the few hundred or thousand people you met personally.
Even though you could pick up a phone and dial anyone in the world who owned a phone, you wouldn't, and if you did, they'd hang up on you. Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.
Is that a good or a bad thing? It's certainly bad in some ways, and this is one of them.
[+] [-] NL807|3 years ago|reply
[+] [-] user3939382|3 years ago|reply
[+] [-] bawolff|3 years ago|reply
> Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.
I'm sorry, did a hand reach out of your phone and force you to install and view the app of the week?
[+] [-] omegalulw|3 years ago|reply
[+] [-] nopenopenopeno|3 years ago|reply
No you can’t. Attention is scarce. That’s why it’s called “attention economy”. In fact, it was far easier to force idiotic ideas onto the screens of millions of people before the internet came along. That’s what TV commercials were.
[+] [-] guerrilla|3 years ago|reply
Not so much as you'd think. Try it. You'd be surprised.
[+] [-] 1vuio0pswjnm7|3 years ago|reply
If I was asked to "regulate" this problem of so-called "altmetrics", and the "attention economy" in general, here is how would I do it.
Twitter used to be based on SMS, but since 2020 it is just a gigantic website like Facebook. These two mega-sized websites are the primary sources of "altmetrics". If we take away the right to create these gigantic outsourced websites, what would happen.
I would place limits on websites that are comprised of so-called user-generated content. For example, if someone wants to run a website with millions of pages, they are free to do so. (If they could actually produce enough content to justify so many pages.) However, they are not free to have millions of different people author the pages. A website could not be a mass scale middleman (intermediaries) for people who wish to publish using the www. A mega-website doing no work to produce content, financed solely by selling people out to advertising could not, for example, supplant an organisation that employs journalists and editors to produce news.
By regulating creation of these mega-websites we could reduce the incentive for advertising. The mega-websites would lose their traffic and disappear. They would be replaced by normal-sized websites that cater to specific audiences.
Allowing a few websites to grow to enormous size while not having to do any work to produce content has been a mistake. Of course they can make billions in advertising revenue. It also allows any notion of egalitarianism in the www's design to be compromised in favour of a sharecropper model, so-called "platforms".
Without oversized websites no one would be able to publish their content to every website in existence. No website would be able to do zero work to create content and yet act as a middleman drawing massive traffic that can be monitised via advertising. That is what these mega-websites like Twitter and Facebook do. They sit between people who create content and people who consume it and sell these people out to advertisers.
The cost of publishing data/information to the www will continue to fall. The technology to make it easy will contnue to advance. We do need to be able to communicate in small groups, as we have always done. That is possible. We do not need to collectively use mega-websites with billions of pages run by a handful of third parties in order to do it. The follow-on effects of millions of people communicating via these third party websites are obviously harmful.
[+] [-] anm89|3 years ago|reply
[+] [-] ShamelessC|3 years ago|reply
Shots fired!
[+] [-] thghtihadanacct|3 years ago|reply
[+] [-] jostmey|3 years ago|reply
[+] [-] club_tropical|3 years ago|reply
[+] [-] godelski|3 years ago|reply
I think this also applies at all levels. I'd argue that it is the big reason people feel very frustrated with reviewing, and especially in hyped areas (e.g. ML). There's plenty of incentives to reject papers (low acceptance rates mean "higher quality" publications, advantage yourself by rejecting or being overly critical of your competitors, you get no clout for reviewing and no one will get upset if you take all of 5 minutes to review), but very few incentives (I can't even name one) to accept papers. It is fairly easy to dismiss papers as not novel because we all build off the shoulders of giants and things are substantially more obvious post hoc. Metareviewers and Area Chairs will 99/100 times stand with reviewers even if they are in the wrong and can be proven so (I had a reviewer give me a strong reject claiming I should compare to another work, which was actually our main comparitor and we compared to in 5+ tables and 5+ graphs). I can't see these issues being resolved until we all agree that there needs to be some incentive to write high quality reviews. The worst of this is that the people it hurts the most is the grad students and junior researchers (prevents graduating and career advancement). I'm not saying we have to accept papers, but I am saying we need to ensure that we are providing high quality reviews. Rejections suck, but rejections that don't provide valuable feedback are worse.
If the publication system is a noisy signal then we need to fix it AND recognize it as such. There's been plenty of studies to show that this process is highly noisy but we're all acting like publications are all that matters.
This is all before we even talk about advantages linked to connections even in double blind reviews, collusion rings, or citation hacking. I feel we can't even get the first step right.
[+] [-] chaosbolt|3 years ago|reply
No one cares, when you can watch 5 movies in a day and feel all sorts of emotions people used to only feel a few times a year, watch porn, play videogames, read about anything you want, you have little time or neurochemistry left to be mad at scientist, journalists or politicians lying to you.
[+] [-] wrp|3 years ago|reply
[+] [-] anonporridge|3 years ago|reply
The older I get, the more I believe this is the truth. For most institutions we've been taught to hold in high regard.
[+] [-] chiefalchemist|3 years ago|reply
[+] [-] wisty|3 years ago|reply
[+] [-] Ztynovovk|3 years ago|reply
[+] [-] viridian|3 years ago|reply
Your main goal, as a PI, is to keep your lab running, and thus research flowing, by any means possible. For some PIs, this means milking every drop of available talent and time out of your doctoral candidates, and is the most common cause of the horror stories you hear about people leaving academia. On the other end of the extreme, they can embed themselves so deeply in fundraising with private or public capital that their lab staff don't see them for more than 15-30 minutes a week, because they are essentially living their lives hopping from sales meeting to sales meeting.
This wouldn't be a problem if the job of the PI was explicitly meant to be that of a salesman, but the actual role of a PI is to define the research being done. They draft the hypotheses, the expected impact, etc, because that is their intended role, but in reality these will always be constructed in a way that makes it easier for the PI to solicit funding.
It's impossible for the attention economy to not play into the research funding loop then, because every set of eyeballs is another potential revenue source for future research, or a tool to justify growing the footprint of your lab. I wouldn't go so far as to call the superstructure corrupting of science though, not in those words. I'd say it forces science to be mission focused, where the mission is a subtle negotiation between the people funding the research and the people performing it, and often times the person with the capital lands much closer to their ideal.
[+] [-] derbOac|3 years ago|reply
The model you're describing implicitly is this idealistic "executive-mentor" model of the PI, who has the ideas and the postdocs or doctoral students are just implementing it. Basically scientist wants to produce, so needs help and outsources the work to others.
In my experience, though, this is not at all what happens many times. Ideas come from those doctoral students or postdocs or whowever, the PI takes them, and then they get credit for those ideas. I've seen PIs who really don't fundamentally understand the research areas, who kind of are just "black holes" for credit of the ideas and work of others around them, and then because they're more senior, they end up getting the credit. Sometimes this process seems intentional, in that the PI cultivates a false impression of what's going on, and sometimes it just happens because of the nature of the attention economy.
So although the "executive-mentor" model is a good one, what's closer to reality in many cases (although not all) is more of a "public liason-mascot" system, or some kind of hybrid.
Because of this mismatch between reality and the assumed schema, the attention economy then incentivizes abuse and corruption.
This isn't even getting into issues about how chasing grants as a fundamental scientific endeavor distorts what is researched. Even if you have a pure leader-mentor PI who is just trying to get their own independent ideas researched with funding, you then have to ask "what is rewarded? Is it what's good rigorous science, or what is popular?"
The problem I think is that what garners attention is not what is rigorous or innovative. Sometimes those things overlap, and maybe they're correlated, but they're not the same.
Maybe this isn't unique to science, but it doesn't make it ok, and it seems like changing it to prevent these problems is necessary.
[+] [-] syncerr|3 years ago|reply
[+] [-] sinenomine|3 years ago|reply
https://newscience.org/nih/
[+] [-] Gatsky|3 years ago|reply
The question is, does any of this matter on a historical scale? Is Science doomed to fail? In 200 years, our descendants will probably look back at us with the same mix of condescension and slightly horrified fascination with which we view our 19th century counterparts. Our stupid scientific publishing system will be viewed similarly to the plumbing in London in the 1850s. The chimney sweep and the graduate student suffer similar plights. It is terrible, immoral, we should do better, but then this is always the case.
On one level, our future descendants should be grateful - we eradicated small pox (that alone would be enough really, an unprecedented gift to all future humanity, a boundless alleviation of suffering), discovered antibiotics, greatly improved child mortality, invented quantum mechanics and relativity, drastically increased our computational capacity, left the planet for the first time, and connected almost everyone in the world. All of this under conditions considerably less ideal than they are today, despite still being far from desirable. Maybe that's all that matters?
Science being Science will sort itself out most likely. We should mainly try to reduce the human suffering involved, and allow greater diveristy in how and where research is done.
[+] [-] n4r9|3 years ago|reply
[+] [-] enviclash|3 years ago|reply
[+] [-] wanderingmind|3 years ago|reply
[+] [-] adharmad|3 years ago|reply
[+] [-] Fomite|3 years ago|reply
Sitting on a tenure and promotion committee at an R1 university, this type of stuff is just as likely to torpedo you as it is to boost you.
[+] [-] abrax3141|3 years ago|reply
[+] [-] seydor|3 years ago|reply
[+] [-] chiefalchemist|3 years ago|reply
Ironically, this is not the general public's perception. Even more ironic is the fact that celebrity "scientists" like Bill Nye and Neil Degrasse Tyson are fond of pushing the "science is better than the rest of us" narrative.
[+] [-] jscipione|3 years ago|reply
[+] [-] jongjong|3 years ago|reply
People used to be uncomfortable discussing 'attention' or 'the media' - I suspect because it was too abstract or not relevant enough to them - Now that many people are struggling to get any attention for their work, attention and the media seem more relevant.
When someone publishes the best work of their career and it receives less recognition than their early work, it sometimes makes them wonder what has changed.
[+] [-] xhkkffbf|3 years ago|reply
[+] [-] swayvil|3 years ago|reply
[+] [-] adamrezich|3 years ago|reply
[+] [-] axiom92|3 years ago|reply
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] c7b|3 years ago|reply
While it does make for a good opening line, using a single prediction from 25 years ago that arguably worked out to some extent (probably not everyone prefers likes over cash), without giving any context on how many similar predictions were made in the same context, just feels a bit odd.
[+] [-] alexfromapex|3 years ago|reply
[+] [-] Ztynovovk|3 years ago|reply
[deleted]
[+] [-] mkl|3 years ago|reply
So 500 million pages have one or more words similar to those. That's pretty flimsy evidence. Putting quotes around it I get 5 results (not 5 million, 5), one of which is this article. I don't doubt the overall conclusions of the article, but I do wonder how well supported some of it is.
[+] [-] kossTKR|3 years ago|reply
Off course there's heaps of interesting papers and progress out there but at least 90% of money and time seems to be spent on politics, careerism and working actively for some disproportionally funded but "profitable" niche.
It's get ahead in the game, "earn money for investors" or further some industry astroturfed cause. Also a lot of PhD's use them to grift like cheap salesmen these days unfortunately.
Probably has something to do with the corporate incentive structures that have emerged.
[+] [-] dendrite9|3 years ago|reply
This comment is similar to comments I've heard about nonprofits, govt work, and a quite a few large businesses. For a long time I thought nonprofits were generally good, in college I learned more about what nonprofit means and how that kneejerk reaction of mine could provide cover for a huge range of behaviors.
"money and time seems to be spent on politics, careerism and working actively for some disproportionally funded but "profitable" niche"
[+] [-] MichaelCollins|3 years ago|reply
The “benefit of humanity” explanation doesn’t work any better. Some scientific work has no conceivable relation to the welfare of the human race—most of archaeology or comparative linguistics for example. Some other areas of science present obviously dangerous possibilities. Yet scientists in these areas are just as enthusiastic about their work as those who develop vaccines or study air pollution. Consider the case of Dr. Edward Teller, who had an obvious emotional involvement in promoting nuclear power plants. Did this involvement stem from a desire to benefit humanity? If so, then why didn’t Dr. Teller get emotional about other “humanitarian” causes? If he was such a humanitarian then why did he help to develop the H- bomb? As with many other scientific achievements, it is very much open to question whether nuclear power plants actually do benefit humanity. Does the cheap electricity outweigh the accumulating waste and the risk of accidents? Dr. Teller saw only one side of the question. Clearly his emotional involvement with nuclear power arose not from a desire to “benefit humanity” but from a personal fulfillment he got from his work and from seeing it put to practical use.
The same is true of scientists generally. With possible rare exceptions, their motive is neither curiosity nor a desire to benefit humanity but the need to go through the power process: to have a goal (a scientific problem to solve), to make an effort (research) and to attain the goal (solution of the problem.) Science is a surrogate activity because scientists work mainly for the fulfillment they get out of the work itself.
[+] [-] version_five|3 years ago|reply
[+] [-] nstart|3 years ago|reply
Most of the other links talk about other tangentially related topics. The subheading "How the attention economy corrupts science" which should contain the meat actually has little to no research of its own that can convince me of the title (unless I'm willing to read the book "Science Fictions"). I read the article "Why are medical journals full of fashionable nonsense?" and found it to have a similar vibe although it had more concrete evidence. Still, the need for more than a few examples is something I feel is fair to call for. Basically, I don't find it to fully support this original article about "attention economy" corrupting science.
Overall, I think this is quite an ironic state. The article seems to hold on to what feels like an idea that is socially popular, something we all suspect, and it presents it as true with evidence that is either vague or indirect (look, this book exists on the topic, therefore it is true). The article fails to clearly draw the differences between "working hard to get attention to one's science" vs "the act of getting attention is corrupting science". I'm overall unconvinced that this article really does anything much to support its premise.
[1] - https://bigthink.com/health/medical-journals-fashionable-non...
[2] - https://us.macmillan.com/books/9781250222688/sciencefictions
[+] [-] csomar|3 years ago|reply