"At baseline, time watching and socializing were negatively correlated with intelligence, while gaming did not correlate. After two years, gaming positively impacted intelligence, but socializing had no effect. This is consistent with cognitive benefits documented in experimental studies on video gaming."
The fact that they "controlled" for genetics using polygenic scores already is a strong sign of low quality research. Polygenic scores are powerful, but they contain very large amounts of noise compared to the true genetic effects. Controlling for genetics using them is like controlling for income by asking whether the respondents own a Porsche.
Also, be aware that Scientific Reports is, if not quite a predatory journal, a very low bar. They publish tens of thousands of articles every year, while charging vast fees.
In general, these guys have correlations, not causation. Children's IQ - and gaming habits etc. - develop as they age, so controlling for baseline IQ is not enough to make a correlation with later IQ and gaming causal. It seems much more likely that smarter kids game more, e.g. because they live in richer households. (No, controlling for SES isn't enough to rule this out, for much the same reasons of measurement error as for the genetics.)
If you wanna believe that your hours on COD have made you a genius, go ahead, I won't stop you. Just don't imagine that this research proves it.
If the noise in polygenic scores is random, all that will do is reduce correlations. Random measurement error always reduces the ability to observe relationships. To be clear, I do not know enough about polygenic scores to judge one way or another how the noise "works" vis-a-vis the entire analysis.
The Porsche comment is snide, but actually exposes a similar error in your critique. Sure, a tax return-derived measure of income would be superior to measuring if someone owned a luxury car. But, if you found yourself in a situation where all you had to go on for measuring economic wellbeing was (luxury) car ownership, your analysis is likely to improve by including it rather than excluding it, unless the measure itself had serious other issues with its accuracy.
Likewise, for SES, it is an imperfect measure, but it is the best we have for measuring social position in a concise way.
Having worked in research and universities for a while, the type of critique presented in this post is one you often see of new graduate students. They are able to tear down problems with research very well, but tend to overlook whether the study itself was still informative, or whether the opposite finding is likely to be true.
For example, suppose we wanted to know if video games or watching videos on the internet are making you dumber. A study like this may not convince you it's making you smarter, but it presents decent evidence they're not making you dumber. You can point out how the measures aren't perfect, but that is far from saying the opposite is true or the observed trends are completely spurious.
> If you wanna believe that your hours on COD have made you a genius, go ahead, I won't stop you. Just don't imagine that this research proves it.
Video games, like all things, should not all be treated equal. I could certainly see problem solving skills developing from world building or highly complex games (Civ, PoE, etc.). In fact, most (but not all) highly successful games have depth, which requires time investment and problem solving. The difference in games can be as varied as comparing a marketing pamphlet to Asimov's novels.
I don't dispute your take on the quality of the research though. I would even go further and speculate it would be really really hard to come up with meaningful tests due to game variance. So most anything on the subject is likely fluff.
If you are going to critique the methodology please provide a reference where this is not a robust method. You may be right but how can I tell without some references? The scholar.google search for "controlling for genetics using polygenetic scores" brings up many recent papers about this methodology and the arguments made in these seem stronger to me than this one comment. IMO on the internet when people can easily misinterpret the science, its important to be clear as possible especially when we take things down.
As far as scientific reports goes its a fine journal, its run by nature. It's not on the same planet as the predatory journals that spam inboxes. I worry that people will read your comment, assume you speak from authority, and discount any work they might see coming from that journal when we both know that good science can be found in scientific reports, and that impact factor is more strongly correlated with "sexy" or expensive science than good science anyhow.
Your response seemed reasonable, and I was nodding along until the last paragraph. I get the impression you had your mind made up about video games and intelligence long before this study was published.
Pattern recognition+spacial rotation+quick answer to stimuli+repetitive tasks or short term memories, several IQ components are testing these.
Kids might have higher IQ because of video games, but poorer to everything else that matters.
”Scientific Reports is an open access journal. To publish in Scientific Reports, all authors are required to pay an article-processing charge (APC) of $1,495.”
Note though that their data comes from the ABCD study, so at least the journal's reputability is not relevant to quality of data collection and most of the experimental design. Edit: of course, this is not a rebuttal of your specific criticisms.
I don't really agree with your qualification of Scientific Reports as "not quite predatory" or "a very low bar". Though it doesn't come close to comparing to Nature or Science, it's a perfectly respectable journal and it publishes quality research.
The worst things you can say about Scientific Reports are it doesn't have the same novelty/impact requirements as many journals* and people like to over-emphasize the "Nature" part of the name in a pointles grab for journal prestige. Journals in general can be pretty shady on a case-by-case basis. See "Nano Chopsticks" published in Nano Letters for a good materials science example of straight garbage (laughably obvious image manipulation) getting published in a fancy journal.
*Not really a problem if you correctly believe science should be more than publishing sexy results.
>If you wanna believe that your hours on COD have made you a genius, go ahead, I won't stop you. Just don't imagine that this research proves it.
You know there are more video games styles than FPS right? Strategy games teach patience and discipline, EVE online teaches economics, even the much dismissed 'mindless' fps teach teamwork.
I think it's likely that at least some games do increase intelligence relative to other activities (i.e. mindlessly watching tv) but less so than others like reading.
Anecdata but one thing I've noticed over time between those who grew up on video games and those who didn't is that those who were gamers deal better with change, especially WRT learning new software in the workplace, starting new projects, etc...
Obviously though the benefits aren't there if people just mindlessly play the same game all day.
Is there a reason to consider the use of a noisy signal as a strong sign of a low-quality study, without addressing whether the noise is correlated with the outcome?
Do we think this study would be improved if it did not control for genetics at all?
If this study has you scratching your head about the impact of digital media, do have a read of "Four arguments for the elimination of television" by Jerry Mander.
The arguments included in it are no different than many of these new studies coming out. If you grew up to be a latchkey kid and had your fair share of TV/video games/etc, it might give you perspective into things you never thought about.
One of the things he mentions is that TV is passive, and puts you in an alpha state where your brain stops trying to respond because there’s no point in responding.
My partner and I do watch a few hours of TV every night now. But we don’t do this “alpha” thing, at least not exclusively. We pause frequently to comment on or joke about what we’re seeing. To the point that I think sometimes a 30 minute show will take us an hour to get through.
The way it works is one of holds the remote and pauses whenever they want to, and if the other wants to pause they just say “pause!”
I wonder how that changes Mander’s analysis. For us it make TV a pretty fun interactive experience.
And this way of watching was largely impossible when Mander did his work, because you simply couldn’t pause TV. Although you could pause a VCR or DVD.
I’m curious how widespread pausing is. I certainly feel that even solo TV watching is a more interactive experience than TV watching was when I was a kid. Alone, I’ll pause to Wikipedia things or to go find related media.
"A latchkey kid, or latchkey child, is a child who returns to an empty home after school or a child who is often left at home with no supervision because their parents are away at work."
The only program I let my kids use is KidPix on the iPad. It's a (rather remarkable) drawing and animation program and I'm constantly amazed at what they make with it. And sometimes horrified. But the 5 yo enjoys creating scenes with sound effects (that she recorded) and animation (that she also recorded), while the 3 yo enjoys making intricate drawings and then methodically coloring everything green. It's his favorite color, and he enjoys this. Not affiliated at all, but KidPix is a great piece of software, and I wouldn't have known about it except it was on a public library computer.
My point is there is (at least) another important category of program that the researchers missed: creator software. I've also made simple songs with them with garage band, but the UI is still rather difficult for them to use it on their own. I was inspired to take this approach because my first introduction to computers was Logo on an Apple IIe, and Seymore Papert's beautiful work left a lasting impression.
Tux Paint is available for iOS. There's another one that my kids loved, which was a puppet show theater app, which easily let you animate characters and add voices. Unfortunately, I don't recall the name of it.
It's amazing the researchers forgot to include categories like apps. The apps my 3 year old son used is Pokpok, YouTube and an iOS app I develop specifically for him since he's on the non-verbal ASD spectrum.
When I was a kid I spent too much time on computers but it was mostly reading Wikipedia and googling programming questions, that kind of stuff.
Where I grew up there wasn't anyone around I could ask those kinds of questions of. I know that's not the Netflix / ipad world that the study is talking about or nessecarily exists today. But I suspect that bifurcation still exists.
My entry was scripting in Counter-Strike (Quake CFG Scripts). This cannot happen with consoles and Apps on Pads/Smartphones entirely prevent this.
Some people try to bring this modification friendly things back, with BBC Microbit, RasPi and so on. But in the end you need to be motivated - and playing better was huge motivation for me!
My early days were DOS mode-x experiments mainly in Turbo Pascal. I had a manual I printed out, and some kind of BBS usenet-like forum thing.
The docs were so sparse and the communities so small that it really was a much different experience than today. I have fond memories of it, but that might just be me looking back with rose colored glasses.
I would have killed for stack overflow though! But there is a sense of self directed mastery that you don't get when you are so much more familiar with how fast the bodies of knowledge are.
The closest I get to that these days is trying to hack code on a plane :)
"Digital media" is a gigantic umbrella and there are so many variations and confounding factors that I doubt anything useful could come from a study like this. As with anything that has the potential for addiction or maladaptive behavior, the difference between healthy and unhealthy consumption comes down to moderation and mindful consumption.
Interesting points mentioned about genetic influence and heritability of intelligence..
"We believe that studies with genetic data could clarify causal claims and correct for the typically unaccounted role of genetic predispositions. Here, we estimated the impact of different types of screen time (watching, socializing, or gaming) on children’s intelligence while controlling for the confounding effects of genetic differences in cognition and socioeconomic status."
"The contradictions among studies on screen time and cognition are likely due to limitations of cross-sectional designs, relatively small sample sizes, and, most critically, failures to control for genetic predispositions and socio-economic context. Although studies account for some confounding effects, very few have accounted for socioeconomic status and none have accounted for genetic effects. This matters because intelligence, educational attainment, and other cognitive abilities are all highly heritable."
I’ve tried to encourage my wife to choose video games over Disney+ when she needs to drop the kids in front of something, but she still has a strong resistance to the idea that games are better.
To me, it’s pretty obvious. The kids problem solve when gaming, and are obviously engaged. When watching TV, they look like zombies.
I think my wife’s biggest hang up with games is that she was always told that they rot your brain. Also, our kids talk about games, but never about TV which she interprets as games being more addictive. I interpret it as games being more interesting and engaging.
I recall, when I was a kid the games were extremely addictive for me and I couldn't just quit playing unless I absolutely have to. I had emotionally very similar experience when quitting smoking, the feeling that if I just can have this one more smoke I will be satisfied and live happily thereafter is very similar to the desire to have one more round.
So I wouldn't call games "not addictive". If anything, watching something on TV is often less adictive because you are told a story with its introduction, climax and ending(up until the Netflix ruined everything with it's endless shovelware).
IMHO, the key is moderation. A day with diverse activities is a day well spent, kid or adult.
Today, if I play Sid Meier's Civilization, a day or two would be completely gone and I will be disconnected from the reality and I will need to re-adapt to the real world. I suspect, excessive gamings primary risk is developing unhealthy understanding of the world in the area where the game simulates the real thing.
Nintendo games on a console that don’t need an internet connection or include advertising. These have been the best combination for our kids where they have fun but aren’t addicted and can easily put them down.
The moment games include advertising they optimize for all the wrong things. I won’t let my kids get free games on iPad, etc for this reason alone.
Games are designed to be addictive, so restraint is a good option. So is television, though, so limiting both is a good policy. I would guess that games' interactivity and personalization lead to much more opportunities for deeper addiction, though (see e.g loot boxes.)
> To me, it’s pretty obvious. The kids problem solve when gaming, and are obviously engaged. When watching TV, they look like zombies.
Are you not just seeing what you want to see? Maybe from your wife's perspective, the kids are carefully to observing and learning a wide range of human emotions, social dynamics, new idioms and music from Disney+, whereas in their games they're learning a few tricks that they repeat ad nauseam to get some trivial rewards from their digital Skinner boxes?
> I think my wife’s biggest hang up with games is that she was always told that they rot your brain
People used to say that about TV too.
I always feel a bit personally attacked when people claim videogames are just bad for you full stop. I'm really passionate about games, I grew up playing NES, and just never stopped. I almost always have a new game waiting to play for when I'm finished with what I'm currently playing.
Sample size of one, but I don't think my brain is rotten. I have a pretty successful career in software, I have a close partner, I have a social life. I have other hobbies too, but it's my main one.
Don't get me wrong, I know my gaming takes time away from other stuff I could/should be doing, like building side projects or getting enough exercise. But TV does the same, so in a choice between the two I pick gaming any day
As others have said, highly dependent on the game. My kids playing minecraft together on the TV? I have no trouble at all with that. Them sitting, like zombies, playing shovelware android games? I am likely to shout at them to go out and play when that happens.
Games and Watching Videos are two different dopamine rewards systems, but both are still dopamine rewards systems. I encourage my kid for games more than watching youtube or doing anything less challenging. However, I still worry about when the stubbornness of leaving the games appear.
One rule I made to myself is simply don't encourage anything just for dopamine rewards. I try to mix things with effort, contemplation, or interaction. Doing something like this specific rule: No more than 1h playing Minecraft alone. If you want to keep playing, ask Dad, Mom, or your cousins to play together; the same for Watching videos.
in what may be somewhat more of a hot take, i would argue that there is an incredible amount of educational value to be found even within the most meritless garbage games as they are still fundamentally systems to be dissected and solved and learning the maximally efficient way to do something worthless is a skillset that transfers quite handily to the valuable things in life. also unlike tv, games have a lot of potential to be immensely collaborative [or competitive] and social. some of my fondest childhood memories were going to my friend's house and co-oping diablo with one of us controlling the mouse and the other controlling the keyboard. there are many far less ad-hoc ways for kids to share games and even singleplayer is a highly rewarding shared experience.
Very very honestly, my experience with people who watch Disney too much is much much better then with excessive gamers. Only one of those groups yells, swears and is vent their anger at people around when they loose the match.
For the record, my kids do play games, I never did complete ban. But, the gaming does not seem to be superior, does not zombify them less nor leads to more inventive play after session finished.
Why are you trying to fight over what is the best for your kids?
Both have obvious benefits: Grimm Brothers tales are still culturally relevant for a reason, even if you can probably find useless brain fodder on Disney+; and video games can totally teach some stuff to your kids.
Reasoning through combat strategy, even in the age-inappropriate context of gun battles, exercises higher level thinking that clearly remains off when watching Ryan’s world.
To be fair, the article does say watching videos had a positive effect as well.
I don't think the problem is that polygenic scores are noisy. (You can choose to make them less noisy by restricting to significant SNPs, for example.) And noise doesn't require directional bias. But to me there are 2 problems:
(A) Polygenic scores for behavioral traits may be estimated in GWAS where the null assumptions (e.g., that mating is not conditioned on the trait being estimated) may not be valid[1]. That is on top of the issues that we usually face for other phenotypes (e.g., more routine population stratification due to geographic history).
(B) The authors did not describe the (genetic) ancestral background of the children being studied. Current techniques are biased across ancestries, for most traits, when using polygenic scores[2]. Certainly adjusting for 20PCs in the final model, as the authors seemingly did, would not be expected to make the scores comparable unless all of the children are from a close ancestral group.
With these sources of stratification, the polygenic score represents more (and less) than the trait that you're hoping it estimates; it also encodes population stratification.
As such, I hardly think this study can be interpreted.
> You can choose to make them less noisy by restricting to significant SNPs, for example.
That makes them more noisy, not less. PGS predictive power for EDU/IQ is always maximized at use of all SNPs. Restricting to the arbitrary subset of genome-wide statistically-significant SNPs in Lee would drive it from the 7% or so they have to <1%, IIRC.
Also, neither of your two problems are the problem here, as the biases there would not be expected to drive a correlation between video game playing & IQ (what sort of within-ethnic interaction would you need for that and why is it plausible?), and would mostly serve to simply not control for intelligence (and quantitatively, because the PGS here is a small fraction of the variance, even gross biases which somehow did manage to drive correlations between those two variables, would still be unable to meaningfully affect the estimates).
They say that some of their main conclusions change if they control for parental education level, which they did not do. So that makes it all sounds questionable. Plus I've never heard of the journal; don't be misled into thinking it's a high profile Nature journal. Someone else here says it's a low quality journal.
Measuring "Intelligence" always seems to result in some questionable science.....
That being said I do wonder about how I would have turned out had my childhood not been spent hiding under the covers at night with a small light devouring novel after novel and instead been inundated with social media and other distractions.
I would love to know what "watching videos" means here. There's a big difference between educational YouTube (Kurzgesagt, Physics Girl, Vertitasium, etc) and TV.
I think it is better to focus on getting kids to do creative projects. For example, parents should get them to write and illustrate a story once a month, or learn to play a music instrument, etc. If kids are doing interesting projects that change yearly and they work to improve their skills then the amount of time they spend on digital media will decrease and will not be a concern.
Having kids do projects is super helpful.
1) It builds their confidence in their ability
2) It shows the world (e.g. college admission boards) how they are valuable
3) It can become a way for them to be their own boss
4) It allows them to figure out what they want to be
5) It keeps them busy and out of trouble
The "get them to" part is the killer. I was easily frustrated as a kid and quit a lot of things in hindsight that I wish I stuck with. You never realize that time now is an investment for later, or that learning curves are different for everyone as a kid. I think that environment where you are in a cohort as a kid can be challenging too. If you see a few people in your class who are really good at something and you are struggling, that kills your confidence, and you will probably quit before actually putting in the work you need to get yourself to that same level and just assume it will always be impossible for you. As an adult now I see a lot of my peers still with this mentality and they drop new hobbies fast. Life is a slow march though and you shouldn't hold yourself to the standards of others, but that's a lifelong lesson that even 35 year olds are learning, and not something you can easily tell a 5 year old.
Of course, if you push too hard the other way, your kid may just hate the skill and drop it. I knew a lot of people who were forced into piano lessons, and got very good at it too. Many quit over the years due to that resentment once their parents gave them the choice, and today never play any music in adulthood. Such a shame.
TL;DR - the complexity of media has been increasing over the past decades, which means that children spending time with digital media are benefiting from it relative to past generations.
As an example, playing a modern AAA video game is much more mentally stimulating than playing Pong. But also, watching an hour of a modern TV show, or even a modern reality show, is more challenging mentally than watching classic TV from the 60s and 70s—there are many, many more plots and relationship dynamics to track and speculate about.
A counter argument (made by Alan Kay and others) is that the language complexity of what people read is reducing which is leading to a return to an oral culture rather than literate culture (to our collective detriment). My anecdotal experience as a high school English teacher leads me to say this feels true.
I wonder what Johnson thinks about the decline of reading going on in parallel to the rise of TV viewership? It's true that old shows or movies might have had simpler and more tidy plots (not all of them of course), but I think even the most complicated modern shows aren't nearly so complicated as a basic novel. Case in point, anything adapted from print to tv or film has a lot of plot left out because there simply isn't enough time to convey that information given the lower information density of these forms of media vs the printed page. Before serialized TV was also serialized stories that could have very complicated subplots and other things going on. Then you have the newspaper itself with all of its complicated real world subplots; a lot more content certainly than what the talking heads on CNN can cover even with 24 hour news.
It's interesting with the internet too, even though there should be a lot more stories in the zeitgeist at once given its wide reach, sometimes due to its virality, one storyline is able to dominate everything at once and suck the air out of the room. Did we really need a dozen article about Will Smith slapping Chris Rock in everything from Reuters to The Atlantic for seemingly two whole days? If you only got news from twitter that might consume your entire feed. If you got your news from the newspaper, that would at most just consume one article or two out of many others of pages of printed material. It would also be limited to probably one section of the many different sections of the paper all covering different news topics.
Can't say whether they're addictive or not. My personal opinion based on my own history is that the reward system in games (levelling up etc.) made me feel like I'd achieved something while in reality, I just entertained myself.
Long hours of gaming and imbibing this translated over into the real world where it became harder for me to put in effort because I had to see rewards accumulate as a score somewhere and that wasn't happening.
Looking back in my childhood, I clearly exhibited sign of addition.
Walking into arcade filled me with joy. Trying to decide which game to play. Imagining if I'm going to finally beat Wonderboy on one coin. Playing Kabal with my brother.
I was a good kid, but I would STEAL money from Mom's purse occasionally to satisfy my arcade craving.
Is the effect of media on intelligence really the primary concern?
Ad absurdum: if watching the Kardashians for two hours a day doubled your intelligence (however you decide to measure that) would you do it? Would you have your kids do it?
I am very out of touch w/ the current world of gaming, but in my past experience Nintendo has always been the 'healthiest' platform. The games are full of beautiful animation and great music - truly works of art, and aren't violent and/or trying to get them to buy things.
I worry about my future children's relationship with technology, especially social media.
But I'm not really concerned about general intelligence. As a parent I feel I have some input into their intellectual growth (to the extent environment allows). I'm far more concerned about the impact of social media on their emotional wellbeing. How they interact with others, and how they view themselves.
In Australia the Government is trying to regulate social media companies. For example, last year it introduced an "anti-trolling" bill, which would require companies to reveal the identities of anonymous trolls. And this is only the beginning of what in my opinion is heavy-handed Government overreach that will not improve the online experience of young people.
Despite being a fairly libertarian person, I'm open to a discussion on banning social media for people under a certain age (16? 18?). And then getting rid of all/most regulations on content.
Not saying this is something we should do right now, or that it's definitely a good idea. I'm just saying I think it's a discussion worth having.
mdrzn|3 years ago
So it's confirmed? Gaming > Socializing? /s
LewisVerstappen|3 years ago
They state
> Socializing via social media, text, and video chat
Is socializing via social media mean talking in a chat? Or does scrolling your instagram feed and liking your friend's posts count as socializing?
Or scrolling Twitter and liking/replying to your friend's tweets? That could be considered socializing too?
trelane|3 years ago
Aeolun|3 years ago
Intelligence does not directly correlate with either success in life or happiness (I believe intelligence is negatively correlated with happiness).
zivkovicp|3 years ago
RGamma|3 years ago
dash2|3 years ago
Also, be aware that Scientific Reports is, if not quite a predatory journal, a very low bar. They publish tens of thousands of articles every year, while charging vast fees.
In general, these guys have correlations, not causation. Children's IQ - and gaming habits etc. - develop as they age, so controlling for baseline IQ is not enough to make a correlation with later IQ and gaming causal. It seems much more likely that smarter kids game more, e.g. because they live in richer households. (No, controlling for SES isn't enough to rule this out, for much the same reasons of measurement error as for the genetics.)
If you wanna believe that your hours on COD have made you a genius, go ahead, I won't stop you. Just don't imagine that this research proves it.
pocketsand|3 years ago
The Porsche comment is snide, but actually exposes a similar error in your critique. Sure, a tax return-derived measure of income would be superior to measuring if someone owned a luxury car. But, if you found yourself in a situation where all you had to go on for measuring economic wellbeing was (luxury) car ownership, your analysis is likely to improve by including it rather than excluding it, unless the measure itself had serious other issues with its accuracy.
Likewise, for SES, it is an imperfect measure, but it is the best we have for measuring social position in a concise way.
Having worked in research and universities for a while, the type of critique presented in this post is one you often see of new graduate students. They are able to tear down problems with research very well, but tend to overlook whether the study itself was still informative, or whether the opposite finding is likely to be true.
For example, suppose we wanted to know if video games or watching videos on the internet are making you dumber. A study like this may not convince you it's making you smarter, but it presents decent evidence they're not making you dumber. You can point out how the measures aren't perfect, but that is far from saying the opposite is true or the observed trends are completely spurious.
rapind|3 years ago
Video games, like all things, should not all be treated equal. I could certainly see problem solving skills developing from world building or highly complex games (Civ, PoE, etc.). In fact, most (but not all) highly successful games have depth, which requires time investment and problem solving. The difference in games can be as varied as comparing a marketing pamphlet to Asimov's novels.
I don't dispute your take on the quality of the research though. I would even go further and speculate it would be really really hard to come up with meaningful tests due to game variance. So most anything on the subject is likely fluff.
asdff|3 years ago
As far as scientific reports goes its a fine journal, its run by nature. It's not on the same planet as the predatory journals that spam inboxes. I worry that people will read your comment, assume you speak from authority, and discount any work they might see coming from that journal when we both know that good science can be found in scientific reports, and that impact factor is more strongly correlated with "sexy" or expensive science than good science anyhow.
throwingrocks|3 years ago
323|3 years ago
Foobar8568|3 years ago
nicce|3 years ago
”Scientific Reports is an open access journal. To publish in Scientific Reports, all authors are required to pay an article-processing charge (APC) of $1,495.”
versteegen|3 years ago
epgui|3 years ago
_d2qh|3 years ago
*Not really a problem if you correctly believe science should be more than publishing sexy results.
wing-_-nuts|3 years ago
You know there are more video games styles than FPS right? Strategy games teach patience and discipline, EVE online teaches economics, even the much dismissed 'mindless' fps teach teamwork.
I think it's likely that at least some games do increase intelligence relative to other activities (i.e. mindlessly watching tv) but less so than others like reading.
jcranberry|3 years ago
jgalt212|3 years ago
You can say that for almost every human study that's not drug based, or very short-term.
Mikeb85|3 years ago
Obviously though the benefits aren't there if people just mindlessly play the same game all day.
boloust|3 years ago
Do we think this study would be improved if it did not control for genetics at all?
DeathArrow|3 years ago
lelanthran|3 years ago
Maybe it's the games you play (CoD) that make imagine that game-playing develops no reasoning skills.
Play something else (starcraft, for example).
barbecue_sauce|3 years ago
thenerdhead|3 years ago
The arguments included in it are no different than many of these new studies coming out. If you grew up to be a latchkey kid and had your fair share of TV/video games/etc, it might give you perspective into things you never thought about.
https://www.youtube.com/watch?v=m3NBEurnIqY
erikpukinskis|3 years ago
One of the things he mentions is that TV is passive, and puts you in an alpha state where your brain stops trying to respond because there’s no point in responding.
My partner and I do watch a few hours of TV every night now. But we don’t do this “alpha” thing, at least not exclusively. We pause frequently to comment on or joke about what we’re seeing. To the point that I think sometimes a 30 minute show will take us an hour to get through.
The way it works is one of holds the remote and pauses whenever they want to, and if the other wants to pause they just say “pause!”
I wonder how that changes Mander’s analysis. For us it make TV a pretty fun interactive experience.
And this way of watching was largely impossible when Mander did his work, because you simply couldn’t pause TV. Although you could pause a VCR or DVD.
I’m curious how widespread pausing is. I certainly feel that even solo TV watching is a more interactive experience than TV watching was when I was a kid. Alone, I’ll pause to Wikipedia things or to go find related media.
npteljes|3 years ago
"A latchkey kid, or latchkey child, is a child who returns to an empty home after school or a child who is often left at home with no supervision because their parents are away at work."
javajosh|3 years ago
My point is there is (at least) another important category of program that the researchers missed: creator software. I've also made simple songs with them with garage band, but the UI is still rather difficult for them to use it on their own. I was inspired to take this approach because my first introduction to computers was Logo on an Apple IIe, and Seymore Papert's beautiful work left a lasting impression.
fullstop|3 years ago
jrussino|3 years ago
lawgimenez|3 years ago
I believe this research is severely lacking.
f0e4c2f7|3 years ago
Where I grew up there wasn't anyone around I could ask those kinds of questions of. I know that's not the Netflix / ipad world that the study is talking about or nessecarily exists today. But I suspect that bifurcation still exists.
ho_schi|3 years ago
Some people try to bring this modification friendly things back, with BBC Microbit, RasPi and so on. But in the end you need to be motivated - and playing better was huge motivation for me!
switchbak|3 years ago
The docs were so sparse and the communities so small that it really was a much different experience than today. I have fond memories of it, but that might just be me looking back with rose colored glasses.
I would have killed for stack overflow though! But there is a sense of self directed mastery that you don't get when you are so much more familiar with how fast the bodies of knowledge are.
The closest I get to that these days is trying to hack code on a plane :)
conqueso|3 years ago
PKop|3 years ago
"We believe that studies with genetic data could clarify causal claims and correct for the typically unaccounted role of genetic predispositions. Here, we estimated the impact of different types of screen time (watching, socializing, or gaming) on children’s intelligence while controlling for the confounding effects of genetic differences in cognition and socioeconomic status."
"The contradictions among studies on screen time and cognition are likely due to limitations of cross-sectional designs, relatively small sample sizes, and, most critically, failures to control for genetic predispositions and socio-economic context. Although studies account for some confounding effects, very few have accounted for socioeconomic status and none have accounted for genetic effects. This matters because intelligence, educational attainment, and other cognitive abilities are all highly heritable."
svnt|3 years ago
christophilus|3 years ago
To me, it’s pretty obvious. The kids problem solve when gaming, and are obviously engaged. When watching TV, they look like zombies.
I think my wife’s biggest hang up with games is that she was always told that they rot your brain. Also, our kids talk about games, but never about TV which she interprets as games being more addictive. I interpret it as games being more interesting and engaging.
mrtksn|3 years ago
So I wouldn't call games "not addictive". If anything, watching something on TV is often less adictive because you are told a story with its introduction, climax and ending(up until the Netflix ruined everything with it's endless shovelware).
IMHO, the key is moderation. A day with diverse activities is a day well spent, kid or adult.
Today, if I play Sid Meier's Civilization, a day or two would be completely gone and I will be disconnected from the reality and I will need to re-adapt to the real world. I suspect, excessive gamings primary risk is developing unhealthy understanding of the world in the area where the game simulates the real thing.
brightball|3 years ago
The moment games include advertising they optimize for all the wrong things. I won’t let my kids get free games on iPad, etc for this reason alone.
trelane|3 years ago
em500|3 years ago
Are you not just seeing what you want to see? Maybe from your wife's perspective, the kids are carefully to observing and learning a wide range of human emotions, social dynamics, new idioms and music from Disney+, whereas in their games they're learning a few tricks that they repeat ad nauseam to get some trivial rewards from their digital Skinner boxes?
idontwantthis|3 years ago
bluefirebrand|3 years ago
People used to say that about TV too.
I always feel a bit personally attacked when people claim videogames are just bad for you full stop. I'm really passionate about games, I grew up playing NES, and just never stopped. I almost always have a new game waiting to play for when I'm finished with what I'm currently playing.
Sample size of one, but I don't think my brain is rotten. I have a pretty successful career in software, I have a close partner, I have a social life. I have other hobbies too, but it's my main one.
Don't get me wrong, I know my gaming takes time away from other stuff I could/should be doing, like building side projects or getting enough exercise. But TV does the same, so in a choice between the two I pick gaming any day
sandos|3 years ago
mariocesar|3 years ago
One rule I made to myself is simply don't encourage anything just for dopamine rewards. I try to mix things with effort, contemplation, or interaction. Doing something like this specific rule: No more than 1h playing Minecraft alone. If you want to keep playing, ask Dad, Mom, or your cousins to play together; the same for Watching videos.
odessacubbage|3 years ago
in what may be somewhat more of a hot take, i would argue that there is an incredible amount of educational value to be found even within the most meritless garbage games as they are still fundamentally systems to be dissected and solved and learning the maximally efficient way to do something worthless is a skillset that transfers quite handily to the valuable things in life. also unlike tv, games have a lot of potential to be immensely collaborative [or competitive] and social. some of my fondest childhood memories were going to my friend's house and co-oping diablo with one of us controlling the mouse and the other controlling the keyboard. there are many far less ad-hoc ways for kids to share games and even singleplayer is a highly rewarding shared experience.
watwut|3 years ago
For the record, my kids do play games, I never did complete ban. But, the gaming does not seem to be superior, does not zombify them less nor leads to more inventive play after session finished.
erfgh|3 years ago
legitster|3 years ago
lemmiwinks|3 years ago
Both have obvious benefits: Grimm Brothers tales are still culturally relevant for a reason, even if you can probably find useless brain fodder on Disney+; and video games can totally teach some stuff to your kids.
Just let them choose what they want I guess?
tiahura|3 years ago
Reasoning through combat strategy, even in the age-inappropriate context of gun battles, exercises higher level thinking that clearly remains off when watching Ryan’s world.
To be fair, the article does say watching videos had a positive effect as well.
SnowHill9902|3 years ago
carbocation|3 years ago
(A) Polygenic scores for behavioral traits may be estimated in GWAS where the null assumptions (e.g., that mating is not conditioned on the trait being estimated) may not be valid[1]. That is on top of the issues that we usually face for other phenotypes (e.g., more routine population stratification due to geographic history).
(B) The authors did not describe the (genetic) ancestral background of the children being studied. Current techniques are biased across ancestries, for most traits, when using polygenic scores[2]. Certainly adjusting for 20PCs in the final model, as the authors seemingly did, would not be expected to make the scores comparable unless all of the children are from a close ancestral group.
With these sources of stratification, the polygenic score represents more (and less) than the trait that you're hoping it estimates; it also encodes population stratification.
As such, I hardly think this study can be interpreted.
1 = https://www.nature.com/articles/s41467-022-28294-9
2 = https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6563838/
gwern|3 years ago
That makes them more noisy, not less. PGS predictive power for EDU/IQ is always maximized at use of all SNPs. Restricting to the arbitrary subset of genome-wide statistically-significant SNPs in Lee would drive it from the 7% or so they have to <1%, IIRC.
Also, neither of your two problems are the problem here, as the biases there would not be expected to drive a correlation between video game playing & IQ (what sort of within-ethnic interaction would you need for that and why is it plausible?), and would mostly serve to simply not control for intelligence (and quantitatively, because the PGS here is a small fraction of the variance, even gross biases which somehow did manage to drive correlations between those two variables, would still be unable to meaningfully affect the estimates).
da39a3ee|3 years ago
Melatonic|3 years ago
That being said I do wonder about how I would have turned out had my childhood not been spent hiding under the covers at night with a small light devouring novel after novel and instead been inundated with social media and other distractions.
woliveirajr|3 years ago
Today it's hiding and using the "smart"phone
raptortech|3 years ago
I would love to know what "watching videos" means here. There's a big difference between educational YouTube (Kurzgesagt, Physics Girl, Vertitasium, etc) and TV.
swframe2|3 years ago
Having kids do projects is super helpful. 1) It builds their confidence in their ability 2) It shows the world (e.g. college admission boards) how they are valuable 3) It can become a way for them to be their own boss 4) It allows them to figure out what they want to be 5) It keeps them busy and out of trouble
asdff|3 years ago
Of course, if you push too hard the other way, your kid may just hate the skill and drop it. I knew a lot of people who were forced into piano lessons, and got very good at it too. Many quit over the years due to that resentment once their parents gave them the choice, and today never play any music in adulthood. Such a shame.
npilk|3 years ago
TL;DR - the complexity of media has been increasing over the past decades, which means that children spending time with digital media are benefiting from it relative to past generations.
As an example, playing a modern AAA video game is much more mentally stimulating than playing Pong. But also, watching an hour of a modern TV show, or even a modern reality show, is more challenging mentally than watching classic TV from the 60s and 70s—there are many, many more plots and relationship dynamics to track and speculate about.
germinalphrase|3 years ago
asdff|3 years ago
It's interesting with the internet too, even though there should be a lot more stories in the zeitgeist at once given its wide reach, sometimes due to its virality, one storyline is able to dominate everything at once and suck the air out of the room. Did we really need a dozen article about Will Smith slapping Chris Rock in everything from Reuters to The Atlantic for seemingly two whole days? If you only got news from twitter that might consume your entire feed. If you got your news from the newspaper, that would at most just consume one article or two out of many others of pages of printed material. It would also be limited to probably one section of the many different sections of the paper all covering different news topics.
noufalibrahim|3 years ago
Long hours of gaming and imbibing this translated over into the real world where it became harder for me to put in effort because I had to see rewards accumulate as a score somewhere and that wasn't happening.
pcurve|3 years ago
Walking into arcade filled me with joy. Trying to decide which game to play. Imagining if I'm going to finally beat Wonderboy on one coin. Playing Kabal with my brother.
I was a good kid, but I would STEAL money from Mom's purse occasionally to satisfy my arcade craving.
Wow!
coffeeblack|3 years ago
TameAntelope|3 years ago
aporetics|3 years ago
Ad absurdum: if watching the Kardashians for two hours a day doubled your intelligence (however you decide to measure that) would you do it? Would you have your kids do it?
objektif|3 years ago
conqueso|3 years ago
istorical|3 years ago
fassssst|3 years ago
tastysandwich|3 years ago
But I'm not really concerned about general intelligence. As a parent I feel I have some input into their intellectual growth (to the extent environment allows). I'm far more concerned about the impact of social media on their emotional wellbeing. How they interact with others, and how they view themselves.
In Australia the Government is trying to regulate social media companies. For example, last year it introduced an "anti-trolling" bill, which would require companies to reveal the identities of anonymous trolls. And this is only the beginning of what in my opinion is heavy-handed Government overreach that will not improve the online experience of young people.
Despite being a fairly libertarian person, I'm open to a discussion on banning social media for people under a certain age (16? 18?). And then getting rid of all/most regulations on content.
Not saying this is something we should do right now, or that it's definitely a good idea. I'm just saying I think it's a discussion worth having.
sidcool|3 years ago
aaron695|3 years ago
[deleted]
lesgobrandon|3 years ago
[deleted]