This is sweet news. I'm over 40. I enrolled at my local university in January and I'm studying (literally right now) for my linear algebra midterm [0] which is in 45 minutes! I'm on HN to calm my nerves.
I graduated high school in the early 2000s and graduated college with major in computer science and a minor in math. My goal is 5-8 more classes for a second degree in math (major).
I'd like to see a study on how the acute stress of living in survival mode for a lifetime affects the brain by using it too much for the wrong tasks.
The last 25 years have been particularly painful for people like me who favor academia and pure research over profit-driven innovation that tends to reinvent the wheel. When I look around at the sheer computing power available to us, I'm saddened that people with wealth, power and influence tend to point to their own success as reason to perpetuate the status quo. When we could have had basic resources like energy, water, some staple foods and shelter provided for free (or nearly free) through automation. So that we could focus on getting real work done in the sciences for example, instead of just making rent.
I've been living like someone from movies like In Time and The Pursuit of Happyness for so many decades without a win that my subconscious no longer believes that the future will be better. I have to overcome tremendous spidey sense warning signs from my gut in order to begin working each day. The starting friction is intense. To the point where I'm not sure how much longer I can continue doing this to myself, and I'm "only" in my mid-40s. After a lifetime of negative reinforcement, I'm not sure that I can adopt new innovations like AI into my workflows.
It's a hollow feeling to have so much experience in solving any problem, when problem solving itself will soon be solved/marginalized to the point that nobody wants to pay for it because AI can do it. I feel rather strongly that within 3 years, mass-layoffs will start sweeping the world with no help coming from our elected officials or private industry. Nobody will be safe from being rendered obsolete, not even you the reader.
So I have my faculties, I have potential, but I've never felt dumber or more ineffectual than I do right now.
>I'd like to see a study on how the acute stress of living in survival mode for a lifetime affects the brain by using it too much for the wrong tasks. The last 25 years have been particularly painful for people like me who favor academia and pure research over profit-driven innovation that tends to reinvent the wheel.
I suspected something very different based off the first sentence. Like someone living in a high crime area and trying not to get dragged into it. Or constantly struggling with poverty, food insecurity, etc.
> When we could have had basic resources like energy, water, some staple foods and shelter provided for free (or nearly free) through automation.
I was inspired to get into programming by Star Trek in the early 2000s because I thought I could contribute to automation that would lead towards that kind of society; much like you've stated here. Some will say we're naive and unrealistic, but all the ingredients for having society function in this way are attainable with a bit of a cultural shift. I was fine with the idea that society could take baby steps towards it, but it seems the last 25 years have been a mixture of regressing and small incremental improvements to things that don't contribute towards that goal. Just like you, my expectations have been utterly destroyed and my outlook for the future is grim.
You're not the only one that has had those kind of feelings, and I really relate to the movies you referenced.
Try to remember, AI is a tool, not a solution, and there will always be new problems. There's a strong case that unlike every other time people said that technology will kill all the jobs, this time it actually will. But a helpful framework comes from Clayton Christensen's Innovator's Solution (not the much more famous Innovator's Dilemma) - whereas a business has well defined needs that can be satisfied by improving products, customers (i.e. people) have ever evolving needs that will never be met. So while specific skills may lose value, there will always be a demand for the ability to recognize and provide value and solutions.
AI didn't really mesh seamlessly with my work until I used Claude, I highly recommend it. If your current workflow involves googling, reading documentation and examples on github until you can put together a solution then AI should slot into your work nicely. It just does all those things but faster and can often surface what I want in 30 seconds instead of 30 minutes of research.
I wouldn't worry though, if the last 4 years are any indicator, we will continue to see LLMs refined as better and better tools at a logarithmic rate, but I don't really see them making the jump to replacing engineers entirely unless some monumental leap happens. If AI ever gets that good it will have replaced vast swathes of white collar workers before us.
I am somewhat optimistic, tech adoption is only going to go up, and the number of students pouring into CS programs is cooling off now that there aren't $100k jobs waiting for anyone who can open up an IDE. My ideal future is people who really love tech are still here in 10 years, and we will have crazy output because the tooling is so good, and all the opportunistic money seekers will have been shaken out.
I just want to say that, while I am about a decade your junior, I feel the same way.
It is weird to live in a world of shallow pursuits, wanting to learn and teach and build and seeing everyone going crazy about 'line goes up'. It also pains me the contortions that are required to afford to exist when we have so much wealth and knowledge and so many still have to suffer.
And the weird thing is, I see everything as learning. From fields learning to interact and persist 'particles' to ecosystems learning to dissipate energy to humans learning to collaborate. And we are literally building machines capable of learning. In a deeper sense, software is machine learning: general computers are the first machines we built that are pure learning potential. A loom can only make fabric but by making them capable of learning different patterns without the need of a human making every little decision we sparked a fire that is now consuming everything.
I don't think LLMs will shortcut software building. But I do think that existence itself is about learning. Seeing it hijacked by people obsessed with grabbing more resources for the sake of it is truly sad.
But then again, that is the root of suffering. Maybe what pains me the most is knowing how much I still hold on to in my own way. Maybe the best lesson I can take from all this is that the more I let go of the more I can lessen my suffering and participate in the great universal journey of learning. As a singer I greatly admired sang: if your cup is already full it's bound to overflow.
I think (from personal experience) talking with a good mental health professional would really help with your current state of mind and the pressure you’re feeling.
Perhaps your life is on the easy setting? Hungry people work really hard. Fearing destroying an entire family by losing my job allows me to find strength and courage.
As a researcher who changed career paths to teaching at a community college, I empathize. Twenty years ago when I graduated from high school, I was inspired by the stories I’ve read about Bell Labs, Xerox PARC, and early Apple and Microsoft. I wanted to be a researcher, and I wanted to do interesting, impactful work.
Over the years I’ve become disappointed and disillusioned. We have nothing like the Bell Labs and Xerox PARC of old, where researchers were given the freedom to pursue their interests without having to worry about short-term results. Industrial research these days is not curiosity-driven, instead driven by finding immediate solutions to business problems. Life at research universities isn’t much better, with the constant “publish-or-perish” and fundraising pressures. Since the latter half of January this year, the funding situation for US scientists has gotten much worse, with disruptions to the NIH and NSF. If these disruptions are permanent, who is going to fund medium- and long-term research that cannot be monetized immediately?
I have resigned myself to the situation, and I now pursue research as a hobby instead of as a paid profession. My role is strictly a teaching one, with no research obligations. I do research during the summer months and whenever else I find spare time.
> I'm saddened that people with wealth, power and influence tend to point to their own success as reason to perpetuate the status quo. When we could have had basic resources like energy, water, some staple foods and shelter provided for free (or nearly free) through automation
What you stated is true, but my disappointing observation is that the people with wealth/power are only marginally smarter than the rest of us on the topic you mentioned. And then I suspect that even if one had a rich benefactor, pulling that off is not easy. It takes a threshold number people who have a holistic view of things to pull of what you mentions i.e nearly free basics of life. Check my profile etc. - some of what I wrote may strike a chord with you.
Also the proponents on Technocracy (Hubbert etc.) about a 100 years back, essentially touched on the subject you state. Note: The word technocracy today has a different connotation.
I'm very sympathetic to your experience and agree with most of what you say, but as someone who has spend half his life in academia and half outside, "who favor academia and pure research over profit-driven innovation that tends to reinvent the wheel", I must say that 'reinventing the wheel' is at least as prevalent in academia than it is in business.
> acute stress of living in survival mode for a lifetime
For some perspective, bone evidence of pre-Columbian Indians showed that they regularly suffered from famine. There was also the constant threat of warfare from neighboring tribes.
The American colonists didn't fare much better, their bone evidence was one of extreme overwork and malnutrition.
If I may so bold as to refer to you as "my friend" (having never met you)...
My friend, I think I understand what you mean. I am about the same age too.
I would like to propose an idea to you - and it is something I have been exploring very deeply myself lately.. maybe the thing we need to start spending our time on is exactly this meta problem now. The meta problem is something like (not perfectly stated): we as humans have to decide what we value such that we can continue to give our existence purpose in the future.
I don't think AI is going to be the be-all-end-all, but it is clearly a major shift that will keep transforming work and life.
I can't point yet at a specific job, or task - but I am spending real time on this meta problem and starting to come up with some ideas. Maybe we can be part of what gets the world, and humans, ready for the future - applying our problem solving skills to that next problem?
I mean all of the above in 100% seriousness and I am willing to chat sometime if interested to compare notes.
Maybe it's time for me (40+) to go back to college. I want to pick up Mathematics and Physics up to the point of General Relativity. Since it's "use it or lose it", I better start reading now.
But I don't really have any time. There are so many things to do, to learn. Younger people who happen to stumble upon this reply, please please prioritize financial freedom if you don't have a clear objective in mind -- and from my observation many people don't have a clear objective when they are in their 20s! If you can retire around 35-40, you have ample time to pursuit any project you want for the rest of the life.
Putting in a plug for MIT OCW 8.962 [1]. I also had this itch, and was able to find time during the pandemic to work through the course (at about 1/2 speed). But true to what others are saying, life intruded for the last few lectures, so still have some items on my todo list. I thought Scott Hughes laid out the math with terrific clarity, with just the right amount of joviality. It is not for everyone, but if you have a suitable background it may turn "scratch an itch" into the obsession that it has done to me.
And to make the obligatory on-topic comment: I'm 61yo. Now get off my lawn.
I've always toyed with the idea of studying Computer Science since I taught myself how to code.
Hell of a lot more difficult now when I need to work and don't really have the same amount of time to dedicate to studying. Hell of a lot easier when you're younger, your whole life basically revolves around the education, and any job you have generally fits around your school life rather than the other way round.
I'm over 40 and even though I mostly manage/lead now I have time to do programming and plenty of math. I still see improvement mentally (not so much physically anymore), but also a lot of improvement in skills I neglected when I was younger like interpersonal skills and sales. I'm also learning a new language and read more than ever. Sometimes I feel like I'm less sharp, but I wonder if that's because I'm doing so much more.
My tricks that I don't always follow, is work out every day, get enough sleep, and stay off of most short form social media. I realized when I was on short form social it would zap a lot of time and kill any focus I had.
This advice could really backfire badly if taken literally by young people.
Optimizing for financial reward early in your career could be the surest way to end up in a dead end from a mission/purpose/domain/skills perspective.
20 years later, you realize you burned two precious decades accumulating money that, honestly, does not help you at all make sense or use of the next two.
I got excited to do this a couple years ago. (early 30s) Time and energy were a real killer.
Physics and Math in a formal setting like school is rigorous, not fun. I found it really hard to stay motivated. I don't know how I would practically use that knowledge, i would never contribute anything scientific. It would take years of grinding through foundational math and physics to get there.
I often ponder if I have the energy to go back to school. I am employed by MIT at one of the labs where I do research for embedded security. As a consequence, they offer free classes you can pick up. I am yet to actually take advantage of that yet but your comment has me thinking the same thing. I turn 36 in a couple days!
Yay, do it! I'm in linear algebra right now (midterm in 40 minutes) and I'm over 40. I went back because I always regretted not taking more higher level math. It's been a lot of work, but very rewarding. My kids (age 7 and 5) think it's pretty cool to see dad working on his TI-89 and Notability on iPad.
I was running into the same issue. I wanted to get into deeeplearning but my math skills had atrophied. go check out mathacademy.com. its no where near the level of time investment that going back to college is and you will learn a lot!
More proof that old boomers don't get what its like to be a modern, young adult. I was just texting with friends about this at the coffee shop this morning while making plans for this weekend. Boss is interruping by goat-yoga mindfullness session, asking me to come into the office an hour this month. Who has time for this?
Echoing the sentiments of others here, this is why I firmly believe that public college should be free, for all, for life. Formal education just works better for some of us than video tutorials or self-paced learning, and ensuring everyone is able to learn new things and practice their skills in a consequence-free environment benefits society as a whole.
Think about the tech nerds (me) who never learned how to cook, and are in their thirties. Or lawyers and Doctors who are sick and tired of feeling like they don’t understand how computers work, and want to learn. Or an accountant who loves maths, and wants to get into the scientific side of the field. Or the homemaker who wants to re-enter the workforce now that their kids are grown, and wants to pick up carpentry and welding to become a tradesperson.
If cognitive decline comes from failing to practice it regularly, then the cheapest solution is free education for life to encourage as many people as possible to keep learning new skills and remain cognitively engaged.
She still got Alzheimer's and died a couple of years later.
She had multiple incidents that she hid because she was too scared to find out, and too stubborn to lose her ability to drive. She could have had some treatment if she'd approached a doctor earlier.
Alzheimer's is utterly evil. Robbing people of their unique spark, killing the person before the body dies.
With 25 years of experience in software development, I’ve noticed that long coding sessions leave me feeling more fatigued than they used to. However, I’ve also become significantly more productive, as I spend far less time grappling with problems I’ve already solved. I’ve only just begun to explore AI-assisted coding, so that isn’t what’s driving my efficiency. Is it reasonable to assume that the natural decline in cognitive performance over time is offset by the gains in experience and expertise?
> Is it reasonable to assume that the natural decline in cognitive performance over time is offset by the gains in experience and expertise?
It depends on the task, but overall, for the work I do as a software developer, yes.
I would say I have less energy, but I need less energy, and I produce better results in the end. I'm better at anticipating where a line of work will go, and I'm quicker and better at adjusting course. There are a lot of multi-hour and multi-day mistakes that I made ten and twenty years ago that I don't make now.
The raw mental energy I had when I was younger allowed me to write things I couldn't write now, but everything I write now is something that other people can read and maintain, unlike twenty years ago. It's very rare that writing a large, clever, intricate mass of code is the right answer to anything. That used to frustrate me, because I was good at it. I used to fantasize about situations where other people would notice and appreciate my ability to do it. Now I'm glad it's not important, because my ability to do it has noticeably declined. In the rare cases where it's needed, there are always people around who can do it.
Another thing that is probably not normal, but not rare either, is that the energy I had when I was young supercharged my anxiety and caused me to avoid a lot of things that would have led to better outcomes, like talking to other people. I'm still not great (as in, not even average for an average human, maybe average for a software developer) but I'm a lot better than I used to be.
> skills decline at older ages only for those with below-average skill usage. White-collar and higher-educated workers with above-average usage show increasing skills even beyond their forties.
> Individuals with above-average skill usage at work and home on average never face a skill decline (at least until the limit of our data at age 65).
I wonder how much of the "age-related" decline is due to the brain functioning on autopilot. After over 5 decades, I have experienced most of the issues I'm going to experience in life. More often than not, I'm addressing issues with mental playbooks based on past experience.
As I get older (now in my 50s), I find myself reflecting on how many aspects of my life and decisions are operating on autopilot. I figure it's worse now with social media where people are constantly bombarded with dopamine hits, while boredom and idle thoughts have largely become things of the past.
Perhaps counterintuitively, I am trying to break this pattern and consciously engage with my experiences by asking a few basic questions, such as:
- What am I seeing here?
- What's going on?
- What am I missing?
- How can I approach this differently to achieve the same or better outcomes?
Additionally, I am making a concerted effort to notice more new details during routine tasks, like commuting or shopping. I can't count how many times I've discovered something new and interesting on my work commutes. Actually, I can: it's every time.
Edit: Also spending more time with long-form content over short-form, be it reading or watching videos. It forces me to consider a topic for a much longer period. Short form knowledge is a trap, unless you have some system that hits you with high rates of repetition (eg Anki).
Are there any guidelines for what exactly this would entail?
My short term memory is falling off a cliff. What do I need to do to prevent that from getting worse? Are there any other bases I need to cover that I don't know that I'm missing?
For those who don't feel like taking math courses in a formal setting, making games from scratch is a fun way to learn and apply linear algebra and calculus.
I never really needed determinants in my life until I tried moving a spaceship towards another object. Trying to render realistic computer graphics gets you into some deep topics like FFTs and the physics of light and materials, with some scary-looking math, but I can feel my mind sharpening with each turn of the page in the book.
I write code, pretty much every single day, and also, solve problems, every single day (7 days a week).
I think solving problems is important. Not just rote coding, but being presented with a bug, or a need to achieve an outcome, without knowing the solution, up front, is what I like.
Basically, every single day, I'm presented with a dilemma, which, if not solved, will scrap the entire project that I'm working on.
I solve every one (but sometimes, by realizing it's a red herring, and trying alternate approaches).
> ”Cross-sectional age-skill profiles suggest that cognitive skills start declining by age 30 if not earlier.”
and
> ”Two main results emerge. First, average skills increase strongly into the forties before decreasing slightly in literacy and more strongly in numeracy”
Does this mean that this study contradicts the popular common understanding that cognitive skills decline after 30? Or am I missing something?
For me, personally, if feels a more impactful finding than the “use it or lose it” one
Recreational travel is the only thing that routinely works for me in terms of slowing down time and fully engaging my brain. It's something I can incorporate into my life multiple times per year and it guarantees a massive amount of new stimulation (assuming travel to new and interesting places). Even the most rudimentary trip to Europe will have you grappling all day long with a different language and culture and environment in ways that are completely taken for granted in our day to day lives.
There's lots of things that can make an even bigger impact, like moving to a new place or starting a new career or school, or a new relationship. But those are things that sometimes only happen a handful of times in our entire lives.
Everything else I find eventually becomes routine, no matter how stimulatingly it can be at the start. Not that we shouldn't try! Some stimulation is a whole lot better than none, and I have a terrible feeling that many people get little-to-no stimulation for weeks and months at a time (beyond a new TV show or podcast or political drama).
If you are older, I think the trick is to watch (or remember!) what younger people do and follow (or revert to) that behavior, as much as you can.
Comparing cognitive abilities between older and younger people fails to control for the inputs - behavior, experience, etc. Try the same inputs (using some big generalities):
* Exploration: Younger people love to explore, even just for exploration sake, and are also compelled to try things - and they also fail. Exploration is their mode, because so much of the world is new to them, because doing something new and innovative is socially admired, and especially because so many major changes happen - leave home, serious romantic relationships, first job, etc. A lot of that happens, ready or not.
* Learning: Similarly, younger people are compelled to learn lots of very challenging things, whether they want to or not; they are compelled to use cognitive skills that they are uncomfortable with. Their job is to learn, daily, for 12-16+ years. Remember school? Remember your early years at work when had little choice of what you did? Remember struggling with all those things?
* Playing: Young people love to play and are socially admired for playing better and more creatively.
What, you're past all that? Nobody is going to make you study things you're not interested in? Don't want to make any big changes? Dignity too big to play? Ego too big to explore and to fail? When you're older, you can say no and 99.99% (I think that's about accurate) take advantage of that and refuse to do or even talk about things they aren't already comfortable with. Does all this sound too hard? Then don't complain about losing those skills.
I think a big part of the problem is the same that affects CEOs - there is nobody to hold them to account.
This makes so much sense. I've been programming every day since I was in my twenties and there are definitely some concepts that seem much easier for me to get my head around now (I'm in my 50's) than earlier.
Right now I'm reading through a college textbook on the biology of learning and memory with ease and good retention. Never got this deep into any subject in my school years.
I’d like to remind everyone that learning to play music, learning dancing or sports requiring complex coordination also count.
Oh, and a question at the back of my mind: wouldn’t using AI to avoid minimize all of us spending time in the struggling-to-figure-something zone lead to earlier decline on a massive scale?
As someone who plays a lot of board games — particularly heavier board games — and hopes to do even more of that in retirement, I’m wondering if/how that is helping/will help fight cognitive decline.
I can imagine at the very least it won’t hurt, and intuitively it makes sense. But I’m not sure studies have been done specifically to understand how board gaming — or the problems being solved with board gaming - helps with cognitive skills.
Curious if others that are closer to this field have thoughts.
[+] [-] semireg|1 year ago|reply
I graduated high school in the early 2000s and graduated college with major in computer science and a minor in math. My goal is 5-8 more classes for a second degree in math (major).
Wish me luck!
[0] Study guide: https://course1.winona.edu/bperatt/M311S25/Tests/Test%202/te... Course: https://course1.winona.edu/bperatt/M311S25/Administrative/M3...
[+] [-] zackmorris|1 year ago|reply
The last 25 years have been particularly painful for people like me who favor academia and pure research over profit-driven innovation that tends to reinvent the wheel. When I look around at the sheer computing power available to us, I'm saddened that people with wealth, power and influence tend to point to their own success as reason to perpetuate the status quo. When we could have had basic resources like energy, water, some staple foods and shelter provided for free (or nearly free) through automation. So that we could focus on getting real work done in the sciences for example, instead of just making rent.
I've been living like someone from movies like In Time and The Pursuit of Happyness for so many decades without a win that my subconscious no longer believes that the future will be better. I have to overcome tremendous spidey sense warning signs from my gut in order to begin working each day. The starting friction is intense. To the point where I'm not sure how much longer I can continue doing this to myself, and I'm "only" in my mid-40s. After a lifetime of negative reinforcement, I'm not sure that I can adopt new innovations like AI into my workflows.
It's a hollow feeling to have so much experience in solving any problem, when problem solving itself will soon be solved/marginalized to the point that nobody wants to pay for it because AI can do it. I feel rather strongly that within 3 years, mass-layoffs will start sweeping the world with no help coming from our elected officials or private industry. Nobody will be safe from being rendered obsolete, not even you the reader.
So I have my faculties, I have potential, but I've never felt dumber or more ineffectual than I do right now.
[+] [-] nonethewiser|1 year ago|reply
I suspected something very different based off the first sentence. Like someone living in a high crime area and trying not to get dragged into it. Or constantly struggling with poverty, food insecurity, etc.
[+] [-] y-c-o-m-b|1 year ago|reply
I was inspired to get into programming by Star Trek in the early 2000s because I thought I could contribute to automation that would lead towards that kind of society; much like you've stated here. Some will say we're naive and unrealistic, but all the ingredients for having society function in this way are attainable with a bit of a cultural shift. I was fine with the idea that society could take baby steps towards it, but it seems the last 25 years have been a mixture of regressing and small incremental improvements to things that don't contribute towards that goal. Just like you, my expectations have been utterly destroyed and my outlook for the future is grim.
[+] [-] pchristensen|1 year ago|reply
Try to remember, AI is a tool, not a solution, and there will always be new problems. There's a strong case that unlike every other time people said that technology will kill all the jobs, this time it actually will. But a helpful framework comes from Clayton Christensen's Innovator's Solution (not the much more famous Innovator's Dilemma) - whereas a business has well defined needs that can be satisfied by improving products, customers (i.e. people) have ever evolving needs that will never be met. So while specific skills may lose value, there will always be a demand for the ability to recognize and provide value and solutions.
[+] [-] ericmcer|1 year ago|reply
I wouldn't worry though, if the last 4 years are any indicator, we will continue to see LLMs refined as better and better tools at a logarithmic rate, but I don't really see them making the jump to replacing engineers entirely unless some monumental leap happens. If AI ever gets that good it will have replaced vast swathes of white collar workers before us.
I am somewhat optimistic, tech adoption is only going to go up, and the number of students pouring into CS programs is cooling off now that there aren't $100k jobs waiting for anyone who can open up an IDE. My ideal future is people who really love tech are still here in 10 years, and we will have crazy output because the tooling is so good, and all the opportunistic money seekers will have been shaken out.
[+] [-] namaria|1 year ago|reply
It is weird to live in a world of shallow pursuits, wanting to learn and teach and build and seeing everyone going crazy about 'line goes up'. It also pains me the contortions that are required to afford to exist when we have so much wealth and knowledge and so many still have to suffer.
And the weird thing is, I see everything as learning. From fields learning to interact and persist 'particles' to ecosystems learning to dissipate energy to humans learning to collaborate. And we are literally building machines capable of learning. In a deeper sense, software is machine learning: general computers are the first machines we built that are pure learning potential. A loom can only make fabric but by making them capable of learning different patterns without the need of a human making every little decision we sparked a fire that is now consuming everything.
I don't think LLMs will shortcut software building. But I do think that existence itself is about learning. Seeing it hijacked by people obsessed with grabbing more resources for the sake of it is truly sad.
But then again, that is the root of suffering. Maybe what pains me the most is knowing how much I still hold on to in my own way. Maybe the best lesson I can take from all this is that the more I let go of the more I can lessen my suffering and participate in the great universal journey of learning. As a singer I greatly admired sang: if your cup is already full it's bound to overflow.
[+] [-] su8898|1 year ago|reply
[+] [-] navbaker|1 year ago|reply
[+] [-] pizzafeelsright|1 year ago|reply
[+] [-] linguae|1 year ago|reply
Over the years I’ve become disappointed and disillusioned. We have nothing like the Bell Labs and Xerox PARC of old, where researchers were given the freedom to pursue their interests without having to worry about short-term results. Industrial research these days is not curiosity-driven, instead driven by finding immediate solutions to business problems. Life at research universities isn’t much better, with the constant “publish-or-perish” and fundraising pressures. Since the latter half of January this year, the funding situation for US scientists has gotten much worse, with disruptions to the NIH and NSF. If these disruptions are permanent, who is going to fund medium- and long-term research that cannot be monetized immediately?
I have resigned myself to the situation, and I now pursue research as a hobby instead of as a paid profession. My role is strictly a teaching one, with no research obligations. I do research during the summer months and whenever else I find spare time.
[+] [-] dennis_jeeves2|1 year ago|reply
What you stated is true, but my disappointing observation is that the people with wealth/power are only marginally smarter than the rest of us on the topic you mentioned. And then I suspect that even if one had a rich benefactor, pulling that off is not easy. It takes a threshold number people who have a holistic view of things to pull of what you mentions i.e nearly free basics of life. Check my profile etc. - some of what I wrote may strike a chord with you.
Also the proponents on Technocracy (Hubbert etc.) about a 100 years back, essentially touched on the subject you state. Note: The word technocracy today has a different connotation.
[+] [-] PeterStuer|1 year ago|reply
[+] [-] WalterBright|1 year ago|reply
For some perspective, bone evidence of pre-Columbian Indians showed that they regularly suffered from famine. There was also the constant threat of warfare from neighboring tribes.
The American colonists didn't fare much better, their bone evidence was one of extreme overwork and malnutrition.
[+] [-] brainzap|1 year ago|reply
[+] [-] sixdimensional|1 year ago|reply
If I may so bold as to refer to you as "my friend" (having never met you)...
My friend, I think I understand what you mean. I am about the same age too.
I would like to propose an idea to you - and it is something I have been exploring very deeply myself lately.. maybe the thing we need to start spending our time on is exactly this meta problem now. The meta problem is something like (not perfectly stated): we as humans have to decide what we value such that we can continue to give our existence purpose in the future.
I don't think AI is going to be the be-all-end-all, but it is clearly a major shift that will keep transforming work and life.
I can't point yet at a specific job, or task - but I am spending real time on this meta problem and starting to come up with some ideas. Maybe we can be part of what gets the world, and humans, ready for the future - applying our problem solving skills to that next problem?
I mean all of the above in 100% seriousness and I am willing to chat sometime if interested to compare notes.
[+] [-] ferguess_k|1 year ago|reply
But I don't really have any time. There are so many things to do, to learn. Younger people who happen to stumble upon this reply, please please prioritize financial freedom if you don't have a clear objective in mind -- and from my observation many people don't have a clear objective when they are in their 20s! If you can retire around 35-40, you have ample time to pursuit any project you want for the rest of the life.
[+] [-] jpmattia|1 year ago|reply
Putting in a plug for MIT OCW 8.962 [1]. I also had this itch, and was able to find time during the pandemic to work through the course (at about 1/2 speed). But true to what others are saying, life intruded for the last few lectures, so still have some items on my todo list. I thought Scott Hughes laid out the math with terrific clarity, with just the right amount of joviality. It is not for everyone, but if you have a suitable background it may turn "scratch an itch" into the obsession that it has done to me.
And to make the obligatory on-topic comment: I'm 61yo. Now get off my lawn.
[1] https://ocw.mit.edu/courses/8-962-general-relativity-spring-...
[+] [-] ljm|1 year ago|reply
Hell of a lot more difficult now when I need to work and don't really have the same amount of time to dedicate to studying. Hell of a lot easier when you're younger, your whole life basically revolves around the education, and any job you have generally fits around your school life rather than the other way round.
[+] [-] matwood|1 year ago|reply
My tricks that I don't always follow, is work out every day, get enough sleep, and stay off of most short form social media. I realized when I was on short form social it would zap a lot of time and kill any focus I had.
[+] [-] 725686|1 year ago|reply
[+] [-] whiplash451|1 year ago|reply
This advice could really backfire badly if taken literally by young people.
Optimizing for financial reward early in your career could be the surest way to end up in a dead end from a mission/purpose/domain/skills perspective.
20 years later, you realize you burned two precious decades accumulating money that, honestly, does not help you at all make sense or use of the next two.
[+] [-] Luc|1 year ago|reply
Agreed on prioritizing financial freedom.
[+] [-] yojo|1 year ago|reply
Grinding is soul-sucking, and having someone at home was the only way I made it through the roughest patches.
I semi-retired in the 35-40 range, but if my choices were being retired and single or working but with my family, I’d 100% take the latter.
[+] [-] tayo42|1 year ago|reply
Physics and Math in a formal setting like school is rigorous, not fun. I found it really hard to stay motivated. I don't know how I would practically use that knowledge, i would never contribute anything scientific. It would take years of grinding through foundational math and physics to get there.
[+] [-] jdefr89|1 year ago|reply
[+] [-] semireg|1 year ago|reply
[+] [-] cultofmetatron|1 year ago|reply
[+] [-] miamiwebdesign|1 year ago|reply
[+] [-] kamaal|1 year ago|reply
This. To Infinity.
Please prioritise financial freedom. I missed a few steps, but as I get old, I realise this is the biggest blocker to almost anything.
Money == Free time.
[+] [-] Bloating|1 year ago|reply
You olds have all the money, all the time.
[+] [-] meindnoch|1 year ago|reply
[+] [-] stego-tech|1 year ago|reply
Think about the tech nerds (me) who never learned how to cook, and are in their thirties. Or lawyers and Doctors who are sick and tired of feeling like they don’t understand how computers work, and want to learn. Or an accountant who loves maths, and wants to get into the scientific side of the field. Or the homemaker who wants to re-enter the workforce now that their kids are grown, and wants to pick up carpentry and welding to become a tradesperson.
If cognitive decline comes from failing to practice it regularly, then the cheapest solution is free education for life to encourage as many people as possible to keep learning new skills and remain cognitively engaged.
[+] [-] bloopernova|1 year ago|reply
She still got Alzheimer's and died a couple of years later.
She had multiple incidents that she hid because she was too scared to find out, and too stubborn to lose her ability to drive. She could have had some treatment if she'd approached a doctor earlier.
Alzheimer's is utterly evil. Robbing people of their unique spark, killing the person before the body dies.
Sorry for the rant
[+] [-] bikamonki|1 year ago|reply
[+] [-] dkarl|1 year ago|reply
It depends on the task, but overall, for the work I do as a software developer, yes.
I would say I have less energy, but I need less energy, and I produce better results in the end. I'm better at anticipating where a line of work will go, and I'm quicker and better at adjusting course. There are a lot of multi-hour and multi-day mistakes that I made ten and twenty years ago that I don't make now.
The raw mental energy I had when I was younger allowed me to write things I couldn't write now, but everything I write now is something that other people can read and maintain, unlike twenty years ago. It's very rare that writing a large, clever, intricate mass of code is the right answer to anything. That used to frustrate me, because I was good at it. I used to fantasize about situations where other people would notice and appreciate my ability to do it. Now I'm glad it's not important, because my ability to do it has noticeably declined. In the rare cases where it's needed, there are always people around who can do it.
Another thing that is probably not normal, but not rare either, is that the energy I had when I was young supercharged my anxiety and caused me to avoid a lot of things that would have led to better outcomes, like talking to other people. I'm still not great (as in, not even average for an average human, maybe average for a software developer) but I'm a lot better than I used to be.
[+] [-] lapcat|1 year ago|reply
> Individuals with above-average skill usage at work and home on average never face a skill decline (at least until the limit of our data at age 65).
[+] [-] runjake|1 year ago|reply
As I get older (now in my 50s), I find myself reflecting on how many aspects of my life and decisions are operating on autopilot. I figure it's worse now with social media where people are constantly bombarded with dopamine hits, while boredom and idle thoughts have largely become things of the past.
Perhaps counterintuitively, I am trying to break this pattern and consciously engage with my experiences by asking a few basic questions, such as:
- What am I seeing here?
- What's going on?
- What am I missing?
- How can I approach this differently to achieve the same or better outcomes?
Additionally, I am making a concerted effort to notice more new details during routine tasks, like commuting or shopping. I can't count how many times I've discovered something new and interesting on my work commutes. Actually, I can: it's every time.
Edit: Also spending more time with long-form content over short-form, be it reading or watching videos. It forces me to consider a topic for a much longer period. Short form knowledge is a trap, unless you have some system that hits you with high rates of repetition (eg Anki).
[+] [-] Yxven|1 year ago|reply
My short term memory is falling off a cliff. What do I need to do to prevent that from getting worse? Are there any other bases I need to cover that I don't know that I'm missing?
[+] [-] optymizer|1 year ago|reply
I never really needed determinants in my life until I tried moving a spaceship towards another object. Trying to render realistic computer graphics gets you into some deep topics like FFTs and the physics of light and materials, with some scary-looking math, but I can feel my mind sharpening with each turn of the page in the book.
[+] [-] aomix|1 year ago|reply
[+] [-] ChrisMarshallNY|1 year ago|reply
I write code, pretty much every single day, and also, solve problems, every single day (7 days a week).
I think solving problems is important. Not just rote coding, but being presented with a bug, or a need to achieve an outcome, without knowing the solution, up front, is what I like.
Basically, every single day, I'm presented with a dilemma, which, if not solved, will scrap the entire project that I'm working on.
I solve every one (but sometimes, by realizing it's a red herring, and trying alternate approaches).
[+] [-] soneca|1 year ago|reply
> ”Cross-sectional age-skill profiles suggest that cognitive skills start declining by age 30 if not earlier.”
and
> ”Two main results emerge. First, average skills increase strongly into the forties before decreasing slightly in literacy and more strongly in numeracy”
Does this mean that this study contradicts the popular common understanding that cognitive skills decline after 30? Or am I missing something?
For me, personally, if feels a more impactful finding than the “use it or lose it” one
[+] [-] standardUser|1 year ago|reply
There's lots of things that can make an even bigger impact, like moving to a new place or starting a new career or school, or a new relationship. But those are things that sometimes only happen a handful of times in our entire lives.
Everything else I find eventually becomes routine, no matter how stimulatingly it can be at the start. Not that we shouldn't try! Some stimulation is a whole lot better than none, and I have a terrible feeling that many people get little-to-no stimulation for weeks and months at a time (beyond a new TV show or podcast or political drama).
[+] [-] mmooss|1 year ago|reply
Comparing cognitive abilities between older and younger people fails to control for the inputs - behavior, experience, etc. Try the same inputs (using some big generalities):
* Exploration: Younger people love to explore, even just for exploration sake, and are also compelled to try things - and they also fail. Exploration is their mode, because so much of the world is new to them, because doing something new and innovative is socially admired, and especially because so many major changes happen - leave home, serious romantic relationships, first job, etc. A lot of that happens, ready or not.
* Learning: Similarly, younger people are compelled to learn lots of very challenging things, whether they want to or not; they are compelled to use cognitive skills that they are uncomfortable with. Their job is to learn, daily, for 12-16+ years. Remember school? Remember your early years at work when had little choice of what you did? Remember struggling with all those things?
* Playing: Young people love to play and are socially admired for playing better and more creatively.
What, you're past all that? Nobody is going to make you study things you're not interested in? Don't want to make any big changes? Dignity too big to play? Ego too big to explore and to fail? When you're older, you can say no and 99.99% (I think that's about accurate) take advantage of that and refuse to do or even talk about things they aren't already comfortable with. Does all this sound too hard? Then don't complain about losing those skills.
I think a big part of the problem is the same that affects CEOs - there is nobody to hold them to account.
[+] [-] marstall|1 year ago|reply
Right now I'm reading through a college textbook on the biology of learning and memory with ease and good retention. Never got this deep into any subject in my school years.
[+] [-] andrei_says_|1 year ago|reply
Oh, and a question at the back of my mind: wouldn’t using AI to avoid minimize all of us spending time in the struggling-to-figure-something zone lead to earlier decline on a massive scale?
[+] [-] Dowwie|1 year ago|reply
[+] [-] jader201|1 year ago|reply
I can imagine at the very least it won’t hurt, and intuitively it makes sense. But I’m not sure studies have been done specifically to understand how board gaming — or the problems being solved with board gaming - helps with cognitive skills.
Curious if others that are closer to this field have thoughts.
[+] [-] risyachka|1 year ago|reply
Because it literally speeds up your cognitive decline as your brain shuts off and offloads all the heavy lifting.