> The other thing I encountered for the first time at IBM was version control (CVS, unfortunately). Looking back, I find it a bit surprising that not only did I never use version control in any of my classes, but I’d never met any other students who were using version control. My IBM internship was between undergrad and grad school, so I managed to get a B.S. degree without ever using or seeing anyone use version control.
I love this bit. This is extremely true even today. Most students at my university, and all of my university's classes, do not use or understand the benefits of a VCS systems. This is crazy on a different level. I hate to say this but it is in fact 2016 and it should not be a question that git or something should be used on every project no matter how small.
Lots of things surprised me when I first started working as a developer after university, including version control. For me, he biggest surprises were:
- programming is a team effort
- writing (non-code) is important
- software is never done
- few clever algorithms used in day-to-day work
- the complexity comes from aggregation of many simple things, not one complicated thing
I'd wager that most of the students you went to school with didn't self teach themselves programming before they went to University and probably didn't have a Computer Science class in high school.
It is difficult to understand the usefulness of a VCS when the longest project you have worked on amounted to 5 lines of code and you can't produce a fizzbuzz program, let alone understand the difference between a class and an object.
If you were self taught, or have been away from the newbies for too long, you don't always remember what it was like when everything was new. Wrapping your head around how to construct a program that does what you want it to do (or even figuring out what you want a program to do in the first place) is a difficult hurdle to jump. The more extra hurdles you throw in front of them (git, debuggers, and even the compile step) make it harder.
The earlier you require those hurdles to be jumped, the earlier you will filter out students. I believe most people can wrap their head around computational thinking and computer science and I would hate to lose them because of the tools.
You don't need a VCS to understand classes & objects. You don't need a VCS to understand Big-O, you don't need a VCS to understand Trees or Graphs or Maps or Stacks or Queues. You don't need a VCS to understand Computer Science.
And once you know what a Tree is... then I can explain to you how git works.
Things are slowly changing, but it's still heavily dependent on the university. I graduated in 2015, and we're now offering freshmen a "tooling class", which introduces students to some popular workflows.
The curriculum is changing pretty frequently, but git is covered in-depth, along with keeping code on GitHub, pull-requests, etc. Other VCS like svn are discussed briefly. The class also goes over popular IDEs and editors (vim, emacs, sublime...)
The class isn't led by a professor, but by a group of older students that have encountered these things in internships. Maybe that's what makes the difference?
I remember taking a few C++ classes in high school. It would have helped if our teacher had spent even a few minutes explaining what version control is. Their instructions never went beyond "You should make sure you save your work periodically." That brilliant piece of advice immediately yielded the following problems for us:
1) Sometimes after compiling under VisualStudio we would return to look at our source code and see that it had been replaced with Assembly. Teacher's response? "I guess you'll have to write it again."
2) Saving periodically to a single file is about as useful as not saving at all when you have borked your program somehow. Now you get to spend the next few hours commenting out random lines and inserting `cout<<"Here"` lines into the code to see how far along it got.
We in the class eventually decided that the "safest" way to do it was to continually save the files under new names in order to create a poor man's VCS. But that in itself was difficult to remember after more than a few days of working on it, as you'd have to remember which version you wanted to work on: assignment.cpp, Copy of assignment.cpp, assignment.bkup.cpp, assignment(Last version that worked).cpp, etc.
I would be appalled these days if CS classes don't spend at least a single lecture discussing how to use Git.
I disagree, except maybe in a class specifically focused on real-world software engineering. Teaching somebody to use git takes an afternoon, teaching somebody to understand programming and computation takes years.
In 1985 I bought PVCS for my personal projects. It was the second software package I paid for; BRIEF (editor) was the first. One of my CS profs bought a copy of PVCS shortly after I showed him my workflow, and everyone else thought I was crazy for spending money on something so useless.
Same thing with me. I graduated without ever seeing version control. Then again, I recall a professor responding to "How do you debug?" with "I don't need to debug. I mathematically prove me code first, and then just write it once."
It was certainly true when I was at university. I encountered CVS through a friend (which was sadly the best free VCS at the time), but pretty much nobody else used VCS. A few students worked out themselves that they needed something like that, and ended up with the chaotic system of renaming their files with a version number at the end.
It's sad that git (or equivalents) doesn't feature more prominently these day. It's as important a tool as compilers, debuggers and editors (although I don't remember being taught how to use them properly, either).
My university has several required CS classes where students must use some VCS (an SVN server is provided), as part of their grade. They're nominally software engineering classes, but really they're about finishing a project in a group setting, usually with a client that is not part of the class. But my school is also less science-oriented than many.
I disagree. VCS use in an early stage of a project can lead to less elegant and coherent systems, because you treat everything as a series of patches from day 1.
The greatest works of humanity were written without a VCS (novels, operas). Using a VCS is only mandatory for multi-person projects.
What you say is like you are surprised because there were no smartphones 15 years ago. For me it is also unbelievable working without source control system but just 10-15 years ago nobody but large companies cared about it. Time changes and flies...
I once chatted with a professor whose class I had already taken, and he told me that for the current iteration of the class they were using git as a submission system for programming assignments. Apparently more than half of the students failed the first assignment because they couldn't figure out git basics.
"I was hesitant about taking the job because the interview was the easiest interview I had at any company, which made me wonder if they had a low hiring bar, and therefore relatively weak engineers. It turns out that, on average, that’s the best group of people I’ve ever worked with."
Maybe / quite probably they weren't filtering out ton of false negatives.
The best people I've ever hired all looked the same:
1. Had a Github with actual code in it.
2. The code was clean, lightly commented, and brief without being terse; regardless of the paradigm.
3. They breezed through the interview because the questions tested their ability to actually build the thing they said they could build. No silly games. A light question that touches on data structures and O time, but nothing you'd need to crack open your old college books over.
I feel bad for people starting out, but senior / intermediate devs are just such a better deal. You pay double and you get about four or five times the work.
I've only met a couple people from the old Centaur group, but they were both excellent engineers. I would love to learn how they actually hire. It's possible that the author found it the easiest at least partly because it was closest to his level of expertise; he was already more comfortable at the software/hardware border than most, judging by the experiences listed prior to that point.
Amazing little essay. I love his intellectual honesty, and his lucid view of his career path. While he claims to have no takeway, I think one obvious one is that, barring disaster, you'll eventually end up where you belong no matter what path you take. You'll leave the bad jobs (maybe later than you should) and you'll get positive reinforcement for things you do well. Every time I've had "career envy", I've tried the alternate path only to discover why I'm really where I currently belong. But I'm glad for those alternate paths, as they gave me that reinforcement I needed.
Crazy also how similar our path to learning programming was. Looking back, all those hours of fiddling with BASIC in high school, or with Pascal at school really didn't teach us all that much. Like him, my internships mostly taught me meta-lessons rather than actually valuable skills. Like him, in my education I followed the path of least resistance, or rather the path of "most options left to explore since I have no clue what I want to do", and I feel I'm lucky I ended up where I did. Like him, I fell in love with math way late when I finally saw it wasn't about rote application of arbitrary techniques (abstract algebra is what opened my eyes) like we're taught in school.
This sentence, buried deep in the essay, is brilliant: "a common failure mode is that you’re given work that’s a bad fit, and then maybe you don’t do a great job because the work is a bad fit. If you ask for something that’s a better fit, that’s refused (why should you be rewarded with doing something you want when you’re not doing good work, instead you should be punished by having to do more of this thing you don’t like), which causes a spiral that ends in the person leaving or getting fired."
This is a wonderful essay that 1) shows its messy work, 2) outlines the deep reward of metacognition.
I have a very smart friend who is actually a great writer. But if she doesn't nail the first draft, she gives up saying, "I suck at writing". Writing is rewriting. And similarly, good learning is having grasp at the engine at your disposal through metacognition.
> In retrospect, I should have taken the intro classes, but I didn’t, which left me with huge holes in my knowledge that I didn’t really fill in for nearly a decade
After joining industry immediately after high school, I dismissed the value of formal education and selected a major with little consideration (I was unaware that CS even existed). I've recently joined a team where all my colleagues, many who received their degrees from prestigious institutions, majored in C.S. and imposter syndrome haunts me. Many though, are surprised,
Although I have my B.S. in information systems, I'm debating whether to return to academia -- obtain a second B.S. in CS or a stretch for a masters in CS (self study the fundamentals, which I'm currently doing, prior to starting the program) -- or continue the self study route to fill the missing gaps in my knowledge.
You’ll get over it. However, you will forever be cursed with having to keep your wiz-kid credentials up to date as you will never have that degree to help wedge the door open for your next opportunity. Ultimately, if programming is what you do (yes, I know CS is not just about coding) it doesn’t really matter, because the rubber meets the road at some point, and either you can code, or you can’t.
I'm also an EE starting my PhD doing research in hardware/software co-design and circuit design. I'm also heavily interested in CS in general, so I'm planning to take as many CS courses as possible.
One thing I'm trying to do is keep myself current when it comes to software development. I think it's good to have a backup skill that I can fall back on in case my PhD doesn't work out.
It's good to know that someone like Dan Luu went through ups and downs before getting to where he is now.
Very interesting read. Especially for someone like me who is on the verge of completing his undergraduate education. I have always feared that life might be too boring in the industry. This provides some fresh perspective. Also thank you for pointing out the downside of reading 50 Manning books. I wish wherever I end up working I manage to carve out time to continue reading and educating myself like you did.
I was reading the "bad stuff" section and I stumbled upon what he said about the Joel Test:
The Joel Test is now considered to be obsolete because it awards points for things like "Do you have testers?" and "Do you fix bugs before writing new code?", which aren’t considered best practices by most devs today.
Can anyone explain why having testers isn't considered a best practice by most devs today?
My programming story: I started with BASIC on an old pocket computer. I typed in games from a manual without really understanding how the code worked. In eighth grade, I programmed a bit in QBASIC in an informal computer class. Then, for years, I didn't program much except for a formula on my TI-82. I never learned the TI-82 programming language well enough to do anything more complex.
Towards the end of high school, I got Internet access and learned HTML to build a web page. I took AP Computer Science (in C++) in my senior year and scored 5 on the exam. I majored in computer science at a top math/science university that isn't as well-known for CS as some other schools. I turned down some of the higher ranked CS schools, which was probably a mistake, especially as I wasn't even that interested in the other sciences. I'm more into languages, and I was interested in CS because of its creative and entrepreneurial potential.
Fast forward to now: I started programming early and majored in CS, and I can write classes, objects, and functions, but I still don't really "get" programming. My algorithms course in college was all math and proofs that I didn't really understand. I've since gone through algorithm MOOCs and implemented some algorithms, but I still can't really apply them. My work involves some programming, but more of it is Linux administration. (I also don't really get how to deal with hardware because of my problems with anything physical.)
> Tavish Armstrong has a great document where he describes
> how and when he learned the programming skills he has. I
> like this idea because I've found that the paths that
> people take to get into programming are much more varied
> than stereotypes give credit for, and I think it's useful
> to see that there are many possible paths into programming.
Okay. Yes that's useful. Let's see what path you took...
> Luckily, the internet was relatively young [...]
There ya go. Look no further!
Mine own story is a little different... The author had local peers, I did not. The author made no mention of an old hand-me-down c64 nor Tandy 1000's in his kindergarten classroom...
But, getting online in the mid-nineties? Check. That's huge.
TCP/IP was explained to me by some random gamer in a chatroom, long before I ever thought to "google" it.
(Speaking of paths... Lycos --> Altavista --> Still Altavista for a long time as I resisted the change to Google --> Google --> DDG)
One way to learn to program is the urge to teach the machine some principle you might find nice to be realised by a machine. Some things are opposed to entropy, some things are there to create order (always with a net gain in entropy, but nonetheless). Therefore, the machines and us seem to be on the same side of nature. Even the whole ecosystem of the planet is a miraculous emergence of high level structure.
So you learn to program, if you want to program some. Print that on the next calendar.
TL;DR
Naturally talented, hardworking individual downplays everything he has ever done and tries to chock it all up to chance. I found this all very abrasive.
> no one’s going to stop you from spending time reading at work or spending time learning
What? You've lived a truly blessed life, Dan Luu. I've observed the opposite, pretty consistently. I've been working as a programmer for 25 years and I've found, across nine separate employers (and lost-track-of-how-many different supervisors) that spending any appreciable time reading (even a book about Angular when you're supposed to be learning Angular) will become a management issue. Everywhere I've ever worked has expected learning to be on your own time. Don't believe me? Put "read three chapters of a book" on a status report and see how many levels of management show up at your desk to micromanage your time.
Weird. I have found this to basically never be the case. Even some of the worst managers I've had have supported learning on the clock, through reading materials, tutorials, or otherwise. I've even had some jobs where learning time was a regularly slated part of my work week, and in some cases was allowed to take classes which cut into work a bit. My current employer has a library of technical books and educational material that any of us are free to rent and learn as we go.
A standing directive that I've got is "get your work done; other than that, I don't care how you spend your time". I regularly put something like "researched current code-signing best practice" into my status reports (and have for the past 8 years, when appropriate).
My time at home is mine. If I'm reading a technology book, it's usually something I'm curious about and don't have a direct use for in my professional life.
I'll consider myself very lucky then. At my firm we're strongly encouraged to take 4 hours per week to dedicate to learning something new or bettering our existing skill set. You simply let your team lead know your current objectives for the month and we're given access to pluralsight/codeschool etc. I honestly didn't know how much fun work could be while still being work until I started here.
Does the same hold true if you put replace "read three..." with "research"? Perhaps it's specific to my industry (data), but I spend a considerable amount of time reading about new techniques and approaches to make our codestack more efficient.
That's not my experience at least in a "DevOps" role. In fact a large portion of our time is spent reading/learning documentation, and random websites. Getting punished for learning on the job, would immediately lead to me looking for a new job.
I'm quite surprised by this since most programming jobs will involve solving problems you don't know how to solve on a regular basis. Reading, whether it's books, blogs, Stack Overflow are ways to help you solve problems.
If you are a JavaScript programmer and spend the work day reading about Elixir; that is a problem. It likely doesn't apply to work so that should be on your own time.
Places I've worked have had their own mini-libraries. I have some kind of documentation loaded up on at least one of my monitors at pretty much all times.
I don't spend significant amounts of time doing nothing but research however. The problem might not be "read three chapters of a book" on a status report, it might be having only that on a status report.
> We tried BASIC, and could write some simple loops, use conditionals, and print to the screen, but never figured out how to do anything fun or useful.
Luckily my BASIC books had most of their examples starting with SCREEN 1. Drawing images programmatically happened to be fun and useful, I learned the hard way by retyping examples and then somehow began modifying those.
My own bummer was Windows epoch. I could do VB but never otherwise grasp Windows programming because any program will have so many IDE-generated boilerplate that was totally meaningless to me and I could not work with that.
I could only resume when I learned proper WinAPI later as an University course, but then again I switched to Linux, which is a best IDE there is.
[+] [-] gravypod|9 years ago|reply
I love this bit. This is extremely true even today. Most students at my university, and all of my university's classes, do not use or understand the benefits of a VCS systems. This is crazy on a different level. I hate to say this but it is in fact 2016 and it should not be a question that git or something should be used on every project no matter how small.
[+] [-] henrik_w|9 years ago|reply
- programming is a team effort
- writing (non-code) is important
- software is never done
- few clever algorithms used in day-to-day work
- the complexity comes from aggregation of many simple things, not one complicated thing
https://henrikwarne.com/2012/08/22/top-5-surprises-when-star...
[+] [-] hdctambien|9 years ago|reply
It is difficult to understand the usefulness of a VCS when the longest project you have worked on amounted to 5 lines of code and you can't produce a fizzbuzz program, let alone understand the difference between a class and an object.
If you were self taught, or have been away from the newbies for too long, you don't always remember what it was like when everything was new. Wrapping your head around how to construct a program that does what you want it to do (or even figuring out what you want a program to do in the first place) is a difficult hurdle to jump. The more extra hurdles you throw in front of them (git, debuggers, and even the compile step) make it harder.
The earlier you require those hurdles to be jumped, the earlier you will filter out students. I believe most people can wrap their head around computational thinking and computer science and I would hate to lose them because of the tools.
You don't need a VCS to understand classes & objects. You don't need a VCS to understand Big-O, you don't need a VCS to understand Trees or Graphs or Maps or Stacks or Queues. You don't need a VCS to understand Computer Science.
And once you know what a Tree is... then I can explain to you how git works.
[+] [-] austinl|9 years ago|reply
The curriculum is changing pretty frequently, but git is covered in-depth, along with keeping code on GitHub, pull-requests, etc. Other VCS like svn are discussed briefly. The class also goes over popular IDEs and editors (vim, emacs, sublime...)
The class isn't led by a professor, but by a group of older students that have encountered these things in internships. Maybe that's what makes the difference?
[+] [-] AdmiralAsshat|9 years ago|reply
1) Sometimes after compiling under VisualStudio we would return to look at our source code and see that it had been replaced with Assembly. Teacher's response? "I guess you'll have to write it again."
2) Saving periodically to a single file is about as useful as not saving at all when you have borked your program somehow. Now you get to spend the next few hours commenting out random lines and inserting `cout<<"Here"` lines into the code to see how far along it got.
We in the class eventually decided that the "safest" way to do it was to continually save the files under new names in order to create a poor man's VCS. But that in itself was difficult to remember after more than a few days of working on it, as you'd have to remember which version you wanted to work on: assignment.cpp, Copy of assignment.cpp, assignment.bkup.cpp, assignment(Last version that worked).cpp, etc.
I would be appalled these days if CS classes don't spend at least a single lecture discussing how to use Git.
[+] [-] ng12|9 years ago|reply
[+] [-] agumonkey|9 years ago|reply
[+] [-] Kluny|9 years ago|reply
[+] [-] faster|9 years ago|reply
[+] [-] mathattack|9 years ago|reply
[+] [-] efaref|9 years ago|reply
It's sad that git (or equivalents) doesn't feature more prominently these day. It's as important a tool as compilers, debuggers and editors (although I don't remember being taught how to use them properly, either).
[+] [-] usea|9 years ago|reply
[+] [-] shermanyo|9 years ago|reply
I swear I used to have a flowchart I'd blindly follow, and god help me if there was a conflict record :/
[+] [-] baldfat|9 years ago|reply
Version control gives you VERSION CONTROL and a backup.
[+] [-] 0xcafecensored|9 years ago|reply
The greatest works of humanity were written without a VCS (novels, operas). Using a VCS is only mandatory for multi-person projects.
[+] [-] cagataygurturk|9 years ago|reply
[+] [-] justinlardinois|9 years ago|reply
[+] [-] ChoHag|9 years ago|reply
Addendum: The end result of this confusing mixture of market forces is a fleet of workers unskilled at either.
[+] [-] collyw|9 years ago|reply
Maybe / quite probably they weren't filtering out ton of false negatives.
[+] [-] 3pt14159|9 years ago|reply
1. Had a Github with actual code in it.
2. The code was clean, lightly commented, and brief without being terse; regardless of the paradigm.
3. They breezed through the interview because the questions tested their ability to actually build the thing they said they could build. No silly games. A light question that touches on data structures and O time, but nothing you'd need to crack open your old college books over.
I feel bad for people starting out, but senior / intermediate devs are just such a better deal. You pay double and you get about four or five times the work.
[+] [-] aidenn0|9 years ago|reply
[+] [-] esfandia|9 years ago|reply
Crazy also how similar our path to learning programming was. Looking back, all those hours of fiddling with BASIC in high school, or with Pascal at school really didn't teach us all that much. Like him, my internships mostly taught me meta-lessons rather than actually valuable skills. Like him, in my education I followed the path of least resistance, or rather the path of "most options left to explore since I have no clue what I want to do", and I feel I'm lucky I ended up where I did. Like him, I fell in love with math way late when I finally saw it wasn't about rote application of arbitrary techniques (abstract algebra is what opened my eyes) like we're taught in school.
This sentence, buried deep in the essay, is brilliant: "a common failure mode is that you’re given work that’s a bad fit, and then maybe you don’t do a great job because the work is a bad fit. If you ask for something that’s a better fit, that’s refused (why should you be rewarded with doing something you want when you’re not doing good work, instead you should be punished by having to do more of this thing you don’t like), which causes a spiral that ends in the person leaving or getting fired."
[+] [-] sitkack|9 years ago|reply
I have a very smart friend who is actually a great writer. But if she doesn't nail the first draft, she gives up saying, "I suck at writing". Writing is rewriting. And similarly, good learning is having grasp at the engine at your disposal through metacognition.
[+] [-] noelwelsh|9 years ago|reply
[+] [-] itsmemattchung|9 years ago|reply
After joining industry immediately after high school, I dismissed the value of formal education and selected a major with little consideration (I was unaware that CS even existed). I've recently joined a team where all my colleagues, many who received their degrees from prestigious institutions, majored in C.S. and imposter syndrome haunts me. Many though, are surprised,
Although I have my B.S. in information systems, I'm debating whether to return to academia -- obtain a second B.S. in CS or a stretch for a masters in CS (self study the fundamentals, which I'm currently doing, prior to starting the program) -- or continue the self study route to fill the missing gaps in my knowledge.
[+] [-] retro64|9 years ago|reply
You’ll get over it. However, you will forever be cursed with having to keep your wiz-kid credentials up to date as you will never have that degree to help wedge the door open for your next opportunity. Ultimately, if programming is what you do (yes, I know CS is not just about coding) it doesn’t really matter, because the rubber meets the road at some point, and either you can code, or you can’t.
[+] [-] clappski|9 years ago|reply
[+] [-] Cyph0n|9 years ago|reply
One thing I'm trying to do is keep myself current when it comes to software development. I think it's good to have a backup skill that I can fall back on in case my PhD doesn't work out.
It's good to know that someone like Dan Luu went through ups and downs before getting to where he is now.
[+] [-] reachtarunhere|9 years ago|reply
[+] [-] CodeMage|9 years ago|reply
The Joel Test is now considered to be obsolete because it awards points for things like "Do you have testers?" and "Do you fix bugs before writing new code?", which aren’t considered best practices by most devs today.
Can anyone explain why having testers isn't considered a best practice by most devs today?
[+] [-] burdalane|9 years ago|reply
Towards the end of high school, I got Internet access and learned HTML to build a web page. I took AP Computer Science (in C++) in my senior year and scored 5 on the exam. I majored in computer science at a top math/science university that isn't as well-known for CS as some other schools. I turned down some of the higher ranked CS schools, which was probably a mistake, especially as I wasn't even that interested in the other sciences. I'm more into languages, and I was interested in CS because of its creative and entrepreneurial potential.
Fast forward to now: I started programming early and majored in CS, and I can write classes, objects, and functions, but I still don't really "get" programming. My algorithms course in college was all math and proofs that I didn't really understand. I've since gone through algorithm MOOCs and implemented some algorithms, but I still can't really apply them. My work involves some programming, but more of it is Linux administration. (I also don't really get how to deal with hardware because of my problems with anything physical.)
[+] [-] daveloyall|9 years ago|reply
Mine own story is a little different... The author had local peers, I did not. The author made no mention of an old hand-me-down c64 nor Tandy 1000's in his kindergarten classroom...
But, getting online in the mid-nineties? Check. That's huge.
TCP/IP was explained to me by some random gamer in a chatroom, long before I ever thought to "google" it.
(Speaking of paths... Lycos --> Altavista --> Still Altavista for a long time as I resisted the change to Google --> Google --> DDG)
[+] [-] iammyIP|9 years ago|reply
[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] barry-cotter|9 years ago|reply
[+] [-] dcmininni|9 years ago|reply
[+] [-] clifanatic|9 years ago|reply
What? You've lived a truly blessed life, Dan Luu. I've observed the opposite, pretty consistently. I've been working as a programmer for 25 years and I've found, across nine separate employers (and lost-track-of-how-many different supervisors) that spending any appreciable time reading (even a book about Angular when you're supposed to be learning Angular) will become a management issue. Everywhere I've ever worked has expected learning to be on your own time. Don't believe me? Put "read three chapters of a book" on a status report and see how many levels of management show up at your desk to micromanage your time.
[+] [-] Casseres|9 years ago|reply
Try something like "reviewed industry publications for new paradigms and optimization techniques."
[+] [-] vogt|9 years ago|reply
[+] [-] khedoros|9 years ago|reply
My time at home is mine. If I'm reading a technology book, it's usually something I'm curious about and don't have a direct use for in my professional life.
[+] [-] alasano|9 years ago|reply
[+] [-] brtt|9 years ago|reply
[+] [-] overcast|9 years ago|reply
[+] [-] GrumpyYoungMan|9 years ago|reply
Can't say I've ever had a problem with that in my two decades in the industry. Learning is expected and mandatory.
[+] [-] bigtunacan|9 years ago|reply
If you are a JavaScript programmer and spend the work day reading about Elixir; that is a problem. It likely doesn't apply to work so that should be on your own time.
[+] [-] lostboys67|9 years ago|reply
At my first job (from high school) my first instruction was "go to company library get the book on fortran and learn it".
[+] [-] MaulingMonkey|9 years ago|reply
I don't spend significant amounts of time doing nothing but research however. The problem might not be "read three chapters of a book" on a status report, it might be having only that on a status report.
[+] [-] unknown|9 years ago|reply
[deleted]
[+] [-] Retr0spectrum|9 years ago|reply
[+] [-] guard-of-terra|9 years ago|reply
Luckily my BASIC books had most of their examples starting with SCREEN 1. Drawing images programmatically happened to be fun and useful, I learned the hard way by retyping examples and then somehow began modifying those.
My own bummer was Windows epoch. I could do VB but never otherwise grasp Windows programming because any program will have so many IDE-generated boilerplate that was totally meaningless to me and I could not work with that.
I could only resume when I learned proper WinAPI later as an University course, but then again I switched to Linux, which is a best IDE there is.
UPD: meaningLESS