top | item 32261590

Ask HN: Should a professional software dev know how many bits go in a byte?

27 points| ssdspoimdsjvv | 3 years ago | reply

Would you say that if you program computers for a living you should always know there are (usually) 8 bits in a byte? Or is it perfectly fine not to know that / can it be considered specialized knowledge if you only program high level software, e.g. front-end web or CRUD business applications?

126 comments

order
[+] joe-collins|3 years ago|reply
Even if someone doesn't need to work at a sufficiently low level as to need that information regularly, not knowing that suggests a severe lack of fundamentals and hints at trouble on the horizon. Someone could play at chemist and work with raw materials and follow recipes without knowing about electron orbitals, but without that understanding they may make disastrous choices if given insufficient oversight and their capacity for problem solving is limited.
[+] popcorncowboy|3 years ago|reply
> not knowing that suggests a severe lack of fundamentals and hints at trouble on the horizon

Exactly this. Strictly speaking you really don't need to know how many bits are in a byte, but it'd be difficult not to have picked up this kind of information having spent any time in the field. The absence of this knowledge is a strong indicator of shallow expertise.

[+] woojoo666|3 years ago|reply
Why is so fundamental about the fact that a byte is 8 bits? The fact that a byte is 8 bits is more for historical reasons. Likewise, an int is traditionally 32 bits, but Javascript and Python programmers don't need to worry about that. If a programmer doesn't know that a byte is 8 bits, they probably just forgot it since it wasn't necessary for them to know.
[+] dusted|3 years ago|reply
Emotionally, I want to say yes, they should know how many bits are in a byte, how many bytes are in a kilobyte (1000) and in a kibibyte (1024) and what big and little endianness is, and at least some part of the computing history that had led us to where we are now..

But rationally, I know that a lot of programmers don't care about computers, or even about programming and to them it's just a job.. And I guess I have no right to judge that..

[+] biztos|3 years ago|reply
For many of us a kilobyte will always be 1024 bytes despite the newfangled fashion for “kibi” this and “gibi” that.

Fortunately the people who adhere to the “ibi” cult are usually sticklers for compliant capitalization, so when I see “kB” I know it Must Be One Of Them.

Now get off my lawn, all 1000 of you!

[+] cnity|3 years ago|reply
For those wondering: big-endian means you crack an egg from the bigger end.
[+] mikewarot|3 years ago|reply
When I see this come up, I always associate it with hard drives sold by marketing folks who just couldn't let a Megabyte equal the traditional value of 1,048,576 bytes... they had to sell that extra 4% by changing the meaning of a word.
[+] hericium|3 years ago|reply
I wouldn't call someone not knowing basics "a professional". And this is a basic knowledge not only for programming but networking, too.

This theoretical person from the title does use internet and would like to know how many megabytes per second can be pushed through their n megabits link, no?

[+] basilgohar|3 years ago|reply
What I have observed, as more and more people from a wider background enter into computer science and software engineering, is that it's not so much what they know or don't know that matters, but rather, what is their attitude towards learning something they don't already know.

If they embrace new knowledge and appreciate those who bring it to them, then I couldn't think of someone better for this art. If, on the other hand, they dismiss anything that they don't immediately see the value to, even at the behest of someone either more senior or experienced, then I would say, that is the red flag.

[+] jb1991|3 years ago|reply
Absolutely they should know. This comes up so many times even in high-level programming that anyone who’s doing anything beyond very trivial student applications should know this stuff.
[+] brunooliv|3 years ago|reply
I think any person remotely interested in computers or programming can and should "know" that a byte is 8 bits.

But... does it matter?

Unless you are working with hardware or in embedded systems more directly, it ABSOLUTELY does not matter at all.

What matters is to have a critical eye for numbers when they matter and/or are "off". That is much more valuable for the day-to-day work.

"Damn, before we could easily hit this endpoint X times per second and after that refactor and library change it's down to X/5? We need to have a look".

Sizes of DB tables, latency of endpoints and being able to properly read stacktraces or logs from a third-party service is much more valuable.

PS: people writing in the comments about not knowing how text encoding works, about sending data trough sockets, etc... Folks, you know how to do these things because they are required from you at your day job, you probably had exposure to proper patterns and got help from experienced people within that particular niche at your company. That's all. Knowing 1 byte is 8 bits is absolutely meaningless knowledge on its own.

I needed to know these "standard sizes" for my CS degreee,sure. Do they matter for my day-to-day work? Not in the slightest.

[+] Kim_Bruning|3 years ago|reply
Do you ever inexplicably encounter the number 255 (or 256)?
[+] mjbeswick|3 years ago|reply
Absolutely they should, as without knowing what a byte is there is little chance someone would know how data is represented in computer systems.
[+] smcn|3 years ago|reply
This may be an outdated opinion based on the people I'm seeing leave bootcamps. I think that underlying architecture is so heavily abstracted these days that you do not need to really know _how_ a computer works in order to make something functional.
[+] TheCoelacanth|3 years ago|reply
That's like if a professional writer didn't know how many letters are in the alphabet.

It has no bearing on actually doing the job, but how on earth did you manage to learn the necessary skills without learning that?

[+] AnimalMuppet|3 years ago|reply
It rarely has bearing on actually doing the job. But all abstractions leak. Once in a while, the number of bits in a byte leak out.
[+] seren|3 years ago|reply
It is important to understand how many MegaBytes/s are sent through 100 Megabits/s ethernet link for example. (or whatever communication link)
[+] praash|3 years ago|reply
This is the strongest point that comes to my mind. I feel that most consumers are completely unaware that their perception of transfer speeds and file size differ by almost an order of magnitude. I absolutely hate how marketing takes advantage of this. Imagine the frustration after thinking that your 10 Mbit/s connection can download a 6 GB game in just ten minutes.

Many internet operators advertise speeds as "1Gb/s" or even "100M". Add in the confusion between 5G and 5 Ghz WiFi.

[+] pmontra|3 years ago|reply
Considering ACKs traffic when doing back of the envelope estimates I divide by 10: 100 Mb/s = 10 MB/s (on Ethernet, Wi-Fi who knows.) But that introduces a new problem: how many developers have an idea of how networking works? A smart developer I worked with once told me that he only knows how to turn Wi-Fi on and off.
[+] joshka|3 years ago|reply
Yes, regardless of how much these are abstracted in modern software development. Size is an important aspect of many things in a computer. It impacts:

- storage capacity

- speed of communication

- speed of computation

- representation of numeric values

- encoding of strings

Bits and Bytes are the fundamental units of size and representation, and knowing them is critical to understanding most other things that build on top of them.

For high level software, you see this come up in:

- integer limitations (e.g. database identifiers)

- internationalization of user interfaces

- floating point numeric comparison and other mathematical operations

- communications over various transport layers

- speed of various abstractions that are used to make the higher level software fast / good

[+] vbezhenar|3 years ago|reply
This is basic knowledge. Of course one should know how many bits go in a byte. It's like asking if one should know how to replace a bulb. Not a software developer, every person should know that. It's like knowing that there're 1024 bytes in kibibyte or 1000 meters in kilometer.

These days one should even know how exactly double is represented in hardware as it's the main type in JS and one should clearly understand bounds of integer numbers which could be represented in double without precision lost.

[+] WelcomeShorty|3 years ago|reply
Should a programmer also know about FS block sizes? About ethernet size? About Multithreading? Memory access bandwidth? CPU architecture? Stored procedures in a DB? Current encryption schemes?

I would say it depends totally on the tasks of the coder. If you fiddle around with some ASP: NO. If you are squeezing every last bit for Netflix performance: YES.

Our domain is so incredibly huge, deep, wide & divers that I do not expect regular coders to be familiar with most details.

[+] orwin|3 years ago|reply
Ideally yes?

The only thing i'm not opiniated on is block size because while this is interesting, this isn't usefull for debugging or optimizing. All the rest any dev should have an idea of what it is and seek further knowledge when needed. Even if he is a frontend dev.

[+] blikdak|3 years ago|reply
Yes. Try adding 1 to a number which is already (in binary) all '1's. If you don't know the size of a byte, or an 'int' or a 'long int' etc for the language you are using you will be mystified why it is now 0, or maybe some other number your language decided was sensible to use (signed vs. unsigned etc). Then extrapolate that times 100 when dealing with floating point etc and presenting a sensible total in your shopping cart web app. If you don't know this stuff learn it or stay away from programming.
[+] shafyy|3 years ago|reply
I don't think it's required to work as an average professional web dev. If you want to be a good developer and keep getting better, understanding the fundamentals of computer science is very important (and makes the job more interesting, in my mind).
[+] exyi|3 years ago|reply
People say you should know something about the lower-level programming and they are right IMHO.

Even as computer user: Internet speeds are regularly given as megabits per seconds. How do you estimate how long will a 5GB file upload without knowing it's 8b/B ?

[+] dragonelite|3 years ago|reply
Of course people should know some of the basics of hardware and software development. Not knowing the basics or the chain of steps a framework or a protocol takes means you might oversee optimisation or pain points in a certain workflow.

But then again knowing how many bits go in a byte it's not needed to be productive in 99% of the cases.

Then again when a professional software developer can't tell me how many bits go in a byte or cant at least explain to me how to read/use call stack/stack frame in a IDE or terminal etc. I really start to wonder how experienced and professional that developer really is.

[+] alkonaut|3 years ago|reply
Yes. This comes up even in the highest level and most trivial work.

And not just “should be able to look it up”, but should know it in their sleep.

Should also know other esoterica like e.g how a string could be zero terminated in one case but length prefixed in others, Joe they could be one, two, four, or variable numbers of bytes per character/glyph in a string and so on. Not because one needs this every day but a) because doing work adjacent to it means you eventually shoot yourself in the foot and b) doing any amount of work will expose you to this and not having picked up on it means you aren’t picking up things.

[+] AlphaGeekZulu|3 years ago|reply
Yes, probably any software developer should know the trivial fact, that a byte consists of 8 bits (nowadays) - even if the developer is not involved with bit operations.

Much more important, in my eyes, and more sophisticated, is the concept of the computer "word": a professional software dev should know how many bytes go in a word, how word length corresponds with bus width, memory organisation and how types depend on word length.

And in my humble opinion, Unicode encoding is such a basic concept, that every dev should understand it on a bit level.

[+] oxff|3 years ago|reply
If you can't do napkin math with bits and bytes to calculate capacities, bandwidths, flows etc. you are lacking one of the most essential skills there are, and need to pick up the slack ASAP.
[+] superchroma|3 years ago|reply
Probably not, it's something you can look up, and the 1000 vs 1024 naming distinction (e.g. kibi vs kilo) continues to be something that most developers I encounter aren't up to speed on. I use libraries with classes modelling such units in them if I need to do arithmetic with them, at the very least to offer security for others when they look at my code.

I think what matters is that you're reasoning about things and being thorough and careful. Broadly, I haven't yet had someone give me a pop quiz at my job yet.