Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> At the risk of sounding stupid, I have no idea how bit-shifting and bitwise operators work, though I've worked in the field for over a decade and am now a Staff Engineer or whatever. There, I said it.

I mean no offense, but honestly I'm a bit flabbergasted that anyone who has worked in programming computers for over a decade wouldn't have acquired this exceptionally basic bit of computing knowledge.



20 years of application development and I’ve used bit shifting a couple times, none of them were all that essential or much better than the alternatives. It’s a bit condesending and par for the course on HN to be flabbergasted at someone’s lack of knowledge of something but honestly there are career paths where it truely does not matter.


Because it is something that naturally flows from having an understanding of boolean logic and that computers store data in binary digits.

Being a "software engineer" who doesn't understand this seems to me like being an architect who doesn't understand trigonometry or something.


Believe it or not, there are many different kinds of software engineering, many of which don't require you to know anything about binary or bitwise arithmetic. I don't know that I've ever used these as a full stack web developer, but I did learn it in my Computer Science program; the only time I think about the bit position of something is when setting/viewing file permissions, or reading source code for a library which does deal with this (there are legitimate reasons to use bitwise operators in JS for things like hashing, cryptography, secure string generation, etc.)

But really, there are people doing software engineering who have never thought about this, and while I agree that any blind spot in knowledge is detrimental (and everyone has some blind spots), it's condescending to suggest that this specific blind spot is noteworthy to the point of lacking foundational knowledge. Your software engineering isn't the only type of software engineering.


I must question the value of a "software engineer" who is working at a layer so abstracted from the reality of implementation that they don't have to understand binary.

> the only time I think about the bit position of something is when setting/viewing file permissions, or reading source code for a library which does deal with this (there are legitimate reasons to use bitwise operators in JS for things like hashing, cryptography, secure string generation, etc.)

People should understand how the libraries they use work. Not necessarily in detail, but they shouldn't be a magic box they're helpless to understand if it isn't working right.

> it's condescending to suggest that this specific blind spot is noteworthy to the point of lacking foundational knowledge.

Possibly. I'll admit that though it seems foundationally important to my understanding of computing it may not necessarily be foundational to understanding of computing in a conceptual sense. However, it is so basic, and follows so naturally from boolean logic (which is foundational), that it is quite surprising it never came up for someone in this profession. I mean, if you already know how logic gates work it is trivial to leverage that into understanding bitwise operations.


> People should understand how the libraries they use work. Not necessarily in detail, but they shouldn't be a magic box they're helpless to understand if it isn't working right.

What I'm saying is that 'should' is unfairly prescriptive. There are things I don't understand that would be massively beneficial such as being able to understand drivers and driver code, C (we didn't learn it to any meaningful capacity in my program), and many other skills.

Fortunately, I've managed to take what I do know about computing, software, and programming, and find places I can apply that knowledge in a software engineering discipline that utilizes the knowledge I do have most effectively. Would I be able to write a driver if I was being paid to do it? Probably; there would be some learning required, maybe not even that much, but someone would be paying me to learn things outside of my general experience. I'd welcome it as well, but it just hasn't happened.

Similarly, there are people whose knowledge encompasses a subset of what I know (but possibly excluding binary and bitwise operations in totality) who are also very effective software engineers.

If you can think about browser state, reactivity, CSS specificity, the DOM, and git, you can be a very effective front-end engineer, and you never have to know much about how computers work. There are absolute wizards with 'front-of-front-end' who will be much more effective at animations, effects, device breakpoints, and visually pleasing layouts than I will ever be, who will never think about binary. And it will never be remotely relevant to them.


I think everyone, no matter where they are in the stack, should have at least a basic idea about how their programs eventually work. How they get turned into machine code, how that machine code gets fetched and executed, how registers work, memory accesses, RAM vs. ROM, what various caches are doing, how I/O devices work including disk, how interrupts work, how syscalls work, how memory allocation is done, what the stack is, typical segments of address space: code, data, stack, heap. I've been in the field for a few decades, but this is still kind of basic second-year compsci, right?

You don't need to be an expert at bitwise operations to grok these important concepts.


> How they get turned into machine code, how that machine code gets fetched and executed, how registers work, memory accesses, RAM vs. ROM, what various caches are doing, how I/O devices work including disk, how interrupts work, how syscalls work, how memory allocation is done, what the stack is, typical segments of address space: code, data, stack, heap

Yeah, I've learned all of this stuff ~10 years ago (in compsci also) and forgotten a lot of it due to never using it. Should I learn it again? To be honest, I'd like to, but I haven't taken the time either. I contemplated doing a deep dive the one time I've had to interview recently, but then didn't end up needing it. I'm sure if I wanted to move to a FAANG I'd need to brush up, but in the mean time I'm bombarded with plenty of things I need to learn on a regular basis to keep up with the actual technologies I work with

> but this is still kind of basic second-year compsci, right?

To be honest, I find I have a very hard time really integrating knowledge of something until I've had ample opportunity to apply it in different ways over some extended period of time. I think this was why the knowledge never stuck when I learned it in CS. There were so many things I'd cram before the test and then never need to think about again.

If I don't have the opportunity to apply these concepts, trying to really 'learn' them is a Sisyphean task for me. With enough effort and repetition I'm sure I could, but I don't agree that I should. I think your attitude really overlooks the variance in learning styles and capabilities. Not everyone can remember things they don't apply for 10 years.


No, but you don't need to know them to do the front end work mentioned by OP.


> I must question the value of a "software engineer" who is working at a layer so abstracted from the reality of implementation that they don't have to understand binary.

To me it sounds like very productive work and a progression of technology. It's 2021, where 99.99% of us don't have to implement our own binary protocols, and work with computers rather than microcontrollers. If you're bit shifting, then it's most likely extremely low level work, which, as is the point of advancements in technology, is necessarily becoming more and more rare, or some case of NIH.

I do work with binary protocols, so I'm very familiar with bit shifting, but I also realize this is all grunt work that I'm having to implement due to poor vendor support of usable/no libraries.


"Understanding binary" on a basic level is different than being comfortable using bitshifting / bitwise operators.


...not really? Bitwise operators can be explained in a couple minutes, tops, and follow naturally from another concept everyone working in computing should be familiar with: boolean logic.


> Bitwise operators ... follow naturally from another concept everyone working in computing should be familiar with: boolean logic.

I'd say to most people raised on "ordinary" maths they follow more naturally from quite a different concept: Moving the decimal point.


That only works for shifting, not the other bitwise operations.


Uh yes, I’m an example of one. I’ve understood all of the operators at one point or another. I get the basic ideas. I’ve just barely used them so I would have to look them up before using them. I also don’t find the concept of bit-shifting to be that intuitive. Boolean logic is easy and a very necessary fundamental concept for all programming — I’m completely comfortable with that…

For what it’s worth, I think most people consider me a competent software engineer. I have a degree in computational mathematics from a good school.


> I also don’t find the concept of bit-shifting to be that intuitive.

Can you clarify what you mean by this? Because the concept of shifting bits to the left or right is so simple that I don't see how it could possibly strike you as unintuitive unless you are completely unfamiliar with the idea of data being represented by a string of contiguous binary digits.

If your problem is merely that knowing what bit-shifting is does not present any obvious use cases to you, that isn't the same as finding the underlying concept unintuitive.


Hmmm. I'll try. I guess the main issue is that I rarely ever deal with data in binary format. The times that I've dealt with binary strings has probably been most often as binary numbers in some sort of programming or math puzzle. Thinking of it as "multiplying" or "dividing" by a power of 2 feels more natural to me than "shifting" bits left or right. Maybe this is because I was a math major instead of CS :p

If I had more experience with lower level languages and / or dealing with raw binary data formats, I probably would have internalized it better by now.


Then why don't you spend that few minutes explaining the thing to the person you initially replied to, or link them to a resource instead of doing what you're currently doing?


Because several other people already gave perfectly adequate direct responses.

Mostly what I'm doing currently is defending my level of surprise.


That's fair. A point that I didn't see mentionned is that what seems really easy for you may not be for others. I know about boolean logic, and bits, but I'm not comfortable with bitwise operation and they certainly don't seem obvious to me. On the other hand, I have coded with other people, and was really surprised that they didn't understand a certain thing (going from for loops to map for example).

When you look at the number of programming languages and how different "idiomatic code" can be for the same task, I think it's a proof that different people think in a different way. For some low-level stuff will be very intuitive, for some it'll be functional programming, for others OO, etc.

As some other people said, CS/SWE is very wide, it's hard to know everything at any point. Add to that that for most people working with high level software, knowing more about the business gives more leverage than learning more about CS fundamentals, you end up with people knowing widely different things.

I think that in general, what's important is having a basic idea about most thing (known unknowns instead of unknown unknowns) so that you can look into stuff deeper if you need it.


> That's fair. A point that I didn't see mentionned is that what seems really easy for you may not be for others. I know about boolean logic, and bits, but I'm not comfortable with bitwise operation and they certainly don't seem obvious to me.

....how in the world can that be? Bitwise operations are just the application of boolean logic to each pair of bits in the arguments. If you know what "exclusive or" means, XOR should be perfectly comfortable to you no matter how many bits are involved.


XOR is fine, I understand how it works, though I often don't think about it in some leetcode-type exercise where a xor is the "clever" solution. Byteshift is not really intuitive to me, for example in the article, I wouldn't know if I had to shift left or right and would take a bit of time to look that up. As I said, different people think in a different way. I don't understand how people that can understand for loops don't understand map and filter, but I know that they exist.


That's just wild to me. If I held up a card with a decimal number on it and asked you to remove the X least significant digits, would you have to look up which way to shift them?


Yes, I would have to.


Would you have to consult a reference in order to be able to point to the least significant digit of a number written down in front of you? Maybe the "least significant/most significant" terminology is unfamiliar enough to you to present a stumbling block. Certainly the fact that you're communicating on a written forum means you're probably comfortable and experienced reading a string of digits and knowing how to understand it as representing a quantity, even if you've forgotten the formal terminology describing that process.


No, I wouldn't have to consult a reference. What's unfamiliar to me is how exactly byteshifting allows you to take the X most significant bits from a number, although now that I've talked a bit and thought a bit about this it starts to make more sense. The thing is, I've never spent much time thinking about this, so it's unfamiliar to me.


If you have a number base n, shift right/left k digits means, respectively, "divide/multiply by n^k".

That's the same whether it's decimal (n=10) or binary (n=2).


It's floor division with bit shifting though, right?


It is a clear sign that the person never had any genuine interest in computing, since that would have led to learning how a CPU works (it's the most important part of a computer!) and thus its instruction set, which would have of course led to learning about binary operators and shift operators.

Instead the person clearly was motivated only by some specific application of computing or by making money.


To me this reads like someone arguing that a champion race car driver must never have had a genuine interest in automobiles if they've never rebuilt an engine.


Clearly you do not have a genuine interesting in computing because you do not have a complete theory of everything, unifying gravity and all the fundamental forces, so really you have no idea how computation even happens at a fundamental level.

Stop gatekeeping there’s going to be someone smarter than you.

People work at different levels.

The logic of how a huge program works can be abstracted above bit shifting. It is still computation.

Maybe try be more understanding.


Talk about gate keeping…

You don’t get to decree what genuine interest in computing looks like for all of humanity.


Everyone thinks their work/what they know is important, that's it. Why don't study electrical engineering while we're at it. Even better, go even deeper and get us all some physics

There's a reason Randall Munroe draw that butterflies xkcd comics.


> and that computers store data in binary digits

I know that computers store data in binary digits but it has absolutely no impact on how I do my job. I'm vaguely aware of how bitwise operators work but have never had to use them in my work, only in random hacky side projects.

> like being an architect who doesn't understand trigonometry or something

I'd equate it with a an architect that doesn't know where the brick supplier they are working with sources their bricks from. It's absolute essential to the project, nothing is going to get built without the bricks. But the architect can still plan the use of those bricks and the project can be entirely successful without them needing that knowledge.


I'd say it's a bit more like a truck driver who doesn't know how an engine or transmission works.

Sure, you can learn the habits just fine, and maybe go your entire career without knowing what goes on under the hood. But sooner or later you're going to get stuck on the side of the road waiting for someone else to come out and fix something trivial.


But then you can call AAA and get fixed in under a hour. I would guess the number of truck drivers that know enough about their engine or transmission to fix a trivial issue is <10%. For Uber drivers I would be flabbergasted if it was over 1%.

I actually think this is a pretty good analogy. A truck driver does not need to know how to fix their truck to do their job. A web developer does not need to know how to bitshift. For mechanics or low level programmers it is a different story.


But this is like an architect trying to lay bricks without knowing how


It really isn’t.


I mean, if you work as a Staff engineer in a department that only does web app development (say, working at Shopify on their rails codebase) I can easily see that you wouldn't come into any contact with bitwise operators at all. Rather, your time is probably better spent on scaling databases, refactoring codebases and other such higher-level things.


People can understand that without knowing about bit-shift operators in a given language.

I know about them because I had to. My first computer had 4K of RAM. When I hit the limits of Applesoft BASIC, which didn't take long, I had to learn 6502 assembler. Later, I've done things like writing protocol decoders, so I've also learned them in Python.

But I think it's perfectly reasonable to have a solid career and never use these. To forget that they ever existed, because one is solving problems where they're just not relevant anymore. Our field is too big to know everything, so I think it's fine that people focus on different aspects of the work.


Same here, I started on 6502 assembly and you had to know a lot of that stuff just because we didn’t have cool libraries like the STL and so on. You want to sort something, well you’d better write a sort routine.


The common use for bitwise I have seen is to represent multiple boolean states in a single variable. It's not required for most people at all. A more descriptive multi-value hash/struct with each state explicitly listed as a boolean will create a more readable code. Unless there's a strong need for memory management or a binary protocol implementation, I don't see a need for it.


it's the attitude that's the problem. Rolling your own terrible solution without asking someone or just googling first.


It simply hasn't come up. I've written accounting and billing software that processes millions without using bit shifting, logistics solvers that do multithreaded depth first search systems on lazy graphs without bit shifting, ticketing systems that sold millions of tickets correctly without bit shifting etc.

I'm sure there are plenty of use cases out there, I just haven't worked on them.


Would you have recognised the opportunity to use it, not knowing how it works?

Sometimes when we learn a new technique we suddenly see all the places it can be used that we didn't know about before. There's the old (probably apocryphal) story about the landscaper who was able to fire one of his staff when a customer told him he can calculate the side of a right-angled triangle without measuring all 3 sides. What he did worked for him, he just didn't know there was a better way.

Anyway I do agree in this case you are probably correct. I think most programmers these days use bitwise logical operators for flags and rarely if ever bit shift outside of system code.


I almost had to learn it once for flags, but Go has iota and constants and Java has enums, and stuffing flags into an integer and storing that in Postgres or Redis would have made the problem arguably worse. Just used a Set.


It's not that they haven't come up in their job that's strange, but that they were never exposed to this level of base computing principle at any point in their career.


The right algorithm beats out low-level every time. Think of bitwise operators as a vector unit, but the scalar is a bit. That’s it.


I also have no idea how to properly do memory management, every language I've worked with has had garbage collection. I know the languages I write in use malloc or something to get more memory, but that's it. Never called it, don't know how it works. For flabberg-ammo for that gasty feeling.


I can understand how this situation happens, but honestly it really bothers me that it is the case. I see all this bloated slow software everywhere that pauses all the time and it is hard not to see "I've never done memory management" as part of the cause.


That might not be due to memory management - a lot of my work is in Go and Java, selling very high demand inventory to millions of people (opening night movie tickets across tens of thousands of screens and shows across the entire country) and the APIs are not slow and do not (noticeably) GC pause. But both Go and Java / JVM allow people to do high-value work without worrying about memory management (other than leaks of course). No allocations, free / release, no pointer arithmetic. Even the concept of pointers in Go isn't actual memory addresses but value/reference semantics. The only time I came close to having to manage memory was iOS apps, but then ObjectiveC added ARC and Swift came along, so nope.


It certainly feels like a GC pause, but it could be a lot of things. Regardless of the cause I feel like the fact that people don't think about how memory is managed is indicative of the real problem: people are programming computers without really understanding how they work. And while that is fine, even encouraged, for a lot of use cases I think anyone calling themselves a professional programmer should be held to a higher standard.


So how many levels or layers down or up should we expect someone to go? Say we take Go / Java as a starting point, L-1 would be C / C++, L-2 would be Assembly / SIMD / AVX processor instructions, L-3 would be chip design / architecture, L-4 would be diode chemistry and fabrication, L-5 is quantum mechanics?

And on the other side L+1 would be databases, L+2 would be system architecture / systems design, L+3 would be user experience and communication, L+4 organisation and team design, L+5 would be LISP?

How many levels would someone have to straddle to meet the higher standard? And does it matter which level their midpoint is?


I too despair when I remember the days of software instantly responding to keypresses. But I fear that no matter how much today's developers knew about computer architecture software would still be slow, because latency is like a gas: it expands to take up all available latency-toleration space in a program's audience. Compressing it back is expensive, ergo it's a race to the bottom where business leadership would not let those developers actually take the time to make software more responsive.


>>I see all this bloated slow software everywhere that pauses all the time and it is hard not to see "I've never done memory management" as part of the cause.

There's no way your ordinary programmer would have done memory management better than the GC options available with something like the JVM.

This is in fact Java/Python/Perl exist in the first place.

Slow software is better than software that crashes often.


I started with C 30 years ago, wrote my PhD thesis in pain, tears and C, and then developed at a more or less amateur level until today (thankfully not in C anymore).

I read about bit-shifting operations but never, ever had the need to use them. So I did not push further and just know that they exist and that they look like <<.

I hope I did not made your flabbergastering worse.

Ah, I also wrote a (tiny) bit of the Linux kernel in ~1994 (what today would be modules).

And still survived without <<


I think programming has become so high-level that there's a lot of fundamental computer science that is largely ignored. I mean, I'm sure this person has no idea what an ISA, cache, CAM, or pipeline is either, but you don't need to know to write front-end JavaScript.

It does unnerve me as well, since it is so fundamental in my mind. But hey, maybe there is just so much to learn there's no time to teach computer architecture anymore?

If you want to feel humbled, read the book "Hackers Delight". It is about 300 pages of mostly bitwise algorithms and operations. I thought I knew a thing or two from decades coding, but this book goes so deep that my brain just shut off.

https://www.powells.com/book/hackers-delight-1st-edition-978...

EDIT: Upon further pondering, bitwise operations make certain assumptions about machine word size for that context. It could be dangerous for higher level languages to allow bitwise ops if the # of bits isn't known, as this context of instructions is generally tied to the hardware: the opposite of high level languages!


Nor do companies care that you know this when applying for a web developer position. IMO they should care since it shows you're capable of learning about different fields in the software engineering field.


We definitely care if you’re capable of learning other fields - we check for breadth in static and dynamic languages, SQL and NoSQL knowledge, UX and systems design, performance optimisation, a sense of aesthetics, writing and communication skills and more. We’d rather people spend time learning these than binary arithmetic or regex operators or SIMD instructions.


Yeah, not everybody has to work with networks or cryptography, but I would've imagined that people at least read libraries that they use (to figure out what's going on under the hood) and inspect/debug low-level stuff from time to time.


A lot of software development is just "executing recipes" and there's nothing wrong with that: https://youtu.be/Nb2tebYAaOA?t=1363


How do you simultaneously know what bit shifting is but then not know that you can easily avoid it completely for a career?


I know that it's the manipulation of binary numbers, and I do know what binary numbers are. But I've never operated on binary numbers professionally. I did do some AND OR NOR NOT XOR gate diagrams in high school and college, but haven't looked at binary numbers or thought in binary numbers since then.


Ok, so it isn't as though you weren't exposed to the concept at least. The way you phrased what you said made it sound like boolean logic itself might be alien to you. Still, there's such a tiny conceptual distance from the gates to bitwise operators that I'm still a bit surprised you didn't acquire it by osmosis.


The Boolean logic part is applicable everywhere, so that's actually unrelated to binary arithmetic or bit-shifting. Stuff like Demorgan's theorem etc I practice more in the context of simplifying if-else statements than binary operations. So there hasn't been enough significant overlap to spread into.


Been writing code for pay since about 2000.

I think I've bit-shifted in anger, like... twice.

I 1000% could not do anything useful with them without consulting a reference, as with most anything else I don't do on an ~weekly basis.

[EDIT] rather, that's all bitwise operators. I know I've used them once (in a "write it once, never touch it again" kind of context) and maybe one other time but recollection's hazy. Just a basic bit-packing thing IIRC, for the one I recall.


Just wait until Copilot is widely used. There will be a whole group of cargo cult programmers.


I mean no offense, but honestly I'm not a bit flabbergasted at all that anyone has your view. I keep seeing people acting as if the bubble they are in is a good representation of the world as a whole, and every indication that maybe they are wrong is met with a big surprise.

If you are truly flabbergasted by this you need to try to find a way out of your bubble so you can understand reality better.



10 years of one year's experience, probably.


Heh, no, I'm happy to say I've worked in a wide range of industries, countries, programming languages, datastores, team compositions, roles, front & backend, side projects and own products. I'm just saying non of them required shifting bits.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: