"My normal job is way up in the clouds," he said. "This is so much more down to earth; this is really a great change."
I'd rather put it this way:
"My normal job used to be in the trenches. Now it's in the clouds. This is really a great change. I'm never going back."
The more things change so that I can concentrate on the software, the more I like it. No more serial or parallel ports, no more soldering or crimping cables, much less resetting & rebooting..." With USB, plug & play, & wireless, I almost forget how much it used to suck. Now if only I could "think" my code onto my monitor without a keyboard...
I would agree with you most of the time. However I've found that because most of the people I work with have never learned a programming language that forces you to understand the hardware, they just don't understand some problems. Passing by reference and passing by value, for instance. Unless you've used pointers before, you don't get what's going on.
However - I do fail to see why a web developer finds soldering "enlightening" - that's a little too far removed!
As a programmer and avid electronics hobbyist, I have a theory which explains some of the appeal, at least to me personally.
Hardware, unlike modern software, actually behaves logically and rewards learning and the construction of consistent mental models. When a circuit malfunctions, there is always a solid physical reason for it. It is because you, the builder, screwed up. An understanding of the physics involved will be rewarded by reality cooperating and the lights blinking merrily. As opposed to software, where most of your work consists of routing around other people's idiocy, and where any results you achieve might spontaneously erase themselves any day as the upgrade cycles churn onward.
Unless you deal with very basic circuitry you have to, at some point, use some else's "idiocy." Same with software, if all you do is build console Hello Worlds, well, that's pretty rock solid. As far as the reward system, that's different for everybody; each with his own.
I totally agree with these guys that getting out the soldering iron and working on real hardware is satisfying as hell.
But ironically, many hardware engineers like me spend all day writing what's essentially software (Verilog code) and running it in a simulator (VCS). Eventually we send a binary file (GDSII) that represents the chip layout off to the fab, and though we do get real chips back from Taiwan, most of us never lay hands on them.
All software programmers should know, at some simplistic but fundamental level, how the high level code they write actually gets executed. That means familiarity with actual hardware, even if it is in an academic way (e.g., building J-K flip flops, learning a fake ISA, etc.)
I don't think it's all that useful. The simplistic view that say Java > Bytecode > ASM is approximately true but modern CPU's basically ignore the ASM and give you the correct result even if they did not do it the way you suggested. Even simple assumptions about RAM break down once you start looking at how L1 cache actually works. And let's not get into what the video card does when rendering text.
At this point I think most people are far better served understanding the abstractions that sit above the HW than what's under them. Learning how stuff works is great fun, but the average programmer and the average carpenter have little reason to care about QM even if it impacts what they do at some level. :-)
Electronics was my first love. At school as a kid I said I wanted to be an electrician (I knew a few) - but I had good maths skills, so Elec Eng was recommended (so since I was 10 I knew what I wanted to study at least - was a nice feeling - never understood "not knowing what to do").
Went to university, about 2 years in I realised that I had seen and understood the state of the art in electronics, at least the core of it, and that all the interesting stuff was in VLSI/chips, and so on (all of which was basically done in software). I even started designing my hobby circuitry in Microcap and pspsice before I would build it, and then I realised I had almost completed the transition into software - when I realised I was "creating" things in software, it was liberating...
I think this appeals to the identity crisis that programmers can sometimes have; are we really engineers? We're the most removed from the physical of all engineers with the exception of the ORIE/FE people, and nobody's quite sure what they are either. You generally do believe the argument that being an engineer is more about the thought process than having to do work on physical structures, but there's always that nagging bit of insecurity, the itch you can't scratch. Working with your hands on electronics alleviates that nagging fear.
That's my theory, at least. The most I ever bother to do though is assemble my own computer from Newegg parts.
I'd rather put it this way:
"My normal job used to be in the trenches. Now it's in the clouds. This is really a great change. I'm never going back."
The more things change so that I can concentrate on the software, the more I like it. No more serial or parallel ports, no more soldering or crimping cables, much less resetting & rebooting..." With USB, plug & play, & wireless, I almost forget how much it used to suck. Now if only I could "think" my code onto my monitor without a keyboard...