> > The truth is that programming isn't a passion or a talent, it is just a bunch of skills that can be learned.
This may be true if your definition of programming is the literal act of writing code / recalling syntax, and learning new tools. Of course, that's not really what programming is. That's the easy part.
Programming is about problem solving, code is simply the tool. Critical thinking/problem solving ability, the ability to think in abstractions, the ability to hold a large amount of information in your head all at once, etc. are all skills, and not skills that the majority of people are particularly good at.
We wouldn't judge a carpenter on e.g. his/her ability to use a saw, or a EE on their ability to use a scope. I never understood why so many put so much emphasis on the tools a software dev knows.
I do believe these skills can be learned, but it's the enjoyment I derive from these things that I believe is the difference between some people and I. When I was in junior high I would do various logic problems for fun. I asked for nothing but a TI-85 for my birthday one year. There probably are people who are better than me at coding who hate the work. But I've hated every job I had until someone started paying me to sit, put on headphones, and solve problems with code. I go home with a sense of satisfaction, whether it's a skill, talent, passion is interesting but doesn't matter to me really.
After (as an adult) watching kids develop, there is a huge positive feedback loop for various abilities.
When you are slightly better or faster or just find doing it (whatever "it" is) more rewarding compared to your peers, then it takes lower effort to get the same reward. This means you are more likely to spend more time doing it. This means that after a while you have more practice than your peers, so the difference in ability is amplified. Now various grownups around you notice and complement you on it, get you lessons to do it better, provide you with supplies, &c. and the difference grows.
By the time you are looking at teenagers, it's hard to tell how much of the difference stems from natural ability, and how much of it is this snowball effect.
Some people really really seems to have a harder time with certain subjects than others.
I took pride in being a good teacher but I have experienced one person who - despite being normal and being able to keep a normal job - just couldn't grasp the basics of computers.
And by basics I mean:
Connect power cable, press power button, wait for Windows to load, type password and open Internet explorer.
It would stop somewhere in between there each time.
There is nothing wrong with that just like there is nothing wrong with me never being able to become a long haul truck driver (falls asleep), pilot (partially colour blind), or soccer star (hopeless).
> but it's the enjoyment I derive from these things that I believe is the difference between some people and I.
Thus is huge as well. Possibly also underappreciated.
I've had the exact same experience, but the other side of the coin was getting bullied and beat up because I was more interested in electronics and programming my Vic-20 then other 'conventional' interests for young boys (and being quite small physically at the time).
Programming is a form of expression for me, and I derive great satisfaction from solving and creating software which truly helps the users I work for. On the the other hand my lack of detachment from 'myself' and 'my work' also can cause some unnecessary pain at times, but it's a work in progress.
If anyone has achieved the balance I seek, I'd be glad to hear their story!
Saying there isn't talent involved in programming is like saying talent is involved in sports or guitar or painting. Sure anyone can improve but some people just have a natural aptitude.
When I was an athlete in high school and college there were people who barely trained and ate whatever they wanted and were far better than people who worked their ass off and counted every calorie.
Programming is much the same. I've had lots of people want me to help them learn, some people are naturals and will make more progress in a few days than most people do in weeks or months.
I'm not sure why people get so defensive about it, nobody cares if you say someone short can't make it in the NBA, but say it about programming or some other mental endeavor people get upset.
In my view programming is definitely a talent or just willingness to put up with things a lot of people wouldn't want to. When I explain to a lot of people in detail what I do during the day and the things I have to think about most people think this is a rather boring and tedious job they don't want to do.
It's the same for a lot of jobs. Intellectually I can do what pure business people or lawyers but I don't find it interesting so I don't have passion for it and will never do well.
On the other hand the first time I saw programming stuff I got excited immediately and had passion for it. I think that's my real talent.
Do you mean to say that being able to think in abstractions or keep a lot of state knowledge in short-term memory are skills that cannot be learned? So where did we acquire these skills, if not by learning?
Your calling them "skills" prejudices the answer. My best understanding is that they aren't skills, they are hardwired abilities. My best guess based on research that I've seen on IQ is that about half of the hardwiring is genetic and half is a result of childhood upbringing, but by adulthood they are not readily changed.
That said, we build concrete skills around how to make the best use of those abilities. We build memory skills to be able to quickly bring things out of short term memory back in. We build abstractions and concepts that allow us to efficiently analyze complex situations. We learn to think better with the capacity that we have. However our capacity doesn't seem to be readily changed.
As a concrete anecdotal example, several who have taught CS for decades tell me that most students simply can't "get" pointer arithmetic or recursion. Their brains don't go that way no matter how much they try. People who do master it learn those things fairly quickly and then can move on. The specific abstractions can be taught, the ability to think through those types of abstractions seems innate.
None of this minimizes the effort needed to become a good programmer even if we have the capacity. However the fact that most humans can't become programmers is why there is a job that typically pays 6 figures where many who are successful in it are self-taught with no more than a high school degree or GED.
"Upbringing" can be a slightly misleading way of stating that summary, as most people think of upbringing as comprising environmental factors that parents can influence. But factors that parents can influence should be largely shared between siblings, and studies typically find that the effect of shared environment is quite small in adulthood. IQ researchers typically find that 60-80% of the variance in IQ is explained by genetics, and most of the remainder to "non-shared environment", a grab bag of factors not shared between siblings that we might as well call "luck".
The above is a slight simplification, but I think it's a fair summary of the current consensus among intelligence researchers, given the word count.
If I remember correctly, the last time I looked into this, there was some doubt about the validity of many of the twin and sibling based IQ studies, because the parents who consented tended to be wealthier.
The upshot was that genetics is probably more of an upper limit on IQ, and that for children raised in poor conditions, environment plays a much bigger role than genetics.
Restriction of range is a potential concern, but not all studies are subject to that criticism. Bruce Sacerdote's study of Korean adoptees randomly assigned to American families found, for example, that the adoptive families represented perhaps the middle 90% of the SES spectrum. Obviously you will rarely see children adopted into the lowest percentile, but you can compare children taken from the lowest percentile to children who are raised natively in that percentile, and you typically find small or nil differences for most outcome measures in adulthood.
Plenty, of smart, dedicated people have been trying for decades to devise interventions to permanently boost IQ among children raised in poor (for the first world) conditions, and unfortunately not much has turned up. Instead, there is a familiar pattern of small pilot studies that show initial promise, but whose gains tend to wash out when the program scales up to a less than one-to-one child/child psychologist ratio, or even when the children are followed up on in adulthood.
The gains from permanently boosting IQ among low-IQ children would be so great that it might still be worth banging our heads against that wall on the off chance that we find something that might work. But we should keep in mind the base success rate of past interventions.
Neural plasticity in adults may be limited but it is not altogether diminished. Also it can be increased - "the plastic potential of neural networks can be engaged late in life by acutely regulating 'functional' E/I transmitter release." http://www.jneurosci.org/content/30/45/14964
For those of us who didn't grow up coding at age 7, this is golden!!!! The amount of terrible CS teachers out there would amaze you!! I mean "blow your mind away" amaze you. I almost gave up CS because of my absolutely terrible professors. I couldn't even understand simple Classes and object!!! It was that bad!!!
Luckily, since I really wanted to learn, I had to seek alternate resources. I was fortunate to stumble on Stanford's Cs106a taught by Mehran Sahami and it changed my life forever. That man is the gold standard for what a teacher should be. He broke down complex ideas and made them look so simple.
Since then, I use a combination of different resources to learn said "difficult concepts". Recursion, got it...pointers, got it too. I have used Standford courses, Berkeley course, udemy, Udacity, Coursera and even youtube to learn. These sites have some really good teachers.
I have since come to understand that CS is really not that difficult. All is takes is a willingness to learn and a good resource to learn from.
Maybe so, maybe not. However I never had trouble when they tried to teach me things. And have heard good things from others who they taught.
One of the people who expressed this basic opinion to me is https://www.amazon.com/Randal-Schwartz/e/B000APA744. He is also someone whose expository abilities are widely recognized as being excellent. And he is also an example of a self-taught programmer with no degrees beyond graduating from high school.
To be fair, "bad teachers" is the simplest explanation. The alternative is to go chasing intricate definitions of exceptional intelligence and long-tailed aptitude disributions that take hella lot of explaining.
>> My best understanding is that they aren't skills, they are hardwired abilities.
And isn't it the most amazing coincidence that those who are thus exceptionally gifted are uniquely placed to benefit from the demand for those very abilities in an extravagantly lucrative profession?
I mean, imagine what would happen if everyone could program! Where would those six-figure salaries go then?
And isn't it the most amazing coincidence that those who are thus exceptionally gifted are uniquely placed to benefit from the demand for those very abilities in an extravagantly lucrative profession?
I mean, imagine what would happen if everyone could program! Where would those six-figure salaries go then?
That was the exact point that I was making.
There is no coincidence. It is economics 101. In a free market, the salary is set by the laws of supply and demand. If people who can program are in short supply and create lots of value, the supply/demand curves will meet at a high price point. That is good for programmers. If everyone could program, then the price point would be lower and programmers couldn't collect those salaries. Programming has low explicit barriers to entry AND high salaries. This could not continue without implicit barriers to entry of some sort.
How low are the explicit barriers? I personally know several programmers who were homeless teenagers, later received GED degrees, and are self-taught in programming. These are mostly highly paid professionals. (One that I know decided as an adult to go back and pursue university now that she could. She's only 30 - by the time she's my age I'm confident that she'll also be a highly paid professional.)
Let's continue with economics 101. When you move a person from a non-programming job to a programming job that pays more, it is obviously good for that person. It is also good for society as a whole - that person is doing something more productive. It isn't so good for other programmers, though, because it reduces their salary.
Speaking personally, I grew up in poverty. My life is much better than I feel I deserve, or than I need. I also like seeing the lives of others around me improve.
Therefore I have encouraged many to take up programming, and spent untold volunteer hours answering questions from people who want to become better programmers. I have zero interest in limiting who gets to program.
For a political example I completely oppose H1B programs in their current form. I think that we should allow free immigration for competent people AND once they arrive we shouldn't restrict them from finding the best job that they can. The current quota system which provides incentives for companies to swear up and down that they are paying a competitive wage, while in fact they are paying under the market price, and then trap the people that they import in jobs that don't make full use of their talents.
I don't think it's nurture v nature, it's both. Anyone can learn, and anyone can get better at something. That does not mean anyone can be Einstein. I could study physics all day and all night and I would never be on his level or the level of his contemporaries.
They're built in, like most cognitive abilities. After all, you never learn how to take a 2 dimensional pattern of light hitting your retina and turn it into a 3 dimensional model of your environment. Ditto for locating the source of sounds. And, while you learn a specific language, none of us have to develop the general skill of learning a language from scratch as toddlers.
> We wouldn't judge a carpenter on e.g. his/her ability to use a saw
We absolutely should because it's a fundamental aspect of doing their job. The difference between me and a professional carpenter is literally the ability to use tools effectively to build an item. I know whether to use a dovetail or dowel joint as well as any professional, but I guarantee the pro will achieve a better finish in much less time.
It's like theoretical CS vs. real-world programming. It's good to know sorting strategies and whatnot, but knowing how to use the tools is what gets stuff done.
> We absolutely should because it's a fundamental aspect of doing their job.
The problem is that we don't pay engineers to crank out pre-designed classes in a specified language with a specific editor. We do pay day laborers to go dig a certain trench over there with these particular shovels and picks.
Software engineering is so much more open-ended. We're paying people to "go help the team make the yard more water efficient".
In other words, if we had more well-defined roles for "coder" versus "engineer" versus "architect", it might make sense to test on the ability to use tools, but in general people in this thread aren't making that kind of distinction because the tech culture in general doesn't make a clear distinction. Otherwise, we'd have "coder" job interviews that didn't include algorithms or open-ended problem solving questions.
>We absolutely should because it's a fundamental aspect of doing their job
A rather trivially learned part of their job. It has little bearing on their ability to build a beautiful cabinet.
>The difference between me and a professional carpenter is literally the ability to use tools effectively to build an item
It's much more than that. How about Michelangelo? Do you think the only difference between you and him is that he's had more practice with a chisel? Doubtful.
>It's like theoretical CS vs. real-world programming. It's good to know sorting strategies and whatnot, but knowing how to use the tools is what gets stuff done.
This is the lone of thinking that leads hiring managers to build teams of mediocre devs with a lot of buzzwords in their CV. I don't care so much that you know e.g. C#, I care that you are a great problem solver. If you have that part down I'm fine with investing a little time into you so that you can learn C#, ASP.NET, whatever. The latter bit is easy.
Eh, I'm not convinced of that. I don't think you can take any person and teach them to think through problems logically. It seems simple enough, and you can certainly make them _better_ at it, but I really do believe we all have upper limits on our potential.
I agree that there are upper limits on a given individual's potential, but I think you are being too absolutist in language. I'm not sure that you can take ANY person and teach them any given thing. That's just silly. I think the powerful point is that there are and needs to be average programmers, and we can teach a subset of people to be that. There is evidence, even if anecdotally, that many are being dissuaded from the field due to a toxic belief that you are either born with some amazing ability to do the work and that it can't be taught, or you suck because you just "don't get it". Very damaging philosophy. I think we could teach certain people to do the work, maybe not everyone, but more than currently. There are a lot of mundane jobs in programming that do not need super high level abstraction and critical thinking skills.
> I think the powerful point is that there are and needs to be average programmers
Of course we do, but no one is saying only the best should be hired and the rest should go dig a ditch. Most programming tasks do not require top level talent to accomplish, but when I'm hiring I'm going after the best candidate I can get, and that has little to do with e.g. what web framework they are most familiar with.
Anyone can learn these skills, but they're sufficiently difficult that only people with the inclination and motivation to push through the hard parts make it anywhere. Desire and grit make more of a difference than talent or natural ability, which are impossible to measure.
>> I don't think you can take any person and teach them to think through problems logically.
That's actually the whole point of scientific training: teaching people to use the tools of science, including thinking logically and avoiding sources of bias and so on.
If you want to go back into antiquity, the ancient Greeks who started the whole Logic thing, Socrates, Plato, Aristotle et al, they never presented logic as some kind of innate ability of human beings, rather they set out to teach it as an instrument of thought that was far from innate. Because if it was innate, it wouldn't need all that work they put into it.
Is there any particular evidence that working programmers actually have either of those? Or are they just fuzzy-ass words we've made up to convince ourselves there's some hard barrier to the eventual reclassification of programming as "un-skilled labor"?
There are two meanings of learn that are at odds here. One is to be taught something, which I think many of us agree isn’t sufficient for programming. The other is to acquire the skill via lots of practice, which is much more appropriate in the context of programming.
Ones ability to program is directly related to ones tenacity to practice programming.
I put emphasis on tools because I've worked with a lot of recent graduates. There's the very real, practical problem of knowing what not to solve. I don't care if you're a genius or not, you're wasting time recreating (poorly) components that already exist in the standard library. Similarly if someone is doing blatantly bad-practice things because "its just implementation details" (stuff like hardcoded paths to /home/username) it causes a bunch of extra effort to clean up the monstrosity.
That's not to say problem solving is not extremely important, but the time suck created by bad tool use (or not using tools out of ignorance of their existence) ultimately reduces the productivity of a programmer, or in some cases, whole teams.
> Similarly if someone is doing blatantly bad-practice things because "its just implementation details" (stuff like hardcoded paths to /home/username) it causes a bunch of extra effort to clean up the monstrosity.
Agreed. There's definitely a balance point there - I remember when I got to the point that I not only recognized that this a bad idea, but where they actively started bothering me even for one-off scripts. Something as simple as setting them as constants at the top of the script makes that feeling mostly go away and requires almost no extra effort when implementing.
There are many times when I think "This is really brittle. Surely there's a better way to do it...", which is usually quickly followed by "I'll figure out that better way later, for now, just isolate it and make a note to fix it when this is done."
+1 Code is simply a tool. And dev tools are even further removed. Git is not problem solving. I'm certainly not going to belittle that skill. I have Git expert envy. However, hiring a problem solver for their experience with nearly irrelevant tools is comical. A red flag, but still comical.
> However, hiring a problem solver for their experience with nearly irrelevant tools is comical
How far down the rabbithole does this go?
I have never programmed iOS, but I'm completely positive I could program for iOS.
10 years of occasional programming, from python, to school projects, to database development, to android apps, to full stack react-native development.
I'm confident that I can build an iOS app.
Programming and understanding how computers handle inputs are tools. But programming is more mindset than anything. The only benefits of knowing the specific language or framework would be skipping the 1 month of learning syntax.
I'm a recreational programmer mostly, but at some point, you realize you can do full-stack and you realize that everything is possible with google and stack overflow.
Better also to not just grasp but live on concepts like opportunity cost and technical debt. You are Not adding value making the wrong choices in these areas, nevermind your ability to type code.
I think the point is that we don't judge on the tools, but the results. If he/she did a good job, why would you care? you wouldn't. Which is the parent's point. I think.
I used to be a "rock star" programmer. I could run rings around my coworkers. I wrote an embedded OS for running machine tools (I had help.) Back then, you read books and memorized everything. I knew x86 assembly and could use a logic analyzer to debug my code. This was in the 80's.
Now the amount of knowledge required to do even mundane tasks is an order of magnitude more. No one can memorize all the APIs and frameworks needed to do their jobs programming. I couldn't work without search. I don't memorize anything anymore. I work on dozens of different technologies.
Basically, I've become a mediocre programmer at a large number of different programming tasks where once I was expert at a very few. I'm the same person but the meaning of what is a programmer has changed over time. I'm OK with that.
Just a side note: I did some debugging of STMicroelectronics ST10 assembler with an oscilloscope the other week and it was quite a refreshing experience making me appreciate all the debug loveliness that is normally provided for us on a plate with even C. The one thing that the loveliness cannot really help with, that the oscilloscope shines with, is proper timing of signals though. Its great.
If you memorize it you don't have to context switch when you look it up. The benefits of memorization are enormous if it reduces your the number of times you have look up details to zero. It's similar to touch typing vs hunt and peck typing with the following exceptions:
1. context switches are much more expensive than looking at a keyboard to find the key to press,
2. memorization enables you have a more complete mental model which greatly speeds up design, reduces bugs and enables you to find bugs faster.
3. you gain surprising insights if you can run the program in your head when you are falling asleep, in the shower or walking around.
When I am doing cryptanalysis the first thing I do is commit the protocol to memory. Sadly if I'm not working on it 24-7 I quickly forget and have to rememorize.
If you didn't develop in that timeframe you might not get the difference between then and today. It takes forever to look things up so you end up memorizing instead. And it isn't a big deal because the number of things needed to be a successful software developer wasn't that great. To be successful and finish projects more or less on time, memorization was needed.
Edit: For example, for a different project I knew the hex codes for every 8051 instruction. This was necessary to be successful debugging using a logic analyzer. Looking up each op-code would be prohibitively slow.
It's a caching problem. If memory access is slow you want to cache more, if not then you don't bother. Things you use a lot will spend more time in the cache, things you don't you'll have to keep retrieving from memory.
We really aren't as different from computers as we sometimes like to believe.
When you memorize all the details of an instruction set, there is a qualitative difference when you code in it.
It's like the difference between being a stammering tourist in a foreign country who is constantly flipping through a dictionary to make elementary utterances, and a native speaker.
I first realized this many years ago after writing an emulator for the Motorola MC68010. Unrelated to that project, I had the occasion to write some assembly code in the same instruction set. Just, wow ...! It was like, I .. know ... everything ! (cue sound of thunder, lightning effect.)
You move a lot faster by memory. I regularly switch between Ruby, JavaScript, Python, and Scala at work and it's quite insane, I spend way too much time realizing language Y doesn't have feature X I use in language Z and looking it up vs rare moments of flow when I've been using the same language for more than a week or two.
> Why did you have to memorize everything? I have also done x86 assembly. You remember what you can. You look up the rest. Same today.
Looking stuff up took a lot longer then. It's the difference between firing up a search engine and typing words versus getting in your car, driving to the library, finding parking, walking to the door, finding the card catalog, finding the right card in the catalog, finding the right stack, finding the book on the shelf, scanning the book index, then finding the right page in the book. Memorizing had a better payoff back then.
Or think of it another way: looking stuff up today is like accessing L2 memory, and back then it was like a cache miss, where you had to take a trip out to disk before you could continue useful work.
Having an accurate and large knowledge of relevant API's will help you program faster and more accurately, and make better design decisions earlier. (You avoid the "if I had known about these API's, I'd have done it differently" situation.)
It's hard not to conclude that this is one axis in the multi-dimensional "ability" space.
Knowing APIs cuts down the time you take to implement something, and also helps you understand what and how you can achieve a goal.
Between a guy who needs to spend 5minutes googling for a class/module/function and someone who prompltly writes it down, there is no question who is a more capable programmer.
While I do not deny that memorization of APIs makes for more efficient programming, I would argue that, long term, readability, maintainability, and extensibility are far more useful time savers overall. Architecting a codebase which meets standards like these requires a different set of experience that comes with time and can't be trivially looked up online.
If I were to measure a programmer's ability, I.e., what makes a "good programmer", perhaps in the context of am interview, I would therefore not test rote knowledge.
The 2 sides debating "talent" are always talking about 2 different things.
(#1) "talent" doesn't exist: knowledge and skills can be learned. No infant comes out of the womb knowing 5 languages or physics equations or the syntax of "printf("hello world")". Every single human who knows how to do something well at one point in their life did not know how to do it well. If one can improve by learning, it means talent doesn't exist. This aligns with Carol Dweck's "growth mindset", the 10000 hours meme, self-improvement, etc. This internal perspective compares oneself at time_before vs time_after.
(#2) "talent" is a subjective ranking of people's abilities: there are people who are noticeably better at expressing their skills. E.g. NBA basketball player Shaq O'Neal has spent more than 10,000 hours practicing free throws with a dozen different coaches to achieve 52% success, but there's a high school kid that can sink them at 80%. We can say the kid is "more talented" at free throws. To say that "free throws" is a "learnable skill" doesn't change anything about noticing the obvious difference in abilities. If Person A learns faster than Person B, that in itself is a talent. This external perspective compares people against other people.
People who hold meaning #1 vs people thinking of meaning #2 are having 2 different conversations. E.g. when companies say "they are looking to hire the most talented" (meaning #2), they are not talking about people who can self-improve (meaning #1).
In this essay, Jacob Kaplan-Moss is talking about meaning #1. Yes, you can read it and take all his advice to heart. However, you still have to understand meaning #2 to properly decode what others are talking about when sports teams, music record labels, Hollywood, venture capitalists, and startups all say they are "looking for the best talent".
Lately, I've started thinking of programming as an art form. If you substitute programming with music, how would you make all of these evaluations about the craft and of its practicioners?
You may be hiring for guitarist and require at least 3 years experience playing the guitar. J.S. Bach applies, but has never played the guitar. As an industry, we typically wouldn't hire him due to lack of experience. Instead we might hire some mediocre kid with the 3+ years experience and a history of guitar lessons, even though after a 100 days on the job, J.S. Bach would have been a mind blowing, mesmerizing guitar player, even if he was still a little clumsy. And after a few years, he'd be more amazing still. Yet as an industry, we focus the ability to be productive on day 10 instead of day 100 or 1000. For long term positions. Why?
How would the software industry conduct an interview for a musician? We'd ask you about all of the songs and instruments you've ever played and have you talk about those experiences. We might ask a hypothetical question about what if you had to play a certain piece, how would you handle it. Maybe we'd have you answer some trick questions: "What are the frequencies of the first five harmonic multiples of B right below middle C?"
I find this to be utter madness.
I think the best way to find a good (or potentially good) developer is to have them do what we do in our jobs, which is, given some vague requirements and some undefined period of time -- a few days or so -- write code to solve a simple problem.
And I loved the part in the article about mediocrity. I've found the best developers to be humble enough to constantly work on improving their craft. So, don't reject a candidate because they appear to "lack confidence".
Reason may be similar to why artists and musicians typically don't make money. Money/power isn't that interesting. So the people who end up becoming the gatekeepers are those who can't see real talent and those with real talent think that those who are in it for the money aren't "real" enough.
The good but not greats are often really good at specific areas not necessarily tied to business concerns. The "greats" spend so much time on it that they end up really good at a bunch of things.
From a business perspective where output needs to be measurable/controllable you want someone either malleable enough to work on any kind of crap, or someone Soo good they can be good at anything.
The really good CSS person, the SVG animator, the functional JS guy will all be passed in favor of the malleable new grad or the industry expert. The only way is if these individuals market themselves and create a new domain of expertise by giving talks or writing books. The value is appreciated by the environment but not companies.
I think your ideas are valid, but there are many potential candidates who cannot do any of the above because they have full time jobs that prohibit open source contributions without prior approval.
While on the surface of it, sure, programming is a skill and not an innate talent, it would be foolish to compare it to other career fields in that way. You can go to school to learn how to be an architect, then go and get a job as an architect and be immediately productive. You can go to school to learn how to lay bricks, then go to a job site and immediately start building something.
You cannot go to school to learn how to be a programmer, and then expect immediately to be productive at any arbitrary programming job, unless you're a talented coder. The education gap is too great. It's like medicine in which you need a few years of residency before the medical field considers you fully competent.
If you go to code school and learn Rails and React, then you can expect to be somewhat immediately productive on a Rails / React project. You can't expect anything out of them if it's not a Rails / React project. They may be able to ramp up quickly, if they're talented. If they're not talented, then they won't be able to, and the employer has to take the hit.
We will need a revolution in software development pedagogy before individual aptitude stops mattering to employers. The state of education at the moment is so bad that unschooled yet talented individuals can be much much better at fulfilling business requirements than trained and educated professionals. Try that with medicine!
> You can go to school to learn how to be an architect, then go and get a job as an architect and be immediately productive. You can go to school to learn how to lay bricks, then go to a job site and immediately start building something.
Do you actually know anyone who's worked in these fields? Because that's not true. Wanna be an architect? Go to school + get 2 years of experience + take a professional certification THEN you can be employed as an architect. Wanna lay bricks? Get a job and you'll be taught there.
I've never laid brick, but I've been an electrician. Tradesmen are almost never working independently, and certainly not when they are working in a junior position. There's a lot of mentorship that goes on, and a lot of tribal knowledge that's shared - things that are simply not written about.
When I started as an electrician, I knew a lot about electricity already. I've always been a bit of a nerd and knew more about how electricity actually works than many journeymen I met. I knew almost nothing about how to actually put that into practice, though.
For example - when wiring an electrical outlet, you have to wrap the end of the wire around a small screw. It's obvious enough that if you do it clockwise tightening the screw holds the wire in place; if you do it counterclockwise, tightening the screw moves the wire and makes life difficult. What wasn't obvious was that instead of taking a pair of needle-nosed pliers and bending the wire to the right shape, there was a small hole in each side of a pair of wire strippers that was exactly the right size to do it: http://www.kleintools.com/sites/all/product_assets/catalog_i...
Had the person I was working for not shown me that, I would have probably never figured it out.
I think the role of a junior developer is very similar to that of an apprentice electrician, in that the dev knows at least enough to implement something on their own so long as they are given concrete tasks. A mid-level dev is similar to the journeyman; they take a partially-defined task, break it into well-defined chunks, and either implement or delegate. A senior dev is the master electrician - they decide what tools to use, make the larger-scope decisions on a project, etc.
This analogy breaks down once we start talking about software "engineers" or "architects". I can't really think of anything I saw in the trades that maps to it; it's more about identifying and anticipating important changes over time and mitigating their costs.
While I totally agree with the things you said and that talent does play a factor, but I feel like playing devil's advocate here a little bit and point how that you can replace every instance of "talented" with "dedicated" and what you said still works.
That's outside of an employee's ability to decide to be. The business has deadlines and needs, and writes the checks, so it's up to them to decide how long it takes for an employee to become a net asset for the company.
Right, but I meant more along the lines of dedication in that the employee spends their free time outside of work, studying, experimenting, and learning. This can drastically reduce the ramp up time and has nothing to do with natural ability.
A talented employee might be able to spend less time outside of work, but a dedicated one can be just as competent in the same amount of time. If you have a developer who has both, then obviously that would reduce the ramp up time even more.
I dunno, I've seen people try to do this, and they either really really struggle, or don't and never get to that point. People who aren't talented really do need their hands held, self study just won't ever get them there.
Being able to switch from a table saw to a miter saw isn't talent.
It's learning how to separate alike from different, using what you know to use something different. It's also realizing that languages vary, but you can learn the underlying concepts and apply them.
Being able to take general programming knowledge and apply it to multiple languages and projects is something you learn.
Talent may help you to learn the concepts and be more versatile, but that doesn't take away from the article's message of the typical distribution curve. This focus on "general programming skills" is fine, but lets not pretend it's "talent", because that's the whole myth!
> The state of education at the moment is so bad that unschooled yet talented individuals can be much much better at fulfilling business requirements than trained and educated professionals.
I'll try that in medicine. Talented midwifes were delivering babies in the 1900s. Since midwifery is still a thing, maybe medicine is not as far advanced as you imply.
Seems like talent still plays a big role in any field.
The midwives who worked with my wife are full nurses with an additional two years of med school. They are very quick to point out that their main job is to recognize the 10% of the time when there is going to be trouble and get a better trained doctor involved.
Note that you can find "midwives" with less skill than the 1900s midwife. There are a lot of quakes out there doing awful things and getting by with it because ~80% of the time you don't need any medical intervention.
Did you ever consider that the employee might be a smidge less likely to walk out the door if the employer invested in him or her, recognized his or her achievements in a timely manner with responsibility and authority (promotion) and compensated him or her fairly?
Seems to me employees "are likely to walk out the door" mainly because employers view them as expendible expenses and not humans trading labor for something of value
Sure, but that still doesn't make it an investment, it just makes it a less risky bet. And betting on people being high-minded over being selfish has always been a bad one.
Well, yeah, that's what it means, right? If you pay me $50k per year and invest $20k of company resources to train me, and the second my training's up, I jump ship to a company promising to pay me $60k, wouldn't you feel ripped off?
Nobody these days would turn down a $10k raise out of company loyalty. They might for other reasons, but loyalty is absolutely not one of them anymore.
The right thing to do is to start transitioning the $20k you were spending training the new guy into salary as he acquires the skills.
Its stupid to spend the $20k, figure its "done" and then think you can just go on paying the $50k after the fact even though the employee is clearly now worth $70k.
You want a $70k talent, you can either find one straight up or make one with some mix of lower salary + training until you have one.
If you hire someone for $50k, then train them with $20k, then give them a raise after the training is over for the $20k it takes to bring their comp up to market, you're out $90k. Accounting-wise, you just gifted the employee $20k on top of his comp.
Or you can just go out and hire for $70k and just be out $70k. This is what's already being done, shifting the cost of training and education onto the ones receiving the benefits of that training.
I'm trying hard to not post a diatribe about what's wrong with this kind of thinking and just point out the errors in reasoning that come from it, but this game of whack-a-mole is frustrating. You can't demand your employer to shoulder all the risks and grant unto you all the rewards. It's unethical and leads to underhanded dealing. We want a more professional labor marketplace, not one governed by promises and half-truths.
The employer's reward in this scenario is more than $20k in value provided by the newly trained employee. The risk,10 of course, is through employee leaving before the employer can realize the reward fully. It's interesting that you don't see it that way, and that you think it's an error in reasoning.
I don't know why I let myself get sidetracked into this worthless ideological discussion.
Final answer, then I'm getting back to work. Because when you take risks, you should expect a reward. If your reward isn't worth the risk, but you still take it anyway, then you're the dumb one. When you're the one offering liquid compensation, you get to decide on the risk profile of the bought solution.
e.g. if a company invests in a building for their office, there's always a risk that building will incur excessive maintenance expenses, become unusable because of some calamity, or that the location around the building will become a bad neighborhood and make that building a lot less useful.
i think a company can invest in, say, teaching their employees a valuable skill and then make money off that investment, even if the employee leaves eventually.
You can insure against risks. You can't against bets. Good luck getting an insurance company to sell you a policy covering training cost recovery in the event of employee-initiated turnover.
i feel like we're playing word games. "risk" vs. "bet."
the distinction you raise is that there (apparently) is no such thing as an insurance policy that protects a company against a valuable employee's departure.
but that doesn't mean "invest" is the wrong word to use when a company spends money to educate an employee.
also, there are strategies aside from a literal insurance policy which companies can and do use to increase employee retention rates. (just like there are strategies other than insurance policies that companies can use to protect a building against damage or loss of value.)
Self-insurance against the risk of employee defections costing them time and money is exactly what employers are already doing, by not taking on the risk of educating and training employees.
You can hedge your bets in many different ways with different assumptions for likely outcomes. For example, if you think it likely that your employee won’t hang around for very long, then you can act by treating them as disposable. You could also choose to mitigate the risk of them leaving if you thought you could be successful at it. Both approaches are forms of self insurance, none of them are full proof, and they can also lead to self fulfilling prophecy problems.
Yes, but the difference between an investment and a bet involves the amount and type of risk. Investments generally don't disappear. Bets do.
Also the expected rewards are higher too. If a company bets $20k in an employee, the best they can hope for is that the employee does a good job. Not that that bet doubles in value.
> If a company bets $20k in an employee, the best they can hope for is that the employee does a good job. Not that that bet doubles in value.
What do you think it means from the company's perspective to "do a good job"? A developer who does a good job will most likely double that $20k in a few months.
Apprenticeship has been a thing as long as professions have existed. A common way to justify training an apprentice is to indenture them - they agree to work for you for a set number of years at a given wage, in exchange for training. Then you get professional organizations who negotiate on behalf of trainees/apprentices to make sure that the training wages are fair and the conditions aren't too onerous. Membership in the professional organization is contingent on keeping your indentureship agreements. Programming is such a young profession that we don't have any of that, and I think we need to evolve it.
Pro athletes are a terrible example. There has been decades of litigation, lockouts, negotiations, etc. across almost all leagues to get to where we are now and there are many that continue to think the system is too restrictive for the players and too much in a league's favor.
I'm surprised that no employers even offer this as an option when hiring.
There might be some legal barriers to this problem like recovering the investments when the employee breaks the contract? Seems like some kind of insurance would help there, though.
I asked my last employer about this and they said that no contract can bind an employee to a company. Any contract would be one-sided in favor of the employee. Seems obvious when you think about it. Indentured servitude is wrong.
Maybe knowledge work is also special. Arguably medicine is a form of knowledge work. And it's not exactly a small or especially niche sector of the economy.
How good is the documentation and process on your team?
Any place I work, it’s easier to train my replacement than it was to train me. It allows me to pick up new projects or leave for a more interesting opportunity without guilt.
Work yourself out of a job.
It definitely means I’m easier to lay off. But I’ve survived 2 rounds of layoffs before and buddy, if you can figure out how to pay your mortgage without a front row seat to watch a company crumble around you then I highly recommend it. (Also only the 1st round offers a decent severance package, and survivor guilt is no fun).
Personal finances help with that. As do maintaining relationships with people.
I think this is absolutely spot on. So many people put their programming expertise or productivity down to some innate quality (that is apparently overabundant in white men). To me it seems nearly every one of those rockstar ninjas has also spent an enormous amount of time learning their trade - usually a lot of time outside of formal school or a 9-5 job.
I think if there's one skill that might be innate and that helps people be better at programming, it's patience. Learning to program and learning programming tools is for most people a boring and extremely frustrating task. You need patience to stick with it after spending two days solid trying to debug some linker error or CSS bug or whatever esoteric programming issue you inevitably have to deal with.
> So many people put their programming expertise or productivity down to some innate quality
Yes. Some people (both as kids and adults) seem to be innately interested in different kinds of activities. And it is perhaps impossible to separate talent with passionate interest since B leads to A.
Take the kids from my tribe, extremely passionate about patterns and things and systems and numbers and coding from early childhood, _despite_ concern from parents and teachers, social ostracization, and efforts to limit computer time. We also tend to acquire talent.
Yeah, but people also tend to spend time on things they're good at. Is programming solely about innate skill? No, of course not. There's a reason why the distinction between senior and junior developers exists. But are you going to put the time in necessary to develop into a senior developer if programming feels like mental torture?
These are all skills you can learn. And I think focusing on learning technologies obscures a lot of these skills, since they're about communication and listening and project management.
For productivity in terms of technological knowledge I'm starting to think that, since choosing which tool is more important than how to use it, a better way to get better is to study case studies. Instead of trying to learn "how can I use Mongo, how can I use PostgreSQL" you try to learn "when should I use Mongo, when should I use PostgreSQL".
I'd love to hear about good sources of case studies. http://www.aosabook.org/en/index.html is good, but doesn't cover kind of programming one would do at a company, much.
This is exactly correct. In the past, I didn't consider myself an excellent programmer, because I struggled to solve esoteric number permutation problems and recursive graph search problems during whiteboard interviews. It seemed to me that all my peers could do it, and they were getting good jobs for it.
Over time I learned that the real value comes from solving the right problems, and recognizing wasted effort. It comes from architecting systems that are easily maintainable. That's what makes money in the business.
Solving narrow, intensely interesting algorithms, is a rare case that is not widely needed in the industry at large.
> The truth is that programming isn't a passion or a talent, it is just a bunch of skills that can be learned.
Aren't these all talents?
* learning many things
* learning difficult things
* recalling things you've learned when you need them
If something is a skill, one would expect all engineers to get better at it with practice. But there are many abilities in engineers that don't seem to work that way.
Learning is a skill you can get better at, so it's just a dependency. Learning is not an unchangeable talent. Some books that will help you learn better:
Wether or not a trait is mutable is different from the degree to which that trait is mutable. The distinction is important. Most breeds of dog can be trained to learn some simple commands, but good luck getting some mutt from the pound to herd sheep like a border collie.
You know, I just don't think it's realistic to expect everyone to be able to learn anything. Certainly an expert in a field can come from anywhere, but not anyone can be an expert.
I don't think you can go through those books, however excellent, grab a random name from the phonebook, and then teach them how to do differential equations. Or to debug corrupted stacks in a multithreaded C++ program.
Leaning how to pass exams can certainly be done. Weather say the essay part of the SAT (producing a formulaic essay using the approved pattern) is actually much use, in deciding if you should get one of the scarce places on a Tier 1 uni STEM course - not so much.
They're definitely learnable skills. The thing I find with engineering is that most skills remain untrained with experience.
If years of experience meant you'd be better at a skill, then older would be able to reverse park very quickly. But without deliberate training, most skills don't improve.
Does anyone else disagree with this article? There is talent for programming, and that talent is correlated with IQ. A person with average or below average IQ is very unlikely to excel as a programmer. There is a lot of academic literature to back this up.
I think innate intelligence (whether measured by IQ or more qualitatively) is a real thing. I think people implicitly live that way and organize society that way.
I also think it's also considered a faux pas to actually say as much. People tie respect, identity, and self worth into intelligence a lot. It's hard to say Kimberly is smarter than David without offending people. Let alone to say engineers are generally smarter than elevator operators (to pick a mostly extinct job).
To some degree, all the interview hazing and discussion about "programming talent" is all dancing around the West's inability to discuss intelligence quantitatively and dispassionately.
To be fair, historically there has been a lot of sexism, racism, bigotry, and pseudoscience tied up in the science of intelligence. I'm not sure how to get society to re-approach the subject in a scientific, productive, and beneficial way.
The sentiments in this article are all very nice and progressive, but... have you ever actually tried to interview people for a programming job? The simple fact is that lots of people, perhaps most people who apply for jobs as programmers, are completely helpless. They couldn't code their way out of a paper bag. It's actually really depressing interviewing them, because you want them to be successful and yet most of them fail miserably at the most basic things.
"programming" is not one thing.. most comments here fail to make even a basic distinction between [ php forms developers; dev-ops Go pipeline builders; game developers; domain-specific problem solvers] or other. Include in any of those categories varying, more or less GUI design, more or less quality, more or less efficient execution, more or less language mastery, more or less elegant design.
Fish apparently are not good at describing water (!)
All I'm seeing here is that regardless of actual dictionary definition I still won't be hired because of some vague opinionated definition in someone's head.
a. There isn't really a gap. Employers are just slow to adjust to market rates for quality engineers.
b. If engineer salaries went up to the market clearing rate, certain business models that assume certain labor costs will become flawed
c. Who is going to pay for the training, especially in the case that all the training doesn't actually end up making the candidate any good at the job?
It's like drawing. Sure maybe with a lot of effort anybody could draw a close approximation of a good drawing but unless you are innately talented and driven you will never be at a good enough level to make a great career at it.
Why do computer science curriculums yield such a common thought toxicity? How can they engage greater diversity and cultivate empathy, compassion, and give students perspective to self-evaluate themselves?
I understand why it's important to emphasize that programming skills can be learned and that we aren't just a bunch of god-given talents. But how far are we going to take this? Surely we can admit that for any given set of skills some people are going to take to it more readily than others, and we can't account for the difference simply in terms of amount of practice or hard work. The top mathematicians or violin players or whatever in the world probably do work harder than others in their field, but that's not the same as saying that the difference between them and others is simply the amount of work/practice.
I think his point was that there's a bell curve and chasing this idea that there is a load of top talent is absurd. There obviously are really great, amazing developers out there, but there are magnitudes more 'mediocre' developers that can still accomplish some really great, amazing products.
This may be true if your definition of programming is the literal act of writing code / recalling syntax, and learning new tools. Of course, that's not really what programming is. That's the easy part.
Programming is about problem solving, code is simply the tool. Critical thinking/problem solving ability, the ability to think in abstractions, the ability to hold a large amount of information in your head all at once, etc. are all skills, and not skills that the majority of people are particularly good at.
We wouldn't judge a carpenter on e.g. his/her ability to use a saw, or a EE on their ability to use a scope. I never understood why so many put so much emphasis on the tools a software dev knows.