I disagree with this article, to some extent at least. Perhaps it's specific for trendy companies and startups. Most of which are just looking for the next shiny object, true.
However, when you look at companies that move the needle in different industries, companies that have repute, market share and profitability, they couldn't care less what is trending these days. They look for domain expertise and excellence.
I have friends who work at Renaissance, the hedge fund. The company couldn't care less about your grasp of the latest ML framework or Keros or whatever you were. As long as you know what you are doing and are exceptionally good at it.
Having worked, full time, at Microsoft, I'd say the same goes there and at Apple, Oracle, even Google for the most part. They don't care about what is trending, just prove your weight.
I think this conclusion was drawn from the companies that make the most noise but are actually not major players in industry. The same companies that are hot for a minute until they meet their eventual demise.
The most robust, relevant and profitable companies out there basically say, 'F* trends, show us your worth in salt'.
It's the hippie companies that ruin it all yet dictate social media conversation...for the 2 minutes their company is hot, then it dies.
It’s funny you use Renaissance as an example. They won’t give me the time of day because they “don’t hire people from finance”. Which is just as much of a bullshit reason as the ones from the start of the article.
> They won’t give me the time of day because they “don’t hire people from finance”
RenTech is a weird firm. The following is a fictional account.
Out of school (I was a finance/engineering double major), I interviewed at RenTech. They told me this was my last chance, that if I worked a day on Wall Street they wouldn't want me because they didn't want that culture (I think it's more about ensuring you have no industry connections outside the firm). They're geographically isolated, encourage employees taking mortgages to buy local homes and frown on industry interactions outside the firm. If you try to leave, they will enforce their non-competes (which are legal in New York) and sue your new employer [1].
I was wary of that need for control then. Today, I think it's morally wrong. Few people can predict what will matter to them ten years down the line. If I worked at RenTech today, I'd be depressed knowing (a) my work went to enrich the likes of Robert Mercer [2], and (b) there is no exit.
Renaissance Technology has been around decades and is one of the biggest hedge funds in the world. I don't think it qualifies as an early-stage company.
Re: renaissance. Of course. I'm highlighting where the 'we don't hire finance' trope originates. Nothing to do with Renaissance.
For all of you too young to remember:
In the 'dot-com' era - many 'new companies' still hired 'top down'. They would hire an 'executive team' first, and then maybe developers.
Often, a CFO etc..
Think PetStore.com - two MBA's hiring 'others to do the work, as workers do'.
The notion of 'all hands on deck founders' etc. was still novel.
Many people still wore suits.
So - 2000-ish - those attitudes evolved - and the 'CFO' for a very early stage company became obsolete. CFO/finance types are generally not required at the most early stages of a company wherein the issues are 'money in, money out, money in the bank'.
So, the slightly aggressive hipster/startup trope of 'we don't hire finance types' I think evolved essentially out of this new understanding of how early stage companies work.
I'm not disagreeing with your point but I think it's a bit out of context here. The parent comment was specifically talking about what they heard from Renaissance Technologies, not a some hipster startup.
How does taking 'no finance' out of context and applying a much more general comment that in no way applies to the very specific original context help?
Sure they may lose some great candidates that way, but the reality is that they don't want people in finance because:
1. They have preconceived notions about how finance "should" work
2. People not from finance are driven by other things
This is not to say that there aren't people from finance who don't fall under this, but it's a good way to weed them out without inviting a billion applications from people in finance.
So then by your own admission they’re not hiring the best due to this heuristic.
One thing my experience in HFT has taught me is that technical ability is only loosely correlated with profitability. So I wouldn't conflate Renaissance's success with their hiring practices.
>So then by your own admission they’re not hiring the best due to this heuristic.
Picking up on your use of the word heuristic, "hiring the best" is very much an optimization problem. Specifically, there's a signal detection problem inherent to hiring in which you want to maximize your hits (selecting the best candidates) while minimizing your false alarms (hiring someone who turns out to be a bad candidate).
There's another layer to this optimization problem, which is that you want to minimize costs, both in time and in money.
As such, there are bound to be many heuristics that are close-to-optimal. I don't know whether excluding people with a background in finance is one of them (in fact, I suspect it's not), but the use of such a heuristic is not prima facie absurd.
In fact, using (good) heuristics in hiring is a feature, not a bug.
The problem here is that the heuristic is (I think) bad.
Two things:
1. what was more or equally important than technical ability for HFT programmers' success in increasing profitability for the company? That was an interesting statement: can you elaborate?
2. That fun might have these heuristics that GP talks of, and for them, the people they hire may well be the best for them. As in, motivated and attitudinally oriented towards what they want, and more able to work free from the assumptions of _knowing_ how it's done in finance. They may be wrong in your case, but that's how heuristics work. To me, that's less of a bullshit case than trendy languages etc.
Trading always has this trade off in terms of (dumb and fast) vs (smart and slow). In the past few years, I think it's been more beneficial to be fast enough, but very smart. So, in this sense, technical ability only takes you so far since "smarts" come from domain expertise or, plainly, just creativity. Team composition matters, since your technical guy needs to be complemented. The objective, of course, is making your team take your organization towards the holy grail of (smart and fast). I'm not sure if this directly answers your question, but perhaps it describes the scenario in which technical ability isn't everything.
I do not think assembling a team for HFT is easy at all, especially now. There are a lot of good reasons for being very peculiar and selective. I am not sure industry outsiders who are just applying understand the dynamic or, in many cases, why a seemingly great candidate is rejected.
I don’t know much about the company, but a firm with extraordinary long term returns with a penchant for secrecy, cult-like hiring practices, and all of the employees invested in their extraordinary investment vehicle sounds sketchy to me.
You have never worked with someone from Oracle. They do not try to hire remotely the best, they hire cheap. Anyone that can churn out barely functional code, use their customers to functional test it, and take often half a fiscal year to provide a bug fix for you. Oracle is now more of a technical holding company than a software company, aside from their db, they merely pickup/purchase new companies and wring them out for revenue. Anything Oracle writes themselves is usually terrible.
> You have never worked with someone from Oracle. They do not try to hire remotely the best, they hire cheap. Anyone that can churn out barely functional code, use their customers to functional test it, and take often half a fiscal year to provide a bug fix for you.
They have a huge services division and that's the same formula used by all the big players. It's a numbers game for billing and they shoot to make it work by having a few highly competent people cover up for the C- players.
> Oracle is now more of a technical holding company than a software company, aside from their db, they merely pickup/purchase new companies and wring them out for revenue. Anything Oracle writes themselves is usually terrible.
Spot on. The database is incredible though the use cases I'd recommend it for over open source alternatives has shrunk to nearly nothing. Outside of that, the Oracle App landscape (Financials, HR, ...) are hilariously terrible, especially factoring in the prices paid for it.
This is something I've heard over the years but I'm not sure if it is still holds true. (I work with Oracle databases daily).
Oracle DBMSes are very robust, to be sure, and you can expect them to run forever. But in my mind other databases seems to have caught up and there really isn't a compelling reason to choose Oracle over other commercial databases anymore (also, Oracle pricing is a big deterrent).
In terms of performance, Oracle's licensing prohibits any benchmarking, but anecdotally I haven't found it to be particularly performant for most of my queries. Oracle used to be known for the innovative under-the-hood algorithmic improvements to the database, but lately I haven't seen anything too exciting, whereas SQL Server is getting better every year with new innovations (columnstore indices, in-memory features, Polybase, in-database Python/R computations, etc.).
> use cases I'd recommend it for over open source alternatives has shrunk to nearly nothing.
I agree, though I would say for heavy transactional use cases, I would still choose a commercial database over an open-source one. However, among the commercial databases out there, Oracle would be my last choice. It has too much legacy crud that have to worked around. The only reason Oracle is still around is because (1) In the enterprise, no one gets fired for buying Oracle. (2) fungible expertise in their services organization ensures business continuity, albeit at a lowest common denominator level.
Yes. SQL Server and others used to have the same terms but was later removed.
The original intent was to prevent non-neutral third-party benchmarking by biased agents that sought to discredit their products through contrived setups (which admittedly can be a problem for any product), but to enforce it through licensing seems a bit heavy handed. Oracle does allow benchmarks that are favorable to them (see TPC-C benchmarks).
I noticed you said "nearly nothing", what are some exceptions? I'm curious why anyone would start using Oracle now. The only reason I would see to use it is because you are reasonably dependent on it already and cannot migrate easily.
> I noticed you said "nearly nothing", what are some exceptions? I'm curious why anyone would start using Oracle now. The only reason I would see to use it is because you are reasonably dependent on it already and cannot migrate easily.
Ultra high end OLTP system leveraging Oracle RAC. It's a very specific use case that goes beyond basic replication where you require ACID compliance, HA, multi master (in this case via shared disks and distributed locks), all atop an MVCC database. The MVCC implementation of Oracle does not require a VACUUM style operation which is another plus for a 24/7 environment.
I know of a couple financial services companies that have this type of setup though I've yet to find one that (IMHO) really justifies it. In all cases they've got boatloads of money to throw at a problem and the guys in charge of making the tech decisions don't mind having one of those boats sail off to Larry Ellison.
Interestingly even in this use case I believe PostgreSQL would be my choice, although as you said it is a silly setup. I had not taken into account particularly narrow implementations that rely on specific Oracle pieces, I could see some going that way if that's the case.
My guess - government or ultra big corporations orders for very large sums, excessive for required task. Part of the sum goes straight in the pockets of people in charge of ordering it. Won't work in all countries but in some certainly will. Same with SAP and other overpriced solutions.
Other idea - "prestige". "We are using top end solutions, unlike the lowly startups, we are serious businessmen see."
> From what I saw Hyperion FM got relatively better after the Oracle acquisition.
In my experience the opposite seems to be true. I have heard people describe Hyperion as "pretty good" prior to the acquisition (this was a while ago, before my time). Today, if I talk to anyone about it, sentiments range from negative to very negative.
Almost every Oracle acquisition has suffered the same fate.
Counterpoint: I currently work with someone from Oracle, and they are one of the best programmers I know. "Usually terrible" doesn't mean "always terrible", which is the point of the article.
Technically the same, just with different start dates. But there's a catch: suppose your fiscal year starts Oct 1st, and you request help on August 15th. The answer is often "we're at the end of the fiscal year, but we have approval to start work on that in October".
So if it takes half a fiscal year, depending on when you first say you need it, it can be much longer.
Except all of those companies put an unreasonable amount of weight on degrees. Good luck getting Google to talk to you if you didn't graduate a top collage, let alone not having a degree.
The people I know who never got a CS education yet ended up programming got their break interviewing at Google. Google seems more comparatively willing to hire those without degrees since there's so much confidence in the interview process.
Not true. I've got a Ph.D. in theoretical physics, am very math heavy, created a number of algos for my work over the last 25+ years (in physics sim, bioinformatics, systems management/orchestration, etc.), run sessions at an ACM conference, yadda yadda yadda.
Two google interviews, and nothing. From what I hear from other people I consider way smarter than I, they also got nothing.
Google has a much copied process, but as a creator of something of huge value notes:
Mebbe their filter isn't quite as good as they think it is. Talking to a number of absolutely brilliant engineers who didn't get hired, it likely has nothing whatsoever to do with talent, algo knowledge, mathematics, etc. There are other factors.
Being an above 40 guy probably didn't help me, google and others seem to have lots of trouble with ageism.
Dan's article was not specifically about being the "top", rather, what does the "top" mean in context, and how do people judge. What is the opportunity cost of doing this? As he points out, as I point out, it can be very high.
The smartest programmer I met in my first decade of work, was a person who had a high school diploma. No college degree. The guy was brilliant, personable, humble. He is quite successful now, and still doesn't have degrees. Chances are, he doesn't have formal education around the math/algos, but has picked up everything he knows.
At the end of the day, hiring is something of a crap-shoot. Past performance is not a guarantee of future performance, either negative or positive. You are after passion, intelligence, fit, experience if it exists (re-inventing wheels can be time consuming/expensive if you are forced to do it, and getting a guide who has been down that path can save you making some mistakes/time/money).
I know people are telling themselves that google has a good process, but honestly, it looks like it enforces homogeneity more than it brings in needed talent. I am not sure this is a good thing. Poor replication of their processes is rampant throughout the industry. I am not convinced this leads to positive outcomes.
Just kiddin'. I think, inverting the binary tree probably means mirroring it. I had an interesting Google interview as well a few years ago where I aced the automated coding test but then the first human interviewer didn't get why I said that regular expressions run in linear time :) Our background was just very different.
> I've got a Ph.D. in theoretical physics, am very math heavy, created a number of algos for my work over the last 25+ years (in physics sim, bioinformatics, systems management/orchestration, etc.), run sessions at an ACM conference, yadda yadda yadda.
> Two google interviews, and nothing. From what I hear from other people I consider way smarter than I, they also got nothing.
Then you and your friends weren't fluent enough with algorithms. That is the point, they don't care about all of your degrees, years of experience, conferences etc, they care about your fluency with maths and algorithms. This means that even a person with a shitty background can get hired at Google while a person with a stellar background gets rejected. Should you have gotten hired? Probably, but their system lets them find a lot of diamonds in the rough who wouldn't get hired anywhere else which is why they use it.
> Then you and your friends weren't fluent enough with algorithms. That is the point, they don't care about all of your degrees, years of experience, conferences etc, they care about your fluency with maths and algorithms.
I didn't fail those portions. Actually did quite well on them. So did my friends.
You are making a number of invalid assumptions, starting from the assumption that their processes are fundamentally accurate or correct. My supposition is from the viewpoint that all systems are fundamentally flawed, and the goal is to minimize risk associated with a flawed system.
I know it is generally hard to acknowledge that google does things wrong, but ... IMO (and I am fairly sure I am not alone here) ... they have a number of significant issues that they haven't quite moved past yet, and this is one of them. Remember, they started out with brain teasers, and school pedigree. The new system isn't demonstrably better IMO, but it helps them convince themselves that it is.
> I didn't fail those portions. Actually did quite well on them. So did my friends.
Then I don't see your point, what are you saying caused you to fail? I have a physics degree from an unknown school, learned to code in my thirties and got a job at Google by just doing well at their algorithms and maths questions so it is definitely possible to get in without ticking any of the hip boxes.
His point is that Google's and everyone else's hiring process is subject to large amounts of randomness and capriciousness. Google themselves have at various times mentioned how their hiring scores don't strongly correlate to performance.
Don't feel because you got in that you are some ordained snowflake. If you had interviewed on another day or with another group within Google you very possibly wouldn't have gotten in.
There are many variables at work when it comes to getting hired and hiring.
they care about your fluency with maths and algorithms
"Maths" is a red herring -- a physics PhD who's still active in academia will definitely be very fluent in maths.
It's all about "algorithms", but I think a lot of software people have tunnel vision about that. There's a lot of fancy terminology you pick up in a CS degree; requiring people to know that filters out a lot of potentially good candidates, unless they've studied CS in their own time.
That's fine if the special CS terminology is absolutely essential for all programmers. But is it really? Realistically, 90% or more of your time as a programmer is spent working on other stuff (automation, testing, designing friendly APIs, catching sneaky bugs, scripting, just generally plumbing stuff together). If you're on a team, does every single team member need to have a great understanding of data structures? Or is it just nice-to-have, specialized knowledge?
> That's fine if the special CS terminology is absolutely essential for all programmers. But is it really?
I'd argue that it matters more for Google more than most employers. The combo of their scale, combined with their large amount of custom infrastructure, combined with their desire to be able to retask engineers on a whim, means that individual engineers will have pretty good chance of touching code where the choice of Big O could make or break a product.
Mayyybe, I'm not so sure but you could be right. [Edit to add: even at Google, most programmers are not doing that kind of stuff most of the time.]
On the flip side, though, I think many programmers (even programmers who are up to date on their CS) are fairly weak at mathematics. We think we're good because we can, you know, invert a binary tree, but how about figuring out an appropriate filter to smooth some data, or verifying that some randomized process is unbiased?
For something like digital filters, if you have basic knowledge you can just look up wikipedia for the details. But the same applies to data structures and big-O!
A lot of companies (including Google) can benefit strongly from people with good maths or stats skills. Do those people also need to be strong in CS? Or if not, do they need to be siloed into a separate hiring process, and placed in separate departments?
I reckon CS, maths, stats and other specialized academic training should all be treated as nice-to-have skills, of varying importance depending on the team balance and project requirements.
When Microsoft hired me in 2006 they didn’t even ask about my lack of degree. I’ve also interviewed at Google and Amazon. Amazon didn’t ask either. Google probably asked, but it didn’t disqualify me.
Do you think the difference now is that the market is more saturated with developers since it started to become a little more of a hot-ticket, and carries less of the old 'nerd' stigma?
Also, I'm curious about how you managed to even get your foot in the door at such large organizations and not just get swept out in a filter at the gate. I'm assuming of course that such large companies employ some kind of application tracking system (which may be in error).
Large companies need a constant supply of new talent, so they develop multiple filters that work in parallel. Being cut by one filter doesn't mean you won't be grabbed by another.
I never perceived there to be any degree filter at the BigCos, or less so than at narrower software companies. In house recruiters from every big5 have contacted me regularly through LinkedIn for as long as I can remember, and still do.
I didn’t graduate from a top college and google has pinged me numerous times to interview. I always turn them down because I like my company and do not want to move. I may also fail the google interview, but they certainly talk to me.
I didn't graduate from college, period, yet recruiters talk to me. It's funny how quickly they backpedal their interview offers when I mention that, however.
Most of my SWE coworkers at Google did not graduate from a "top college". Not sure where you're getting this impression from. There simply aren't enough annual graduates of the top programs to fully populate all tech companies, and there are plenty of people from other schools (like me) who also do well.
It is, however, a filter that will be applied against applicants, and from what I've heard, it's more than enough to trigger the "we'd rather have false negatives than false positives" filter if you're not in the lucky minority.
Even if it's never used as a filter by Google, their hiring practices just don't apply to the rest of the industry - Google can afford to pass on 99.9% of the high quality talent that applies. Other firms who try this are only crippling themselves.
I strongly disagree. Most of my coworkers are not from "top colleges". Neither are most of the people I interview. Even for Google, it is tough getting interviewees who do very well. Given that, it'd be ludicrous to discard most interviewees out of the bat because they didn't come from a "top school". The interview questions and relevant job experience are the true hiring bar. Possession of a relevant degree, and the granting institution for said degree, is far down the list.
"Except all of those companies put an unreasonable amount of weight on degrees"
I don't think it's 'unreasonable' at all to put strong emphasis on education.
Surely, at the end of the day it's possible to be great without it, but having a good education is pretty strongly correlating factor with so many things.
'Being cool', I don't think is correlated with much at all. Unless it's super consumer facing and inner culture is gong to have to match outer culture on some level.
Touched my first computer almost exactly 40 years ago.
I can troubleshoot most hw/os issues with one ear and eyeball tied behind my back.
Can program decently in many general domains.
Never went past my associate's because the work was more compelling.
Knowing how many PhD's never work in their field, and how many Master's get earned and laid aside, I can't agree with you at all.
I think you might be imagining that correlation.
" I think you might be imagining that correlation."
If you think the entire Silicon Valley is missed this, because it's not 'my' correlation.
People who study CS are probably more likely to better at CS that those who have not.
It doesn't prove or mean anything in general, and your personal situation is not relevant: of course there are tons of 'non-degreed' great techies out there. Nobody is denying that.
I've hired a lot of people and there's no doubt that you get better luck with degreed than non - and even school rankings matter.
In Canada, for example, you get consistently strong tech recruits out of U Waterloo. Impressively so. Not always but usually. Other schools in the region - much more hit and miss.
There are advantages to going to college over a boot camp or self learning. In fact, CS and SE degrees are ones that pay for themselves. If you're not willing to put in the work, that tells me you won't last long at Google.
I had to leave college early due to a family thing and start working to contribute. That was eighteen years ago, and I've put in so much work because I run into attitudes like yours.
Of course, in the past twenty four hours I've created an app and potential side-business from a technology stack I had absolutely no experience in any of it just to prove to a potential employer I can do 'front-end' stuff for them.
The average developer can't even CSS, don't even play with that 'degrees are the ultimate arbiter' mess.
What do you mean by that? How sophisticated is it to "do CSS"? When I've dealt with it, you basically look at the inspector and at your CSS libraries of preexisting classes, and basically do a combination of attaching classes and writing new ones with custom styles until you get it all to look right.
He means do it well. There's a lot to doing CSS well, and frankly I just don't have any interest in learning all the ins and outs. It's not a fun platform to work in, it's very fiddly, and it's not exactly programming either.
There's an art-form to doing amazing pixel perfect web designs, and I really respect the people who are good at it. That's not me though, and it's never going to be me.
I had run into a similar situation, though less time has passed than in your case.
I've had to do similar things to find work: (started young out of interest and put it aside for a few years, then), spend time volunteering (UNOV), freelanced, voluntary freelanced for small businesses/local orgs, build pet projects, just build things in general and study.
Maybe it prepares one better as an engineer than a scientist in the field — the application of a science vs the theoretical and experimental work in developing the science.
---
To expand on the parent subject...
I've had interviews since where I've been able to lay out some of the things I've designed and built both independently and while working for a major media company and still had interviews last hours upon hours to receive no call to even notify me that they've decided to go another route all for some odd questions with no definite answer like:
Implement a poly-fill for bind by extending the Function object
Or,
Explain what this css does "if you don't know that's okay,
but don't get it wrong – that's 'bad'.":
.box {
display: flex;
min-width: 1024px;
min-width: 52em;
margin: 0;
padding: 10px 15px 10px 15px;
}
.box--item {
flex-shrink: 2
}
.button {
appearance: none;
border-radius: 3px;
background-color: blue;
}
.button--green {
background-color: green;
}
In the case of the CSS, no DOM context was presented, and the class names I'm giving provide far greater context than the test I'm referring to did...
Only to have every answer given either go unremarked-upon or just told "no", even though the answer could not have been a definite wrong. It was just not the preferred answer.
I was also informed you would be paid more just for having a Masters or PhD, regardless of your contribution. They had graphs and were happy to show me.
I'm talking smaller, trendier companies here though. Buzz words and egos seem to abound. It never made me bitter, but it just seemed so bizarre-o. Not really for me.
> Of course, in the past twenty four hours I've created an app and potential side-business from a technology stack I had absolutely no experience in any of it just to prove to a potential employer I can do 'front-end' stuff for them.
This tells me you're wasting your time and potential. There are other ways to get a job.
They didn't request it, I just didn't have any direct professional experience with React/Redux or the Python ecosystem, which are the lion's share of front-end positions in my area. I've gotten a lot of immediate 'passes' from recruiters unfortunately.
Thanks for your reply though, I agree with you but then I run into reality and the need to pay bills.
Where exactly did I say that was not the case? There are plenty of options for getting a college degree, and there are way to get a job at these companies with needing. However, if you're going to try to get one be sending in your resume, expect to get turned down without a degree.
Beyond that, software development is plagued by this mindset that college degrees are useless and that only self taught people are worth looking into. I've worked with many college educated people who can't write good code, and I've worked with many self-taught people who really don't understand what they're doing. The best has always been a person who went to school and then continued on learning after the fact.
>Beyond that, software development is plagued by this mindset that college degrees are useless and that only self taught people are worth looking into.
Ugh, yes. I've worked for a leading SIM and smart card manufacturer and they required a 4-year degree to even get in the door. I interviewed countless people, including some who were able to answer some quite technical questions that others couldn't. And many of those had to get disqualified since we found out during the interview process that they didn't have a degree. There were so many degree holders there that didn't know what the hell they were doing, and it was obvious.
One doesn't prove the other, that's my point. The only thing that can be said is that a degree doesn't prove if someone fits a position or not, it only proves they've finished an education.
I've also been in the field for many years and held several interviews. My takeaway, to put it a bit clumsy: "knowledge is easy to fix, mindset and experience not as easy."
As I told someone over Twitter the other day: mindset isn't something that can be readily taught, which is why STEM is still so in-demand with a gross labor shortage. For most people it's something you either innately have or don't have.
But yeah. I was reinforcing your point, not being counter.
I was referring to "If you're not willing to put in the work...". Some people are more than willing to put in the work but there are things in life that might hinder them from doing so at the time. People take a break in their studies for (various) personal reasons and some end up working in the field despite never finishing to get a degree.
"...expect to get turned down without a degree" and
"...software development is plagued by this mindset that college degrees are useless ..."
My point is that there's a middle way. Only looking at degress is stupid because it doesn't guarantee anything other than that they got a degree. With that in mind it doesn't make sense to instantly disqualify people without a degree either.
I would like to meet the person who knows everything! I have met some 20 years olds who think they know everything! But as the old saying goes, "The more I learn the less I know"
Having two kids getting their CS degrees right now at top
50 colleges.
I am not impressed.
A friend of my son goes to UNFS (North Carolina State Film School). For four years he makes films. After the first year he has to choose between photography, directing, writing, etc. Then he spends 3 year fine tuning his craft. Great program!
CS Programs seem to be a little of this and a little of that, none of it coordinated. Quality is teaching is varied, occasional a great teacher or TA, much more frequently poor teachers or TAs.
I remember seeing at 14/15 some of my older cousins course notes form his computing course at UMIST (one of the 4 good places to do computing then) - and thought this isn't much different to what I am doing in my CSE class (and CSE's where the vocational track for those expected to leave high school at 16
I'm also self-taught, but I don't see a degree as valueless. There's a lot of concentrated knowledge to be had in a hurry from a good (probably even a decent) program, and while I've yet to run into a case where I could not pick up what I needed on the fly, I certainly have many times felt it would be an enormous timesaver, and thus make me more efficiently able to do things I enjoy doing and that get me paid, to have had that knowledge preloaded.
On the other hand, I didn't start my adult life a few dozen grand into the red with student loan debt, which, from what I gather through long acquaintance with many who did, has very considerable advantages of its own. So I'd have to call it a tradeoff - but, then, that's my whole point: it's not accurate to say that either option is strictly preferable to the other.
I'm self taught (as in, I've taught myself a lot outside of school, in fact I left school early and only finished my degree after several years in the industry), and I still learned quite a bit from my computer science degree.
It forced me to study a wide variety of things I probably wouldn't have spent much time on if on my own (i.e. OS programming, making my own compiler, prolog, finite automata, assembly programming, etc).
Also did A.I., 3d graphics, network programming, and various other things, but I probably would have learned that on my own to a certain point (I developed video games for awhile).
Pretty much. Managed to work on about eight games in a row that didn't make back their investment for three separate companies while I was in the game industry. They failed for various reasons, and a few of them really should have been successful, in my opinion, but oh well. Bad timing (releasing at the same time as heavy hitters), bad marketing, bad luck with reviewers, getting screwed by platform holders, aiming at the wrong audience, bad choice of difficulty, bad choice of which idea to pursue, technical issues that weren't in the testing environment and not discovered until release, overly restrictive and expensive update patching policies with platform holders, all sorts of fun reasons.
I assume you are talking about US undergraduate degree (if not, can you clarify). If so, most good colleges give you ample opportunities for additional learning -- go to grad classes, do research -- those avenues are usually very easy to open.
I probably agree that for each student at least one of those 4 years is usually wasted; but this is different than whole program being a waste for a person.
Yes the US undergraduate degree. I have gotten into some research and got picked up for an exceptional internship which have been good. It doesn't change that I could have passed out of all the actual curriculum on day one though.
I guess "community taught" would be better than "self taught" since I learned from so many people online. But I had been programming for 10 years before school, and while I was in the military (over 6 years) I took advantage of many MOOCs from MIT, Stanford, UNSW, etc. and many books from Knuth to most of the No Starch Press library which I understand is not the case for traditional students. It is frustrating though, especially not being just out of high school.
If you're taking MOOCs from MIT, Stanford, UNSW and others can you really say that university CS curricula taught you nothing? They probably taught you quite a bit since, well, you took their courses.
You were just overprepared for an undergraduate degree by the time you got around to trying for it. For anyone else taking a similar path, specifically the military part, you can often knock out an associates degree with that amount of military experience. Do it, you can skip most of the core curriculum and focus on the CS part and be done in 2-3 years (less if summer courses are available) and be in grad school in your 3rd or 4th your of full-time college education.
Yes the Navy "trained" me by saying "here are 20,000 pages of documentation on the E-2 Airborne Early Warning aircraft avionics, now go fix it".
That has nothing to do with my CS studies, of which I have learned nothing from college and could have CLEPed a BS degree on day one if it was possible and moved on to a masters where I should have been initially placed.
You ever notice how most people never tell you what state they're from unless you ask them? It's almost like they don't believe their state of origin to be all that important to their personal identity. Some of those people might even be from Texas.
But the hombre from Texas is not one of those people. He will drive for 30 minutes, past dozens of ordinary, mundane restaurants, to reach the nearest Texas-style barbecue restaurant, run by a fellow expatriate Texan, who shoulders the burden of living outside of Texas for the sake of fellow Texans who tragically cannot be in the best state all the time, yet need to regularly consume bits of Texas in order to survive.
Ironically, Texas has so many people in it, all constantly surrounded on all sides by Texas, that you cannot easily identify that guy until he actually leaves the state.
Also, the thing that's wrong with Texas is football. That is easily 120% of my problem with Texas.
How many classes did you take in college? Because my first few CS classes I didn't learn a whole lot from either. It wasn't really until the 200 and 300 level classes that I started learning anything substantial that I hadn't picked up on my own beforehand.
I only have a couple classes left. I had already studied it all before starting, materials up to graduate level are mostly available openly online now. I did expect more from the 300-400 level classes and lost a lot of motivation when I realized they were not as advanced as I expected.
I did take some 400 level electives completely outside what I am interested in and had not studied which were great though. One professor for those asked in the beginning what each person expected from and I mentioned what I am actually specializing in and he consistently gave examples of applying it to what I do which was excellent.
If waiting for 2-3 years before you're able to learn anything substantial is considered normal - and we're not talking about monasteries and such - then I think there is something seriously wrong with the system.
I was already writing various types of software, taken a class in high school, gone to summer camps, etc, before I reached freshman year of college. There was plenty of students who had never even touched a programming language before and it was all brand new to them.
And I still learned things my first year (especially in ancillary classes, I only took three computer classes my freshman year), I just didn't get much out of CS classes until sophomore year. One of those classes was a gimme class I really should have tested out of ahead of time in hindsight, though (basically computers 101, I had quizzes on identifying what was the desktop, mouse and monitor).
You should read all the context before stating that this comment thread means the system is broken. Just because extremely self-motivated individuals now have readily available resources to learn almost anything on their own does not mean the system is broken for everyone else.
I have no college degree, or college classes at all, and have been doing systems engineering/software engineering for about 13 years now. All self taught/learning on the job.
While it is nice not to have had college loan debt, I know there are things I may have a better grasp on if I spent months learning a particular topic. In fact, I am jealous of people who actually got to spend time being taught computer science and learning interesting things. I am sure there are topics/approaches/patterns that would be very helpful. Not that I am complaining, things certainly have worked out for me very well, it is just something else that I know would "boost" what I already know.
On the flip side, I know of one job that was a very short interview, as they said they could only offer X (which was laughably less than what I was making currently) because I had no degree. What I knew meant very little (to the place, not the interviewers), having the piece of paper meant more. You could say "their loss" but on the flip side, I have no idea what opportunities I have missed because of it.
I think you're mostly right, though. I think the article is probably right that companies who _say_ "we only hire the best" don't necessarily even hire good programmers. I don't think the companies you mentioned really care if they have the best, especially since that's pretty hard to measure anyway. They want people that are going to do great job in the area they were hired to work in.
You seem to be not exactly disagreeing with the article, but maybe tightening the slack on what we would call a “trendy company.” The author did not specify any successes of the “trendy companies” being discussed(I’m not including fundraising as a success of the company), but they did speecify some failures, and left the options open for more. The author, like you, praised Google for not hiring on basis of “trendiness.” I think you’re in more agreement than disagreement.
The article’s title/headline could be adjusted to more accurately represent the claims made in it.
I wanted to post a snarky response to your comment, and so I went to the Rennaisance web page to look at their vacancies. And lo and behold, no Keras or Kafka in sight. I'm impressed. That's certainly not true for the Facebooks and the Apples of the world.
Also I think you mean 'hipster' instead of 'hippie'.
But - there are a lot of said 'hipster companies' in the Valley, moreover, the cult has spread beyond: it exists even in the copycat cities of Montreal, Van, NYC, Austin, Boulder, yada yada.
I'm well into my 30's an the last few start ups I've consulted with - both in the Valley but not well known - were undeniably trying to be too hip.
I'm as cool as late 30's something can try to be without losing any dignity :) but I felt like Grampa Simpson (not my antiquated pop culture reference)
As far as I can tell it's spread across the whole country. At the very least, it's definitely present at startups in Chicago, where I'm at, and Chicago doesn't have that strong of a startup scene (it does have one, but it's pretty small compared to other metro areas).
Most SWE interviews at Google take place in either Java or C#. These aren't sexy, trendy languages. They're boring line-of-business languages that just work. And the frameworks you already know matters very little.
At my in-person interview at Google, they told me I could use whatever I felt most comfortable with, which at that point in time was Objective-C.
In retrospect it was a terrible idea. That language is stupid verbose, and I ran out of whiteboard for every single question I had there. They were mainly interested in me for iOS development though. They did not give me an offer, although I don't think it was for that reason (I was a little nervous and two interviewers gave me some major head-scratchers).
It's definitely true that some languages are better than others, because the types of problem and time limitations are the same regardless of the language chosen. Lower level languages tend to be worse because they take longer and there's more ways to trip yourself up.
Personally, I think that C# is the ideal language for interviewing in, as the .NET standard library is very powerful. SortedDictionary alone can easily polish off entire classes of interview questions, and the same can be said for pretty simply LINQ statements. I actually solved one problem so trivially using a combination of the right data structure and LINQ that the interviewer asked me to provide an alternate imperative implementation of the LINQ statement, just so he could see that I actually knew how to implement something like that rather than just use it.
You can approach similar levels of power with use of Java 8's Streams APIs and a good collections library like Guava, but Guava is only likely to be understood by a majority of interviewers at Google, whereas the C# standard library should be understood by a majority of C# interviewers everywhere.
It's absolutely worth spending a few hours brushing up on a "better" interviewing language and then using that rather than using a less optimal language just because it happens to be all you've used recently.
I would probably use Python on a whiteboard nowadays, even though at my day job I use C# and only use Python periodically in my freetime at the moment. Python's a lot more concise of a language in general. Although C# is smooth like butter when paired with Visual Studio, and I could use it if requested.
I think my views here might be biased since the only people I see interviewing in Python tend to be bootcamp graduates or new grads, and their coding performance in general tends to be not as good (which is fine, as the expectations for L3 SWE are lower).
I agree with you that the Python language syntax is nice, but it doesn't have quite the same level of built-in support for algorithmically useful data structures as C# or Java with Guava. To give you a concrete example, it's not unusual for interview questions to require the use of a self-balancing binary search tree or a similar data structure in order to reach optimal runtime complexity. Realistically no one is ever going to implement a self-balancing BST along with solving the actual problem inside of a 45 minute coding interview, but it's nicer if you're able to refer to an actual library implementation that exists and can be used versus hand-waving away the existence of one.
that the interviewer asked me to provide an alternate imperative implementation of the LINQ statement, just so he could see that I actually knew how to implement something like that rather than just use it
I wonder why interviewers look for these traits when it is very clear that you'll almost never implement a low level data structure.
JacaScript is another frequently used language. It along with Python are typically what I see new grads using. More experienced interviewees tend to use Java/C#/C++. I interviewed in C# despite most of ,y interviewers not knowing it (though any dev can figure C# out well enough to evaluate algorithm correctness). I've also interviewed one person in Go ... despite me not really knowing Go. That's when it's really helpful to have written down every character during the interview for writing up the evaluation afterwards.
However, when you look at companies that move the needle in different industries, companies that have repute, market share and profitability, they couldn't care less what is trending these days. They look for domain expertise and excellence.
I have friends who work at Renaissance, the hedge fund. The company couldn't care less about your grasp of the latest ML framework or Keros or whatever you were. As long as you know what you are doing and are exceptionally good at it.
Having worked, full time, at Microsoft, I'd say the same goes there and at Apple, Oracle, even Google for the most part. They don't care about what is trending, just prove your weight.
I think this conclusion was drawn from the companies that make the most noise but are actually not major players in industry. The same companies that are hot for a minute until they meet their eventual demise.
The most robust, relevant and profitable companies out there basically say, 'F* trends, show us your worth in salt'.
It's the hippie companies that ruin it all yet dictate social media conversation...for the 2 minutes their company is hot, then it dies.
Long live domain expertise and exceptionality.