"Engineers" - ironically the term used in the software industry for people who never standardize anything, solve the same problem solved by other "engineers" over and over again (how many libraries do you need for arrays and vectors and guis and buttons and text boxes and binary trees and sorting, yada yada?) while making the same mistakes and learning the hard way each time, also vehemently argue about software being "art" might like OSX, but even that is debatable. Meanwhile actual Engineers (the ones with the license) the people who need CAD and design tools for building bridges and running manufacturing plants stay far away from OSX.
I did EE in college but we mostly just used Windows because the shitty semi-proprietary SPICE simulator we had to use, and stuff like that, only supported Windows. The company that makes your embedded processor might only support Windows (and begrudgingly at that).
I think engineers using software should not be seen as an endorsement. They seem to have an incredible tolerance for bad UI.
You seem to be suggesting that a chunk of the hundreds of millions of people who use a UI that you don't like, secretly hate it or are forced to tolerate it. Not a position I'd personally want to argue or defend, so I'll leave it at that.
What an oddly aggressive and hostile response to such a banal observation. Yes, millions of people use software they hate, all the time, that’s wildly uncontroversial.
Making up what? Go drop by your nearby shop.
My hair styling constantly complains about management software that they use and quality of payment integration.
At work I constantly hear complaints about shitty, slow IDEs.
At optician store guy been complaining about inventory system.
People hate software that they're forced to use. Professionals are better at tolerating crapware, because there's usually sunk cost fallacy involved.
This is not a reasonable way to infer the sentiment of hundreds of millions of people in different countries, different business, different situations, etc, etc.
Disguising it as an "observation" is even more ridiculous.
Indeed I’m not ready to defend it, it is just an anecdote. I expected the experience of using crappy professional software to be so universal that I wouldn’t have to.
>They seem to have an incredible tolerance for bad UI.
Irelevant.
Firstly, it's a tool, not a social media platform designed to sell ads and farm clicks, it needs to be utilitarian and that's it, like a power drill or a pickup truck, not look pretty since they're not targeting consumers but solving a niche set of engineering problems.
Secondly, the engineers are not the ones paying for that software so their individual tolerance is irelevant since their company pays for the tools and for their tolerance to those tools, being part of the job description and the pay.
Unless you run your own business , you're not gonna turn down lucrative employment because on site they provide BOSCH tools and GM trucks while you personally prefer the UX of Makita and Toyota. If those tools' UX slows down the process and makes the project take longer it's not my problem, my job is to clock in at 9 and clock out at 5, that's it, it's the company's problem to provide the best possible tools for the job, if they can.
It was figuratively. Obviously everyone has different working hours/patterns depending on job market, skill set and personal situation.
But since you asked, Google is famous for low workloads. Or Microsoft. Or any other old and large slow moving company with lots of money, like IBM, Intel, SAP, ASML, Airbus, DHL, Siemens, manufacturing, aerospace, big pharma, transportation, etc. No bootstrapped "agile" start-ups and scale-ups, or failing companies that need to compete in a race to the bottom.
If you look at creative pros such as photographers and Hollywood ‘film’ editors, VFX artists, etc. you will see a lot of Windows and Linux as people are more concerned about getting absolute power at a fair price and don’t care if it is big, ugly. etc.
Oh, I'm sure there are lots of creatives who use OSX, so I don't mean to suggest nobody uses OSX, so I'll admit it was a bit in jest to poke fun at the stereotype. I'm definitely oldschool - but to me It's a bit cringe to hear "Oh, I'm an engineer.." or "As an engineer.." from people sit at a coffee shop writing emails or doing the most basic s/w dev work. I truly think silicon valley people would benefit from talking to technical people who are building bridges and manufacturing plants and cars and hardware and chips and all this stuff on r/engineeringporn that everyone takes for granted. I transitioned from s/w to hardcore manufacturing 15 years ago, and it was eye opening, and very humbling.
I’d assume a lot of this is because you can’t get the software on MacOS. Not a choice. Who is choosing to use Windows 10/11 where you get tabloid news in the OS by default? Or choosing to hide the button to create local user accounts?
Who is choosing to use macOS, where non-Apple monitors and other 3rd party hardware just stops working after minor updates and then starts working again after another update, without any official statement from Apple that there was a problem and a fix?
I do. Because for all issues it has, it is still much better than whatever Windows has to offer.
> where non-Apple monitors and other 3rd party hardware just stops working after minor updates and then starts working again after another update, without any official statement from Apple that there was a problem and a fix?
At least my WiFi doesn't turn off indefinitely during sleep until I power cycle whole laptop because of a shitty driver.
So what, Windows does the same. Printers [1], WiFi [2], VPN [3], Bluetooth devices [4], audio [5] - and that's just stuff I found via auto-completing "windows update breaks" on Google in under 5 minutes.
The only problem is that Apple is even worse at communicating issues than Microsoft is.
The big difference is that Microsoft - at least usually - confirms and owns the issues.
With Apple, it's usually just crickets... nothing in the release notes, no official statements, nothing. It's just trial and error for the users to see if a particular update fixed the issue.
So the same software exists on multiple platforms, there are no legacy or hardware compatibility considerations, interoperability considerations, no budget considerations, and the users have a choice in what they use?
I.e the same functionality exists with no draw backs and money was no object.
More choice in hardware. More flexibility in hardware. UI preferences. You can't get a Mac 2 in 1 or a Mac foldable or a Mac gaming notebook or a Mac that weighs less than a kilogram. You can't get a Mac with an OLED screen or a numpad. Some people just prefer the Windows UI too. I usually use Linux but between MacOS and Windows, I prefer the latter.
We use the sales metrics and signals available to us.
I don't know what to say except resign to the fact that the world is fundamentally unfair, and you won't ever get to run the A/B experiment that you want. So yes, Windows it is !
You seem to have some romanticized notion of engineers and deeply offended by someone calling themselves engineer. Why do you even care if someone sits at a coffee shop writing emails and calls themselves engineer? You think it somehow dilutes prestige of word "engineer"? Makes it less elite or what?
"deeply offended" - My default response to imposters is laughter. Call yourself Lord, King, President, Doctor, Lawyer whatever - doesn't matter to me. I'd suggest you to lighten up.
Not that the degree means much, I learnt 90% of what I know on the job. It certainly helped get my foot in through the university brand, and alumni network.
You can call yourself anything you want Doctor, Lawyer, Engineer. I have the freedom to think my own thoughts too.
I always likened "engineers"[1] to "people who are proficient in calculus"; and "computers"[1] to "people who are proficient at calculations".
There was brief sidestep from late 1980s to early 2010s (~2012) where the term "software engineer" came into vogue and completely ran orthogonal to "proficiency in calculus". I mean, literally 99% of software engineers never learned calculus!
But it's nice to see that ever since ~2015 or so (and perhaps even going forward) proficiency in calculus is rising to the fore. We call those "software engineers" "ML Engineers" nowadays, ehh fine by me. And all those "computers" are not people anymore -- looks like carefully arranged sand (silicon) in metal took over.
I wonder if it's just a matter of time before the carefully-arranged-sand-in-metal form factor will take over the "engineer" role too. One of those Tesla/Figure robots becomes "proficient at calculus" and "proficient at calculations" better than "people".
It looks like ever since humankind learned calculus there was an enormous benefit to applying it in the engineering of rockets, aeroplanes, bridges, houses, and eventually "the careful arrangement of sand (silicon)". Literally every one of those jobs required learning calculus at school and applying calculus at work.
Why pointing out Calculus as opposed to just Math?
Might be just my Eastern Europe background where it was all just "Math" and both equations (that's Algebra I guess) and simpler functions/analysis (Calculus?) are taught in elementary school around age 14 or 15.
Maybe I'm missing/forgetting something - I think I used Calculus more during electrical engineering than for computer/software engineering.
In my central european university we've learned "Real Analysis" that was way more concerned about theorems and proofs rather than "calculating" something - if anything, actually calculating derivatives or integrals was a warmup problem to the meat of the subject.
Calculus, because all of engineering depends critically on the modeling of real world phenomena using ordinary or partial differential equations.
I don’t mean to disregard other branches of math — of course they’re useful — but calculus stands out in specific _applicability_ to engineering.
Literally every single branch of engineering. All o then. Petrochemical engineering to Biotech. They all use calculus as a fundamental block of study.
Discovering new drugs using Pk/Pd modeling is driven by modeling then drug<->pathogen repo as cycles using Lotka models.
Im not saying engineers dont need to learn stats or arithmetic. IMO those are more fundamental to _all_ fields, janitors or physicians or any field really. But calculus is fundamental to engineering alone.
Perhaps, a begrudging exception I can make is its applications in Finance.
But every other field where people build rockets, cars, airplanes, drugs, or ai robots, you’d need proficiency in calculus just as much as you’d need proficiency in writing or proficiency in arithmetic.
True, we learnt calculus before college in my home country - but it was just basic stuff. But I learnt a lot more of it including partial derivatives in first year of engineering college.
>I think I used Calculus more during electrical engineering than for computer/software engineering.
I think that was OPs point - most engineering disciplines teach it.
Yeah computer science went through this weird offshoot for 30-40 years where calculus was simply taught because of tradition.
It was not really necessary through all of the app developers eras. In fact, it’s so much so the case that many software engineers graduating from 2000-2015 or so work as software engineers without a degree in BS. Rather, they could drop the physics & calculus grind and opt for a BA in computer science. They then went on to become proficient software engineers in the industry.
It’s only after the recent advances of AI around 2012/2015 did a proficiency in calculus become crucial to software engineering again.
I mean, there’s a whole rabbit hole of knowledge on the reason why ML frameworks deal with calculating vector-Jacobian or Jacobian-vector products. Appreciating that and their relation to gradient is necessary to design & debug frameworks like PyTorch or MLX.
Sure, I will concede that a sans-calculus training (BA in Computer Science) can still be sufficiently useful to working as an ML engineer in data analytics, api/services/framework design, infrastructure, systems engineering, and perhaps even inference engineering. But I bet all those people will need to be proficient in calculus the more they have to deal with debugging models.
That 99% guess seems high considering calculus is generally a required subject when studying computer science (or software engineering) at most universities I know of.
You’re right it’s a total guess. It’s based on my experience in the field.
My strong “opinion” here comes from an observation that while calculus may have been a required subject of study in awarding engineering degrees, the reality is, people didn’t really study it. They just brushed through a couple of pages and wrote a few tests/exams.
In America there’s plethora of expert software engineers who opt for a bachelors degree in computer science that is a BA not a BS.
I think that’s complete totally reasonable thing to do if you don’t want to grind out the physics and calculus courses. They are super hard after all. And let’s face it, all of the _useful to humanity_ work in software doesn’t require expertise in physics or calculus, at least until now.
With AI going forward it’s hard to say. If more of the jobs shift over to model building then yes perhaps a back to basics approach of calculus proficiency could be required.
Most software engineering just doesn’t require calculus, though it does benefit from having the understanding of functions and limit behaviors that higher math does. But if you look at a lot of meme dev jobs they’ve transitioned heavily away from the crypto craze of the past 5 years towards “prompt engineering” or the like to exploit LLMs in the same way that the “Uber for X” meme of 2012-2017 exploited surface level knowledge of JS or API integration work. Fundamentally, the tech ecosystem desires low skill employees, LLMs are a new frontier in doing a lot with a little in terms of deep technical knowledge.
Hmm, that is an interesting take. Calculus does seems like the uniting factor.
I've come to appreciate the fact that domain knowledge has a more dominant role in solving a problem than technical/programming knowledge. I often wonder how s/w could align with other engineering practices in terms of approach design in a standardized way so we can just churn out code w/o an excessive reliance on quality assurance. I'm really hoping visual programming is going to be the savior here. It might allow SMEs and Domain experts to utilize a visual interface to implement their ideas.
Its interesting how python dominated C/C++ in the case of the NumPy community. One would have assumed C/C++ to be a more a natural fit for performance oriented code. But the domain knowledge overpowered technical knowledge and eventually people started asking funny questions like
there was some old commercial that had the tagline "performance is nothing without control". If you can't put the technology to work on your problems then the technology, no matter how incredible, is worthless to you.
This checks out. I'm a software developer who took math all through high school and my first three years of college. I barely scraped through my calculus exams, but I excelled at combinatorics, probability, matrix math, etc. (as long as it didn't veer into calculus for some reason).
I guess I just enjoy things more when I can count them.
For this engineering, I think calculus is not the main proficiency enhancer you claim it to be. Linear Algebra, combinatorics, probability and number theory are more relevant.
Calculus was important during the world wars because it means we could throw shells to the enemy army better, and that was an important issue during that period.
Nowadays, calculus is just a stepping stone to more relevant mathematics.
Todays ML frameworks grapple with the problem of “jacobian-vector products” & “vector-jacobian product” as a consequence of understanding the interplay between gradients & derivative; and the application of the “chain rule”. All of those 3 concepts are fundamentally understood by being proficient in calculus.
While I’m being the hype-man for calculus I don’t mean to say proficiency in linear algebra or statistics is in any “less necessary” or “less useful” or “less challenging” or “less..” in any way.
I’m merely stating that, historically, calculus has been the unique branch of study for engineering. Statistics has always found value in many fields — business, finance, government policy etc.
Sure Linear algebra is one of those unique fields too — I kinda like to think of it as “algebra” in general and perhaps its utility has flowed in tandem with calculus. Idk. I haven’t thought super hard about it.
From what I've heard (not an OSX user) Windows is the best operating system for multiple screens; OSX and Linux glitch way more. Most anyone doing 3D sculpture or graphics/art on a professional level will eventually move to working with 2-3 screens, and since there are no exclusively Mac design programs, OSX will be suboptimal.
There's little things too, like some people using gaming peripherals (multi-button MMO mice and left hand controllers, etc.) for editing, which might not be compatible with OSX.
And also, if you're mucking around with two 32 inch 4k monitors and a 16 inch Wacom it might start to feel a little ridiculous trying to save space with a Mac Pro.
Besides Windows having more drivers for USB adapters than Linux*, which is a reflection of the market, I find Linux having much fewer glitches using multiple screens.
Once it works, Linux is more reliable than Windows. And virtual desktops have always worked better on Linux than on Windows. So I disagree with you on that front.
* In my case, this means I had to get an Anker HDMI adapter, instead of any random brand.
I'd say a lot of engineers (bridges, circuit boards, injection mouldings) are kept far away from OSX (and linux). Honestly, I'd just love a operating system that doesn't decide its going to restart itself periodically!
Yes. I'm pretty sure my wifes 2014 Macbook Air has been 6 months without restart. My windows 11 workstation however has never done a week. I power down now daily to avoid dissapointment.
IETF RFCs soon number over 10K; Java, win32, the Linux kernel syscall API are famous for backward compatibility
not to mention the absurd success of standard libraries of Python, Rust, PHP and certain "standard" projects like Django, React, and ExpressJS
> (how many libraries do you need for arrays and vectors and guis and buttons and text boxes and binary trees and sorting, yada yada?)
considering the design space is enormous and the tradeoffs are not trivial ... it's good to have libraries that fundamentally solve the similar thing but in different context-dependent ways
arguably we are using too many libraries and not enough problem-specific in-situ DSLs (see the result of Alan Kay's research the STEPS project at VPRI - https://news.ycombinator.com/item?id=32966987 )
I'd argue almost all NEW library development is about politics and platform ownership. Every large company wants to be the dependency that other projects tie into. And if you don't want to hitch your wagon to google or facebook or whoever, you roll your own.
Many if not most computational problems are fundamentally about data and data transformation under constraints - Throughput, Memory, Latency, etc, etc. And for the situations where the tradeoffs are non-trivial, solving this problem is purely about domain knowledge regarding the nature of the data (video codec data, real-time sensor data, financial data, etc) not about programming expertise.
The various ways to high level architect the overall design in terms of client/server, P2P, distributed vs local, threading model, are, IME are not what I would call crazy complicated. There are standard ways of implementing various variations of the overall design which sadly because of a overall roll-your-own mindset, most devs are reluctant to adopt someone elses design. Part of that is that we don't have a framework of knowledge that allows us to build a library for these designs in our head where we can just pick one thats right for our usecase.
I don't agree with your characterization of the design space as 'enourmous'. I'd say most programmers just need to know a handful of design types because they're not working on high performance, low latency, multi-million endpoint scalable projects where as you say things can get non-trivial.
I'll give a shot at an analogy (I'm hoping the nitpickers are out to lunch). The design space for door knob is enormous because of the various hand shapes, disability constraints, door sizes, applications, security implications, etc. And yet we've standardize d on a few door knob types for most homes which you can go out and buy and install yourself. The special case bank vaults and prisons and other domains solve it their own way.
I challenge you to take those people who make bridges to build full software.
I am not meaning software is engineering or not.
It is a fact, in terms of cost, that software and bridge building are, most of the time very different activities with very different goals and cost-benefit ratios.
All those things count when taking decisions about the level of standardization.
About standards... there are lots also and widely used, from networking to protocols, data transfer formats... with well-known strengths and limitations.
In my 30± year career I can confidently say that Software Engineers look towards standardisation by default as it makes their lives easier.
It feels to me that you're bitter or had more than one bad experience. Perhaps you keep working with, or come across, bad Engineers as your generalising is inaccurate.
Maybe we need a new moniker "webgineer". The average HN/FAANG web programmer does appear to vastly overestimate the value of their contributions to the world.
When I started doing this "Internet stuff" we were called "webmasters", and job would actually include what today we call:
- DevOps
- Server/Linux sysadmin
- DB admin
- Full stack (backend and frontend) engineer
1999 indeed! I haven't heard that term since around 1999 when I was hired as a "web engineer" and derisively referred to myself as a "webgineer". I almost asked if I could change my title to "sciencematician".
People who cobble together new printers or kettles overestimate the value of their contributions to the world too. The delineation isn't between JS devs and JPL or ASML engineers.
You can shit all you want on so called "engineers", but they are the one who make the CAD you're talking about that "real engineers" use. So get off your high horse.
You're kidding yourself if you don't think that mechanical, structural or any other engineers don't do the same thing. They do.
I worked for one of the UKs leading architecture / construction firms writing software and also am an amature mechanic.
You'd be amazed at how many gasket types, nuts, bolts, fasteners, unfasters, glues, concretes, bonding agents and so on ... all invented for edge preferences and most of which could be used interchangably.
Also standards? Hah. They're an absolute shitshow in any engineering effort.