I think the headline statement sounds stronger than intended. In the context it reads like: Professional coders (obviously) do not care about Replit, so we focus on a lower segment.
Given the context, it's pretty clear he is talking about their target market, not about their employees. Do you only work on things you're going to use yourself?
Edit:
Look, this is what the article says:
> In essence, Replit’s latest customer base is a new breed of coder: The ones who don’t know the first thing about code.
> “We don’t care about professional coders anymore,” Masad said.
Rather than the outrage-bait this was turned into, the quote was entirely about who their customers are. And it makes perfect sense. What kind of a professional programmer would use these their tools anyway?
> Masad said something that is going to stick with me for a while: “Finding software problems in your life is also a skill, looking at a problem and saying, ‘a piece of software could solve that’ is a skill.”
> Since I can remember, that is a skill that only software developers could capitalize on. Now, pretty much anyone can.
It's true, and even as a software engineer there are things I've thought up in the past but haven't worked on because I'm not well versed with frontend tech. I had an idea the other day for a simple web app, and after some brainstorming and scaffolding with an LLM there is something halfway decent running locally for me.
Feeling "unblocked" and empowered to get something working rather than burdened by all of the minutiae that need to be figured out to ship and iterate on an idea is a great feeling. I wouldn't trust any of the current models to take something to production, but for "personal" software that I can control and manage and spruce up, it's been fun getting to hack on things I haven't felt equipped to tackle solo before.
I originally thought the headline meant "we don't care about using professional coders at our company anymore and will rely fully on AI," but it looks to actually mean "we don't want our product to focus as being a tool for professional coders anymore, but rather non-coders." This actually makes more sense than I initially thought: it's the simple use-cases that AI's can actually reliably deliver on.
Another interesting point is that it seems that these agentic capabilities are all possible without a proprietary model. So is replit too now essentially just an LLM API wrapper with no moat?
Last thing, the statement “Finding software problems in your life is also a skill, looking at a problem and saying, ‘a piece of software could solve that’ is a skill,” is remarkable. It sounds like Replit realized that most people really don't want to be inundated with writing software and framing this aversion as a problem itself.
It's funny that he presents this as a new direction. I actually interviewed with replit a couple of years back, and it was precisely their accessibility to non-professionals which interested me: the instant online editor/sandbox environment felt like it could be a good next step from Scratch, doing for software development something like what Arduino did for embedded projects. Their appeal for professionals seemed weaker, since we already have so many other options.
I strongly doubt that AI code generation will be any more of a silver bullet than any of the other "it'll be so easy anyone can do it" tools ever have been, but as an effectively infinite library of example starter projects to build on and improve, it could be quite useful.
> Instead, he says it’s time for non-coders to begin learning how to use AI tools to build software themselves
More code and applications to maintain in the long run. Being a software engineer will be in more demand than ever. I've noticed many individuals want to create an MVP before committing fully.
There is some thrill in writing code so boring that you still appreciate it when you come back to it a a decade later, but all-in-all I don't care about coding either, and I suspect most of us don't. Problem solving is the business we are actually in.
> He said at one point it might not be possible this decade. Even as he set up an “agent task force” to develop the product last year, he wasn’t sure if it would work. What changed was a new model from Anthropic, Claude 3.5 Sonnet, which achieved a record score on a coding evaluation called SWE-bench in October.
If the entire fortunes of your company and product changes overnight based on models released by other companies, how useful is your product really ?
Amjad has to be the worst hype man ever, none of the stuff he says even sounds convincing.
Let's say software really can be written completely autonomously by LLMs. I think it d be important to put that leverage in the hands of individual software engineers to build even more and faster, and avoid it being entirely in the hands of a few companies
Ordinarily, I would say the current situation is great for innovation and economic growth. Only in the case of potentially soon creating very capable LLMs is it questionable. Which situation would bring more innovation and economic growth: a few companies having AGI or every engineer having it? I've been thinking about that a lot, and I'm not sure which it is.
Wasn’t it just a thing for students to get started with programming without having to setup an environment? What are they doing these days, apart from jumping on the AI grift bandwagon?
I feel, cannot substantiate, but I feel today looks like those days of "Model Driven Design", "Model Driven Architecture", UML code generators, WYSIWYG IDEs.
Love that the author heavily editorializes but cordons off the editorial into its own section. I would be happy to see that become a standard practice elsewhere.
> In essence, Replit’s latest customer base is a new breed of coder: The ones who don’t know the first thing about code.
> “We don’t care about professional coders anymore,” Masad said.
> Instead, he says it’s time for non-coders to begin learning how to use AI tools to build software themselves. He is credited with a concept known as “Amjad’s Law” that says the return on learning some code doubles every six months.
These business types only learning enough to prompt an AI will eventually get tired of prodding the AI to fix things, unless Masad here is also selling an AI that will monitor and maintain your infrastructure for you too. They don't even have the patience to listen to developers talk about technical issues in meetings today, they're definitely not cut out for app maintenance work when the technical details start to matter.
He is doing the smart thing and selling shovels though, and this statement is part of that sell.
> Instead, he says it’s time for non-coders to begin learning how to use AI tools
Seems reasonable that more people will be able to utilise programability of computers.
People who already created elaborate spreadsheet’s with formulas, now might start also using python/java/go for their benefit. (Eg. Accountant, might start helping with the accounting system maintenance)
Ok. So this is the new version of nocode. The spin is a little bit different, becauase I can tell from personal anecdotes that llms do allow people with almost no experience in a given area to feel confident enough to take steps that only few years ago would cause them to stop rather quickly. In a a sense, there is a real threat to the current demand ( and that is seen across the board already as corporations either suspend hiring or proudly announce 2025 is the year of not hiring devs ), but the agent is still only as good as its operator ( as with most tools ).
I am not saying the real breakthrough will not come, because what is being done already is impressive on its own especially when you pair a person that knows what he is doing with LLM.. but for now this iteration of nocode still has one major flaw: humans.
LLM powered regular people won't be able to replace/undercut developers because they don't know how to correctly gather requirements.
The day LLMs start insisting Joe Schmoe non-devs first plan DB schemas/hosting/auth (ie; before responding with random HTML/CSS/JS page demos) is the day I start getting really scared.
"Yet it has grown its revenue five-fold over the past six months." Yeah, by shutting down Replit for Education and suddenly charging for stuff that used to be free. I guess repl.it is not a charity, but this is a blunt way of raising revenue that a "me too" AI offering may not compensate for.
you have to break people out of their current context putting whatever solutions into chatgpt, and then realize that its a coding problem to switch to replit, while not being somebody with the gut sense that thats something you should do.
I really do think its basically over for most software engineers. There's obviously still challenges in writing code, but you can already get 80% of the way there, and with a senior engineer guiding a junior using genai, most peoples hard won skills are completely commoditized. It will take time for the tooling to catch up and be implemented, but I am operating as if I wont have stable employment in 3 years.
Also people are not looking at the math of it. If 40% of software engineering gets automated, thats a massive reduction in demand, and with the glut of engineers on the market, will crater salaries. It doesnt all need to be automated for it to basically crush SWE as a good career
AI changes the type of labor required to produce working software, which will result in a downward shift in wages
As software engineering gets automated the demand for it goes _up_ because the problem spaces that were previously uneconomic to tackle become economic. We're going to see a further proliferation of software, and that's going to require care and feeding. So, now you need software engineers _everywhere_.
You could say, no, that's not how it's going to work because now everyone can just talk to a computer and get the software they want. But there's so many non-technical reasons why that doesn't put software engineers out of work. People are lazy and don't want to write (even if just by talking) their own software. People can't clearly describe what they want, so they won't be able to put to words the software they want. Most people just accept things as they are and don't think to change things with software.
What is clear is, if you're a run-of-the-mill, take-a-ticket work-a-ticket software engineer, with no interface to the business other than that, then yeah that shit is ripe to get automated away. But if you get out into the real world, man, it's the best time ever to have software engineering skills. Opportunities are _everywhere_.
Hard to know. It may be that reducing the friction to get to 80% increases demand for the autists that know the system through and through to productionalize the MVP, as there are now 100x as many superjuniors producing glitchy code that works most of the time.
BTW when GPT4 came out I started a study group with a few friends who never wrote code before. Understanding how to step through python and understand what was happening within the code wasn't the hard part, the basics of the tooling and debugging how python was borked on that particular person's laptop was the hard part. Maybe GPT5 will be able to debug and run shell commands to fix that sort of mundane garbage that engineers deal with as a matter of course, but IMO it will take a whole new environment, possibly along the lines of what Replit is building as far as taking care of all the tooling for you and presenting you with a code window with a Play button.
Getting 80% of the way there wasn't hard to begin with. If you had an idea, you could outsource it to someone on Fiver/Upwork/etc for a few hundred bucks and get 80% of the way there. The challenge for 99% of software projects is getting the other 80% of the way there, which is maintenance and integrations.
There's definitely a small chance a simple hit app could be made by a non-technical "ideas guy" and an LLM -- something like Wordle would be a good example. But Wordle wasn't the one paying engineers 6 figure salaries to begin with.
I'm not talking about ideas guys. I'm talking about if you spent 3 years on React, now I can just go use an LLM and start writing react as if I have 3 years experience. Same with AWS CDK deploying to Java lambda. I can just start using the LLM, and mostly get up to speed. All your hard won knowledge is just not that valuable
> I'm talking about if you spent 3 years on React, now I can just go use an LLM and start writing react as if I have 3 years experience.
er, no, you can't -- I don't know why you would think this unless you're junior and can't actually tell (which turns out to be one of the dangers of LLMs).
If your job is writing small snippets of code to solve clearly defined problems, AI can replace you today. If the main challenge to overcome in your job is figuring out how to get React to render a form, AI can replace you today. This is definitely true for some software engineers, even some with senior titles. But it's so far from what my job actually looks like that I can't begin to feel threatened.
> If your job is writing small snippets of code to solve clearly defined problems, AI can replace you today.
I strongly question the existence of these jobs; even the simplest "add some forms to this page" consultancy work in an app older than a few years tends to have requirements complexity around integrating with ancient/bespoke frameworks. If these jobs did exist, you'd assume they would have already been outsourced to SEA.
The best part is if the software is easy to write through AI, someone could implement a similar system and put Replit out of business. Or ask the AI to write a clone of Oracle or Microsoft Windows for example.
Yeah, the value of software as a whole seems like it is about to greatly decrease. You’d think tech companies and investors would be a lot more nervous about destroying the value of software, but I’m not sure if they’re thinking that far ahead.
Hooray, in a few years you can type a prompt and get a complete app you can sell to customers! But the customer can also type a prompt and get a complete app that meets their expectations even better than yours. Your generated app is worthless. Your entire company is just a middle man between the customer and ChatGPT, and the customer can write a prompt just as well as you.
Software as an entire industry is pretty much going to cease to exist if AI enthusiasts get their way.
We already see this with music. Have you ever had someone give you a link to a song they generated with AI? Do you really want to spend 4 minutes listening to a thing they made in 15 seconds? If I wanted an AI song about coffee then I’d generate one myself, not listen to yours. The faster and easier it gets to generate apps, the more we’re going to end up with people just generating their own. You won’t be able to sell your generated apps to anyone.
> But the customer can also type a prompt and get a complete app that meets their expectations even better than yours. Your generated app is worthless. Your entire company is just a middle man between the customer and ChatGPT, and the customer can write a prompt just as well as you.
What sort of apps do you work on where this is the case?
I just can’t fathom how a 10mloc financial engine that follows global regulations, has latency requirements, and tons of complex business logic that’s propriety, is going to be built via natural language.
Or how about flight computers for rockets?
Or sequences for MRI machines?
These might be obviously hard, but where’s the limit between viable and not viable?
AI absolutely can’t do the work that I do right now, but if AI keeps advancing then that gap is going to get smaller.
And at some point, even if the gap is still pretty big, the never-ending quest to increase profits will drive companies to start laying developers off even if the AI isn’t quite ready.
I work in a highly regulated industry. I hope that buys time. But I spend a lot of time worrying about how I’m going to take care of my kid if things keep moving at this rate.
> but you can already get 80% of the way there, and with a senior engineer guiding a junior using genai
Getting to a demo, an MVP, or an early-stage company are probably all doable with a skeleton crew of software engineers plus LLMs. But it is difficult to imagine scenarios where 1) customers are paying you real amounts of money and 2) the technical problems the company faces remain solvable "with a senior engineer guiding a junior using genai."
It feels like a recipe for disaster--when (not if) things go awry, there is no slack in the system, nothing to absorb the shock. It's brittle. Maybe we will see lots of companies in the next few years with successful exits who are using a model like this! But I doubt many are durable businesses.
honestly, highly highly doubt this. there is nothing revolutionary in most engineering, and its even easier when the product is already robust and just needs to be maintained
This has been the argument for every CMS and no-code solution ever. 80% of apps are just crud. 90% maybe! We can just anticipate all the needs and have plugins and drag and drop builders. Coding will be restricted to just a small set of senior engineers working on core framework development. Just watch! Any moment now..
And they were right, in a way. Frameworks excel in the early stages of application development. Drag and drop, click to create a database, it all works. But ask even small sized companies if they keep using it after 1 year, and the answer is often no. If it is yes, it’s usually all building custom things to work around the limitations of the framework. Or, it is a simple product/site with no further needs (but they are rare).
Anyway, AI for the full development cycle! Any moment now..
Sure it’s different. But the point is early stage development is not the hard part. LLMs so far perform well within training data, which there is a ton for crud. We can’t say how it performs maintaining projects yet, it’s not clear it can be pattern matched in the same way.
>It doesnt all need to be automated for it to basically crush SWE as a good career
Let's be real, even at $60k, the WFH makes it tolerable, but you'll need lifestyle changes like selling your house if your mortgage is too much. Plenty of people working hospital jobs making around $40k and we don't deal with medical sharps and patients that bite you. It could always be worse.
If companies don't see the value with human engineers, let them suffer the consequences. If they can run a business with only AI coding, there probably isn't a significant moat there, and then they'll have lots of competition as a lot of bored SWEs use AI as well and start companies to stay in the game.
But seriously, what’s a single good app out there that’s been written by AI with no developers involved? I don’t know of any. Not play apps, but real apps serving real users, with billing, subscriptions, some traffic, etc…
> What does 'commoditization of coding' literally look like?
Presumably something like the 'commoditization of elevator operating' that we once saw. The elevator operator never went away, but operating elevators was simplified to the point that anyone could do it. At which time everyone started doing it, commoditizing the job.
In other words, it looks like programming languages that become increasingly approachable to any old random person on the street who can jump in without any meaningful past experience or special knowledge. You know doubt already what that looks like because we're already using those programming languages (e.g. GPT) in narrow domains to great effect. They have yet to prove themselves as complete general purpose programming languages, but they have made great strives towards that and it is, by many accounts, now only a matter of time.
I can see it for simple toys, I just can't see it for more complex projects. I have no idea how you could use natural language to describe some complex business rules, and once you start having to write in a more logical language it's verging on the precision of code. And to then be able to audit the system, make minor changes and regression test, etc...
I can see it speeding engineers up, I can't see the guy on the clapham omnibus building a complex line of business app (or an operating system, flight computer, financial engine, MRI sequencer, etc...). And once it can't be your average Joe and it's someone who requires training, suddenly we're back to it not being a commodity.
While in recent memory the engineer and the coder have oft been the same person, that wasn't always the case (historically there was a clear divide between the engineers and coders) and isn't always the case. I see no mention of engineers around this discussion. Is there a reason for its introduction?
I introduce professionals because I believe the real benefit will be speeding them up, not replacing them with laymen who suddenly have commoditised access to programming skills
? "Professional" was set out right from the onset. The question was about engineers, which, while it is true that coding and engineering is sometimes done by the same person, there is no reason why it has to be that way and, especially historically, coding is a separate job. The context is specifically about the job of coding, not the job of engineering. There is nothing to suggest engineers are being commoditized and never was.
So be it, but the topic is explicitly about "professional coders". Not "engineer", not "software professional". It is literally stated in the title of the original link.
A coder and an engineer are not necessarily the same person. It has become more common to see people do both jobs, but in history it was unusual for the engineers and coders to be the same person, and the idea is that we're going back to that model – except this time the coder will be a machine instead of another human.
As in the old job of coder, short for "encoder". That does not appear to be the same thing you are talking about, but it was equally inefficient to have a second person doing the coding, especially as the tools improved. That is how we ended up with people splitting their roles as both engineer and coder. But the premise is that you won't need to worry about human coder anymore, not as a distinct individual and not a job shared with other jobs.
Commoditization of coding as a skill set happened decades ago. When was the last time you wrote any assembly language? Nobody writes any actual code anymore - machines do it all for us, according to the loosey-goosey high-level instructions we now call "coding". Somehow, we all still have more work to do than ever... but that's just the Jevons paradox in action. If AI makes programmers more efficient, demand for programming services will only increase.
>If 40% of software engineering gets automated, thats a massive reduction in demand
This is your daily reminder that the lump of labor fallacy is incorrect. There is no fixed amount of demand, and software development has become more productive, by vastly more than 40%, since its inception as a profession.
Its about the type of person now required to write software. You dont need some ultra math brain autist to write code if the LLM can do it with some human assistance from someone of much less ability. Coding may see more demand, but the skill set itself sees a reduction in the base quality bar required by the engineers, dropping wages and changing the profession. Ofcourse there are exceptions, but not for most businesses
Yes, the profession is going to change, as it always has but this is still bad economics. Wages don't fall when productivity increases, or otherwise you'd be poorer every year since the industrial revolution. When software developers become more productive their value goes up, not down. A web or game developer in 2015 was not a "ultra math brain autist" compared to a 1980s CS whiz kid writing video games in assembly, but financially better off. Software developers, like everyone else, aren't paid for their calculus skills, they're paid for the market value of their output, entirely independent of how much brainy skills that requires.
When Guido and Mats invented Python and Ruby, or you get a really fancy debugger, did you think the wage apocalypse is upon us because you got a more productive tool and more people can write software?
> Instead, he says it’s time for non-coders to begin learning how to use AI tools to build software themselves. He is credited with a concept known as “Amjad’s Law” that says the return on learning some code doubles every six months.
No, that's not a law. That's stupid made-up bullshit that he made up on the spot.
I think most programmers will find their way, even if they stop programming. I worked with non-programmers who have attention span of a toddler and wouldn't be able to read a few sentence email and respond accordingly. They would just respond to first 1-2 sentences.
Corporate work is mostly pretend and look busy.
I'm really curious to explore code of an app written completely by AI.
I wrote complete apps that run a business. The biggest hurdle? Figuring out what is actually going on, what the process is, why is it so poorly documented and why is it stored in heads of multiple people. I don't think gathering this information would be possible without understanding how software and databases work and what I need to know.
Wow, won't take long for that perspecitve to sound REALLY fckn stupid. I was actually keeping an eye on the company, but I'm pretty sure now it's not a place where I want to work.