Hacker Newsnew | past | comments | ask | show | jobs | submit | gr4vityWall's commentslogin

> which web shops demand governmental ID?

Basically all web shops in Brazil require you to give a government ID to buy anything (usually your CPF number).


Brazil has an insane number of 'illegal' immigrants as well as people living in Favela who essentially don't even recognize the state, so I'm curious how that works. I assume it's something like the US where 10 illegals work under one social security number or a tax ID they've registered under the auspice of foreign controlled business.

> an insane number of 'illegal' immigrants

Immigrants can request a CPF (the 'national ID'). I don't think being in the country 'legally' is a requirement, that isn't enforced the way it is in the US.

> people living in Favela who essentially don't even recognize the state

Most people get assigned an ID at birth. And people who live in a favela often have to work outside it, and they interact with most companies/state services that aren't utilities as usual.

Utilities OTOH often get MITM'd by militia/narcos these days though.

> I assume it's something like the US where 10 illegals work under one social security number or a tax ID

No need for anything fancy like that. The poorest people are willing to work based on verbal agreements, as the alternative is either starving, or hoping the public social security network has your back. And in case your employer requires one, that's a non-issue because, except for rare circumstances, everyone has one.

Digital banking, install payments and general smartphone usage is widely popular, including favelas.


It doesn't need to be written by a human only, but I think generating it once and distributing it with source code is more efficient. Developers can correct errors in the generated documentation, which then can be used by humans and LLMs.

> maybe $250/month (...) which you can then use to go and earn 100x that.

25k/month? Most people will never come close to earn that much. Most developers in the third world don't make that in a full year, but are affected by raises in PC parts' prices.

I agree with the general principle of having savings for emergencies. For a Software Engineer, that should probably include buying a good enough computer for them, in case they need a new one. But the figures themselves seem skewed towards the reality of very well-paid SV engineers.


>Most developers in the third world don't make that in a full year

And many in the first world haha


25k annually (before taxes) is $12/hour with a 40 hour work week, how many software developers in the first world are working for that? There are probably some, but I’d be surprised if there were “many”.

> But the figures themselves seem skewed towards the reality of very well-paid SV engineers.

The soon to be unemployed SV engineers when LLM's mean anyone can design an app and backend with no coding knowledge.


and you can code from an rpi / cellphone and use a cloud computer to run it so you actually don't really need an expensive PC at all

> I'll admit I'm somewhat biased against Bun?

Why? Genuine question, sorry if it was said/implied in your original message and I missed it.


Good question, hard to say, but I think it's mainly because of Zig. At its core Zig is marketed as a competitor to C, not C++/Rust/etc, which makes me think it's harder to write working code that won't leak or crash than in other languages. Zig embraces manual memory management as well.

> At its core Zig is marketed as a competitor to C, not C++/Rust/etc

What gives you this impression?

I directly created Zig to replace C++. I used C++ before I wrote Zig. I wrote Zig originally in C++. I recently ported Chromaprint from C++ to Zig, with nice performance results. I constantly talk about how batching is superior to RAII.

Everyone loves to parrot this "Zig is to C as Rust is to C++" nonsense. It's some kind of mind virus that spreads despite any factual basis.

I don't mean to disparage you in particular, this is like the 1000th time I've seen this.


You have pretty explicitly framed Zig as a C replacement yourself, e.g.: https://www.youtube.com/watch?v=Gv2I7qTux7g

More broadly, I think the observation tends to get repeated because C and Zig share a certain elegance and simplicity (even if C's elegance has dated). C++ is many things, but it's hardly elegant or simple.

I don't think anyone denies that Zig can be a C++ replacement, but that's hardly unusual, so can many other languages (Rust, Swift, etc). What's noteworthy here is that Zig is almost unique in having the potential to be a genuine C replacement. To its (and your) great credit, I might add.

>> At its core Zig is marketed as a competitor to C, not C++/Rust/etc, which makes me think it's harder to write working code that won't leak or crash than in other languages. Zig embraces manual memory management as well.

@GP: This is not a great take. All four languages are oriented around manual memory management. C++ inherits all of the footguns of C, whereas Zig and Rust try to sand off the rough edges.

Manual memory management is and will always remain necessary. The only reason someone writing JS scripts don't need to worry about managing their memory is because someone has already done that work for them.


Well if anything take as a compliment. As a C, C++ (and some Rust) who lately is enjoying Zig, I think Zig is the only programming language positioned to convince system programming die hard C programmers to use another programming language with simplicity and power backed in.

But completely agree. Its a perfect replacement for C++ and I would say the natural spiritual successor of C.

I gave up using Rust for new projects after seeing the limitations for the kind of software I like to write and have been using Zig instead as it gives me the freedom I need without all the over-complication that languages like C++ and Rust bring to the table.

I think people should first experiment see for themselves and only then comment as I see a lot of misinformation and beliefs more based on marketing than reality.

Thank you very much for your wonderful work!


I got to love that the author of the thing can show up and say “Why?! I never said any of that!”

A lot of stuff related to older languages is lost in the sands of time, but the same thing isn’t true for current ones.


Rust is more of a competitor to C++ than C. Manual memory management is sometimes really helpful and necessary. Zig has a lot of safety features.

I mean, they said they looked at the source code and thought it was gross, so there’s a justification for their concern, at least.

That's fair, but the word 'biased' felt unusual to describe how they perceive the runtime.

I agree with your general idea. I'd add it also looks very similar to what typical GNU/Linux have in practice: blessed packages from the distros' repositories, and third-party repos for those who want them.

Debian also has something 'in the middle' with additional repositories that aren't part of the main distribution and/or contain proprietary software.


The author would probably find joy in using Zig.

Personally my biggest complain from Rust is that I wish it was more readable. I've seen function signatures that seemed straight out of C++.


> Personally my biggest complain from Rust is that I wish it was more readable. I've seen function signatures that seemed straight out of C++.

There is always a trade-off. You really cannot provide enough information for the compiler without the current signatures. There is a certain point where you cannot compress the information without losing some features.


It's always an option to create type aliases, but there's a bit of "robbing Peter to pay Paul" happening when you do that.

You make the signature shorter but also take away the ability for the programmer to quickly understand what code is doing.


I fully believe you, I don't know what the solution would be either.


The Debian stable model of having a distro handle common dependencies with a full system upgrade every few years looks more and more sane as years pass.

It's a shame some ecosystems move waaay too fast, or don't have a good story for having distro-specific packages. For example, I don't think there are Node.js libraries packaged for Debian that allow you to install them from apt and use it in projects. I might be wrong.


Never mistake motion for action.

An eco system moving too quickly, when it isn't being fundamentally changed, isn't a sign of a healthy ecosystem, but of a pathological one.

No one can think that js has progressed substantially in the last three years, yet trying to build any project three years old without updates is so hard a rewrite is a reasonable solution.


> No one can think that js has progressed substantially in the last three years

Are we talking about the language, or the wider ecosystem?

If the latter, I think a lot of people would disagree. Bun is about three years old.

Other significant changes are Node.js being able to run TypeScript files without any optional flags, or being able to use require on ES Modules. I see positive changes in the ecosystem in recent years.


That is motion not action.

The point of javascript is to display websites in the browser.

Ask yourself, in the last three years has there been a substantial improvement in the way you access websites? Or have they gotten even slower, buggier and more annoying to deal with?


No but the devs can push slower, buggier, more annoying websites to prod, faster!

And after all, isn’t developer velocity (and investor benefits) really the only things that matter???

/sssss


> The point of javascript is to display websites in the browser.

I don't follow. JavaScript is a dynamic general purpose programming language. It is certainly not limited to displaying websites, nor it's a requirement for that. The improvements I mentioned in the previous post aren't things you'd get the benefit of inside a web browser.


> modern websites

Your are comparing the js ecosystem and bad project realizations/designs.

> Action vs motion

I think the main difference you mean is the motivation behind changes, is it a (re)action to achieve a meassurable goal, is this a fix for a critical CVE, or just some dev having fun and pumping up the numbers.

GP mentioned the recent feature of executing ts, which is a reasonable goal imo, with alot of beneficial effects down the line but in the present just another hustle to take care about. So is this a pointless motion or a worthy action? Both statements can be correct, depending on your goals.


> For example, I don't think there are Node.js libraries packaged for Debian that allow you to install them from apt and use it in projects

Web search shows some: https://packages.debian.org/search?keywords=node&searchon=na... (but also shows "for optimizing reasons some results might have been suppressed" so might not be all)

Although probably different from other distros, Arch for example seems to have none.


Locally, you can do:

  apt-cache showpkg 'node-*' | grep ^Package:
which returns 4155 results, though 727 of them are type packages.

Using these in commonjs code is trivial; they are automatically found by `require`. Unfortunately, system-installed packages are yet another casualty of the ESM transition ... there are ways to make it work but it's not automatic like it used to be.


> there are ways to make it work but it's not automatic like it used to be

Out of curiosity, what would you recommend? And what would be needed to make them work automatically?


Some solutions include: add a hook to node on the command line, fake a `node_modules`, try to make an import map work, just hard-code the path ... most of this is quite intrusive unlike the simple `require` path.

I really don't have a recommendation other than another hack. The JS world is hacks upon hacks upon hacks; there is no sanity to be found anywhere.


> Unfortunately, system-installed packages are yet another casualty of the ESM transition ...

A small price to pay for the abundant benefits ESM brings.


Honestly, I don't see the value gain, given how many other problems the JS tooling has (even ignoring the ecosystem).

In particular, the fact that Typescript makes it very difficult to write a project that uses both browser-specific and node-specific functionality is particularly damning.


It is possible to work with rust, using debian repositories as the only source.


The stable model usually implies that your app has to target both the old and the new distro version for a while. That is a bit too much to ask for some, unfortunately


> Has the value improved in the last year and a half?

I'd say yes, my reasoning being:

1. DDR5 got more expensive;

2. 16 cores AM4 CPUs are cheaper, partially due to the release of the 5900XT (16C/32T CPU, basically a rebranded 5950X with 100mhz lower single-core boost clock);

3. Lots of gamers selling their used AM4 kits for good prices as they migrate to the 9800X3D.


Agreed. WebSocket works perfectly when someone wants a lightweight message-based protocol on top of TCP and doesn't want to implement it themselves, potentially in multiple languages. It being originally browser tech became a minor detail.


Wouldn't it look stuttery if the user tried to drag windows or other widgets with a limited framerate?

> What you actually want is low latency

High frame rates directly contribute to lowering latency.

I guess I'm the polar opposite of you here, I heavily prefer low latency/high refresh rates over low CPU/GPU usage, as long as the machine is plugged in to a power source.


Maybe it's car analogy time: this is a bit like saying, well it's fine if my car is redlining 24/7, as long as it gets the speed I want when I happen to be on the highway.

Nobody is saying they want to tolerate some horribly laggy interface for the sake of lower CPU usage. The point is simply about not wasting enormous amounts of power when it's not needed.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: