Hacker Newsnew | past | comments | ask | show | jobs | submit | williamDafoe's commentslogin

So to summarize - Fyne is a (? cross-platform ?) library for GUI apps, and you believe it's more productive than existing libraries, so you wanted a native window manager because ... exactly why? what exactly is the savings or advantage?

Because I tried working on existing ones and it was painful.

I wanted a modern alternative to exist that was approachable and fun. And once we have met existing systems we can start to exceed them ;).


If you only understood how White Hot and Stone Cold a research field can be! Five years ago crypto was white hot and today it's almost Stone Cold! People get intoxicated with the money beams that get blasted at people in the white hot subfields but realize that only 5-10% of phds are majoring in the white hot subfields and they simply got lucky 5 years ago to pick that subfield and all that attention hopefully lasts for 10 years which is enough time for them to get tenure or succeed with a startup! if the field isn't white hot for 10 consecutive years they will get fired at tenure time!

When last I looked there were three physics faculty in the United States for every two PhD students. That's because there just aren't any industry jobs for physics PhDs. Assuming a 40-year career, that means one job every 13 years for those two PhDs (one graduating every 3 years) to compete for - so a 1 in 4 chance of getting any physics job! Which is completely insane it's a lucky thing there are trillions of computer science jobs for these unemployed physicists to take - and take them they sure do!

The oversupply has extended from phds down to bachelor's degrees. With 37% of Americans getting bachelor's degrees it doesn't mean anything anymore and is wasteful overtraining. I'm no Trumper but somebody has to stop the greedy life-wasting that academics have created by overfunding a lot of stupid duplicate research and excessive college educations by people who never have an impact...

Some would say you educate people to cultivate an engaged citizenry

Very few people say that. Overwhelmingly the rhetoric one hears is that the purpose of higher education is to get a better job.

Personally, I think it would be great if we educated people to cultivate an engaged citizenry. But if we're going to do that we have to be up-front about it an work on an economic model that supports it. So, for example, you can't have student loans that are predicated on being able to obtain a certain level of income on graduation, and you certainly can't make those loans impossible to discharge even in bankruptcy. If you lie about it, as we have been for decades now, it all unravels sooner or later.


> Very few people say that.

It’s not very fashionable on HN because of the faux-tough utilitarian outlook, sure. I’m the real life, there might be such a thing as over-education, but the US are certainly not there.


Thomas Jefferson said that a bunch

You need to keep in mind context. He lived in a time when the overwhelming majority of society was self employed and there was no formalized, let alone compulsory, educational system whatsoever. Looking up the exact history there, the first compulsory education began in 1852 (Jefferson died in 1826 at the age of 83), where children 8 to 14 were required to spend at least 3 months a year in 'schooling', with at least 6 weeks of it being consecutive. [1]

And in the early 19th century near to 100% of Americans lived in rural areas where access to centralized information was minimal. There was no internet, radio, or other means of centralized communication. For that matter, there wasn't even electricity. The closest they'd have had would have been local newspapers. So people without any education would have had very little idea about the world around them.

And obviously I don't mean what's happening half-way around the world, but in their own country, their own rights, and so on. Among the political elite there was a raging battle over federalism vs confederalism, but that would have had very little meaning to the overwhelming majority of Americans. Jefferson won the presidency in 1800 with 45k votes against John Adams' 30k votes, when the country's population was 5,300,000!

[1] - https://www.ebsco.com/research-starters/history/history-publ...


Even into the 1950's and early 60's, my dad went to a one-room school, probably until he was around 14 years old. No running water or air conditioning, the job of the first student to arrive in colder months was to start a fire in the stove to heat up the room.

Had he been born a few years earlier, it would have been unlikely for him to even graduate. 1940 was the first year that the graduation rate hit 50%.


Absolutely, although by then electricity, radio, urbanization, and other such things had already radically reshaped the overall character of society to be something much closer to today than of Jefferson's time.

Jefferson, in modern parlance, would probably be a 'pragmatic libertarian.' He envisioned independent self-reliant people, and in fact (like many of the Founding Fathers) was somewhat opposed to 'economically dependent' people, including wage laborers, voting - for fear that their vote could be coerced too easily, and that they might otherwise be irresponsible. That's where things like property ownership came from as a voting requirement.

And a major part of self reliance is an education that is both broad and fundamental which is where the 'pragmatic' part comes in, as I think fundamental libertarianism would view education as exclusively a thing of the private market, whereas Jefferson supported broad and public education precisely as part of this formula to independence.


Thomas Jefferson was one person, and he died over 200 years ago.

Nations with high GDP tend to be service economies. Service professions tend to require good reading and writing skills, and often a college-level specialization. (No need for PhDs, though, except for scientists.)

This is 100% true but by the early 1990s computer science Phds were ALSO in the shitter because of all the 1990 layoffs and the total shutdown of industrial basic research and so every industry CS PhD was trying to get a professorship job so they could continue to do research after all the researchers in industry had been laid off!!! Science is a pyramid scheme and a very shallow pyamid with 80-90% cuts at every level !!!

Computer science has become the worst profession of all now because all the OTHER scientists say, "it's okay if I can never have a career in a scientific field I'll just switch and become a computer scientist!" And most of them will work for food ...

I graduated in 1993 when there were 20 usa positions for the 1000 CS PhDs. "That's okay" you say? "How many were from top schools" you say? 200, thats how many PhDs were from top 10 schools ... So, 10:1, a 90% cut ...

I did get a tenure track job (outside the usa - in canada) but could not afford the incredibly low pay in the most expensive city in north america (income vs housing costs ratio).


Vancouver mentioned!

Just FYI Google runs their data centers at 85 degrees F (about 30 degrees C). I think Google probably knows more about how to run Intel CPUs for longest life and lowest cost per CPU cycle. After all they are the #5 computer maker on earth. What Intel is doing and what they are recommending is the act of a desperate corporation incapable of designing energy-efficient CPUs, incapable of progressing their performance in MIPS per Watt of power. This is a sign of a failed corporation.


Google runs datacenters hot because it's probably cheaper than over-cooling them with AC.

Chips are happy to run at high temperatures, that's not an issue. It's just a tradeoff of expense and performance.


> Just FYI Google runs their data centers at 85 degrees F (about 30 degrees C). I think Google probably knows more about how to run Intel CPUs for longest life and lowest cost per CPU cycle. After all they are the #5 computer maker on earth.

Servers and running things at scale are way different from consumer use cases and the cooling solutions you'll find in the typical desktop tower, esp. considering the average budget and tolerance for noise. Regardless, on a desktop chip, even if you hit tJMax, it shouldn't lead to instability as in the post above, nor should the chips fail.

If they do, then that value was chosen wrong by the manufacturer. The chips should also be clocking back to maintain safe operating temps. Essentially, squeeze out whatever performance is available with a given cooling solution: be it passive (I have some low TDP AM4 chips with passive Alpine radiator blocks), air coolers or AIOs or a custom liquid loop.

> What Intel is doing and what they are recommending is the act of a desperate corporation incapable of designing energy-efficient CPUs, incapable of progressing their performance in MIPS per Watt of power.

I don't disagree with this entirely, but the story is increasingly similar with AMD as well - most consumer chip manufacturers are pushing the chips harder and harder out of the factory, so they can compete on benchmarks. That's why you hear about people limiting the power envelope to 80-90% of stock and dropping close to 10 degrees C in temperatures, similarly you hear about the difficulties of pushing chips all that far past stock in overclocking, because they're already pushed harder than the prior generations.

To sum up: Intel should be less delusional in how far they can push the silicon, take the L and compete against AMD on the pricing, instead of charging an arm and a leg for chips that will burn up. What they were doing with the Arc GPUs compared to the competitors was actually a step in the right direction.


"Stronghold" is a joke phrase, is it not? Intel had ZERO progress in integrated graphics from 2013-2020. ZERO. That's the reason why "it works so well" - because they NEVER improved the performance or architecture! Sure, they diddled with the number of CU's, but as far as graphics architecture, they never changed it, and it was POOR to begin with (couldn't do 1080p esports very well ...)


x86 is the cpu architecture. i don't believe gp was talking about intels igpu solution at all.


Are x86 consoles a joke?


Consoles have very little lock-in on their architectural choices, since they only ever support a small set of hardware configurations in the first place. I guess some of the current generation are x86-based but it would be very easy to move to ARM for the next generation if they wanted to.


A badly optimized CPU will take excessive amounts of power. The "failure in choosing cooling solutions" excuse is just the pot calling the kettle black.


The 7000 series of CPUs is NOT known for running cool, unlike the AMD 5000 series (which are basically server CPUs repurposed for desktop usage). In the 7000 series, AMD decided to just increase the power of each CPU and that's where most of the performance gains are coming from - but power consumption is 40-50% higher than with similar 5000-series CPUs.


When you use EcoMode with them you only lose ~5% performance, but are still ~30% ahead of the corresponding 5000-series CPU. You can reduce PPT/TDP even further while still ahead.

https://www.computerbase.de/artikel/prozessoren/amd-ryzen-79...


I specifically singled out the 7800X3D though, it runs incredibly cool and at a very low power draw for the performance you get.


> You know, I'm something of a CPU engineer myself :D

Actually almost everything what you wrote is not true, and commenter above already sent you some links.

7800X3D is the GOAT, very power efficient and cool.


The only reason the 7800x3d is power efficient is because it simply can't use much power, and so it runs at a better spot of the efficiency curve. Most of the CPUs won't use more than ~88w without doing manual overclocking (not pbo). Compare that to e.g. a 7600x that's 2 cores fewer on the same architecture and will happily pull over 130w.

And even if could push it higher, they run very hot compared to other CPUs at the same power usage as a combination of AMD's very thick IHS, the compute chiplets being small/power dense and 7000 series X3D cache being on top of the compute chiplet unlike 9000 series that has it on the bottom.

The 9800x3d limited in the same way will be both mildly more power efficient from faster cores and run cooler because of the cache location. The only reason it's hotter is that it's allowed to use significantly more power, usually up to 150w stock, for which you'd have to remove the IHS on the 7800X3D if you didn't want to see magic smoke


Multics : everything is a memory segment, just get out your soldering iron and wirecutters to begin porting Multics to your hardware today!

IBM: everything is a record type, we have 320 different modules to help you deal with all 320 system-standard record types, and we can even convert some of them to others. And we have 50,000 unfixable bugs because other pieces of code depend upon the bug working the way it does ...

UNIX: everything is an ASCII byte. Done

I started writing code in the 1970s on TOPS-10, TWENEX, PLATO, and my undergrad thesis advisor helped to invent Multics. The benefits of UNIX are real, folks, the snake oil is the hardware vendors who HATE how it levels the playing field ...


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: