I think this old quote can come around to being accurate in a way, if you consider that from the user's perspective every cloud service is like one system. Aws, Azure, Google cloud...how many will there be when the dust settles? ;-)
reading uncle bob's book was the singlemost harmful thing i ever did to myself as a software engineer. wasted so much time writing such worthless tests. TDD? ughhh no way.
>The option up here really truly is "do we use fossil fuels, or do we use nuclear". Renewables do not help.
Hey now - renewables gave us electricity up here long before Einstein started thinking about atoms!
We are very few people here, 250MWh helps a lot, but if we have to chip in to build a nuclear plant we'll be broke before the project planning is done. ;-)
I think the fragmentation is good, it allows many talents working in the same area at once and lets users choose what fits best. The problem is that this leaves a lot of the polishing work to the users, and the defaults - which most stick with - are often the most boring, safe choice.
I think what DHH did with Omakub (and Omarchy) was a constructive solution here - use the myriad of options to pick out a set of components and configs that work well. Polish the selections, hide and ignore the capabilities that don't fint in, and document how to use the resulting "bundle" in great detail.
>8K at jumbo TV size has relatively large pixels compared to an 8K desktop monitor. It’s easier to manufacture.
I don't think that's true.
I've been using a 8k 55" TV as my main monitor for years now. It was available for sub-800 USD before all such tv's vanished from the market. Smaller pixels were not more expensive even then, the 55"s were the cheapest.
4k monitors can be had for sub-200 usd, selling 4x the area of the same panel should be at most 4x that price. And it was, years ago.
So they were clearly not complicated or expensive to manufacture - but there was no compelling reason for having 8k on a TV so they didn't sell. However, there IS a compelling reason to have 8K on a desktop monitor!
That such monitors sell for 8000 usd+ is IMO a very unfortunate situation caused by a weird incompetence in market segmentation by the monitor makers.
I firmly believe that they could sell 100x as many if they cut the price to 1/10th, which they clearly could do. The market that never appeared for tv's is present among the world's knowledge workers, for sure.
I've been using an 8k 65" TV as a monitor for four years now. When I bought it, you could buy the Samsung QN700B 55" 8k, but at the time it was 50% more than the 65" I bought (TCL).
I wish the 55" 8k TVs still existed (or that the announced 55" 8k monitors were ever shipped). I make do with 65", but it's just a tad too large. I would never switch back to 4k, however.
Average bitrate from anything not a Bluray for even HD is not good, so you do not benefit from more pixels anyway. Sure, you are decompressing and displaying 8K worth of pixels, but the actual resolution of your content is more like 1080p anyway, especially in color space.
Normally, games are the place where arbitrarily high pixel counts could shine, because you could literally ensure that every pixel is calculated and make real use of it, but that's actually stupidly hard at 4k and above, so nvidia just told people to eat smeary and AI garbage instead, throwing away the entire point of having a beefy GPU.
I was even skeptical of 1440p at higher refresh rates, but bought a nice monitor with those specs anyway and was happily surprised with the improvement, but it's obvious diminishing returns.
This is exactly why 8K tv's failed in the market, but the point here is that your computer desktop is _great_ 8k content.
The tv's that were sold for sub-1000 usd just a few years ago should be sold as monitors instead. Replace the TV tuners, app support, network cards and such and add a displayport.
Having a high-resolution desktop that basically covers your useable FOV is great, and is a way more compelling use case than watching TV on 8K ever was.
HDMI 2.1 is required, and the cables are not too expensive now.
For newer gpus (nvidia 3000+ or equivalent) and high end (or M4+) macs hdmi 2.1 works fine but Linux drivers have some licensing issue that makes hdmi 2.1 problematic.
It works with certain nvidia drivers but I ended up getting a DP to HDMI 8K cable which was more reliable. I think it could work with AMD and Intel also but I haven't tried.
In my case I have a 55 and sit normal monitor distance away. I made a "double floor" on my desk and a cutout for the monitor so the monitor legs are some 10cm below the actual desk, and the screen starts basically at the level of the actual desk surface. The gap between the desk panels is nice for keeping usb hubs, drives, headphone amps and such. And the mac mini.
I usually have reference material windows upper left and right, coding project upper center, coding editor bottom center, and 2 or 4 terminals, teams, slack and mail on either side of the coding window. The center column is about tice as wide as the sides. I also have other layouts depending on the kind of work.
I use layout arrangers like fancyzones (from powertoys) in windows and a similar mechanism in KDE, and manual window management on the mac.
I run double scaling, so I get basically 4K desktop area but at retina (ish) resolution. 55 is a bit too big but since I run doubling I can read stuff also in the corners. 50" 8K would be ideal.
Basically the biggest problem with this setup is it spoils you and it was only available several years ago. :(
I managed to grab a 55" 8K LG before 8K went out of fashion. I run it at 4k120 for games and 8k60 with doubling for productivity.
I've never had a better monitor and if one should exist it's not available in any store I know about. Monitors costing 2-3x as much as this TV did back then are worse. When it dies I will have to downgrade. :-/
Hmmm, I'm confused now: aren't 8K displays just becoming a thing? Your perspective sounds like they are a dying breed. In the meantime, for me, they are still prohibitively expensive.
>aren't 8K displays just becoming a thing? Your perspective sounds like they are a dying breed. In the meantime, for me, they are still prohibitively expensive.
I'd say yes and no - they are becoming a thing - again. And you're right that they are prohibitively expensive this time.
Some 5 years ago 8K tv's were heavily marketed and displayed in many electronics stores, but consumers apparently didn't bite - basically no 8K content available and for "normal" TV use you can barely see a difference between 4k and 8k anyway.
So these TV's were very cheap for a short while before they basically disappeared.
And they make for great PC monitors. At normal working distance from a monitor you definitely notice the difference between 4k and 8k.
The screen area is basically the same as a 2x2 grid of 27" 4k monitors, but in one screen. For productivity work it's absolutely glorious, text is super-crisp.
I think there is a market structure problem that blocks progress here. Most people who work all day at a monitor would love to have such a screen, but the people paying for screens buy what the producers are selling, based on price.
So we end up with dual or triple small-monitor setups even in the wealthiest companies. If a few of the FAAMGs started asking for a 50" 8K maybe something would happen, but it hasn't yet. :(
Storage is not needed. You can consume solar power as it's generated, and it is as useful then as if it came from oil, gas or coal.
When the sun goes down, you have saved tons of oil, gas and goal that didn't have to burn during the day. Which is very very good. You don't have to "solve nighttime" before solar makes sense, it makes sense immediately.
Edit: And of course, nighttime is also being solved, in many ways, already.
Unless blackouts or brownouts are going to be allowed to increase in frequency from very rare at present, sources of immediate dispatchable electricity are needed. Large scale hydro plants, battery storage, are expensive to have sitting around. Coal can take several hours to ramp up generation , Gas can be much faster, a few hours from a cold start.
Nighttime is predictable. I'm specifically include bad weather in solar / wind generation which can mean zero output. Can be for days. Storage needs to be reserved for these events.
reply