Hacker News new | past | comments | ask | show | jobs | submit login

On open sourcing Wolfenstein, Doom, Quake: https://youtu.be/udlMSe5-zP8?t=591

Neuralink: https://youtu.be/udlMSe5-zP8?t=2523

Artificial General Intelligence: https://youtu.be/udlMSe5-zP8?t=2778

Quantum Computing: https://youtu.be/udlMSe5-zP8?t=3106

“Engineering is figuring out how to do what you want with what you’ve actually got”: https://youtu.be/udlMSe5-zP8?t=3190

End of Moore’s Law / On CPU architecture: https://youtu.be/udlMSe5-zP8?t=3860

5G and streaming (games & video): https://youtu.be/udlMSe5-zP8?t=4288

edit: like already mentioned there are a lot of topics covered, some even for just a few sentences, the conversation flows, worth watching the whole thing




Carmack briefly mentions 5G and streaming games. I think there is good economic reason for the 5G gaming is eventually coming (it may take some time) because low latency enables it.

If you think about the price of gaming PC or console, there is huge discrepancy between the budget of hardcore gaming enthusiast and casual gamer. It would be nice to get $5000 gaming tower in every house, but you don't. Many casual gamers would rent $5000 - $10,000 worth of gaming hardware for few hours a week it was simple. Only way to get bleeding edge high-end gaming to masses is to put GPU and some other parts to the edge (or at least within the same city) and stream or partially stream the game to cheaper computer device and screens.

Consider $5,000 worth of bleeding edge hardware that costs $1/hour to run. If you rent it for $8/hour, and it only sells for 5 hours per day for gaming, the HW pays itself back in 5 months. It could be rented out for other stuff in the meantime. Cloudflare, what do you think?

I could see a market emerging similar to school VHS/DVD/Game rentals. There is limited computational resource near you and you can rent it for gaming. If all is taken (weekend evenings) the 'shelves' are empty. On working/school days and hours you get the same thing cheaper.

It probably happens in Japan, South Korea and Nordic countries first.


Do you really get significant cost savings? You can't really use hardware from other time zones / continents, because the latency would be very visible. So the hardware has to be reasonably close to the users.

Then I would imagine the problem is that everyone is playing video games at about the same time, i.e. in the evening or at certain times of the week end. So you need to provision for peak usage.

So to me the only benefit to renting is if there are lots of occasional gamers, that do not play every day / every week, including peaks where they would all suddenly play at the same time on certain days (christmas, long week ends, release of new game, etc).

And then of course because the hardware progresses so quickly you need to amortise the hardware pretty fast.

I haven't seen the actual economics but I am surprised it would be much cheaper for consumers.


You might be able to make it work if you can get a gaming tier on a cloud platform. EC2 already has GPU focused instances.

I can see things working out, but still expensive, if the hardware also being using for scientific computing and CGI rendering.

Although I'm not sure if gaming hardware can run 24/7.


I first heard about this year ago, and was also amazed that the numbers could work and RDP lag would make it viable. I can't find the reddit post at the moment, but we had a few exchanges where it was explained that latency is not an issue, and availability is not an issue.

This is still a fairly niche market, and while there are certainly evening peak times, younger kids have a lot more free time than you might think.

How well the company hosting this is doing, I couldn't tell you, but the product itself seemed to work and work well, so much so that I have had it on my to-do list to try and experience it for myself.


The speed of light is really fast, Google has fiber connecting all of its data centers, I think viable data centers for game streaming overlap more than you think, it's limited by how much bandwidth Google can spare between data centers.

The reason most services need to be located close to clients is because you want to avoid data transit over the open internet. You want as little open internet as possible between you and the client. Traffic that's internal to Google's networks doesn't have that problem. You can use compute / gpu in US-west and transit that data to you via US-east and the additional latency would be measured in nanoseconds.


With a straight shot, latency from one coast to the other is at least 50 ms [0]. A more typical route is on the order of 80 ms.

That's definitely noticeable for latency-sensitive actions. I recently switched a server from Oregon to New Mexico, and I notice the latency increase with mosh.

Moreover, there's not a lot of timezone difference between the east and west coasts of the US. Going someplace like Europe is more like 180 ms.

I've played games with a ping like that, but a lot of the ping was my wifi. Doubling the latency would not make the game a better experience.

This could work well for certain types of game, those that are a little less latency-sensitive. But in general the latency is still a big issue.

[0]: speed of light in fiber is ~ 100e6 m/s, around 3000 miles between coasts


Wow, I never realized my intuition was off by so much. Speed of light in vacuum is still about 16ms between coasts, which is orders of magnitude worse than I thought. I thought most latency came from routers and switches but a significant portion of delay is indeed speed of light.


Exactly. And 60fps (which is not that high for many games) gives you 16.67ms per frame. at a 50ms delay you're already 3 frames behind.


Not all games are that latency depending. If I am playing civilization I probably won't care that much where the server is.


Neither, will you need to rent an expensive gaming rig by the hour, though.


So.... Google Stadia? Except it's more like $10/month for 4k gaming.


There seems to be confusion about what hardcore gaming means? Competive Gaming, high-end user experience or bragging rights spec based enthusiasts.

My benchmark would be the XBox One X, which is perfectly capable of 4k 60 fps gaming with HDR and Atmos, try to get that stable running with any PC, good luck. That's 400 for the console and in total under 2000 if you need to get an additional OLED-TV and Atmos-enabled-speakerset, which are not exclusively gaming budget.

The only shortcoming there is the absence of fast current generation NVMe mass storage and RTX-enabled GPU.

Reasonable high end PC gaming is possible well below $2000 or even $1000, even $500 will get you decent performance for even casual competetive gameplay.

The priceyness of gaming rigs setups comes from the insane demand of getting an edge at 4k 120+fps in FPS-shooters. That bracket can't be conviced by any streaming services physical limitations for the next 10 year or so at least.


The Xbox One X doesn't maintain those specs. The resolution drops during game play.

You actually need a machine north of $2k to support 60fps 4k.


> That's 400 for the console

In reality it is 400 for the console + 60 / year for online gaming, which means it is about 640 for 4 years of gaming.

Others have mentioned that you can't get 60fps @ 4k, but I have never played on an xbox one x, so I can't attest to that.


You are not correct. The hardware in gaming PCs and consoles is +- the same, the only advantage is game devs can optimze for a specific hardware setup instead of all possible combinations like on PC.

If console is running 4k@60Hz, it means that level of details is probably somewhere in low settings compared to PC version. For that, you don't need 2000USD PC, something much cheaper would be enough. On top of that you can use it, well, as PC. But yes consoles will be cheaper, just not that much and its purpose is much, much more narrow


> My benchmark would be the XBox One X, which is perfectly capable of 4k 60 fps gaming

Going to disagree with you here. I don't think there is even one game that runs at full 4K (not CBR) and maintains 60fps. If there is one I would love to know so I can check it out but so far everything I play on my Xbox One X is either running at well below 4K in order to maintain 60fps or runs at 30fps with 4K CBR. Very few games are true 4K.


>try to get that stable running with any PC

Easy. It will obviously be more much expensive than a console, but it's perfectly doable if you have the budget for it


I have the budget, but not the time. Unless by budget you mean paying you own household QA technician. Dolby Atmos over HDMI and HDR does not rhyme with NVIDIA drivers, or very few of them, which then again tends to breaks my VR.

https://www.reddit.com/r/nvidia/comments/ao4c0u/anyone_with_...


I’m not so sure you know what you’re talking about.

4K and >120 FPS are incompatible goals. If you are optimizing for frames in a competitive title the first thing you will do is turn total render target size down to the minimum. Conversely, if you want to render at max quality, those pixels look a lot nicer as resolution than as frames.

If you’re a real pro, you can spend five times as much money to hit 80% of cutting-edge performance on both metrics simultaneously. You hardcore gamer, you.

People who talk like that are usually kids spending their parents’ money.

FYI, genuine demand for higher gaming performance is mostly being driven by VR, where sentences like “8K at 144hz” aren’t just big numbers.


Me east german born orphan peasant gamer has of cause no idea what words mean. ¯\_(ツ)_/¯


My kid plays 1-5 hours of Xbox every day. There’s no way I could pay $1/hour.

Renting game consoles only works for casual gamers or super hard core people who want $5k rigs. Casual gamers don’t care, I think, and use their 5 year old iPhone. And there aren’t a ton of super high end gamers. And I hang out in gaming cafes where people pay $6/hour.


Thankfully, actual cloud gaming services are way cheaper than $1/hour. Stadia, for example, has no recurring monthly cost for 1080p gaming; you just have to pay for the games.


$1/hr at 5 hours per day is ~$150/month. I have no idea what you can afford but that’s basically the price of cable these days.


It's also the entire cost of a console these days. The current status quo is a better comparison than a different service where cutting it out entirely is a meme due to ludicrous cost and subpar value.


It’s also $1800/year. I can buy an Xbox or ps4 for $500. I can buy a gaming pc for $2k.

Paying $1800/year/per person forever is a bad deal.


Those countries already have fibre - are you saying 5g is better than fibre for streaming gaming?


Even in the wonderland of fiber, South Korea, fiber to the home has slowed down and fiber to the building is common. Korea Telecom still has lots of coaxial setups they stretch into gigabit speeds using 1:N connections. Gigabit penetration in Seoul was still below 50% few years back, much lower elsewhere.

Btw. Fiber has no latency advantage. What you need is servers at edge in both use cases.


Fiber has a latency advantage in real world usage. In the common scenario of multiple computers and users sharing a single connection, if one of them does something latency sensitive (gaming or video chat), he will be bothered a lot less by others heavy bandwidth use on fiber than on ADSL, or in a lesser measure cable.

That's because heavy bandwidth use (downloads, video streaming) will saturate a low bandwidth link and packets will drop. Or bufferbloat will increase latency without dropping, which is the same for latency sensitive usage.

With multiple video streams going on in the average household being a common case nowadays, it's nice to have enough bandwidth.


Here are these timestamps on one clickable video, feel free to add your own: https://use.mindstamp.io/video/XFnaNsKJ?notes=1


this is great!


I'd be very curious to see if an Ask HN on the premise of “Engineering is figuring out how to do what you want with what you’ve actually got” but in the context of understanding what you can actually manage to learn or build with your current known ability to learn and skills. Could be a very interesting take on imposter syndrome and understanding effective ways to learn?


> “Engineering is figuring out how to do what you want with what you’ve actually got”

same lines as... _life is figuring what to do with what you’ve got_.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: