Hacker Newsnew | past | comments | ask | show | jobs | submit | unionpivo's commentslogin

>As a gamer, why would you want to spend a few hundred bucks on a gaming box, when it isn't able to play the biggest hits? Who would want to deliberately limit their ecosystem to indie games?

???

Look at steam top 100, sure there are 2 or 3 games you wont be able to play on there, but there rest work just fine. And sure there are popular games outside steam, but even if none of them worked (which is not true), for most gamers its a non issue. (And Valve is probably not really concerned about them)

The only games this limits are online competitive (most of the time FPS) games. There are plenty of gamers, myself included, that have 0 interest in such games.

In short even if 0 online FPS games are playable on steam console(which is not true), there are still 10s of millions of gamers, who wouldn't care.

As far as why wouldn't people pick something that can play 100% of games is because they cant. Even the best PC cant play Nintendo games, not all PS games are on PC or xbox, etc. You always have a trade off. And plenty of people still buy PC's,Deck, PS5's and Switch consoles.

My guess id more people won't buy it because, they want better specs, not because a few games wont work on them.

But that still leaves millions, potentially tens of millions of people.


it's not just porn blocking. That's just what is in newspapers. Porn blocking is only small part.

Essentially, you have to preform risk assessment if your site contains any child inappropriate content (according to new law that is defined kind of vague ), you have to age verify all the visitors from UK or risk getting fines.

Since service allows for user upload, this means that their site could protentional qualify. And even if it does not, you need a lawyer to go through everything, to make sure you don't. Sure the chances their site get targeted is small, but not zero.


It's not that I think that UE5 is good for low end hardware, it's not.

One of the reasons that a lot of studios struggle with bad performance on UE5, is because a lot of studios, fired their most experienced devs and hired bunch of cheaper new programmers, because they bought into the whole make game with blueprints idea. I have several friends (I know just one datapoint ), that were in games industry from 6 to 12 years that got fired, just for the studio to replace them with cheaper more inexperienced devs.

Baicly UE5 overpromised how easy it was. You still get some great working games that use UE5, but this are from studios that have experienced devs.


It’s not terrible at low-end hardware. Fortnite has been able to run on phones for a long time now. It’s not as lightweight as Unity or Godot by any means and they still remain the optimal choice for low-end platforms.

What you can’t do is hit compile out of the box and expect it to work well on those low-end platforms, because it will try to use all the high-end features if it thinks it’s allowed to.

I don’t think it exactly overpromises how easy it is, but unlike a lot of software it has a learning curve that seems gentle at first and then exponentially increases. It’s high-end AAA-grade development software aimed at professionals, it expects you to know what you’re doing.


Exactly, but a lot of studios thought they could get away without that, so now they are paying the price.


C is 50 years old or something like that, and it still doesn't have a standard hash map.

Sure its not impossible for C to get that, but at the same time, they are trying to write git not fix C.

* My point is, that hash maps and data structures like that are clearly not the priority of C or they would **exist by now.

** by exist I mean either in C standard, or a at least a community consensus about which one you pick, unless you need something specific.


> or they would *exist by now.

See: https://news.ycombinator.com/item?id=45120171

Nobody needs to change a language standard for 9 lines of code. When you really want to use a hash map, its likely that you care about performance, so you don't want to use a generic implementation anyway.

> or a at least a community consensus about which one you pick

There is a hash table API in POSIX:

    GNU libc:  https://sourceware.org/glibc/manual/latest/html_node/Hash-Search-Function.html
    Linux hsearch(3): https://man7.org/linux/man-pages/man3/hsearch.3.html
    hsearch(3posix): https://www.man7.org/linux/man-pages/man3/hcreate.3p.html


Ursela is there because politicians(individual countries leaders) want here there. She is convenient scape goat for things that would hurt politicians image at home, so they let commission do the dirty work, so they can say it wasn't their fault.


We are getting fucked by USA because we do not feel strong enough without them.

Everybody is blaming Ursula for the deal with Trump, like it was her decision. The individual countries politicians made the call, she is just a convenient scape goat (Probably precisely why her position was created for, to push things politicians know will be unpopular). Politicians (leaders of EU states) have and do have all the power to give her the marching orders.

We are getting fuck on a lot of things because we feel the need to rely on USA. With better/bigger militaries ourselves, USA will have les leverage.


If USD stops being global currency, there probably will not be just one that replaces it, but there will be several competing ones (USD still probably being there), for foreseeable future, with countries hedging bets.


Even if you are releasing such a solution today, it will take months/years to build knowledge and toolchains and best practices. Then have traind developers to be able to use it.

> youre risking getting trapped in a local minimum.

Or you are risking years of searching for perfect when you already have good enough.


Because nowdays more than ever content you need is in silos.

Your facebooks/twiters/instagram/stack overflow/reddit ... And they all have limited expensive api's, and have bulk scrapping detection. Sure you can clobber together something that will work for a while, but you can't runn a buissness on that.

Aditionaly most paywalled sites (like news) explicitly whitlist google and bing, and if someone cretes new site, they do the same. As an upstart you would have to reach out to them to get them to whitelist you. and you would need to do it not only in USA but globaly.

Anothe problem is cloudflare and other cdns/web firewalls, so even trying to index mom and pops blog site could be problematic. An d most of the mom and pop blogs are nowdays on som ploging platform that is just another silo.

Now that i think about it, cloudflare might be in a good position to do it.

The AI hype and scraping for content to feed the models have increased dificulty for anyone new to start new index.


This is the best (and saddest) answer. LLMs break the social contract of the internet, we're in a feudalisation process.

The decentralized nature of the internet was amazing for businesses, and monopolization could ruin the space and slow innovation down significantly.


> LLMs break the social contract of the internet

The legal concept of fair usage has and is being challenged, and will best tested in court. Is the Golden Age of Fair Use Over? Maybe [0].

[0] https://blog.mojeek.com/2024/05/is-the-golden-age-of-fair-us...


While LLMs have accelerated, it, it was already the case that silos were blocking non-Google and non-Bing results before LLMs. LLMs have only made existing problems of the web worse, but they were problems before LLMs too and banning LLMs won't fix the core issues of silos and misinformation.


You're thinking too much by the rules. You can absolutely scrape them anyway. Probably the biggest relevant factor is CGNAT and other technologies that make you blend in with a crowd. If I run a scraper on my cellphone hotspot, the site can't block me without blocking a quarter of all cellphones in the country.

If the site is less aggressively blocking but only has a per-IP rate limit, buy a subscription to one of those VPNs (it doesn't matter if they're "actually secure" or not - you can borrow their IP addresses either way). If the site is extremely aggressive, you can outsource to the slightly grey market for residential proxy services - for fifty cents to several dollars per gigabyte, so make sure that fits in your business plan.

There's an upper bound to a website's aggressiveness in blocking, before they lose all their users, which tops out below how aggressive you can be in buying a whole bunch of SIM cards, pointing a directional antenna at McDonald's, or staying a night at every hotel in the area to learn their wi-fi passwords.


> You're thinking too much by the rules. You can absolutely scrape them anyway. Probably the biggest relevant factor is CGNAT and other technologies that make you blend in with a crowd. If I run a scraper on my cellphone hotspot, the site can't block me without blocking a quarter of all cellphones in the country.

I am familiar with most of that, and there is a BIG difference between trying to find a workaround for one site, that you scrape ocasionaly, than to to find workaround for all of the sites.

Big sites will definitely put entire ISP's behind annoying capachas that are designed to stop exactly this (if you ever wonder why you sometimes get capatchas that seem slow to load, have long animations, or other annoying slow things, that is why etc.)

And once you start making enough money to employ all the people you need for doing that consistently, they will find a jurisdiction or 3 where they can sue you.

Also good luck finding residential/mobile ISP's that will stand by, and not try to throttle you after a while.

You definitively can get away with doing all of that for a while, but you absolutely can't build sustainable businesses on that.


There are many rationalizations to not try.


And JavaScript/dynamic content. Entrenched search engines have had a long time to optimize scraping for complex sites


> management's thirst for elimiating pesky problems that come with dealing with human bodies

But that's what 95% management is for. If you don't have humans, you don't need majority of managers.

And I know of plenty of asshole managers, who enjoy their job because they get to boss people around.

And another thing people are forgetting. That end users AKA consumers will be able to use similar tech as well. So for something they used to hire a company for, they will just use AI, so you don't even need CEO's and financial managers in the end :)

Because , if software CEO can push a button to create an app that he wants to sell, so can his end-users.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: