Hacker Newsnew | past | comments | ask | show | jobs | submit | jerven's commentslogin

RDFa/Microdata is more interesting for people whom sell objects instead of content. e.g. marking up that a page is about a kitchen cabinet that is 60cm wide and in the color white might lead to more sales in the long run. As people whom are looking for 60cm wide cabinets might get to your page instead of one about one 36 inch wide.


That's an oddly specific search and even Google doesn't have any kind of tools for queries like that. What is more likely is that you'll find companies specialized in selling cabinets and they'll have a browser/search to restrict choice by given dimensions. There is not a lot of benefits for them to expose all that data to various search engine, best case scenario they end up competing with a bunch of other brands on a generic search engine page where they have absolutely no control how things are presented etc...

And even before thinking about that, you can actually put the dimensions in a description, which some do (like Ikea) and Google is definitely able to pick up on that, no RDFa was ever needed. As far as I can tell, LLMs can work that out just fine as well.

The problem with the metadata discussion is that if they are actually useful, there is no reason that they are not useful to humans as well, so instead of trying to make the human work for the machine it is much better to make the machine understand humans.


That doesn't sound relevant to zero click.


I am partial to the approach in https://www.expasy.org/about-chat ;) so yes I can think it can help. Mostly though the use case becomes interesting when you deal with multiple graph databases e.g. UniProt + WikiData etc.

If it is just to query one single dataset that is already in one tool it is less compelling.


Yes, it is super sad :( a real pity that a cheap but useful resource gets taken out almost by accident.


I would like to state that good bike lanes and trains also have induced demand. The Netherlands and Switzerland, have demand for more of both (as well as more demand for car lanes as well)

It is just that trains and bikes are much more efficient in terms of land use.

The 3 lane road in front of my house is "good" for 16,000 cars a day. The 2 lane train line a 5 minutes walk from my house is "good" for 120,000 passengers a day. A train line can carry about 10x the traffic of a car lane (in practice) with similar ground usage.

So when a train system has more demand/use than expected (e.g. leman express in the geneva region) there are more options to increase throughput (in the leman express case double level trains) that require less new infrastructure to be build.

When new infrastructure is required, limitations of space mean that a 15 year period from plan to implementation is normal. Which means infrastructure which has more head-room is preferred over quickly saturated ones.

To add the adding of one lane to the A1 for 18KM costs half the total of the leman express infrastructure. But has significantly less benefits in total transit capacity.


> The 2 lane train line a 5 minutes walk from my house is "good" for 120,000 passengers a day.

But that's not true. Your chances of living within 5 minutes of a train stations are slim, unless train stations are spammed everywhere. And if stations are spammed everywhere, then they become inefficient.

Meanwhile, cars are only mildly affected by additional 400-500 meters of distance.

There's a great resource: https://www.geoapify.com/isoline-api/ - it shows isochrones for different commute methods.

> A train line can carry about 10x the traffic of a car lane (in practice) with similar ground usage.

In practice, a train line effectively is only slightly better than cars, unless you enshittify your city into a Manhattan-style dense hell.

Moreover, self-driving cars with mild carpooling (think 4-6 people per vehicle) blow ANY transit mode out of the water in speed and efficiency. It's not even close. A good approximation of this are airport pickup vans (the ones that you arrange in advance).

> To add the adding of one lane to the A1 for 18KM costs half the total of the leman express infrastructure. But has significantly less benefits in total transit capacity.

Yeah. Imagine that instead of wasting money on useless transit (see: Seattle ST3), we used them to incentivize companies to build more offices outside of dense city cores.

Then these lanes wouldn't even be necessary!


Or, you know, DO put offices in dense city cores, say within 5-15 minutes of a train station. (that or telecommute). https://www.youtube.com/watch?v=SDXB0CY2tSQ

And, a lot more stations are within say a 15 minute reach if you use a bike O:-) https://www.youtube.com/watch?v=1UxCbmT9elk


> Or, you know, DO put offices in dense city cores

And then what? How do you get there?

Your time budget is 30 minutes (the average commute in the US). Go on, try to play around.

> that or telecommute

Yes. But if you telecommute, then why bother with all those trains and dense offices?

> And, a lot more stations are within say a 15 minute reach if you use a bike O:-)

That's already too much for commutes and will result in commutes inferior to the current status quo in the US.


Not much different from average commute in a lot of places TBQH. Though definitely not in big-city-traffic. If you’ve ever seen bumper-to-bumper queues in LA or Manhattan, you know there’s no way those folks are getting anywhere in the next eternity or two. That kind of gridlock pushes up the average for everyone else.

Of course I do have a slightly different set of requirements; since I've always lived out in the countryside. You trade in a longer commute for more elbow-room at home.

The trains generally run on time, so that's what I often used to use if I needed to get into a dense town.

That was before COVID. Post-COVID, telecommuting has become available to more people. In my opinion, that's the best solution where possible.

At the very least telecommuting and trains gets the OTHER cars off the road when I need to physically be at factories, labs, or workshops.


> Not much different from average commute in a lot of places TBQH. Though definitely not in big-city-traffic.

The commutes in large cities (New York is a bit more nuanced) in the US are still faster than in _any_ large European city. Mostly because of cars.

> Of course I do have a slightly different set of requirements; since I've always lived out in the countryside. You trade in a longer commute for more elbow-room at home.

My favorite city from the urban design standpoint is Houston (I hate its climate and Texas that surrounds it). People there can have beautiful and spacious single-family houses with backyards, and yet still have short commutes because it doesn't have a well-defined city core.

So it lacks the obvious traffic magnets, and people tend to chose jobs near their housing. This is the model that needs to be promoted, and it can solve housing issues.


4 people, for a short term stay is about where it starts to make sense to ride share. Long term, you would have an longer term pass, vastly reducing the cost of a busride, and you would often travel in smaller groups. So in my experience there are times when bus/tram can be much faster and convenient than a car. Of course there are many cases where it is the other way round (and going out of the cities that ratio changes dramatically for a car). Good city design tends to favor a ratio in favor of public transport over cars.


Yes, in theory, but your process control would be terrible, thus your product would also be terrible. Unless you invent something very clever.


But that makes them superb heat battery, which is under used at the moment to store energy.


because of the Carnot cycle, no? it just doesn't make much sense to store energy as heat


A lot of energy is spent to heat things anyway.


Is it a current trend? my Mom does this and she picked it up in the 70's on typewriters.


Working for an open-data project, I am starting to believe that the AI companies are basically criminal enterprises. If I did this kind of thing to them they would call the cops and say I am a criminal for breaking TOS and doing a DDOS, therefore they are likely to be criminal organizations and their CEOs should be in Alcatraz.


MilleniumDB is an interesting engine, as is Qlever mentioned in other comments. I think both are good candidates at making RDF graphs one or two orders of magnitude cheaper to host as sparql endpoints.

Both seem to have arrived at the stage of transitioning from research to production code.

Very exiting for those of us providing our data in RDF and exposing Sparql.

AWs Neptune analytics is also very interesting, allowing Cypher on RDF graphs. Even the Oracle inbuilt RDF+Sparql seems to have improved greatly in 23ai.


It seems like writing Cypher to query RDF would be hard.


There was a microsoft prototype for more stack allocation in OpenJDK (https://archive.fosdem.org/2020/schedule/event/reducing_gc_t...). I recall that being put on hold because of how it would interact with project Loom fast stack copying. But I don't know the current status.

GO has a non moving GC and I understand, that the cost of introducing safe moving GC is considered high. If one has a moving GC which the serious java one's are read/write barriers are already required, especially if they are concurrent like ZGC, C4 or Shenadoah. ZGc, C4 and Shenadoah all started out as non generational GC implementations, which gained them later, because in most cases they do increase performance/reduce overhead.

Valhalla makes objects denser, and reduces overhead of identity which is great. Reducing the difference in memory layout between java objects and nested go structs.

Go with arena's reduce the GC de-allocation costs. Something that the ZGC team is looking at in relation to loom/virtual threads. (but I can't find the reference for that right now)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: