Hacker News new | past | comments | ask | show | jobs | submit login
‘YouTube recommendations are toxic,’ says dev who worked on the algorithm (2019) (thenextweb.com)
423 points by 3131s on Dec 23, 2020 | hide | past | favorite | 352 comments



For me YouTube recommendations are not toxic, they are so amazing I some times wonder how comes the world around is not heaven when everybody has free access to this.

That's because I always watch high-quality educational content and some beautiful comforting and inspiring music. So for me YouTube is a fountain of knowledge, visual and musical aesthetics etc. Every day I get inspiration, healthy kind of fun and learn something cool, useful and healthy.

But people who watch stupid and destructive stuff (even once) are doomed because they are going to get the same and worse kind of content recommended over and over. This is a particularly underestimated social problem indeed.

That's why I recommend to avoid signing-in into YouTube. Better watch anonymously so you can always clean the cookies and get out of the pit.


Same experience here. I have to actively curate them by removing some one-off dumb videos from my watch history, but my YouTube recommendations are by far extremely high quality. Youtube is the only service I know of that has actually achieved working content recommendations for me.

My Watch Later has hundreds of hours of content from things that I definitely want to watch but I'm either not yet in the mood for or too long to get to now.

Seriously, youtube for all its flaws is an incredible human achievement on par with wikipedia.


I don't use the "watch later" feature anymore. It's always a bit irritating when you want to catch up on the list only to find that a video has been removed in the mean time. So I switched to archiving using youtube-dl. Also nice to have a few backup videos in case of an internet outage.


I rarely have that problem with watch later, but it does happen. When I want to ensure a video is available down the line, i send the url to the web archive; they have a special archiver for youtube videos.


What's the link?


Just prefix any URL with this exactly: https://web.archive.org/save/

Example: https://web.archive.org/save/https://www.youtube.com/watch?v...

If it's already saved, it'll do a new snapshot, but I believe the archive is smart enough not to duplicate saved youtube videos.


I tried this and it keeps saying job failed.


Looks like the Archive is having some trouble today, but it's also getting throttled because the link is getting clicked a lot. the /save/ link will redirect you to a saved snapshot of the page; it basically tells the archive "capture this page right now".


Same issue. Love the concept of saving through a GET link.


Just curious, do you download them one at a time or do you have a "system" for handling them?

I ask because this seems like a great idea. I've already written a script to read from a list and download and now I'm wondering if I don't just use use that instead of "watch later" going forward. Curious if you do anything beyond adding them one at a time I should consider before changing my approach.


FYI youtube-dl can download playlists. I heavily suspect you can even use --continue to download missing updated bits in a playlist you previously downloaded


Good point on the playlists, it hadn't dawned on me that 'watch later' is a playlist. Haven't tried continue, in the past I've created a 'downloaded.txt' file which gets updated as files download and then checked before it runs again so it doesn't try to re-download them.


Yeah you don't need that downloaded.txt, youtube-dl can do that all on its own with the right flags :)


This is conceivable to me, bit I've never had this happen. What sort of videos would be removed that you actually wanted to watch?


Apart from what has already been mentioned, stuff also gets copyright-struck. Sometimes for ridiculously minor infractions. Sometimes for no real infraction at all.

Additionally, there are the vague and ever-changing terms of service.

Also, people sometimes take stuff down themselves. Take the JRE, for a recent and rather prominent example.


Happens to me often. I don't think it even tells you what videos are gone. I know I've seen channels say they removed a video themselves because it didn't do well with views so they wanted to re-tweak it in the hope it lands better.


Cody's Lab had a video about refining metal ores removed, as one example.


People remove and re-upload (or not) their videos sometimes, so if you could even find it that'd be an annoying extra step.


+1 for cleaning up the watch history. It's pretty nice that youtube let's you edit that and doesn't use a "full history" to train the recommendation algorithm.

I try to be careful and always use private browsing when opening "suspicious" videos, but every now and then, one slips and taints my recommendation.

OTOH, with a clean history, I barely subscribe to channels, since new videos usually end up in the recommendations, anyway.


"OTOH" = On the other hand.

For those that, like me, hadn't seen this one before ^_^


How do you clean up your watch history?


Go to "History" in your sidebar: https://www.youtube.com/feed/history

You can individually remove videos by clicking on Delete (at least on desktop).

You can also pause your watch history (eg. for some guest browsing) or clear it entirely (if you want to start from scratch).


My big curation-related grief has to do with YouTube playlists (the ones you can create with the +Save button on YouTube.) The list of playlists can't be sorted in alphabetical order by playlist name. It seems to be ordered either by playlist creation date and/or by when I last added a video to the list (with most recent playlists first.) I'd say the playlist feature is next to useless to me, except that I keep using it, ever so masochistically.

This is a recurring grief I have with Google products: poor UI and a seeming inability to receive and/or act upon feedback from power users.


This is intentional. They want you to just follow their AI more than curate your own world. You live in their world.


Thing is YouTube has hardly any features to manage a playlist that big. I think it does not even allow to search within the playlist.


YouTube provides high quality for everyone, because it knows what your preference is. For poor dumbs the suggestion feed full of conspiracy videos is very extremely quality in their eyes too. The other thing is that we consider their videos - crap


but many years later,some videos in watch later list have gone,it seems that no way to find them back.


> So for me YouTube is a fountain of knowledge, visual and musical aesthetics etc. Every day I get inspiration, healthy kind of fun and learn something cool, useful and healthy.

You must never be curious about things outside of your passions.

I play Fortnite with my kids casually. I looked up a howto a few weeks ago, and my recommendations are still 25% Fortnite related content, even though I only watched one video one time.

It's been happening for years. I looked up one old Norm Macdonald standup bit, and my recos changed immediately to Joe Rogan and adjacent talking heads. I watch one MKBHD video, and I'm overwhelmed with recos for unboxing and tech reviews.

The stuff I genuinely care about, music theory, is impossible to keep in my home page because YouTube doesn't value that engagement.


> I play Fortnite with my kids casually. I looked up a howto a few weeks ago, and my recommendations are still 25% Fortnite related content, even though I only watched one video one time.

In general I've noticed that YouTube favors your "newest outliers" (let's call them) over stuff you've historically looked at. Probably the rationale (optimization metric?) is trying to capture you into content that might add to your total watch time on the platform.


> That's why I recommend to avoid signing-in into YouTube. Better watch anonymously so you can always clean the cookies and get out of the pit.

There definitely needs to be a "reset" button on these services, or at least a better way for your to actively monitor your algorithmically-determined interests and give feedback (I swear, you watch one SNL video...).

Another poster in the thread said you can do this by editing your watch history, which frankly would have never occurred to me as a way to influence the algorithm. It should be a more explicit feature for any sort of recommendation engine.


The fundamental problem is that these desirable features are likely contrary to google's interests. They have convinced themselves that their recommendation system maximizes engagement, so anything that pulls users away from that is a pottential threat to growth of attention directed toward ads. They will tolerate these power-user style methods as long as they remain statistically unpopular among users. I seriously doubt they would willingly provide a button that negates the fruits of the targeting work at the core of their business model.


You can also click the three dots icon next to the video thumbnail of recommendations and select "Not interested"


Every few days if you notice the algo giving too many fast-food style videos, it's worth it to aggressively use this to unfollow all these topics. Miraculously, the next time you refresh your feed, all those old interesting topics you used to be into will come back.

Overall the algo is very very recency-influenced. You can watch videos from a niche interest for a year straight, but if you then spend a week watching nascar, you'll never see anything but nascar videos for months unless you say "not interested".

I imagine they have found that for the average user, having a longer history doesn't improve G's metrics.


I wish the algo would instead notice that I always watch every video from particular channels that only publish rarely. I also publish my own videos, and after a long gap without publishing my new videos get no views unless I pay for ads or promote them outside of YT, and even my family members who are only subscribed to my channel don't get my videos recommended.


This is useful but it's pretty opaque to the user as to what exactly this does.


It tells the algorithm that you’re “not interested” in videos like that one?


Yup, that's what I always do, and unsurprisingly my recommendations are great.


> Another poster in the thread said you can do this by editing your watch history, which frankly would have never occurred to me as a way to influence the algorithm.

Really, why? (I'm curious.)

> There definitely needs to be a "reset" button on these services

The reset button already exists in this context: Google and YouTube have options to automatically purge your various histories at the 1/3/6 month marks.

https://www.theverge.com/2019/10/2/20895183/youtube-history-...


I think we need more than a reset button. We need, at a minimum, a "blacklist" feature.

I would love to be NOT have any video recommended to me that contains certain keywords in it's metadata. I simply don't care what I would miss out on if all videos with the word "Trump" anywhere in their metadata, for example, would simply never be served up for my perusal.

Even better would be some type of "whitelist" feature which only shows me stuff from my subscribed channels and their associated channels.


I think a blocklist is prudent for sure, and more likely for YouTube to implement. An allowlist like you propose is already similar to the pure Subscriptions view, and restricting it to only channels related to your subscriptions would kill most of the reason behind recommendations in the first place (virality.) Not saying that's good or right, just that YouTube is less inclined to implement such a feature.


It doesn't seem that hard to get out of a pit, even with the account. It just takes a couple of days of browsing of the new type of content, along with maybe clicking "don't recommend this channel"/"not interested" 3-4 times, and the algorithm switches to the new things... The algorithm doesn't care, it just recommends.


I want Youtube to recommend educational videos, a la Extra History and 3B1B. That's most of what I watch on Youtube.

Youtube recommends stupid crap based on what other people like.

If I watch one stupid video not in incognito, ALL of the recommendations become stupid videos because those videos have better metrics. Youtube actively guides me away from educational content.

Even within educational videos, if I click on a click-baity channel once, that channel dominates things I've watched MANY more time. It seems to all be about click-through metrics.

I've never had Youtube recommend really upscale educational content like Zach Star before. It's quality stuff, but you gotta find it yourself.


> ALL of the recommendations become stupid videos because those videos have better metrics. Youtube actively guides me away from educational content.

Back when I was not particularly into YouTube and hadn't had many great recommendations YouTube insisted I should watch Sapolsky's Harvard lectures - these probably had particularly good metrics. Once I started to watch them (and was very satisfied) more and more great recommendations started to appear.

> It's quality stuff, but you gotta find it yourself.

People should start sharing curated lists of high-quality no-bullish YouTube videos/channels. Kind of like those awesome-everything on GitHub.


Conversely, at some point I decided to watch Sapolsky lectures based on human recommenders, but this caused YouTube to recommend not only other science lectures but also sensational videos purporting to be about quantum mechanics, medicine, etc. but which were obviously total pseudoscience.


Not saying YouTube's recommendation algorithm is perfect but you could try to interact with the feedback popups. That seems to clear up stuff I don't need on my 1st page.

If all else fails, create a new Google account for separate exclusive use should do the trick.


> The algorithm doesn't care, it just recommends.

And isn't that terrifying, in the Parable of the Paperclip Maximizer kinda way?

We can make the well-its-your-individual-responsibility bootstraps argument, but that kinda works only if you believe in the supremacy of an individual's autonomy and that the mind can't be hacked or hijacked. Everything from addictive products to the marketing industry shows that isn't the case. The youtube paperclip maximizer is maximizing for your mind's engagement.

We can say "well, it's just feeding people what they want!" But you can say the same thing about airdropping crates of opium onto the streets.


It is only terrifying if you assume people are mindless spiritless bots.

In which case (if you truly believe that), we are all fucked anyway. Then who cares? Is youtube really your biggest problem?

> is maximizing for your mind's engagement.

So has media been doing for a long time, also all writers and poets, etc. Mind's engagement is not the only meaning of life, and most people act out this deep knowledge. I don't see a problem.


Yes, it was called "yellow journalism" before it was called "clickbait". I agree, that's not new.

What's new and disturbing is 1) the degree of algorithmic personalization, and 2) that personalization goes to the highest bidder. Put together, we've built a marketplace for population-scale behavioral nudges, and the people who play that game have war-chests, not human-sized wallets.

The Facebook/Myanmar genocide is the Godwin's Law reference of the social media debate, but you don't need to look any further than the politicization of what should be neutral medical facts being fueled by recommendation-hole conspiracy theories to see the danger of all this.

I don't believe people are mindless spiritless bots, but I do believe our industry thrives on engineering ways to hack people's attention circuits. And what's terrifying is we've put a paperclip maximizing machine at the helm of all this.


I have this problem too occasionally; but if you spend 2 minutes hitting "do not recommend videos like this" + "do not recommend this channel", all the good content beneath will rise to the top.


The most simple way to adjust recommendations is to delete videos you didn’t like from your watch history.


I'm glad it works for you, but dedicated individual action by someone especially capable and focused on the topic isn't a solution to systemic problems like this. Indeed, hero stories are often used to diminish problems. Systemic problems need systemic solutions.


It's a mixed bag: I usually only watch very niche low views tech things, so it's easy to see which recommendations are truly based on what I watch and which ones are based on what Youtube wants me to watch (because it has millions of views and from unrelated topics).

The ones based on my views are pretty good recommendations, the other type is usually extremist ragebait.


When an user becomes aware of the algorithm and purposely avoids clicking on videos that interest him slightly because it can radically change his suggestions for days or weeks, it is a sign imho that the algorithm is not well designed.


Uh, I had done the same you did for years - no account, cleared cookies - and consistently found conspiracy videos in recommendations. Now less than before, but all the same.

It's not from watching "stupid destructive stuff", in the least. Plenty of full music albums often link to it, and amateur stuff.


Wow, I watch high-quality, beautiful, comforting and inspiring, knowledge, visual and musical aesthetics, cool, useful and healthy.

The others watch stupid, destructive, worse kind of content.

I guess everyone can claim the same because all these properties of what someone watch are highly subjective and a matter of taste, preferences and pure opinion.


The problem is that most people don’t understand The chain reaction problem here.

Watching one calming video is unlikely to lead you to become addicted to calming videos. YouTube is optimized for the hook, and calming videos aren’t hooky.

The much more salient videos tend to be the sensational ones, and people who don’t have the understanding of that fact are much more easily hooked.

So while it can be a fountain of knowledge for one group of people, the insidious nature of the hooking algorithms still lurk beneath that surface.


> But people who watch stupid and destructive stuff (even once) are doomed because they are going to get the same and worse kind of content recommended over and over.

It's also important to note that a lot of users similar to you never make it to this paragraph. They just assume the reports about the dangers are overblown.


When I watch junk on YouTube I need to open the link using Chrome's incognito mode.

When are companies going to give us more control over what inputs we give their recommendation algorithms? There should be a button that says "hey, I'm going to watch this because I'm only human, but please, don't show this junk to me". Or, "hey, I understand this content creator has thousands of videos and uploads content daily, but, I'm only interested in this one video, not in their daily vblog from today"

I literally watch 1/3 - 1/2 of all my videos now in incognito mode on YouTube, just because their algorithm is so aggressive when it comes to recommending junk / 'content creator' noise.


> When are companies going to give us more control over what inputs we give their recommendation algorithms?

My guess is that they'll never do this, because they want you to go bad and watch as much of that junk as possible.


I've had pretty good luck simply deleting the junk videos from my watch history. I can't recall ever seeing a similar junk video being recommended after religiously taking that action.


You can on YouTube. You can finally ban channels from your recommendations and tell it now to show you things like that anymore with the submenu by the video thumb.


> But people who watch stupid and destructive stuff (even once) are doomed because they are going to get the same and worse kind of content recommended over and over.

Not really my experience. I'm not sure but I have the impression that the recommendation heuristic is very time-sensitive. Let say I watch sailing videos for a few days, it tends to forget I was interested in guitars the previous week.


I've noticed the same, except that it occasionally remembers and pops something up that I haven't watched in a while. I think, "Oh, I haven't seen one of those in a while!" about once a week now.


It's also reasonably good at context sensitivity, so when I'm playing a backing track for my guitar 90% of the recommendations are for other backing tracks, with the rest being decent song and music theory channel recommendations rather than whatever else I've been looking at this week.

The flip side is, I somehow appear to have accidentally clicked subscribe on some insane Trumpist 'news' channel I probably stumbled on via a forum link, and it very quickly took over the push notifications on my phone. That might be subscribe functionality working as intended, but it took more effort to identify the source of nonsense spam and unsubscribe from it...


YouTube links and embedded videos will tilt your search experience too.

The problem is that the algorithm seems biased so that one or two visits to shit causes you to fall down into a spiral of shit. That's probably because statistically when averaged across the entire YouTube user-base, shit leads to more time on the site. Shit is usually more divisive, inflammatory, etc., and negative emotions are the easiest path to engagement.

Virtually everything wrong with social media can be summarized by "negative emotions are the easiest path to engagement." Want to keep people on your site? Anger, offend, or scare them.


>The problem is that the algorithm seems biased so that one or two visits to shit causes you to fall down into a spiral of shit.

after months of having a fairly normal youtube experience basically only playing music I clicked on one Jordan Peterson video that was in a news article, and for the next few weeks all I got was alt-right content in my feed. I even tried to get rid of it by explicitly saying I don't want it recommended to me, which gets rid of the individual channels but similar stuff just keeps popping up.

Youtube overfits what people click on so badly, I have no idea how this isn't fixable by honstly just turning the recommendation system off and just giving me random crap, even that would be an improvement.


Yeah I’ve had that experience too. It’s easy to see how YouTube perhaps almost single handedly caused an explosion of fascism among passive media consumers. You hit one of those videos and it rabbit holes you.


The problem I've found is that it takes some tuning to get it to stop recommending horrid clickbait, extremist recruiting fodder or just copyright fraud.

You have to actively click "not I don't like this" when you see something low quality.

This means that the average joe will be spammed by low quality shite, where as the power user will get decent recommendation.

I do have a container tab just for youtube, to avoid polluting my recommends with embedded shite.


Yes, this is the problem, and the article nails it. I just did all experiment to proceed it to myself too: open YouTube in a private tab, search for anything gaming related, click something random on the front page, and bam: in my recommendations "Ronda Rousey destroys feminist". Of course once you click that you're down a black hole of Ben Shapiro and Jordan Peterson.


Try opening youtube in private mode. You get a ton of garbage that it expects you to watch.

I binged Nilered once and felt pretty terrible, I can't imagine how I would feel after a day of "peppa pig plays minecraft."


First of all, article is incorrect: for a long time now YT optimizes recommendations not for watch time, but for life-time engagement. YT probably has the most advanced QT ML in the world.

Second, it’s trivial to re-adjust recommendations: go to history and delete everything you didn’t like, and recommendations instantly will become much better.


> First of all, article is incorrect: for a long time now YT optimizes recommendations not for watch time, but for life-time engagement. YT probably has the most advanced QT ML in the world.

Lifetime engagement is a function of watch time, right?


No, it turns that if you optimize for immediate watch time, people watch a lot at once and then didn’t return. It’s more profitable if people watch a little bit everyday for the rest of their life.


Everyday I open a private browsing window on Firefox and keep YouTube whatever on that. Same for Amazon and other stuff. At the end of the day, close the window and all gone.


The main advantage of using YouTube with an account, is being able to see what's new from all your subscriptions.

To get rid of the annoying stuff, I added this to my uBlock Origin filters:

www.youtube.com###comments

www.youtube.com###related

www.youtube.com##.ytp-show-tiles.videowall-endscreen.ytp-player-content.html5-endscreen

This of course also eliminates ads.


You can follow channels through RSS.


It reminds me of the "Kill your TV" stickers in the 90s. Obviously plenty of people wasted huge amounts of time watching garbage. On the other hand you could watch NOVA and Ken Burns.

The more things change the more they stay the same.


I concur, but that's because I use YouTube for very narrow niche content that the algorithm helps me find. If I was open to YouTube helping me figure out what to get interested in, I have every reason to believe it would be as horrible as what Spotify tries to get me to listen to or the sponsored posts Instagram shoves down my feed.

For the record, YouTube is full of amazing German-language TV documentaries, and since many of them are posted by unofficial accounts (read: not the broadcaster's channel), it's the algorithm that surfaces them, not search.


> Better watch anonymously so you can always clean the cookies and get out of the pit.

Eh, you can always remove a video from your history. I just don't watch certain things, or watch them in a throwaway firefox container. Being able to maintain playlists and subscriptions is worthwhile.

Like you I get amazing music recommendations and educational content. It took concerted effort but my recommendations spot on for the most part.


>That's why I recommend to avoid signing-in into YouTube. Better watch anonymously so you can always clean the cookies and get out of the pit.

When you sign in, you can view your history and remove things you don't want the algo to work on.

Here's what to do instead: Sign in, and use your account to curate what you want to see. As you say, music, education, and for me, it's cookingtube, some non-toxic video game people for sim games and the like, etc.

And my algo recommendations are wonderful. The biggest issue is it forgetting about cool channels and hiding them, and I don't use the bell because I hate notifications, but generally speaking it does a great job.

But if you want to watch something that you DO NOT want the algo matching on: Open a private tab (or clean out your history).

It's actually not bad to keep it under control.

Oh, and when Youtube drops a redpill propaganda video in, or something similar, (it always will try to indoctrinate you into a radical ideology, no matter what), make sure to use the menu and tell it to fuck off and never show it again.

I've got it trained now. No politics! No radicalism! Just good, relaxing, well-made content.


I don't think people who say "For me YouTube recommendations are not toxic" actually get the point. Although I won't judge your opinions or preferences, the real issue with Youtube or any recommender system is the way it games your dopamine system.

Studies have shown how the prospect of an unknown reward riles up users of social media platforms. Is this a bad thing? It might depend on who you ask. For someone like me, who has been through multiple addictions (cigarattes, drugs), I know that abusing dopamine affects quality of life, productivity and happiness in general.

In that way, I think recommendation systems which optimize for higher engagement is toxic. I think there's value in having an AI do the heavy-lifting of choosing the right content for us. But, the AI's rewards should align with what I need, which is quality content tailored for me (whether or not it increases my engagement on that platform).


> But people who watch stupid and destructive stuff (even once) are doomed because they are going to get the same and worse kind of content recommended over and over. This is a particularly underestimated social problem indeed.

Indeed. It's a bad enough problem that one wonders whether the company is liable for its effects upon society.


Yeah, it is absurd. All of my youtube subscriptions (all 60+ of them) are educational or personal interest things. But let your reptilian brain click on one stupid video of a scantily clad woman and your recommendations will just go down the freaking drain...


This is why my YT feed is full of women's longjump and polejump vids, and the comment sections on those "sports" vids are a bunch of guys joking about how no one is there because they follow athletics.

https://youtu.be/RQBIreEyCc4


Same. I had the good fortune to have my laptop (and youtube account) used as the communal music playing device for about a year in a household with really good music taste. Consequently my YouTube suggestions are excellent (much, much better than Spotify).


It's actually much easier. Some time ago YouTube added a new tab 'Watch history'. You can open it and manually delete any 'junk' videos from your history with just one click. Algorithm changes accordingly, I've checked.


Alternative Option:

Turn off watch history, then only the videos you explicitly like will affect your recommendations.

Bonus Points: Use extension to turn off recommended panel on side of videos, then you only see recommendations when you go to home screen.


Exactly this. Play stupid games, win stupid prizes.

I don't think you should avoid signing in, when you see something you don't like just mark it as such and the algorithm seems to be pretty good at filtering in the future.


For me it's the opposite, for some reason practically all Youtube recommendations outside of the subscribed channels are completely useless and I would never watch them. There are only few exceptions. It's not that the recommendations are toxic, they are just extremely bad.

The algorithm doesn't seem to be able to learn what interests me. My suspicion is that this is so because I have a lot of diverging interests that change from month to month. Still, this does not explain why Youtube keeps recommending channels and topics even when I never watch them.


I mostly agree with you except I find it’s easy to steer the recommendation algorithm which is part of why I like it. Sometimes I go on a political/toxic YouTube binge for an evening or two but feel I am able to steer my front page back to more constructive content pretty easily.


Your recommendations must work different to mine. Second video from the top for me is "Jordan Peterson explains why you should never lie." I've never willingly watched a Jordan Peterson video in my life. I mainly use YouTube to watch tutorials related to my industry. As a test I decided to scroll through my recommendations and count when the first tutorial specific to my niche appeared. It was #119.


YouTube recommended me Kate Bush's "Wuthering Heights". Treasure.


Check out Angra's cover of the same song. Great track.


Being logged out and in incognito doesn't help.

https://algotransparency.org/?date=23-12-2020


Same here.

Since I am not as good with ignoring stupid stuff, I just open stuff I don't want to see in my recommendations in a private window. Works great!


I do the same, things and topics which is one time watch and don't need recommendation related to it, i watch in incognito


You definitely don't sound like an outlier.


"Just don't watch bad stuff"


Yeah it really is an echo chamber type algorithm. I like it too because if I see garbage I hit the back button. Hopefully that trains it more. I generally watch DIY, history, video game reviews, and a few different genres of music and youtube has certainly helped me find more of the same. However if you're watching conspiracy videos and right wing propaganda you're gonna get more of that and it becomes a self fulfilling prophecy of drek and dumpster fire videos.


This is exactly how I feel about many things in our modern world. People complain about so many things, but don't realize that it's often their fault, or at least they often have more influence than they realize. I'm glad to hear you have such a great experience with youtube. I generally have the same experience: great videos that relate surprisingly well to my interests, the worst I ever see is ads if I'm on the mobile app since I block ads on browsers.


I know I'm old but dang for some reason I really dislike recommendations from almost any company.

For media I grew up in the 70s and had the TV Guide. I'd look through to see what I wanted to watch, circle some things and then watch. No spying what-so-ever.

So, I like for example HN where AFAIK I'm not tracked so much and I choose what I want from the top of the list.

Conversely I mostly hate Amazon, Netflix, Amazon Prime, Spotify, Youtube, Twitter, Instagram, and all the ad services and any other service that uses some algorithm to decide what to put in my face.

It's to the point that 3 out 4 time I'll open videos in private windows so they don't get added to my profile and used for recommendations.

I don't mind going to a tech sight and seeing ads for tech or a car site and seeing ads for cars. Do mind visiting an apartment listing site and then going around the net and having every other sight shove ads for apartments in my face.


I've said it many times now: people should be avoiding "the algorithm" as much as they can on streaming and social media sites.

On Facebook "avoiding the algorithm" entails creating a friends list of all your friends, then using the link to that list as a portal into facebook. Posts and shares are listed in chronological order without anything missing (which is absolutely NOT true for the normal news feed, even when "chronological order" is selected). No "so-and-so commented on this" or "so-and-so liked that". Just posts and shares. Scroll through the feed until you recognize something, and you're done.

On youtube, you really should be going straight to subscriptions first. If you want to look for recommendations that's perfectly fine, you can go to the home page... but don't go there by default.

On reddit, I recommend using /r/all as a portal with some of the more obnoxious subreddits blocked.

I'm aware that what you see is still influenced by the algorithm in these examples, but the impact is reduced by a lot.

Browse with intention. Make deliberate choices. Don't just let the algorithm lead you around.

Remember, "the algorithm"s on these sites aren't designed for your benefit. They're designed to increase the site's metrics. More clicks. More time on that website.


I think client apps can help a lot in all of these cases. Like newpipe for YouTube or infinity for reddit. They clear out tracking as much as possible and let you browse without logging in.

But the website of all these site feels useless and slow to me. Recommendations were good on yt, but I think they stopped it during the election. So, I defaulted to newpipe and freetube.


In my experience, recommendations are still good. You just can't rely on them anymore because if you keep following them, you tend to go to whatever is popular rather than going deeper down the rabbit hole like you used to.

You have to mainly keep to your subscriptions, and use recommendations occasionally.


> So, I like for example HN where AFAIK I'm not tracked so much and I choose what I want from the top of the list.

The top of the list is the product of some recommendation algorithm. Clicking through https://news.ycombinator.com/newest, there were some 900 submissions in the past 24h; you probably chose from a small subset of that.


It's not personalized.


Adding on to this, the FAQ (https://news.ycombinator.com/newsfaq.html) has information on how stories are ranked.


It amazes me that advertising companies have been able to brainwash companies into paying more for "targeted ads". Usually I spend a few days doing research into whatever big purchase I plan to make, then for the next 6-12 months I get spammed with ads for that product on every platform. It's such a waste of their money and my time.


Just use ublock origin(firefox). If on apple then safari should be good too.


"Engagement" is toxic full stop.


> "The road is most utilized when all traffic comes to a full stop"

-One of my managers

This is what I hear every time someone optimizes engagement.


I open private tabs as well for some videos that interest me, but that I wouldn't like to keep receiving recommendations on similar videos. I thought I was alone :)


You can look into invidious BTW


>Netflix

Back when Netflix was a DVD delivery service, its personal recommendation system was fantastic. Rate things by stars, it would estimate how you'd rank movies based on users with similar tastes to you, so rather than trying to give the same ratings to all movies and push the same things to everyone, it gave me a ton of very useful recommendations. I had a 500 movie long queue on it for awhile, so each movie I got was like a surprise from my past self.

Then one day they failed to deliver a DVD to my place because my mailbox was full or something, and they closed my account and my queue was gone. I never bothered trying to open another one after that.


I've never been into recommendations and I find most are awful. TikTok is addicting but it's all rubbish content.

I feel like I am in the minority in how I use YouTube. If someone shows me a video I'll watch it, but I don't browse YouTube.


Many people nowadays use client apps and invidious for YouTube.


I much prefer a really good and feature full browse and search functionality rather than recommendations. Youtube has really poor search and even worse browsing ability.


The browse would be good if it wasn't location based


It's like the media company is taking the driver seat of your mind and decides you need to watch these videos over and over. Where have individualism and creativity gone?.


The search function still exists, recommendations are really aimed at low effort engagement. YouTube recommendations are only aiming for the equivalent quality of channel surfing when most user generated content is really bad.

Sadly, it’s largely your own history that clogs the recommendations. The less you use a recommendation engine the more relevant it becomes.


> The search function still exists...

The search function recommends results that match your query. It's part of the same system!


How does one search for what one doesn't know exists?


If you want a glimpse of the average person's youtube experience, open the site in incognito and go to the "trending" tab.

It's a completely different site depending on what you initially search for when you start using it.

The problem is, 80% of youtube userbase never really typed a search query for anything in the first place, and as a result the stuff they get recommended and watch is pure shit.


We don't actually know what the "ingredients" are for recommendations.

I don't think it's possible to draw conclusions from one's own experience with youtube recommendations and extrapolate it to others or to any sort of judgement about the quality of the algorithms. We are simply not privy to what, exactly, youtube is doing.

I think it's wise to heed Guillaume Chaslot's warnings. In the same way that youtube exploited (inadvertently or not) glitches in the brains of preschoolers (remember "Spiderman and Heidi" videos? if not, don't look it up). There's reason to believe these things can fly under our radar and influence adults in ways that are not perceptible and not something we would approve of in advance.


The stuff is what the stuff is brother. [1]

[1] https://youtu.be/ajGX7odA87k?t=945


That's an amazing talk and I thank you for bringing it to my attention. Congratulations, you beat YouTube's recommendations for me.

BTW I was already aware of all the questionable things AI is used for, but Mickens' analysis is beautiful.


Holy shit, this talk is amazing. Thank you!!


Wow I just did this. It's like turning on cable TV at a motel. (I don't watch TV at home).


Are you telling me the first recommendation engine of the world with a billion-dollars AI and 2 billion users optimizing for addiction returns the same results as cable channels whose offices look like a « Dunder Mufflin » environment, with only yearly feedback on whether customers trended upwards or downwards? Good job on the cable guys.


> the average person

An average person, not the average person. They'll at very least target the content by country, and if you have a fixed or rarely varying IP address (which I do at home, and may public wireless APs do in my experience) they can be more fine-grained than that and give recommendations based on what people in your more local area have searched for recently.

I had an Italian friend staying in my spare room for about half a year some time ago (hassles with previous landlord, took her and her fellow a while to save to the point where they could afford a place together). For many months afterwards I would get Italian language recommendations and adverts from youtube and others, even in incognito on a fresh device.


It wasn't that bad. Some influencer videos, music, prank videos, sports, a couple niche youtubers, some video games. I saw a couple educational videos in there too.

Sure the influencer stuff is pretty stupid, but whatever, it's not harmful.


Influencers are very often harmful.

1. Their aspirational lifestyles perpetuate feelings of inadequacy amongst their viewers, especially when their target market is often predominantly young people. Their audiences grow up with self doubt and anxiety by comparing their lives to that of the influencers they follow.

2. They are in the pockets of their advertisers and sponsors. Any recommendations they make are inherently disingenuous.

3. They’re often just really terrible people. Being the kind of attention-seeker who is able to command a large online following often seems to correlate with being a jerk. Just look at Jake and Logan Paul — two of the biggest stars of today, and both mired in controversy and bad choices.


I'm a fan of Reddit's AmITheAsshole and I'm always stunned by the stories of teens whose life plan is to become an influencer. As a sociological phenomenon, it's fascinating: celebrity distilled down to an implausibly pure form, like Paris Hilton rendered into uncut white powder. But as somebody who feels responsibility to the kids in my extended family, it's a worry. I'm not even sure how to explain to them that it might not be the best way to spend 40 or 50 years.


You know what's worse than aspirations to become a celebrity? No aspirations at all. I know people like that. They just seem to see the future as a regular life with nothing particularly to do in it except exist and perform normal regular practical activities to sustain it. I can't imagine how dull it must be, but they seem to cope, and get all their excitement from immediate events. I hope you can be happy that your relatives at least have ambition, even if it's not quite accurately targeted yet. Not everyone is that fortunate.


Just to be clear, I know of no relatives who want to be influencers. It's only a hypothetical problem for me.

I also have never met people with "no aspirations at all" and am not sure they exist. Although you seem concerned about diminishing people's ambitions, you also seem to denigrate people who want to have "a regular life", as if there's something wrong with people who want to settle down and raise kids and live a good life. That's never been my path, but I respect it a lot.


I don’t know, I have quite a lot of respect for the fathers around me. I’d rank climbing Mount Lenin, being a CEO or hitchiking from Paris to Sydney an order of magnitude easier than being a father ;)


Influencers are just people that have a sizable fan base, that they earned fair and square. Now, what they choose to do with that fan base, is another story. Just like a startup can lose its soul when bringing in VC money, an influencer can lose their soul when they try to scale their brand with marketing deals.

Mark Rober is a good example of an influencer who has stayed true to his core values while scaling his brand. He’s the Gen-Z version of Mythbusters.

Potato Jet is the opposite, his brand has become overrun with every cinematography company using him to peddle their wares. It’s hard to take anything he says as an honest review of the product.


If someone else having a nice life makes you feel your life is bad, your life may actually be bad. If you spend your days working and feeling unfulfilled and wish you could be the person restoring a ghost town or exploring the arctic... that's a signal.

Humans aren't farm animals meant to be kept nicely penned up.


> open the site in incognito and go to the "trending" tab.

I just tried that and all of the top trending videos were something called "My Best Friends" followed by a long scroll of NBA videos.


Yes - I had the exact same experience - lots of "best friends" content and then NBA video.

I watched one of the NBA videos so, touché, youtube.


Watch two or three e.g. flying saucer videos, then it's saucers recommended for the next three months.


I have quite similar results in incognito and normal mode. Are you that the trending tab is profiled?


Or they typed one thing 5 years ago and now YT thinks that's what you're all about


I believe trending is hand curated by youtube


I really hope they are not hand picking only My Best Friend and NBA videos.


I looked at trending once. Never again.

There is great content on YouTube, but what I consider great is completely different from the next person, which is why trending makes no sense...


I expect it's like how stamp collecting used to be the world's most popular hobby. It might not have been anybody's favorite activity, but it was commonly accessible to everyone. They call these things "lowest common denominator". If you have to pick something for everybody, that's probably a better choice than any niche group's favorite thing.


The Trending videos always disappoint me. It seems like it's always some fame chasing YouTubers/people who dont deserve being on the Trending list instead of actual cool videos and content. People are gaming the algo hardcore to make tens of millions for themselves. The James Charles & Charli D'amelio entire crew of like 10 people who are always in the trending list, Soundcloud rappers, etc. Then, on the flipside, you have the YouTube algo punishing content creators and shadowbanning their videos of content YouTube doesn't like, great example of this is any of the popular firearms channels. You'll never see any of those in the Trending list despite them having millions of views.

Are you a LGBTQ makeup artist rapper who gives money away and plays Minecraft? Straight to #1 Trending video for you. Dont forget the clickbait title and misleading thumbnail image.

Are you producing original and awesome content? No one will ever find your video.


>It seems like it's always some fame chasing YouTubers/people who dont deserve being on the Trending list instead of actual cool videos and content.

What else should they be showing? How does the algorithm determine what are "cool videos and content" to someone who hasn't entered searches? It makes the most sense to put the top Youtube channels/stars on the front page, after all they are the most viewed.

>Are you producing original and awesome content? No one will ever find your video.

This is why TikTok has blown up; an absolute nobody can get a million views on their video overnight.


> Charli D'amelio

I have no idea who this is, but can confirm that they take up something like 50% of the screen real estate. The other half is a BBQ channel.


I find youtube recommendations crap because if I watch something on one subject it seems to assume the subject is now the love of my life and proceeds to offer a zillion on just that subject.

regarding what the article is talking about ... eh, it seems like they're saying it works fine but sometimes puts up videos whose politics they don't like. Short of going in and removing the stuff you don't like so that other people won't get to see it I don't see how you fix this. It's presumably the same algorithm that's working most of the time that's surfacing the crap.


>it seems to assume the subject is now the love of my life

christ, no kidding. this was my first clue that google has fundamentally changed as a company and it was time for me to leave.

ive been using gmail since it came out and watching YouTube with a signed-in, non adblocked experience for years. the algorithim doest give a SHIT. if i watch one "The Office" clip, thats suddenly 60-80% of my feed despite the fact that i dont particularly like the show. im not sure what theyve doing with all my data theyve been harvesting like gollum, but it sure isnt being used for my benefit. (i dont have kide, no one else uses my account, dont watch youtube through vpn, etc etc)

and dont get me started on the ads. ive never installed an anime girl fighting game, from gplay or anywhere else. stop it.


I suspect there's a simple explanation. Sometimes people go on massive subject binges and there's likely certain videos that are more likely to trigger this than others. If you watched one the youtube algorithm is basically asking if you'd like to binge the rest because a lot of other people did.

In terms of watch time and retention on site I imagine those binges are gold hence why they've ended up being optimised for.


They have no problems showing me ads for exploitive garbage like raid shadow legends or some other shit I will never ever ever install or buy. And then they tell us they use the data for serving “more relevant” adverts? I call BS.


Oh the ads. I cannot underestimate the annoyance it has caused to viewers. Seeing the numerous memes making fun of those asset flip adware turned "games" makes me die a little inside.

Now I actively push my normie friends to install uBlock Origin on their devices. And Vanced or NewPipe when it worked.


The midrolls are nearly always in an awful spot in the video and especially when watching something on a serious topic so jaring.

At least most television ads are placed in a slot designed for ads to be there. Though I still find them jaring for shows that arent up-beat.

To be honest, tv ads are the reason we dont watch a lot of TV anymore. The ad:content ratio is too high, and youtube has definitely as high or higher without an adblocker.


You could just pay for youtube premium?


Does youtube even advertise it? I would guess I would have to go into my profile and find it there, but I've never seen it mentioned by youtube.

Edit: Also, I think many are jaded by cable where you pay and still get ads. Hulu has an ad-free tier, but I forget the history and if that's existed from the start. I do remember there being a lot of outrage about paying and still getting ads at some point.


I pay for YouTube Premium in Rupees. It's £1.36 a month.

One of the weird things tho, is that I didn't see Ads with Ad Blocker. Is there some regional thing that ad blocker doesn't work in the USA but it works elsewhere?


No dark pattern there.


How else do you suggest youtube pays its costs? It probably uses more bandwidth and quite possibly more storage than just about any other website on the net.


I have no idea what you are referring to. This has never ended badly for consumers. Why, I remember when cable came out. For a small monthly fee we could get rid of all the advertising. That turned out ... just... oh wait. Okay, then Sirius Satellite! All those ads on the radio? Just get Sirius at a small monthly fee, and bam! All ads will... be... gone... forever... Well, uhmmm... Okay. Never mind. /s


Indeed, if you watch a video repairing your dish washer, usually a one off task, Youtube will assume you have developed a fetish for watching white good repairs, and they will pop up in your recommendations for weeks.


Exactly. I've been on Tik Tok for a few months and I feel like I've seen a hundred times as many interesting and relevant videos that have actually improved my life and inspired purchases as I've ever seen on YouTube, mostly just on my For You Page (aka its recommendation algorithm). As a user, it doesn't feel like magical AI, it just feels like Tik Tok quickly grokked my most basic demographic information and frequently tries feeding me things that anyone would guess might be relevant to a 40 year old American dad who likes videogames (aka stuff Google knows). It's perplexing that Google, masters of A/B testing, can't figure out that parenting advice or retrogaming content might be relevant to me.

Granted the interaction model is very different, and who knows if I'd ever commit to watching even a 5 minute YouTube video on something I'm not primed to watch, but after Tik Tok, Youtube feels like a stale product of yesteryear.


A friend of mine has the same problem, but I don't. I almost think YouTube has him and you in some weird long-term A/B test. I sent him some French rapper as a joke one day and YouTube proceeded to send him crappy French rap for the next several weeks.


This is exactly why I browse entirely within private mode now. I always make sure I am not signed in to my Google account before clicking any YouTube link.

I do not need to be constantly reminded of a single weird video I have clicked months before.

YouTube recommendations haunted me for life.


Just edit your watch history, and recommendations will become instantly better.


Exactly this, they're not toxic, just sh*t.

search for a famous song, and the recommendations are for a zillion cover versions of the same song, or other music by the same artist.


Yes; in particular it would be nice if the algorithm mixed in "here's some videos on a topic you watched a bunch of six months or a year ago" sometimes. But it's so fixated on the very short term of your history that if a topic ever ages out of that it's gone forever, unless you personally remember it and go searching for it...


It seems to try to show mw videos I've watched 6 months ago; not topics from before, but the actual videos I've already seen.

Whats worse is that for videos I haven't watched it recommends the same videos for weeks, even though I never watch it.

I feel like half of the recommendation slots are wasted in one of these ways.


It doesn't ever seem to be short term for me... It shows me the same uninteresting videos for months.

Oh, you watched a game streams? How about I show you three of that game on the top of the page until you stop coming back.

The only thing that works to get rid of them is to downvote it. I don't want to give the downvote, but it's the only way to make the algorithm understand I don't care about what they are showing me.


It's kinda picky though in what it spams you with. For example I watch tons of space engineering bids, will watch anything from Scott Manley as soon as it comes out, yet it often doesn't recommend it to me, I gotta hunt it down somehow.


> if I watch something on one subject it seems to assume the subject is now the love of my life and proceeds to offer a zillion on just that subject.

There was a distinct point in time a couple years ago where this started to happen. Before that, my YouTube feed was always a diverse selection of all the different topics I had ever watched. Now, if I watch one car review my feed is full of car channels. If I watch one political video it fills with politics. Nothing I do seems to affect YouTube's singular focus on whatever topic I've most recently viewed.


Use private browsing...it forgets the love of your life effect


Yeah, the recommendations are completely useless for me. They get stuck in some random topic I’m not even interested in just because I watched a video once.


This is really annoying. A couple days ago, my wife was having trouble sleeping and I selected a competitive Tetris match video for her as a joke. Now I can't stop getting recommendation for all tetris-related stuff. I want my Tetrisless life back.


What happens if you start actively pressing to dislike on every video you didn't want to get recommended to you?


It’ll eventually get a little better but it takes a LOT of repetition. For example, after the gamer tantrum a few years ago anything remotely related (e.g. some designer’s talk - dry, non-provocative) would queue a bunch of anti-feminist “reaction” rants and it took many dozens of dislikes before that would stop after watching one video in the area. Science topics were similar - the angry reactionary side of the new atheism movement presumably has equal engagement numbers so the algorithm really wants you to settle in for some ad views.


This isn't my experience at all. On certain subjects, YouTube will start relentlessly pushing an agenda. If I watch some videos where people talk about sci-fi movies or TV, my recommendations become flooded with videos about how SJWs are ruining Star Wars, Star Trek, and Doctor Who, etc. It's gotten to the point that I now only watch such videos in Incognito Mode.


This might be because there is a lot of content like that.


I get the same, together with videos about how social media is censoring conservatives while am literally being recommended their videos.

It's quite something.


Spot on.


I find YT does a gradient ascent into more popular videos.

I start out listening to a song with 1000 views, I get recommended 10k view songs, then 100k, and before you know it it's back in the mainstream. There's no way to surf a niche, to hunt in a pile of bin ends.

I imagine it's similar for conspiracy stuff. It'll point you in the direction of the most convincing nonsense.


This is my biggest pet hate of all of YouTube.

I remember waaaaaay back in 2009-ish hunting for some new electronic music.

\begin{old_man_better_in_my_day}

First video I went on had about 1000 views. Every "related" video had around the same (sometimes less).

I spent 4 hours finding a bunch of new and relatively unknown artists because, rather than dragging me back to the 100k aggregate, I was allowed to explore the graph of "other people went on to watch this".

These days I'm lucky to see any sidebar suggestions that have fewer than 100k views.

\end{old_man_better_in_my_day}

I think the problem is the "one-size-fits-all" approach can't possibly capture how I would find it useful (music discovery via graph/tree traversal) vs. someone else (e.g. regular sky sports football highlights).


I'm big into discovering obscure music and unfortunately it's become very hard on YT, you're right. If I don't constantly search for stuff I end up on the mainstream over listened playlists which I really don't want.


> I find YT does a gradient ascent into more popular videos.

Doesn't this totally contradict the claim that youtube pushes people to the extremes?


It's within a category. I mean I don't get recommended music videos when I'm watching an episodic video podcast, nor vice versa. But within those categories I get heavily pointed at the most popular for the category. I hypothesize it's similar for conspiracy content.


If you search in non-latin scripts, maybe it still ascends into a mainstream, but it probably won't be your mainstream.

(YT does much more cross-script indexing now than it did when I first started playing around with this, but it still tends to silo by writing system, and if it's relevant search works much better in the original script than relying upon auto-transliteration.)


If conspiracy stuff were the same way you describe music, you would eventually end up with the mainstream opinion, no?


Perhaps in that case the algorithm finds the local extremum, not the global.


end up with the currently really popular conspiracy stuff. As opposed the nuanced, analysed, subtle thorough and less excitingly paced stuff.

1) CIA is interefering in US elections with evidence-free accuasation of Russia helping candidates who might reign them in to follow the law. Conspiracy? Or something to it?

2) Russia is paying bounties to kill American soldiers. Conspiracy? Or something to it?

3) Hunter Biden might be a bit corrupt. Conspiracy? or something to it?

4) The President has been compromised by Russia. Conspiracy? or something to it?

There will be incendiary stuff on all of the above absolutely full of utter nonsense. There will be reasonable stuff also assessing evidence and may even not come up with a clear and definite conclusion on one or another. Which of these will YT reccomendation push you to? I don't know.


Reality is too boring to film.


The ultimate result of the YT algorithm for me has been that the videos are increasingly a jukebox for background noise. Sometimes the noise is music(as in the popular "lofi hiphop study beats" or old anime soundtracks, or 80's pop hits) and sometimes the noise is man talks into camera(as in tech reviewers, music theory teachers, and video essayists) but it's basically inoffensive, interchangable background with an occasional moment that makes me pay attention or even comment.

By their metrics it probably looks great: lots of watch time! Interaction! But it's basically leaving the possibilities unfulfilled.

Now, I remember early days YT as well. It was much better at getting you circulating through a niche, but it did have the problem of getting stuck in a cycle of the same few videos. Modern YT just gradually moves you towards popular categories. Occasionally it does surprise, as with the months where "Steamed Hams" edits were recommended daily, or more recently, the train station ghost story song with no title. But that's mostly the exception.

Edit: And a big part of why I seem to have this feed is simply because I have "liked and subscribed" to channels in these categories. If you do that, the recommender biases heavily back towards them.


After realizing that I would sometimes spend hours caught in the YouTube recommendation trap--to the detriment of my sleep, my work, my hobbies, my family--I've taken steps to manage my borderline addiction to YouTube. Prior to this I'd tried blocking it entirely, but the problem is that there is sometimes valuable educational content that I benefit from watching (even if only to briefly unwind after a day of work). Ideally there could be an option in the YouTube settings to turn off all recommendations, or perhaps a browser extension to remove them from the page, but to solution I ended up with was quite simple: RSS!

I went to every channel with useful educational content that I like to watch, subscribed to their feed (or even indivial playlists) with my RSS reader of choice (Feedly), and the videos get delivered and displayed inline in the RSS reader. No more YouTube binging because I don't find myself with the urge to visit the site, and so I'm never exposed to the recommendations. On the rare occasion where I want to watch an arbitrary video, I open a private browsing session so that the recommendations are uselessly generic.


I feed you. Can you share some of your educational channels?


Protip if this annoys you: go to myactivity.google.com, delete everything and pause all further collection. Now only your subscriptions and likes count (and maybe a few other things that are not considered "activity"). It's so much better! Did it once and now I don't want to go back.


It's too bad YouTube doesn't let you keep your history (so you can go back and find something you saw before) without it feeding into recommendations.


Its available with YouTube premium. Complete ad removal and the ability to download videos is well worth the cost.


You can get those features from freeware browser extensions.


on your mobile device?


Just like how google maps won't store your search history or use your current location to help narrow search results if I don't let it store my location history indefinitely?


The scary thing is that I don’t have any type of Google account at all, yet they clearly still track me and amass a profile of my history and choices.


It doesn't take them long to figure out to link a new device to their previous un-accounted profile, either.


Yeah, the EU tried to fix that and now we have to accept cookies everywhere and nothing really changed.

IMHO it's better to have an account and configure it. The GDPR-like approach is much better than the do-you-mind-cookies-banners approach.


Thanks for the tip -- I didn't realize this affected Youtube. Curious to see how it improves my experience.


I didn't even know that was a thing, thanks!


At the end of 2016, I was bored and decided to watch a flat earth video. I spent a few evenings going down that conspiracy rabbit-hole as it was completely fascinating. (I couldn't tell if these folks were expertly trolling, or if they were serious). Once I was back to reality (2017) I was done wanting to see anything conspiracy related. It took a year for that stuff to disappear from my right-hand column. I just kept telling myself, "you deserve this... you asked for it".


I watched some of one video whose content would be described by half of the US as political opinion and protected speech. It was mainly misogyny and incitement to racial hatred.

YT’s algorithm seemed to think that I needed to keep being exposed to this, so filled my recommendations with more.

Like the parent, it has taken over a year of me choosing ‘not interested’ for YT to stop trying to persuade me that I should hate most of mankind and actively seek to hurt people who are different from me.

I’m not sure how far from ‘Do no evil’ it is possible to get but this seems like it’s the opposite to me.


Why didn't you click on "Not interested"?

It only takes a couple of days for interesting stuff to disappear.


My luck with this has been pretty mixed. I tell YouTube that I'm really not interested in things like Jordan Peterson or ASMR by telling them to not recommend that video or that channel whenever they come up. If I'm lucky enough, those suggestions will die down a bit, sometimes for a few weeks it seems. But eventually, they will start popping back up.


I have said not interested in living on boats, making boats, repairing boats, or anything boat related and yet they still pump these videos at me. I wish there was a way to say “never show me a video about boats, in any context”


Youtube changed their recommendation algorithm sometime in 2017 and stopped recommending the more egregious conspiracy videos. Their recs are now rather bland and mainstream.


I’m repeating this comment here over and over - the trivial fix for this is to edit your watch history.


The recommendations are now almost completely inescapable if you want to continue using their service. They recently did away with email notifications for new content [1] and now the RSS export feature for subscriptions has been removed [2].

I refuse to interface with their algorithm, so I've begun hunting down alternative video hosts and the personal sites of the creators I can find and am preparing to leave their platform.

Alternatives are now very viable, with a platform such as BitChute soon to offer functionality such as live streaming [3].

[1] https://www.androidpolice.com/2020/08/15/youtube-will-end-em...

[2] https://support.google.com/youtube/thread/85275454?hl=en

[3] https://www.bitchute.com/funding/livestream/


I turn off YouTube watch history and use the "subscriptions" page exclusively. https://www.youtube.com/feed/subscriptions


You could use a greasemonkey-style script to modify youtube.


recommendations are entirely escapable via search. (but I never subscribe to anything, so maybe you want a different experience?)


Weird, my RSS feeds have been unaffected.


They are for now, but if they have removed the export button you can guarantee the RSS will be killed off soon.


Can we call this the Oppenheimer Syndrome? I see so many opinion pieces from software engineers lamenting how they've created this "horrible thing" by working on social media recommendation algorithm X, and it all fits neatly into this painfully predictable trope of how we should all live in a camper and use a Nokia from the 90s.

Yeah, I'm jaded and of course there's merit at the bottom of all this, but I really can't be arsed to read another high and mighty essay about how one person in the course of their duty led the entirety of humanity astray by recommending what people are engaged by online.


These pieces also bother me. We're supposed to praise these people for coming forward or something?

This guy spent years of his life making the world a worse place to live in, while taking home a fat salary, and now has somehow found his moral compass and wants to let us know. Yeah, no thanks.


I really want to write an article now where I say how my work as a web developer/software engineer has made the world a better place, and why we should be happy with modern technology and the internet as it is today.

Of course I've not worked at Facebook or Google, so maybe it'd be better if we saw a Facebook or Google engineer write that sort of article instead. Would be interesting to hear from someone who loves the company they work for, thinks their work improves humanity and considers their tech the coolest thing in the world.


The engagement engine doesn't end at YouTube. It's always active, everywhere, and this type of article exploits it.


A lot of the comments here are in the spirit of "just clear your history" or "just be careful about what you watch." If you have to go out of your way to step over eggshells in order to get good recommendations, that indicates the problem is with the designers and decision-makers instead.


Not only that, it's wrong. I can have zero "conspiracy" videos in my history/cookies and still sometimes see recommendations for it. Granted it was worse years ago.


I actually like conspiracy videos (for fun), but Youtube did something a few years back so that they are recommended with much less frequency.

Also to get a totally clean slate, which is still not perfect as you said, you have to delete subscriptions and likes as well as watch history. I've tried it a few times, but since I keep my subscriptions I always end up back on the same topics.


I just find them to be dumb. 50% of my recommendations are videos I've already watched.


I wouldn't mind this as much if YouTube were better about indicating which ones you've watched. In theory you can just look for the little red bar under the thumbnail but they seem to "forget" after a while.

This makes it tricky to go through the backlog of a channel's content without watching it all in order. Sure, at first any random video will be new to you, but eventually you will start seeing more and more repeats, forcing you to start choosing videos more methodically from the videos page. Even then, it's tricky when the ones you've seen are not always marked consistently.

It doesn't seem like too much data to story for a company that offers gigs of free email.

(And no, I don't have a auto-deletion policy set up, I just double-checked)


I have this problem too. It's stuff I've watched or it's stuff I've scrolled past uninterested in a dozen times. Basically the only time I casually find something I want to watch is when a subscription uploads new content


It could really benefit from a simple "suggest things I've already seen y/n" UI toggle. Sometimes you do want it to do that, such as for music, but most of the time you don't.


This just changed about two weeks ago, right? The rate of suggested rewatches went up from like 5% to 20% for me. And surprisingly, it seems not to have tracked or not be displaying that I've seen it before (with the red progress bar on the thumbnail). I agree this is a bad change, especially deceiving me about my watch history.

This was around the time they changed the mobile progress bar behavior to require dragging to advance, rather than allowing you to jump to the point you select (i.e. away from the model used by the android brightness slider).


That's because there's not really that much great content on YouTube.


There is undoubtedly more great content than one could watch in a lifetime on youtube. The challenge is to identify and find it among the thousand lifetimes of trash.


Giant tip: look at https://www.youtube.com/feed/history and remove any videos you don't want included for recommendations - it really helps clean up your suggested videos.

Doing that, combined with liberally using incognito mode for one-off video watching, helps keeps suggestions actually reasonable.


More people in these comments need to know about the history page. I personally recommend just clearing it from time to time. Obviously when you're getting bad recommendations. But even when you're getting good recommendations, it's easy to find yourself in a rut of watching the same stuff and reseeding the recommendations system can help you find new content.

I like Youtube's recommendations. They just seem to get too narrow too fast instead of experimenting with showing you videos that it's less confident you might like. Thus resetting history is necessary.


My youtube sucks, I'm always looking for stuff to kill my time and youtube recommend the same crap videos 500 times in a row. It take a long time to find something I want to watch.


I've been loading the Youtube homepage every 15 minutes for the past week (I've been sick). It's the same videos over and over again. If I scrolled past it the first time, I'm still going to scroll past it the 100th time I see it.


That's actually what prompted me to post this.

I don't know how Youtube's algorithms work internally, but lately the front page is just so broken that it's baffling. The sidebar recommendations are another discussion -- those are possibly more pernicious but at least not as obviously dysfunctional.

How is it possible to have the largest repository of video content in the world, yet Youtube is only able to recommend me the same videos over and over again (half of which I seen already or quickly clicked out of). There is also a simplistic list of recommended categories, which I would change if I could, and those never change either.

If the front page were just completely random, it would be better than it is now. If I could actually tell Youtube to show me videos with a low-to-moderate view count and a high upvote ratio, then that would be fabulous and I probably would encounter new subjects and ideas rather than getting stuck watching the same garbage over and over. I'll never let my future children spend even a second on Youtube, because the stuff that it recommends to kids is so asinine that it just makes me mad.

I really believe that collectively we are all much dumber because of the Youtube's pathetically bad algorithms.


You're probably right about the recommender but kid videos are not crap. In fact there's a Cambrian explosion of kid videos, some are amazing.


You know it's bad when one of the reasons for dismissing a video is "I've watched this before". If I didn't know any better, I'd say it's a separate sub-system(gasp, microservice) that doesn't integrate nicely with the service that stores/holds your previous viewing history.


Agreed. This tweak to their formula would improve it for me masssively: show the video on my Home page just once. If I click reload, show me an entirely new set.

Someone may say, this would bother others, who saw something interesting but didn't click it this time. For me, it would train me to open those in a new tab or save them to my Watch Later list. Or just make it an opt-in feature, this aggressive renewal of the home page.


get well soon. May you want watching some TED talks instead. :)


Another thing, don't click on a "Clickbait" video because you are curious, it will open the gates of hell and destroy your recommendations for a week.


Removing them from your watch history can often fix this.


The trick is to actively tell YouTube what you don’t like, it work quite well for me.


I did try this a few times, but usually it have little effect on my recommendations.


Yeah I do it, and you have to be relentless. Mark every video you don't want to see.

And god help you if you watch even one off-topic video - like another comment mentions, YouTube algorithm then concludes you want to see a million more. Reminds me of a Weird Al lyric about TiVo: "I watched Will and Grace one time, one day. Now TiVo thinks I'm gay."


You can thank toxic journalists who attacked youtube recommendations for being "toxic". You can also thank the toxic journalists for all the censorship on youtube.

Youtube recommendations 10 years ago was so amazing. Just a refresh or two and you'd get interesting stuff - obscure or popular, new or old. It didn't matter. Now, it's the same corporate, propaganda or popular vids over and over again. You bring up youtube and it's mostly the same exact recommendations from yesterday and the day before. They are literally cycling the same videos over and over again.

Why not make the default search and recommendations "PC" to appease the toxic journalists. But give the users the option to revert back to the "normal" search and recommendations.

Why can't they do it like they do search. "Safe", "Moderate", "The good old days" modes?


My uBlock Origin filters:

  www.youtube.com###dismissable:-abp-contains(Recommended for you)
  www.youtube.com###dismissable:-abp-contains(M views)
  www.youtube.com###dismissable:-abp-contains(LIVE NOW)

(edit: formatting)


I skimmed/read the entire article (dated June 2019). It does not not contain anything new or surprising for most of the HN audience (not a critique of the article itself, which is aimed at a broad audience).

There are some very fundamental questions that I don't think we have really good answers to yet.

- Like all social media, Youtube's algo is aimed at maximizing aggregate view time, and people working there are directly rewarded for that so it's difficult to change. You could bring up some obvious criticism, that recommending crap will destroy the value/brand in the long term. But that's extremely hard to measure/proof, especially when all short term KPIs point in the opposite direction. So you need the top leadership (who still has to answer to the Board/shareholders) to take the contrarian position on faith.

- The article/Chaslot claims that Youtube's recommendations are not about "what the viewer wants". The obvious question is then, how do you measure "what the viewer wants"? Youtube / social media's main metrics, engagement minutes does not seem obviously flawed. Even if they were, I have not seen any practical alternative (either in this article, or many other discussions). People often bring up analogies with nicotine and other addictive substances, that engagement is largely involuntary. But the main issue with nicotine is clearly harmful for health. If it weren't, there would probably wouldn't be much legislation around it, even given the adddictiveness. The more apt analog may be soft drugs like cannabis, where the legislative momentum seems to move in the other direction.

- The fundamental issue is not that social media is not giving people "what they want". It's that often people want things can be bad for themselves and/or for society. This is very difficult to resolve without taking a strongly authoritarian/paternalistic standpoint.


Article is just incorrect: few years ago, YT realized that optimizing for watch time decreases engagement, which is obviously most critical for life-time profits, and switched to optimizing for life-time engagement. I.e. watch less today, but return tomorrow. It’s very tricky ML problem, so by now YT ML is probably the most advanced in the world in this area.

PS. Trigger for investigation was the event named “boobacalipse” - when recommender recommended videos with boobs on the first page for couple days.


Nit picky, but nicotine itself isn't really harmful unless you have blood pressure issues to start with. It's tobacco (specifically smoking it) that's the real danger.


The impact of such algorytms is devastating. Not only because they recommend engaging content that may lead to addiction and waste of time, the most precious thing every human being has.

What is even worse such algorytms force creators to become part of this harmful scheme. Knowledge is the thing that is hurt also. It should be out there. For decades people learned how to explain complex things in an easy way. Because there is no other way we can as people evolve and learn.

But now these algorytms promote the opposite. People who have nothing to say, people who complicate simple stuff beyond imagination are paid for doing so.

On the other hand people who have something to say are bashed because they are not engaging enought, they don't do videos that fit needs of video addicts. So they are not paid.

Such algorytms are the worst thing humand kind invented in last years. The devastating effect of these can be compared to drugs and gambling. It should be banned.


I feel that this is becoming true of much of the internet. The number of lame, banal blog posts that I come across is just insane. A lot of the tech blogs are just cheap ripoffs of original documentation/examples or just fluff. This is madness.


I have confused feelings on the "efficiency" of their recommendations. As many others attest to, their recommendations lean heavily on popularity, and any one video in your history can change dramatically the majority of videos you get recommended. Even if most of your history is on completely different topics, that one video will dominate the recommendations.

People tend to suggest this is intended behavior, my question is instead: Is the recommendation engine all that good to begin with, given how a lot of people find their recommendations confusing, lackluster, or downright bad?


I gave up on the recommendations completely. It has been nothing but videos I've already watched (about 90% of what I saw came from my playlists and subscriptions) and videos that have nothing to do with what I'm watching right now.

So I did the following:

- installed a plugin that redirects me from Home to Subscribtins page, effectively disabling the Home page completely

- deleted all history from YouTube

- disabled all possible tracking and data collection

The recommened videos on the left of the video player are a lot better as a result. It sucks that these mostly focus on the author of the video but scrolling down usually reveals new content. It's not as good as what it used to be in the late 2000s but it's far better than the default settings.


What's the redirection plugin? I was considering this myself.


Terrible recommendations. One can start watching relatively harmless “list of Pokémon with illustrations” or “evolution of super Mario characters” videos and instantly it will start recommending videos of violent games or live-streaming gamers swearing - so it goes from something age-appropriate to something totally opposite.

Also don’t get me started on getting bombarded and suggested only “sound variations” videos which are noisy, rude and an attack on the senses - pure garbage content. Definitely a huge leap to assume I’ll find this interesting based on my previous choices.

Maybe instead of complaining here I should just go ahead and delete YouTube :)


I always thought the recommendations were fine but today I was setting up an old Mac Mini I found and I saw the default recommendations because I wasn’t logged in, there were a weird amount of videos about Muslims and conspiracies for some reason.


The Paperclip Maximizer is a thought experiment that shows that if you apply a powerful AGI to something completely innocuous, things can go hilariously wrong over time.

https://www.lesswrong.com/tag/paperclip-maximizer

People are amused to read the story, and then go on their way.

But what happens if you apply a rather less powerful AI to something else even just a little bit less innocuous? (like optimizing to grab people's attention, as is happening here).

I think you can get spectacular results.

And of course even if the AI is low powered, it can still work as an effective amplifier for human activity, both good and bad: (See: Myanmar, Cambridge Analytica, etc)


I once contemplated a notion of optimization process risks where we should consider how narrow a target out of the space of all possible universes the process could pull ours into.

Unfortunately, I realized that even a simple phase-locked loop -- like what you use to condition an oscillator to be in sync with TAI using GPS -- can actually hit absurdly narrow targets... e.g. two oscillators a continent apart humming away within parts per billion of each other, yet that kind of process isn't particularly dangerous. Even brining millions of oscillators into sync isn't going to cause some great harm.

I'm not sure how to reason about it, but I think people are much too concerned about AGI risks relative to dumb optimization process risks. It's similar to fretting about movie-plot villains when most actual evil in the world comes from indifference and bad incentives or people trying to "help".


Would fitness landscapes help for part of it?

https://en.wikipedia.org/wiki/Fitness_landscape

They're typically used for reasoning about other things, but there's a few useful intuitions you can draw.

Notably, you can't optimize to a point that is not on the fitness landscape in the first place... so the following comic can't happen: http://dresdencodak.com/2009/09/22/caveman-science-fiction/

Which doesn't mean there can't be any surprises of course!


I discovered so many wonderful documentaries thanks to youtube, some about things I had never heard of before. "It’s in YouTube’s interest to keep us watching for as long as possible", and the way youtube keeps me watching as long as possible is to recommend stuff that I want to watch. I don't understand the problem with this metric. Is it that some people watch things they don't really want to see, like a trainwreck? But why do these people who compulsively watch things that they don't really want to watch keep checking youtube at all? Why don't they hit the dislike button? Wouldn't that adjust the algorithm?


I’m quite enjoying it- I don’t watch anything political, and if anything comes up that’s a bit strange I just don’t watch it and eventually it goes away. It’s recommending me some great old footage of shows and some interesting new channels, mostly around engineering etc.

I suspect if you try and use it as a news source or anything other than entertainment it would be dangerous.


Yeah. Like literally everything else it's a nice place as long as politics aren't shitting everything up.


The YouTube algorithm's incentives seem to be responsible for so many videos being excessively long. Everything wants to waste your time because YouTube and the creator make more money that way.


Yep. I watch everything on 1.5 speed these days and skip around like crazy due to this. YT is going to give me ADD.


The one that really annoys me is when $famous_author is on a book tour and I watch a talk. After doing so, YT recommendations repeatedly include more videos of the same author giving the same talk on the same book at various other locations in the book tour.

The algorithm delivers far, far less novelty than I want in anything except music. I very,very rarely want to see a dozen nearly identical videos on the same topic.


YT algorithm is great for me. I learn a ton about: woodworking, violin technique, folk music, StarCraft, math, coding, home improvement, DIY projects, etc.

What is toxic is the viral ideology that infects the mind of many, many humans. We need education and social interaction to counterbalance the anti-truth conspiracy infection that social media is spreading.


> We need education and social interaction to counterbalance the anti-truth conspiracy infection that social media is spreading.

The problem is, this will not (ever) work. Most people are mentally and/or physically exhausted after a long day of work, they don't have the mental firewalls to protect themselves against "cheap and easy" proposals (like, for example, to "build a wall").

I see conspiracy crap like cigarettes: even though everyone should know for decades now that smoking kills, the only way to effectively get rid of cigarettes is to ban their marketing and make them expensive. Going after the vendors is what must be done. And if that includes jailing the worst offenders, be it.


Anything recommendation driven seems suspect these days.

I find if you seed it carefully though it can be OK. e.g. Mine is mostly full of tech and trance videos. Hard for that to get toxic.

Politics, news, opinion pieces etc...and you quick get sucked into one of the echo chambers.

I suppose a case could be made that trance is my preferred echo chamber though...


I hide the front page and recommendations with ublock rules and go directly to the 'subscriptions' page instead.

It's the only way to see new videos from creators I care about.


The problem I have with YouTube recommendations is that it keeps suggesting me MOST of the big content providers MOST of the time. Sometimes I want to see a video from that indie creator I subbed to a few months back, even if just to catch up. Not all the time MKBHD or LTT on the front.


so how does it work so go on me? I find urges to check the feed all the time. But the content rarely engages me. I watch a couple minuets then switch. What is going on here? why am I gravitated towards it? Can I use the same process to my benefit? Like can you build your own algorithm to get you addicted o healthy information?

Can anyone point me in the right direction?


I think being aware of what's happening to you is the right direction. The content rarely engages you, and that's part of the feature that's trapping you. Partial reinforcement is extremely addicting. If you gave somebody a pile of dimes and said "put these in this slot machine, pull the lever and get 11 cents every time" nobody would play that game. But the randomness of a slot machine means that when it hits, your brain lights up.

Youtube's "recommended for you" section is a slot machine for whatever will engage you. For me it's novelty, for a lot of people it's an irresistible clickbait title. Nobody is immune to it, although I feel the hacker news crowd might be suspect to clickbait much more specific than whatever youtube chums the waters with on their front page when you're not logged in and have no history (open Youtube in an incognito window to see what I'm talking about, it's terrible).


Define healthy information, also I think too much information (good or bad for whatever it means) is always bad for anybody


Don't forget you feed the algorithm. I use Youtube mostly for hobbies. A couple examples are cooking and small engine repair - I find it super valuable to watch multiple people do the same thing in different ways. Anecdotally, the algorithm does a really good job giving me more content/channels both narrowly related and slightly expanding the interests. Sure, it can be frustrating if you watch a clickbaity video that spawns a genre of clickbaity content recommendations - but it doesn't take long for those to stop showing up so long as you don't watch them.


I learned to like it over time. YouTube recommendations helped me find incredibly valuable content which I don't think I would be able to find without the algorithm being new to the topic and not quite knowing where to look. These days, if I find myself interested in a new topic, I intentionally make a few searches knowing that YT will bring something relevant in the future and it's generally going to be even better than the web search because for some topics, the barrier to making a good YT video is much higher than making a high ranking article or a web site.


I mean, the dumb algorithm keeps recommending videos I've watched a day ago. No wonder it's a disaster.


I don't even know anymore what's the word "toxic" means and what do people think it means.


Does anyone else find these toxic post on HN to be toxic, not even trying to make a cheap point here. At least on twitter and YT I can consistently ban thinks that play on cheap emotions.


Youtube prioritizing "watch time" is why Youtubers take 1 minute of content and stretch it out to 10 minutes. It's an enormous waste of everyone's time, and it's crazy to think about the enormous power and influence Google has over humanity's behavior.

I always wondered why Youtube never tried to get their algorithm to optimize for quality. For example, one way to do this would be to look at comment lengths and threads. I'd imagine that for more quality videos, you'd have people writing longer comments and responding to each other.

I'd be in favor of regulation enabling users to have some semblance of control over the recommendation algorithm given that it has such an enormous effect on their lives. Ideally you'd have competitors driving improvements like this, but Youtube is effectively a monopoly at this point.


A pure recommendation engine of the sort we learn about in class would simply aggregate user preferences based on whatever content-appropriate properties are available. A real-life recommendation engine is written and deployed as part of a business plan.

I think it's fair to say that all recommendation engines deployed by for-profit companies have some element of recommending things that drive more revenue for the company. The question is, how much does end user give up, in terms of getting actually interesting recommendations vs being recommended something that is more profitable for the company, in exchange for getting things of interest to their attention.

I've seen enough to be convinced, anecdotally, that YouTube's recommendation engine is heavily skewed towards revenue-driving. For example, how much of what YouTube recommends is influenced by their big-money media customers? What users get is as marginally interesting as necessary to addict viewers.


I'm bit dissatisfied that ' 15 minute compilation of cats vomiting' video linked in article is no longer available.


I primarily use YouTube on my AppleTV, and man, the UI is awful. The recommendations change every time the home screen comes up, and there's no easy way to flag things. So if I see 3 recommendations I need to watch, I have to try to remember what they were before I watch one, because when I finish with the first video, the others will be gone.

Netflix is even worse (and they all seem to suffer from this to some degree). When I start it, it's an ad for some random thing. To find what I was watching last I have to either remember whether that miniseries is considered a movie or a TV show or what.. or, I can scroll down to Home, and "continue watching.." will be buried 3 rows deep and 2 columns over. So incredibly strange and unintuitive.

I guess they want their algos to be drive more engagement rather than just take you back to what you almost certainly opened the app to watch?


What I hate is, it's orders of magnitude more toxic for kids. Hyper-amplification of bad recommendations.

For sure, Youtube's (bad) impact on current generation children will be felt (in a bad way) for many upcoming generations to come. At this point, it's literally technology gone rogue.


My wife and I recently made the decision to totally remove YT Kids from the kids' iPads (which they mainly use for long car rides). It's completely toxic in a way I didn't even realize it would be, and the videos aimed at the kids are the lowest quality garbage imaginable.

I had thought maybe YouTube Kids wouldn't be as bad as the InstaSnapFaceTwit, but it's just as bad. The content recommendations are so bad that frequently videos will be in Russian or Korean (our native language is English). And YouTube Kids encourages kids to have zero attention span by showing all sorts of recommendations all the time. My youngest can hardly get through a single ten minute YouTube video without clicking something else.

Our new rule is they can have some occasional screen time (though playing with real toys and playing outside are preferred), but when we do have screen time, we can do activities that are creative or imaginative or strategic (like games or making art or writing). If we're using tech as a consumption box, reading is best, followed by good movies that we can watch together.

I'm honestly happy now that YouTube Kids has been banned from my house. I'd commend every parent to do the same.


Found tons of good music via YouTube recommendations. As an adult I know enough not to blindly start watching flat earth videos should they be recommended and lose all sense of reality. Please don't destroy the good in a vain hope of protecting everyone from themselves.


On a related note, I'm surprised nobody mentioned the Clickbait Remover for YouTube addon. Highly recommended! It's on Mozilla addons (and probably Chrome, not sure):

https://addons.mozilla.org/en-US/firefox/addon/clickbait-rem...

I always use YouTube signed out anyways, but there's also this Privacy Redirect addon that redirects YouTube links to Invidious, Twitter to Nitter, etc:

https://addons.mozilla.org/en-US/firefox/addon/privacy-redir...


Google and YouTube video search are terrible. I find the result poorly matches the query I typed, the results are hard to browse, and of course favor videos on YouTube ignoring better videos that may be on other sites.

Bing video search is by far better. Assuming you use DuckDuckGo type your search and follow it with !bv. The result is an excellent browsable display of videos that match your search well, that can be previewed by hovering over them, and the results aren't biased to just YouTube. You can also add emphasis to words in your search with the +/- operators and quotes.

Also the "up next" YouTube video is rarely as good as going back to the Bing search and choosing another video.


I recently took the radical step of buying a youtube family subscription. I was ad-blocking anyway, but if we only consume free things, then we should expect to be exploited in some way by the websites we consume. The family subscription is $18/month, and I am also spending about $30/month on patreon support.

I'm hoping as a paying customer the company/providers will take my needs more seriously than the people who they have to milk for attention. It very likely won't make a bit of difference unless enough people do the same, and I have little hope of that happening. Most people would rather complain about free stuff than pay for anything.


This is yet another version of the argument that democracy is good BUT people can't be trusted to make good decisions on their own and should only be exposed to carefully curated content.

In other words, an implicit contradiction.


I have three rules for Youtube:

1. I have two or three accounts specifically for what i want to view. I have an account where I only see video editing tips/how-to's, and a separate account for real estate, a separate account for personal interests, etc.

2. When I invariably see something interesting outside of my normal account's spectrum, sure I watch it - but I make sure and delete it from my history.

3. I aggressively click "don't recommend this again" for anything that falls outside of the account's focus.


Youtube, like all social media tends to give you more of what you consume.

If you watch social commentary, or God forbid anything to do with race or gender you'll be feed the worst parts of the internet.

Watch videos on the X86 instruction set, you'll get more of that. I tend to stick to music and programing.

I did find a gem where someone animated the original Star Wars episode 9 script, but even that was a rare diamond in the rough. For that one video , you have hundreds of people whining about how Disney ruined everything.


I've drastically improved my YouTube experience by removing all recommendations and comments from my YouTube experience using the "Remove YouTube Recommended Videos, Comments" Chrome/Firefox extension. Combining this with an ad blocker makes YouTube a joy. It works for me, not the other way around.

Sure, I may miss out on watching a few good videos, but I find at most I'll miss out on a good piece of trivia, at best I am more focused and in control over my attention.


Let’s be honest - toxic here just means “people are accessing content that doesn’t agree with my personal politics”. This word doesn’t have meaning anymore, like “fascism”.


Here are some CSS selectors I block on YouTube to remove recommendations and other annoyances:

div#secondary .videowall-endscreen .ytp-pause-overlay .iv-promo-video .ytp-cards-teaser .ytp-button

That first one is super-nice -- using that, I no longer need superhuman self-control to prevent my brain from falling into a YouTube black hole powered by millions of iterations of A/B testing targeted at my lizard brain. I just watch one video and then... close the tab.


Love to use the algorithm that brought billions and billions of views to videos titled like “OPENING $10,000,000 DARK WEB MYSTERY BOX” and other excellent content.


One at least poor (if not dumb) and unwanted kind of recommendation I keep getting is for videos I have very recently watched, and completely - and both the watching and then the recommendation happen in the same YouTube app on the same phone, often within minutes or seconds of each other. Unless there is some reason I don't know of, this seems useless, a waste of processing cycles, and potentially irritating to the user.


There are chrome extensions which disable the recommendations sidebar in YouTube. I recommend using one.

https://chrome.google.com/webstore/detail/remove-youtube-rec...

Not sure if this is the one I use, because I'm on mobile ATM. But it's a similar idea, at least.


A simpler solution is to just use your ad blocker:

  www.youtube.com###related
It's worth taking moment to familiarize yourself with adding this kind of filter.


Thanks.


The author claims that RT is a "state-sponsored propaganda outlet." That paints an unfortunate and misleading image about RT programming. RT, for example, is the home of On Contact -- a Chris Hedges program, as well as former Minnesota governor Jesse Ventura, and "conservative comedian" Dennis Miller. To imply that these people are useful stooges in a foreign propaganda apparatus is rather insulting.


The only toxic with youtube is in settings where you cannot block the incessant ads. Twelve dollars is just ridiculous to watch ad free user created content. worse the ad play is wholly random feeling where you can end up with multiple ads even in a short video only you have none in the next.

the recommendations for me have been fine if not repetitive, just because I watched one video on a subject doesn't mean I want all the rest


In my experience YouTube's recommendations are pretty predictable. If you watch a specific type of content it will keep recommending videos from that type until you search for something else. Sure, it makes some mistakes but you can always customize it to not show some type of videos or channels. If you are getting toxic recommendation all the time there is a high chance that you watch a lot of toxic content.


Beware the filter bubbles: https://youtu.be/B8ofWFx525s

Using scientific unification as a cognitive guide to understand this holistically: https://www.amazon.com/dp/0375714499/

Developers! Startups! Cap tables! Unicorns! Disrupt YC and tune it out.


No one in this thread has mentioned that YouTube recommendations have dramatically changed in 2017, from the heavy conspiracy/sensational/lies videos to something much more "toned down" and IMO rather annoying. It's driving everyone to the mainstream, or perhaps into the mainstream of two camps. At least before, flat earth did not fit into the mainstream of anything.


That's okay, I've noticed google has started to make it completely useless in any case. I tried to watch a speed run for a game (~2 hours) recently and google deemed to interrupt me with an ad every 3-5 minutes. Not even TV does that.

I just found something similar on twitch, which for some reason didn't give me any ads. I wonder how long that will last...


I really like my YouTube recommendations after a lot of time on the site and manually clicking "I'm not interested" on a good number of them. I've also wound up with several accounts over the years that all have slightly different but overlapping content - my most recent account gets more stuff like Rambalac for example.


On a slightly related note, I really am enjoying the browser extension "remove youtube suggestions" My home page and completely blank and any video that I watch has no recommendations on the side. I like it this way because I won't get sucked into a rabbit hole of youtube videos. I also have comments turned off


Even as guest Youtube wants to do the recommendation algorithm it's great what can I say besides privacy is the last thing they're worried about anyways. They're almost like: You know you're being monitored, don't worry about it we'll get you some great videos along the way.


This becomes very apparent when traveling and using hotel Internet, or built in YT on room's "smart" TV.

The YT recommendations are very targeted, not only for local business but culture, and aggregate viewing patterns. Go to Washington DC, and check YT search patterns. Then, go to Seattle, WA, Memphis, TN, Austin, TX and compare these to each other.

It is a fascinating window into what YT thinks about the local population.


At some point I as listening a lot of nursery rhymes song with my child. When doing this I just let the recommendation algorithm choose the next song for me because I don't really care what to watch next. After a few week I was stuck in a two song loop where song N+1 was equal to song N-1...


Honestly it’s all I’m watching anymore during lockdown.

I’ve exhausted Netflix and Amazon Prime. I honestly use it so much I’d love like a Reroll button to show me completely fresh algorithm picks.

Yeah they’re toxic whatever don’t care, they’re excellent. Alcohol is bad for me too but I’m not gonna stop doing that either.


My YT recommendations are usually my already browsed songs, so it's a good thing for me. I also don't login on YT as I don't have a google account. Well, technically I do have one, I created one for my Android emulator but I don't care what's in that account.


Yes they are. They are the drug pushers of the modern era and the harder the drug the harder it pushes you

Sure, it works fine for "normal" content. That's fine. But for "divisive" stuff it's like having someone trying to push you off road at every moment


Yeah I wonder if in the future we won't see this as a terrible crime (pushing the next dopamine hit to a bunch of addicts) or if humanity needs to evolve to gain much much more self control.


I prefer the word dope peddlers, as in dopamine. It's not technically accurate but I like the impression it gives.


I actually miss the Ben Shapiro DESTROYS Feminists recommendations that I was getting 2 years ago. It was kinda fun. Now they have toned it down so much that it's hard to find engaging content. Their music recommendation is still good.


I like to have video game speed runs in the background every now and then. Now my suggestions are full of videos with outraged titles about how someone CHEATED in a mario speed run. EXPOSED. LIAR. PATHETIC CHEATER PRETENDS TO BE BLINDFOLDED.

ffs


I don't know how I was wasting my time watching skateboarding and far right videos...

I like stakeboarding but I am 41 and I have not practiced skateboarding in years.

But I am on the left on politics.

I delete the youtube app and put a URL blocker to stop this behavior.


Educate your non tech friends to use adblockers, vpn and cleaning browser data after each session. Actually YouTube has decent search with filters. Don’t use any suggested video search an build a profile ( if you wish).


YT has terrible search. It totally fails to use the + operator.


The other day I watched some video from this bearded British Linux nerd about awk, and since then, YouTube just won't stop recommending me videos involving Arabic expats to English-speaking countries.

Thanks YouTube


Recommendation engines are the monkey's paw of social media companies.


That's why I'm never logged in. And I also clear my cookies everytime I close the browser. Youtube to me now is more like a blank page where I search for stuff instead of a weird aggregator.


> We’ve designed our systems to help ensure that content from more authoritative sources is surfaced prominently in search results

So Google also doesn't necessarily recommend what the user wants.


You have to fine tune the recommendations yourself, by manually removing and choosing the option "I don't like the video".

It works surprisingly well, and well worth the effort.


I like that the author doesn't even see the irony in the fact that his article is toxic and written in a sensationalist way just to get more clicks and more ad revenue.


Also, what's the story with Youtube deleting and disabling all comments for autogenerated music videos? That was a really messed up move.


Firefox add-on that can remove various addicting components of YouTube:

Search for "Remove YouTube Recommended Videos, Comments"


This website is literally unusable on mobile. The ads make it impossible to scroll. What a load of garbage.


This feels a bit like stating the obvious.

I felt the "pull" of the alt-right because of YouTube;

It begins innocuously enough, someone shares something that's a thoughtful piece on "googles culture" or something which is left leaning, then in the recommendations there is a response, usually a passionate argument against SJW's or whatever, once you watch one it's basically game over (at least this was the case 1-2 years ago); My entire feed was completely inundated with 'Ben Shapiro owns the libs" or "SJW owned compilation" usually featuring that very loud and aggressive feminist lady with the dyed red hair in the thumbnail.

it was only with concerted effort (IE; clicking "Not interested"... a lot!) that I was able to eventually save my account, but I still get the odd Jordan Peterson and even right now I got a recommendation for "Power&Panic; the last days for Christian nationalism" by "The Thinking Atheist".

What's worse is YouTube used to require me to give a predefined reason to not recommend something; and "too political" was not one of those choices.

I'm absolutely certain that the same thing happens for others too towards the left. It's incredibly radicalising.

I just do not give a fuck about politics, my YouTube watch history is all music and conference talks.


Why don't you just delete the underlying videos from your youtube history? I've had history turned off for a while, so my recommendations are only based on likes, subscriptions, and the actual content I'm watching at that moment. It seems like cleaning/clearing the history would solve the problem?


How do you know that deleting your history does anything? I mean, yes, deleting your history hides what you've been looking at (from you), but does that restrict the recommendation algorithm? Is there a way to know for sure?


It's a pretty stark difference when you have one video in an otherwise blank youtube account vs having zero.


It could, but I’ve watched a lot of videos since then.

I was not aware that this was possible back then.

Good advice though, I would suspect it works, I would hope it would. :)


I literally only watch one type of video on YouTube (chess games).

The entirety of my recommendations are Jordan Peterson.

Wtf


And then if you watch one Jordan Peterson video, all of sudden Youtube thinks you are a Proud Boy.

another interesting example from the past, I watched a few science/nature focused videos on Antarctica. Then youtube decided I wanted to see Antarctica conspiracy videos about aliens, secret buried worlds, and Nazi holdouts.

That being said, I think the recs have become much cleaner lately. i.e. no more Russia Today videos in my feed.


Personally, I find the opposite. When I watch conservative content on YouTube, the recommended video is almost always very milquetoast - almost always an episode of “Uncommon Knowledge” by the Hoover Institute (despite me finding those extremely boring and therefore actively avoiding them), or a speech by Roger Scruton/Jordan Peterson/Douglas Murray - thinkers who are right-wing, but “safe”.


I have two accounts, one for work and one private. One of them is for following political right-leaning stuff, the other for left-leaning.

In my experience both accounts do not try to push me to extremes. They DO try to keep me in a bubble, and both are very happy to lead me away from politics in general.


The Matrix gets it wrong. There won't need to be a war.


More people should find out about this.


Relevant xkcd and TheOatmeal comics (the latter is a bit longer and about the whole internet but I see the Mona Lisa part as a referral to YouTube):

https://theoatmeal.com/comics/making_things

https://xkcd.com/202/


you don't say


[flagged]


Could you please stop breaking the site guidelines? You've been doing it a lot, I'm afraid. They ask: "Please don't comment about the voting on comments. It never does any good, and it makes boring reading."

https://news.ycombinator.com/newsguidelines.html


COVID misinformation is a bit of an outlier in how the various social networks are treating content, though.


I agree but still a good example that OP's statements are questionable.


Wait, they're claiming that Youtube recommends a lot of conspiracy bullshit, which is pretty well documented. You're pointing out that Youtube bans one specific narrow type of conspiracy bullshit, which is true but irrelevant. I don't see how that makes their statements questionable.


No, it's not. It's like saying smoking is good for you because there's this one 120 year old smoker.

COVID misinformation is being tackled with specific manual attention because the algorithm was prioritizing it, and because it's receiving fairly unique amounts of attention.

Google's (and Facebook's and Twitter's) hands were forced here. They'd much rather be leaving it alone like they do with other conspiracy theories.


Note that the article dates to June 2019, based on Chaslot's experiences from earlier years (which were probably still largely true when the article was written, but hard to check right now one way or another).


I fought with Instagram algorithm, and now my instagram explore feed delights me. I get cool new content that I actually like and enjoy.

I fought with Youtube algorithm, and it's really really lame. I don't get any new creators or quality content. Even the education content it tends to suggest is absolute garbage.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: