Hacker News new | past | comments | ask | show | jobs | submit | mrob's comments login

I trust software more if it's ugly. Desktop software interfaces were pretty much solved by the late 90s. The majority of changes since then have been aimed at either dumbing down the software to better appeal to people who are easily manipulated and exploited, or just change for the sake of change so designers can justify their jobs. And I choose to avoid mobile devices because every one of them is slow and frustrating compared to using a keyboard and mouse.

I believe modern UI designers provide me negative value on average. Ugly software is a good sign because tells me no designer was there to ruin it.


Me too, but there is a hint of survivorship bias here. I trust most of the ugly software I am still using because it was designed well enough in the ugly era to survive til today.

I have largelyforgotten all the bad ugly software.


> I trust software more if it's ugly

Interesting point and I was thinking if I believe the same, but I can't completely agree. Reason is there's a decent chunk of software outside of the "mainstream" app space where the UI is ugly and also broken, likely because the author isn't a great developer. I'm thinking engineering and research type apps. They have the kind of bugs where clicks do nothing, user feedback is faulty, wrong sequence of clicks causes a crash, and it makes me instantly distrust the entire product.


You seem to be using the most negative expressions possible. Maybe I'm also less upset because most of the software that I use (Linux, FOSS, KDE) is less affected. I'd call it making the software look less intimidating / more approachable and more fashionable. They can both make some sense, though in many cases I also don't like it. Some fashions are objectively worse than others, like hiding scrollbars in non-touch interfaces and making interactive elements look like regular text.

IME, the ugliest software has not received much UX nor design work, and so the UX often sucks, too. Gitk comes to mind, it's very ugly and the weird diff scrolling behavior regularly gets me to where I don't want to go.


These are business decisions, not design choices.

> dumbing down the software

"The growth marketing team has identified an untapped demographic".

> change for the sake of change so designers can justify their jobs.

"We've decided to rebrand to focus on XYZ market."

> mobile devices because every one of them is slow

"If we can run the website on the user's device we'll save on developer time."

> Ugly software is a good sign because tells me no designer was there to ruin it.

It just means they don't have the money to hire a good designer. Meaning no marketing department. For whatever reason designers don't do open source. Looking at you GIMP. Blender finally bit the bullet.


"The growth marketing team has identified an untapped demographic".

Non-experts is always a larger group than experts.

A design prioritizing non-experts does not benefit experts.

Moreover, experts know they are trying to solve hard problems and benefit from tools that recognize the nature of solving hard problems entails doing hard things.

Or to put it another way, though there is a large market of people who don't play the violin, a ukulele is not a violin.


Exactly. There is pressure to sell to as many people as possible, as cheaply as possible. This leads to "simplifying" the UI.

Generally this means power users now have to click 3 times instead of 1.

Of course there are alternatives, like having an advanced mode or keyboard shortcuts. But that takes time and effort for a typically smaller set of users.


Following 90s design is how you get the atrocious GIMP 2 floating window mess. Furthermore, the standards in the late 90s and early 00s were so low because of the lack of available software packages that programs crashing your kernel was just part of the normal work week. Memory corruption wasn't treated as a problem unless it corrupted data on disk, because it was considered normal that a program couldn't run more than a few hours before committing memory corruption suicide.

UI controls were standardized and programs stuck to them because they didn't have the CPU cycles to waste on styles, but that didn't prevent programs from using text areas as listboxes, abusing picture elements to stylize checkboxes, and picking whatever cursor the dev decided fit most (despite its name and intended use) when hovering over controls. The control schemes for anything 2D or 3D graphics were also designed by dice roll. Programs also regularly didn't fit on your screen unless you adjusted your resolution, because people often ran on lower resolutions to be able to use more colours. "This website works best in 1024x768" didn't just apply to websites.

The 90s software that stuck around until today is good because it survived decades of competition and reinvention or because people who don't want to learn new software just refused to try something else. The thousands upon thousands of expensive software packages that looked just like it found their deserved deaths years ago.


I don't drink cola myself, but it seems logical to me. The point of the expensive advert is showing everybody how rich Coca Cola is. That increases the trust people have in their products being safe and reliable because they know Coca Cola has something to lose. If they didn't advertise they'd be like those Chinese sellers named as random strings of uppercase letters. I definitely wouldn't buy cola from one of those.

I think paying for Youtube will increase the chances of my Google account getting banned. I've never heard of Google banning somebody for rejecting adverts. But if I pay them money, there's a chance there will be a problem with the payments, and that risks triggering false positives on automatic fraud detection. If that happens I assume I would be banned with no recourse and no human intervention. The safest thing to do is never change how you interact with Google in any way unless you absolutely have to.

I don't like depending on Google in this way but I've had a Gmail account for a very long time and changing to a different email address would be a major inconvenience.


> I think paying for Youtube will increase the chances of my Google account getting banned. [...] The safest thing to do is never change how you interact with Google in any way unless you absolutely have to. I don't like depending on Google in this way but I've had a Gmail account for a very long time and changing to a different email address would be a major inconvenience.

I recall that even logging into Youtube with your Google account could have that danger: if for some reason Google decided that your name isn't your real name, under its "real names" policy your whole account could get banned, even from other services like Gmail and Google Talk. It's for that reason that I've been very careful to never log into Youtube with my Gmail account, even though that account always used my real name, and even though Google+'s deep integration with YouTube is AFAIK no longer relevant.


I don't think this is true. The typical uneven LED spectrum causes poor color rendering accuracy, but human color perception is highly inconsistent anyway. Think of the blue/white dress people were arguing about ( https://en.wikipedia.org/wiki/The_dress )

See also:

https://en.wikipedia.org/wiki/Color_constancy


I've seen this repeated many times but never seem any evidence for it. At typical PWM frequencies the perceived brightness is just the average brightness of the wave. I believe this myth arose from people driving low-brightness indicator LEDs using PWM for increased efficiency when using simple current-limiting resistor circuits. People saw the energy savings from less waste heat in the resistor and somehow confused it with something happening in the eye.

Linear regulators are in fact used for room lighting, and efficiency can be reasonably good. Typical design is AC input -> bridge rectifier -> passive low-pass filter -> long string of LEDs with a single linear regulator in series. Voltage drop across the regulator is much lower than across the string of LEDs so there's not a whole lot of heat generated.

They're never used for dimming, since that requires a voltage drop, which is the context here.

There doesn't need to be a health risk for it to be annoying. I personally dislike PWM and I'll continue to personally dislike it even if it's proven safe. Fortunately it's easy to find non-flickering LED lights.

If the article said "I find PWM annoying" I wouldn't have commented like I did.

You can just lower the current. Not everyone does because it generally requires more expensive components, e.g. inductors. There is a threshold voltage ("forward voltage") needed for LEDs to turn on but there's no threshold for minimum radiant flux. LEDs are actually more efficient at low current (although this might be counteracted by greater losses in the power supply).

JPEG artifacts are less disturbing because they're so obviously artificial. WEBP and similar artifacts look more natural, which makes them harder to ignore.

I think I agree, low quality JPEGs give the idea of looking through slightly imperfect glass, WEBP and AV1 look a bit more like bad AI

Subpixel rendering isn't necessary in most languages. Bitmap fonts or hinted vector fonts without antialiasing give excellent readability. Only if the language uses characters with very intricate details such as Chinese or Japanese is subpixel rendering important.

Ah so only 20% of the global population? Nbd

Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: