Hacker Newsnew | past | comments | ask | show | jobs | submit | npunt's commentslogin

The irony is that AI writing style is pretty off-putting, and the story itself was about people being put off by the author's AI project.

Discoverability is quite literally the textbook problem with CLIs, in that many textbooks on UI & human factors research over the last 50 years discuss the problem.

With 1B+ users Apple isn't in the position to do the typical startup fast & loose order of operations. Apple has (rightly) given themselves the responsibility to protect people's privacy, and a lot of people rely on that. It'd be a really bad look if it turned out they made Siri really really useful but then hostile govt's all got access to the data and cracked down on a bunch of vulnerable people.

Yeah, it's a classic CLI v GUI blunder. If you don't know exactly what the commands are, the interface is not going to be particularly usable.

I've found I appreciate having Siri for a few things, but it's not good enough to make it something I reach for frequently. Once burned, twice shy.


I'm amazed more AI tools don't have reality checks as part of the command flow. If you take a UX-first perspective on AI - which Apple very much should - there's going to be x% failures to interpret correctly, causing some unintended and undesirable action. A reasonable way to handle these failure cases is to have a post-interpretation reality check.

This could be personalized, 'does this user do this kind of thing?' which checks history of user actions for anything similar. Or it could be generic, 'is this the type of thing a typical user does?'

In both cases, if it's unfamiliar you have a few options: try to interpret it again (maybe with a better model), raise a prompt with the user ('do you want to do x?'), or if it's highly unfamiliar, auto cancel the command and say sorry.


Was the failure really driven by privacy policy? Long term a privacy play is the right move. But right now, Siri's capabilities even underwhelm vis-a-vis a model with no understanding of user context that is just interpreting commands.

Agreed. I often have to verbally battle with Siri to do the most basic interaction. Siri recognizes all my words but misinterprets my intent and does something I didn’t want.

Yeah and the fact that this basically hasn’t improved in a decade tells me that it’s likely that nobody actually works on Siri.

Not to mention the iOS keyboard has gotten so bad in the last year that it took me 3x longer to type this comment (I use the swipe keyboard). I had to fix at least a dozen typos.

Every now and then when they screw up, they’ll have a mea culpa with the press. They haven’t done that with Siri or the keyboard yet.


The new Alexa uses Claude under the hood, and it also misinterprets my intent, only with a 2 second longer delay and slightly more approachable tone.

My understanding is they are just stubborn. They once did it this way and now it is "the Apple way".

Recent example: Apple used to hide "search in page" in the share menu in mobile safari. Far from obvious, but at some point one discovers it because there is no other place to look for it.

Now they have finally decided to make a standard fly dropping overflow menu and hide the share button there. But interestingly you still need to open the share menu from there to find the search button.

Meanwhile other buttons that weren't as obviously misplaced in "share" like "Add to Bookmarks" are now on the top level together with the share button.

Same goes for the arguments against things like cut and paste in finder: they didn't create it back in the day and now there is a complete mythology about why cut and paste in Finder would actually be stupid and almost evil.


FYI, an easier way to search in pages in Safari is to just type what you want to search for in the address bar, and then at the bottom of the list of suggestions (you may need to scroll it down) you can tap "On this page".

After 7 years I learned. Thank you!

How did you learn this hack?


Glad to have helped!

I just noticed it randomly many years ago, I don't remember the occasion but I guess I was scrolling trying to find a page in history lazily and noticed it at the bottom.

It's an example that sums up feature discoverability (well, lack of) on iPhones - there are so many things like this, that are really useful to know if you find out about them but the only way to find out is luck or having a friend tell you. Occasionally the official Apple "Tips" app has useful stuff, but not much.

I actually have a thing in my family Signal chat of every few weeks sharing a new random iPhone tip, as I'm by far the nerdiest in the group. Maybe I should collate them all into a "hard to discover Tips" blog and share on HN...


It was driven by privacy and on device compute.

Anything you ask an Android device to do, or an Alexa device goes to their clouds to be 100% processed there.

Apple tried to make a small and focused interface that could do a limited set of things on device without going to the cloud to do it.

This was built around the idea of "Intents" and it only did the standard intents... and app developers were supposed to register and link into them.

https://developer.apple.com/documentation/intents

Some of the things didn't really get fleshed out, some are "oh, that's something in there?" (Restaurant reservations? Ride Booking?) and feels more like the half baked mysql interfaces in php.

However, as part of privacy - you can create a note (and dictate it) without a data connection with Siri. Your "start workout" command doesn't leave your device.

Part of that is privacy. Part of that is that Apple was trying to minimize its cloud spend (on GCP or AWS) by keeping as much of that activity on device. It wasn't entirely on device, but a lot more of it is than what Android is... and Alexa is a speaker and microphone hooked up to AWS.

This was ok, kind of meh, but ok pre-ChatGPT. With ChatGPT the expectations changed and the architecture that Apple had was not something that could pivot to meeting those expectations.

https://en.wikipedia.org/wiki/Apple_Intelligence

> Apple first implemented artificial intelligence features in its products with the release of Siri in the iPhone 4S in 2011.

> ...

> The rapid development of generative artificial intelligence and the release of ChatGPT in late 2022 reportedly blindsided Apple executives and forced the company to refocus its efforts on AI.

ChatGPT was as much a blindside to Apple as the iPhone was to Blackberry.


I think all of these are true:

1. Apple is big enough that it needs to take care of edge cases like offline & limited cell reception, which affect millions in any given moment.

2. Launching a major UI feature (Siri) that people will come to rely on requires offline operation for common operations like basic device operations and dictation. Major UI features shouldn't cease to function when they enter bad reception zones.

3. Apple builds devices with great CPUs, which allows them to pursue a strategy of using edge compute to reduce spend.

4. A consequence of building products with good offline support is they are more private.

5. Apple didn't even build a full set of intents for most of their apps, hence 'remind me at this location' doesn't even work. App developers haven't either, because ...

6. Siri (both the local version and remote service) isn't very good, and regularly misunderstands or fails at basic comprehension tasks that do not even require user data to be understood or relayed back to devices to execute.

I don't buy that privacy is somehow an impediment to #5 or #6. It's only an issue when user data is involved, and Apple has been investing in techs like differential privacy to get around these limitations to some extent. But that is further downstream from #5 and #6 though.


Dragon Naturally Speaking was way more accurate than Siri is now, and it was on-device on ancient computers.

I don't care if I have to carefully say "bibbidy bobbity boo, set an alarm for two" - I just need it to be reliable.


Yeah, I'm not buying that either/or framing too

Siri could've done better but Apple is definitely taking big risks with their privacy play. They might just corner themselves.

I'm amazed 'set a reminder for x when I leave this location' still doesn't get the 'when I leave this location'. It's clear user expectation created internally (by siri marketing) and externally (by ai tools) has far outpaced capability.

Apple seems weird about that and I'm not sure why, maybe accuracy or creepiness factor?

A feature I would love is to toggle "answer calls on speakerphone" based on location, so that I can answer a call with my phone on the desk while I'm at home and not have my ear blasted off taking a call when I'm walking down the street.


Apple Reminders has a feature to remind you when you are leaving or arriving at a location. It's super useful! But it's not super low friction to add to a Reminder via UI (it's buried at the bottom of the edit screen), so it's a feature ideally suited for a voice-based reminder. Nevertheless, nobody implemented it.

Hey, I made that!

It's a great feature! I was demoing it to my parents over Thanksgiving and forgot about the lack of Siri support, and of course it failed. Parents were excited when I mentioned it but now won't be using it. Ah well.

Super useful feature; thanks. I used it to help me remember the names of people I saw often at places I ate at or worked out of.

I can set location-based alerts manually. For me, or for those who voluntarily already share their location with me. No reason Siri can’t drive those same notifications.

Edit: to be clear, Siri doesn’t. Still no reason it shouldn’t be able to.


Ah its great you bring this up, it's timely as my app is adding contacts syncing soon and I want to do it in a secure/private way. If you choose to go ahead with this, are there any plans to make it open source? ty!


Yeah, it will be


They want you to never unsubscribe, which requires your addiction.

Incentive to addict + Ability to addict = outcome


It doesn't require addiction though. It only requires an aversion to watching ads, or the more general aversion to being annoyed.


iOS .0 releases tend to be this way, even on brand new devices. I noticed some big perf improvements on the 26.0.1 release. If I were you I'd wait til 26.1 or 26.2 and reassess then. It still may not be optimal for a mini tho for non-perf reasons, as iOS26 assumes a larger average device size.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: