In the bigger picture Apple aren't really documenting anything for their new toys, which is not the end of the world but is a huge step back for openness. There is - as far as I can see - no long-form documentation for M1 other than patents.
I already refuse to own apple products so I don't really have any skin in the game, but consider that if microsoft were like this the blind eye would not be turned - buy our new product, everything you run is up to us, we won't document it, maybe we'll upstream the compiler, tough shit.
The fact that they can be like this is really proof that their market position is much stronger than most HNers seem to think it is.
On the other hand, Apple released their 64-bit ARM backend for LLVM[1] in 2014. They announced their intention do so almost immediately after the first ARM64 phone, but the process took several months because it had to be merged with a duplicate backend developed by others.
I think it is likely that support for these instructions will be upstreamed to LLVM, and the teams involved in this monumental transition (which was announced like ~50 days ago) simply didn't put it at the highest priority on the list of things that developers need to know to get their apps working on Day 1.
A fair criticism is that some of the more technical documentation, such as on the Mach-O ABI, have disappeared from Apple's developer site, or been relegated to an unsearchable archive[2], so it's very reasonable to be skeptical of ISA extensions ever being documented there.
The developer site has always sucked though, and it seems to me they've declared bankruptcy and are rebuilding it on machine-generated docs and WWDC content for app developers. Occam's razor.
I just started getting into macOS and iOS development and the documentation is HORRENDOUS.
Even the classroom educational material Apple put out is incomplete, outdated, and often completely wrong!
Additionally, It seems to be impossible to compile anything in Xcode other than a hello-world without gettting undocumented warnings and (non fatal) errors logged, which, if you can find a matching thread on their forums, usually results in a “just mark it down as log noise”.
No it wasn't always this bad. The speed at which Swift has changed has left the documentation behind. I've complained about this numerous times.
Simply put, if you want useful doco don't look at the latest version of the language. They literally stripped the useful notes and examples and it's now a straight API print out.
If you use stackoverflow (sarcasm) be prepared to constantly be using older versions of swift and translating them to the latest. Feel free to submit updates to them.
So it does help to understand what changed in the language versions. I think anything past Swift 3 is similar to today. Xcode does help a little telling you what you pasted makes no sense to a degree.
Good documentation will come. Apple's current priority right now is converting macOS to a closed OS like iOS. Once this is done, Apple can advertise the fact how "easy" it is to build apps for both ios and macOS with one codebase. Apple will then taut "courage" as they force all Apple developers to switch to Swift if they want to develop on their ecosystem. You will ofcourse have to pay them for this privilege, and that ofcourse is the end goal.
Yes. I have recently released my first game in the app store. It is a simple game, and so this is probably the 3rd rewrite of it.
The second rewrite was when I finally had an apple device, so I thought "I will do everything the apple way" - I used Swift and SpriteKit, and tried to do everything exactly how I thought the Apple people would like.
I then dropped that laptop, and was told it would cost more than the price of the laptop to repair, so I decided to do another rewrite - this time, using my linux laptop.
The experience of writing it in plain documented C, and using libSDL2 for most of the IO was far superior. Writing my own physics engine instead of using the SpriteKit one made the actual gameplay better (because it meant I could easily fudge physics a bit to make it more fun).
Only at the very end of the project did I then go through the painful process of making it happy with xcode, happy to build for iOS, and then finally happy with the Apple review process (including hacking my SDL build to remove bluetooth peripheral support so I didn't need to request unwanted permissions).
I now have an M1 Macbook Air, and I've briefly looked again at doing things the "apple way" (e.g. swiftui) and I'm coming to a similar conclusion.
There’s a lot you get for free from things like SwiftUI, such as accessibility, support for right to left locales, portability all the way from the Apple Watch to the TV, etc.
I am also currently writing an app using SDL on Linux, which I have previously written in SwiftUI and it’s a lot of fun, but it’s just not comparable.
My SwiftUI version runs on the watch, the iPhone the TV and even in widgets with barely any modification. (And with much better fonts!)
The Linux version is fun, but it’s just not comparable in terms of the work the platform is doing for me.
I just started working with Swift again after building a couple toy apps when Swift was brand new. I feel your pain with their documentation!
What bothers me most is how their documentation for anything is a massive scrollfest[0], followed by links everywhere, yet the actual documentation still feels sparse enough that I almost exclusively try to just read the source code and make guesses. Rails docs sure aren’t going to win any design awards, but at least I can access everything on a single page![1]
Why not pretend Swift doesn't exist and just use Objective-C instead? It's stable, it's well-understood, there's a ton of examples and docs everywhere and it hasn't changed much in a decade.
I'm mainly an Android developer, so I've never made anything serious for the Apple ecosystem (only some tiny Mac apps for myself), but I just hate moving targets with burning passion. It's hilarious to look at all the "community" with their Kotlin and support libraries and jetpack and other abstract-the-evil-OS-away stuff changing and breaking everything three times a month while the only changes I see myself are those to the raw underlying system APIs whenever there's a major OS release.
My point about the docs isn't the day-to-day stuff, but the things that cannot be machine generated - to write a good compiler you need to know the details of how the CPU is put together underneath.
Intel will give you 800 pages on how to optimize for their current ISA (AMD will give you 45, but still)
As I understand it, Intel will also sell you a compiler that can optimize better than GCC because they know internal microarchitecture details. To me, that seems like a more clear-cut by-design conflict of interest between chip designer and user than these undocumented instructions.
Intel's compiler is basically free for plebs but not for datacentres.
LLVM and GCC are good enough (brilliant, even) for general purpose compilation, but for a (say) supercomputer which is guarded by guns having Intel on the phone can be worth it.
Clang and GCC regularly go toe-to-toe with or exceed ICC and have plenty of architectural cost models for modern systems, in part thanks to the extensive documentation and free tools from Intel themselves. ICC does have some gimmick features like GPU offload, and (last I saw) much better auto-vectorization. But as far as run of the mill code goes, you aren't going to get some magical double-digit performance uplift on average scalar code that can only be attained by ICC and/or Intel doing subterfuge.
Beyond that, making a claim of a conflict of interest requires more than just "It seems like these words fit and sound right, to me." Because "Microprocessor company writes well optimized code for their microprocessors and sells it" hardly seems like a conflict of interest at all in this case, it literally seems like a company selling a product that goes with their existing product. Absolutely everybody does this, including AMD, IBM, Nvidia, Apple, etc. It is not a "conflict of interest" for Intel to both design a microprocessor and also sell code that works well on it. What might be a conflict of interest is if they said "We have an obligation to release these specifications, but also we can make lots of money off them if if we keep them private despite that existing obligation, so we'll just keep them private." Which is a much stronger claim. (In fact even the infamous "cripple AMD" legal SNAFU over ICC wasn't a conflict of interest, it was brought as an antitrust suit, AFAIK.) It's not like Twilio has a "conflict of interest" when they implement new features in their product portfolio, build them on Twilio's servers, and integrate them with other Twilio products.
In general, Intel is ahead of almost every other vendor when it comes to publicly-available information regarding performance optimization (first and third party), by a landslide.
GCC and Clang both generally match ICC’s performance, and in my experience beat it more often than they lose to it. ICC’s only real advantage is MKL and other bundled libraries, which are generally better than what free software has to offer.
You've hit on an important point here. As soon as Apple documents one of their CPU extensions, they've put themselves in the position of having to support it long-term. Leaving it undocumented gives them the freedom to make breaking changes from one CPU generation to the next -- hopefully with the end goal of deciding on an implementation and documenting the end result.
Apple must support base LLVM functionality because it is the core of their extend-embrace-extinguish strategy. This says nothing that they will extend that offer to their undocumented secret sauce, as you imply.
Don't be absurd. LLVM was a research project until Apple hired Chris Lattner and poured money into it to make a viable alternative to GCC. They've been among the project's top contributors (financially and in terms of code) for 15 years. That's quite the long con.
Apple creates plenty of proprietary technologies but I can't think of a time they've been credibly accused of crushing an open technology that already existed.
They did that, in part, because they were subject of one of the first GPL enforcement actions, originally being unwilling to open source their objc GCC frontend. They being Nextstep at the time, but there's not a huge distinction these days.
Their investment in LLVM is very much an attempt to keep reproprietization on the table.
I absolutely believe that the investment in LLVM was justified to executives as a way to avoid GPL-licensed software, which Apple has long been allergic to.
But it should be noted that the GPL violation you refer to was in 1989. Put in context, that was ~16 years before they started investing in LLVM, and they have done so continuously for the past ~15 years.
It should also be noted that GCC was not suited technically, by design, for what Apple wanted out of a compiler at the time.
The JIT stuff they were doing with image filters for Core Image and Core Video and the IDE integration they were doing in Xcode could not be done in GCC at the time regardless of licensing without making massive changes, which would not have been accepted upstream.
GCC at the time was intentionally designed and implemented to make it hard to use parts of it independently, such as just using the parser to make a syntax tree for your IDE to use for highlighting or suggestions, or using the code generator in a JIT.
There had been suggestions and patches to make GCC more modular and separable, but they were rejected because being more modular would make it easier for proprietary software to interact with GCC components.
It was only after LLVM and clang started getting widespread use that GCC became more open to the idea of it becoming more flexible like LLVM.
Ultimately what the executives think is the reason is the real reason, as we saw from Oravle reproprietizing Solaris.
And yes, it was a while back that literally lawyers had to get involved in their blatant GPL violation in the compiler space. We absolutely get to judge a company on it's past actions.
I'd argue that Apple of today is closer to Nextstep than Apple of the same time frame. Like, it was literally Steve Jobs making that decision at the time to violate the GPL. The joke for the longest time was that Nextstep bought Apple for negative dollars because of how much senior leadership was carried over from nextstep.
It's not like it was some random startup they acquihired for a couple engineers.
The way you put it non copyleft licenses are just there for bait and switch. I don’t know any case where that happened where there was outside contributions.
Oracle reproprietized all of Solaris, including the outside contributions up to that point.
And I assume that Apple has support for these instructions this whole thread is about in their toolchains, so if that's the same version that's in xcode, reproprietization is already happening.
Leaving reproprietization on the table is the whole point of non-copyleft open source.
it's is a long as OSX. So it perfectly matches the 'long con' time frame. Just don't know why you go about calling amoral business strategies designed to benefit from opensource without giving anything substantial back as a "con".
There were alternatives to LLVM and patches to gcc that achieved the same. Apple option to bring llvm up by investing heavily in it has more to do with the licensing than technical aspects.
Put another way: "Apple pays project leader Bay Area salary + pre-iPhone AAPL stock to support open source development for more than a decade." (There are certainly more projects Apple relies on that deserve this!)
Most of Apple's extinguished open source projects are ones they started themselves and then abandoned.
... and in the CUPS case I could see it more of a situation where they wanted good printing support, saw that unix printing was a horrible mess, but that CUPS had promise, decided to bring it in-house in order to make it work well on macOS, and then after that was done, stopped really caring about it all that much.
Because printing is just so unsexy in general, and tends to get neglected everywhere.
Or maybe different people have different needs and don’t care if every device is “open”.
The vast majority of people are not enamored with technology for technologies sake. These are tools. Does the tool do what I want with the least amount of friction? That’s what they care about.
Also how does Apples stance on one product in any way change the device you are using to post in any way? Are Apple thugs coming to your house destroying non-Apple tech? Of course not!
It always amuses me how Apple can simultaneously be irrelevant due to their market share and also be the greatest threat to open computing by their temerity at daring to have different computing models.
If you don’t like their computing model then don’t use it. Railing that they aren’t supporting your vision lockstep is pretty freaking arrogant.
Their market position is strong because they provide value that people desire. The market is not a zero sum game. Apples mere existence does diminish your computing experience. Indeed Apples existence and continually pushing computing in new areas has benefited everyone in the last 40 years - whether you use Apple or not. Just think where computing would be of the attitude that you were only a real computer user if you assembled it - at the component to board level, not just bolting pre-mad parts to a pre-made case - persisted? People criticized Apple for selling complete computers with the Apple II.
> Or maybe different people have different needs and don’t care if every device is “open”
And that is exactly the point you miss.
You are happy and don't care if Apple is open or not. Good for you.
Some of us are not.
And if you do not care whether Apple is open or not, what does it really matter to you if some of us advocate for more openness and transparency from them? It's important to us.
A terrible analogy, they aren't hurting anybody, something which would be fundamentally wrong. A better analogy would be that Apple is baking bread in a way that you don't like. Which is only a problem if you are forced to eat it.
>If you don’t like their computing model then don’t use it. Railing that they aren’t supporting your vision lockstep is pretty freaking arrogant.
If you don't like/agree/see value in this persons comment then why are you engaging with them? That's how silly your position sounds.
Now back to the substance of your comment.
>Or maybe different people have different needs and don’t care if every device is “open”.
How could "not open" be a need of someone?
>These are tools. Does the tool do what I want with the least amount of friction? That’s what they care about.
How does an Apple engineer documenting the design add friction in any way to what you're doing?
> Indeed Apples existence and continually pushing computing in new areas has benefited everyone in the last 40 years - whether you use Apple or not.
Apple has helped popularize the smartphone as a consumer device to consume music/media, do light business on (emails, IM, documents, etc) and do a few minor creative tasks digitally such as drawing, music, etc. They do get a lot of credit for that. They have also co-created an application platform for other businesses to sell software. But then they robbed 30% of their sales too - They get negative credit for being so greedy.
Your argument that they have "continually pushed computing" is a bit of a stretch. Give us your best argument/examples/facts.
>In the bigger picture Apple aren't really documenting anything for their new toys,
This should not come as a surprise, it has always been Apple's default stance.
Apple has been, is, and will very likely always be a closed ecosystem.
They might sometimes appear or pretend to be open, but it's essentially good PR/marketing (e.g. stash Unix underneath a closed-source stack, call yourself open source and proceed to lure all the open source devs to the platform).
I think it’s more a sign of the times than Apple being Apple that they don’t tell you what hardware you buy when you buy one of their systems or how it is connected. Television manuals used to contain hardware schematics, car manuals used to be way more detailed, etc.
Apple also had detailed info on their hardware. The Apple II reference manual had schematics, mapped it to the PCB, described what each part did, described 6502 assembly and had an assembly listing of the monitor ROM. The only thing missing to build a copy, I think, was an assembly listing of the Basic interpreter.
The phone book edition of Inside Macintosh also had info on the hardware, but a lot less of it.
The keyword being had, but Apple has always been on the more closed side --- Inside Macintosh was far more pretty but overall less informative than the corresponding IBM PC Technical Reference books.
The Apple II was useless on arrival unless you could program it yourself. The iPhone is useful for 99% of customers while providing no on-device programming. Weird analogy.
This really isn't new. It looks like instead of treating the "Neural Engine" as an I/O device like the GPU, it's accessed as a coprocessor. It makes sense for them to only expose their functionalities through libraries as the underlying hardware implementation will go through changes in the future. After all, does, say, Nvidia, document their GPU's ISA?
> consider that if microsoft were like this the blind eye would not be turned
Microsoft was taken to court over undocumented APIs in Win 3.x that gave their own software an unfair advantage. This is why the Win32 documentation exists today.
As someone who was actually thinking about getting the M1 based on the great performance and battery life, but without any actual use for it, I can see what you mean.
But the lack of openness is enough to stop those ideas, a new Surface laptop or next generation AMD mobile is much more useful.
I just hope that a battery life breakthrough is on the horizon from AMD or Intel.
>I just hope that a battery life breakthrough is on the horizon from AMD or Intel.
Intel has their Lakefield chips which use the same big.LITTLE configuration but their first large release chips will be Alder Lake next year.
My concern is less with Intel and AMD's ability to implement the cores as much as Microsoft's ability to implement a scheduler. They never came to grips with the weirdness of AMD's Bulldozer architecture.
Process node is a large part of it. Fab a Ryzen at 5nm and it will at least be in the ballpark, though X64 decode complexity makes it hard to completely close the gap.
There is also nothing stopping someone else from matching or beating M1 performance with another ARM64 core or RISC-V. Both of those have simple easy to parallelize decoders. It will be done if there is demand. AMD, Marvell, Qualcomm, and Intel all have the expertise to do it. Probably TI too if they wanted to reenter the CPU market.
I think you have to look beyond 'technical' ability and factor in commercial incentives: like who would be buying these CPUs and how many? Zero chance of an AMD or Intel ARM core like this in the near future. Marvell have pulled out of the Arm server market IIRC. Qualcomm a possibility but they have underwhelmed in the past.
Nvidia would be my favourite post the Arm acquisition but they probably have their eye on other markets.
From an Apple perspective openness is a feature, not a goal in itself. Youre worrying about this or that specific ARM instruction, but Apple doesn't even want you to care whether it's ARM, or Intel, Or PowerPC, or whatever. It's an Apple system, part of the MacOS ecosystem and supported by Apple dev tools and power user features.
If that's not the world you are interested in, what are you even doing owning a Mac? Apple does not try to be all things to all people the way e.g. Microsoft has.
> If that's not the world you are interested in, what are you even doing owning a Mac?
I would like consumers to have at least semblance of freedom. You have to consider that for most consumers, almost everything mentioned in this thread from top to bottom is complete Greek to the average person buying a computer.
This particular question (Documenting stuff), is the cherry on top. The Sundae itself is Apple's locked-in approach to their ecosystem. The result of the Epic squabble is much more important for example, similarly I don't think it's a good thing that Apple can force Stadia to be non-native (If you were a lifelong Apple customer would you know there was any alternative?).
The world is only going to get more technological - thanks to things like the cloud, the Orwells of the future will be writing about technology.
As a final remark, I believe there should be broad but shallow regulation for the whole sector, Apple are just the most obvious example.
That's true, at some level of abstraction you stop caring practically about openness. That line has shifted, and it points to hardware being commoditized more than lock-in. Do you care who made your logic gates or if it's documented?
> Do you care who made your logic gates or if it's documented?
Generally no. But it's something that worth an attempt, no matter how clueless the attempt is, something positive may come out of it. Here's an attempt to implement a free and open source standard cell library using the 1-micron process from the 80s.
The implication I assume that there would be pushback on Microsoft for doing the same would be because of the possibility of nefarious undocumented changes. What possibly nefarious things could an undocumented ISA extension do?
Presumably the next iteration of the processor can remove them and use the opcodes for something else. Apple recompiles their system library, hence is unaffected. You can't do the same so peak M1 performance is unavailable for you.
Well the point of the question is trying to establish if it’s even possible in the first place. If there is no credible way, then there is no need to fear there are nefarious reasons for implementing such a change for either Apple or Microsoft, outside of restricting third parties. If there is, then it’s as you say.
Apple can make tools that use the undocumented ISA, which they don't have to release to the public, potentially using it to develop applications with performance advantages that no one other than Apple can attain.
It's similar to how Microsoft doesn't make all APIs on Windows public and uses non-public ones for various reasons--except this is in hardware.
It's possible the undocumented ISA is used to speed up x86 emulation and might be kept secret due to intellectual property concerns or similar.
> Apple can make tools that use the undocumented ISA, which they don't have to release to the public, potentially using it to develop applications with performance advantages that no one other than Apple can attain.
Why would they? They sell hardware, not really software.
Hardware doesn't work without at least some sort of software.
I really wish they (or any one with modern fast CPU and motherboard designs) would sell hardware only bare-bones with no OS or even bootloader, but history shows a mass market won't support that.
But they make more money from the Apple Store and the other services and they are always giving advantages to their own stuff on iOS like using private APIs or bypassing security rules .
It's starting to grow, but on any given platform of theirs, the overwhelming majority of their income still comes from the actual hardware, and the services revenue for that platform is maybe 1/6th or 1/8th of what they make from the hardware for it:
(within this, you probably want to break out 5/6 of the services revenue and tack it onto iOS, and then treat the remaining 1/6 sliver as applying to the mac).
This is honestly one of the biggest peace-of-mind things for being an apple customer, and it's definitely a big cultural trope within their support network as well. You pay your "union dues" by buying the hardware, and then "you're in". There's no money to be made after that fact, so there are an enormous number of small interactions after that that are fully trustworthy - for example, unlike many internet-ad-revenue companies, you're not worried about being profiled or having your info sold to advertisers. You're also not worried about "product collapse based on shaky revenue streams" - one thing that's always been scary about using various consumer apps is there are a fair number of programs out there that just have really shaky revenue models, so it's scary to commit to them because a lot of them eventually have to pivot, or sell out, or otherwise do something brutal to stay alive. When using some of apple's hardware, you know if it's selling decently, the business is stable and certain rugs won't get pulled out of under you.
Only the software that directly competes with Apple own software. So for example Spotify was denied access to the special APIs for a while, browsers can't use their own engine, Apple apps can bypass firewall rules but games with lootboxes are welcomed because they don't compete with Apple and a 30% cut gets in Apple's pockets.
Ouch. It's exactly this mentality that enables malicious actors - "it could be worse". It could always be worse, but that doesn't make it sufficient justification.
Functional equivalent:
"It could always be even more anticompetitive. Let's be happy with what we got."
Eh, what? What malicious actors? What sufficient justification, and justification of what and to whom? What could be worse? What could be more anticompetitive, and competing with what exactly? I said nothing like this, your "equivalents" are from an entirely different world.
I said we should be happy Apple decided to open their platform - because they didn't have to, at all. It's entirely their right to not open anything, not write any documentation, or not even allow 3rd party software at all. If you don't like it, don't buy their overpriced premium products - it's not for you. I wouldn't want my own company to be forced to do/not do stuff, so of course I don't wish that upon Apple. There are enough entirely open computing platforms that are VERY cheap to obtain and easy to use, perhaps that would be a better choice for you.
Meh, nothing serious, just that the "it could be worse" line of thinking can be a little dangerous if broadly applied since it could always be at least marginally worse, that's all I meant.
Completely agree about forcing them to do or avoid strategies, though governing bodies may disagree!
Jokes aside, it seems that despite occasional marketing/rhetoric from Apple, their platform represents the minimum bound of "openness". It's only open enough to allow further business - e.g. allowing 3rd parties to develop apps in the store (from which they take a cut of proceeds) - rather than being open for the spirit of it, if that makes any sense.
They are a business so that's not exactly something one could fault them for, though.
Yes, what I am saying is that we can't fault them for being a business. We should be looking for our open computing platform elsewhere - I think there is more than enough options. If a supposedly "open" platform wasn't open then that would be bad, but Apple is about something different - offering a premium vertically integrated product for people that don't even know what a CPU or OS is. If they were forced to open their platform it is very much possible there wouldn't be any Apple platform at all, and I don't want that kind of world.
> though governing bodies may disagree!
That's why I am reacting so strongly. If people begin to apply your thinking too broadly (e.g. to businesses like Apple) it will lead to governing bodies behaving like that. Let's apply this kind of thinking to platforms that claim to be open.
> Apple aren't really documenting anything for their new toys
This seems an odd strategy for an accelerator. How are they expecting that people will use these new instructions? Only through Apple-supplied closed source compilers and libraries? (This is not meant to indicate you're wrong, only I'm confused about Apple's strategy)
A difference, of course, is that Apple can ship it with its OS, and does so.
I would guess they’re not or not yet willing to commit to shipping these instructions in future hardware (possibly, but that is more guesswork, because they have to work around bugs to use them. Those NOPs this article is talking about might be to work around them)
> How are they expecting that people will use these new instructions? Only through Apple-supplied closed source compilers and libraries?
This sounds right. Forcing people to use Apple supplied and supported libraries allows Apple to make major changes in hardware without breaking compatibility (there are always edge cases, though).
If the compiler generates these instructions, won't old binaries break if Apple later make changes in the hardware? (For shared libraries it would be "ok" because the old binary could use the new library, as long as Apple didn't also break the ABI).
Accelerate.framework is a shared library distributed with the OS. Like most other system libraries, it’s not available as a static library, and though you could theoretically ship a copy of it in an app bundle, you’re not supposed to. Instead the ABI is kept backwards-compatible.
There’s no publicly available compiler, proprietary or not, that produces binaries that use the instructions.
My life is so much better being in the Apple ecosystem.
Some of it was unintentional, but now its iPhone, Macbook Pro, Apple Watch, Airpods Pro, and IOT devices that use Homekit. I have an iPad but I use it the least.
Pre-Intel I used to make fun of "Apple fanboys". But aside from the niche and incompatibilities of the PowerPC architecture back then, that turned out to be a lot of other immaturities I had and was raised with.
I also used to use Android, but the "look how many advanced, poorly integrated and under maintained things I can do by myself" concept didn't stay appealing for very long.
This is a non-sequitur. There is nothing stopping Apple from doing things in an open manner - it's purely cultural. The fact that they don't is why I do not buy for a second that they can be trusted with privacy, whether they care or not.
The quality of Apple's ecosystem is because of the centralized effort put into it and them owning the stack, not because the end result is walled-in.
When someone tells me Apple really cares about user privacy, I point them to how Apple has been deliberately crippling their wares to actually spy better on their users. Some of the recent events include:
- Crippled macOS Mojave+ to let Apple know every app you open (in the name of malware protection, but sent unencrypted over the internet).
- Crippled macOS Big Sur to allow any Apple whitelisted software to bypass any user application firewall blocks and some VPN apps.
- Even ios, that has a whole lot of "do you give app permissions" for loads of stuff, leaves a glaring wide hole open as it doesn't let the user to allow or block an app from connecting to the internet (on WiFi).
iOS devices have had a file system users can at least save and retrieve from for years now.
The mobile app landscape has changed from the gold rush early last decade, and I rarely install anybody's. I empathize with the developers still trying to make it there, but its not a user's problem.
The protocols that Apple chooses to promulgate work really well.
Flagship games work really well.
The garden has gotten really big, I can't see the walls anymore but little bad gets in.
Computers do what I want them to, here. They don't do what I want them to do outside of here. For context, I jump into a Windows instance, or routinely put compute instances in the cloud.
If all that works for you, that's mostly fine, and for the most part I don't think you should feel like you have to defend yourself. But two things:
1) Not all of us are like that. Admittedly I'm very far in the other camp: I run Linux as a desktop OS, and have done so for most of the past 20 years. For much of that I even ran Linux on Apple hardware (because I really did love their hardware), but a few years ago finally gave up due to the increasing frequency of undocumented things that don't work. But there are many of us (less extreme than I am, who would otherwise be content to run macOS and iOS) who absolutely do not buy in to a world where they don't truly own their devices.
2) There's a pervasive worry around here that Apple's approach will catch on, the end result being that full control over your own hardware and software will be a near impossibility. (Or, you'll be able to have it, but then be denied access to things like streaming media, and government and financial apps, a trade off people shouldn't have to make.) I think that's a very real possibility in the longer term, and I think the end result of that is total surveillance and a world where corporations decide what we're allowed to do with technology. And I think for this reason there can be a bit of hostility toward people such as yourself who are happy in the Apple walled garden and seem unconcerned about the future implications of this kind of computing model.
As an aside, I also reject the idea that it's necessary that the user give up agency over their devices in order to be protected from malware. A walled garden -- no matter how far away the walls are -- is not a requirement to keep bad stuff out.
>The fact that they can be like this is really proof that their market position is much stronger than most HNers seem to think it is.
Elsewhere on the HN front page is a computer made with a Russian internal CPU. The CPU has a custom ISA based on x86, and isn't open. The documentation is poor, and nobody has the source code for the compiler. It's also about 5 years behind the state of the art in x86. How can a product like this exist? Easy. Russia needs them for military hardware because they're not going to put Intel in their defense radar systems. The manufacturer has a niche to sell to.
Instead of being 5 years behind, Apple's closed and undocumented ISA is 2 years ahead. How much moreso, if the Russian CPU can survive in the market, can they.
Seems clear that Russian CPU is targeted towards the Russian defense industry, where all that NIH and secrecy can arguably be leveraged into national security as its a less cost-sensitive application.
Of course, and Apple's ISA is targeted towards the People Who Buy Macs industry. By not making it available to datacenters, they are supporting other CPU vendors in the race to catch up. It will help AMD a lot in their bid to catch up that all investors can see that Apple won't be competing with them.
> By not making it available to datacenters, they are supporting other CPU vendors in the race to catch up.
This is by far the most charitable characterization of this monetization scheme I've seen so far.
Some would argue this is actually in place to ensure that purchases go directly through Apple each time (i.e. a one-to-one of customers to hardware purchases), rather than through a cloud provider who may purchase hardware once and allow thousands of customers to access it.
c.f. Apple's restrictions on subletting use for no less than a single day per "machine".
"Apple is extracting value at monopoly-pricing levels" == "Apple is leaving tasty treats on the table to lure competitors." They're the same statement but phrased in different ways. Any price above the equilibrium lures competitors - that's how markets avoid monopoly pricing, in the absence of external factors that protect the monopoly.
It's not surviving in a market, because it's not in a market - it's being produced for a specific consumer, whose requirements are idiosyncratic enough[0] that the market refuses to service them.
0: a very low bar in this case, given the existence of things like Intel Management Engine, but it would be much the same if they insisted on something less justifiable like running all the CPU pins at 12-volt logic levels
They key word is market - The Elbrus is obviously propped up by the Russian government and it's tentacles. It's not an invalid business model but it's not a good way to measure technology.
It's actually not that different from Apple Silicon when you think about it.
Apple Silicon: you only get it when you buy an iPhone, iPad, ARM Mac, etc. in the Smartphone or PC market; and it enables features that make those devices competitive.
Russian Elbrus: you only get it when you buy some Russian military hardware in the defense market; and it enables features that make those devices competitive.
M1 does not compete with Intel or AMD. (M1)(Apple)(OSX) competes with (Intel | AMD)(Choose any OEM)(Windows | Linux). All sales of the M1 chip are "guaranteed sales," to Apple itself, building macs.
I already refuse to own apple products so I don't really have any skin in the game, but consider that if microsoft were like this the blind eye would not be turned - buy our new product, everything you run is up to us, we won't document it, maybe we'll upstream the compiler, tough shit.
The fact that they can be like this is really proof that their market position is much stronger than most HNers seem to think it is.