It's funny how so many people are suddenly mischaracterizing the Apple II or the nature of early personal computers.
The very first personal computers, like the Apple II, were sold as development kits to developers/hobbyists because these were the only people who would even think to buy a personal computer in 1976 and 1977. These development kits contain all of the information one would need to create software and hardware for these machines. Is that so unusual?
Fast forward one short year to 1978 and Apple hires Jef Raskin to start the "McIntosh" project. His goal is to create a $500 computing appliance that the average person could own and use. Sound familiar?
Fast forward 32 years. There is more information published about the internals of Linux, Windows, OS X, and PC architecture than most developers would ever want to know. We can write low-level drivers that run in the kernel, to high-level scripting languages. We can design a custom hardware card to slap in a PC or MacPro; we can design custom hardware that connects to an iPhone or a laptop via USB. And the iPad is arguably the closest we've come to Jef Raskin's 1978 vision of a computing appliance.
The very first personal computers, like the Apple II, were sold as development kits to developers/hobbyists because these were the only people who would even think to buy a personal computer in 1976 and 1977. These development kits contain all of the information one would need to create software and hardware for these machines. Is that so unusual?
Although I was very young, and it was a log time ago, I'm not sure I agree with you. My dad purchased an S-100 bus system from George Morrow's Thinker Toys in 1977. It came with CP/M, there were games, including Star Trek and Adventure, there was even a word processor of some kind (not WordStar, something else). It certainly wasn't used as a "development kit" -- outside of maybe knowing a little BASIC, my dad didn't know how to program and used the computer for writing letters, and for playing games (which drove my mother nuts :))
Also: you can't be a successful technology company, across many decades/scales/markets, running the exact same strategy. The terrain changes too fast.
Indeed, it it the overwhelming triumph of openness in many spheres -- the cheap standard components, open-source operating systems, open protocols -- which means other 'sealed appliance' strategies need to be tried, to explore those areas not ideally served by total openness.
The sealed iPhone/iPod isn't a rival to openness, but a necessary complement and outgrowth. Just look at the Settings > General > About > Legal page. You'll see 20+ pages of mostly open-source license/copyright declarations.
I almost entirely agree with your post, though I'll note that the Apple II was unique specifically because it came pre-assembled in a plastic case and was marketed to "regular people". Still, obviously most users were hobbyists, just less "hardcore" than those who assembled their own Altairs.
I thought the original Apple (Apple I) was the version that came as a kit.
Wikipedia agrees. The circuit board was complete, but the purchaser needed to "build the case".
However, to make a working computer, users still had to add a case, power supply transformers, power switch, ASCII keyboard, and composite video display
The Apple II was not a kid (though there was a board-only version available if you asked). You're thinking of the Apple I. The Apple II was a complete, assembled system in a box from the factory. You bought it, plugged it into your TV and tape deck, and started coding. Frankly it was the most out-of-the-box-integrated programmable hardware available at the time.
They came with schematics because all expensive electronic equipment back then did. This was back in an era when electronics broke down frequently. Schematics were needed to fix things when they broke. If the owner didn't fix it, then an electronics repairman did.
You don't get schematics with anything today because you throw it out if it breaks rather than pay someone $100/hr (what auto mechanics charge) to fix it for you.
Intellectual property is also a bigger concern today that it was back then. By not giving you the schematics, they force you to do the tedious (but easy) task of reverse engineering their product.
Woz has always about being open and modular and extensible. Jobs has always been about the opposite -- and for good reason; the early Macs were typically more stable and functioned better simply because of that lock-in.
But history has played this over and over again, and Woz always wins. Vendors work out standards; the user experience consolidates.
If you give people the freedom and trust them, they'll work it out. Or you can trust a benevolent dictator to work it out for you. Most Apple afficianados will say the most people don't want that freedom; I'm here to tell you that they do.
We don't have to debate. Let's just tune in and watch the sales of the iPad starting next month, after the faithful have all bought theirs. Woz will win. Woz always wins.
What's with this constant invoking of the 'faithful'? I work at a place that sells the iPad and I've had a lot of people ask me about it. The people who are buying the iPad aren't exactly fanboys, who have been slavering over anything that Apple would put out. They're normal folks. A lot of them don't know any real specifics about the iPad, but they have the feeling that it would really be more enjoyable to use than their netbook. The notion that these Apple products only have niche appeal (because normal people value functionality and openness over prettiness) is kind of bonkers in the face of the extremely strong mainstream popularity of the various iPods as well as the iPhone.
The iPhone is really open and functional compared to the phones that preceded it. Just downloading a ringtone for most prior phones was more expensive than full-fledged games on the iPhone. The app store is far from perfect, but it's a very open market compared to anything I've seen out of Nokia or any of the Japanese handset makers.
Its funny how easily people forget this - Apple created the first MARKET for mobile apps. Thats openness. Before the iPhone the mobile game was one big scam, with someone reaching into your pocket at each and every step in a long chain.
> Apple created the first MARKET for mobile apps. Thats openness.
I distinctly remember being able to install software I downloaded (or made) on my Sony Ericsson P-800 a good couple years before the iPhone was announced. In fact, digging a little back, I am sure you could install PalmOS apps on your Treos since the first day they were launched.
And anyone could develop for both platforms (actually, there was Java ME too), anyone could host the file you would have to download to install and specify how, and if, you would pay for the download. In fact, many application stores appeared hosting zillions of smartphone apps you could download and install.
Apple's market is not open at all. Apple decides if you will be able to sell the fruits of your development work.
What Apple did is to corner both users and developers. And that's bad.
How many people did? Why so few? No market to speak of.
They made things one level free-er, which is the best they could have done given the constraints of the market and the technology. There are issues. They aren't villains for having issues.
A lot of carrier-branded phones have features like that disabled in favor of only allowing the user to purchase ringtones from the carrier's marketplace where you pay upwards of $3-$4 per ringtone (i.e. ringtones, at 30s or less, cost more than buying the entire song on iTunes). It is/was vendor lock-in at it's worst: create a captive audience, and then price gouge them because they don't know any better (i.e. "My old phone didn't have ringtones, but now I can buy ringtones from Verizon! Yay!"; without realizing that the phone itself supports user-created ringtones that you don't have to pay for).
It's sort of the same for the iPhone now. People think it's great because they are used to phones with limited functionality, openness and usability. Compared to those phones that iPhone is a breath of fresh air. Apple's just playing a game of, "what they don't know can't hurt them." People don't sit down and think about exactly what their ideal device is like. They see the iPhone/iPad and thing that it's neat and mold their expectation around what they see (especially if their expectations were lower to begin with).
A lot of carrier-branded phones have features like that disabled in favor of only allowing the user to purchase ringtones from the carrier's marketplace where you pay upwards of $3-$4 per ringtone (i.e. ringtones, at 30s or less, cost more than buying the entire song on iTunes).
I would like to point out this is an American phenomenon. At least in India, the mobile market is completely open.
This. I think the iPhone was a complete boon to the US mobile market, which was suffering and locked down. In America, the iPhone is one of the most open handsets out there. However, everywhere else, where carriers have less of an ability to be quite as locked-down and evil, they're one of the least open.
So if someone pre-generates a strange file-format for you - it's almost as simple as my 5 year old nokia.
Or there's a 17 step process.
The way the iPhone handles it's ringtones is a step back from how most normal phones did. (// Note to American readers - if your carrier blocks a function - it means you have a bad carrier - not a bad phone)
The iPhone excelled in having a great browser. And looking gorgeous. As far as Phone parts go (crap reception/quality, struggles with group sms, ringtones, etc) it was a step backwards from most phones on the market. Now that might be an acceptable trade-off, but it still needs to be stated.
> Let's just tune in and watch the sales of the iPad starting next month
We already watched the sales of the iPhone, and we saw how that went. Same type of device, same type of complaints by openness advocate, and what was the result? Are you guys going to just keep on saying this stuff until some closed Apple product fails, then say that's proof that people want openness, ignoring the litany of preceding closed devices that succeeded? I'm here to tell you that this is not a very good argument.
In all likelihood both will continue to be part of the mobile ecosystem for some time, or until Google gets tired of throwing money at the problem.
But as I said to another responder, the iPhone is one device on one carrier per country, and Android is many devices on all carriers. They are hardly on equal footing for a purely numerical battle for superiority, but still iPhone is winning.
Google is making real money with Android, and even if it's a loss leader it's just another medium to expand their search business. They can afford to lose the millions in Android develoment and they still come out winning.
What money do you suppose they are making? It's probably cheaper to pay Apple to maintain Google as the default search engine on the phone than to make their own. It's unlikely that a whole lot more internet searches are happening because of Android that wouldn't have happened in any other arrangement of the ecosystem, so I still don't see how Google benefits.
Yes, that's an option. But consider this: the iPhone becomes the number one smartphone out there -> Mobile Safari has the highest mobile marketshare of any web browser -> Apple controls the mobile web. What Google ultimately want is an open web so they keep making money from AdSense/AdWords.
"They are hardly on equal footing for a purely numerical battle for superiority"
Thats Apples choice. Those chose to lock in with AT&T they could have not locked in and sold many MANY more. So yes it equal if anything its less equal for the android because it came in later.
Considering that the Android product options prior to November were pretty weak, the momentum in Android is partially based on decent products finally being available to consumers. Unfortunately, I think Google's N1 distribution strategy isn't helping matters.
Granted, but irrelevant to the claim I was making. Also I'd like to point out that that's one device on (normally) one carrier per country, being pushed by only one company. The others are all being pushed by consortiums or multiple phone companies and are available normally on all carriers yet they still lag behind.
The consumer isn't choosing the iPhone or iPad because it is closed, they are choosing it because it is easy to use, sexy, and supported by lots of apps. You can still have those things and be open, and Android is proving it.
Android is neither as easy to use or as sexy as the iPhone. The dismal sales of the nexus one despite it's superior spec sheet and androidness are proof of this.
But I do agree with you, most people buy things out of which they get the most utility, and ease of use an sexiness are a big part of that. But the person I was responding to mad the claim that people won't buy the ipad because it is closed. It would only be possible to believe this if your head was buried deep in the sand.
Android is neither as easy to use or as sexy as the iPhone. The dismal sales of the nexus one despite it's superior spec sheet and androidness are proof of this.
True. But if you compared those numbers to the sales of the iPhone 3G or 3GS, they would look less rosy. What you've got there is a non-smartphone without an app store on a smaller carrier being compared with a smartphone with an app store on a larger carrier that's famous of not having many options in the phones you can select because of its CDMA network.
Anyway, the whole discussion of which is doing better is completely unrelated to my original objection to the top-level poster.
You should especially not say that since the comparison is not really fair, considering the iPhone OS was an unknown quantity, and iPhone lacked an app store, was on a smaller carrier, and was not launched during the holiday season.
Sales are not complete proof of sexiness, but they are evidence. It strikes me that if the iPhone on a single carrier can win out against an entire ecosystem of phones on many carriers, it probably has something they do not.
> The dismal sales of the nexus one despite it's superior spec sheet and androidness are proof of this.
In a carrier-locked-down market like the US, where the norm is to buy phones that are married to a given carrier, it only proves Apple has better marketing and carrier relations than other companies.
Also, Android is not a phone - it's a platform. The Nexus One is only one of many competing offerings - that compete not only with iPhones, but between themselves.
I think the effect is real, but of course it's not as direct as this. Now that the iPhone has a viable competitor that's more developer friendly, it seems likely that apps will start to appear first on Android, and then only later on the iPhone. I think that might effect consumers' perception of which is the sexier device.
Having a platform that's easier to develop, publish and distribute software for is only half of the "developer friendly" equation. The other half is having a user community that's willing to spend money on that software.
Are there many Android developers who've been able to quit their day jobs and make a living from app sales? (That's a serious question, not a snarky one -- I don't follow the market closely.)
Seems like if that were going to happen it would have happened already, or we'd at least have seen some sign of a sea-change. Yet the ports still flow in the opposite direction (when developers even bother). Maybe because Android appeals to the spendthrift, and iPhone to people more open with their wallets?
The other explanation is that iPhone is plenty open to developers who make the types of apps people want to buy, and closed only to the tinkerers who would not have been making salable apps anyway.
Is the Android really more developer friendly? A very profitable application category on the iPhone are audible applications. That category is a nonstarter on the Android because Java latency causes noticeable sound artifacts.
Keep in mind it's not 1976 or 1984. Things are different today. Technology is no longer reserved for the technical elite to enjoy or fetishize over exclusively. If you want to draw a line in the sand to figure out when things changed I'd say 2001 and the introduction of the iPod -- an incredibly closed but amazingly compelling and easy to use mainstream device that smashed its open competition with no mercy. This was about the same time we saw the rise of the cell phone in every pocket which was similarly closed and restricted. Around the same time we saw the release of the Playstation 2 which, again, is a closed and restricted platform that has gone on to be the best selling gaming console in history. 3 years later we see the release of the Nintendo DS -- one of the most successful mobile electronic devices in history. Closed & restricted.
I want both. People are getting anxious about the iphone/pad but overall, look at recent history. It's good to have several approaches gong at once. Linux/OSX/Windows is a good example. Go back in time and remove any of those from the game and you get a less good present.
I think Apple is largely less capable then MS of holding a problematic monopoly for long (at least with their current m.o.). That would require being every thing for every person to plug all potential competition holes. That comes with hairy compromises that Apple probably won't want to make.
So far, Apple hasn't done any net damage. They are getting things done and moving things along fast. It will take either a real slow down in their innovation or some serious damage to flip that balance.
"Wozniak's design was open and decentralized in ways that still define those concepts in the computing industries. The original Apple had a hood, and as with a car, the owner could open it up and get at the guts of the machine. Although it was a fully assembled device, not a kit like earlier PC products, Apple owners were encouraged to tinker with the innards of Wozniak's machine—to soup it up, make it faster, add features. There were slots to accommodate all sorts of peripheral devices, and it was built to run a variety of software. Wozniak's ethic of openness also extended to disclosing design specifications. In a 2006 talk at Columbia University, he put the point this way: "Everything we knew, you knew." To point out that this is no longer Apple's policy is to state the obvious."
I suspect the resurrection of this vision is what will begin the fightback against Apple's closed universe vision. If I could get the same (or better) hardware, with roughly the same formfactor as the IPad with a lot of connectors and a completely hackable software stack for a decent price, that would be awesome. The only competition shaping up on the hardware front seems to be HP's slate. Maybe I should buy one and install Linux (or something else) on it when it comes out.
(If I am wrong correct me, what is a good tablet that competes with the IPad?) I would love to see something built around an ARM processor for e.g. but building hardware is a lot tougher than in Woz's days. As a thought experiment if just the hardware part of the IPod were available for say 350 $ or so it wasn't closed and were completely open like the original Apple so we could hack whatever on it, how many of us would buy one? I would. If i had the harware chops i'd build and sell this myself.
Linux is awesome but there isn't a competing (with the IPad) hardware platform to run it on. Hopefully some one will have the cojones and talent to go up against Apple soon. Remember, once Microsoft was the unstoppable juggernaut who were on track to dominate all of computing.
I hope to live to see the day of the withering of Apple.
Always Innovating's Touch Book is probably close to what you want. The hardware specs are open, there are pins for serial on the board, and it can run a variety of ARM based Linux distros (Android, Ubuntu, etc).:
http://www.alwaysinnovating.com/touchbook/
That looks very nice, except for the abysmal 1024x600 resolution. I'll give Apple credit for having the iPad resist the inexplicable war on vertical pixels.
* I would love to see something built around an ARM processor for e.g. but building hardware is a lot tougher than in Woz's days*
indeed, the days you could buy four microswitches and some wood and have a joystick that worked as well as one from the shops are long gone, yet people still compare the iPad to a >30 year old computer which came with schematic diagrams.
What would most computer hobbyists do with an iPad schematics? Why would you buy an open ipad chassis? Did you buy an OpenMoko phone?
What if someone sold a 30-pin dock connector board with an Arduino on it?
Or what about an app that exposed the iPhone OS APIs through RPC to programs running on an Arduino board with Bluetooth? This kit might make for a nice product!
I found this quote from Wozniak in Founders at Work to be ironic.
"You'd go to the store and they'd just have all this stuff that you could buy to enhance the Apple II. So one of our big keys to success was that we were very open. There's a big world out there for other people to come and join us."
But all of those were things that were left out of the Apple ][ project for various reasons: either they didn't exist yet (Disk ][ being the prime example), or were too expensive to add to the motherboard.
What exactly is lacking from modern hardware today? Not enough communications capability? Screen isn't large or hi-res enough?
The expandability has moved to software. That's the App Store. People are customizing their machines and adding capability, but they're doing it through the SDK and not the schematics.
Any restrictions on modern hardware are likely to be strategic. For instance, the reason you can't make voice calls with the ipad and that it doesn't have a camera is most likely because that would mean you no longer need an iphone.
The parts would have added another $5 to the total cost, if that. It also gives them a reason to sell you a new one a year or two down the line. Along with less anemic ram, needed for this amazing new feature called multi-tasking.
From a practical consumer standpoint you can also buy lots of accessories to enhance your iPod/iPhone/iPad. Apps, cases, car integration kits, home players, even a little dongle you can stick in your shoe for running, a credit card swiper, etc. Obviously it's not open but it's the same practical effect in the end. The ecosystem around everything i____ is a big part of Apple's success today.
and a few other good accounts of the period, it really annoys me that Apple get the credit for inventing the personal computer. They were not even the most popular personal computer of the 80s (or 70s). I have a hard time finding anyone who own a Apple II but most of the people I work with were Commodore 64 owners.
The winners get to write the history -- and in the end, more people had more to do with Pagemaker, Illustrator and Photoshop (which were, in practical terms, Mac apps) than they did with the far-more-spectacular (for the time) Video Toaster and Caligari (Amiga). It's never been the machine; it's what people can do with it. If prosumer video was more accessible and lit more imaginations at the time, we'd be singing a different song today.
By these standards, hasn't Steve Jobs been notching up final victories over Woz for at least 25 years? In fact, I'm pretty sure I've read people claiming it repeatedly! As far back as the early 80s, the Mac marks the turning point where Apple became primarily focused on a self-contained, don't-touch-the-tech computing appliance. The introduction of the Mac, and the way it won out over the Apple II successors, also marks Woz's personal influence in Apple effectively coming to an end.
It's not as if this is a recent idea Apple's had. If anything, OSX's relative openness was the aberration, given their history. Despite how exciting we find the Apple II era, I tend to think of the Mac as a bigger part of their history, certainly in terms of the Mac era's influence on today's company.
No Macs ever prevented you from running and distributing whatever software you wanted. And while there wasn't a command line before OS X, there were a great deal of opportunities for hacking via extensions, ResEdit, HyperCard, etc.
There are multiple 'axis' to this victory business, and this one - the money angle - is only a single way of interpreting all this.
If I could choose between the two of them as human beings I'd pick Wozniak for sure, he's one of the nicest 'well to do' people that ever came out of silicon valley.
Jobs is simply still in a pissing match with Gates, and it looks like he will be able to prove some day that Apples view of the software market was even more closed than Microsofts, and potentially far more profitable.
I bought a Mac for my son a couple of years ago, I'm beginning to regret that decision. At the time it was to open his eyes to the fact that not every computer on the planet runs the same operating system and that diversity is good. Now I'm not so sure that an Apple was the right way to express that (he already had an older linux computer to play with blender on, but it was definitely past its prime).
Money is a very convenient yardstick to measure success by, but it collapses a lot of data in to a single number and it does not tell you the history of how it got there.
Clearly 'open' is never going to make as much money as 'closed', the RIAA and MPAA are all too aware of that, but longer term 'open' will always win because stuff that is special today is a commodity tomorrow.
Remember the times when each and every piece of electronics came with it's own weird set of proprietary protocols and connectors? Now it's all IP and we're better off because of that, except for the lawyers.
Some sort of hacking sandbox is needed for the iPad. Something like Hypercard would be dandy. Then again, aren't web apps already available? Safari is a very good Javascript platform. Maybe an iPad version of iWeb with a "local server" and some sort of app publishing/sharing via Mobile Me?
Actually the beauty of this approach, is that a 3rd party could come out with this quickly, using App Engine or Amazon as a back end. Combine that with an iPad App that opens a special purpose browser/Ide to run them in, and I think you'd have a business.
Personally I'm getting tired of hearing about how Javascript-enabled web apps are good enough. No, they aren't.
Yeah, it's exciting that Quake2 runs in Chrome/Safari, it really is. But in 95/96 I was participating in demo-scenes, with graphics that weren't accelerated by the GPU, in 386 real-mode, and it was a lot easier and a lot more exciting (demos that got distributed by a local PC magazine, having as target more people than the people that actually tried that HTML5 Quake2 demo).
Personally I'm getting tired of hearing about how Javascript-enabled web apps are good enough. No, they aren't.
To play on a level field with native apps? No. Wrong discussion! What about for the same sort of tinkerer/hobbyist who wrote Hypercard stacks? Seems to me, with a little hosting and a few tools, Javascript would be dandy!
(And as for commercial apps on Hypercard -- hey, you're free to try. But I'm not talking about targeting those folks.)
Forgive me if I'm being dense, but I'm not certain what that has to do with the point being made. Could you explain it in greater detail?
My understanding is that there are entire classes of programs where "let's put it on a website" is not a viable method. You can write many types of apps that way, certainly, but you pretty much give up anything so much as pretending to be performance. Additionally, it prevents you from doing anything particularly low-level - if you were desperate enough, you could always set up a web interface to gcc or something, but that won't help you if you want to hack on a device driver or something. Systems programmers are people too, you know. ;)
My understanding is that there are entire classes of programs where "let's put it on a website" is not a viable method.
But if you recall what Hypercard stacks were like, then putting it on a website that's specially tailored to make it easy, combined with a local app to make things more seamless, might actually be a money making proposition.
Put the hacking open/closed politics aside. What do you think about the idea as a business?
The point that was made above was, "there has been no progress because demos from 15 years ago had as good as graphics as HTML5 games have today".
But the difference is that HTML5 games are games, and demos from 15 years ago were demos. Writing an entire app like you write a demo is quite difficult; hand-optimized assembly takes a lot longer to write than Java. The improvement over the last 15 years is in the programming language tools and runtimes, not the graphics that the programs produce.
Apple expressly forbids "interpreted code" on the iPhoneOS platform.
I will say, on Monday that Adobe is releasing flash CS5 which will build native Android/iPhone/iPad/Desktop/Web apps all from the same source (which will be compiled down, not interpreted). I for one would love for this to be the new hypercard :OD
The lua parts of corona are compiled to bytecodes, but they still run on a lua vm on the iphone. As far as Apple changing their minds, the last 24 hours have proven you right.
>MonoTouch is delivered as a static compiler that turns .NET executables and libraries into native applications. There is no JIT or interpreter shipped with your application, only native code.
While the idea of Jobs vs. Woz philosophies seems basically accurate, they diverged a bit later than the author says.
Apple had no control over the software and hardware used with the original Macintosh. The author is just flat wrong on this. The serial connectors were a bit weird for the time, but that was it. There was none of the patented-connector, crypto-signed-applications, brick-the-hacked-devices tactics we see with the i* line. The Mac OS didn't even have kernel mode. You could just rewrite all of RAM with your own code and jump to address zero if you felt like it, and people did.
The Apple //c was as "closed" as the Macintosh, and I'm pretty sure Woz had something to do with that machine (you could even get a signed limited edition).
Modern Macs are made with plenty of standard parts, and the worst that happens if you mess with them is that you void the warranty--which is exactly what happens if you mess with the insides of a Dell or HP computer.
The true divergence is that Jobs now has Apple producing consumer entertainment devices in addition to computers. Obviously, the tradeoffs and rules are different for consumer entertainment devices. Whether it's an iPhone, a PS3, or the radio in my car, the manufacturer isn't interested in supporting openness and arbitrary hacking--they are expected to make a functional, attractive product that "just works", which requires maintaining some control over what goes into it.
P.S. AT&T Wireless is the result of AT&T absorbing McCaw Cellular, and separated again from AT&T years ago, so while it's fun to talk about the irony of Jobs' blue-boxing, that wasn't really quite the same company.
This article is basically fluff. It is trying to turn Apple's evolving business strategies into some sort of personality conflict. One that guys like Woz probably wouldn't acknowledge is real.
From the moment they were selling real products and not kits, Apple has always wanted to control every aspect of the computing experience. For a while, at Apple, it was considered heresy to be an "Open Mac" supporter -- that is, you thought it was okay to allow third party companies to produce peripherals like disk drives or printers. It was only around '87 or so that the idea of an expandable Mac saw the light of day (the Macintosh II) and that product line slowly petered out in the 90s.
People on this thread are suggesting that the OS was hackable with a floppy disk out of some desire to be friendly to tinkerers. Don't be silly. It was that way because there was no other conceivable alternative for software distribution.
If we disregard Apple's mid-90s confusions it's been on a steady road towards the iPad since the beginning, in rhetoric if not always in reality.
At the time, you weren't allowed to hook arbitrary equipment up to the phone system -- everything had to be approved by AT&T. IIRC the hardware Woz used for his dial-a-joke setup was homebrew.
Apparently it was not homebrew; here are a couple articles saying that he rented the equipment from the phone company. So the dial-a-joke thing might have been entirely legit.
I'm guessing that when the article describes the dial-a-joke service as illegal, it's confusing it with his blue-box antics, which occurred around the same time. According to the following article (a fun read), the blue box preceded dial-a-joke; he got the inspiration for the service from a dial-a-joke line in New York that he called while demoing blue boxes to potential customers:
It might overrun Sony and Microsoft in computer gaming
I don't understand.. how?
At best, I see it appealing to a small niche gaming market, but as it stands, I can't see how it would overrun Sony or Microsoft - for example, the iPad is hugely underpowered compared to todays popular gaming machines and you would definitely need to couple it with some other input devices, like keyboard & mouse or gamepads, since multitouch alone doesn't seem all that suitable for a lot of games IMHO.
iPhone, smartphone and iPad gaming isn't going to be what console or PC gamers are going to expect. However, we're not talking about those type of people buying this device.
For example, Brickbreaker has been played by over 50 million people while COD: Modern Warfare 2 has only been played by 15 million.
These type of games (that hardcore gamers would consider silly and frivolous) are going to be the dominant games in the marketplace.
Hmm, I'm beginning to see why they used an Apple ][+ in LOST. Jobs as the Man in Black, against Woz's Jacob. ("You have no idea how much I want to kill you.")
I get this same vibe from a lot of older folks (or people who want to sound like them, all grizzled and what). True, from a hardware standpoint, Apple devices are closed. What matters today however, is that they are open devices from the perspective of a platform and web software developer.
Has everyone forgot how cellphones were black boxes before the iPhone? And lets not forget that we make stuff for normal people here. Not just hackers like us. As it turns out, most folks don't want a hardware-hackable machine. They don't care enough for USB ports. The tablets being championed by MSFT in the previous decade were not much more moddable than the iPad, but I don't see anyone yelling at MSFT about that.
Has everyone forgot how cellphones were black boxes before the iPhone?
I must have forgotten that because it isn't true. On my P800 or Blackberry I could install any app, not just approved ones.
(Edit: I wonder if this meme about the "cell phone dark ages" comes from the millions of people who switched from dumbphones to iPhone and thus aren't personally familiar with the actual smartphone state of the art circa 2006.)
Smarphones have always been horrible. Have you ever tried to install a JavaME app that does something more useful than storing a high score on a phone? Or dealt with this symbian signing nonsense more than a year or so ago (they made it slightly better with the signing web service since then, but it is still a hassle)? You always have to cope with some certificate authority protection racket, apart from the interresting APIs.
The openneess if the iPhone appstore is par for the course here...
I can't speak for where you are, but here in North America (well Canada & the USA) to go from [cell phone customized for service provider] to [cell phone customized for you with apps that you selected to use] was never a simple process - perhaps easier on a blackberry but not simple.
The device makers simply didn't have enough leverage to make the process simple.
I am in the USA and IIRC there was no "process" with my T-Mobile Blackberry; I could install any app I wanted directly from the Web browser. Maybe that's some kind of exception since T-Mobile is the least evil carrier.
The very first personal computers, like the Apple II, were sold as development kits to developers/hobbyists because these were the only people who would even think to buy a personal computer in 1976 and 1977. These development kits contain all of the information one would need to create software and hardware for these machines. Is that so unusual?
Fast forward one short year to 1978 and Apple hires Jef Raskin to start the "McIntosh" project. His goal is to create a $500 computing appliance that the average person could own and use. Sound familiar?
Fast forward 32 years. There is more information published about the internals of Linux, Windows, OS X, and PC architecture than most developers would ever want to know. We can write low-level drivers that run in the kernel, to high-level scripting languages. We can design a custom hardware card to slap in a PC or MacPro; we can design custom hardware that connects to an iPhone or a laptop via USB. And the iPad is arguably the closest we've come to Jef Raskin's 1978 vision of a computing appliance.