1. Just Watch

    At the risk of sounding cliché, Apple has spent the last decade and-a-half (more, really) preparing for the release of a product like the Apple Watch.

    In 1998, Steve Jobs and Jony Ive introduced their first full-fledged collaboration with the introduction of the iMac. The Bondi Blue bubble of CRT glass and molded plastic became the symbol for “Think Different” in practice rather than simple philosophy. As a computer, it made great strides in simplicity, promotion of the USB standard, and in proving that not all the magic had gone from Apple. But more than that—or any other trait it might be known for—the iMac was a style device.

    For years, PC companies tried and failed to replicate the iMac’s style with slapdash plastic inserts in fuchsia and teal and other fantastically 90’s colors. They checked the “flashy colors” box, and then stared stupidly as Apple became the company of trendy design while they sold millions of personality-free commodity boxes in plastic plate mail to businesses around the world. The whole process informed those who paid attention that Apple, the new Apple at the time, understood style in a way that other tech companies just plain did not.

    They rode that wave through the gorgeous but flawed Titanium PowerBook and G4 Cube, had a bit of mainstream success with the white iBook, and leveraged it like crazy on a new (for Apple, that is) device category called the iPod. But they didn’t stop there. The iPod took another step that further helped to shape the company: miniaturization.

    At the time, Apple was not the company making the parts that filled the little silver and white brick that could hold your entire music collection. But they learned from those who did provide the parts by designing the interiors, going on to press that brick thinner and thinner until now even the two year old iPod Touch seems almost impossibly thin.

    With the iPod, Apple became a household name in a way it simply hadn’t been before. Suddenly, it was “cool” to buy and use something from Apple. And not in the way the the geeks (or some of us, anyway) had always thought Apple stuff was cool (remember the Newton?) but in the way that ordinary people think of cool. Apple products became a style symbol for everyone when they had only before been a design symbol for those who cared to look.

    You know where this is going.

    Then the iPhone dropped, and Apple clarified what the company could be even further. It was a facet they had shown in fits and starts through their history, but never in quite the same quantity and quality as with the first iPhone. And that facet was technology. Not just speed-bumped, this is our latest and greatest but it’ll be laughably outdated in three months, technology. The iPhone brought real, holy crap, I didn’t even know that was possible, tech. The kind that we’ve all gotten used to by know, as Apple fills phone after phone with high quality cameras, touch sensitive glass, fingerprint scanners, and systems on a chip.

    So think about that for a moment and allow me the indulgence of this next paragraph.

    Three things. Style, miniaturization, and technology. Each of these at a level that the competition shamelessly strives toward but rarely achieves. Style. Miniaturization. And technology.

    Fill in the blank. I know you can.

    Apple is uniquely prepared for wearables. And I’ve seen articles that decree it is their style that makes it possible, that no tech company gets style in a way that could translate over into fashion. No company except Apple. I’ve listened to podcasts that suggest it’s Apple ability to shrink things down (into a whole computer on a chip! Whatever that means) that makes them especially equipped for breaking open the wearable market in a way that transcends gadget geeks (of which I am unabashedly one). And some say that it’s their tenacious pursuit (and achievements) of technology that will propel them into a category that many think is unnecessary.

    On top of all that, Apple has spent at least since the introduction of the iPad working with leather and other materials in their cases, making mistakes, learning and growing. They’ve introduced an OS so simple to use that the market demanded more functionality not because the system didn’t accomplish the task but because users desperately wanted to use it for more and more of their daily computing needs.

    What few commentators seem to be saying is that sometimes two plus two can equal five; that the sum of the parts is lesser than the value of the whole; that despite checking every conceivable box on every conceivable multi-column spreadsheet on every imaginable tech website, the competition will likely come up confused, defensive, and above all irate when the Apple Watch creates a category that they think they already started.

    Just watch.

     


  2. Great piece from Jared Sinclair on the state of electronic health care records and its relation to Apple’s WWDC HealthKit announcements. From the article:

    The reality of EHR usage is that – even as late as 2009 – fifty percent of US hospitals were only only halfway electronic. Most just converted the easy stuff to electronic records, like lab results. Less than one percent (!) of them had completely moved beyond paper records. Many still had no electronic records at all.

    Yet another rrason the spotlight of the public eye should be cast upon healthcare. How can we expect to have a high-quality modern healthcare system when the industry itself seems so resistant to something as seemingly simple (from the outside) as keeping records in digital form where other hospitals and physicians can see and update them quickly and easily. Isn’t this just the sort of serious, world-changing problem that technology is supposed to solve?

     


  3. Convergence and Philosophy

    June is always an interesting time. Before I started this site, June was interesting because I was excited to see the new Apple products that I’d likely not be able to buy. Then it became the time of year that I would learn about the new iPhone, whose subsidized pricing made it possible for me not only to own one, but to own a new one everytime there was one. It’s third iteration came as the time of year I was most excited to write about, simply because so much was being said, and the allure of participating was incredibly strong.

    This year I find myself loving the keynote, excited about the products, and intrigued by the developer-related information. And yet, I’m unable to write about this June’s news the way that I’ve written in the past.

    For one thing, there’s the simple fact that I’m not a developer, and this is the most developer-centered WWDC I can remember. But it’s the second thing that makes me hesitant to comment this time around. There are just so many people of absolutely excellent quality writing about Apple these days. Perhaps they’ve always been out there and as the years go on, I just find more and more of them, but something tells me that it’s more than that. Apple, as a topic, has become crowded.

    Even so, amid the teeming thousands of responses to this year’s WWDC keynote, I keep coming back to this bit from Jim Dalrymple:

    Apple showed that it’s not just the data that is following the user through iCloud to a variety of devices, but it’s bigger than that—it’s a uniform experience that is following the user.

    Now obviously Jim isn’t a personality I had to dive very deep to get to, but sometimes the big names are as big as they are for a reason.

    Once again, I’m not a developer, and though announcements like Apple’s new Swift programming language make me wonder if I could ever learn, my propensity for juggling too many side projects makes it unlikely to ever come to pass. But Jim’s piece, and this paragraph in particular, helped me realize something about this year’s presentation that matters immensely to non-developers: Apple’s WWDC 2014 message is one of convergence and philosophy.

    For quite some time, the Apple community has speculated about the convergence of Mac OS X and iOS. And time after time, Apple has seemingly rebuffed this notion. But WWDC 2014 reveals to us that the two operating systems are indeed on a collision course, though not in the way the knee-jerk tech pundits predicted.

    Apple, as is it’s wont, is playing the long game. The short game says to be platform agnostic with browser-based apps like Google, or to build one OS to rule them all like Microsoft with Windows 8. But these strategies place too much focus (unsurprisingly) on the tech, specifically the services, and not enough on the average user. There is a company that does focus on the user, and that company is Apple. However, though this year looks to be about tech and services like its rivals, it’s really about devices and users.

    In 2014 users want cloud-connected, access anywhere, high-utility computing on whichever device is handiest at the moment. It’s akin to the old camera saying in which the best camera is the one you have with you. Users not only want that feeling in the current PC and Post PC market, they expect it. Google and Microsoft provide this by creating an entity that users interact with through their device. They seem to say “buy a Samsung or a Nokia and you can access Google or access Microsoft.” These entities have all of your stuff, whether it be Word documents or Gmail messages, someone (or something) has your stuff and you can get to it if you buy a device and use the attending software.

    Apple, on the other hand, has designed their system around the device. Think, “I am using my Mac,” or “I am using my iPhone.” Unlike Microsoft and Google, for whom the device is a layer of abstraction between the user and the primary product, which is the respective company’s services, Apple’s devices are zero layers of abstraction from their primary product: the device itself.

    Swift and extensions and widgets and all the others make a better Mac, and a better iPad, and a better iPhone, and a better (most likely) Apple TV or iWearable. Apple seems to see its customers saying “I love using my iPhone, but this feels like something I’d rather finish on my Mac or my iPad. Oh, look at that. I can just work on it there too,” which in turn makes the user love the Mac, iPad, and iPhone even more.

    For so long now, we’ve become used to the idea of trashing iCloud as a second class citizen within Apple. “When will they get server-side design and engineering the way they do other parts of their business?” we say on our podcasts and blogs and Twitter streams. The thing is, all of Apple’s services, be it developer or cloud, OS or language, are second class citizens to the device itself.

    Apple wants you to love holding, using, and owning its devices. Everything they do supports that philosophy. To them, iCloud is only a problem if it makes people enjoy their devices less. A 4-inch iPhone screen is only a problem if people like their iPhones less because of it. Inter-app communication is only important when it starts to make people like using their iPads less or their phones less than another brand. And the same can be said for all of these in reverse. If it’s making the device worse to use, it will be marked for revision or death (be it slow and steady or quick and merciless).

    Without a single hardware announcement, Apple has done more with this keynote to reinforce its position as a device company than all the Surfaces and all the Nexuses and all the Glasses put together. Because for Microsoft and Google, those are just another dumb screen that can see their services. For Apple, the services are there to do just that: serve. They serve the needs of the hardware which serves the needs of the user. The primary need? An enjoyable experience with the device.

    So how does this all relate to convergence? With one WWDC, Apple has taken a bigger step toward it than ever before. But it’s where they are converging that matters most. Apple and the developer community around it now have the ability to give us the one device that does it all. That device? The one that’s in your hand or your lap or on your desk right now.

     


  4. Surface the Third

    I’ve always been intrigued by the idea of Microsoft’s Surface. Sure, it seems a bit too divided, a bit too “I can just have everything, can’t I?” But despite its obvious conceptual shortcomings, a device that is all things to all people has an appeal.

    On more than one occasion I’ve talked about using the iPad for work, writing in particular. I see it as the future of computing and if Apple plays their cards right, the primary non-smartphone computing platform.

    After the introduction of the surface Pro 3, discussions arose around the announcement and a piece by Ben Thompson in which he calls for the end of the Surface line.

    Even in my clearly Apple-biased view, this makes little sense from a product and marketing standpoint even while it makes a great deal of sense on a profit to loss basis.

    Microsoft fell behind when the internet first showed signs of growing into what it has now become, and it bought and crushed its way back into the space. It fell behind again when Vista was a flop, but most spectacularly with the smartphone. They were caught flat-footed (as was everyone) but weren’t able to force their way back to power. And though Windows Phone isn’t dead, it sure doesn’t look all that promising.

    With the Surface, Microsoft has the potential to be ahead of the curve. They’re doing taking a different tact with tablets, zigging when everyone else is doing a variation on Apple’s zag.

    If there is a future in which the primary work device is a tablet with a desktop operating system, then the Surface could give Microsoft the edge they’ve long needed. They’re thinking differently (yeah, I know) though we’ll have to wait and see whether it’s different good or different not so good.

     


  5. Linus Edwards:

    I don’t see an easy way to solve this problem. It seems users have this conflicted nature of wanting both simplicity and complexity, and app developers can’t keep both sides happy.

    A great article and idea from Linus, but I found myself shaking my head by the end, not because I think it’s a bad argument (I don’t, of course) but because I see this “churn” as a blessing rather than a problem.

    It reminds me of something Steve Jobs said in his famous Stanford address:

    Death is very likely the single best invention of Life. It is Life’s change agent. It clears out the old to make way for the new.

    As it is with life, so it is with software. The cruft of the old is swept away by the new and fresh, “unbruised” as Shakespeare might have said and unburdened by the cares of the past.

    But unlike life, at least unlike human life, software evolves at breakneck speed. The accruing complexity of an app like Castro or an operating system like iOS, is near enough to the mistakes of its forebears to learn from them.

    Simplified writing apps (about which I know a little) tend to add complexity just like any other app does. But not even the most complicated and poorly managed will ever end up with the monstrosity that is Word’s Ribbon Interface. And even though my guess would be that most of Microsoft’s designers hate the Ribbon just as much, they are faced with being the crufty and old. Their customers would never allow even a mild simplification, let alone the steps necessary to make Word into a pleasant user experience. Look no further than Windows 8 (and furthermore 8.1) for your evidence.

    It took the compound complexity of Windows and OS X to spur the development of iOS (and Windows Phone). Designers must follow the branches of the complicated in order to create the essential versions that come later.