1. Great piece from Jared Sinclair on the state of electronic health care records and its relation to Apple’s WWDC HealthKit announcements. From the article:

    The reality of EHR usage is that – even as late as 2009 – fifty percent of US hospitals were only only halfway electronic. Most just converted the easy stuff to electronic records, like lab results. Less than one percent (!) of them had completely moved beyond paper records. Many still had no electronic records at all.

    Yet another rrason the spotlight of the public eye should be cast upon healthcare. How can we expect to have a high-quality modern healthcare system when the industry itself seems so resistant to something as seemingly simple (from the outside) as keeping records in digital form where other hospitals and physicians can see and update them quickly and easily. Isn’t this just the sort of serious, world-changing problem that technology is supposed to solve?


  2. Convergence and Philosophy

    June is always an interesting time. Before I started this site, June was interesting because I was excited to see the new Apple products that I’d likely not be able to buy. Then it became the time of year that I would learn about the new iPhone, whose subsidized pricing made it possible for me not only to own one, but to own a new one everytime there was one. It’s third iteration came as the time of year I was most excited to write about, simply because so much was being said, and the allure of participating was incredibly strong.

    This year I find myself loving the keynote, excited about the products, and intrigued by the developer-related information. And yet, I’m unable to write about this June’s news the way that I’ve written in the past.

    For one thing, there’s the simple fact that I’m not a developer, and this is the most developer-centered WWDC I can remember. But it’s the second thing that makes me hesitant to comment this time around. There are just so many people of absolutely excellent quality writing about Apple these days. Perhaps they’ve always been out there and as the years go on, I just find more and more of them, but something tells me that it’s more than that. Apple, as a topic, has become crowded.

    Even so, amid the teeming thousands of responses to this year’s WWDC keynote, I keep coming back to this bit from Jim Dalrymple:

    Apple showed that it’s not just the data that is following the user through iCloud to a variety of devices, but it’s bigger than that—it’s a uniform experience that is following the user.

    Now obviously Jim isn’t a personality I had to dive very deep to get to, but sometimes the big names are as big as they are for a reason.

    Once again, I’m not a developer, and though announcements like Apple’s new Swift programming language make me wonder if I could ever learn, my propensity for juggling too many side projects makes it unlikely to ever come to pass. But Jim’s piece, and this paragraph in particular, helped me realize something about this year’s presentation that matters immensely to non-developers: Apple’s WWDC 2014 message is one of convergence and philosophy.

    For quite some time, the Apple community has speculated about the convergence of Mac OS X and iOS. And time after time, Apple has seemingly rebuffed this notion. But WWDC 2014 reveals to us that the two operating systems are indeed on a collision course, though not in the way the knee-jerk tech pundits predicted.

    Apple, as is it’s wont, is playing the long game. The short game says to be platform agnostic with browser-based apps like Google, or to build one OS to rule them all like Microsoft with Windows 8. But these strategies place too much focus (unsurprisingly) on the tech, specifically the services, and not enough on the average user. There is a company that does focus on the user, and that company is Apple. However, though this year looks to be about tech and services like its rivals, it’s really about devices and users.

    In 2014 users want cloud-connected, access anywhere, high-utility computing on whichever device is handiest at the moment. It’s akin to the old camera saying in which the best camera is the one you have with you. Users not only want that feeling in the current PC and Post PC market, they expect it. Google and Microsoft provide this by creating an entity that users interact with through their device. They seem to say “buy a Samsung or a Nokia and you can access Google or access Microsoft.” These entities have all of your stuff, whether it be Word documents or Gmail messages, someone (or something) has your stuff and you can get to it if you buy a device and use the attending software.

    Apple, on the other hand, has designed their system around the device. Think, “I am using my Mac,” or “I am using my iPhone.” Unlike Microsoft and Google, for whom the device is a layer of abstraction between the user and the primary product, which is the respective company’s services, Apple’s devices are zero layers of abstraction from their primary product: the device itself.

    Swift and extensions and widgets and all the others make a better Mac, and a better iPad, and a better iPhone, and a better (most likely) Apple TV or iWearable. Apple seems to see its customers saying “I love using my iPhone, but this feels like something I’d rather finish on my Mac or my iPad. Oh, look at that. I can just work on it there too,” which in turn makes the user love the Mac, iPad, and iPhone even more.

    For so long now, we’ve become used to the idea of trashing iCloud as a second class citizen within Apple. “When will they get server-side design and engineering the way they do other parts of their business?” we say on our podcasts and blogs and Twitter streams. The thing is, all of Apple’s services, be it developer or cloud, OS or language, are second class citizens to the device itself.

    Apple wants you to love holding, using, and owning its devices. Everything they do supports that philosophy. To them, iCloud is only a problem if it makes people enjoy their devices less. A 4-inch iPhone screen is only a problem if people like their iPhones less because of it. Inter-app communication is only important when it starts to make people like using their iPads less or their phones less than another brand. And the same can be said for all of these in reverse. If it’s making the device worse to use, it will be marked for revision or death (be it slow and steady or quick and merciless).

    Without a single hardware announcement, Apple has done more with this keynote to reinforce its position as a device company than all the Surfaces and all the Nexuses and all the Glasses put together. Because for Microsoft and Google, those are just another dumb screen that can see their services. For Apple, the services are there to do just that: serve. They serve the needs of the hardware which serves the needs of the user. The primary need? An enjoyable experience with the device.

    So how does this all relate to convergence? With one WWDC, Apple has taken a bigger step toward it than ever before. But it’s where they are converging that matters most. Apple and the developer community around it now have the ability to give us the one device that does it all. That device? The one that’s in your hand or your lap or on your desk right now.


  3. Surface the Third

    I’ve always been intrigued by the idea of Microsoft’s Surface. Sure, it seems a bit too divided, a bit too “I can just have everything, can’t I?” But despite its obvious conceptual shortcomings, a device that is all things to all people has an appeal.

    On more than one occasion I’ve talked about using the iPad for work, writing in particular. I see it as the future of computing and if Apple plays their cards right, the primary non-smartphone computing platform.

    After the introduction of the surface Pro 3, discussions arose around the announcement and a piece by Ben Thompson in which he calls for the end of the Surface line.

    Even in my clearly Apple-biased view, this makes little sense from a product and marketing standpoint even while it makes a great deal of sense on a profit to loss basis.

    Microsoft fell behind when the internet first showed signs of growing into what it has now become, and it bought and crushed its way back into the space. It fell behind again when Vista was a flop, but most spectacularly with the smartphone. They were caught flat-footed (as was everyone) but weren’t able to force their way back to power. And though Windows Phone isn’t dead, it sure doesn’t look all that promising.

    With the Surface, Microsoft has the potential to be ahead of the curve. They’re doing taking a different tact with tablets, zigging when everyone else is doing a variation on Apple’s zag.

    If there is a future in which the primary work device is a tablet with a desktop operating system, then the Surface could give Microsoft the edge they’ve long needed. They’re thinking differently (yeah, I know) though we’ll have to wait and see whether it’s different good or different not so good.


  4. Linus Edwards:

    I don’t see an easy way to solve this problem. It seems users have this conflicted nature of wanting both simplicity and complexity, and app developers can’t keep both sides happy.

    A great article and idea from Linus, but I found myself shaking my head by the end, not because I think it’s a bad argument (I don’t, of course) but because I see this “churn” as a blessing rather than a problem.

    It reminds me of something Steve Jobs said in his famous Stanford address:

    Death is very likely the single best invention of Life. It is Life’s change agent. It clears out the old to make way for the new.

    As it is with life, so it is with software. The cruft of the old is swept away by the new and fresh, “unbruised” as Shakespeare might have said and unburdened by the cares of the past.

    But unlike life, at least unlike human life, software evolves at breakneck speed. The accruing complexity of an app like Castro or an operating system like iOS, is near enough to the mistakes of its forebears to learn from them.

    Simplified writing apps (about which I know a little) tend to add complexity just like any other app does. But not even the most complicated and poorly managed will ever end up with the monstrosity that is Word’s Ribbon Interface. And even though my guess would be that most of Microsoft’s designers hate the Ribbon just as much, they are faced with being the crufty and old. Their customers would never allow even a mild simplification, let alone the steps necessary to make Word into a pleasant user experience. Look no further than Windows 8 (and furthermore 8.1) for your evidence.

    It took the compound complexity of Windows and OS X to spur the development of iOS (and Windows Phone). Designers must follow the branches of the complicated in order to create the essential versions that come later.


  5. Just Another Guy

    At the University of Idaho, where I earned my undergraduate degree, there’s a room with two floor-to-ceiling glass-paneled sides. One of its three entrances connects to the Einstein Brothers bagel shop (which in my day was simply a campus cafe called Common Grounds). I spent many afternoons in the glass-paneled room; it has a name, but I always preferred to call it the Quiet Room. There was a no talking policy, no noise at all in fact, except for the baby grand piano in the corner, often played beautifully by an elderly man who stopped by for fifteen minutes or so every couple of days. I never asked his name.

    One day, ensconced in one of the faux-leather chairs, PowerBook in my lap, unfinished literary essay in a long since forgotten version of Word for OS X open behind my web browser, I paged idly through Slashdot or Digg or MacRumors. Glancing up from the screen, I saw a young man making his way across the room towards me.

    "So, you’re a Mac guy?" he asked, but it wasn’t really a question.

    Now, I am not what you might call comfortable in social situations and even more so with a stranger approaching me in my silent sanctuary.

    "Uh, yeah." I tried to smile in the way an affable, approachable Mac user might.

    "Nice, man." He flopped into the chair next to mine, rummaging through a messenger bag (a rarity at the time), finally producing a thin black USB hard drive, the cord wrapped in a snarl around its center. "I got into ‘em because of the music. What about you?"

    I said something about ease of use; at the time I couldn’t say what I know now—that I love hardware and software that pays an unusual amount of attention to details that average users rarely, if ever, notice.

    “We gotta stick together, you know? Check this out,” he continued.

    With a flick of his wrist, he popped open his computer (a white plastic iBook), plugged in the portable drive, and brought up the Finder. Inside were folders and folders of music, easily thousands of songs. He aimed the screen at me. “See anything you like?”

    I did.

    Now it’s important to remember that this was many years ago. My thoughts on pirated music or software have shifted greatly since then, mostly in response to my desire to become someone who makes a portion of their income from content creation. But on that day in the Quiet Room, I was a cheap, naive college student in a bizarrely high-pressure social situation.

    Whether I took the guy up on his offer or not is beside the point. We went on to talk a bit more about the Mac, until we started drawing looks from others in the room. Naturally, I wasn’t the only person who came to this place in order to get away from the noise. Eventually, he packed up his machine and headed out with a wave and a smile.

    To this day I wonder if he would’ve approached me had I been typing away on a Dell or a Sony or a (then IBM) ThinkPad.

    During the summer months I still take classes at the University of Idaho. It’s only a short walk from my house, and the course content still teaches me something new every time. And some days I sit in the Quiet Room in the same chair, wondering if someone were to come through the glass doors with a MacBook under their arm, would they see the Apple logo on my iPad and strike up a conversation? Or, in light of Apple’s recent popularity, would they see not a kindred spirit—another person who gets it—and instead see nothing at all, just a student (a little old for the university scene these days) hunched over a glowing screen, like everybody else.

    Isn’t that what 2003 me wanted? An Apple computer in virtually everyone’s hands, a diminished Microsoft, computers thin and light enough that you’d hardly notice them in a backpack, battery life that lasts all day. It was. But now, instead of being “the Mac guy,” I’m just another guy with a Mac.

    I suppose that’s better.1

    1. My inspiration for this piece comes from this post which appeared at 9to5 Mac.