We’ve all heard it, or at the very least thought it. On some frigid morning we’ve all put one foot out the door while still clinging to our summer shorts in hopes that it’ll warm up later in the day. Or, we’ve stepped into the spring sunlight in that favorite sweater and know within five minutes that it was definitely the wrong call. And yet we try. It seems like every year I try. Why is that?
Comfort. Convenience. Probably a great deal of sentiment. I love that sweater or that pair of jeans or that t-shirt. Even the forces of nature can’t change that. Now, in the world of clothing and fashion, I don’t have much of a choice but to surrender. One day it will simply be too hot outside to justify that favorite jacket. And so it goes back in the closet for another time.
My technology, however, doesn’t have to be this way. I can select a general purpose computing device that accomplishes all (or nearly all) of my current and even possible goals. Up until a few years ago, the choice was even more clear cut. But with tablets and smartphones closing the performance gap, there seems to be fewer and fewer reasons why our mobile devices should not be capable of completing the same tasks as a laptop or desktop.
In his article, “An iPad, a Computer, a Holy Grail,” Linus Edwards makes the case for simply settling for a coat closet full of tools that only come out for their specified purpose:
Anytime you consolidate devices, you are making compromises. If you decide to get rid of your camera and only use the iPhone to take photos, you are getting a lesser experience, as most stand-alone cameras can take better photos than the iPhone. If you get rid of your Nintendo 3DS and only play games on the iPhone, you are missing out on a more specialized gaming experience.
These would all be true if I were the kind of person that would carry each of these specialized tools. In reality, I’d say it’s more likely that most would prefer to just have one device fill the place of three. Before the iPhone, and even as an avid gamer, I never owned a handheld gaming device. Before the iPhone I never (or very rarely) kept a camera on me. I have countless images of my children, or just the scenery around me, that I’ve shot on the iPhone simply because it was there. In its absence, I wouldn’t have taken a better picture with a point-and-shoot or DSLR, I would have taken nothing. The moment would just be gone.
So there are obvious utilitarian reasons for going with a smartphone or tablet, but what about my earlier comparisons to fashion? That sweater I just want to keep wearing? That’s the iPad. It’s comfortable, predictable, fun, and pleasant to me in a way other devices aren’t. I want it to be the device I get to use, just like I want the weather to be right for that sweater. The difference comes in the application.
Weather is weather, but digital tools and digital tasks are not the same as the physical problems of hammering nails, driving screws, or staying cool on a hot day. At this point, the processing power of the A7 rivals that of laptops from only a few years ago, certainly not the pinnacle of computational performance but clearly enough to get most jobs done in a reasonable timeframe. Even so, we are forced to deal with workarounds and awkward situations to accomplish some of the same tasks on a tablet that would be much easier on a traditional desktop operating system.
The two just don’t make sense. Physical limitations, such as Edwards’ example Swiss Army Knife used to build an entire house, are virtually insurmountable. Building a house with a Swiss Army Knife is never going to make sense, but each year, building and operating a website using only an iPad makes more and more sense. The capability is there, the implementation just hasn’t quite made the leap. And if people like Federico Viticci and others who try to be exclusive iPad users don’t push the limits, who will?
The right tool for the right job is a nice sentiment if you want to carry around a sack full of toys like Santa Claus. But when the hardware is growing more and more capable every year, why not consolidate as much as possible? It’d be like getting to wear your favorite sweater every day of the year.
On this week’s Critically Speaking podcast: Where were you in 1998? Games were in a state bound to the rules of the tabletop RPG, and though on the surface things seem to have changed a great deal since then, many of gaming’s traditional roots still hold fast.
The same can be said with technology, as a person might realize the $1,000 MacBook Air is the superior product, but simply don’t care enough about computers to spend that money and are fine with the $300 Dell that lets them check their email and Facebook.
An interesting point, and he goes on to compare fashion (which some people care a great deal about while others do not) to technology:
How many geeks out there will insist on the top of the line Mac Pro, yet go outside wearing an old t-shirt and cheap pair of sneakers?
Edwards is right on both counts. Geeks often do place a high value on their areas of interest; it’s part if what makes us geeks. But the difference that I see between the fashion example and technology is that the Mac Pro geek who spends $20 on the cheapest clothes he can find, will rarely complain about how terrible his clothes are. The fashionista who end up with a $300 Dell inevitably whines about how it’s slow or “broken” or “getting full of too much stuff.” If not, they simply ask about the Mac when they run into their techie friend who already owns one.
And then we have to answer, all the while keeping in mind that they don’t care the same way we do, yet often want the same experience (and at a fraction of the cost).
On this week’s Critically Speaking podcast: You may, in fact, only live once, but what happens when a game only lets you die once? Whether it’s an arcade shooter with a steadily increasing swarm of enemies or a quiet hiding place along the tree line in Day Z, permanent death in games is growing ever more popular.
On this week’s Critically Speaking podcast: Game after game after game, sequels in this and other entertainment industries seem never to end. Are the audiences the better for having another title in their favorite franchises, or are we just lining up for another helping of the same old rehash?
On this week’s Critically Speaking podcast: Mario and Michelangelo, Zelda and Zefferelli, either way discussions of games lead inevitably toward the mature analysis of other media. But how do we increase the range of intelligent criticism and discussion surrounding these interactive experiences? Scott and I discuss the staying power of the gaming classics, Path of Exile, and the upcoming console launches.
Another good article by Linus Edwards, but it leaves out a very important point. Battery life may be fairly stagnant over the past twenty years, and we certainly could use a technological leap ahead. However, when you look at the gains in device capability, the fact that such limited tech still gets us 10 hours or so is impressive.
Imagine how much more gas your car would burn if it had 10,000 horsepower. That’s the kind of power increase we’re dealing with in our computers and mobile devices since the 80’s.
On this week’s Critically Speaking podcast: Entertainment knows no age, right? Some might say that games, as a medium, lend themselves to a certain demographic. Increasingly, the population of gamers is growing older and older. With that age comes a shift in preference. Is the medium broad enough to support older gamers? And what if it’s not?
On this week’s Critically Speaking podcast: There comes a time when the number of times a person can play a single game takes a back seat to the number of games a person wants, and has the financial resources, to play. Scott and I discuss replayability, and a plethora of news from the XP mechanic in the upcoming Thief to the Nvidia G-sync announcement that claims to eliminate the bane of PC-Gaming: screen tearing and stutter.
On this week’s Critically Speaking podcast: Twenty years ago, the market for games was a very different place. Technology was different, more impenetrable, more complicated. Games were a difficult pastime in some ways, though sitting on the couch was always easier than going outside. In age where the gaming market has grown by leaps and bounds, how big is the niche for title that haven’t been “dumbed down”? And who decides what’s dumb?
Vision. We all have it. The ability to sense what is beyond the range of immediate sight. Perhaps you have a vision for how the next year will play out, or maybe you have a vision for that app you always wanted to program or the novel you always wanted to write. The unfortunate reality is that though we can all feel our sense of vision, the vast majority of us cannot or do not act upon it. Then, of the small fraction who do, only a sliver are able to bring it to fruition. How small is the number of those who can bring their vision to bear again and again?
Two years ago, the world lost one such visionary when Steve Jobs passed away. For many associated with tech, Jobs was and is larger than life. There are others, certainly, Hiroshi Yamauchi of Nintendo for instance, whose bold ideas and unyielding resolve have driven their industries to new heights. This week, DYHAMB? is examining what it means to be a visionary; I’d like to take a look at a few candidates for my part.
What can I say that I haven’t already said? Most people discuss Steve Jobs’ tenure at Apple, especially after his triumphant return in 1997, in terms of the products that the company produced. The iMac, iPod, iPhone, and iPad were each shining examples of Steve’s vision. So too was the G4 Cube, a financial disaster, perhaps exemplified his vision even more so than the others. Meticulous detail, a focus on the internal as well as external design, and the vicious refusal to retain legacy technologies all helped define a Jobs driven product.
But there was much more to Jobs’ vision than the products his company produced. The company itself was the vision, and the products only a part. From the glass-walled retail impossibility that is an Apple Store, to the thousands strong crowds cheering his electric stage presentations, to the piece by piece construction of a company that could continue on in his absence, Jobs bent the world to his will. And when he couldn’t bend it, he ignored it, focusing instead on the next piece of his overall strategy.
Without the creative mind of Shigeru Miyamoto, the modern videogame landscape might have risen to the same economic heights, but the cultural gaming touchstones would certainly have been very different. Mario and Luigi, Bowser and Yoshi, armies of Pikmin, Donkey Kong, and for God’s sake Link and Zelda, all are the brain children of a man who for decades has emphasized player experience above all else.
And with Miyamoto, the vision doesn’t stop at simply inventing game characters, who though iconic, are significantly less complex than your average film protagonist. His design ideas made their way into physical control devices, hardware specifications, and gameplay mechanics, all in service of making a Nintendo game feel and play better than its contemporaries.
The PC. So much so has Bill Gates affected an industry, that his company and its software became synonymous with the device itself. Gates’ great vision was not in products, an area where his rival, Jobs excelled, nor was it in the corporate structure that Jobs made the capstone of his second tenure, it was Gates’ relentless pursuit of a computing world that accepted nothing less than Microsoft everywhere. And with that vision achieved, he left driver’s seat in order to pursue one of the largest philanthropic efforts in history with the Gates foundation.
Formerly of Lionhead Studios, now CEO of 22Cans, Peter Molyneux might seem out of place in a list with industry titans such as Jobs, Gates, and Miyamoto, who each defined their industries on nearly peerless levels. However, with his broad, shoot-for-the-moon design choices, and his incredible ability to hype a product only to see it so often fall short of the goal, Molyneux represents the visionary who continues to reach for his ideals in spite of endless technical limitations and financial constraints. Where the others worked within the bands of what was possible at the time (with the possible exception of Jobs, who on a few occasions surprised an audience with tech that appeared so impossible that it was assumed to be faked), Molyneux works in the realm of the wide-open dream. Whatever the technology is capable of at the time, his products reflect, though his vision is always much much bigger.
And All of Us
Above all, each of these visionaries was a creator. A builder of realities that before their input, did not exist. Quality, originality, or perseverance may have helped each stand out amongst his peers, but creation ties them all together. That’s vision, to conceptualize a possible future and to distort today’s reality into your tomorrow.
On this week’s Critically Speaking podcast: There are few things more powerful in human history than the ability to see a possible future and to pull it into reality. But how do we define such an ability? Who possesses it, and how can we recognize it before it’s too late? This week, in a joint DYHAMB? effort, Scott and I discuss what it takes to be a visionary.
When I started this document, I meant only to give a few impressions of the new iPhone 5S. But, as I used the phone every day, I found more and more that needed to be said. It’s not a comprehensive review, but I think I touches on some subjects that other big name sites may have missed.
First impression is absolute sorcery, not quite to the level of “You had me at scrolling,” or retina displays. Those additions changed not only the way I thought about using my phone, but what I thought was possible with technology. And though I see TouchID as a real, useful feature, it doesn’t—in its currant state—change everything.
Upon further use, some drawbacks arise: phone on table, upside down, sliding the grip, retraining prints (which I actually should not have done as much as I did; it learns over time if you stick with it). Apple claims the the reader works at 360 degrees, but the easiest way to confuse the sensor is to use it with your finger in a different rotational position. I’ve seen reviews that claim 80–90 percent success rate. My daily use feels much closer to about 60.
Overall, it’s amazing tech with a useful application but tenuous real world reliability.
3D games are simply incredible, especially in the stat that matters most: frame rate. Just as a phone UI is transcendent at 60fps, so too is gaming. Textures still leave something to be desired, but that may be more a function of limited storage. On this note, the 5S will be the last 16GB model that I buy. Hopefully by the iPhone 6 and beyond we’ll be looking at 32, 64, and 128GB models.
Burst mode is great, but it still fails the ultimate “regular person” camera test: no-flash pictures of the kids playing inside the house. Low light is worlds better than the 4S. I don’t shoot much video and have little interest in doing so for slo-mo, though the effect looked great with the [marching band] I work with.
The bigger screen is great to look at, especially for games. I still want a wider screen vs height (in portrait orientation). The 16:9 aspect ratio just seems too skinny. And, to counter the usual argument about one-handed use, Apple already crossed that threshold with the 5/5S. I can’t reach the corners without shuffling my grip, and the tall nature of the phone makes the balance such that I’m more nervous about dropping it. That said, now that I have the larger screen, I would not want to go back. I assume that the obvious reasons are further reinforced by how thin and light the newer 16:9 devices are.
Build and Design
I missed the iPhone 5 design discussion and have a few points. I’ve always considered the iPhone 4 to be the culmination of the original iPhone’s design. The 5/5S is something different. Many have said that the 5C is Ive’s hardware to match iOS 7. I’d argue that the 5S form factor matches just as well, if not better. Of all the previous designs, this one feels the most like you’re just holding a sheet of glass, upon which your apps come to life.
At this point, there’s little reason for me to try to recommend an iPhone to a new user. Now certainly that is not because the device is lackluster or deficient in some way. It’s quite the opposite, in fact. There is no phone on the market that I would more strongly advise anyone to buy. Apple’s build quality and polish of both hardware and software are still in a league of their own, even with iOS’s remaining growing pains. The problem with recommendations and the iPhone is that so many people have already made up their minds about the product, the company, or its users.
The real challenge for Apple with the 5S is not to make a great device, they’ve done that with flying fluorescent colors. Now, they need people to look at it as new, and not just more of the same. With a media at the peak of Apple antipathy, that could prove quite the difficult task.
On this week’s Critically Speaking podcast: When is a console not a console or a PC not a PC? Valve’s announcements this week only further cloud the picture of living room, gaming, and hardware. And though they talk a lot of peace, love, and mods for everyone, questions still abound. Plus, Scott and I check in on DayZ standalone, the iPhone 5S, Real Racing 3, Mark of the Ninja, and Starcraft 2: Heart of the Swarm.
Few games from my formative years left a psychological footprint equal to Myst. Plenty of Zelda and Mario, later Final Fantasy, but Myst, its sequel, Riven, and the three accompanying novels (particularly The Book of Ti’ana) loom larger than any of those console giants. It may have been because of the emphasis on exploration and storytelling—there was in fact little else to the games besides these things. Or it might have been the incredible (for the time) music and sound. Whatever it was, much of my early thinking about books and the future of videogames began with that lonely island, with one spire of a tree lancing up into the sky, and a half sunken ship floating next to the dock where I awakened.
If you had a similar background with the Myst games, Emily Yoshida’s Grantland article is worth giving up a bit of your morning to read. I’m glad I did.
On this week’s Critically Speaking podcast: Can a medium define a century? That’s the question Scott and I attempt to unravel as we challenge the recent Manifesto claiming that games will be the 21st century’s dominant form. Plus, GTA V gets a release, and a boatload of cash, while Steam cooks up something to surprise us with next week.
The real trouble starts outside of Google. Which 64-bit processor? Intel’s (the company says it will add 64-bit “capabilities” to Android)? Samsung’s? Qualcomm’s?
Who writes and supports device drivers for custom SoC modules? This sounds a lot like Windows device driver complications, but the complexity is multiplied by Google’s significantly weaker control over hardware variants.
Gassée makes some great points about the industry reaction to Apple’s new A7. Including this one concerning Android’s problem with the move to 64-bit.
From what I’m able to gather, the great improvement in performance is largely not because of the 64-bit architecture itself but from the myriad changes, optimizations, and upgrades that go along with the transition.
It may be splitting hairs, but the A7 going 64-bit is the source of the improvements. Once again we have nerds contesting Apple’s move forward based on purely technical and pedantic arguments. To a layman (read most of the people who will purchase the phone over the course of its life), “A7 is 64-bit” connects to “A7 is twice as fast in many cases” just fine. Turns out that in addition, there’s a technical argument to support this assertion as well.
The nine million iPhone units sold over the launch weekend nearly doubles the “over five million” units of the iPhone 5 sold in its opening weekend last year.
Think about that for a second. Imagine how long it would take a person to assemble an iPhone. Now multiply that times nine million. The sheer amount of work that goes into assembly for a launch like this is simply staggering.
Koppaka’s argument is that the switch to 64-bit isn’t so much to benefit iPhones and iPads (not right now, anyway), but rather for laying the groundwork to usher in a new version of Apple TV, which already runs on a modified version of iOS.
I find this idea very intriguing, and I’d add this as well. If Apple chooses to make the set top box a AAA caliber console, they would be fine to start with a performance disadvantage. Why? Because surely Apple will want to rev its set top box far more frequently than Sony or Microsoft or Nintendo does.
Imagine the difference between say, a PS4 and a revision 3 of this hypothetical Apple TV box. Even if they only upgrade every two years, Apple’s bound to pull ahead. And each time, at their low price, they’re much more likely to sell another box to a repeat customer. Maybe the previous one gets sold to Gazelle or maybe it just gets put in another room with another TV. Either way, this could be a good way to increase the allure of a box that would probably focus on video delivery first and games second.
On this week’s Critically Speaking podcast: Suddenly, it seems, everyone’s gunning for the living room in earnest. Microsoft and Sony are already there with Nintendo wandering a bit lost. Apple looms on the outside, apparently waiting for the right moment to strike. What does the future of the console look like? Scott and I discuss.
In the App Store app, there’s a special tab. When it first appeared there, I was delighted, though after a few minutes I found it slightly less useful than it could have been. That tab is the “Purchased” tab, a list that contains every App purchase I’ve made since the store opened. Some of the items there are still on my iPhone or iPad (albeit different versions than when I first picked them up), and some have long since been deleted and forgotten.
One of the earliest apps to appear there, and one that until today remained on my first home screen was Silvio Rizzi’s Reeder. I’ve written about Reeder before, and would still recommend it as one of the finest RSS feed readers available. But it’s no longer the best. That honor now belongs, perhaps unsurprisingly, to the brand new Reeder 2.
After a long series of anticipatory tweets and posts by its author, Reeder 2 is finally available on the App Store, and by all my measures, it’s great. At a glance, Reeder 2 looks and feels very much like the original Reeder sans the heavy iOS 6 styling and textures. I’d be happy if it only went that far, but the new version has some nice user experience improvements that bring it in line with iOS 7. Animations abound, as they did with Reeder, but version 2 takes that a step further with iOS 7-style physics that bring the sliding, bouncing panels to life.
Typography and readability are also improved and cannot be overstated in an app that primarily focuses on reading. Everything looks clearer and breathes a little better, even on the now comparatively squat iPhone 4S.
Gesture navigation has also been expanded. Reeder famously used swipes and slides to mark items as read or favorite, to move between view types, and so on. With Reeder 2, all of that is supercharged. There are more gestures available with better consistency and a great deal better response. This app is blazing fast. Many apps under iOS 7 and indeed iOS 7 itself can often feel a bit more languid than its predecessors. Reeder 2 has none of the gelatinous, long duration animation that iOS 7 seems to revel in. Animations are quick, fluid, and bring the content into view right away; there’s no waiting.
The app, now universal, is $4.99 on the iTunes store. If you’ve been using the original Reeder even half as long as I have, you’ve probably gotten your money’s worth. I highly recommend that you support the developer and buy the new version, even at its App-Store-premium price point.
When Apple holds an event, my RSS feeds (unsurprisingly) fill up pretty quickly with hands-on impressions, commentary, and recaps. As I was working through them, and seeing similar opinions throughout, I came upon this, from Marco Arment:
The iPhone 5C isn’t the new low-end model: it’s the new mainstream iPhone. It’s the one Apple’s promoting more, marketing more, and making available for preorder. This is the new iPhone, and as customers and the press have repeatedly shown, a new external design is all that really matters when defining “new”.
What if now we have an alternating tick-tock, wherein the “c” model receives external design changes on the years opposite of the new numbered model? For instance, this year we had a new iPhone 5c. Next year we’ll have a new iPhone 6. So, when the top end model gets an external redesign, the midrange model upgrades internals. When the top end model upgrades internals, the midrange model gets a new external design.
Such a strategy would certainly alleviate the consistent complaining of tech writers and Wall Street every other year. Some will make note of the last year’s(ish) internals in a colorful case nature that seems to be the 5c’s philosophy, and those same folks will probably feel the same way about future c’s. At any rate, it’s an interesting position for Apple in the never-ending fight against an underwhelmed audience.
The real story, in my opinion, is the one worth a thousand words on each of those aforementioned tech blogs but not getting near that sort of coverage — the new camera in the iPhone 5S. The camera? Yes, the camera.
Great piece in the possible impact of the photography capabilities on the new iPhone 5s.
If you haven’t yet listened to my podcast with Scott Boren, now would be an excellent time to begin. It’s a good one. Our special guest Brendan Keogh has all the details about Press Select, the new publishing label for longform videogames writing. In addition to the twenty questions routine by myself and Scott, the group takes on the topic of whether or not games should be measured by the metric of “fun” as we examine “Playing Outside" by Leigh Alexander.
Nintendo is what you see and hear on the screen and what you hold and feel in your hand.
I disagree with Ritchie on his following point that Nintendo never made good controllers. In fact, I’d argue that they were unparalleled up until the Xbox 360 controller, but I think John Siracusa’s covered that well enough. But man, a part of me really wants this to happen—for Nintendo to make controller peripherals for iOS (not only for their own games but for the whole ecosystem).
The combination of Apple’s device hardware and Nintendo’s peripheral design experience would be great. Each addresses the weakness of the other with a strength hopelessly out of reach of either company on their own.
Apple, though brilliant at beautiful device design, has never been especially adept at creating the bulbous ergonomic shapes that well-made controllers (or even a proper mouse) require. The iPhone works because it has a very simple interaction model—touch something onscreen and it produces a reaction.
The iOS touchscreen paradigm is only effective when that model remains simple. When challenged with even mildly more complex control schemes (such as the fairly basic Terraria) the magic fades as quickly as it appeared that first day Steve flicked a contact list and “had” us all “at scrolling.”
Nintendo, especially pre Wiimote, has ever been a company that designs for the hand and gameplay experience derived from said. Its CPUs and GPUs are generally considered inferior, even when they aren’t (as with the GameCube).
In addition to supporting their own business, Nintendo could plausibly benefit (itself and the iOS developer ecosystem) greatly by providing what would almost certainly become the standard for peripheral gaming hardware on iOS. If you’re a game developer, who better to base your game’s controls on than Nintendo—especially if their hardware had a significant portion of iOS market coverage.
I’m not sure how likely such an alliance would be, and Nintendo might have to fall quite a bit further before it would make such a move. As a gamer, it would definitely be a welcome departure from the universally horrible virtual controls many games opt for.
In the lead up to Tuesday’s event, the internet is already alight with the sounds of preemptive disappointment. When these events come around, I always wonder what Apple would need to release for the audience to be satisfied. Now, certainly you can’t please everyone all of the time, but with more and more Android devices sold worldwide, the voices of apathy seem bit by bit more credible.
Don’t get me wrong; there’s little that Apple can do to defend against such a broad array of competing devices, but that’s not Apple’s M.O. It instead focuses on a product vision and sticks with it. That vision changes with time, of course, but in a gradual, well-considered way. Reactionary behavior isn’t in Apple’s DNA, and that is a recipe for success in almost any endeavor. And it’s important not to confuse aversion to reactionary decisions with hyper-conservativism.
Apple builds on a carefully planned idea, iteration by iteration. The iPhones at Tuesday’s event will most certainly be an extension of that strategy. Anything else would be cause for concern to anyone truly familiar with Apple.
Make two great games for iOS (iPhone-only if necessary, but universal iPhone/iPad if it works with the concept). Not ports of existing 3DS or Wii games, but two brand new games designed from the ground up with iOS’s touchscreen, accelerometer, (cameras?), and lack of D-pad/action buttons in mind. (“Mario Kart Touch” would be my suggestion; I’d buy that sight unseen.) Put the same amount of effort into these games that Nintendo does for their Wii and 3DS games. When they’re ready, promote the hell out of them. Steal Steve Jobs’s angle and position them not as in any way giving up on their own platforms but as some much-needed ice water for people in hell. Sell them for $14.99 or maybe even $19.99.
Though I don’t agree with Gruber’s overall strategy that Nintendo should release games for iOS, if they did, this would be my favored version. I am not a fan of Square Enix’s similar approach, but I honestly think that These days, Nintendo is simply in a higher tier than Square Enix.
Gruber also suggests that Nintendo lower prices on these hypothetical titles when the next year’s version comes out. Another great point similar to the iPhone’s $199–$99–$0 with contract phase out concept.
Great piece on Polygon about Upper One Games Company, a developer focused specifically on the culture and mythology of northern Native American groups. This exactly the kind of progress the industry needs; the more underrepresented groups that start their own development houses, the more likely that content reaches an audience.
On this week’s Critically Speaking podcast: Harassment is no joking matter, but those who are most often the perpetrators see it as just that: a joke. With an industry already under intense pressure from social and political forces, threats of violence from the inside only make matters worse. And since everyone’s in a critical mood, why not speak to the faults of Guild Wars 2, Saints Row 4, console release dates, and the Diablo expansion?
A lot of great points in this post from Ben Kuchera on short form games. It boils down to this:
This attitude that anyone who doesn’t buy shorter, higher-concept games are “part of the problem” is wrong, wrong, wrong. People buy and play video games for many reasons, and it’s not up to us to cast judgment on them.
I have to think, though, that the “sneering critic” from Kuchera’s post is probably a rare bird. The most common situation I see is a critic responding to an attack from commenters about how these short games aren’t “worth it,” by which they tend to mean “worthless.”
Short games are sometimes worth it, especially the high-concept critical darlings that spawn these debates. Long games are sometimes a part of the problem, especially when the hours added tread narrative water.
Did you know that many words don’t mean what they “traditionally” mean? And by “traditionally” I mean the definitions one might find in, say, a dictionary, as opposed to the natural evolution of language. Said evolution is in and of itself often quite eyeroll-inducing (see additions of “irregardless” or Internet acronyms like “lol” to the dictionary), but not so much as when people misuse words for their own purposes.
Their purpose, of course, being the desire to make their opinion on something an objective fact— it’s not enough simply to dislike something, after all. One’s dislike needs to be based on provable evidence, so anyone’s disagreement can be said to therefore be objectively incorrect. Because argument on the Internet is serious business.
Thus the misuse gets spread far and wide and becomes generally accepted as a new definition. No-one questions it, lest one get accused of being a “grammar nazi”— which is as bad as being a real Nazi, I’m led to believe— and nobody wants that.
So, a quick primer on what some common words and phrases actually mean on the Internet. You’ll thank me later.
So, Ballmer retired. You’ve heard it already, are probably even exhausted by all of the writing and posting that’s been done on the subject. I’ll try to keep it short, then.
Ballmer’s end has been written on the wall (by tech pundits) for as long as I can remember, maybe even since his tenure as CEO began. But why has the tech press always had it out for this guy? Well, many a writer would mention things like Ballmer’s business focus, ability as a steward, and failure as an innovative or disruptive force (which translates to the lack of ability to take successful risks). I’m going to call it right now; all of that is a smokescreen. The more I think about the Ballmer news, the more it seems to me that it was never truly about his business practices or technology leadership. Ballmer’s just not one of us.
Think about it. Ballmer, for all his time spent in the industry, is not a nerd, not a geek, not at all. When he gets excited about “Developers, developers, developers,” it seemed insincere and out of place, despite his well known physical gesticulations. When the iPhone came out, and all of the geeks (yes, even Windows fans and the tinkerers who would become Android stalwarts) went nuts about how this thing was as good as taking a trip into a time machine so far ahead was its design, Ballmer was simply head-in-the-sand clueless. I know little about Ballmer outside of his public role at Microsoft, but he doesn’t strike me as the guy basking in the monitor’s glow, cranking out a script to automate , well, anything.
That guy is Bill Gates, the technologist founder, the code wizard, bespectacled, and stooping, frog-voiced and decidedly unathletic of appearance. Gates, we could understand because he was like us. Now, that’s not to say that Gates is literally like us. Geeks, nerds, technologists come in all shapes and sizes, of course, but most have tendencies at some level that remind us of Gates. For many, however, he represents the worst qualities of what makes a geek, the things about ourselves we would rather not see.
His superior logic, and trademark voice, make him grating and abrasive in an argument, where Ballmer simply booms. And when Gates is wrong, the world seems to point and laugh, just like middle school.
Really Bill? We’re all gonna be using tablets in ten years. Have you seen those things? Yeah, sure Bill.
And that brings us to Jobs. Every once in a while, geeks run into that guy, the one who not only is interested in the same things they are, is as smart as they are, but also has an easy charisma and charm that disarms, even enraptures non-tech types. The world seems to always go this person’s way. He (or she) can make the same arguement that the introverted, glasses and and pocket protector type does, and all of a sudden it becomes a brilliant flash of light shed upon the world.
Wait, so you’re saying tablets are post-PC devices. They’re what’s next. Yeah Steve, we get it. Quick, buy millions of them and run this guy’s stock through the roof. He’s a genius!
Meanwhile, the Gates type called it first, but everyone was so busy poking fun at him that they lost the point. Now, of course execution and product quality had a lot to do with how the tablet market turned out, and you can argue who really won the debate though the market winner is clear. But for our purposes, it doesn’t matter because Ballmer wasn’t even involved in the discussion, not really.
Jobs could take risks and seemingly bend the world to his will. Gates was the smartest guy in the room but lacked the charisma to hypnotize an audience. Ballmer is from another planet. With him out of the picture, a CEO more in line with Jobs or Gates, or simply someone who appears to focus on technology through business (someone like Tim Cook) rather than business through business (Ballmer) would do a lot of good for Microsoft. Ballmer did what he could with the company; unfortunately, it seems tech was never really his passion, though Microsoft as a company clearly was and is.
Earlier this week, Shawn Trautman posted a question he’d received on Discovergames from an indie strategy game developer. The title is for iPad only, and Shaun doesn’t have access to one, so I though I’d post my thoughts on the game.
Strategic Leap is a checkers or chess-like game for the iPad in which the pieces have restricted movement and capturing abilities, and the player has a bank of spells that allow him or her to add or remove spaces to the board, access power-ups for increased scoring, and a plethora of other gameplay enhancements. The result is a lively and interesting take on an age old board game premise.
Let’s start with the icon. The HD tag seems unnecessary when the only option is for iPad. Sure, it may denote retina, but at this point, retina support is expected. Secondly, the icon frame will be a problem when iOS 7 hits, though I’m sure the developer is looking out for that bit already. Characteristically, the icon communicates the game’s personality well.
Next up is the tutorial. At first, I felt overwhelmed by the sheer number of items in the tutorial; however, the process is much quicker and more intuitive than it first appears. All the while, the tutorial text injects a light and fun voice into the proceedings, urging you along and congratulating you on completing the tutorial tasks.
When it’s time to actually play the game, there are three modes: puzzle, campaign, and multiplayer. I wasn’t able to check out multiplayer at this point, though it seems fairly straightforward. Based on the actions available in the tutorial, I began with puzzle mode.
Once inside the puzzle mode, the game presents yet another tutorial, this one about sliding pieces and scoring. The lesson was just short enough to avoid frustrating me with its string of friendly educational levels. Then the game really begins. Sure enough, the first section is easily completed, and easily mastered with three-star ratings on the first handful of levels. But just about the time I started to feel confident, the difficulty ramps up. Each task is still easily completed, but the simple three-star wins take a deal of figuring to achieve. It was a pleasant ramp-up that felt natural and challenging, rather than abrupt or punishing.
In the campaign the player is confronted with an opponent, complete with avatar artwork and silly character description. The bio’s serve to reinforce the game’s lighthearted tone. The levity is a welcome piece in the face of so many puzzlers that resort to trance techno laser lights and thumping soundtracks.
After the opponent selection, it’s off to battle. I selected Homer. He seemed an easy enough target. Sure enough, after tapping on his info card, I was presented with yet another tutorial. And, just as before, the instruction was simple and quick. On to Homer.
At this point, the game’s many variables finally came into focus. After a number of rounds dealing with Homer, I felt like I had a handle on the systems. Unfortunately, that’s about as far as I tend to go with games of this type. For puzzle fans, there’s quite a lot more to dig into with Strategic Leap. And for a group if friends, the multiplayer mode is ideal.
Best of all, the game is free, and blessedly free of any advertisements, up-sells, or in-game currency. If you have an iPad and any inclination toward puzzle games, it’s worth a look on the iTunes Store.
This time around, we should expect an iPhone 5S that comes with upgraded components, but the same, old design. Groundbreaking the iPhone 5S will not.
Can you feel it? It’s S season. You know, that time every-other-year when Apple releases a (by trend) really excellent update, and the tech press writes hundreds of nearly identical stories about how it isn’t groundbreaking.
Well, like I say every time S season rolls around, all these writers seem to care about is physical design. And here we have a prime example wrapped in the guise of a Mac Pro anticipation article. Let’s see what Reisinger has to say about the upcoming Mac Pro.
But the Mac Pro is groundbreaking in every sense of the word. The device, which is incredibly small, promises to deliver the kind of power and functionality that we’ve yet to see from Apple. And its design? Well, let’s just say that the new Mac Pro is arguably the best-looking desktop it’s ever launched.
Yes, Reisinger goes on to talk about ports and power again later, but this section sums his argument up as well as anything in the article. The Mac Pro has always been the most powerful Mac available, and is always in a class above all but the most custom PC’s (except for the most recent, admittedly old/new update). So what power is the Mac Pro bringing that is thus far unheard of from Apple? Nothing. The Mac Pro will certainly be powerful, impressively so if the marketing turns out to be true, but Reisinger’s argument essentially boils the Mac Pro down to looking really different.
If the Mac Pro had been announced exactly as it is today but in a case that resembled the aluminum tower style, Reisinger would have little to fill his article with. I worry that the overall reaction to the Mac Pro and iPhone 5S will continue down a similar path. Sometimes I wonder if, even after all this time, tech pundits still see Apple’s only real strength as making pretty hardware. When they don’t have a new look, the press calls it boring, regardless of internals or features. When the look is new, the product is instantly more “innovative” than the last.
Certainly the Mac Pro is interesting, maybe the most interesting thing Apple has released in a while, simply because it is so different. However, the oncoming swarm of articles about how the iPhone 5S is uninteresting because it has the same outer casing as the 5, is not something I look forward to.
On this week’s Critically Speaking podcast: why do we need all these blinking lights, ammo readouts, and inscrutable meters plastered across our games’ content? Scott and I ponder the HUD and whether or not it’s just better to go HUDless. GTA Online gets some airtime, though I’m still dubious, and Scott unearths some intel on Everquest Next’s PC first strategy. Oh, and Saints Row: The Third.
I write on the iPad, a lot. In fact, some of my most viewed articles revolve around the concept of writing on the iPad. Of course, there are some limitations to writing from this device that I may never overcome—typing iOad instead of iPad because my fingers can’t seem to find the right placement, even after several years—but the sheer convenience of the iPad as the ultimate carry and use writing computer is simply too tempting to ignore. Not to mention, I love testing out new apps and writing about them (often using them to create the very articles I’m writing, as I’m doing right now).
When I first read about Editorial on Macdrifter and then later on Macstories (where Viticci is in his element for about 25k words), I was intrigued but worried that this new writing app would be a little too x-callback URL complicated for my uses. What I found was that Editorial can be as complicated as you want (seriously, just look at the table of contents on Viticci’s article) or what appeared to me to be “just complicated enough.”
In some ways, I was right. Editorial is plenty complicated when you want it to be. There are text snippets built right into the UI. On OS X, Text Exander has been a must for my day job in which I often spend several consecutive hours commenting on student essays. Snippets for reminders of common mistakes are a godsend. When writing on iOS, anything that speeds up the typing process, especially if it involves uncommon characters or formatting can be absolutely essential. With Editorial, there’s a snippet for everything, and if there’s not, you are free to make one for yourself.
Above that, though, is the bonus keyboard row that has become a staple for iOS writing apps. In Editorial, its usefulness cannot be overstated. All of the common Markdown keys are present (Editorial has brilliant in-document Markdown support) and function as they do in ByWord; pressing the open paren key also types the close paren and places the cursor in between. When selecting text, typing the open paren places the open and close on either side of the selected text. And oh the text selection! The bonus keyboard row functions as a text selection area as seen in this video from a while back. Now, this isn’t new. Many apps have tried to replicate the one finger for cursor placement, two for selection method, but in practice they’ve failed me spectacularly. Not Editorial. For the first time on iOS, text manipulation, copying, pasting, and moving feels complete. You’ll miss it in every app that doesn’t have it.
Another place where Editorial excels is in its use of sliding panels for the various views for research, file selection, and Python support (I’m gonna leave that section to the professionals). Other apps have incorporated these features, but combining them with the powerful text expansion and workflow pieces of Editorial takes research, linking, and information collection to a whole other level. In addition, the app is fast and fluid, never feels like it’s pushing the iPad’s limited resources, and just gets you where you need to be quickly and reliably with very little visual clutter (especially when you consider the expansive list of possibilities within the app).
By no means is this reccomendation exhaustive. In fact, it may even do Editorial a disservice, considering how many features and abilities the developer has implemented; however, If you use your iPad for any sort of text creation outside of the occasional twitter post or iMessage, or if you’ve been interested in pushing the envelope of what your tablet can (effectively and efficiently) do for serious work, Editorial is the app you’ve been waiting for, perhaps searching for. Apps like this are a big part of what makes iOS so exciting. It’s just brilliant.
Minor Spoilers for Bioshock Infinite’s opening section
Is this what we are? Is this what it all comes down to: a man with a hook/ratchet/winch pulling pieces of people’s faces apart? Of course, based on seemingly random patterns, the man might cave in an enemy skull, or simply puncture it with the hook, or fail in the first attempt only to cock back and swing again until the wonderfully rendered human face no longer returns to the frame. Regardless, the result is the same. The player-controlled protagonist bloodily rampages through an intricately designed world crushing, puncturing, and ripping until enemies cease entering his field of view.
Honestly, add wrench, chainsaw, rifle stock, fist, fist with brass knuckles, baseball bat, what have you, and soon enough you’ve got a description that fits a staggering amount of AAA gaming titles released in the last five to ten years. Essentially it’s the formula for the modern shooter: shoot to kill, melee if you can no longer effectively shoot. The closer the enemies get, the gorier the outcome.
Now, I’ve written about this before, about how NPC’s (and enemies especially) can be viewed as little more than digital flies to be swatted in creative ways by the player. And this makes sense in games like Halo, Saints Row, or Dishonored. The player’s goal is to eliminate the enemy (or possibly, incapacitate in the case of Dishonored), to fight through a series of conflicts with a growing array of weapons and skills with which to dispatch the oncoming enemy forces.
But in my recent play-through of Bioshock Infinite, I found myself not only shocked but repulsed by the in-your-face gore and violence—nearly to the degree of shutting the game off entirely. Of course, a laundry list of reasons kept me playing in spite of my initial reaction, but it is the moment of shock that I’ve chosen to discuss here.
Infinite is a first person shooter. It says so right on the box, or in the game’s Steam description. Why should I then expect anything different from the countless other shooters that flood the industry almost daily? Why was I unfazed by Halo’s Covenant multitudes or the stream of Helghast in Killzone, and yet somehow deeply unsettled by the first Columbia police officer to come into contact with Booker’s sky-hook?
Contrast and expectation.
The opening sequence of Infinite is similar to its predecessor: a darkened sea, a mysterious lighthouse with various mantras and maxims scattered about. Even the rain and cultish leader echo each other. But whereas the original Bioshock sends the player straight into the demolished underwater world of Rapture, where crazed splicers fling themselves at the player from the very first moments, Infinite takes its time.
Booker enters Colombia while its citizens are still at peace. After an eerie baptismal initiation (that arises again cleverly near the game’s conclusion), Booker emerges among the floating platforms of 1800’s Columbia, a city in the sky. Colombia’s people mill about as they prepare for a seemingly exciting annual event: the Raffle. There’s a carnival that gives the player access to Infinite’s various mechanics as a series of carnival games. Booker fires guns at cardboard cutouts of the Vox Populi, who at the time seem to be some sort of rebel gang terrorizing Colombia’s citizens. He tests out tonics, which give him magical abilities and control of the elements. He even sees a late game enemy on display as a carnival attraction. All the while, Columbia’s citizens chat and gossip, ask him questions and ignore him, eat cotton candy and peanuts, play with their children. It’s all very idyllic, with the clear undercurrent signaling some sort of coming upheaval, around which Booker’s story (and the player’s) will most certainly orbit.
Nothing too out-of-the-ordinary so far. We’ve got an intro, a tutorial section, and the makings of an inciting incident. It’s Gaming Plot Mechanics 101. However, unlike the first Bioshock, which introduces the player to combat through trial by fire (electrical shock, actually), Infinite continues with it’s raffle chatter and city gossip as Booker learns about vending machines, Colombia’s currency, health boosts and salts (mana) replenishers, about turret possession, and the mysterious energy that powers the city’s clockwork horse carriages. It’s all very immersive, richly detailed, and for me, downright fascinating.
But all dreams end with the sleeper awakened. Colombia’s, Booker’s, and the player’s end when the tutorials and lore are exhausted, and the game’s (sometimes a little too obvious) god rays come shining down through the foliage around the Raffle stage. A crowd has gathered in anticipation, and Booker is just in time. He unsurprisingly wins the Raffle, and is presented with a choice: throw a baseball at the interracial couple who have appeared on stage (the crowd’s expected raffle reward) or to throw the ball at the announcer (the player’s rebellious option that implicitly aligns with the cultural norms of the game’s audience).
After the choice, Booker is instantly recognized as the False Shepard of Colombia’s prophecies and the audience turns on a dime against him. Here’s where the face-ripping comes in.
The problem, or brilliance depending on how you assess it, is that up to this point, the people of Colombia have seemed pleasant, polite, and good (if a bit snooty). Now, they’re clearly racist, which on its own is horrible. But does their racism merit tearing through them like so many gazelles in the lion’s jaws? Though, even that comparison would have a purpose, at least the lion eats the gazelles (not that Booker should resort to cannibalism).
It may occur to some that the racists have now become enemies and thus fit under my “flies to the swatter” doctrine. Not so. In my previous examples, the Covenant are not peaceful, polite, irritatingly aristocratic ladies and gentlemen. They’re bloodthirsty, though sometimes comical, alien invaders. Ditto for the Helghast, who have their own weak attempts at humanization. Something about Colombia’s denizens is different. Something about Infinite is different. What is it?
Contrast. In most games, the desire to get immediately into the action drives the player into a frenzy of shoot, reload, shoot, reload, rest before having a chance to do much of anything else. Even a game like Mass Effect has clear interaction segments and clear combat segments. In Infinite, a peaceful crowd becomes a gruesomely slaughtered string of corpses in seconds. It happens fast, and it happens to NPC’s that before the attack are humanized from the very first moment. Never are even the guards seen as threatening or aggressive until Booker decides to throw the ball. Any more human and Ken Levine and his team might have convinced me to not even play their game at all. How’s that for “Would you kindly?”
Though contrast does a great deal of the work in setting up the strikingly violent moments after the Raffle, in my case, expectation did the rest. Quite a long time has passed, and a good many titles, since Bioshock (or even the forgettable Bioshock 2). Those games are both well-steeped in gore. Rapture, where the first two games are set, however, is already a bloody mess when the player character arrives. Compound those two details and the expectation is that the protagonist will be adding to the violence that preceded him. Colombia, on the other hand, is clean; we’re talking Disney clean. Everything glows or shines, the clothes are indicative of upper class and the trappings of financial excess. Not only that, but the opening includes no true combat tutorials (just the seemingly innocuous carnival games). Add that to the distance between titles, and a chance to forget just how gory the Bioshock games are, and you have the makings for a stomach churning moment.
Bioshock’s Atlas reveal and the ensuing gameplay segment are powerful but ring of the M. Night Shyamalan school of pulling the rug out from under the audience. Infinite achieves something equally as interesting, if not more so. Under the right circumstances the game evokes an emotional and visceral (as in, gut-wrenching) response early enough that the player might reject the experience all together. And if not that, the observant player is forced to come to grips with Booker’s extremely aggressive tendency toward violence.
This week on the Critically Speaking podcast, Everquest Next is—apparently—going to change everything. How do you know when to trust the hype? Also, violence and Bioshock, high-detail gore, desensitization, and the power of contrast.
From Kyle Orland’s review of DuckTales Remastered for Ars Technica:
But over two decades after the original game was released on the Nintendo Entertainment System, I felt ill-equipped to review this game in its correct context. To help with this problem, I’ve invited along a special guest reviewer to help me out today: eight-year-old Kyle Orland. Say hi to the nice readers, Kyle.
I played the original DuckTales for NES (it appears Orland and I are of an age), and recall finding it really difficult but fun. The remaster doesn’t hold much interest for me beyond a curiosity, but this piece is simply the most enjoyable videogame review I have read in a very long time, perhaps years.
It isn’t racist for a TV show, videogame or movie to have more Caucasian characters than ethnic minorities. If it is set in a country or a time period where the majority of the population are white, then it is logical to use primarily white characters. It doesn’t equate to…
The fundamental flaw in this argument, and a possible reason that the author may see themselves as “balanced,” is not in the examples themselves but their structure.
The entire series depends upon its use of the singular: a game, a TV show, a film. Indeed it would be an unnoticeable blip if there were just one game that emphasized one race or gender based on some narrative stricture centering in time or place.
Unfortunately, the problem is not with a game rather an entire industry of singular games that overemphasize one race and grossly marginalize women. DiscoverGames’ statistics are surprising even to me, but I’d love to see the number of female characters in games who are not hyper-sexualized, disempowered, or else wise stereotyped. I’m guessing it’s lower than the stock 15%, a lot lower.
When an industry produces such a trend, it is important that its consumers question the status quo. No, it does not mean that a particular game developer (or all of them) are racist/misogynist—until relatively recently, they may have even been unaware; it means that the industry has become racist/misogynist somehow and that the trend needs to be addressed.
Without discussing these problems, without highlighting them, the individual studios would just go on making the same games and adding them to a market already saturated with similar titles.