How Not To Launch A Game

By now, you’re probably painfully aware of the problems that are plaguing the biggest of Nintendo’s mobile apps, Pokemon GO. A crossover of sorts with Niantic Labs’ augmented-reality sensation Ingress, GO tasks players to venture through a world, capturing Pokemon and battling them against others. The twist is that the world being explored is our own– literally. GO uses the player’s location data on their smartphone to track what Pokemon are nearby. So you might encounter, say, a Drowzee at your favorite all-night diner, or a Zubat near the tunnel you take to work. It’s a fantastic concept, and based on what I’ve played so far, it is an amazing game. 

I wish I was playing it right now. 

The game’s launch, starting Tuesday night in Australia and New Zealand and ongoing as I write, has been an utter, unmitigated disaster. This is a failure that’s to be expected from new online developers, but not necessarily from Nintendo and certainly not from Niantic. In short, what should have been the opening fanfare on the series’ 20th anniversary celebration is rapidly turning into a bloodletting that threatens the Pokemon brand as a whole. There are three core problems plaguing the game right now, and each one is directly traceable back to either Nintendo or Niantic not doing something they should have. These are sins of omission, things that any experienced developer only ever faces once before correcting them in their next project. 

The first, and probably the one people are most upset at dealing with right now, is the flawed to the point of useless login procedure. Upon starting the game for the first time, the player is asked to log in either through a Google account or their Pokemon Trainer Club (PTC) account. The PTC option is the best choice, because it is (ostensibly) tied in to all of your other online Pokemon services– your Global Link account for the games, your online card game account, your real-world card game ranking, etc. Unfortunately, this requires the game to authenticate against the PTC, which is a massive bottleneck. The single-sign-on server has been reduced to rubble since noon today. Now, if you were to log in via the Google account, you might be able to get in… but your data is tied to your account, not your device. So if you had managed to get a good start on the PTC account, you would have to start from scratch on the Google account. 

The solution to this is simple, from a standpoint of any developer who’s ever had to build a login portal pass-through of any kind: store the login as an encrypted token in your database, and issue the user a token that corresponds to their data on your system, not the authenticator’s token. This way you only go out to the authenticator when you absolutely need to, and not constantly re-authenticate. In addition, by not tying data to a remote authentication token, you can then allow a user to link multiple login methods to the same data, in case they don’t have (or want to create/use) one of the offered login options. Going to the authenticator is a costly action, and as a developer it’s your responsibility to minimize that cost. This is basic stuff for any kind of distributed system, not just highly-complex ones. 

But more than that, the existing login scheme violates an emerging mobile-app design trope: it asks for a username and password on every launch from the title screen (and the game frequently crashes back to the title screen, trashing its login token in the process for reasons unfashionable to man and Pokemon alike). I don’t think any mobile game released in recent memory has used a username/password authentication setups as its everyday login. Most rely on secure identification information provided by the mobile operating system, and if the user requests additional protection, it’s secured by the phone’s local authentication (your unlock code, for example, or thumbprint/other biometric key). Smartphones are the last single-user environment in consumer computing. Many apps have tossed aside the archaic and error-prone username/password setup in favor of allowing the app to act as if nobody else in the world even exists. Even my banking applications only ask for my password in dire circumstances; the norm is simply to accept thumbprints as proof of identity and move on. 

(As an aside: I labor to believe that Niantic coded the app that way on purpose. What I think happened was that Nintendo, in a misguided attempt to protect users’ privacy, forbade both token caching and account linking. Worse, the PTC login token may have a uselessly short lifespan, on the order of fifteen minutes or so. It makes sense, but at that point, they both should have realized that using the existing consumer-facing PTC login process would be a Very Bad Thing and would develop a similar process specifically for communications related to the app. An N-to-N solution, so to speak.)

The second problem comes in the form of how the app went live. After the game’s open beta ended last week, speculation ran rampant on when GO would have its public launch. You would expect a reference here to whoever had the right day in the proverbial office pool, but in fact there wasn’t even enough time to set up such a pool. The game was suddenly released in the Australia and New Zealand regions on Tuesday night at about 10:30p EDT. There was no fanfare, no announcement, nothing. Niantic wouldn’t even register their support Twitter until Thursday morning. In the official silence that followed, North American users swarmed to create ANZAC-regioned accounts for the Apple Store and Google Store, or resorted to sideloading the game (downloading the game app from an unofficial source and installing it manually, something common on Android but difficult on iPhones). 

Setting up regional rollout is not difficult, and certainly it’s not the issue here. But it was botched badly in this instance, because an obvious and elegant solution to the issue presents itself by virtue of the game’s nature. GO requires the player’s accurate location, right? So why not release the app as a “pre-load” in all regions, and then allow access based on location in order to prevent the servers from overwork? This allows geographically large regions– the United States, in this case– to be divided into cascading rollout zones. The simplest zone distribution would be by time zone, but other factors could inform that decision. I don’t think this has been done before in a mobile app, but it’s certainly something to consider. 

More damning than the rollout timing, though, was the radio silence out of Nintendo and Niantic throughout the whole affair. These are not plucky indie developers who have to choose between addressing public complaints or fixing their game. They are big enough that they ought to have halfway competent PR groups. (Though given Nintendo’s PR catastrophes this year, I think we can say that they in fact do not have a PR staff that is even minimally competent.) It is absolutely unacceptable to meet consumer queries with disinterest bordering on apathy. I would have preferred even a somewhat hostile response over nothing; at least with a venomous reply, you know they actually saw your question. 

Finally, and this is probably the most distressing fault that I’ll address, the game has virtually no tutorial or information on how to play. There is no online manual, no on-screen guides, nothing. I honestly thought my game had soft-locked when I went up a level, because nowhere did it say “Swipe to dismiss” on the all-encompassing celebration screen. Using a Poke Stop (waypoint, which when used gives valuable items like Poke Balls and Revives) was similarly opaque. There is a short explanation of Pokemon Gyms when you first try to use one, but nothing comprehensive on what the Gyms are or what they do. There is a daily login bonus that grants you some of the game’s real-money currency, but nowhere is it explained that it’s dependent on how many Gyms are under your direct control. Battling is a rough affair, especially because while the Pokemon do respect type advantage and disadvantage, getting a Pokemon to use their special move is not explained at all. And in the absence of this information, players are falling back to approaching the game like any other Pokemon game, when it is definitely not meant to be just another main-series game.

I’m going to be perfectly clear: Pokemon GO is a good game, but it is not a traditional Pokemon game. Battling and trading are the focus of the handheld games. GO is not about that. GO is instead about exploration and collecting, and indirectly about area control in the real world. If you go into this game looking for battling and saving the world, you are going to be disappointed. Instead, if you view the game as an incentive for physical activity, similar to critical darling Zombies, Run!, the game becomes much more engrossing. The real-world aspect of the game may feel gimmicky, but it is integral to the game’s design. 

So having players approach the game as a traditional Pokemon adventure does it a disservice. If it were more obvious to the player that going out to Poke Stops to restock your is more economical than buying them from the cash shop, the player would do that. If it were stated that controlling Gyms grants the player free cash shop currency, the player would do that, too. Even basic information like attack type match ups would be helpful. Without that information, the player runs a high risk of frustration and boredom– two things lethal to any game. The game is too complex to approach solely on intuition. A tutorial should be the next top priority for Nintendo and Niantic. 

Pokemon GO is, I still assert, a good game. But it has not had a good start. I mentioned way at the top that the catastrophe that is this launch could poison the Pokemon franchise as a whole. That was not hyperbole. Think about how Star Fox Adventures was the first harbinger that the series would never again reach its former glory. Or how Xenosaga Episode 2 killed the hopes of that series reaching its full conclusion. One game, if it’s bad enough– or perceived enough as bad– can ruin a franchise beyond salvaging. I think Pokemon GO has a good chance of undeservedly being that game. And should that happen, I will weep for it, dry my eyes, and move on. I just think it’s a little early to start digging that grave.

Here It Goes Again (E3 2016 and WWDC 2016)

On Monday at 10a PDT (1p EDT), Apple’s Worldwide Developer’s Conference (WWDC) kicks off with the as-usual keynote speech by Apple CEO, Tim Cook. It’s been a tradition for a long time that the early-summer conference reveals the software upgrades to Apple’s iOS and Mac OS X systems, and this year adds watchOS to the lineup. As a dyed-in-the-wool Appleologist (hail our Eternal Leader, the Jobs), this has always been something for me to look forward to, and this year is no exception– but for a completely different reason. We’ll get to that in a moment.

On the flip side of the equation, though, the Electronic Entertainment Expo (E3) also starts on June 14th (Tuesday) this year; much like last year, when the two events converged. Even though E3 proper doesn’t start until Tuesday, two of the Big Three– Microsoft and Sony– typically have their major announcement events on the day before. Microsoft begins the coverage at 9:30a PDT (12:30p EDT), with Sony starting at 6p PDT (9p EDT). Nintendo isn’t doing a major event, but will instead be running their Treehouse Live stream all day on Tuesday starting at 9a PDT (12p EDT). E3 has been winding down as a major show since the “pause” it went through after 2006 (incidentally, the only one I ever went to– and yes, I do bring that up more often than I should), with more companies either front-loading the majority of their announcements in their own venues, or simply skipping the show altogether in favor of more “open” events such as PAX or Awesome Games Done Quick. 

Still, it’s been a tradition for the years that I’ve been blogging to go over each company and make some predictions, assertions, and otherwise look like a total nerd. Who am I to argue with a tradition that I set myself up for? To save space, though, and to have them all in the same place, I’m going to go over both WWDC and E3 in this post. Buckle up, kids, this is gonna get geeky.

Apple: iOS hits a major milestone this year with the inevitable release of version 10. The annual refresh cycle of the force behind Apple’s outstanding growth post-iPhone is not expected to be the revolutionary leaps forward that iOS 4 or 9 were; instead, Apple is focusing on usability and minor tweaks across the board. Siri– Apple’s long-parodied digital assistant who often requires assistance herself– is slated to get an API for third-party developers, allowing users to command Siri to handle tasks in apps beyond the default ones. Honestly, being able to ask Siri when my next bus arrives will be a godsend, as right now I need to tap on an incredibly unresponsive watch interface to get that info without digging out my phone. Speaking of the Apple Watch, watchOS 3 is slated to become more independent of the iPhone– a few months back, Apple began mandating watch apps be able to do something without requiring communications with the phone. This will be a blessing, particularly if it’s not limited to the next iteration of the hardware (but who am I kidding). Siri is also coming to the Mac, as Apple sunsets the clunky OS X name in favor of MacOS– incidentally, that’s what they used to call the operating system after System 7 but before OS X. Beyond that, I can’t really think of anything that I’d want from Apple this year. Honestly, if the rumors that this year’s iPhone hardware is going to be of minimal improvement compared to the 6/6s come true– which I’m more than willing to believe– I may end up breaking my every-two-year upgrade pattern and waiting for the 2017 device, which is supposedly going to be a significant departure. We shall see.

Sony: PlayStation VR, Sony’s answer to the Oculus Rift and suchlike, is scheduled to make its full debut next week. One of the major things that both Sony and Microsoft have been fending off has been the rumors of a hardware refresh for the relatively young PS4 and Xbox One, respectively. In the PS4’s case, I can see that happening if only to incorporate the PSVR’s “booster box” (additional hardware that sits between the headset and the console) into the console as an all-in-one unit. I don’t think Sony is going to make a big deal out of it, but it would be interesting to see if they announce the new hardware alongside other titles in their Monday evening event. (By the by, I’m still salty that I didn’t get tickets for the Fathom Events-powered theater experience. I had completed the registration, on time, twice, and got error pages. Kinda thinking Sony might want to consider a better way to get those tickets distributed.) In terms of software, we’re going to see a lot of third-party stuff highlighted, but Sony might reveal a new Gran Turismo title that works with the VR headset. I would love nothing more, in terms of ludicrously out-there wishes, for the PSP to be officially sunsetted and its software added to PS Now (their streaming rental service), along with a completion of the PSP’s catalog on digital; it is criminal that some of the system’s best games (Brave Story: New Traveler, the Star Ocean remakes, Valkyrie Profile Lenneth, Tactics Ogre…) are still physical-only.

Microsoft: And here’s where I kinda fall down, because I don’t yet have an Xbox One, and so far I have seen nothing to make me want one. Cuphead looks kinda cool, but I’m willing to bet that’s just a timed exclusive. Rock Band 4 was literally only on my short list because of the sunk-cost fallacy (read: all my DLC was on the 360). I could honestly not care less about the Halo games, and there are no other exclusives on the horizon that have me interested. Not even the system’s precipitous price drops over the last few months could sway me (even if I had the money). The cynic in me says that the price drops are due to a hardware refresh coming, but that makes little sense because unlike the PSVR, there’s no reason for the Xbox One to become more powerful than it already is. I think we’ve hit the wall of diminishing returns in terms of graphics, and that’s okay. What I want to see is MS embracing its “it can’t get worse” status at the moment and start taking risks with games and ideas that might not be conventional, but might be hits in hiding. Really, I want MS to become the company that they were in 2007, when I picked the 360 over the PS3.

(As a side note to Microsoft’s thing, I want to give a mention to one of the people who became one of my personal heroes while he was at Microsoft, Stephen Tolouse. His blog is full of incredible insights on the state of the video game industry from the perspective of one of its giants. Please check it out.)

Nintendo: Okay, first things first: we’re not gonna see the NX this year. Period. Not gonna happen. Whether or not that’s because Nintendo is adding VR to the system or just because it’s not entirely ready is up for speculation, but it is going to remain under wraps until next March at the earliest. For good or for ill, we’re stuck with the Wii U and 3DS for at least one more year. In my opinion, that’s in the “good” column. I don’t think we’ve seen the end of what either of those systems can do, particularly the New 3DS– though that particular machine is a victim of being too late for its own good. Nintendo is going to make Pokemon Sun and Moon the focus of its Treehouse Live show, along with Kirby Planet Robobot (releasing today). There’s also been rumors that a new DLC pack is coming for Mario Kart 8. Beyond that, quite frankly, there is no telling whatsoever as to what Nintendo will show. We might hear about some paid DLC/expansions for Splatoon and Super Mario Maker, and we may also see a few more indie darlings like Freedom Planet 2 and whatever Yacht Club Games is doing to follow up Shovel Knight. We might see a new, proper Metroid game. We might finally see Nintendo dig deep into its back catalog and reboot some series– the Wii’s preview slides back in 2005 teased a “Gumshoe” remake, which would probably be much cooler than it has any right to be. My pie in the sky wish is that Nintendo buckles and finally remakes Gyromite with an augmented-reality ROB, possibly through the New 3DS. You can’t tell me that the thought of ROB coming back wouldn’t be cool. (Oh, and Mother 3, but that’s less of a silly hope now that it’s the only one left.)

Square-Enix: We already know about the HD Remaster of Final Fantasy XII, titled The Zodiac Age. There was a comparison video released earlier which shows off some of the graphical upgrades; that’s not the reason I’m excited for the game. No, the fact that it’s based on the improvements made in the International Zodiac Job Edition that has me excited. Beyond that, we’re probably going to see only a few minor things announced; Final Fantasy XV is nearing release, which is nice, but eh. (I know it sounds like sacrilege that I’m not excited about a mainline FF game, but… eh. It just looks so… run of the mill.) We might see a few more clips of the FF VII remake, which is slightly less eh; I’m interested in seeing how the game changes as it shifts towards a more episodic format. It really seems like Nintendo is getting the best of the Dragon Quest series, but it also seems like North America isn’t. SE is also thinking about the Mana series, which hasn’t been done justice in North America since Legend of Mana in 2000; it’s entirely possible we could see a compilation release, but I wouldn’t hold my breath. Kingdom Hearts 3 is also probably on the list. For a major surprise, we might see the first glimpse of the 4.0 expansion for Final Fantasy XIV– particularly now that the 3.3 patch landed last week– which may involve the liberation of Ala Mhigo, giving players the first opportunity to go on the offensive against the forces of darkness. Of course, if SE were to consider re-making Final Fantasy Fables: Chocobo Tales, too, I wouldn’t complain (best damn beginner RPG since Super Mario RPG).

Blizzard: Don’t expect a whole lot here. They just released Overwatch– which I should be playing instead of writing this– so they’re going on a bit of a break. I honestly don’t know what they have left beyond continuing World of Warcraft expansions. 

Atlus: Persona Persona Persona Persona. Social Links Social Links Social Links SOCIAL LINKS

Sega (and Atlus): I forgot Sega bought them. Seriously, outside of P5, Sega doesn’t have much on its slate that has me really excited, except maybe Sonic Boom (which I still believe in) and Dawn of War III. I’m going to insist that Sega try to bring Puyo Puyo Tetris out in the West, but that never happens.

Valve: More Team Fortress hats. Still no Half-Life 3.

GungHo Online: More Puzzle & Dragons, hopefully announcing a localization of the new 3DS game. 

Bushiroad: give Cardfight Online pls

With that, I think we’re set on what’s coming next week. I’m probably going to be completely wrong on a lot of these, but that’s actually a good thing. I like surprises.

Less Is More

This week saw the release of Blizzard’s video game Overwatch, available on PC, Playstation 4, and Xbox One. The game is advertised with a tremendous amount of flair and pomp, with Blizzard’s usual blitzkrieg of videos, previews, and so forth. But in actually describing what the game is about, Blizzard is being uncharacteristically minimalist. It’s a multiplayer objective-oriented shooter. That doesn’t mean a whole lot to people who aren’t already steeped in the intricacies of video games, and even then it’s pretty vague. Fortunately, Overwatch is more than just a handful of words.

The primary conceit of the game is that two teams of six characters compete to control objectives and complete simple missions. Only four game types are available in the launch package: Assault, which tasks players to capture or defend two control points on a map (one at a time); Escort, where players must lead a vehicle through dangerous territory– or stop the payload in its tracks; Control, where there is only one control point, and victory is determined by being the first to hold the point for about two minutes (non-consecutively); and Hybrid, which combines the Assault and Escort types by requiring the attackers to capture a drop zone for the vehicle they are escorting before the vehicle can arrive. Games typically last less than ten minutes each, but during escort missions the attacking players must reach checkpoints to extend the time allotted time that defenders must hold out for. At launch, twenty-one characters are available, and more are anticipated.

Prior to the game’s launch, there were rumblings that Blizzard was trying to make Overwatch the next big game in eSports, like how their StarCraft and StarCraft 2 have been massive draws in South Korea and the world over. Given the description of Overwatch as an objective-oriented shooter, one might think that the game would be suited towards the kinds of long-form, contemplative gameplay that has characterized the majority of eSports’ broadcast output. Indeed, shortly after the game’s open beta period earlier this month, Reddit users on the League of Legends board were abuzz with the thought that this title might kill interest in League. I’m proud to say that’s not the case, and it’s for the same reasons why League is a strong game as well. Basically, both games scratch different itches.

League of Legends is an extremely slow-paced game. It has moments of quick action, but predominantly there is a lot of 1000-foot-high strategic planning going on that makes for a very different kind of tension. Players who rush in to get kills find themselves facing an extremely steep death penalty: respawn timers range from twenty seconds to over a minute, and not only are you out of the action during that time, you’re not gaining the gold and character levels needed to stay on an even footing with your opponent. The game also has an order of magnitude larger roster of characters– over 140 to Overwatch‘s two dozen. There is an established level of tactical balance in League that involves knowing which characters are strong against others. But probably the most glaring difference once a player has experience with both games is that League of Legends is a much longer-term game than Overwatch

When you pick a character in League, you’re committing to that character and that role for anywhere from twenty minutes to almost an hour and a half, depending on the game. The concept of “lane swapping”– changing the role a player executes dynamically, essentially breaking the established metagame to get an advantage over an unwary enemy– is relatively unorthodox in worldwide League play, which is fixated on a very rigid game structure. There are rules of the game and then there are conventions: guidelines which have become so ingrained into high-level competitive play that players can’t help but learn as they gain experience. In short, League is a very regimented game that differs only in its details, and cumulative errors and advantages build up to victories.

Overwatch chucks all of that. Games are fast– under ten minutes– and the action is relentlessly non-stop. Death comes quickly in the game: Widowmaker, the arachnid-themed sniper, can one-shot several weak characters like Tracer and an unarmored D.Va. Fortunately, you’re only out of the game for about eight to ten seconds after death, and since characters don’t evolve at all during the match there’s no progression to fall behind on. If you’re the short-range Mei finding yourself stopped by an enemy Reinhart’s huge energy shield, however, you’re not stuck with her: players can change their active character during the match. There are a wide variety of maps and game modes, in contrast to League‘s trusty old Summoner’s Rift. Victory in Overwatch hinges on every moment, but errors are fleeting; an early mistake doesn’t hinder you twenty minutes later, or even twenty seconds later.

If it sounds like I’m overwhelmingly favoring Overwatch, I have to admit I am a bit more happy with the new game than I am with League of Legends right now. But that’s not to say that there’s a clear hierarchy between them. And, probably most telling, I vehemently disagree that Overwatch has a place at the eSports table. The game is too fast, too “blink and you miss it” to be an effective spectator sport, which is a failing both of the somewhat claustrophobic three-dimensional maps and the first-person perspective making it difficult to get a good birds’-eye view of the action, which is a hallmark of League‘s televised presentation.

But is Overwatch a bad game? Absolutely not! And is League of Legends officially obsolesced? Of course not! I love them both, and I’m thrilled to live in a time where both games are active and popular. Like I said earlier, the two games fulfill very different roles in how people play. Trying to say one replaces the other is like saying “Oh, you like Final Fantasy? Here, you’ll love Street Fighter!” If anything, I think it’s great that there is that variety of video games available. 

When I was in college, I picked up Pocket Fighter as a way to intentionally leave my comfort zone with the games I played; there had been too many samey RPGs out at that point and nothing else really appealed at the time. It rekindled a spark in me that I hadn’t realized had gone away. Tournament fighters had undergone the same kind of overload before I really discovered RPGs, and platformers before both of them. Whenever something becomes too commonplace, a shakeup can really help people discover what made them love the games in the first place.

Overwatch is an exceptionally strong game, and is probably the best objective-shooter on the market today. There are a few mechanical glitches with the game, but those are fleeting and probably going to be fixed in short order. Blizzard has made what could be an early contender for Game of the Year, and with any luck there’s more to come.

Yo-Kai Wha?

Tomorrow marks the North American debut of Yo-Kai Watch, a 3DS game wildly popular in Japan but only now being released (four years after its original release; more on this in a bit). To people outside the so-called otakusphere the game is hardly a blip on the radar; many North American game players haven’t even heard of it. Within Japan, however, the game looks to unseat Pokemon as ruler of the “collect everything” style of game. It’s easy to take this at face value and just write it off as a clone, but the truth is a bit more complex. It’s also telling of a trend that helps identify where video games are headed. 

In Yo-Kai Watch, a young boy named Nate has an encounter with Whisper, a specter that owes him a favor after releasing him from an ancient capsule. Whisper follows Nate around as a sort of polter-butler, helping him identify the various other Yo-Kai (spirits) within the town. Some of these spirits are friendly or beneficial, like Komasan (a guardian dog); others are antagonistic or dangerous, such as Negatibuzz (a mosquito that appears to cause temporary depression). If a Yo-Kai is affecting a person, Nate might either befriend the spirit or be forced into battle. Along the way, of course, Nate will collect the various Yo-Kai and send them into battle to protect his town from being utterly overrun. 

At first glance, the game appears to be yet another interpretation of the Shinto pandeistic mythology; spirits are everywhere, affecting people or things whether those people know it or not. However, where Pokemon puts a stronger emphasis on the competitive aspect of the game’s battles, Yo-Kai Watch heavily favors simply making peace with the spirits, rather than forcing them into battle. It’s a fantastic alternative to Pokemon’s often cutthroat nature. For younger players or those looking for a less adversarial kind of game, I’d highly recommend it. 

But it’s precisely that (if you’ll excuse the pun) spirit of nonviolence that is attracting me to the game so strongly. It’s a refreshing trend to see so many games being released where ruthless competition isn’t fostered for its own sake. Animal Crossing Happy Home Designer and Undertale are just two more examples. All of these are predicated on a strong message of peacemaking; in Undertale, taking a bloodthirsty approach to the game results in it getting progressively bleaker. In an era where video games as a whole are routinely blamed for acts of real-world violence, it’s encouraging to see so many games looking to buck that trend. 

Yo-Kai Watch is not perfect in this regard, because there are still times when violence is unavoidable. Self-defense is the order of the day, though, not relentless aggression, and so the game becomes a bag of mixed messages. Fighting is a conscious choice in the real world, and that is the stronger message that Undertale tries to stress. Yo-Kai Watch’s message of peace-bringing could be better served by adding the option to talk down opposing spirits rather than beating them down. Diplomacy has been a bit of a mixed bag in video games, though, and so it’s not surprising that Level-5 didn’t include it in this game. 

Then again, there’s always next time. The game that North America is getting tomorrow is merely the first in the series; the third game is being released in Japan soon. In a funny enough twist, Yo-Kai Watch 3 takes the setting out of Japan and into the United States, featuring appropriately American Yo-Kai modeled after astronauts and football players, for example. If the game catches on in the US– and considering the enormous media push that Nintendo and Hasbro are putting behind this release, it’d be a shock if it didn’t– we might see Whisper pal it up with those spirits soon enough. 

Silver Screen Sentiments

A few years before my Dad died, my sister gave me a rather interesting Christmas gift: a ticket stub scrapbook. I was already well into my habit of seeing movies regularly, but there was something a bit more to the scrapbook than just a place to keep a record of what I’d seen. I put every ticket stub I had into it, which meant we went to see National Treasure 2 that weekend. As time went on, though, I meant to try to keep a perfect record of my cinematic consumption from there on out.

I love movies, but more than just watching them I love going to the movies. A theater is, for me, a place of refuge, where I can set aside the troubles of the real world for a couple of hours and watch someone else’s story unfold before me. There is peace to be found in even the most gleefully violent turn-your-brain-off action cliche heap. And it’s all larger than life, larger than is possible to replicate at home. I won’t ever give up my huge flatscreen, but it’s nothing compared to a glorious DLP IMAX wall of film. So I go to the movies, because some movies just plain deserve to be seen that way.

In Cleveland– actually Macedonia– I would almost like clockwork go to the theater and see one or two movies each Saturday. It was a comfortable and peaceful routine. I kept it up when I moved to Pittsburgh, but as I started to get back to collecting games and anime series for my library, I found myself more and more spending those Saturdays at home, marathonning a game or TV box set. I had almost ended the practice. 

The scrapbook changed that. I started paying attention to release dates; if there was something I wanted to see, I tried to go on opening weekend. Funnily enough this also meant seeing some big-ticket Ghibli films, too, like the theatrical runs for Ponyo and Arietty. And, of course, the Marvel movies were starting up again. It was a good time to go catch some flicks.

And then depression hit. One of the first things that depression does is take things you once loved to do and make you feel bored by them. Actually, you don’t feel bored. You don’t feel enjoyment. You don’t feel pleasure. You just plain don’t feel anything. The technical term for this is anhedonia, from the same Greek root that gives us hedonist. Someone who is anhedonic is literally unable to feel pleasure or happiness. You can tickle them all you like but the laughter will be a mere unconscious physiological response; there will be no sincere mirth in it. Kind of like baking out a tray of cookies, only to open the oven and find absolutely nothing inside, not even the baking sheet the cookies were on. You wanted cookies. You got a puzzlingly intractible nothingness.

Movies, either at home or in a theater, just didn’t do it for me anymore. It baffled me, because while I was in the theater I was laughing along with the punch lines, gasping at the villains, and generally appeared to be enjoying myself. But it never sank in the way it used to. Of all of the things that I could say about depression, that is the most frustrating topic to discuss because there literally is no vocabulary in any language on earth to accurately express the complete and total void within my mind. Even the word void doesn’t cut it: it implies a contrast with non-void. If it had been just a dead chunk of my brain– like I’d had a stroke or something– eventually neurons would reroute themselves within the healthy parts to restore proper order (or a reasonable facsimile thereof). Depression isn’t so much having to make detours in your brain’s highway system as it is waking up one morning to discover that every square millimeter of asphalt in said system has spontaneously become molten lava, and your car’s magnesium rims have just exploded, setting your garage on fire, and you have to be at work in ten minutes. Still not accurate. But close. The reality is worse.

The kicker of it is that it’s completely out of one’s conscious control. It’s all to do with neurotransmitter levels in the brain, the internal messaging system that allows four pounds of flesh to safely run the other hundred-odd pounds. The brain, already awash in a faulty mix of those signaling chemicals, overcorrects for their influence. Unfortunately, at this point the damage is done, because the overcorrection becomes the new “how to fix this” procedure. When next the sads hit, the brain goes overboard in the other direction, unbalancing itself unwittingly because it has difficulty telling whether or not a little grief is going to germinate into a full-blown crash.

I used to wonder why people drink to excess, or use drugs that they know are not good for them, or do other self-destructive things in the name of avoiding feeling bad. I don’t anymore. At a certain point, you become desperate to feel happy again– to feel anything again, even just for a little while. I hit that point. Hard. But I still count myself extremely lucky that I was lucid enough to know  the dangers were far worse than the potential benefits. Not everyone does. Worse, not everyone who hits that point even cares.

But I got help. I am on medicine now– it’s not a cure-all, but I’m not the zombie I feared I would be, either. I’m back in therapy. And, probably most importantly, I’m going to the movies again. Today was a double-feature, Jurassic World and Inside Out. I lost the stub for the first one, which upsets me (but not that much– it was a pretty basic monster movie, saved only by Chris Pratt’s outstanding performance). But the second flick… it felt great to go back into the theater and sit down as the lights dimmed. 

For the first time in a very long time, I found peace in the darkness.

No Surprises Here

The horrendous tantrum that started back last August rolls onward, which should surprise exactly none of you; these things tend to grow legs of their own accord and sooner or later nobody can catch them. Unfortunately, that’s exactly what happened. The sense of sheer entitlement and exclusionism that started with “them damn feminazis tryin at take away my video games” has blown up into a general maelstrom of the Defenders of True Geek Culture trying to force out the insidious forces of “progressives” and “feels”, to ensure that the things they love will remain theirs alone and theirs forever– even if they were never intended for them in the first place. 

Perhaps, then, the fact that Joss Whedon deleted his Twitter feed on May 4th– mere days after the release of his most recent film, Avengers: Age of Ultron– is not so surprising in and of itself. Whedon has always been a polarizing figure in geek circles anyway, with some people not liking that he tends to write the same “powerful” female characters in every work, and others just not really liking the over-reliance on under-intelligent banter. But up until yesterday, he was seen as “safe” from the criticisms and havoc that literally anyone else ever saying those things would have to endure, owing to his previous success much less than his birthright as a white dude. And I say that as a white quasi-dude. That he threw up his hands and walked away from Twitter should have been a wake-up call for those who sent anyone any kind of abuse. That it apparently hasn’t been is unsurprising, the inherent unsurprisingness of which leads me to believe that I really shouldn’t expect to be surprised at the depth of human sickness anymore. (I’ll try to stop saying “surprise” now.)

What we have here is a rather unusual counter-stroke to what the Internet had allowed to occur back in the 90s and early 00s in the first place: a sort of reactionary-revisionist faction seeking to isolate and disenfranchise people en masse. See, back in the early days of the Internet, it was a good thing that people with incredibly diverse interests could connect with each other regardless of their location. No matter what you were in to, be it Star Wars, Final Fantasy, anthropomorphic animals, sex with furniture, whatever– the odds were that somewhere out there was someone else with the interest who is probably dying for the chance to chat about it. (Although the sex with furniture thing is pretty weird. I’m not judging, just saying it’s weird and not my thing, but if it’s yours, you’re welcome to it.) For people who were of a certain mindset that wasn’t common in the era or area that they were growing up in, the Internet was a godsend.

At some point, however, there started to be a backlash against the background weirdness of the universe being brought into the foreground. Like nebulae condensing into stars, the scattered pockets of weird in the Internet were coalescing into groups, organizations that could support their members as needed. Some people thought that these pockets of weird should not exist, that it was “convincing people that they were normal when in fact, they weren’t”. People started highlighting these groups and shaming them, ostracizing them in much the same way that the individuals had been isolated in their everyday life. The concept of “live and let live” was sorely lost on these people.

Then you had the parts of the Internet where there were no rules, where things could get shocking and horrendous without warning. At first there was an unstated rule saying that it was all done in satire, that the racist, xenophobic material being bandied about like cat pictures on Saturdays wasn’t at all representative of the users’ actual views. But it was unstated, and stayed unstated, and assertions that it was serious were not taken as the kayfabe that it was proffered as, but instead at face value. Eventually even the unstated assertion fell away, and there were actual violent psychotic monsters posting in full sincerity. In some sense it is Möbius’ Aristocrats: a setup so filthy that it turns inwards upon itself, ever escalating, never reaching the punchline that renders what came before it benign and funny (if it ever could be considered so). 

It’s not exactly clear when these two groups got together and birthed the mindset that the Internet was a horrible place, filled with depravity and devoid of mercy. Certainly, the mainstream media did not help matters; scare stories about websites where Your Children were At Risk of Predators were a dime a dozen in those early days. They’ve calmed down a little since then, but are still no more based in fact than the (similar vintage, but thankfully now-extinct) Ripped From The Headlines TV-movie about whatever scandal of the week. By calling out and highlighting the awful behavior of certain minority parties online, it painted the picture that the Internet was a lawless place free from consequences and populated only by unfeeling avatars. It was like a TV news crew broadcasting the exact times and street corners where a drug dealer hung out, in the hopes that the people who would make use of this information would be the police instead of the drug dealer’s customers. It attracted the people who would do these horrible things, and sought to make them “normal”.

But nonetheless, the mindset that outright hostility and sociopathic behavior were the baseline of behavior on the internet became the “accepted” norm. I say “accepted” because by and large the only time anyone calls this out is when they are themselves under attack. “Everyone else is fair game; hurt my feelings, though, and you’ve crossed the line.” And in a sense, it was the “live and let live” attitude from the early days that allowed that mindset to assert itself as “the way it is” simply because it didn’t want to tell those people they couldn’t do what they wanted. An abuse of logic allowed people to shoot down the argument of “you can’t bully anyone, they’re just doing their thing” by saying “well, bullying them is my thing, and by the same assertion you can’t tell me not to”. It again boils down to the unstated half of the axiom: “they’re doing their thing and not hurting anyone else“.

But getting back to Joss Whedon and the current state of affairs, the fight against the hostility which has now entrenched itself in the Internet that once brought people together is going… Poorly. The immediacy of the medium means that you have to be there to defend yourself, and if enough people push you to a breaking point through death threats or other promises of violence, well, you either soldier on or you fold up and go home. There is a severe lack of equilibrium within what passes for conversation online today: many can group together to attack, but a defender always stands alone. Faced with a crush of humanity in all its bile and wrath, what choice is there but to flee? Quite frankly, it’s probably safe to say it’s not worth fighting.

Except it is.

We are facing a new era of society: where our intrinsic selves are exposed to the entirety of humanity at a moment’s notice. Socially, this has not happened in several thousand years. What we are seeing is the throes of evolution at work; raw aggression, this time in social interaction, is being selected for as those who cannot properly process the emotions of seven billion humans being thrown at them are being weeded out of the gene pool. Unlike the evolutionary crises which allowed us to start using tools, or grasp the greater mysteries of the universe through advanced mathematics, however, we have a tool greater than any formula: we can become aware of what we are sacrificing in order to succeed in this new era of humanity. Who knows what skills or abilities we gave up when the Great Engineer of the Universe pushed us to our current state. But we know exactly what we are losing now: traits like compassion, empathy, gentleness, compromise. We are losing our ability to do the things which brought us to this point in our history.

It’s not my place to say whether or not the ultimate fate of humanity some hundreds of years from now is to touch the stars with the better angels of our nature by our sides, or to grasp them from atop a tower of our enemies’ corpses. However we are destined to survive this evolutionary inflection point, we must as a species do so. I will continue to fight for equality, for a world where hostility is the exception and not the rule, for a world where everyone is free to choose as dictated by the desires of their truest self, for the people who believe to keep believing, for the people who don’t know to find their answers wherever they may lie. I will champion the cause of positivity and compassion for as long as I live.

Which, of course, shouldn’t surprise you.

Five Terrifying (But Thankfully Fictional) Computer Viruses

This morning I saw a news report that suggested keeping confidential information on a detachable drive such as a flash drive in order to avoid having the data stolen. I immediately thought of a way that could be circumvented, and at the risk of sounding like Buzzfeed, I thought up four more viruses that, as far as I know, are only the product of my own imagination. This was just a sort of thought-exercise, nothing really intentional or a goal, but something to think about.

1. Johnny Cache: Captures specific PDF files based on a likely size range from the infected computer and any external drives attached, then shares them in a “cloud” service to allow people to mine them for personal information.

2. The Beat-Alls: Deliberately issues hundreds of thousands of read/write cycles on all storage devices attached to the infected computer, counting on wearing the devices down faster, destroying data.

3. Gabba Gabba Hey: Snoops traffic on all networks the infected computer is connected to, and stores packets for later “echoing” back into the network. The idea is to flood a network with a denial of service attack using data that is indistinguishable from “real” traffic.

4. The Vapors: Installs various input method editors into the infected computer and randomly switches among them at set intervals, turning all typed information into total gibberish; worse, attempts to disable the IME trigger the infected computer to randomize the encoding of all text displayed as well as input.

5. London Calling: Disables, then takes direct control of, temperature controls on the infected computer, with the intention of destroying the hardware through overheating and/or overuse of the cooling mechanisms, ultimately creating a fire hazard.

Welcome Home

I’m proud to announce the URL change from my old name to my current one. Thank you for your patience. I’m going to spend this weekend organizing my thoughts on the transition to date and from this point forward, and hopefully next week I’ll be able to begin presenting to you the story of how I became who I am now. Until then, please make yourself at home.

The Times They Are A-Changin’

About two months ago, I came out of the closet as a transgender woman. There has been a lot of discussion about all of this, both in closed arenas and open areas, but curiously I have stayed very quiet here. I didn’t even update the main header until just now, and the site URL still bears my old name. As soon as I can, I’m probably going to have to get that changed. But here we are, all about to embark on a new journey.

So to sum up: Yes, I am changing everything over to read “Phoebe” instead of “John”. This has been a couple of years in the making, and while outwardly it seems sudden, behind the scenes it’s been agonizing. I am planning on opening up 2015 with a full discussion of the whole process, but for the time being, please look forward to it.

Gamers Are Dead

Those three words, an implementation of a rather tired and banal journalistic cliché indicating that a fad is over, sparked a rather disgusting outpouring of hatred in the world over the past month. And I risk reigniting it with this article, but I do so knowingly and willingly, because, quite frankly, it doesn’t go far enough. The concept of being a “gamer” is not only a dead concept, but its rotted and decaying corpse is being paraded around on strings, made to dance for the whims of a handful of misanthropes who are desperate to cling to the only piece of identity that they have left. Gamers aren’t just dead, they’re undead. And like the undead, we have to stop this zombie outbreak before it threatens the world.

The idea of being a “gamer” was originally created as a result of advertisers trying to pigeonhole the customers who were buying their clients’ products into a cohesive demographic at a time when there wasn’t one. Nobody knew what magazines to advertise games and consoles in, outside of general computing magazines. Eventually, people like Larry Flynt picked up on the fact that there were people out there who were buying those computing magazines only for gaming coverage. Flynt then bankrolled the creation of Video Games and Computer Entertainment, one of the first post-Crash magazines to cover the newly resurrected industry exclusively. VGCE was one of the first magazines that had to wrestle with a new problem: what else could we sell to people who bought video games?

Eventually, that question began to answer itself in a rather distressing form: “more video games”. I still remember with a certain twisted sense of fondness the overambitious and somewhat dodgy advertisements in the first few issues of VGCE, for things like arcade-style NES controllers and a mail-in trade-in store (which itself would be unsettlingly prescient a couple decades down the line). But by and large, advertisers in the magazine were restricted to games and gaming stuff. A huge part of this could also be understood to be reticence on the part of more mainstream advertisers; remember that the Crash had only been a few years prior, and that as of 1983, “video games were dead”. Unsold copies of E.T. stood as grave markers in Hills and Ames, memorializing the amount of time, energy, and space wasted on what was still going to be seen as a fad for another twenty years.

Faced with this mindshare ouroboros, consumers of video games began to self-radicalize. I note with some amount of wry amusement that I’m using the 2014 sense of the word “radical” to refer to people who would be using the 1987 sense of the word. Obsessiveness with games became a hallmark of people who played them. This wasn’t out of any addictiveness of the games themselves (though let’s be honest, Tetris is a hell of a drug) but rather because they simply weren’t exposed to anything else. Rather than attempt to bring them into the circles of other interests– which would be counterproductive to the goal of making more money on video games– the publishers of the now-somewhat-established gaming journalist corps, consisting of magazines like Gamepro, VGCE, and Electronic Gaming Monthly, began actively excluding outside interests from their magazines. One of the irregular features of Nintendo’s in-house publication (Nintendo Power) had been to highlight a celebrity or other outside luminary who was a fan of their games, in the hopes of leveraging some of the rather devoted Nintendo fanbase to that celebrity’s newest project. Obviously, it failed dramatically, and by the third or fourth anniversary of the magazine it was a distant and frequently covered-up memory.

In the post-Crash landscape of the video game hobby, this singular focus on promoting games and only games was, arguably, necessary to ensure the survival of the hobby as a whole. Thus, when faced with the question of identity, consumers really only had the carefully-crafted and exclusive concept of “gamer” to fall back on. It was what they did for fun, and they didn’t really have that many other outside interests. It made sense, of a sort, to say that one was a “gamer” in the sense that one could also say they were a “fisher” or a “reader”. And at the time, there was nothing wrong with that, because the hobby was still small enough that a game that didn’t have widespread support would be too risky to release, and it was more efficient at the time to produce games that fit the demographic than it would have been to advertise to expand the demographic.

A couple years back, I wrote a somewhat meandering series of posts on the differences between being a “community” and being an “enclave”. When I wrote that, in early 2012, I was still thinking primarily of the rather heated backlash against efforts that had been made to expand the market for consumers of video games. The Wii, hallmark of what “gamers” considered to be everything wrong with those efforts, was five years old, and the Wii U was just around the corner. What was remarkable about the concept of the Wii, and by extension Nintendo’s “blue ocean” strategy, wasn’t that there were supposedly “weak” games being made but rather than Nintendo– one of the companies most directly responsible for the recovery of the North American video game industry after the Crash– was recognizing that the approach of exclusive reinforcement was no longer viable. The industry was no longer in its crisis mode; it did not live and die over the success or failure of the Next Big Thing. It hadn’t since 1992, and Mortal Kombat.

Most people nowadays think of the original Mortal Kombat as a rather poorly-designed Street Fighter clone that was notable only for its gore levels (which are tame by today’s standards). However, it was a huge risk for an industry that was slowly coming to terms with the fact that its consumer base was going through adolescence. Games like the Super Mario and Sonic series were perennial sellers; anyone could pick them up, and they appealed to kids of all ages. But the people who had bought the very first iterations of Mario and suchlike were now being seen as “outgrowing” the idea of video games, and Midway took a huge gamble in creating a game that was more “mature”. For better or worse, the gamble paid off. MK became a smash hit, and while it wasn’t universally praised or even universally bought, it was successful enough to not only kickstart a new franchise, it also opened up the market to a newer section of players. It was the “blue ocean” strategy before it was called “the ‘blue ocean’ strategy”.

And that’s where the wheels fell off. Mortal Kombat was a risk, and it had opened up the market to newer players who were then inculcated into the self-affirming and exclusive “gamer” demographic. Rather than learn the lesson that there were untapped markets out there waiting to expand the numbers of potential consumers, the industry as a collective whole decided to simply strip-mine the metaphorical “new challengers” for everything they were worth. It baffles me that nobody at the time was taking the long view, and realizing that the whole of the industry didn’t collapse just because one game wasn’t “for everyone”. What happened instead was a culture of immediacy: starting in the mid-90s, there was a “new mega-hit” every few months, which would seize the whole of the market for a time, then bow out in favor of the next one.

This paradigm, a steady rhythm of games that would explode onto the scene and then fade away, attracted the attention of people who had started out playing Super Mario Bros. in their elementary and middle school days, but were now graduates of college with degrees in computing. There was money to be made in being that Next Big Thing, even just once. This resulted in a population explosion in the development and production sphere, like a 32-bit Baby Boom. More than that, though, the rise of the Internet in the mid-to-late 90s made it possible for smaller developers to not only target their work directly to their customers for far less than it would cost to do so through traditional channels, it enabled those developers to come together in the first place to see the underserved sections of the population who might want to play a game once in a while. The first cracks in the wall built around the idea of being a “gamer” were forming, in the shape of three gems in a row.

Originally a Flash game, Bejeweled, by the studio that would eventually become PopCap Games, was one of the first in a series of simple to play abstract puzzle games that would define the schism between so-called “hardcore” and “casual” games. Again, it was the wrong lesson being learned, this time not by the industry but by the traditional consumer base. Flash (and its predecessor, Shockwave) were technologies that made it easy for “ordinary” people to play games on their computers. There was no setup, no tinkering necessary, just put a URL in and start playing. The money in these games wasn’t in selling access, but rather in the advertisements surrounding them. Ads which, thanks to peeking in on what the user did on the computer when they weren’t playing the game, had a much better understanding of the user’s habits and preferences than Larry Flynt’s team did back in 1987. Advertisers started to realize that people who played games did more than just play games.

It went the other way, too. In the runup to the release of the Dreamcast, Sega embarked on a massive advertisement campaign that rivaled any before it for a video game product. While Sony’s “U R NOT (red)E” campaign in the mid-90s had garnered some limited exposure, the “It’s Thinking” ads for the Dreamcast were everywhere. Television, magazines, billboards, you name it. It worked, to an extent; the Dreamcast enjoyed several months of success before Sega cut the legs out from under it. That’s not the point, though: it showed other developers that ads in “mainstream” media worked. Soon EA’s Madden NFL series started having ads in sports programming of all kinds, introducing a smaller secondary population boom into the consumer base– one which only bought sports games, continuing both the culture of enclave-building and the proof that there were distinct segments of the market within the market.

It wouldn’t be until 2006, with the release of the Wii, that a company outright addressed this segmentation of the market with games for all. From the very beginning, the Wii was meant for an audience that was not already on board; Nintendo at the time asserted that “mature gamers” were already well-served, and that there was an entire generation of people who would be willing to play games if only they weren’t so complicated or insular. The traditional consumers reacted with disgust; the rest of the world, however, reacted by opening their wallets. The Wii was a massive cash injection for Nintendo. The industry again learned the wrong lesson, leading to the flood of shovelware for the Wii and its contemporaries, all trying to cash in on the second coming of the “fad”. When the churned-out “simple” games failed to catch on, mostly because they looked and felt cheap and had no forethought put into them, motion controls were picked up on as the “reason” for the Wii’s success, and quickly imitated. Again, these were mostly failures, and even Nintendo began shying away from its motion controls later on.

This left those people who had forged their identities around the concept of being “gamers” in a bind. For over twenty years, these people had had an entire industry at their beck and call; games had been made “for them exclusively”, and if a game was a smash hit, it had almost universal acclaim within the enclave. Now, though, there were huge schisms among “gamers”, those who liked how the expansion was happening, and those who felt betrayed. Anger multiplies faster than understanding, especially on the Internet, and soon the dominant meme in the so-called “community” was “Are ‘casual’ games destroying gaming?” In 2008, this was a pressing and worrying question. In 2014, we finally have our answer:

“Yes, they did. And doing so was a good thing.”

To be blunt, the idea of being a “gamer” as a sole indicator of one’s identity is an outmoded and dangerous concept, and must be discarded. It was carefully crafted from 1986 to 2006 through a sweet and subtle brainwashing, the Kool-Aid of which I freely admit that I drank deeply. The generation that saved an entire industry in North America should be proud of having undone what the Crash of 1983 did, but that in no way means that they should rest on their laurels. But like a parent clutching the bike long after the child has proven they can balance without the training wheels or their support, the “gamers” are now doing more harm than good to the industry they love. The “Weekend At Bernie’s“-style manipulation of the remains of the identity has to stop.

Bereft of any other identity, though, is a dangerous way to go through life, and the past few months have been perfect crystallizations of exactly why that is. Under the pretense of necromanticizing the “gamer” label, hatred and evil have crept in to fill the void. The ouroboros is now digesting itself; the circle is closing. People who once held the label of “gamer” are being used; having proven that they can be molded and manipulated as a whole, they are a useful tool for pushing an agenda of marginalization. It’s a conspiracy of the perceived majority: deluded into thinking that they speak for everyone, “gamers” are pushing back against the inevitable understanding that “gamer” and “person who plays games” are no longer exact synonyms.

The recurring theme throughout this entire story is that the video game industry has actively resisted very nearly every attempt to “grow up” that it has ever been backed into trying. It’s unsurprising, and it’s a little bit sad. But if you’re looking for the bright side in all of this, take this one: I personally feel that video games have given me so much more than I could have hoped to experience on my own. The dorky little eight-year-old me who played Kid Icarus went on to check out books on mythology from the library, lecturing my relatives on the Greek gods at every chance I got. The fourteen-year-old me who was enthralled by the concept of magical technology in Final Fantasy VI started writing (bad) stories about a world where magic replaced electricity. The twenty-one-year-old me parlayed a love of Pokémon into a job at a game store, where I could share the games that I loved with people who came in– where I could bring people the joy that games had brought me. Being a “gamer” has given me the seeds– and only the seeds– to most of the good that I’ve accomplished in my life. But it’s been the rest of my life that made those seeds germinate and bloom.

Gamers are dead. Long live those who survived being gamers.