No Surprises Here

The horrendous tantrum that started back last August rolls onward, which should surprise exactly none of you; these things tend to grow legs of their own accord and sooner or later nobody can catch them. Unfortunately, that’s exactly what happened. The sense of sheer entitlement and exclusionism that started with “them damn feminazis tryin at take away my video games” has blown up into a general maelstrom of the Defenders of True Geek Culture trying to force out the insidious forces of “progressives” and “feels”, to ensure that the things they love will remain theirs alone and theirs forever– even if they were never intended for them in the first place. 

Perhaps, then, the fact that Joss Whedon deleted his Twitter feed on May 4th– mere days after the release of his most recent film, Avengers: Age of Ultron– is not so surprising in and of itself. Whedon has always been a polarizing figure in geek circles anyway, with some people not liking that he tends to write the same “powerful” female characters in every work, and others just not really liking the over-reliance on under-intelligent banter. But up until yesterday, he was seen as “safe” from the criticisms and havoc that literally anyone else ever saying those things would have to endure, owing to his previous success much less than his birthright as a white dude. And I say that as a white quasi-dude. That he threw up his hands and walked away from Twitter should have been a wake-up call for those who sent anyone any kind of abuse. That it apparently hasn’t been is unsurprising, the inherent unsurprisingness of which leads me to believe that I really shouldn’t expect to be surprised at the depth of human sickness anymore. (I’ll try to stop saying “surprise” now.)

What we have here is a rather unusual counter-stroke to what the Internet had allowed to occur back in the 90s and early 00s in the first place: a sort of reactionary-revisionist faction seeking to isolate and disenfranchise people en masse. See, back in the early days of the Internet, it was a good thing that people with incredibly diverse interests could connect with each other regardless of their location. No matter what you were in to, be it Star Wars, Final Fantasy, anthropomorphic animals, sex with furniture, whatever– the odds were that somewhere out there was someone else with the interest who is probably dying for the chance to chat about it. (Although the sex with furniture thing is pretty weird. I’m not judging, just saying it’s weird and not my thing, but if it’s yours, you’re welcome to it.) For people who were of a certain mindset that wasn’t common in the era or area that they were growing up in, the Internet was a godsend.

At some point, however, there started to be a backlash against the background weirdness of the universe being brought into the foreground. Like nebulae condensing into stars, the scattered pockets of weird in the Internet were coalescing into groups, organizations that could support their members as needed. Some people thought that these pockets of weird should not exist, that it was “convincing people that they were normal when in fact, they weren’t”. People started highlighting these groups and shaming them, ostracizing them in much the same way that the individuals had been isolated in their everyday life. The concept of “live and let live” was sorely lost on these people.

Then you had the parts of the Internet where there were no rules, where things could get shocking and horrendous without warning. At first there was an unstated rule saying that it was all done in satire, that the racist, xenophobic material being bandied about like cat pictures on Saturdays wasn’t at all representative of the users’ actual views. But it was unstated, and stayed unstated, and assertions that it was serious were not taken as the kayfabe that it was proffered as, but instead at face value. Eventually even the unstated assertion fell away, and there were actual violent psychotic monsters posting in full sincerity. In some sense it is Möbius’ Aristocrats: a setup so filthy that it turns inwards upon itself, ever escalating, never reaching the punchline that renders what came before it benign and funny (if it ever could be considered so). 

It’s not exactly clear when these two groups got together and birthed the mindset that the Internet was a horrible place, filled with depravity and devoid of mercy. Certainly, the mainstream media did not help matters; scare stories about websites where Your Children were At Risk of Predators were a dime a dozen in those early days. They’ve calmed down a little since then, but are still no more based in fact than the (similar vintage, but thankfully now-extinct) Ripped From The Headlines TV-movie about whatever scandal of the week. By calling out and highlighting the awful behavior of certain minority parties online, it painted the picture that the Internet was a lawless place free from consequences and populated only by unfeeling avatars. It was like a TV news crew broadcasting the exact times and street corners where a drug dealer hung out, in the hopes that the people who would make use of this information would be the police instead of the drug dealer’s customers. It attracted the people who would do these horrible things, and sought to make them “normal”.

But nonetheless, the mindset that outright hostility and sociopathic behavior were the baseline of behavior on the internet became the “accepted” norm. I say “accepted” because by and large the only time anyone calls this out is when they are themselves under attack. “Everyone else is fair game; hurt my feelings, though, and you’ve crossed the line.” And in a sense, it was the “live and let live” attitude from the early days that allowed that mindset to assert itself as “the way it is” simply because it didn’t want to tell those people they couldn’t do what they wanted. An abuse of logic allowed people to shoot down the argument of “you can’t bully anyone, they’re just doing their thing” by saying “well, bullying them is my thing, and by the same assertion you can’t tell me not to”. It again boils down to the unstated half of the axiom: “they’re doing their thing and not hurting anyone else“.

But getting back to Joss Whedon and the current state of affairs, the fight against the hostility which has now entrenched itself in the Internet that once brought people together is going… Poorly. The immediacy of the medium means that you have to be there to defend yourself, and if enough people push you to a breaking point through death threats or other promises of violence, well, you either soldier on or you fold up and go home. There is a severe lack of equilibrium within what passes for conversation online today: many can group together to attack, but a defender always stands alone. Faced with a crush of humanity in all its bile and wrath, what choice is there but to flee? Quite frankly, it’s probably safe to say it’s not worth fighting.

Except it is.

We are facing a new era of society: where our intrinsic selves are exposed to the entirety of humanity at a moment’s notice. Socially, this has not happened in several thousand years. What we are seeing is the throes of evolution at work; raw aggression, this time in social interaction, is being selected for as those who cannot properly process the emotions of seven billion humans being thrown at them are being weeded out of the gene pool. Unlike the evolutionary crises which allowed us to start using tools, or grasp the greater mysteries of the universe through advanced mathematics, however, we have a tool greater than any formula: we can become aware of what we are sacrificing in order to succeed in this new era of humanity. Who knows what skills or abilities we gave up when the Great Engineer of the Universe pushed us to our current state. But we know exactly what we are losing now: traits like compassion, empathy, gentleness, compromise. We are losing our ability to do the things which brought us to this point in our history.

It’s not my place to say whether or not the ultimate fate of humanity some hundreds of years from now is to touch the stars with the better angels of our nature by our sides, or to grasp them from atop a tower of our enemies’ corpses. However we are destined to survive this evolutionary inflection point, we must as a species do so. I will continue to fight for equality, for a world where hostility is the exception and not the rule, for a world where everyone is free to choose as dictated by the desires of their truest self, for the people who believe to keep believing, for the people who don’t know to find their answers wherever they may lie. I will champion the cause of positivity and compassion for as long as I live.

Which, of course, shouldn’t surprise you.

Five Terrifying (But Thankfully Fictional) Computer Viruses

This morning I saw a news report that suggested keeping confidential information on a detachable drive such as a flash drive in order to avoid having the data stolen. I immediately thought of a way that could be circumvented, and at the risk of sounding like Buzzfeed, I thought up four more viruses that, as far as I know, are only the product of my own imagination. This was just a sort of thought-exercise, nothing really intentional or a goal, but something to think about.

1. Johnny Cache: Captures specific PDF files based on a likely size range from the infected computer and any external drives attached, then shares them in a “cloud” service to allow people to mine them for personal information.

2. The Beat-Alls: Deliberately issues hundreds of thousands of read/write cycles on all storage devices attached to the infected computer, counting on wearing the devices down faster, destroying data.

3. Gabba Gabba Hey: Snoops traffic on all networks the infected computer is connected to, and stores packets for later “echoing” back into the network. The idea is to flood a network with a denial of service attack using data that is indistinguishable from “real” traffic.

4. The Vapors: Installs various input method editors into the infected computer and randomly switches among them at set intervals, turning all typed information into total gibberish; worse, attempts to disable the IME trigger the infected computer to randomize the encoding of all text displayed as well as input.

5. London Calling: Disables, then takes direct control of, temperature controls on the infected computer, with the intention of destroying the hardware through overheating and/or overuse of the cooling mechanisms, ultimately creating a fire hazard.

Welcome Home

I’m proud to announce the URL change from my old name to my current one. Thank you for your patience. I’m going to spend this weekend organizing my thoughts on the transition to date and from this point forward, and hopefully next week I’ll be able to begin presenting to you the story of how I became who I am now. Until then, please make yourself at home.

The Times They Are A-Changin’

About two months ago, I came out of the closet as a transgender woman. There has been a lot of discussion about all of this, both in closed arenas and open areas, but curiously I have stayed very quiet here. I didn’t even update the main header until just now, and the site URL still bears my old name. As soon as I can, I’m probably going to have to get that changed. But here we are, all about to embark on a new journey.

So to sum up: Yes, I am changing everything over to read “Phoebe” instead of “John”. This has been a couple of years in the making, and while outwardly it seems sudden, behind the scenes it’s been agonizing. I am planning on opening up 2015 with a full discussion of the whole process, but for the time being, please look forward to it.

Gamers Are Dead

Those three words, an implementation of a rather tired and banal journalistic cliché indicating that a fad is over, sparked a rather disgusting outpouring of hatred in the world over the past month. And I risk reigniting it with this article, but I do so knowingly and willingly, because, quite frankly, it doesn’t go far enough. The concept of being a “gamer” is not only a dead concept, but its rotted and decaying corpse is being paraded around on strings, made to dance for the whims of a handful of misanthropes who are desperate to cling to the only piece of identity that they have left. Gamers aren’t just dead, they’re undead. And like the undead, we have to stop this zombie outbreak before it threatens the world.

The idea of being a “gamer” was originally created as a result of advertisers trying to pigeonhole the customers who were buying their clients’ products into a cohesive demographic at a time when there wasn’t one. Nobody knew what magazines to advertise games and consoles in, outside of general computing magazines. Eventually, people like Larry Flynt picked up on the fact that there were people out there who were buying those computing magazines only for gaming coverage. Flynt then bankrolled the creation of Video Games and Computer Entertainment, one of the first post-Crash magazines to cover the newly resurrected industry exclusively. VGCE was one of the first magazines that had to wrestle with a new problem: what else could we sell to people who bought video games?

Eventually, that question began to answer itself in a rather distressing form: “more video games”. I still remember with a certain twisted sense of fondness the overambitious and somewhat dodgy advertisements in the first few issues of VGCE, for things like arcade-style NES controllers and a mail-in trade-in store (which itself would be unsettlingly prescient a couple decades down the line). But by and large, advertisers in the magazine were restricted to games and gaming stuff. A huge part of this could also be understood to be reticence on the part of more mainstream advertisers; remember that the Crash had only been a few years prior, and that as of 1983, “video games were dead”. Unsold copies of E.T. stood as grave markers in Hills and Ames, memorializing the amount of time, energy, and space wasted on what was still going to be seen as a fad for another twenty years.

Faced with this mindshare ouroboros, consumers of video games began to self-radicalize. I note with some amount of wry amusement that I’m using the 2014 sense of the word “radical” to refer to people who would be using the 1987 sense of the word. Obsessiveness with games became a hallmark of people who played them. This wasn’t out of any addictiveness of the games themselves (though let’s be honest, Tetris is a hell of a drug) but rather because they simply weren’t exposed to anything else. Rather than attempt to bring them into the circles of other interests– which would be counterproductive to the goal of making more money on video games– the publishers of the now-somewhat-established gaming journalist corps, consisting of magazines like Gamepro, VGCE, and Electronic Gaming Monthly, began actively excluding outside interests from their magazines. One of the irregular features of Nintendo’s in-house publication (Nintendo Power) had been to highlight a celebrity or other outside luminary who was a fan of their games, in the hopes of leveraging some of the rather devoted Nintendo fanbase to that celebrity’s newest project. Obviously, it failed dramatically, and by the third or fourth anniversary of the magazine it was a distant and frequently covered-up memory.

In the post-Crash landscape of the video game hobby, this singular focus on promoting games and only games was, arguably, necessary to ensure the survival of the hobby as a whole. Thus, when faced with the question of identity, consumers really only had the carefully-crafted and exclusive concept of “gamer” to fall back on. It was what they did for fun, and they didn’t really have that many other outside interests. It made sense, of a sort, to say that one was a “gamer” in the sense that one could also say they were a “fisher” or a “reader”. And at the time, there was nothing wrong with that, because the hobby was still small enough that a game that didn’t have widespread support would be too risky to release, and it was more efficient at the time to produce games that fit the demographic than it would have been to advertise to expand the demographic.

A couple years back, I wrote a somewhat meandering series of posts on the differences between being a “community” and being an “enclave”. When I wrote that, in early 2012, I was still thinking primarily of the rather heated backlash against efforts that had been made to expand the market for consumers of video games. The Wii, hallmark of what “gamers” considered to be everything wrong with those efforts, was five years old, and the Wii U was just around the corner. What was remarkable about the concept of the Wii, and by extension Nintendo’s “blue ocean” strategy, wasn’t that there were supposedly “weak” games being made but rather than Nintendo– one of the companies most directly responsible for the recovery of the North American video game industry after the Crash– was recognizing that the approach of exclusive reinforcement was no longer viable. The industry was no longer in its crisis mode; it did not live and die over the success or failure of the Next Big Thing. It hadn’t since 1992, and Mortal Kombat.

Most people nowadays think of the original Mortal Kombat as a rather poorly-designed Street Fighter clone that was notable only for its gore levels (which are tame by today’s standards). However, it was a huge risk for an industry that was slowly coming to terms with the fact that its consumer base was going through adolescence. Games like the Super Mario and Sonic series were perennial sellers; anyone could pick them up, and they appealed to kids of all ages. But the people who had bought the very first iterations of Mario and suchlike were now being seen as “outgrowing” the idea of video games, and Midway took a huge gamble in creating a game that was more “mature”. For better or worse, the gamble paid off. MK became a smash hit, and while it wasn’t universally praised or even universally bought, it was successful enough to not only kickstart a new franchise, it also opened up the market to a newer section of players. It was the “blue ocean” strategy before it was called “the ‘blue ocean’ strategy”.

And that’s where the wheels fell off. Mortal Kombat was a risk, and it had opened up the market to newer players who were then inculcated into the self-affirming and exclusive “gamer” demographic. Rather than learn the lesson that there were untapped markets out there waiting to expand the numbers of potential consumers, the industry as a collective whole decided to simply strip-mine the metaphorical “new challengers” for everything they were worth. It baffles me that nobody at the time was taking the long view, and realizing that the whole of the industry didn’t collapse just because one game wasn’t “for everyone”. What happened instead was a culture of immediacy: starting in the mid-90s, there was a “new mega-hit” every few months, which would seize the whole of the market for a time, then bow out in favor of the next one.

This paradigm, a steady rhythm of games that would explode onto the scene and then fade away, attracted the attention of people who had started out playing Super Mario Bros. in their elementary and middle school days, but were now graduates of college with degrees in computing. There was money to be made in being that Next Big Thing, even just once. This resulted in a population explosion in the development and production sphere, like a 32-bit Baby Boom. More than that, though, the rise of the Internet in the mid-to-late 90s made it possible for smaller developers to not only target their work directly to their customers for far less than it would cost to do so through traditional channels, it enabled those developers to come together in the first place to see the underserved sections of the population who might want to play a game once in a while. The first cracks in the wall built around the idea of being a “gamer” were forming, in the shape of three gems in a row.

Originally a Flash game, Bejeweled, by the studio that would eventually become PopCap Games, was one of the first in a series of simple to play abstract puzzle games that would define the schism between so-called “hardcore” and “casual” games. Again, it was the wrong lesson being learned, this time not by the industry but by the traditional consumer base. Flash (and its predecessor, Shockwave) were technologies that made it easy for “ordinary” people to play games on their computers. There was no setup, no tinkering necessary, just put a URL in and start playing. The money in these games wasn’t in selling access, but rather in the advertisements surrounding them. Ads which, thanks to peeking in on what the user did on the computer when they weren’t playing the game, had a much better understanding of the user’s habits and preferences than Larry Flynt’s team did back in 1987. Advertisers started to realize that people who played games did more than just play games.

It went the other way, too. In the runup to the release of the Dreamcast, Sega embarked on a massive advertisement campaign that rivaled any before it for a video game product. While Sony’s “U R NOT (red)E” campaign in the mid-90s had garnered some limited exposure, the “It’s Thinking” ads for the Dreamcast were everywhere. Television, magazines, billboards, you name it. It worked, to an extent; the Dreamcast enjoyed several months of success before Sega cut the legs out from under it. That’s not the point, though: it showed other developers that ads in “mainstream” media worked. Soon EA’s Madden NFL series started having ads in sports programming of all kinds, introducing a smaller secondary population boom into the consumer base– one which only bought sports games, continuing both the culture of enclave-building and the proof that there were distinct segments of the market within the market.

It wouldn’t be until 2006, with the release of the Wii, that a company outright addressed this segmentation of the market with games for all. From the very beginning, the Wii was meant for an audience that was not already on board; Nintendo at the time asserted that “mature gamers” were already well-served, and that there was an entire generation of people who would be willing to play games if only they weren’t so complicated or insular. The traditional consumers reacted with disgust; the rest of the world, however, reacted by opening their wallets. The Wii was a massive cash injection for Nintendo. The industry again learned the wrong lesson, leading to the flood of shovelware for the Wii and its contemporaries, all trying to cash in on the second coming of the “fad”. When the churned-out “simple” games failed to catch on, mostly because they looked and felt cheap and had no forethought put into them, motion controls were picked up on as the “reason” for the Wii’s success, and quickly imitated. Again, these were mostly failures, and even Nintendo began shying away from its motion controls later on.

This left those people who had forged their identities around the concept of being “gamers” in a bind. For over twenty years, these people had had an entire industry at their beck and call; games had been made “for them exclusively”, and if a game was a smash hit, it had almost universal acclaim within the enclave. Now, though, there were huge schisms among “gamers”, those who liked how the expansion was happening, and those who felt betrayed. Anger multiplies faster than understanding, especially on the Internet, and soon the dominant meme in the so-called “community” was “Are ‘casual’ games destroying gaming?” In 2008, this was a pressing and worrying question. In 2014, we finally have our answer:

“Yes, they did. And doing so was a good thing.”

To be blunt, the idea of being a “gamer” as a sole indicator of one’s identity is an outmoded and dangerous concept, and must be discarded. It was carefully crafted from 1986 to 2006 through a sweet and subtle brainwashing, the Kool-Aid of which I freely admit that I drank deeply. The generation that saved an entire industry in North America should be proud of having undone what the Crash of 1983 did, but that in no way means that they should rest on their laurels. But like a parent clutching the bike long after the child has proven they can balance without the training wheels or their support, the “gamers” are now doing more harm than good to the industry they love. The “Weekend At Bernie’s“-style manipulation of the remains of the identity has to stop.

Bereft of any other identity, though, is a dangerous way to go through life, and the past few months have been perfect crystallizations of exactly why that is. Under the pretense of necromanticizing the “gamer” label, hatred and evil have crept in to fill the void. The ouroboros is now digesting itself; the circle is closing. People who once held the label of “gamer” are being used; having proven that they can be molded and manipulated as a whole, they are a useful tool for pushing an agenda of marginalization. It’s a conspiracy of the perceived majority: deluded into thinking that they speak for everyone, “gamers” are pushing back against the inevitable understanding that “gamer” and “person who plays games” are no longer exact synonyms.

The recurring theme throughout this entire story is that the video game industry has actively resisted very nearly every attempt to “grow up” that it has ever been backed into trying. It’s unsurprising, and it’s a little bit sad. But if you’re looking for the bright side in all of this, take this one: I personally feel that video games have given me so much more than I could have hoped to experience on my own. The dorky little eight-year-old me who played Kid Icarus went on to check out books on mythology from the library, lecturing my relatives on the Greek gods at every chance I got. The fourteen-year-old me who was enthralled by the concept of magical technology in Final Fantasy VI started writing (bad) stories about a world where magic replaced electricity. The twenty-one-year-old me parlayed a love of Pokémon into a job at a game store, where I could share the games that I loved with people who came in– where I could bring people the joy that games had brought me. Being a “gamer” has given me the seeds– and only the seeds– to most of the good that I’ve accomplished in my life. But it’s been the rest of my life that made those seeds germinate and bloom.

Gamers are dead. Long live those who survived being gamers.

The Gates

The last few months, as I’m sure you’re aware, have been pretty rough within the tech and video game industries. It started with a disgruntled post on a message board, claiming to be from an ex of a notable female video game developer. The post accused that developer of having had a brief romantic relationship with a writer for a game review website. This sparked a discussion of ethics in video game journalism and the interconnectedness of reviewers and developers, both metaphorical and literal.

The review website conducted their investigation, found no evidence of wrongdoing, and revised its policies to prevent the appearance of impropriety in the future. In any other industry, with any other individuals involved, and any other consumers raising the alarm, that would have been the end of it. It has since descended into a maddening maelstrom of abuse and hate, with invasions of privacy being perpetrated both for and against “the cause”. Rather than look at the currently-ongoing imbroglio, though, I think it’s important to step back and take a look at both how the metaphorical tropical depression escalated into a full-blown hurricane of hostility, and how this is only a bellwether of what is going to happen in the future.

The first thing you need to be aware of is that the seeds of the current mess were sown with the hashtag “#gamergate”. A hashtag (in case you haven’t already been bludgeoned over the head with the word since Twitter hit critical mass in 2010) is a text marker, preceded with the “hash” or “pound” symbol, that is added to designate a tweet as being part of a larger conversation. For example, when television shows air, you’ll often find (either in place of or right above the ubiquitous “bug” station identifiers in the lower right-hand corner of the screen) a hashtag that is recommended for use when tweeting about the show; in my preferred case, “#PoI” would be how I would scream to the world that I was referring to the currently-broadcasting episode of Person of Interest. Hashtags run the gamut from terse to almost taking up the entirety of the 140 character limit for tweets, but the important thing to remember here is that they are a method of self-selection: the person writing the tweet consciously chooses to apply the label to the tweet– and by extension, themselves, if that hashtag is representative or symbolic of something greater.

So we have this hashtag of “#gamergate”, and we have some people who have chosen to use it when tweeting or writing about video games. Great. But the problem here is one that is unique to semi-anonymous electronic communications: anyone can use the hashtag on any tweet. There is no hierarchical leadership within Twitter that dictates exactly how the hashtag is to be used. This is by design, to an extent; the hashtag is a bit of metadata, as opposed to data in and of itself. (Metadata is, at its core, data about data. If you think of your car as a piece of data, the fact that it is blue is metadata.) Where things get interesting– and by “interesting” I mean “Oh God what happened I turned my back for like five seconds and it’s all on fire now”– is that there is also no authority structure to say that the tag is being misused.

Let’s step away from the internet for a moment and take a look at a similar phenomenon, one that exists in the American political system, specifically bill riders and earmarks. Within the two houses of the US Congress, laws begin their existence as bills to be proposed on the floor of the House of Representatives or the Senate. (The rules for what starts where are arcane and beyond the scope of this article.) Now, these bills have pretty straightforward aims at their genesis: let’s take a hypothetical example of a bill that says owners of blue cars get a 1% tax break for a year. (What? I like blue cars.) The bill has to reach a certain level of approval within the originating committee in the House or Senate before it can be presented to the legislature at large, where it has to reach yet another threshold before going to the other house, where– you guessed it– they have to approve it at a certain level. Now, say for a moment that the senator from Pennsylvania has an objection to the bill, because (probably due to cadmium deposits found in the old coal mines or something) blue cars don’t sell nearly as well in his state. The PA Senator can ask for a change to the bill– a rider– that says that in Pennsylvania, the break is extended to red cars as well. These riders, it should be noted, don’t have to have anything to do with the original bill; a Senator from Connecticut could ask for a rider approving several million dollars of federal funding for a bridge repair. You can repeat the process for however many legislators it takes in order to get past the threshold of approvals before bringing the bill to a vote. This creates a rather difficult catch-22 for the legislators who have to choose between supporting a bill that has unpleasant riders, or voting against a bill that could do good because of the riders attached to it.

How does that tie into video games? Remember that “#gamergate” is both anarchic (as in it has no authority structure) and self-selected. Anyone can attach the hashtag to tweets of all kinds; ranging from demanding stronger ethical conduct rules in video game journalism, to detailed rape and death threats against developers and their families. In a historically traditional view of anarchy-advocacy (that is, people arguing for anarchy as a method of self-governance), the group should be policing itself and clamping down on the destructive behavior of the latter. It isn’t. If anything, the tag has been co-opted by those threats and tyrants (and I use the word “tyrant” in its vernacular sense of “violent autocrat”, not my usual tongue-in-cheek definition of “necessary minimally-exercised authority”). The people arguing for greater transparency in video game journalism are being drowned out by those who would see game developers driven from their homes simply for making games that questioned the status quo.

So this presents an interesting question: which is the real face of the “#gamergate” movement, terror or accountability? The answer is, frankly, both. Because, and this is an important distinction, it is possible to be correct without necessarily being right.

I’m not arguing that there are no problems with how video games are covered and how they are presented to the public. The gaming press absolutely is complicit in the current state of affairs where games receive massive amounts of hype prior to release only to be abject trainwrecks. I’m also not arguing that there doesn’t need to be a greater female and minority voice within the video game industry. The most interesting and engrossing games I’ve played over the last five years have been female-developed. But what is undeniable is that it’s now not possible to declare any of these points without being implicitly associated with sociopathic jerks. That perception is self-perpetuating: it drives away people who would be a moderating or mitigating voice and attracts, well, more sociopathic jerks.

Unfortunately, the “#gamergate” hashtag is beyond salvaging as it currently stands. It’s highly unlikely that the voice of reason could ever regain control of the narrative that’s attached to the tag. The movement has, as it was destined to do, moved on. And this is just the beginning of something we’re going to see a lot more of in the future.

What’s interesting to note with regards to the “#gamergate” phenomenon is that, for both of its goals, it achieved success. It started as an outcry against corruption in video game reporting, and it resulted in getting several sites to re-evaluate their policies. It then moved on to terrorizing women in video games, something that also gained swift results. The problem is that logically, these should have been two completely different movements, and people who pushed hard for one goal found themselves, and their credibility, being washed into a tsunami of support for the other, which they may not have had any desire to.

I’m reminded of a quote from H.P. Lovecraft’s The Case of Charles Dexter Ward: “Do not call up what you cannot put down.” The people who started fighting for the cause of transparency found easy and fast allies in misogynists and psychopaths, and didn’t stop to think that when the goal of transparency was achieved, it might be a little hard to dial back the frothing anger stirred up in their erstwhile allies. Now they find themselves in the back-seat, desperately trying to reclaim the reins of their movement from the people they egged on just days– or even hours– before. This prompts the feeble cries of “we’re about transparency!” when the label is overwhelmingly being used to justify threatening and stalking women in the industry.

If there was a chance to separate out the two sentiments, it has long since passed. The beast, called up from the depths, can no longer be put down. All we can do now is close the gates, to prevent something far, far worse from emerging.

EDIT, 15 October 6:30a: Since the original publication of this post, an individual inspired by the “#gamergate” movement has sent a threat to Utah State University promising violence against its students if USU went ahead with plans to have Anita Sarkeesian as a speaker today. Sarkeesian herself cancelled the talk, citing insufficient security measures at USU. The fact that actual violence was threatened as a result of the movement’s momentum is troubling, and within back-channels organized for the movement the cancellation is being hailed as a success, which is even more troubling.

A post on the NeoGAF forums describes the inaccuracies and myths that lie at the core of the continued assertion that the “#gamergate” movement is still about specific ethical grievances. As I mentioned yesterday (above), the initial impetus for the movement was resolved within a few days of its revelation. What has continued has been an embarrassment and a shame upon a hobby that has brought so many people together. The threats of violence must stop. The dishonesty about what is going on must stop.

Finally, Kris Straub, author of the webcomic Chainsaw Suit, has posted “The Perfect Crime”, which details in far more brevity the entirety of this post’s ultimate points regarding why self-selected movements are going to be problematic in the future. Like it or not, “#gamergate” was successful in both of its goals– the short-term one (of investigating the ethics concern) and the long-term one (stalking and threatening women). It’s now a model case, a textbook case, for deception in public relations. You can expect this sort of “we’re not saying we condone the actions of the extremists in our movement, but we’re going to accept and claim responsibility for the results that those extremists get us” play to start showing up in issues that really matter.

And it’s the promise of that kind of future, where discussion is intentionally obfuscated and civilized argument is impossible, that terrifies me just as much as any threat to my person.

Fight or Flight

Throughout the life of this blog, and its predecessor, anime conventions have been a big part of my social activity. I’m grateful for the time that I spent both participating as a patron of these conventions, and for the effort and work I put in to them as a contributor. I made some very close friends through the time that I was with the local convention, and I had some other people who I was not as close to but was still amicable with for the sake of the show. But today, two years after I was dismissed from the service of that show, I cut off the last and most tenuous of those relationships. It was not out of malice, but rather a realization that, since there was no business connection which mandated me to swallow discomfort at certain behaviors, I had no real reason to continue association with those individuals. Of course, it was sparked by one incident and one individual in particular, but that was (as is usually the case) the straw that broke the camel’s back.

A lot of people don’t know this, but two years ago, I came very close to leaving Pittsburgh and all of my friends behind. I felt as if the world around me had come crashing down; that I had overstayed my welcome in this city, and that I needed to leave in order to be able to move on and recover. If I had done so, I might have managed to join up with another convention, might have made more friends, might have been in a completely different situation. But in truth, I knew deep in my heart that I wouldn’t. I would have left Pittsburgh and recommitted myself to my previous habit of isolation, of coming home night after night to an empty apartment, playing video games alone and never once reaching out to anyone again. After all, I’d been burned so badly once again; the experiment had ended in failure and disproven my tenuous hypothesis that I could be a social creature. Being miserable and alone would have appeared preferable to being happy with others, just long enough for them to leave.

I thought seriously about it. Of course, in the end, I decided to stay. And it has brought me pain nearly every day since then, as the circle of friends I had previously engaged with and found a place in continued to deteriorate. Some friends had their circumstances change; others deliberately cut others off; still others left Pittsburgh themselves. Through it all I tried to remain friends with as many of them as I could, fighting my instinct tooth and nail to keep sight of the fact that I didn’t have to run away again, that I still had people here who I cared about and who reciprocated that affection. Each day that passed when another friend dropped off the radar was another body-blow to that assertion.

It all came to a head about a month ago, when one individual tried to get me to reconnect with the convention organization. The management that had dismissed me was still in charge, and I harbored doubts that I would fare any better under them this time than I had when I still had goodwill and ambition for the convention. I attempted to make it clear that I didn’t want to be part of the organization again, but the end result was that the person who attempted to reach out to me got the wrong impression and, I think, took it personally. The dismissal had been a personal insult to me, but it was not that individual who had done so. Unfortunately, this misunderstanding culminated in a breakdown in communications today, and prompted both of us to mutually terminate relations. That person and I had almost never seen eye to eye, and so I’m sure that neither of us are too terribly broken up about it.

Afterwards, of course, I felt myself wondering what was keeping me in Pittsburgh. Obviously now there is a more urgent force keeping me here, specifically my continuing education, but there was still an extremely strong urge to consider disappearing again. It has been on my mind throughout 2014, especially considering that up until May I had no real attachments keeping me here. I could have left any time I wanted. I chose, however, to stay; to leverage the resources available here to bring myself closer to a greater amount of freedom if and when I choose to leave later on. That choice is still a few years off now, but it has been on my mind today.

My education plan includes, as a matter of necessity, an extended period of time spent overseas in order to more fully immerse myself in the language, culture, and idioms of Japan. I love travel, and it would be dishonest to say that I’m not looking forward to the trip. But the reason for the travel is not just for the educational opportunity it provides, and it is also not solely for the entertainment and excitement of international tourism. In a sense, crossing the Pacific Ocean is a real chance for a new start. I’ve said on more than one occasion that once I leave Pittsburgh, the odds are not good that I will return for very long, if at all; most of the work that I would be looking to do is centered on the West Coast, and if that doesn’t work out, I can freelance from pretty much anywhere on the planet. I wouldn’t mind an itinerant lifestyle.

Even with all of this on my mind, I have found myself not wanting to leave. I’m making new friends in my classes, socializing more and forging new connections once again. Each day that passes I find myself more and more unable to make the mental severance that I had throughout my time at Gannon: that the campus was merely a way-station, the origin point of my journey, but never more than first base. The Pitt campus feels more like a home that I will not want to leave. Though I’m only familiar with a few of its buildings right now, each day I learn more places that feel like they are mine now, that I belong here. I am only a visitor here, but I increasingly don’t want to leave.

Last week, I did something uncharacteristic: I went to a Pitt Panthers game. Well, half of one anyway; it was a blowout by the end of the first quarter, and I was in lousy seats in direct sunlight, so I left near the end of the half. Before the game started, though, the Alma Mater was sung, and I was struck by one of the lines in the song: “Over fate and foe victorious”. The past few years have been bad, for me, in the professional sphere and in my personal life. My health had a bad scare, and my mental health hit a breaking point. A lot has happened, enough so that the phrase does little to really encapsulate the breadth of the challenges. But you know what? I’m still here. It hasn’t been easy. It isn’t going to get any easier. But I am still here. I’m here, and there are still people here who want me here. I am where I belong, at least for now.

I can live with that.

Above And Beyond

After having had all of my classes once each, and after doing homework for two of them, I’ve come to realize that being older and having had this level of curiosity about Japan prior to my formal attempts at learning has made me a bit too eager to move on to certain things. I pride myself on being a fast learner anyway, but being able to pick up on the fact that the first reading assignment in one particular class is meant to demonstrate what not to do, or being able to connect a vocabulary lesson to a catchphrase heard years ago… These are things that I don’t think too many of my classmates can do just yet. I have to balance out my natural inclination to go as far as I can and to exceed expectations with the desire not to stand out too much.

These are, of course, fundamentally incompatible, which is why it is so difficult and stressful.

Stand Up/The Vanguard

In a few short hours– less than half a day– I’ll be beginning my second college experience, and with it, my second career. Quite a bit has happened in the past ten months, much of which would destroy anyone who had not already endured it, but one thing remains true: I am a survivor. I cannot be broken so long as I can see a way forward. I have clawed my way out of hellish situations in the past, and this one– while still the worst challenge I’ve ever been set against– is no different.

I got to thinking about this a little this afternoon. I remember what I had to my name when I left Cleveland for the last time; I had sold off all but a handful of my most treasured possessions and felt that there could be no recovering from such a disaster. That was the end of 2006, the conclusion of a dark chapter in my life, and the beginning of a rebirth of sorts. 2007 was not easy, but it was better. By 2009 I had considered the Reclamation Project complete, and was looking to improve my situation further than I had been before my retreat. I may have overextended my reach in some cases, but by and large I was on the right track– until I suffered an exceptional advance of my depression in 2012. Life collapsed around me then, and while not all of it can be traced back to the disease which is my daily hell, it certainly didn’t help matters.

Tomorrow, though, starts the rebuilding phase again. It will not be easy. It will not be quick. I will have to sacrifice, to eliminate much from my everyday, in order to recover even the slightest equilibrium, let alone advance. The next three to five years will be a true test of who and what I am. Some people never survive their first trip to college; they drop out, or find they can’t handle the pressure, or discover their true passion and talent elsewhere. This will be my second. And if a third, fourth, or ninth is required, then so be it.

The last week has been one where I have found myself doubting everything that has led me to this point. An unrelated setback also occurred which shook my confidence and left me truly doubtful as to whether or not I could manage any real improvement. I’ll freely admit that there have been nights where I have lain awake and on the verge of tears, wondering if I hadn’t just wasted every breath since last Thanksgiving. Some nights I crossed that border.

Tonight will not be one of those nights. I’m going to bed and I expect that I will sleep peacefully, confident that everything will be okay for once. I am, for a change, aware that this is within my power, not just to influence, but to control. That’s the key, for me, and what’s been a major point of my emotional crashes: that what happens to me is not what happens because of me. There’s going to be a lot of unforeseen problems from here on out. Some of them are going to wreck my shit completely. But what I need to keep within me is this feeling– right now– that says that all of that would happen even if I did see it coming and simply couldn’t avoid it. I can let the world go to hell. As long as I keep doing my part to prevent it– by studying, and dedicating myself to the ideal that communications is the answer– then none can judge me unworthy.

If my life is too big to fix on my own, then the reverse is true as well: I’m not wholly responsible for it falling apart, either, and I don’t deserve to stay so low.

I believe that luck is cyclical. I had a bunch of good years in the beginning, and then fourteen bad ones. The wheel has to come back around sometime.

Good night, folks.

Faulty Motivator

To say that the last two months have been hectic and busy would be a gross understatement, the likes of which are unheard of from my usual idiom of communication. It’s taken this long for me to get back to something approaching a normal schedule, and despite the fact that I start classes this coming Monday, I’m still not entirely at 100%. But, like I said, I have less than a week to go: the time to slack is running out.

I’ve spoken at length about depression here, and in other places, and it’s because of that fact that I feel like I really shouldn’t be relying on it as an excuse for why I have tended to nap for hours during the day and have been almost completely inactive on the weekends. But, like it or not, I still have depression, and like it or not, that still means I get wiped out a lot easier than healthy people do. It’s not so much an excuse as it is a challenge, and it’s one I’m going to have to overcome relatively quickly if I’m to solve the majority of my problems.

Part of this is that I do need to muster up motivation to do something extracurricular that poses an actual mental challenge. An acquaintance started translating old NES games for what I can only assume to be fun, and I’m thinking it might not be a bad idea to at least do the script work for some older titles as well. This is all predicate on me keeping up my studies; I refuse to accept anything less than a 3.0 from my report card, with a 3.5 being my ultimate goal. I will not fail, I will not falter.

I should probably also mention that I am getting very excited to get back into studying. I picked up the majority of my textbooks last week, and it’s been a bit of a struggle to prevent myself from reading through the novels assigned for one of the classes ahead of time. I’ve also flipped through my language books, and at that first glance they’re set up in a very interesting and different way from almost every other tutorial text I’ve seen on the language. It’s not about rote memorization of the kana, but very contextual; this echoes some of what I discovered about my own osmosis of the language through countless years of games and anime. It’s an extremely natural way to learn, and one which I’m sure will work for me.

As an aside, I tried taking the advice of several friends who told me to plow through a kanji dictionary a handful of pages at a time over the summer. I just couldn’t do it. I am fairly certain I need the interaction with other learners and actual speakers of the language in order to connect the mental dots. Which, coincidentally, brings me to my next point.

For me, college (the first time around) was as much about learning how to do certain things as it was learning how I learn. Endless calculus drills and derivations have left me all but unable to balance my checkbook, let alone determine the volume of an irregular solid in fifth-dimensional space. Reading through white papers and experiment results were excellent ways to put me to sleep. I literally could not endure another mumbled lecture on how multiple inheritance works in C++. But put a task in front of me, and I learned everything I needed to. Have me write about what I got out of a reading assignment and I could go to town on it. Ask me, and let me ask, and you’ll find that I get it a lot more easily than one might think. I learn by doing, by putting principles into action and experimenting with what I know (or think I know).

A few days ago, a friend posted a bunch of haiku to her blog, in written Japanese. I didn’t ask for a translation; I want to work it out for myself, and I know I will in time. But it’s that sense of going the extra mile, of wanting to fight through an assignment that piques my interest that has me more excited than the prospect of ten-minute rampages across campus to get to class on time, or lectures that warp the fabric of reality and become inescapable temporal anomalies. It’s not about learning to do. It’s about what I can do with what I learn.

And that is plenty motivation enough for me.

Ramblings of a transwoman agitator