Tag Archives: writing

Silver Screen Sentiments

A few years before my Dad died, my sister gave me a rather interesting Christmas gift: a ticket stub scrapbook. I was already well into my habit of seeing movies regularly, but there was something a bit more to the scrapbook than just a place to keep a record of what I’d seen. I put every ticket stub I had into it, which meant we went to see National Treasure 2 that weekend. As time went on, though, I meant to try to keep a perfect record of my cinematic consumption from there on out.

I love movies, but more than just watching them I love going to the movies. A theater is, for me, a place of refuge, where I can set aside the troubles of the real world for a couple of hours and watch someone else’s story unfold before me. There is peace to be found in even the most gleefully violent turn-your-brain-off action cliche heap. And it’s all larger than life, larger than is possible to replicate at home. I won’t ever give up my huge flatscreen, but it’s nothing compared to a glorious DLP IMAX wall of film. So I go to the movies, because some movies just plain deserve to be seen that way.

In Cleveland– actually Macedonia– I would almost like clockwork go to the theater and see one or two movies each Saturday. It was a comfortable and peaceful routine. I kept it up when I moved to Pittsburgh, but as I started to get back to collecting games and anime series for my library, I found myself more and more spending those Saturdays at home, marathonning a game or TV box set. I had almost ended the practice. 

The scrapbook changed that. I started paying attention to release dates; if there was something I wanted to see, I tried to go on opening weekend. Funnily enough this also meant seeing some big-ticket Ghibli films, too, like the theatrical runs for Ponyo and Arietty. And, of course, the Marvel movies were starting up again. It was a good time to go catch some flicks.

And then depression hit. One of the first things that depression does is take things you once loved to do and make you feel bored by them. Actually, you don’t feel bored. You don’t feel enjoyment. You don’t feel pleasure. You just plain don’t feel anything. The technical term for this is anhedonia, from the same Greek root that gives us hedonist. Someone who is anhedonic is literally unable to feel pleasure or happiness. You can tickle them all you like but the laughter will be a mere unconscious physiological response; there will be no sincere mirth in it. Kind of like baking out a tray of cookies, only to open the oven and find absolutely nothing inside, not even the baking sheet the cookies were on. You wanted cookies. You got a puzzlingly intractible nothingness.

Movies, either at home or in a theater, just didn’t do it for me anymore. It baffled me, because while I was in the theater I was laughing along with the punch lines, gasping at the villains, and generally appeared to be enjoying myself. But it never sank in the way it used to. Of all of the things that I could say about depression, that is the most frustrating topic to discuss because there literally is no vocabulary in any language on earth to accurately express the complete and total void within my mind. Even the word void doesn’t cut it: it implies a contrast with non-void. If it had been just a dead chunk of my brain– like I’d had a stroke or something– eventually neurons would reroute themselves within the healthy parts to restore proper order (or a reasonable facsimile thereof). Depression isn’t so much having to make detours in your brain’s highway system as it is waking up one morning to discover that every square millimeter of asphalt in said system has spontaneously become molten lava, and your car’s magnesium rims have just exploded, setting your garage on fire, and you have to be at work in ten minutes. Still not accurate. But close. The reality is worse.

The kicker of it is that it’s completely out of one’s conscious control. It’s all to do with neurotransmitter levels in the brain, the internal messaging system that allows four pounds of flesh to safely run the other hundred-odd pounds. The brain, already awash in a faulty mix of those signaling chemicals, overcorrects for their influence. Unfortunately, at this point the damage is done, because the overcorrection becomes the new “how to fix this” procedure. When next the sads hit, the brain goes overboard in the other direction, unbalancing itself unwittingly because it has difficulty telling whether or not a little grief is going to germinate into a full-blown crash.

I used to wonder why people drink to excess, or use drugs that they know are not good for them, or do other self-destructive things in the name of avoiding feeling bad. I don’t anymore. At a certain point, you become desperate to feel happy again– to feel anything again, even just for a little while. I hit that point. Hard. But I still count myself extremely lucky that I was lucid enough to know  the dangers were far worse than the potential benefits. Not everyone does. Worse, not everyone who hits that point even cares.

But I got help. I am on medicine now– it’s not a cure-all, but I’m not the zombie I feared I would be, either. I’m back in therapy. And, probably most importantly, I’m going to the movies again. Today was a double-feature, Jurassic World and Inside Out. I lost the stub for the first one, which upsets me (but not that much– it was a pretty basic monster movie, saved only by Chris Pratt’s outstanding performance). But the second flick… it felt great to go back into the theater and sit down as the lights dimmed. 

For the first time in a very long time, I found peace in the darkness.

No Surprises Here

The horrendous tantrum that started back last August rolls onward, which should surprise exactly none of you; these things tend to grow legs of their own accord and sooner or later nobody can catch them. Unfortunately, that’s exactly what happened. The sense of sheer entitlement and exclusionism that started with “them damn feminazis tryin at take away my video games” has blown up into a general maelstrom of the Defenders of True Geek Culture trying to force out the insidious forces of “progressives” and “feels”, to ensure that the things they love will remain theirs alone and theirs forever– even if they were never intended for them in the first place. 

Perhaps, then, the fact that Joss Whedon deleted his Twitter feed on May 4th– mere days after the release of his most recent film, Avengers: Age of Ultron– is not so surprising in and of itself. Whedon has always been a polarizing figure in geek circles anyway, with some people not liking that he tends to write the same “powerful” female characters in every work, and others just not really liking the over-reliance on under-intelligent banter. But up until yesterday, he was seen as “safe” from the criticisms and havoc that literally anyone else ever saying those things would have to endure, owing to his previous success much less than his birthright as a white dude. And I say that as a white quasi-dude. That he threw up his hands and walked away from Twitter should have been a wake-up call for those who sent anyone any kind of abuse. That it apparently hasn’t been is unsurprising, the inherent unsurprisingness of which leads me to believe that I really shouldn’t expect to be surprised at the depth of human sickness anymore. (I’ll try to stop saying “surprise” now.)

What we have here is a rather unusual counter-stroke to what the Internet had allowed to occur back in the 90s and early 00s in the first place: a sort of reactionary-revisionist faction seeking to isolate and disenfranchise people en masse. See, back in the early days of the Internet, it was a good thing that people with incredibly diverse interests could connect with each other regardless of their location. No matter what you were in to, be it Star Wars, Final Fantasy, anthropomorphic animals, sex with furniture, whatever– the odds were that somewhere out there was someone else with the interest who is probably dying for the chance to chat about it. (Although the sex with furniture thing is pretty weird. I’m not judging, just saying it’s weird and not my thing, but if it’s yours, you’re welcome to it.) For people who were of a certain mindset that wasn’t common in the era or area that they were growing up in, the Internet was a godsend.

At some point, however, there started to be a backlash against the background weirdness of the universe being brought into the foreground. Like nebulae condensing into stars, the scattered pockets of weird in the Internet were coalescing into groups, organizations that could support their members as needed. Some people thought that these pockets of weird should not exist, that it was “convincing people that they were normal when in fact, they weren’t”. People started highlighting these groups and shaming them, ostracizing them in much the same way that the individuals had been isolated in their everyday life. The concept of “live and let live” was sorely lost on these people.

Then you had the parts of the Internet where there were no rules, where things could get shocking and horrendous without warning. At first there was an unstated rule saying that it was all done in satire, that the racist, xenophobic material being bandied about like cat pictures on Saturdays wasn’t at all representative of the users’ actual views. But it was unstated, and stayed unstated, and assertions that it was serious were not taken as the kayfabe that it was proffered as, but instead at face value. Eventually even the unstated assertion fell away, and there were actual violent psychotic monsters posting in full sincerity. In some sense it is Möbius’ Aristocrats: a setup so filthy that it turns inwards upon itself, ever escalating, never reaching the punchline that renders what came before it benign and funny (if it ever could be considered so). 

It’s not exactly clear when these two groups got together and birthed the mindset that the Internet was a horrible place, filled with depravity and devoid of mercy. Certainly, the mainstream media did not help matters; scare stories about websites where Your Children were At Risk of Predators were a dime a dozen in those early days. They’ve calmed down a little since then, but are still no more based in fact than the (similar vintage, but thankfully now-extinct) Ripped From The Headlines TV-movie about whatever scandal of the week. By calling out and highlighting the awful behavior of certain minority parties online, it painted the picture that the Internet was a lawless place free from consequences and populated only by unfeeling avatars. It was like a TV news crew broadcasting the exact times and street corners where a drug dealer hung out, in the hopes that the people who would make use of this information would be the police instead of the drug dealer’s customers. It attracted the people who would do these horrible things, and sought to make them “normal”.

But nonetheless, the mindset that outright hostility and sociopathic behavior were the baseline of behavior on the internet became the “accepted” norm. I say “accepted” because by and large the only time anyone calls this out is when they are themselves under attack. “Everyone else is fair game; hurt my feelings, though, and you’ve crossed the line.” And in a sense, it was the “live and let live” attitude from the early days that allowed that mindset to assert itself as “the way it is” simply because it didn’t want to tell those people they couldn’t do what they wanted. An abuse of logic allowed people to shoot down the argument of “you can’t bully anyone, they’re just doing their thing” by saying “well, bullying them is my thing, and by the same assertion you can’t tell me not to”. It again boils down to the unstated half of the axiom: “they’re doing their thing and not hurting anyone else“.

But getting back to Joss Whedon and the current state of affairs, the fight against the hostility which has now entrenched itself in the Internet that once brought people together is going… Poorly. The immediacy of the medium means that you have to be there to defend yourself, and if enough people push you to a breaking point through death threats or other promises of violence, well, you either soldier on or you fold up and go home. There is a severe lack of equilibrium within what passes for conversation online today: many can group together to attack, but a defender always stands alone. Faced with a crush of humanity in all its bile and wrath, what choice is there but to flee? Quite frankly, it’s probably safe to say it’s not worth fighting.

Except it is.

We are facing a new era of society: where our intrinsic selves are exposed to the entirety of humanity at a moment’s notice. Socially, this has not happened in several thousand years. What we are seeing is the throes of evolution at work; raw aggression, this time in social interaction, is being selected for as those who cannot properly process the emotions of seven billion humans being thrown at them are being weeded out of the gene pool. Unlike the evolutionary crises which allowed us to start using tools, or grasp the greater mysteries of the universe through advanced mathematics, however, we have a tool greater than any formula: we can become aware of what we are sacrificing in order to succeed in this new era of humanity. Who knows what skills or abilities we gave up when the Great Engineer of the Universe pushed us to our current state. But we know exactly what we are losing now: traits like compassion, empathy, gentleness, compromise. We are losing our ability to do the things which brought us to this point in our history.

It’s not my place to say whether or not the ultimate fate of humanity some hundreds of years from now is to touch the stars with the better angels of our nature by our sides, or to grasp them from atop a tower of our enemies’ corpses. However we are destined to survive this evolutionary inflection point, we must as a species do so. I will continue to fight for equality, for a world where hostility is the exception and not the rule, for a world where everyone is free to choose as dictated by the desires of their truest self, for the people who believe to keep believing, for the people who don’t know to find their answers wherever they may lie. I will champion the cause of positivity and compassion for as long as I live.

Which, of course, shouldn’t surprise you.

Five Terrifying (But Thankfully Fictional) Computer Viruses

This morning I saw a news report that suggested keeping confidential information on a detachable drive such as a flash drive in order to avoid having the data stolen. I immediately thought of a way that could be circumvented, and at the risk of sounding like Buzzfeed, I thought up four more viruses that, as far as I know, are only the product of my own imagination. This was just a sort of thought-exercise, nothing really intentional or a goal, but something to think about.

1. Johnny Cache: Captures specific PDF files based on a likely size range from the infected computer and any external drives attached, then shares them in a “cloud” service to allow people to mine them for personal information.

2. The Beat-Alls: Deliberately issues hundreds of thousands of read/write cycles on all storage devices attached to the infected computer, counting on wearing the devices down faster, destroying data.

3. Gabba Gabba Hey: Snoops traffic on all networks the infected computer is connected to, and stores packets for later “echoing” back into the network. The idea is to flood a network with a denial of service attack using data that is indistinguishable from “real” traffic.

4. The Vapors: Installs various input method editors into the infected computer and randomly switches among them at set intervals, turning all typed information into total gibberish; worse, attempts to disable the IME trigger the infected computer to randomize the encoding of all text displayed as well as input.

5. London Calling: Disables, then takes direct control of, temperature controls on the infected computer, with the intention of destroying the hardware through overheating and/or overuse of the cooling mechanisms, ultimately creating a fire hazard.

The Gates

The last few months, as I’m sure you’re aware, have been pretty rough within the tech and video game industries. It started with a disgruntled post on a message board, claiming to be from an ex of a notable female video game developer. The post accused that developer of having had a brief romantic relationship with a writer for a game review website. This sparked a discussion of ethics in video game journalism and the interconnectedness of reviewers and developers, both metaphorical and literal.

The review website conducted their investigation, found no evidence of wrongdoing, and revised its policies to prevent the appearance of impropriety in the future. In any other industry, with any other individuals involved, and any other consumers raising the alarm, that would have been the end of it. It has since descended into a maddening maelstrom of abuse and hate, with invasions of privacy being perpetrated both for and against “the cause”. Rather than look at the currently-ongoing imbroglio, though, I think it’s important to step back and take a look at both how the metaphorical tropical depression escalated into a full-blown hurricane of hostility, and how this is only a bellwether of what is going to happen in the future.

The first thing you need to be aware of is that the seeds of the current mess were sown with the hashtag “#gamergate”. A hashtag (in case you haven’t already been bludgeoned over the head with the word since Twitter hit critical mass in 2010) is a text marker, preceded with the “hash” or “pound” symbol, that is added to designate a tweet as being part of a larger conversation. For example, when television shows air, you’ll often find (either in place of or right above the ubiquitous “bug” station identifiers in the lower right-hand corner of the screen) a hashtag that is recommended for use when tweeting about the show; in my preferred case, “#PoI” would be how I would scream to the world that I was referring to the currently-broadcasting episode of Person of Interest. Hashtags run the gamut from terse to almost taking up the entirety of the 140 character limit for tweets, but the important thing to remember here is that they are a method of self-selection: the person writing the tweet consciously chooses to apply the label to the tweet– and by extension, themselves, if that hashtag is representative or symbolic of something greater.

So we have this hashtag of “#gamergate”, and we have some people who have chosen to use it when tweeting or writing about video games. Great. But the problem here is one that is unique to semi-anonymous electronic communications: anyone can use the hashtag on any tweet. There is no hierarchical leadership within Twitter that dictates exactly how the hashtag is to be used. This is by design, to an extent; the hashtag is a bit of metadata, as opposed to data in and of itself. (Metadata is, at its core, data about data. If you think of your car as a piece of data, the fact that it is blue is metadata.) Where things get interesting– and by “interesting” I mean “Oh God what happened I turned my back for like five seconds and it’s all on fire now”– is that there is also no authority structure to say that the tag is being misused.

Let’s step away from the internet for a moment and take a look at a similar phenomenon, one that exists in the American political system, specifically bill riders and earmarks. Within the two houses of the US Congress, laws begin their existence as bills to be proposed on the floor of the House of Representatives or the Senate. (The rules for what starts where are arcane and beyond the scope of this article.) Now, these bills have pretty straightforward aims at their genesis: let’s take a hypothetical example of a bill that says owners of blue cars get a 1% tax break for a year. (What? I like blue cars.) The bill has to reach a certain level of approval within the originating committee in the House or Senate before it can be presented to the legislature at large, where it has to reach yet another threshold before going to the other house, where– you guessed it– they have to approve it at a certain level. Now, say for a moment that the senator from Pennsylvania has an objection to the bill, because (probably due to cadmium deposits found in the old coal mines or something) blue cars don’t sell nearly as well in his state. The PA Senator can ask for a change to the bill– a rider– that says that in Pennsylvania, the break is extended to red cars as well. These riders, it should be noted, don’t have to have anything to do with the original bill; a Senator from Connecticut could ask for a rider approving several million dollars of federal funding for a bridge repair. You can repeat the process for however many legislators it takes in order to get past the threshold of approvals before bringing the bill to a vote. This creates a rather difficult catch-22 for the legislators who have to choose between supporting a bill that has unpleasant riders, or voting against a bill that could do good because of the riders attached to it.

How does that tie into video games? Remember that “#gamergate” is both anarchic (as in it has no authority structure) and self-selected. Anyone can attach the hashtag to tweets of all kinds; ranging from demanding stronger ethical conduct rules in video game journalism, to detailed rape and death threats against developers and their families. In a historically traditional view of anarchy-advocacy (that is, people arguing for anarchy as a method of self-governance), the group should be policing itself and clamping down on the destructive behavior of the latter. It isn’t. If anything, the tag has been co-opted by those threats and tyrants (and I use the word “tyrant” in its vernacular sense of “violent autocrat”, not my usual tongue-in-cheek definition of “necessary minimally-exercised authority”). The people arguing for greater transparency in video game journalism are being drowned out by those who would see game developers driven from their homes simply for making games that questioned the status quo.

So this presents an interesting question: which is the real face of the “#gamergate” movement, terror or accountability? The answer is, frankly, both. Because, and this is an important distinction, it is possible to be correct without necessarily being right.

I’m not arguing that there are no problems with how video games are covered and how they are presented to the public. The gaming press absolutely is complicit in the current state of affairs where games receive massive amounts of hype prior to release only to be abject trainwrecks. I’m also not arguing that there doesn’t need to be a greater female and minority voice within the video game industry. The most interesting and engrossing games I’ve played over the last five years have been female-developed. But what is undeniable is that it’s now not possible to declare any of these points without being implicitly associated with sociopathic jerks. That perception is self-perpetuating: it drives away people who would be a moderating or mitigating voice and attracts, well, more sociopathic jerks.

Unfortunately, the “#gamergate” hashtag is beyond salvaging as it currently stands. It’s highly unlikely that the voice of reason could ever regain control of the narrative that’s attached to the tag. The movement has, as it was destined to do, moved on. And this is just the beginning of something we’re going to see a lot more of in the future.

What’s interesting to note with regards to the “#gamergate” phenomenon is that, for both of its goals, it achieved success. It started as an outcry against corruption in video game reporting, and it resulted in getting several sites to re-evaluate their policies. It then moved on to terrorizing women in video games, something that also gained swift results. The problem is that logically, these should have been two completely different movements, and people who pushed hard for one goal found themselves, and their credibility, being washed into a tsunami of support for the other, which they may not have had any desire to.

I’m reminded of a quote from H.P. Lovecraft’s The Case of Charles Dexter Ward: “Do not call up what you cannot put down.” The people who started fighting for the cause of transparency found easy and fast allies in misogynists and psychopaths, and didn’t stop to think that when the goal of transparency was achieved, it might be a little hard to dial back the frothing anger stirred up in their erstwhile allies. Now they find themselves in the back-seat, desperately trying to reclaim the reins of their movement from the people they egged on just days– or even hours– before. This prompts the feeble cries of “we’re about transparency!” when the label is overwhelmingly being used to justify threatening and stalking women in the industry.

If there was a chance to separate out the two sentiments, it has long since passed. The beast, called up from the depths, can no longer be put down. All we can do now is close the gates, to prevent something far, far worse from emerging.

EDIT, 15 October 6:30a: Since the original publication of this post, an individual inspired by the “#gamergate” movement has sent a threat to Utah State University promising violence against its students if USU went ahead with plans to have Anita Sarkeesian as a speaker today. Sarkeesian herself cancelled the talk, citing insufficient security measures at USU. The fact that actual violence was threatened as a result of the movement’s momentum is troubling, and within back-channels organized for the movement the cancellation is being hailed as a success, which is even more troubling.

A post on the NeoGAF forums describes the inaccuracies and myths that lie at the core of the continued assertion that the “#gamergate” movement is still about specific ethical grievances. As I mentioned yesterday (above), the initial impetus for the movement was resolved within a few days of its revelation. What has continued has been an embarrassment and a shame upon a hobby that has brought so many people together. The threats of violence must stop. The dishonesty about what is going on must stop.

Finally, Kris Straub, author of the webcomic Chainsaw Suit, has posted “The Perfect Crime”, which details in far more brevity the entirety of this post’s ultimate points regarding why self-selected movements are going to be problematic in the future. Like it or not, “#gamergate” was successful in both of its goals– the short-term one (of investigating the ethics concern) and the long-term one (stalking and threatening women). It’s now a model case, a textbook case, for deception in public relations. You can expect this sort of “we’re not saying we condone the actions of the extremists in our movement, but we’re going to accept and claim responsibility for the results that those extremists get us” play to start showing up in issues that really matter.

And it’s the promise of that kind of future, where discussion is intentionally obfuscated and civilized argument is impossible, that terrifies me just as much as any threat to my person.

Stand Up/The Vanguard

In a few short hours– less than half a day– I’ll be beginning my second college experience, and with it, my second career. Quite a bit has happened in the past ten months, much of which would destroy anyone who had not already endured it, but one thing remains true: I am a survivor. I cannot be broken so long as I can see a way forward. I have clawed my way out of hellish situations in the past, and this one– while still the worst challenge I’ve ever been set against– is no different.

I got to thinking about this a little this afternoon. I remember what I had to my name when I left Cleveland for the last time; I had sold off all but a handful of my most treasured possessions and felt that there could be no recovering from such a disaster. That was the end of 2006, the conclusion of a dark chapter in my life, and the beginning of a rebirth of sorts. 2007 was not easy, but it was better. By 2009 I had considered the Reclamation Project complete, and was looking to improve my situation further than I had been before my retreat. I may have overextended my reach in some cases, but by and large I was on the right track– until I suffered an exceptional advance of my depression in 2012. Life collapsed around me then, and while not all of it can be traced back to the disease which is my daily hell, it certainly didn’t help matters.

Tomorrow, though, starts the rebuilding phase again. It will not be easy. It will not be quick. I will have to sacrifice, to eliminate much from my everyday, in order to recover even the slightest equilibrium, let alone advance. The next three to five years will be a true test of who and what I am. Some people never survive their first trip to college; they drop out, or find they can’t handle the pressure, or discover their true passion and talent elsewhere. This will be my second. And if a third, fourth, or ninth is required, then so be it.

The last week has been one where I have found myself doubting everything that has led me to this point. An unrelated setback also occurred which shook my confidence and left me truly doubtful as to whether or not I could manage any real improvement. I’ll freely admit that there have been nights where I have lain awake and on the verge of tears, wondering if I hadn’t just wasted every breath since last Thanksgiving. Some nights I crossed that border.

Tonight will not be one of those nights. I’m going to bed and I expect that I will sleep peacefully, confident that everything will be okay for once. I am, for a change, aware that this is within my power, not just to influence, but to control. That’s the key, for me, and what’s been a major point of my emotional crashes: that what happens to me is not what happens because of me. There’s going to be a lot of unforeseen problems from here on out. Some of them are going to wreck my shit completely. But what I need to keep within me is this feeling– right now– that says that all of that would happen even if I did see it coming and simply couldn’t avoid it. I can let the world go to hell. As long as I keep doing my part to prevent it– by studying, and dedicating myself to the ideal that communications is the answer– then none can judge me unworthy.

If my life is too big to fix on my own, then the reverse is true as well: I’m not wholly responsible for it falling apart, either, and I don’t deserve to stay so low.

I believe that luck is cyclical. I had a bunch of good years in the beginning, and then fourteen bad ones. The wheel has to come back around sometime.

Good night, folks.

Faulty Motivator

To say that the last two months have been hectic and busy would be a gross understatement, the likes of which are unheard of from my usual idiom of communication. It’s taken this long for me to get back to something approaching a normal schedule, and despite the fact that I start classes this coming Monday, I’m still not entirely at 100%. But, like I said, I have less than a week to go: the time to slack is running out.

I’ve spoken at length about depression here, and in other places, and it’s because of that fact that I feel like I really shouldn’t be relying on it as an excuse for why I have tended to nap for hours during the day and have been almost completely inactive on the weekends. But, like it or not, I still have depression, and like it or not, that still means I get wiped out a lot easier than healthy people do. It’s not so much an excuse as it is a challenge, and it’s one I’m going to have to overcome relatively quickly if I’m to solve the majority of my problems.

Part of this is that I do need to muster up motivation to do something extracurricular that poses an actual mental challenge. An acquaintance started translating old NES games for what I can only assume to be fun, and I’m thinking it might not be a bad idea to at least do the script work for some older titles as well. This is all predicate on me keeping up my studies; I refuse to accept anything less than a 3.0 from my report card, with a 3.5 being my ultimate goal. I will not fail, I will not falter.

I should probably also mention that I am getting very excited to get back into studying. I picked up the majority of my textbooks last week, and it’s been a bit of a struggle to prevent myself from reading through the novels assigned for one of the classes ahead of time. I’ve also flipped through my language books, and at that first glance they’re set up in a very interesting and different way from almost every other tutorial text I’ve seen on the language. It’s not about rote memorization of the kana, but very contextual; this echoes some of what I discovered about my own osmosis of the language through countless years of games and anime. It’s an extremely natural way to learn, and one which I’m sure will work for me.

As an aside, I tried taking the advice of several friends who told me to plow through a kanji dictionary a handful of pages at a time over the summer. I just couldn’t do it. I am fairly certain I need the interaction with other learners and actual speakers of the language in order to connect the mental dots. Which, coincidentally, brings me to my next point.

For me, college (the first time around) was as much about learning how to do certain things as it was learning how I learn. Endless calculus drills and derivations have left me all but unable to balance my checkbook, let alone determine the volume of an irregular solid in fifth-dimensional space. Reading through white papers and experiment results were excellent ways to put me to sleep. I literally could not endure another mumbled lecture on how multiple inheritance works in C++. But put a task in front of me, and I learned everything I needed to. Have me write about what I got out of a reading assignment and I could go to town on it. Ask me, and let me ask, and you’ll find that I get it a lot more easily than one might think. I learn by doing, by putting principles into action and experimenting with what I know (or think I know).

A few days ago, a friend posted a bunch of haiku to her blog, in written Japanese. I didn’t ask for a translation; I want to work it out for myself, and I know I will in time. But it’s that sense of going the extra mile, of wanting to fight through an assignment that piques my interest that has me more excited than the prospect of ten-minute rampages across campus to get to class on time, or lectures that warp the fabric of reality and become inescapable temporal anomalies. It’s not about learning to do. It’s about what I can do with what I learn.

And that is plenty motivation enough for me.

Filing Issues With Reality

Just before Tekkoshocon, Pez lent me his copy of Jane McGonigal’s Reality Is Broken: How Games Can Make Us Better. I regret to say, actually, that I didn’t get a chance to crack it open until well after the show was over. On the flip side, though, it only took me about twenty pages before I realized I needed a copy of the book myself.

McGonigal doesn’t waste any time in providing her argument. She starts off with a mythological story about how the ancient Lydians survived an eighteen-year famine through the effective use of games. (Which put the other purchase I made at the bookstore that day– Suzanne Collins’ The Hunger Games— is a markedly less pleasant light.) She then breaks her argument into three parts, detailing along the ride eleven ways that reality could be improved through the strategic implementation of behaviors that are seen in gaming of all stripes.

While McGonigal focuses predominantly on electronic gaming, only breaking out of it for a handful of her examples, she presents an incredibly strong case that the mechanical aspects of gameplay in general are worth far more than the condescension that most people have for players of games. And, while fundamentally I agree with a lot of her points, I also have to take issue with her assertion that making everything a game will make life better. This is mostly out of the incredibly poor way that a lot of “gamification” efforts have been implemented in the past.

When dealing with incredibly boring or tedious tasks in my childhood, I was often told that I should make a game out of it. The problem is that fun cannot be “enforced” in that way. It didn’t matter how I compelled myself to accomplish whatever I was told to do, only that I did it. And if I wasn’t invested in the task to begin with, there was no way I was going to put in the mental effort to compel myself to do it. Now, if the game had developed organically among the other people I was working with at the time– or hell, even if there were other people to engage in the game– it would be a different story. But solo, it was just frustrating to be told that I should somehow force myself to enjoy something objectively boring.

That’s why I take a jaundiced eye to games like Chore Wars or Fitocracy: it’s great to compete with other people in these games, but for someone on their own, if they’re not already committed to the tasks, they sure as hell aren’t going to be motivated by a little number or avatar. It’s a bit like having a competition to see who can finish their homework first. There might be a reward at the end, and it might not be any more substantial than bragging rights; in the end, though, you’re still doing your homework, and if you just plain don’t want to do homework, no reward is going to be good enough. More to the point, games of any sort get boring after an extended period: when people ultimately get bored of Chore Wars, the dishes will start to pile up again.

Do I think that making reality more fun is going to make people happier? Absolutely. I love games, and I play them constantly; earlier this month I was introduced to Tiny Tower, and I’ve been using it alternately as a time-waster and as a productivity monitor (work for x minutes, check on the tower for one or two, then back to work for x more). But I have some issues with the assertion that gaming can become a force that will make kids do their homework or eat their broccoli, or make adults save for retirement or mow the lawn. Games are only so powerful, after all.

Nerdery: The Book, April Update

This past weekend I took some time to work on a rough outline for how Nerdery is going to be structured. A lot of it had to do with how much I wanted to pull from Jane McGonigal’s fantastic Reality Is Broken— which I realize I need to write a review of, too– but a bit more of it also had to do with the fact that, for as all-encompassing a topic as general nerdery is, I really only focus on a few major aspects of it.

The other part of it, though, is getting over the feeling that I’ve gone over this stuff before. After all, Nerdery is the culmination of well over ten years’ worth of essays, private and public, and so going over it all again is pretty much a necessity. For as much of it as I end up doing, I really dislike repeating myself. Ultimately, this means I have to just suck it up for the sake of making a greater point. The outline also helps me focus my thoughts so that I can approach each essay with at least the illusion of it being something new and unique to me.

In the end, I decided on structuring the book into four parts, not counting the inevitable introduction and conclusion chapters:
In Part One, I’m going to explore what it means to be a nerd. I’ll look at the origin of the word and concept, the history of how nerds are portrayed in media and culture, and see how it evolved into what it is today.
In Part Two, I’ll focus on the negative aspects of being a nerd. I’ll discuss the bias against intellect in society today, how being a nerd can be personally and collectively detrimental, and go over a couple of high-profile incidents where someone was targeted for being too smart.
In Part Three, I’ll flip the argument around and declare why being a nerd isn’t entirely a bad thing. I’ll focus primarily on why people choose to self-identify as nerds, what high intellect can do to help a community, and discuss how celebrities are embracing nerdery.
And in Part Four, I’m going to discuss what can be done to eliminate the stereotype of being a nerd. I’ll focus on why it was never actually relevant, why it’s constantly evolving, and how the world will be much better once there are no more “nerds”.

If it sounds like there’s a lot to go over, and if it sounds like people really aren’t going to like a lot of what I have to say (I imagine parts three and four are going to raise the most hackles), good. Ambition goes hand in hand with intellect and nerdery. And, despite what the essay at the beginning of this week would have you think, I have absolutely no qualms about failing quite publicly.

My next major goal is to have a draft of the book done by the end of summer; I’d like to shoot for October 1st as a draft deadline. This gives me time to crank out one long-form essay each weekend until then, while accounting for time for revisions and some mild editing. The essays are going to be written privately– that is, not shared with anyone just yet. However, I’m likely to share snippets of thoughts as blog posts now and again. If I get done with painting up my miniatures early, I may repurpose the Saturday morning disconnection time into a writing-only period, using my laptop while turning its Wifi off.

It’s on, ladies and gentlemen. Let’s get down and nerdy.

Whuffielupagus (Part Three)

Were we wrong about Mass Effect? That’s not really the point. Personally, I think it was a better game than the reviewer said it was, but then again we almost always disagreed internally about the games we reviewed. Netjak always inhabited that quasi-professional level, where we weren’t getting paid to write about video games, but we all approached it with solemnity approaching the sepulchral. So, how exactly should we have been approached?

In his book “Down and Out in the Magic Kingdom”, Cory Doctorow introduced the concept of whuffie: a quantitization of personal reputation that had replaced money as the way to gain non-essential luxuries (basics were freely available). People who did good things, like composed symphonies or let people ahead of them in line, gained whuffie, while people who did bad things, like cutting people off in traffic, lost it. It is instantaneous and mostly subconscious due to constant neural connections to an/the internet. In short, it puts a number on positive attention, while penalizing negative attention. While it sounds like the ultimate in egalitarianism, it sits poorly with me as something that should ever be implemented, even in the internet, simply because it’s instantaneous. Someone who just made a bad mistake will have a low whuffie score due to the snap-judgments of those around them drowning out or undoing the good they’ve done in the past.

If you try, but fail, you may have gained experience, but you have still failed. In private, the only one who ever sees it is you, and thus it’s easier to take. The problem is when your failures are public: if someone sees you fail, no matter how close you came, the memory of that failure is still there. On some level the witness will always have that knowledge about you, and there’s a good chance that it will in some way color their perceptions about you. When the failures are ephemeral– like bumping into a glass storefront because you thought it was an open door– the effect is minimal. When the failures are permanent and/or replayable– like a Youtube video of that very same collision– the effect is compounded.

With blogs and the internet, the temptation is to take everything as being instantaneous. Something posted years or decades ago is just as accessible as something posted right now, and whether or not it’s just as relevant today is, ironically, irrelevant. It doesn’t matter how long ago you posted that topless picture, or if it’s just one ill-conceived moment among thirty thousand exquisite photographs; it’s there forever and it’s all that matters. There is no statute of limitations on the internet.

It should, of course, go without saying that I think that’s a load of absolute bullshit. For my part, I try very hard not to let one mistake in the past color my perceptions of any site, professional or personal. The big names are capable of cranking out stinkers once in a while. But the little guys who don’t get any attention sometimes have the biggest ideas. The internet was supposed to let everyone’s voice be heard.

So remind me why we’re ignoring some of them, just because everyone else is?

Whuffielupagus (Part Two)

The question then becomes one of effort and consistency. If 99% of the posts on a blog are filler (cough), does that automatically condemn the 1% of the posts that have genuinely good and interesting content? More important to the point at hand, does that 1% of posts command the blog’s readership attention for the uninteresting 99%? If it’s not a 1%/99% ratio, where exactly do you draw the line? How do you judge a site’s real worth?

The “fun” thing about all of this comes when you realize that some of the internet intelligentsia happen to have some rather cruel streaks in them as well (and I most certainly include myself in that categorization from time to time). The egalitarian nature of the internet just doesn’t sit well with some folk, and that gives rise to sites such as “Web Pages That Suck” and “Your Webcomic Is Bad And You Should Feel Bad”. This is to say nothing of the legions of commenters and forum-goers who pooh-pooh anything that isn’t a work of magnificent perfection.

Now, far be it from me to say that criticism isn’t warranted or desperately deserved in some cases– and I’l be the first to own up to the many, many mistakes I’ve made in the past. But as I’ve always said, honest and constructive criticism will always beat just plain ol’ criticism. For everything that’s wrong, there should at least be something that was done right. This isn’t always the case, obviously, but the cases where it doesn’t hold true are so astonishingly rare as to be worthy of the ire and bile that are heaped upon– well, more or less everything.

It gets worse when you start trying to quantitatively and objectively assess the quality of something based on a relatively incidental number. Back when Netjak was still around– which was itself a remarkable example of a professional blog– we raised our fair share of hackles with certain of our reviews. The biggest offender here was with Mass Effect, which didn’t get the glowing praise from the reviewer that the rest of the gaming press was lauding onto the game. This prompted an individual to sign up for our forums (which we used in place of a comments system) and berate us for not falling in line. The individual went so far as to suggest that our small userbase on the forums (because we’d just gone through a dormancy period due to technical failures) as well as the low number of threads and posts (because it was set to auto-purge posts older than a certain threshold) “did not give [us] the right” to our opinion of the game. Nevermind that Joystiq had linked to us repeatedly; we were small, therefore we didn’t count.