Tag Archives: technology

How Not To Launch A Game

By now, you’re probably painfully aware of the problems that are plaguing the biggest of Nintendo’s mobile apps, Pokemon GO. A crossover of sorts with Niantic Labs’ augmented-reality sensation Ingress, GO tasks players to venture through a world, capturing Pokemon and battling them against others. The twist is that the world being explored is our own– literally. GO uses the player’s location data on their smartphone to track what Pokemon are nearby. So you might encounter, say, a Drowzee at your favorite all-night diner, or a Zubat near the tunnel you take to work. It’s a fantastic concept, and based on what I’ve played so far, it is an amazing game. 

I wish I was playing it right now. 

The game’s launch, starting Tuesday night in Australia and New Zealand and ongoing as I write, has been an utter, unmitigated disaster. This is a failure that’s to be expected from new online developers, but not necessarily from Nintendo and certainly not from Niantic. In short, what should have been the opening fanfare on the series’ 20th anniversary celebration is rapidly turning into a bloodletting that threatens the Pokemon brand as a whole. There are three core problems plaguing the game right now, and each one is directly traceable back to either Nintendo or Niantic not doing something they should have. These are sins of omission, things that any experienced developer only ever faces once before correcting them in their next project. 

The first, and probably the one people are most upset at dealing with right now, is the flawed to the point of useless login procedure. Upon starting the game for the first time, the player is asked to log in either through a Google account or their Pokemon Trainer Club (PTC) account. The PTC option is the best choice, because it is (ostensibly) tied in to all of your other online Pokemon services– your Global Link account for the games, your online card game account, your real-world card game ranking, etc. Unfortunately, this requires the game to authenticate against the PTC, which is a massive bottleneck. The single-sign-on server has been reduced to rubble since noon today. Now, if you were to log in via the Google account, you might be able to get in… but your data is tied to your account, not your device. So if you had managed to get a good start on the PTC account, you would have to start from scratch on the Google account. 

The solution to this is simple, from a standpoint of any developer who’s ever had to build a login portal pass-through of any kind: store the login as an encrypted token in your database, and issue the user a token that corresponds to their data on your system, not the authenticator’s token. This way you only go out to the authenticator when you absolutely need to, and not constantly re-authenticate. In addition, by not tying data to a remote authentication token, you can then allow a user to link multiple login methods to the same data, in case they don’t have (or want to create/use) one of the offered login options. Going to the authenticator is a costly action, and as a developer it’s your responsibility to minimize that cost. This is basic stuff for any kind of distributed system, not just highly-complex ones. 

But more than that, the existing login scheme violates an emerging mobile-app design trope: it asks for a username and password on every launch from the title screen (and the game frequently crashes back to the title screen, trashing its login token in the process for reasons unfashionable to man and Pokemon alike). I don’t think any mobile game released in recent memory has used a username/password authentication setups as its everyday login. Most rely on secure identification information provided by the mobile operating system, and if the user requests additional protection, it’s secured by the phone’s local authentication (your unlock code, for example, or thumbprint/other biometric key). Smartphones are the last single-user environment in consumer computing. Many apps have tossed aside the archaic and error-prone username/password setup in favor of allowing the app to act as if nobody else in the world even exists. Even my banking applications only ask for my password in dire circumstances; the norm is simply to accept thumbprints as proof of identity and move on. 

(As an aside: I labor to believe that Niantic coded the app that way on purpose. What I think happened was that Nintendo, in a misguided attempt to protect users’ privacy, forbade both token caching and account linking. Worse, the PTC login token may have a uselessly short lifespan, on the order of fifteen minutes or so. It makes sense, but at that point, they both should have realized that using the existing consumer-facing PTC login process would be a Very Bad Thing and would develop a similar process specifically for communications related to the app. An N-to-N solution, so to speak.)

The second problem comes in the form of how the app went live. After the game’s open beta ended last week, speculation ran rampant on when GO would have its public launch. You would expect a reference here to whoever had the right day in the proverbial office pool, but in fact there wasn’t even enough time to set up such a pool. The game was suddenly released in the Australia and New Zealand regions on Tuesday night at about 10:30p EDT. There was no fanfare, no announcement, nothing. Niantic wouldn’t even register their support Twitter until Thursday morning. In the official silence that followed, North American users swarmed to create ANZAC-regioned accounts for the Apple Store and Google Store, or resorted to sideloading the game (downloading the game app from an unofficial source and installing it manually, something common on Android but difficult on iPhones). 

Setting up regional rollout is not difficult, and certainly it’s not the issue here. But it was botched badly in this instance, because an obvious and elegant solution to the issue presents itself by virtue of the game’s nature. GO requires the player’s accurate location, right? So why not release the app as a “pre-load” in all regions, and then allow access based on location in order to prevent the servers from overwork? This allows geographically large regions– the United States, in this case– to be divided into cascading rollout zones. The simplest zone distribution would be by time zone, but other factors could inform that decision. I don’t think this has been done before in a mobile app, but it’s certainly something to consider. 

More damning than the rollout timing, though, was the radio silence out of Nintendo and Niantic throughout the whole affair. These are not plucky indie developers who have to choose between addressing public complaints or fixing their game. They are big enough that they ought to have halfway competent PR groups. (Though given Nintendo’s PR catastrophes this year, I think we can say that they in fact do not have a PR staff that is even minimally competent.) It is absolutely unacceptable to meet consumer queries with disinterest bordering on apathy. I would have preferred even a somewhat hostile response over nothing; at least with a venomous reply, you know they actually saw your question. 

Finally, and this is probably the most distressing fault that I’ll address, the game has virtually no tutorial or information on how to play. There is no online manual, no on-screen guides, nothing. I honestly thought my game had soft-locked when I went up a level, because nowhere did it say “Swipe to dismiss” on the all-encompassing celebration screen. Using a Poke Stop (waypoint, which when used gives valuable items like Poke Balls and Revives) was similarly opaque. There is a short explanation of Pokemon Gyms when you first try to use one, but nothing comprehensive on what the Gyms are or what they do. There is a daily login bonus that grants you some of the game’s real-money currency, but nowhere is it explained that it’s dependent on how many Gyms are under your direct control. Battling is a rough affair, especially because while the Pokemon do respect type advantage and disadvantage, getting a Pokemon to use their special move is not explained at all. And in the absence of this information, players are falling back to approaching the game like any other Pokemon game, when it is definitely not meant to be just another main-series game.

I’m going to be perfectly clear: Pokemon GO is a good game, but it is not a traditional Pokemon game. Battling and trading are the focus of the handheld games. GO is not about that. GO is instead about exploration and collecting, and indirectly about area control in the real world. If you go into this game looking for battling and saving the world, you are going to be disappointed. Instead, if you view the game as an incentive for physical activity, similar to critical darling Zombies, Run!, the game becomes much more engrossing. The real-world aspect of the game may feel gimmicky, but it is integral to the game’s design. 

So having players approach the game as a traditional Pokemon adventure does it a disservice. If it were more obvious to the player that going out to Poke Stops to restock your is more economical than buying them from the cash shop, the player would do that. If it were stated that controlling Gyms grants the player free cash shop currency, the player would do that, too. Even basic information like attack type match ups would be helpful. Without that information, the player runs a high risk of frustration and boredom– two things lethal to any game. The game is too complex to approach solely on intuition. A tutorial should be the next top priority for Nintendo and Niantic. 

Pokemon GO is, I still assert, a good game. But it has not had a good start. I mentioned way at the top that the catastrophe that is this launch could poison the Pokemon franchise as a whole. That was not hyperbole. Think about how Star Fox Adventures was the first harbinger that the series would never again reach its former glory. Or how Xenosaga Episode 2 killed the hopes of that series reaching its full conclusion. One game, if it’s bad enough– or perceived enough as bad– can ruin a franchise beyond salvaging. I think Pokemon GO has a good chance of undeservedly being that game. And should that happen, I will weep for it, dry my eyes, and move on. I just think it’s a little early to start digging that grave.

No Surprises Here

The horrendous tantrum that started back last August rolls onward, which should surprise exactly none of you; these things tend to grow legs of their own accord and sooner or later nobody can catch them. Unfortunately, that’s exactly what happened. The sense of sheer entitlement and exclusionism that started with “them damn feminazis tryin at take away my video games” has blown up into a general maelstrom of the Defenders of True Geek Culture trying to force out the insidious forces of “progressives” and “feels”, to ensure that the things they love will remain theirs alone and theirs forever– even if they were never intended for them in the first place. 

Perhaps, then, the fact that Joss Whedon deleted his Twitter feed on May 4th– mere days after the release of his most recent film, Avengers: Age of Ultron– is not so surprising in and of itself. Whedon has always been a polarizing figure in geek circles anyway, with some people not liking that he tends to write the same “powerful” female characters in every work, and others just not really liking the over-reliance on under-intelligent banter. But up until yesterday, he was seen as “safe” from the criticisms and havoc that literally anyone else ever saying those things would have to endure, owing to his previous success much less than his birthright as a white dude. And I say that as a white quasi-dude. That he threw up his hands and walked away from Twitter should have been a wake-up call for those who sent anyone any kind of abuse. That it apparently hasn’t been is unsurprising, the inherent unsurprisingness of which leads me to believe that I really shouldn’t expect to be surprised at the depth of human sickness anymore. (I’ll try to stop saying “surprise” now.)

What we have here is a rather unusual counter-stroke to what the Internet had allowed to occur back in the 90s and early 00s in the first place: a sort of reactionary-revisionist faction seeking to isolate and disenfranchise people en masse. See, back in the early days of the Internet, it was a good thing that people with incredibly diverse interests could connect with each other regardless of their location. No matter what you were in to, be it Star Wars, Final Fantasy, anthropomorphic animals, sex with furniture, whatever– the odds were that somewhere out there was someone else with the interest who is probably dying for the chance to chat about it. (Although the sex with furniture thing is pretty weird. I’m not judging, just saying it’s weird and not my thing, but if it’s yours, you’re welcome to it.) For people who were of a certain mindset that wasn’t common in the era or area that they were growing up in, the Internet was a godsend.

At some point, however, there started to be a backlash against the background weirdness of the universe being brought into the foreground. Like nebulae condensing into stars, the scattered pockets of weird in the Internet were coalescing into groups, organizations that could support their members as needed. Some people thought that these pockets of weird should not exist, that it was “convincing people that they were normal when in fact, they weren’t”. People started highlighting these groups and shaming them, ostracizing them in much the same way that the individuals had been isolated in their everyday life. The concept of “live and let live” was sorely lost on these people.

Then you had the parts of the Internet where there were no rules, where things could get shocking and horrendous without warning. At first there was an unstated rule saying that it was all done in satire, that the racist, xenophobic material being bandied about like cat pictures on Saturdays wasn’t at all representative of the users’ actual views. But it was unstated, and stayed unstated, and assertions that it was serious were not taken as the kayfabe that it was proffered as, but instead at face value. Eventually even the unstated assertion fell away, and there were actual violent psychotic monsters posting in full sincerity. In some sense it is Möbius’ Aristocrats: a setup so filthy that it turns inwards upon itself, ever escalating, never reaching the punchline that renders what came before it benign and funny (if it ever could be considered so). 

It’s not exactly clear when these two groups got together and birthed the mindset that the Internet was a horrible place, filled with depravity and devoid of mercy. Certainly, the mainstream media did not help matters; scare stories about websites where Your Children were At Risk of Predators were a dime a dozen in those early days. They’ve calmed down a little since then, but are still no more based in fact than the (similar vintage, but thankfully now-extinct) Ripped From The Headlines TV-movie about whatever scandal of the week. By calling out and highlighting the awful behavior of certain minority parties online, it painted the picture that the Internet was a lawless place free from consequences and populated only by unfeeling avatars. It was like a TV news crew broadcasting the exact times and street corners where a drug dealer hung out, in the hopes that the people who would make use of this information would be the police instead of the drug dealer’s customers. It attracted the people who would do these horrible things, and sought to make them “normal”.

But nonetheless, the mindset that outright hostility and sociopathic behavior were the baseline of behavior on the internet became the “accepted” norm. I say “accepted” because by and large the only time anyone calls this out is when they are themselves under attack. “Everyone else is fair game; hurt my feelings, though, and you’ve crossed the line.” And in a sense, it was the “live and let live” attitude from the early days that allowed that mindset to assert itself as “the way it is” simply because it didn’t want to tell those people they couldn’t do what they wanted. An abuse of logic allowed people to shoot down the argument of “you can’t bully anyone, they’re just doing their thing” by saying “well, bullying them is my thing, and by the same assertion you can’t tell me not to”. It again boils down to the unstated half of the axiom: “they’re doing their thing and not hurting anyone else“.

But getting back to Joss Whedon and the current state of affairs, the fight against the hostility which has now entrenched itself in the Internet that once brought people together is going… Poorly. The immediacy of the medium means that you have to be there to defend yourself, and if enough people push you to a breaking point through death threats or other promises of violence, well, you either soldier on or you fold up and go home. There is a severe lack of equilibrium within what passes for conversation online today: many can group together to attack, but a defender always stands alone. Faced with a crush of humanity in all its bile and wrath, what choice is there but to flee? Quite frankly, it’s probably safe to say it’s not worth fighting.

Except it is.

We are facing a new era of society: where our intrinsic selves are exposed to the entirety of humanity at a moment’s notice. Socially, this has not happened in several thousand years. What we are seeing is the throes of evolution at work; raw aggression, this time in social interaction, is being selected for as those who cannot properly process the emotions of seven billion humans being thrown at them are being weeded out of the gene pool. Unlike the evolutionary crises which allowed us to start using tools, or grasp the greater mysteries of the universe through advanced mathematics, however, we have a tool greater than any formula: we can become aware of what we are sacrificing in order to succeed in this new era of humanity. Who knows what skills or abilities we gave up when the Great Engineer of the Universe pushed us to our current state. But we know exactly what we are losing now: traits like compassion, empathy, gentleness, compromise. We are losing our ability to do the things which brought us to this point in our history.

It’s not my place to say whether or not the ultimate fate of humanity some hundreds of years from now is to touch the stars with the better angels of our nature by our sides, or to grasp them from atop a tower of our enemies’ corpses. However we are destined to survive this evolutionary inflection point, we must as a species do so. I will continue to fight for equality, for a world where hostility is the exception and not the rule, for a world where everyone is free to choose as dictated by the desires of their truest self, for the people who believe to keep believing, for the people who don’t know to find their answers wherever they may lie. I will champion the cause of positivity and compassion for as long as I live.

Which, of course, shouldn’t surprise you.

Five Terrifying (But Thankfully Fictional) Computer Viruses

This morning I saw a news report that suggested keeping confidential information on a detachable drive such as a flash drive in order to avoid having the data stolen. I immediately thought of a way that could be circumvented, and at the risk of sounding like Buzzfeed, I thought up four more viruses that, as far as I know, are only the product of my own imagination. This was just a sort of thought-exercise, nothing really intentional or a goal, but something to think about.

1. Johnny Cache: Captures specific PDF files based on a likely size range from the infected computer and any external drives attached, then shares them in a “cloud” service to allow people to mine them for personal information.

2. The Beat-Alls: Deliberately issues hundreds of thousands of read/write cycles on all storage devices attached to the infected computer, counting on wearing the devices down faster, destroying data.

3. Gabba Gabba Hey: Snoops traffic on all networks the infected computer is connected to, and stores packets for later “echoing” back into the network. The idea is to flood a network with a denial of service attack using data that is indistinguishable from “real” traffic.

4. The Vapors: Installs various input method editors into the infected computer and randomly switches among them at set intervals, turning all typed information into total gibberish; worse, attempts to disable the IME trigger the infected computer to randomize the encoding of all text displayed as well as input.

5. London Calling: Disables, then takes direct control of, temperature controls on the infected computer, with the intention of destroying the hardware through overheating and/or overuse of the cooling mechanisms, ultimately creating a fire hazard.

Whuffielupagus (Part Three)

Were we wrong about Mass Effect? That’s not really the point. Personally, I think it was a better game than the reviewer said it was, but then again we almost always disagreed internally about the games we reviewed. Netjak always inhabited that quasi-professional level, where we weren’t getting paid to write about video games, but we all approached it with solemnity approaching the sepulchral. So, how exactly should we have been approached?

In his book “Down and Out in the Magic Kingdom”, Cory Doctorow introduced the concept of whuffie: a quantitization of personal reputation that had replaced money as the way to gain non-essential luxuries (basics were freely available). People who did good things, like composed symphonies or let people ahead of them in line, gained whuffie, while people who did bad things, like cutting people off in traffic, lost it. It is instantaneous and mostly subconscious due to constant neural connections to an/the internet. In short, it puts a number on positive attention, while penalizing negative attention. While it sounds like the ultimate in egalitarianism, it sits poorly with me as something that should ever be implemented, even in the internet, simply because it’s instantaneous. Someone who just made a bad mistake will have a low whuffie score due to the snap-judgments of those around them drowning out or undoing the good they’ve done in the past.

If you try, but fail, you may have gained experience, but you have still failed. In private, the only one who ever sees it is you, and thus it’s easier to take. The problem is when your failures are public: if someone sees you fail, no matter how close you came, the memory of that failure is still there. On some level the witness will always have that knowledge about you, and there’s a good chance that it will in some way color their perceptions about you. When the failures are ephemeral– like bumping into a glass storefront because you thought it was an open door– the effect is minimal. When the failures are permanent and/or replayable– like a Youtube video of that very same collision– the effect is compounded.

With blogs and the internet, the temptation is to take everything as being instantaneous. Something posted years or decades ago is just as accessible as something posted right now, and whether or not it’s just as relevant today is, ironically, irrelevant. It doesn’t matter how long ago you posted that topless picture, or if it’s just one ill-conceived moment among thirty thousand exquisite photographs; it’s there forever and it’s all that matters. There is no statute of limitations on the internet.

It should, of course, go without saying that I think that’s a load of absolute bullshit. For my part, I try very hard not to let one mistake in the past color my perceptions of any site, professional or personal. The big names are capable of cranking out stinkers once in a while. But the little guys who don’t get any attention sometimes have the biggest ideas. The internet was supposed to let everyone’s voice be heard.

So remind me why we’re ignoring some of them, just because everyone else is?

Whuffielupagus (Part Two)

The question then becomes one of effort and consistency. If 99% of the posts on a blog are filler (cough), does that automatically condemn the 1% of the posts that have genuinely good and interesting content? More important to the point at hand, does that 1% of posts command the blog’s readership attention for the uninteresting 99%? If it’s not a 1%/99% ratio, where exactly do you draw the line? How do you judge a site’s real worth?

The “fun” thing about all of this comes when you realize that some of the internet intelligentsia happen to have some rather cruel streaks in them as well (and I most certainly include myself in that categorization from time to time). The egalitarian nature of the internet just doesn’t sit well with some folk, and that gives rise to sites such as “Web Pages That Suck” and “Your Webcomic Is Bad And You Should Feel Bad”. This is to say nothing of the legions of commenters and forum-goers who pooh-pooh anything that isn’t a work of magnificent perfection.

Now, far be it from me to say that criticism isn’t warranted or desperately deserved in some cases– and I’l be the first to own up to the many, many mistakes I’ve made in the past. But as I’ve always said, honest and constructive criticism will always beat just plain ol’ criticism. For everything that’s wrong, there should at least be something that was done right. This isn’t always the case, obviously, but the cases where it doesn’t hold true are so astonishingly rare as to be worthy of the ire and bile that are heaped upon– well, more or less everything.

It gets worse when you start trying to quantitatively and objectively assess the quality of something based on a relatively incidental number. Back when Netjak was still around– which was itself a remarkable example of a professional blog– we raised our fair share of hackles with certain of our reviews. The biggest offender here was with Mass Effect, which didn’t get the glowing praise from the reviewer that the rest of the gaming press was lauding onto the game. This prompted an individual to sign up for our forums (which we used in place of a comments system) and berate us for not falling in line. The individual went so far as to suggest that our small userbase on the forums (because we’d just gone through a dormancy period due to technical failures) as well as the low number of threads and posts (because it was set to auto-purge posts older than a certain threshold) “did not give [us] the right” to our opinion of the game. Nevermind that Joystiq had linked to us repeatedly; we were small, therefore we didn’t count.

Whuffielupagus (Part One)

Last week, Pez mentioned (via Twitter) that he was in the bad habit of disregarding blog posts that don’t have comments. His argument– and it’s not an unfounded one, but I’ll get into that later– was that if nobody had bothered to respond to it, it wasn’t worth his time to read. He himself admitted that it’s a flawed reason to not read something, and when I made the snarky self-deprecating remark that “nobody must read my blog, then, if there’s no comments”, it was mostly as a joke. But then, I got to thinking about why he’d have that policy about feedback uber alles, and why its scarcity somehow indicates a lack of quality.

My introduction to the social internet was, as I’ve said often, Usenet. That was nothing but feedback. It’s post upon post upon post, each one building off the rest, and a community emerging from nothing– not even a structure more concrete than “post about this here, and that there”, and even that was fluid to some degree. Occasionally one post or another would become a foundation for more discussion, either by virtue of its own content or by containing a reference or link to something else, like a World Wide Web page. It would be the “big topic” for a while, and then fade away as the next one came to the table.

A good friend of mine, who I met during our shared time in Usenet, dislikes the idea of blogging in general. He feels it to be narcissistic and unnecessary, as the vast majority of them are people just endlessly talking about themselves into the void to make themselves feel important. I, of course, can’t argue with that, considering the fantastic amounts of pure crap that I put out here, but as the past month or so seems to have shown I’m working on fixing that. But what I think that friend is missing is the rather important distinction between a personal blog and a professional blog, and a rather frightening blurring of the lines between hobbyists and professionals.

The coming of the World Wide Web made everybody rock stars, and elevated all content to the same plateau. Suddenly, personal blogs, not necessarily intended for widespread consumption, are being presented and styled as if they are. Structurally, there’s nothing different between my site and say, Joystiq, aside from the content and the tools used to produce them. Where it all breaks down is in content: Joystiq has it and I, to be quite frank about it, don’t. At least, I don’t regularly have any content nearly as compelling as Joystiq’s constant feed of gaming news. And that’s where things start to get hairy.

iWanna Be Seceded

Last week, Macworld’s Jason Snell argued against iTunes’ continued existence as a monolithic, do-everything application, and for the implementation of its features as a suite of interconnected applications. He’s right, of course, and Apple knows he’s right. But it’s still not going to happen.

Some background is in order here. iTunes, as an application, was first introduced in January of 2001, for Mac OS 9. The 1.0 version didn’t support the iPod, mostly because that wouldn’t be introduced until October of that year (alongside 2.0). It was a somewhat lackluster media player, and it wouldn’t come into prominence as the main iPod “driver” software until the 2.0 version was released. The Windows version wouldn’t be out until 2003.

Flash forward to today, where it runs on Windows and OS X, and it’s still a lackluster media player, with godawful playlist support and worthless organization and randomization tools. It also has a metric assload of useless crap bolted onto it, including videos, podcasts, Cover Flow, the mobile App Store, books, and Ping. I honestly couldn’t tell you who gives two flying farts about Ping; no Mac user I’ve ever talked to does. And Cover Flow is pointless when the vast majority of a user’s library isn’t sourced from the iTunes Store. iTunes does everything, and everything it does, it does abysmally. I vehemently hate iTunes.

Apple doesn’t have a whole hell of a lot of fondness for iTunes as an application, either. When the Mac App Store debuted, it was its own application. On iOS devices, the individual functions that iTunes serves on those platforms are split out into individual apps (“Music”, “Videos”, “App Store”, “iTunes”– actually the music store, and “iBooks”). The trend has progressed on newer devices for secession out from a monolithic app. Meanwhile, the very next day, rumors arose that iTunes 11 would include a dedicated “iCloud” panel of some sort, in an astonishingly backwards move.

Unfortunately, there’s a very good reason Apple simply can’t break up the iTunes racket, and it’s one that got forced on them as the iPod took off in popularity: Windows. When an average user buys an iPod or an iPhone, they’re not going to want to have to install a dozen different applications just to get the thing to work. Apple recognizes this and has made overtures towards steering users away from iTunes; iOS 5 allows you to buy an iPhone, iPad, or iPod Touch and get it going without it ever being connected to a traditional computer.

At the same time, though, iTunes remains the single point of entry for non-Apple-distributed music to get on your device, even with its iTunes Match service. Many genuinely useful apps like ProPlayer or Avid Studio are hamstrung by iTunes’ de facto gatekeeper role for files. On Windows, it’s a crash-prone piece of bloatware that I wouldn’t install even at gunpoint. Probably worst of all, though, it’s just old technology, and I would be honestly shocked if the internal code libraries that iTunes is built on haven’t been completely obsolesced; Apple can and has done much better in the intervening ten years.

I still think that iOS and the devices running it are good, and the fact of the matter is, I bet I’d have the same problem syncing stuff to an Android device, just backwards (as I’ll bet there aren’t many reliable Mac clients for those). But honestly, some of the sheen is starting to wear off of iTunes. It’s time to put the old dog down and work with our new technologies in new ways.

Decentralization

Over the last few years or so, I’ve come to rely on my smartphone for a great deal of my ability to stay connected, both socially and technologically. I’ve had it since 2008, and it wouldn’t be until ’09 that I really started to notice how much I was seemingly dependent on it; the few times that I did leave it home accidentally, I felt detached and uneasy. This isn’t anything new, really; the tethering to a device feeling really started when I first got a cell phone of my own in the first place.

The Shutdown Days experiment at the beginning of the month helped to ground me back in reality, and I’m doing more to try to leave the phone in the car or tucked away in my bag when I don’t need or want to be disturbed by it. It’s easy, however, to see the constant connectivity as a bad thing: I’m at the beck and call of a device no bigger than a deck of cards. It’s not the case– not really, anyway. I can see how it would be taken to an unhealthy extreme, and I’m doing what I can, within reason, to curb that tendency.

Still, it’s not for nothing that these technologies were developed, and I dare say that they have even helped to make me more productive. Between the smartphone and the tablet, I’m able to turn unproductive cycles– like waiting for an order at a restaurant, or sitting in a theater before the lights dim– into opportunities for either accomplishing something or reducing stress. These are devices whose purpose is to do anything, everywhere, with every amount of time. It’s foolish not to use them for those purposes.

That said, there is something rather wonderful to be said for leaving the phone in the car and simply waiting for my food, calmly drinking in the atmosphere of the place. We cannot constantly be doing many things. Every once in a while, it helps to simply do one thing, and put your whole being into doing that thing.

Decommissioned

Thoughts on the weekend past are coming on Thursday, as per what is becoming my usual idiom.

Way, way back in the early days of the World Wide Web, back before there was more to the internet than just the WWW, the concept of a “home page” was a necessary conceit that stemmed from the medium’s predominant usage in campus computer labs. Browsers did offer bookmarks, but since you weren’t guaranteed to get the same computer each time, there was little point in saving bookmarks. The home page began as a way for a user to store hyperlinks to their most favored sites, for easy access when they weren’t at “their” computer. My first few web sites were dedicated to this effort, and I made tremendous strides in learning HTML and Javascript so that I could have an easy, no-thinking-required morning link list.

Nowadays, nobody even uses the term “home page”; the idea itself is outmoded as browsers offer bookmark syncing capabilities and browsing session restoration. There are huge services dedicated to keeping your bookmarks completely accessible, thus obviating the need for you to learn HTML and have hosting space set up. Personal hosting sites are vanishingly rare since the collapse of GeoCities, their purposes having been melded into social networking sites like Facebook and Twitter; their most common use case even at the end– a weblog– is now handled far more elegantly by services like WordPress and Blogger.

I maintained a links page on my old site for just shy of about nine years now; it went live at 7p on Thursday, April 24th, 2003. At some point this month– likely by the time you read this– that will have ended, as I’m migrating my bookmark list to iCloud. The Links page on the old TFO.net site, which remained an open-secret since the 2009 move to this domain, will be deleted, as it doesn’t serve any purpose anymore and is prohibitively difficult to maintain. I’ll talk a little bit tomorrow on why that is.

Link Wednesday: Dropbox

Important Service Note: In many cases, the links provided in Link Wednesday posts are referrals; by using them you provide a benefit to John, which will be disclosed for each service or product advocated during Link Wednesday. In this specific instance, using the referral link to sign up increases John’s storage space as well as your own. A “clean” URL is obtainable by hovering over the link.

We live in an increasingly decentralized society. Talk of “the cloud” permeates a lot of technology these days, and the increasing size of our data files means that devices which seemed roomy and unfillable a few years or even months ago now feel cramped and restrictive. Take my own portable-computing dilemma, for example: I have an iPhone with 32GB of space and an iPad with 16GB. As I’ve worked with both devices, I’ve found that I really need the inverse– I want more space on the tablet and need less space on the phone. In some cases, this is because I want the same files on both devices but don’t want the hassle of constantly syncing them.

Dropbox is a way around that, and one that I’ve been using for about two years now. It’s one of the oldest and most well-respected “cloud” file services in business, and it’s also one of the most economical. The free service offers you 2GB of online space that can be synced to a mobile device (iOS or Android) or a desktop computer (Windows, Mac OSX, or Linux). Paid services bump that up to 50GB or 100GB depending on your outlay, and referrals allow you to expand your capacity– if you refer someone, you get 1GB added on permanently, while the person who you referred gets half that. This even works for free accounts.

On desktops, the process is completely transparent– Dropbox sets up a folder that’s automatically synced to its servers as long as you’re online. Mobile usage is where it really shines, though, and this is due in no small part to the fact that mobile devices are becoming far more robust. I can only speak to the iOS client, but even that has strong integration to other apps such as the Elements text editor. Recent additions allow the app to accept files for upload, making it a great tool for collecting content on the go. Moreover, you can use your Dropbox account to share large files with friends (though certainly I would use some restraint in doing this for certain types of files).

Overall, I like it quite a bit over Apple’s iCloud service; even though iCloud support is “baked into” iOS and a number of apps are starting to embrace it, Dropbox is still a better deal in terms of flexibility and expandability. It might not replace having a USB stick on hand at all times– many workplaces clamp down on its use due to security reasons– but it’s still worth free.