I read two articles today that at first don’t seem to have anything to do with one another but connected for me in a profound way that I’m going to attempt to capture in this post. I’ve been thinking a lot lately about how the web has changed since I first started participating in it at around the turn of the century.
This is the one I read first, and it’s a bit on the long side, but I recommend your read the entire thing. It is fantastic writing. It is about some of the realities of the current landscape, this multi-modal, always-listening web. It most intelligently describes this paradox of the modern web: What tear gas taught me about Twitter and the NSA.
This is why the state-of-the-art method for shaping ideas is not to coerce overtly but to seduce covertly, from a foundation of knowledge. These methods don’t produce a crude ad—they create an environment that nudges you imperceptibly. Last year, an article in Adweek noted that women feel less attractive on Mondays, and that this might be the best time to advertise make-up to them. “Women also listed feeling lonely, fat and depressed as sources of beauty vulnerability,” the article added. So why stop with Mondays? Big data analytics can identify exactly which women feel lonely or fat or depressed. Why not focus on them? And why stop at using known “beauty vulnerabilities”? It’s only a short jump from identifying vulnerabilities to figuring out how to create them. The actual selling of the make-up may be the tip of the iceberg.
Maybe it’s the NSA, maybe it’s the way every page now tracks every click in an attempt to better market to you1, maybe it’s how violated we feel when a company like Target is hacked, maybe it’s just the Facebook algorithm showing us pictures of our ex we’d rather not see; but something is causing us to feel more than a little anxious about the state of our technologies and perhaps a little nostalgic for a simpler time.2
Last night I read about Flickr’s recent redesign and 10 year anniversary, which has left many looking back to at least the start of “Web 2.0”. Flickr Turns 10: The Rise, Fall and Revival of a Photo-Sharing Community.
Back in 2004, the sort of rich online environment for social interaction that Flickr and other newcomers were inventing was so new that people started talking about “Web 2.0,” a term that started out sounding futuristic but soon became redundant, since its influence was everywhere. No Web 2.0 site was more important than Flickr; it debuted just six days after Mark Zuckerberg launched Facebook from his Harvard dorm room, and at first, it wasn’t clear that Butterfield and Fake’s photo-sharing site wasn’t the bigger deal. Even its name, with the missing final vowel, provided inspiration to countless other startups.
There it is, this undertone of disgust, a heralding back to a time when we had all the promise and power of these new social technologies without all the yucky commodification of our lives. In an otherwise stating-the-obvious article, What’s Driving UX Innovation — User Experience or User Exploitation?, is this gem:
And this is the problem. Rather than focusing on making a great product, that people will time and time again choose to use, the approach seems to be wait until your users are locked and squeeze them for as much revenue as possible by giving them a product that can just about be tolerated. If accountants had plotted a curve against a monetization/UX axis and knew the point where users will switch off, we’re getting closer and closer to this point every day.
And my favorite trip down memory lane, the second article I referenced in my opening paragraph, A Conversation with Andrew Smales, Founder of Diaryland.
There was a cool feeling at the time, even as the internet was starting to take off. That was when I looked back, a year or two into it, and thought, this felt a little more enclosed back then, like an actual culture or subculture.
And I could stop there and just pine for the good old days but Andrew immediately goes on to say something else equally true:
On the whole, I don’t really miss that time because everything on the internet is so much better now. You can get on it anywhere. For me, it’s not much of a tradeoff. I know what you’re talking about, and there was a neat feeling back then, of that little close thing, but what can you do? The internet is so much better overall. It makes up for losing that.
We’re back to Zeynep’s original paradox. Even as a maker of these technologies, I would not want to go back ten years technologically. It is so much simpler now than it was even five years ago to start up a company and do X with technology. Every space that has to do with providing technology services for other technology companies is packed with fantastic options. Even the basic code frameworks and tools we use are so much better—as we would expect!
Jason Fried and John Gruber wax nostalgic for much of this podcast, and at about 1:08:00 make some profound statements about alligning business’ goals with those of their customers.
This has all come at the cost of those qualities that vetted the original makers of the web. What I refer to as TechCrunch culture. Everyone’s an expert. “10 Ways to Drive More Hopeless Lackeys Into the Gaping Jowls of your Website.” Even the most cynical of web entrepreneurs these days are reading 5 paragraph summaries of ebooks that are simplified re-tellings of research that matured years ago and building entire businesses or products around these collections of aphorisms.
That’s not all. All this data they’re collecting about you is driving product design decisions, and why shouldn’t it? It is your behavior they3 want to influence. Why is there an annoying popup asking you to sign up for an email list on nearly every site you visit? It’s because those work, and spam works. It’s your fault. Congratulations, we’ve democratized web design and development to the point where we are getting what our depraved attentions deserve. The distance between a fart-noise soundboard app and Forbes has compressed almost to the point of indecipherability.
Without strong leadership steering these ships, we’re destined to continue down the path of deprivation. The pirates have taken over the fleet, and those with discipline and old-world skills are hidden away in enclaves, quietly existing on the resources available around them.4 The point is: We have a surplus of knowledge and a paucity of wisdom.5 Perhaps this is not a new problem.
In case you didn’t take the time to read Zeynep’s article, I’m going to include another long section that so beautifully captures her observations:
During a break, I cornered the chief scientist on Obama’s data analytics team, who in a previous job ran data analytics for supermarkets. I asked him if what he does now—marketing politicians the way grocery stores market products on their shelves—ever worried him. It’s not about Obama or Romney, I said. This technology won’t always be used by your team. In the long run, the advantage will go to the highest bidder, the richer campaign.
He shrugged, and retreated to the most common cliché used to deflect the impact of technology: “It’s just a tool,” he said. “You can use it for good; you can use it for bad.”
“It’s just a tool.” I had heard this many times before. It contains a modicum of truth, but buries technology’s impacts on our lives, which are never neutral. Often, I asked the person who said it if they thought nuclear weapons were “just a tool.” Humans have always fought, but few would say it doesn’t matter if we fight with sticks, knives, guns, or nuclear weapons.
This time, I sighed and let it go. I wanted to get back to Twitter. I wanted to get back to my hometown.
Postscript: I realized that this post is an expanded/updated version of Life Is Too Short to Make Shitty Software.