Monthly Archives: August 2009

I See the “Social,” But Where Is the “Media”?

In September of 2006, a few creative geeks in Boston hosted a small event called PodCamp.  It was designed as a peer education and social network for podcasters — that is, people who make audio and video content for the web.  They were expecting a few dozen attendees; they got over 300.

People from as far away as California, Florida and England converged on Boston that weekend to meet their peers, share their collective knowledge and build a real-world community around an online pastime.  The energy and insight on display that weekend was infectious, and so PodCamp founders Chris Brogan and Christopher Penn decreed that any of us who wanted to host a PodCamp in our own hometowns could do so with their blessing.

Since then, there have been dozens of PodCamps around the world, from Sweden to Hawaii.  There will be 20 hosted this year alone, including the 4th annual events in Boston and Pittsburgh.  Global attendance for PodCamps measures into the tens of thousands.  And yet, in all this growth, one subtle change seems to go mostly unnoticed:

PodCamp isn’t about podcasting anymore.

Instead, PodCamp has become a catch-all for blogging, social networking, personal branding and SEO (among other themes).  The number of actual podcasters has dwindled precipitously since 2006, while the number of casual social media aficionados has exploded.  (To wit: of the four long-running and nationally-recognized podcasts in Pittsburgh, all of them — Should I Drink That?, Something to Be Desired, The G Spod and The Wrestling Mayhem Show — existed prior to PodCamp.)  And it’s easy to see why:

  • Podcasting takes more time, energy and resources than text
  • Podcasting is frequently collaborative; blogging is not
  • Podcasting gets compared to the work of professionals; writing is personal
  • Podcasting is a multi-step process; text can happen anywhere

Our move toward ever-faster means of communication with fewer barriers to entry means the cheapest, fastest and easiest will always become the most ubiquitous, while anything more complex will be relegated to the Land of Niche.  But complexity is entirely relative.

When I first started producing Something to Be Desired in 2003, I thought we had to move fast because producing a web-based sitcom was so easy, everyone would be doing it.  I was only half-right; it turns out producing a continuing series only seems easy, while producing individual web-based videos is far easier.  Very few people have the time, interest, help, skill or ideas to sustain an ongoing show.  So although I expected the web to mount a serious challenge to TV and film paradigms through an explosion of independent talent, the web instead provided disruption through distribution, not production.

Who knew that the audience at the first PodCamp would be the anomaly rather than the norm?  Who could predict how easily we’d convert from an audience of makers to an audience of talkers?  And who could expect that the democratic web, which once seemed on the verge of detonating our expectations about what was possible artistically, would so quickly be co-opted as just another distribution tool by the existing media conglomerates — one which we’re content to analyze but not to utilize?

If actions speak louder than words, why are so many of us content to just keep talking?

Why Are Some Cities More Twitterific Than Others?

Last night, while I was helping answer someone’s question about which cities boast the most Twitter users (top city: London), a new question emerged: how does one city become more Twitterized than another?

London, like much of Europe, is a more mobile and text-based culture than the US, so them topping the list makes sense.  In fact, the top 30 results add up logically based solely on relative population size, with a few exceptions (Toronto, Austin and Seattle must all have disproportionately tech-savvy populations compared to their actual size).  But take a look at the surprises further down:

Nashville @ #30? This must come as a shock to every northern city who presumes technology catches on more slowly in the south.

Rio de Janeiro @ #33? Despite being the world’s 21st-most populated metropolis, Rio’s Twitter adoption lags behind Orlando and Minneapolis?

Columbus and Pittsburgh @ #35 & #36? How are two “rust belt” cities more tech-forward than San Jose, San Antonio, Berlin and Montreal?

Baltimore @ #45? Although many people consider Baltimore and Washington, D. C., to be one large population swirl, D. C. is well ahead in terms of Twitter adoption, clocking in at #13.

How did these anomalies happen?  Since Richard Florida and the Freakonomics team are understandably busy, allow me to posit a few theories…

It’s not about population, it’s about purpose. Nashville is smaller in population than Louisville, yet Louisville isn’t even on the list.  Maybe that’s because Nashville is more of a business and cultural hub than Louisville, which — like Detroit, another city that’s not in the top 50 — is more of a manufacturing-based economy.  (Also, like Austin, I suspect Nashville’s Twitter populace is at least partially music-driven; people need a way to keep track of which bands are playing where.)  Likewise, politicos in D. C. have a vested interest in staying perpetually connected — moreso than their laid-back, blue collar sister citizens in Baltimore.

It’s not about population, it’s about income. Rio may be a metropolis, but a huge chunk of its population lives in poverty.  Twitter is a tool for the techies, and being a techie necessitates having at least a median income.  (Ditto Detroit, whose economy is currently the nadir of American mismanagement.)

It’s not about where we’ve been, it’s about where we’re going. To outsiders, Pittsburgh and Columbus may seem to be trapped in the rust belt, but internally, the people in both cities are acutely aware of the pressing need to evolve forward from the industrial economies of their past to a tech-based global connectivity.  Both cities have successfully hosted PodCamps, and Pittsburgh was previously revealed to be the third-bloggiest city in the country.  These days, when change happens, it begins digitally.

Twitter isn’t perfect, and neither are the systems used to rate its influence.  But as a snapshot of urban identity, Twitter adoption may be a more relevant signifier than some of the questions we’ll see on next year’s census.  (And just in case you were wondering, yes, anti-census patron saint Michelle Bachmann is on Twitter.)

The Fishbowl Is Killing Us: Why Social Media Must Evolve or Die

Lately, I’ve been lamenting the lack of true iconoclasts in social media.  But I’ve also admitted that I may be expecting too much. Just because the tools we use to communicate with each other have changed, it doesn’t mean that everyone suddenly has something revolutionary to say; it just means we now have more immediate ways to say the same mundane things.

In fact, the culture of social media itself might actually prevent innovation.

Of Course You’re Deaf; You’ve Been Yelling Inside the Echo Chamber for Years

In a recent blog post, fellow armchair sociologist Steve Spalding parallels my train of thought by detailing the hypocrisy of independent thought: we all claim to revere it, yet few of us produce it and most of us ostracize it when we do find it.

Given those habits, it’s no wonder social media has become one giant echo chamber that’s unable (or unwilling) to ignite a larger cultural impact. In its brief existence, the social media fishbowl has developed an internal hierarchy consisting of “A-List bloggers” and “key influencers” whom everyone else chases after, hoping to ensnare their attention just long enough to be deemed worthy of a larger audience themselves.

And yet the extent of this group’s influence is confined to the minutiae of the tools we all use; rarely does anyone considered to be a social media “thought leader” find traction beyond the twin borders of technology and marketing. In this sense, the entire field seems less a hotbed of disruptive rabble-rousing than a motley parade of carpenters, all passionately searching out new and innovative ways to use a hammer.

Which leaves those of us hoping to see a true seismic shift in the way the world works waiting impatiently for a few brave social media practitioners to look past the safe, secluded confines of their bassinets and venture out in search of new and skeptical audiences who don’t simply smile and nod. That day is inevitable; the fishbowl can only hold so many of us in before it cracks.

Bookmark This for Later

My car died back in May.  Ann’s car just died on Friday.  Suddenly we’ve gone from a two-car household to a couple in need of a bus pass.

Fortunately, Ann lives and works along a busline, so our daily routine is barely disrupted.  She still leaves for work and comes home around the same time, I work from home and make dinner, and we both walk the dog in the evening.  Sure, our freedom for shopping or going out with friends is temporarily limited, but our world hasn’t imploded either.  We’re just biding our time, crunching some numbers and researching our options for buying a new car.  Our lives will be back to normal in a week or two.

And yet, if you’d told us back in April that both of our cars would be DOA by September, we’d have complained, bargained and panicked: Anything but that!  Anything but an unplanned change to our lives!  Anything but INCONVENIENCE!

Why is it that we perpetually forget how easy it is to adapt to just about anything?  Maybe if we all had a little reminder from time to time, we wouldn’t get so anxious when it comes time for those traditional harbingers of doom like car inspections, annual reviews, election season or the nightly news.

Do We Expect Too Much From Social Media?

Two recent news articles about Twitter — one in the Baltimore City Paper (for which I was interviewed) and one in the Pittsburgh Post-Gazette — have finally pushed the mainstream media’s reaction toward social media beyond the standard “what does it all mean?” journalistic crutch and instead ask a more practical question: how do we maximize the potential value in this new form of communication?

Sure, microblogging may be “here to stay” (at least as much as any new technology can be said to be “permanent”), but how many of us are using it to create actual value and meaning for ourselves, much less to push the cultural envelope?

Don’t worry: we can’t.

Our Caveman Ancestors Would Be SO Pissed…

Until recently, technological innovation happened relatively slowly.  Due to cultural, theological, sociological and legal restrictions, society’s advancement was always incremental.  (When your days are spent scavenging for food, evading predators and fucking wildly so your species doesn’t become extinct, you don’t have much time to spend tinkering with that cotton gin.)  The upside?  Every new invention was allowed to simultaneously appear earth-shattering AND to stick around long enough to become fully integrated into modern society, its potential for change (and profit) being mined from every direction before the “next big thing” came along.

Now, technology has reached a tipping point where it evolves more rapidly than the average citizen can keep up with.  Gone are the days when a fork, globe or sundial were the talk of the town; now, by the time you finally figure out what half the buttons on your cell phone are for, it’s obsolete and you have to buy a new one because nobody services that dinosaur in your pocket.  Not that you couldn’t figure it out faster if you had more time, but since we spend the bulk of our days earning the income we need to afford last year’s innovations, our learning curves with new toys are distinctly mountainous.

With technology becoming ever faster, smarter and smaller, the sky’s the limit on what we can do.  The problem is, that doesn’t leave us much time to figure out what we want to do, much less what we should do.  It took the printing press the better part of a century to comfortably change the world, but the iPhone is less than a decade old and its first incarnations are already comical fossils; can you blame a guy for not trying to innovate with a tool that was antiquated on the day it shipped?

So when the mainstream media asks how we can maximize the potential of these tools — or when I lament people’s inability to truly rebel with them — we’re missing the point.  Expecting society to do something amazing with Twitter is like expecting a caveman to artfully navigate the subtleties of a weedwhacker; let’s just be happy we can turn the damn thing on.