Reflections from Planningness 2014

I recently returned from this year’s Planningness event in Portland, Oregon. It was a great event full of great speakers. And I owe a tremendous amount of gratitude to Baldwin& for sending me (by the way, my views shared here are not necessarily endorsed by them, it’s just the byproduct of my own senseless ramblings).

I’m sure there will be no shortage of Planningness recaps making their way around the inter webs in the coming days but I felt as though I had a responsibility to at least share some of my initial thoughts. Although, instead of just recapping all of the sessions I had attended, I decided to outline a few ways that this year’s event will impact my work moving forward. Because I’m selfish. And this is all about me.

Some of this may be blindingly obvious. Maybe not. However, I do think there are some helpful distinctions in here to help improve our discipline of planning moving forward. Feel free to share your feedback or poke around some of the links that I provided below for more context and details.

KEY LEARNINGS

 

2014 PLANNINGNESS PRESENTATIONS

[I will continue to post here as they become available]

Ian Fitzpatrick, “Low Fidelity Data Mining”

Jess Seilheimer, “How To Launch a Crowdfunded Product”

Jamie Davidson, “How To Raise Venture Capital”

James Brown, “How To Maximize Flow”

Nitin Khana, “How To Grow Your Startup”

Robert Gallup, “How To Hack Electronics”

NOTABLE QUOTES

[not all of these are verbatim but as well as memory serves]

“We forget that you can capture someone’s attention for half a second without actually impacting them.” – Megan Averell

“When the consumer is diminished, so too are planners and our role.” – Megan Averell

“You have the ability to choose your response.” – Jeff George

“You are not a human doing. You are a human being.” – James Brown

“The saddest life is an irrelevant one.” – James Brown

“To be creative, one must be comfortable with their areas of ignorance or completely changing their opinions and perspectives.” – Lisa Azziz-Zadeh

“Derivation is an art.” – Ian Fitzpatrick

“Don’t fall in love with the numbers. Fall in love with the people that those numbers point you toward.” – Ian Fitzpatrick

“Whenever you use a tool in order to examine something, you are limited in what you see based on the capabilities of that tool.” – Alexandra Horowitz

RECOMMENDED READING LIST

Alexandra Horowitz, On Looking

Shunryu Suzuki, Zen Mind, Beginner’s Mind

Howard Gossage, The Book of Gossage

Stanley Pollitt, Stanley Pollit on Planning

David Lynch, Catching the Big Fish

The Brutalization Will Be Televised

There’s been a very disturbing thing happening with very disturbing frequency lately. Take a look at just a few of the popular headlines from last month:

NASCAR driver Tony Stewart’s car struck and killed a foot-bound Kevin Ward Jr. during a Sprint Cup race in upstate New York.

A nine-year-old girl accidentally killed an instructor after losing control of an Uzzi at a gun range.

The St. Louis Police Department shot and killed 25-year-old Kajieme Powell for stealing a few drinks and powdered donuts from a nearby store.

Terrorist organization and international alliance of assholes, ISIS, savagely beheaded American journalist James Foley in Iraq.

Of course, the tragedy of human death is not new news. Sadly. But what is a bit different today is the degree to which the deaths of others have become increasingly experiential. All of the headlines above were not just news stories — they were videos. Recorded, posted online, ready for viewing and shared. News networks linked to it. Social media networks spread it.

In a way, this demonstrates the duality of media and technology in modern culture today. It can provide people with unprecedented access to new information, but it can also initiate people to new experiences. Turns out, coming face-to-face with real-world death is one of them. What used to be an experience limited to just a few physically-present bystanders is now available to anyone with an internet connection. Death has gone mainstream.

That’s an important distinction for the sake of historical context. After all, the idea of humans dying is as old as humans are. And death has long been a central storytelling element throughout media for millennia (oral tradition, written text, sculpture, painting, etc.). But today, it’s more than just confronting the brutal realities of death — it’s the fact that we now have the capability to witness the moment of death itself. It’s a monstrous shift, in every sense of the word.

In 1969 Eddie Adams won a Pulitzer Prize award for a photograph taken only seconds before the execution of a Viet Cong [1]. Drew Richard’s The Falling Man captured the horrific moments of September 11th. These photographs (and many others like them) have become some of the most enduring images of the past century. But there’s something even more grotesque about the filming of death on video. It’s more immediate, more undeniable. It transforms passive audiences into afflicted eyewitnesses.

There’s probably a great opportunity here to reassess the role of ethics in media today. Or we could underline the explosive popularity of video in regards to web traffic, media monetization and advertising initiatives. But it seems as though the most important piece of this present day phenomenon is the realization that, plain and simple, our technological capabilities have eclipsed our emotional capabilities to cope with them.

In the mid-1960s, Marshall McLuhan prophetically described technology as a “global village” where “electronics and automation make mandatory that everybody adjust to the vast global environment as if it were his little home town [2]“. It’s a poignant metaphor for the present day. And it’s probably pretty accurate, too. It’s just that, now, it’s all starting to hit a little too close to home.

_______________________________

[1] The New York Times, 2011

[2] War and Peace In the Global Village, Marshall McLuhan, 2001

The Triumph of Soul

It’s always a peculiar thing when a beloved comedian dies too soon. To know that someone capable of making us feel so carefree, so joyful had been living a life such the opposite. 

This scene from The Dead Poets Society is one of my all-time favorites from Robin William’s illustrious career. It’s not a riotously funny scene, a la The Birdcage or Mrs. Doubtfire. But it’s an important one.

It teaches us that soul counts for something.

And it teaches us that despite our incessant human need to always understand the world, to prove, to substantiate, to reason, to quantify, to uncover incontrovertible empirical truths, to theorize, to validate, to justify, to know — sometimes it’s just as good to stop thinking, and to feel. Rather than always needing to explain things, can’t we just appreciate them sometimes, instead?

Soul. It’s hard to describe. It defies logic. It never needs a footnote. Yet it does something to us. It moves us. It changes us. And the world feels a little emptier without it.

Surely, this is no more obvious, than now.

robin-williams-cover-ftr
creds: Peggy Sirota

When It Comes To Portrayals of Women, It’s Time For Mass Media & Culture To Grow Up

 

o-JESSICA-LANGE-MARC-JACOBS-facebook
courtesy of Marc Jacobs©

Lately, I’ve been thinking about older women.

Hold on. Let me explain.

This past month, Tom Junod wrote an article in Esquire magazine called, “In Praise of 42 Year-Old Women.” As the title suggests, Junod lays tribute to a relatively newfound cultural phenomenon: the ascent of the middle-aged woman. He writes, “a few generations ago, a woman turning forty-two was expected to voluntarily accept the shackles of biology and convention; now it seems there is no one in our society quite so determined to be free.”

Bear in mind, it’s a men’s magazine, written largely for men, by men. So Junod’s praises & perspectives may be a bit… partisan. Okay. But what does seem undeniable is the mainstream momentum building behind today’s middle-aged woman:

Last year People magazine crowned then-40-year-old Gwyneth Paltrow as the “World’s Most Beautiful Woman“, a list that also included 38-year-old Drew Barrymore, 46-year-old Halle Berry and 75-year-old Jane Fonda.

Popular daytime television programs and online content portals, predominately aimed at women, declare that 40 is the new 30, not necessarily a novel topic of discussion but gaining in social clout.

Major brands are beginning to feature older female models: for example, Marc Jacobs announced 64-year-old Jessica Lange as the new face of the brand. 62-year-old Jacky O-Shaughnessy recently modeled a signature leotard for American Apparel.

Even privately: according to the research and data analytics division at one of the world’s highest-trafficked porn websites, the most popular performer of the year was a 42-year-old woman who managed to garner 3 times more searches and comments than anyone else on the list [1].

Once you stack up all of these media observations, an interesting inflection starts to emerge: either our culture today isn’t nearly as fixated on youth as we’ve historically been or our culture is beginning to redefine the meaning of youth.

Part of this phenomenon may be explained by demographic shifts. We are an aging population. By 2050, the average age of Americans will increase from 35.3 years-old to 41.7 years-old [2]. And obviously our definition of “middle age” shifts proportionately to increases in age expectancy.

Then again, part of it may also be explained by generational shifts in lifestyle. Young women today are more likely to be college-educated than young men, meaning they typically enter the workforce at a later age, on average. They’re also now getting married three years older, on average, then they were in 1980 [4]. And some experts also point to the fact that more women are having children later in life, evinced in part by a 25% increase of women having their first child between the ages 35 to 39 over the past decade [5]. In many ways, it’s becoming more common for major life milestones to be met later in life.

But another often overlooked factor may be, you guess it, money. Women control $12 trillion of the overall $18.4 trillion in global consumer spending, which means there’s a significant financial opportunity for both media and brands [6]. Especially when it comes to categories around beauty and appearance. By some estimates, the boom for anti-aging products in just one year was projected to surpass $4 billion [7]. And we all know, wherever money leads, focus follows.

agingwithgrace

So before we applaud this remarkable cultural resurgence of the modern middle-aged woman, perhaps we should consider the arch of the narrative, not just the momentum behind it.

And this is where the perspectives of real-life women can trump any trend analysis or quantitative market research or demonstrative datasets.

Huffington Post writer Susan Deily-Swearingen expressed her own exhaustion over the trend: “40 is not the new 30, just like Obama is not the new Kennedy, just like Michael Buble is not a new Rat Packer. 40 is simply 40. Why can’t things just be what they are? Why do things have to be the new something else? When do we give things a chance to just be themselves?” 

Elissa Straus of The Week put it more strongly, “we have ladies feeling more pressure to be hot later than ever… men haven’t embraced actual mature, powerful women. Nope. All we have here is yet another fantasy. It’s just the same old song.”

The reality is, there seems to be an unfortunate tradeoff to this story. On one hand, it’s encouraging to see an older demographic of women finally receiving mainstream recognition. It’s about time. But on the other hand, it’s also entirely possible that the media frenzy around middle-aged women is yet another attempt to prolong the pressure and reinforce the stereotypes that media has traditionally placed on them, albeit at a younger age.

Is it possible that our newfound cultural obsession with the middle-aged woman stems solely from the middle-aged woman’s ability to revert herself back to a former self? To lose the baby weight. To eliminate the face lines. To look good in a bikini on the cover of Vogue. It’s the difference between wanting to compliment a middle-aged woman on how good she looks and feeling the need to congratulate her on it.

Time will tell.

But the most encouraging element of this cultural phenomenon is that we may beginning to see a phase change. Finally. Women today, spearheaded in large part by this older group of women, may be beginning to untether themselves from the stereotypes that mainstream media has historically tied them to. A shift in the story. After all, there is perhaps no one better poised to escape the guise of archetypes and to embrace genuineness in it’s many forms, than she.

Just as the modern middle-age woman has proven to us all that she, too, can project an air of undeniable desire, she’s also here to remind us that she doesn’t always need to.

 

___________________________________________________

[1] Pornhub Insights, 2014

[2] Vienna Institute of Demography at the Austrian Academy of Sciences, 2006

[3] Pew Research Center, 2011

[4] WiseGeek.org, 2014

[5] Center for Disease Control & Prevention, 2014

[6] Boston Consulting Group, 2014

[7] Focalyst & Millward Brown, 2007

The 5 Stages of Creative Development

stages.001

 1. Being able to discern a good idea from a bad idea.

Good ideas are hard to come by. This may explain why 65% of new television shows are cancelled after one season [1]. Or why less than 0.01% of mobile apps are considered financial successes [2]. Or why 90% of new product launches fail [3]. Being able to recognize a stellar idea among a surplus of bad ideas is no small skill.

2. Being able to generate a good idea.

Coming up with great ideas, especially with some consistency, can be a daunting process. Fail fast. Rapid prototyping. Iterate, iterate, iterate. That type of thing. Always aim for an overwhelming amount of ideas, as Dustin Ballard demonstrates. It’s not a science, it’s a way of working. Aristotle once said, “we are what we repeatedly do. Excellence, then, is not an act, but a habit.” Keep going.

3. Being able to clearly articulate a good idea.

One sentence strategies. Six word stories. Setting creative hooks. There are a ton of helpful frameworks out there. The point is, powerful ideas need simple explanations. It doesn’t really matter how good an idea is, if you can’t clearly communicate why it matters. Too often, too many great ideas fail, not because of the idea itself but because of the way in which the idea was presented.

4. Being able to defend a good idea.

In 1876, Western Union declared that “the telephone has too many shortcomings to be seriously considered a means of communication.” In 1899, the U.S. Patent Commissioner stated “everything that can be invented has been invented.” In 1936, the New York Times said that “a rocket will never leave the Earth’s atmosphere.” Great ideas often face great resistance.

5. Being able to detach yourself from a good idea. 

At this stage of the process, two things can happen to great ideas: either they (1) find success or (2) they don’t. Don’t let the destruction of a great idea destroy you. Don’t let the success of a great idea satisfy you. Both are equally dangerous to the creative process and your creative development. As George Will once wrote, “any idea is dangerous if it’s a person’s only idea.” And that’s probably pretty accurate. Be thirsty.

______________________________________________

[1] Screen Rant, 2012

[2] Gartner Social Report, 2014

[3] The Antidote: Happiness For People Who Can’t Stand Positive Thinking, 2012

What Happens When Privacy Disappears & We Have Nothing Left to Hide?

AP_donald_sterling_2_jt_140510_16x9_992

Some people feel like society has taken a turn for the worse recently. Particularly when it comes to the issue of privacy.

Take a look at the headlines from the past couple of years. NSA security probes. Digital espionage. Identity theft. Catfishing.

Rogue individuals, whether perceived as shameful traitors or courageous whistleblowers, have exposed omnipotent governments. Glenn Greenwald. Edward Snowden. Julian Assange.

Multi-national corporations have fallen victim to epic security breaches. From Bank of America to Apple to Target to Sony to Disney to LinkedIn to eHarmony.

Even major news organizations — like News of the World, which shuttered its doors after 168 years of operation due to an egregious phone hacking scandal — continue to evaporate our faith in this passé thing called ‘privacy’.

It may also explain how a service like Snapchat, which essentially offers the same functionality as traditional SMS but with a twist of temporariness, can attract over 500 million people around the world and a $3 billion buyout from Facebook. Then of course, there’s Pluto and Wickr and Cyber Dust and a slew of other self-destructing communication services. All for the sake of covering up our digital footprints.

We waltzed into the digital age like a child stumbling upon an unguarded cookie jar, with wide-eyed amazement and a seemingly endless appetite. It was exhilarating. Consume ’til your heart’s content. Now, the nausea is starting set in. Slowly.

Horst Feistel wrote in Scientific American, “there is growing concern that computers now constitute, or will soon constitute, a dangerous threat to individual privacy.” The year was 1973. Yet that sentiment hasn’t shifted much since then. If anything, it has only emboldened.

Today, 74% of internet users said they were more concerned about privacy this year than they were a year go [1]. That number has grown nearly 52% in the past 5 years alone [2]. But what’s more interesting is how internet users have begun safeguarding some of their online behaviors. A recent survey from Pew Research found that 86% of internet users have actively taken steps online to remove or mask their digital footprints — anything from clearing cookies to encrypting email to protecting name displays on social networks [3].

As is the case with most things, a healthy dosage of perspective is helpful. After all, privacy has always been a fervently defended issue. Puritan rule in the 1600s decreed keeping an eye on your neighbor as a civic duty and in many towns it was forbidden to live alone. The national Census, as originally established by the U.S. Constitution, was regarded by many at the time as a flagrant infringement of privacy and personal information. And how could we forget the fury that had bubbled over when law enforcement agencies began wiretapping early telephone networks in the 1890s? In other words, losing our collective shit over privacy is sort of an American pastime.

But perhaps there is something greater afoot here today. Something with much wider implications. Something that will not only change the way we interact with technology, but something that may change the way we interact with each other. On a human level.

As Aldous Huxley once said, “technological progress has merely provided us with more efficient means for going backwards.”

So here’s my optimism for an otherwise very troubling trend:

A few years ago, Carnegie Melon University professor Jesse Schell gave a TED-like talk at the Dice Summit 2010 where he imaged a future when new technology could have transformative affects on human behavior — namely, a future when the gamification of everything things could potentially enable companies, governments and other institutions to reward people for completing or excelling at routine, everyday tasks.

For instance, he dreamt up technology that could intuitively know each time you take public transportation, providing people with instantaneous tax incentives. Or, eye sensors that could track the completion rates of the novels that you’ve read so that Amazon could provide more accurate book reviews. Or, mechanisms that could reward school children with incremental scholarship funding from the Art Council each time they practice the piano, even more if they performed particularly well on a given day. It’s futuristic, but not far-fetched. A bit Big Brother-ish. But utterly astounding.

The most important point of Schell’s talk was the realization that in a not-so-distant future, when these types of technologies can track, watch and reward our everyday behaviors, wouldn’t we instinctively become more aware — and more conscious — of our behaviors? Wouldn’t we become a bit more sensitive to the fact that our every behavior carries consequence? Wouldn’t these gamified-track-reward-whatever-you-want-to-call-them systems provide a better mirror into the way we live our lives? A real-time, unavoidable reflection for how we behave? As Schell concludes, “it could be that these systems are all just crass commercializations and they’re terrible. But it’s possible, that they’ll inspire us to be better people.”

Schell’s thesis for the gamification of objects may also bear some analogies to the issue of privacy. Despite all the outrage today — the increase of information gathering, the dissipation of personal boundaries, the countless spying scandals, the notion that nothing ever dies once it lives online — isn’t it possible that our evolving relationship with privacy may impact our evolving relationship with the world in which we live? Isn’t it at least possible that the evaporation of privacy could — in some unforeseen, round-about, possibly tyrannical but potentially convalescent way — actually contribute to a profoundly positive change in how we conduct our lives? In a world where it becomes harder and harder to hide things, wouldn’t we become less and less motivated to have things worth hiding?

It’s hard to say for certain. But what is certain is that we’re inevitably heading toward this future whether we like it or not. One where our secrets are more susceptible. One where our indiscretions are more exposable. One where our carefully constructed reputations, the images of ourselves that we want the world to see, are less of a byproduct of painstaking production and more of an honest representation for who we really are. Our benevolence and our wickedness. And everything in between.

We can continue to run around in outrage over privacy. And maybe we should. But maybe our best hope for the future is just as much about changing technology as it is about changing ourselves.

Let’s start by being good.

_________________________

[1] Harris Interactive, 2013

[2] Pew Research Center, 2013

[3] Pew Research Center, 2013

Finding Beauty In Unlikely Places

Beautiful-Graffiti-Art_05There was a powerful story printed in the Washington Post years ago about how Joshua Bell, one of the world’s premier virtuoso violinists, descended into a crowded Metro subway station to perform a few classical pieces on one of the world’s finest handcrafted violins. Despite having just performed to a sold-out crowd at one of the most prestigious concert halls in the country a mere three days earlier, Bell was now busking for spare change amidst a backdrop of early morning commuters. The most shocking thing about it was that nearly nobody — save a curious child or the occasional passerby — had stopped, or had even passed a glance at Bell as he played some of the most beautiful compositions of music in the world.

The story, which became known as Pearls Before Breakfast, was an experiment on how context can alter perception. It’s an important story because it demonstrates just how susceptible human beings are to overlook beauty in everyday situations.

Artist Brendan O’Connell created a collection of portraits highlighting the subtle beauty inside of one of the most least likely of places: Walmart Supercenters. His paintings feature stores shelves, stocked with JIF peanut butter and UTZ potato chips, crowded check-out counters, boxes of Farfalle pasta, a package of frozen Bubba Burgers. It’s a powerful meditation on minutiae. He says, “trying to find beauty in the least-likely environment is a kind of spiritual practice.”

German-born photographer Michael Wolf recently launched his latest photo project, capturing the living conditions in mega cities throughout Asia. It’s what you might’ve expected: claustrophobic dwellings made of everything from concrete to cardboard. But Wolf’s photos capture it in a way that outline an aesthetic behind it all. There’s a strange beauty to it, a hidden geometry, not by intent or by architectural design, but by the realization that humanity is capable of existing even within structures built to suppress it.

There is something soul-stirring about the emergence of beauty from unlikely places. It’s why I’ve always gravitated toward artists with more raw reflections of the world. Like when Tom Waits writes a love song from the perspective of a prostitute recently released from prison or when the Coen Brothers produce a film where the villain ultimately succeeds or when Charles Bukowski, once referred to as ‘the Poet Laureate of American Lowlife,’ pens a poem about drug-induced fornication in a roach-infested hotel room. There’s an unshakable truth woven into these stories, untarnished by the pretense of perfection, a portrayal of human life so realistic that it’s a thing of beauty in and of itself. By disavowing our dictionary definitions of what beauty is, we are able to find beauty in places we never thought it could be.

Confucious once famously said, “everything has beauty, but not everyone can see it.” Or maybe we’re just not looking close enough.