The Triumph of Soul

It’s always a peculiar thing when a beloved comedian dies too soon. To know that someone capable of making us feel so carefree, so joyful had been living a life such the opposite. 

This scene from The Dead Poets Society is one of my all-time favorites from Robin William’s illustrious career. It’s not a riotously funny scene, a la The Birdcage or Mrs. Doubtfire. But it’s an important one.

It teaches us that soul counts for something.

And it teaches us that despite our incessant human need to always understand the world, to prove, to substantiate, to reason, to quantify, to uncover incontrovertible empirical truths, to theorize, to validate, to justify, to know — sometimes it’s just as good to stop thinking, and to feel. Rather than always needing to explain things, can’t we just appreciate them sometimes, instead?

Soul. It’s hard to describe. It defies logic. It never needs a footnote. Yet it does something to us. It moves us. It changes us. And the world feels a little emptier without it.

Surely, this is no more obvious, than now.

robin-williams-cover-ftr

creds: Peggy Sirota

When It Comes To Portrayals of Women, It’s Time For Mass Media & Culture To Grow Up

 

o-JESSICA-LANGE-MARC-JACOBS-facebook

courtesy of Marc Jacobs©

Lately, I’ve been thinking about older women.

Hold on. Let me explain.

This past month, Tom Junod wrote an article in Esquire magazine called, “In Praise of 42 Year-Old Women.” As the title suggests, Junod lays tribute to a relatively newfound cultural phenomenon: the ascent of the middle-aged woman. He writes, “a few generations ago, a woman turning forty-two was expected to voluntarily accept the shackles of biology and convention; now it seems there is no one in our society quite so determined to be free.”

Bear in mind, it’s a men’s magazine, written largely for men, by men. So Junod’s praises & perspectives may be a bit… partisan. Okay. But what does seem undeniable is the mainstream momentum building behind today’s middle-aged woman:

Last year People magazine crowned then-40-year-old Gwyneth Paltrow as the “World’s Most Beautiful Woman“, a list that also included 38-year-old Drew Barrymore, 46-year-old Halle Berry and 75-year-old Jane Fonda.

Popular daytime television programs and online content portals, predominately aimed at women, declare that 40 is the new 30, not necessarily a novel topic of discussion but gaining in social clout.

Major brands are beginning to feature older female models: for example, Marc Jacobs announced 64-year-old Jessica Lange as the new face of the brand. 62-year-old Jacky O-Shaughnessy recently modeled a signature leotard for American Apparel.

Even privately: according to the research and data analytics division at one of the world’s highest-trafficked porn websites, the most popular performer of the year was a 42-year-old woman who managed to garner 3 times more searches and comments than anyone else on the list [1].

Once you stack up all of these media observations, an interesting inflection starts to emerge: either our culture today isn’t nearly as fixated on youth as we’ve historically been or our culture is beginning to redefine the meaning of youth.

Part of this phenomenon may be explained by demographic shifts. We are an aging population. By 2050, the average age of Americans will increase from 35.3 years-old to 41.7 years-old [2]. And obviously our definition of “middle age” shifts proportionately to increases in age expectancy.

Then again, part of it may also be explained by generational shifts in lifestyle. Young women today are more likely to be college-educated than young men, meaning they typically enter the workforce at a later age, on average. They’re also now getting married three years older, on average, then they were in 1980 [4]. And some experts also point to the fact that more women are having children later in life, evinced in part by a 25% increase of women having their first child between the ages 35 to 39 over the past decade [5]. In many ways, it’s becoming more common for major life milestones to be met later in life.

But another often overlooked factor may be, you guess it, money. Women control $12 trillion of the overall $18.4 trillion in global consumer spending, which means there’s a significant financial opportunity for both media and brands [6]. Especially when it comes to categories around beauty and appearance. By some estimates, the boom for anti-aging products in just one year was projected to surpass $4 billion [7]. And we all know, wherever money leads, focus follows.

agingwithgrace

So before we applaud this remarkable cultural resurgence of the modern middle-aged woman, perhaps we should consider the arch of the narrative, not just the momentum behind it.

And this is where the perspectives of real-life women can trump any trend analysis or quantitative market research or demonstrative datasets.

Huffington Post writer Susan Deily-Swearingen expressed her own exhaustion over the trend: “40 is not the new 30, just like Obama is not the new Kennedy, just like Michael Buble is not a new Rat Packer. 40 is simply 40. Why can’t things just be what they are? Why do things have to be the new something else? When do we give things a chance to just be themselves?” 

Elissa Straus of The Week put it more strongly, “we have ladies feeling more pressure to be hot later than ever… men haven’t embraced actual mature, powerful women. Nope. All we have here is yet another fantasy. It’s just the same old song.”

The reality is, there seems to be an unfortunate tradeoff to this story. On one hand, it’s encouraging to see an older demographic of women finally receiving mainstream recognition. It’s about time. But on the other hand, it’s also entirely possible that the media frenzy around middle-aged women is yet another attempt to prolong the pressure and reinforce the stereotypes that media has traditionally placed on them, albeit at a younger age.

Is it possible that our newfound cultural obsession with the middle-aged woman stems solely from the middle-aged woman’s ability to revert herself back to a former self? To lose the baby weight. To eliminate the face lines. To look good in a bikini on the cover of Vogue. It’s the difference between wanting to compliment a middle-aged woman on how good she looks and feeling the need to congratulate her on it.

Time will tell.

But the most encouraging element of this cultural phenomenon is that we may beginning to see a phase change. Finally. Women today, spearheaded in large part by this older group of women, may be beginning to untether themselves from the stereotypes that mainstream media has historically tied them to. A shift in the story. After all, there is perhaps no one better poised to escape the guise of archetypes and to embrace genuineness in it’s many forms, than she.

Just as the modern middle-age woman has proven to us all that she, too, can project an air of undeniable desire, she’s also here to remind us that she doesn’t always need to.

 

___________________________________________________

[1] Pornhub Insights, 2014

[2] Vienna Institute of Demography at the Austrian Academy of Sciences, 2006

[3] Pew Research Center, 2011

[4] WiseGeek.org, 2014

[5] Center for Disease Control & Prevention, 2014

[6] Boston Consulting Group, 2014

[7] Focalyst & Millward Brown, 2007

The 5 Stages of Creative Development

stages.001

 1. Being able to discern a good idea from a bad idea.

Good ideas are hard to come by. This may explain why 65% of new television shows are cancelled after one season [1]. Or why less than 0.01% of mobile apps are considered financial successes [2]. Or why 90% of new product launches fail [3]. Being able to recognize a stellar idea among a surplus of bad ideas is no small skill.

2. Being able to generate a good idea.

Coming up with great ideas, especially with some consistency, can be a daunting process. Fail fast. Rapid prototyping. Iterate, iterate, iterate. That type of thing. Always aim for an overwhelming amount of ideas, as Dustin Ballard demonstrates. It’s not a science, it’s a way of working. Aristotle once said, “we are what we repeatedly do. Excellence, then, is not an act, but a habit.” Keep going.

3. Being able to clearly articulate a good idea.

One sentence strategies. Six word stories. Setting creative hooks. There are a ton of helpful frameworks out there. The point is, powerful ideas need simple explanations. It doesn’t really matter how good an idea is, if you can’t clearly communicate why it matters. Too often, too many great ideas fail, not because of the idea itself but because of the way in which the idea was presented.

4. Being able to defend a good idea.

In 1876, Western Union declared that “the telephone has too many shortcomings to be seriously considered a means of communication.” In 1899, the U.S. Patent Commissioner stated “everything that can be invented has been invented.” In 1936, the New York Times said that “a rocket will never leave the Earth’s atmosphere.” Great ideas often face great resistance.

5. Being able to detach yourself from a good idea. 

At this stage of the process, two things can happen to great ideas: either they (1) find success or (2) they don’t. Don’t let the destruction of a great idea destroy you. Don’t let the success of a great idea paralyze you. Both are equally dangerous to the creative process and your creative development. As George Will once wrote, “any idea is dangerous if it’s a person’s only idea.” And that’s probably pretty accurate. Be thirsty.

 

______________________________________________

[1] Screen Rant, 2012

[2] Gartner Social Report, 2014

[3] The Antidote: Happiness For People Who Can’t Stand Positive Thinking, 2012

What Happens When Privacy Disappears & We Have Nothing Left to Hide?

AP_donald_sterling_2_jt_140510_16x9_992

Some people feel like society has taken a turn for the worse recently. Particularly when it comes to the issue of privacy.

Take a look at the headlines from the past couple of years. NSA security probes. Digital espionage. Identity theft. Catfishing.

Rogue individuals, whether perceived as shameful traitors or courageous whistleblowers, have exposed omnipotent governments. Glenn Greenwald. Edward Snowden. Julian Assange.

Multi-national corporations have fallen victim to epic security breaches. From Bank of America to Apple to Target to Sony to Disney to LinkedIn to eHarmony.

Even major news organizations — like News of the World, which shuttered its doors after 168 years of operation due to an egregious phone hacking scandal — continue to evaporate our faith in this passé thing called ‘privacy’.

It may also explain how a service like Snapchat, which essentially offers the same functionality as traditional SMS but with a twist of temporariness, can attract over 500 million people around the world and a $3 billion buyout from Facebook. Then of course, there’s Pluto and Wickr and Cyber Dust and a slew of other self-destructing communication services. All for the sake of covering up our digital footprints.

We waltzed into the digital age like a child stumbling upon an unguarded cookie jar, with wide-eyed amazement and a seemingly endless appetite. It was exhilarating. Consume ’til your heart’s content. Now, the nausea is starting set in. Slowly.

Horst Feistel wrote in Scientific American, “there is growing concern that computers now constitute, or will soon constitute, a dangerous threat to individual privacy.” The year was 1973. Yet that sentiment hasn’t shifted much since then. If anything, it has only emboldened.

Today, 74% of internet users said they were more concerned about privacy this year than they were a year go [1]. That number has grown nearly 52% in the past 5 years alone [2]. But what’s more interesting is how internet users have begun safeguarding some of their online behaviors. A recent survey from Pew Research found that 86% of internet users have actively taken steps online to remove or mask their digital footprints — anything from clearing cookies to encrypting email to protecting name displays on social networks [3].

As is the case with most things, a healthy dosage of perspective is helpful. After all, privacy has always been a fervently defended issue. Puritan rule in the 1600s decreed keeping an eye on your neighbor as a civic duty and in many towns it was forbidden to live alone. The national Census, as originally established by the U.S. Constitution, was regarded by many at the time as a flagrant infringement of privacy and personal information. And how could we forget the fury that had bubbled over when law enforcement agencies began wiretapping early telephone networks in the 1890s? In other words, losing our collective shit over privacy is sort of an American pastime.

But perhaps there is something greater afoot here today. Something with much wider implications. Something that will not only change the way we interact with technology, but something that may change the way we interact with each other. On a human level.

As Aldous Huxley once said, “technological progress has merely provided us with more efficient means for going backwards.”

So here’s my optimism for an otherwise very troubling trend:

A few years ago, Carnegie Melon University professor Jesse Schell gave a TED-like talk at the Dice Summit 2010 where he imaged a future when new technology could have transformative affects on human behavior — namely, a future when the gamification of everything things could potentially enable companies, governments and other institutions to reward people for completing or excelling at routine, everyday tasks.

For instance, he dreamt up technology that could intuitively know each time you take public transportation, providing people with instantaneous tax incentives. Or, eye sensors that could track the completion rates of the novels that you’ve read so that Amazon could provide more accurate book reviews. Or, mechanisms that could reward school children with incremental scholarship funding from the Art Council each time they practice the piano, even more if they performed particularly well on a given day. It’s futuristic, but not far-fetched. A bit Big Brother-ish. But utterly astounding.

The most important point of Schell’s talk was the realization that in a not-so-distant future, when these types of technologies can track, watch and reward our everyday behaviors, wouldn’t we instinctively become more aware — and more conscious — of our behaviors? Wouldn’t we become a bit more sensitive to the fact that our every behavior carries consequence? Wouldn’t these gamified-track-reward-whatever-you-want-to-call-them systems provide a better mirror into the way we live our lives? A real-time, unavoidable reflection for how we behave? As Schell concludes, “it could be that these systems are all just crass commercializations and they’re terrible. But it’s possible, that they’ll inspire us to be better people.”

Schell’s thesis for the gamification of objects may also bear some analogies to the issue of privacy. Despite all the outrage today — the increase of information gathering, the dissipation of personal boundaries, the countless spying scandals, the notion that nothing ever dies once it lives online — isn’t it possible that our evolving relationship with privacy may impact our evolving relationship with the world in which we live? Isn’t it at least possible that the evaporation of privacy could — in some unforeseen, round-about, possibly tyrannical but potentially convalescent way — actually contribute to a profoundly positive change in how we conduct our lives? In a world where it becomes harder and harder to hide things, wouldn’t we become less and less motivated to have things worth hiding?

It’s hard to say for certain. But what is certain is that we’re inevitably heading toward this future whether we like it or not. One where our secrets are more susceptible. One where our indiscretions are more exposable. One where our carefully constructed reputations, the images of ourselves that we want the world to see, are less of a byproduct of painstaking production and more of an honest representation for who we really are. Our benevolence and our wickedness. And everything in between.

We can continue to run around in outrage over privacy. And maybe we should. But maybe our best hope for the future is just as much about changing technology as it is about changing ourselves.

Let’s start by being good.

_________________________

[1] Harris Interactive, 2013

[2] Pew Research Center, 2013

[3] Pew Research Center, 2013

Finding Beauty In Unlikely Places

Beautiful-Graffiti-Art_05There was a powerful story printed in the Washington Post years ago about how Joshua Bell, one of the world’s premier virtuoso violinists, descended into a crowded Metro subway station to perform a few classical pieces on one of the world’s finest handcrafted violins. Despite having just performed to a sold-out crowd at one of the most prestigious concert halls in the country a mere three days earlier, Bell was now busking for spare change amidst a backdrop of early morning commuters. The most shocking thing about it was that nearly nobody — save a curious child or the occasional passerby — had stopped, or had even passed a glance at Bell as he played some of the most beautiful compositions of music in the world.

The story, which became known as Pearls Before Breakfast, was an experiment on how context can alter perception. It’s an important story because it demonstrates just how susceptible human beings are to overlook beauty in everyday situations.

Artist Brendan O’Connell created a collection of portraits highlighting the subtle beauty inside of one of the most least likely of places: Walmart Supercenters. His paintings feature stores shelves, stocked with JIF peanut butter and UTZ potato chips, crowded check-out counters, boxes of Farfalle pasta, a package of frozen Bubba Burgers. It’s a powerful meditation on minutiae. He says, “trying to find beauty in the least-likely environment is a kind of spiritual practice.”

German-born photographer Michael Wolf recently launched his latest photo project, capturing the living conditions in mega cities throughout Asia. It’s what you might’ve expected: claustrophobic dwellings made of everything from concrete to cardboard. But Wolf’s photos capture it in a way that outline an aesthetic behind it all. There’s a strange beauty to it, a hidden geometry, not by intent or by architectural design, but by the realization that humanity is capable of existing even within structures built to suppress it.

There is something soul-stirring about the emergence of beauty from unlikely places. It’s why I’ve always gravitated toward artists with more raw reflections of the world. Like when Tom Waits writes a love song from the perspective of a prostitute recently released from prison or when the Coen Brothers produce a film where the villain ultimately succeeds or when Charles Bukowski, once referred to as ‘the Poet Laureate of American Lowlife,’ pens a poem about drug-induced fornication in a roach-infested hotel room. There’s an unshakable truth woven into these stories, untarnished by the pretense of perfection, a portrayal of human life so realistic that it’s a thing of beauty in and of itself. By disavowing our dictionary definitions of what beauty is, we are able to find beauty in places we never thought it could be.

Confucious once famously said, “everything has beauty, but not everyone can see it.” Or maybe we’re just not looking close enough.

A Web of Deception

audience-of-one5672689

Early this year Veritasium released a rather scathing YouTube video exposing the pervasiveness of Facebook ‘Likes’ coming from fake people. Click farms. Spambots. Duplicate accounts. That sort of thing. Even Facebook estimates that fake users represent somewhere between 5.5% and 11.2% of its total user base, which could account for up to 137.76 million users across their network [1].

But this phenomenon isn’t exclusive to Facebook. The same could be (and has been) said of Twitter. And Instagram. And any other popular online social network. And even just the world wide web, in general. A recent report from Incapsula found that actual humans account for less than 40% of all web traffic [2].

This is no revelation. There has been a handful of similar reports with similar findings in recent years, all with varying degrees of “really, are you kidding me?”. At the very least, it should cause some contemplation from companies that advertise online, where metrics like page views, clicks, search and visitor flow impact the pricing, testing and optimizing of online ad units.

It’s sort of like the first time you realized that the laughter from your favorite television sitcom was actually coming from a pre-programmed soundboard in some studio somewhere. It doesn’t necessarily make the punchline any less funny. It just makes you reconsider the uproar behind it all.

 

______________________________

[1] The Next Web, 2014

[2] Incapsula, What Google Doesn’t Show You, 2014

The Lust For Loyalty

© Tom Fishburne

© Tom Fishburne

A recent study from the National Opinions Research Center found that almost 15% of married women and 21% of married men admitted to having extramarital affairs at some point throughout their marriages [1].

Let that sink in for a second.

Now, consider how most brands today believe that these same people are unwaveringly loyal when it comes to the brand name of shampoo in their shower. Or the logo on their sneakers. Or even the insignia on their automobile. Point is, our assumptions of how loyalty works is often incongruent with the reality of what loyalty is.

In his book How Brands Grow: What Marketers Don’t Know, market researcher Byron Sharp builds on a wealth of empirical evidence to suggest that, despite all the pontification from marketing professionals, customer loyalty is largely a myth. His findings underline the important (and often overlooked) Duplication of Purchase Law which states that all brands within a category share their customer base with other brands of similar size. This explains why 72% of Coke drinkers in the UK say they also drink Pepsi, for example [2]. It also explains why even the most ardent brand advocates aren’t always brand exclusive.

Martin Weigel articulated this masterfully in his essay The Liberation of Magic:

“Loyalty is much more like an open marriage than one characterized by unwavering monogamy and devotion… Irrespective of the category we examine, we see that the vast majority of buyers are in fact not loyal to a single brand. Devoted loyalty – borne of the belief that other brands just aren’t as good, or just aren’t the same – does not exist. Instead, consumers are perfectly happy to buy from a repertoire of brands.”

There are plenty of marketers out there today that have built bonafide businesses by throwing around language like fansambassadors, brand activists, and loyalty. And there are just as many B-school whitepapers saying things like, 1.5% of shoppers drive 80% of sales for the average new CPG product; or, the top brand buyers are almost 6 times more likely to try a brand’s new products than average shoppers [3]. And those aren’t entirely moot points.

But fandom is fool’s gold. As Karen Nelson-Field of the Ehrenberg-Bass Institute cautions, too many brands are “putting a disproportionate amount of effort into engagement and strategies to get people to talk about a brand, when [they] should be spending more time getting more light buyers.” True talk. The lust for loyalty is a siren song for companies needing to address actual business problems. Fans may give brands reassurance but rarely do they increase revenue or market share or household penetration or any other real, bottom-line business metrics.
* * *
The Institute of Practioners in Advertising (IPA) has been mining marketing effectiveness measures from more than 1,000 brand case studies over the past 25 years. Their research shows that loyalty campaigns underperform on almost every business metric. They also found that only 9% of loyalty campaigns increased loyalty significantly — not much higher than non-loyalty campaigns in fact [4]. And this actually makes quite a bit of sense. As Les Binet and Peter Field explain in Marketing In the Era of Accountability, talking to existing customers is fundamentally less rewarding because:
    1. there are usually fewer of them than non-customers, and
    2. they are typically more influenced by product experience than by communications.

Binet and Field use this IPA data to demonstrate that across nearly every category, “superfans” represent such a small segment of potential customers that it doesn’t even come close to reaching a critical mass for communications. You know, the majority of Nike owners don’t have a logo tattooed across their chests. And even among those select “passion brands” that seemingly have an unyielding consumer allegiance, communications are less likely to influence purchase behavior — which may explain why a brand like Apple has deliberately ignored ambassador outreach, centralized CRM or even social media engagement in general.

So while building brand loyalty has become one of the more popular marketing mantras of recent years, it’s important to at least acknowledge its limitations. It’s not quite the marketing panacea that keynote speakers and best-selling authors lead us to believe it is — at least not when it comes to delivering against hard business goals. Loyalty may be valuable as a marketing output but it tends to be pretty poor as a marketing objective.

David Ogilvy once famously said, “the consumer is not a moron… she’s your wife.” And for some, there may be an uncomfortable amount of truth to that now.


[1] National Opinions Research Center, 2013

[2] How Brands Grow: What Marketers Don’t Know, 2010

[3] Catalina Marketing Corp., 2012

[4] Marketing In The Era of Accountability, 2007