Back to Ac: Returning to Academia, With a Twist

In 2009, I received my PhD from Royal Holloway, University of London. I thought it was the culmination, and the conclusion, of my academic career.

At the time, I had a full-time non-academic job as director of communications at a private school. I had decided to settle permanently in Toronto, my hometown. Since undergrad, I had worked in communications while pursuing degrees in English literature and teaching cinema and media studies; by settling in Toronto, I thought I had effectively chosen the former career, relegating my academic work to a fun, occasionally lucrative-ish hobby.

But I wasn't ready to give it up. I loved teaching, and I loved writing. And don't get me wrong: although I often project optimism (I am, in fact, a perennial optimist), it was painful not to be a full-time professor. On good days, it felt like a choice. On bad days, it felt as though I'd been abandoned by the academy. I would listen to Amy Winehouse's "Back to Black":

We only said goodbye with words
I died a hundred times
You go back to her
And I go back to...

So, over time, I continued to teach and publish. I also continued to work in PR and communications, specializing in social and digital media. I published my book. I started a boutique agency, which took off beyond anything I'd hoped for. I did a short stint in corporate management before leaving to consult again. I continued to teach: at the Schulich School of Business, McMaster, Ryerson - I even went viral for it.

Over those same years I applied very selectively for a couple of academic jobs. I was interviewed for two. One of them went to someone who fit the field specification much more closely. For the other, in a far-away city, I withdrew my application after deciding with my husband that we weren't willing to move for it. I kept teaching part-time and consulting full-time. My kids got older.

And then, near the end of last year, I was offered what I never thought I would have: a full time, permanent academic job, as a professor in the Bachelor of Public Relations Management program at Centennial College here in Toronto.

The Unicorn Job

I joke that my current job is a "unicorn job": it's exactly right for me, but it wouldn't be right for most people. It requires a PhD and a fair bit of industry experience in PR/marketing/comms and a lot of teaching experience and research experience with a good publication record. Most academics don't have much industry experience, and most industry folks don't have a doctorate. 

The ironic thing for me, of course, is that if I'd taken the classic academic pathway of TT-job-while-ABD, I wouldn't have been qualified for this job. It's only because I left academia that I've been able to come back to academia in this capacity: at a college (what Americans would think of as a "community college"), in a degree-level program at that college.

And what a job it is. Centennial is spectacular. I get to work alongside colleagues like PR czars Barry Waite and Donna Lindell and pedagogical champion Marilyn Herie and change leader Ann Buller. It's a school where they hold a huge Pride picnic, and include the First Nations Two-Spirit community by name. It's a school where I get to work directly with the Global Experience folks to help my students access international work placement experiences.

I get to teach, I get to publish, and I get to present at conferences like CPRS Illuminate 2017. It's pretty much a dream job for someone like me who couldn't decide between academia and industry. I get to do both now.

always coming home

I want to make one thing clear: the academic job that I have now is a great job, but it not the kind of academic job for which most PhD programs prepare their students. It is not a job at a research university. It is not a tenure-track job; although Ontario college professors have a lot of job security, it isn't a tenure system. It is a job where I teach a relatively heavy course load (4-5 courses per semester) and where my research, although it is highly valued, is done alongside my teaching.

It is not a job for everyone. But it is the perfect job for me, and for my colleagues, all of whom are exceptional teachers and mentors. I love research but I love teaching. The job security I have is enough for me. And I truly do have at least part of the summer off.

The point of all of this, though, is that I want to tell other academics who have left academia: leaving doesn't have to be forever.

In Ontario, specifically as colleges offer more university-level degree programs, there is likely to be an increasing number of opportunities for people with PhDs and industry experience to help build these programs, like mine, that offer the best of both worlds: academic rigour and practical application. 

But even if you're not in Ontario, there may be jobs like mine near you: jobs that are probably at community colleges that value both your terminal degree and your practical experience. Jobs like mine are jobs in which "leaving" academia isn't leaving, but rather broadening. It's not an either/or; it's a both/and.

One of the most important things I learned in graduate school, in fact, is the value of both/and thinking. Holding two things in tension that seem to conflict but actually complement.

So I guess I didn't go "back to ac" after all. I'm not a prodigal academic, who left and then returned.

I didn't go home. I've always been home. I just didn't know it until now.

When you think it's the end, but it's not. Photo © Jessica Langer.

When you think it's the end, but it's not. Photo © Jessica Langer.

Fake News and the Nation as Imagined Community

One of the things I touched on during my portion of the "Champions of Truth" workshop at #CPRS2017, put together by myself, Heather Pullen and Terry Flynn, was Benedict Anderson's idea of the nation as imagined community - and how these communities are built, in part, on the vernacular that our officials and representatives use.

I first encountered Anderson a dozen years ago in grad school, and his work has continued to be a touchstone for me, during my time as a theorist of literature and media and then as a professor of communications, marketing and public relations. (Because ultimately, these fields are all related: they're about the stories that are told publicly, how and why they're told, and how they are received and re-told.)

The crux of Anderson's argument is this: nation-states are not inherent, naturally-arising structures, but are rather collaborative constructs upon which we all collectively agree, according to a version of a social contract (underpinned by a literal contract, i.e. laws), and maintained by various collective social actions and representative symbols: language, media, pledges, monuments, etc. These things can be, and are, wielded as the weapons of nationalism, but that's another post.

I also touched on Jason Stanley's definition of propaganda in his exceptionally good book How Propaganda Works: "the employment of a political ideal against itself" (viii).  And here's how they fit together:

The "fake news" phenomenon turns the democratic national ideal of a free press against itself, by claiming that the press is free while fomenting the conditions that limit press freedom, in particular, discrediting truthful reporting. This is done in concert with more brutal methods of suppressing press freedom, like denying access to mainstream media outlets, arresting journalists who are fulfilling their job duties, and supporting violence against journalists.

When a leader lies, and when a leader redefines words as their own opposite (for instance, by calling mainstream media outlets "fake news" while asserting that demonstrably false information is in fact true), this becomes part of the vernacular of the nation, by which it is governed and by which the nation is imagined by its citizens. An attack like this on the vernacular of the nation is an attack on the nation itself.

I would argue that this dynamic doesn't just apply to nations, though. The deleterious effect of a dishonest leader who seeks to redefine vernacular is the same in any organization. Think about the downfall of American Apparel, in large part due to the actions of its founder, Dov Charney, who allegedly sexually harrassed female employees under the guise of an "unconventional" management style. 

If the head of an organization - political, social, business or otherwise - actively creates and fosters a culture in which falsehood is acceptable and truth is unwelcome, the organization will become unstable. The vernacular itself becomes vertiginous.

And this is where literature becomes relevant: because Orwell knew this as well as Anderson and Stanley do. 

References:

Anderson, Benedict. Imagined Communities: Reflections of the Origin and Spread of Nationalism. London: Verso, 1983

Stanley, Jason. How Propaganda Works. Princeton, NJ: Princeton UP, 2015.

Fake News, Word of Mouth and Social Media

In preparation for the workshop I'm presenting at CPRS 2017 alongside Terry Flynn and Heather Pullen, I've been doing a fair bit of reading lately about the "fake news" phenomenon (which is a new, Trump-issued name for a very old tradition of dis/misinformation). 

I think, though, there is probably one central reason why the dissemination of "fake news" in our current media environment is so pernicious and difficult to counter. It's a similar reason to why influencer marketing is so successful ROI-wise.

It's because information shared by social media "friends" functions primarily as word-of-mouth, and readers therefore trust it more than they would information that comes directly from a more 'official', authoritative source.

Word-of-mouth marketing has the highest "trust" factor. Let's think about the paid-earned-shared-owned (PESO) model developed by Gini Dietrich:

Image credit: Cision.

Image credit: Cision.

Paid influencer content is in the "paid/shared" buckets, but reads as if it's in the "earned" bucket, and most influencers are influencers because of their success getting their own earned content seen. That's why influencers are often so hesitant to use the #ad or #paid hashtags to indicate where their content is an advertisement. But even when they do, most influencers are careful always to indicate that they only work with brands they personally like and trust; they're just lucky enough to be paid to represent them. (This is generally truthful, if only to keep things on-brand.)

The strength of influencer marketing is that it exists in the liminal space between paid and earned media: it uses the high trust factor of word-of-mouth recommendation to advertise products.

And the strength of fake news is that it exists in a similar liminal space between earned and shared media. 

There is a distinction between media producers who publish fake news in order to achieve a strategic objective, whether that be political, social, economic or otherwise, and social media users who share that fake news because they believe it is true. And there are many reasons for publishing fake news. A group of young Macedonians apparently published reams of it in order to make money from "outrage clicks"; any political objective was secondary to the economic impetus. (This is one of the things, by the way, that distinguishes modern "fake news" from many historical propaganda efforts.)

Fringe political groups with specific political agendas, on the other hand, publish the most outrageous stories they can find - or spin - and seed this content in highly partisan social media groups, which themselves become the vector for information dissemination rather than the publisher itself. Counterculture groups, whatever their position, are particularly susceptible to this dynamic, because they are configured in opposition to mainstream sources; they are more likely to trust less mainstream news sources, as they see mainstream news outlets as "fake". Word-of-mouth, therefore, becomes the only trusted source of information - whether or not that information is itself deliberate disinformation, which is often is. 

This isn't just a fringe-group phenonemon, though. As our filter bubbles shrink, we tend to see only information from sources with which we already agree. This creates a feedback loop in social media: our perceptions of the world are continually confirmed wherever we go, and over time, we become more and more inured to ideas outside of our bubble. And within this bubble, we therefore become more susceptible specifically to fake news that confirms what we already believe, no matter how unbelievable it might otherwise seem. (I would argue that this is even more important now, considering how unprecedented much of our political landscape is.)

Yet another argument, I think, for continuing to seek out information and ideas that are anathema to us - not to change our own minds, but to keep our fake-news-detectors working. 

The Last Lecture at Ryerson University: April 3, 2017

This year, in response to my letter to my Ryerson students that went viral, Ryerson invited me to give the Last Lecture to its graduating student body. The Last Lecture is a tradition that started with Randy Pausch, a professor who discovered that he had terminal cancer and chose to give one final talk, a culmination of all the wisdom and advice he thought most important. Since then, universities and colleges all over the world have instituted their own Last Lecture series for graduating students.

Here's what I said to Ryerson's graduating class of 2017.

***

Over and over again in my life, every so often, I’ve gone back to an article by Oliver Burkeman, published in the Guardian, called “Everyone is just totally winging it, all the time.”

I’ve also gone back, over and over, to a quote from one of my favourite films, The Princess Bride: "Life is pain, Highness. Anyone who says differently is selling something."

At first glance, these might seem like odd things to talk about during this sort of speech. “No one knows what they’re doing… oh, and also, life sucks.” Inspired yet?

But the thing is: it’s so important to leave room in our lives for doubt, for pain, for fear… because that’s the only way we can get to growth, change, understanding, knowledge, and accomplishment.

We have to wing it before we can fly. And we have to understand that life is pain, before we can appreciate that life is also joy.

***

Let’s talk about Burkeman’s article for a few minutes… about “winging it”.

Why does Burkeman think we need to know this?

I think it’s because we seem to be afflicted with a society-wide case of impostor syndrome.

When we look around us, it feels like everyone else has it all together. It seems as though the people around us have their lives all figured out, their goals all set and attainable… and they’re having a much better hair day than we are. 

This is sometimes made worse by social media. We check in on our friends on Instagram or Snapchat or Facebook… we see their vacation photos and new shoes and swanky office views.

But it’s not just about social media. We all present to the world the side of us we want others to see. And so we always see everyone else’s highlight reel… which feels especially lonely when we ourselves are struggling through parts of our lives that we wish could end up on the cutting room floor. (…to be all Creative Industries about it.) 

We all do it. But that doesn’t make it feel any less personal.

***

The thing about “everyone is winging it” theory, of course, is that it’s not quite true. It’s not quite the case that no one knows what they’re doing.

It’s just that no one starts out knowing what they’re doing.

One of the things I said to Tanya Chen from Buzzfeed (to cite myself; even in a speech, I want to ensure I’m abiding by Ryerson’s academic standards!)… is that no one is born knowing how to be. 

There isn’t some “How To Human” manual that they happened to run out of on the day you were born. You didn’t forget to register in first year for Person Class. (And if you did, it’s not your registrar’s fault.).

No… we all start out clueless, together. And the antidote to that cluelessness isn’t knowledge, really. It’s not wisdom, either and it’s not diligence… though these three things are certainly helpful.

The antidote is to give yourself the space to grow. To understand that it’s not just okay to have a lot to learn, but it’s necessary.

We must be kind and patient with ourselves, because even for those of us who learn quickly, true expertise takes many years to build. And we must be kind and patient with others, because they, like us, need encouragement, not derision.

It’s not enough, though, just to give yourself space to grow. You also need to learn how to grow. And luckily, my dear students, that is exactly the thing you’ve just successfully spent the last four years doing. Thinking critically. Solving problems. Learning how to learn.

Everyone is totally winging it. No one knows what they’re doing. But many people – like yourselves – are learning how to do what they’re doing.

You might be winging it, but you’ll be able to wing it better. And what is winging it, if not learning to fly? 

**

And now that we’re talking about flying, let’s also talk about falling.

I want to talk about the times when we are not patient, when we are not kind. I want to talk about the times when others are neither patient nor kind with us. And about the times when we give ourselves plenty of room to grow, we develop expertise, we work hard and get good marks, we network and build relationships, we do everything right… and we still don’t succeed.

There are lots of those times in your future. Because life is often unfair and unequal, and we live in a society that contains both structural barriers and individual injustices. There will be situations in your life that are so hard that you wonder how you could possibly live through them. There will be pain; because life is pain, and anyone who tells you differently is selling something.

And when these things happen to you, as they will, I want you to remember one simple thing:

You matter. 

Every one of you in this room matters.

That is a fundamental truth of your existence that could never and will never change, whatever else happens.

Knowing that you matter gives you community. You are a necessary, indispensable, inextricable part of a world where your presence brings goodness to those close to you.

That knowledge is the soil in which joy can grow… and I promise, it will.

***

After my letter to my students was published online, I heard from a lot of people. Many of those people were other professors. And almost all of them said the same thing: “I feel the same way about my students.”

Despite how it might feel sometimes – especially during exam week – we care about how you’re doing. We want to see you succeed and we want to help. You matter to us.

One great thing about graduating from Ryerson is that your built-in community doesn’t end when you graduate. As a Ryerson alum, you have a huge community of people who have your back. Your professors, your classmates, your fellow alumni: we are a web of influence of which you are a necessary part. And one day, it will be your turn to help a newly hatched graduate into the big wide world.

***

We’re almost at the end. But I want to say one more thing.

For this speech, I was asked to respond to the prompt: ““If this were your last time to address a group of students, what would you say to them?”

But instead of thinking of this as an ending, I’d like to think of it as a beginning. Because this is about you, dear students. This is one of the last times you are going to be a student; and soon, it’s going to be the first time you enter the world as a university graduate…  and a young professional.

It’s time to wing it.

We’re kicking you out of the nest.

But we know you’re ready. You’re fully fledged.

It’s time to fly.

On Pokemon Go and "Consensus Reality"

A Spearow at the entrance to Powhatan State Park in Virginia. Credit: Virginia State Parks on Flickr.

A Spearow at the entrance to Powhatan State Park in Virginia. Credit: Virginia State Parks on Flickr.

Back when I was a science fiction scholar (okay, I sort of still am... yes, Gerry, I'll have my Cambridge chapter in on time!), I spent a lot of time thinking about different kinds of reality. Diegetic reality - the "reality" within a work of art or literature or media - versus non-diegetic reality, the world in which we all live. The future or alternate realities in science fiction, which Darko Suvin calls "cognitive estrangement" from our own reality, versus the world in which we all live. 

The world in which we all live. It's what Suvin calls "zero world", or what Katheryn Hume calls "consensus reality". I like that second one, consensus reality. It's a term that understands that we all create reality around us as we live: we create systems of government, theories of art, traditions and ways of living and being human. And at the bottom of it all, despite our disagreements, there is some kind of deep consensus among humans: that there is something called "reality" and we are all living in it.

(This is, by the way, one of the reasons why Western culture in particular is so anxious about what we call "delusion": people who believe that the TV is talking to them, or that they are the Chosen One, or somesuch. They haven't bought into the consensus about reality.)

Now, this consensus is a lot more fragile than we often think it is, "we" being those of us raised in generally-Western cultures ruled by principles of rationality; I want to acknowledge here that the concept of reality itself is much more open-ended in many non-Western cultures, though I don't want to belabour it in this post. For more on indigenous scientific literacies, for instance, see Grace Dillon's work or anything written by Nnedi Okorafor.

This consensus is also fragile because, with the increasing popularity of location-based social media as a method not just of communication but as augmentation of our daily lives, what we think of as "reality" is changing rapidly.

Why Do We Want a Pikachu?

Why do Pokemon matter? They're just bits of information on a server. Why do we care if we catch them all, or some, or any? 

This is not a new question, of course. The gaming-inclined among us have been derided for valuing in-game objects since there were in-game objects to value. But why do the proverbial "purpz" (for those of you who didn't play World of Warcraft, that's hard-to-achieve legendary items) matter to us? Because there is a consensus around what they mean. They have become semiotic objects, signifiers of status and expertise in-game. They have also become, due only to consensus, economically significant objects in their own right, available for purchase or sale, and "farmed" by teams of bleary-eyed professional gamers in the same parts of the world to which we also outsource production of other things we buy and throw away. 

They, like Pokemon, are just bits of information on a server, but they hold real value to those who participate in the consensus that they are valuable.

Pokemon Go is a little bit different, though: it represents, I would say, an advancement of consensus-based gaming, a step towards merging "zero world" with the digital one. In World of Warcraft or KOTR, even if you're playing in a team with others, the game-world is separate from the non-game world. You sit down at a computer and put on your headphones and you're in.

But with Pokemon Go, the game world is layered onto the real world in a Baudrillardian simulacrum of the planet. Pokestops and gyms, where you can get items and battle your Pokemon against others', are stationary and generally map onto places of significance; the Pokemon themselves are mobile. The "AR camera" allows Pokemon to layer themselves on top of whatever part of the world you're in: they're on your street, in your bedroom, bouncing on your dog's nose. You throw a ball and catch them. And they're everywhere. They're even in places like Auschwitz and the Holocaust Museum and a children's hospital

Pokemon as Invasive Species

There's been a lot of ink spilled about the inappropriateness of Pokemon being in places like these; I wouldn't disagree. But the problem isn't so much that the game developers are insensitive; rather, in the game, locations have different meanings than in zero-world. Pokestops and gyms are the only locations of difference: that is, the only locations that have any material significance within the Pokemon Go layer of reality. This isn't so much insensitivity as what I might call imperfect semiotic mapping. Because these spaces in-game do often map onto places of significance in zero-world, like churches and schools; it's just that "significance" and "appropriateness" are two very different things, and the consensus that we have about the kind of significance these places hold in our sociocultural memory - the pillars of what Benedict Anderson called our "imagined communities", to borrow a concept - doesn't map very well onto the kind of significance that these places hold in Pokemon Go.

Pokemon themselves... well, they might even be called invasive, in a multivalent way: they have invaded both our sense of consensus reality and the spaces that are collectively important to us. Of course, you as an individual always have the choice not to play. But while Pokemon themselves don't interact directly with zero-world reality, the people who play it do. And if the consensus is that there's a Pokemon in Central Park, then you end up with a crowd gathered around, abandoning their cars, to catch it. If the consensus is that there are rare Pokemon that live at Harbourfront in Toronto, then you end up with crowds of people milling around trying to catch them.

The Impact of Consensus

These consensuses (consenses?), like the consensus of WoW gold being worth real money, have material real-world effects, too. Business Insider reports that because of Pokemon Go, food carts have become permanent fixtures in Central Park, peddling rations to aspiring Ash Ketchums. The game has also been lauded as a way to improve mental health: people experiencing depression and anxiety are finding helpful the game's incentives to leave their homes and walk to hatch eggs, and to go to different biomes to catch different Pokemon. 

Of course, as with most trends, there are naysayers: the Vancouver superintendent who encouraged players to "think about their life choices", for one. As with most trends, there will be those who choose not to follow it, or who think it's silly. But why does the Vancouver signwriter care in the first place? Because they, indirectly, also participate in the consensus-reality of Pokemon in our midst: they are directly affected by players of the game. 

As a marketer, I understand that many kinds of value exist only by consensus. Status symbols, for instance, are only such because there is a social consensus about their significance. As a social/digital communications professional, I understand that the value of a social network's functionality is only valuable inasmuch as it provides that network's users an optimal way to create, share and consume the kind of content that those users - and the companies that would market to them - find valuable. (This includes MMORPGs.) Even when there is little functional value, there is social value, because and only because of a social consensus to find value.

Pokemon Go is perhaps, then, the apothesis of this concept. It is a layer of reality that has zero inherent functional value, and is imperfectly mapped onto our reality, and is significant only because we have collectively decided it is so. 

Pokemon Go is, in other words, the closest thing we've seen yet to a purely consensus-based reality.

A Few Notes on the State of Social

As 2015 draws to a close – and I’ve been so busy with work that I’ve let the blog lie mostly fallow – I thought I’d scribble down a few unscientific observations about the state of social as we head into the New Year.

Social is moving towards simple, easy and image-rich. There’s a reason Instagram and Snapchat are so popular for so many brands (and why Instagram CPMs are so much higher than Facebook or Twitter): they are easy, simple andimage-rich. The reason? Users don’t want to hunt for content; they want content to come to them and be easily digestible, and they want to be able to hop in and out of the content as their time fragments further. Instagram provides a quick hit of prettiness in the 1-minute wait for the elevator; Snapchat lets you communicate quickly, easily and effectively with your friends – and with brands – with the click of a camera button rather than laborious typing.

For some examples of brands that are doing simple, pretty marketing really, really well, check out upstart skincare brand Sabbatical Beauty and jewelry maven Simple Studs on Instagram.

Gifs are the new words. Why say it with a phrase when you can say it with a 5-second subtitled snippet of your favourite television show? This is how Tumblrcommunicates, it’s how people communicate in the Jezebel and xoJanecomment sections, and increasingly, it’s how people communicate on Facebook and elsewhere online.

And, like any developing language, some words are more common than others – and have become standard responses in particular situations. (This one, from Parks and Rec, is one of my favourite things on the Internet – it’s become a common response to anyone behaving badly anywhere.)

Brands: please understand what “Netflix and Chill” means. It doesn’t mean what you probably think it means, if you’re planning to use it or any version of it in a campaign. Unless you’re Trojan. Or OKCupid.

In fact, this goes for any trendy language that you see used. Like “bae” or “on fleek” or “throwing shade”. If you need further instructions, please see the@BrandsSayingBae Twitter account for what not to do.

What trends have you noticed?

3 Big Benefits of Continuous Improvement

I’ve been thinking a lot about the concept of continuous improvement. It’s so often pooh-poohed as a meaningless management buzzword or corporatespeak, or lambasted Dilbert-style as a euphemism for… well, no one really knows.

But when it’s done right and built into your business processes, continuous improvement can be one of the best things you can do for your business. Here’s why.

Steve Jobs quote/image via leanblog.org.

Steve Jobs quote/image via leanblog.org.

1. It’s a morale-booster.

Continuous improvement allows us to forgive ourselves for not being perfect the first time. And in doing so, it opens up a path forward into learning and finding better ways to do things the second time, and the third time, and the fourth. This mindset is excellent for morale, because it demonstrates that the company sees employees as people with the capacity, ability and desire to learn… and they are given the opportunity to do so. 

In a workplace where management encourages continuous improvement, employees are supported through their mistakes and onto a path of learning. Providing staff with the opportunity to learn and grow in their work is one of the most important things that companies can do to hire and retain top-performing employees, and continuous improvement can help your company do just that.

2. It takes the anxiety out of mistakes.

In a workplace that sees mistakes as sheer liabilities, employees who make mistakes – and all employees will make them, because human beings are fallible – are anxious. They know that mistakes will hurt their relationship with management, even if the mistake was made in an attempt to do something different and perhaps better, and so they may take fewer positive risks.

This is not to say that the concept of continuous improvement should excuse mistakes. Nor should it be used to dismiss the potentially real damage caused by a mistake. Instead, continuous improvement should be used as a shift in the conventional thinking: instead of mistakes being seen as failures, they are learning opportunities – opportunities, that is, to improve.

3. It allows feedback to be helpful, not hurtful.

The art of constructive criticism is a delicate one. When you’re unhappy with someone’s work, it’s difficult to tell them what went wrong without bruising their ego – or potentially making them insecure about their work in general. The best managers find the balance between honesty and kindness, because ultimately, polite but firm honesty is the kindest way to manage.

In a workplace driven by continuous improvement, though, the purpose behind the criticism is different. Every mistake or problem is conceptualized as an opportunity to discuss the status quo and tweak it to improve the final product. Processes become learning experiences. Feedback turns from “here’s what you did wrong” into “here’s how we can do it better next time”. It’s an enormous, fundamental cultural shift, from negative to positive, from punishment to possibility.

I’m going to tell a personal story here, because stories are how we learn… and boy, did I learn from this one.

Not too long ago (but long enough ago for me to have learned from it!), I was recommended for a small contract by someone I like and trust, and who likes and trusts me and my work. It was sort of in my wheelhouse, but enough of a stretch that it was a challenge: a big challenge, in fact. It was for a major client.

I prepared for it. I worked hard on it. And, reader, I bombed it. I am used to doing excellent work and to having very, very happy clients and students. I didn’t have a happy client this time.

My first reaction was to worry that I’d permanently ruined my reputation and good name in the industry, and to be anxious that I’d let my colleague down. And although they were disappointed that the contract hadn’t gone better, they told me that the company is committed to continuous improvement, that I would receive very honest feedback about what went wrong, and that I would have another chance to do better on another contract.

That feedback was pretty painful to read. But I’m glad I set my ego aside and did it, and committed to my own continuous improvement, because reading that feedback allowed me to do two things: to find out about a few weak spots and set about fixing them, and to practice failing. I needed the opportunity to learn how to fail gracefully, and to use what I learned to do better next time.

I’m happy to report that, when the company gave me another chance to complete a similar contract, I did beautifully.

As a former academic, in particular, I have always been uncomfortable with failure; in an academic context, failure in one’s work is seen as tantamount to failing as a person. But in business, the concept of continuous improvement allows me to see myself and my work as a process, and allows me the joy of learning new things and getting better and better at the work I love.

Summer Schoolin’: 2014 Teaching Dates

Now that the Schulich semester is over, I’m focusing on my summer teaching, training and other fun entrepreneurial education projects. Some of the teaching I’m doing is executive education for companies you’ll definitely have heard of, but most of it is bloggable.

First up is the Canadian Institute Women’s Leadership Forum, where I’ll be representing the McMaster MCM program, where I teach. If you’ll be at the forum, come by and say hello!

Next is #socialmediaTO, where I’ll be representing the coolest Toronto education startup of all time, BrainStation, on a social media panel that includes luminaries from Shopify, HootSuite, Twitter and Tangerine. It’s May 21 at 7:30 PM at theExtreme Startups space, and tickets are FREE, so come on out and play with us!

As you know, in mid-June, I’ll be teaching a course on communications and new technologies in the MCM program. At the end of June, I’ll be heading all the way over to Helsinki, Finland to present my research on authenticity and luxury brand communities at the 2014 CCT annual conference. (I’m bringing the baby! By myself! It’s OK to present my poster with a Baby Bjorn strapped to me, right?)

And in July, I am very excited to announce that I’ll be teaching an incredible intensive six-week social media marketing crash course under the BrainStation banner. Details to come.

Speaking and Teaching, Spring 2014

It’s a busy spring around here!

This is the last week of my Marketing Management class at the Schulich School of Business; I’ve loved teaching undergrads for the first time in ages, and I’m going to be sorry to see those students go. They are, almost to a one, fantastic.

Coming up next week, I’ll be heading out to San Francisco and giving a short talk and hosting a discussion on social media and digital marketing for the Pediatric Device Consortium at UCSF. These folks are basically superheroes – their mandate is to develop surgical devices to make surgery on kids easier, less painful and more successful – and it’s really a privilege to be working with them.

In May, I’ll be giving a session on Creativity and Innovation through the Schulich Executive Education Centre for a major financial institution, and will be representing the McMaster-Syracuse MCM program, where I’m on faculty, at the Canadian Institute Women’s Leadership Forum.

In June, I’ll be teaching Communications and New Technologies, a course I’ve developed for the MCM, at this summer’s MCM residency (and online throughout the summer). I’ll also be heading off to Helsinki, Finland for the 2014 Consumer Culture Theory Conference, where I’ll be presenting my current research on authenticity, community and the luxury brand experience.

July’s reasonably quiet at the moment, and then in August, thanks to the generosity of some good friends and their lovely cat, I’ll be heading (at least for a day) toLoncon 3.

3 Things You Can't Do In Business (That You Used To Do in Academia)

Not a lot of preamble here: I’ve been thinking quite a lot about why former academics seem sometimes to struggle in the business world. Here are three of the biggest, most important things that post-academics or alt-academics looking to enter the business world need to understand before they make the leap.

(Since I’m not a scientist, I can’t tell you how much #3 applies to ex-scientists; my guess is that it applies quite differently to quantitative research than qualitative or theoretical research. So please take that into account when you’re reading this!)

Anyway: don’t do these 3 things.

1. Ignore things (deadlines, emails, etc.) and hope they’ll go away.

I can’t tell you how many academics I know who submit pieces late, whether they’re books or chapters or papers or conference proposals or whatnot. And when I say late, I don’t mean on Monday when they promised it on Friday. I mean months late. Years late, in some cases.

And I have a confession to make: in my academic career, I’ve done this too. It’s the norm. It causes no end of hair-tearing frustration for the people who are on the receiving end of these pieces and proposals, but it’s part of academic culture, I suppose.

I also hear this quite a bit: “I was supposed to email my advisor back, but I can’t face it, so I marathoned Breaking Bad on Netflix.” “I haven’t done anything on my dissertation/book in months; I just can’t get into the right head space.” “I’m just not making progress. I don’t know why. I’m avoiding going to campus.”

This stuff flummoxes me.

In business, if you don’t go to work, you get fired. If you don’t make progress on a project for months, even weeks, you get fired. If you turn in an assignment six months late, you get fired. If you don’t create or add value, you get fired.

If you avoid your boss because you can’t face speaking to her, you get a reputation as unreliable. If you avoid a necessary task because you don’t want to do it, you get reprimanded and told to do it. If you cause your colleagues hair-tearing frustration because you just can’t be relied upon to do the things you’re supposed to do… guess what? You’re out.

I’m starting to think that this is the biggest cultural shift that ex-academics need to make in order to get into the business side of things. Deadlines aren’t suggestions; they’re deadlines. Assigned tasks aren’t optional, and they’re on specific timelines. You can’t avoid a client’s calls. You can’t avoid your boss’s emails.

In business, you don’t make excuses. You do your damn work, you do it on time, and you do it well.

2. Revise until something is perfect.

We all know the maxim that the perfect is the enemy of the good. And we all know that the best dissertation is a done dissertation. But the reason we need to keep those things in mind – heck, the reason so many academics have posters of sayings like these up on their office walls – is because often, in academia, there’s so much time spent honing and refining and critiquing an idea that it becomes a shadow of what it was.

There’s something else that’s more destructive, though, that goes along with this endless refinement and quest for perfection (a perfection that, I might add, is impossible): the fact that so much academic discourse is in the business not of building on others’ ideas but of tearing them down. The peer review that excoriates an argument for being flawed, even if there’s something substantive to it; the debate sessions in grad seminars that seem more a game of point-scoring than intellectual inquiry; the smirking associate professor at the back of the conference panel waiting to pounce on any chink in an argument’s armour (and there is always one).

Of course, criticism is necessary to chip art out of roughly hewn stone. But in business, at least in the business environments in which I’ve spent most of my time, there’s much more of a spirit of exploration – which is ironic, because academia is supposed to be about exploration. Failure isn’t personal, and it’s not even bad, really: it’s just the discovery of something that didn’t work, which frees up resources to find something that does work. I’ve never felt more intellectually constrained than in certain academic environments, and I’ve never felt more intellectually exhilarated than in entrepreneurship.

3. Suggest instead of asserting.

Oh, weasel words. You know: “It seems to me that”, “I believe there’s a possibility that”, “It might be the case that”, and all those meek little phrases that peep out ideas without committing to them.

I’ve used those phrases far more times than I care to mention, in my academic work. Heck, I still use them in business… when they’re warranted. But while the surface function of weasel words is to bring an idea to the conversation, they’re often used as an excuse for plausible deniability. The line of thinking is: if all I do is suggest something, then I don’t have to commit to the idea! I can deny I ever meant it; I was just suggesting it, testing it, thinking that it’s a possibility. I didn’t mean it.

Many academics, particularly graduate students and others who are in less secure academic positions, use these words because academic criticism can be vicious, mean and very personal. And by “personal”, I don’t mean that a scathing review of an article is likely to end with a “your mom” joke; I mean that academics tend to identify themselves – and their colleagues – largely by their work, so that a criticism of the work is a criticism of the self.

So why, in business, should you assert rather than suggest? It’s not because you’re more likely to be right. In fact, you’re fairly likely to be wrong. Business ideas change just as much as academic ideas.

It’s because in business, the stakes are different. You’re not trying to be right, necessarily; you’re trying to innovate and bring new ideas to the table that will help the business be more successful. And in business, the stakes aren’t personal; sure, it feels bad to be wrong, but instead of being a personal failing, it’s purely a professional one. And there’s psychological space to try again, in a way that there isn’t in academia.

A wrong idea in academia, is denigrated, mocked, pushed aside or into the dreaded dustbin of Outdated Theories. A wrong idea in business is shrugged over, set aside, and quickly forgotten in pursuit of a more successful one. No harm, no foul.

Can you guess which side I prefer?