By Philip Kotler, Bobby J. Calder, Edward C. Malthouse and Peter J. Korsten
The ideal role of marketing was articulated 60 years ago. How close to the ideal have we come by now?
THE GROWING NUMBER of chief marketing executives reflects the increasing importance companies attach to marketing. Yet the average tenure of a chief marketing officer (CMO) is three and a half years, well below that of the typical CEO. Both the prevalence of the CMO position and its precariousness give rise to the question: Has marketing realized the vision to which its adherents have long aspired? A recent global survey of CMOs reveals both how far marketing has come and where there is room to grow.
The Vision for Marketing
For more than 60 years, marketers have had a clear vision of the ideal role of marketing, which consists of two core ideas. One is the concept of the “marketing mix,” which dates to the late 1940s. Harvard’s Neil Borden, while president of the American Marketing Association, realized there was no set formula for successful marketing. Instead, the marketer must choose the best mix from the set of all possible mixes. Jerome McCarthy later codified the mix in the classic 4Ps of marketing — product, price, place and promotion. The task of the marketing executive is to have control of, or at least influence on, all of the 4Ps and blend them to produce the best value.
The second fundamental idea is that marketing decisions should be based on a solid understanding, supported by hard data, of target customers and other stakeholders. Anchoring decisions in data has become part of the bedrock vision of marketing. These two core components — control of the marketing mix and customer-oriented, data-based decision making — are fundamental to the field’s shared vision of marketing. It has been over a half century since that vision, now clearly spelled out in marketing textbooks, took shape. So what is the status of the field relative to the vision?
HOW CHIEF MARKETING OFFICERS RATE THEIR INFLUENCE >>> Sigue leyendo
By Randall S. Wright
Too many executives confuse what an innovation is with what an innovation would do for them if they had one. The solution? Think of innovation as an if-then argument.
ATTEND ALMOST ANY conference on innovation, and one will hear someone in the audience ask, “Yes, but how are you defining ‘innovation’?” Why is there no clear, shared meaning of “innovation”? I believe it is because most executives confuse what an innovation actually is with what an innovation would do for them if they had one. For example, most companies think of an “innovation” as something that wins a sale with a better solution, increases revenue or takes market share from a competitor. But those aren’t definitions of innovation. They’re outcomes executives would like to get from innovation.
The problem is a serious one, not the least because companies send engineers, “technology entrepreneurs” and “technology scouts” in search of innovations when a shared understanding of what they are looking for may not exist across the organization’s people and functions or between “scouts” and managers. More significantly, to “innovate” means to “regenerate” — and most companies decline or fail because they fail to regenerate.
I propose that all true innovations are arguments. By this I mean that all innovations are composed of three elements: a proposition and a conclusion linked by an inference. I further propose that this is not merely a convenient or workable definition that covers most instances of innovation. Far from it: Stating that innovations are arguments is not just stating a definition — it is an identity, an equality. Innovation = Argument.
Let me explain. When the late Steven Jobs went to Xerox’s Palo Alto Research Center in December 1979 to kick around the lab to see what was up, he made an argument — an innovation. He stumbled on a proposition — the graphical user interface — and inferred that this interface would be the way that everyone would experience computing. Jobs later told Rolling Stone, “Within 10 minutes, it was obvious that every computer would work this way someday. You knew it with every bone in your body.” Steve Jobs was an innovator because he could make inferences between technology propositions and conclusions about human experience. Sigue leyendo
An annual event called “Silicon Valley Comes to Oxford,” which took place earlier this month, featured a debate at the Oxford Union on this motion:
“This house believes that the average worker is being left behind by advances in technology.”
The concept of “Silicon Valley community” is a geographically loose one, because helping make the argument were MIT Sloan’s Erik Brynjolfsson, director of the MIT Center for Digital Business, and Andrew McAfee, principal research scientist at the center.
It was logical that the two were invited: their new book, Race Against the Machine (Digital Frontier Press, 2011), is on exactly that theme. (Here’s our blog post about the book.)
McAfee’s opening statement, which he posted at his blog, includes this challenge for how we might rethink the meaning of “corporate responsibility”:
It’s also time to change our minds and broaden our definition of ‘social entrepreneurship.’ When we hear that term at present, we think of sustainability, or clean or green tech, or improving the lots and lives of people in the developing world. All of these are worthwhile and wonderful things to do. Here’s another one: create jobs for average workers. Because there aren’t enough of them right now. The greatest scarcity in our economies now is a scarcity not of resources or even of good new ideas, but of opportunity — of chances to let people realize the American Dream, and the English Dream, the Indian and Chinese and Mexican dream. Sigue leyendo
New York Times reporter Matt Richtel says that too much digital consumption and a steady stream of distraction appear to change the brain and affect our ability to be creative.
The Neurological and Creative Toll of Digital Overload
If you’re like a lot of people, during your work day you might check 40 websites. You could be switching between programs such as Word and Excel and your email application 36 times an hour. You probably stop what you’re doing — or at least pause — when a text message buzzes or an email comes in or your cell phone rings.
Matt Richtel, technology reporter for the New York Times, says in an interview on the NPR program Fresh Air that for all the productivity upsides to digital consumption, there are huge downsides, too, including changes in the brain that seem to affect not just the ability to engage in conversation but the ability to be creative, too.
“Twenty years of glorifying all technology as if all computers were good and all use of it was good, I think science is beginning to embrace the idea that some technology is Twinkies and some technology is Brussels sprouts,” Richtel says. “If we consume too much technology, just like if we consume too much food, it can have ill effects. And that is the moment in time we find ourselves in . . .with the way we are digesting, if you will, technology all over the place.”
Richtel notices, he says, that he’s “not quite as engaged in my world when I’m constantly using devices as I am when I’m away from them.” Away from them, “I can give myself over to conversations a little bit differently.”
Awarded a Pulitzer Prize this year for his Times series “Driven to Distraction,” about the dangers of driving while multitasking with cell phones and other devices, Richtel says that the digital glut appears to not just increase distraction but decrease creativity. Sigue leyendo