Thursday, December 12, 2019

A disappointing progress report on diversity and inclusion

strategy+business, December 12, 2019

by Theodore Kinni


Illustration by Boris SV


Racial and ethnic minorities make up 38.8 percent of the population of the U.S. and a nearly equivalent share of its workforce. But minorities represent only 17 percent of full-time university professors and 16.6 percent of newsroom journalists. They are only 4.5 percent of Fortune 500 CEOs and 16 percent of Fortune 500 boardroom directors. They are 9 percent of law firm partners; 16 percent of museum curators, conservators, educators, and leaders; 13 percent of film directors; and 6 percent of the voting members of the Academy of Motion Picture Arts and Sciences.

These discrepancies haven’t gone unnoticed, but they also haven’t been effectively addressed. “During more than three decades of my professional life, diversity has been a national preoccupation,” writes journalist and New York University professor Pamela Newkirk in the second paragraph of the preface to her book Diversity, Inc. “Yet despite decades of handwringing, costly initiatives, and uncomfortable conversations, progress in most elite American institutions has been negligible.”

Newkirk devotes most of Diversity, Inc., which is heavily focused on racial inequality, and particularly, discrimination against African-Americans, to demonstrating this dismaying reality through a sometimes tangled mix of factoids and anecdotes drawn from the arenas of academia, media, and business. The bigger stories that emerge are all variations on the same theme: The lack of progress by minorities in America’s elite institutions is a function of a political and societal arc that has stretched across a half a century. Read the rest here.

A Noble Purpose Alone Won’t Transform Your Company

Learned a lot lending an editorial hand here:

MIT Sloan Management Review, December 10, 2019

by Rob Cross, Amy Edmondson, and Wendy Murphy


Consider these two companies: The first is a retail chain with hundreds of locations globally — innovative, but basically a sales platform. The second is a hospital that treats the world’s most devastating cancers. Which do you think has a more engaged workforce?

If you chose the latter, in light of its quest to save lives, you wouldn’t be alone. Yet, when we spent time with both organizations, we discovered that the working environment in the hospital was rife with fear, workforce morale was low, and employee turnover was high. At the retail chain, on the other hand, there was a palpable spirit of camaraderie, employees were energetic and enthusiastic, and customers were very pleased with the service. The retailer had the more engaged workforce by a long shot.

It’s a common misconception, both in businesses and in management articles and books, that a sense of purpose is what matters most when it comes to engaging employees. Many leaders concerned with attracting and retaining top talent believe that nothing motivates people as much as the larger good they might be doing or the chance to change the world. Accordingly, they extol the higher virtues of their companies’ missions and the meaning of the work they offer.

But our work with more than 300 companies over the past 20 years, particularly our research using organizational network analysis (ONA) and our interviews with executives, reveals that purpose is only one contributing factor; the level and quality of interpersonal collaboration actually has the greatest impact on employee engagement. In this article, we’ll explore why collaboration has that effect and which behaviors you can adopt and practice to nurture it. Read the rest here.

Wednesday, November 27, 2019

Becoming your most charismatic self

strategy+business, November 27, 2019

by Theodore Kinni



Photograph by Klaus Vedfelt

Peter Drucker, my favorite managerial touchstone, didn’t think much of leadership charisma. You can almost hear him grinding his teeth as he describes, in his 1992 book, Managing for the Future, being asked to run a seminar on “how one acquires charisma” by a vice president of HR at a big bank.

It’s the prelude to a bit of a rant. “History knows no more charismatic leaders than [the 20th] century’s triad of Stalin, Hitler, and Mao — the misleaders who inflicted as much evil and suffering on humanity as have ever been recorded,” Drucker fumes. “But effective leadership doesn’t depend on charisma. Dwight Eisenhower, [former Secretary of State] George Marshall, and Harry Truman were singularly effective leaders, yet none possessed any more charisma than a dead mackerel.”

Drucker’s antipathy toward charisma is understandable. An Austrian working in Germany, he witnessed the rise of Adolf Hitler, and he was forced to flee to London a few months after Hitler was appointed chancellor in January 1933. But Drucker may have gotten this one wrong: He seems to be conflating the effects of charisma with the ends to which it is applied.

It appears, upon further reflection, that charisma does contribute to leadership effectiveness. “A meta-analysis of data spanning close to a quarter of a century has shown that charismatic leaders not only possess an ability to inspire their troops to ever higher levels of performance, but also simultaneously embed deeper levels of commitment in their psyche,” report academics Stephen Martin and Joseph Marks in their book, Messengers: Who We Listen To, Who We Don’t, and Why.

Sounds promising. But what if a leader indeed possesses no more charisma than a dead mackerel? Can it be cultivated? Read the rest here

Friday, November 15, 2019

How to build a great experience

strategy+business, November 15, 2019

by Theodore Kinni



Illustration by Paula Daniëlse

In 2017, the Marriott School of Business at Brigham Young University announced that henceforth the Department of Recreational Management would be known as the Department of Experience Design and Management. The idea that immersive and engaging experiences produce value and deliver competitive advantage has come a long way in the 20 years since Joe Pine and Jim Gilmore welcomed us to something they called the “experience economy.”

Designing Experiences is the latest in a long line of books that have appeared on the subject. In it, J. Robert Rossman, a professor at Illinois State University, and Mathew Duerden, an associate professor in the aforementioned department at the Marriott School, touch on many of its predecessors (including one in which I had a hand, Be Our Guest) in a concise textbook that serves as both a theoretical foundation and a how-to guide for experience design.

The theoretical foundation, which appears mostly in the first two chapters, bogs down a bit in explaining what constitutes an experience. This murk stems from Pine and Gilmore’s positioning of experiences as an economic activity unique from products and services. Rossman and Duerden carry this forward by arguing that experiences differ from products and services because the person on the receiving end of an experience must be actively co-creating it. “Experience demands conscious attention, engagement, and action — in a word, participation,” they write.

This distinction isn’t clear to me. Is there any product or service we can buy and consume that doesn’t require our participation in some form or other? And even if it were possible not to participate in the acquisition and use of certain products or services (say, buying groceries or cutting the lawn), mightn’t that count as a very good experience for some of us? Read the rest here.

Tuesday, November 5, 2019

Best Business Books 2019: Management

strategy+business, November 5, 2019

by Theodore Kinni



Illustration by Harry Campbell

Marcus Buckingham and Ashley Goodall
Nine Lies About Work: A Freethinking Leader’s Guide to the Real World (Harvard Business Review Press, 2019)

Stephen Martin and Joseph Marks
Messengers: Who We Listen To, Who We Don’t, and Why (PublicAffairs, 2019)

Roger Dooley
Friction: The Untapped Force That Can Be Your Most Powerful Advantage (McGraw-Hill, 2019)

In 1954, the discipline of management was neatly encapsulated by Peter Drucker in the pages of a single book, The Practice of Management. This year’s best business books on management reflect how much the discipline has changed in the past 65 years, and how fuzzy the boundaries separating fields have become.

Nine Lies About Work, by Marcus Buckingham and Ashley Goodall, the year’s best management book, challenges the assumptions that underlie contemporary managerial practices, many of which date back to Drucker’s day. In doing so, the book offers a glimpse of a new management paradigm that may prove to be better suited to the times. Messengers, by Stephen Martin and Joseph Marks, prompts us to see managers as a living, breathing communication medium — and it describes the traits that can ensure the messages they deliver will be heard. And Friction, by Roger Dooley, suggests that if managers turn their attention to simplifying anything customers and employees need to do, they’ll happily do more of it. Read the rest here.

Saturday, November 2, 2019

Lucky You!

strategy+business, November 1, 2019

by Theodore Kinni



Photograph by Elizabeth Fernandez

Recently, on a social media site for professionals, I suggested that luck plays a significant role in leadership and business success. This didn’t sit well with several commentors, who argued that successful people become that way largely by dint of merit — they work hard and use their brains and hone their ability to identify and exploit opportunities. People like Bill Gates and Warren Buffett make their own luck, I was told.

Hogwash. This is not to detract from the monumental business achievements of two of America’s wealthiest (and most philanthropic) men. But we should acknowledge that Gates and Buffett both drew winning tickets in the birth lottery.

Gates’s dad co-founded a law firm and served as president of the Washington State Bar Association; his mom, who came from a family of bankers, held prominent board positions. They sent their son to one of the best prep schools in the nation and then to Harvard, where, with their assent and support, he dropped out to start a computer software company. Mary Gates helped her son’s fledgling company get the IBM contract that led to MS-DOS. Warren Buffett was the son of a U.S. congressman — Howard Buffett represented Nebraska in the House of Representatives for four terms, and founded a brokerage firm. Warren’s parents sent him to the Wharton School of the University of Pennsylvania, the University of Nebraska, and Columbia Business School, where he studied with and was mentored by Benjamin Graham, the father of value investing. Buffett’s first job was in his father’s firm and then he went to work for Graham. Nobody gave Gates or Buffett their billions, or even their first tens of millions. But when they pulled themselves up by their bootstraps, the climb wasn’t as far as it would be for most of us.

Once leaders attain positions of power, luck continues to play a powerful role in their success. Take Jack Welch, who was named “manager of the century” by Fortune in 1999. Read the rest here.

Tuesday, October 22, 2019

Past performance is no guarantee of future results

strategy+business, October 18, 2019

by Theodore Kinni



Photograph by aeduard

By 1905, when philosopher George Santayana wrote, “Those who cannot remember the past are condemned to repeat it,” humans had already been gleaning lessons from history for several millennia. Around 800 BC, in the Iliad, Homer used the principal players in the Trojan War to explore leadership strategies and styles. Nearly a thousand years later, at the start of the second century AD, Plutarch compared the character traits of historical leaders in Lives of the Noble Greeks and Romans. And of course, we are still at it today. The business bookshelves are sagging with leadership and strategy lessons drawn from the lives of yesterday’s inventors, tycoons, generals, politicians, and other leading lights.


Sometimes these lessons feel like too much of a stretch — not only because they tend to idealize their subjects, but also because they elevate ad hoc responses into generic rules. How much credence, for instance, should a new CEO put in creating a “team of rivals” à la Abraham Lincoln? Or, to hold my own feet to the fire, how much faith should a leader in a battle for market share put in the “hit ’em where they ain’t” military strategy of Douglas MacArthur?

Ben Laker, professor of leadership at Henley Business School and dean of education at the National Centre for Leadership and Management in the U.K., points to current Prime Minister Boris Johnson, who wrote a book about another U.K. prime minister, The Churchill Factor: How One Man Made History, to illustrate the difficulties of applying lessons from the past. “The Prime Minister knows how Winston Churchill created a sense of connection through a ‘backs against the wall’ mentality in 1941. He is basing his rhetoric, decisions, and actions on Churchill’s example. And many people do feel more connected to him because of it,” Laker says. “But as Johnson’s critics observe, Brexit is not a war and a wartime mentality is at odds with a situation that requires openness and collaboration to reach a feasible outcome.”

Clearly, context is a critical factor in applying history. “You can look at the past and ask yourself whether you would do the same thing in the same situation,” Laker told me in recent conversation. “But the problem is you are not in the same situation. So, how relevant is history to your present situation?”  Read the rest here.

Wednesday, October 2, 2019

Need to Work Differently? Learn Differently

Learned a lot lending an editorial hand here:

Boss Magazine, October 2019

by Michael Griffiths


Digitization requires a new set of skills and a new set of training for employees


The next time your company holds an all-hands meeting, look around the room — or the arena — and consider this: It’s likely that more than half the people present will need reskilling or upskilling in the next three years.

This probably doesn’t come as a complete surprise to you. The forces of change are transforming every aspect of work, including what is done, who does it, and where it is done.

For example, emerging technologies — especially AI and machine learning — are among the most disruptive of these forces. In fact, 81 percent of respondents to Deloitte’s 2019 Global Human Capital Trends survey indicated they expect the use of AI to increase or increase significantly over the next three years. Unlike some, we don’t believe that AI will eliminate the need for a workforce. Instead, we anticipate the rise of hybrid jobs, which are enabled by digitization, technology, and the emergence of a new kind of job, which we call the superjob. A superjob combines work and responsibilities from multiple traditional jobs, using technology to both augment and broaden the scope of the work performed and involving a more complex set of digital, technical, and human skills.

Hybrid jobs and superjobs can enable your company to be more responsive to customers and adaptable to change. But it requires a more deliberate and agile approach to capability development. Already, many companies are responding to this need: Our research finds that 83 percent of organizations are increasing their investments in reskilling programs, and more than half (53 percent) increased their learning and development budgets by 6 percent or more in 2018.

But will more learning be enough at your company? It’s doubtful. To Work Differently we think your company should first Learn Differently. Read the rest here.

Wednesday, August 14, 2019

The Greatest Showman on Earth

strategy+business, August 14, 2019

by Theodore Kinni

Phineas Taylor Barnum’s future was bright. He believed from the age of 4 that his grandfather, pleased to have his grandson as his namesake, had purchased the most valuable farm in Connecticut in Barnum's name. For years, the boy’s grandfather talked about the farm and his neighbors congratulated him on being the richest child in the town of Bethel. At the age of 12, Barnum was taken to see his farm. It was five worthless, inaccessible acres in a large swamp. Everyone had a great laugh.

Robert Wilson, editor of the American Scholar and author of Barnum, sees the roots of the 19th-century American showman’s outsized pecuniary drive in “this strangely cruel and astonishingly drawn-out joke.” But it’s hard to judge whether the story is true — the only citation Wilson offers is Barnum’s autobiography, which should give the reader pause, considering its author’s reputation for humbug and penchant for spinning his own life story.

If Barnum didn’t stretch the story (or invent it outright), it also may reveal the roots of his preternatural talent for hucksterism. Certainly, he elevated the joke to unprecedented heights with a series of frauds so entertaining to American and European audiences of every social class that instead of shunning him, they rewarded him with riches that beggared the promise of the farm that never was. He also provided an early and, sadly, enduring lesson in the use of brazen hype, shameless self-promotion, and fake news as the basis of a successful business. Read the rest here. 

Tuesday, August 13, 2019

Staying Ahead of Disruption with Workforce Sensing

Learned a lot lending an editorial hand here:

Workforce Magazine, August, 2019

By Daniel Roddy and Chris Havrilla

Plug the word “disruption” into Google Trends and you’ll get a jagged line tracking 15 years of peaks and plunges in search frequency. But for all the shortterm variation in the chart, the long-term trend is steadily rising: there are nearly three times as many “disruption” searches today as there were in 2004. 

The steady rise in searches reflects a reality that won’t surprise most leaders. They face a host of disruptions—social, demographic, environmental, economic, technological, and geopolitical. Not only is it their job to make sure that their companies don’t get blindsided by these breakpoints in the status quo, but they also must be able to respond to them quickly and agilely in order to transform these disruptions into competitive advantage.

Sensing is the foundation on which an organization’s ability to identify, pace, and respond to disruption is built. In hindsight, disruptions seem obvious. By the mid-2000s, it was clear that streaming movies would decimate the video rental industry. But to have realized that a decade earlier, when the MP3 format first emerged for audio, and acted upon it is another matter entirely.

The ability to sense disruptions in their nascent stages and predict how they are likely to affect a company and its stakeholders is crucial to success in business today. This is especially true when it comes to sensing disruptions in the workforce. Read the rest here.

Thursday, August 1, 2019

Work Should Generate Energy, Not Sap It

Learned a lot lending an editorial hand here:

Forbes, August 1, 2019

by Michael Gretczko


GETTY

It’s 5:45 a.m. There is a candle flickering in the room. A bass booms. I pump my legs. Left, right, left, right. My heart starts pounding. I suck in air. Soon, I’m pouring sweat.

Does this sound like a nightmare? It’s just the opposite.

I start most of my days at SoulCycle, a 45-minute, high-intensity spin class. It’s my “secular sanctuary,” as one of their founders describes it. The class grounds and focuses my mind, resets and recharges my body with the energy I need for the day ahead. It enables me to bring my best self to my work. (I swear I haven't been paid for my comments — I’m just plain addicted.)

When I travel, I invite my colleagues to ride with me. We become a tribe at these classes. By the time we get to our post-workout coffees, we’re connected in a more intimate and intense way, high-fiving and sharing our sense of accomplishment.

As I reflect on what I love about cycling, I realize there are parallels between what it does and what great organizations strive to do. Both seek to maximize our human potential. Both are focused on enabling us to impact the world around us by unlocking our best capabilities and intentions.

There are three lessons from my spin class experience that align with how leaders of high-performing organizations unleash the energy of their workforces. Read the rest here.

Diversity, Inclusion, and the Alternative Workforce

Learned a lot lending an editorial hand here:

Boss Magazine, August 2019

by Kathi Enderes


The alternative workforce, including outsourced teams, contractors, consultants, freelancers, gig workers, and the crowd, is going mainstream. It’s the fastest-growing labor segment in the EU. By next year, the number of self-employed workers in the US is projected to reach 42 million people — nearly tripling in two years. Alternative workers account for over 10 percent of Australia’s labor pool.

Savvy leaders are well aware of the growth in the alternative workforce. In Deloitte’s 2019 Global Human Capital Trends survey, 41 percent of the almost 10,000 executive respondents said alternative workers are “important” or “very important” to their organizations. But only 28 percent said their organizations were “ready” or “very ready” to address the employment of alternative workers. A mere 8 percent said that they have the processes in place to manage and develop these workers. All this represents an opportunity and challenge for leaders everywhere.

A Wellspring of Talent

The opportunity in the alternative workforce is three-fold:

Filling the ‘skills gap’: The growing ranks of alternative workers offer a valuable pool of skills and capabilities in a time when it is becoming increasingly difficult to fill jobs. Last year, a global study by the Manpower Group reported that nearly half (45 percent) of employers studied were having trouble filling open positions; among companies with more than 250 employees, the percentage rose to 67 percent. That’s a major reason why the employment of alternative workers is spreading beyond IT into a host of other roles. Respondents in the 2019 Global Human Capital Trends survey indicated that they are using alternative workers extensively in operations (25 percent of respondents), customer service (17 percent), marketing (15 marketing), and innovation/R&D (15 percent).

Positively impacting organizational performance: Alternative workers are often highly talented, experienced, and self-motivated, attracted by the freedom, flexibility, and variety provided by working in arrangements other than traditional employment. Respondents to our trends survey who measure the contribution of outsourced teams, freelancers, gig workers, and the crowd reported that these workers have a positive impact on organizational performance.

Increasing diversity: Alternative workers can be a valuable source of diversity. After all, they may be located anywhere in the world, and often they come from a variety of backgrounds and experiences. They can contribute unique perspectives and ideas. Smart leaders not only consider the traditional dimensions of diversity — race, gender, age, and physical ability — they also tap into the deeper value embedded in the hearts and minds of workers. In a complex, global business environment, bringing different hearts and minds together is more important than ever.

So how can your organization tap into the wellspring of alternative workers? Read the rest here.

Wednesday, July 31, 2019

All the healthcare you can afford

strategy+business, July 31, 2019

by Theodore Kinni


Illustration by adventtr

In 2014, a syllabus and sample lecture for a course entitled Introductory Korean Drama (pdf) surfaced at Princeton University. Written by the eminent healthcare economist Uwe Reinhardt, it began, “After the near‐collapse of the world’s financial system has shown that we economists really do not know how the world works, I am much too embarrassed to teach economics anymore, which I have done for many years. I will teach Modern Korean Drama instead.” It appears that some economics professors aren’t nearly as dismal as their science.

Reinhardt never taught the class, which he said began as an impromptu lecture at a dinner with a group of Korean and Taiwanese health insurance professionals. But his tongue-in-cheek analysis of Korean TV dramas offers a glimpse of his ability to get to the nub of a matter. So does Priced Out, Reinhardt’s final book, published earlier this year, two years after his death in 2017.

In the book, Reinhardt gets to the crux of the ongoing debate over the American healthcare system — in which solutions abound but relief is nowhere in sight — with just one question: “As a matter of national policy, and to the extent that a nation’s health system can make it possible, should the child of a poor American family have the same chance of avoiding preventable illness or of being cured from a given illness as does the child of a rich American family?”

This is the ethical issue hidden behind all the talk of free markets and government control, the political rhetoric about socialism and states’ rights, and the calculations of how much the people of the United States can or can’t afford to pay for healthcare. Clearly, it’s an uncomfortable one. When Reinhardt first posed the question more than 20 years ago, he was dismissed as a “socialist propagandist” for his temerity.

“And so,” he laments, “permanently reluctant ever to debate openly the distributive social ethic that should guide our healthcare system, with many Americans thoroughly confused on the issue, we shall muddle through health reform, as we always have in the past, and as we always shall for decades to come.” 

But muddle through we must, because of two long-term trends: the seemingly inexorable growth in healthcare spending and the increasing inequality in the distribution of income and wealth. These trends, Reinhardt argues, “already are pricing more and more American families in the lower part of the nation’s income distribution out of health insurance and healthcare as families in the upper half of the distribution know it.” In other words: No, currently, the child of a poor American family does not have the same healthcare prospects as the child of a rich American family. Read the rest here.

Thursday, July 25, 2019

Getting full value from external talent

strategy+business, July 25, 2019

by Theodore Kinni



Photograph by Hero Images

Many recent studies of talent include some version of the prescriptive advice in PwC’s Preparing for tomorrow’s workforce, today report: “Harness the potential of flexible talent and innovation.” The wellspring of flexible talent and innovation is the contingent or alternative workforce — these days, that includes the fast-growing ranks of freelancers, independent contractors, gig workers, and the crowds whose collective genius companies can tap to address a variety of challenges.

The problem, as the PwC study found, is that 92 percent of companies are not managing these contingent workers as effectively as they could. Even as companies rely on contingent workers in ever-greater numbers, they often make it difficult — if not impossible — for them to contribute in full measure. Leaders need to do better.

This didn’t matter much 30-something years ago when I became a full-time freelancer. Most industries had little use for contingent workers then, and most workers wanted “real” jobs on the payroll. By 2017, however, 57 million American workers identified themselves as freelancers — that’s 36 percent of the workforce and nearly 50 percent of millennials. And contingent workers are in demand in a host of industries for a host of reasons. These include (but are not limited to): the record low unemployment rate, shortages of talent in emerging capabilities arenas (like AI and robotics), and the growing numbers of business models and workforce strategies that depend on contingent workers.

Yesteryear, managing contingent workers was something of a contradiction in terms. It seemed like a major reason to hire independent contractors was that you didn’t have to bother managing them. If there was a problem, the relationship could easily be terminated with a minimum of cost or conflict. And regardless of how well contingent workers performed, it was the rare manager who thought it might be worth cultivating an ongoing relationship. The operative managerial mind-set was “here today, gone tomorrow.”

That mind-set has been transformed over the last decade, as contingent workers have become more central to more companies’ operations. Read the rest here.

A new view of the fortune at the bottom of the digital pyramid

strategy+business, July 24, 2019

by Theodore Kinni




Photograph by code6d

The benefits of digitization and Internet connections in developing nations — and the opportunities awaiting companies that can provide them — have been much lauded in the past couple of decades. But as Payal Arora, a professor at Erasmus University Rotterdam, clearly demonstrates in her new book, The Next Billion Users, the conventional storyline around the transformative effect of technology on people’s lives often doesn’t ring true.

Arora, who has been studying how the global poor outside the West use computers and the Internet for nearly 20 years, discovered this for herself during her first development project in a rural region of southern India. “The goal,” she explains, “was to infuse this town with new digital technologies to help the poorer members of the community leapfrog their way out of poverty.”

The project team set up computer kiosks and funded cybercafes. It sent computer-equipped vans to remote villages to promote Internet awareness. “We envisioned women seeking health information, farmers checking crop prices, and children teaching themselves English,” Arora writes. The reality was the polar opposite: The kiosks became Pac-Man gaming stations, social networking sites dominated computer usage in the cybercafes, and the free movies used to attract people to the vans became their primary draw.

“Many of the technology development projects I have worked with since have yielded similar results,” Arora writes. “Play dominates work, and leisure overtakes labor, defying the productivity goals set by development organizations.” (Imagine the sniffing among Western do-gooders.)

This is the source of what Arora defines as the third digital divide between the developed and developing worlds. The first digital divide is access to technology. The second divide is the ability to use the technology — to read and write, for instance. And the third divide, which Arora labels “the leisure divide,” is rooted in motivation. “The leisure divide is about understanding what the global poor want from their digital life and why it matters to them,” she writes. “It reminds us that fulfillment is not necessarily a matter of efficiency or economic benefit but can involve a more elusive, personal, and emotive drive.” Read the rest here.

Monday, July 15, 2019

Casting the Dark Web in a New Light

Learned a lot lending an editorial hand here:

MIT Sloan Management Review, July 15, 2019

by Keman Huang, Michael Siegel, Keri Pearlson, and Stuart Madnick


With cyberattacks increasingly threatening businesses, executives need new tools, techniques, and approaches to protect their organizations. Unfortunately, criminal innovation often outpaces their defensive efforts. In April 2019, the AV-Test Institute, a research organization that focuses on IT security, registered more than 350,000 new malware samples per day, and according to Symantec’s 2019 Internet Security Threat Report, cyberattacks targeting supply chain vulnerabilities increased by 78% in 2018.

Wide-scale attacks are becoming more common, too. In October 2016, a distributed denial-of-service (DDoS) attack that hit Dyn, a domain name system (DNS) provider, in turn brought down companies such as PayPal, Twitter, Reddit, Amazon, Netflix, and Spotify. In 2017, the WannaCry and NotPetya ransomware attacks affected health care, education, manufacturing, and other sectors around the world. A report from the Department of Health in the U.K. revealed that WannaCry cost it 92 million pounds. That same year, while the cyber-defense community was working out how to fight ransomware, cryptojacking — the hijacking of other people’s machines to mine cryptocurrency — arose as a threat. Cryptojacking attacks detected by Symantec increased by 8,500% during 2017. During 2018, the value of cryptocurrencies plunged 90%, yet Symantec still blocked four times as many cryptojacking attacks as the previous year.

Attackers always seem to be one or two steps ahead of the defenders. Are they more technically adept, or do they have a magical recipe for innovation that enables them to move more quickly? If, as is commonly believed, hackers operated mainly as isolated individuals, they would need to be incredibly skilled and fast to create hacks at the frequency we’ve seen. However, when we conducted research in dark web markets, surveyed the literature on cyberattacks, and interviewed cybersecurity professionals, we found that the prevalence of the “fringe hacker” is a misconception.

Through this work, we found a useful lens for examining how cybercriminals innovate and operate. The value chain model developed by Harvard Business School’s Michael E. Porter offers a process-based view of business. When applied to cybercrime, it reveals that the dark web — that part of the internet that has been intentionally hidden, is inaccessible through standard web browsers, and facilitates criminal activities — serves as what Porter called a value system. That system includes a comprehensive cyberattack supply chain, which enables hackers and other providers to develop and sell the products and services needed to mount attacks at scale. Understanding how it works provides new, more effective avenues for combating attacks to companies, security service providers, and the defense community at large. Read the rest here.

Friday, July 12, 2019

Peter Drucker’s favorite leadership writer

strategy+business, July 12, 2019

by Theodore Kinni



Photograph by FXQuadro

Peter Drucker, the Austrian-American business author and consultant who defined management in the second half of the 20th century, wrote 39 books. Oddly, the word leadership doesn’t appear in any of their titles. In 1954, in his landmark The Practice of Management, Drucker suggested why: “The first systematic book on leadership: the Kyropaidaia of Xenophon — himself no mean leader of men — is still the best book on the subject.”

Kyropaidaia, or Cyropaedia, is the biography of Cyrus the Great, who used military conquest and enlightened governance to create the first Persian Empire around 540 BC. Xenophon the Athenian wrote the bio nearly 200 years later, and it became part of the leadership syllabus for centuries: In his 2001 book, Xenophon’s Prince: Republic and Empire in the Cyropaedia, Christopher Nadon, a professor at Claremont McKenna College (part of a consortium that includes the Drucker School of Management), writes that Alexander the Great and Julius Caesar read Kyropaedia and it was a strong influence on Machiavelli’s The Prince. Thomas Jefferson had two copies in his library.

So what do we know about Xenophon? Drucker’s description of him as “no mean leader” might be based on Xenophon’s own memoir. Titled Anabasis, it’s the story of a misbegotten military expedition, the emergence of a reluctant but talented leader, and a strategic, fighting retreat that saved an army of 10,000 mercenaries stranded deep in enemy territory.

Before he became a writer, Xenophon was embedded in this army, known as “the Ten Thousand.” Around 400 BC, Cyrus the Younger, a distant royal relation of Cyrus the Great, recruited the force as part of a military expedition. Cyrus was generous with favors and promises, but he didn’t bother to mention that his true purpose was to depose his brother, Artaxerxes II, who had inherited Persia’s throne.

Cyrus was killed in the first battle against Artaxerxes. The war lost, a group of generals and captains from the Ten Thousand tried to negotiate safe passage home — and they were betrayed by allies and slain. Thus, the Greek mercenaries found themselves leaderless and without provisions. “Separated from Hellas by more than a thousand miles, they had not even a guide to point the way,” reported Xenophon, who wrote Anabasis in the third person. “Impassable rivers lay athwart their homeward route, and hemmed them in. Betrayed even by the Asiatics, at whose side they had marched with Cyrus to the attack, they were left in isolation.” Read the rest here.

Friday, July 5, 2019

Cloud-based HCM systems should come without surprises

Lent an editorial hand preparing this guide to preparing a reality-based business case for HCM:

Deloitte's Capital H Blog, July 3, 2019

by Marty Marchetti

The business case for cloud-based human capital management (HCM) systems can sound pretty compelling. What CHRO wouldn’t want fast access to the latest advances in HCM technology at a lower overall cost? But my colleagues and I help companies make the move to cloud HCM, and we often get a firsthand view of the mismatch between expectations and reality that was revealed in Deloitte’s 2019 Global Human Capital Trends study.



It is important to have a comprehensive and accurate total cost of ownership for cloud HR before your company commits to it, during the implementation, and after it is in place.

“No surprises” should describe your move to the cloud, and the following 5 questions can help you reduce them. Read the rest here.

Friday, June 14, 2019

Conversational computing

strategy+business, June 13, 2019

by Theodore Kinni

Steve Jobs could be relentless when he wanted something. In early 2010, he wanted a small startup in San Jose, Calif. CEO Dag Kittlaus and his cofounders had just raised a second round of funding and didn’t want to sell. Jobs called Kittlaus for 37 days straight, until he wrangled and wheedled a deal to buy the two-year-old venture for Apple at a price reportedly between US$150 million and $200 million. The company was Siri Inc.

Wired contributor James Vlahos tells the story of how Siri took up permanent residence in the iPhone in his new book, Talk to Me. It’s the first nontechnical book on voice computing that I’ve seen and a must-read if you have any interest in the topic.

Vlahos spends the first third of Talk to Me describing the platform war currently raging in voice computing. It details the race among the big players, including Amazon, Google, and Apple, to embed AI-driven voices in as many different devices as possible, as they seek to dominate the emerging ecosystem. The fact that Amazon now has more than 10,000 employees working on Alexa provides a good sense of the dimensions of that race.

But voice computing is more than a platform play. It is likely to have ramifications and applications for every company, especially if Vlahos’s contention that “the advent of voice computing is a watershed moment in human history” turns out to be right.

“Voice is becoming the universal remote to reality, a means to control any and every piece of technology,” he writes. “Voice allows us to command an army of digital helpers — administrative assistants, concierges, housekeepers, butlers, advisors, babysitters, librarians, and entertainers.” Voice will disrupt the business models of powerful companies — and create new opportunities for upstarts — in part because it will put AI directly in the control of consumers, Vlahos argues. “And voice introduces the world to relationships long prophesied by science fiction — ones in which personified AIs become our helpers, watchdogs, oracles, and friends.” Read the rest here.

Transformation in energy, utilities and resources

Learned a lot lending an editorial hand here:





PwC, June 13, 2019


The world is at the midpoint of a massive energy-related transformation. By 2040, the global demand for all forms of fuel and power will be four times what it was in 1990. During the same 50 years, the issue of global climate change will have moved from the margins to the centre. Institutions everywhere will be striving to address climate-related problems by dramatically decreasing and mitigating carbon use.

In the energy, utilities and resources (EU&R) industries, the relationship between these two dynamics — the rise in demand and the recognition of carbon use as a climate threat — is already determining basic strategic choices. And it will continue to do so for years to come. This development will profoundly affect a wide range of companies: producers of all forms of energy; disseminators and sellers of electric power, gas and oil; energybased process industries such as chemicals and steel; and producers of other extracted commodities. Leaders in all those businesses will need the acumen to make and execute decisions that combine growth with environmental sustainability, often in novel ways.

The ability to take this new approach to management, especially for companies that have been successful in the past, is not guaranteed. Thus, transformation — the ability to make fundamental shifts in strategy, operating model and day-to-day activity — is on the agenda for EU&R companies this year, with a stronger sense of urgency than before. Fortunately, because of the rise of digital technology, the growing use of interoperable platforms and an emerging consensus about the value of renewable energy, EU&R companies have more tools and opportunities than ever before for thriving through this disruption. 

The urgency became clear in the results of a number of surveys conducted recently by PwC — including those of chemical company CEOs, oil and gas company CEOs, and power and utilities companies — and it is especially pressing in the utilities sector. For instance, when we surveyed senior executives in Germany’s energy sector in 2018, 77% said that the bulk of their company’s revenues would continue to come from their core businesses over the next five years, yet 57% of them expected those revenues to fall over the same period. Likewise, in chemicals, according to our 22nd Annual Global CEO Survey trends series, the next decade is likely to see the sector come under increasing pressure on a range of sustainability measures. In short, although the demand for EU&R’s elemental commodities will grow and its essentially extractive, capital-intensive nature will not change, business as usual will not be a viable alternative for many companies. Read the rest here.

Tuesday, June 11, 2019

Using AI to Enhance Business Operations

Learned a lot lending an editorial hand here:

MIT Sloan Management Review, June 11, 2019

by Monideepa Tarafdar, Cynthia M. Beath, and Jeanne W. Ross



Artificial intelligence invariably conjures up visions of self-driving vehicles, obliging personal assistants, and intelligent robots. But AI’s effect on how companies operate is no less transformational than its impact on such products.

Image result for mit sloan reviewEnterprise cognitive computing — the use of AI to enhance business operations — involves embedding algorithms into applications that support organizational processes. ECC applications can automate repetitive, formulaic tasks and, in doing so, deliver orders-of-magnitude improvements in the speed of information analysis and in the reliability and accuracy of outputs. For example, ECC call center applications can answer customer calls within 5 seconds on a 24-7-365 basis, accurately address their issues on the first call 90% of the time, and transfer complex issues to employees, with less than half of the customers knowing that they are interacting with a machine. The power of ECC applications stems from their ability to reduce search time and process more data to inform decisions. That’s how they enhance productivity and free employees to perform higher-level work — specifically, work that requires human adaptability and creativity. Ultimately, ECC applications can enhance operational excellence, customer satisfaction, and employee experience.

ECC applications come in many flavors. For instance, in addition to call center applications, they include banking applications for processing loan requests and identifying potential fraud, legal applications for identifying relevant case precedents, investment applications for developing buy/sell predictions and recommendations, manufacturing applications for scheduling equipment maintenance, and pharmaceutical R&D applications for predicting the success of drugs under development.

Not surprisingly, most business and technology leaders are optimistic about ECC’s value-creating potential. In a 2017 survey of 3,000 senior executives across industries, company sizes, and countries, 63% said that ECC applications would have a large effect on their organization’s offerings within five years. However, the actual rate of adoption is low, and benefits have proved elusive for most organizations. In 2017, when we conducted our own survey of senior executives at 106 companies, half of the respondents reported that their company had no ECC applications in place. Moreover, only half of the respondents whose companies had applications believed they had produced measurable business outcomes. Other studies report similar results.

This suggests that generating value from ECC applications is not easy — and that reality has caught many business leaders off guard. Indeed, we found that some of the excitement around ECC resulted from unrealistic expectations about the powers of “intelligent machines.” In addition, we observed that many companies that hoped to benefit from ECC but failed to do so had not developed the necessary organizational capabilities. To help address that problem, we undertook a program of research aimed at identifying the foundations of ECC competence. We found five capabilities and four practices that companies need to splice the ECC gene into their organization’s DNA. Read the rest here.

Sunday, June 2, 2019

Managerial hubris brought down MacArthur

strategy & business, May 29, 2019

by Theodore Kinni



Photograph by Pictorial Press Ltd / Alamy

I find hubris to be a fascinating cognitive flaw. Perhaps the spectacle of arrogance leading to a fall from grace provides a socially acceptable outlet for my predilection for schadenfreude — another obnoxious personality glitch. But my flaws don’t matter all that much. I’m not a leader.

For leaders, the consequences of cognitive flaws like hubris are magnified. And nowhere is the danger of managerial hubris more evident than in the career of General Douglas MacArthur, whose life and career I studied for my book No Substitute for Victory: Lessons in Strategy and Leadership from General Douglas MacArthur. In June 1950, when President Harry Truman appointed him to head the United Nations Command at the start of the Korean War, MacArthur was already a prime candidate for hubris. He had served as commander of the U.S. Army Forces in the Pacific in WWII and was still, at age 70, serving as the de facto leader of postwar Japan and its more than 80 million citizens. He was, as biographer William Manchester put it, an “American Caesar.” It is unlikely that MacArthur would have objected to the characterization, had he been alive to hear it.

If MacArthur had an elevated sense of ego and invincibility by 1950, his initial success in prosecuting the Korean War surely reinforced the feeling. As the UN forces fought to hang on at Pusan, their last foothold on the Korean Peninsula, MacArthur mounted an audacious, large-scale amphibious attack well behind enemy lines at the port city of Inchon. The plan was risky, if not foolhardy: Inchon’s 30-foot tides are so extreme that the window for making the assault was limited to two days in September. Moreover, if the landing forces had been unable to take the port, they would have been trapped.

As it turned out, the Inchon invasion was a complete success. The North Korean Army reeled in surprise, and a day later, the UN forces at Pusan broke out. Within two weeks, the invaders had been expelled from South Korea and the UN forces crossed the 38th Parallel, heading north to the Chinese border. The stage was set for one of the 20th century’s most dramatic exhibitions of hubris. Read the rest here.

Friday, May 24, 2019

Are meaning & purpose missing for your workforce?

Learned a lot lending an editorial hand here:

Capital H Blog, May 24, 2016

by Matthew Deruntz and Christina Rasieleski


Organizations are increasingly offering lavish perks to attract and retain talent, and then tracking their success with annual engagement surveys. But what if they’re missing the point?

Despite a laser-like organizational focus on what is traditionally called employee engagement, most people remain less than satisfied with their jobs. Deloitte’s 2019 Global Human Capital Trends survey points to what may be really missing. Many workers lack autonomy and access to the tools and information they need; moreover, they aren’t satisfied with the design of their jobs or the day-to-day flow of work. In fact, most survey respondents rated their organizations only “somewhat effective” or “not effective” on a number of factors related to experience: positive work environment, meaningful work, growth opportunities, trust in leadership, and supportive management. These aren’t issues that organizations can address with free doggie daycare or on-site CrossFit. Instead, they need to reevaluate the fundamental human needs of their workforce.

For better or worse, work holds such a dominant place in many people’s lives that when it fails to meet their innate need for meaning and purpose, their entire lives can become less satisfying and fulfilling. To address this issue and recognize that everyone who contributes to the organization—whether as a full-time employee, contractor, or gig worker—is an individual with intrinsic human needs, organizations need to pivot from thinking about an “employee experience” to thinking about a “human experience” for their workforce. Read the rest here.

Tuesday, April 30, 2019

Bad meetings no more

strategy+business, April 30, 2019

by Theodore Kinni

Aldous Huxley had it wrong. Bad meetings — not mescaline — open the doors of perception. They lull me into a trance. I occasionally surface (did I snore?), murmur agreement (to who knows what), surreptitiously check my phone, and nod off again. If the deep breathing I often hear on conference calls is any clue, I’m not the only one who achieves spiritual transcendence in bad meetings.

The authors of how-to books about meetings never consider the salutary effects of bad ones. Instead, they typically start with an adrenalin-like shot of statistics. Steven Rogelberg, Chancellor’s Professor at University of North Carolina, Charlotte, and author of The Surprising Science of Meetings, is no exception to the rule. He offers the usual litany of dismay. There are about 55 million meetings per day in the U.S. alone, and they cost US$1.4 trillion annually, not counting indirect costs such as employee frustration. “Too many meetings” is cited as the top time-waster by 47 percent of U.S. workers.

Nevertheless, Rogelberg doesn’t think that companies should eliminate meetings. “Was the great management guru Peter Drucker correct when he said, ‘Meetings are a symptom of bad organization. The fewer meetings, the better’?” Rogelberg asks. “The answer is an emphatic ‘no.’ Abolishing meetings is a false solution. Schedules with too few meetings are associated with substantial risks for employees, leaders, teams, and organizations.” Instead, the author advises breaking the cycle of bad meetings with the application of meeting science.

If anybody has a claim on the role of meeting scientist, it’s Rogelberg. He has been researching meetings using field surveys, laboratory studies, and experiments incorporating planted accomplices for 15 years. In this book, he weaves his findings and the research of others into an evidence-based approach to meetings that is sometimes eye-opening. Read the rest here.

Tuesday, April 23, 2019

Does your rewards strategy identify and address employee stressors?

Learned a lot lending an editorial hand here:

Inside HR, April 23, 2019

by Pete DeBellis




What is the basis for your company’s rewards offerings? For too many companies, it is purely benchmarks – that is, they make rewards decisions based on the rewards being offered by other companies with whom they believe they compete for talent. The problem: companies that follow this approach are left guessing about the desires and stressors of their actual workforce rather than knowing definitively what they want or need. In fact, based on Deloitte’s 2019 Global Human Capital Trends report, nearly one-quarter (23 percent) of organisations do not feel they know what rewards their employees value.

There’s nothing wrong with benchmarking per se: You should know what rewards your competitors are offering their employees. But that’s only one piece of the rewards puzzle. To optimise a rewards offering, you need to know a lot more about your rewards customers, that is, your company’s employees. Our research at Bersin finds that companies with mature, high-performing rewards functions achieve this by adopting some version of the following 4-step process, which uses the same kinds of surveys that marketers use to understand customers. Read the rest here.

Thursday, April 18, 2019

In praise of the purposeless company

strategy+business, April 18, 2019

by Theodore Kinni



Photograph by Avalon_Studio


These days, my vote for the most misunderstood and misused management concept goes to “corporate purpose.” Back in 1973, the concept was crystal clear to Peter Drucker, who declared with admirable concision in Management: Tasks, Responsibilities, Practices: “There is only one valid definition of business purpose: to create a customer.” Since then, however, the definition of corporate purpose has mutated into pretty much any reason for being in business that isn’t explicitly connected to making money.

Business professors Sumantra Ghoshal and Christopher A. Bartlett unbottled this genie in a 1994 article in Harvard Business Review, in which they argued that strategy (“an amoral plan for exploiting commercial opportunity”) wasn’t enough: “A company today is more than just a business. As important repositories of resources and knowledge, companies shoulder a huge responsibility for generating wealth by continuously improving their productivity and competitiveness. Furthermore, their responsibility for defining, creating, and distributing value makes corporations one of society’s principal agents of social change. At the micro level, companies are important forums for social interaction and personal fulfillment.”

Why was a highfalutin corporate purpose seen as such a big deal? IGhoshal, who passed away in 2004, and Bartlett, who is now professor emeritus of business administration at Harvard Business School, concluded that companies had to transform themselves from economic entities to social institutions. They added that the “definition and articulation [of purpose] must be top management’s first responsibility.” Read the rest here.

Thursday, April 11, 2019

We Lead People, Not Cardboard Cutouts

Learned a lot lending an editorial hand here:

Forbes, April 11, 2019

by Michael Gretczko


GETTY

My wife and I just took our 5-year-old fraternal twins on a skiing vacation. Our daughter is caution incarnate. She likes to ski in a familial caravan — one parent ahead and one behind — and she wants constant feedback about her performance. Our son likes to get a rough idea of the conditions — icy here, snowboarders there — and push off. He doesn’t mind falling and doesn’t particularly care what we think of his performance. It’s astounding how different twins can be.

I’m constantly amazed how my children can uncover insights that allow me to see my role as a leader in a new light. I’m always seeking new ways to create engaged, high-performing teams, and typically, that devolves to some type of employee segmentation, by generation, job description or personality. We’re told that millennials often prefer to work this way, programmers want to work that way, and that Driver and Pioneer Business Chemistry styles want to work yet another way. But if my twins respond best to radically different conditions and parenting styles, can any type of segmentation be granular enough to respond to the individual needs of employees?

I suspect that it can’t. To engage with people on a truly human level — that is, to get beyond the employees-as-interchangeable-assets mindset — we need to be far more responsive to employees as individuals. Read the rest here.

Large businesses don’t have to be lousy innovators

strategy+business, April 11, 2019

by Theodore Kinni



Photograph by Kanchisa Thitisukthanapong

Gary Pisano, author of Creative Construction: The DNA of Sustained Innovation, doesn’t buy the idea that large enterprises are inherently lousy innovators. Back in 2006, Pisano, the Harry E. Figgie Professor of Business Administration at Harvard Business School, traced the origin of every drug approved by the FDA over a 20-year period to either one of the world’s 20 largest pharmaceutical companies or one of the 250 smaller, supposedly more innovative biotechs. When he compared the two groups, he discovered a “statistical dead heat” — R&D productivity was no better in the smaller biotechs than in big pharma.

Pisano also points to anecdotal evidence to support his opposition to the conventional wisdom about innovation in large enterprises. For every big, established company that failed at transformational innovation (think Blockbuster, Kodak, and Polaroid), he points to another that has succeeded. In 1964, when IBM announced its revolutionary 360 mainframe computers, it was already the largest computer company in the world and ranked 18th on the Fortune 500. In 1982, when Monsanto scientists invented the foundational technology for GMOs (genetically modified organisms), the company was 81 years old and number 50 on the Fortune 500. And in 2007, when Apple launched the iPhone, it had sales of US$24 billion and already stood at 123rd on the Fortune 500.

Pisano says that the difference between a Blockbuster and an IBM is the ability of leaders to sustain and rejuvenate the innovation capacity of their companies. It’s an ability he calls “creative construction,” and he writes that it “requires a delicate balance of exploiting existing resources and capabilities without becoming imprisoned by them.”

Walking that tightrope is a challenge for large companies. It’s tough to move the needle with innovation when the needle’s scale is measured in billions of dollars. “For J&J [Johnson & Johnson] to maintain its historical rate of top-line growth,” reports Pisano, “it must generate about $3 billion–$4 billion of new revenue per year.” The complexity of managing innovation in large organizations can also be daunting. “When you get to be the scale of a J&J, you have a lot of moving parts,” he explains. “You now have a system with serious frictions. Friction impedes mobility. Lack of mobility means lack of innovation.”

But large companies also have some advantages that can give them a leg up in innovation. “Larger enterprises like J&J have massive financial resources to explore new opportunities,” says Pisano. They can hedge their bets, tap deep reservoirs of talent, navigate regulatory agencies, and use their huge distribution networks and strong brands to roll out new products to millions of existing customers. Read the rest here.

Wednesday, March 20, 2019

Leadership lessons from a guitar hero

strategy+business, March 19, 2019

by Theodore Kinni

About an hour into Still on the Run, a documentary exploring the career of rock guitar god Jeff Beck, Eric Clapton, another deity in the pantheon, pops up on screen and says, “I don’t even know how he’s doing it half the time.” Clapton is talking about Beck’s ability to give voice to the guitar. But his comment set me to wondering how Beck developed his unique style and what, if any, lessons his nearly 60-year career might offer those who aspire to reach the top of the business world.

With CEO tenure in large companies running five years or so, the fact that Beck has been numbered among the world’s best guitarists for about 10 times that long is worth exploring. Surely, innate talent and endless hours of practice count for a lot. But loads of guitarists have both, and haven’t had careers that lasted as long as the average CEO’s. Beck, however, has been an inveterate seeker of innovation in both technology and technique. And this habit has enabled the 74-year-old London-born musician to continuously expand his capabilities and transform his sound.




Jeff Beck performs at the Bataclan in Paris in 1973.

Photograph by Philippe Gras / Alamy


As biographer Martin Power tells it in Hot Wired Guitar: The Life of Jeff Beck (Omnibus Press, 2014), Beck’s parents valued musicianship, but not the electric guitar, which in the 1950s was associated with rockabilly and other disreputable musical genres. When his parents refused even to spring for new strings for a borrowed guitar, the teenager began building his own crude instruments. Unable to tune his early efforts, Beck learned to bend the strings to pitch while playing, a work-around that became a signature. The wannabe lead guitarist stole the pickup needed for his first electric guitar and built its amp in his school’s science department. Since then, Beck continually explored and adopted technological advances in guitar effects and electronics — such as tape-delay units, fuzz boxes, and guitar synthesizers — to shape and extend his playing.

Beck’s eagerness to learn and incorporate techniques from far-flung places is another hallmark of his career. Like Clapton, he learned from American blues giants — and rode the wave of cultural appropriation that gave rise to rock and roll. According to Power’s biography, Beck says that the first time he heard Jimi Hendrix play, he thought, “Oh, Christ, all right, I’ll become a postman.” Then he followed Hendrix around to learn how he created his sound. Other inspirations include a women’s choir that recorded Bulgarian folk songs, operatic tenor Luciano Pavarotti, and electronic dance music. Beck describes his resulting style as “a form of insanity…. A bit of everything, really. Rockabilly licks, Jimi Hendrix, Ravi Shankar, all the people I’ve loved to listen to over the years. Cliff [Gallup], Les [Paul], Eastern and Arabic music, it’s all in there.” Read the rest here.

Saturday, March 2, 2019

Finding your company’s cultural sweet spot

strategy+business, March 1, 2019

by Theodore Kinni

“Culture eats strategy for breakfast” is a pretty tasty bon mot, though it’s doubtful that Peter Drucker, who usually gets credit for it, actually cooked it up. It can also cause severe indigestion when leaders ignore it. If you’re tempted to join them, read Rule Makers, Rule Breakers, by University of Maryland, College Park, psychology professor Michele Gelfand.

Gelfand has been studying culture and social norms for more than 20 years. Ever wonder why conjuring up a “burning platform” galvanizes workers instead of sending them scurrying for the exit doors? It’s the same reason that the heartbeats of firewalkers in the Spanish village of San Pedro Manrique, as well as the people watching them, become synchronized. It’s also why test subjects who eat fiery chili peppers together report a higher sense of bonding, and work better together in subsequent economic games. “Social norms, like participating in rituals, can increase group cohesion and cooperation,” says Gelfand.

Social norms are the glue that holds groups together. “They give us our identity, and help us coordinate in unprecedented ways,” Gelfand writes. “Yet cultures vary in the strength of their social glue, with profound consequences for our worldviews, our environments, and our brains.” And for companies, too. Read the rest here.