Wednesday, August 14, 2019

The Greatest Showman on Earth

strategy+business, August 14, 2019

by Theodore Kinni

Phineas Taylor Barnum’s future was bright. He believed from the age of 4 that his grandfather, pleased to have his grandson as his namesake, had purchased the most valuable farm in Connecticut in Barnum's name. For years, the boy’s grandfather talked about the farm and his neighbors congratulated him on being the richest child in the town of Bethel. At the age of 12, Barnum was taken to see his farm. It was five worthless, inaccessible acres in a large swamp. Everyone had a great laugh.

Robert Wilson, editor of the American Scholar and author of Barnum, sees the roots of the 19th-century American showman’s outsized pecuniary drive in “this strangely cruel and astonishingly drawn-out joke.” But it’s hard to judge whether the story is true — the only citation Wilson offers is Barnum’s autobiography, which should give the reader pause, considering its author’s reputation for humbug and penchant for spinning his own life story.

If Barnum didn’t stretch the story (or invent it outright), it also may reveal the roots of his preternatural talent for hucksterism. Certainly, he elevated the joke to unprecedented heights with a series of frauds so entertaining to American and European audiences of every social class that instead of shunning him, they rewarded him with riches that beggared the promise of the farm that never was. He also provided an early and, sadly, enduring lesson in the use of brazen hype, shameless self-promotion, and fake news as the basis of a successful business. Read the rest here. 

Tuesday, August 13, 2019

Staying Ahead of Disruption with Workforce Sensing

Learned a lot lending an editorial hand here:

Workforce Magazine, August, 2019

By Daniel Roddy and Chris Havrilla

Plug the word “disruption” into Google Trends and you’ll get a jagged line tracking 15 years of peaks and plunges in search frequency. But for all the shortterm variation in the chart, the long-term trend is steadily rising: there are nearly three times as many “disruption” searches today as there were in 2004. 

The steady rise in searches reflects a reality that won’t surprise most leaders. They face a host of disruptions—social, demographic, environmental, economic, technological, and geopolitical. Not only is it their job to make sure that their companies don’t get blindsided by these breakpoints in the status quo, but they also must be able to respond to them quickly and agilely in order to transform these disruptions into competitive advantage.

Sensing is the foundation on which an organization’s ability to identify, pace, and respond to disruption is built. In hindsight, disruptions seem obvious. By the mid-2000s, it was clear that streaming movies would decimate the video rental industry. But to have realized that a decade earlier, when the MP3 format first emerged for audio, and acted upon it is another matter entirely.

The ability to sense disruptions in their nascent stages and predict how they are likely to affect a company and its stakeholders is crucial to success in business today. This is especially true when it comes to sensing disruptions in the workforce. Read the rest here.

Thursday, August 1, 2019

Work Should Generate Energy, Not Sap It

Learned a lot lending an editorial hand here:

Forbes, August 1, 2019

by Michael Gretczko


It’s 5:45 a.m. There is a candle flickering in the room. A bass booms. I pump my legs. Left, right, left, right. My heart starts pounding. I suck in air. Soon, I’m pouring sweat.

Does this sound like a nightmare? It’s just the opposite.

I start most of my days at SoulCycle, a 45-minute, high-intensity spin class. It’s my “secular sanctuary,” as one of their founders describes it. The class grounds and focuses my mind, resets and recharges my body with the energy I need for the day ahead. It enables me to bring my best self to my work. (I swear I haven't been paid for my comments — I’m just plain addicted.)

When I travel, I invite my colleagues to ride with me. We become a tribe at these classes. By the time we get to our post-workout coffees, we’re connected in a more intimate and intense way, high-fiving and sharing our sense of accomplishment.

As I reflect on what I love about cycling, I realize there are parallels between what it does and what great organizations strive to do. Both seek to maximize our human potential. Both are focused on enabling us to impact the world around us by unlocking our best capabilities and intentions.

There are three lessons from my spin class experience that align with how leaders of high-performing organizations unleash the energy of their workforces. Read the rest here.

Diversity, Inclusion, and the Alternative Workforce

Learned a lot lending an editorial hand here:

Boss Magazine, August 2019

by Kathi Enderes

The alternative workforce, including outsourced teams, contractors, consultants, freelancers, gig workers, and the crowd, is going mainstream. It’s the fastest-growing labor segment in the EU. By next year, the number of self-employed workers in the US is projected to reach 42 million people — nearly tripling in two years. Alternative workers account for over 10 percent of Australia’s labor pool.

Savvy leaders are well aware of the growth in the alternative workforce. In Deloitte’s 2019 Global Human Capital Trends survey, 41 percent of the almost 10,000 executive respondents said alternative workers are “important” or “very important” to their organizations. But only 28 percent said their organizations were “ready” or “very ready” to address the employment of alternative workers. A mere 8 percent said that they have the processes in place to manage and develop these workers. All this represents an opportunity and challenge for leaders everywhere.

A Wellspring of Talent

The opportunity in the alternative workforce is three-fold:

Filling the ‘skills gap’: The growing ranks of alternative workers offer a valuable pool of skills and capabilities in a time when it is becoming increasingly difficult to fill jobs. Last year, a global study by the Manpower Group reported that nearly half (45 percent) of employers studied were having trouble filling open positions; among companies with more than 250 employees, the percentage rose to 67 percent. That’s a major reason why the employment of alternative workers is spreading beyond IT into a host of other roles. Respondents in the 2019 Global Human Capital Trends survey indicated that they are using alternative workers extensively in operations (25 percent of respondents), customer service (17 percent), marketing (15 marketing), and innovation/R&D (15 percent).

Positively impacting organizational performance: Alternative workers are often highly talented, experienced, and self-motivated, attracted by the freedom, flexibility, and variety provided by working in arrangements other than traditional employment. Respondents to our trends survey who measure the contribution of outsourced teams, freelancers, gig workers, and the crowd reported that these workers have a positive impact on organizational performance.

Increasing diversity: Alternative workers can be a valuable source of diversity. After all, they may be located anywhere in the world, and often they come from a variety of backgrounds and experiences. They can contribute unique perspectives and ideas. Smart leaders not only consider the traditional dimensions of diversity — race, gender, age, and physical ability — they also tap into the deeper value embedded in the hearts and minds of workers. In a complex, global business environment, bringing different hearts and minds together is more important than ever.

So how can your organization tap into the wellspring of alternative workers? Read the rest here.

Wednesday, July 31, 2019

All the healthcare you can afford

strategy+business, July 31, 2019

by Theodore Kinni

Illustration by adventtr

In 2014, a syllabus and sample lecture for a course entitled Introductory Korean Drama (pdf) surfaced at Princeton University. Written by the eminent healthcare economist Uwe Reinhardt, it began, “After the near‐collapse of the world’s financial system has shown that we economists really do not know how the world works, I am much too embarrassed to teach economics anymore, which I have done for many years. I will teach Modern Korean Drama instead.” It appears that some economics professors aren’t nearly as dismal as their science.

Reinhardt never taught the class, which he said began as an impromptu lecture at a dinner with a group of Korean and Taiwanese health insurance professionals. But his tongue-in-cheek analysis of Korean TV dramas offers a glimpse of his ability to get to the nub of a matter. So does Priced Out, Reinhardt’s final book, published earlier this year, two years after his death in 2017.

In the book, Reinhardt gets to the crux of the ongoing debate over the American healthcare system — in which solutions abound but relief is nowhere in sight — with just one question: “As a matter of national policy, and to the extent that a nation’s health system can make it possible, should the child of a poor American family have the same chance of avoiding preventable illness or of being cured from a given illness as does the child of a rich American family?”

This is the ethical issue hidden behind all the talk of free markets and government control, the political rhetoric about socialism and states’ rights, and the calculations of how much the people of the United States can or can’t afford to pay for healthcare. Clearly, it’s an uncomfortable one. When Reinhardt first posed the question more than 20 years ago, he was dismissed as a “socialist propagandist” for his temerity.

“And so,” he laments, “permanently reluctant ever to debate openly the distributive social ethic that should guide our healthcare system, with many Americans thoroughly confused on the issue, we shall muddle through health reform, as we always have in the past, and as we always shall for decades to come.” 

But muddle through we must, because of two long-term trends: the seemingly inexorable growth in healthcare spending and the increasing inequality in the distribution of income and wealth. These trends, Reinhardt argues, “already are pricing more and more American families in the lower part of the nation’s income distribution out of health insurance and healthcare as families in the upper half of the distribution know it.” In other words: No, currently, the child of a poor American family does not have the same healthcare prospects as the child of a rich American family. Read the rest here.

Thursday, July 25, 2019

Getting full value from external talent

strategy+business, July 25, 2019

by Theodore Kinni

Photograph by Hero Images

Many recent studies of talent include some version of the prescriptive advice in PwC’s Preparing for tomorrow’s workforce, today report: “Harness the potential of flexible talent and innovation.” The wellspring of flexible talent and innovation is the contingent or alternative workforce — these days, that includes the fast-growing ranks of freelancers, independent contractors, gig workers, and the crowds whose collective genius companies can tap to address a variety of challenges.

The problem, as the PwC study found, is that 92 percent of companies are not managing these contingent workers as effectively as they could. Even as companies rely on contingent workers in ever-greater numbers, they often make it difficult — if not impossible — for them to contribute in full measure. Leaders need to do better.

This didn’t matter much 30-something years ago when I became a full-time freelancer. Most industries had little use for contingent workers then, and most workers wanted “real” jobs on the payroll. By 2017, however, 57 million American workers identified themselves as freelancers — that’s 36 percent of the workforce and nearly 50 percent of millennials. And contingent workers are in demand in a host of industries for a host of reasons. These include (but are not limited to): the record low unemployment rate, shortages of talent in emerging capabilities arenas (like AI and robotics), and the growing numbers of business models and workforce strategies that depend on contingent workers.

Yesteryear, managing contingent workers was something of a contradiction in terms. It seemed like a major reason to hire independent contractors was that you didn’t have to bother managing them. If there was a problem, the relationship could easily be terminated with a minimum of cost or conflict. And regardless of how well contingent workers performed, it was the rare manager who thought it might be worth cultivating an ongoing relationship. The operative managerial mind-set was “here today, gone tomorrow.”

That mind-set has been transformed over the last decade, as contingent workers have become more central to more companies’ operations. Read the rest here.

A new view of the fortune at the bottom of the digital pyramid

strategy+business, July 24, 2019

by Theodore Kinni

Photograph by code6d

The benefits of digitization and Internet connections in developing nations — and the opportunities awaiting companies that can provide them — have been much lauded in the past couple of decades. But as Payal Arora, a professor at Erasmus University Rotterdam, clearly demonstrates in her new book, The Next Billion Users, the conventional storyline around the transformative effect of technology on people’s lives often doesn’t ring true.

Arora, who has been studying how the global poor outside the West use computers and the Internet for nearly 20 years, discovered this for herself during her first development project in a rural region of southern India. “The goal,” she explains, “was to infuse this town with new digital technologies to help the poorer members of the community leapfrog their way out of poverty.”

The project team set up computer kiosks and funded cybercafes. It sent computer-equipped vans to remote villages to promote Internet awareness. “We envisioned women seeking health information, farmers checking crop prices, and children teaching themselves English,” Arora writes. The reality was the polar opposite: The kiosks became Pac-Man gaming stations, social networking sites dominated computer usage in the cybercafes, and the free movies used to attract people to the vans became their primary draw.

“Many of the technology development projects I have worked with since have yielded similar results,” Arora writes. “Play dominates work, and leisure overtakes labor, defying the productivity goals set by development organizations.” (Imagine the sniffing among Western do-gooders.)

This is the source of what Arora defines as the third digital divide between the developed and developing worlds. The first digital divide is access to technology. The second divide is the ability to use the technology — to read and write, for instance. And the third divide, which Arora labels “the leisure divide,” is rooted in motivation. “The leisure divide is about understanding what the global poor want from their digital life and why it matters to them,” she writes. “It reminds us that fulfillment is not necessarily a matter of efficiency or economic benefit but can involve a more elusive, personal, and emotive drive.” Read the rest here.

Monday, July 15, 2019

Casting the Dark Web in a New Light

Learned a lot lending an editorial hand here:

MIT Sloan Management Review, July 15, 2019

by Keman Huang, Michael Siegel, Keri Pearlson, and Stuart Madnick

With cyberattacks increasingly threatening businesses, executives need new tools, techniques, and approaches to protect their organizations. Unfortunately, criminal innovation often outpaces their defensive efforts. In April 2019, the AV-Test Institute, a research organization that focuses on IT security, registered more than 350,000 new malware samples per day, and according to Symantec’s 2019 Internet Security Threat Report, cyberattacks targeting supply chain vulnerabilities increased by 78% in 2018.

Wide-scale attacks are becoming more common, too. In October 2016, a distributed denial-of-service (DDoS) attack that hit Dyn, a domain name system (DNS) provider, in turn brought down companies such as PayPal, Twitter, Reddit, Amazon, Netflix, and Spotify. In 2017, the WannaCry and NotPetya ransomware attacks affected health care, education, manufacturing, and other sectors around the world. A report from the Department of Health in the U.K. revealed that WannaCry cost it 92 million pounds. That same year, while the cyber-defense community was working out how to fight ransomware, cryptojacking — the hijacking of other people’s machines to mine cryptocurrency — arose as a threat. Cryptojacking attacks detected by Symantec increased by 8,500% during 2017. During 2018, the value of cryptocurrencies plunged 90%, yet Symantec still blocked four times as many cryptojacking attacks as the previous year.

Attackers always seem to be one or two steps ahead of the defenders. Are they more technically adept, or do they have a magical recipe for innovation that enables them to move more quickly? If, as is commonly believed, hackers operated mainly as isolated individuals, they would need to be incredibly skilled and fast to create hacks at the frequency we’ve seen. However, when we conducted research in dark web markets, surveyed the literature on cyberattacks, and interviewed cybersecurity professionals, we found that the prevalence of the “fringe hacker” is a misconception.

Through this work, we found a useful lens for examining how cybercriminals innovate and operate. The value chain model developed by Harvard Business School’s Michael E. Porter offers a process-based view of business. When applied to cybercrime, it reveals that the dark web — that part of the internet that has been intentionally hidden, is inaccessible through standard web browsers, and facilitates criminal activities — serves as what Porter called a value system. That system includes a comprehensive cyberattack supply chain, which enables hackers and other providers to develop and sell the products and services needed to mount attacks at scale. Understanding how it works provides new, more effective avenues for combating attacks to companies, security service providers, and the defense community at large. Read the rest here.

Friday, July 12, 2019

Peter Drucker’s favorite leadership writer

strategy+business, July 12, 2019

by Theodore Kinni

Photograph by FXQuadro

Peter Drucker, the Austrian-American business author and consultant who defined management in the second half of the 20th century, wrote 39 books. Oddly, the word leadership doesn’t appear in any of their titles. In 1954, in his landmark The Practice of Management, Drucker suggested why: “The first systematic book on leadership: the Kyropaidaia of Xenophon — himself no mean leader of men — is still the best book on the subject.”

Kyropaidaia, or Cyropaedia, is the biography of Cyrus the Great, who used military conquest and enlightened governance to create the first Persian Empire around 540 BC. Xenophon the Athenian wrote the bio nearly 200 years later, and it became part of the leadership syllabus for centuries: In his 2001 book, Xenophon’s Prince: Republic and Empire in the Cyropaedia, Christopher Nadon, a professor at Claremont McKenna College (part of a consortium that includes the Drucker School of Management), writes that Alexander the Great and Julius Caesar read Kyropaedia and it was a strong influence on Machiavelli’s The Prince. Thomas Jefferson had two copies in his library.

So what do we know about Xenophon? Drucker’s description of him as “no mean leader” might be based on Xenophon’s own memoir. Titled Anabasis, it’s the story of a misbegotten military expedition, the emergence of a reluctant but talented leader, and a strategic, fighting retreat that saved an army of 10,000 mercenaries stranded deep in enemy territory.

Before he became a writer, Xenophon was embedded in this army, known as “the Ten Thousand.” Around 400 BC, Cyrus the Younger, a distant royal relation of Cyrus the Great, recruited the force as part of a military expedition. Cyrus was generous with favors and promises, but he didn’t bother to mention that his true purpose was to depose his brother, Artaxerxes II, who had inherited Persia’s throne.

Cyrus was killed in the first battle against Artaxerxes. The war lost, a group of generals and captains from the Ten Thousand tried to negotiate safe passage home — and they were betrayed by allies and slain. Thus, the Greek mercenaries found themselves leaderless and without provisions. “Separated from Hellas by more than a thousand miles, they had not even a guide to point the way,” reported Xenophon, who wrote Anabasis in the third person. “Impassable rivers lay athwart their homeward route, and hemmed them in. Betrayed even by the Asiatics, at whose side they had marched with Cyrus to the attack, they were left in isolation.” Read the rest here.

Friday, July 5, 2019

Cloud-based HCM systems should come without surprises

Lent an editorial hand preparing this guide to preparing a reality-based business case for HCM:

Deloitte's Capital H Blog, July 3, 2019

by Marty Marchetti

The business case for cloud-based human capital management (HCM) systems can sound pretty compelling. What CHRO wouldn’t want fast access to the latest advances in HCM technology at a lower overall cost? But my colleagues and I help companies make the move to cloud HCM, and we often get a firsthand view of the mismatch between expectations and reality that was revealed in Deloitte’s 2019 Global Human Capital Trends study.

It is important to have a comprehensive and accurate total cost of ownership for cloud HR before your company commits to it, during the implementation, and after it is in place.

“No surprises” should describe your move to the cloud, and the following 5 questions can help you reduce them. Read the rest here.

Friday, June 14, 2019

Conversational computing

strategy+business, June 13, 2019

by Theodore Kinni

Steve Jobs could be relentless when he wanted something. In early 2010, he wanted a small startup in San Jose, Calif. CEO Dag Kittlaus and his cofounders had just raised a second round of funding and didn’t want to sell. Jobs called Kittlaus for 37 days straight, until he wrangled and wheedled a deal to buy the two-year-old venture for Apple at a price reportedly between US$150 million and $200 million. The company was Siri Inc.

Wired contributor James Vlahos tells the story of how Siri took up permanent residence in the iPhone in his new book, Talk to Me. It’s the first nontechnical book on voice computing that I’ve seen and a must-read if you have any interest in the topic.

Vlahos spends the first third of Talk to Me describing the platform war currently raging in voice computing. It details the race among the big players, including Amazon, Google, and Apple, to embed AI-driven voices in as many different devices as possible, as they seek to dominate the emerging ecosystem. The fact that Amazon now has more than 10,000 employees working on Alexa provides a good sense of the dimensions of that race.

But voice computing is more than a platform play. It is likely to have ramifications and applications for every company, especially if Vlahos’s contention that “the advent of voice computing is a watershed moment in human history” turns out to be right.

“Voice is becoming the universal remote to reality, a means to control any and every piece of technology,” he writes. “Voice allows us to command an army of digital helpers — administrative assistants, concierges, housekeepers, butlers, advisors, babysitters, librarians, and entertainers.” Voice will disrupt the business models of powerful companies — and create new opportunities for upstarts — in part because it will put AI directly in the control of consumers, Vlahos argues. “And voice introduces the world to relationships long prophesied by science fiction — ones in which personified AIs become our helpers, watchdogs, oracles, and friends.” Read the rest here.

Transformation in energy, utilities and resources

Learned a lot lending an editorial hand here:

PwC, June 13, 2019

The world is at the midpoint of a massive energy-related transformation. By 2040, the global demand for all forms of fuel and power will be four times what it was in 1990. During the same 50 years, the issue of global climate change will have moved from the margins to the centre. Institutions everywhere will be striving to address climate-related problems by dramatically decreasing and mitigating carbon use.

In the energy, utilities and resources (EU&R) industries, the relationship between these two dynamics — the rise in demand and the recognition of carbon use as a climate threat — is already determining basic strategic choices. And it will continue to do so for years to come. This development will profoundly affect a wide range of companies: producers of all forms of energy; disseminators and sellers of electric power, gas and oil; energybased process industries such as chemicals and steel; and producers of other extracted commodities. Leaders in all those businesses will need the acumen to make and execute decisions that combine growth with environmental sustainability, often in novel ways.

The ability to take this new approach to management, especially for companies that have been successful in the past, is not guaranteed. Thus, transformation — the ability to make fundamental shifts in strategy, operating model and day-to-day activity — is on the agenda for EU&R companies this year, with a stronger sense of urgency than before. Fortunately, because of the rise of digital technology, the growing use of interoperable platforms and an emerging consensus about the value of renewable energy, EU&R companies have more tools and opportunities than ever before for thriving through this disruption. 

The urgency became clear in the results of a number of surveys conducted recently by PwC — including those of chemical company CEOs, oil and gas company CEOs, and power and utilities companies — and it is especially pressing in the utilities sector. For instance, when we surveyed senior executives in Germany’s energy sector in 2018, 77% said that the bulk of their company’s revenues would continue to come from their core businesses over the next five years, yet 57% of them expected those revenues to fall over the same period. Likewise, in chemicals, according to our 22nd Annual Global CEO Survey trends series, the next decade is likely to see the sector come under increasing pressure on a range of sustainability measures. In short, although the demand for EU&R’s elemental commodities will grow and its essentially extractive, capital-intensive nature will not change, business as usual will not be a viable alternative for many companies. Read the rest here.

Tuesday, June 11, 2019

Using AI to Enhance Business Operations

Learned a lot lending an editorial hand here:

MIT Sloan Management Review, June 11, 2019

by Monideepa Tarafdar, Cynthia M. Beath, and Jeanne W. Ross

Artificial intelligence invariably conjures up visions of self-driving vehicles, obliging personal assistants, and intelligent robots. But AI’s effect on how companies operate is no less transformational than its impact on such products.

Image result for mit sloan reviewEnterprise cognitive computing — the use of AI to enhance business operations — involves embedding algorithms into applications that support organizational processes. ECC applications can automate repetitive, formulaic tasks and, in doing so, deliver orders-of-magnitude improvements in the speed of information analysis and in the reliability and accuracy of outputs. For example, ECC call center applications can answer customer calls within 5 seconds on a 24-7-365 basis, accurately address their issues on the first call 90% of the time, and transfer complex issues to employees, with less than half of the customers knowing that they are interacting with a machine. The power of ECC applications stems from their ability to reduce search time and process more data to inform decisions. That’s how they enhance productivity and free employees to perform higher-level work — specifically, work that requires human adaptability and creativity. Ultimately, ECC applications can enhance operational excellence, customer satisfaction, and employee experience.

ECC applications come in many flavors. For instance, in addition to call center applications, they include banking applications for processing loan requests and identifying potential fraud, legal applications for identifying relevant case precedents, investment applications for developing buy/sell predictions and recommendations, manufacturing applications for scheduling equipment maintenance, and pharmaceutical R&D applications for predicting the success of drugs under development.

Not surprisingly, most business and technology leaders are optimistic about ECC’s value-creating potential. In a 2017 survey of 3,000 senior executives across industries, company sizes, and countries, 63% said that ECC applications would have a large effect on their organization’s offerings within five years. However, the actual rate of adoption is low, and benefits have proved elusive for most organizations. In 2017, when we conducted our own survey of senior executives at 106 companies, half of the respondents reported that their company had no ECC applications in place. Moreover, only half of the respondents whose companies had applications believed they had produced measurable business outcomes. Other studies report similar results.

This suggests that generating value from ECC applications is not easy — and that reality has caught many business leaders off guard. Indeed, we found that some of the excitement around ECC resulted from unrealistic expectations about the powers of “intelligent machines.” In addition, we observed that many companies that hoped to benefit from ECC but failed to do so had not developed the necessary organizational capabilities. To help address that problem, we undertook a program of research aimed at identifying the foundations of ECC competence. We found five capabilities and four practices that companies need to splice the ECC gene into their organization’s DNA. Read the rest here.

Sunday, June 2, 2019

Managerial hubris brought down MacArthur

strategy & business, May 29, 2019

by Theodore Kinni

Photograph by Pictorial Press Ltd / Alamy

I find hubris to be a fascinating cognitive flaw. Perhaps the spectacle of arrogance leading to a fall from grace provides a socially acceptable outlet for my predilection for schadenfreude — another obnoxious personality glitch. But my flaws don’t matter all that much. I’m not a leader.

For leaders, the consequences of cognitive flaws like hubris are magnified. And nowhere is the danger of managerial hubris more evident than in the career of General Douglas MacArthur, whose life and career I studied for my book No Substitute for Victory: Lessons in Strategy and Leadership from General Douglas MacArthur. In June 1950, when President Harry Truman appointed him to head the United Nations Command at the start of the Korean War, MacArthur was already a prime candidate for hubris. He had served as commander of the U.S. Army Forces in the Pacific in WWII and was still, at age 70, serving as the de facto leader of postwar Japan and its more than 80 million citizens. He was, as biographer William Manchester put it, an “American Caesar.” It is unlikely that MacArthur would have objected to the characterization, had he been alive to hear it.

If MacArthur had an elevated sense of ego and invincibility by 1950, his initial success in prosecuting the Korean War surely reinforced the feeling. As the UN forces fought to hang on at Pusan, their last foothold on the Korean Peninsula, MacArthur mounted an audacious, large-scale amphibious attack well behind enemy lines at the port city of Inchon. The plan was risky, if not foolhardy: Inchon’s 30-foot tides are so extreme that the window for making the assault was limited to two days in September. Moreover, if the landing forces had been unable to take the port, they would have been trapped.

As it turned out, the Inchon invasion was a complete success. The North Korean Army reeled in surprise, and a day later, the UN forces at Pusan broke out. Within two weeks, the invaders had been expelled from South Korea and the UN forces crossed the 38th Parallel, heading north to the Chinese border. The stage was set for one of the 20th century’s most dramatic exhibitions of hubris. Read the rest here.

Friday, May 24, 2019

Are meaning & purpose missing for your workforce?

Learned a lot lending an editorial hand here:

Capital H Blog, May 24, 2016

by Matthew Deruntz and Christina Rasieleski

Organizations are increasingly offering lavish perks to attract and retain talent, and then tracking their success with annual engagement surveys. But what if they’re missing the point?

Despite a laser-like organizational focus on what is traditionally called employee engagement, most people remain less than satisfied with their jobs. Deloitte’s 2019 Global Human Capital Trends survey points to what may be really missing. Many workers lack autonomy and access to the tools and information they need; moreover, they aren’t satisfied with the design of their jobs or the day-to-day flow of work. In fact, most survey respondents rated their organizations only “somewhat effective” or “not effective” on a number of factors related to experience: positive work environment, meaningful work, growth opportunities, trust in leadership, and supportive management. These aren’t issues that organizations can address with free doggie daycare or on-site CrossFit. Instead, they need to reevaluate the fundamental human needs of their workforce.

For better or worse, work holds such a dominant place in many people’s lives that when it fails to meet their innate need for meaning and purpose, their entire lives can become less satisfying and fulfilling. To address this issue and recognize that everyone who contributes to the organization—whether as a full-time employee, contractor, or gig worker—is an individual with intrinsic human needs, organizations need to pivot from thinking about an “employee experience” to thinking about a “human experience” for their workforce. Read the rest here.

Tuesday, April 30, 2019

Bad meetings no more

strategy+business, April 30, 2019

by Theodore Kinni

Aldous Huxley had it wrong. Bad meetings — not mescaline — open the doors of perception. They lull me into a trance. I occasionally surface (did I snore?), murmur agreement (to who knows what), surreptitiously check my phone, and nod off again. If the deep breathing I often hear on conference calls is any clue, I’m not the only one who achieves spiritual transcendence in bad meetings.

The authors of how-to books about meetings never consider the salutary effects of bad ones. Instead, they typically start with an adrenalin-like shot of statistics. Steven Rogelberg, Chancellor’s Professor at University of North Carolina, Charlotte, and author of The Surprising Science of Meetings, is no exception to the rule. He offers the usual litany of dismay. There are about 55 million meetings per day in the U.S. alone, and they cost US$1.4 trillion annually, not counting indirect costs such as employee frustration. “Too many meetings” is cited as the top time-waster by 47 percent of U.S. workers.

Nevertheless, Rogelberg doesn’t think that companies should eliminate meetings. “Was the great management guru Peter Drucker correct when he said, ‘Meetings are a symptom of bad organization. The fewer meetings, the better’?” Rogelberg asks. “The answer is an emphatic ‘no.’ Abolishing meetings is a false solution. Schedules with too few meetings are associated with substantial risks for employees, leaders, teams, and organizations.” Instead, the author advises breaking the cycle of bad meetings with the application of meeting science.

If anybody has a claim on the role of meeting scientist, it’s Rogelberg. He has been researching meetings using field surveys, laboratory studies, and experiments incorporating planted accomplices for 15 years. In this book, he weaves his findings and the research of others into an evidence-based approach to meetings that is sometimes eye-opening. Read the rest here.

Tuesday, April 23, 2019

Does your rewards strategy identify and address employee stressors?

Learned a lot lending an editorial hand here:

Inside HR, April 23, 2019

by Pete DeBellis

What is the basis for your company’s rewards offerings? For too many companies, it is purely benchmarks – that is, they make rewards decisions based on the rewards being offered by other companies with whom they believe they compete for talent. The problem: companies that follow this approach are left guessing about the desires and stressors of their actual workforce rather than knowing definitively what they want or need. In fact, based on Deloitte’s 2019 Global Human Capital Trends report, nearly one-quarter (23 percent) of organisations do not feel they know what rewards their employees value.

There’s nothing wrong with benchmarking per se: You should know what rewards your competitors are offering their employees. But that’s only one piece of the rewards puzzle. To optimise a rewards offering, you need to know a lot more about your rewards customers, that is, your company’s employees. Our research at Bersin finds that companies with mature, high-performing rewards functions achieve this by adopting some version of the following 4-step process, which uses the same kinds of surveys that marketers use to understand customers. Read the rest here.

Thursday, April 18, 2019

In praise of the purposeless company

strategy+business, April 18, 2019

by Theodore Kinni

Photograph by Avalon_Studio

These days, my vote for the most misunderstood and misused management concept goes to “corporate purpose.” Back in 1973, the concept was crystal clear to Peter Drucker, who declared with admirable concision in Management: Tasks, Responsibilities, Practices: “There is only one valid definition of business purpose: to create a customer.” Since then, however, the definition of corporate purpose has mutated into pretty much any reason for being in business that isn’t explicitly connected to making money.

Business professors Sumantra Ghoshal and Christopher A. Bartlett unbottled this genie in a 1994 article in Harvard Business Review, in which they argued that strategy (“an amoral plan for exploiting commercial opportunity”) wasn’t enough: “A company today is more than just a business. As important repositories of resources and knowledge, companies shoulder a huge responsibility for generating wealth by continuously improving their productivity and competitiveness. Furthermore, their responsibility for defining, creating, and distributing value makes corporations one of society’s principal agents of social change. At the micro level, companies are important forums for social interaction and personal fulfillment.”

Why was a highfalutin corporate purpose seen as such a big deal? IGhoshal, who passed away in 2004, and Bartlett, who is now professor emeritus of business administration at Harvard Business School, concluded that companies had to transform themselves from economic entities to social institutions. They added that the “definition and articulation [of purpose] must be top management’s first responsibility.” Read the rest here.

Thursday, April 11, 2019

We Lead People, Not Cardboard Cutouts

Learned a lot lending an editorial hand here:

Forbes, April 11, 2019

by Michael Gretczko


My wife and I just took our 5-year-old fraternal twins on a skiing vacation. Our daughter is caution incarnate. She likes to ski in a familial caravan — one parent ahead and one behind — and she wants constant feedback about her performance. Our son likes to get a rough idea of the conditions — icy here, snowboarders there — and push off. He doesn’t mind falling and doesn’t particularly care what we think of his performance. It’s astounding how different twins can be.

I’m constantly amazed how my children can uncover insights that allow me to see my role as a leader in a new light. I’m always seeking new ways to create engaged, high-performing teams, and typically, that devolves to some type of employee segmentation, by generation, job description or personality. We’re told that millennials often prefer to work this way, programmers want to work that way, and that Driver and Pioneer Business Chemistry styles want to work yet another way. But if my twins respond best to radically different conditions and parenting styles, can any type of segmentation be granular enough to respond to the individual needs of employees?

I suspect that it can’t. To engage with people on a truly human level — that is, to get beyond the employees-as-interchangeable-assets mindset — we need to be far more responsive to employees as individuals. Read the rest here.

Large businesses don’t have to be lousy innovators

strategy+business, April 11, 2019

by Theodore Kinni

Photograph by Kanchisa Thitisukthanapong

Gary Pisano, author of Creative Construction: The DNA of Sustained Innovation, doesn’t buy the idea that large enterprises are inherently lousy innovators. Back in 2006, Pisano, the Harry E. Figgie Professor of Business Administration at Harvard Business School, traced the origin of every drug approved by the FDA over a 20-year period to either one of the world’s 20 largest pharmaceutical companies or one of the 250 smaller, supposedly more innovative biotechs. When he compared the two groups, he discovered a “statistical dead heat” — R&D productivity was no better in the smaller biotechs than in big pharma.

Pisano also points to anecdotal evidence to support his opposition to the conventional wisdom about innovation in large enterprises. For every big, established company that failed at transformational innovation (think Blockbuster, Kodak, and Polaroid), he points to another that has succeeded. In 1964, when IBM announced its revolutionary 360 mainframe computers, it was already the largest computer company in the world and ranked 18th on the Fortune 500. In 1982, when Monsanto scientists invented the foundational technology for GMOs (genetically modified organisms), the company was 81 years old and number 50 on the Fortune 500. And in 2007, when Apple launched the iPhone, it had sales of US$24 billion and already stood at 123rd on the Fortune 500.

Pisano says that the difference between a Blockbuster and an IBM is the ability of leaders to sustain and rejuvenate the innovation capacity of their companies. It’s an ability he calls “creative construction,” and he writes that it “requires a delicate balance of exploiting existing resources and capabilities without becoming imprisoned by them.”

Walking that tightrope is a challenge for large companies. It’s tough to move the needle with innovation when the needle’s scale is measured in billions of dollars. “For J&J [Johnson & Johnson] to maintain its historical rate of top-line growth,” reports Pisano, “it must generate about $3 billion–$4 billion of new revenue per year.” The complexity of managing innovation in large organizations can also be daunting. “When you get to be the scale of a J&J, you have a lot of moving parts,” he explains. “You now have a system with serious frictions. Friction impedes mobility. Lack of mobility means lack of innovation.”

But large companies also have some advantages that can give them a leg up in innovation. “Larger enterprises like J&J have massive financial resources to explore new opportunities,” says Pisano. They can hedge their bets, tap deep reservoirs of talent, navigate regulatory agencies, and use their huge distribution networks and strong brands to roll out new products to millions of existing customers. Read the rest here.