Friday, November 15, 2019

How to build a great experience

strategy+business, November 15, 2019

by Theodore Kinni



Illustration by Paula Daniëlse

In 2017, the Marriott School of Business at Brigham Young University announced that henceforth the Department of Recreational Management would be known as the Department of Experience Design and Management. The idea that immersive and engaging experiences produce value and deliver competitive advantage has come a long way in the 20 years since Joe Pine and Jim Gilmore welcomed us to something they called the “experience economy.”

Designing Experiences is the latest in a long line of books that have appeared on the subject. In it, J. Robert Rossman, a professor at Illinois State University, and Mathew Duerden, an associate professor in the aforementioned department at the Marriott School, touch on many of its predecessors (including one in which I had a hand, Be Our Guest) in a concise textbook that serves as both a theoretical foundation and a how-to guide for experience design.

The theoretical foundation, which appears mostly in the first two chapters, bogs down a bit in explaining what constitutes an experience. This murk stems from Pine and Gilmore’s positioning of experiences as an economic activity unique from products and services. Rossman and Duerden carry this forward by arguing that experiences differ from products and services because the person on the receiving end of an experience must be actively co-creating it. “Experience demands conscious attention, engagement, and action — in a word, participation,” they write.

This distinction isn’t clear to me. Is there any product or service we can buy and consume that doesn’t require our participation in some form or other? And even if it were possible not to participate in the acquisition and use of certain products or services (say, buying groceries or cutting the lawn), mightn’t that count as a very good experience for some of us? Read the rest here.

Tuesday, November 5, 2019

Best Business Books 2019: Management

strategy+business, November 5, 2019

by Theodore Kinni



Illustration by Harry Campbell

Marcus Buckingham and Ashley Goodall
Nine Lies About Work: A Freethinking Leader’s Guide to the Real World (Harvard Business Review Press, 2019)

Stephen Martin and Joseph Marks
Messengers: Who We Listen To, Who We Don’t, and Why (PublicAffairs, 2019)

Roger Dooley
Friction: The Untapped Force That Can Be Your Most Powerful Advantage (McGraw-Hill, 2019)

In 1954, the discipline of management was neatly encapsulated by Peter Drucker in the pages of a single book, The Practice of Management. This year’s best business books on management reflect how much the discipline has changed in the past 65 years, and how fuzzy the boundaries separating fields have become.

Nine Lies About Work, by Marcus Buckingham and Ashley Goodall, the year’s best management book, challenges the assumptions that underlie contemporary managerial practices, many of which date back to Drucker’s day. In doing so, the book offers a glimpse of a new management paradigm that may prove to be better suited to the times. Messengers, by Stephen Martin and Joseph Marks, prompts us to see managers as a living, breathing communication medium — and it describes the traits that can ensure the messages they deliver will be heard. And Friction, by Roger Dooley, suggests that if managers turn their attention to simplifying anything customers and employees need to do, they’ll happily do more of it. Read the rest here.

Saturday, November 2, 2019

Lucky You!

strategy+business, November 1, 2019

by Theodore Kinni



Photograph by Elizabeth Fernandez

Recently, on a social media site for professionals, I suggested that luck plays a significant role in leadership and business success. This didn’t sit well with several commentors, who argued that successful people become that way largely by dint of merit — they work hard and use their brains and hone their ability to identify and exploit opportunities. People like Bill Gates and Warren Buffett make their own luck, I was told.

Hogwash. This is not to detract from the monumental business achievements of two of America’s wealthiest (and most philanthropic) men. But we should acknowledge that Gates and Buffett both drew winning tickets in the birth lottery.

Gates’s dad co-founded a law firm and served as president of the Washington State Bar Association; his mom, who came from a family of bankers, held prominent board positions. They sent their son to one of the best prep schools in the nation and then to Harvard, where, with their assent and support, he dropped out to start a computer software company. Mary Gates helped her son’s fledgling company get the IBM contract that led to MS-DOS. Warren Buffett was the son of a U.S. congressman — Howard Buffett represented Nebraska in the House of Representatives for four terms, and founded a brokerage firm. Warren’s parents sent him to the Wharton School of the University of Pennsylvania, the University of Nebraska, and Columbia Business School, where he studied with and was mentored by Benjamin Graham, the father of value investing. Buffett’s first job was in his father’s firm and then he went to work for Graham. Nobody gave Gates or Buffett their billions, or even their first tens of millions. But when they pulled themselves up by their bootstraps, the climb wasn’t as far as it would be for most of us.

Once leaders attain positions of power, luck continues to play a powerful role in their success. Take Jack Welch, who was named “manager of the century” by Fortune in 1999. Read the rest here.

Tuesday, October 22, 2019

Past performance is no guarantee of future results

strategy+business, October 18, 2019

by Theodore Kinni



Photograph by aeduard

By 1905, when philosopher George Santayana wrote, “Those who cannot remember the past are condemned to repeat it,” humans had already been gleaning lessons from history for several millennia. Around 800 BC, in the Iliad, Homer used the principal players in the Trojan War to explore leadership strategies and styles. Nearly a thousand years later, at the start of the second century AD, Plutarch compared the character traits of historical leaders in Lives of the Noble Greeks and Romans. And of course, we are still at it today. The business bookshelves are sagging with leadership and strategy lessons drawn from the lives of yesterday’s inventors, tycoons, generals, politicians, and other leading lights.


Sometimes these lessons feel like too much of a stretch — not only because they tend to idealize their subjects, but also because they elevate ad hoc responses into generic rules. How much credence, for instance, should a new CEO put in creating a “team of rivals” à la Abraham Lincoln? Or, to hold my own feet to the fire, how much faith should a leader in a battle for market share put in the “hit ’em where they ain’t” military strategy of Douglas MacArthur?

Ben Laker, professor of leadership at Henley Business School and dean of education at the National Centre for Leadership and Management in the U.K., points to current Prime Minister Boris Johnson, who wrote a book about another U.K. prime minister, The Churchill Factor: How One Man Made History, to illustrate the difficulties of applying lessons from the past. “The Prime Minister knows how Winston Churchill created a sense of connection through a ‘backs against the wall’ mentality in 1941. He is basing his rhetoric, decisions, and actions on Churchill’s example. And many people do feel more connected to him because of it,” Laker says. “But as Johnson’s critics observe, Brexit is not a war and a wartime mentality is at odds with a situation that requires openness and collaboration to reach a feasible outcome.”

Clearly, context is a critical factor in applying history. “You can look at the past and ask yourself whether you would do the same thing in the same situation,” Laker told me in recent conversation. “But the problem is you are not in the same situation. So, how relevant is history to your present situation?”  Read the rest here.

Wednesday, October 2, 2019

Need to Work Differently? Learn Differently

Learned a lot lending an editorial hand here:

Boss Magazine, October 2019

by Michael Griffiths


Digitization requires a new set of skills and a new set of training for employees


The next time your company holds an all-hands meeting, look around the room — or the arena — and consider this: It’s likely that more than half the people present will need reskilling or upskilling in the next three years.

This probably doesn’t come as a complete surprise to you. The forces of change are transforming every aspect of work, including what is done, who does it, and where it is done.

For example, emerging technologies — especially AI and machine learning — are among the most disruptive of these forces. In fact, 81 percent of respondents to Deloitte’s 2019 Global Human Capital Trends survey indicated they expect the use of AI to increase or increase significantly over the next three years. Unlike some, we don’t believe that AI will eliminate the need for a workforce. Instead, we anticipate the rise of hybrid jobs, which are enabled by digitization, technology, and the emergence of a new kind of job, which we call the superjob. A superjob combines work and responsibilities from multiple traditional jobs, using technology to both augment and broaden the scope of the work performed and involving a more complex set of digital, technical, and human skills.

Hybrid jobs and superjobs can enable your company to be more responsive to customers and adaptable to change. But it requires a more deliberate and agile approach to capability development. Already, many companies are responding to this need: Our research finds that 83 percent of organizations are increasing their investments in reskilling programs, and more than half (53 percent) increased their learning and development budgets by 6 percent or more in 2018.

But will more learning be enough at your company? It’s doubtful. To Work Differently we think your company should first Learn Differently. Read the rest here.

Wednesday, August 14, 2019

The Greatest Showman on Earth

strategy+business, August 14, 2019

by Theodore Kinni

Phineas Taylor Barnum’s future was bright. He believed from the age of 4 that his grandfather, pleased to have his grandson as his namesake, had purchased the most valuable farm in Connecticut in Barnum's name. For years, the boy’s grandfather talked about the farm and his neighbors congratulated him on being the richest child in the town of Bethel. At the age of 12, Barnum was taken to see his farm. It was five worthless, inaccessible acres in a large swamp. Everyone had a great laugh.

Robert Wilson, editor of the American Scholar and author of Barnum, sees the roots of the 19th-century American showman’s outsized pecuniary drive in “this strangely cruel and astonishingly drawn-out joke.” But it’s hard to judge whether the story is true — the only citation Wilson offers is Barnum’s autobiography, which should give the reader pause, considering its author’s reputation for humbug and penchant for spinning his own life story.

If Barnum didn’t stretch the story (or invent it outright), it also may reveal the roots of his preternatural talent for hucksterism. Certainly, he elevated the joke to unprecedented heights with a series of frauds so entertaining to American and European audiences of every social class that instead of shunning him, they rewarded him with riches that beggared the promise of the farm that never was. He also provided an early and, sadly, enduring lesson in the use of brazen hype, shameless self-promotion, and fake news as the basis of a successful business. Read the rest here. 

Tuesday, August 13, 2019

Staying Ahead of Disruption with Workforce Sensing

Learned a lot lending an editorial hand here:

Workforce Magazine, August, 2019

By Daniel Roddy and Chris Havrilla

Plug the word “disruption” into Google Trends and you’ll get a jagged line tracking 15 years of peaks and plunges in search frequency. But for all the shortterm variation in the chart, the long-term trend is steadily rising: there are nearly three times as many “disruption” searches today as there were in 2004. 

The steady rise in searches reflects a reality that won’t surprise most leaders. They face a host of disruptions—social, demographic, environmental, economic, technological, and geopolitical. Not only is it their job to make sure that their companies don’t get blindsided by these breakpoints in the status quo, but they also must be able to respond to them quickly and agilely in order to transform these disruptions into competitive advantage.

Sensing is the foundation on which an organization’s ability to identify, pace, and respond to disruption is built. In hindsight, disruptions seem obvious. By the mid-2000s, it was clear that streaming movies would decimate the video rental industry. But to have realized that a decade earlier, when the MP3 format first emerged for audio, and acted upon it is another matter entirely.

The ability to sense disruptions in their nascent stages and predict how they are likely to affect a company and its stakeholders is crucial to success in business today. This is especially true when it comes to sensing disruptions in the workforce. Read the rest here.

Thursday, August 1, 2019

Work Should Generate Energy, Not Sap It

Learned a lot lending an editorial hand here:

Forbes, August 1, 2019

by Michael Gretczko


GETTY

It’s 5:45 a.m. There is a candle flickering in the room. A bass booms. I pump my legs. Left, right, left, right. My heart starts pounding. I suck in air. Soon, I’m pouring sweat.

Does this sound like a nightmare? It’s just the opposite.

I start most of my days at SoulCycle, a 45-minute, high-intensity spin class. It’s my “secular sanctuary,” as one of their founders describes it. The class grounds and focuses my mind, resets and recharges my body with the energy I need for the day ahead. It enables me to bring my best self to my work. (I swear I haven't been paid for my comments — I’m just plain addicted.)

When I travel, I invite my colleagues to ride with me. We become a tribe at these classes. By the time we get to our post-workout coffees, we’re connected in a more intimate and intense way, high-fiving and sharing our sense of accomplishment.

As I reflect on what I love about cycling, I realize there are parallels between what it does and what great organizations strive to do. Both seek to maximize our human potential. Both are focused on enabling us to impact the world around us by unlocking our best capabilities and intentions.

There are three lessons from my spin class experience that align with how leaders of high-performing organizations unleash the energy of their workforces. Read the rest here.

Diversity, Inclusion, and the Alternative Workforce

Learned a lot lending an editorial hand here:

Boss Magazine, August 2019

by Kathi Enderes


The alternative workforce, including outsourced teams, contractors, consultants, freelancers, gig workers, and the crowd, is going mainstream. It’s the fastest-growing labor segment in the EU. By next year, the number of self-employed workers in the US is projected to reach 42 million people — nearly tripling in two years. Alternative workers account for over 10 percent of Australia’s labor pool.

Savvy leaders are well aware of the growth in the alternative workforce. In Deloitte’s 2019 Global Human Capital Trends survey, 41 percent of the almost 10,000 executive respondents said alternative workers are “important” or “very important” to their organizations. But only 28 percent said their organizations were “ready” or “very ready” to address the employment of alternative workers. A mere 8 percent said that they have the processes in place to manage and develop these workers. All this represents an opportunity and challenge for leaders everywhere.

A Wellspring of Talent

The opportunity in the alternative workforce is three-fold:

Filling the ‘skills gap’: The growing ranks of alternative workers offer a valuable pool of skills and capabilities in a time when it is becoming increasingly difficult to fill jobs. Last year, a global study by the Manpower Group reported that nearly half (45 percent) of employers studied were having trouble filling open positions; among companies with more than 250 employees, the percentage rose to 67 percent. That’s a major reason why the employment of alternative workers is spreading beyond IT into a host of other roles. Respondents in the 2019 Global Human Capital Trends survey indicated that they are using alternative workers extensively in operations (25 percent of respondents), customer service (17 percent), marketing (15 marketing), and innovation/R&D (15 percent).

Positively impacting organizational performance: Alternative workers are often highly talented, experienced, and self-motivated, attracted by the freedom, flexibility, and variety provided by working in arrangements other than traditional employment. Respondents to our trends survey who measure the contribution of outsourced teams, freelancers, gig workers, and the crowd reported that these workers have a positive impact on organizational performance.

Increasing diversity: Alternative workers can be a valuable source of diversity. After all, they may be located anywhere in the world, and often they come from a variety of backgrounds and experiences. They can contribute unique perspectives and ideas. Smart leaders not only consider the traditional dimensions of diversity — race, gender, age, and physical ability — they also tap into the deeper value embedded in the hearts and minds of workers. In a complex, global business environment, bringing different hearts and minds together is more important than ever.

So how can your organization tap into the wellspring of alternative workers? Read the rest here.

Wednesday, July 31, 2019

All the healthcare you can afford

strategy+business, July 31, 2019

by Theodore Kinni


Illustration by adventtr

In 2014, a syllabus and sample lecture for a course entitled Introductory Korean Drama (pdf) surfaced at Princeton University. Written by the eminent healthcare economist Uwe Reinhardt, it began, “After the near‐collapse of the world’s financial system has shown that we economists really do not know how the world works, I am much too embarrassed to teach economics anymore, which I have done for many years. I will teach Modern Korean Drama instead.” It appears that some economics professors aren’t nearly as dismal as their science.

Reinhardt never taught the class, which he said began as an impromptu lecture at a dinner with a group of Korean and Taiwanese health insurance professionals. But his tongue-in-cheek analysis of Korean TV dramas offers a glimpse of his ability to get to the nub of a matter. So does Priced Out, Reinhardt’s final book, published earlier this year, two years after his death in 2017.

In the book, Reinhardt gets to the crux of the ongoing debate over the American healthcare system — in which solutions abound but relief is nowhere in sight — with just one question: “As a matter of national policy, and to the extent that a nation’s health system can make it possible, should the child of a poor American family have the same chance of avoiding preventable illness or of being cured from a given illness as does the child of a rich American family?”

This is the ethical issue hidden behind all the talk of free markets and government control, the political rhetoric about socialism and states’ rights, and the calculations of how much the people of the United States can or can’t afford to pay for healthcare. Clearly, it’s an uncomfortable one. When Reinhardt first posed the question more than 20 years ago, he was dismissed as a “socialist propagandist” for his temerity.

“And so,” he laments, “permanently reluctant ever to debate openly the distributive social ethic that should guide our healthcare system, with many Americans thoroughly confused on the issue, we shall muddle through health reform, as we always have in the past, and as we always shall for decades to come.” 

But muddle through we must, because of two long-term trends: the seemingly inexorable growth in healthcare spending and the increasing inequality in the distribution of income and wealth. These trends, Reinhardt argues, “already are pricing more and more American families in the lower part of the nation’s income distribution out of health insurance and healthcare as families in the upper half of the distribution know it.” In other words: No, currently, the child of a poor American family does not have the same healthcare prospects as the child of a rich American family. Read the rest here.

Thursday, July 25, 2019

Getting full value from external talent

strategy+business, July 25, 2019

by Theodore Kinni



Photograph by Hero Images

Many recent studies of talent include some version of the prescriptive advice in PwC’s Preparing for tomorrow’s workforce, today report: “Harness the potential of flexible talent and innovation.” The wellspring of flexible talent and innovation is the contingent or alternative workforce — these days, that includes the fast-growing ranks of freelancers, independent contractors, gig workers, and the crowds whose collective genius companies can tap to address a variety of challenges.

The problem, as the PwC study found, is that 92 percent of companies are not managing these contingent workers as effectively as they could. Even as companies rely on contingent workers in ever-greater numbers, they often make it difficult — if not impossible — for them to contribute in full measure. Leaders need to do better.

This didn’t matter much 30-something years ago when I became a full-time freelancer. Most industries had little use for contingent workers then, and most workers wanted “real” jobs on the payroll. By 2017, however, 57 million American workers identified themselves as freelancers — that’s 36 percent of the workforce and nearly 50 percent of millennials. And contingent workers are in demand in a host of industries for a host of reasons. These include (but are not limited to): the record low unemployment rate, shortages of talent in emerging capabilities arenas (like AI and robotics), and the growing numbers of business models and workforce strategies that depend on contingent workers.

Yesteryear, managing contingent workers was something of a contradiction in terms. It seemed like a major reason to hire independent contractors was that you didn’t have to bother managing them. If there was a problem, the relationship could easily be terminated with a minimum of cost or conflict. And regardless of how well contingent workers performed, it was the rare manager who thought it might be worth cultivating an ongoing relationship. The operative managerial mind-set was “here today, gone tomorrow.”

That mind-set has been transformed over the last decade, as contingent workers have become more central to more companies’ operations. Read the rest here.

A new view of the fortune at the bottom of the digital pyramid

strategy+business, July 24, 2019

by Theodore Kinni




Photograph by code6d

The benefits of digitization and Internet connections in developing nations — and the opportunities awaiting companies that can provide them — have been much lauded in the past couple of decades. But as Payal Arora, a professor at Erasmus University Rotterdam, clearly demonstrates in her new book, The Next Billion Users, the conventional storyline around the transformative effect of technology on people’s lives often doesn’t ring true.

Arora, who has been studying how the global poor outside the West use computers and the Internet for nearly 20 years, discovered this for herself during her first development project in a rural region of southern India. “The goal,” she explains, “was to infuse this town with new digital technologies to help the poorer members of the community leapfrog their way out of poverty.”

The project team set up computer kiosks and funded cybercafes. It sent computer-equipped vans to remote villages to promote Internet awareness. “We envisioned women seeking health information, farmers checking crop prices, and children teaching themselves English,” Arora writes. The reality was the polar opposite: The kiosks became Pac-Man gaming stations, social networking sites dominated computer usage in the cybercafes, and the free movies used to attract people to the vans became their primary draw.

“Many of the technology development projects I have worked with since have yielded similar results,” Arora writes. “Play dominates work, and leisure overtakes labor, defying the productivity goals set by development organizations.” (Imagine the sniffing among Western do-gooders.)

This is the source of what Arora defines as the third digital divide between the developed and developing worlds. The first digital divide is access to technology. The second divide is the ability to use the technology — to read and write, for instance. And the third divide, which Arora labels “the leisure divide,” is rooted in motivation. “The leisure divide is about understanding what the global poor want from their digital life and why it matters to them,” she writes. “It reminds us that fulfillment is not necessarily a matter of efficiency or economic benefit but can involve a more elusive, personal, and emotive drive.” Read the rest here.

Monday, July 15, 2019

Casting the Dark Web in a New Light

Learned a lot lending an editorial hand here:

MIT Sloan Management Review, July 15, 2019

by Keman Huang, Michael Siegel, Keri Pearlson, and Stuart Madnick


With cyberattacks increasingly threatening businesses, executives need new tools, techniques, and approaches to protect their organizations. Unfortunately, criminal innovation often outpaces their defensive efforts. In April 2019, the AV-Test Institute, a research organization that focuses on IT security, registered more than 350,000 new malware samples per day, and according to Symantec’s 2019 Internet Security Threat Report, cyberattacks targeting supply chain vulnerabilities increased by 78% in 2018.

Wide-scale attacks are becoming more common, too. In October 2016, a distributed denial-of-service (DDoS) attack that hit Dyn, a domain name system (DNS) provider, in turn brought down companies such as PayPal, Twitter, Reddit, Amazon, Netflix, and Spotify. In 2017, the WannaCry and NotPetya ransomware attacks affected health care, education, manufacturing, and other sectors around the world. A report from the Department of Health in the U.K. revealed that WannaCry cost it 92 million pounds. That same year, while the cyber-defense community was working out how to fight ransomware, cryptojacking — the hijacking of other people’s machines to mine cryptocurrency — arose as a threat. Cryptojacking attacks detected by Symantec increased by 8,500% during 2017. During 2018, the value of cryptocurrencies plunged 90%, yet Symantec still blocked four times as many cryptojacking attacks as the previous year.

Attackers always seem to be one or two steps ahead of the defenders. Are they more technically adept, or do they have a magical recipe for innovation that enables them to move more quickly? If, as is commonly believed, hackers operated mainly as isolated individuals, they would need to be incredibly skilled and fast to create hacks at the frequency we’ve seen. However, when we conducted research in dark web markets, surveyed the literature on cyberattacks, and interviewed cybersecurity professionals, we found that the prevalence of the “fringe hacker” is a misconception.

Through this work, we found a useful lens for examining how cybercriminals innovate and operate. The value chain model developed by Harvard Business School’s Michael E. Porter offers a process-based view of business. When applied to cybercrime, it reveals that the dark web — that part of the internet that has been intentionally hidden, is inaccessible through standard web browsers, and facilitates criminal activities — serves as what Porter called a value system. That system includes a comprehensive cyberattack supply chain, which enables hackers and other providers to develop and sell the products and services needed to mount attacks at scale. Understanding how it works provides new, more effective avenues for combating attacks to companies, security service providers, and the defense community at large. Read the rest here.

Friday, July 12, 2019

Peter Drucker’s favorite leadership writer

strategy+business, July 12, 2019

by Theodore Kinni



Photograph by FXQuadro

Peter Drucker, the Austrian-American business author and consultant who defined management in the second half of the 20th century, wrote 39 books. Oddly, the word leadership doesn’t appear in any of their titles. In 1954, in his landmark The Practice of Management, Drucker suggested why: “The first systematic book on leadership: the Kyropaidaia of Xenophon — himself no mean leader of men — is still the best book on the subject.”

Kyropaidaia, or Cyropaedia, is the biography of Cyrus the Great, who used military conquest and enlightened governance to create the first Persian Empire around 540 BC. Xenophon the Athenian wrote the bio nearly 200 years later, and it became part of the leadership syllabus for centuries: In his 2001 book, Xenophon’s Prince: Republic and Empire in the Cyropaedia, Christopher Nadon, a professor at Claremont McKenna College (part of a consortium that includes the Drucker School of Management), writes that Alexander the Great and Julius Caesar read Kyropaedia and it was a strong influence on Machiavelli’s The Prince. Thomas Jefferson had two copies in his library.

So what do we know about Xenophon? Drucker’s description of him as “no mean leader” might be based on Xenophon’s own memoir. Titled Anabasis, it’s the story of a misbegotten military expedition, the emergence of a reluctant but talented leader, and a strategic, fighting retreat that saved an army of 10,000 mercenaries stranded deep in enemy territory.

Before he became a writer, Xenophon was embedded in this army, known as “the Ten Thousand.” Around 400 BC, Cyrus the Younger, a distant royal relation of Cyrus the Great, recruited the force as part of a military expedition. Cyrus was generous with favors and promises, but he didn’t bother to mention that his true purpose was to depose his brother, Artaxerxes II, who had inherited Persia’s throne.

Cyrus was killed in the first battle against Artaxerxes. The war lost, a group of generals and captains from the Ten Thousand tried to negotiate safe passage home — and they were betrayed by allies and slain. Thus, the Greek mercenaries found themselves leaderless and without provisions. “Separated from Hellas by more than a thousand miles, they had not even a guide to point the way,” reported Xenophon, who wrote Anabasis in the third person. “Impassable rivers lay athwart their homeward route, and hemmed them in. Betrayed even by the Asiatics, at whose side they had marched with Cyrus to the attack, they were left in isolation.” Read the rest here.

Friday, July 5, 2019

Cloud-based HCM systems should come without surprises

Lent an editorial hand preparing this guide to preparing a reality-based business case for HCM:

Deloitte's Capital H Blog, July 3, 2019

by Marty Marchetti

The business case for cloud-based human capital management (HCM) systems can sound pretty compelling. What CHRO wouldn’t want fast access to the latest advances in HCM technology at a lower overall cost? But my colleagues and I help companies make the move to cloud HCM, and we often get a firsthand view of the mismatch between expectations and reality that was revealed in Deloitte’s 2019 Global Human Capital Trends study.



It is important to have a comprehensive and accurate total cost of ownership for cloud HR before your company commits to it, during the implementation, and after it is in place.

“No surprises” should describe your move to the cloud, and the following 5 questions can help you reduce them. Read the rest here.

Friday, June 14, 2019

Conversational computing

strategy+business, June 13, 2019

by Theodore Kinni

Steve Jobs could be relentless when he wanted something. In early 2010, he wanted a small startup in San Jose, Calif. CEO Dag Kittlaus and his cofounders had just raised a second round of funding and didn’t want to sell. Jobs called Kittlaus for 37 days straight, until he wrangled and wheedled a deal to buy the two-year-old venture for Apple at a price reportedly between US$150 million and $200 million. The company was Siri Inc.

Wired contributor James Vlahos tells the story of how Siri took up permanent residence in the iPhone in his new book, Talk to Me. It’s the first nontechnical book on voice computing that I’ve seen and a must-read if you have any interest in the topic.

Vlahos spends the first third of Talk to Me describing the platform war currently raging in voice computing. It details the race among the big players, including Amazon, Google, and Apple, to embed AI-driven voices in as many different devices as possible, as they seek to dominate the emerging ecosystem. The fact that Amazon now has more than 10,000 employees working on Alexa provides a good sense of the dimensions of that race.

But voice computing is more than a platform play. It is likely to have ramifications and applications for every company, especially if Vlahos’s contention that “the advent of voice computing is a watershed moment in human history” turns out to be right.

“Voice is becoming the universal remote to reality, a means to control any and every piece of technology,” he writes. “Voice allows us to command an army of digital helpers — administrative assistants, concierges, housekeepers, butlers, advisors, babysitters, librarians, and entertainers.” Voice will disrupt the business models of powerful companies — and create new opportunities for upstarts — in part because it will put AI directly in the control of consumers, Vlahos argues. “And voice introduces the world to relationships long prophesied by science fiction — ones in which personified AIs become our helpers, watchdogs, oracles, and friends.” Read the rest here.

Transformation in energy, utilities and resources

Learned a lot lending an editorial hand here:





PwC, June 13, 2019


The world is at the midpoint of a massive energy-related transformation. By 2040, the global demand for all forms of fuel and power will be four times what it was in 1990. During the same 50 years, the issue of global climate change will have moved from the margins to the centre. Institutions everywhere will be striving to address climate-related problems by dramatically decreasing and mitigating carbon use.

In the energy, utilities and resources (EU&R) industries, the relationship between these two dynamics — the rise in demand and the recognition of carbon use as a climate threat — is already determining basic strategic choices. And it will continue to do so for years to come. This development will profoundly affect a wide range of companies: producers of all forms of energy; disseminators and sellers of electric power, gas and oil; energybased process industries such as chemicals and steel; and producers of other extracted commodities. Leaders in all those businesses will need the acumen to make and execute decisions that combine growth with environmental sustainability, often in novel ways.

The ability to take this new approach to management, especially for companies that have been successful in the past, is not guaranteed. Thus, transformation — the ability to make fundamental shifts in strategy, operating model and day-to-day activity — is on the agenda for EU&R companies this year, with a stronger sense of urgency than before. Fortunately, because of the rise of digital technology, the growing use of interoperable platforms and an emerging consensus about the value of renewable energy, EU&R companies have more tools and opportunities than ever before for thriving through this disruption. 

The urgency became clear in the results of a number of surveys conducted recently by PwC — including those of chemical company CEOs, oil and gas company CEOs, and power and utilities companies — and it is especially pressing in the utilities sector. For instance, when we surveyed senior executives in Germany’s energy sector in 2018, 77% said that the bulk of their company’s revenues would continue to come from their core businesses over the next five years, yet 57% of them expected those revenues to fall over the same period. Likewise, in chemicals, according to our 22nd Annual Global CEO Survey trends series, the next decade is likely to see the sector come under increasing pressure on a range of sustainability measures. In short, although the demand for EU&R’s elemental commodities will grow and its essentially extractive, capital-intensive nature will not change, business as usual will not be a viable alternative for many companies. Read the rest here.

Tuesday, June 11, 2019

Using AI to Enhance Business Operations

Learned a lot lending an editorial hand here:

MIT Sloan Management Review, June 11, 2019

by Monideepa Tarafdar, Cynthia M. Beath, and Jeanne W. Ross



Artificial intelligence invariably conjures up visions of self-driving vehicles, obliging personal assistants, and intelligent robots. But AI’s effect on how companies operate is no less transformational than its impact on such products.

Image result for mit sloan reviewEnterprise cognitive computing — the use of AI to enhance business operations — involves embedding algorithms into applications that support organizational processes. ECC applications can automate repetitive, formulaic tasks and, in doing so, deliver orders-of-magnitude improvements in the speed of information analysis and in the reliability and accuracy of outputs. For example, ECC call center applications can answer customer calls within 5 seconds on a 24-7-365 basis, accurately address their issues on the first call 90% of the time, and transfer complex issues to employees, with less than half of the customers knowing that they are interacting with a machine. The power of ECC applications stems from their ability to reduce search time and process more data to inform decisions. That’s how they enhance productivity and free employees to perform higher-level work — specifically, work that requires human adaptability and creativity. Ultimately, ECC applications can enhance operational excellence, customer satisfaction, and employee experience.

ECC applications come in many flavors. For instance, in addition to call center applications, they include banking applications for processing loan requests and identifying potential fraud, legal applications for identifying relevant case precedents, investment applications for developing buy/sell predictions and recommendations, manufacturing applications for scheduling equipment maintenance, and pharmaceutical R&D applications for predicting the success of drugs under development.

Not surprisingly, most business and technology leaders are optimistic about ECC’s value-creating potential. In a 2017 survey of 3,000 senior executives across industries, company sizes, and countries, 63% said that ECC applications would have a large effect on their organization’s offerings within five years. However, the actual rate of adoption is low, and benefits have proved elusive for most organizations. In 2017, when we conducted our own survey of senior executives at 106 companies, half of the respondents reported that their company had no ECC applications in place. Moreover, only half of the respondents whose companies had applications believed they had produced measurable business outcomes. Other studies report similar results.

This suggests that generating value from ECC applications is not easy — and that reality has caught many business leaders off guard. Indeed, we found that some of the excitement around ECC resulted from unrealistic expectations about the powers of “intelligent machines.” In addition, we observed that many companies that hoped to benefit from ECC but failed to do so had not developed the necessary organizational capabilities. To help address that problem, we undertook a program of research aimed at identifying the foundations of ECC competence. We found five capabilities and four practices that companies need to splice the ECC gene into their organization’s DNA. Read the rest here.

Sunday, June 2, 2019

Managerial hubris brought down MacArthur

strategy & business, May 29, 2019

by Theodore Kinni



Photograph by Pictorial Press Ltd / Alamy

I find hubris to be a fascinating cognitive flaw. Perhaps the spectacle of arrogance leading to a fall from grace provides a socially acceptable outlet for my predilection for schadenfreude — another obnoxious personality glitch. But my flaws don’t matter all that much. I’m not a leader.

For leaders, the consequences of cognitive flaws like hubris are magnified. And nowhere is the danger of managerial hubris more evident than in the career of General Douglas MacArthur, whose life and career I studied for my book No Substitute for Victory: Lessons in Strategy and Leadership from General Douglas MacArthur. In June 1950, when President Harry Truman appointed him to head the United Nations Command at the start of the Korean War, MacArthur was already a prime candidate for hubris. He had served as commander of the U.S. Army Forces in the Pacific in WWII and was still, at age 70, serving as the de facto leader of postwar Japan and its more than 80 million citizens. He was, as biographer William Manchester put it, an “American Caesar.” It is unlikely that MacArthur would have objected to the characterization, had he been alive to hear it.

If MacArthur had an elevated sense of ego and invincibility by 1950, his initial success in prosecuting the Korean War surely reinforced the feeling. As the UN forces fought to hang on at Pusan, their last foothold on the Korean Peninsula, MacArthur mounted an audacious, large-scale amphibious attack well behind enemy lines at the port city of Inchon. The plan was risky, if not foolhardy: Inchon’s 30-foot tides are so extreme that the window for making the assault was limited to two days in September. Moreover, if the landing forces had been unable to take the port, they would have been trapped.

As it turned out, the Inchon invasion was a complete success. The North Korean Army reeled in surprise, and a day later, the UN forces at Pusan broke out. Within two weeks, the invaders had been expelled from South Korea and the UN forces crossed the 38th Parallel, heading north to the Chinese border. The stage was set for one of the 20th century’s most dramatic exhibitions of hubris. Read the rest here.

Friday, May 24, 2019

Are meaning & purpose missing for your workforce?

Learned a lot lending an editorial hand here:

Capital H Blog, May 24, 2016

by Matthew Deruntz and Christina Rasieleski


Organizations are increasingly offering lavish perks to attract and retain talent, and then tracking their success with annual engagement surveys. But what if they’re missing the point?

Despite a laser-like organizational focus on what is traditionally called employee engagement, most people remain less than satisfied with their jobs. Deloitte’s 2019 Global Human Capital Trends survey points to what may be really missing. Many workers lack autonomy and access to the tools and information they need; moreover, they aren’t satisfied with the design of their jobs or the day-to-day flow of work. In fact, most survey respondents rated their organizations only “somewhat effective” or “not effective” on a number of factors related to experience: positive work environment, meaningful work, growth opportunities, trust in leadership, and supportive management. These aren’t issues that organizations can address with free doggie daycare or on-site CrossFit. Instead, they need to reevaluate the fundamental human needs of their workforce.

For better or worse, work holds such a dominant place in many people’s lives that when it fails to meet their innate need for meaning and purpose, their entire lives can become less satisfying and fulfilling. To address this issue and recognize that everyone who contributes to the organization—whether as a full-time employee, contractor, or gig worker—is an individual with intrinsic human needs, organizations need to pivot from thinking about an “employee experience” to thinking about a “human experience” for their workforce. Read the rest here.