Showing posts with label cybersecurity. Show all posts
Showing posts with label cybersecurity. Show all posts

Thursday, March 12, 2020

The algorithmic trade-off between accuracy and ethics

strategy+business, March 12, 2020

by Theodore Kinni



Photograph by Yuichiro Chino

Strava, a San Francisco–based fitness website whose users upload data from their Fitbits and other devices to track their exercise routines and routes, didn’t set out to endanger U.S. military personnel. But in November 2017, when the company released a data visualization of the aggregate activity of its users, that’s what it did.

Strava’s idea was to provide its users with a map of the most popular running routes, wherever they happened to be located. As it turns out, the resulting visualization, which was composed from three trillion GPS coordinates, also showed routes in areas, such as Afghanistan’s Helmand Province, where the few Strava users were located almost exclusively on military bases. Their running routes inadvertently revealed the regular movements of soldiers in a hot zone of insurgency.

The problem, explain University of Pennsylvania computer and information scientists Michael Kearns and Aaron Roth, authors of The Ethical Algorithm: The Science of Socially Aware Algorithm Design, is “that blind, data-driven algorithmic optimization of a seemingly sensible objective can lead to unexpected and undesirable side effects.” The solution, which they explore for nontechnical leaders and other lay readers in this slim book, is embodied in the emerging science of ethical algorithm design.

“Instead of people regulating and monitoring algorithms from the outside,” the authors say, “the idea is to fix them from the inside.” To achieve this, companies need to consider the fairness, accuracy, transparency, and ethics — the so-called FATE — of algorithm design.

Kearns and Roth don’t deal with the FATE traits in a sequential manner. Instead, they describe the pitfalls associated with algorithms and discuss the ever-evolving set of solutions for avoiding them. Read the rest here.

Monday, July 15, 2019

Casting the Dark Web in a New Light

Learned a lot lending an editorial hand here:

MIT Sloan Management Review, July 15, 2019

by Keman Huang, Michael Siegel, Keri Pearlson, and Stuart Madnick


With cyberattacks increasingly threatening businesses, executives need new tools, techniques, and approaches to protect their organizations. Unfortunately, criminal innovation often outpaces their defensive efforts. In April 2019, the AV-Test Institute, a research organization that focuses on IT security, registered more than 350,000 new malware samples per day, and according to Symantec’s 2019 Internet Security Threat Report, cyberattacks targeting supply chain vulnerabilities increased by 78% in 2018.

Wide-scale attacks are becoming more common, too. In October 2016, a distributed denial-of-service (DDoS) attack that hit Dyn, a domain name system (DNS) provider, in turn brought down companies such as PayPal, Twitter, Reddit, Amazon, Netflix, and Spotify. In 2017, the WannaCry and NotPetya ransomware attacks affected health care, education, manufacturing, and other sectors around the world. A report from the Department of Health in the U.K. revealed that WannaCry cost it 92 million pounds. That same year, while the cyber-defense community was working out how to fight ransomware, cryptojacking — the hijacking of other people’s machines to mine cryptocurrency — arose as a threat. Cryptojacking attacks detected by Symantec increased by 8,500% during 2017. During 2018, the value of cryptocurrencies plunged 90%, yet Symantec still blocked four times as many cryptojacking attacks as the previous year.

Attackers always seem to be one or two steps ahead of the defenders. Are they more technically adept, or do they have a magical recipe for innovation that enables them to move more quickly? If, as is commonly believed, hackers operated mainly as isolated individuals, they would need to be incredibly skilled and fast to create hacks at the frequency we’ve seen. However, when we conducted research in dark web markets, surveyed the literature on cyberattacks, and interviewed cybersecurity professionals, we found that the prevalence of the “fringe hacker” is a misconception.

Through this work, we found a useful lens for examining how cybercriminals innovate and operate. The value chain model developed by Harvard Business School’s Michael E. Porter offers a process-based view of business. When applied to cybercrime, it reveals that the dark web — that part of the internet that has been intentionally hidden, is inaccessible through standard web browsers, and facilitates criminal activities — serves as what Porter called a value system. That system includes a comprehensive cyberattack supply chain, which enables hackers and other providers to develop and sell the products and services needed to mount attacks at scale. Understanding how it works provides new, more effective avenues for combating attacks to companies, security service providers, and the defense community at large. Read the rest here.

Monday, June 4, 2018

How to Become a Master of Disaster

strategy+business, June 4, 2018

by Theodore Kinni

If you like disaster stories, you’ll love Meltdown, by Chris Clearfield, a principal at risk consultancy System Logic, and András Tilcsik, an associate professor at the Rotman School of Management. The authors cover a gamut of catastrophe, from a ruined Thanksgiving dinner to the water crisis in Flint, Mich., and the multiple meltdowns at the Fukushima Daiichi Nuclear Power Plant caused by the Tōhoku earthquake and tsunami in 2011. The worst part of all these examples: According to the authors, they were preventable.

All the disasters recounted in Meltdown share characteristics first identified by sociologist Charles Perrow. Now in his nineties, Perrow earned the appellation “master of disaster” for his seminal study of a host of incidents in high-risk settings, starting with the Three Mile Island Nuclear Generating Station accident in 1979. “In Perrow’s view,” explain Clearfield and Tilcsik, “the accident was not a freak occurrence, but a fundamental feature of the nuclear power plant as a system.”

This system — indeed, each of the systems described in Meltdown’s disasters — is complex and tightly coupled: complex in that the systems are nonlinear, with parts sometimes interacting in hidden ways, and tightly coupled in that there is little slack in these systems. A failure in one part quickly, and often, unexpectedly affects other parts. Read the rest here.

Wednesday, May 23, 2018

Your Customers May Be the Weakest Link in Your Data Privacy Defenses

Learned a lot lending an editorial hand here:

MIT Sloan Management Review, May 22, 2018

by Bernadette Kamleitner, Vincent W. Mitchell, Andrew Stephen, and Ardi Kolah


Your Customers May Be the Weakest Link in Your Data Privacy Defenses
Does your company have consumer data it isn’t legally authorized to possess?

Don’t be too quick to answer. Many ethical, lawfully managed businesses do have such data — and it comes from a surprising source: their customers, who inadvertently share the personal data of their family, friends, and colleagues.

The lack of awareness regarding peer-dependent privacy is one way that London-based Cambridge Analytica Ltd. was able to collect the personal information of more than 71 million Facebook users, even though only 270,000 of them agreed to take the now-bankrupt company’s app-based personality quiz. Cambridge Analytica reportedly knew what it was doing, but any company that accesses customer data, such as contacts, call logs, and files, can unknowingly breach peer privacy.

Blame apps. Virtually all large companies offer apps to their customers, and most of those apps access and collect customer data. Often, that includes peer data, which also is collected even though the app’s owner may have no direct relationship with the user’s peers.

Consider a typical scenario: John installs a customer club membership app on his smartphone. During this process, the app requests permission to access core services on his device, including his contacts. John agrees. This opens a Pandora’s box of potential problems. John has given a third party — the company owning the app — permission to access not only his personal data, but also the personally identifiable information of the hundreds of contacts saved in his phone. None of those people, including Rachel, whose name, phone number, email address, photo, and date of birth are stored in John’s phone, agreed to share their information with the company. They have no idea that they have been caught up in a peer-dependent privacy breach.

Company executives may be no more aware of the privacy breaches built into their apps than John and his contacts. Yet, it could cost them as dearly. Under the EU General Data Protection Regulation (GDPR), any company can incur fines of up to 4% of global annual revenue or 20 million euros, whichever is greater, for failing to respect the sovereignty of EU citizens over their personal data. Notably, these fines are not limited to customer data: As of May 25, 2018, the personal data of EU citizens, including data on other people’s devices, must be obtained lawfully, fairly, and transparently in accordance with the principles of the GDPR. This implies that the fully informed consent of peers is needed prior to taking possession of their personal data (barring some other legal basis). In most cases and subject to a balancing test, companies also need to provide peers with access to their personal data and, in some cases, delete that data on demand.

In short, peer-dependent privacy has become a significant exposure for companies that want to ensure the highest standards of data protection, privacy, and regulatory compliance....read the rest here

Thursday, March 2, 2017

RSA 2017: 5 Takeaways From the Biggest Cybersecurity Conference

Lent an editorial hand here:

WSJ.CustomStudios, March 2, 2017

by David B. Burg and Grant Waterfall, PwC

The annual RSA Conference acts like a microcosm of the global cybersecurity ecosystem: everyone’s there, and it’s as kinetic and chaotic as the industry itself. Yet the industry’s biggest cybersecurity conference also provides some valuable insights, as we recently found.

Since returning from RSA in mid February, where PwC maintained a lively presence amid the hubbub, we’ve condensed our takeaway into five key points:

Efficiency: A record 43,000 information security professionals attended this year’s RSA, roaming 550 vendor booths and choosing among more than 500 educational sessions to attend. As the cybersecurity world continues to expand and grow in importance and relevance, this event continues to grow as well — just five years ago, only 17,000 information security professionals attended RSA, according to the event managers. So for anyone who wants to find out just about anything about cybersecurity, it’s all there. Someone new to the cybersecurity and privacy industry could theoretically cram months of research and learning into just a few days in San Francisco. Read the rest here.

Thursday, November 3, 2016

TechSavvy: How “Smart” Is Your R&D Spending?

MIT Sloan Management Review, November 3, 2016

by Theodore Kinni



Strategy&’s annual Global Innovation 1000 study, which examines the 1,000 public companies that spend the most on R&D (collectively 40% of the world’s total R&D spending), is always insightful. The most dismaying finding: In every one of the past 12 years, the study has found no statistically significant relationship between the financial performance of the Innovation 1000 companies and their R&D spending.

Assuming that fact doesn’t cause you to throw up your hands and use your company’s R&D budget for a massive beer bash, this year’s study, published in strategy+business, provided another insight that is well worth considering: A transformation in R&D spending is occurring.

“R&D is shifting more and more toward developing software and services,” write Strategy& principals Barry Jaruzelski, Volker Staack, and Aritomo Shinozaki. “Software increasingly carries the burden of enabling product differentiation and adaptability, and enhancing customer experiences and outcomes. Services, offered along with or separately from physical products, now focus more on new customer needs, providing enhanced value and improved usability.”

This shift, explain the authors, is driven by the ever-increasing capabilities of software, the embedding of software and sensors in products, the ability to connect products via IoT and the cloud, and, as always, customer demand. It’s manifesting in every kind of “smart” product and service.

Since 2010, the Global Innovation 1000 companies have increased their R&D spending on software offerings by 65% — to $142 billion. In addition, report the authors, “companies currently allocating 25% or more of their R&D budgets to software offerings reported that their revenues were growing significantly faster than those of key competitors with lower allocations.”

What does your company spend its R&D budget on? Read the rest here.

Thursday, October 20, 2016

TechSavvy: Beware the Paradox of Automation

Paradox AutomationMIT Sloan Management Review, October 20, 2016

by Theodore Kinni

Earlier this year, Facebook exorcised those pesky human editors who were introducing political bias into its Trending news list and left the job to algorithms. Now, reports Caitlin Dewey in The Washington Post, the Trending news isn’t biased, but some of it is fake. Turns out the algorithms can’t tell a real news story from a hoax.

Facebook says it can improve its algorithms, but errors of judgment aren’t the only pitfall in transferring human tasks to machines. There’s also the paradox of automation. “It applies in a wide variety of contexts, from the operators of nuclear power stations to the crews of cruise ships, from the simple fact that we can no longer remember phone numbers because we have them all stored in our mobile phones, to the way we now struggle with mental arithmetic because we are surrounded by electronic calculators,” says Tim Hartford in an excerpt published by The Guardian from his new book, Messy: The Power of Disorder to Transform Our Lives. “The better the automatic systems, the more out-of-practice human operators will be, and the more extreme the situations they will have to face.”

Hartford borrows William Langewiesche’s harrowing description of the crash of Air France Flight 447 to illustrate three problems with automation: “First, automatic systems accommodate incompetence by being easy to operate and by automatically correcting mistakes. … Second, even if operators are expert, automatic systems erode their skills by removing the need for practice. Third, automatic systems tend to fail either in unusual situations or in ways that produce unusual situations, requiring a particularly skillful response.”

The excerpt is worth a read — especially if it prompts you to ask if your company’s automation initiatives might entail similar risks. Read the rest here.

Thursday, September 22, 2016

TechSavvy: That Sound You Hear Is Your Enterprise’s AI Technology

MIT Sloan Management Review, Sept. 22, 2016

by Theodore Kinni


Sound Enterprise AI Technology Artificial IntelligenceApple held its “Special Event” and, among other things, officially killed the iPhone’s 3.5-millimeter earbud jack, replacing it with $159 wireless AirPods. My first reaction: Meh. But then I read Mike Elgan’s paean to this development in Computerworld.

Elgan says that AirPods are actually artificial intelligence hardware. “The biggest thing going on here is the end of ‘dumb speaker’ earbuds, and the mainstreaming of hearables — actual computers that go in your ears,” he says. “Bigger still is that the interface for these tiny computers is a virtual assistant. When you double-tap on an AirPod, Siri wakes up, enabling you to control music play and get battery information with voice commands.”

What does this mean for your company? Soon every employee could have a supercomputer whispering in his or her ear. For instance, Hearables startup Bragi and IBM just announced that they plan to combine Bragi’s Dash earbuds and IBM’s Watson IoT platform “to transform the way people interact, communicate, and collaborate in the workplace.”

Earbud-sporting workers, according to the companies, will use the devices to “receive instructions, interact with co-workers, and enable management teams to keep track of the location, operating environment, well-being, and safety of workers.” Bragi and IBM have targeted six areas of initial focus: worker safety, guided instructions, smart employee notifications, team communications, workforce analysis and optimization, and biometric ID. Read the rest here.