The Edge of Risk Menu Search
Stay informed on top risk issues—BRINK delivers expert insights on global risk right to your inbox, daily.
Please enter a valid email address.
Success! Thank you for signing up.
Technology Cybersecurity

Open Data and Consistent Metrics May Lay Foundation for Cyber Risk Success

Improved data transparency and a standardized set of cyber metrics is what experts believe will position both the public and private sectors to better handle data breaches and cybersecurity risks moving forward.

Policymakers and business leaders have often struggled to make well-guided decisions about their cyberrisk mitigation strategies because of how little usable data is available on the subject, said Josephine Wolff, a faculty associate at the Harvard Berkman Center for Internet and Society.

“Everyone has a pretty shaky grasp of what’s going on and is misinformed to some degree, but not because they aren’t paying attention,” Wolff said. “It’s because we don’t have very good data to work with. There’s not a lot of public information.”

One reason for the lack of useful cyber-related information is that companies don’t often disclose cyber threats unless they are legally required to by the Securities and Exchange Commission. A vocal portion of the U.S. business community has fought back against proposals to require more comprehensive disclosures after attempted or successful attacks, saying that such requirements would hurt profits and make companies even more vulnerable to hacking.

Although Wolff agreed that companies shouldn’t throw caution to the wind with sensitive cybersecurity disclosures, she said a wider sampling of data, when properly aggregated and synthesized, would lay the foundation for more robust risk mitigation policies in the future.

“We want to err on the side of as many measurements as possible and as much data collection as possible,” she said.

But massive data collection is not particularly useful unless it’s standardized and interpreted using consistent methodologies, says Trey Herr, senior research associate at George Washington University’s Cyber Security and Policy Research Institute. An objective cybersecurity metrics program is the only way to ensure we have a comprehensive understanding of the cyberrisk landscape.

“One of the biggest issues we have is that there’s not a consistent set of performance metrics for how you design and implement information security either in the public or private sector,” Herr said. “We have, unfortunately, something of an ex post means of analyzing who’s messed up, but we don’t really have a way of looking into the future and assessing who’s doing well.”

A number of organizations have already laid out best practices for cyber data collection, including a widely circulated 2014 report from Marsh & McLennan and the Chertoff Group, as well as a foundational report from the SANS Institute, which argued that “by equipping management with objective measurements you are demonstrating the increased maturity of your security program and the likelihood of its success.” But for the most part, this literature is only the beginning, Herr said, and decision makers won’t be able to properly assess the cyber landscape until consistent metrics are more widely adopted.

Massive data collection on cyberrisk is not particularly useful unless it’s standardized and interpreted.

“Part of the problem with the conversation in terms of preparedness is the focus that we have is on the most visible event,” Herr said. “It’s the classic incident of being more worried of crashing in a plane than dying in a car accident.”

This is true of the current focus on big retail data breaches, which has shed light on recent massive, but has taken away attention from other more pernicious cyberrisks. The public focus on big consumer breaches is a symptom of having incomplete data that illuminates a part of the puzzle without exposing the whole cyber picture, Wolff said.

“The things that we learn are very much the things that are reported in the headlines,” she said. “And those aren’t always the things that are most useful for thinking about how to do security better—in fact, those are rarely the things that are most useful.”

The New York Times recently echoed this sentiment by dismissing these sorts of data breaches as overblown risks. There are many other cyber threats to be aware of, the article said, and enhanced focused on data breaches focuses only on one portion and exaggerates its effects on the public.

“Only a tiny number of people exposed by leaks end up paying any costs, and for the rare victims who do, the average cost has actually been falling steadily,” according to the New York Times article. “It’s not so different from the soap company that advertises how many different types of bacteria are on a subway pole without mentioning how unlikely it is that any of those bacteria would make you sick.”

But a number of identity theft experts, including Eva Velasquez, CEO and president of the Identity Theft Resource Center, are not as quick to dismiss the effects data breaches may have on consumers.

“If the thief is crafty enough, they can clear out your bank account,” Velasquez said said.

She warned against trivializing the already well-reported data breaches that have plagued major U.S. and international retailers, but, like the other experts, agreed that comprehensive data disclosure that transcends organizational silos will be important in broader risk mitigation.

Peter Beshar, Executive Vice President and General Counsel at Marsh & McLennan Companies, testified to Congress earlier this year about encouraging wide data sharing across the public and private sectors in order to beef up our understanding of developing cyberrisks.

Working in isolation, neither the private sector nor the public sector has the tools to protect our nation’s critical assets,” Beshar told Congress. “To accelerate the identification and detection of emerging threats, there needs to be greater trust and real-time threat information sharing between the private and public sectors.”