Stories
Stories
We’re All Going to Get Hacked
In November 2014, Sony Pictures suffered a massive, high-profile data breach, with hackers breaking in and stealing everything from confidential employee data to unreleased films. And not long after, on a Saturday morning, Ray Rothrock’s cell phone rings. Rothrock (MBA 1988) is the CEO of the cybersecurity firm Red Seal, and a higher-up at Sony was looking for his help. After the breach, he told Rothrock, the company essentially hit factory reset on their entire network. The phones were down. They were doing payroll by handwritten checks. They had burned it all down. And now they needed someone to help them rebuild it.
In hindsight, Rothrock says, the answer to Sony’s break-in was to isolate and treat the affected area, not tear it all down. But he gets it—executives and managers still need some education on how to handle evolving cyber threats. Rothrock lays out his solutions in his latest book, Digital Resilience: Is Your Company Ready for the Next Cyber Threat?, and we talk to him here about what that looks like in real terms—and why the C-suite should treat hacking not as a probability, but an eventuality.
- READ MORE
-
Dan Morrell: Tell me about RedSeal. What does the company do? What services does it provide?
Ray Rothrock: RedSeal is an enterprise cybersecurity-software company. We do two big things. One, we model your network—and I’ll explain what that is in a second—and then we measure and quantify your cyber capabilities in that network.
It’s very hard to understand a network these days. This was hard to understand 10 years ago, when the company was founded. But it’s eminently more complicated today because we have wires, we have WiFi, we have the Cloud, we have IOT. We have all these things, and nobody understands how it’s all put together.
So my software goes in and analyzes every pathway, every port, every protocol, every application, every piece of data. And we build a software model, and just like a software model, you can ask it questions or you can play wargames with it or whatever you want to do.
That model is the basis for many use cases. Is my policy in compliance? What’s the pathway from here to there? Is it protected, is it encrypted, whatever it is? And then the measurement. This is something we innovated a couple years ago.
I was a VC for 25 years before I became CEO of RedSeal, and I did 15 cyber deals. Not a one of them ever came in and gave me a score! How’d you build it? Do you think it’s good? How do you know it’s good? What are you comparing it to?
So we’ve created this concept called a digital resilience score, and that score reflects how well you built your network, how well you maintained your network, and do you know all of your network. Those were three key elements to understanding if you can protect during an attack.
The concept of resilience is really what this is all about. Being resilient means being able to survive an impairment. So it’s just like, in your automobile, you have airbags and seat belts. So if you hit a tree today, you walk away from it. If you’d hit a tree 30 years ago, you wouldn’t. Networks had never been built with resilience in mind. They were just built very quickly [over] the last 30 years and a lot of tribal knowledge went away.
So RedSeal tries to quantify the architecture, quantify the equipment, quantify all the elements of the network, to tell you whether or not you can survive hitting a tree.
Morrell: What compelled you to write this book?
Rothrock: What really got on me going on it was the Target attack in 2013. Target was a big Fortune 50 company. They had all the best products; they had good engineers, good policies, good practices. Yet they got hacked in a very big, public way. They lost 40 million records.
That was a tipping point for me. And then I asked myself, how could that happen? They have the best stuff, they have the best… It happened because the malware was already inside, and that’s the bit flip for me in today’s world.
So, cyber today, the bad stuff is already in our network. And it’s already there, so how do we deal with that? We don’t know where it comes from, we don’t know what it looks like, we don’t know how it’s going to manifest itself. That’s called being resilient: I’m ready for whatever comes at me. That’s what motivated me to write the book, because it’s a new way of thinking about our cyber universe.
Morrell: Yeah, I want to talk about the Target case, because that plays a pretty significant role in your book. And I wonder if we can talk about that breach and use it as a case study. How the attackers got in, what damage was done, and how digital resilience could have prevented something like that.
Rothrock: Absolutely. That’s great. I’ll stay out of the technical gobbledygook, as I like to call it.
Basically, this was a well-maintained company. They had all the right proper protections and policies and what have you. And they have a HVAC system, and it needed to be serviced, and so the service guy comes in with his PC. He plugs into that HVAC, which is pretty typical these days, and there was malware on his computer that had gotten to him via a phishing attack.
Phishing: 99% of all the problems start with you or me, the carbon pieces of this equation, making a mistake. The malware hopped from his PC to the HVAC system. It’s not manually driven. It’s looking for a particular pattern, probably a credit card pattern in this case, and didn’t find it.
So it keeps on, every time it finds a new pathway—it just goes down it and marches down to the next computer, to the next computer, and then on and on. The corporate system’s connected to the point-of-sale systems at the retail stores. They had some policies that were interesting. One of the policies is, they cleared the POS systems every night. They’d wipe the hard drives, just in case someone steals one of these cash registers, right?
Morrell: Yeah, right.
Rothrock: So the data’s not there. But they didn’t turn them off, and because that data had been stored in the memory of the computer—not necessarily on it—it was on the hard drives, but wiped.
Seeing their memory, hot and live, the malware finds it and says, “Oh, this looks like a credit card.” It phones home, finds a way out of the network, that’s easy, calls the bad guys. They type their way in, and all of a sudden the data starts to fly.
I don’t know this for a fact, but I’ll bet you they had policies that said the HVAC network should not talk to the corporate. They couldn’t test it. They couldn’t prove it. These incredibly complex networks—maybe at one point it was that way, but all it takes is one path and you’re toast. And that’s what happened.
Someone should have said, “Well, what if something does happen? Where’s the data? Where does it live? How do we protect the data?” That’s the first thing. So, in this case, they didn’t know what the threat looked like. They didn’t know where the threat comes from.
Part of being resilient is having a policy that says we shall block this or protect that. That’s an important thing, and most companies—almost all companies—have policies. Just like, they have a policy . . . well, they have policies that say you can’t do this, you can’t do that. Had they implemented and proven that policy, that would have been resilient. Because, not knowing where the threat’s going to come from, that policy would have prevented it from happening. So that would have been a good thing.
They probably, I don’t know, they probably had a policy that said, out in the POS, they’re supposed to turn them off at night. Because that would have cleared the memories, right? But then somebody didn’t turn it off. This is the non-technological side of resilience. The network in the computers, that’s where everything is, but you have to have policies that protect against the malware getting into there.
That’s where resilience comes in. Resilience: it’s surviving impairment and not showing that you’ve been hurt or hit or anything. And that’s where resilience matters. So you don’t know where the threat is, you don’t know how it’s going to manifest itself, and if it did, how would you shut it down?
Morrell: Now, this concept of resilience—and you make this point in the book—is not necessarily a new concept. It manifests itself in biological systems, in ecosystems. Talk about where else you see that, maybe in the natural world.
Rothrock: Oh, my gosh. The natural—our human body is naturally resilient. We have this thing called a skin, that’s like a firewall. But if you cut yourself, bad stuff gets in, what happens? The white blood cells attack it.
What’s that? That’s like intrusion detection, right? Ba-doop, doop! The alarm goes off, the white blood cells get it, but the red blood cells do the repair. So, the human immune system is a perfect example. It doesn’t know where the threat’s going to come from; it doesn’t know where your cut is. But it knows, when it gets that alarm, to attack it and shut it down before it kills you.
Morrell: You also talk about digital resilience as sort of a public health issue. Talk about that concept, and what that might look like, for somebody who’s working at an office. Is that a hygiene practice? What are they doing on the front lines?
Rothrock: Every city has a public health department that defines washing your hands—that employees must wash their hands when they go to the bathroom. Simple hygiene thing, but that’s very simple. Don’t pick up a USB stick off the ground and plug it into your computer. It might be loaded with something that you don’t like.
We have all of these rules and regulations in our public health, in our medicine. And this building’s got sprinkler systems in it. Do we expect this room to catch on fire? No! Why do we put sprinklers in it? Just in case. And why do we put them in there? Because the law said you had to, because enough buildings had burned down.
One of the precepts in the book is, we’ll have enough cyber burn-downs, the law will come into play, and people will be forced to comply with something. And, of course, everyone bellyaches. Companies don’t want to do it, but let’s face it, folks: If you don’t fix it, you’re going to lose the trust. If you couldn’t go into a restaurant and trust the fact that the food had been prepared right. And that’s because there’s laws and regulations that come in there. We see restaurants getting shut down all the time when the food inspector comes around.
We’ve got to do that in the cyber world. And it’s not hard. It’s not expensive. It’s just necessary.
Morrell: Want to return to that, because it is a big precept of the book, which is that these things are just inevitable.
Rothrock: Yeah!
Morrell: It’s just going to happen. But to the layperson, the question would be: Why can’t we just build the strongest walls possible? Why not? Why are these inevitable?
Rothrock: Because we’re human beings, and we make mistakes. We don’t build perfect walls. What is it, you build a 50-foot wall, someone has a 51-foot ladder. And software, in particular, is very human—frail. And, in fact, very few people graduate these days with any training of how to make a software package hard. There are people who learn how to do that and they write military systems, and even those aren’t perfect.
We’re in the early, early days of writing software that’s perfect. Encryption is a common technology that we use for very important protection of information. It’s a very expensive computation. It takes a lot of computer power, a lot of electricity, and it creates a lot of heat, but computers aren’t there yet.
But it we could encrypt everything. If this, my phone right here, was totally encrypted, it’d be great. A lot of it is, but not all of it. Then we wouldn’t have to worry about … You could just leave the doors unlocked, if we had fully encrypted data. But we don’t. And it will be a very long time.
Morrell: You have these 26 action items at the end of the book, and the last one of those is sort of counterintuitive. And it’s “share your pain.” A company goes through something like this—they’ve just lost money, it’s embarrassing. What is the value in sharing that story?
Rothrock: Well, good resilience comes from anticipation, thinking: What could go wrong here? In Sony’s case, the malware was in the routers. I’ve never met a single person in the cyber world who thought the malware would ever be in the routers. But it was. That’s the first time. What if Sony kept that a secret? Then Cisco and Juniper and all the other router makers would not have taken some actions to make their routers more cyber resilient.
But that’s a result of that. If you don’t share the lessons learned, you don’t have to share every intimate detail. But if you don’t share the lessons learned, how is it we’re going to be better? How is the next engineer, that next generation of equipment, going to make it better?
So I think it’s important, and I don’t believe this is a hard technical problem. We have a lot of good technology in the world. We have too much technology, probably, and the strategy of the world just buying the next thing and applying it to their network. I think those days are very limited, if they’re not over already.
In fact, every time you add something else, you’re creating new vulnerabilities. So we need to take a pause. We need to think about how we’re going to deploy our next network.
But, really, it’s going to take the guts of management, to make the investment with the money, to protect themselves. It’s only a matter of time until a big brand that we all know and trust—and every day—gets hammered hard. Not just Supermicro, but what about Apple, Amazon? I mean, Facebook’s taking some heat right now for various reasons. A big brand we believe and trust. Tesla. General Motors.
We don’t have to wait for that. I worry about our democracy. We respond to crises. That’s the way we’re built. The farsighted management teams, CEOs, and boards—when they start thinking about that, they’ll see the value of making that investment to keep the trust of their stakeholders. Not just their shareholders and not just their customers.
I touched on that in the book in one place, but I think that’s the opportunity. And that’s why I’m doing what I’m doing at RedSeal.
Skydeck is produced by the External Relations department at Harvard Business School and edited by Craig McDonald. It is available at iTunes or wherever you get your favorite podcasts. For more information or to find archived episodes, visit alumni.hbs.edu/skydeck.
Post a Comment
Related Stories
-
- 01 Dec 2023
- HBS Alumni Bulletin
3-Minute Briefing: Nathaniel Fick (MBA/MPA 2008)
Re: Nate Fick (MBA 2008); By: Julia Hanna -
- 01 Jun 2023
- HBS Alumni Bulletin
The Exchange: The Tech Leader’s Tightrope
Re: Nien-he Hsieh (Kim B. Clark Professor of Business Administration); Henry W. McGee (Senior Lecturer of Business Administration); By: Jen McFarland Flint -
- 01 Jun 2022
- HBS Alumni Bulletin
Elevator Pitch: Certified Check
Re: Adar Arnon (MBA 2020); Ahmad Yang (MBA 2020) -
- 25 Mar 2021
- Forbes
Startup Hits $1.5B Valuation
Re: Blake Hall (MBA 2010)
Stories Featuring Ray Rothrock
-
- 26 Aug 2016
- CNBC
RedSeal CEO Ray Rothrock Talks Cybersecurity with Jim Cramer
Re: Ray Rothrock (MBA 1988); Steve Timmerman (MBA 1985) -
- 16 Jun 2016
- HBS Alumni Bulletin
How Do We Win the Cyberwar?
Re: Ray Rothrock (MBA 1988); Joshua Lefkowitz (MBA 2008); Anne Bonaparte (MBA 1988); Matthew Prince (MBA 2009) -
- 01 Jun 2016
- HBS Alumni Bulletin
How Do We Win the Cyberwar?
Re: Ray Rothrock (MBA 1988); Joshua Lefkowitz (MBA 2008); Anne Bonaparte (MBA 1988); Matthew Prince (MBA 2009); By: Dan Morrell; illustrations by Victo Ngai