The Decipher Security podcast by Duo Security analyzes the news, explores the impact of the latest risks, and provides informative and educational material for readers intent on understanding how security affects our world. Their mission is to amplify the voices of those who look at security through the prism of how it affects victims, and seeks out trusted, pragmatic, voices that focus on security impact over hype. It isn’t about the coolest exploit, the scariest vulnerability, or the largest breach. Decipher provides context, information, and analysis, not to point fingers or lay blame.
On April 24, ForAllSecure CEO David Brumley joins Decipher Security host, Dennis Fisher, to talk about the importance of software security as well as the need for better cooperation between developers and security teams.
Convinced that there has to be better software security advice out there rather than telling developers “don’t make a mistake, while you code” or “your code sucks”, ForAllSecure CEO sought out to find a way to truly help developers write great applications. His passion stems from the fact that he believes software can change the trajectory of our future possibilities. He shares a particular example: e-commerce couldn’t have happened without fundamental crypto libraries, such as TLS and SSL. Thanks to these crypto libraries, today’s online economy is the size of Spain’s GDP! Though security is often seen as a cost rather than a value, it’s been proven to be the basis of innovation. He posits that security, particularly secure software, is a value add to businesses. In this podcast, Brumley shares his journey chasing his mission to help developers change the world and how Mayhem, an autonomous, next-generation application solution, came to be.
This 30 minute podcast is available for listening here. The full transcript is available below.
Dennis Fischer: Welcome to the decipher podcast. My guest today is David Brumley who has a whole bunch of titles. We'll start off with him being the CEO of ForAllSecure, a software security company, a professor at Carnegie Mellon University, and he's also the former director at CyLab, which is part of CMU. He has published a whole bunch of very interesting work. So David, thanks for being here today.
David Brumley: I'm excited to join you today, Dennis.
Dennis Fischer: Yeah, I am too. I'm looking forward to this. Weirdly, software security is one of those things that I've been interested in for a long time, even though I think it's somehow not one of the sexiest topics in the security industry. People love to talk about zero days and APTs and all these cool attack techniques. Development organizations and software professionals work together to make sure that the software we're all churning out is secure and software security is at the base of all that. I don't feel it's always gotten as much attention as it should over the years.
David Brumley: I think it's a little bit misunderstood in its place in security. People see what happens after a system has been hacked and someone who's installed malware. As you said, software security is so fundamental to how they get there and also so fundamental to how you can stop them. It's worth everyone understanding its role.
Dennis Fischer: I think so too. I've talked to people about this. Gary McGraw is a good friend of mine and I've talked to him about software security for almost two decades now. One of the things that he's always mentioned to me is how little university students get taught about software security or even security in general. I think that's starting to change now. You'd have a much better idea about that than I do. What's your perspective on how much up-and-coming developers or folks who are just starting to get into the field are being taught about this?
David Brumley: I mean it's growing. Just like everything in computer security. We didn't have enough computer security experts at universities to teach it. A lot of universities, their curriculum starts with: here's a whole bunch of different areas of security. There's network security, software security, crypto, and then they go through the list of problems people have encountered throughout history. Like in crypto: don't use an old Caesar cipher network, make sure you have an IDS or you'll get compromise. Then there's software security. They're like, don't make a mistake programming. And that's the extent.
Dennis Fischer: That's helpful.
David Brumley: Yeah, super. So you're totally safe if you never ever make a mistake and you're safe from everything an attacker might do. It's that easy.
Dennis Fischer: That's worked out really well for us over the last 30 years.
David Brumley: You can just tell. Bugs are at the root of a lot of security problems. In your previous podcasts, you've talked about crypto bugs. A lot of crypto bugs are actually software implementation bugs. It's not like you're breaking the crypto. More often than not, there's a flaw in how it was implemented. Zero days, more often than not, are software exploitation techniques where the software was vulnerable and then it was installed on a million devices and now all of them are at risk.
Dennis Fischer: Right. And it's not until it gets installed at some massive scale like that, that somebody notices the bug that had already been there.
David Brumley: At least when I look at attackers, they fall into three buckets. There's the people who are doing social engineering attacks. We don't handle that. You need user education; you need to make sure that you recognize phishing and all that sort of stuff. Then there's network security. Did you just put a system on the internet that shouldn't be? Was there a misconfiguration? Did someone just forgot to change the default password? The third type is really software security. It's when someone made a mistake in the app.
Dennis Fischer: I know that you teach an intro to computer security course and also a software security course. In that intro to computer security course, how much do you even get into software security, if at all? Or is it more like what you described about the general buckets that you need to think about if you're interested in security.
David Brumley: Oh man, at CMU. We do a ton of software security. So you're being a little bit polite. The way I teach it is, at least how it's been described externally, is I do offense first. I start looking at how could someone actually find a brand new vulnerability and then walk students through how to build original exploits. Instead of poking at particular flaws, I try to teach people how to think about how an attacker would analyze a system and how they would find zero days.
Dennis Fischer: Oh, interesting. That's a cool approach because one of the things I was going to ask you is how much you study or model attacker behavior.
David Brumley: We do a ton. We do a lot with offense. There's a bunch of hacking competitions throughout the year and CMU has consistently been ranked, the top international United States team and won Defcon CTFs which is the Superbowl of hacking. And, if you look at places like the National Security Agency, we have a lot of depth in our graduates in that program. So we're unique from that.
Dennis Fischer: Yeah, I wonder why there's not more of that. There certainly is in the security researcher community. Vulnerability researchers and red team folks certainly look at attacker behavior. But I haven't seen it all that much in the academic community. I think that's starting to change. I think you're right about that. But I wonder why that is. Do you think people just haven't gone down that path yet because there's been other problems to solve?
David Brumley: I think it actually is fear on the academic institution standpoint sometimes. For a long time, hacker and criminal were equated. We all know those aren't the same thing. When I tell people, "Hey, I teach people how to find flaws and the guy who found a flaw in a Tesla was one of my students." They're like, "Aren't you worried about them becoming criminals?" I'm like, "Wait, wait, wait. Why are you confusing an expert in computer security with a criminal?"
Dennis Fischer: A lot of that is just a function of the way the media has used the word hacker over the last two or three decades -- the way that people read these stories or see TV reports that are about some hacker who hacked a Tesla. That's a super cool story. But that does lead to people thinking, "Oh, so if I am driving my Tesla, the guy in the lane next to me can hack it."
David Brumley: It's mixing up Hollywood with real life. I mean, we don't tell people you can't study chemistry because you're going to become the new Breaking Bad, right?
Dennis Fischer: Right. Which might actually lead people in into chemistry.
David Brumley: Well, I found Breaking Bad very motivating. Software security is really where it's at. There's a lot an IT person can do to prevent being hacked: they can make sure they use strong passwords, or they can make sure that they have proper network architecture. Software security is special. Traditionally only the developer could check it. Once that software is out, no one else could, other than professional red teamers. It's always been this special nugget where the users were victims. If the developer didn't check it, users were going to get exploited and they couldn't do anything about it. It's especially important to teach at the university level because we just need to create developers who don't just know not to make a flaw but actually know how an attacker might think about breaking into a program.
Dennis Fischer: That's a good point. From a software development standpoint, if you're just a developer who hasn't taken an intro to computer security course and you're developing professional applications, would you necessarily know what bugs would be exploitable or what a security flaw would look like in your code?
David Brumley: I don't think so. And that doesn't mean that they wouldn't recognize that there's a thing called a buffer overflow or it's possible to misuse crypto or something like that. They don't really grok it, if you will. It's not part of their day-to-day activity to think about it and how the code they're writing may become vulnerable.
Dennis Fischer: Yeah, that makes sense. If you're focused on one task and you're like, "Okay, I've got this one feature that I'm working on, or this one app I'm trying to build. I've got to get it done by this point." You're not necessarily thinking about all the ways things can go wrong. You're focused on a deadline and just getting your job done.
David Brumley: Exactly right. You're not incentivized as part of your job to look for security flaws. You're paid for features.
Dennis Fischer: Right? Yeah, exactly. So tell me a little bit about how you got to the point where you started ForAllSecure and what you guys are trying to accomplish.
David Brumley: It comes from my university experience. I was teaching these top notch hackers for a long time and there was two problems that I kept seeing happen again and again. The first was that there's a workforce shortage of talented hackers. There's just not enough of them. The second problem I saw was not enough people have good tools to find security problems. A great hacker could one off a tool for a particular program, but they need to be systematic about it. We really started on this research of how can we figure out how to automate being the attacker to finding new flaws. We spun off our company, ForAllSecure, and the name really comes from this idea that everyone should be able to check the security of the apps they use.
We were fortunate. At the same time, DARPA wanted to show fully autonomous cybersecurity is possible and that was really in the application security domain. Can you automatically hack programs? Can you automatically patch them and can you win in a full spectrum battle? Our company entered our product called Mayhem and we won. That really highlighted those two dimensions. We can do a lot more automatically. You don't necessarily have to ask developers to do more. You can just bring better tools and we can find the flaws that your professional red teamer would find.
Dennis Fischer: If you're an enterprise, on an enterprise software development team, at what point would they come to you or start using your tool?
David Brumley: A typical DevOp shop would use it just part of their normal software development life cycle. As soon as they pushed code -- and we typically trigger a build system -- then they'd push that to Mayhem to find those flaws automatically. The paradigm shift for them is those tools never stop. It's not a scan where you're just going to get a report at the end. It's actually going to try to continuously attack your program forever until you tell it to stop.
Dennis Fischer: Okay. So you're not scanning the binary or the source code at the end of the development life cycle. It's throughout the entire process.
David Brumley: Absolutely not. I just don't understand why people think scanners are going to solve all their problems. You have these attackers who are going to spend hundreds of thousands of hours -- and they bring human creativity -- and then you think like a five minute scan is going to find all their problems. We just threw that out. We said the way to model computer security is to model like an attacker. You have to make sure it's continuous. Certainly, you run it and at some point you're going to say it's good enough, but you don't stop.
Dennis Fischer: Yeah, you would have to keep going. Attack techniques change all the time. I would also think as you're continuing to add features or different pieces to whatever code base you're working on that, that has a bunch of dependencies in there. Things change. There might be different attack paths in there.
David Brumley: Yeah, absolutely. We're running the code. That's one of the unique aspects about Mayhem. We actually run the code and try to attack it. When we say that you have a flaw, we actually show you how to trigger it. We give you an input. We don't just say, you shouldn't have used this function here. We'll actually give you an input that says, "Hey, you have that string copy bug, this is how you can exploit it." That makes it so we have zero false positives. It makes it easier for developers to figure out what went wrong.
Dennis Fischer: Oh that's cool. Have there been any instances where you guys have come across a new bug class or something that you really hadn't seen before?
David Brumley: The set of bug classes? No. Those are pretty well understood from theory, but we do find really unexpected bugs. We've written about a lot of bugs that have been in the media, like an OpenWRT bug. The one that was actually most surprising to me was a bug in the trigonometric function sign. If you remember sine, cosine, and tangent and sohcahtoa, that code was written in 1995 by Sun Micro Systems. We just earlier this month we found a brand new-zero day in it. That was surprising. You can find a brand new zero day -- something that's been battled, that ancient, and that fundamental,
Dennis Fischer: That's wild. Those are literally the four words that I remember from geometry.
David Brumley: Yeah, me too. When we found it, I had to go back and re-review. Sohcahtoa, what does that mean? And then, I figured it out. Right.
Dennis Fischer: I would have had to ask my daughter who's in high school. I'd be like, "What? What's this? Remind me."
David Brumley: It's a completely fundamental, basic library routine used by everything from rendering software to, autos, planes, cars, trains, and cyber physical systems. It was really surprising to me and it was just this very unusual place to find an important bug.
Dennis Fischer: That's pretty interesting. 25 years later you're finding a bug in that cheese.
David Brumley: It was a buffer overflow. Why would you need to use a buffer when calculating sine? I have no idea... but there it was!
Dennis Fischer: I love it and probably whoever wrote that code long retired and has no idea what they were even writing at the time.
David Brumley: The copyright is Sun Micro Systems. That company doesn't exists anymore, right? It was bought by Oracle.
Dennis Fischer: Not certainly in that form. Right. That's funny. David, are there certain industries that you've found are more apt to use the tools that you guys are developing?
David Brumley: Yeah, I can tell you who's apt and who's actually not been terribly forthcoming about this. First, the DOD loves it. They have a lot of critical systems that they need to check and they've been using Mayhem throughout the DOD and to high success. We then started working with commercial enterprise, and we found really two sets of people. We found people that are in safety critical infrastructure places. That's why I mentioned cars and airplanes, where those systems have to work for 20, 30 years. They're really interested in Mayhem and the techniques because you don't want your car or your airplane crashing as you're driving or as you're flying. The other set of people who are using it are people who are following the DevOps model and these tend to be more mature organizations. They've recognized that we've got to get out of this idea of checking once and thinking that everything's done. We have to get into this point of continuously checking application because it models the DevOps mindset.
David Brumley: The set of people who haven't actually been that enthusiastic so far is IoT. From a cybersecurity perspective, I was kind of alarmed there was a large number of them who, in fewer or more words, would say, we don't care about the security of those end devices we sell customers. They should just buy a new one if they want to upgrade.
Dennis Fischer: That is essentially the attitude that I've seen from a lot of IoT vendors. They consider them disposable devices. Either they don't have a mechanism for pushing an update or they don't care to because it's too expensive for them to do it. They'd rather you just buy a new thermostat, light bulb, baby monitor or whatever the hell it happens to be.
David Brumley: We've even seen this go to medical device manufacturers where they're using it as an upgrade path: "I'm sorry you bought that expensive piece of hospital equipment that's now outdated. You need to buy the new one today because we've updated the software for a security vulnerability."
Dennis Fischer: Right. There's a lot of words you could use for that sales model, but it's pretty cynical.
David Brumley: Cynical, predatory. I don't know what you want to say. It's definitely not in the best interest of their users.
Dennis Fischer: Yeah, it's true. You would think that with all of the attention that IoT security has gotten in the last, say four or five years, especially since the Mirai botnet started and the absurd vulnerabilities that have been discovered, particularly in medical equipment. Like you mentioned, you'd think that the manufacturers would really want to get on top of that. There's a real opportunity for vendors in that space to be like, "Okay, we're the ones that care about your security and privacy. None of these other vendors do."
David Brumley: I'd hope so. I'm sure there's companies out there where this is really part of their model and I'd love to meet them.
Dennis Fischer: If you're listening, please get in touch with David and help yourselves out.
David Brumley: Just make the world feel better about IoT. I think a telling sign is, if you ask people in enterprise where's your budget for securing all that IoT stuff you're using, they'll say "we don't have any". Then you ask, "Is it important?" They're like, "yeah, it's important". It's like, "Well then where's your budget?" They're like, "We have no budget". There's something to be said about following the money to find out what's really important to people.
Dennis Fischer: I wonder if that's a function of it being a relatively new technology, even though, honestly it's not. There's been internet connected "things" for a long time that weren't traditional computers. I wonder if it's a matter of people not knowing where to stick it in a budget.
David Brumley: It could be. Some organizations are compliance driven. They tend to be behind the curve and wait for someone else to solve the problem. It feels like IoT is in that bucket as opposed to being proactive in security.
Dennis Fischer: Yeah, it certainly does. All of this technology that people talk about is IoT just got thrown onto the internet within the last say 10 years. Literally nobody was thinking about the consequences of having your entire home connected to the internet with default passwords and no network security in front of it. We've seen what the results have been and they're not good.
David Brumley: I don't know if it's true, but I had a theory on some of this. When you rip apart these IoT devices, it looks like they use the cheapest available developer -- the cheapest labor possible to write the code. We've even seen where it looks like it's copied. Maybe the person with manufacturer A hired a developer and then manufacturer B hired the same developer. The code is the same. You have this compounding effect for security.
Dennis Fischer: I've seen that in some of the vulnerability and security research reports I've read from guys who have done exactly that. They're comparing code bases and they're like, "Well, I've seen this giant block of code before. I know who wrote this and it's not good." I think you're right. People are going and grabbing what they can to get these devices out into the market as quickly as they can. That's not a recipe for success in terms of security.
David Brumley: No, it's not. A lot of what we're trying to solve at ForAllSecure, it's a long path. We're looking at it as our mission is to automatically check the world's software. We want to be able to take the software that you're going to run and allow you to check it so you don't just have to trust the developer. From my point of view, if we could put real pressure on companies like, "I'm not just going to make a buying decision on this. I'm actually going to evaluate it myself." Then, we can help improve security. Maybe I'm a cynic, but I think people say they want security, but they're not actually given the tools to determine whether or not there is security baked in (to the product they're buying). And you have to give people that before we can have a change.
Dennis Fischer: So you're thinking in terms of: I'm an enterprise CISO and I'm thinking about buying software package A before I buy it. Then, I'd have the chance to run your tool against it and see what it looks like, almost like a home inspection.
David Brumley: Yeah, absolutely. Like a home inspection. And it could be us or there could be a third party, but someone who has been an independent assessment of that would go a long way.
Dennis Fischer: That's an interesting model. I think it's something that the industry could use because people just don't have that opportunity. They're usually buying black box software or hardware that they really don't know what's inside and they have no real chance to look at the source code or the inner workings of it. They essentially have to take the seller's word on its utility and its reliability.
David Brumley: Whether it's fit for purpose. What's interesting is, in CGC, we found vulnerabilities and we were able to automatically patch them even without the developer. It turned out the automatic patching didn't really have a market opportunity. Not because of the technology limitations. It was actually very robust. It was because of the lawyers and the EULAs, where the EULAs would prevent people from doing those independent assessments or the EULA would prevent you from patching your own software even if I had a vulnerability. Then, it would become unsupported and I didn't think about that as an academic. It was surprising to me that it's baked into the legal agreements that even if you wanted to, you couldn't.
Dennis Fischer: I mean there's some of that going on in the IoT world too, right now...this whole right to repair thing. Am I allowed to fix a problem I know that I have in this device? Or is that going void my warranty and then I'm screwed.
David Brumley: I'm completely on the right to repair obviously. People should, when you buy something, at least be able to independently test it and fix it. That's not taking rights away from the developer or the company. That's just giving you the rights you should have when you handed over your cash.
Dennis Fischer: If I buy a car and I know how to replace the brakes, replace the spark plugs, or do whatever it is, I've got the right to do that. It doesn't have any detrimental effects on my liability.
David Brumley: Yup. And that's all been independently tested or you can test your brakes yourself. Why would it be different with the telematics units that control whether you can brake or accelerate. Why can't you test those yourself? Why can't you look for a consumer reports on it? I think that's the world that we want to create.
Dennis Fischer: I like it. Are there any limits that you guys are bumping up against in terms of types of applications or anything like that that just don't kind of fit this model?
David Brumley: There's some, and it's just a matter of continuing R&D. I don't think there are fundamental barriers. The sort of techniques we used, everyone used in the DARPA Cyber Grand Challenge.They really try to emulate what an attacker would do. The way you exploit windows is different than the way you exploit Linux, which is different than the way you exploit an embedded operating system. We have to look at each operating system and port the product to that or port the technology to it. At that point it's more of an engineering problem. You know how to do it. It's just a question of is there enough people out there who want it that I'm going to put the engineering effort into it.
Dennis Fischer: That's a good point. I wanted to ask you about what it was like when you were working at CyLab. How did you get into that job and what was the day-to-day work there like?
David Brumley: One of the things that happened by design at Carnegie Mellon is we started having all these security faculty in different departments, because we know security is cross-disciplinary. We had people in public policy, people in computer science, people in engineering, and people in business school interested in this. The university created an Institute to house all those efforts instead of them being in all these different schools, uncoordinated. They created CyLab to be this institution where we can all work together. Day-to-day, a lot of it was trying to set up and maintain that idea of cross disciplinary collaboration and it was pretty interesting. I had a chance to talk to public policy experts in cybersecurity, which wasn't my background, but really understand what drove their policy or talk to, the computer scientist, which I love. Then, I also talk to people in business school. Like, 'Hey, what are the economic incentives that we can create to do A or B"?
Dennis Fischer: Yeah, CyLab. It may not be the first one like that, but it's the first one that I can remember. Having the kind of influence and impact that it has over the years. The volume of talent and brain power that's gone through that and come out of there is really impressive.
David Brumley: I don't know if we're the biggest either, but the best compliment you can get is when someone emulates you. We've had a number of people say they set up their institution to be like CyLab.
Dennis Fischer: There's quite a few of them around the country now and there's some others around the world too. I think it's the right approach because as we've talked about in this entire podcast. Security is not just a technical problem. It's a human problem at its core. The humans developing the software need to think about things in one way, the business people need to think about things in another way, the economists need to think about it in another way, and the policymakers and so on down the line. If you have all these people working together in one place, thinking about ways to solve the same problems, you've got an advantage.
David Brumley: Absolutely. Like I said, what are the incentives? So we've done research on everything from cyber deterrence. For example, what is the equivalent of mutually assured destruction and cyber? One of my students Tiffany Bao was amazing at this. She got a best paper from the NSA and now is a professor at ASU. That drill down on these questions about economic incentives. How can you make it so security is seen as a value and not as a cost? I saw a really interesting talk once that said, "Look, if you look at the impact of crypto, think TLS and SSL, that's created the cyber economy for purchases, which we're all feeling now. We buy everything online. If we hadn't had that security research and that development, you wouldn't have that economy. That today, that online economy is the size of the GDP of Spain." That's an example of where security really enabled something. It wasn't just a cost.
Dennis Fischer: Unfortunately that's still the way that some organizations think about it is as a cost center and the department of NO, or however you want to put it, telling you not to do things. You're not allowed to do that or you're doing it the wrong way. It's certainly a lot more effective to show people here's the ways that you can do this safely rather than saying, just don't do that at all.
David Brumley: You can do it safely. What follows out of safely is it's more robust. When we go and we talk to people about cybersecurity, a lot of times the most effective arguments are you're going to have better uptime. You can handle more customers and less unplanned incidents. It's not about like, "what if someone stole your information?" It's that you can do what you're doing and you can do it better and more often.
Dennis Fischer: That's a good way to look at it. Because usually what people respond to is that the money part of it. You can make more money if you do things this way.
David Brumley: I truly believe it. This isn't just a pitch. It's security actually allows you to do what you want to do without having to worry.
Dennis Fischer: Right. That's the general idea. Unfortunately, that's not the way it's portrayed a lot of times. So, no. Well David, I really enjoyed this, man. Thanks so much for giving me so much time today. This is fun conversation. I'll have to do it again sometime soon.
David Brumley: It was great talking to you, Dennis. Stay safe, man.