• Home
  • Blog
  • Live Panel:Application Security Debt

Live Panel:Application Security Debt

Why security must become an integral part of preparing software
Why security must become an integral part of preparing software

Jeff Martin, vice president of product for Mend, was recently interviewed by Michael Vizard from the Techstrong Group.

In a fascinating conversation on application security debt, the two shed a spotlight on the insufficiencies of the current security stance of many companies and the budgetary pressures that might be influencing them. They also discussed the pressures on security that have increased due to the increasing complexity of components, the role developers must now play in security considerations, and the tensions between productivity/speed and security that affect how developers and DevOps safeguard their code.

Jeff outlined his view on why security must now be an integral part of good software, how far automation and remediation of vulnerabilities can currently go, and the importance of making security a vital part of the education of the next generation of developers. Finally, he touched upon the business case for more robust security practices, saying that the combination of regulatory issues and customer demand is escalating companies’ needs and requirements for better application security.

Full transcript below.

MV We’re here with Jeffrey Martin, who’s vice president of product for Mend, and we’ll be talking about what goes on with application security these days and all things vulnerabilities. Jeffrey, welcome to the show.

JM Nice to be here.

MV It feels like we kind of don’t talk about this enough but there’s a significant amount of technical debt when it comes to application security out there. A lot of folks will build and deploy applications. They kind of tell themselves that they’ll get to some sort of vulnerability that they know about eventually. A lot of times they don’t know about them and they don’t even know they exist and nothing gets done at all. So, as they used to say back in the vernacular, are we cruising for a bruising here, and are things going to get worse?

JM You know, it’s funny. I was talking about this repeatedly for the past month or so. Er, yeah, things are going to get worse. We have, I think, a bumpy time ahead, especially with some of the economic pressures, and companies that aren’t as mature with their AppSec may view security as a cost center. So, I think that not only is our general maturity in the application security discipline a little bit insufficient right now, I think in the next year or so we are going to have a lot of cost pressures that are really going to force people to prioritize what they’re spending money on.

MV So what is the fundamental problem? I mean I don’t think people get out of bed in the morning and think “Let’s go and build an insecure application.” So where does everything kind of break down?

JM So I think there’s two big avenues for that. The first is increasing complexity. Our applications are becoming more and more complex. So, fundamentally, especially with the rise of things like cloud native, you end up really stitching together an application. It’s a dynamic configuration of a bunch of services and containers and deployments all running together. I’d liken it to, instead of building a car – putting together a bill of materials, putting it all together and then driving it down the road – nowadays the car’s parts are changing as you’re driving them. And that just creates this huge complexity problem. It’s hard to keep track of all the pieces, never mind the security of all the pieces. And then the other avenue is frankly it’s still a relatively – let’s compare it to quality – immature discipline that hasn’t been fully automated, hasn’t been scaled, in the same way that other kinds of application risk have been.

MV As we go along, does the relationship between the security teams and the developers need to change, or is it changing, or are these two cultures that are just never going to come together, and do we have to just throw everybody in a room and lock the door and see what happens?

JM Look, I think there’s traditionally been a little bit of a handover. Throw it over the wall. It gets thrown back. I think that’s changing. I think it’s changing for a few different reasons, but two of the big ones are. One: cybersecurity professionals are becoming thin on the ground. There’s not enough of us and so developers are having to take on some of that responsibility. And so, it’s getting democratized a little bit just due to scarcity of professionals, who can focus. And then the other reason is: I think everybody hates that “throw it all over the wall. Throw it back,” is kind of a lose-lose for everyone and just to be very blunt about it, it can turn into a lot of finger-pointing, or “they’re slowing me down.”. So, both sides are actually coming together.

I do think in the long term it will start collapsing into each other and the roles will become a little bit more cybersecurity provides guidance and education and stops being a blocker as the developers pick up the security burden themselves. And so security becomes an enabler, not a blocker.

MV Truth be told, a lot of developers use that excuse about productivity as a reason not to be so good at security, right? They’re basically saying, “this is somebody else’s problem and I’ve got to write all this code quick, and I’ve got to meet all these deadlines.” So, the question is: can we build applications at the rate we are on, maybe even faster, and build them securely?

JM It’s funny because developers are, after all, a kind of engineer, and there’s that old engineering maxim, “Do you want it fast, cheap, or good? Pick two.” And the fact of the matter is, that’s not sufficient anymore. I think they tend to view it that way. “Oh. I could spend a ton of money and do it quickly. I’m going to sacrifice ‘good’: quality and security.” I think slowly everybody is realizing that the definition of good software includes secure software. It’s right up there with, ‘Does it do what I mean it to? Does it function?” And the excuse of “It slows me down,” is not a good excuse anymore. It stopped being a good excuse a little bit ago. In the same way, I can’t say, “Oh, you want the software to work? That’s going to slow me down.” It just doesn’t track. You know, the definition of ‘work’ includes being secure.

MV Developers would also sort of expect that a lot of these issues are going to be handled by the platforms and the CI/CD pipelines, and all this stuff is going to get automated, and it should just fix the code when it’s a low-level, trivial thing. So, you know, how automated are we getting, to the point where maybe we don’t have to trouble these developers with this pesky little security issue?

JM Yeah. I wish it worked that way. Even as a tool vendor myself, I will tell you tools can’t solve everything right now. It’s too hard of a problem set. You can’t automate away all the security. You can’t basically hit an easy button and then everything you deploy is secure. It is a discipline and it has to be checked. That said, there is a lot going on, on the platform side. You know, when I deploy into Google Cloud or AWS or Microsoft Azure, there’s a lot of security features that are present there, and it does create this odd hand-off of what’s my problem as a developer versus the problem I’m contracting out as part of my cloud services, at least for modern applications. But no, there’s no easy button, still. The fact of the matter is, the main thing we can automate right now, with some few exceptions, is the detection of issues: SAST findings, SCA findings, misconfigurations. And only recently have we been able to automate things like prioritization of these issues and remediations. And as somebody on the cutting edge of auto-remediation, I can tell you, it’s not a cure-all yet. We’re probably a good five years, ten years away from that.

MV So, you don’t think AI’s going to come along and save us all from ourselves any time soon?

JM No. Kind of in general, period, but also specifically in the AppSec space. Computers are very good. AI is very very good at handling big data quickly, and deriving insights from it, but the fact of the matter is, security is not quite just that deterministic. Like every other form of risk management, it requires judgement calls and what you find as acceptable risks. And fundamentally the main people we’re protecting you against are other humans, not computers.                   

MV Do you think we need to retrain all the developers? And I ask the question because if you’re in college and you’re studying to become a developer, or wherever else you’re going, security is an elective. It’s not a requirement, and anything that’s an elective is pretty much not taken by people who are in a hurry to go to work, so you know, maybe the fundamental flaw is that we’ve just allowed security to become an elective for all these folks.

JM Yeah. It’s funny. I think the biggest problem and I touched on it briefly, is code quality isn’t an elective; it’s part of how you’re taught to code. Your code needs to work. Your code also needs to be secure. It needs to be part of the basic education of all developers, and I think a lot of companies are beginning to do that. Schooling systems and how we train developers prior to actually working in enterprises is not, and small or medium-sized businesses, you know, they’re just doing the best they can, without much of a focus on it. But at least the big enterprises are doing a lot of on-the-job training, and that’s why I think that the role of a cybersecurity professional in those organizations is really turning into things like education. But, boy, it would be nice if everybody was as educated on secure code as they are on writing quality code, coming out of school.

MV Well speaking of that, we have all these wonderful DevSecOps processes that people are trying to build. There is this sermon that’s given. Everybody nods their head in agreement. Then they go off and do whatever it is they normally do. And I guess the question I’m trying to drive at a little bit is, do we need to find a way to kind of make that stick in a way that is a cultural change and what does it take to do that, besides standing up in front of everybody and saying, “Thou shalt”?

JM So I think it actually goes back to an earlier question you asked, which is, are things going to get worse or better, or where are we going? I think we’re probably still a couple of shocks to the system – big breaches, big violations, cyber warfare – something that sort of wakes everybody up a little bit. But I actually see this already happening in the enterprises, the really big ones. They know that it’s just a matter of time before there’s a shock to the system and they’re doing the best they can to prepare by creating decent practices and doing education. But the fact of the matter is, until it becomes very apparent to people, the monetary and other kinds of risk that they’re taking on by not having it be fully secure, I think that, frankly, it’s going to continue along the way it is for a little bit. We’re probably a shock or two away. That said, there’s a lot of people obviously advocating standing up and saying “Thou shalt.” Just my experience, “Thou shalt” works better when it’s “Thou shalt because look at this problem over here that’s happened. We don’t want it to happen to us.”

MV Alright. We’ve seen those so-called wake-up calls many times in the past, already, and each time they occur, everybody stands up and says, “Yep. This time for sure it’s the wake-up call.” And then about two weeks later everybody rolls over and hits the snooze button again. So, you know . . .

JM The Colonial pipeline for example

MV So I’m not sure these shocks to the system are working in the way that we had hoped or intended. So, I don’t know, is there some other way to think about it?

JM I do think they’re having some results. You know a lot of the work CISA is doing right now, some of that was driven by the Colonial pipeline of course, but a lot of it was in the works before the executive order on SBOMS for federal purchasing that went out about a year ago now. I think there’s some realization, especially on the federal level, that we need to get better. And I do think those are slowly driving us in that direction. You know, I’ve heard the word SBOM more in the last six months than I had in the previous five years. I think this is steps in the right direction that are driven by shocks, but it is slow progress, I agree. It’s a little frustrating, right, when you can see that nobody likes a Cassandra screaming that the sky is falling, but the fact of the matter is that we under-invest in cybersecurity both from execution and from training.

MV Most business executives when they hear the phrase ‘SBOM’ probably think it’s a new type of cocktail somewhere. So, the question then becomes what do we have to do to get the business side to kind of drive this harder because that’s the thing that the developer side will ultimately respond to?

JM So, as one of those business-side people sometimes myself, I will tell you we’re driven by a couple of things. One is requirements: standards and certifications. And I think a lot of those are going in this direction. We’re already there to some degree in the highly regulated industries like medical, but that is slowly spreading out to generalized application development. And the same thing, too. Requirements for the ecosystem you’re playing in. Mobile applications actually have some reasonable security standards because if you want to be in the store for Apple, you have to have certain amounts of security. But the other part is customer demand, and that’s the part that I see really ramping up right now. We all sell our applications to somebody, for the most part, whether it’s another business or sometimes consumers, and the other stakeholders are beginning to demand security in, frankly, the same way that they demand software works. You know whatever we can say about the lessons learned from SolarWinds. I think one of the big lessons was, “Hey. You’re dependent on your vendors, for the applications that they provide being secure, and there needs to be a minimal level of things like contractual agreements there. So I think there is a lot of business ‘push’ towards this. Two people I listen to the most on the business side are compliance and regulations that I have to comply with, and the people that are giving me money. And so as they demand more security, more transparent security, of course the business side is going to listen. But I do think that’s a journey that’s going to be taken over the next three to five years. It’s not going to be tomorrow.

MV Do you think we’ve become our own worst enemy because we embraced the Agile programming and then we told ourselves that any time there’s an issue we’ll rapidly fix it but we never do, and so then the problem becomes this, you know, continuous lifecycle of issues where vulnerabilities never get addressed and this is becoming part of the problem? So, maybe we need to tell ourselves that we can just experiment on the users anymore.

JM It’s funny because it goes back to another question you had earlier which was, are we kind of making things worse, especially if you think about Agile. As I pointed out the complexity problem is getting worse. So, if you were going to try to make complexity a bigger issue, you would change things every single day or multiple times a day. Agile development introduces complexity. Complexity introduces risk. It’s just sort of fundamental to how we’ve decided to approach application development. Rapid development. Lots of changes. Break it fast, Fix it fast. The problem is, I don’t care what you’re investing in cybersecurity, if everything keeps changing every day or multiple times a day, you end up creating a problem that you can never catch up to. And I do think that some of the more critical pieces of applications – critical infrastructure, for example – they’ve begun slowing things down a little bit because security is just too important to risk just to increase the speed of feature delivery. In general, I bet that pendulum swings back a little bit towards slower, more careful releases, until such a time as we can get the automation to a level to support it for real.MV You heard it here. Unless everybody opens up their windows and starts screaming “I’m not going to take it any more,” tomorrow we’re going to have some serious issues coming forward. So, hold on to your hat. Jeffrey, thank you for being on the show.

Watch the video interview here.

Meet The Author

Adam Murray

Adam Murray is a content writer at Mend. He began his career in corporate communications and PR, in London and New York, before moving to Tel Aviv. He’s spent the last ten years working with tech companies like Amdocs, Gilat Satellite Systems, Allot Communications, and Sisense. He holds a Ph.D. in English Literature. When he’s not spending time with his wife and son, he’s preoccupied with his beloved football team, Tottenham Hotspur.

Subscribe to Our Blog