All posts by jerry

Defensive Security Podcast Episode 272





jerry: All right. Here we go. Today is Sunday, July 7th, 2024, and this is episode 272 of the defensive security podcast. My name is Jerry Bell and joining me tonight as always is Mr. Andrew Kalat.

Andrew: Good evening, Jerry. This is a newly reestablished record twice in a week or

jerry: twice in a week. I can’t believe it.

Andrew: I know. Awesome. Yeah. You just had to, quit that crappy job of yours that provided income for your family and pets and you know everything else but now that you’re unemployed house But now that you’re an unemployed bum.

jerry: Yeah, I can podcast all I want 24 7 I think i’m gonna become an influencer like i’m gonna just be live all the time now

Andrew: you could I really I look forward to you asking me to subscribe and hit that notify button.

jerry: That’s right. Hit that subscribe button

Andrew: Like leave a rating and a comment

jerry: like and subscribe All [00:01:00] right getting with the program we’re we’re getting back into our normal rhythm. As per normal, we’ve got a couple of stories to talk about. The first one comes from Dark Rating and the title is, A CISO’s Guide to Avoiding Jail After a Breach.

Andrew: Before we get there.

Andrew: I want to throw out the disclaimer that thoughts and opinions do not reflect any of our employers, past, present, or future.

jerry: That’s a great point. Or, my cats.

Andrew: Unlike you, I have to worry about getting fired.

jerry: I still have a boss. She can fire me.

Andrew: That’s called divorce, sir. But true.

jerry: Yeah.

Andrew: Anyway, back to your story.

jerry: Anyway, yeah. CISO’s Guide to Avoiding Jail After a Breach. So this is this is following on a upcoming talk at, I think it’s Black Hat talking about how CISOs can try to insulate themselves from the [00:02:00] potential legal harms or legal perils that can arise as a result of their jobs. It’ll be interesting to see what’s actually in that talk, because the article itself, in my estimation, despite what the title says, doesn’t actually give you a lot of actionable information on, How to avoid jail. They do they do a quote Mr. Sullivan, who was the CISO for Uber.

jerry: And they give a little bit of background and how it’s interesting that he he is, now a convicted felon. Although I think that’s still working its way through the the appeals process. Though he previously was appointed to a cybersecurity board by president Obama.

jerry: And before that he was a federal prosecutor. And in fact, as the article points out, he was one of the process, he was the prosecutor who prosecuted the first DMCA case, which I thought was quite interesting. You didn’t know that about him, but what’s interesting is this article at least is based a lot on [00:03:00] interviews with him and including recommendations on things like communicating with your your board and your executive leadership team. But I’m assuming that He had done that at Uber.

Andrew: Yeah, this is such a tough one for me, and it makes, I think a lot of good people make references in the article. I want to shy away from being a CISO if there’s this sort of potential personal liability. When, there’s a lot of factors that come into play about why a company might be breached that aren’t always within the control of the CISO, whether it be budget, whether it be focus, whether it be company priorities, and you have an active adversary who is looking for any possible way to get into your environment.

Andrew: So what becomes the benchmark of what constitutes a breach? Negligence up to the point of going to jail is the one that [00:04:00] I’ve struggled with so much and I think those who haven’t really worked in the field much can very easily just point to mistakes that are made, but they don’t necessarily understand the complexity of what goes in to that chain of events and chain of decisions that led to that situation.

Andrew: Every job I’ve been in where we were making serious decisions about cybersecurity was a budgetary trade off and a priority trade off and a existential threat to the company if we don’t do X, Y, and Z. Coming from five or six different organizations at the same time coming up to that CFO or the CEO and they have to make hard calls about where that those resources go and those priorities go to keep people employed. And you pair that with a very hostile, third party intentionally trying to breach you it’s a tough situation and I don’t think any of us knows what the rules look like. At this point to keep yourself out of [00:05:00] trouble. You’ve been in this position, not in the, going to jail part, but that this threat was much more meaningful to you in your last role than it is to me.

jerry: It is very uncomfortable. I’ll tell you when when the Uber CISO got got charged and the CISO of SolarWinds got charged, that’s It’s an uncomfortable feeling an exposed feeling. In criminal law, there’s this concept of strict liability.

jerry: And strict liability basically means, it means the thing happened. And because the thing happened and you are responsible for the thing, it doesn’t matter that, there, there’s no mitigating factors. Your your state of mind, your motivations, , none of that matters in a strict liability case.

jerry: And to some extent, it feels like that in this instance, I don’t think it really is, although, when you’re a CISO sometimes that thought can cross your mind. Now in the article, they actually point out that, though the CISO is the [00:06:00] lightning rod when things go wrong. It is not just the CISO that is responsible for, what went wrong.

jerry: As they describe it, it takes a community and the results of that community are, as we’ve now seen or is alleged is, being pinned on a particular individual. And I, I think and I know from having read the Uber case I’ve not. I’m not so familiar with the SolarWinds case although I’m obviously familiar with what happened in SolarWinds case, with Uber, it was a situation where they they had a a, basically a data breach and the allegation was that the ad, the adversary was trying to hold it for ransom and they They successfully negotiated having that, at least this is my understanding of how the case went they negotiated a payment through [00:07:00] the bug bounty program to the adversaries, perhaps, maybe adversaries isn’t the right word allegedly deleted the data and because of that, they didn’t report the breach.

jerry: And so it was really, the failure to report that breach which the government was coming after him for, basically being deceptive to investors. And it’s not necessarily that he was malicious or what have you, but no, basically my layman’s rate is he was defrauding

jerry: investors by withholding information about a breach that he was obligated to report. So that’s a tough situation. And what concerns me is that this is somebody who was a federal prosecutor so I had I had plenty of competent legal counsel surrounding me.

jerry: And that was a good thing. It felt good. And I’m quite certain he did too, further he himself [00:08:00] was a prosecutor. And so I have a hard time accepting, and maybe it’s just very naive of me. I’d have a hard time accepting that, He was actually trying to misrepresent things or hide things.

jerry: I guess that’s where I’m at on this one. It feels bad and the article points out that, because of this, one of the, one of the whispers as they describe it in the industry is that it’s forcing people who are qualified for the role and understand the perils that they face to shy away from taking that role.

jerry: And that then leads to people who are maybe not as qualified taking the role and then obviously not doing as good of a job. And therefore actually, the net effect is a weaker security posture.

Andrew: Yeah. I think one thing that you can, if we try to get some advice out of this or try to give some advice out of this, and the one thing they mentioned in For lack of a better [00:09:00] term, tie some other people in the organization to the same decision, right?

Andrew: Make sure that your board is aware and your executives are aware and that you’re not the only one holding the risk bag at the end of the day that, if you have to own the risk yourself, then you need to have formal control. Now, in this case, we’re talking about. In theory, he got in trouble because he didn’t notify the SEC and it was a public company, it was material breach.

Andrew: And, so stockholders weren’t informed more so than he was negligent in his cybersecurity duties in terms of technical controls and audits and that sort of thing. However, that feels the way things are going. We hear more and more calls for hold companies accountable directly and legally with risk of jail for breaches.

Andrew: And this, there’s a lot of nuance here that’s not exactly what happened here. But I find that very troubling and [00:10:00] obviously, I have a bias because I’m in the industry and I would be at risk of that potentially. But I just don’t think it’s that simple. There’s no CISO that has that much control over an environment that they should be solely responsible for taking the fall if a breach were to happen, although that does happen all the time, but it’s one thing to lose your job is another thing to go to jail.

jerry: Yeah. And I think that the author here points out that at least as Mr. Sullivan describes it, he feels like he was put forward by Uber as a sacrificial lamb. I guess what I don’t really understand was how much better would it have been for him if, He had done a more effective job at, creating what I’ll loosely call co conspirators within the company.

jerry: I think what they’re trying to say is that you as a CISO should go to the board, to your CEO, to whoever, and articulate the risk, [00:11:00] not with the intention of them again, becoming co conspirators, but of them saying, gosh, now I know about it I don’t want to go to jail. I’m going to reallocate the money or do what, do whatever is required in order to address the particular risk. Now, I think in this instance, it wasn’t like a, we have to go spend more money on security. It was more, Hey, we had this issue. Do we disclose it or not?

jerry: And I think, that’s a slight, maybe a slightly different take, I would assume by the way, just again, having played in this pool he didn’t make that decision alone.

Andrew: Sure. Part of me, and this maybe is not exactly apples to apples, but I think about a lawyer advising an executive on the legality of something that executive can take that advice or reject that advice, a CISO advising a company on the legality or outcome or [00:12:00] risk of a decision. They don’t always make that decision. They’re somewhat beholden to their leadership on which way the company wants to go.

jerry: There was a an unwritten aspect to this that I wanted to discuss a bit. And that is the subtext of all of this, I think, is going to create an adversarial relationship between the CISO and the CISO’s employer, because it feels to me like what the government would have preferred is for the CISO to to run to the government and say, Hey, my employer isn’t acting ethically.

jerry: Necessarily saying that’s what happened in Uber’s case or any of these cases, but I think that’s what the government is trying to push. Now, granted there’s a not so gray line, beyond which you have an ethical duty to to rat on your employer.

jerry: You can imagine all sorts of situations not [00:13:00] even in the realm of security where, you would be obligated to go and and report them. But it feels to me they’re trying to lower that bar.

Andrew: Yeah, I can see that. Unfortunately this is probably going to be messy to get sorted out. And it’s going to take a lot of case law and it’s going to take a lot of precedence. That makes me nervous. If I were offered a CISO opportunity at a public company, I’d probably think real long and hard about it, about passing on it or trying to assure some level of security to avoid this problem.

jerry: Our next story throws some throw some sand in the gears there. This one comes from CSO online and the title here is US Supreme Court ruling will likely cause cyber Regulation chaos.

jerry: And so unless you’ve been living under a rock or perhaps just not in the U S you’re probably aware that the Supreme Court, I guess it was last [00:14:00] week, overturned what has been called or referred to as the Chevron deference doctrine. And the name comes from the oil company Chevron, and it stems back from a 1984 so 40-year-old ruling by the Supreme Court that basically I, I’ll sum summarize it to say that ambiguous laws passed by Congress can be interpreted by regulators like the FCC, the FDA, the SEC and so on. In the U S at least a lot of regulations are very high level. It’ll say something, I’m going to make it pick a stupid example. It’ll have that will say, use a strong authentication . And then it’ll be up to a regulator to say strong authentication means that you use multi factor authentication.

jerry: That isn’t SMS based.

jerry: That initial ruling was intended to establish that courts aren’t experts [00:15:00] in all matters of law.

jerry: And by default courts should be deferring to these regulators. And that has stood the test of time for quite a long time. And now it was overturned in in this session of the Supreme whatever you want to think about the sensibility of it.

jerry: I think the challenge that we now have is going to be a have made the joke on social media that right now the most promising career opportunities has got to be trial lawyers, because there’s going to be all manner of court cases, challenging different regulations which, in the past were, pretty well established as following regulations set by the executive branch in the U. S. But now as this article points out, things ranging from the SEC’s requirements around data breach [00:16:00] notifications to the Graham Leach Bliley Act of 1999.

jerry: There’s a broad range in the security space of regulations, which, are likely to be challenged in court because the prescription behind those laws basically don’t cover the way they’re currently being enacted. And so we should assume that they will, these will be challenged in court and given the Supreme Court’s ruling, the established prescription coming out of the executive branch is no longer to be deferred to.

jerry: And it’s unclear at this point, by the way, how courts are going to pick up their new mantle of responsibility and interpreting these things because, judges aren’t experts in security. So I think that’s why they’re calling it chaos right now, because we don’t really know what’s going to happen. For the longterm, think things will normalize.

Andrew: Yeah. Businesses hate uncertainty.

jerry: [00:17:00] Yes.

Andrew: And for good or ill, businesses can have a huge impact on government legislation. So I think this will get sorted out eventually, but I think you’re right. I think what we counted on, or at least tried to work around or With these regulatory agencies and understand these rules have now all changed, and I think you’re right.

Andrew: There’s going to be probably a ton of. of these rules that have the force of law being challenged now in court. And I think ultimately Congress has probably the reins to fix this if they want, but I think that’s another interesting problem. If SCOTUS is saying, look, You regulatory agencies are taking the power of law in your own hands and we don’t like that.

Andrew: So the power of law comes from Congress and elected officials in Congress. Then Congress, you need to do a better job of defining these rules specifically. That presents its [00:18:00] own set of interesting challenges because how well will they do that? And we’ve seen a lot of well intentioned laws, especially in very complex areas, have their own set of problems because of all of the trade offs and problems that go into legislative work in Congress causing issues.

Andrew: So it will be very interesting. This could have a lot of wide ranging impacts. And again, to your point, I’m not getting anywhere near whether they should or shouldn’t have done this, but I think the intent was you unelected regulators shouldn’t make law, Congress should make law. Okay. But that’s easier said than done.

jerry: Yeah. It’s, I think it’s that plus the constitution itself. very directly says that it is up to the judicial branch to interpret laws passed by Congress. Yeah. Yeah. And not the executive branch. And that’s [00:19:00] what, that’s where I think if you read the majority opinion, that’s basically to sum up, that’s what they’re saying.

jerry: I think the, the challenges that when the constitution was written, like there was, it was a much, much simpler time.

Andrew: There’s a lot of interesting arguments about. That you see out there and there’s a lot of very passionate opinions on this. So I’m trying very hard to stay away from the political rhetoric around it and just, I concur that this throws a lot of accepted precedent around our industry into question.

jerry: But, going back to the previous story, I don’t know, again, I’m not a, I’m not an attorney. However, if I were Joe Sullivan, I would feel like I have a new avenue of appeal.

Andrew: Sure. Yeah. Did the SEC made this law in essence could, would be his argument. And based on this particular ruling by SCOTUS [00:20:00] that was an inappropriate ruling and, or an inappropriate law.

Andrew: And therefore his. Obviously I’m not a lawyer because I’m not articulating this like a lawyer, but he could say that’s why I shouldn’t have been trying to convicted and please politely pound sand.

jerry: I do think the, I do think the opinion did say something along the lines of it doesn’t overturn, previously held court cases, people are due their day in court.

jerry: So if he has an avenue for appeal, that’s how the justice system works. This is hot off the presses. I think. I think the echoes are still circling the earth, we’ll be seeing the outcome of this for a while and I don’t think we exactly know what’s going to happen next. Stay tuned and we’ll check in on this periodically.

jerry: Okay. The next one comes from Sansec and there’s actually two stories, one from Sansec and one from security week. And this is [00:21:00] regarding the polyfill. io issue. I’m hesitant to call it a supply chain attack, but I guess that’s what everybody’s calling it.

Andrew: Come on, get on the bandwagon.

jerry: I know, I know.

Andrew: If you want to be an influencer, man, you got to use the influencer language.

jerry: I feel, it makes me feel dirty to call it a supply chain attack. So why what makes you so uncomfortable calling it a supply chain attack? I don’t know. I don’t know. I, that’s a good question. And I, the answer is I don’t really know.

jerry: It just feels wrong.

Andrew: Did your mother talk to you a lot about supply chain attacks?

jerry: See that’s, maybe that’s the problem.

Andrew: Okay. Imagine you’re walking in a desert and you come across a supply chain attack upside down stuck on its back. Do you help it? But you’re not turning it over. Why aren’t you turning it over, Jerry?

jerry: I don’t even know where this is going.

Andrew: I had to lighten it up after the last two stories, man. You were being a downer.

jerry: Polyfill is [00:22:00] a is a JavaScript library that many organizations included in their own website. It does oversimplifying it. It enables some types of more advanced functions or newer functions of modern web browsers to work in older versions of web browsers. And so I don’t fully understand the sanity behind this. I think it’s, maybe this will start to cause some rethink on how this works, but , this JavaScript library is called by reference rather than it being served up by your web server, you are referring to it, as a remote entity remote document hosted on, in this instance, polyfill. io.

Andrew: So instead of the static code living in your. HTML code. You’re saying go get the code snippet from this bot and serve it up.

jerry: Correct. It’s telling the web browser to go get the codes directly. Yeah. What happened [00:23:00] back in February was I don’t fully understand, what precipitated this, but the maintainer of the polyfill. js library in the polyfill. io domain. Was sold to a Chinese company. And that company then started using they all basically, they altered the JavaScript script library to alt, alternatively, depending on where you’re located and other factors either serve you malware or serve you spam ads and so on.

Andrew: So you’re saying there are not hot singles in my area ready to meet me?

jerry: It’s surprising, but there probably are actually.

Andrew: carry on.

jerry: They can’t all be using Polyfil. Anyhow, there, there were, depending on who you believe somewhere ranging from 100, 000 [00:24:00] websites that were including this polyfill. io code to tens of millions as purported by CloudFlare. So at this point, by the way, that the issue is somewhat mitigated.

jerry: I’ll come back to why I say somewhat mitigated that the poly field that IO domain, which was hosting the malicious code has been taken down. Most of the big CDN providers are redirecting to their own local known good copies, but again, they haven’t solved the underlying issue that it’s still pointing to JavaScript code that’s hosted by somebody else. Although, presumably companies like CloudFlare and Akamai and Fastly are probably more trustworthy than, Funnel in China.

Andrew: Yeah yeah, because they actually came out and denied any malicious intent and cried foul on this whole thing too, which was interesting.

jerry: Yes. [00:25:00] But people have done a pretty good job. And in fact, this, the San Sec report gives it pretty good. Pretty thorough examination of what was being served up. And, you can very clearly see it it’s serving up some domain lookalikes, like I find it hilarious, Googie dash any analytics. Com, which is supposed to look like googleanalytics. com. And I suppose if it were in all caps, it would probably look a lot more like that. But the other interesting thing is that these researchers, noticed that the same company also in several other domains, some of which have been also serving up malware.

jerry: And those have also been taken down, but there are also others that aren’t serving, or haven’t been seen serving malware yet and are still active. And so it’s it’s probably worth having your threat Intel teams. Take a look at this because my guess would be that at some point in the future the [00:26:00] other domains that this organization owns will probably likewise be used to serve up a malware.

Andrew: Bold of you to assume that all of us have threat Intel teams.

jerry: Fair enough. You do you just, it just may be you.

Andrew: Correct. Me and Google.

jerry: Yes.

Andrew: And my RSS feed of handy blogs, but yes,

jerry: that’s right,

Andrew: but yeah, they seem to have, oh, a wee bit of a history of being up to no good.

Andrew: This particular Chinese developer.

jerry: Yes, defending against this, I think is pretty, pretty tough beyond what I said on the supply side. I think it’s, I think it’s a bad idea. Maybe I’m a purist. Maybe I’m old school and it should be out the pasture. I think it’s a risky as we’ve seen many times now.

jerry: This is not by far the first time this has happened to be including by reference things [00:27:00] hosted, as part of some kind of an open source program. Not necessarily picking on open source there. I think it happens less often with commercial software. As we’ve seen it now happen quite a few times with these open source programs, either, including things like browser extensions and whatnot.

jerry: I, now having said that, you can imagine a universe where this existed as a just simply and solely a GitHub repo and companies, instead of referring to polyfill. io we’re downloading the polyfill code to their own web server. And most likely you, you would have between a hundred thousand and 10 million websites serving locally, modified code, but then again, nobody updates

Andrew: right? It would be impacted, but we’re running 28 year old versions.

jerry: So maybe not.

Andrew: Yeah, but boy, to your point, it gives me a little bit of a [00:28:00] heebie jeebies to say that the website that you’re responsible for is dynamically loading content and serving it that you don’t have control over, but that’s perhaps very naive of me.

Andrew: I don’t do much website development. I don’t know if that’s common, but as a security guy, that makes me go, Ooh, that’s risky. So we don’t control that at all. Some third party does. And we’re serving that to our customers or visitors to come to our website and we just have to trust it. Okay. But that probably exists in many other aspects of a modern supply chain or a modern development environment where you just have to trust it and hope that.

Andrew: People are picking up any sort of malicious behavior and reporting it as they did in this case, which is helpful But then it causes everybody to scramble to find where they’re using this which then goes to hey How good is your software building materials or software asset management program to how quickly can you identify you for using this?

Andrew: and then there was a lot of confusion when this first came out because there’s different sort of kind of [00:29:00] styles or Instances of polyfill that some were impacted some were not how much of this is You know, what truly was at risk? And the upside is that the domain was black hole pretty quick. Anyway, it seems so fragile, right? You’ve got this third party code that you don’t control. You don’t know what’s the other end. You probably have ignored that it’s even out there and forgotten about it, especially this is defunct code. And that’s a whole other area that drives me a little crazy at night is how do you know when an open source software is no longer being maintained and is silently or quietly gone end of life and you should be replacing it? I’ve contemplated things like, hey, if there hasn’t been an update within one year, Do we call that no longer maintained?

Andrew: I don’t know. I don’t have a good answer. I play around with that idea with my developers and talking about, because we want to make sure that code is well maintained and third party code that we’re using is being up to date. We don’t want end of life code in general, but I don’t know what [00:30:00] constitutes the end of life in open source anymore.

jerry: I think we will eventually see some sort of health rating for open source projects. And that health rating will be based on like, where are the developers located in the world? How long on average does it take for reported vulnerabilities to get fixed? How frequently are commits and releases of code being made and other things like that. But that doesn’t necessarily mean a whole lot. Look at what happened with, what was it? X Z.

Andrew: Yeah. Yeah.

jerry: That was a very, arguably, won’t call it healthy, right?

jerry: But it was an active project that had a malicious a malicious contributor who found ways of contributing malicious code in ways that were difficult to discern. And then, you look at what happened with open SSL and then open SSH and [00:31:00] it’s not a guarantee, but I think

jerry: it would be good to know that, hey, you have code in your environment that is included by reference and it was just bought by a company who’s known to be a malicious adversary. And we don’t have that. We don’t have any way of doing that today.

Andrew: So you want like a restaurant health inspector to just show up and be like, all right, show me your cleanliness.

jerry: They so I think that we will get there.

Andrew: You want a sign in the window, this restaurant slash get hub repository earned a B minus, but has great brisket.

jerry: Sometimes you just have to risk it. Good, good brisket is good brisket. So I think that’s going to happen, but what that doesn’t solve is the demand side. So that’s. I think part of the supply side, you still have to know to go look for the health score.

Andrew: Or have some sort of tooling or third party tool [00:32:00] that, some sort of software security suite that, scans your code and alerts you on these things in some way, like in theory. And I’m sure by the way, that there’s probably vendors out there that think they do this today and be happy to pimp us on their solution.

jerry: Oh I’m, I feel quite certain that my LinkedIn. DMs will be lit up with people wanting to come on the show to talk about their fancy AI enabled source code analyzer.

Andrew: But it’s just one more thing devs that now have to worry about as security teams have to worry about. And. This is a competition against developing new features and new functionality and fixing bugs is, this is now just one more input to worry about, which competes for priorities, which is why it’s not that simple.

jerry: It’s very true. Way back when I was a CISO.

Andrew: You mean two weeks ago?

jerry: Way back. The way I had always characterized it is using open source software is like adopting a puppy. You can’t ignore it. It needs to be cared for. You have to feed it and clean up after [00:33:00] it and walk it and whatnot. I don’t think that is a common approach. I think we typically consume it as a matter of convenience and assume that it will be good forever. I think we’re getting, we’re starting to get better about developing an inventory of what you have through SBOM. And that of course will lead to better intelligence on what needs to be updated when it has a vulnerability and that’s certainly goodness, but I think that the end to end process in many organizations needs a lot of work.

Andrew: Yeah. I also think that this is never going to go away in terms of companies. I think rightly or wrongly, or we’ll always be reliant on third party open source software now. And so we’ve got to find, and this is also a relatively rare event that we’re aware of the hundreds or maybe thousands of open source projects that people use regularly.

Andrew: This doesn’t happen very [00:34:00] often.

jerry: It’s the Shark attack syndrome, you hear about it every time it happens. And so it’s, it seems like it happens often, but when it does happen, it can be spectacular. .

Andrew: It’s interesting because when these things hit a certain level of press awareness, it also drives a third party risk management engagement of various vendors to vendors and Inevitably, at least in my experience, when we see something like this hit you will inevitably see, if you were a vendor to other businesses, their third party risk management team spinning up questionnaires to their suppliers, hey, are you impacted by this and what’s your plan?

Andrew: Which then drives another sense of urgency and a sense of reaction. That may be false urgency that’s taking your resources away from something that’s more important. But you can’t really ignore it. The urgency goes up when customers are demanding a reaction in this way, whether or not it’s truly your most important risk that you’re working, it doesn’t matter.

jerry: Having come from a service provider, I [00:35:00] lived that pain. And, and I’m sure you, you do too. Like you, you have to deal with it both ways. You have your own customers who you want you to answer their questions, but then you have your own suppliers. If for no other reason than to be able to answer your customer’s questions with a straight face, you’ve got to go and answer them. I think one of the challenges with that is where does it end ? I’m a supplier to some other company and I have suppliers and they have suppliers and they have suppliers and they have turtles all the way down, and If you think about everybody, assuming everybody acted responsibly and they all got their vendor questionnaires out at right away, but how long would it take to actually be able to authoritatively answer those questions?

jerry: I don’t know. I think it’s. I think there’s a lot of kabuki dance, I don’t know if that’s an appropriate term there.

Andrew: It’s executives saying, we have to do something, go do something. [00:36:00]

jerry: That’s true.

Andrew: And so then the risk management folks or third party risk manager or whoever do something and then they could point, Hey, look, we did something.

Andrew: We’re waiting for responses back from Bob’s budget cloud provider.

jerry: There’s a lot of hand wringing that goes on. I will also say having worked, in certain contexts you end up having small suppliers. You may end up with small suppliers who may not know they have to go do something.

jerry: And so your questionnaire may in fact be the thing that prompts them to go take action because their job is to deliver parts. They’re not a traditional service provider. They have some other business focus.

jerry: In those instances, it could very well be because like you said, not everybody has a threat intel team, that you are in fact telling them that they have to worry about something it’s, it doesn’t make it any less annoying though, especially if you have a, a real, a more robust security program in place. Because I don’t [00:37:00] know, in my experience, I’m not sure anything genuinely beneficial has come from those vendor questionnaires other than put potentially, like I said, the occasional you’re telling a supplier who was otherwise unaware.

Andrew: I think it breeds a false sense of security that you’ve got a well managed supply chain and a well managed third party risk management.

Andrew: I question the effectiveness.

jerry: Yeah I can agree with that.

Andrew: So not to be too cynical about it, but, and then I always wonder, what are you going to do? Okay, let’s say. Let’s say you’re, how soon could you shift to another provider? Okay. Let’s play this out. Let’s say you ask me and I’m running Bob’s budget cloud provider.

Andrew: Do I have polyfill? And I say, I don’t know. What are you going to do? You’re going to cancel your contract. Maybe you’re going to choose to go someplace else. Maybe it’s going to take time. Yeah, it could influence your decision to renew or continue new [00:38:00] business or whatnot. But

jerry: it’s, I think what you’re trying to say, and I agree is it doesn’t change the facts for that particular situation.

Andrew: Right yeah. And do you want me to spend time answering your questions or go fixing the problem?

jerry: I want you to do both, dammit. That’s that’s their view. What do I pay you for?

Andrew: I don’t know. I have a tough spot. I don’t have a really warm fuzzy about these sort of fire drills that get spun up around Big media InfoSec events.

Andrew: I think they’re, I think it’s the shark attack and it’s, do you have sharks in your lagoon? Maybe.

jerry: I feel like this whole area is very immature. It’s a veneer that, in most instances, I think is worse than useless because it does create a false sense of security.

Andrew: Yeah, I agree. And how do you know I’m not lying to you when I fill out your little form?

jerry: That’s the concern. We’re lying and there was a breach, like you would, you as the [00:39:00] customer would, crucify them in the media, or in a lawsuit

Andrew: Yeah, at the end of the day, it either becomes a breach of contract or a, I don’t know, I’m not a lawyer, but I haven’t fully articulated my thoughts on this yet. But there’s something I’ve just never really felt was very effective or useful about these sorts of questionnaires that go out around these well publicized security events.

jerry: Yeah, I agree. I agree. I think there is likely something sensible as a consumer.

jerry: Yeah. It is helpful to know the situation with your suppliers and how exposed you are, because then your management wants to know, Hey what’s my level of exposure to this thing? And you don’t want to turn your pockets inside out and say, I don’t know. But at the same time, I’m not sure that the way that we’re doing it today is really establishing that level of reliable intelligence. The last story comes from tenable the title is how the regression vulnerability could impact your cloud environment. So the [00:40:00] regression is cutely spelled with the SSH capitalized. So regression, this regression vulnerability was a recently discovered this slash disclosed vulnerability in open SSH.

jerry: I think it was for versions released between 2021 and as recently as a couple of weeks or months ago and can under certain circumstances allow for remote code execution. So kind of bad

Andrew: Yeah remote code execution Unauthenticated against open SSH that’s open to the world.

Andrew: Correct, but It’s not that easy to pull off.

jerry: Correct. There’s a lot of, there’s a lot of caveats and it’s not necessarily the easiest thing to exploit. So I think they say it takes about 10, 000 authentication attempts. And even with that, you have to understand the exact version of OpenSSH and information about the platform it’s running on, like [00:41:00] it’s a 32 bit, 64 bit, et cetera.

Andrew: Yeah. And I think that those tests were, a 32 bit. And it’s much tougher against 64 bit because you’ve got to basically get the right address collision in memory, is my understanding. Take that with a little grain of salt. But that was my understanding.

jerry: But not impossible. And so the point of this post is, OpenSSH is exposed everywhere.

jerry: Like it’s everywhere. And they point back to cloud and I think they point to cloud for two reasons. Reason number one is, in, I think cloud incentivizes or makes it really easy and in some instances, preferable to expose SSH as a way of managing your, your cloud systems. And in those instances, there’s almost always going to be open SSH. Unless it’s RDP, then it’s all good.

Andrew: It’s much preferred.

jerry: RDP is way better.

Andrew: There’s a GUI. There’s pictures.

jerry: There’s pictures. That’s right.

Andrew: A mouse works.

jerry: How [00:42:00] much better could it get? And then the other reason they are picking on cloud providers is that as a consumer, you are provisioning based on images that usually with most cloud providers, You’re provisioning your servers using images provided by the cloud provider. And those images may not be updated as frequently as maybe they should be. And so therefore, when you provision a system, it is quite likely, to come vulnerable right out of the gate. And you’ve got to get in there and patch it right away.

jerry: You’ve got to know that’s your responsibility and it’s not actually protected by the magic cloud security dust.

Andrew: At least, not your cloud. Maybe Bob’s budget secure cloud is, I don’t know, that joke didn’t work out, but you make an interesting point. And I think I was talking to somebody about this and I was trying to make the example that when we started doing this stuff pre cloud, because we’re old. [00:43:00] The concept of something being exposed to the internet was a big deal. Everything was in a data center behind a firewall, typically. And typically if you wanted to expose something to the internet, like an SSH Port or an HTTP port, an HTTPS port, that usually had a lot of steps to go through, and most companies would also make sure that you’re hardening it and making sure that, it really needed to be exposed.

Andrew: But with cloud, and I think you referenced this, it’s exposed by default. Most of the time there’s this, there’s not this concept of this thick firewall that, that only the most important things and well vetted and well secured things would be exposed to the internet. There is no more quote unquote perimeter. Everything’s just open to the internet. And that’s the way the paradigm is taught now with a lot of cloud providers, that there isn’t this concept necessarily of private stuff in the cloud versus public stuff. It’s just. stuff. And yeah they, talk to limited ACLs and only open the ports you have to and that sort of thing.

Andrew: But I think it’s super easy and super simple for people to just build something and I got to [00:44:00] get to it. So open SSH and, or whatever, or literally RDP and do what they got to do. And to your point, yeah, most of these images are not. hand rolled images. There’s something, some sort of image that you grab off of some catalog and spin it up and probably has a bunch of vulnerable stuff in it.

Andrew: But SSH we think of as safe ish. And, even security folks are like only have SSH open. But this to me speaks more and more to, it still matters what your attack service is, and you still shouldn’t be exposing stuff that doesn’t need to be exposed to the internet because you never know when something like this is going to come along even on quote unquote, your safe protocols to be open to the internet.

Andrew: So the less you have exposed, the less you have to worry about this. Now, I’m not saying that the only thing gets attacked is the stuff that’s open to the internet. We know that’s true, but it’s one more. hurdle that the bad guy has to get through. And again, buys you more time to manage stuff if it’s not directly exposed as an attack surface to a random guy coming from [00:45:00] China.

jerry: So the the recommendations coming out of this are a couple. First is making sure that you update, obviously that you update for this vulnerability, patch the vulnerability. Second is that when you are using cloud services and you’re provisioning systems with. A cloud provided image, make sure that you are keeping them patched, even newly provisioned systems are probably missing patches and they need to be patched post haste, limiting access, they talk about least privilege and they talk about that on two axes.

jerry: The first axis is With regard to network access to SSH, not everything should have access to SSH. It is not a bad practice to go back to the bastion host approach on a relatively untrusted system that then you use as a jumping off point to get deeper into the network where you don’t have every one of your systems. SSH exposed to the internet. It gives you [00:46:00] one place to patch. It gives you a lot more ability to focus your monitoring and whatnot. Now the other access they point out is that in the context of cloud providers, you can assign access privileges to systems. And so if your system is compromised it’s going to inherit all the access that you’ve given to it through your cloud provider. And so that could be access to S3 storage buckets or, other cloud resources that may be not directly on the system that was compromised, but because that system was delegated access to other resources they provide basically seamless access for an adversary to get to them. And that’s another, in my view, a benefit to that relatively untrusted bastion host concept that doesn’t have any of those privileges associated with it.

Andrew: Yeah, it’s a tough sell. I don’t think most cloud [00:47:00] architects think about it that way at all.

jerry: You are absolutely right. They don’t think about that until they’ve been breached. And then they do. Yeah. And I can authoritatively say that given where I came from.

Andrew: That’s fair. And part of the goal of this show is to try to take lessons. So you don’t have to learn the hard way.

jerry: There is a better way. And it’s not, no, it’s not as convenient. Not everything that we used to do back in the old days, when we rode around on dinosaurs was a bad idea. There are certain things that, probably are still apt even in today’s cloud based world.

jerry: I think one of the, one of the challenges I’ve seen is the, how best to describe it, the, like the bastardized embracing of zero trust. Again, in concept, it’s a great idea, it’s a great idea, but like the whole NIST password guidance that came out a couple of years ago where people looked at it said, Oh, NIST says I don’t need to change my [00:48:00] passwords anymore. It does actually say that, but it’s in the context of several other things that need to be in place, in, in the context of in the context of zero trust, that also portends certain other things. I think where zero trust starts to break down is when you have vulnerabilities that allow the bypassing of those trust enforcement points.

Andrew: Yeah. If you can’t trust the actual authentication authorization technology involved zero trust dependent upon that. I think the takeaway for me is you can never get to zero risk, but you never know when you might have to rapidly patch something really critical.

Andrew: And are you built to respond quickly? Can you identify quickly? Can you find it quickly? And can you patch it quickly? That’s the question.

jerry: And you can make it harder or easier on yourself. Design choices you make can make that harder or easier.

Andrew: Yeah. As well as how you run your teams. One thing that I’ve often tried to instill in [00:49:00] the teams that I work with is I can’t tell you what vulnerabilities are going to show up in the next quarter, but I know something’s going to show up. So you should plan for 10 to 20 percent of your cycles to be unplanned, interrupt driven work driven by security.

Andrew: And if you don’t, if you’re committing all of your time to things, not security, when I show up, it’s a fire drill, but I know I’m going to show up and I know I’m going to have asks. So plan for them. Even if I can’t tell you what they are, a smart team will reserve that time as an insurance policy, but that’s a tough sell.

Andrew: It’s a tough sell. Yeah. They don’t always buy into it, but that’s my theory. I try to do it, to explain at least and try to get them to buy into. And sometimes it works. Sometimes it doesn’t.

jerry: All right. I think I think with that, we’ll call it a show.

Andrew: Given the weather gods are fighting us today.

jerry: Yeah. I see [00:50:00] that it’s starting to move into my area, so it’ll probably be here as well. So thank you to everybody for joining us again. Hopefully you found this interesting and helpful. If you did tell a friend and subscribe.

Andrew: And buy something from our sponsor today, sponsored by Jerry’s llamas,

jerry: The best llamas there are. All right.

Andrew: I feel like all the podcasts need to need, use our code Jerry’s big llama box. Dot com.

Andrew: I’m just going to stop before this goes completely off the rails.

jerry: That happened about 45 minutes ago.

jerry: So just a reminder, you can follow the podcast on our website at defensive security. org. You can follow Lerg at

Andrew: Lerg L E R G on both x slash Twitter and InfoSec. Exchange slash Mastodon.

jerry: And you can follow me on InfoSec. Exchange at Jerry. And [00:51:00] with that, we will talk again next week. Thank you.

Andrew: Have a great week, everybody.

Andrew: Bye bye.


Defensive Security Podcast Episode 269

Defensive Security Podcast Episode 268



jerry: [00:00:00] All right, here we go today. Sunday, July 17th. 2022. And this is episode 268. Of the defensive security podcast. My name is Jerry Bell and joining me tonight as always is Mr. Andrew Kellett.

Andy: Hello, Jerry. How are you, sir?

jerry: great. How are you doing?

Andy: I’m doing good. I see nobody else can see it, but I see this amazing background that you’ve done with your studio and all sorts of cool pictures. Did you take those.

jerry: I It did not take those. They are straight off Amazon actually. It’s.

jerry: I’ll have to post the picture at some [00:01:00] point, but the pictures are actually sound absorbing panels.

Andy: Wow. I there’s jokes. I’m not going to make them, but anyway, I’m doing great. Good to see ya..

jerry: Awesome. Just a reminder that the thoughts and opinions we express on the show are ours and do not represent those of our employers. But as you are apt to point out, they could be for the right price.

Andy: That’s true. That’s true. And that, and by the way, what that really means is you’re not going to change our opinions. You’re just going to to hire them.

jerry: Correct. right. Sponsor our existing opinions.

Andy: Someday that’ll work.

jerry: All right. So we have some interesting stories today. The first one comes from SC magazine dot com. The title is why solar winds just might be one of the most secure software companies. In the tech universe.

Andy: It’s a pretty interesting one. I went into this a little.

Andy: Cynical. But there’s a lot of [00:02:00] really interesting stuff in here.

jerry: Yeah there, there is, I think

jerry: What I found interesting. A couple of things. One is very obvious. That this is a. Planted attempt to get back into the good graces of the it world. But at the same time, It is very clear that they have made some pretty significant improvements in their security posture. And I think for that, it deserves a.

jerry: A discussion.

Andy: Yeah, not only improvements, but they’re also.

Andy: Having these strong appearance of transparency and sharing lessons learned. Which we appreciate.

jerry: Correct. The one thing that I so we’ll get into it a little bit, but they still don’t really tell you. How. The thing happened.

Andy: Aliens.

jerry: Obviously it was aliens. They did tell you what happened. And so in the. Article here they describe this the [00:03:00] CISO of solar winds describes that the attack didn’t actually. Change their code base. So the attack wasn’t against their code repository. It was actually against one of their build systems.

jerry: And so they were the adversary here. Was injecting code. At build time, basically. So it wasn’t something that they could detect through code reviews. It was actually being added as part of the build process. And by inference the head. Pretty good control. At least they assert they had good control over their

jerry: source code, but they did not have good control. Over the build process and in the article they go through. The security uplifts they’ve made to their build process, which are quite interesting. Like they I would describe it as they have three parallel. Build channels that are run by three different teams.

jerry: And at the end of, at the [00:04:00] end of each of those, there’s a comparison. And if they don’t. They don’t match, if the. They call it a deterministic build. So there are like their security team does one, a dev ops team does another and the QA team does a third. And all building.

jerry: The same set of code. They should end up with the same final. Final product. All of the systems are are central to themselves. They don’t commingle. They don’t have access to each others. So there should be a very low opportunity for for an adversary to have access to all three.

jerry: Environments and do the same thing they did without being able to detect at the end, when they do the comparison between the three builds, whether it’s a novel approach. I hadn’t thought about it. It seems.

jerry: My first blush was, it seemed excessive, but as the more I think about it, It’s probably not a huge amount of [00:05:00] resources to do so maybe it makes sense.

Andy: Yeah.

Andy: And also, they mentioned that three different people are in charge of it. And so to corrupt it. Or somehow injected. Into all three would take. Somehow corrupting three different individuals, somehow some way.

jerry: Yeah, they would have to clue the three teams would have to collude.

Andy: Yeah.

Andy: Which. Is difficult.

jerry: Yeah.

jerry: Yep. Absolutely.

jerry: So they actually I haven’t looked into it, but they actually say that they’ve open sourced their their approach to this the multi kind of multi what I’ll just call multi-channel build. I thought that was. Interesting.

jerry: So There’s a, it’s a good read that they talk about how they changed from their prior model of having one centralized SOC under the. The company CISO to three different SOCs that monitor different. Different aspects of the environment. They went from having a kind of a part-time.

jerry: Red team to a [00:06:00] dedicated red team who’s focused on the build environment. I will say the one. Reservation I have is this kind of feels maybe a little bit like they’re fighting the. The last war. And so all the stuff that they’re describing is very focused on. Addressing the thing that failed last time.

jerry: And, are they making equal improvements in other areas?

Andy: Could be, I would say that.

Andy: They’re stuck in a bit of a pickle here where they need to address. The common question is how do you stop this from happening again? That is. That is what most people are going to ask them. It’s what the government’s asking them. That’s what customers asking them. And so there. There’s somewhat forced, whether that’s the most.

Andy: Efficient use of resources, not to deal with that problem right there. They have no choice. But I also feel like a lot of the changes they met, build change to their build process. I would catch. A great many other supply chain type. [00:07:00] Attack outcomes.

Andy: It seems to me.

jerry: Fair. Fair enough.

Andy: It’s also interesting because a lot of these things are easy to somewhat. Explain. I bet there’s a lot of devil’s in the details if they had to figure out, they mentioned that they did. They halted all new development of any new features for seven months and turned all attention to security.

jerry: Yeah, so it sounded like they moved from I think an on-prem. Dev and build environment to one that was up in AWS so that they could dynamically. Create and destroy them as needed.

Andy: Yeah, it’s. It’s an interesting, the fundamental concept that this article is saying is, Hey, once you’ve been breached, And you secure yourself.

Andy: Do you have a lower likelihood of being breached in the future. Are you like Dell? You have the board’s attention. Now you have the budget. Now you have the people now have the mandate to secure the company.

Andy: And is that true?

jerry: think it is situational. that there are some, [00:08:00] I’m drawing a blank. I think that’s one of the hotel change. don’t want to say the wrong name, but I I believe that there are. There are also instances. We’re readily available. Where the contrast true. Like they just keep getting hacked over and over.

Andy: And I sometimes wonder if that has to do with the complexity of their environment and the legacy stuff in their environment. If you look at a company like, I don’t know anything about solar winds, but I’m guessing. You know that there is somewhat of a. Fairly modern it footprint that. Maybe somewhat easy to retrofit as opposed to, hotel chain.

Andy: Probably some huge data centers that are incredibly archaic in their potential architecture and design and.

jerry: That’s a good point. It’s a very good point. It’s a different, it’s very different business model, right?

Andy: And they talked about how they’re spending, they’ve got three different tiers of socks now outsourcing two of them. They’re spending a crap ton of money on security.

jerry: Yes.

Andy: Whether with CrowdStrike watching all their end point [00:09:00] stuff. They mentioned it here. I’m sure that CrowdStrike appreciated that. Their own. Tier three SOC. They’ve got a lot of stuff and they also talking to that now their retention rates for customers are back up in the nineties, which is pretty, pretty good. So I don’t know. Yeah. Clearly this is a PR thing.

Andy: But at the same time, I really do appreciate. A company that’s gone through this sharing as much as they’re sharing because the rest of us can learn from it.

jerry: Yeah, absolutely.

Andy: And the other thing it’s interesting because I look at this, cause I work for software company now. And it’s a small company. It’s nothing the size of these guys. And we don’t have the resources these guys have, but. I think about how many points in our dev chain. Probably could be easily corrupted in a supply chain attack.

Andy: That they’re stopping with their model. That, I wonder what. What could I do? Like how much of this could you do on a budget? There’s a huge amount of people environment here. There’s a huge amount of. Of red tape and [00:10:00] bureaucracy and checks and balances that must add tremendously to the cost.

Andy: Probably slow things down a little bit, probably gonna, would get pushed back. If you just tried to show up at your dev shop and say, Hey, we’re doing this now without having gone through this sort of event. So what I’m dancing around here is the concept of culture. Have, post-breach, you now have a culture that is probably more willing to accept what could be perceived as draconian security mandate over how they do things.

Andy: As opposed to pre breach.

jerry: Yeah. It probably doesn’t scale down very well.

Andy: Yeah.

jerry: With the. The overhead that they’ve poured on. Any, they also. In the article point out that you. It remains to be seen. How well solar winds continues carrying on, but it does, like you said, it does seem like. They’ve they’ve definitely taken this and learned from it and not only learn from it, but also have like we see in this article,

jerry: I’m trying to [00:11:00] help the rest of The rest of the industry learned, which is, by the way, like what we’re trying to do here on the show. Kudos to them.

jerry: For that.

Andy: Yeah. I also wonder how many other dev development shops.

Andy: We’ll learn from this and adopt some of these practices. So they’re not the next supply chain attack. Cause that’s really where the benefit comes.

jerry: Yeah. Yeah, absolutely.

Andy: Yeah.

jerry: All right. Onto the next story, which comes from computer and the title here is log for shell on its way to becoming endemic. So the the us government after. Joe Biden’s president Joe Biden’s cyber executive order in, I think it was. 2021. Maybe. Formed this cyber security.

jerry: What is it called? The

jerry: Cyber

Andy: safety review board.

jerry: safety review board. I could remember the S.

Andy: Yeah.

jerry: Which I think was modeled after the [00:12:00] NTSB or what have you. But they released this report last week, which describes. What happened in, or at least their analysis of what happened. In the log4j. Incident that happened last year. And. So I have mixed. Mixed emotions.

jerry: About this one. You know that one of the, one of the key findings is that. Open source development. Doesn’t have the same level of maturity and resources that, that. Commercial software does. And, on the one hand, one of the promises of open source was, many eyes makes.

jerry: Bugs. Very shallow. Which I think we’ve seen is not really holding water very well. But I think the other problem is it’s asserting that. Open source developers are uniquely making security mistakes in their development. [00:13:00] In the last I checked every single month for the past 20 plus years. Microsoft releases.

jerry: Set of patches For security bugs in their software and they are not open source. And so I, I think it w what’s a little frustrating to me was they didn’t. It feels like they didn’t address the elephant in the room. Which was not necessarily that the. Th that the open source developers here did

jerry: a bad job. They didn’t understand how to. Code securely. It’s self-evident that they made a, they made some mistakes. But the bigger problem is the fact that it was rolled up into so freaking many. Other open source. In non open source packages in and multi-tiered right. It’s.

jerry: Combined into a package that’s combined into another package. That’s combined into another package. That’s. [00:14:00] Combined into this commercial software. And the big challenge we had as an industry. Was figuring out where they, where all that stuff was. And then even after that Trying to beat on your vendors.

jerry: To come to terms with the fact that they actually have log4j in there environment, and then having to make these like painful decisions do we stop using. For instance, VMware, because we know that they have yet that they have log4j and they haven’t released the patch. At the time they have, since, by the way,

jerry: But. Th that is I think that’s the more concerning problem. Not just obviously for log4j but when you look across the industry, we have lots of things like log4j that are. Pretty managed by either a single person or a very small team on a best effort basis. And they serve some kind of important function and they just keep getting.

jerry: Consolidated. And I don’t [00:15:00] think there’s a real appreciation for how pervasively, some of these things. Are being used. They do talk about in the recommendations about creating built in a better bill of material for software, which I think is good. But it’s still, that’s like coming at it the wrong way.

jerry: It seems to me like we need to be looking for hotspots and addressing those hotspots. And I just don’t, I’m not seeing that it’s concerning to me.

Andy: what do you mean by hotspots?

jerry: Hotspots in terms of potentially. Poorly managed or not. That’s not the right way to say it, but less well-managed open source packages that have become super ingrained.

jerry: In the it ecosystem like log4j like openssl has been in some of the other bash, And so on.

jerry: We see this come and go. But at the end of the day I don’t know that we have a good handle on where those things are. So we’re just going to continue to get [00:16:00] surprised when some enterprising researcher. Lifts up a rug that nobody’s looked under before and realizes, oh gosh, there’s this piece of code that was managed by

jerry: a teenager in the proverbial basement. And they’ve since moved on to college and it’s you. It’s not being maintained anymore. Any more, but it’s like being used by By everybody and their dog.

jerry: We don’t seem to be thinking about that problem, at least in that way.

Andy: Yeah, you said something early on in covering this too about how open sources less rigorous and their controls than commercial, but I think it’s very fair to say that. vast majority of commercial applications. Are reusing tons of open source. And their code, right? That.

Andy: The kind of odd implication there is that. Commercial entities write everything from the ground up when that’s not true. Now here’s the flip side. If I’ve got a well known, mature, vetted [00:17:00] package. That does its job well that I can include in my software package. I could potentially save myself a lot of bugs and.

Andy: And vulnerabilities because that package has been so well vetted. In theory, right?

jerry: A hundred percent.

jerry: Yep.

Andy: like writing your own encryption algorithm, bad idea. There’s a whole. Whole litany of people who’ve, edited, ruined because they thought they knew better. And that’s a really hard problem to solve. So I think there’s value in having. Almost like engineering standards of this type of strength of concrete,

Andy: that is reused because it’s a known quantity as opposed to, Hey, we’re just going to invent some new concrete and give it a whirl. I see it a little bit like that. But I agree with you. I also wonder how often.

Andy: Dev shops can spare someone who his whole job is to dig deep into the ecosystem of all the packages they pull in. When they do their development and know the life cycle of those. To the level we’re [00:18:00] talking about versus, Hey, that’s a solved problem. I’d just pull it off the shelf and move on.

jerry: I think that is the very issue as I see it. That is the. Problem because I don’t think most companies have the ability to do that.

Andy: What do you thinking like a curated.

Andy: Market of open source tools that are well-maintained.

jerry: Think we’re headed in that direction. I don’t. I don’t love the idea. By any stretch. I’m not saying don’t mean to imply that I do. But. I don’t see a good alternative. And the reason is that, like you said, you want. As a, as the developer of a application, whether it’s open source or not.

jerry: You want to use? You don’t want to recreate something that’s already existing and you want to use something that’s reliable. I think that one of the problems is that. These smaller pieces of open source. Technology like I have a strong feeling that like when the, when log4j started out, they didn’t expect that they were [00:19:00] going to be in every fricking piece of commercial and open source software out there.

jerry: It just happened. It happened.

jerry: over time. And. And.

jerry: I just think there was little consideration on both sides of the equation for what was happening. It was just happening and nobody really was aware of it.

Andy: It’s not like the log4j team was like, gum, use me everywhere. And then, there’s a little bit of, Hey, I wrote this, it’s up to you. If you want to use it, that’s on you.

jerry: Yeah. It’s there. Caveat. Emptor.

Andy: so it’s.

Andy: Yeah, this is. I don’t know. It’s a tough problem. I don’t know. The software bill materials is your solve either. I know a lot of people are talking about it. I know that it helps, but.

jerry: It, I think it, it helps in so much as if you have a, a few as a. Manufacturer of software or even you as a consumer. Have a S bomb that goes all the way down, which by the way, is itself a. Pretty tricky. When something like log4j hits [00:20:00] it becomes much easier to look across your environment and say, yep, I got it there and there.

Andy: Yeah.

jerry: That’s what I have to go fix. By the way, like it’s. You’re also dependent on your close source. Commercial software providers. Also doing a. A similar kind of job. So I think there’s a coming set of standards and processes. That the industry is going to have to, to get to, because this problem isn’t going to go away. It’s going to continue to get worse.

jerry: And somebody is either going to Some enterprising government like Australia or India or the U S is going to stuff a. Solution, none of us would like that our throat, or we’re going to have to come up with something.

Andy: Yeah. You’re not wrong.

Andy: It’ll be interesting to see how it plays out.

Andy: Now that I think the genie’s out of the bottle, you got to assume some of these big cybercrime. Syndicates or whatever term you want to use are attempting to replicate this.

jerry: Oh a hundred percent. A hundred percent, they gotta be looking around saying, what is. [00:21:00] open source components exist in, pervasively and what would be easy ish.

jerry: For me to take over slash compromise so that I could, roll and roll up into as many. Environments as I can, like that would be. Super convenient as a, as an adversary.

jerry: So anyway, there’s lots more to come on that I do think we’re going to see lots of hyper-focus on.

jerry: Source code supply chain, open source. Coming. And I fear that it’s going to. Be largely misguided, at least for awhile.

Andy: Fair enough.

jerry: All right. The next story comes from bleeping. Computer in the, this is a fascinating one. Title is hackers impersonate cybersecurity firms in callback phishing attacks.

Andy: Clever people.

jerry: We have a story here about an adversary or maybe multiple adversaries, who it becomes super enterprising and they [00:22:00] are sending letters to unwitting. Employees at different companies. And I don’t know how well targeted this is. There’s really not a lot of discussion about that, but. In the example they cite they have a letter.

jerry: I think it comes. By way of email. On CrowdStrike letterhead. And it basically says, Hey, CrowdStrike and your employer have this. Have this contract in place, we’ve seen some anomalous activity. You have you and your company. Are beholden to different regulatory requirements and we have to move really fast. We need you to call this phone number and to schedule an assessment. And it. Unlike by the way, a lot of a lot of these things is pretty well written. I would like to think that if I got it. I would say. That’s BS, but like it is really well-written, there’s not, it’s not full of grammatical errors. That kind of makes sense.

jerry: And apparently if you follow the instructions, by the way, [00:23:00] It, the hypothesis is that it will lead to unsurprisingly a ransomware infection because they’ll install a remote access Trojan on your workstation. And then, use that, use that as a beachhead to get into your.

jerry: Your company’s network.

Andy: Yeah. I hate to say it, but another good reason why you shouldn’t let your employees just randomly install software.

jerry: Yes.

Andy: And you have to assume. There’ll be some, this is where I struggled by the way with social engineering training is I really do believe, and it’s not a failure. It’s not a moral failure it’s not an intelligent failure. It’s a psychological weakness of how human beings. Brain’s work that.

Andy: These bad guys are exploiting and they will find some percentage in some certain circumstances. That will fall for these sorts of efforts. And you’ve got to be resilient against that. I don’t think you can train that risk away.

jerry: I, yeah, I would say that it’s [00:24:00] perilous to think that you can train it away, because then you start to think that when it happens, It’s the failure of the person. And actually think that’s the wrong way to think about it. If you have, Obviously. You want to do some level of training?

Andy: Sure.

jerry: Just if for no other reason, you’re obligated to do that by many regulations and whatnot. But, also like you want people to understand. Like what to look for, it’s it helps in the long run, but at the end of the day, like you, we have to design our environments. To withstand that kind of.

jerry: Issue right.

Andy: Yeah.

jerry: if we’re. If our security is predicated on someone. Recognizing that a well-written email on CrowdStrike letterhead. Is is fake. Like we have problems.

Andy: Yeah. If you’re never going to be taken down by one error click on an employee.

Andy: That I think is a problem you need to solve.

jerry: Yeah. And that’s a failure on, on, [00:25:00] on our.

jerry: Like it and security side, not on the employee side.

Andy: Yeah.

jerry: So anyway, be on the lookout. Obviously this is a pretty, I hadn’t heard of this before. It makes total sense in hindsight, but something to be on the lookout for.

jerry: All right. The last story we have comes from cybersecurity. One of my new new favorite websites, by the way. The good stuff on there. Title is Microsoft rollback on macro blocking in office sows confusion. So earlier in the year, Microsoft made a much heralded. Announcement. That they were going to be blocking.

jerry: Macros in Microsoft office from anything that was. Originated from the internet. And and that. Was born out by the way, by an apparent. But some researchers have said that. It’s much as two thirds. Of [00:26:00] the. Attacks involving macros has fallen away. So pretty effective control Microsoft last week.

jerry: Now it’s that they were reversing course and re enabling macros. I assume. Because CFO’s everywhere were in full meltdown that their fancy spreadsheets we’re no longer working and obviously we should assume that, the the attacks are going to be back on the upswing. And apparently this is a temporary reprieve. It’s a little unclear when Microsoft is gonna re enable it. But I have a strong feeling that a lot of.

jerry: Organizations have. Taking us taking a breather on this front because Microsoft solved it. For us and now we need to be back on, on the the defensive.

Andy: Yeah, I’m really curious what the conversation was like that Forced them to reverse course, like what broke. That was that big of a deal that was so imperative because this has been a [00:27:00] problem. For at least 15 years with Microsoft.

jerry: yeah.

Andy: least. This was a pretty big win. And now it’s. Kinda get rolled back. So I was disappointed.

jerry: So there are. I think there’s some links in here. You can actually go back and re enable it through group policy settings. Obviously if if you’re so inclined, Probably a really good idea. As a, as an it industry, I think we’re worse off. For this change until they re enable it.

Andy: Yeah. This is without knowing all the reasons behind it. This feels like such a pure example of productivity versus security sort of trade off and playing out in real time.

jerry: Yeah. I can almost guarantee you this what’s going on.

jerry: So that yeah. That is a little concerning. Definitely. Be on the lookout.

Andy: Indeed. We’ll see what happens to be continued. Stay tuned.

jerry: To be continued. And that is [00:28:00] that is the story for tonight. Just one little bit of editorial. I spend a lot of time during the week reading. Different stories, all kinds of Google alerts set up for For different security stories and whatnot to help pick what we talk about on these podcasts.

jerry: And. It is amazing to me. How many. Stories that are Couched asnews are actually. Basically marketing pieces.

Andy: Yeah.

jerry: It’s I know that we’ve talked about this in the past, but it is alarming. I actually gotten to the point now where I dropped down to the end to see what they’re going to try to sell me before I get too invested in. The

Andy: I look at who wrote it. And if they’re like not a staff writer, if they’re like contributing writer from, chief marketing officer from blah, blah, blah, I’m like, Nope.

jerry: Yeah.

Andy: I very quickly just. Stop reading it. If it’s something written by an employee of a vendor or some variety. [00:29:00] And I don’t mean to be that harsh about it. It’s just.

Andy: There’s a bias there that they believe their own marketing. And their own dog food and they’re clearly pushing the problem. They know how to solve.

jerry: Yeah, they’re characterizing. The problem is. Something offerings can solve.

Andy: Right.

jerry: And, and I think it’s a. It’s certainly an understandable. Position, but I. I’m concerned that as a industry,

jerry: Where do we go to get actual best practices. Because if you’re, if everything you read is written by a security vendor who wants. The best practices are install crowdStrike install red Canary install. McAfee installed

Andy: you bring up an interesting. You bring up an interesting side point, which is. I’m seeing some movement in the cyber insurance industry that they’re basically saying. At the broadest level for those that are less sophisticated. These are the three EDRs. We want you [00:30:00] to have one of and if it’s not one of these three, you don’t get premium pricing.

jerry: Oh, that’s interesting.

Andy: And you’re like, wow. Especially because it’s such a blanket statement. And so many environments are different and I’m. I’m not passing judgment on the efficacy of those three vendors, which is why I’m not saying them. It’s more, that’s feels like a very.

Andy: Lack of nuanced opinion that, Very blunt instrument being applied there.

jerry: Yeah, and It also.

jerry: Ignores like a whole spectrum of other stuff that you should be doing in.

Andy: That’s just their EDR. Table-stakes right. And which is all coming very much from ransomware. They’re just getting their ass kicked the ransomware payouts. And so they’re like what is what will stop ransomware?

jerry: Fair enough. That’s a fair. That’s a fair point.

Andy: Back to your point about, So many marketing pieces being masquerading as InfoSec news, I think is very true. And on that note, I want to thank today’s sponsor of Bob’s budget, firewalls.

jerry: [00:31:00] We proudly have I think we’ve cleared 10 years of no No vendor sponsorship. No sponsorship of any kind, other than a donation.

Andy: Yes, which we appreciate.

jerry: All right. is the show for this week. Happy to have done two weeks in a row now. Got to make a habit of this.

Andy: I know this is great. I appreciate it.

jerry: All

Andy: all four listeners that we still have.

jerry: I moved to a commercial podcasting hosting platform. And so we get actually now get some metrics and we have about

jerry: About 10,000 ish.

Andy: Wow.

jerry: Or so.

Andy: counting the inmates that are forced to listen as part of their correction.

jerry: No see see

jerry: think actually because That’s a one to many thing so there’s probably like one stream is forcing like maybe 500 people. To listen.

jerry: And then when they do crowd control, like that could be thousands of

Andy: That is true.

Andy: I was quite [00:32:00] entertained. And really proud of you when I found out that your voice. Was found to be one of the best tools to disperse crowds.

jerry: Hey, we all have to be good at something right.

Andy: It is up there with. Fire

jerry: Yeah. Yeah.

Andy: neck and neck. Better than tear gas. I, are you aware of this better

jerry: I was not aware that I had overtaken tear gas.

Andy: It’s impressive. My friend, you should be proud.

jerry: I, I am.

Andy: should be proud.

jerry: am. I’m going to go tell them.

jerry: All

Andy: All right.

jerry: Have a good one, everyone.

Andy: Alrighty. Bye.

jerry: Bye.

Defensive Security Podcast Episode 267

Defensive Security Podcast Episode 267



jerry: [00:00:00] Alright, here we go. Today is Sunday, July 10th, 2022. And this is episode 267 of the defensive security podcast. My name is Jerry Bell and joining me tonight as always. Is Mr. Andrew Kellett.

Andy: Good evening, Jerry, how are you? Good, sir.

jerry: I’m doing great. How are you doing?

Andy: I’m good man. It’s hot and steamy in Atlanta. Tell you that much.

jerry: Yeah. I ‘ve been back for a month from my beach place. And I think today’s the first day that we’ve not had a heat advisory. [00:01:00]

Andy: Yeah, that’s crazy.

jerry: which it has been brutally hot here.

Andy: Now, when you say beach place, you might have to be more specific, cause you’ve got one like seven beach houses now.

jerry: Well, the Southern most beach house. Yes.

Andy: Yeah. One is the Chateau. One’s technically a compound.

jerry: One’s an island,

Andy: that’s.

Andy: We’re going to have to probably name them because. They’re tough to keep straight.

jerry: They definitely are. Yup.

Andy: But, I, for one. Appreciate your new land barronness activities. And look forward to.

Andy: Jerry Landia being launched and seceding from the United States.

jerry: Hell. Yeah. That’s right.

Andy: I’ll start applying for citizenship whenever I can.

jerry: Good plan. Good plan. All right. A reminder. We should probably already said this, but the thoughts and opinions we expressed on the show are ours and do not represent those of our employers.

Andy: But for enough money, they could

jerry: yeah. Everything is negotiable. [00:02:00] All right. Couple of really interesting stories crossed my desk. Recently and the first one comes from the US department of justice of all places. And the title here is Aerojet , Rocketdyne agrees to pay $9 million to resolve false claims act allegations.

jerry: Of cybersecurity violations in federal government contracts. So the story here is that there’s this act, as you could probably tell by the title called the false claims act that permits an employee of a company who specifically does business with the US government to Sue the company under the false claims act claiming that the company is misrepresenting itself in the execution of its contracts. And if that [00:03:00] lawsuit is successful, the person making the allegation, basically it’s a whistleblower kind of arrangement. The person making the allegation gets a cut of the settlement. And so in this particular case the whistleblower received $2.61 million dollars of the $9 million.

Andy: Wow. So his company. In theory was lying on their security controls. And he found out about it or knew about it. And was a whistleblower. About it is getting 2.61 million.

jerry: Correct. Correct.

Andy: Have to go check everything in my company. I’ll be right back.

jerry: I’m guessing that his lawyers will probably take about 2 million of the 2.61, but, Hey, it’s still.

jerry: still. money, right?

Andy: That’s crazy. It reminds me, it’s probably a lot of our listeners are too young for this, but. The days of the business software Alliance about turning in your employer for using pirated software, that you could get a cut of that, but not in the you [00:04:00] know seven figure range.

jerry: Yeah, this is really quite interesting. And what’s more interesting is that there is apparently some indication that the US government may expand the scope of this to include non government contracts and including. Perhaps even like public companies. Under the jurisdiction of the securities and exchange commission. I don’t think that’s ah codified yet.

jerry: Probably just ah hyperbole at this point, but holy moly. It really really drives home the point that we need to, do what we say and say what we do.

Andy: So what were the gaps or what were the misses that they said they had.

jerry: have done a little bit of searching around. I didn’t go through all of the details in that case. Because it was a settlement, there may not be an actual Details available, but I’ve not been able to find the specific details of of what they were not doing.

Andy: Yeah. did [00:05:00] go and I cause. I was very curious about this and did do a bunch of searching and found some summaries of the case and some of the legal documentations, and it looks like. The best I was able to get into is there was a matrix of 56 security controls. Or something around those lines, don’t quote me on that and that the company only had satisfactory coverage of five to 10 of them.

jerry: Oh, wow.

Andy: And there was another one where they did a third-party pen tests who got into the company in four hours. It looks like there’s a bunch of Unpatched vulnerabilities. So it’s in legalese, right? So it’s a little tough to translate into our world at times.

Andy: But I’m actually quite curious and I might want to do some more research trying to figure out what exactly were the gaps and I guess at the end of the day, they agreed to these things contractually. And just didn’t do them.

jerry: Correct. That’s the net of it.

Andy: This is primarily if you’re doing business with the government, the us government.

jerry: Correct. Do you have a government contract?

jerry: Yeah for now. And I do think that over time, like I said, my [00:06:00] understanding is that the scope of this may make increase.

Andy: This is, I really feel like this is huge. This could open the door.

Andy: I mean because you and I both know how often those contractual obligations and the way you answer those questions is a little squishy.

jerry: Yeah. Yeah. Optimistic, I think. I think optimistic might be.

Andy: That’s fair. That’s fair. But it’s also interesting trying to have, federal judges navigate this very complex world. Yeah, that’s it. That’s a crazy story. We’ll see where that goes.

jerry: So anyway, it really highlights the point about being very honest and upfront with with what we’re doing. And if we commit to doing something, we need to do it.

Andy: Yeah, it just gets fuzzy when there’s business deals on the back end of that answer.

jerry: No, I could completely agree.

jerry: All right. The the next story also pretty interesting. Also comes from a us government agency. This one comes [00:07:00] from CISA the cybersecurity and infrastructure security agency. I hate the name. I really wish they come up with a different name. It’s the word security way too many times. Anyway that the title here is North Korea state-sponsored cyber actors use Maui ransomware to target the healthcare and public health sectors.

jerry: That from a, from a actual actor standpoint or threat actor standpoint, there’s not a ton a ton of innovation here. They’re not doing anything super sophisticated that we don’t see in a lot of other campaigns, but what is most interesting is that the government, the US government has attributed this particular campaign to North Korea. And North Korea is, one of the most, perhaps the most heavily sanctioned country in the world for the us government. And so if you, as a an entity in the US somehow support an [00:08:00] organization or a person or entity in North Korea, you can be subject to penalties from the U S government.

jerry: And the point here is if you are a victim of this ransomware campaign and you pay the ransom, you may run a foul of those sanctions and that could end in addition to whatever penalties you might come into as a result of of the breach you may actually run into some pretty significant additional penalties as a result of supporting the north Korean government.

Andy: Well, that is an interesting little problem isn’t it?

jerry: Yes, it is. Yes, it is.

Andy: What you need is a shell company. To run your ransomware payment through.

jerry: I have a feeling is a lot of that going on in the world.

Andy: we saw some shenanigans with like lawyers doing it as a proxy and with using. In essence [00:09:00] privileged communications to hide it. At least allegedly in some previous stories we’ve covered. But that’s an interesting problem. Yeah. I can see how that would be a challenge. Maybe if you only paid the ransomware, like in bulk wheat shipments.

jerry: a barter system.

Andy: Because we send them food.

jerry: That’s true.

Andy: That’s allowed.

jerry: so you recover your data by paying in humanitarian aid.

Andy: I think Twinkies for data is a perfect campaign. We should launch.

jerry: I don’t even know what to say.

Andy: Either pay three Bitcoin, which is now probably worth like 30 bucks. I don’t know, I haven’t checked lately or.

Andy: Two semis full of Twinkies.

jerry: But how are you going to get to Twinkies to them? That’s what I want to know.

Andy: They have ships. They make ships that they go on and they go across the sea and then they take them off the ships. Did you not read the books I gave you?

jerry: Oh, geez. Showing my ignorance. I will say that there are some recommendations down at the bottom. Some [00:10:00] of them are interesting and things that you haven’t seen a lot of recommended before. But a lot of them are just the normal run of the mill platitudes. Only use secured networks and avoid using public wifi networks. Consider you using an installing a VPN.

jerry: No, I get so tired of the, you should consider doing X. Well, okay. I considered it.

jerry: You should consider not using administrative rights for your users. Okay. I considered it.

Andy: Well, and the real problem here is that ransomware is not one threat. It is the outcome of.

jerry: Exactly.

Andy: Yeah. That’s why the ransomware defense is an interesting problem. Unless you’re actually just trying to stop the. Pure encryption component of it. How that ransomware starts could be highly varied.

jerry: It’s the end link in the chain, right? Because as, they talk about earlier in the [00:11:00] advisory here. This particular Maui ransomware is actually pretty manual. It actually has to be, apparently be launched. By hand. With the command line. So whoever is whoever the threat actor is, they found some way into the system and you can infer. Assuming that the CISA actually has that kind of insight. You can infer by reading through their recommendations, how they think the north Koreans are getting in there using RDP that’s exposed to the internet and then moving laterally using user credentials who have administrative rights and so on. So you can infer based on what they’re saying not to do to see probably how it’s being propagated, but sometimes it’s a little difficult to understand, with these kinds of recommendations how much of it is the result of actual observations and just yeah, we have this list [00:12:00] of good hygiene practices. And we think this is what you should be doing.

Andy: Yeah. I think the problem is that. True. Ransomware defense is highly varied based on the. Individual company’s stance platform, environment, situation. And it’s very difficult to roll that into a couple paragraphs in a generic article.

jerry: Yeah. Yep. Absolutely.

jerry: So anyway don’t get ransomwared and if you do. Don’t pay off the north Koreans because you’re going to get a double whammy.

Andy: I still don’t quite understand how the north Koreans are launching these from their Commodore 64s. But maybe we’ll talk about that in another show.

jerry: It is a fair question, how they’re coming into possession, but I would expect it’s coming in via countries like China and Russia where they may not have that sanction in place. .

jerry: Which by the way, I think is how there, cause you, you would ask the same question. Well, how are they getting internet access? My, my understanding is it. It is [00:13:00] coming through China.

jerry: So the last story today comes from ZD net and the title here is these are the cybersecurity threats of tomorrow that you should be thinking about today.

Andy: Well, hold on. If I’m thinking about tomorrow’s threats today, thinking about today’s threats today.

jerry: Well, you were supposed to think about today’s threats yesterday.

Andy: Oh,

jerry: And you were supposed to think about yesterday’s threats last week.

Andy: I’m gonna have to start over.

jerry: Yeah, well, Look, this isn’t this career. It’s not meant for everybody.

Andy: Is this what they mean by thought leaders?

jerry: I think it is.

jerry: I think it is.

jerry: So the first threat of tomorrow that you need to worry about today. Is quantum threats. And that’s not like the James Bond quantum, right? This is more like the. The quantum computing. Hacking all your public key crypto. Which by the way is going to be here probably sooner than anybody really wants to [00:14:00] admit.

jerry: The world of quantum computing is advancing quite. Quite rapidly. And. I think that the interesting thing to consider is that we’re, we are creating just massive amounts of encrypted data. On a daily basis today. And presumably it’s infeasible to decrypt almost all of it. Because of the complexity.

jerry: But in the near future that won’t be the case. The things that we’re creating today could be relatively easily decrypted using things like quantum computing. So presumably there is enterprising companies and state actors and whatnot, and squirreling away encrypted data. Today.

jerry: That will be encrypted. Somewhere down the line. So at some point. It makes sense for us, like we’re going to have to, I think even before. Quantum crypto because quantum Attacks against cryptography. Becomes [00:15:00] technically feasible. We’re going to have to shift to quantum resistant. Crypto, which is going to be interesting because there it’s you going out on a limb and saying that.

jerry: The quantum resistant crypto that we’re making actually will be quantum resistant because we don’t have the kind of quantum computers that can verify that hypothesis.

Andy: In fact NIST, just put out the beginning of the process to solicit and evaluate. For quantum resistant public key cryptography algorithms.

jerry: Yeah.

Andy: Which I mean, by the way is not a quick process. I think the last time they updated. It took three or four or five years of review. So it’s not a quick. Quick endeavor typically, but yeah. It’s an odd one. And I guess the theory behind this is that quantum computers just do math differently. So all of the time variables.

Andy: Of how long it would take to break a standard encryption today don’t apply or apply very differently to quantum computing than they do to our standard type of computing today.

jerry: Yeah. [00:16:00] Correct. Conceivably a. A well. In properly skilled quantum computer could. Take a, contemporary. Public key crypto and break it, in, in very short time.

Andy: In essence by brute forcing it just much, much, much faster.

jerry: It’s not actually brute forcing. It’s just solving the math.

Andy: Yeah, I. Well, I guess what I’m saying is there’s. There’s nothing inherently weak in the encryption algorithm. It’s the speed of the computing that’s changing.

jerry: It’s the approach.

Andy: You could break it. You could break today’s encryption as well. It just takes a very long time.

jerry: Well, the. The strength in the encryption today comes from the fact that we have to basically just brute force you know trying to factor. Numbers. But in the world. When you get it. To the quantum computing.

jerry: You don’t have to actually brute force. The

Andy: Hmm.

jerry: can just, you can just solve it. Like you don’t. You don’t even, you don’t even have to brute force, so solve it. It [00:17:00] is.

Andy: I gotta be honest. I feel woefully ignorant and naive of these issues, and I clearly need to educate myself because I. I did not understand that.

jerry: It’s well, It’s. It is a

jerry: It’s just very different. It’s a very different thing. It’s not a, like you can’t take a binary computer.

jerry: And compare it to a quantum computer.

jerry: It’s almost like an analog computer. Anyway, it’s very interesting. It’s a, I actually think it’s more closely aligned to. What had been described as the DNA computers, where you can arrange, you can break. Segments of the DNA up and have it have it assemble itself into the answer to a complicated question that you would have a really hard time, answering with a traditional computer. I think it’s closer to that.

jerry: Than it is to like a, an actual binary computer. The concepts are difficult to translate, which says to me like That is by the way, the concern I have as we approach. You creating these [00:18:00] quantum safe algorithms? Like we were like hypothesizing, how. Quantum computing is going to evolve once it the.

jerry: Anyway, that’s.

Andy: Yeah, before we move on to the last thing I’ll say is this feels like the magic black box. That is a bit of a boogeyman because a lot of people don’t understand it. But

jerry: Totally. But it is. it is.

jerry: a practical thing, and it is a responsible thing for us to go and create this, these quantum safe algorithms and start migrating to them.

jerry: With as soon as practical. So I’m just concerned that like how confident can we be that? They’re actually quantum safe. So the next future threat, which I don’t think is actually a future threat, I think it’s like already here is software supply chain attacks. This is, obviously things like what happened with solar winds and Microsoft exchange. And.

jerry: Many others where the threat actors, I think are finding it a lot easier to attack. Purveyors of [00:19:00] software and software as a service companies and in and whatnot. Because if you do that you you can not just attack one company you can conceivably, with one attack yet.

jerry: Access to many different organizations as we saw with Kaseya. And solar winds as well. So yeah, I, I definitely think this is. On the upswing. I fear is it is an industry. Our response to this threat is like more spreadsheets.

jerry: I

Andy: And answer more questions.

jerry: Yeah. I

Andy: no. We have to live off. Look, we need to go back to the pioneer times and we code all of our own software. That’s the only solution Jerry.

jerry: It’s like the Hyundai version of

Andy: Look.

jerry: We have to build it .

Andy: Yeah, I think, yeah. And you can’t download anybody else’s code because you don’t know what’s in it. You have to code it yourself. the only option.

jerry: It’s true. And by the way, building your own encryption is it still irresponsible? So try to figure that one out.

Andy: [00:20:00] I’m kidding. Of course. Yeah, it’s a tough. It’s a tough problem. There’s so much inherent trust that you establish if we’d look back at solar winds and Yeah how many times have you and I said, Hey, upgrades and patches are important. And then that became the attack. Vector. Let’s just hope that becomes a rarity.

Andy: And let’s also hope that. Somebody else gets hit by that before you do, and you have time to react.

jerry: Yeah we just, we have to find a more mature way as an industry of handling this. Threat. There’s some there’s some approaches evolving, like salsa and.

Andy: Yeah.

jerry: And whatnot, but. Still the. From a consumer side, it’s still little more than spreadsheetware are you, or are you not. Salsa. Well, we already know from the first story that people are apt to lie.

Andy: Yeah. To defend yourself from this, I think you could do a lot of threat modeling of, and I’ll go back to the whole concept of least privilege as best you can, but some of these. Software. [00:21:00] Supply chain risks happened because you have no choice, but to have a massive amount of trust granted to some.

Andy: Third-party software.

Andy: Just to function.

jerry: Completely agree, open sources adds another add s another level of complexity there, because. What we see with open sources. And by the way I have no particular aversion to open source thing. The problem we have is that. It’s. It’s apt to be abandoned. It’s easy for it to get handed off from a quote good person to a quote, bad person. It’s.

jerry: Conceivable that a quote, good person goes bad.

jerry: In, in, in many other permutations in it, like they’re so stacked on top of each other some of these open source applications. The feed and even the commercial applications. There’s like tens of thousands of packages. Like how do you.

jerry: How you get your hands around that.

Andy: Yeah, that is crazy.

Andy: You brought up up something that reminded me,’ve even seen some very [00:22:00] popular well-known.

Andy: Packages be turned into protest where

jerry: Yeah.

Andy: purposes, by the maintainers for various reasons, it’s rare, but we’ve seen it a couple of times and it’s.

jerry: Over the years we’ve seen. Browser plugins being sold by their.

Andy: Yeah.

jerry: Their author and maintainer to, Malicious. Quasi malicious companies. So it, it happens. And it’s a really difficult. Problem to solve, but we’re going to have to reconcile how to how to solve it eventually.

jerry: The next one is internet of things, making us more vulnerable. Blah, blah, blah, blah, blah. More devices you have on your network. Created by, I think it’s this, it’s a flavor of the same. That same thing, you have these embedded devices, typically lower costs, whether it’s a copier.

jerry: I don’t know a thermometer or whatever. They have crappy firmware. And they get abandoned and they’re still on your network and they become a launching point. And then, they point out in the [00:23:00] article the super famous. I guess infamous story about the Las Vegas casino that was hacked through their.

jerry: Their aquarium thermometer, which is. I don’t know. Something’s wrong there. If that can happen, but I mean there’s stories about, people getting in the wireless networks through Through a smart light bulbs. There’s a lot of stories like that and,

jerry: But I think it’s a similar kind of thing we have to. We have to figure out how to handle that. The one that makes me the most concerned though. Is a deep fakes powering business, email compromise attacks.

jerry: So I actually as a. Kind of an experiment. And I’m recording it with this software now. You can easily buy. You have access, like the average person has access to technology that allows you. Pretty easily to do deep, fake type stuff. And if you think about that in the context of what we’ve seen [00:24:00] with

jerry: The business email compromise where somebody’s parading as the CFO. Asking to transfer money or to change the bank account information. This opens up a whole new world. And especially when you add the layer of a video deep fakes, holy crap. Like having a WebEx. With the person who you think is your boss. And by all means, by all appearances, it is.

jerry: And here you’re talking face to face, virtually face to face with who you think is your boss, giving you an instruction on how to do something or to do something. And it’s. It’s not real.

Andy: Yeah. In fact, Jerry’s, not even here I deep faked jerry’s entire portion of this podcast.

jerry: That’s true. That’s true. I was never real by the way.

Andy: That’s actually not true. Jerry has evolved into a llama. it’s been replaced by a deep fake AI. I’m kidding. No, I don’t mean to make light of this because I think it’s absolutely [00:25:00] legitimate. We, as humans. Have evolved to trust our senses. And identify people visually and audibly. Withreat level. Like we don’t have any built in skepticism that we just inherently trust it.

Andy: And this problem. Is playing on that psychological concept of, we trust our senses when we identify somebody because we’re very good at identifying things. And so the fact that. We have now moved to this digital environment and digital comms, and we can deep, fake this successfully. Is really powerful and dangerous.

Andy: And I can see this wreaking a lot of havoc. Absolutely.

jerry: Yeah, it’s. It, it seems a little scary to me to be honest. And I think we’re going to have to come up with. Better processes.

jerry: Well, you’re gonna have to have a multifactor, like you’re I think we’re going to get to a point. you just can’t trust. [00:26:00]

jerry: just can’t trust.

jerry: that. And by the way, scares the crap out of me. When you think about things like evidence submitted in the court from surveillance cameras. There’s like the, your mind can go in lots of problematic places. But from a, just narrowly from a business.

jerry: Resisting business, email, compromise type things. It really puts on what’s the focus back on having a robust process.

jerry: Where even if you have your boss, the CFO, whoever call you up on a WebEx.

jerry: Like you still have to have some systematic way.

jerry: That requires authentication and whatnot.

jerry: That.

jerry: The person has to prove who they are.

jerry: We just have to do that.

Andy: Yeah, no.

Andy: No matter what you can’t violate the process that.

Andy: Authenticates at multiple levels that this person is who they say they are, and that they’re authorized to do it. Which is difficult, especially for small companies. It’s a lot [00:27:00] of discipline and bureaucratic red tape, but. Otherwise, I just it’s going to get too trivially easy. To fake a phone call from the CEO.

Andy: With a perfect voice representation.

jerry: alone a perfect ah video.

Andy: Right.

jerry: Yeah. Anyway, that’s Something to keep you awake at night. Destructive malware attacks is next. Again I think we’ve we’ve seen this. Quite a lot. We had WannaCry NotPetya. Yeah. And candidly. Like the scourge of the internet right now is ransomware, which I think they’re thinking more like physically damaging, candidly ransomware is I think in this category already.

jerry: So I think we’re, I think we’re already living in this one.

jerry: And then finally the skills crisis. Although I guess I’ll go back one, we’ve talked to in the past about. Some of the forward-looking innovations in [00:28:00] malware, probably moving into firmware.

jerry: And in that may be where we where we see this going next is it does get more destructive for the average organization, but it’s by means of attacking firmware. Like where you can’t recover. Your hardware, you just, you can’t just wipe a system and.

Andy: Yeah. Just bricks, the entire.

Andy: Whatever.

jerry: Right.

Andy: Motherboard level or hard drive level

jerry: Or it’s it. Or it’s it’s infected at a, in a way that just you can’t clean it. You just, you can never reestablish trust.

Andy: You mean like installing windows.

jerry: For instance.

Andy: Sorry. I’m kidding. I’m kidding.

jerry: Yeah. Although I have to on that funny point. So the the author of systemd, for those of you who ah who are Linux nerds like me, the author of systemD recently left Red Hat and moved to Microsoft.

Andy: Aye.

jerry: And so system D has been pretty controversial thing and, cause it’s starting, my, my view is it’s starting [00:29:00] to move Linux into kind of a windows windows mode of operation. And so I think like the next plan. Like order 66.

Andy: Oh,

jerry: right. And system D was like the clones.

jerry: In order 66 is. They’re going to, we’re going to, they’re going to rename systemD to be SVC host.

Andy: So who are the Jedi in this example? Exactly.

jerry: I don’t know, everybody’s bad.

jerry: So there’s no, there’s there’s no. There’s no light side of the force is like the dark and the darker side of

Andy: But with order 66, like they kill all the Jedi. So who are they going to like.

jerry: Oh, that’s the Linux. That’s the Linux people. That’s the people who are using. Using Linux.

Andy: I see.

jerry: Yeah.

Andy: Yeah.

jerry: The only thing left will be ah, will be the windows? Yeah.

Andy: And my one lone copy of OS2 warp that I’m running still.

jerry: That is very true

jerry: Yup.

Andy: It’s a dangerous world. My friend.

jerry: So the the final frontier of risks that we will have to worry about [00:30:00] tomorrow. Is the skills crisis.

jerry: I am. I will sayit a little bit differently. I’m not sure that is the skills crisis so much as. The ability to pay for the skills we need.

Andy: So you think there’s plenty of people out there we’re just not paying in a form.

jerry: I think, yeah, I think so. I think it’s probably naive of me to say there’s plenty of people out there, but there are people out there. My observation is. A lot of organizations have lots and lots of of job openings and they moan and complain about the. The lack of people, but it’s the lack of people who were willing to take the job at the.

jerry: Rate that they’re offering.

Andy: Okay. So is that a. Is that a supply demand problem as well. If there was more supply. To meet that demand. It would. down the pricing.

jerry: yes. I think so. I think [00:31:00] that’s, what’s. To be honest, like one, one of the ways to look at this is, a lot of organizations are.

jerry: Trying to dramatically increase the supply to get the cost to go down. The reality is that there’s a. Like, we’re just don’t have the level of the number of people. We, most organizations don’t have the number of people in the skill level of people they need. But at the same time, I think there’s.

jerry: That’s a symptom. Of a bigger problem that I don’t think that most companies invest the amount of money they need to invest in securing and operating their IT like, I think we’ve just. Like we’re over. We’re overextended. We’re over leveraged. In it, and we’re trying to figure out how to fix it without fixing that.

jerry: Over leveraged situation. That’s just, Jerry’s macroeconomic. Baloney.

Andy: Yeah, see some truth in there though that we.

Andy: We also have [00:32:00] probably a lot of bad practices. And failure to follow best practices for various reasons that were. Compensating for, with other security controls, as opposed to just. things more inherently secure.

jerry: Oh a hundred percent.

jerry: A hundred percent

Andy: Now that’s a complicated thing, there may be very good reasons why we do that. And I’m not saying. I’m trying not to be dogmatic about there’s only one way to do things. I think that’s true. I also think a lot of legacy businesses have so much tech debt. And the cost to say rebuild with best practices are so high.

Andy: I think probably don’t have much choice.

jerry: Absolutely.

Andy: Yeah, I don’t know. It’s interesting problem.

Andy: I do wonder.

Andy: I do wonder if it’s.

Andy: Going to stabilize or we’re always going to be in this scenario.

jerry: It is.

jerry: On the one hand

jerry: There’s a lot of peril in thinking that the patent office is going to close. But on the other hand how much more innovation, like how many more features does your Your word processor need? .

Andy: [00:33:00] Apparently lots.

jerry: Well, I guess I’m like there will always be innovation, but I think the rate of innovation is going to. You’ll start to flatten out a bit. And

Andy: we’ve been saying that for years and.

jerry: Yeah, well, it’s

Andy: law has proved us wrong over and over again. But I hear you like how word process spreadsheets pretty mature, pretty commodity. Like how much more tech do you need in them, but. Heck we’re still debating whether or not macros should be on or off in office by default.

jerry: Oh, gosh, that’s right. Yep.

Andy: No, I hear ya. I just, I also feel like it’s going to slow down soon and feel two old guys talking about getting off their lawns.

jerry: Well, I think it’s, I think the complexity will probably shift around and I think to some extent, what. What might end up happening is we see. We see the devices people have move away. The device’s employees have move away from being kind of general purpose computers to more specialized.

jerry: Like iPad type [00:34:00] devices. Not. Not not like green-screen terminals but less so in less a wave. Or less general computing and more. Specialized, which I think are easier to secure. That just shifts. A lot of the complexity into other parts of the environment your infrastructure.

Andy: Well, going to specialized isn’t that like being more like an IOT. Problem, which also we can’t seem to keep updated and secure.

jerry: Well, that’s true. I guess were.

jerry: Pretty bad all the way around,

Andy: yeah. Sorry, clearly there’s no hope for any of us.

Andy: We’re doomed.

jerry: Yeah.

jerry: All right. Well, I guess we’ll just just keep milking the the machine for awhile. And then we all retire to our private island,

Andy: jerry Landia.

jerry: Anyway I thought this was. Pretty interesting list of things to be thinking about. The most interesting one I thought by far was the was the deep fake threat.

jerry: I wanted to call that out. Anyhow, that is [00:35:00] the show for today. Thank you all for for listening. Sorry. It’s been so long. Life continues. To get in the way of making podcasts. And I, every time I think it’s going to level out and I will be less busy. Something happens.

jerry: Hopefully. Fingers crossed.

Andy: Fair enough, but Hey, I I, we appreciate you guys sticking with us and hopefully still finding some value in the podcast and we enjoy making them when we can.

jerry: Take care, everyone.

Andy: Have a great week.

Andy: Buh-bye.

jerry: Bye.

Defensive Security Podcast Episode 266

Defensive Security Podcast Episode 265

Google Exposes Initial Access Broker Ties With Ransomware Actors (

Okta says hundreds of companies impacted by security breach | TechCrunch

Okta: “We made a mistake” delaying the Lapsus$ hack disclosure (

Microsoft confirms Lapsus$ breach after hackers publish Bing, Cortana source code | TechCrunch

DEV-0537 criminal actor targeting organizations for data exfiltration and destruction – Microsoft Security Blog

Sabotage: Code added to popular NPM package wiped files in Russia and Belarus | Ars Technica

President Biden Signs into Law the Cyber Incident Reporting Act (

SEC Proposes Rules On Cybersecurity Risk Management, Strategy, Governance, And Incident Disclosure By Public Companies – Technology – United States (

Defensive Security Podcast Episode 263