North Dakota State University-Main Campus

Episode 244: Avoiding Cyber Catastrophes

Lisa: Hi, I'm Lisa Hernandez

Lizzy: and I'm Lizzy Ghedi-Ehrlich

Lisa: And we are your hosts for Scholars Strategy Network's No Jargon. Each month we will discuss an American policy problem with one of the nation's top researchers without jargon. And this month we are talking about a topic that we've actually never discussed before...cybersecurity!

Lizzy: Happy New Year. I can't wait to learn about it.

Lisa: Yay. Happy new year to you too.

Lizzy: Yeah, it's 2023. People were using their credit cards and their social security numbers, and they're all sorts of things probably over the past few months. And of course, we're gearing up to do our taxes. And all of these are things that happen online and deal with a lot of, uh, sensitive information.

Lisa: Absolutely. I mean, we are here recording this through multiple forms of technologies all at once. So I'm excited for us to really talk about, um, policies on a federal level that can really protect our security measures as far as our individual cybersecurity, but also, just the government and different agencies, security as well.

It's really important when we think about our safety, even though it doesn't, it isn't something that's always on our minds.

Lizzy: Well, yeah, we take a lot of this stuff for granted. Like there's this huge infrastructure and we all use it every day and we're storing our passwords and our money and our health information. And definitely when I think about cybersecurity, I think about what are like, you know, my mom and like avoiding scams and the personal things I can do, uh, not about federal policy.

So definitely interested to hear that perspective.

Lisa: Yes, and we had a great conversation. So for this week's episode, I spoke to Jeremy Straub, an assistant professor of computer science, and the director of the Institute for Cybersecurity Education and Research at North Dakota State University. Straub's research focuses on technology development and technology policy.

He is the author of a recent SSN Brief titled Cybersecurity Incidents Can Be Unwelcome, Wake Up Calls for Unprepared Agencies. Here's our conversation.

Lisa: Hi, Professor Straub. Thanks for being on No jargon.

Jeremy: Glad to be here.

Lisa: I am sure you're aware there's always something in the news about digital threats on a global scale, some kind of large company organization being threatened to be hacked. I mean, a couple of weeks ago there were these fears of maybe Twitter becoming this big landmine for cybersecurity breaches. And you've been studying cybersecurity for some time with a focus on these digital threats that the government faces. So let's begin this conversation by talking about ways that cyber breaches can jeopardize our national security here in the United States. How serious have digital threats been in the past, and how serious can they continue to be?

Jeremy: Well, I think those are two really good questions and you know, really it's almost kind of the comparison between them. That's kind of the interesting answer. You know, we certainly have seen some large, uh, data breaches in the United States. You know, everything from like the Marriott breach where, uh, you know, tons of guest records were compromised.

Personal identity information was provided. B reaches related to credit profiles, and even like the Office of Personnel Management or OPM, breach, where lots of information related to security clearances was compromised. And, you know, each of these has got kind of a scope to it, you know, in some cases, like the OPM breach, the data that was compromised was, uh, you know, particularly sensitive, but the number of people that were impacted was a little bit lower.

Um, when we were talking about like the credit profile information, obviously that, that's very sensitive information as well. Not quite as much or, or as sensitive as what's in the, uh, background files. But again, a lot more people. And then the Marriott breach, uh, again, was a very large breach, but the information that was in that was probably not as sensitive as compared, to the other two that I just mentioned.

So there's been a lot, you know, again, a lot of breaches, a lot of, you know, people that have been impacted in some way. But we haven't really seen, a catastrophic breach, in the United States, we've seen some things that kind of suggested what, you know, what a catastrophic breach could be like.

Uh, you know, there was an instance of a, uh, breach that resulted in chemicals in a water treatment plant being incorrectly mixed, um, or, or at least showing that they could be incorrectly mixed. I don't remember exactly how far, um, that one got right offhand here. But, you know, that, that type of thing could obviously harm lots and lots of people.

I distinctly remember that, that it didn't, that nobody was, you know, specifically injured from that. and, uh, you know, again, that kind of shows you what the, um, bad cases could be that are much worse than perhaps what we've seen with, you know, just a compromise of personal information in the past.

Lisa: Um, do you see the United States government being worried about these scenarios or taking some preemptive measures in order to protect their agencies from these types of security breaches?

Jeremy: Well, there certainly is a lot going on. You know, there are several agencies, that have, uh, you know, kind of a shared responsibility or responsibility perhaps more correctly for different areas of the National Cybersecurity Defense. and there's a lot of different efforts going on, to try to get agencies to, um, enhance their, uh, you know, their security posture to be more aware about what they're doing. And to give them better resources and better standards, to try to, you know, again, try to get every agency from, you know, the big agencies, that you would obviously, think would be very cognizant of cybersecurity issues on their own, down to smaller agencies that may not realize, you know, the level of threat that's out there, or even the sensitivity of some of the data that they hold.

So, you know, at the government level, at the federal level, there is a lot going on. The challenge though, is that it's not just government, it really is, you know, everybody in the ecosystem, which includes private enterprise, it includes individuals, you know, it even includes contractors that work for the government that might have access to government information.

 If any one of these areas has a, a problem has, you know, a lack of appropriate security. It's possible that information can be stolen from one place and then used to attack another. So, if you think about like a, a scenario where somebody might use certain personal information when they're coming up with a password or as a, you know, validation questions to recover their password, and that information can be stolen from another website.

Or you know, somebody that's not governmental that they do business. , bank, et cetera., you know, that could be a scenario where even though the government system isn't breached, somebody can still get into, that individual's account because they're able to, uh, you know, to get the information perhaps necessary to reset their credentials from somewhere else. And then once they're into that system, they can use whatever permissions, whatever authorities, the individual whose account that they've, compromised, has perhaps compromising even more accounts. So it's really, you know, a very difficult situation because again, you know, you're relying upon so many things kind of working right in tandem, which isn't always gonna be the case.

Lisa: Right. And you mentioned contractors who work for the government, and immediately I think about Edward Snowden. My producer Mandana and I were talking about the, when the news broke out about him in 2013 when he leaked what was highly classified information from the National Security Agency, where he worked at the time.

And I remember there was this debate about what government information needed to be private and what information the public had the right to know in the name of transparency. So I would love for you to talk about this. Are there some cyber breaches that are done in the name of doing some kind of service to the public?

Jeremy: Well, I mean, that's a very tricky question. I mean, I think there certainly can be, you know, actors that feel that they are trying to do something, in the public interest or in the public good. I think when we get to, you know, to security questions, you know, you have a variety of factors that you need to consider.

And you know, it makes it a little bit difficult because any particular actor, um, in the scenario and, you know, I guess, uh, you know, I'm kind of, I'm kind of breaking the No Jargon, um, name here by not describing what, what I mean by the term actor. And, and, you know, and so we use the term actor in cybersecurity basically to just talk about somebody that has a role in a, you know, in a breach.

So we're not talking about like a Hollywood actor or something here. We're just talking about, you know, in this case, like the person that might be thinking of leaking data or testing a system or whatnot, um, we call them an actor. But, you know, any, anybody could be thinking, okay, they, maybe they're gonna attack a system to show that it's insecure or they're going to attack a system, to try to access a record that they feel needs to be public.

The challenge with that is that you're, you're dealing with people that may not have the full understanding of the implications of, you know, of the effect of what they're doing. So, you can envision a scenario where even if, you know, most people would agree that showing that the system is insecure is a good thing or that, you know, a particular piece of information maybe should be in the public domain.

They may attack a system and, and actually create bigger vulnerabilities. They may let somebody that has, you know, less, uh, you know, societally benefit motivated goals also kind of piggyback on their attack by opening a door that can't be readily closed. it's also possible that by trying to access a system for these purposes

They inadvertently break the system or disabled parts of its functionality. And you know, all of those are, are things that probably would, you know, again, if, if we kind of assume that they have a public interest motivation, are probably not things that they intended to happen, but there are things that could happen kind of inadvertently.

We have a strong societal recognition of the need for people to, um, to sometimes stand up against policies that they believe are unjust. But it becomes very difficult when, you know, when

there's so many, concurrently moving or concurrently related pieces that you can't really figure out exactly what the implications of a particular action might be reliably before taking it.

Lisa: I do wanna talk a little bit about your brief that's on our SSN website, which will be linked in the podcast notes as well, but to focus on things that do result from these breaches.

And you briefly talk about something called the Snowden Effect. Can you explain what that is for listeners?

Jeremy: Sure. There's a few different aspects to it. And you know, that's one of those terms that I think, you know, we, we've kind of used in a few different ways since it was originally coined. But I think, you're thinking about a few things when you're thinking about the impact of, you know, Edward Snowden's, data, uh, removal from NSA.

And so, you know, as an agency, you know, you'd be thinking about, well, what happens if somebody else gets the information? That's, you know, that's agency records and maybe thinking a little bit more about what types of data, uh, you know, you would be collecting, what types of data you'd be storing, even, you know, how long you store it for.

On the flip side, you know, we can think about, uh, Snowden's impact from, kind of a societal perspective as well. And impacts that it has had on our perception as everyday citizens of, uh, you know, what types of data government collects, what type of data government holds.

Which again, you know, is, is a, you know, caused some people to change kind of their perception of, you know, of what's going on, uh, there and, and some people, you know, thought it was, okay. Some people were very upset. Um, and there's a, you know, a very large continuum of reactions to that.

So I think those are all kind of the different things that, you know, you can kind of think as, you know, as Snowden's kind of impact on, not only agency action, but also on, you know, kind of the public perception of agencies. You know, all of those things kind of have a certain amount of interconnectedness too, where obviously the, actions an agency might decide to take in the future data they might decide to collect or store is gonna be driven by kind of a, perception of how, you know, members of the public react to being found out that the agency is storing or collecting that type of information. So, you'd want to consider, particularly in the role of a, uh, leader of an agency or an individual in an agency who's making policy decisions like that to try to make sure that what you're collecting, what you are, uh, storing is something that, uh, you know, you're gonna be able to explain really easily to people. 

And, And, you know, I think it's also important to realize that, you know, a Snowden style breach is not the only way that this can happen. From a federal perspective, you know, um, Congress looking into something is another way that this type of question could be asked, uh, or, you know, could be brought to the forefront.

You know, even certain types of, uh, you know, reporting that the agency might do themselves might inadvertently, um, identify certain types of information that's being held or stored. So it really, you know, creates an impetus for agency leaders to just, you know, take a, take a look through, what are we collecting, what is the rationale for what we're collecting, how long are we storing it for?

What are the conditions under which we're getting rid of the data or archiving it or, or whatever other actions may be taken to make sure that there really is, you know, good decision making, occurring. And, you know, particularly in the case of storage, it's really easy to just inadvertently store data, you know, indefinitely, right? I mean, you have to usually take some sort of proactive action, to remove it from storage. 

And so if you don't have a specific plan for that, and specific assignments to do that, you know, data can, can be there unfortunately, waiting to be breached, waiting for something else bad to happen, you know, indefinitely. And, and in a lot of cases, maybe it doesn't need to be around that long and there doesn't need to be as much data there for a nefarious party to get access to if they do happen to breach something.

Lisa: Do you have any advice for agency leaders? Things that they should focus on to protect our data?

Jeremy: It really depends on, you know, on the individual agency. I think most kind of generic recommendation, uh, that I would have is, is really coming up with a good assessment plan, and, you know, starting with the question of, what are we collecting, doing an inventory of what data is being collected and then, you know, doing kind of the same inventory of what type of data are we storing?

You know, even agencies that are no longer collecting certain types of data might have that data, you know, in their archives. Right? Um, so you really need to kind of start from the perspective of saying, hey, you know, what are we doing now? What information do we have now?

And, you know, trying to limit collection to the data that is, you know, that is regularly needed for processing a certain type of transaction, making sure that the data that the agency, you know, thinks that it needs for processing a certain type of transaction is actually used. You know, again, just because something is on a form doesn't necessarily mean that the agency needs it. And then, you know, doing kind of a similar but maybe a little bit different process in terms of retention of trying to figure out what type of data is needed beyond a transaction and how long it needs to be retained for. 

And, you know, some of this, uh, of course will be statutory, which may mean that, you know, an agency could actually figure out, well, we don't really need this, but we're required to collect it.

We're required to store it. Where they may require, you know, legislative action to, uh, to remove or change, uh, one of those require. You know, a big thing though in the, in the, in regards to storage is perhaps the ability to reduce the amount of information that is stored after the transaction is complete, where some of the information you collect may be needed to process the transaction, but may not be needed for archival purposes.

Lisa: Why haven't some of these proposals already been pursued more aggressively? As far as, um, addressing data, collecting data storage, um, especially considering how catastrophic, a major data breach could be for the United States.

Jeremy: Well, you know, there certainly have been efforts and, you know, even before, you know, really thinking about this from like a cyber security perspective, the Office of the Management of Budget, uh, OMB, um, actually has a, a role in looking at most of the forms that are generated by the federal government.

Um, I don't know if you've ever seen, you know, any number of forms have like an OMB control number. You see a little block of text at the top. Um, and there are laws out there, um, that, for most forms, for most agencies, there are some exceptions, require the form and the collection to be reviewed by OMB, before it can be used, to make sure that it's not putting, you know, a, um, disproportion or unnecessary burden on respondents.

 Um, of course, you know, OMB is looking at this from, you know, from their perspective – from their statutory requirement of looking at the burden that it places. But, you know, this is, this is a potential model that could be used, uh, to, you know, validate forms and other types of data collections to ascertain if there may be unnecessarily sensitive information being collected by the form. 

Certainly collecting information that is not needed, um, is definitely an unnecessary burden, right? You're making somebody do something, you know, in. Filling in a few additional questions or what have you that is not required for the transaction.

That's, you know, kind of the definition of unnecessary burden as it comes to a form. So some of this kind of already falls within, um, you know, what OMB is doing, what OMB is supposed to be doing right now. But certainly, you know, there are, there are other things that could be added to it. And then there's also a very important role for the agencies themselves, who are the ones that understand their business processes the best.

They understand what they're doing with, uh, you know, with the data. They understand how the data is used. And so forth. Um, and, you know, there are some things that it's very difficult as somebody that's not involved in the, you know, daily operations of a particular process to realize, hey, we don't actually need that.

Or, you know, we used to use that, but we don't anymore. And so people that are, you know, involved in those processes are in a lot of cases the best place to get, uh, information about, well, we don't really need this, or, you know, we could figure out a way not to need this.

Lisa: I'm wondering about maybe not just agency's rules, but overall policy making. And is there more to do to make cybersecurity a bigger priority for the United States government?

Jeremy: Well, I think the federal government, um, and, and certainly, you know, many state and local governments realize how important cybersecurity is. You know, a lot of governments have had breaches, they've had, you know, perhaps in the case of municipalities in particular, they've had, neighbor, uh, municipalities who've had data breaches, and they've seen kind of the fallout from that.

Here in North Dakota, we've had a really big push, um, at the state level, um, to try to, you know, think about cybersecurity as it relates to kind of all of the state agencies. Thinking about how it, you know, how it relates to, general workforce and need to be aware of cybersecurity, whether they're working in a high tech industry or a low tech industry, and what that means, you know, in terms of kinda the educational offerings that need to be made available in the K-12 system and the higher ed system and, you know, vocational and other education systems.

Um, so I think there is a lot of, you know, a lot of focus on it. But you know, it's very difficult to deal with everything that needs to be done. Of course, you know, there, there's far more things that could have attention than ever get them. We see this a lot too, kind of in the realm of

you know, emergency response and emergency management where, you know, after, you know, say a hurricane comes through, there's a lot of focus on, uh, you know, on hurricane preparedness because everybody is so acutely aware of, you know, what's just happened and, trying to prevent that from happening again.

And then as you get further and further and further from that event, you know, that awareness naturally goes away. It's replaced by awareness of other problems. Um, and so I think, you know, you have, public agencies that really have, you know, an infinite number of, uh, you know, potential problems that they, you know, they could potentially prepare for and trying to figure out how to prioritize them.

And I think one of the big challenges is having, you know, policy makers, having those that are implementing policies understand how catastrophic cyber breaches can be. When they're not targeted at things that can be corrected over time you know, being a victim of identity theft is a very difficult process for those that have to go through recovering from it. But, you know, I mean, it is something that, you know, in, in the vast majority of cases is recoverable from. 

If somebody attacks a power plant and causes an explosion or attacks a water facility and poisons people, you know, there may be people that get, you know, very, very sick that have long term health implications from that, and if the breach is significant enough, you know, again, you know, the explosion, uh, at the power plant, uh, you know, poisoning of a certain level, you know, it could even cause deaths within a very short period of time afterwards. That again, there's no, there's no way of reversing.

You know, so those are the types of breaches that, uh, you know, I think we fortunately have not seen specific examples of in the United States, but, there are very real vulnerabilities out there. Because we know that there are older systems that have a direct or indirect connection to the internet.

We know that there are these critical processes that have these, uh, you know, human safety implications. And you know, even if you don't know that a specific process at a specific facility has a specific vulnerability, you know, just by looking at kind of the worst case scenario thinking, well, what if somebody, you know, manages to breach this facility. We won't worry right now about exactly how they do that. What could they do if they manage to get in the, you know, the proverbial electronic front door. And you know, again, in the case of a lot of facilities, those answers are pretty catastrophic.

You know, even disabling, you know, the power grid or, uh, the flow of, uh, natural gas in certain areas of the country in winter, you know, that could have a really catastrophic effect because it prevents people from getting the heating that they need. Um, so again, there's just so many things, uh, you know, where we're, we're dependent upon, the successful performance of, uh, you know, various technologies that, you know, a breach of them, again, could have immediate, really dramatic, um, human health, human safety, human life impacts. And that's the thing. I think as you know, as a policy maker, you really want to be thinking about, you know, not just, uh, you know, what happens if, the data we hold is, is stolen. But what happens if somebody gets into our system and they actually do something nefarious and they cause, uh, you know, they cause injury or death?

How do we prevent that? You know, whether it's you know, electronic safeguards, whether it's physical safeguards, whether it's disconnecting certain functions that we're not sure, are secure or can be secure, uh, from having any sort of direct or inter, uh, indirect internet connection. You know, those are again, I think really the critical things to think about, personally.

Lisa: Thank you so much for calling attention to that, you know, on our podcast using all our various technologies in order to, um, get this information out there. And I do wanna know if, you have any more thoughts that you wanna share on how cyber security can be like further protected before we close up.

Jeremy: You know, there's a few different things, um, that, that are kind of, you know, again, kind of the simple things that, that people can think about. You know, for individuals protecting your own identity information to the, you know, the best you can is a, is something that everybody can do.

You know, being very cognizant of who you provide information to, what information you provide. You know, making sure that you're, you know, just being a little bit skeptical, being, you know, kind of the healthy skeptic of when somebody asks for information. Is this somebody that should be asking?

For the information, is this person who they claim they are? How do I know that? And then, okay, they're saying they need this information. Do they really need this information? And, and one example is, uh, you know, if somebody calls up claiming to be from your bank, your insurance company, you know, your utility provider. Just saying, Hey, I'll call them back. Um, you know, I'll, I'm gonna get my bill out, or I'm gonna get out the phone book or look online on their website, look up a phone number, and then just have the person give you their extension where you can reach them back at.

And, you know, if they're answering the phone number that the company provides, you know, as kind of a central phone number and you're just being transferred off to an extension, you have a much better idea that they work for the company. And, you know, it's a little bit annoying cuz you have to go a little bit out of your way

to find the phone number and, you know, end the call and restart the call. And, you know, probably most of the time for most people, the call was legitimate. But now you know, now you know before you're providing, you know, personal information And, you know, our personal information in a lot of cases is the, you know, is the gateway to these, commercial systems. You know, when somebody compromises an account, it's usually an individual's account, which means knowing information about the individual, is a way to, you know, to get into their business system.

So, you know, if you've made, the password, and this is not a good password, by the way, for the listening audience. But if you've made the password, say your dog's name and somebody can figure out your dog's name or your dog's name and your, daughter's birthdate, and somebody can get those two pieces of information and, you know, they can use that to guess the password to your, you know, to your business system.

Course one thing you can do to prevent that is coming up with stronger passwords that don't have these personally associated meanings, but there's a lot of people out there doing that. So it kind of shows you how information stolen from one place, can be leveraged in another place. And you know, again, an attacker that is well motivated, what do they do? Well they, you know, they keep persisting, they keep pushing at, uh, their goal and trying to get, you know, different pathways to it. Whether it's people's personal information, whether it's a more technical attack against a vulnerability in a particular system, or a combination of those. So everything we can do, to make it a little bit more difficult for people to be able to do that makes things secure.

And I think that's, you know, that's kind of the real key thing here is to just, you know, again, look at as many processes, you know, and kind of flipping to the other side, the, uh, you know, the agency side or the commercial system operator side, looking at as many different processes and saying, you know, let's figure out how we can lock this down.

Let's look at this, let's make sure that, we're managing information well, that we're managing access well. It, it really is just about looking, you know, looking at as many different things as you can and trying to make them as secure as you can, um, you know, through this analysis process and, taking an unrelenting view of this, not, not getting to the point where saying, okay, you know, we finished this analysis, we're done. And instead saying, you know, we finished this analysis, now it's time to go back to the beginning and see what's changed.

See what we missed. Kind of always looking to the extent resources are available to the extent time is available, to the extent personnel is available, to, you know, just kind of continuously improve. And of course, you know, the resources, the time, the personnel that comes back to making sure decision makers understand, you know, what the implications are so that they are providing those resources to the people you know, in the technical rules that are able to use them to enhance security.

Lisa: Well, thank you for sharing how we as individuals can protect our own cybersecurity and, you know, how decision makers can think about, creating policy around it. And I will now go change my Coco 2003 exclamation point password, um, in order to protect my security. That's a joke. That's not actually my password.

Um, but thank you so much Professor Straub for giving us all of this fascinating information. I definitely look forward to continuing to pay attention to, um, cybersecurity, how I can protect myself, and also, um, seeing what kind of measures the government is taking in order to protect it. So thanks for coming on

No Jargon, and thanks for listening. For more on Professor Jeremy Straub's work, check at our show notes at scholars.org/no jargon. No Jargon is the podcast of the Scholars Strategy Network, a nationwide organization that connects journalists, policy makers, and civic leaders with America's top researchers to improve policy and strengthen democracy.

The producer of our show is Mandana Mohsenzadegan. If you like the show, please subscribe and rate us on Apple Podcasts or wherever you get your shows. You can give us feedback on Twitter, @nojargonpodcast, or at our email address, [email protected]

 

 

Support No Jargon

Stay connected with America's top researchers