Archive 9 min read

Ethics In Technology And Cybersecurity

How new technologies are used and built is really up to us. Regardless of your moral compass, it's important that you discuss the creation & use of these tools with your teams and larger community.

Ethics In Technology And Cybersecurity

Watch this episode on YouTube.

Reasonably Accurate 馃馃 Transcript

Morning everybody. How are you doing today? It's Monday. Um epis, this is episode uh 70 of Mornings with Mark. Um Thank you again for your continued support. As always, this is a two way discussion, especially with today's episode's topic. Um Hit me up online at marknca in the comments down below or by email me@markn.ca.

What I wanted to do, talk to you about today was ethics and technology and specifically in cybersecurity. And the reason why this topic came up uh across my radar this morning, um was uh Ramona Pringle uh from uh the Ryerson Media Zone, wrote a phenomenal um essay for CBC news covering um Google's decision about project maven.

So this was a project uh contract that they had um with the Pentagon um in the US. And a number of their A I experts um expressed significant concerns with how their work was being used um within uh the military um and internally they had a long discussion.

Um and up for debate and Google came up with a set of principles and it said that they're not going to go after that contract um again, when it comes up for renewal. Um And this raises the larger question of what's right, what's wrong?

Now, I don't have any answers for that. Of course, this is not gonna be one of those shows where it's like, hey, you should be doing this, this, this and this, what I wanted to do was highlight these challenges because um, as you know, on the show, we've covered a number of times how to get started in cybersecurity.

And I've covered a little bit about what working in cybersecurity is like. But this is not a topic that I've really um had a chance to dive into and to share with you. Um because I think it's an absolutely critical one. So not just for technology but for cybersecurity as well.

Um So we have that Google uh project Maven and A I um example, another example is with Aws and their recognition um video service. So it's um lets you do video analysis and things like facial recognition. Well, one of the use cases right on the site was you could use it for law enforcement.

Um Technologies, Azure had a similar thing um where they're back ending um some government departments in the US, like I the immigration enforcement. Um and there's a lot of questions around these use cases of technology. Um And you'll hear a number of arguments and some of them have merit.

Some of them don't. Um An interesting one that always comes up is saying that, you know, technology is neutral, it's how you use it, that is different. And while I'm not sure, I believe that 100% I can give you numerous examples of that.

So a lot of the controls and technologies that we use in cybersecurity can be used both in a malicious manner as well as in a positive defensive manner. Um Now we see that with malware, we see that with cybercrime, but you also see that on a bigger level around um enforcing certain ideals on a community or population, um profiling or censoring a community or a population.

Um There's a lot of ways that this stuff can go bad even though it can do good. So there's a lot of argument for that, you know, technology is neutral, it's how you use it. Let me give you an example. Um a personal example that I I've dealt with in my career at one point.

Um I was working for a large organization and we were implementing um a uh web proxy doesn't sound like much we're implementing a security proxy. Um So this is a device that sits between your users and the internet. And the goal is to um do a whole bunch of security scanning to make sure that uh bad content isn't going to users desktops and by bad content in this context, I mean, things like malware, malicious javascript in today's world, it would be you know, crypto mining javascript, um, viruses, um, you know, malware implants, things like that and then vice versa to make sure that no sensitive data was leaving the organization that shouldn't be.

Um, now that's where things get a little bit trickier. Um, because in order to do both those actions, you need to look at all the traffic that is coming in and out of your organization. That also means personal information, that also means um personal activities because people use the internet.

Um you know, to check uh on those recipes to share and communicate with friends on social networks, um to download games, to research new stuff, to look into health issues, to do their banking for any number of things. Um And then in the larger, you know, what level of privacy, um can they expect when they're on a corporate network going out?

And I know there's the law and we're not talking about the law, we're talking about what's moral, right? Or what's ethical that's different than what's legal. There's a very different line and you'll see companies all the time claim the legal line, but there's a larger question that you need to wrestle with.

So when we were deploying this technology, the discussion came up because somebody from HR had mentioned like, hey, wait a minute, if you can track what users are doing online, can we use that for hr investigations? And so a discussion ensued and this is absolutely critical to any sort of ethics.

Um uh any deployment around um that has an ethical twinge that um you need to discuss this stuff, you need to put sunlight and transparency behind it in order to hash this stuff out because whatever you choose, you need to be explicit about what you're doing.

So the question came up of, OK, if you've got this web proxy in place that's filled all this bad stuff, it's also looking at all the web traffic. Can we use it for? Hr And if so, what are the boundaries around that? Um Even if you're not using it for hr investigations, the fact that it's running and can say, hey, Mark is surfing Facebook three hours a day.

Mark went and checked out this um uh you know, this site about sex or this is about a health issue or this, that or the other thing who is looking at it, where are the boundaries? And this is one area where I find cybersecurity teams that I talk to fall down a lot is that the internal process and sharing that internal process with teams like hr um with legal with the larger user community is how you use these tools.

So you normally as a cybersecurity um uh worker as a on the team, uh have the ability to do a lot of prying and a lot of invasive actions um within the scope of your job, right? You need to do this investigation, you need to go through and forward and then figure this out.

You need to be able to say here are the boundaries in which we're allowed to do that here is where we won't step over the line. And if you step over the line, here's how we police ourselves. Um There's a lot of really complicated issues here.

Um Another example, so we have that proxy issue trying to discuss like, ok, should we use it for? Are, and what are the boundaries around the um admin staff on the cybersecurity team, how can they look at traffic? When will they look at traffic?

Um And what we had settled on for us just to close that story out was that the system would do uh everything pretty much automated and only raise a flag at which point a human would log in to check things. So we could say, um you know, first sweep was being done automatically based on these criteria.

When one of those criteria was triggered, then a human would do a first law investigation. And if there was something there bring in other team members from different teams in order to um you know, make sure that more people were uh you know, looking at it.

So one person couldn't go rogue now, not a ton of people. Uh because that bridges to my second issue. Um is that a lot of the time, um especially if you're doing something like forensic investigation. There's a temptation to brief everybody on the team or to brief your boss.

Do you have to, um, I conducted a number of investigations where all my boss knew was I was working on an investigation that I gave a number to. So it's case number, blah, blah, blah. I've dedicated X amount of hours. It's going to impact the rest of my workflow.

The following ways I'm reporting to legal on this. I'm reporting to hr uh on this. Here are the contacts there. If you need information about the case, talk to them because I'm not in a position to let you know. So I was working uh deeply on, on some cases, my boss was essential blind to it.

Um And that's OK. And that similarly goes to um other issues around sharing information. There's this push to share everything and collaborate, which is amazing, but there needs to be sometimes be boundaries around sensitive information and you should not be um hesitant, you should not be reticent to say, hey, wait a minute.

This is information is sensitive and I can only on a true need to know basis. Not a want to know. Not a I feel like I'm one of the cool kids because I need to uh you know, I feel like I need to know actual need to know.

Um There's a lot of gray area here. In fact, it's all gray area. There is nothing but gray area when it comes down to this stuff, because in cybersecurity, you will be dealing with these technologies that have a wonderful upside for defense that have a horrid, horrid downside in the hands of the wrong people.

Um And you are trusted within your organization to be working with these technologies to be using them within some sort of ethical guidelines, within some sort of moral boundaries. You need to know what those are for the organization you're in, you need to understand how to be transparent about.

Um when you're up against those boundaries, when you've gone over them. Um What happens how that goes around now, that's gonna be different for everybody. But as a cybersecurity professional, as somebody thinking about getting into cybersecurity, you need to understand that this is going to be a part of your day.

This is going to be a part of the discussion. So we go back to that original Google example that Ramona had written about. Um and I'll link to that below and I'll tweet it out as always. Um They, these uh scientists working in A I not even in cybersecurity in A I were uncomfortable with the potential uses of their technologies based on the contract, right?

They had a contract with the military. Um and they were providing technologies that could be used in malicious ways. They weren't necessarily being used, but they could be used in that way. And all they had was somebody's word saying Oh, don't worry, we won't use that in a negative way.

That team and that company decided that they were no longer comfortable with that. And so they've taken steps now, their decision is on them. I'm not gonna pass a value judgment here. Um, but I think it's important that you need to be prepared for these types of discussions throughout your career because they're gonna come up again and again and again, and remember, there's a difference between what you can legally get away with.

Um, and what you should be doing, what you feel is right to be doing and what your organization should be doing and feels right doing. These are all gray areas, but the only way you get through them is discussion, getting it out in the open and tackling.

And as always, same with this show, that's how I like to approach it. Hit me up online at marknca in the comments down below wherever you're seeing this. And as always by email me@markn.ca, let me know how you've tackled it.

Have you had a particularly hairy situation that you had to work your way through that the community could learn from? Um That's a critical way to share as well. Um Even though everybody's gonna have different values, different morals, different ethics and that's fine.

We all live in different communities. Um, but we all share the same challenge of upholding them and living to them and working through, uh, these situations when we're confronted, that something runs against them. So let's get that discussion going. Uh, it's a big deep topic for a Monday.

Um, but I know you're up for it. Uh, I hope you have a fantastic day. I'll talk to you online and I'll see you on the show tomorrow.

Read next