Archive 6 min read

Ethics And Action In Technology

Ethical questions and quandaries are tough enough to work though when they are theoretical. But when you're confronted with them in the real world, there are usually real world consequences. This makes a hard situation even harder. What do you do? What can you do?

Ethics And Action In Technology

Watch this episode on YouTube.

Reasonably Accurate 馃馃 Transcript

Good morning everybody. How's it going today? I'm coming down to you a little early today for episode 71. Um have some scheduling challenges today and throughout the rest of the week as I'll be in the eu um uh talking to uh some partners, talking to some customers. Um So uh stay tuned, I'm gonna try to hit the normal times but uh maybe uh off by a little bit.

Um What I want to talk about today is a follow up to episode 70 which where uh we talked about um ethics in technology, ethics and cybersecurity and how as a cybersecurity professional, you are going to have to be prepared to have those conversations around the use of the technologies that we're deploying around, um access to information around privacy and around um keeping certain things compartmentalized in an age where the push is very much the opposite, whereas to share everything to collaborate, to have no barriers.

Um And I'm an advocate of some of that as far as having no barriers and, and facilitating collaboration, obviously, that's how business moves forward. However, from a security and privacy perspective, you also need very clearly defined controls that are monitored. What I wanted to talk in this episode, uh about, in this episode was a follow up to that because a couple of things crossed the radar um, yesterday.

Um So I had referenced, uh, the Google, um A I team um forcing an internal discussion which resulted in um, the fact that they're not going to renew that contract with the, the military um in the US. But two other things that came to light was that uh Microsoft had the immigration customs enforcement um as a customer on Azure, they've since pulled down that customer reference case study given the current political situation with that department in the US.

Um And they're facing some internal backlash as far as the engineering teams and uh internal teams within Azure saying, wait a minute, is this what we want? Um And then the flip side or not the flip side, but an adjacent story um is AWS um received a letter from a number of shareholders um uh just before the weekend about the um use of uh the recognition video um technology.

So this is an A I model that um does face detection as well as object detect within videos um and their um use in law enforcement because there's been a few case studies done with law enforcement where they've been able to use this technology to set up um easy face recognition systems.

Um And they actively market that to law enforcement and to government as well. So there's there's that extra step. Um Now this is interesting because the the backlash is coming in two forms. So one, we're seeing internal teams say this is not what we're OK with. Um And the other is from shareholders.

Now, shareholders have always had um pressure over companies. That's just how that public comp uh publicly traded company system works. So I don't need to dive into that. What I wanted to raise the flag on was internal teams. You need to be aware of that. This stuff is not free and this stuff.

Uh But what I mean by that is that, you know, standing by your ethics and your moral position is not free. It's not without cost. Now, we're seeing these examples in A I um which is good because A I has a position of strength as far as A I um engineers A I researchers right now.

They are um so in demand that it's not even funny. It's a crazy hot field. They're getting poached from uh other companies left, right and center. None of these folks with A I um challenges or with A I skill set, sorry are going to be challenged to find another role.

So they are in an excellent position to be able to put their foot down and say, wait a minute, this is not what we are ok with. I am not OK with us crossing the line to building a face detection technology that could be racially biased or uh our technology being implemented in a way and that helps, you know, military acquire targets, things like that.

They are in a position that if the company came down and said, no, this is what we're gonna go ahead and do that. They could walk away. Um, same somewhat to cybersecurity that cybersecurity is in hot demand. If you have the right skill set and the right network, you can easily find another role um based on moral or ethical reasons.

Now, that's not true all the time. That's not true for everybody. Um So that raises the stakes even more. If you're asked to deploy a monitoring technology and you're seeing abuse with it, there is a cost or there's a potential cost if you come forward with that. Now, I'm not saying don't do it.

I'm just highlighting the challenge. You thought it was hard before having a discussion of saying, hey, wait a minute, I'm not comfortable if our technology is being used like this or I'm not comfortable if we are um you know, sharing information in the following way. Now, you need to understand there are consequences to the actions if you are gonna take them because you could be fired, you could be let go or you could be ignored or you could be said no.

And this is, this is definitely the direction the company wants to go and then you're faced with a really hard decision of. Is that something that you want to associate yourself with? Now, that's an entirely personal decision. That's a decision within your family, within your community, within yourself. Um I'm not gonna give you guidance either way.

What I wanted to highlight though was the circumstances in which you're seeing the latest wave of news stories. These A I researchers are in the uh fantastic position to make a stand like this and to force change because there is such a small subset of qualified A I researchers uh that if the company uh can't just let them go.

Right. They're going to lose market share, they're gonna lose market traction if they're letting these people go. And that's an enviable position. That's a fantastic position to facilitate change from cybersecurity is not quite as strong a position, but as close. Um for now with the right folks and the right skill set, you're not immune just because you're in cybersecurity and A I researcher.

But I think that's really important and I think that's something that we need to discuss and we need to get people um to have the tool set internally to have that discussion. I was quite fortunate that when I did my masters in information security, um ethics was a course. It was a mandatory course.

I took a full semester on ethics. Uh some interesting discussions in the classroom, some interesting essays we had to write to position you know, defend our positions under different structures um and different schools of thought. And I don't think that's covered enough. I think that has to be an absolutely core component because even if we walk away from some of the extreme examples we're seeing right now, let's look at Facebook, let's look at any social network out there, any ad tech network where the goal of that technology is essentially to follow people around and try to convince them to buy something.

Um And more often than not without their knowledge of the depth of technology, there's a lot of questions there. Um But because we're not teaching this early, we're not getting people to ask these questions to have this discussion. Um We're only getting it to the point when it gets egregious.

And so we're already an extreme and a precipice. And then we say, well, we can walk back a bit as opposed to starting down that path in the first place. No answers again today. Um Just wanted to highlight these these challenges in that um special case with A I and cybersecurity in a position of strength from an employee perspective.

Um But there are definitely consequences to this again. This is something where transparency and sunlight is absolutely critical, getting out in front of it um talking about it in that spirit, hit me up online at Mar NC A in the comments down below as always by email me@markn.ca, I'm interested in hearing your um challenges you faced around ethical or moral lines.

Um Your thoughts on this, how to approach it within uh your teams, within your company um within our community. Let me know. I hope you're set up for a fantastic day. Um As I mentioned at the outset of this episode schedule will be a little up and down this week due to travel to the European Union.

Um but I hope to talk to you tomorrow and as always online.

Read next