Archive 9 min read

Cybersecurity Motivations

Recently a video of mine was flagged by YouTube's automated ContentID system which may or may not have been justified. Regardless, it got me thinking of the mismatch in motivations for builders investing in cybersecurity and privacy.

Cybersecurity Motivations

Watch this episode on YouTube.

Reasonably Accurate 馃馃 Transcript

Morning everybody. How are you doing today? In this episode of the show, we're gonna talk about the role of security and privacy in technology. Now bear with me as I kind of go into a bit of a rant here. You may have noticed the last episode of mornings with Mark got flagged under the content ID automated system on youtube.

Um Apple made a copyright claim against most videos that I could tell that we're using snippets from the keynote um but not the full compilations from major media outlets like N Gadget or Verge. Now, I believe that my video was uh within fair use in the US or Fair Deal here in Canada. Um because essentially the format was this, I was talking to the camera.

I would cut to a quick snippet of a key privacy or security point in the keynote for a few seconds, then come back and then discuss that point, put it in the larger context, um refer to other research that supported that point. Um And I think that falls under the educational as well as the research um exemptions around fair dealing and fair use.

But again, I'm not a lawyer so I can't dispute it for, on legal grounds. It just based on my interpret and the way that content ID system really works on youtube. There's really no point in disputing it because if uh Apple fires their lawyers up, I don't have the resources to fight it. So it's easier.

It's not a copyright strike, it was just a claim. So I just simply re uploaded the video with those parts removed with placeholders referring people to the original video on Apple's website. So uh minor thing there, but it really kind of reminded me and triggered a few points coming together to talk about the larger role of security and privacy within technology.

So well, that was a copyright claim around uh commercial works content ID as a system is very much designed to do that and it does that um really well, perhaps too well in that it steamrolls over a lot of fair use and fair dealing. And obviously, it's not in youtube's interest to support that they want to um have their advertisers and commercial interests uh guarded first.

So very clear profit motivation there. And that's not necessarily a bad thing, but at the same time, two articles popped up or two issues with a lot of articles popped up this week around content on youtube that don't have a clear profit motivation. So the platform is waffling on dealing with them. The first was uh this horrendous incident of continued harassment and hate speech um against uh a particular activist and journalist.

Um, I believe they are journalists anyway, I, I'll link to the article, you can read it themselves. I don't want to call out those people here. That's not the point. Um But it was a pretty clear example to anyone who viewed the um, content or the transcript of the content that, that was um hate speech um at, you know, pure and simple, um just horrendous to see.

But youtube uh in a four tweet explanation, don't ask me why it was four tweets um explained publicly that they reviewed it and it was a case of free speech and it didn't go against their community guidelines, which most people are up in arms about saying it's somewhat ridiculous and that you're not harassment is not allowed on the platform, um hate speech and you know, incitement isn't supposed to be allowed yet.

You've reviewed this which seems like a, a slam dunk to nail it and you've allowed it to go on. At the same time. There was also highlighting an issue around youtube's algorithm and the promotion of videos um in the rabbit hole effect, which youtube continues to deny. But I think any of us who use the platform know that if you click on a video all of a sudden you're bombarded by other videos of similar stripes and you quickly sort of get away from the uh original video um by being led.

Uh sort of like the game of telephone a little bit, every hop gets the original message a little bit um dirtier, a little noisier so that it's less and less clear. Um But in this case, it was innocent home videos um or activity videos that families were sharing um with themselves or marking as public.

Unfortunately. Um And again, there's a whole different issue around uh uploading videos of your kids to youtubes or pictures of your kids to social media. Um But any of the way these families have decided to do that and the algorithm has started to recommend them to uh potential predators um to people who are sexualizing Children uh inappropriately, um which is, you know, always, um and just really some abhorrent and horrific stuff and, and, you know, follow these links at, at your own risk.

They are within the context of journalism. So it's not that bad, but it is pretty tough stuff to read to know that these things go on the platform. And again youtube saying, well, no, the rabbit hole effect isn't there. And they've taken one step to um help prevent this and they won't let kids live stream, um which is, you know, just a tiny, tiny step forward.

They need to do a lot more to protect kids um and inform uh parents and families about what's appropriate what's not appropriate on the platform. But all this ties back to motivation and you're probably thinking, what does this all tie back to cybersecurity to privacy? Well, it it is pretty clear that people act in uh you know, rather predictable ways to defend their interests and they are incentivized to defend those interests.

So, Fernando Montenegro from 451 research, um fantastically smart individual has long been a proponent of using economic modeling to cybersecurity issues and essentially looking at motivations versus outcomes. And I think this is an interesting one here is that, you know, youtube has a very clear profit motivation and their actions line up with that and everything else is so far away from that clear profit motivation that it's hard for them to make a decision and it's not just youtube and, and what I mean here is that um keeping users on the platform and happy has an indirect um line to their revenue stream.

Whereas keeping the people who are paying for ads is a far more direct line than, you know, the two or three hop from keeping users on the platform. Now they're obviously combined. Um but it helps the priority and the same thing happens for Facebook. We have scandal after scandal after scandal and Twitter who, you know, recently continues to struggle to implement the the very basics of their policy to do it consistently.

Um And you know, again to motivation, their motivation is advertisers and keeping people on the platform. And if people aren't leaving in droves, they are not going to take action. And unfortunately based on the content, so they obviously took down a huge amount of um isis related terrorist content years ago. But they continue to allow white nationalist and far right extremist content, which arguably based on the statistics is responsible for far more terrorist acts within the continental US than uh I I um but you know, there's this real challenge here as far as motivation is the platforms are struggling to do what's right and it really comes down to a technical aspect of objective versus subjective.

So content uh content ID is objective. It's taking the audio from something like the Apple keynote and matching it up to the audio in various videos. And it can objectively say this is that the other types of content are subjective and that takes human and that takes judgment calls, that takes clear policy and that takes practice.

So what does this have to do with privacy? What does this have to do with security? Well, I'm glad you stayed with us this long. Um Because I promise this all wraps up, at least I hope it all wraps up. I believe that security and privacy are fundamentally misunderstood because the way that they've been practiced for a really long time.

So for the longest time, security has always been bolt on privacy has always been a sort of hard shell around the edge of we won't let information leak out to the public but internally it's fair game. We covered that a while back on information management and with Facebook's new direction, um the problem is is that people now have this perception that security is bulky.

Security is problematic and privacy. I can't build privacy into my products because I won't have the ability to apply machine learning and big data analytics and take advantage of all this valuable data. I can't resell it. I can't do this, that and the other thing and I don't think those things hold up to the light of day, they don't pass the sniff test.

They're easy, easy answers to push back on and people go well. No, no, I can't. You know, security is really challenging and you need, you know, the wizard on the hill to understand security that it's bull. OK. Security is really straightforward. It's trying to make sure that whatever system you build does what you intend and only what you intend and privacy is all about information management, not exposing information needlessly and controlling access uh and monitoring that access to ensure that uh a you know what you've got, but also that you don't infringe on people's privacy being clear and trans parent in the information you're taking and holding up to that standard.

And I think all of these things privacy by design, security by design and in practice is done better by breaking down these barriers by making sure that security teams and privacy focused developers are working together to build a solution from the outset. But the problem is all of our organizations are not set up like that.

Most of our organizations are set up to bolt this stuff on at the end to do compliance audits, to do reviews before they go live as opposed to in the building stage. And where this ties back to is motivation because we're not seeing security or privacy designs as uh and there's no one incentive, right?

The incentive is I don't want to get slapped on the wrist or I don't want to end up in the front page or all over the internet going viral because of a privacy breach or because of a security breach. When I don't think that's a proper motivator, we need to figure out some way to tie, this is the right thing to do and this is the best way to build into an economic motivator.

Now, the sad thing is, I don't have an answer for that. I don't know what that motivator is, but I know we need to find something because like we've seen with youtube where they have a very clear profit motivation to keep copyrighted material off and they do a great job on it. All of the social networks are struggling with hate speech with discrimination with harassment because at the end of the day, there's no direct economic motivator to get them to deal with it.

It's a reputational issue. It's a, please don't leave our platform issue. And we, as users aren't doing our job by leaving, we're sticking around and we're saying, well, no, there's still good on these platforms, we're going to deal with them. So there's no motivator there. And it's the same thing when you're building technology with security and privacy, people aren't building it in because of this misperception about the fact that it's super hard and it's super difficult and it will stifle innovation and it will slow you down.

Not true. What is, where that comes from is the way that we've implemented with separate teams with bolting it on at the end and instead of building it into start, if you build it in the start, these things are just another aspect of your design. Security is not going to slow you down. Privacy is not going to slow you down.

But what's the motivator? That's my question to you. That's my challenge to you is what is the motivator that will get people to start to think that way. Yes, giving talks, doing vlogs all this kind of stuff, fine, whatever. But at the end of the day, there needs to be some sort of economic incentive or motivator that's not just compliance or regulation.

And we haven't seen the full impact of GDPR, but I don't think that's the end all be all. I think it's a lot more complex than that. And I'd love to hear your thoughts hit me up online at Mark NC A in the comments down below and as always by email me at Mark N dot C A.

What do you think that motivator is, do you agree with my perception and my position on the role of security and privacy and designs that things should be secured by default, things should have privacy built in by default. How do we get there? Big questions. Hope you're set up for a fantastic day. I look forward to talking to you and we'll see you on the next episode of the show.

Read next