Security Cloud Privacy Tech
Cybersecurity Motivations

Cybersecurity Motivations

Mornings With Mark no. 0188

Watch the episode on YouTube

Join the discussion on LinkedIn

Share on Twitter

Bad Robot Transcript

Morning, everybody. How you doing today in this episode of the show? We’re going to talk about the role of security and privacy in technology. The bear with me as I kind of go into a bit of a rant here. You may have noticed the last episode of mornings with Mark got flagged under the content ID automated system on YouTube.

I’m Apple made a copyright claim against most videos that I could tell I that were using Snippets from the keynote, but not the full compilations from major media Outlets like Engadget average. Now, I believe that my video was within fair use in the US or Fair dealing here in Canada.

And because this format was this I was talking to the camera. I would cut to a quick snippet of a key privacy or security point in the keynote app for a few seconds then come back and then discuss that point put it in the larger contacts time refer to other research that support of that point and I think that falls under the educational and editorial as well as the research on exemptions around fair-dealing and fair use.

I’m not a lawyer so I can’t dispute it after on legal grounds. It just based on my interpretation in the way that Content ID. A system really works on YouTube. There’s really no point in disputing it because if Apple fires are lawyers up. I don’t have the resources to fight.

It’s always easier. It’s not a copyright strike. It was just a claim. So I just simply re-uploaded the video with those parts removed with placeholders referring people to the original video on Apple’s website so that minor thing there but it really kind of reminded me and triggered a few points coming together to talk about the larger role of security and privacy within technology.

So well, that was a copyright claim around a commercial Works Content ID as a system is very much designed to do that. And it does that really well perhaps too. Well in that it’s steamed rolls over a lot of fair use in fair-dealing and obviously it’s not in YouTubes interest to support that they want to have their advertisers and Commercial interests are guarded first.

So very clear profit motivation there and that’s not necessarily a bad thing. But at the same time to articles popped up our two issues with a lot of Articles popped up this week around Content on YouTube that don’t have a clear profit motivation. So the platform is waffling on dealing with them.

The first was this horrendous incident of continued harassment and hate speech against a particular activism journalist. Am I believe there are journalist. Anyway all I’ll link to the article you can read themselves that I want to call out those people here. That’s not the point at but it was a pretty clear the content or the transcript to the content that that was hate speech at pure and simple just horrendous to see but you tube in a for tweet explanation don’t ask me why times for tweets explain publicly that they reviewed it and it was a case of free speech and it didn’t go against their Community guidelines, which most people are in arms about saying it’s somewhat ridiculous harassment is not allowed on the platform and hate speech and you know incitement isn’t supposed to be allowed yet.

You’ve reviewed this which seems like a slam dunk to to nail it and you’ve allowed it to go on at the same time. There was also Highlighting at an issue around YouTube’s algorithm and the promotion of videos in the rabbit hole affect which YouTube continues to deny but I think any of us who use the platform know that if you click on a video all of a sudden you’re bombarded by other videos of similar stripes and you quickly sort of get away from the original video and by being lead a sore like a game of telephone a little bit every stop get the original message a little bit dirtier little noisier so that it’s less and less clear.

But in this case it was innocent home videos or activity videos that families were sharing time with themselves or marking is public. Unfortunately. I’m going to give it a whole different issue around uploading videos of your kids to YouTube’s or pictures of your kids to social media. But anyway, these families have decided to do that and the algorithm it started to recommend them to a potential Predators have two people who were sexualizing children and inappropriately, which is you always Stop in the end.

You don’t follow these Links at your own risk. They are within the context of Journalism. So it’s not that bad but it is pretty stuff Tuff Stuff to read to know that these things go on the platform Again YouTube saying what the rabbit hole effect, isn’t there. Are they taking one step to help prevent this and then it won’t let kids live stream, which is you don’t just a tiny tiny stuff for then you do a lot more to protect kids and I did inform parents and families about what’s appropriate.

What’s not appropriate platform, but all this ties back to motivation and you’re probably thinking what is a soul-tie back to cybersecurity to privacy. It’s pretty clear that people are in rather predictable ways to defend their interests and their incentivized to defend those interest. So I Fernanda Montenegro from 451 research Ave.

Fantastically smart individual has long been a proponent of using economic modeling to cyber security issues in the centrally looking at motivations vs. Outcomes. And I think this is an interesting one here is that YouTube has a very clear profit motivation and their actions line up with that and everything else is so far away from that clear profit motivation that it’s hard for them to make a decision and it’s not just you know, what I mean here is that keeping users on the platform and happy has an indirect lying to their revenue stream.

Where is keeping? The people who are paying for ads is a far more direct line then and now the two or three hot from keeping users on the platform now, they’re obviously combined and but it helps the prioritize and the same thing happens for Facebook with have scandal after scandal after scandal in Twitter recently continues to struggle to implement the the very basics of their policy to do it consistently and, you know again to motivation their motivation is advertisers and keeping people on the platform if people aren’t leaving in droves, they’re not going to take action and unfortunately based on that the content so they obviously took down a huge amount of Isis related to terrorist content years ago with a Danoodle owl white nationalist and far-right extremist content, which arguably based on the statistics is responsible for Farm or a terrorist acts within the Continental us and then Isis but real challenge here is as far as motivation is the platforms of struggling to do what’s right and it really comes down to a technical aspect of objective vs.

Subjective. So content content ID is objective. It’s taking the audio from something like the Apple keynote and matching it up to the audio in various videos and you can objectively say this is that the other types of content are subjective and that’s takes human in that takes judgment calls that takes clear policy and that takes practice.

So what does it have to do with privacy? What does this have to do with security while I’m glad you stayed with us this long because I promised this all wraps up at least. I hope it all wrapped up. I believe that security and privacy are fundamentally misunderstood because the way that they’ve been practiced for a really long time.

So for the longest time security is always been bolt on privacy has always been this hard shell around the edge of we won’t let information leaked out to the public. But internally, it’s fair game. We covered that a while back on information management and with Facebook’s new Direction. The problem is that people now have this perception that security is bulky security is problematic and privacy.

I can’t build privacy into my products because I won’t have the ability to apply machine learning and big data analytics and take advantage of all this valuable data can’t resell it. I can’t do this that the other thing and I don’t think those things hold up the light of day.

They don’t pass the sniff test easy easy answers to push back on on people go when they are Camp Hill Security’s really challenging and you need to know the wizard on the hill to understand security security is really straightforward. It’s trying to make sure that whatever system you build does what you intended only You intend and privacy is all about Information Management not exposing information needlessly and controlling access and monitoring that access to ensure that a you know, what you’ve got but also that you don’t infringe on people’s privacy being clear and transparent in the information you’re taking and holding up to that standard and I think all of these things privacy by Design security by Design and in practice is done better by breaking down these berries by making sure that security teams and privacy Focus developers are working together to build a solution from the outside.

But the problem is all of our organizations are not set up like that. Most of our organizations are set up the bolt that stuff on at the entrance to do compliance audits to do reviews before they go live and where is motivation because we’re not seeing security or privacy designs as there’s no incentive, right? The incentive is Morning, everybody.

How you doing today in this episode of the show? We’re going to talk about the role of security and privacy in technology. The bear with me as I kind of go into a bit of a rant here. You may have noticed the last episode of mornings with Mark got flagged under the content ID automated system on YouTube.

I’m Apple made a copyright claim against most videos that I could tell I that were using Snippets from the keynote, but not the full compilations from major media Outlets like Engadget average. Now, I believe that my video was within fair use in the US or Fair dealing here in Canada.

And because this format was this I was talking to the camera. I would cut to a quick snippet of a key privacy or security point in the keynote app for a few seconds then come back and then discuss that point put it in the larger contacts time refer to other research that support of that point and I think that falls under the educational and editorial as well as the research on exemptions around fair-dealing and fair use.

I’m not a lawyer so I can’t dispute it after on legal grounds. It just based on my interpretation in the way that Content ID. A system really works on YouTube. There’s really no point in disputing it because if Apple fires are lawyers up. I don’t have the resources to fight.

It’s always easier. It’s not a copyright strike. It was just a claim. So I just simply re-uploaded the video with those parts removed with placeholders referring people to the original video on Apple’s website so that minor thing there but it really kind of reminded me and triggered a few points coming together to talk about the larger role of security and privacy within technology.

So well, that was a copyright claim around a commercial Works Content ID as a system is very much designed to do that. And it does that really well perhaps too. Well in that it’s steamed rolls over a lot of fair use in fair-dealing and obviously it’s not in YouTubes interest to support that they want to have their advertisers and Commercial interests are guarded first.

So very clear profit motivation there and that’s not necessarily a bad thing. But at the same time to articles popped up our two issues with a lot of Articles popped up this week around Content on YouTube that don’t have a clear profit motivation. So the platform is waffling on dealing with them.

The first was this horrendous incident of continued harassment and hate speech against a particular activism journalist. Am I believe there are journalist. Anyway all I’ll link to the article you can read themselves that I want to call out those people here. That’s not the point at but it was a pretty clear the content or the transcript to the content that that was hate speech at pure and simple just horrendous to see but you tube in a for tweet explanation don’t ask me why times for tweets explain publicly that they reviewed it and it was a case of free speech and it didn’t go against their Community guidelines, which most people are in arms about saying it’s somewhat ridiculous harassment is not allowed on the platform and hate speech and you know incitement isn’t supposed to be allowed yet.

You’ve reviewed this which seems like a slam dunk to to nail it and you’ve allowed it to go on at the same time. There was also Highlighting at an issue around YouTube’s algorithm and the promotion of videos in the rabbit hole affect which YouTube continues to deny but I think any of us who use the platform know that if you click on a video all of a sudden you’re bombarded by other videos of similar stripes and you quickly sort of get away from the original video and by being lead a sore like a game of telephone a little bit every stop get the original message a little bit dirtier little noisier so that it’s less and less clear.

But in this case it was innocent home videos or activity videos that families were sharing time with themselves or marking is public. Unfortunately. I’m going to give it a whole different issue around uploading videos of your kids to YouTube’s or pictures of your kids to social media. But anyway, these families have decided to do that and the algorithm it started to recommend them to a potential Predators have two people who were sexualizing children and inappropriately, which is you always Stop in the end.

You don’t follow these Links at your own risk. They are within the context of Journalism. So it’s not that bad but it is pretty stuff Tuff Stuff to read to know that these things go on the platform Again YouTube saying what the rabbit hole effect, isn’t there. Are they taking one step to help prevent this and then it won’t let kids live stream, which is you don’t just a tiny tiny stuff for then you do a lot more to protect kids and I did inform parents and families about what’s appropriate.

What’s not appropriate platform, but all this ties back to motivation and you’re probably thinking what is a soul-tie back to cybersecurity to privacy. It’s pretty clear that people are in rather predictable ways to defend their interests and their incentivized to defend those interest. So I Fernanda Montenegro from 451 research Ave.

Fantastically smart individual has long been a proponent of using economic modeling to cyber security issues in the centrally looking at motivations vs. Outcomes. And I think this is an interesting one here is that YouTube has a very clear profit motivation and their actions line up with that and everything else is so far away from that clear profit motivation that it’s hard for them to make a decision and it’s not just you know, what I mean here is that keeping users on the platform and happy has an indirect lying to their revenue stream.

Where is keeping? The people who are paying for ads is a far more direct line then and now the two or three hot from keeping users on the platform now, they’re obviously combined and but it helps the prioritize and the same thing happens for Facebook with have scandal after scandal after scandal in Twitter recently continues to struggle to implement the the very basics of their policy to do it consistently and, you know again to motivation their motivation is advertisers and keeping people on the platform if people aren’t leaving in droves, they’re not going to take action and unfortunately based on that the content so they obviously took down a huge amount of Isis related to terrorist content years ago with a Danoodle owl white nationalist and far-right extremist content, which arguably based on the statistics is responsible for Farm or a terrorist acts within the Continental us and then Isis but real challenge here is as far as motivation is the platforms of struggling to do what’s right and it really comes down to a technical aspect of objective vs.

Subjective. So content content ID is objective. It’s taking the audio from something like the Apple keynote and matching it up to the audio in various videos and you can objectively say this is that the other types of content are subjective and that’s takes human in that takes judgment calls that takes clear policy and that takes practice.

So what does it have to do with privacy? What does this have to do with security while I’m glad you stayed with us this long because I promised this all wraps up at least. I hope it all wrapped up. I believe that security and privacy are fundamentally misunderstood because the way that they’ve been practiced for a really long time.

So for the longest time security is always been bolt on privacy has always been this hard shell around the edge of we won’t let information leaked out to the public. But internally, it’s fair game. We covered that a while back on information management and with Facebook’s new Direction. The problem is that people now have this perception that security is bulky security is problematic and privacy.

I can’t build privacy into my products because I won’t have the ability to apply machine learning and big data analytics and take advantage of all this valuable data can’t resell it. I can’t do this that the other thing and I don’t think those things hold up the light of day.

They don’t pass the sniff test easy easy answers to push back on on people go when they are Camp Hill Security’s really challenging and you need to know the wizard on the hill to understand security security is really straightforward. It’s trying to make sure that whatever system you build does what you intended only You intend and privacy is all about Information Management not exposing information needlessly and controlling access and monitoring that access to ensure that a you know, what you’ve got but also that you don’t infringe on people’s privacy being clear and transparent in the information you’re taking and holding up to that standard and I think all of these things privacy by Design security by Design and in practice is done better by breaking down these berries by making sure that security teams and privacy Focus developers are working together to build a solution from the outside.

But the problem is all of our organizations are not set up like that. Most of our organizations are set up the bolt that stuff on at the entrance to do compliance audits to do reviews before they go live and where is motivation because we’re not seeing security or privacy designs as there’s no incentive, right? The incentive is

More Content

Related Content