Watch this episode on YouTube.
Reasonably Accurate 馃馃 Transcript
How's everyone doing today? Um Good. Um I'm doing a double checking my fee someday. I will have confidence in the stream. Hm, I have confidence in this stream today. Ok. My favorite word or one of my favorite words of all time is a muck as in A I is running a muck.
Um It's a word that you never use in a positive context because it means uncontrollable or disruptive sort of behavior. Um And that's what we're seeing. So I talked a bit about A I in the context of the upcoming announcements this week from MS B from Google next.
Um and or Google Io sorry, next is coming up in a few, uh few months. Um And uh some other examples. Um Now, interestingly enough, I'm just adjusting the microphone a little bit here unsuccessfully. Of course, there we go. Ok. Um Interestingly enough, uh the first wave of Google announcers was sort of hit or miss now I found it ironic that they called out the beer emoji.
Um That's the follow up on the hamburger emoji um that they had made a mistake on and of course they made a mistake on the phone and the beer. Um So they're correcting that, but they made a whole bunch of new announcements yesterday, like um updates to ML Kit, um Android P, which is I got a whole bunch of new machine learning techniques to help you use your phone less.
Um And importantly, they made this announcement of a Google assistant being able to make phone calls to businesses and, and on your behalf, um autonomously to set up appointments and things like that run amok was the first term that sprung to mind and it continues to stay there and sort of bubble a little bit because the questions we were asking yesterday about A I and ethics and what we want them to do and the bigger discussion, um this sort of hits on another pet peeve of mine is that um I get asked quite often by journalists around the world.
Um you know, was this a smart security decision or was this a good decision for privacy, things like that? Um And my response is almost always the same, at least, I hope it's relatively consistent um is that I don't have a problem with any given decision as long as it's made explicitly.
And this is sort of the challenge we have um with a lot of these Google announcements, like they say, hey, look all these great features in Android pe to help reduce your addiction to the smartphone. But what they don't highlight is the fact that they need to know you even more intimately and connect even more data um to you and build a better profile in order to do that.
Um Look at all these cool advances we've made in Google Photos and they are cool. Don't get me wrong. These features are amazing. These are great cool new features. Um but look at these cool new features that we've made and oh yeah, of course. Um you know, to get this better awareness of your photos, we harvest it even more and scanned everything we could to train our A is better.
And then the Google Assistant um stepping very much into the, into the real world interacting with other people. Um without saying who it is. That is a bit of a flag for a lot of people who was fired up on Twitter a little earlier. Um that there's a concern there is that legitimate.
Are we cool with um A is having human interactions without the human knowing they're interacting with an A I directly, that's the big key here directly. It's one thing for A is to um augment. Um So in the case of the Welsh police that we talked about yesterday, um the A is were scanning through the hundreds of thousands of faces to bubble up a few for humans to take it to the next step.
This is A I taking it all the way and I can understand the Silicon Valley mindset of saying like, hey, what a cool thing we can do for users is we can get the A I to call places and book appointments for them. Um But what about the flip side of the users interacting, like having call?
And it's this weird voice and you're like, I don't know how to handle this and it turns out to be an A I that's poorly trained or an A I that's behaving poorly as opposed to a real human. So there's definitely some lines being crossed here. But my biggest problem sort of the underlying takeaway from Google IO this week was we're going to build even better profiles about you and they've gotten relatively unscathed through the spotlight that's been hit Facebook recently.
But again, these are implicit um for most like an implicit transaction, it's not explicit and I don't like that. I like explicit transactions. I think that's where we find strong privacy. That's where we find strong security is laying things out for people and saying here's what we're doing, here's what you get back.
Are you OK with that? Um There's also a parallel discussion happening right now on uh on Twitter with a few folks around um Apple and their disclosure when you say, hey, I want all the data you have on me and what they send back and is that actually complete or not?
And um you know, I had raised the point that Apple was sending back what they had, but they, you know, where they failed was they weren't crystal clear and here's what we have, here's what you have stored with us that we don't have access to um or only have access to under a legal legal request.
Um And then, you know, here's what we trace around you and build a profile around you. Um So there's a lot of work to be done here and I think this is the tip of the iceberg, but this is my big takeaway for today. My big caution for today is that when um the company, when companies come out with all marketing spin and go, hey, look at all this cool new stuff.
Um I warning bells, warning bells, I like explicit. Hey, this is what we're doing. This is what we are um giving you in return. So uh positive example, um most security vendors including the one I work for do this. Um When you either opt in or opt out depending on how they um to sharing some um anonymized or pseudo anonymized security information.
Here's what happens in the back end is that it gets analyzed if it's malicious and everyone benefits from it. Um There's no tracking done, there's no tracing done. Um There's no profile building up. It's not perfect, but at least there's an explicit agreement. Hey, here's what you give, you know, by sharing some of this data that's been cleansed or at least filtered.
Here's what you get back in return, you get better protection. It's a direct translation. It's not just you getting better protection, it's everybody. Um Facebook makes a very um understandable trade off, but it's not explicit, it's implicit, right by sharing all of your activity on the network.
We protect your user information, but information about the users is what we sell to advertisers to the point where it could impact um community norms and society norms. And we're seeing that in a lot of places where there's an election coming up here in Ontario in Canada.
And I'm seeing on youtube, tons of ads from that are explicitly biased. Um And actually probably against Canadian campaign laws. Um but uh no uh information about who's putting them out there and who's trying to influence the feed and view of people uh in the province.
So, um major issues and again, really kind of re tanged here. But the takeaway is explicit agreements when it comes to user privacy and security are far better than implicit. And A I is starting to creep into a lot of places where we need to talk about it now before it goes everywhere.
Um So that we can make these explicit agreements and understand what we're trading off and what we're getting back in return. It's not bad. There's no bad things out there necessarily, it just needs to be out in the sunlight and transparent so that you can consciously, as opposed to things just being pushed on around.
Uh you. So um please, by all means, uh hit me up online, marknca. Um looking forward to talking to you about this issue. Um It's the only way we're gonna get through it. Um It is gonna be a hot button issue for the next few years, for sure.
Discussion needs to start now. Have a great day. I hope your day is as beautiful as it is out here. I decided to film outside today because it's like plus 24 already. Why wouldn't you um take care? I will talk to you guys soon. Getting back.