Archive · · 4 min read

Facial Recognition Discussion Required

Facial recognition is a technology that exemplifies the underlying neutrality of most technology. When used with positive intentions it makes out devices more secure. When in the wrong hands, it can violate privacy on a massive scale. Do we need to regulate this technology? At what level? Regardles

Facial Recognition Discussion Required

An icon representing a document where the bottom half of it has been drawn with a dotted outline, implying a copy The CBC Radio segment has been archived and is only available from CBC by request.

The third radio column for my new series on CBC Radio’s Ottawa Morning. This time around Robyn Bresnahan and I talk about facial recognition technology and how it’s being used in our communities.

What Is Facial Recognition?

Facial recognition is a technology that can identify people in an image or video. This is commonly done by identifying various facial features and reducing them to a unique set of attributes.

Identifying that part of the image or video is a face is step one. This type of facial recognition is widely deployed in photography. Cameras work hard to identify faces in order to adjust focus, lighting, colour space, etc.

The second step is to identify the face by measuring the distance between the eyes, bridge of the nose, angle of the jawline, etc. in order to create a unique model for that face. That model is then compared with an identity database in order to figure who the face belongs to.

In the past few years, we’ve seen a number of advances that have had a dramatic impact on facial recognition.

The first is an advancement in face detection itself using new computer vision techniques. Computers are better than ever in detecting a face regardless of the image or video being analyzed.

The second is a constant and significant reduction in the cost to analyze imagery. A lot of this has to do with the advances in cloud technologies. AWS, Microsoft, and Google all offer these types of analysis services for about a $0.10 per minute for video.

This means that facial recognition is more accurate than ever and more accessible than ever. Understandably, it’s popping up in more and more scenarios.

Positive Usage

Whether you know it or not, you’ve been using facial recognition technology almost daily.

Google Photos and Photos.app in the Mac ecosystem both use facial recognition to help sort your photos. They do this in the background in order to make your ever-growing library of photos manageable. There’s simply no way you could keep up trying to manually flag each person in each photo.

Similarly, smartphones with more than one camera use facial recognition to enable new modes that are similar and may even surpass traditional photographic methods.

Swinging Snapchat filter applied
Me with a swinging Snapchat filter applied

A fun application of this approach is the lens and filters in Snapchat. The app uses facial recognition to map out you features and then layer fun or scary graphics on top.

At the 2018 royal wedding, Sky News deployed facial recognition to identify and flag celebrities in the audience to viewers around the world.

There has also been a number of apps published that attempt to detect your emotions or age from photos or video. Taking that a step further, Google researchers also published a fun app that attempts to match you to a fine art work that looks most like.

Potential Abuse

The challenges arise from other, increasingly common use cases. Recent reports have various law enforcement organizations using or trialing facial recognition technologies at scale.

We’ve seen this in varying forms in Maryland, Orlando, Cardiff, Calgary, and elsewhere.

Because there are no guidelines, how and where this technology is used varies greatly.

The most troubling public case being in Cardiff where their experiments have a 92% false-positive rate. That means the system was identifying the wrong person roughly 9 out of 10 times.

The only bright side is that facial recognition was not the only determining factor for this use case. Manual review ensured that no one was false arrested after being misidentified by the system.

And it’s not just law enforcement or government use cases that are troubling. An app called “FindFace” used facial recognition to match any provided photo with a matching social network profile. Surprising no one, this was quickly used for stalking and digging up profiles hidden through obscurity.

But what if it didn’t stop there?

Dystopia

In 2002’s sci-fi thriller, Minority Report, there’s a scene set in the near future where the hero—played by Tom Cruise—is walking through a public space. As he walks, he’s called out by name by advertisements pitching him personalized offers.

While it seems outlandish in 2002, we’ve slowly but surely moved closer and closer to making that a possibility. We have the technology available now.

And it is starting to come true.

In China—where admittedly, the culture around personal privacy is markedly different from most of the rest of the world—they have a program called “Sharp Eyes”.

This program looks to integrate cameras nation wide under a single, massive data sharing effort. All of that data will be analyzed, tracking citizens throughout their daily lives.

Discussion Needed

What is needed in other communities is a discussion around the use of facial recognition technologies.

Law enforcement has been tasked with a job and they will do that job to the best of their abilities, with the tools available, within the guardrails provided by the community and various legislation.

It is the communities role to set those guardrails and regularly update them based on advances in technology and changes in cultural sensitivities.

And it’s not just law enforcement that needs guardrails. We need to ensure that any discussion around the use of this and other technologies that impact the community is a productive one.

This is the discussion that Microsoft has attempted to start in the US.

In a seemingly odd step, Microsoft has asked federal regulators to form a commission to examine the issue. As a purveyor of this technology as a service, Microsoft has a vested interest in being allowed to keep it in the market. However the community will benefit from any discussion.

On it’s own, facial recognition is neither good nor bad. It’s the use of the tool that provides the context to make that type of value judgment.

As a community, we need to have an ongoing discussion to understand what uses we will accept and those we will not.

Read next