Transcriptions
Note: this content has been automatically generated.
00:00:04
good afternoon good evening uh well i'm going to talk about the subject which is um
00:00:15
a quite different from the fairly different from the ones that we've been
00:00:20
hearing recently in the last uh our
00:00:28
subject is by metrics and in particular biometric security what interest me in particular is
00:00:35
the security part of the aspect you'll see in a moment what i'm talking about
00:00:41
oh metrics some well maybe i you haven't heard about it or maybe thought it was a rather
00:00:47
abstract but most of you have it in your pocket
00:00:51
because nearly everybody has a mobile phone in that market
00:00:56
and uh that's a high chance that they will be able
00:00:58
to uh the carry out facial recognition or um recognise fingerprints
00:01:03
you've probably being exposed to this just oh for one year ago
00:01:10
when you were still able to travel freely and uh
00:01:15
when you were still able to use an automatic good door uh
00:01:20
and um for possible control and when i'm
00:01:24
your face was recognised to automatically hope for
00:01:29
most of your not been exposed to buy metrics in the area of her a crime
00:01:36
uh i criminality emperor metrics are of course used by uh
00:01:40
a criminal investigation offices to care tries
00:01:43
traces left behind by criminals and uh
00:01:47
taking samples on crime scenes in there by metrics is used as a tool
00:01:52
so here i have schemata sized in simple terms what tape by metric system is in general
00:01:59
i'm going to take facial recognition as an example because uh of course we all know what the faces
00:02:08
when you do facial recognition what does one do want us essentially comparisons one compares one
00:02:14
face with another the first thing you do is that you have to have a sensor
00:02:20
you have to have a camera which it takes an image of the uh face
00:02:25
since uh in english and then this image is processed in a number of stages
00:02:33
so just to be able to be compared at the end
00:02:36
of the closest with a reference which is contained in the database
00:02:43
for example uh you have probably a recorded an image of your face in your mobile phone could be video
00:02:51
surveillance application that is used for all and identity check
00:02:56
application when you are entering a security building you know um
00:03:02
a high has completely replace this module here model number three
00:03:11
and the you um find it a lot uh here six years ago that
00:03:18
was not the case six years ago you had a human being or was it
00:03:21
carrying out the stage now it's being it's being done
00:03:25
a automatically by computer based a large number of data
00:03:30
not going to expand exactly how we do a facial recognition but want
00:03:34
us in order to deceive official recognition this is one of our research subjects
00:03:40
uh on the one hand we try to plant the bad guys we try to break the
00:03:44
facial recognition systems that that's a very cool part of the job once we succeed in doing that
00:03:50
we try to find ways and means of detecting a a tax
00:03:56
so we try to ensure that the buyer metric system becomes more robust to resist attacks
00:04:02
for example um as because official recognition application on my mobile phone
00:04:10
the um i found ten uh is equipped with that
00:04:15
but uh now you find very high and uh or or not not so high and
00:04:21
the rangers sandwich i'm with the face recognition systems but they can be deceived a fairly easily
00:04:28
you can do this at home not going to cite the names
00:04:31
of brands but um this is that the uh low end of the
00:04:35
at the lower end of the range you can deceive these
00:04:38
uh telephones were easily whereas um it's much more difficult to
00:04:43
uh this even i found i found time i'll explain uh i do more about this uh
00:04:50
of the british know what are the attacks the attacks are represented by all these coloured dots
00:04:55
is the places where we have identified faults in by metrics systems
00:05:03
no uh the ones which interest us in particular other ones i'm going
00:05:07
to be uh starting at the moment that's a tax numbers one and two
00:05:12
it's a tech number one
00:05:15
um uh to wrecked a tax
00:05:21
you try to copy that by metrics you falsify someone's identity you try
00:05:25
to use up so that person's identity to pass yourself off as that person
00:05:32
and uh you you printed photograph and you try to process of
00:05:35
office uh another person second uh attacked consists of injecting in the system
00:05:41
'cause you have to have some of a a high 'cause talents in order to be able to penetrate
00:05:45
the i. t. system ought to be able to do this injection uh uh not anyone can do this
00:05:52
so you have to penetrate the i. t. system and inject a false
00:05:56
piece of data and this is what we're going to talk about the uh
00:06:05
i'm going to show you some examples which show you the progress that's been
00:06:09
made to since two thousand nine when we started to work on these attack problems
00:06:16
this is the first and simplest to attack which was published in two thousand
00:06:20
nine which was a reproduced on a larger scale in two thousand ten where we
00:06:27
showed systematically uh what you need to to to uh to see for a um
00:06:34
a laptop with a photograph
00:06:39
you have to quantify
00:06:42
uh i'm number different facial recognition in the algorithms thanks
00:06:46
to this we managed to develop a pretty basic countermeasures systems
00:06:59
so um i of course some it's um normally system
00:07:05
knows the difference between a photograph and a sheet of paper
00:07:11
you can go further than this
00:07:14
and you can imagine the folding knife or left out some of the stages we've left
00:07:19
out to him though there was some intermediate steps but you can go even further you can
00:07:24
make some ultra realistic masks in three dimensions is awful
00:07:28
clutched masks with a whole so either to see through the multiple is made of different materials
00:07:36
made of rescind we studied some in the past made
00:07:39
the resin here they are ultra realistic a photo realistic
00:07:44
um masks made of plastic all of silicone few for come up to the second floor
00:07:49
here today you would've seen some of these masks and the systems used to detect them
00:07:56
we created these masks in my research project to see
00:07:59
is that you can break the systems with more elaborate attacks
00:08:03
the aim of course being to create more effective countermeasures systems
00:08:12
another form of attack see these attacks ah what
00:08:17
we call the based on artifacts in other words
00:08:20
um objects that resembled a human being in order to uh to see if the if the machine
00:08:28
but you can know all to the human you can sometimes simply use makeup to change a person's appearance
00:08:35
it was shown not by us but by someone else that it is
00:08:38
possible to change one's appearance in order to uh to resemble someone else
00:08:43
on the internet or some uh artists to change the appearance just by using a
00:08:48
makeup not by resorting to process control c. sees a will to resemble a celebrities
00:08:55
uh you have to be vitality to do that uh did yep we
00:08:58
don't have those talents but uh what it can do is you can use
00:09:02
make up to change a person's appearance and to uh make a personal goal that would be nice it to make it doesn't look younger but anyway
00:09:10
so we managed to show that um we managed to make the person look
00:09:15
all that just by using makeup it was uh one of the same day
00:09:19
uh it's one the same person on the same day just by using makeup
00:09:25
so uh the idea was to test the fish recognition systems to
00:09:28
see whether the repressed enough to be able to deal with eighteen
00:09:32
the aim was to see whether we could to detect with the system
00:09:38
good to um detect whether makeup was being used and it's
00:09:43
very difficult at the moment today because it's a real human being
00:09:49
to his face we applied pigments which are very delicate and free
00:09:54
difficult to underlies by a a computer vision
00:10:01
no this is what they can do you can get a camera sees the camera sees when you do an attack
00:10:07
presentation all attack can be carried out in different ways
00:10:10
you can present a photograph sure photograph of different uh
00:10:14
quality is all a screen and this is what the camera the the computer the machine sees no
00:10:23
you see the same images as the uh computer which one is true and which one is false
00:10:30
do you think this one is false
00:10:36
a layout so as not to audible do you think it's true or false
00:10:43
is what i can see an artifact or not
00:10:48
would you would you say it's true or false the other one is true this one here true or false and this one
00:11:02
everything is false everything's falls in fact
00:11:06
are you gonna tell me what's the difference or two differences the first difference is that in the first column
00:11:12
the fact that scared out is a sheet of paper for the gulf was printed on
00:11:17
the sheet of paper but the original image was a um kept it in different ways
00:11:26
the second attack was done with the screen of a mobile
00:11:28
phone so the images but quality out of focus rather cluttered
00:11:36
because the telephone in the uh
00:11:40
a g. i. e. the uh the phone is slightly out of a focus
00:11:51
you have to uh increase the distance of the detectives realistic the last attack was done on a big screen
00:11:57
a day you display the photograph so i can see it even for you it was not easy to tell
00:12:03
a real from full so the attacks are becoming more more realistic
00:12:07
but i've not yet to got to the most realistic one this was in the early days ten years ago
00:12:16
now we're trying to detect these attacks and we um
00:12:20
very quickly bump up against a limit which we try to show in our project
00:12:25
another is another form of attack which is really problematic in which it concerns a identity talk
00:12:31
documents concerns uh it's called morphing attacks it's a number
00:12:36
of countries in the world and in europe as well
00:12:40
you where to um get a driver's license or
00:12:44
a a a national identity card you bring along your
00:12:49
photograph you complete a form you had in your
00:12:53
foreman you leave them afterwards you receive your identity documents
00:12:59
and uh normally there is a digital chip which is been scanned but it so happens that
00:13:04
if you have touched up the photo a tinge to yourself for example by a merging two different people
00:13:13
you will find the totally file it to identity document which will enable uh
00:13:18
each of those two different people to travel a validly with that document so you
00:13:23
you can find someone can find themselves so in uh europe with uh
00:13:28
numerical uh image which is uh it's been morphed from uh
00:13:33
uh uh two different people so um it complies can travel
00:13:37
on the possible out of another person this is been a demonstrated
00:13:41
it's been proven on state of the art facial recognition system so there is a
00:13:46
uh there is so this is a vulnerability that does exist
00:13:53
uh don't worry i will finish on the post it note but before that i would like to say a few words about the deep fakes
00:14:02
uh keep fakes a it patrick action french
00:14:11
so the idea is to uh transform the face of one person into the face of another person to make
00:14:15
people believe to make you believe that this is reality
00:14:19
on the right you recognise the person you recognise the actor
00:14:25
who is tom cruise was not tom cruise
00:14:29
you recognise tom cruise but the fact is this person this is an actor
00:14:35
yours us to mime and to talk a bit like tom cruise does and then
00:14:39
a deep fake okay uh uh software was used which is based on a i two
00:14:46
transform uh the face in real time into that of the face of tom cruise
00:14:53
subject to anyone's face can be a transformed into the
00:14:57
uh phase of time 'cause that's very difficult to detect
00:15:01
now here at the images of keep fakes now all these people don't exist at all don't exist at all
00:15:08
he's a totally synthetic images
00:15:13
this a virtual chance of these people exist uh on the planet this is scary
00:15:21
because if you create a synthetic image um what what can it be used for it
00:15:25
can be used to create a false profiles on the internet to do industrial or political espionage
00:15:30
so it's very dangerous you can also do real time videos that's starting to emerge
00:15:38
and um we're beginning to be able to combine the photos with uh the voice
00:15:43
so i'm hyper realistic uh audio uh i'm
00:15:49
thinking uh of my speaking terms the yep
00:15:53
it's this technology's really tough follicle but um the technology
00:15:58
can also be used for room good purposes these deep uh_huh
00:16:05
this um for example for data confidentiality purposes
00:16:10
normally you need a certain volume of the images
00:16:15
uh we have to agree to a use of a a photograph and
00:16:23
uh and uh you'll be regulations will soon ban the use of fish recognition as a number of application
00:16:30
so it's getting very difficult to collect to these um
00:16:34
photographs with the person's consent a lot of companies are using a
00:16:37
him photographs of people without the consent of the question is uh
00:16:44
how can i use a chain interest groups of people if i am not a lot to do so
00:16:48
the onset is maybe to use these
00:16:50
some t. fake technologies to create synthetic images
00:16:59
and uh since you can create the synthetic images you you
00:17:02
can also uh control the factors within them for example the luminosity
00:17:05
and uh to change the channel to change a man to well not a for example all these images are synthetic
00:17:13
and if you you can very certain factors to change the uh expression
00:17:20
the uh brightness
00:17:24
all the factors which unnecessary for facial recognition