Transcriptions
Note: this content has been automatically generated.
00:00:00
when ladies and gentlemen thank you so much for for having me on because we agree on that
00:00:06
pretty general of the department of economic and employment even geneva
00:00:11
and i'm i'm going to facilitate the panel discussion this is often
00:00:16
um i'd like to start with with a few words uh uh
00:00:20
uh who which could be from a hundred low volume on it
00:00:27
the words are the following the quality or not the
00:00:31
longevity of one's life is what is important to set out
00:00:37
it's martin luther king nineteen sixty four up
00:00:42
his nobel prize winner and he was acutely aware
00:00:47
of the importance of a concept quality of life
00:00:51
and they're suffering dot coms from the lack of it
00:00:57
fast forward now to twenty twenty two
00:01:01
and you half exchange of bilateral exchanges between
00:01:06
prof like a nice which are going before going way and
00:01:09
still the focus is quality of life or the lack thereof
00:01:16
proper going east one comes up with a question
00:01:20
what kind of thing makes you feel fart or deprived
00:01:25
and the answer by protagonist to that's how i will all it is
00:01:33
yeah a lot of time feeling trapped and alone and having no means
00:01:38
of getting out of these circumstances makes one feel start depressed or angry
00:01:46
project on this one is a google and junior with now on gardening leaf
00:01:55
ah a proper gonna used to is essentially a google a high chop watch
00:02:02
and that just came out like three days ago without a sort of conversations that you have in between
00:02:08
a a a human intelligence and and they are you a good eventually and
00:02:15
this obviously poses quite a few questions really really really relative to quality of life
00:02:23
are we moving to words sentiment a are your is indeed a yeah
00:02:27
i no doubt smart i think we've we've worked out this afternoon uh um
00:02:33
what would it mean if a i would become french and um and
00:02:40
and then perhaps what can we do to make sure technology and especially a
00:02:47
you are in service humanity rub it in your project and to complement this
00:02:54
when it comes to pushing the boundaries of innovation uh for humanity should we not rethink about
00:03:01
the importance of a a are you and teach them comments in the spirit
00:03:06
of the work uh don't by professor ignore or from the two thousand nine
00:03:12
no a nobel prize winner for economics the first one we got this price and i
00:03:17
think it's it's more about the nobel prize and the quality of women in in economic research
00:03:24
so these are some of the big questions which i will explore with a collection of brilliant minds
00:03:29
and heart we i think we need both a present
00:03:33
here on the panel today and it gives me now uh
00:03:37
a great pleasure to to call on on the podium uh a few people but i think i really have to here
00:03:44
so so i ladies first overview wasn't about a
00:03:49
superwoman a responsible for the fiftieth birthday of dudley moore
00:03:54
a project manager at the foundation thank you so much already for being here
00:03:59
then i have an equal our cart founding partner of captain these i
00:04:03
think that's that's correct that's the name of the low for men you're the
00:04:06
president of that was the previous one but the only reason was good memory
00:04:10
yeah and also the okay you will tell us about about the new one
00:04:14
uh and president of the geneva also made on a high and i'd like to to our to this this
00:04:19
collection of great minds uh every all our where is
00:04:23
every uh every uh come to the stage a very uh
00:04:28
every you're you're the director of the are we we spoke about you're really in the situation right before
00:04:34
professor at the pure further as well and board member of the valley
00:04:39
more that foundation then there is he still basically anywhere is his you'll
00:04:44
here we go because your is your your your your of your former
00:04:48
director of of of uh the sort when you have the or format peak
00:04:52
and the board member of the valley more information as well then there is
00:04:55
your now who's got apologies if i use sort of miss we sounded your name
00:05:02
but i i think we got it your does your senior a guy restructure the
00:05:05
united nations usage for disarmament thank you so much for being with us uh you wanna
00:05:12
rico chablis the ah actually at our uh uh
00:05:16
we cover also house many heart uh he is the
00:05:19
director of history spiro the configuration of rubber trees on a i research in europe but you're also work
00:05:25
probably marked for what pages use p. uh you you might even to tell us what this means
00:05:31
i don't think you really go for the year and then there is
00:05:34
juror no uh uh the market so professor and director of this you wild
00:05:39
and also board member of that probably more the foundation so i'm gonna i'm just gonna you get
00:05:44
away from here and and give you give you my spock please or call come come closer so
00:05:50
i'll give you the floor intern i think that's the
00:05:53
the biggest panel i ever tried to manage uh almost impossible
00:05:59
uh so but before giving you the flow we have about thirty five forty minutes
00:06:03
together and and so i have to tell you the the rule is pretty simple
00:06:08
it's p. c. p. when she comparing and to the point please
00:06:14
uh because we all have to time to uh do anything else and
00:06:17
often i'd like to start with with with every uh every uh well
00:06:22
to the point yes absolutely how to see the evolution of a yeah research in the next
00:06:28
few years and what you think other fields
00:06:31
where a i will make the greatest advances uh
00:06:40
yes many families of the ah there is really hard to to answer the question so just
00:06:46
what is certainly there's also this model only i know as opposed to the old i guess
00:06:51
what is the main difference between hold a uh yeah called expert systems in the fifties
00:06:56
and on into the new idea model yeah there's not much you know besides digital data
00:07:03
all the data is digital what for many of us in industry when you would win just it is
00:07:07
like eighty percent of industries where they still don't know what digital data is we can come back to that
00:07:14
something you know so many families of yeah but i would i would classify
00:07:18
them into to to simplify the discussion to be funded the photos one is
00:07:24
trying to do things that human beings are not able to do at all
00:07:29
processing huge amount of data to extract
00:07:32
properties and help humanity quality of life
00:07:37
and so and so was a huge amount of data the second
00:07:41
family i'm simply over simplifying the whole thing but the second families
00:07:45
yeah i tried to me make human beings skills
00:07:49
or augment this is a funny development human beings could
00:07:53
in the first case i think this is with the big potential is into medium term i would say
00:07:58
a high processing a huge amount of data i'm a little bit conservative but they also
00:08:03
like to be a little bit provocative in one place it before people will do me
00:08:09
i believe that if we gave all of us
00:08:12
humanity give to write to explode all all data
00:08:16
that we are generating every the massive amount of data that we all of us we are generating everyday
00:08:22
if all the humanity would agree to she a hundred percent one hundred percent
00:08:26
of the data they usually they i see all the legal people you know
00:08:30
starting shouting it means i'm sure we all their data i'm show that humanity would make
00:08:37
use progress we would understand social behaviour or the the the climate change will
00:08:44
will ecology we would be able to what you do data mining in that big
00:08:50
so we are going much beyond what human will ever be able to do and we would make
00:08:54
big big progress in quality of life you may need to install i'm convinced of that and this is within reach
00:09:02
now people are going to a hundred percent of the uh yeah
00:09:05
i agree on my site i am i start being fed the
00:09:09
bustle of each of us we are a bit further i know
00:09:13
like taking this little box each time now we access any stupid website
00:09:17
although this confidentiality box you know and private issue that we understand
00:09:22
all the things and so on we checked the object of boxes
00:09:25
right all the time we're all funded by that because they want to access that for a new website
00:09:32
only side i would like to have a system where i can describe my profile
00:09:37
what am i had a ready to share what and what we did to share yeah and so
00:09:42
and then people will know what website would known from that automatically first point
00:09:48
then i would like to be we've paid for what i'm sharing
00:09:53
i don't believe i don't understand and that's something we should regulate that only a few
00:09:58
companies making money based on top of the data did we generate we are the data generated
00:10:05
we out of the owners of the data and only a few companies to the ah making profit
00:10:11
from the data i generated i want humanity to be
00:10:15
able to make money i want to date other to own
00:10:20
and based on the that the closer to the to the level of shaving you know
00:10:25
between zero percent and one hundred percent i would like to be able to get money
00:10:30
four percent the function hundred percent of my data i would get something like
00:10:35
which is called is the is the universal sorry why not two thousand francs amounts
00:10:42
if i should only some really thought you any fashion zero percent i would get
00:10:47
no money but it's okay right i think there is something that we can really today
00:10:52
okay then in the case of the older yeah i yeah we're we're using data to recognise images
00:10:59
into to do speech recognition to do translations on i don't think we should expect much more problems
00:11:06
except if we fall cues oh yeah i used
00:11:09
to expand human skills not to replace you you know
00:11:15
every i gave told you know i i asked people to raise yeah the hand
00:11:18
if they still believe to suit on the most cost before the end of the year
00:11:22
i've been doing that for ten years every uh uh the
00:11:25
fewer and fewer people within the last conference id force in the
00:11:29
in the rose then yeah i'd want a buck meeting board meeting
00:11:35
only one person out of two hundred posted wheezing today and one and nasty and that would be next dealing with zero
00:11:43
so i don't believe in magic i think it's just hard work digital they tried to call this the main
00:11:49
difference between the old yeah and the more than yeah but this one that we can can can go for it
00:11:55
thank you so much uh uh with a messy because it's uh i don and
00:12:02
and prices power but i caught on and then obviously sort of reversing this uh
00:12:08
detective digital feudalism kind of uh uh you know triangle power to to to to be
00:12:15
on call that i was you i am i'm very conservative but i am hundred personally either
00:12:21
okay yeah i i like to work transparency item we need new full transparency
00:12:26
on both sides talking about transparency of the the topics probably just to keep the company's
00:12:33
explode into data i let me i think we need that's what transparency from the data providers
00:12:38
i want you know i think we need to be transparent off almost data we're willing to say
00:12:43
uh_huh and if you're willing to show that they don't have to be paid for their end users of the data
00:12:48
they also have to be transparent about what they do with all the love and the yeah i
00:12:52
can also help us to build tools that will follow that will track to you like this stuff
00:12:58
there is a lot of progress being made in by the matrix yeah is insecurity i think we
00:13:02
can use exactly the same technology to track what people is uh people are doing with my data
00:13:08
people talk a lot about the the biometric possible it's very easy to get a very estimate as more s. m. s. message
00:13:15
each time someone access you by omitting possible and you know it's like looking into a by
00:13:21
the window or something right that if you know that people we know that you are looking
00:13:27
by a little closer to you and you know if you don't do it anymore
00:13:33
i got a thanks terrible lot thank you for the last uh uh uh historically that move to you
00:13:42
draw points often used to save up money is nothing else but what he
00:13:46
makes of himself now humanity has also
00:13:50
started to create artificial intelligence not just
00:13:53
human intelligence how will artificial intelligence contribute to the quality of life in the next fifty years
00:14:01
again we you know fifty fifty hours a i in three minutes obviously
00:14:08
yeah oh yeah but mulder
00:14:12
thank you very much
00:14:15
i'm going to answer this for difficult question in french to uh
00:14:20
give everyone a break it so how can artificial intelligence
00:14:24
contribute to the quality of life in the next fifty years
00:14:30
well obviously uh we don't have a crystal ball uh tell disposal
00:14:34
so it's going to be very difficult for me to answer this question precisely to move the less
00:14:41
after having discussed
00:14:43
it with some colleagues they told me refer to your personal experience so with your permission
00:14:51
as of overdue lifter for several decades and been doing research in
00:14:55
this field for several decades i remember that when i was a asked
00:15:00
this question at the beginning of my career what do you do and what subject you work on
00:15:09
replied referred to upload differ replied to artificial intelligence
00:15:14
the interviewer would to the give me a funny look and say uh how sales think how can one
00:15:21
work on such a strange subject uh which is rather rather obscure and there's not a very use to anyone
00:15:29
that's true that since then things have evolved a lot remember um interesting episode
00:15:39
when with professor for things of yours and the purpose of a
00:15:42
fulcrum zurich when we were submitting some projects to the swiss national fund
00:15:47
it's as soon as if we received some rather skeptically replies we had
00:15:52
to go to bed and several times to try and convince all all the
00:15:59
uh partners that it was not to strangers that and that this kind of thing was done
00:16:02
elsewhere also it's what's in them there was no reason not to do this since woods and
00:16:07
now if we look at the situation we're in right now things have changed a lot fortunately
00:16:15
i is everywhere you have it in your dishwasher you will soon
00:16:19
have it in your car to drive you where you want to go
00:16:22
automatically i will have it and this is being referred to but this because
00:16:28
when you go through the new automated possible chick at geneva
00:16:32
report they've introduced to new device you place your devise a
00:16:37
on the the plate to um the the system uh if you take such a possible lets
00:16:44
you through what's going to happen in the next fifty as i don't know on the other hand
00:16:51
if i turn to the person whose portrait it's on this wall angelo to label it
00:16:58
and if i consider
00:17:02
what his motive and the result the troll was
00:17:05
a two launches foundation like to go to if uh
00:17:10
the advances of technology i. t. overall politics so and uh i uh to
00:17:16
serve human beings and if human beings are not to be in sleep like technology
00:17:21
if you much of the sleeper technology then things will be heading in the right direction unquote
00:17:27
now how can things change and continued to change in the right direction in the next few
00:17:33
years prior to that i will turn to the second poster which you can see on the wall
00:17:39
and i'm convinced that the quality of life for label which is going to be launch
00:17:44
but the foundation will play an important role why because it places the emphasis on the
00:17:52
number one am of missed the telly molly and at the same time this label will
00:17:57
uh trusts a broad range of people researchers companies industrialists
00:18:04
all these people are going to be people if they so wish and in
00:18:08
compliance with listed a little is we should be able to positively contribute to
00:18:16
artificial intelligence being at the service of human beings
00:18:19
and not vice versa thank you very much indeed to
00:18:25
so there's a from mister medium thank you for breaking down the wall between english and french
00:18:32
speaking french uh so um i will try to avoid speaking a swiss german the stuff to do
00:18:43
let's get back to to english you're
00:18:45
probably christian mention cars and the intelligence in
00:18:50
vehicles and obvious that the transition i'd like to to to to take
00:18:56
when the system is made of a when a system is made or tournaments thanks to a yeah i
00:19:04
we generally find an excessive trust of the human two words information transmitted by the
00:19:11
machine and therefore potentially a loss of concentration
00:19:16
of cognitive attention uh by the human being
00:19:20
how do you see the the future about human machine interface
00:19:25
uh uh and and can we eventually do something to maintain the human cognitive involvement throughout the
00:19:31
use of the system you know terms uh how to make a yeah you uh uh uh
00:19:37
sort of getting people to to to get easy if i could sort of simplify my own question please you all know
00:19:51
i think you you you voices to be heard of something i think is the different question actually and
00:19:57
i think that we're finally getting a bit closer with this question and so on
00:20:01
making a ice nadine or e. s. l. let's think a bit about what it means
00:20:06
really for humans interact an autonomous system because we talk about yeah in software about um
00:20:12
i think that processing about things it can do in in the
00:20:15
end we consider computer but what does it mean we interact but uh
00:20:19
with the system that is upon an answer it has a high level of of automation
00:20:23
and and the passion points to loss of
00:20:27
attention to bias that that humans tend to have
00:20:30
when they they live for for a long time on a system that works well that is
00:20:35
i've had a certain level of predictability and that's a real challenge but i'm just going to
00:20:40
to start a bit with the background before answering that even though you said to be at the uh to the point that i
00:20:45
would be to the point and i think the first um if i shall engine here there's more and more studies in the is on
00:20:53
um human machine interaction in the context of autonomy of course autonomous card obvious don't exist
00:20:59
not not at the highest level i think that the stakes of autonomy that doesn't exist but
00:21:04
there's the high level of about autonomy in certain systems in in cars even is that certain happens is then
00:21:10
so in that situation when we deal with says they have a high level of autonomy
00:21:15
it's been shown time and time again that after a while the seven nine twelve seconds usually you can
00:21:21
i started his attentions all these that relied on this is the nannies especially if it works very well
00:21:27
for a long period of time more and more that attention is lost and the waitress that we tried this is then to
00:21:33
navigate while to take as the places or to just decide to
00:21:36
to make the right decisions because we just like to trust it
00:21:40
not the problem that was it this problem and to sort of come
00:21:44
it is the challenge was up again through um a sufficient into interfaces because
00:21:50
the interface is no way the meeting point between the human and uh
00:21:53
this is then sell looking at designing interfaces that there are making them more
00:21:59
not just user friendly but more uh designed in a way that
00:22:02
would cap which is the key the human involved engage in there and
00:22:06
the process of using that since then has been fun as a sort of a solution but then even they are actually
00:22:12
there is very little that we can do just to design itself it with the design of an oven interface because
00:22:18
human attention i think the car that will have more to say because i don't get the neural scientific background yet
00:22:23
that it's very it's very hard to to maintain for a long time no
00:22:26
matter how well designed and interface itself there's this this is a fundamental problem in
00:22:31
human machine interaction in the context of autonomy that no matter how how well we design an
00:22:36
interface loss of attention and and that sort of a can complacency which is kicking at some point
00:22:43
and then there's that other solutions through for example interstate interfaces where
00:22:47
well you i haven't more realistic experience of that of the use of the system of the environment but even
00:22:53
there of course that there there is no magic there's no silver bullet for and for keeping semen actively involved
00:22:59
while still it is really the challenge of course in uh in in
00:23:03
using cars but even more something but because i think you and um
00:23:07
now working hours that is autonomous weapon systems but the consequences of course i
00:23:11
am much more stevie and when when dawson attention cars and i says then
00:23:16
arsenal function and then it's very difficult for human to run up their alertness in a moment of prices
00:23:23
so that the answer to come back you know in a circular ways that in a way we are
00:23:27
yeah really facing the challenge in in human machine interaction in this regard because it's very it's still
00:23:32
very hard to calibrate attention an expectation interest when
00:23:36
we use when we get to take it on assistance
00:23:40
you as a forward just a lot do do we need to develop use keys in humans i
00:23:45
mean we we we speak a lot about this actually the we we humans need to our future work
00:23:51
do we need to to learn new skills uh when it comes to interacting with the i. e. what your take on that
00:23:57
oh absolutely we need to learn with that an excuse and also
00:24:00
new training nap it's when when mean gauge with an assistant self
00:24:04
for example this is done and i read recently study on um that they slammed um it does
00:24:10
not driving experience where you have to update uh of course there is it's provided there is um
00:24:16
there is an updating the software constantly because the system keeps learning and then you
00:24:20
how you have that data that uh that's training you as it is the driver
00:24:24
the saints things are now beginning to i starting to be studied in that
00:24:27
field of autonomous weapons systems were looking more and more at what training can
00:24:32
can but that's it entailed forces stands that keep learning and we
00:24:36
have to keep updating a mental model of the system you're using because
00:24:40
this is then you start using today might be different in a way it learns or the weight behaves in your six
00:24:45
months from now sell training has to change it would and that requires obviously much more agility and with the war that
00:24:52
everybody uses now agility agility but it's really it really requires a lot of sort of uh
00:24:58
um and preparedness to to learn constantly and to update that that training more regularly
00:25:03
uh_huh thank you so much you got on the you heard my my starting point
00:25:11
these really are essentially not being very happy about its its condition
00:25:16
do you see yourself defending a an artificial intelligence one they don't sort of the first part
00:25:21
of my my question and and second really how how do you see a uh this this
00:25:27
this agenda of of digital commons and implementation of it uh on
00:25:32
a on a global level i mean you think it's gonna calls
00:25:35
legal problems and what can switch didn't do it on top from thank you so much
00:25:41
and as apprentice only see the point of course i personally difficult for like the so i'm not trying to answer in french
00:25:48
so first first question i think that to defend yea i what is interesting
00:25:53
is that today we haven't tackled that then interesting question is that i i
00:25:58
a i is a intelligent or not because this is like i mean it's in society that to an average a ooh
00:26:05
ooh citizens say uh the uh a i would replace a
00:26:11
kane and in fact it's just there to improve
00:26:15
with data set up to improve the augmented reality so
00:26:19
we can see that uh we are back to the ground and in the end we don't speak a lot about uh
00:26:25
the uh strong guy uh artificial intelligence like a fantasy which
00:26:31
ago was the further and further them all go forward stop
00:26:35
this uh europe and uh the draft opening act is quite pragmatic that was a conference is
00:26:40
speaking about uh the the regulation of uh what we cannot predict and this is the case so
00:26:46
to me this is intelligent this is clever the way it was made uh they are summaries cousins and
00:26:53
a list of what i wanna be happy with that
00:26:56
but uh a biology in public spaces interesting in that respect
00:27:01
and uh if it at some point we put all our
00:27:04
data as somewhere and this you question i think because that
00:27:08
we always say that uh d. r. g. d. g. p. r. is very interesting et cetera
00:27:13
but uh we every time we need to put a disease across
00:27:17
but philosophically it goes beyond its means that uh we want to
00:27:22
to put today test subject uh in the focus somewhat these data but we never
00:27:28
it it it it was it was never it in in the in the silly that we don't
00:27:35
want to beat i mean i thought because well we always asked to be our profit also anyways
00:27:41
we'd all that's so that's so nice about if it's a
00:27:44
she like that which don't exist so why now would die defence
00:27:49
would i defend a a i i'm not sure i did that in this yet so it was quite funny but i mean
00:27:55
these being said there's some quite outstanding things uh there is
00:27:59
the uh picture a movie made by job the initial but well
00:28:03
where it was quite good so it might occur but i'm not then a a
00:28:08
next uh five years autonomously hike or they're not here not now it's not very clear
00:28:15
and so uh yesterday i went through the mit
00:28:18
more machine i mean we have to me in and
00:28:22
i don't know that's exactly how i can we have to decide whether we want a um is a
00:28:29
to have a crash show with that dog oh is that with uh something gets and then in between
00:28:34
a bikini or are docked are or something like that and the choices i mean that are just all four
00:28:39
i mean they just so i go for a mix of everything
00:28:42
and we cannot do that i mean it is very interesting and it's
00:28:49
but even though we won't have main maybe
00:28:53
autonomously hikers or smart machines that we'll be some
00:28:56
difficulties in their regulating and as for the digital colours i don't have much to say in fact
00:29:04
as soon as we want to put in place something and globally yeah and we need international
00:29:09
conventions which are quite a baby to implement and
00:29:14
dead don't really cover a lot of things so
00:29:18
of course to me the outcome is quite good but if we go beyond
00:29:23
the united states it's they are quite weak tools in the end of the day
00:29:32
i think in colour
00:29:34
the capital mm should be uh uh
00:29:39
continue on this very front of is a yeah you
00:29:43
small or not and i heard nicholas and i are very
00:29:48
if one looks a history uh artificial intelligence house probably so for bin
00:29:56
best interest and not true stupidity
00:30:01
uh do you agree with this statement first and how
00:30:05
you see the evolution of the modulation around a are you
00:30:09
and a given your background i'd like to hear you are
00:30:14
saying a few words about the necessity to be more trans disciplinary
00:30:19
in the approaches we're taking on on on research but also in
00:30:22
terms of implementation thank you so much legal thank you very much
00:30:28
so point she sighs it's i think there is nothing
00:30:33
more dangerous than human intelligence or stupid even g. day
00:30:38
all the race that come with that vision takings are
00:30:41
human eight hours human generate so the rest started human
00:30:46
and we should probably avoid the temptation of talking about and a i
00:30:50
as an entity in itself there's multiple instances of a.
00:30:55
i. systems or systems that exceed eight a. i. characteristics that
00:31:01
still not an unique and it yes and a high maybe something that we have
00:31:05
to keep in mind when we think about how do we establish governance for a high
00:31:11
because there is not a single entity there with people ways we're
00:31:14
implementing it with people ways for developing it would be ways for
00:31:18
using it and what people ways for our putting an oversight system
00:31:23
thought if first point we need to to address and there are
00:31:28
again years risks human made stars human made solution should address a human
00:31:34
aspects of developing the i deploy in a high and reacting to usage yeah
00:31:40
if you look at how are we developing a legislation
00:31:45
will expand it to governance and the path we should follow
00:31:49
i i would say that uh it's an impossible task we're trying to do
00:31:54
something here that cannot be she to call we if any
00:31:58
fixed ruled that will solve all the problems i would let
00:32:02
us be happy uh let's see how i stop worrying about
00:32:05
the phone but god that's not possible but it's not a
00:32:10
task like us is divorced one where i just push the rock all on top of
00:32:16
a hill just put it down because says it was going something that had no point
00:32:21
we're doing something it is very hard but we can achieve because there is no point to it
00:32:25
but it is worth doing and is gonna be
00:32:28
and never ending process of starvation rules and governance mechanisms
00:32:33
that are necessary to reduce there is that these technologies and tail these technologies that are
00:32:39
emerging that are normal that active will think so oh we come
00:32:43
back to this analogy or trying to fly thing why rabbit and
00:32:47
when we're doing that we know that we don't have all the
00:32:50
necessary information to set all the rules to set the strict rules
00:32:55
ah but we cannot wait too long to gather this information in order to set
00:33:00
before so we need to play with these different a time frame is we have to
00:33:04
also take advantage of the increasing it is the knowledge that we keen as we see the system is being used
00:33:11
and that is the thing to something that was referred before ah we she's the different colours mechanisms and
00:33:18
tools we should not focus only on regulation only on
00:33:23
particularly got use remains because they yeah they full feel
00:33:27
one role among the different techniques we it was mentioned that if we want to half
00:33:33
a lot well treaty a global framework then we
00:33:37
need to reach consensus with many many stakeholders that
00:33:44
usually takes for ever that ends up what's going down some of
00:33:49
the more air choose your parts and mortimer strongest part of this
00:33:54
saw that isn't this model is not suitable for setting the rules for
00:33:59
everything related yeah it's a lot of value of doing that because that's that's
00:34:03
they generally ground but then we have other mechanisms need to
00:34:07
be in place that need to be coherent with these general consensus
00:34:11
that allows us and empower us to do and apply these uh
00:34:16
these mechanisms and here's where we're talking about holidays binding last also
00:34:20
just awful mechanisms to stand artsy look at the the way they
00:34:23
act it's relying strong on standards standards that don't exist yet so
00:34:29
this is something that we need to to look at and don't lose sight on these
00:34:32
different pen is that we need to develop and we need when we talk about me
00:34:37
he is working to disciplinary works ah often we have in in events like that like a a
00:34:43
one session that goes on the technique one session that goes on daily aspects one
00:34:48
session of both in the attic aspect and we're still working on files and we
00:34:53
need to start increasing literacy all across the board with only dress in a i
00:34:59
guess a lot of energy on saying we need to teach people how to program
00:35:04
there is a lot less saying we need to teach engineer is how
00:35:07
to develop a legal framework or a common frame and is something that you
00:35:11
because if we we we want these engineers the developers to
00:35:16
take that these elements into account if we want an organisation to make decisions about
00:35:20
informed they need to have these different endings and this is something that is very important
00:35:25
two aspects that i think are necessary to reach a proper norman something like the first one is
00:35:30
empowerment like developing these capacities increasing uterus at the
00:35:34
second one he's incentives you want to get by
00:35:39
if you need to generate incentives for people to do things
00:35:42
in the right way to develop a pair that meeting responsible way
00:35:47
every frame or that we set in terms of legislation in terms
00:35:50
of the standard in terms of a binding well or so awful
00:35:54
needs to be outline somehow always be compatible we if the needs with the
00:36:01
rules and we think norms that a a certain social economical uh environment pass
00:36:08
otherwise you may end up with things like that that are not applicable or what
00:36:11
we would be impossible to generated by necessary to adhere to discuss these uh these technical
00:36:18
thank you very much indeed
00:36:20
incentives and empowerment that's where we go really uh
00:36:25
uh this more or less what we used to uh
00:36:28
capo also supports a a lot they're like they're not to yourself if
00:36:35
can you tell us something about this uh project to develop a label full artificial intelligence
00:36:42
that seems to be an extraordinary project uh can you tell us something about the time line
00:36:48
uh i think everyone in the audience is looking forward to seeing that label ready and
00:36:52
to put in place or what are the obstacles that the foundation will have to overcome
00:36:59
i'm in cooperation with the various sins compression with diverse but this one really
00:37:07
miss if you would introduction this is a label which we're launching
00:37:10
with the uh the foundation but also together with s. m. u. use
00:37:18
so it's often difficult person is to integrate to artificial intelligence in the cool business and uh
00:37:25
we come into quite a lot of we run up against a lot of restraining factors uh when we try to
00:37:31
do this with financing and the understanding of the ecosystem
00:37:35
rules that we have to comply with as regards this the
00:37:40
um approach that the foundation is adopted to create things
00:37:44
label isn't intuitive one the idea is to create to uh
00:37:48
uh technical tool which will comprising contain like a a a
00:37:53
a or a or a i rather and uh which will
00:37:58
control the way in which areas use by the eye enterprises um
00:38:03
which will involve ethical standards so as you can imagine it's pretty challenging
00:38:09
but i believe um our uh research is a tele million enough to be able to take up this challenge
00:38:20
assume is need to innovate to stay in the market
00:38:24
term it's difficult for them to stick to step because um
00:38:30
many of the things they have to learn of a very difficult to learn so first
00:38:34
of all we have to really challenge the channel this a research project in the right direction
00:38:41
we have a reflection group which consists of uh some very big
00:38:46
names in research and some free time to people so with first
00:38:49
will reflecting on how we're going to put this project in place
00:38:52
then we will try to find some companies which sure agree to
00:38:58
work with c. i. e. um to d. up the first stage
00:39:05
it's difficult to give you a timeline uh
00:39:08
i don't want to create false expectations but um
00:39:14
and this is something which is emerging from the ground on
00:39:18
the occasion of this fiftieth anniversary of the tell him only foundation
00:39:23
and i'm sure that the next few years will come up with something concrete to disregard
00:39:29
super but so is the moderator
00:39:33
well them with a a show from uh who was
00:39:37
already a set up the potential of c. c. you buy
00:39:44
and then a great memory earlier what is the potential of a c. u. i. to implement this technology
00:39:56
so um we were asking about the challenges and the potential yes
00:40:01
says that you're from that well we aren't into faculty centred universe
00:40:05
to for to me for this is both a challenge and the strings
00:40:11
so the structures that were of faculty but uh we use that we have
00:40:14
is that we are a centre which is supported by the wreck to rate
00:40:19
but at the same time the universe to seem to look up human
00:40:23
governments is such that often we are limited in what we'd like to do
00:40:28
and of course we are accountable you this too so the budget is what it is we cannot
00:40:33
develop it anymore well we uh we can try
00:40:37
to developers additional interesting projects like the one we are
00:40:42
running with to tell the deli molly foundation now as regards the challenges at the moment
00:40:50
there is a growing awareness at the level of the can turn on the region
00:40:54
that something has to be done in the area in the digital area and to
00:41:01
for a enterprises so there's a deer of uh developing research and
00:41:07
teaching in this area and to develop for digital technology even more strongly
00:41:15
so this is what i would like to see happen i would
00:41:19
like to see really strong signal being given by our authorities for example
00:41:25
to develop a digital technology most strongly in the can to
00:41:29
move to move out of this this is part of your objective
00:41:33
e. edge of time like the present extensively that don't you yeah i think you can
00:41:45
how to keep its uh switches to
00:41:52
stating in this record to want to the next steps um
00:41:59
what to the next actions that a counter geneva is going to be doing in the field of uh
00:42:06
artificial intelligence also thank you for the challenger
00:42:11
the i would like to give you a round of applause full having a state with the some tool half the time
00:42:18
for us a high is part of the broader feel does out minister said earlier this afternoon
00:42:27
namely the field of of placing digital technology that the service of all the
00:42:32
strong him forces of achiever and also mall globally that i'm
00:42:37
the regional level and at the level of the international geneva
00:42:42
i'll action uh can be situated at the transfers soul uh
00:42:48
a lot of these between these two axes as regards economy and employment
00:42:54
the reason beefy a witches materialist
00:43:00
with the consequences of the covert nineteen pandemic with
00:43:04
regard to the digital divide this is something which um
00:43:08
does still exist to very much in companies in a semi is you mentioned earlier the problem of following
00:43:15
all the regulations the problem of knowing what one is allowed to do what one is not allowed to do
00:43:22
especially for a a semi use which uh make up a considerable portion of our economic fabric
00:43:28
so this is a major problem them if i add to that the problem of cyber security
00:43:35
we'll see the uh mister a little soft and spoke to about a digital uh
00:43:41
confidence and trust and a i of course uh is integrated in that
00:43:46
we have the local regional agenda which is pretty strong in this regard as a number of initiatives which are
00:43:54
being run by the department to increase our footprint in this regard
00:44:00
what makes geneva really original and i devote to quite a lot of time
00:44:04
working on this is to um build the bridges
00:44:08
uh between the terms which ricardo refer to moments ago
00:44:12
trance discipline or t. v. or breach between a
00:44:16
digital components um small local forces speeded to um
00:44:21
the university of the uh couldn't hear him here we're in the process of the c. u. i.
00:44:30
and the h. e. g. is uh just nearby and h. e. s. s. o. is nearby that's
00:44:35
the university of applied sciences western uh switzerland and
00:44:40
uh the e. p. f. l. this was instituted
00:44:42
swiss federal institute of technology lows and also has um a brunch in geneva so um
00:44:50
we are very fertile ground in economic terms so
00:44:53
i'll challenges to get all the surfaces to work together
00:44:57
and to think and rethink these documents related issues so
00:45:01
we positional so so different to between these two uh
00:45:05
uh thrusts the local aspect so's not
00:45:09
to leave anyone behind resume in the ditch
00:45:14
oh and then there's a whole um acceleration that is required to to
00:45:19
a home school so we will need for future and to establish um
00:45:25
actually for a as a real the hub for a digital skills and
00:45:30
competencies it's about all i can tell you in the space of two minutes
00:45:37
asking you to give a round of applause to the other panelists we
00:45:42
palace would like to applaud the panelists who have stayed home until after time
00:45:48
well the audience can have close the panelists intern
00:45:52
well i think we have a slightly exceeded the time it was a lot to to this panel
00:45:57
really the president love very dearly molly