See all PowerfulJRE transcripts on Youtube

youtube thumbnail

Should We Be Fearful of Artificial Intelligence?

7 minutes 51 seconds

🇬🇧 English

S1

Speaker 1

00:00

The Jorogun experience.

S2

Speaker 2

00:02

I think some people do and that's the real fear. Some people like this guy who's advocating for a digital god, like they do know where it's going because they actually work in technology and it's not freaking them out.

S3

Speaker 3

00:14

But it's like Columbus, you know, like everybody thought the world was flat He got a couple of people to get on a boat and flood and we didn't know what that was gonna mean

S2

Speaker 2

00:21

See if we can find where Elon talks about this part because that that was the most

S4

Speaker 4

00:25

fascinating to me. I mean The reason Open I exist at all.

S2

Speaker 2

00:31

This is what he said earlier, but it's gonna be

S4

Speaker 4

00:34

nice to be close friends And I was at his house in Palo Alto and I would talk to

S2

Speaker 2

00:38

him I realize these are the people that are at the pinnacle of technology

S4

Speaker 4

00:42

and they're having slumber parties. Yeah, they're very aware AI safety seriously enough and What

S3

Speaker 3

00:49

did he say about it?

S4

Speaker 4

00:50

He really seemed to be what it once was, sort of, a digital super intelligence, basically a digital god, if you will, as soon as possible.

S3

Speaker 3

01:01

He wanted that?

S4

Speaker 4

01:03

Yes. He's made many public statements over the years that the whole goal of Google is what's called AGI, artificial general intelligence or artificial superintelligence. And I agree with him that there's great potential for good, but there's also potential for bad. And so if you've got some radical new technology, you want to try to take a set of actions that maximize probably it will do good and minimize probably it will do bad things.

S4

Speaker 4

01:29

Yes. It can't just be health leather. It's just go barreling forward and hope for the best. And then at 1 point, I said, well, what about, you know, we're going to make sure humanity's okay here.

S4

Speaker 4

01:45

And then he called me a speciest.

S3

Speaker 3

01:51

Did he use that term?

S4

Speaker 4

01:53

Yes. And there were witnesses. I wasn't the only 1 there when he called me a speciest. And so I was like, okay, that's it.

S4

Speaker 4

02:01

Yes, I'm a species. Okay, you got me Yeah, I'm fully specious That was the last role at the time

S2

Speaker 2

02:18

How wild is that it is wild But these are the people that are in control of this thing and I think there's also this race that's going on. There's all these different companies around the world that are trying to develop artificial general intelligence first because I think having it first if you have a digital god first, you have a massive advantage over everyone and everything. Yeah, I mean, if you think that tech companies have a lot of power now, imagine if tech companies unleash a digital god.

S2

Speaker 2

02:46

I mean, they literally might be the very seeds that created God.

S3

Speaker 3

02:50

I really believe that now the digital universe is probably the most sought-after God. You know most people probably spend more time advocating for whatever they're seeing online anywhere than they do for their church or their, you know, that I think that already exists. People like Elon see that And there should be a race for it.

S3

Speaker 3

03:17

I don't see it as, you know, everything, everything can have a really dark bad side. And we can't control it. So I think even talking about it the way you're talking about it is scary. And I don't know that it's.

S3

Speaker 3

03:31

Can I scare you with this?

S2

Speaker 2

03:32

Let's scare me with this.

S3

Speaker 3

03:33

This was on 60 Minutes last night. They did a whole piece.

S2

Speaker 2

03:36

Oh, that's right. I saw this. 1 AI program spoke in a foreign language It was never trained to know this mysterious behavior called emergent properties has been happening where AI unexpectedly teaches itself a new skill.

S3

Speaker 3

03:49

Like a minute in he says something that I think is

S2

Speaker 2

03:51

pretty wild. Yeah, this is bananas. Go ahead.

S2

Speaker 2

03:53

This is on CBS.

S5

Speaker 5

03:55

Is called emergent properties. Some AI systems are teaching themselves skills that they weren't expected to have. How this happens is not well understood.

S5

Speaker 5

04:09

For example, 1 Google AI program adapted on its own after it was prompted in the language of Bangladesh, which it was not trained to know.

S1

Speaker 1

04:22

We discovered that with very few amounts of prompting in Bengali, it can now translate all of Bengali. So now All of a sudden, we now have a research effort where we're now trying to get to a thousand languages.

S6

Speaker 6

04:35

There is an aspect of this which we call, all of us in the field call it as a black box. You know, you don't fully understand and you can't quite tell why it said this or why it got wrong. We have some ideas and our ability to understand this gets better over time, but that's where the state of the art is.

S5

Speaker 5

04:54

You don't fully understand how it works, and yet you've turned it loose on society?

S6

Speaker 6

05:00

Let me put it this way. I don't think we fully understand how a human mind works either

S5

Speaker 5

05:06

Was it from that black box? We wondered

S2

Speaker 2

05:14

What else they say in

S3

Speaker 3

05:14

here That's like he wrote a poem and they're asking why did it right? But why does that scare you guys so much? Why does

S2

Speaker 2

05:22

it listen? It's alive It's not whether or not it's scary. It's a car.

S2

Speaker 2

05:26

It's a kind of life form.

S3

Speaker 3

05:27

But here's the thing. I am really fearful of humanity. I'm really afraid of us.

S2

Speaker 2

05:35

Let's hope that's not afraid of us too, decides to get rid of us.

S3

Speaker 3

05:41

Or make us better. Or, you know, and that's a 50-50. And I would like to, you know, I have enough negativity, and I'm not talking, it's not about me, but going on in my mind where if I don't know and I'm assuming that it is a higher power than me Not not God, but it is a higher power than me that maybe for whatever reason let's trust that it is a benefit and not something that's horrible.

S2

Speaker 2

06:13

Well, I certainly hope.

S3

Speaker 3

06:15

I hope it's a benefit. You just played for me, didn't scare me. It doesn't scare me.

S3

Speaker 3

06:20

The fact that it is artificial intelligence What is intelligence just by virtue of what it is? So it's learning things that we can't explain. It's intelligence That's what intelligence is.

S2

Speaker 2

06:32

I think we're using this word in a weird way. The word scared. Because I don't think that it's scared like I'm scared of wolves.

S2

Speaker 2

06:39

It's not that kind of scared. It's scared like, ooo-wee, I realize where this is going. And it might not even be

S4

Speaker 4

06:46

in our lifetime.

S3

Speaker 3

06:46

But you didn't describe a good place.

S2

Speaker 2

06:48

Well, it's not a good place for us But maybe it's a good price place for the universe

S3

Speaker 3

06:52

you invest in this life. Do you invest in Bitcoin? That's a Bitcoin.

S4

Speaker 4

06:56

Okay, so you're I

S2

Speaker 2

06:57

think but that's not what I'm thinking about I'm thinking about maybe this is what happens with intelligence everywhere. That maybe intelligence realizes limitations to biology. And biological evolution is very time consuming.

S2

Speaker 2

07:10

It takes a long time to get adaptation, for things to change. It takes decades. It takes centuries. It takes thousands of years.

S2

Speaker 2

07:18

But this could happen in weeks and hours and minutes, and especially if it knows how to make a better version.

S3

Speaker 3

07:24

And you don't think that's scary, what you just proclaimed? You keep saying scary.

S2

Speaker 2

07:28

I do. I think it's just the thing that's happening. And I think we will- But you have children.

S2

Speaker 2

07:33

Yes, we will continue to exist, but I feel like this was inevitable. Just like it's, you ever, there's inevitable things that happen in nature that we don't want to admit. You don't want to say they're scary. They're just inevitable.