0

Elon Musk and Sam Altman Discuss What People Should Work On [2016 Video and Tran...

 1 month ago
source link: https://hackernoon.com/elon-musk-and-sam-altman-discuss-what-people-should-work-on-2016-video-and-transcript
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Elon Musk and Sam Altman Discuss What People Should Work On [2016 Video and Transcript]

Search icon
see notifications
Notifications
New alert! 🚨 One-tap Google login and signup is now live on HackerNoon! Give it a try! 🚀
03/19/2024
This is a reminder to drink more water, and maybe get off your bed and exercise a bit? 😂 Whatever you do, we support you. Stay hydrated while browsing HackerNoon 💚
03/08/2024
You can now WRITE ANYWHERE with the HackerNoon Mobile App! Experience today 💚
03/08/2024
Time to update all your call-to-actions to gain more traffic 🚗🛵🚌
03/08/2024
What's trending in the HackerNoon Community today?
03/08/2024
What's new on HackerNoon this week? Find out now!
03/08/2024
New! Pixelated Icons Open Sourced by HackerNoon
01/30/2024
HackerNoon's original documentary is officially here! Watch Web 2.5 Today!
01/30/2024
see 1 more
Elon Musk and Sam Altman Discuss What People Should Work On [2016 Video and Transcript]  by@videoman
featured image - Elon Musk and Sam Altman Discuss What People Should Work On [2016 Video and Transcript]
Your browser does not support theaudio element.
Read by Dr. One (en-US)
Audio Presented by

@videoman

Video Man

i'm a man i'm a man i'm a video man


Receive Stories from @videoman

today we have Elon Musk Eon thank you

[00:00:00] : [00:00:05]

for joining us thanks having right so we

[00:00:05] : [00:00:06]

want to spend the time today talking

[00:00:06] : [00:00:06]

about

[00:00:06] : [00:00:09]

your view of the future and what people

[00:00:09] : [00:00:11]

should work on so to start off could you

[00:00:11] : [00:00:13]

tell us you famously said when you were

[00:00:13] : [00:00:15]

younger there were five problems that

[00:00:15] : [00:00:17]

you thought were most important for you

[00:00:17] : [00:00:19]

to work on um if you were 22 today what

[00:00:19] : [00:00:20]

would the five problems that you would

[00:00:20] : [00:00:23]

think about working on B well first of

[00:00:23] : [00:00:25]

all is it I think um if somebody is

[00:00:25] : [00:00:28]

doing something that is useful to the

[00:00:28] : [00:00:30]

rest of society I think that's a good

[00:00:30] : [00:00:31]

thing like it doesn't have to change the

[00:00:31] : [00:00:33]

world like you know

[00:00:33] : [00:00:35]

if you make something that has high

[00:00:35] : [00:00:38]

value to people and frankly even if it's

[00:00:38] : [00:00:41]

something if it's like just a little

[00:00:41] : [00:00:46]

game or you know the some improvement in

[00:00:46] : [00:00:48]

photo-sharing or something if it if it

[00:00:48] : [00:00:51]

how does a small amount of good for a

[00:00:51] : [00:00:54]

large number of people that's I mean I

[00:00:54] : [00:00:56]

think that's that's fine like stuff

[00:00:56] : [00:00:57]

doesn't need to be change the world just

[00:00:57] : [00:01:00]

to be good

[00:01:00] : [00:01:02]

but you know in terms of things that I

[00:01:02] : [00:01:04]

think are most likely to affect the the

[00:01:04] : [00:01:06]

future of humanity I think

[00:01:06] : [00:01:09]

AI is probably the single biggest item

[00:01:09] : [00:01:11]

in the near term that's likely to affect

[00:01:11] : [00:01:14]

humanity so it's very important that we

[00:01:14] : [00:01:19]

have the advent of AI in a good way but

[00:01:19] : [00:01:22]

that is something that

[00:01:22] : [00:01:24]

if you if you could look into the

[00:01:24] : [00:01:25]

crucible and enter the future you would

[00:01:25] : [00:01:28]

like you would like that outcome

[00:01:28] : [00:01:30]

because it is something that could go

[00:01:30] : [00:01:33]

could go wrong and as we've talked about

[00:01:33] : [00:01:36]

many times and so we really need to make

[00:01:36] : [00:01:39]

sure it goes right

[00:01:39] : [00:01:39]

that's best I think AI working on AI and

[00:01:39] : [00:01:45]

making sure it's a great future that's

[00:01:45] : [00:01:46]

that's the most important thing I think

[00:01:46] : [00:01:48]

right now

[00:01:48] : [00:01:51]

the most pressing item SEC then

[00:01:51] : [00:01:54]

obviously I think in to do with with

[00:01:54] : [00:01:58]

genetics if you can actually solve

[00:01:58] : [00:02:02]

genetic diseases if you can prevent

[00:02:02] : [00:02:04]

dementia or Alzheimer's or something

[00:02:04] : [00:02:06]

like that that West's genetic

[00:02:06] : [00:02:08]

reprogramming that would be wonderful so

[00:02:08] : [00:02:10]

I think this

[00:02:10] : [00:02:13]

genetics it might be a sort of second

[00:02:13] : [00:02:17]

most important item I think

[00:02:17] : [00:02:20]

having a high bandwidth interface to the

[00:02:20] : [00:02:22]

brain like we're currently

[00:02:22] : [00:02:24]

bandwidth-limited we have a digital

[00:02:24] : [00:02:26]

tertiary self inform of our email

[00:02:26] : [00:02:28]

capabilities like computers phones

[00:02:28] : [00:02:31]

applications how we're effectively

[00:02:31] : [00:02:33]

superhuman but we're extremely bad with

[00:02:33] : [00:02:35]

constraint in that interface between the

[00:02:35] : [00:02:39]

cortex and your sort that the tertiary

[00:02:39] : [00:02:42]

digital form of yourself and helping

[00:02:42] : [00:02:44]

solve that bandwidth constraint would

[00:02:44] : [00:02:46]

would be I think very important for the

[00:02:46] : [00:02:48]

future as well so one of the I think

[00:02:48] : [00:02:52]

most common questions I hear young

[00:02:52] : [00:02:53]

people at ambitious young people ask is

[00:02:53] : [00:02:55]

I want to be the next Elon Musk how do I

[00:02:55] : [00:02:56]

do that on

[00:02:56] : [00:02:59]

obviously the next Elon Musk will work

[00:02:59] : [00:03:01]

on very different things then than you

[00:03:01] : [00:03:03]

did but what have you done or what did

[00:03:03] : [00:03:05]

you do when you were younger that you

[00:03:05] : [00:03:07]

think sort of set you up to have a big

[00:03:07] : [00:03:09]

impact well I think this well I should

[00:03:09] : [00:03:11]

say that I do not expect to be involved

[00:03:11] : [00:03:15]

in all these things so the the the five

[00:03:15] : [00:03:17]

things that I thought about the time in

[00:03:17] : [00:03:20]

in college quite a long time ago 25

[00:03:20] : [00:03:22]

years ago

[00:03:22] : [00:03:24]

you know being you know making life

[00:03:24] : [00:03:26]

multiplanetary

[00:03:26] : [00:03:29]

selling accelerating the transition to

[00:03:29] : [00:03:30]

sustainable energy

[00:03:30] : [00:03:34]

the the internet broadly speaking and

[00:03:34] : [00:03:37]

and then genetics and AI I think I

[00:03:37] : [00:03:41]

didn't expect to be involved in in all

[00:03:41] : [00:03:43]

of those things I actually at the time

[00:03:43] : [00:03:45]

in college I sort of thought helping

[00:03:45] : [00:03:48]

with electrification of cars which was

[00:03:48] : [00:03:50]

how we start out and that's a that's

[00:03:50] : [00:03:51]

actually what I worked on as an intern

[00:03:51] : [00:03:52]
[00:03:52] : [00:03:55]

advanced ultra capacitors with to see

[00:03:55] : [00:03:57]

think there would be a breakthrough

[00:03:57] : [00:03:59]

relative to batteries for energy storage

[00:03:59] : [00:04:03]

and cars and then when I came out to go

[00:04:03] : [00:04:04]

to Stanford that's what I was going to

[00:04:04] : [00:04:07]

be doing my grad studies on is it was

[00:04:07] : [00:04:09]

working on a best at energy storage

[00:04:09] : [00:04:12]

technologies for electric cars and I put

[00:04:12] : [00:04:15]

that on hold to start an Internet

[00:04:15] : [00:04:18]

company in 95 because

[00:04:18] : [00:04:21]

there doesn't seem to be like a time for

[00:04:21] : [00:04:23]

particular technologies when they or at

[00:04:23] : [00:04:27]

a steep point in the inflection code and

[00:04:27] : [00:04:30]

and I didn't want to you know do PhD at

[00:04:30] : [00:04:32]

Stanford and then and what sure will

[00:04:32] : [00:04:35]

happen and then and I wasn't entirely

[00:04:35] : [00:04:37]

certain that the technology I'd be

[00:04:37] : [00:04:40]

working on would actually succeed I can

[00:04:40] : [00:04:42]

get you can get a you know doctrine or

[00:04:42] : [00:04:44]

many things that ultimately are not do

[00:04:44] : [00:04:46]

not have a practical bearing on the

[00:04:46] : [00:04:49]

world and I wanted to you know just I

[00:04:49] : [00:04:51]

really was just trying to be useful

[00:04:51] : [00:04:54]

that's the optimization it's like what

[00:04:54] : [00:04:56]

are what can I do that would actually be

[00:04:56] : [00:04:58]

useful do you think people that want to

[00:04:58] : [00:05:01]

be useful today should get PhDs um

[00:05:01] : [00:05:05]

mostly not some with what is the best

[00:05:05] : [00:05:07]

some yes but mostly not

[00:05:07] : [00:05:09]

how should someone figure out how they

[00:05:09] : [00:05:11]

can be most useful or whatever this

[00:05:11] : [00:05:13]

thing is that you're trying to create

[00:05:13] : [00:05:16]

what would what would be the utility

[00:05:16] : [00:05:18]

Delta compared to the current

[00:05:18] : [00:05:20]

state-of-the-art times how many people

[00:05:20] : [00:05:22]

it would affect so that's why I think

[00:05:22] : [00:05:25]

having something that has a that's that

[00:05:25] : [00:05:27]

has a mix makes a big difference but

[00:05:27] : [00:05:29]

effects a sort of small to moderate

[00:05:29] : [00:05:31]

number of people as great as is

[00:05:31] : [00:05:32]

something that makes even a small

[00:05:32] : [00:05:34]

difference but but affects a vast number

[00:05:34] : [00:05:37]

of people like the area yeah on you know

[00:05:37] : [00:05:39]

under the code yeah exactly I don't know

[00:05:39] : [00:05:40]

every under the curve is would actually

[00:05:40] : [00:05:42]

be roughly similar for those two things

[00:05:42] : [00:05:45]

so it's actually really about

[00:05:45] : [00:05:47]

yeah I just trying to be useful and

[00:05:47] : [00:05:50]

matter when you're trying to estimate

[00:05:50] : [00:05:52]

probability of success so you say

[00:05:52] : [00:05:53]

something will be really useful good

[00:05:53] : [00:05:55]

area under the curve I guess to use the

[00:05:55] : [00:05:58]

example of SpaceX mmm-hmm when you made

[00:05:58] : [00:05:59]

the go decision that you were actually

[00:05:59] : [00:06:00]

going to do that this was kind of a very

[00:06:00] : [00:06:03]

crazy thing at the time very crazy there

[00:06:03] : [00:06:05]

shortly yeah I'm not shy about saying

[00:06:05] : [00:06:07]

that but I kind of agree I agreed with

[00:06:07] : [00:06:10]

them that it was quite crazy crazy if if

[00:06:10] : [00:06:14]

the objective was to achieve the

[00:06:14] : [00:06:17]

best risk adjusted return sliding our

[00:06:17] : [00:06:20]

company is insane

[00:06:20] : [00:06:21]

but that was not that was not my

[00:06:21] : [00:06:23]

objective why I item to come to the

[00:06:23] : [00:06:24]

conclusion

[00:06:24] : [00:06:27]

that if something didn't happen to

[00:06:27] : [00:06:28]

improve Rock technology would be stuck

[00:06:28] : [00:06:31]

on earth forever and and the big

[00:06:31] : [00:06:34]

aerospace companies had just had no

[00:06:34] : [00:06:36]

interest in radical innovate

[00:06:36] : [00:06:38]

all they wanted to do is try to make

[00:06:38] : [00:06:41]

their old technology slightly better

[00:06:41] : [00:06:44]

every year and in fact sometimes we

[00:06:44] : [00:06:46]

would actually get worse and

[00:06:46] : [00:06:48]

particularly in Rockets is pretty bad

[00:06:48] : [00:06:50]

like the in 69 we were able to go to the

[00:06:50] : [00:06:53]

moon with a Saturn 5 and then the space

[00:06:53] : [00:06:54]

shuttle could only take people to

[00:06:54] : [00:06:55]

low-earth orbit and then the Space

[00:06:55] : [00:06:57]

Shuttle retired and that trend is

[00:06:57] : [00:07:00]

basically trends to zero it

[00:07:00] : [00:07:02]

if people sighs think technology just

[00:07:02] : [00:07:04]

automatically gets better over here but

[00:07:04] : [00:07:05]

actually doesn't it only gets better if

[00:07:05] : [00:07:08]

smart people would work like crazy to

[00:07:08] : [00:07:11]

make it better that's how any technology

[00:07:11] : [00:07:13]

actually gets better and

[00:07:13] : [00:07:16]

by itself technology if people don't

[00:07:16] : [00:07:18]

work and it actually will decline

[00:07:18] : [00:07:20]

you can look and look at the history of

[00:07:20] : [00:07:22]

civilizations many civilizations and

[00:07:22] : [00:07:25]

look at say ancient Egypt were they able

[00:07:25] : [00:07:26]

to pull these incredible pyramids and

[00:07:26] : [00:07:28]

then they basically forgot how to build

[00:07:28] : [00:07:30]

pyramids and

[00:07:30] : [00:07:34]

and then even hieroglyphics they've got

[00:07:34] : [00:07:35]

how to read hydrocal hieroglyphics so we

[00:07:35] : [00:07:37]

look at Rome and how they will to look

[00:07:37] : [00:07:39]

to build these incredible roadways and

[00:07:39] : [00:07:41]

aqueducts and indoor plumbing and they

[00:07:41] : [00:07:44]

forgot how to do all of those things and

[00:07:44] : [00:07:48]

there are many such examples in history

[00:07:48] : [00:07:50]

so I I think

[00:07:50] : [00:07:52]

choice bear in mind

[00:07:52] : [00:07:55]

that's you know

[00:07:55] : [00:07:59]

entropy is not on your side yeah

[00:07:59] : [00:08:01]

one thing I really like about you is you

[00:08:01] : [00:08:04]

are unusually fearless and willing to go

[00:08:04] : [00:08:05]

in the face of other people telling you

[00:08:05] : [00:08:06]

something that's crazy and I know a lot

[00:08:06] : [00:08:08]

of pretty crazy people you still stand

[00:08:08] : [00:08:10]

out uh where does that come from or how

[00:08:10] : [00:08:12]

do you think about making a decision

[00:08:12] : [00:08:13]

when everyone tells you this is a crazy

[00:08:13] : [00:08:15]

idea where do you get the internal

[00:08:15] : [00:08:16]

strength to do that

[00:08:16] : [00:08:18]

well first well I'd say I actually think

[00:08:18] : [00:08:22]

I feel feel fair quite strongly

[00:08:22] : [00:08:25]

so it's not as though I just have the

[00:08:25] : [00:08:27]

absence of fear I've I feel it quite

[00:08:27] : [00:08:28]

strongly

[00:08:28] : [00:08:31]

but there are times when something is

[00:08:31] : [00:08:33]

important enough you've leave it enough

[00:08:33] : [00:08:35]

that you do you do it in spite of the

[00:08:35] : [00:08:38]

fear so speaking of important things

[00:08:38] : [00:08:40]

like people shouldn't think I I i

[00:08:40] : [00:08:43]

should think well i feel fear about this

[00:08:43] : [00:08:45]

and therefore i shouldn't do it

[00:08:45] : [00:08:48]

it's normal to be to feel fear like you

[00:08:48] : [00:08:50]

have to do something mentally wrong you

[00:08:50] : [00:08:53]

shouldn't feel fair

[00:08:53] : [00:08:55]

so you just feel it and let the

[00:08:55] : [00:08:56]

importance of it drive you to do it

[00:08:56] : [00:08:59]

anyway yeah you know actually something

[00:08:59] : [00:09:01]

that can be helpful as fatalism some

[00:09:01] : [00:09:03]

degree if you just think it's just

[00:09:03] : [00:09:06]

accept the probabilities then that

[00:09:06] : [00:09:09]

diminishes fear so

[00:09:09] : [00:09:12]

we're starting SpaceX I thought the odds

[00:09:12] : [00:09:14]

of success were less than 10%

[00:09:14] : [00:09:17]

and I just accepted that actually

[00:09:17] : [00:09:19]

probably I would just lose lose

[00:09:19] : [00:09:21]

everything

[00:09:21] : [00:09:23]

but that maybe would make some progress

[00:09:23] : [00:09:25]

if we could just move the ball forward

[00:09:25] : [00:09:28]

even if we died maybe some other company

[00:09:28] : [00:09:30]

could pick up the baton and move and

[00:09:30] : [00:09:32]

keep moving it forward so that we're

[00:09:32] : [00:09:35]

slowly some good

[00:09:35] : [00:09:37]

yeah same with Tesla I thought your odds

[00:09:37] : [00:09:39]

of a car company succeeding were

[00:09:39] : [00:09:41]

extremely low what do you think the odds

[00:09:41] : [00:09:43]

of the Mars colony are at this point

[00:09:43] : [00:09:45]

today

[00:09:45] : [00:09:49]

well um oddly enough I actually think

[00:09:49] : [00:09:51]

they're pretty good

[00:09:51] : [00:09:55]

so like when can I go okay um at this

[00:09:55] : [00:09:57]

point I am certain there is a way I'm

[00:09:57] : [00:09:59]

certain that success is one of the

[00:09:59] : [00:10:01]

possible outcomes for establishing a

[00:10:01] : [00:10:03]

self-sustaining mass colony in fact

[00:10:03] : [00:10:05]

growing Lost Colony I'm certain that

[00:10:05] : [00:10:09]

that is possible whereas until maybe a

[00:10:09] : [00:10:11]

few years ago I was not sure that

[00:10:11] : [00:10:12]

success was even one of the possible

[00:10:12] : [00:10:15]

outcomes some meaningful number of

[00:10:15] : [00:10:17]

people going to Mars I think this is

[00:10:17] : [00:10:19]

potentially something that can be

[00:10:19] : [00:10:22]

accomplished in about 10 years

[00:10:22] : [00:10:26]

maybe sooner I mean maybe nine years I

[00:10:26] : [00:10:29]

need to make sure that SpaceX doesn't

[00:10:29] : [00:10:31]

die between now and then and that I

[00:10:31] : [00:10:33]

don't die or if I do die that someone

[00:10:33] : [00:10:35]

takes over who will continue that

[00:10:35] : [00:10:38]

shouldn't go on the first launch yeah

[00:10:38] : [00:10:41]

exactly like she's the first launch will

[00:10:41] : [00:10:43]

be a robotic anyway so I want to go

[00:10:43] : [00:10:44]

except for the internet latency

[00:10:44] : [00:10:46]

yeah they are at latency to be pretty

[00:10:46] : [00:10:47]

significant

[00:10:47] : [00:10:51]

i Mars is roughly 12 light minutes from

[00:10:51] : [00:10:53]

the Sun and Earth is 8 light minutes so

[00:10:53] : [00:10:55]

closest approach Mars is for light

[00:10:55] : [00:10:57]

minutes away that first approaches 20 a

[00:10:57] : [00:10:59]

little more because you have to you

[00:10:59] : [00:11:00]

can't sort of talk directly through the

[00:11:00] : [00:11:01]
[00:11:01] : [00:11:04]

speaking of really important problems

[00:11:04] : [00:11:07]

AI so you've been outspoken about AI um

[00:11:07] : [00:11:10]

could you talk about what you think the

[00:11:10] : [00:11:11]

positive future for AI looks like and

[00:11:11] : [00:11:13]

how we get there okay

[00:11:13] : [00:11:16]

I mean I do want to emphasize that

[00:11:16] : [00:11:19]

this is not really

[00:11:19] : [00:11:23]

something that I advocate or this is not

[00:11:23] : [00:11:26]

prescriptive this is simply pretty

[00:11:26] : [00:11:29]

hopefully predictive as he looks on to

[00:11:29] : [00:11:31]

say oh well like this is something that

[00:11:31] : [00:11:34]

I want to occur instead of so this I

[00:11:34] : [00:11:36]

mean I think that probably is the best

[00:11:36] : [00:11:40]

of the available alternatives

[00:11:40] : [00:11:40]

the best of the available alternatives

[00:11:40] : [00:11:46]

that I can come up with and maybe

[00:11:46] : [00:11:47]

someone else can come up with a better

[00:11:47] : [00:11:52]

approach or better outcome is that we

[00:11:52] : [00:11:54]

achieve democratization of AI technology

[00:11:54] : [00:11:58]

meaning that no one company or

[00:11:58] : [00:12:00]

small set of individuals has control

[00:12:00] : [00:12:03]

over advanced AI technology I think that

[00:12:03] : [00:12:06]

that's very dangerous

[00:12:06] : [00:12:09]

it could also get stolen by somebody bad

[00:12:09] : [00:12:12]

you know like some evil dictator country

[00:12:12] : [00:12:14]

could send their intelligence agency to

[00:12:14] : [00:12:16]

go steal it and gain control it just

[00:12:16] : [00:12:18]

becomes a very unstable situation I

[00:12:18] : [00:12:20]

think if you've got any

[00:12:20] : [00:12:24]

any incredibly powerful AI

[00:12:24] : [00:12:26]

you just don't know who's who's going to

[00:12:26] : [00:12:29]

control that so it's not as I think that

[00:12:29] : [00:12:30]

the risk is that the AI would develop a

[00:12:30] : [00:12:32]

will of its own right off the bat I

[00:12:32] : [00:12:35]

think it's more the consumers that some

[00:12:35] : [00:12:36]

someone

[00:12:36] : [00:12:40]

may use it in a way that is bad or and

[00:12:40] : [00:12:42]

even if they weren't going to use in a

[00:12:42] : [00:12:43]

way that's bad but somebody could take

[00:12:43] : [00:12:44]

it from them and use it in a way that's

[00:12:44] : [00:12:47]

bad that that I think is quite a big

[00:12:47] : [00:12:49]

danger so I think we must have

[00:12:49] : [00:12:51]

democratization of AI technology make it

[00:12:51] : [00:12:51]

widely available

[00:12:51] : [00:12:54]

and that's you know the reason that

[00:12:54] : [00:12:57]

obviously you me and the rest the team

[00:12:57] : [00:13:02]

you know created open AI was to help

[00:13:02] : [00:13:05]

with the democra

[00:13:05] : [00:13:08]

ni technology so it doesn't get

[00:13:08] : [00:13:10]

concentrated in the hands of a few at

[00:13:10] : [00:13:13]

and but then of course that needs to be

[00:13:13] : [00:13:15]

combined with

[00:13:15] : [00:13:18]

solving the high bandwidth interface to

[00:13:18] : [00:13:21]

the cortex

[00:13:21] : [00:13:24]

humans are so slow humans are so slow

[00:13:24] : [00:13:26]

yes exactly

[00:13:26] : [00:13:29]

but you know we already have a situation

[00:13:29] : [00:13:30]

in our brain where we've got the cortex

[00:13:30] : [00:13:32]

and limbic system and the limbic system

[00:13:32] : [00:13:36]

is kind of a tomato that's that's the

[00:13:36] : [00:13:37]

primitive brain it's kind of like the

[00:13:37] : [00:13:40]

your instincts and

[00:13:40] : [00:13:43]

whatnot and then the cortex is of

[00:13:43] : [00:13:45]

thinking upper part of the brain those

[00:13:45] : [00:13:48]

two seem to work together quite well

[00:13:48] : [00:13:50]

occasionally your cortex and limbic

[00:13:50] : [00:13:53]

system may disagree but they definitely

[00:13:53] : [00:13:54]

works pretty well generally works pretty

[00:13:54] : [00:13:56]

well and it's like rare to find someone

[00:13:56] : [00:13:59]

who I've not found someone who wishes to

[00:13:59] : [00:14:00]

either get rid of the cortex or get rid

[00:14:00] : [00:14:02]

of the Olympic system

[00:14:02] : [00:14:05]

very true yeah it's that's unusual so so

[00:14:05] : [00:14:09]

I think if we can effectively

[00:14:09] : [00:14:13]

merge with AI by

[00:14:13] : [00:14:17]

improving that the neural link between

[00:14:17] : [00:14:19]

your cortex and the

[00:14:19] : [00:14:21]

your digital extension yourself which

[00:14:21] : [00:14:23]

already likes that already exists just

[00:14:23] : [00:14:27]

has a bandwidth issue and then then

[00:14:27] : [00:14:29]

effectively you

[00:14:29] : [00:14:34]

become an AI human symbiote and and if

[00:14:34] : [00:14:37]

that then is widespread with anyone who

[00:14:37] : [00:14:39]

wants it can have it then we solve the

[00:14:39] : [00:14:42]

control problem as well we don't have to

[00:14:42] : [00:14:43]

worry about

[00:14:43] : [00:14:46]

some sort of evil dictator AI because

[00:14:46] : [00:14:49]

kind of we are the AI collectively that

[00:14:49] : [00:14:51]

seems like the best outcome I can think

[00:14:51] : [00:14:55]

of so you've seen other companies in the

[00:14:55] : [00:14:56]

early days that start small and get

[00:14:56] : [00:14:58]

really successful um hope I don't work

[00:14:58] : [00:14:59]

at asking this on camera but how do you

[00:14:59] : [00:15:02]

think urban AI is going as a six month

[00:15:02] : [00:15:04]

old company I teach Pico pretty well I

[00:15:04] : [00:15:06]

think we've got a really talented group

[00:15:06] : [00:15:08]

what opening eye and yeah really really

[00:15:08] : [00:15:11]

talented team and they're working hard

[00:15:11] : [00:15:15]

open a is structured as see a 501c3

[00:15:15] : [00:15:16]

nonprofit

[00:15:16] : [00:15:20]

but you know many nonprofits do not have

[00:15:20] : [00:15:22]

a sense of urgency it's fine they don't

[00:15:22] : [00:15:24]

have to have a sense of urgency

[00:15:24] : [00:15:27]

but opening ideas

[00:15:27] : [00:15:29]

cause I think people really believe in

[00:15:29] : [00:15:31]

the mission I think it's important

[00:15:31] : [00:15:34]

it's about minimizing

[00:15:34] : [00:15:38]

the risk of existential harm in the

[00:15:38] : [00:15:40]

future and

[00:15:40] : [00:15:42]

so I think it's going well I'm pretty

[00:15:42] : [00:15:45]

impressed with what people are doing in

[00:15:45] : [00:15:48]

the talent level and obviously we're

[00:15:48] : [00:15:49]

always looking for

[00:15:49] : [00:15:52]

great people to join the only mission

[00:15:52] : [00:15:54]

let's to 40 people knots yes well well

[00:15:54] : [00:15:56]

alright just a few more questions before

[00:15:56] : [00:15:58]

we we wrap up how do you spend your days

[00:15:58] : [00:16:02]

now like what what do you allocate most

[00:16:02] : [00:16:04]

of your time to my time is mostly split

[00:16:04] : [00:16:08]

what's between SpaceX and Tesla and of

[00:16:08] : [00:16:11]

course I try to spend

[00:16:11] : [00:16:14]

it's a part of every week at open AI so

[00:16:14] : [00:16:16]

I spend most I spend

[00:16:16] : [00:16:19]

basically half a day at opening I most

[00:16:19] : [00:16:22]

weeks and then and then I have some

[00:16:22] : [00:16:23]

opening I stuff that happens during the

[00:16:23] : [00:16:26]

week but other than that it's really

[00:16:26] : [00:16:28]

much basically translates X or Tesla

[00:16:28] : [00:16:29]

like what is your time look like they're

[00:16:29] : [00:16:32]

uh yeah so that's a good question um I

[00:16:32] : [00:16:34]

think a lot of people think I must spend

[00:16:34] : [00:16:37]

a lot of time with media or or on

[00:16:37] : [00:16:39]

business II things but actually almost

[00:16:39] : [00:16:41]

almost all my time like eighty percent

[00:16:41] : [00:16:43]

of it is spent on engineering design

[00:16:43] : [00:16:46]

engineering and design so it's

[00:16:46] : [00:16:50]

developing next generation product at

[00:16:50] : [00:16:53]

that's eighty percent of it

[00:16:53] : [00:16:55]

you probably remember this a very long

[00:16:55] : [00:16:57]

time ago many many years you took me on

[00:16:57] : [00:16:59]

a tour of SpaceX and the most impressive

[00:16:59] : [00:17:00]

thing was that you knew every detail of

[00:17:00] : [00:17:02]

the rocket and every piece of

[00:17:02] : [00:17:03]

engineering that went into it I don't

[00:17:03] : [00:17:05]

think many people get that about you

[00:17:05] : [00:17:07]

yeah I think a lot of people think I'm

[00:17:07] : [00:17:08]

kind of a business person or something

[00:17:08] : [00:17:10]

it just fine I like business is fine but

[00:17:10] : [00:17:12]

um a guy

[00:17:12] : [00:17:15]

really it's you know it was like it

[00:17:15] : [00:17:18]

SpaceX Gwynne Shotwell was chief

[00:17:18] : [00:17:21]

operating officer she kind of manages

[00:17:21] : [00:17:25]

legal finance sales and kind of general

[00:17:25] : [00:17:28]

business activity and then my time is

[00:17:28] : [00:17:30]

almost entirely with the engineering

[00:17:30] : [00:17:33]

team working on improving that the

[00:17:33] : [00:17:36]

Falcon 9 and the Dragon spacecraft and

[00:17:36] : [00:17:37]

developing the most colonial

[00:17:37] : [00:17:37]

architecture

[00:17:37] : [00:17:40]

and that Tesla

[00:17:40] : [00:17:43]

it's working on the model three and

[00:17:43] : [00:17:46]

yes I'm in the design studio took very

[00:17:46] : [00:17:47]
[00:17:47] : [00:17:50]

happening a week

[00:17:50] : [00:17:54]

dealing with aesthetics and and look and

[00:17:54] : [00:17:57]

feel things and then most of our week is

[00:17:57] : [00:18:00]

just going through engineering of the

[00:18:00] : [00:18:02]

car itself as well as engineering of the

[00:18:02] : [00:18:06]

factory because the biggest epiphany

[00:18:06] : [00:18:10]

I've had thus this year is that what

[00:18:10] : [00:18:12]

really matters is that is the machine

[00:18:12] : [00:18:15]

that builds the machine the factory and

[00:18:15] : [00:18:16]

this that is at least towards magnitude

[00:18:16] : [00:18:18]

hotter than the vehicle itself it's

[00:18:18] : [00:18:21]

amazing to watch the robots go here and

[00:18:21] : [00:18:25]

these cars just happen yeah now this

[00:18:25] : [00:18:27]

actually is has a relatively low level

[00:18:27] : [00:18:30]

of automation compared to what the

[00:18:30] : [00:18:32]

gigafactory will have and what model 3

[00:18:32] : [00:18:34]

will have what's the speed on the line

[00:18:34] : [00:18:37]

of these cars actually average the line

[00:18:37] : [00:18:40]

is incredibly slow it's probably about

[00:18:40] : [00:18:42]
[00:18:42] : [00:18:45]

including both X and s

[00:18:45] : [00:18:51]

it's maybe a 5 you know 5 centimeters

[00:18:51] : [00:18:53]

per second and what can you get this is

[00:18:53] : [00:18:55]

very slow or what would you like to get

[00:18:55] : [00:18:56]
[00:18:56] : [00:18:59]

I'm confident we can get to to at least

[00:18:59] : [00:19:01]

1 meter per second so 20-fold increase

[00:19:01] : [00:19:05]

that will be very fast yeah um at least

[00:19:05] : [00:19:07]

I mean I think quite a bit 1 meter per

[00:19:07] : [00:19:08]

second just put that in perspective is a

[00:19:08] : [00:19:11]

slow walk or like a medium speed Walker

[00:19:11] : [00:19:13]

fast or could be one and half meters per

[00:19:13] : [00:19:16]

second and

[00:19:16] : [00:19:19]

and then the fastest humans can run over

[00:19:19] : [00:19:21]

10 meters per second

[00:19:21] : [00:19:24]

so if we're doing point zero five meters

[00:19:24] : [00:19:26]

per second that's a very slow current

[00:19:26] : [00:19:29]

card for low speed and and at 1 meter

[00:19:29] : [00:19:31]

per second you can still walk faster

[00:19:31] : [00:19:33]

than the production line

[00:19:33] : [00:19:33]
by Video Man @videoman.i'm a man i'm a man i'm a video man
Read my stories
L O A D I N G
. . . comments & more!

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK