About this transcript: This is a full AI-generated transcript of Biohacker Bryan Johnson trusts AI will solve human aging, published May 3, 2026. The transcript contains 4,956 words with timestamps and was generated using Whisper AI.
"You've been emphatic that humans need to lean into AI to survive. So tell me the specifics, and I'd like specifics, of how superintelligence is going to overcome that and deliver on this imagined future, where AI cures all disease and allows us to live forever. Sundar announced yesterday, with..."
[0:00] You've been emphatic that humans need to lean into AI to survive.
[0:04] So tell me the specifics, and I'd like specifics, of how superintelligence is going to overcome
[0:09] that and deliver on this imagined future, where AI cures all disease and allows us to
[0:13] live forever.
[0:14] Sundar announced yesterday, with Demis, that...
[0:18] Demis Hibabas and Sundar Pichai from Google.
[0:19] Yep.
[0:20] They created a new model that created a novel hypothesis about a way to turn a cold cancer
[0:27] into a hot cancer.
[0:28] So oftentimes, cancers sit in the body, undetected by the immune system.
[0:32] They made it hot.
[0:33] And so the immune system would say, like, there you are, or you can treat it.
[0:37] So a lot of people have criticized AI, like, oh, AI can't do anything novel.
[0:41] It just replicates human understanding.
[0:43] But yesterday was one of the first demonstrations of a genuinely novel insight.
[0:49] It thought it up itself.
[0:50] That you showed AI understands how cells communicate, understands the context in which they exist,
[0:58] and can talk back and say, what about this?
[1:00] And then it was replicated in an experiment that it was correct.
[1:04] So this is what I've been hoping for, for years, like, we're going to start making these
[1:09] little breakthroughs, then big breakthroughs, and then we're all going to say, huh.
[1:14] And so all of a sudden, this idea that we may be the first generation who won't die,
[1:19] people may say, like, maybe it's not crazy.
[1:22] Won't die, but won't die of that.
[1:24] Take that and multiply it by every disease out there.
[1:27] The study that just came out of China, if it replicates, they showed age reversal of
[1:34] equivalent of 9 to 15 years in humans across over 50% of human tissue, which is a gigantic
[1:42] demonstration.
[1:43] I have no doubt we were going to be living longer.
[1:45] It's not, I don't think, you know, maybe not today, but it's sort of like, probably back
[1:49] in when Orville Wright took off, I'm not going to be the person on the beach saying, he only
[1:54] went eight feet.
[1:55] Yeah.
[1:56] You go, oh my fucking god, he flopped.
[1:57] Exactly.
[1:58] That's the point.
[1:59] But again, you're saying, we won't die.
[2:01] We will die, we just will die later.
[2:03] If we travel back in time, and we're hanging out with Homo erectus, and I say, hey, Homo erectus,
[2:10] tell me about the future of our existence, what's going to happen?
[2:14] They will use their existing mental models and say, I will hunt, and I will forage,
[2:18] and I'll do this and that.
[2:19] They will not say that you're going to discover a microscopic world of atoms and molecules.
[2:24] You're going, right?
[2:25] They couldn't have imagined it.
[2:26] Exactly.
[2:27] And I'm saying that right now, relative to AI, it is possible we are as naive as Homo erectus.
[2:34] Certainly.
[2:35] But that's like the idea, I mean, you had a Dan Brown book here, I saw it, which he just
[2:39] wrote about in his recent book, that if Mozart came back and heard a speaker, he'd think there
[2:44] was a little orchestra in the little box, but he certainly would have soon understood
[2:49] it.
[2:50] We have a rate of adaptation as a species.
[2:52] For a new idea to be presented, for me to ingest it, for me to be like, I'm okay with
[2:56] it now.
[2:57] So we have a certain rate.
[2:58] Then AI has a certain rate.
[2:59] And if you start pairing those rates, you just have an open question and say, will we
[3:03] keep up to date with what's happening to AI or will it exceed us?
[3:06] And if it exceeds us, will it exceed in ways we can't quite comprehend?
[3:08] And so I'm not stating with any certainty, whether it's good or bad.
[3:12] All I'm saying is, this is a big moment.
[3:15] And what we don't want to do is walk into this moment as a war-faring, sick, addicted
[3:22] culture that kills ourselves for these points.
[3:26] I imagined being in the year 2500.
[3:29] And you're looking back at the early 21st century, our time and place, and you're wondering,
[3:33] what did Homo sapiens do in that moment, in that century?
[3:37] And I thought they might say two things.
[3:38] One is, this is when Homo sapiens gave birth to super intelligence, and this is when we
[3:43] figured out we would no longer die.
[3:45] It's solely a question, as a species, what are we capable of?
[3:50] And that's where I arrived at.
[3:51] It wasn't like, I want to live forever, therefore I'm going through this.
[3:54] It was that, trying to put a finger on what is the thing that can be done.
[3:58] So why do you phrase it like that?
[3:59] Because it is phrased and you market it, and it's quite marketed, I want to live forever.
[4:03] Oh, I don't.
[4:04] And I specifically do not say, I want to live forever.
[4:08] Mm-hmm.
[4:09] Okay.
[4:10] But you said you don't want to die.
[4:11] Don't die is very different.
[4:12] Okay.
[4:13] Explain that for me.
[4:14] Okay.
[4:15] Don't die is what you experience right now, and what I experience right now.
[4:17] Like if we felt the windows shatter, you and I would leave this room and seek protection,
[4:23] right?
[4:24] Like we'd go to a safe place.
[4:25] So don't die is our instinct in not wanting to die in this moment.
[4:30] And so this goes back to, this is going to be one more step for you.
[4:32] Okay.
[4:33] So in marrying the two concepts of giving birth to super intelligence and we are the first
[4:37] generation who won't die, my entire endeavor is not about health, it's about AI alignment.
[4:44] If you just have to create a practical map and say what things need to happen in order
[4:48] for us to solve aging.
[4:49] We know a jellyfish can be immortal.
[4:52] We know a hydra can be immortal.
[4:53] We know that immortality has been solved in biology.
[4:55] Mm-hmm.
[4:56] We know a lot in the human species.
[4:57] To do that, you probably need to have like a scientific-
[4:59] You mean the jellyfish at the bottom of the sea, those jellyfish?
[5:01] Yes, exactly.
[5:02] Okay.
[5:03] If you say, how could you practically take those learnings and apply them to human?
[5:06] You need a certain level of scientific understanding and tools to do that.
[5:10] We have those.
[5:11] Sure.
[5:12] But why not just study jellyfish instead of yourself?
[5:14] Measure everything.
[5:15] Because I think it comes from a mentality that a lot of people, and I'm not, I will lump you
[5:19] in with people, that everything is measurable.
[5:22] And every single thing is measurable, when in fact the human body is a very, I mean, or
[5:27] everything's like a computer.
[5:29] It's not like a computer.
[5:30] The human brain is not like a computer.
[5:32] If you look at the depths of it, I've become the most measured person in the world, offering
[5:36] a demonstration of what characterization can happen.
[5:39] And then I sift through the entire thing and say, I'm going to give it to you for free.
[5:44] I open source those and say, you can do your own thing.
[5:46] I will tell you exactly how to make it.
[5:48] And when you go out in the world right now, it's an absolute disaster.
[5:50] So how do you isolate the impact of each of these protocols, especially about short-term
[5:55] and long-term safety, and confidently endorse their effectiveness?
[5:59] Yeah.
[6:00] I mean, we do this.
[6:01] We just completed a study looking at hyperbaric oxidant therapy.
[6:03] We just did it with sauna, where we measured over a hundred biomarkers before I did hyperbaric
[6:08] oxidant therapy.
[6:09] I did the therapy.
[6:10] We isolated all my therapies during that time period, and we measured my biomarkers again.
[6:14] So yes, it's imperfect because there are variables we can't control in the world, but generally
[6:19] speaking, it's a fairly controlled experiment.
[6:21] Many of the studies around hyperbaric are mixed.
[6:24] Hyperbaric oxidant therapy has been the most efficacious therapy we've done.
[6:29] That you've done?
[6:30] Yeah.
[6:31] For your body?
[6:32] But we look at population level evidence.
[6:33] So it's not just mine.
[6:34] We say, what does all the evidence say of the people who have done this?
[6:38] What kind of effect size did they see in their brains, in the microbiome, in the telomeres?
[6:42] Then we say, what does mine show?
[6:45] Right.
[6:46] So we just have a comparative set.
[6:47] Yeah.
[6:48] So what I'm seeing, considering the wide variations in human biology, are they even comparable?
[6:52] That is possibly irresponsible to say one thing that happens to you is the thing that happens
[6:57] to another.
[6:58] Large scale, double blind, placebo controlled trials are the gold standard.
[7:02] Sure.
[7:03] We all want them.
[7:04] Yes.
[7:05] In the meantime.
[7:06] In the meantime.
[7:07] Everybody needs to make decisions on what they do in life.
[7:10] And so what I'm trying to do is I say, okay, look, here's the evidence base for hyperbaric
[7:14] oxidant therapy.
[7:15] Here's the population level data.
[7:17] I'm going to do this experiment, measure this, do the therapy, and share my results.
[7:22] So I share it.
[7:23] Then I can say, hey, everybody, if you want to repeat my experiment, you can do the following
[7:27] thing.
[7:28] The vector of criticizing me for experimenting and sharing results, it's such a narrow...
[7:33] Well, I think it's because you're an influencer.
[7:36] There's plenty of bad information.
[7:38] What is the alternative?
[7:39] So what do we do?
[7:40] Do the standard gold standard of double blind?
[7:42] Who's going to fund it?
[7:43] Well, you're rich.
[7:44] Yeah.
[7:45] How many studies could I fund?
[7:46] A lot.
[7:47] So could all of you.
[7:48] Yeah.
[7:49] But like for the number of variables, but the thing is like, this is the thing is it's
[7:52] not practical to get there and gripe and be like, don't run experiments on yourself
[7:56] and share your data.
[7:57] Just wait for everybody to do shit.
[7:59] Like, how do you get by in life?
[8:00] What do you do?
[8:01] Well, I think people think you should not be using your data to represent things and
[8:07] make decisions.
[8:08] But it isn't the gold standard of double blind.
[8:09] So nobody should experiment.
[8:10] No.
[8:11] I didn't say that.
[8:13] I said, I think you are under describing your influence, I would say.
[8:19] If you look at the range of the health advice out in the world, we're probably the safest
[8:24] vector of advice out there.
[8:27] You are.
[8:28] Yeah.
[8:29] Because?
[8:30] Because it happened to you?
[8:31] Nope.
[8:32] Because we follow the most robust scientific evidence in the world.
[8:36] You really do not worry if they're not gold standard studies.
[8:39] It's just, this is what happened to me, folks.
[8:42] Even in the gold standard studies, you're going to have limitations and every scientist
[8:46] that has written a paper knows that they have arranged that data in a certain way and
[8:52] made certain complicated decisions with that data.
[8:55] It is never clean, it is always complicated.
[8:58] And so this is the thing is, when scientists come at me and they somehow want to maintain
[9:03] some kind of higher position, that is not the case.
[9:05] Do you think that's just it?
[9:07] Maybe some of them are worried.
[9:08] Hey, look.
[9:09] Safety.
[9:10] I have seen this behind the scenes process now.
[9:12] I did not, I did not become educated as a scientist.
[9:15] I have sat in these rooms and had, I've been in these conversations where people are trying
[9:19] to figure out how to present a data set to the world in the paper they're writing.
[9:23] They have incentives to write papers that achieve certain outcomes to be in certain journals.
[9:28] It is not a sacred process.
[9:31] It is a vicious competitive process.
[9:33] So when they attack you, when these scientists attack you for being, you know, a con man,
[9:38] I've heard that many times.
[9:40] Yeah.
[9:41] I'm not, first, I'm not a con man.
[9:43] I'm just telling you what they're doing.
[9:44] I'm doing my experiments, I'm sharing my data, right, to the best of my ability.
[9:48] But when they come at this as somehow, I'm the person that needs to be ridiculed because
[9:52] this method is not good, I'm going to point the spotlight back and say, the public is
[9:59] not aware of what you do, I understand what you do.
[10:01] It does remind me a little bit of tech people who do this, where they go, I remember having
[10:06] a very elaborate conversation with Mark Zuckerberg about problems on social media, anti-Semitic
[10:11] stuff many years ago.
[10:12] This was a decade ago, and I said, down the line, this is going to be bad, like ultimately
[10:16] when the toxic stew gets down the line, it's going to cause problems.
[10:20] And he literally goes, but what about cable?
[10:22] I'm like, yeah, no, that's an industry that's declining, so I'm going to look at you, who's
[10:26] going on the up.
[10:27] And he goes, well, what about newspapers?
[10:28] And I'm like, again, it feels a little shifty.
[10:31] It's like, they're also corrupt, is your argument, or they're not corrupt?
[10:35] No, I'm saying that I am trying to solve a very practical problem that exists for billions
[10:42] of humans.
[10:43] What do I do?
[10:45] And on that stream of thought, it's like, who do I trust?
[10:48] How do I know what to do?
[10:49] How do I know what a process is?
[10:50] And I'm trying to say, here's a way to think about it.
[10:53] Here's how you understand scientific evidence.
[10:55] Here's how you understand measurement.
[10:56] I'm trying to take a basic class in scientific method and medical measurement of the human
[11:01] body.
[11:02] Then I become a focal point of criticism of how everyone disagrees with that method
[11:06] ology.
[11:07] Right.
[11:08] Which is fine.
[11:09] Like, that's what science is, what society is.
[11:10] That's why...
[11:11] So you think it's because they're priests and don't want to give up their priestly
[11:14] robes?
[11:15] That is the natural human response to any kind of situation.
[11:19] When every ounce of your life is measured, are you living?
[11:23] If you have a dashboard on your car, you're not going to be like, way too much information,
[11:27] really don't care about the oil.
[11:28] You know, the oil is different than all the rest of the nonsense.
[11:32] Or when a baby is born, you're like, honestly, too many panels for the baby, too excessive.
[11:37] Just give me half the measurements.
[11:38] So it's always the context on the frame.
[11:41] When LeBron James goes to bed on time and eats well and exercises and measures the metabolic
[11:45] activity, people are like, that's amazing, LeBron.
[11:48] When I do it, the reason why they criticize me is they don't know the category I belong.
[11:53] Except LeBron James plays basketball then.
[11:55] What is your goal in that regard?
[11:57] I'm trying to demonstrate the science of aging.
[12:00] So a living experiment to actually quantify.
[12:03] Now this is useful because if somebody has a health issue, it becomes real, right?
[12:07] It's no longer like this.
[12:08] Oh, this dude is a motherfucker.
[12:09] It's like, how do I actually know what works?
[12:13] And that's what I'm trying to show them is there is a scientific process to measure, do
[12:19] something and then measure, which is weirdly absent from medicine.
[12:22] The point I'm trying to make is not that we've solved aging, not that we know everything.
[12:26] I'm trying to show that you can approach health with a scientific method.
[12:31] That's it.
[12:32] Which a lot of people do.
[12:33] I think you're not the first to do it.
[12:35] Just you.
[12:36] Sure, sure, sure.
[12:37] But I'm saying like, this is the point I'm trying to make though, right?
[12:39] Is that health generally in America right now does not follow the scientific method.
[12:45] Please elaborate.
[12:47] When you go into your doctor, right?
[12:49] And it's like, I have a blank thing going on.
[12:53] You get prescribed the drug, right?
[12:55] Sometimes.
[12:56] It's usually not like, let's do a comprehensive panel of markers.
[12:59] Let's assess where you're at now.
[13:01] Let's assess what the potential therapies could be.
[13:04] Let's then do that therapy and measure the markers again.
[13:07] Right.
[13:08] It's expensive.
[13:09] It doesn't fit into...
[13:10] You're talking about preventative care over the course of a lifetime, constant monitoring.
[13:12] Even robust care, if I have an issue.
[13:15] And it just, it doesn't operate enough in that capacity.
[13:19] Well, it's solving a problem in the immediate.
[13:21] That's what you're talking about.
[13:22] Yeah.
[13:23] Versus a systemic.
[13:24] You're talking about a systemic solution.
[13:26] Yeah, exactly.
[13:27] And I'm trying to basically say that like, this is a reasonable approach of how everybody
[13:30] should be managed to.
[13:31] Okay.
[13:32] You got a lot of attention for publicly saying you're no longer taking rapamycin after nearly
[13:36] five years of experimentation, off-label experimentation.
[13:39] It's still far from being proven to extend life in humans.
[13:44] But doing things to your body based on preliminary and cherubic data hasn't stopped people from
[13:49] doing it.
[13:50] So why make such a big stink about this change in protocol when in fact there is potential?
[13:57] Why did you do that?
[13:58] Do you worry that your public criticism might have real impact on funding for necessary research
[14:02] around something like this that might be helpful to people?
[14:05] I mean, as you said, the rapamycin is kind of like the golden child of the longevity
[14:11] community.
[14:12] It's one of very few things that actually gets a high praise from a wide range of scientists.
[14:16] Right.
[14:17] A little bit like Ozempic right now.
[14:19] It's a drug that suppresses the immune system.
[14:21] And so it's not free.
[14:22] You have to pay the consequence of a whole bunch of side effects like mouth sores, disrupted
[14:27] lipids, metabolic activity, et cetera.
[14:29] Yep.
[14:30] You said that.
[14:31] It also suppresses cancer surveillance cell activity.
[14:33] So it's suppressing an important part of your body which otherwise looks for cancer.
[14:37] And so we just said, we don't think this trade-off is worth it because we have risks that we're
[14:41] not quite certain what they are and how significant they are.
[14:44] And then a month later, there was a study that came out in Yale that showed that on 16 epigenetic
[14:49] clocks, methylation clocks, it increased the speed of aging.
[14:52] Right.
[14:53] You said that.
[14:54] And so, yeah, I mean, I think it's like one of these things where we're early, early
[14:57] days for biological aging.
[14:58] We don't know what's going to work and what's not going to work.
[15:00] It was a reasonable decision.
[15:02] Do you think you did damage to real research into something that might potentially be very important?
[15:07] for humans?
[15:08] Maybe.
[15:09] Because you have influence.
[15:10] Pretending you don't have influence is kind of silly.
[15:11] Maybe even increase it.
[15:12] Increase funding because they don't believe you?
[15:16] Because it now has a new hurdle to overcome.
[15:18] There's a new objective to demonstrate that's efficacious in certain ways.
[15:23] You're a friend by being difficult.
[15:25] Yeah.
[15:26] What you're trying to do is you're trying to characterize the drug.
[15:29] You're trying to basically say, how can we see both the benefits and the side effects
[15:33] in the way those two?
[15:35] So we offered further characterization of it.
[15:37] How do you handle the criticism that you were doing it to take attention away from that New
[15:41] York Times piece about you?
[15:42] For example, that was one of the accusations.
[15:44] Oh, no.
[15:45] It's not related.
[15:46] Not related.
[15:47] Yeah.
[15:48] So you don't feel like this author is saying you're doing damage to real longevity research
[15:53] by saying it doesn't work, essentially, to your followers.
[15:57] Yeah.
[15:58] I mean, you could easily frame that and say I'm doing them a favor.
[16:01] Right?
[16:02] We identified it's increasing the speed of aging.
[16:04] We identified the side effects.
[16:06] The strongest proponent of rapamycin died of brain cancer.
[16:12] We don't know if that's why.
[16:14] We don't know if that's why.
[16:15] We don't know why.
[16:16] We don't know.
[16:17] We don't know why.
[16:18] Right?
[16:19] But he was on a very high dose.
[16:20] But I'm just saying it's potentially useful to call attention to something that's not.
[16:23] Yeah, but that's a little bit like the vaccine people.
[16:25] Like, ah, I found six cases.
[16:28] There's nothing wrong with looking at it.
[16:29] But it's certainly, you know, we have measles now after 20 years of not having measles.
[16:34] Yeah.
[16:35] I mean, it's a question.
[16:36] We're just asking questions sometimes.
[16:39] Or ignorant.
[16:40] Or trying to create ignorance.
[16:41] No?
[16:42] No.
[16:43] No.
[16:44] This is all about characterization.
[16:45] This is all about how do you make informed decisions with very complicated topics.
[16:48] I felt like I was in the room with RFK Jr. there for a second.
[16:51] Like, this, circumcision, autism.
[16:53] I'm like, what?
[16:54] Like, how did...
[16:55] And there were no studies there.
[16:56] There was nothing.
[16:57] And there are plenty of studies showing the other way.
[16:59] But you don't think there's a problem raising things like this?
[17:02] No, so I think that...
[17:04] So he...
[17:05] The gentleman who died of brain cancer actively raised this, right?
[17:09] To pose the question.
[17:10] Because one of the side effects is it lessens cancer surveillance activity in the body.
[17:15] Sure, but you know it before.
[17:16] We know that's the case.
[17:17] And so we were concerned in my body, is that a problem?
[17:20] So we know it lessens cancer surveillance.
[17:25] If it lessens cancer surveillance, it increases cancer risk potentially.
[17:29] That would be causative.
[17:30] Possibly.
[17:31] So we're curious.
[17:32] Yeah.
[17:33] We didn't say if it did.
[17:34] We just said, that's interesting.
[17:35] Mm-hmm.
[17:36] And if that's something that could happen, we definitely want to be aware of it.
[17:39] Right.
[17:40] But getting back to the complexity, it could have been, he had the gene.
[17:42] Entirely.
[17:43] He could have been walking near, too close to a cell tower.
[17:45] Who knows?
[17:46] Entirely.
[17:47] Someone could have hit him at some point when he was seven.
[17:48] Yep.
[17:49] You know, you just don't know.
[17:50] Yeah.
[17:51] Right?
[17:52] Yeah.
[17:53] So we're not...
[17:54] I'm not saying it's causative.
[17:55] I'm saying it's an open question that we should investigate.
[17:56] Yeah.
[17:57] But then you have people down in Texas not getting their measles shots and all these kids dying
[18:00] of measles.
[18:01] Like, it has repercussions.
[18:02] What?
[18:03] Yeah.
[18:04] Correct?
[18:05] Do you think you're responsible for the repercussions?
[18:06] No.
[18:07] Measles is different than an apple mice.
[18:08] Sure.
[18:09] I'm just using it as an example.
[18:10] Yeah, yeah, yeah.
[18:11] People just asking questions.
[18:12] Yeah, yeah.
[18:14] Yeah.
[18:15] The top line debates would be much better to have with, like, robust scientific data.
[18:20] Mm-hmm.
[18:22] Those are well-characterized diseases.
[18:23] Mm-hmm.
[18:24] It's well-characterized reactions.
[18:26] It's a different conversation.
[18:27] Right.
[18:28] Okay.
[18:29] One of the things that's a problem in our society, people can't get doctor's appointments.
[18:32] They can't afford to buy fresh produce, for example.
[18:35] Good things to put in your body.
[18:37] And you've earned your money.
[18:39] You're entitled to spend it any way you want to.
[18:41] Do you think there's anything toned up about promoting expensive and time-consuming protocols
[18:46] for people, if that's the case?
[18:48] I wish there was somebody in the world that helped them understand how to figure out what
[18:52] things work for them and how to test foods.
[18:54] Mm-hmm.
[18:56] So you're doing it for them.
[18:57] This is the point I'm trying to make.
[18:59] I'm trying to say there is a deficiency in the world.
[19:02] Right.
[19:03] Of people knowing practically what to do.
[19:05] Mm-hmm.
[19:06] And then if I do something, then, you know, the rebuttal is, well, it's not a placebo-controlled
[19:09] double-blind study, so therefore it shouldn't exist.
[19:11] Meanwhile, what does the average person do?
[19:14] Mm-hmm.
[19:15] How do they think through, how do I figure out what is wrong with me?
[19:18] How do I actually address, is this therapy good for me?
[19:20] What do I measure?
[19:21] So you're moving into, you think you're moving into an empty space that they're not
[19:26] Entirely.
[19:27] I mean, no matter, Kara, no matter where I go, if the person's a billionaire or the person
[19:31] has no money, they are in almost the same space of lack of understanding of the most basic
[19:37] things on health.
[19:38] It is, that society does not know how to sleep.
[19:41] Mm-hmm.
[19:42] We have not ever-
[19:43] This is true.
[19:44] So it is like, we're at such a elementary level on every vector of health.
[19:48] Mm-hmm.
[19:49] Who's going to fill the void?
[19:50] And so if people out there are talking about their various things, what I would prefer is,
[19:54] they use scientific evidence and try to be robust with what they're saying.
[19:58] Because right now, it's very confusing.
[20:00] What about you do something, someone tries it because they believe in you, and bad things
[20:07] happen?
[20:08] Are you ready to take responsibility if something you say or advise hurts someone?
[20:13] It's the natural process.
[20:14] I mean, I could be talking about coffee, and somebody tries coffee and they have a negative
[20:18] reaction to coffee.
[20:19] Like, you know, am I responsible for that?
[20:21] I mean, that's just the natural part of people trying stuff.
[20:23] I mean, it's inevitable.
[20:24] Like, you can't have a controlled experiment with billions of humans.
[20:27] Like, we all try things for a day.
[20:28] Right.
[20:29] But if you say, stick it up your ass, and someone dies from it, yes, it is.
[20:32] This is the thing.
[20:33] Well, if you give a specific, you know.
[20:35] This is why I have not done coffee colonoscopy, or coffee enemas, because the evidence is
[20:41] not great.
[20:42] Why not put your vast sums of money, and again, you can spend it any way you want, into something
[20:46] like gun control, big killer of people, or unprocessed foods in schools that would actually
[20:51] help people.
[20:52] Yeah.
[20:53] In this short term, I guess.
[20:55] Yep.
[20:57] I'm trying to find the fulcrum with the biggest leverage.
[21:01] Uh-huh.
[21:02] My formula has been go to the root of that problem and try to solve the foundational issues.
[21:07] Uh-huh.
[21:08] The guns and the other things, they live downstream from other things.
[21:13] And this is what I've been trying to tell, like, very basic, like, causative relationships.
[21:17] Uh-huh.
[21:18] If you don't get good sleep, your willpower...
[21:21] Certainly.
[21:22] ...falls off a cliff.
[21:23] Uh-huh.
[21:24] Your sex drive falls off a cliff.
[21:26] You know, sometimes I think, someone asked me what I thought about why I was so intriguing
[21:30] to people like you, and you're very typical of entrepreneurs I've met over the many years,
[21:34] and they all have a certain thing.
[21:35] And for some reason, I said, well, men can't have babies, and therefore this is their version
[21:40] of that.
[21:41] Uh-huh.
[21:42] Uh-huh.
[21:43] And, of course, it's narcissistic rather than community-based.
[21:45] How do you think about that?
[21:46] I mean, what's more community-based than the human race?
[21:49] Like, what more could be a communal endeavor?
[21:52] And you felt dying was what was happening to us, and you're trying to solve that problem?
[21:57] What problem are you solving for, exactly?
[21:59] Yep.
[22:00] So, the primary objective function of the world right now is power, wealth, and status.
[22:05] Uh-huh.
[22:06] Those are the biggest...
[22:07] Nothing new.
[22:08] Exactly.
[22:09] And so, if you take those objective functions, and you then take this powerful new intelligence,
[22:13] you point it at those ends, you question, is that the right goal?
[22:17] Uh-huh.
[22:18] Do you want to do that?
[22:19] Or, is there something else?
[22:20] And what I'm suggesting is, as a species, we are going to transition into a new era,
[22:27] Uh-huh.
[22:28] where existence itself is the highest virtue.
[22:31] So, you're in it for humanity, not for self-aggrandizement.
[22:35] Like, what else is there to play for?
[22:38] I mean, self-aggrandizement is like such a ridiculous notion.
[22:41] We're such a minuscule and insignificant part of this galaxy.
[22:44] Like, why would it even matter?
[22:46] I think the narcissist accusation is really lazy, actually.
[22:49] Okay.
[22:50] It's like, it doesn't understand the depth of what I'm trying to do.
[22:53] The headlines have painted this superficial, narcissistic story that people grab onto.
[22:59] Uh-huh.
[23:00] If you look at the depths of it, I've become the most measured person in the world, offering
[23:04] a demonstration of what characterization can happen.
[23:07] So, when you say narcissism is lazy, you also play into it with your photos of your chest,
[23:13] and, you know, holding this thing with your son, and talking about blame.
[23:17] You're doing it on purpose, presumably, correct?
[23:20] Uh-
[23:21] You know it's like catnip to media to do that.
[23:24] Absolutely.
[23:25] Right.
[23:26] So, why are you doing it that way?
[23:27] That the world is in a ferocious match of competition.
[23:30] Uh-huh.
[23:31] And you need your thing to be understood.
[23:33] And like when you play out there, you just play the game.
[23:36] Play the game.
[23:37] Yeah.
[23:38] Explain rejuvenation athlete.
[23:40] When you're an entrepreneur, the playbook is you martyr yourself in the pursuit of wealth,
[23:46] which means you don't sleep very much, you don't exercise.
[23:48] Right.
[23:49] Hustle porn.
[23:50] So, then you want to flex like I only get three or four hours of sleep a night and I'm
[23:54] pretty badass.
[23:55] It's the Elon thing.
[23:56] It is, right?
[23:57] So, that's the cultural norm.
[23:58] And so, if you can quantify that trade-off, your wealth may be increasing over the time
[24:03] duration, but you're destroying your health.
[24:06] Would you have characterized yourself as a hustle porn guy?
[24:10] 100%.
[24:11] How did you feel doing that?
[24:12] It was disastrous.
[24:13] I would have committed suicide had not been with my kids.
[24:16] Most of society has a pretty serious mental health issue.
[24:19] And so, that's what I'm saying.
[24:20] When the people are criticizing me, I don't think it's about me.
[24:23] So, how do you want to die?
[24:26] I want to die in the most ridiculous way possible.
[24:29] Okay.
[24:30] Go for it.
[24:31] Because irony is the ultimate language of the universe.
[24:34] Yeah, it's going to be a great headline.
[24:36] Hit by a truck.
[24:37] 100%, right?
[24:38] Like, hit by a bus would be one like choking on broccoli, you know, an experiment that I
[24:42] accidentally give myself cancer with, you know, like, it's just, it's, I'm guaranteed
[24:47] to die in the most ridiculous way possible.
[24:50] It's just going to happen.
[24:51] Okay.
[24:52] So, say you see it coming.
[24:53] What's your last words?
[24:55] Oh, fuck.
[24:56] It would be something like, have fun with it.
[25:02] I thought it was it.
Transcribe Any Video or Podcast — Free
Paste a URL and get a full AI-powered transcript in minutes. Try ScribeHawk →