|Can human civilization last 10,000 years?||
random trip report
I was talking with Noah about SETI the other day, and I pointed out that if the average lifespan of technological civilizations is short (like 1000 years), our chance of detecting ET at a particular moment in the 13-billion span of the universe could be very small.
This led me to three questions:
Humans have created great things:
These creations are unique in the universe. There may be other civilizations that have arts and philosophy, but ours will be different.
Perhaps I'm anthropocentric, but I think that human creations show a level of genius that may be much rarer than intelligence per se. I'm amazed, for example, that the works of Bach emerged from a mortal brain. It's possible that some creations of humans are the greatest in the entire breadth and lifetime of the universe.
Human have also accomplished incredible things in math and science. We discovered relativity and quantum mechanics. We proved Fermat's last theorem and the independence of the continuum hypothesis. These are staggering feats of intellect. However, I view them separately from humanities because other civilizations are likely to have made the same (and probably greater) discoveries.
If humans become extinct today, our creations disappear in the sense that there will be no one to experience and appreciate them. In the case of cataclysmic extinction (e.g. nuclear war) they will be physically destroyed. Perhaps the physical artifacts will survive, and other intelligence beings will evolve on Earth; but there is no guarantee that these beings will have the types of intelligence needed to understand our creations. So it's likely that if humans become extinct in the near future, our creations die with us.
But our creations could survive us; if at some point we discover civilizations outside Earth and are able to communicate with them, we'll be able to send them our creations in radio or optical transmissions.
Would aliens understand and appreciate our creations? I think it's likely. Chemistry offers only a few options for the level of complexity needed for life. Aliens probably see light and hear air vibrations. I don't think you can have technology without some type of permanently-recorded language. And they'll almost certainly have individuals, and social groups, and cooperation and competition at various levels. They'll run into the same philosophical questions: free will, the meaning of life, and so on.
If there is a network of communicating civilizations - and I think this is possible - then our creations could be shared among them. Our creative legacy would spread across the universe, and would survive until the heat death of the universe brings an end to all civilizations. I view this as the best possible outcome.
It's unlikely that we'll discover other civilizations with our current technology. The longer our civilization lasts and develops, the greater our chance of discovering other civilizations. If human civilization lasts 10,000 years, there's a good chance we'd find someone to share our creations with.
In summary: I want human civilization to survive so that our existing artistic and intellectual creations have maximal impact, and so that we're able to keep make new creations.
I consider the survival of civilization to be the most important challenge facing humans, by a wide margin. I'm shocked and discouraged by the fact that it's seldom discussed or studied, especially by the people (leaders and policy-makers) who could actually do something about it.
But not everyone shares my views. I was talking with my friend Ray about this. He told me he's concerned about what happens in his own lifetime: that he avoid poverty, and that the suffering of others is minimized. He's not greatly concerned with what happens after he dies. He thinks that 99% of humans share this view: they care about their own survival, and perhaps that of their offspring, but not that of their species.
Initially I was surprised to hear this, and somewhat disappointed: this value system is basically the same as that of a unicellular organism. But on reflection I'm less surprised; if my own future was uncertain I'd probably think the same way.
Another point of view is that humans have evolved in a way that makes us selfish and aggressive, and this prevents us from being responsible citizens of Earth, and our self-extinction is inevitable and should be welcomed. I have sympathy with this viewpoint. However I don't embrace it, for the above reasons.
Human civilization could end in various ways, which we can roughly categorize as
Here are some possible civilization-killers; there may be others.
The U.S. and Russia have about 5,000 hydrogen bombs each - enough, I believe, to vaporize almost all the humans on Earth. If they let the missiles fly, and China and Europe get involved too, the result could be a cataclysm that destroys pretty much everything.
What is the likelihood of a global nuclear war? It depends on the governments of the nuclear powers. The U.S. was run for 4 years by a malevolent crackpot (Donald Trump), who wanted to know why we can't use nukes, and who got into missile-waving confrontation with another crackpot from North Korea. Fortunately, Russia and China are currently run by regimes that are stable and (although oppressive) are not crackpots. So global nuclear war is unlikely right now. However, this could change quickly (and environmental changes could accelerate this; see below).
Another seven countries have nuclear weapons, including Pakistan, North Korea, India, and Israel. Iran may join the club at some point. Several of these countries are run by theocracies, and religion zealots tend to be crackpots. So smaller nuclear wars - one or two explosions - are possible, perhaps likely. However, these wouldn't threaten civilization.
Currently, building missiles and high-yield bombs requires nation-scale resources. However, it may be possible for small groups to obtain fissible materials and build small bombs. These might destroy a city or two, but they wouldn't threaten civilization.
Technology now exists for synthesizing cells with designed genes. It's probably possible to design a virus that is highly contagious and 100% lethal to humans. This can be created cheaply.
At some point someone could do this and release it, and 99% of humans would be dead within a year or so. A few people might survive in bunkers or remote places. They'd revert to a pre-civilization state after a while.
It's also possible that something could evolve naturally - e.g. a virus - that is able to kill all humans. This is hard to prevent; but our experience with COVID shows that improved forms of government (see below) would increase our chances of surviving such an event.
Many environmental disasters are in progress: plastic death of the oceans, poisoning of groundwater and soil, radioactive waste, and so on. Of course the biggie: climate change due to carbon emissions.
In the worst case (which is playing out as we speak), average temperature will rise 5-6C. Sea level will rise by 50 feet. Huge areas of low land will be submerged. 1/6 of species will become extinct. Fires will burn most forests. There will be severe drought everywhere.
I don't know if climate change will make the Earth uninhabitable to humans; experts say it might reduce human population to .5 billion. But it's certain to exacerbate socioeconomic factors (see below), which will increase the chances of nuclear or bio disaster. There will be less of everything, therefore more conflict, therefore greater chance of crackpots pulling triggers.
Nuclear and bio weapons are like guns we've created and pointed at ourselves. Can we somehow get rid of them? No. Blueprints for making them are public. We've opened Pandora's box. So we need to focus on potential trigger-pullers. In what situations, and for what reasons, would a person - or a nation - intentionally do something that could endanger civilization? How can we prevent these situations?
Human history is a never-ending battle between those with power and those without it. Insatiable greed - seemingly a human trait - causes the rich to push the poor to the point of starvation. Self-preservation is the most powerful human instinct. When people's existence or livelihood is threatened or uncertain, it drives people to pull triggers. To stop trigger-pulling, we need to understand the source of conflict, and remove them.
Capitalism (by which I mean unregulated free-market capitalism, a la Ayn Rand) is a threat to the survival of human civilization. It leads to an ever-widening wealth gap, creating deprivation, unfairness, and desperation. It leads to fascism, where the rich deprive the poor of their freedom as well as their hope. This also happens at a national level, e.g. U.S. economic colonialism.
Nationalism is a threat to the survival of human civilization. Powerful countries economically exploit or colonize weak countries. Colonial powers create borders (e.g. in the Middle East) that fracture existing countries and ethnic groups, causing perpetual war.
Organized religion is, for various reasons, a threat to the survival of human civilization:
The idea of electing representatives seems good on the surface, and it's an improvement over aristocracy, autocracy, etc. But it has a fatal flaw: voters don't respond to complex or long-term policy plans. They respond to selfish short-term promises, to divisive "identity politics" such as nationalism and racism, and to charistmatic demagogues. So we end up with leaders who are potential trigger-pullers, and policies that exacerbate all the problems listed above.
In the U.S. we also have the problem that billionaire trigger-pullers like the Koch brothers can buy the media and can buy elections. It's possible that we can improve the campaign/election process a bit, e.g. by imposing campaign spending limits. But that won't solve the basic problem.
The world's major social structures, and the genetically-encoded nature of humans, seem to be squarely aligned against the survival of civilization. It seems hopeless, and it probably is. But we may as well try. In my case, this means laying out a plan, a roadmap for survival. I'll leave it to others to decide whether this plan is feasible, and if so, perhaps to take the first steps to realize it.
To start, we need a new way to make decisions. Current forms of government - including the American system - have failed to address global issues.
My plan involves something I've already written about: scientific government. The idea is that government policies are selected using experiments that show their effect on a "figure of merit" (a quantitative measure of public well-being). The organization of the government is modeled after that of existing scientific community: meritocracy based on peer review. People vote on how the figure of merit is defined, but they don't elect representatives.
The broad strokes are as follows:
When these things are done, human civilization will have at least a chance to survive, and to flourish - to continue our explosion of artistic creation, to continue advancing our understanding of science, and to keep searching for life and civilizations outside Earth.