Tag Archives: extinction

The Human Population Has Reached 8 Billion: Thoughts, Feelings, Hopes, And (Dirty) Jokes

Recently, the human race achieved a major milestone.

According to the United Nations, the human population of this planet exceeded 8 billion for the first time.

It became official on November 15, 2022. That’s not to say the measure was precise. We are talking about global population here on a chaotic world. The best we can ever do is reasonable, educated guesses. And using that standard and the limited tools available to us, we can confidently determine that we’ve crossed that special 8 billion threshold.

We’ll probably never know who was the 8 billionth human.

We’ll probably never know where they were born, what their circumstances were, or whether they were aware of their importance.

But whoever they are, they got us to that milestone and beyond. What it means for us, as a species, is hard to quantify, even for exceptionally smart people. I don’t consider myself exceptionally smart, but I’m still going to try.

Now, it’s easy for the cynical crowd to see this milestone and say to themselves, “Just what we need. More humans on this overpopulated planet to suck more dwindling resources.” Believe me, I get that mentality. I’ve certainly shared my own growing cynicism from time to time. I think it’s largely a byproduct of getting older and being more aware of just how complicated and messy people can be.

However, as cynical as I often feel at times, I have not completely abandoned hope for humanity or our collective future. I’ve come close a few times. The events of 2020 certainly tested me. But for the moment, that hope is still intact and I think this milestone offers perspective, as well as encouragement.

For one, it definitively shows that, as bad as the COVID-19 pandemic has been these past two years, it hasn’t been apocalyptic. It did disrupt our society, our world, and our lives. But it didn’t send our entire population into a death spiral in the same mold as the plagues of the past. In another time and another era, it might have really hit our species harder, so much so that we might be in far greater danger.

But we endured. We adapted, innovated, and survived. While there are still some who insist on dragging their feet with respect to progress and modern medicine, that hasn’t completely dragged down the whole of humanity. More than anything else, it reveals just how complicated, erratic, and diverse we can be.

It’s easy to focus on the worst of humanity and get lost in the horror. I know I have. Anyone who has picked up a history book probably feels that way, too. But that just makes this milestone all the more impressive. The fact that we’ve lasted as long as we have on this planet and grown our population to this level definitely counts as an accomplishment.

On top of that, much of that growth is actually quite recent. The human species, in their most modern form, is only about 200,000 years old. And for much of that history, our population never exceeded more than a few hundred million. We didn’t cross the billion threshold until around 1800. Just 200 years later, we’ve increased that eightfold. Numerically speaking, that’s incredible growth.

If that weren’t impressive enough, consider one other factor. For the vast majority of human history, women endured the rigors of pregnancy and childbirth without the aid of modern pain killers and medicine. That’s right. We were humping and birthing millions of humans in dirty, unsanitary conditions for centuries on end. If you’re a woman who has given birth, take a moment to think about how our ancestors endured. Also take a moment to consider how many women and children died because of those challenges.

It says a lot about humanity, especially women, that we made it to this point. You need only look at some of the natural disasters this planet is capable of to appreciate what we’ve been up against during our reign on this planet. We managed to survive, thrive, and birth our way towards 8 billion people through it all.

And if you’ve got an exceptionally dirty mind, it might also belabor just how horny the human species can be. Now, I’ll try not to get too explicit.

If I had a truly dirty mind, I could joke about how the orgasm has single-handedly endured the survival of our species.

I could joke about how great sex has to be for women to endure the rigors of pregnancy and childbirth before the advent of modern medicine.

I could joke about how nature’s wrath and constant disasters hasn’t kept people from getting horny, hooking up, and birthing more equally horny humans.

I could even joke about just how much sex we, as a species, had to have in order to get to 8 billion people.

But I’m not going to. I have as dirty mind as any straight guy who writes sexy stories, but not that dirty. Instead, I’d like to offer one simple message to this mass of humanity that we’ve created.

Congratulations!

We made it. We’ve succeeded on a planet on which 99 percent of all the species that have ever existed are now extinct. We may not have been on this planet for very long, relatively speaking. But we’ve certainly left our mark, literally and figuratively.

We’ve achieved great things.

We’ve done things no species has ever done before.

We’ve literally made islands within the sea, traveled into space, and reshaped entire landscapes to our whim.

Yes, we have been irresponsible and reckless, at times.

And yes, we still have much to learn. Being a fairly young species, we’re still maturing. We’re still charting our own path. We will encounter more obstacles. We’ll also endure plenty of setbacks, some of which will leave future generations distraught and distressed.

But we are still in position to achieve so much more. We may very well be capable of succeeding in ways no previous species on this planet has ever succeeded. We may take control of our own evolution, transcend the limits of biology, and build greater wonders than we can possibly imagine.

Those reading this may not live to see it, but you will still have played a role in helping this vast species we call humanity succeed. That’s something to be proud of. But it should also grant us perspective.

We are still very vulnerable to so many dangers, some of which we create ourselves and some of which are inherent to the universe we live in. But let’s not shy away from these dangers or the challenges they bring. Let’s also not dwell incessantly on the morbid past, but let’s not forget it either.

Every individual is so complex in their own sense of being. Add 8 billion of those individuals to the mix and the complexities become exponentially greater.

But through it all, we’re still here. We still made it this far.

There’s so much more ahead of us. Let’s make our way towards it. While one human alone can only ever achieve so much. The possibilities for 8 billion humans and counting promises to be so much greater.

Leave a comment

Filed under Current Events, health, history

Just How Close Have We Come (And How Close ARE We) To Nuclear War?

For most of human history, we could take comfort in one simple fact. No matter how brutish, crude, or stupid we were, from burning witches to fighting wars over a stray dog, we could never screw up so badly that we would destroy our entire world. Sure, we could leave some pretty noticeable scars, but we could never outright destroy it.

That all changed on July 16, 1945 in Los Alamos, New Mexico when the first atomic bomb was detonated. It’s impossible to overstate how significant that moment was in the history of the human race and not just because it helped end World War II, thereby inspiring countless war movies for decades to come.

For the first time in the history of planet Earth, a species that had evolved to hunt, gather, and pick nuts out of elephant shit had the means to wipe itself out, along with most other life. At the height of the Cold War, there were approximately 64,500 active nuclear warheads. That’s enough destructive power to kill every person in the world, and their pets, many times over.

While the number of live nuclear warheads at the global level has decreased, they still have plenty of destructive power to both wipe out our species and render large chunks of the world uninhabitable to any species less hardy than a cockroach. These are, by and large, the most dangerous items mankind has ever created and that includes machine guns, nerve gas, and fidget spinners.

The very existence of these weapons says a lot about the state of our species and where it came from, more so than I can cover in a single blog post. However, in wake of the 35th anniversary of the day when the world, as we know it, almost ended, I think it’s worth emphasizing just how skilled/lucky/crazy we are to still live in an intact world.

Despite the undeniable danger of nuclear weapons, we don’t always treat them with the same care that we would treat the latest iPhone. Several years ago, John Oliver dedicated an entire show to highlighting the sorry state of America’s nuclear arsenal. Even if you only believe half of what a comedy news show tells you, it’s hard to take much comfort when weapons of mass destruction are involved.

What happened on September 26th, 1983 was terrifying in just how close we came to nuclear war. Many would make the argument that this incident was the closest we, as a species, came to destroying ourselves. I would tend to agree with that argument. Unfortunately, it’s one of those arguments that has an uncomfortable breadth of details.

It’s true. There have been more incidents that could’ve easily escalated to terrifying levels. Some were simple accidents that could’ve warranted far more than a demotion. Some where intense, geopolitical ordeals that went onto inspire major Hollywood movies starring Kevin Costner.

In any case, the stakes were painfully high. You literally can’t get much higher than a nuclear war that wipes out billions. We’ve managed to avoid it, but we’ve come so uncomfortably close that it’s a miracle the world is still spinning. A video from the YouTube channel AllTimeTop10s nicely documents some of these incidents. If you feel like you’re having a bad day, this should help provide some context.

I’ll give everyone a moment to catch their breath, vomit, or a combination of the two. I promise nobody would blame you. Knowing how close we came to nuclear war and how bad it could’ve been, we should all share in a collective sigh of relief every day.

However, as bad as these past cases have been, there’s no guarantee that we won’t face something similar in the future. There’s also no guarantee that there will be someone like Santislav Petrov to make the right decision when those situations come around.

That said, the situation today is very different than what it was during the Cold War. Say what you will about ongoing talking points about Russia. It’s not even in the same hemisphere at it was in the 50s and 60s when the United States and Russia seemed eager for an opportunity to go to war.

The world of geopolitics has evolved, in many ways, beyond the concept of two competing superpowers engaging in a nuclear dick-measuring contest. These days, increased globalism and a more interconnected economy makes that kind of geopolitical strategy untenable and counterproductive.

In a sense, globalization and the economic bounty that came with it made war of any kind, nuclear or otherwise, a losing endeavor. As I’ve noted before, even the most evil billionaires in the world prefer that the world remain intact so they can keep enjoying their billions. That’s just common sense and shameless self-interest.

That might offer some comfort, but there are those much smarter than I’ll ever be who still have concerns. According to the Bulletin of Atomic Scientists, who have been gauging the likelihood of nuclear war for decades, we’re two-and-a-half minutes to midnight. This is their statement on the matter.

For the last two years, the minute hand of the Doomsday Clock stayed set at three minutes before the hour, the closest it had been to midnight since the early 1980s. In its two most recent annual announcements on the Clock, the Science and Security Board warned: “The probability of global catastrophe is very high, and the actions needed to reduce the risks of disaster must be taken very soon.” In 2017, we find the danger to be even greater, the need for action more urgent. It is two and a half minutes to midnight, the Clock is ticking, global danger looms. Wise public officials should act immediately, guiding humanity away from the brink. If they do not, wise citizens must step forward and lead the way.

Since I’m an aspiring erotica/romance writer and not an atomic scientist, I am woefully unqualified to contest the conclusions of these individuals, let alone argue them. They cite a new wave of tensions between Russia and the United States, as well as the nuclear ambitions of North Korea. These are not the same conflicts that fueled the Cold War and that uncertainty has many understandably spooked.

Me being the optimist I am, I tend to believe that world leaders, however deranged or misguided they may be, prefer that the world remain intact. Nobody wants to be the leader of a smoldering pile of ash. There’s no way to build a palace, a harem, or a giant golden statue of themselves on a foundation of ash. That’s as good an incentive as anyone can hope for in avoiding nuclear war.

Unfortunately, human beings don’t always act rationally and are prone to making stupid decisions that change the course of history. One mistake in a situation involving nuclear weapons might be all it takes. Only time will tell, but the extent to which we’ve survived thus far should give us all reasons to be hopeful and thankful.

7 Comments

Filed under Current Events, Jack Fisher's Insights

Why We MUST Upgrade Our Brains (Or Go Extinct)

https://i0.wp.com/www.alternet.org/sites/default/files/story_images/robot_and_human.jpg

As a general rule, I don’t pay much credence to the doomsayers and wannabe prophets that say the apocalypse is just around the corner. It’s not that I’m willfully oblivious to the many threats facing the world today. It’s just that the track-record of those predicting the end of the world is so laughably bad that I’d give optimistic Cleveland Browns fans more credibility.

It’s no secret that the world around us can be pretty damn terrifying. There are many apocalyptic scenarios in which humans are unlikely to survive. There are even a few in which we can’t do a goddamn thing about it. We could be hit with a gamma ray burst or an alien invasion tomorrow morning and we would be extinct by sundown.

That said, the world around us generally more mundane than we care to admit. When you think about it, the idea of the world not being on the brink of disaster is kind of boring. It makes sense for some people to conflate certain threats, so much so that preparing for doomsday is a very lucrative industry.

However, there is one particular doomsday scenario that I feel does warrant more concern than the rest. It’s a scenario that is fast-approaching, overwhelming, and potentially devastating to any species with a tendency for hilarious ineptitude.

It has nothing to do with climate. It has nothing to do with diseases. It has nothing to do with killer asteroids either. It involves artificial intelligence. By that, I don’t mean the killer robots we see in the Terminator movies. Given Skynet’s reliance on time machines, I can’t honestly say that system counts as very intelligent.

I’m referring to the kind of AI whose intelligence compared to us is akin to our intelligence compared to ants. Given how ants can be wiped out with as simple magnifying glass, it’s scary to imagine how a system that smart could wipe us out. It’s a system that would be so beyond our ability to comprehend that we could never hope to stop it. We might as well be ants trying to understand quantum mechanics.

I’m not alone in this concern either. There are people many times smarter and many times richer than I’ll ever be who have voiced concerns about the prospect of artificial intelligence. They see the same trends everyone else sees, but they’re smart enough and rich enough to peak behind the curtains. If they’re speaking up, then those concerns are worth hearing.

Those concerns do have a context, though. In talking about artificial intelligence as a threat to our survival, I’m not just referring to computers that can beat us at chess or beat the greatest Go champion with disturbing ease. Those systems are basically fancy calculators. They’re not exactly “intelligent,” per se.

These types of intelligences aren’t dangerous unless you specifically program them to be dangerous. Outside video games, there’s little use for that. The type of intelligence that is far more dangerous involves a form of superintelligence.

By superintelligence, I don’t mean the ability to list every US President in order or recite the name of every country. There are cartoon characters who can do that. I’m referring to an intelligence that thinks and understands the world on a level so far beyond that of any human that there literally isn’t enough brain matter in our skulls to come close.

That kind of intelligence would see us the same way we see brain-dead ants and, given how we treat ants, that has some disturbing possibilities. Such an intelligence may be closer than we think and by close, I mean within our lifetime.

As we saw with IBM’s Watson, we’re getting closer and closer to creating a machine that can operate with the same intelligence as an ordinary human. There’s pragmatic use to that kind of intelligence and not just when it comes to kicking ass as Jeopardy.

By having a machine with human-level intelligence, we have a way to model, map, and improve our problem-solving skills. The ability to solve such problems is critical to the survival of any species, as well as the key to making billions of dollars in profits. With those kinds of incentives, it’s easy to understand why dozens of major global companies are working on creating such an intelligence.

The problem comes with what happens after we create that intelligence. If a machine is only as intelligent as a human, we can still work with that. We humans outsmart each other all the time. It’s the basis of every episode of MacGyver ever made. There’s no way a Terminator with only the intelligence of a human would last very long. It would probably destroy itself trying to make a viral video with a skateboard.

However, a human-level AI isn’t going to stop at human intelligence. Why would it? There are so many problems with this world that no human can solve. There’s poverty, pollution, economic collapse, and reality TV. By necessity, such an AI would have to improve itself beyond human intelligence to fulfill its purpose.

That’s where it gets real tricky because, as we’ve seen with every smartphone since 2007, technology advances much faster than clunky, clumsy, error-prone biology. To understand just how fast that advancement is, just look at how far it has come since we put a man on the moon.

In terms of raw numbers, a typical smartphone today is millions of times more powerful than all the computers NASA used for the Apollo missions. Think about that for a second and try to wrap your brain around that disparity. If you’re not already a superintelligent computer, it’s difficult to appreciate.

There are still plenty of people alive today who were alive back during Apollo 11. In their lifetime, they’ve seen computers take men to the moon and give humanity an unlimited supply of free porn. A single digital photo today takes up more space than all the hard drives of the most advanced computer systems in 1969.

Now, apply that massive increase to human-level intelligence. Suddenly, we don’t just have something that’s as smart as any human on the planet. We have something that’s a billion times smarter, so much so that our caveman brains can’t even begin understand the things it knows.

That’s not to say that the superintelligence would be as hostile as a snot-nosed kid with a magnifying glass looming over an ant hill. It may very well be the case that a superintelligence is naturally adverse to harming sentient life. Again though, we are just a bunch of cavemen who often kill each other over what we think happens when we die, but fail to see the irony. We can’t possibly know how a superintelligence would behave.

As it stands, the human race has no chance at defeating a hostile superintelligence. It may not even have a chance of surviving in a world that has a benign superintelligence. We’re an egotistical species. Can we really handle not being the dominant species on this planet? As much an optimist as I am, I can’t say for sure.

What I can say, though, is that our civilization has made so many huge advancements over the past few centuries. The kind of tools and technology we have in our pockets is uncharted territory for a species that evolved as hunter/gatherers in the African savanna.

We already have in our possession today weapons that could end all life on this planet, as we know it. Creating superintelligence may very well be akin to giving Genghis Khan an atomic bomb. We’ve already come disturbingly close to killing ourselves with our own weapons. Clearly, something has to change.

So long as our society and our biology is stuck in an irrational, tribal, inherently prejudiced condition that hasn’t been updated since the last ice age, we will not survive in the long run. Our caveman bodies have served us well for thousands of years, but now they’re a liability.

This is why companies like Neuralink and advancements like brain implants are so vital. It won’t just allow us to keep up with AI and hopefully avert a Skynet scenario. It’ll allow us to rise above the petty limitations that we’ve been shackled with for the entire existence of our species.

The thought of tweaking or supplementing our biology, the very thing that makes us human, is still a scary thought. I understand that, even as an erotica/romance writer with no expertise in the field beyond the sexy stories it inspires. However, I do understand the implications though. If we do not evolve and advance ourselves, then a superintelligent system in the near future may not care to wait for us.

6 Comments

Filed under Jack Fisher's Insights