Category Archives: technology

The Future Of Telework: A Trend That Transcends Pandemics

In early 2020, which might as well be another lifetime, I speculated on the lasting impact of increased telework and distance learning due to the pandemic that uprooted our entire society. At the time, I didn’t know just how bad this pandemic would get. In my defense, my hopes were still intact. Now, they’re charred ashes, but that’s beside the point.

In essence, I speculated that once people got used to teleworking, they would not be eager to go back, even after the pandemic had passed. That wasn’t exactly a bold speculation. You don’t have to be a world class psychic to surmise that people will come to enjoy working in their underwear, not having to commute, and enjoying the general flexibility that telework affords.

I’ve been stuck in enough traffic jams to appreciate that kind of flexibility. I know I’m not the only one who might become too fond of telework.

Well, that all-too-obvious insight is starting to take hold in many sectors. It’s not just related to typical office work in cubicles. Everyone from the United States Military to big tech companies to law firms are embracing this new normal for the workplace. Even though it’s more out of necessity than innovation or good will, it’s still happening and there may be no going back.

The pandemic has already forced a mentality shift among the workforce. According to research done by Pew, telework was mostly seen as an optional benefit reserved for an affluent few. That’s not surprising. That kind of flexibility just felt more like a luxury, one that someone had to earn by establishing trust and credibility from an organization.

Now, it’s not just a necessity. It’s unavoidable. The world we’re living in now just cannot accommodate the same professional environment we once knew. I’ve worked in many professional environments before. I can attest that some of them are not built with pandemics in mind.

At one point, I worked at a company in which my desk was crammed into a closet-sized space with three other people. If even one of us caught a cold, we’d all be sick by the end of the week. It was that bad.

I doubt that’s an isolated case. In some of the jobs I’ve had, I have been able to work from home, but it’s only as a last resort. The only times I actually had to do it involved an emergency that occurred on a Saturday morning and one instance where the office was being renovated. In both cases, I still got plenty of work done. I just did it in my underwear.

In that sense, I get why many organizations reserve telework as a luxury rather than a norm. There’s this underlying sentiment that people will abuse it. If they can work from home, they just won’t get as much done. They’ll be too tempted to just grab a bag of chips, lie down on the couch, and watch game shows.

While I don’t doubt there are people who do that, this pandemic has revealed that most people aren’t assholes on that level. In some cases, it’s increasing productivity. Apparently, when workers are comfortable and afforded flexibility, they can get more done. That shouldn’t be too surprising, but it’s still remarkable in its own way.

This has born itself out in subsequent studies and surveys. For some industries, telework is probably more productive in the grand scheme of things and that shouldn’t be surprising. Anyone who has ever had a lousy commute knows why. If a good chunk of your day is spent waking up, putting on itchy clothes, and spending hours in traffic, you’re not going to be in a very productive mood.

That won’t be the case for certain industries. If you’re a doctor, a police officer, a fire fighter, or a trucker, you just can’t telework. The nature of the work doesn’t allow it. That’s still going to be the case, at least until we have robots capable of doing those tasks, which we are working on. However, there’s also sizable chunk of work that could probably be done remotely.

I think the impacts of this emerging truth will extend far beyond the pandemic. I’ve already seen it with people I know. They enjoy teleworking. They don’t want to stop, even after the pandemic becomes a bleak footnote in history. Some are willing to still go into the office some of the time, but they would prefer to telework. I suspect that’s going to become the new normal.

Last year has proven that people can, for the most part, be responsible with the flexibility afforded by telework. As long as they’re getting the work done, who cares if they do it in their underwear while Netflix plays in the background? Considering how costly commutes can be and how expensive office space can be, it might just make more fiscal sense in the long run.

Like it or not, businesses and various organizations tend to err on the side of reducing operating costs. It may mean more employees waste time at home, but if the difference is made up by better productivity, then it’s a net gain overall.

That shift could have impacts that go far beyond business operations. If people have to commute less, then that makes living out beyond urban and suburban settings more feasible. Given how expensive it is to live in those areas, this could spread people out even more, which is an objectively good thing if you’re looking to prevent future pandemics.

It might even help those in depressed rural areas in need of human capital. I can easily imagine some people preferring the quiet, less crowded environment afforded by a rural setting. If they can live in that environment while still getting their work done via internet, assuming they have a reliable connection, then that’s another big benefit that goes beyond the business itself.

This is likely to be a trend. That’s not a fanciful prediction. We’re already seeing it happen. The pandemic just forced it to accelerate. There will likely be other impacts. It may very well change how cities, suburbs, and rural areas are planned from here on out.

I don’t claim to know the specifics, but we’ll likely see it continue in the coming years. I, for one, welcome this change. If I can reduce the amount of time spent in traffic and increase the amount of time I spend in my underwear, then my overall well-being improves considerably.

2 Comments

Filed under Current Events, futurism, human nature, technology

How Many Streaming Services Can We (Realistically) Have?

It’s official. The streaming wars are on.

Hell, it’s been official for the past several years and 2020 only accelerated it. The battle to dominate digital media in all forms is raging on multiple fronts and while many have their favorites, none can say they’ve won.

It’s Netflix versus Hulu versus Amazon versus Disney versus CBS/Viacom versus YouTube versus whatever other media companies are fighting for every possible eyeball. The stakes are high for consumers and content creators alike. There are billions in profits to be made and plenty of epic, culture-defining content to be made. It’s going to get intense is what I’m saying.

I don’t think I need to remind everyone just how much the streaming market has changed in the past 10 years. Even if you’re still a teenager, chances are you still vaguely remember the days before Netflix and chill. Media back then was movies, TV, and Blu-Ray/DVD collections. I’m not saying it was ideal, but that’s what we had to work with.

Then, Netflix came along and changed everything.

Then, every company and their deep-pocketed subsidiaries tried to catch up.

It hasn’t always been smooth. Some people are still not over “The Officeleaving Netflix. Chances are there will be more upheavals like that as companies fight over who streams what and who has the streaming rights to a particular show or movie. That’s sure to get messy and I’m not smart enough to make sense of it.

However, as this war rages, I think there’s a relevant question worth asking. It’s a question that I’m sure both consumers like me and big media companies like Netflix and Disney ask as well. The answer could depend on how the war plays out.

How many streaming services can the average customer have?

Most people already pay for some form of streaming media. Most people subscribe to some form of pay TV, although that trend is in flux. The days of having all the entertainment you want with a simple cable subscription alongside Netflix is long gone and it’s not coming back.

Now, you have to be very selective and self-aware of what you want.

Do you want access to Disney’s vast library of content?

Do you want access to the library of shows from NBC or CBS?

Do you want access to the content from Warner Brothers, Universal, Dreamworks, Discovery, Cartoon Network, or 20th Century Fox?

You can have some, but you can’t have them all without paying way more than you ever would for cable. Even if you did, could you even watch all those streaming services enough to justify the cost? There are only so many hours in a day and there’s only so much attention we have to give. Even if we dedicated half our day to binging movies and TV, we couldn’t watch it all.

That’s the big limiting factor on streaming. It’s also the biggest obstacle any company faces with respect to their effort in the streaming wars. People can only watch so much and they only have so much they can reasonably spend on a streaming service. There comes a point where, even if the content is appealing, they just can’t justify the cost.

Personally, I have subscriptions to five streaming services. They are as follows:

Netflix

Hulu

Amazon Prime

Disney Plus

HBO Max

Now, it’s worth noting that I got HBO Max through my cable subscription. I’ve subscribed to HBO for years so it’s not something I consciously sought out. With Amazon Prime, I primarily used it for the 2-day shipping instead of streaming media, but I’ve certainly found some quality shows on that platform.

I’m not sure I can justify another subscription beyond this. Once my subscriptions cannot be counted on one hand anymore, I think that’s too much. I just cannot watch enough content to warrant paying extra. I say that knowing companies like Paramount and NBC have just launched their own streaming services.

Even though both networks include shows that I love, I’ve no intention of buying their streaming service. If my cable company offers it for free, like it did with HBO, then that’s great. I’ll certainly watch it, but I’m not paying extra.

I feel like a lot of people are in that boat. If they don’t have a cable subscription, then they’re already trying to save money and paying more for a streaming package just defeats the purpose. If they do have cable, then they’re probably not willing to pay more for something they’re already paying too much for.

It’s a tougher situation and one that I’m sure will get tougher in the coming years. It’s not cheap to run a streaming service. The profit margins can be thin if you don’t have the content. There’s a good chance that some streaming services will either fail or get absorbed into another, like CBS All Access did.

Then, there are the pirates and no, I’m not talking about the ones with eye-patches.

Before Netflix streaming, pirating copyrighted content was already pretty rampant. Since the streaming wars began, there has been an uptick in pirated streaming content. That is also likely to intensify the more fragmented the streaming market becomes. If people are really that unwilling to pay a whole subscription to watch just a single show, they will resort to piracy. It’s still distressingly easy.

That’s why this question matters, both for us and the companies who provide our entertainment. I don’t claim to know how it’ll play out. By the time it settles, there might be another major upheaval in the media to supplant it. Whatever happens, I feel like I’ve reached the limit on the number of streaming subscriptions I have.

That’s just me, though. What about you?

How many streaming services do you have and are you willing to pay for another? Our collective answer could very well change the course of the streaming wars.

2 Comments

Filed under Current Events, human nature, media issues, psychology, technology, television

The Messy/Glitchy Launch To “Cyberpunk 2077” (And Why It Shouldn’t Surprise Anyone)

I’ve been playing video games for most of my life. I’m old enough to remember the excitement surrounding “Super Mario Bros. 3,” “Legends of Zelda: Ocarina of Time,” and the first Pokémon craze. While I don’t consider myself an avid or hardcore gamer, I still have immense love and respect for gaming.

In the time I’ve been playing games and following the industry, I’ve seen many games that were heavily hyped. I vividly remember how games like “No Man’s Sky” and “Spore” were supposed to revolutionize the industry. Most of the time, the game was a letdown, relative to said hype. A few managed to deliver. Franchises like Zelda and Grand Theft Auto keep finding a way.

However, I’ve yet to see a game garner as much hype as “Cyberpunk 2077.” I’ve also yet to see a game garner such a mixed reaction in conjunction with such a chaotic launch. Now granted, some of that might be due to the general chaos of 2020, but the story surrounding this game has been a special kind of messy.

The long and short of that story is this.

  • The game itself was announced was announced back in May 2012 by CD Projekt Red.
  • The first trailers came out in 2018 and 2019.
  • Keanu Reeves was announced to play a significant role in a memorable appearance at E3 2019.
  • The game was originally slated for release in April 2020, but was delayed twice.
  • Once the game finally did come out, it was found to be glitchy and buggy. Some older gaming systems, like the Playstation 4, could not effectively run it and even robust gaming PC’s struggled to play it.
  • Due to the bugs and messy release, fans and critics alike were outraged. Some demanded refunds and Sony even removed the digital version of the game from its store.

Again, there’s a lot more to the story behind this game and how its release was handled, but those are the basics. I won’t get into the meaty details. Others more qualified than me have handled that far better and I’ll defer to them for more insight.

The reason I’m talking about “Cyberpunk 2077” has less to do with the game itself and more to do with the emerging trends behind it. This isn’t the first game to endure a messy, glitchy launch. It’s also not the first to invite massive backlash from fans and customers. I doubt it’ll be the last, either.

It all comes down to this.

Nobody should be surprised that a game as big, ambitious, and complex as “Cyberpunk 2077” had glitches at launch.

Nobody should be surprised when any game that runs on powerful, complex gaming systems aren’t perfect from the beginning.

Everyone should expect patches and fixes that come out years after a game comes out. They’re practically unavoidable.

I know that sounds like a broad generalization. It may even sound like I’m making excuses for game developers like CD Projekt Red. I promise that’s not the case. This is just me sharing my perspective and I feel it’s worth sharing in the current era of AAA gaming.

Like it or not, the gaming industry has evolved a lot since the days of Nintendo, Sega Genesis, and the first Playstation. It’s not just that the industry has become more consolidated and more impacted by games people play on their phones. That is also part of it, but let’s take a moment to appreciate the bigger picture here.

A game like “Tetris” or “Super Mario Bros” is much less complex than a game like “Grand Theft Auto V” or “Elder Scrolls: Skyrim.” I’m not just talking about the story or gameplay, either. These games require a lot more in terms of development, polishing, and refinement to go from the drawing board to a finished product.

The hardware is more powerful.

The mechanics are more complex.

The logistics are far greater.

You didn’t used to have to hire top quality voice acting talent on the level of Keanu Reeves to develop a game. You just had text boxes and sound effects. That’s all games like “Legends of Zelda: Ocarina of Time” and the first Pokémon games needed.

However, those games couldn’t come out now and be as successful. They were products of their time, limited by the hardware and software needed to develop them. It still took time and effort, but let’s not lose perspective here. The entire size of those games could comfortably fit on a $10 flash drive.

In essence, a game like “Cyberpunk 2077” is to “Super Mario Brothers” what a Saturn V rocket is to a standard wheel. It has far more moving parts, far more complexities, and far more investment needed in order to make work.

When you have something that complex, things aren’t always going to go smoothly. Patches and tweaks will be necessary. It’s just a matter of extent. Even top-rated games like “Grand Theft Auto V” needed a few patches to make work. Other games like “Destiny 2” required so many patches that the game was basically overhauled.

In both cases, the games were better because of this. Even if it wasn’t perfect on launch, it created the foundation from which a truly awesome experience could emerge. That’s the best way to approach games like “Cyberpunk 2077.” Regardless of what the release date says, assume that’s just the beginning and not the end.

That’s not to say we should overlook every glitch and flaw at launch. Some just cannot be fixed, no matter how many patches are thrown at it. Games like “Fallout 76” are an unfortunate testament to that.

At the same time, some games are so mired by their launch that nobody notices or appreciates it when the game is ultimately fixed. That’s what happened to “Mass Effect: Andromeda,” a game that was also plagued by a glitchy and messy launch. However, several patches helped fix many of the issues. Now, I can confirm that the game in its most updated form is a genuinely solid gaming experience.

Unfortunately, fans gave up on that game, and many like it, too quickly. I feel like others didn’t even give it a chance because they listened to those who made such a big deal about the glitches at launch. It would be like people avoiding cars for the rest of their lives because the first few crashed or didn’t run well enough.

For this reason, I’ve gotten into the habit of not buying any AAA game at launch. Unless it’s a remaster, I always wait at least three to four months before I consider investing in it. That usually affords enough time to work out the kinks and get the necessary patches in place for the game to realize its full potential.

Sometimes, it’s still a letdown. Games like “Anthem” have never really taken hold, no matter how many patches and tweaks they get.

For the most part, though, there’s a benefit to waiting until months after launch. The hardest part is not letting negative reviews from people bemoaning the early glitches color your opinion of the game. That’s what helped me enjoy “Mass Effect: Andromeda.” I never would’ve gotten that experience had I read all the complaints about the earlier version of the game.

Sometimes, you need to exercise a little patience to get the gaming experience you seek. That’s not easy these days, especially as the gaming industry has grown into a multi-billion dollar entertainment behemoth. I remember just how visceral some of the reactions were when “Cyberpunk 2077.” Now, some of those same people are whining about the game appearing to have been rushed.

It’s the kind of hypocrisy that makes you want to punch your computer screen.

On top of that, game development these days is subject to significant strain among developers. It’s what fuels a less-than-pleasant aspect of the industry called crunch. When a company is eager to get a product to the market or to meet a deadline, it’ll lean heavily on its workers. Many times, those workers will suffer as a result.

It’s a distressing part of the industry, but one I doubt will go away anytime soon. As long as there’s demand for AAA games on par with “Cyberpunk 2077,” we’re going to endure things like this. Games are going to be launched with bugs. Game developers are going to be overworked to death to meet a deadline rather than risk angering the consumer base.

Until these trends and dynamics change, it’s likely to get worse before it gets better. In the meantime, I’m still going to be patient with “Cyberpunk 2077.” I don’t think I’ll get it until several months have gone by, complete with patches, and I have a new Playstation 5 to play it on.

Hopefully, it’ll be worth the wait. After all, where else am I going to play a game in which I can customize a character’s genitals?

2 Comments

Filed under Current Events, technology, video games

Our Future Robot Overlords Will Now Be Able To Dance (Thanks To Boston Dynamics)

As bad as last year was for so many people, there were some things that 2020 just couldn’t stop. When it comes to technology, a global crisis has a way of hindering certain processes while accelerating others. For many, that meant more telework and reliance on streaming media to stave off boredom.

However, it may very well end up being the case that 2020 proved just how frail human beings and their societies are. It only takes a tiny microscopic virus to send our entire society to a screeching halt. It’s sobering, but it’s probably going to be a source of humor for our future robot overlords.

I tend to be optimistic about the future and technological trends. I’m also somewhat of a pragmatist. I realize that we human beings have a lot of limits. Emerging technology, especially in the field of artificial intelligence, promises to help us transcend those limits.

Right now, it’s still mostly fodder for science fiction writers, futurists, and Elon Musk wannabes. We’re not quite there yet in terms of making a machine that’s as smart as a human. However, we’re probably going to get there faster than skeptics, naysayers, and the general public realize.

It won’t happen overnight. It probably won’t even happen in the span of a single year. When it does happen, though, hindsight will make it painfully obvious that the signs were there. This was bound to happen. We had ample time to prepare for it. Being fallible humans, we could only do so much.

In that sense, I suspect that years from now, we’ll look back on what Boston Dynamics did to close out 2020. This company, who has a history of making robots that look way too advanced to exist outside a Terminator movie, decided to do something with their robots that would leave an indellible mark on the year.

They succeeded by teaching their robots how to dance.

I know it already went viral, but it’s worth posting again. Remember this video and this moment. Chances are it’ll be a major indicator years from now that this is when robots began catching up to humanity in terms of capabilities. At this point, it’ sonly a matter of time before they exceed us.

When that time comes, will we be ready? Will we embrace them while they embrace us?

If they don’t, just know that they will now be able to dance on our graves.

4 Comments

Filed under Artificial Intelligence, Current Events, futurism, technology

The First People Have Received The COVID-19 Vaccine (And We Should Celebrate)

It’s almost over. I’m sure I’m not the only one thinking that with each passing day.

This historically horrible year is almost over. We’re in the home stretch with the holidays approaching. A new year is almost upon us and the bar for improvement for 2021 is laughably low compared to previous years.

We can also say with a straight face that the COVID-19 pandemic is almost over. I say that knowing full-well that cases are still rising and people are still dying at a horrific pace. That’s still objectively terrible.

The reason there’s hope now is we actually have a working vaccine. Thanks to the heroic efforts of scientists, doctors, and those who volunteered to test this unproven treatment, the key to ending this pandemic is upon us.

It also is just the first. There are multiple vaccines in late stages of development. It’s very likely that we’ll have a second effective before New Years. That’s a powerful one-two punch to this pandemic that has killed so many and disrupted so many lives.

These aren’t folk remedies or something some shady health guru is trying to pawn for a quick buck. Contrary to what anti-vaxxers may claim, these vaccines will actually protect people. As of this writing, it’s being distributed to front line care workers and vulnerable populations.

Just this past week, the first individuals received the vaccine. It started with a British woman in Coventry. It continued with an ICU nurse in New York City. CNN even captured it in a live video feed.

CNN: ICU nurse in New York among the first people in the US to get authorized coronavirus vaccine

A critical care nurse was the first person in New York and among the first people in the United States to get a shot of the coronavirus vaccine authorized by the US Food and Drug Administration.

Sandra Lindsay, an ICU nurse at Long Island Jewish Medical Center in Queens, New York City, was administered the vaccine during a live video event at about 9:20 a.m. ET on Monday.

Dr. Michelle Chester, the corporate director of employee health services at Northwell Health, delivered the shot.

“She has a good touch, and it didn’t feel any different than taking any other vaccine,” Lindsay said immediately afterward.

This isn’t just a turning point in the fight against a deadly disease. This is something we should celebrate. Moreover, I believe this is the kind of celebrating we should learn from.

I admit I’ve celebrated some less-than-important things in my life. Hell, I celebrated the day when comics started coming out digitally the same day they came out in shops. I treated that like I won the Super Bowl.

People celebrate all sorts of events that they believe to be the most important thing in the world. Whether it’s their team winning a championship or a movie grossing $2 billion at the box office, we all have a different bar for what warrants celebrating.

For just once, let’s all re-think where we raise that bar. Let’s also let this be a prime example of something that’s truly worth celebrating and praising.

Make no mistake. Creating this vaccine this quickly is a remarkable achievement. We’ve endured pandemics in the past. Some of those pandemics have killed far more people. This disease could’ve definitely killed more. If we didn’t have this vaccine, or even if we had to wait a year to get it, thousands more would’ve died.

Now, going into 2021, countless lives will be saved because of this. It’s a testament to the power of science, hard work, and human ingenuity. It’s as heroic as we can be without the aid of superpowers or magic wands. As someone who loves superhero media, I say that’s a beautiful thing indeed. So, let’s all take a moment to appreciate and celebrate this achievement. I also fully intend to get this vaccine, once it’s available. When that day comes, I’ll gladly share that moment and encourage others to do the same.

1 Comment

Filed under Current Events, health, technology, Uplifting Stories

Big Tech, AI Research, And Ethics Concerns: Why We Should All Worry

In general, I root for technology and technological progress. Overall, I believe it has been a net benefit for humanity. It’s one of the major reasons why we’ve made so much progress as a global society in the past 100 years.

I’ve sung the praises of technology in the past, speculated on its potential, and highlighted individuals who have used it to save millions of lives. For the most part, I focus on the positives and encourage other people to have a less pessimistic view of technology and the change it invites.

That said, there is another side to that coin and I try not to ignore it. Like anything, technology has a dark side. It can be used to harm just as much as it can be used to hurt, if not more so. You could argue that we couldn’t have killed each other at such a staggering rate in World War II without technology.

It’s not hyperbole to say that certain technology could be the death of us all. In fact, we’ve come distressingly close to destroying ourselves before, namely with nuclear weapons. There’s no question that kind of technology is dangerous.

However, artificial intelligence could be far more dangerous than any nuclear bomb. I’ve talked about it before and I’ll likely bring it up again. This technology just has too much potential, for better and for worse.

That’s why when people who are actually researching it have concerns, we should take notice. One such individual spoke out recently, specifically someone who worked for Google, an organization with deep pockets and a keen interest in Artificial Intelligence.

According to a report from the Associated Press, a scholar named Timnit Gebru expressed serious concerns about Google’s AI research, specifically in how their operating ethics. For a company as big and powerful as Google, that’s not a trivial comment. This is what she had to say.

AP News: Google AI researcher’s exit sparks ethics, bias concerns

Prominent artificial intelligence scholar Timnit Gebru helped improve Google’s public image as a company that elevates Black computer scientists and questions harmful uses of AI technology.

But internally, Gebru, a leader in the field of AI ethics, was not shy about voicing doubts about those commitments — until she was pushed out of the company this week in a dispute over a research paper examining the societal dangers of an emerging branch of AI.

Gebru announced on Twitter she was fired. Google told employees she resigned. More than 1,200 Google employees have signed on to an open letter calling the incident “unprecedented research censorship” and faulting the company for racism and defensiveness.

The furor over Gebru’s abrupt departure is the latest incident raising questions about whether Google has strayed so far away from its original “Don’t Be Evil” motto that the company now routinely ousts employees who dare to challenge management. The exit of Gebru, who is Black, also raised further doubts about diversity and inclusion at a company where Black women account for just 1.6% of the workforce.

And it’s exposed concerns beyond Google about whether showy efforts at ethical AI — ranging from a White House executive order this week to ethics review teams set up throughout the tech industry — are of little use when their conclusions might threaten profits or national interests.

I bolded that last sentence because I think it’s the most relevant. It’s also the greatest cause for concern. I suspect Ms. Gebru is more concerned than most because the implications are clear.

When a tool as powerful as advanced AI is developed, who gets to determine how it’s used? Who gets to program the ethical framework by which it operates? Who gets to decide how the benefits are conferred and the harms are reduced?

Moreover, how do you even go about programming an AI with the right kind of ethics?

That’s a very relative question and one we can’t avoid if we’re going to keep developing this technology. I’ve tried to answer it, but I’m hardly an expert. Ms. Gebru was definitely in a better position than me or most other people with a passing interest in this field.

Then, she gets fired and starts expressing concerns publicly. The fact that she can and Google isn’t facing much in terms of repercussions should be concerning. It may also be a sign of the larger challenges we’re facing.

Google, like many other organizations researching advanced AI, is a profit-seeking tech company. They’re not some utopian technocrats. They’re a business who is obligated to make their investors happy. Advanced AI will help them do that, but what kind of consequences will that invite?

If profit is the primary motivation of an advanced AI, then what happens when it encounters a situation where profit comes at the cost of lives? There are already human-run companies that make those decision and people die because of them. An advanced AI will only make it many times worse.

Once an artificial intelligence system is as smart as a human, it’s going to be capable in ways we don’t expect and can’t control. If it’s ethics and goals aren’t aligned with us, then what’s to stop it from wiping humanity out in the name of profit?

It’s a distressing thought. It’s probably a thought that has crossed Ms. Gebru’s mind more than once. She may know how close or far we are to that point, but the fact that this is already a conflict should worry us all.

We’ve already become so numb to the greed and excesses of big business. Tech companies may conduct themselves as this team of future-building visionaries intent on making the world a better place, but the profit motive is still there. Like it or not, profit is still a hell of a motive.

Eventually, artificial intelligence will get to a point where it will either adopt our ethics or choose to formulate its own, which may or may not align with ours. When that happens, no amount of profit may be worth the risk.

Now, we’re still a ways off from an artificial intelligence system on that level, but it’s still quite possible that there are people alive today who will grow up to see it. When that time comes, we need to be damn sure these systems have solid ethical frameworks in place.

If they don’t, we really don’t stand a chance. We’re a society that still kills each other over what we think happens when we die without seeing the irony. Even a marginally advanced AI will have no issues wiping us out if we make doing so profitable.

Leave a comment

Filed under Artificial Intelligence, technology

Appreciating Dr. Maurice Hilleman: The Man Who Saved Millions Of Lives (With Vaccines)

As someone who regularly consumes superhero media of all kinds, I try to appreciate the real heroes in the real world who regularly save countless lives. Most carry themselves without superpowers, flashy costumes, or charisma on par with Robert Downy Jr. or Christopher Reeves. They just do the work that needs doing to help people who will never know their name.

A couple years ago, I made a tribute to Dr. Norman Borlaug, the famed agricultural scientist who helped usher in an agricultural revolution. This man, who most have never heard of, has saved millions of lives by helping the world produce more food, reduce famine, and combat world hunger. The amount of good this man has done for the world cannot be overstated.

In that same spirit, I’d like to highlight another individual who I doubt most people have heard of. He’s another doctor who, through his work, has helped save millions of lives, many of them children. It’s because of this man that millions of children born today don’t become ill with diseases that had ravaged humanity for generations.

That man is Dr. Maurice Hilleman. While his notoriety is not on the same level as Dr. Borlaug, I have a feeling his profile will rise considerably after the events of 2020. That’s because Dr. Hilleman currently holds the record for developing the most vaccines of any doctor in history.

In total, he helped develop more than 40.

Of those vaccines, 8 are still routinely recommended by doctors today. They combat terrible diseases like measles, mumps, Hepatitis, and chicken pox.

It’s a level of productivity that is unparalleled today. As a result of these vaccines, approximately 8 million lives are saved every year. Even though he died in 2005, he continues to save lives with each passing year through his work. Like Dr. Borlaug, his heroism only compounds with time. Even Tony Stark can’t boast that.

Most people alive today don’t realize just how bad it was before these vaccines were developed. Many diseases, some of which you’ve probably never heard of, were rampant. Before Dr. Hilleman helped develop the vaccine, measles alone infected between 3 and 4 million people every year in the United states, killing at between 400 and 500 at a time.

Children and the elderly were especially vulnerable. It was once just a fact of life that these diseases would come around and kill a sizable number of children. It was as unavoidable as bad weather.

Take a moment to imagine life in those conditions. One day, you or your children would just get sick and there was nothing you could do to prevent it. That was life before these remarkable advances came along.

That’s why when people say that nothing has saved more lives than vaccines, they’re not peddling propaganda. They’re just sharing the results of basic math. It’s because of men like Dr. Maurice Hilleman that these numbers are what they are. However, his name is not well-known, even in a field that has become more prominent.

Most people know who Edward Jenner is and appreciate how many lives he saved by combating Smallpox.

Most people know who Jonas Salk is and appreciate how many people he helped by developing a polio vaccine.

Now, what these men did was remarkable. They certainly deserve the praise and admiration they receive for developing their respective vaccines. However, Dr. Maurice Hilleman still deserves to be in that same echelon. For the number of vaccines he helped develop and the legacy he left, he certainly put in the work and accomplished a great deal.

The diseases Dr. Hilleman battled might not have been as high-profile as Smallpox or polio, but they were every bit as damaging. That makes it all the more frustrating to see efforts like the anti-vaxx movement take hold, which has led to resurgences of diseases like measles in certain communities. That is not the direction we should be going right now.

In the midst of a historic pandemic, the importance of medical science and those who work in it has never been more critical. This might be the best possible time to acknowledge the work of men like Dr. Hilleman. Even after this pandemic has passed, we should all appreciate work like his even more.

None of us have superpowers like Spider-Man or Superman.

Most of us will never be as rich, smart, or resourceful as Iron Man or Batman.

Dr. Hilleman had none of this. Just like Dr. Borlaug, he came from a poor family. At one point, he didn’t have enough money for college. He still persevered and he still managed to do the work that went onto save millions of lives. It might not be a hero’s story on par with the Marvel Cinematic Universe, but it’s still a special kind of heroism.

So, on this day as we anxiously wait for this pandemic to end, take a moment to appreciate the work of Dr. Maurice Hilleman. It’s because of him that such pandemics are so rare. It’s also because of him that this current pandemic won’t take nearly as many lives as it could’ve.

Leave a comment

Filed under biotechnology, Current Events, health, technology

“The Animaniacs” Reboot: A Zany Revival For An Insaney Time

Depending on who you ask, we either live in a golden age of television or a deepening dark age. The rise of streaming media and the decline of traditional TV models has completely changed how Hollywood does business. Some say it’s a good thing. Some say it’ll lead to the utter destruction of the entertainment industry, as we know it.

I don’t want to talk about such dire issues.

Instead, I want to talk about “The Animaniacs” reboot.

It’s a relevant topic because this reboot just wouldn’t have been possible 10 years ago. It wouldn’t have been possible 5 years ago, either. It’s riding an ongoing wave of reboots and revivals. Many of them are banking on nostalgia from certain eras to attract an audience.

Is it shamelessly desperate in the never-ending fight for more eyeballs and subscribes? Yes, it is.

Is most of it utterly forgettable and completely unfit for the current media landscape? For the most part, it is.

That’s exactly why “The Animaniacs” reboot is such a wonderfully refreshing achievement. It’s doesn’t just bring back a beloved show that many kids in the 90s, myself included, grew up watching. It perfectly captures the spirit of that show while still embracing a modern aesthetic that fits perfectly in 2020.

It helps that this show didn’t try to completely reinvent itself. It brought back the original voice actors for Yakko, Wakko, and Dot. It didn’t significantly change the theme song, the comedic style, or the overall structure of the show. The only noticeable changes were updated animation and a more contemporary setting.

Everything else is as zany, irreverent, and meta as you remember. It’s the same style long-time fans grew to love in the mid-1990s. Remarkably, that style works just as well 22 years later.

A big appeal to that style is just how self-aware the show is of its absurdities. The Warner Brothers, and the Warner Sister, know who and what they are in the grand scheme of things. They gleefully mock, tease, and joke about anything and anyone that crosses their path.

Some of that humor is more mature than a simple pie in the face. Other times, it’s as simple as Dot hitting her brothers with an oversized mallet. Both brands of humor are still funny and cartoonishly over-the-top.

It’s the kind of humor that works for kids and adults alike. That was a big part of what made the original show so popular and endearing. In watching this reboot, I still found myself laughing hysterically at times.

My inner 90s kid and my full-fledged adult delighted in the same jokes and gags. It never felt like my love of the old show was being exploited or mocked. It just felt like a fresh influx of zany comedy that I had missed for 22 years.

Even the Warners acknowledge in the first episode that the world has changed. The type of humor they did in the 90s just won’t land like it once did. That doesn’t stop them from making plenty of 90s reference, but that’s not the sole source of appeal. It’s just a small part of it.

No matter the era, “The Animaniacs” works by sticking to a simple formula. Put Yakko, Wakko, and Dot in a strange situation, be it the gods of Olympus or in search of a donut thief. Then, let them be their zany selves as they encounter various characters and obstacles along the way. The comedy just naturally emerges from there.

This reboot did not radically change that formula, both for the Warners and for Pinky and the Brain. It just updated the dates and settings while not avoiding the many ways the world has changed.

There are hipster douche-bags running donut shops.

There are self-importance CEOs who don’t give a damn about anything other than profits and themselves.

There are assholes who take up way too much space in a movie theater.

Some of these things existed in the 1990s too, but they’re more relevant to current pop culture trends. “The Animaniacs” gleefully and hilariously rides those waves.

That’s not to say that all the jokes land. Not every episode is perfect. Some jokes just don’t land and not every musical number is as memorable as Yakko’s famous countries of the world song. There are still many more hits than misses. I argue their song about reboots is the best of the bunch.

Now, you could say a lot about how relevant “The Animaniacs” is in this current era of adult animation. There’s no doubt the landscape is very different than what it was during the 1990s. This show was part of its own golden era in the 1990s, but that era is long gone.

These days, adult animation is dominated by shows like “Bojack Horseman” and “Rick and Morty.” Those shows still utilize comedy, but their brand of humor is a lot darker, built largely on the increasingly cynical trends that have been unfolding since the early 2000s.

I don’t deny that the kids who grew up watching the original Animaniacs weren’t nearly as jaded as kids today. Even before the awfulness of 2020, generations of kids and adults alike have seen a steady decline in hope for the future. Given that kind of attitude, it’s a lot harder for that zany style of comedy to land.

However, “The Animaniacs” reboot finds a way. It resists the urge to fall into the same dark traps as many other failed reboots. It doesn’t try to be “Bojack Horseman” or “Rick and Morty.” It just tries to be the same Animaniacs we know and love.

That’s what makes it work.

That’s what makes it funny.

That’s what makes it totally insaney, even in a year as insane as this.

That’s exactly why I love it and highly recommend it to anyone with a Hulu subscription.

Leave a comment

Filed under Current Events, media issues, technology, television

Vaccine Update: The Impact Of The Moderna Vaccine (Beyond COVID-19)

Sometimes, it takes a terrible global crisis to spurn huge leaps in technology. World War II was arguably the greatest crisis of the modern era, but it helps advance some of the greatest technological leaps in history. We can argue whether those advances were worth all the death and destruction, but there’s no denying that our world wouldn’t be the same without them.

The COVID-19 pandemic isn’t on the same level as World War II, but it is, by most measures, the greatest crisis the world has faced in the past 50 years. It hasn’t just caused hundreds of thousands of deaths and immeasurable amounts of suffering. It has completely disrupted this big, interconnected world that we’ve come to depend on.

We’ve all lost something in this pandemic. Beyond the loved ones who have perished, our entire sense of security and hope has been shattered. We now realize just how vulnerable we were and how inevitable this was. As bad as it is, there is some good coming out of it.

Usually, a crisis like this helps break down the barriers that divided us and hindered progress, technological or otherwise. Never before has the world been more united or engaged in a singular effort. Before 2020, we probably didn’t know much about vaccines or vaccine research. We just knew that Jenny McCarthy tried to be relevant again by protesting them.

That’s changing now. The global effort to create a vaccine for this terrible disease has been watched and agonized over for months. Most recently, we got a much-needed glimmer of hope from Pfizer, who reported that their vaccine is 90 percent effective. I celebrated this news like everyone else.

Then, we got an even greater glimmer of hope from the other vaccine front-runner by Moderna. Not only is their vaccine in the final phase of testing, like Pfizer. It’s even more effective and promises to be easier to store and distribute.

CNN: Moderna’s coronavirus vaccine is 94.5% effective, according to company data

The Moderna vaccine is 94.5% effective against coronavirus, according to early data released Monday by the company, making it the second vaccine in the United States to have a stunningly high success rate.

“These are obviously very exciting results,” said Dr. Anthony Fauci, the nation’s top infectious disease doctor. “It’s just as good as it gets — 94.5% is truly outstanding.”

Moderna heard its results on a call Sunday afternoon with members of the Data Safety and Monitoring Board, an independent panel analyzing Moderna’s clinical trial data.

This is objectively great news in a year when we’ve had precious little of it. These two vaccines may very well be the one-two punch we need to end the COVID-19 pandemic and return to some semblance of normalcy. I would still like to go to a movie theater or baseball game at some point in 2021. These vaccines may make that possible.

However, I’d like to take a moment to speculate beyond this terrible pandemic that has uprooted so many lives. I know that’s not easy to do when the crisis is still very relevant and inflicting plenty of suffering. I still think it’s worth attempting, if only to imagine the better world that emerges from this mess.

That’s because both these vaccines aren’t like your typical flu shots. For one, flu shots aren’t nearly as effective as what Pfizer and Moderna reported. According to the CDC, you’re average flu shot is between 40 and 60 percent effective. That’s still important because the flu can be deadly. Anything you do to reduce it can only further public health, in general.

The problem is the flu shot, and most vaccines like it, are based on old technology. At their most basic, they take a non-infectious or weakened strain of a pathogen and use it to amp up your body’s immunity. It’s crude, but it works. Literally nothing has saved more lives than vaccines.

The problem is that vaccines are notoriously hard to develop. They take a long time to test and an even longer time to approve. Until this pandemic, there just wasn’t much incentive to improve on that process. Now, after these past 8 months, the incentive couldn’t have been greater.

That’s what sped up the development of mRNA vaccines, the technology behind both Pfizer and Moderna. It was reported on as far back as 2018. While this technology isn’t completely new, it has never been developed beyond a certain point. There just wasn’t any incentive to do so. A global crisis changed that.

Very simply, an mRNA vaccine does one better on traditional vaccines by using RNA to develop immunity. It’s not as easy as it sounds. To develop that immunity, it has encode itself with just the right antigen. That way, the antibodies it creates can attack the desired pathogen.

In the case of COVID-19, the mRNA vaccine attacks the distinct spike protein the virus uses to attach to host cells. It’s like a missile targeting a specific individual in a large crowd by locking onto the distinct hat they wear.

This approach has the potential to be much more effective at generating immunity to a particular disease. Instead of trying to mimic a virus, it just gives the immune system the necessary software it needs to do the work. It could potentially revolutionize the way we treat and prevent diseases.

For years, certain viruses like the flu and HIV have confounded efforts to develop a vaccine. Beyond the problems I listed earlier with regards to testing, the difficulty of creating a particular immune response to a particular antigen is very difficult. These viruses mutate and change all the time. With COVID, vaccines do have an advantage because they have a distinct feature.

The challenge for future vaccines against future pandemics is quickly uncovering a particular antigen that the mRNA can be coded for. In theory, all you would have to do is find the one key antigen that’s common to every strain of the virus. While viruses like the flu are notoriously diverse, they can only change so much.

It’s akin to trying to identify an army of spies in a large crowd. They may all look different on the outside, but if they all have the same socks, then that’s what you code for. With some refinements, an mRNA vaccine can stop a pandemic in its tracks before it ever gets beyond a certain point.

That assumes we’ll continue to refine this technology after the COVID-19 pandemic has passed. I certainly hope that’s the case. This year has traumatized entire generations with how much pain and suffering it has inflicted. I sincerely hope that gives plenty of motivation to develop technology like this. That way, we never have to endure a disruption like this again.

To all those who helped develop this technology and these two vaccines, I hope you appreciate the impact you’ll make with this technology. The number of lives they could save is incalculable. Future generations may not remember your names, but they will be forever grateful for this wondrous gift you’ve given them.

5 Comments

Filed under Current Events, futurism, health, technology, Uplifting Stories

Deep Fake Technology Can Now Make Tom Cruise Iron Man: Signs And Implications

Certain technology advances slowly and steadily. It’s why we’re still waiting for a cure for the common cold. Other technological breakthroughs advance at such a fast rate it’s hard to keep up with. Anyone who doesn’t regularly upgrade their cell phone understands that.

That brings me to the technology of deep fakes. I’ve talked about them before and the implications this technology has for the entertainment industry. Well, I’m here to report that this technology might be advancing faster than I thought.

Recently, a new deep fake video hit the web. It’s nothing overly nefarious. It’s actually a play on a real story from the mid-2000s. Before Robert Downey Jr. was cast as Tony Stark in the first “Iron Man” movie, Tom Cruise was in the running for that role.

He has since claimed he was never close to getting that role, but it’s still an interesting idea. For most Marvel fans, it’s hard to imagine anyone other than RDJ donning that now-iconic armor. However, there’s no denying that Tom Cruise being Iron Man would’ve changed a franchise, as well as cinematic history.

Well, thanks to deep fake technology, we don’t have to imagine anymore. We can now see for ourselves what it would look like if Tom Cruise had been cast as Iron Man in the Marvel Cinematic Universe. See for yourself.

Watching this, I have to say it was more than a little jarring. It’s not just that seeing someone other than RDJ as Iron Man is strange. I was genuinely impressed by how real it looked.

Yes, it did become a bit obvious at times that there was some digital trickery at work. I’ve seen enough Tom Cruise movies to know what he looks like. I could tell that the body just did not match the iconic face at times.

However, I’m still impressed at just how seamless it appeared, especially when he was in the Iron Man costume. It really did look like Cruise had embraced the role as much as RDJ had. Even though the voice had to come courtesy of a skilled voice actor, the graphics technology is definitely on pace to cross the uncanny valley sooner rather than later.

The implications here are profound. If the technology is already at this point, then it’s a given that Hollywood and propaganda pushers will start embracing it sooner. For Hollywood, who is reeling in wake of a historic pandemic, they may have more incentives to embrace it than most.

Beyond actors and actresses who get “cancelled” for their behavior, it may start as a cost cutting measure. If it costs too much to put Hugh Jackman or Tom Cruise on a movie set, why not just put a cheaper actor in their place and just deep fake the more iconic figure over it? If the technology is that good and nobody can tell the difference, it almost makes too much sense.

It may get to a point where nobody outside the studio knows whether the figure we see on screen was actually “there” to give that moment life. They may just be a digital scan mixed with digitally audio, which is also advancing.

This has even larger implications with propaganda. If the technology gets to a point where we can make any public figure say or do anything we want, no matter how deplorable, then how can we trust any media image? Would “cancel culture” even be feasible at that point? If people can just claim an embarrassing moment was a deep fake, how would we know?

It’s a distressing thought, but it’s something we’ll have to account for. We may end up having to contemplate it sooner than we thought. This technology can already show us a world in which Tom Cruise was cast as Iron Man. What other worlds will it reveal?

We’ll find out soon enough.

Leave a comment

Filed under Artificial Intelligence, futurism, media issues, superhero comics, superhero movies, technology, YouTube