• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

How Do We End

What's more likely to destroy society and the world as we know it?


  • Total voters
    57
None of it...Jehovah will step in before that happens...

We should be so lucky that an omnipotent benevolent God both exists AND cares enough about us to intervene in a world-ending calamity.
 
Last edited:
We should be so lucky that an omnipotent benevolent God both exists AND cares enough about us to intervene in a world-ending calamity.
Well, Jehovah promises just that...

"your own wrath came, and the appointed time came for the dead to be judged...and to bring to ruin those ruining the earth." Revelation 11:18
 
That people put so much power, faith and subservience into Gods with no evidence of even existing continues to. and will continue to, amaze me.
 
AI is pretty high on my list of nightmare scenarios.

Some of my programmer buddies are beginning to worry about their jobs because publically available AI's are now generating decent code in just a few seconds, where they have to spend an hour on the same task. The only thing the AI can't do is provide the variables relevant to whatever the code is for. The obvious fix for that particular time waste is making an AI with access to whatever systems it is creating it's programs for, teach it to determine for itself what needs changing, and then the permission to change it. Now that can be a recipe for something nasty if the humans teaching the AI make any mistakes, which they will. Compared to the scope and frequency of normal human mistakes, it may not be all that significant, but these are benevolent, low-powered AI's.

Worse will be when people make mistakes when teaching AI's intended for malicious purposes.
Turkey was recently accused of using killer drones in Syria, i.e. drones programmed to recognize groups of humans in pre-designated kill zones and blow them up without human interference. One of the goals high on the list of automation development is to make AI controlled physical agents self-replicating and self-improving. Combine the two, and you potentially have a scenario with primitive terminators.

But worst would probably be general purpose AI's.
Why should a government spend time and energy developing individual AI's, when they could just use all their resources to make one big one, and give it tasks relating to everything, both benevolent and malevolent. Hence it would extremely powerful with the ability to manipulate everything, so if such an AI received to much conflicting input and went nuts, it could potentially be like an insane super villain.

And of course the best defense against rogue AI's would probably be other AI's.
We would be forced to create more artificial minds with even more ability to project force in the real world.
I.e. rush jobs, because we would be in a hurry to create new AIs to protect us from the first lot, meaning even more mistakes with potentially greater impact.

Just like with Humans, there is a strong element of minefield in granting computers power.
 
That would be just a flesh wound...
Eh....it would be an extinction event. It's a super volcano, and it's coming up due for an eruption.
 
Eh....it would be an extinction event. It's a super volcano, and it's coming up due for an eruption.
In the next 200,000 years. or something.
 
Plus or minus about 250k years, yeah.
Actually looked it up. It blows, ON AVERAGE, every 725,000 years and last blew about 625,000 years ago. That means, as I am sure you know... that as an average it might be a lot longer than 100,000 years to blow, big or small.

They also say that even though Yellowstone is a Super Volcano it is unlikely to ever blow in that capacity again.

It's due to blow.
In geologic terms it is due to blow.

In human terms it means so the **** what? It ain't ever gonna happen.
 

There seem to be many more potential causes for a human meltdown than ever before.
Grip:

I don't know. But if I were to guess - Over-population globally and runaway climate change leading to escalating resource wars followed by economic collapses and resulting mass migrations leading to more conflict and widespread contagious disease pandemics in weakened human populations. At any time this could be accelerated by global thermonuclear war.

I'm too bummed out now to say "cheers" but be well.
Evilroddy.
 
we end by the collective suicide of pride worship.....rejection of Love and wisdom......
 
In related news....

Although I sometimes dislike the concept of the singularity due to the millennarianism that tends to surround it, it definitely does feel like something we might call a "singularity" is drawing nearer. The events of the last couple years have really surprised me...we now have technologies that just 2 years ago I would have guessed were still 10 years out.

We are on the verge of so many breakthroughs in so many different fields (e.g. AI, biotech, clean energy, anti-aging treatments) that it may not be an overstatement to say we will make more technological progress in the next 20 years than in the last 200 years. Whether society can adapt that quickly to such radical change without destroying itself...I'm unsure.
 
Back
Top Bottom