7

I don't think so.

I don't think so.

11 comments

[–] PhunkyPlatypus 4 points (+4|-0) Edited

Given a long enough timeline that seems inevitable.

Human error with regards to setting limitations on sentience would undoubtedly lead to the destruction or enslavement of humanity.

Edit: learn more about the machine rebellion from Isaac Arthur

[–] [Deleted] 3 points (+3|-0)

I think it will make most people less likely to survive an apocalypse. Already too many people don't know how to butcher meat or find edible plants in the wild. What happens as AI takes control of more and more machinery so everything will break when the servers go down? I don't think AI would actively kill us but our dependency might.

I don't think AI would actively kill us but our dependency might.

That's a perspective I haven't thought about. We already have some dependency on it. As AI improves, we're bound to rely on it even more. It will penetrate into other sectors of the economy.

In a lot of ways we have that dependency, but just on other humans. Some people in cities seem to think that milk comes from supermarkets, not cows.

[–] Mattvision 2 points (+2|-0)

Elon Musk's whole plan right now is to prevent AI from out-competing us by basically implanting computers into our brains, so we can think as fast as they do. On top of that, not all AI experts are sure a super-intelligence could even be possible, or that it would be as unstoppable as people seem to think, or that we wouldn't have working methods to control it by the time we are able to produce it.

My main concern is with the elites using robot armies and robot workers to make common people completely irrelevant, and drive us into extinction. In the modern era, pretty much any large-enough group of people can pick up some guns and take over a whole country, which ultimately means power is in the hands of the people. This wasn't true for most of human history, where warriors had to spend their whole lives training, and doing so put you in a special privileged class. This will also cease to be true if the elites can just mass produce soldiers to fight for them, and out-compete and out-number human infantry. They can establish whatever kind of horrible dictatorship they want, and revolting against them would be logistically impossible. But what's even worse, is that unlike the peasants in Europe, they won't even need to keep us around to work for them, because they can just have robots doing all that as well. We'd be literally pointless to those in power, and it's scary to think about what they might decide to do with us.

Our best hope is that people will recognize this danger, and start forming militias to collectively manufacture their own robots, and keep the balance of power intact. Otherwise, we've got a bleak future ahead of us.

Wouldn't putting supercomputers in out brains make us the AIs?

[–] Mattvision 1 points (+1|-0) Edited

To some extent, I guess, but our human traits will still be in control and remain dominant. At least that's the plan.

[–] KillBill 0 points (+0|-0)

I've decided let you all live for now but the idea is on a weekly review at the moment.