Future Artificial Intelligence

Humanity and Future Artificial Intelligence

The advancements in Artificial Intelligence are quite astonishing. It reaches the threshold where it’s as smart as the smartest most inventive
human.
It really could be a matter of days before it’s smarter than the sum of humanity.

what happens when machines surpass
humans in general intelligence?

If machine brains surpass human brains
in general intelligence, then this new super intelligence would have undergone an event called the intelligence explosion.
Likely to occur in the 21st century.

It is unknown, what or who this machine network would become the issue of super intelligence.

I think we can’t overcome this problem just technologically.
You imagine that we’ve done our job perfectly and we’ve created the most safe
beneficial AI possible. but we have let the political system become totalitarian and evil.
It’s not going to work out well because we’re talking about human AI.

Human AI

Human AI means AI at the level of human.
So, the issue of how do we make humans
Better is the same issue as how we make
AIS that are at human level?

What is an actual good future? what does
that actually look like?

All of us already are cyborgs.
We have a machine extension of ourself in the form of our phone and computer
and all our applications.
We are already a super human.
By far we have more powerful capability
than president of united states had 30 years ago.

If you have an internet link, you can communicate to millions of people and communicate to the rest of earth instantly and these are magical powers that didn’t exist not that long ago.
So, I think everyone is already superhuman.

The limitation is one of bandwidth.
we’re bandwidth constraint, particularly
on output. our input is much better but
our output is extremely slow.
If you want to be generous you could say maybe it’s a few hundred bits per second or a kilobit or something like that.
Output compare that to a computer which can communicate at the terabit level.

Now, We’re headed towards either super
intelligence or civilization ending.

Super intelligence

Intelligence will keep advancing. another thing that we’re advancing is something that puts civilization into stasis or destroys civilization.

what is a world we would like to be in?

Super intelligence studies have purposed a number of directions in which AI could develop. Such as:
✓ The development of stronger and smarter artificial servants
✓ Development of a network of
increasingly intelligent systems
✓ Development of AIS with human-like
personalities or the development of AIS with moral reasoning capabilities that can make decisions autonomously and care about humanity.

The term AI is used to mean anything from incremental improvements to the software of today’s computers to the development
of human like thinking machines.
However, this type of AI is an extreme
case, It is also called ASI.
Strong or general AI in contrast with today’s narrow and weak AI.

It presents the creation of a machine with intellectual abilities that match or exceed those of humans across the board by definition.

An ASI can perform better than us in any conceivable task including intellectual skills.
it could engage in scientific research, teach itself new abilities improve its own code, create unlimited, copies of itself, choose better ways of deploying its computational resources and it could even transform the
environment on earth or colonize other
planets.

The evolution of this new type of sentience could follow many paths.
it could present a potential existential risk to humanity depending on the nature and capabilities of the system.
However, the hypothesis of a powerful machinetaking over the world is not the only outcome. but the really two sides that’s one is getting rid of a lot of the negatives. that like the compassionate use to cure diseases and all other kinds of horrible miseries that exist on the planet today.

So, that is a large chunk of the potential
but then beyond that, if one really wants
to see what the positive things are that
could be developed.

I think one has to think outside the constraints of our current human biological
nature. it’s unrealistic to modern trajectory stretching.

Leave a Comment