Will AI surpasses human-level intelligence? And when is this likely to happen?
We are taking about Human-level AI, often referred to as AGI (artificial general intelligence) or ASI (artificial super intelligence).
This is beyond the “Turing Test,” defined as the ability of an AI to perform human-level tasks in a fashion indistinguishable from us humans. The "Turing Test" is already behind us.
Two of the greatest futurists and brightest thinkers, Ray Kurzweil and Elon Musk have both concluded that it will occur around 2029.
At a recent Abundance Summit AMA, Ray reiterated his now famous prediction: “Artificial intelligence will achieve human-level intelligence by 2029.”
In July of 2020, in an interview with the New York Times, Elon said the following on the topic: “AI will become vastly smarter than any human and would overtake us by 2025. But that doesn’t mean that everything goes to hell in five years. It just means that things get unstable or weird.”
Elon’s prediction was updated in July 2023, during a Twitter/X Spaces discussion about the future of AI, “I think it's 5 or 6 years away … and I would say the definition of digital super intelligence is that it's smarter than any human at anything.”
We are used to take the bus as humans, with a human bus driver, so to speak.
But in 2029 there will not be a human bus driver, but something else something of superior intelligence. Where will it take us? Who knows? Will there be a kill-switch?
We can influence it now, but can we still then. There are bad actors everywhere. Will they take control of this development? Let's hope not.
It is clear that singularity is a 'winner take all' market. All power will converge in one force. Dystopian scenarios light up in my mind.
Let's assume just as light will always dissolve darkness, this too will end up for the good.
Things Are Changing that is for sure.