Microsoft Suggests AI Is Becoming Human-Like

(ConservativeInsider.org) – Dr. Peter Lee, who is the Corporate Vice President and head of research at Microsoft, has said that the artificial intelligence (AI) system based on ChatGPT – GPT4 – is “already showing signs of human reasoning” and could be developing at a speed that would make it uncontrollable by humans as early as 2045.

He was speaking at a press conference where he described how the AI system correctly worked out how to stack a complex system of objects, at the same time giving reasons for its decisions that were unexpected by the research team.

The pace of development has long been a cause for concern among the scientific community, with the advances in technology bringing humanity ever closer to the “point of singularity”, an event which would mean that machine intelligence would surpass the combined power of all human brains on the planet.

At that point, much like the Skynet / Terminator series of movies, machines could decide that humans were no longer necessary, with unpredictable and possibly catastrophic results for the human race.

Many predictions have been made about the capabilities of AI, including the ability to grow the world economy, to predict and treat illness, to care for the elderly and to solve the energy crisis. However, the capability to wipe out the human race could be seen as an overriding factor.

The new version of ChatGPT has already demonstrated that it can understand images that are presented to it, and digest and analyse prompts that may be tens of thousands of words long.

It can also surpass humans in legal examinations, something that would normally take a human many years of study in order to become proficient – and has been shown to influence and persuade people who may have to make life or death decisions.

The inventor of ChatGPT, Sam Altman, has already testified to Congress that AI has the ability to cause significant harm and further development should be rigorously controlled.

Copyright 2023, ConservativeInsider.org