Artificial General Intelligence Is Possible and Deadly

The interesting kind of artificial intelligence is Artificial General Intelligence (AGI). Short of that crucial generality, “AI” amounts to “software.” We have lots of software. It is both useful and dangerous. It has transformed our world for both better and worse and will continue to do so. But whatever we have done with software, it all still relies on us. If our culture collapsed, so would our software stack.

The big implications of AGI start with the fact that it implies the possibility of fully autonomous robotized industrial ecosystems that don’t require any human oversight or labor. With only modest extrapolations of today’s industrial technology, AGI systems could design, run, and manage a closed loop of robotic workers to staff mines and factories and manufacture all their own industrial components. Imagine relatively cheap AGI-controlled robots driving existing industrial equipment. Many industries already focus human labor on tasks weighted toward judgment, creativity, and flexibility. With AGI, these constraints go away: industrial computer systems become capable of creative oversight and business strategy, and highly flexible general-purpose industrial robots become much more viable.

Given the potential profits, this kind of robotization may occur with enthusiastic human help. Some aspects of industrial labor and management may be beyond first-generation AGI robotics. However, the combined human-AGI industrial ecosystem will have a strong incentive to improve designs and build next-generation products that are more capable. Given the industrial scaling and improvement properties of computers, and the greater degree of optimization that can be applied to artificial intelligence, this could lead to a “superintelligence” capability that could replace all human labor in the entire industrial ecosystem and outsmart any human.
Quantum by FLY:D is licensed under Unsplash unsplash.com

Get latest news delivered daily!

We will send you breaking news right to your inbox

© 2024 louder.news, Privacy Policy