Artificial Superintelligence, AI in a Box & Machine Consciousness With Nick Bostrom
Artificial Superintelligence or digital superintelligence that is ready to challenge the human intelligence formed after a long evolutional process. But, if humans survive the threat of extinction by human activities then experts are expecting an ace era of humanity. Experts have a common belief that the time of the process from Artificial General Intelligence to Artificial Superintelligence (ASI) will be lower than the time of the current process from narrow AI to Artificial General Intelligence (AGI). We might have fewer options than we think to make it the incorrect way.
World-famous Oxford university professor and philosopher Nick Bostrom has real concern about Artificial superintelligence and believes ASI is a serious existential threat to humanity. The concern is more specific to the creation of superhuman AI in a very short time of initial AI. It will lead to quick access to greater resources than humans for information, processing, and computation. A known solution is the ethical AI principles based on human values. Current intelligent technology trends raising so many important questions. In this video, the creator has tried to address those. Please watch this video till the end.