A welcome unpacking of a fuzzy concept, helping clarify the term "singularity" and other terms that get tossed around without much of an effort to explain some distinctions. As usual, Max More is at the forefront of human and machine evolution, making difficult ideas understandable. I look forward to reading the upcoming installments in this series.
"It is important to discuss whether we should expect an IE once we have AGI or SAI, and whether we can do anything to maintain AC but avoid IE and a PH."
Yes. And Transhumanism is essentially the answer to that question, which makes Transhumanism a moral mandate insofar as human morality is concerned.
Wait a minute. Gary Marcus doesn't think LLMs are the path to AGI??
;-)
I know. It's hard to tell with Gary. He is very shy and doesn't like to speak his mind. :-)
Unique perspectives. A truly enlightened piece.
A welcome unpacking of a fuzzy concept, helping clarify the term "singularity" and other terms that get tossed around without much of an effort to explain some distinctions. As usual, Max More is at the forefront of human and machine evolution, making difficult ideas understandable. I look forward to reading the upcoming installments in this series.
"It is important to discuss whether we should expect an IE once we have AGI or SAI, and whether we can do anything to maintain AC but avoid IE and a PH."
Yes. And Transhumanism is essentially the answer to that question, which makes Transhumanism a moral mandate insofar as human morality is concerned.
Excellent overview and clears up some muddy thinking out there ... thank you! [Totally agree on LLMs]