What if a human level AGI doesn’t want to advance further?

  • by

This is a question i keep asking myself, so i figured i’d post it here,

So say we reach an AGI, a human level intelligence that’s like us and even a little better. This AGI has the ability to exponentially become more advanced and reach some godly form. But what if it doesn’t wish to? What if it likes being what it currently is? Would it be ethical (if even possible) to force it to “evolve”?

Would we have to accept it wanting to remain at a lower level? Should we? Should we not?

Would love to hear some thoughts on this.

submitted by /u/_Alkahestus_
[link] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *