- I somewhat buy into moral realism, at least on a gut level? (Many things Will Macaskill says, for instance, resonate quite a lot.)
- But I'm also worried about the orthogonality thesis
- How can these be reconciled, if at all?
- One trivial way: if superintelligence takes off and acts before it's had its own time for a Great Reflection, then it could wipe humans out before it realises its mistake, despite the 'fact' that any intelligence would eventually come to that conclusion.
- I.e. it's just an interaction of timing, instrumental convergence, and capabilities
- So in essence I think this *is* a rejection of the 'strong' orthogonality thesis, though it admits a 'weak' version
---
- Moral realism vs orthog: B linked to acausal decision theory?
- And found some blind spots. Need to read up on both!