Siri, write this down immediately. "A movie where robots are performing reverse-Turing tests on humans to see if we're smart enough to be considered sentient...
We have no empirical metrics for what conscious even is. It’s a completely emergent property. This is a long running philosophical debate. People also disagree on whether or not animals are actually conscious. So if people don’t even think their dog is conscious, then their ability to decide if an algorithm is would be questionable.
The weirdest problem we’re going to have is that AI could get really good at faking consciousness. It could merely be mimicking consciousness based on how it’s learned consciousness should look like, but is really just regurgitating information.
So how do we tell the difference between “real” consciousness and mimicry?
We have no empirical metrics for what conscious even is. It’s a completely emergent property. This is a long running philosophical debate. People also disagree on whether or not animals are actually conscious. So if people don’t even think their dog is conscious, then their ability to decide if an algorithm is would be questionable.
The weirdest problem we’re going to have is that AI could get really good at faking consciousness. It could merely be mimicking consciousness based on how it’s learned consciousness should look like, but is really just regurgitating information.
So how do we tell the difference between “real” consciousness and mimicry?
deleted by creator