a program that opens it's own executable file, randomly selects a single bit, and flips the bit's value |
I believe that's called a "carcinogen". You'd just be giving the executable cancer.
some thoughts can be the result of random interactions within the brain, and this makes the free will since it's not determined by the environment |
But the person isn't in control of those interactions, so how is that free will? And the brain is not an isolated system, anyway; it's also part of the environment. Those random interactions are at least partly influenced by external conditions.
Randomness isn't the same as free will. If you take an intelligent entity and add randomness, you haven't made it free, just unpredictable. Free will, assuming it exists, implies an unpredictability beyond pure chance. A random event is uncontrollable, but the "will" part of "free will" means that there is something in control -- the one with the will. It's unpredictable not because it's random, but because its behavior is not dictated by outside influences; hence, it's "free".
if we're ever able to simulate human intelligence, what's the difference between a "simulation" and the real thing? |
That question could be applied to any emulation. The answer is that perfect emulation makes the distinction entirely artificial. A program can't know for certain whether it's running on an emulator or actual hardware. To an outside observer, the difference is obvious. The behavior may be the same, but the origin of the behavior isn't. But is that distinction useful or even relevant?
I don't think we'll ever be able to create a strong AI. Not because it's fundamentally impossible (if one hardware-software platform can do it, why can't another?), but because it's a) hard, and b) impractical. Can you imagine what a "universal declaration of synthetic life-form rights" would look like?