are ints faster than booleans

Why is GLboolean(in Open GL) an unsigned char?

I thought booleans were supposed to be faster, as logically they take less memory. Anyway I've talked to a friend and he said the issue of them on X86 processors could be due to how fpus(the processors which accelerates (integers & float) calculations. But are there any official answers on this?
My guess: Open GL is a C library, meaning the boolean data type is not available.
Topic archived. No new replies allowed.