How many iterations are required to complete the algorithm given an input, relative to the input size?
For example, counting the elements that contain even values in a vector of ints takes O(n) time, because you need to iterate once for each element, and accessing each element takes O(1) time (that's a property of vectors. 1*n=n). Doing the same for a set of ints takes on average O(n log n), since you need to make as many iterations, and accessing each element takes O(log n) time.
Accessing a vector element takes O(1) because it always takes the same amount of time, regardless of how big the vector or how far from the head the element you want to access is.