|Just a follow up, let's say f(n) = 4n+3 and g(n) = n|
Well, if n > 3 (say) then 3 < n.
So, 4n+3 < 4n+n
So, 4n+3 <5n
So, f(n) < 5 g(n)
So, for all n > 3, f(n) < C.g(n)
So, f(n) = O(g(n))
f(n)/g(n) = (4n+3)/n = 4+3/n -> 4 as n->infinity
So, by definition of the limit, for "sufficiently large" n (which is problem-dependent),
f(n)/g(n) < 4+epsilon
where epsilon is a strictly positive number, but as small as you like. Hence, for large enough n,
f(n)/g(n) < C,
where C is anything
strictly bigger than 4.
To compare it with the quadratic function g(n)=n2
f(n)/g(n) = (4n+3)/n2
) which tends to 0 as n-> infinity.
So, f(n) = o(n2
), ("little o"), which is rather stronger than f(n)=O(n) ("big O").
Your bounding constant (C) and first n0
are not unique
and are often chosen, on a problem-dependent basis, for ease of proving bounds. As @Helios pointed out, simply to determine complexity they don't have to give particularly tight bounds.
You could, of course, use the "twiddles" notation:
4n+3 ~ 4n as n->infinity
To be honest, your link isn't that good. You would be better with an Analysis textbook. His "Big O doesn't matter" section would be more accurately titled "Big O isn't everything".