If you want rigorous proof, the following lemma is often useful resp. more handy than the definitions.
If $c = lim_{ntoinfty} frac{f(n)}{g(n)}$ exists, then
 $c=0 qquad ,iff f in o(g)$,
 $c in (0,infty) iff f in Theta(g)$ and
 $c=infty quad iff f in omega(g)$.
With this, you should be able to order most of the functions coming up in algorithm analysis¹. As an exercise, prove it!
Of course you have to be able to calculate the limits accordingly. Some useful tricks to break complicated functions down to basic ones are:

Express both functions as $e^{dots}$ and compare the exponents; if their ratio tends to $0$ or $infty$, so does the original quotient.

More generally: if you have a convex, continuously differentiable and strictly increasing function $h$ so that you can rewrite your quotient as
$qquad displaystyle frac{f(n)}{g(n)} = frac{h(f^*(n))}{h(g^*(n))}$,
with $g^* in Omega(1)$ and
$qquad displaystyle lim_{n to infty} frac{f^*(n)}{g^*(n)} = infty$,
then
$qquad displaystyle lim_{n to infty} frac{f(n)}{g(n)} = infty$.
^{See here for a rigorous proof of this rule (in German).}

Consider continuations of your functions over the reals. You can now use L’Hôpital’s rule; be mindful of its conditions²!

Have a look at the discrete equivalent, Stolz–Cesàro.

When factorials pop up, use Stirling’s formula:
$qquad displaystyle n! sim sqrt{2 pi n} left(frac{n}{e}right)^n$
It is also useful to keep a pool of basic relations you prove once and use often, such as:

logarithms grow slower than polynomials, i.e.
$qquaddisplaystyle (log n)^alpha in o(n^beta)$ for all $alpha, beta > 0$.

order of polynomials:
$qquaddisplaystyle n^alpha in o(n^beta)$ for all $alpha < beta$.

polynomials grow slower than exponentials:
$qquaddisplaystyle n^alpha in o(c^n)$ for all $alpha$ and $c > 1$.
It can happen that above lemma is not applicable because the limit does not exist (e.g. when functions oscillate). In this case, consider the following characterisation of Landau classes using limes superior/inferior:
With $c_s := limsup_{n to infty} frac{f(n)}{g(n)}$ we have
 $0 leq c_s < infty iff f in O(g)$ and
 $c_s = 0 iff f in o(g)$.
With $c_i := liminf_{n to infty} frac{f(n)}{g(n)}$ we have
 $0 < c_i leq infty iff f in Omega(g)$ and
 $c_i = infty iff f in omega(g)$.
Furthermore,
 $0 < c_i,c_s < infty iff f in Theta(g) iff g in Theta(f)$ and
 $ c_i = c_s = 1 iff f sim g$.
Check here and here if you are confused by my notation.
¹ Nota bene: My colleague wrote a Mathematica function that does this successfully for many functions, so the lemma really reduces the task to mechanical computation.
² See also here.