optimization – What would be the gradient function

Suppose we have the function f(x) = $frac{1}{2}$x$^T$Qxx$^T$b. What would be $bigtriangledown$f(x)?

I think I am just getting confused about having both x$^T$ and x. My intuitive answer would be: $bigtriangledown$f(x) = $frac{1}{2}$Qxb but I’m not sure if that is correct though it seems like it would be since x$^T$ and x represent the same vector.