complexity theory – Is it possible for the runtime and input size in an algorithm to be inversely related?


Well an algorithm with $O(0)$ fulfills the criterion. It basically does nothing. As soon as your algorithm does at least one operation on execution it has a runtime cost $t(n) > 0$.
Since $$t(n)in O(1/n) Leftrightarrow exists c,n_0forall n >n_0: t(n) leq ccdotfrac 1 n$$
An algorithm with constant runtime doesn’t have runtime $O(1)$. This means that for a runtime measure where every operation costs at least $1$ only the empty algo has runtime $O(1/n)$ but if you e.g. say that an if-stmt with the check of an condition has cost zero you can build algorithms whos runtime cost is 0 after a certain input is reached e.g.:

def algo(n):
  if n < 100:
    do something very expensive

This algo is if you declare condition checking as 0 cost operation an algorithm with runtime $O(0)$ and thus also runtime $O(1/n)$ even though it could do a very expensive operation for the first hundred values.

Generally a decreasing complexity is rather senseless because you can alwys express it as either $O(1)$ or $O(0)$. (e.g. $O(1/n) = O(0)$, $O(1/n+10) = O(1)$).