Lets talk about Entropy, specifically the statistical mechanics interpretation of Entropy. Loosely speaking Entropy is described as a quantifier of how chaotic a system is. The more micro-states attainable by the system (with appreciable probability) the higher the Entropy of the system.

For example rolling a fair dice will be much more `chaotic’ than flipping a heavily weighted coin.

As far as I am aware Boltzmann Entropy and Shannon Entropy are defined similarly (differing only by a constant), Shannon Entropy of a system described by the random variable $X$ with mass function $p$ (resp. density) is :

$$H(X):=-sum_{i=1}^np(x_i)log(p(x_i))~~text{or in the continuous case }~~ H(X):=-int p(x)log p(x) dx.$$

For systems which are irreversible the 2nd law of Thermodynamics says entropy will always increase. Consider the case of dispersion of heat : Let $W_t,~tin(0,infty)$ be a Brownian Motion, the density $p_t$ of this stochastic process solves the heat equation :

$$ frac{dp}{dt}=Delta p $$

and the entropy of the system (at time $t$) is defined as $-int p_t(x) log p_t(x) dx$.

My $textbf{Question :}$ Consider a general stochastic process $Y_t$

$$dY_t:= u(Y_t)dW_t$$

(say for a convex function $u$) what justification do people have for calling the entropy of this process

$$ -int u(p_t)dx? $$

My thoughts : This so called `entropy’ has nothing to do with Boltzmann/Shannon Entropy, it is just a name used to describe a quantity of the system which should behave similarly to the usual definition entropy (i.e increase as time increases).