Does multiplication increase entropy?

The Shannon entropy of a number $ k $ in binary digits is defined as

$$ H = – log ( frac {a} {l}) cdot frac {a} {l} – log (1- frac {a} {l}) cdot (1- frac { a} {l}) $$

from where $ l = text {floor} ( frac { log (k)} { log (2)}) $ is the number of binary digits of $ k $ and $ a $ is the number of $ 1 $-s in the binary extension of $ k $,

So let's take a look at the number $ k $ as a "random variable".

Suppose that $ n, m $ are selected evenly at random in the interval $ 1 le a $,

Hypothesis 1):

$ H_ {m cdot n} $ is then "significantly" larger $ H_n $,

Hypothesis 2):

$ H_ {m + n} $ is then not "significantly" larger $ H_n $,

Here is an empirical statistical test that indicates that multiplication increases entropy, but not addition:

```
def entropyOfCounter (c):
S = 0
for k in c.keys ():
S + = c[k]
prob = []
for k in c.keys ():
prob.append (c[k]/ S)
H = sum ([ p*log(p,2) for p in prob]) .N ()
Return H
def HH (l):
return entropyOfCounter (counter (l))
N = 10 ^ 4
MX = []
MP = []
for k in the range (N):
n = Randint (1.2 ^ 500)
m = Randint (1,2 ^ 500)
Hn = HH (integer (n). Digits (2))
Hm = HH (integer (m). Digits (2))
c + = 1
M = max (Hn, Hm)
MX.append (HH (integer (n * m). Digits (2)) - Hn)
MP.append (HH (integer (n + m). Digits (2)) - Hn)
tX = mean (MX) / (sqrt (variance (MX)) / sqrt (N)). N ()
tP = average (MP) / (sqrt (variance (MP)) / sqrt (N)). N ()
prints tX, tP
Output:
31.1839027855549 0.266357305397406
```

The first case (multiplication) significantly increases the entropy. The second case (addition) not.

Is there any way to give a heuristic explanation of why this is generally so (if so) or is this empirical observation generally available? $ 1 le a $ not correct?

Connected:

https://physics.stackexchange.com/questions/487780/increase-in-entropy-and-integer-factorization-how-much-work-does-one-have-to-do