computability – Does the term “continuity” have a different meaning in maths and in CS?

I ask this question because of some statements in the question “What is the ‘continuity’ as a term in computable analysis?” making me suspicious.

I’m engineer, not computer scientist, so I don’t have the Turing machine but logic gates in mind when I’m thinking about algebraic operations performed with devices.

I read the answer to the question “Why are computable functions continuous?” and understood it the following way:

Because the device’s input is of infinite length (a decimal number with an infinite number of digits after the decimal point), the device (e.g. Turing machine or computer) cannot read the entire number before writing the $n$-th digit of output.

Instead, the device can only have read $m(n)$ digits of the input when it writes the $n$-th digit of output.

If the first $n$ digits of the output of some function only depend on the first $m(n)$ digits of the input, the function is continuous.

However, if I understand this argumentation correctly, the word “continuous” in computation theory is not identical to the word “continuous” in mathematics:

Rounding towards zero would only require reading the input until the decimal point (so $m(n)=text{const.}$); however, the mathematical function being calculated is not “continuous” according to the mathematical definition of that term.

We could also perform a digit-wise operation ($m(n)=n$) and exchange certain digits after the decimal point; for example replace all 4s by 9s and all 9s by 4s. As far as I understand, the function being calculated is not continuous on any interval of $mathbb{R}$ (however, it would be right-continuous on $(0,infty)$ and left-continuous on $(-infty,0)$).

And if I didn’t make a conceptual mistake and we use a balanced numeral system (like a Russian computer in the 1960s) instead of the decimal system, a similar algorithm (exchanging 0s and 1s instead of 4s and 9s) would even represent a mathematical function which is not even directional continuous on any interval of $mathbb{R}$.


Does the computability depend on the numeral system being used (as the example with the balanced numeral system suggests) or is the term “computable” even assuming a certain numeral system being used?

Is the observation correct that the term “continuous” does not have the same meaning in maths and CS?