How can a computer deal with real numbers

Computers are an exceptionally powerful tool for various computations, but they don’t excelate at storing decimal numbers. However, people managed to overcome these issues: not storing the number in a decimal format, which is limited to very few decimal places, but as an integer instead, while keeping track of the number’s precision.

Still, how can a computer simplify computations just like humans do? Take a look at this basic example

$$sqrt{3} times (frac{4}{sqrt{3}} – sqrt{3}) = sqrt{3} times frac{4}{sqrt{3}} – sqrt{3} times sqrt{3} = 4 – 3 = 1$$

That’s how a human would solve it. Meanwhile, a computer would have a fun time calculating the square root of 3, diving 4 by it, subtracting the square root of 3 from the result and multiplying everything again by the square root of 3.

It would surely defeat a human in terms of speed, but it would lack in terms of accuracy. The result will be really close to 1, but not 1 exactly. A computer has no idea that, for instance, $sqrt{3} times{sqrt{3}}$ is equal to $3$. This is only one of the uncountable examples out there.

Did people already find a solution, as it seems elementary for mathematics and computations? If they didn’t, is this because it didn’t serve any purpose in the real world?