How does time-dependent camera rotation with a mouse work?

The commonly used equation for camera rotation with a mouse does not involve time. This make sense since higher frame rates have smaller changes in mouse position and vise-versa so it all evens out. If time slows down or speeds up, however, camera rotation from the mouse does not adjust accordingly. Just as you move slower when time is slowed, logically I also want rotating to be slower.

One option is to multiply the change-in-position of the mouse with the same multiplier I’m using on time, but shouldn’t it be possible to have change-in-rotation and change-in-time in the same equation, independent from framerate?