linux – Is there a technical reason why rebooting is required to use a Displayport-to-VGA adapter?

I manage servers and prefer to use VGA as graphics quality is unimportant and VGA is ubiquitous on crash carts. I’d prefer to have a single video cable on the carts, and I can think of no more common connector.

Occasionally I find a server with no VGA or DVI output, just DisplayPort, or perhaps DisplayPort + HDMI.

I have a couple DisplayPort-to-VGA adapters. But they seem to only work if the computer is rebooted. That’s no good. Half the time I connect a monitor up to a server it’s because something has gone wrong, and I need to investigate the current state of the system. Having to reboot erases that state and some evidence of the problem.

I think that I could leave these adapters in the servers, one per and that would let me walk up with a vga monitor and plug in. But that’s impractical as well to have a bunch of dongles hanging out everywhere.

Is there a technical reason a fresh reboot is required to get vga output through a DisplayPort-to-VGA adapter?

Operating Systems tend to be linux distros. I’m wondering if the “thing” that the TTY’s attach to simply does not exist until the DisplayPort-to-VGA adapter is inserted. And the TTYs start at boot time, not dynamically. So that would tell me I need to write a udev script or something to detect insertion of displayport adapter and spawn a tty on it. Though that would probably fall short of displaying printk’s and such, much like using a usb-to-vga “adapter”.

But I didn’t think these little DP-to-VGA adapters had a frame buffer for instance, like the usb ones do.

My ideal solution would be some sort of kernel parameter to set in grub that would pre-initialize the DP port to serve as a VGA output. I tried this one to no avail: