Video cards do two things.
1) Sense monitor presence via impedance. If the electrical load
at the end of the line "glitches", then the video card will
assume the monitor is unplugged, and disable the video output.
When it sees the load impedance present again, it can enable
the video output (especially if that was the only monitor on the
2) The OS and video card, query the monitor via the serial DDC bus
on the connector. That allows reading the EDID EEPROM of the monitor,
and seeing what resolutions are supported. Historically, the
video driver is not allowed to drive higher than a certain
resolution, unless obtaining information first, that says the
monitor actually supports that resolution. More than 20 years
ago, you could damage a monitor by using too high a resolution
or refresh rate.
While a KVM could be designed to be completely seamless, by
faking all the necessary responses, it is simply cheaper for them
not to do that. And just let the thing glitch, and rely on the
automated responses to return things to normal.