A video card sends out a video signal of some resolution and update frequency.
A monitor displays images of various resolutions and update frequencies.
The logic controlling the display of images of non-native resolutions is handled in the monitor.
Exceptions:
Video driver software on the computer can often determine the supported resolutions of a monitor, and perform scaling by itself, to scale a video image of unsupported resolution into one which is supported by the monitor. In this case, a scaling algorithm can be applied from within the driver to convert the low-resolution video signal into native resolution, which is then sent to the monitor.
Game software can often also determine supported resolutions, and can scale its own video image from its internal resolution to one of the supported output resolutions. In this case as well, a scaling algorithm can be applied from within the game, and the driver and monitor receive a native resolution video image. (To my knowledge, this is very rarely done except on video game consoles, where many games render at a lower resolution and scale the resulting image up to fill the full display resolution. PS2 games, in particular, were notorious for rendering 512x386 images, and then scaling them up to full display size).