6

Chris's Wiki :: blog/linux/KernelModesettingBackground

 1 year ago
source link: https://utcc.utoronto.ca/~cks/space/blog/linux/KernelModesettingBackground
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

How we wound up with Linux's kernel mode setting ('KMS')

June 9, 2022

Once upon a time, not quite back when dinosaurs roamed the earth, computer displays were CRTs and fundamentally operated by scanning an electron beam or beams back and forth over the screen. Usually the computer sent the CRT an analog stream of either monochrome or red, green, and blue intensity signals, and the CRT did all the work of scanning the electron beam(s) across the display. Initially, computer CRTs tended to work only at a single frequency of these signals, the horizontal and vertical scan rates. As the PC revolution got under way, good CRTs became multisync, which meant that they were able to work at a variety of scan rates. This was especially important to cope with the wide variety of resolutions and scan rates that were used by and supported by various PC graphics systems.

In the PC world, VESA wound up defining some very basic modes that all VESA compatible monitors had to support, as part of VESA BIOS extensions (also). Both the graphics hardware in your PC and the CRT you had probably supported better modes with more resolution, but to use them you had to know card-specific information and generally you had to know what timings to use. Enter XFree86 modelines, which were a general way of saying what scan frequencies you wanted to use. These modelines are where 'mode setting' gets its name from.

In the older days of Linux, the kernel didn't know very much about graphics (at least on PCs). Instead, setting up and handling graphics hardware was the domain of the X server; the kernel gave it access to PCI (or AGP) resources, and the X server directly stored values and read things out. Part of what the X server did was set the graphics mode (ie, the modeline resolution, depth, and scan frequencies), initially from explicit modelines and then over time from EDID information and other things you didn't have to configure (which was great). This was user space mode setting. There were a variety of reasons to do this at the time (cf) but it had various drawbacks, including requiring the X server to have significant privileges (cf Fedora removing them).

(This is the era in which the X server had to switch out of its high resolution mode before you could switch to a text console.)

Over time, PC graphics cards became capable of doing more and more things that really had to be handled in the kernel. They might do DMA, generate interrupts, require extensive involvement with kernel level things, and so on. As a result the kernel developed increasingly complex (and large) drivers for GPUs, and those drivers became capable of more and more things (cf the kernel DRM system). One of the things that moved from the X server to the drivers for many drivers was this job of setting the graphics mode, creating kernel mode setting. Once the kernel could set the graphics mode and get EDID information from the connected displays, it started doing so during boot, picking the best resolution out of the ones available.

Setting the 'correct' mode for your monitor is now more important than it used to be, because we've switched over from CRTs to LCDs. LCDs are fundamentally fixed resolution (and colour depth) displays, unlike CRTs (more or less), so if you actually want to read crisp things on your LCD, you want to drive it at its native resolution. Linux's kernel mode setting makes this happen as early as possible and in as many situations as possible, not just when the X server is running.

Traditionally, the Linux text console on PCs really did pretty much use VGA text mode (also). In the modern KMS world, the 'text console' is actually a framebuffer console that draws text through graphics operations as part of the DRM system. It's possible that some sort of framebuffer console is still used today even if KMS has been specifically turned off, although I'm not sure.

(See also wikipedia's Mode setting, which also covers other Unixes.)

PS: As of Fedora 36, Fedora only has DRM/KMS in its kernel. See also the announcement of the new simpledrm driver, which has some discussion of how the PC 'firmware' (EFI or BIOS) sets up the initial graphics environment.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK