Some say they see poetry in my paintings; I see only science.

This game actually appears to be one which the good guys won. I haven't noticed any wacky giant or tiny fonts when installing a new fedora for a while now. Maybe some of these rants helped somewhere, maybe the guys who could do something actually tried to hook their computers to an HD television and finally realized why purest DPI fanaticism was a bad idea...

See the Rationale for why DPI is worth so much trouble.

Once upon a time, you could put:

Xft.dpi: 96

in your ~/.Xdefaults file, and the resource would be used by almost everything to set the DPI (dots per inch) on the display to the fixed value 96 dots per inch.

Now, no libraries seem to pay any attention to something as old fashioned as X resource files.

Once upon a time, you could could run the gdmsetup utility and add the -dpi 96 option to the X server command line arguments.

Now, gdm has been improved beyond the point of offering you the ability to change the options. After all, you'd only hurt yourself if you tried. Better to prevent you from doing it at all, the GNOME nannys all know best.

Once upon a time you could edit the /etc/X11/xorg.conf file and lie about the DisplaySize size in the Monitor section in order to indirectly specify the dots per inch by providing a monitor size that will result in the server calculating the desired dots per inch.

Now X has gotten so smart, it doesn't even install an xorg.conf file. You can go to a lot of trouble to download and install the system-config-display tool (in Fedora at least), and use it to manufacture an xorg.conf file, but even if you change the DisplaySize, X is so proud of itself for finally paying attention to the monitor's EDID information (after ignoring it for 20 years), that it knows you are wrong, and refuses to pay any attention to your DisplaySize setting.

So how the devil do you set DPI these days?

You can run gnome-appearance-properties, goto the fonts tab, dive into the advanced settings and finally find a box where you can type an alternate DPI setting. Unfortunately this only works for GTK apps, and only for them if a dbus session and the gnome-settings-daemon is running. You'll have those if you run GNOME, but I don't know if you'll get them in KDE or not. If, like me, you hate GNOME and KDE, you'll have to manually start up this junk even if you'd rather not clutter your system with it.

This actually records the settings in xml files stashed in ~/.gconf/desktop/gnome/font_rendering, but individual GTK programs won't look there, they will only listen to gnome-settings-daemon.

For KDE apps (at least with the new KDE 4.2 in Fedora), you can run the KDE settings tool and dive into Appearence, then Fonts, and find a similar setting to override DPI (though you are only given the choice of 120 and 96).

Naturally, this only works for KDE apps, not for GTK apps.

This actually stashes the information in: ~/.kde/share/config/kcmfonts, and unlike GTK apps, no special daemon seems to be needed for individual KDE apps to pay attention.

Now we have piecemeal settings for GTK and KDE, but apps written in plain old toolkits like Qt don't know about any of these settings. Perhaps there is a qt-config settings specific for Qt apps? Who knows, the point is that it is already getting totally ridiculous. A global setting is what you really want, and the gradual "improvement" of everything has seemingly eliminated that possibility.

It is also important to note that these are per-user settings. If the fonts are unreadably small or absurdly large due to the combination of your hardware and X11's DPI computation, then things like the GDM login screen may well be unuseable unless you copy your personal ~/.gconf/desktop/gnome/font_rendering directory into the GDM users' corresponding directory (since the GDM login screen is running as user gdm).

And, of course, every time you add a new user, you have to run around tweaking DPI everywhere once again.

So is a global setting possible? The answer appears to be "Yes, but...".

If you have an nvidia graphics card, and if you are running the closed source nvidia driver, then you can add this to the "Device" section of your xorg.conf file (at least you will have an xorg.conf if you are using the nvidia driver):

        Option      "UseEDIDDpi" "False"
        Option      "DPI" "96 x 96"

The first option tells it to ignore the monitor's EDID DPI information. The second tells it to By God use 96 DPI. Finally a global setting that works on everything without finding all the individual tweaks needed for every dad-gum GUI toolkit.

Possibly there are similar options for other drivers. If you look you might be able to find them. It seems absurd that there is no driver independent way to do this.

Even this isn't really all that good, since it is a per system setting rather than a per user setting, and individual users my perfer different defaults. Also if you have multiple monitors hooked up, different DPI settings for each monitor would be best.

Perhaps the dpi option on xrandr would work? I haven't tried it, but perhaps it would be a cross toolkit solution if it were invoked at login on a per user and screen basis. (Though the things I have tried to modify with xrandr have never worked :-).

Actually if you change the file /etc/sysconfig/desktop to say DISPLAYMANAGER=KDE (creating the file if it isn't there), you can switch from GDM to KDM as the login manager. Once you are running KDM, you can edit the /etc/kde/kdm/kdmrc file and modify the ServerArgsLocal parameter to add the -dpi 96 option to the X server startup arguments. That does give you a driver independent way to set DPI, but you can't control the server args with GDM (it has been improved too much for that).

So why is setting DPI important?

Let's consider the first monitor I needed this for. A large (42 inch) HD display. At 42 inch diagonal with 1920x1080 resolution, the accurate DPI calculated from the provided EDID information was 52DPI. If you use that accurate DPI to request, for example, a 9 point font, you will find that you get a total of about 6 pixels to render 9 point characters (and most characters don't use the whole height, so you really get about 4 pixels). Most characters in most fonts need a lot more pixels than that to be rendered readably :-).

The whole theory of fanatical dedication to the exact hardware DPI setting is ridiculous. As the example above shows, the accurate rendering of a perfectly fine font for a printed document is hopeless on the monitor. Also, the practical truth is that you aren't gonna be sitting as close to a monitor that size as you are going to be holding a printed page.

The actual DPI that a user finds optimal will be up to that user, but almost the worst possible choice is to default to the calculated DPI from the hardware information. For a lot of hardware, the calculated DPI will be perfectly fine, and will be very close to 96 DPI anyway. For vast amounts of other hardware the calculated DPI will be ridiculous, but setting it to 96 DPI will make it readable so the user will at least be able to make out what the menu items say while navigating to the dialog he can use to set it to his own personal preference. Clearly, a default to some value close to 96 would make infinitely more sense than defaulting to the actual hardware DPI.

For another example, I don't need to go any farther than my new monitor (the old one died). This time it is a 46 inch HD display, and the calculated DPI would be even worse if the EDID information described the dimensions accurately, but this set calls itself 160mm by 90mm - a value apparently chosen by Samsung merely to get the aspect ratio correct. The calculated DPI for these phony dimensions is 305 DPI, which makes all the fonts come out so gigantic that most dialog boxes won't fit on the screen, making it physically very difficult to modify the DPI setting.

Then consider projectors. They don't even have a screen size. The actual size of the projected image depends on the distance from the projector to the screen. It can't make any sense to calculate DPI for such a beast, merely defaulting to a fixed value near 96 once again makes far more sense.

There are a few bugzillas out there on this topic that I've reported:
no way to control X server startup options
fonts suddenly become tiny
gnome needs limit on font DPI deduction
For God's sake, please ignore EDID DPI and just default to 96!

 
Game of Linux Entry Game of Linux Site Map Tom's Fabulous Web Page
 
Page last modified Sat Feb 11 18:34:00 2012