Dying is easy. Comedy is hard.

This game is quite old, but still instructive. It documents a long struggle with X11 and mode lines which eventually turned out to be a hardware problem :-).

Introduction

So, the last few years Linux advocates have been getting all puffed up about how Linux is ready to takeover the desktop because it is getting easier and easier to install.

To show just how easy, here is my tale of how I got X to talk to my Westinghouse LVM-42w2 hi-def 1080p monitor on different versions of Fedora.

Fedora Core 5

To being with, installing Core 5 with this monitor was a real treat. Xorg Bug 7243 can be worked around after your system is up, but during install, you can't easily get control of the xorg.conf file to make it run at a higher resolution than 640x480, and at 640x480, you can't reach the buttons on the screen to tell it how to install. The trick is to remove the DVI cable from the video card and plug the monitor in via the VGA interface, then put back the DVI cable after getting Fedora installed and the xorg.conf work-arounds fixed up.

After getting Fedora installed, the first thing to figure out is that there are two different video drivers I could use for my ATI video card. The one that comes with Fedora Core 5 is the open source "radeon" driver, and when installing, is the one X is configured to use. The radeon driver in FC5 doesn't support 3D graphics, but the proprietary drivers from ATI do, so I wanted to install the ATI drivers, named (God knows why) "fglrx". Fortunately I can avoid ATI's complicated driver build and install process and visit rpm.livna.org to find information on how to add the livna repositories to my system so I can download the pre-packaged fglrx drivers from there. [Note: livna and other 3rd party repositories have now mostly merged into rpmfusion, so that is the place to go for new fedora installs.]

The 1920x1080 HDTV resolution, being a relatively new innovation, is naturally not included in the list of resolutions X "just knows" how to generate. This being the case, I have to come up with a X mode line. These are described (if you can call it that) in the xorg.conf man page. They basically consist of 9 random numbers you have to pull out of thin air (but God help you if you make the slightest mistake because the nuclear device hidden in every monitor ever made will detonate and take most of the civilized world out if you provide a mode it doesn't like :-).

With all of civilization at stake, I want to be careful, so I do lots of web searches, and eventually find a remarkable post in a forum (which, unfortunately, I can no longer locate). This points out that the EDID information obtained from the monitor, while being almost completely ignored by X is, in fact, printed in the log file, and even more amazing, it is printed in a form which can be turned into a mode line:

(II) fglrx(0): Supported Future Video Modes:
(II) fglrx(0): #0: hsize: 1152  vsize 864  refresh: 75  vid: 20337
(II) fglrx(0): #1: hsize: 1280  vsize 1024  refresh: 60  vid: 32897
(II) fglrx(0): Supported additional Video Mode:
(II) fglrx(0): clock: 138.5 MHz   Image Size:  930 x 520 mm
(II) fglrx(0): h_active: 1920  h_sync: 1968  h_sync_end 2000 h_blank_end 2080 h_border: 0
(II) fglrx(0): v_active: 1080  v_sync: 1082  v_sync_end 1087 v_blanking: 1111 v_border: 0
(II) fglrx(0):  Westinghouse
(II) fglrx(0): Ranges: V min: 50  V max: 75 Hz, H min: 30  H max: 80 kHz, PixClock max 150 MHz
(II) fglrx(0): Monitor name: LVM-42w2
(II) fglrx(0): End of Display1 EDID data --------------------

If you cut and paste the appropriate numbers in the appropriate order, you can generate a mode line with no cabalistic computations required:

ModeLine "1920x1080" 138.5 1920 1968 2000 2080 1080 1082 1087 1111

Of course if (like me) you started knowing absolutely nothing about X configuration, this doesn't help. Where the devil do you put the ModeLine in the xorg.conf file? What things refer to the mode? What good is having the bloody thing if you don't know what to do with it?

Fortunately, there are lots of sample xorg.conf files to be found on the web, and after staring at lots of them, they look less and less like characters fired from a shotgun, and your eyes start to focus again.

The ModeLine goes in the Monitor section of the xorg.conf file:

Section "Monitor"
        Identifier   "Monitor0"
        VendorName   "Westinghouse"
        ModelName    "LVM-42w2"
        HorizSync    31.5 - 70.0
        VertRefresh  50.0 - 70.0
        ModeLine     "1920x1080" 138.5 1920 1968 2000 2080 1080 1082 1087 1111
        Option       "dpms"
EndSection

The string "1920x1080" is just a name. It could as easily be "wombat" as "1920x1080", but "1920x1080" is a bit more descriptive. That mode name, in turn needs to show up in the Screen section:

Section "Screen"
        Identifier "Screen0"
        Device     "Videocard0"
        DefaultDepth     24
        Monitor    "Monitor0"
        SubSection "Display"
                Viewport   0 0
                Depth     24
                Modes     "1920x1080"
        EndSubSection
EndSection

So, by the trivial process of transforming data in the X log file (which, by the way is found at /var/log/Xorg.0.log) I was able to generate a mode line to run my monitor at native 1080p resolution. (Wait! Aren't computers the ones that are good at fiddling with data? Why am I the one doing this by hand? More on this topic later :-). This works fine in the latest version of the ATI drivers with the latest Fedora Core 5 kernel (at least on my hardware), but there were some versions of the driver which encountered ATI Bug 37 from the unofficial ATI bugzilla. It is no fun having your system lock up hard as a rock just because you logout or try to reboot :-).

The fglrx drivers are also something of a pain, because they usually lag behind the fedora system updates, so you often find your video busted when you do updates until livna catches up. This made me look forward to Fedora Core 6 where, rumor had it, the open source radeon driver would support 3D reasonably well.

Fedora Core 6

I was pleased to find that installing Fedora Core 6 was easier. The bug that forced it to 640x480 was apparently fixed in the newer radeon driver, so no cable swapping was required to do the install. It does run at 800x600, but apparently the install always runs at that resolution, so that's not a driver problem.

I was also pleased to find that 3D did indeed work once I got everything up. I could run silly games like neverputt and 3D screen savers, and the famously worthless glxgears program. All seemed rosy. Life was good.

Even more amazing, after 20 years, the new X 7.1 actually uses the EDID information from the monitor. It came up in 1920x1080 mode without any intervention from me at all.

But, evil forces would soon terrorize the system! No sooner did I start feeling good about FC6 than the screen just suddenly turned off, then back on, then off again. Sometimes it would stabilize in the on condition, and I could use it for a while, sometimes it would go off and stay off. Mostly it would just flicker on and off every half second or so, making it completely unusable. Lots more details on this appear in Xorg Bug 8790.

Since I observed that the monitor status info was reporting 61Hz instead of the 60Hz the mode line is supposed to generate, I decided the obvious thing to do was generate a new mode line for a slightly reduced refresh rate and see if that solved the problem.

So simple to say, so hard to do :-).

Now that I need the cabalistic computations, I have to be wary of the nuclear holocaust I might trigger. I go back to the web again, looking for clues. One thing I find is the concept of "reduced blanking" for LCD displays which don't need as much time to shift electron beams around as CRTs need. This eventually leads me to the VESA Coordinated Video Timing Generator spreadsheet. Unfortunately, it generates a completely different set of random gibberish numbers that don't match any of the gibberish in an X mode line, and I have no idea how to translate VESA video engineer speak into X11 mode line speak.

On the theory that someone must know how, I keep searching the web, and finally discover the "cvt" program, which is released as part of X and already incorporates the algorithm in the spreadsheet as well as the conversion to a mode line. My problem is solved! I already have a copy of cvt on my system! Let's go run it to generate the reduced blanking mode line I need to run slightly less than 60Hz:

# cvt -r 1920 1080 58

ERROR: 60Hz refresh rate required for reduced blanking.

DOH! The fershlugginer program won't let me generate a 58Hz mode line! I feel like Ralphie in A Christmas Story: "What do you want for Christmas little boy?", "A genuine Red Ryder 58Hz mode line!", "You'll shoot your eye out, kid!".

Fortunately, one of the things that makes Linux so easy is access to the source code. An rpm -q command to find out which package owns the cvt command tells me it is part of the main X server so I download and install the source rpm for the X server.

This allows me to discover that the cvt program is written to use practically every internal typedef and data structure in the entire X server, but only one night of valiant code hacking was required to extract a version I could build without downloading all the dependencies and building all of the X server. Finally, I can remove the error exit and let it go ahead and print the stupid warning, but also print the mode line as well.

So, since I was plugging in my own mode line before, it should be easy to adapt my old Fedora Core 5 xorg.conf file and plug in my 58Hz mode line in the Fedora Core 6 xorg.conf file, right?

BZZZZT! Wrong! The new X release is so proud of finally using EDID, it wants to make up for the last 20 years by now utterly refusing to pay any attention to any mode line you might specify. Nope. No such thing as mode lines. You just imagined them. They never existed. X has always used the monitor EDID info.

Having encountered a fairly steep obstacle, I decide to detour around it. The fglrx drivers were working on Fedora Core 5, maybe they work on Core 6 as well. Maybe they don't have the flickering problem. I'll push the radeon mode line on the stack for a while and give fglrx a try.

I install the livna fglrx drivers for FC6, and apart from Livna Bug 1269, they seem to be working. All the 3D programs that were sure to trigger flickering before no longer trigger it. Letting the power saver come on and then bringing the monitor back online doesn't trigger it, so the flickering problems seem to be gone with fglrx.

Then I try to logoff. Aargh! ATI Bug 37 is back again. Same 2.6.18 kernel on FC5 and FC6, same version of the fglrx drivers on FC5 and FC6, but something somewhere (Xorg 7.0 versus 7.1?) has made the dread system lockup come back. I try various things like downloading ATI's installer with a somewhat newer version of the drivers than livna has, but the lockup remains. Interestingly, with the i686 FC6 installed (instead of the x86_64 version), the fglrx drivers work fine. No system lock up on i686 (but I want to run in 64 bit mode).

Not having the source to the ATI drivers (or any idea where to begin debugging such a problem), I head back to the radeon drivers again. Perhaps I can figure out how to build my own version of X from the source which will allow me to specify a mode line.

I start looking for the code that generates mode lines in order to disable it. I find it in routines with "ddc" in their name. I also find a mysterious reference to the "NoDDC" option in the radeon man page. Hmmmm... This looks promising, perhaps I don't need to modify X, perhaps I just need to go on a quest for the proper mysterious gibberish to add to the xorg.conf file to make it take a mode line after all.

No documentation I can find says anything about where I might use this mysterious "NoDDC" option, so I grep all the X server source for "NoDDC", and find it is a module option in the "ddc" module (almost makes sense :-). I also find some other options in there: "NoDDC1" and "NoDDC2" - let's turn 'em all on! OK, how the devil do I specify an option to a Module?

Back to the xorg.conf man page, and I find I can use a SubSection entry in the Modules section to specify options to a module. But wait! The new 7.1 X release is finding modules all by itself by magic. There is no Modules section in the xorg.conf file. If I put one in and only specify the "ddc" modules, then it stops finding all the other modules it previously found, so nothing much works. Back to the original log file again. I discover all the module names it finds automatically are listed in "LoadModule:" lines in the log file, so I can finally construct a Modules section for the xorg.conf file, have it manually load all the modules I previously got automatically, and also add the "NoDDC" options to the "ddc" module.

So, I should now be able to specify a mode line, right? Nope, get errors in the log, X still doesn't like it. Let's try the "IgnoreEDID" option I find in the radeon man page. Maybe that is what makes it hate the mode line?

Nope, still hates me. The errors now claim my mode line appears to have timings that look like I'm trying to use reduced blanking, but now that it knows nothing about my monitor, it is afraid of triggering nuclear holocaust, so it won't let me use that mode line.

Off to the X server source code again! I find the error it is printing and see the flags it is checking before printing the error. Ah-HA! One of the flags can be set with the "ReducedBlanking" option in the monitor section. This time it will take my mode line for sure!

You didn't think it would be that easy did you? Nope, the next error complains about the refresh rates being outside the range expected. Sigh. Back to the xorg.conf man page again. Add some "HorizSync" and "VertRefresh" ranges wide enough to drive a battleship through. (Actually I can't remember for sure which of the reduced blanking and sync rate errors I encountered first, but they definitely both happened even if not in this order).

Try starting X one more time. OH MY GOD! Is that a cursor on the screen? Did X actually start? Is it actually running at 1920x1080 at my lower refresh rate? Yes to all questions! The quest has ended! Victory is at hand!

You see how simple that all was?

Before I forget, I should probably mention that this did indeed make the monitor flicker rate go way down and almost disappear. (I was doing all this just to see if it would make a difference - I didn't know it would have any effect at all). By cranking down the refresh rate 1 more Hz and, I was able to (so far) entirely eliminate the flickering.

Fedora 7

Amazing! No new excitment in Fedora 7. The same hack from 6 worked in 7.

Fedora 8

Oh Joy! In Fedora 8 they have improved the X code so much it now utterly ignores my mode line :-). I see there is already xorg bug 10205 out there with a patch. Maybe it will make it into the Fedora source soon, but I give up. I've caved in and ordered a new video card with an nvidia chipset, and when it gets here I'll take the hours needed to fiddle with the heatpipes in zooty (my ultra, ultra quiet computer) and be rid of the flakey radeon card forever (or until I decide the geforce card is even flakier :-). I suppose I could try rebuilding the radeon driver with Henry's patch, but I'm gonna conserve my energy for replacing the card instead.

Note that there is a new (undocumented in the man page, naturally) option for the radeon driver: DefaultTMDSPLL, but turning it on seems to have no effect on the flicker problem. The theory with this option is to ignore the bad timing info some card makers put in their bios and use built in defaults in the driver instead.

Editorial Comments

According to wikipedia, X originated in 1984, X11 appeared in 1987, and XFree86 showed up in the early 1990's. In all that time, until the very recent Xorg 7.1 release, the user of X was almost entirely at the mercy of the dreaded, almost totally undocumented, impossible to produce mode line. (OK, there were improvements from time to time like building in common refresh rates and some mode line calculator web pages around the net).

Meanwhile, the EDID standard first showed up in 1994, and since then, compliant monitors have been describing themselves to computers which cared to pay attention (but X never cared).

So, after more or less 20 years of forcing users to provide mode lines, release 7.1 comes along and almost totally refuses to allow you to specify a mode line except under the most obscure and undocumented conditions.

Don't you think you've already blown up every non-compliant monitor in the world? What's with the sudden switch from paying no attention to the monitor to paying no attention to the Linux user? Would the world end if you allowed the same people you've forced to generate mode lines for the last twenty years to keep doing it? After ignoring EDID for 20 years, where did the total certainty that you have it perfect come from? Wouldn't at least one release where you allowed manual intervention with less trouble than solving Fermat's last theorem have been a good idea?

My favorite saying still remains: "I hate helpful software!"

And why does the cvt program show up now? We could have used it 20 years ago, but now that it is almost impossible to use a mode line I can finally generate one with a simple utility?

But wait, there's more! If the X source contains a utility for converting screen resolution and refresh rate into a mode line, shouldn't X have a new way to specify modes? One where all I have to tell it is the screen size and refresh rate (and I guess the reduced blanking flag as well)?

Perhaps in 20 more years this will happen :-).

Conclusions

I don't think any sane person would actually call all this "easy", so perhaps the "ready for desktop" thing is a bit premature. On the other hand it was a lot like playing a computer themed adventure/puzzle game, and unlike a game, I got an actual real world benefit by winning :-).

To be fair, however, having access to the source code did make it possible to work around my configuration problems. With the ATI drivers, there was nothing to look at once I encountered the system lockup. There may be a work around for it as well, but I have no way to tell.

And I also admit most folks aren't trying to get a monitor like this to work (though there are more and more Linux home theatre folks out there, so the numbers are growing).

And there have been improvements as things went along. Actually using the EDID information is good when it works (but things can get carried away). The problem is really throwing up so many obstacles when the automatic stuff doesn't work.

References

The Truth Revealed!

With Fedora 8's absolute refusal to accept a mode line, I decided to go ahead and switch to an nvidia card. Imagine my shock when I found the same $@#! flickering happened. There were so many bug reports just like mine with people having the same problem with ati - it must have been a video card problem - I was sure of it!

But no, it is indeed a hardware problem, but not in the video card, in the monitor. Additional experimentation shows that the DVI2 input in the LVM-42w2 monitor is where the flickering originates. If I switch the video to DVI1 or even get an adapter cable and use the HDMI input, there is no flickering. Why DVI1 and DVI2 are not absolutely identical circuitry, I have no idea, but now that I know the problems are on DVI2, I can just use DVI1 and put an end to this whole episode (of course, the lower refresh rate makes DVI2 useable, so it is still annoying that I can't do that in the newest version of X).

Of course, the HDMI input in the monitor has the world's lamest implementation of HDCP support, which makes the HDMI input close to useless (if I switch away, I can't switch back again later). I've ordered a DVI switch with fairly decent reviews, and I'll see how that goes when I try to put everything on DVI1.

Just an update: The IOGEAR DVI switch worked great, and everything was functioning fine for the longest time (I've made it to Fedora 10 now), but one day the poor old LVM-42w2 made the loudest SNAP! I've ever heard, and magically turned itself into a giant paper weight right before my eyes (the only thing missing was the cloud of smoke). That gave me an excuse to upgrade to a Samsung LN46A950 which has been working well with the DVI switch connected to HDMI2 via a DVI to HDMI converter cable. The Samsung has no funky HDCP problems with the HDMI input, but it does need you to do one incredibly obscure thing to really work correctly: You have to rename the HDMI2 input to "PC". Once you do that, it seems to become an entirely different monitor which works very well as a computer display.

 
Game of Linux Entry Game of Linux Site Map Tom's Fabulous Web Page
 
Page last modified Sat Feb 11 18:24:24 2012