Red Hat Bugzilla – Bug 74866
doesn't take max pixel clock into account for availible resolutions
Last modified: 2007-04-18 12:47:00 EDT
If video card does not support the resolution please do not show it in
configurator. Theoretically 2meg card should support 1152x864x16bit, I think.
Created attachment 78064 [details]
I fixed a bug where it ignored the video memory when calculating the availible
Unfortunately in your case all 1152x864 are removed like this:
(II) TRIDENT(0): Not using default mode "1152x864" (bad mode
I guess (since your monitor specs are pretty good) that this is due to the
graphics card max clock. r-c-xfree can't read the clock max though, so it will
be hard to fix it properly.
Yeah, unfortunately I don't see a way to fix this. Closing as 'wontfix'.
Could you (optionally) run X --probeonly to verify configured resolutions? The
previous Xconfigurator did it.
I'd really rather not. I'm reluctant to complicate the code for what is a
pretty uncommon case. That's ultimately what made Xconfigurator unmaintainable
in the first place. I don't believe that 2MB cards are all that common these
I will not reopen it second time but clearly see a fundamental problem here:
Configurator should always ask XFree86 driver for a list of supported
resolutions/color bits instead of trying to calculate the list by some obscure
Mike, do you have an opinion?
I believe that the configuration utilities should not present options to the
user that are unavailable on their hardware. That includes invalid video
resolutions. This is something that I believe should be fixed in our
config tool, however I do not think it is a critical priority.
While 2Mb video hardware is not really common these days on new computers,
it is very common on integrated motherboards that we support, and I believe
it is common enough, that this issue should be addressed in the future.
It need not be complicated or full of clutter to implement a proper solution
however. All that is needed, is to know how much video memory is on the
adaptor. From that, you can calculate what the largest video mode is for
each depth that the card can support. Also, some hardware has limitations
aside from video memory constraints, which dictate what modes are and are
not available. These things do not belong in special cased code in the
config tools however, they belong in a hardware database in some kind of
generic format that generic code in the config tool can read in.
I believe these things to be important for future, but that it is too much
hassle and effort for us to implement for our next release. Future releases
will have a new video hardware database format, which is capable of storing
more information about video hardware, and that will simplify the addition
of features in the config tool.
Like Brent, I do not want to see the config tool code slopped up with
tonnes of hardware specific hacks and kludges, or it will become another
For now, it might be incorrect to display video modes for a card that
aren't actually useable, but I just consider it an annoyance rather than
a critical showstopper flaw.