Dear Lazyweb — how to get native screen resolution in F7?

For my birthday, I bought a nice new LCD monitor (it’s sooooo nice to not be a poor graduate student anymore…). A Samsung SyncMaster 204B. In addition to being a cool monitor, it also confirmed my slowly-formed theory about the source of the headaches I was frequently getting at nights and on weekends. It’s nice to finally be headache free again. However, there is a downside. Despite the extra cash I paid to get a monitor with 1600×1200 resolution and the extra time I had to spend to find such a monitor, Fedora 7 insists on running it at 1280×1024 or lower. ‘xrandr -q’ only lists resolutions at or below 1280×1024.

I went into system-config-display and selected “LCD Panel 1600×1200” and restarted X, but no luck. xrandr still doesn’t allow 1600×1200. I’m not that familiar or comfortable with xorg.conf, but I tried a couple things (explicitly adding a Modes line with the entries “1600×1200”, “1280×1024”, etc; modifying the HorizSync and VertRefresh to more closely match the specs I found online). No dice. I tried googling a bit and saw some Debian post about deleting one of the existing modelines, but my xorg.conf has no modelines and I’ve heard scary things about editing them and what can happen if you get it wrong. I feel like such a noob.

To Summarize: How do I get Fedora 7 to run my Samsung SyncMaster 204B at its native (1600×1200) resolution?

Thanks in advance for any pointers.

UPDATE: To answer some questions: I have an nvidia card (Quadro FX 1400 as far as I can tell from lscpi) and am using the nv driver. Also according to lspci, the video card seems to have 128MB memory. I’m using the DVI connection between the monitor and video card. System RAM is 2GB. Also:

[root@Eenie ~]# cat /var/log/Xorg.0.log | grep 1600×1200
(II) NV(0): Modeline “1600×1200” 161.00 1600 1712 1880 2160 1200 1203 1207 1245 -hsync +vsync
(II) NV(0): Modeline “1600×1200” 162.00 1600 1664 1856 2160 1200 1201 1204 1250 +hsync +vsync
(II) NV(0): Not using default mode “1600×1200” (exceeds panel dimensions)
(II) NV(0): Not using default mode “1600×1200” (exceeds panel dimensions)
(II) NV(0): Not using default mode “1600×1200” (exceeds panel dimensions)
(II) NV(0): Not using default mode “1600×1200” (exceeds panel dimensions)
(II) NV(0): Not using default mode “1600×1200” (exceeds panel dimensions)
(II) NV(0): Not using driver mode “1600×1200” (exceeds panel dimensions)
(II) NV(0): Not using driver mode “1600×1200” (exceeds panel dimensions)
(II) NV(0): Not using mode “1600×1200” (no mode of this name)

Anyway, the first handful of suggestions didn’t pan out, but I’ve got about a dozen more in the comments to this blogpost, so I’m going to be trying some of them out. Thanks to everyone so far with suggestions.

UPDATE: Thanks to everyone for the many additional suggestions.  Apparently no amount of xorg.conf tweaking will fix this issue.  As nona and ajax pointed out in the comments, this is apparently a limitation of the nv drivers.  (See freedesktop bug 3654 and bug 4314) Apparently the workarounds are to either use the proprietary nvidia drivers (yes, I do feel a bit icky right now; and worried–they haven’t been very stable for me in the past) or use the VGA connection instead of DVI (which now that I’ve been made aware of, I might be soon switching to this).

It is nice to have the native 1600×1200 resolution now.

25 thoughts on “Dear Lazyweb — how to get native screen resolution in F7?”

  1. In my case (Dell 20′ flat), I had to add the following line in the Device section of xorg.conf:
    Option “NoDDC”

    Worth to try…

  2. There might just not be enough video ram configured for this resolution. Do the math!

    Lower the bit depth and check whether it works then, or got to your BIOS and increase the size of the video RAM (works only if you have a shared memory graphics card, i.e. intel).

  3. Just a quick stab at this: try to disable xrandr? Disabling autodetection might force x.org to follow your xorg.conf file?!?

  4. What kind of graphics card do You use? If that’s and Intel card, you’ll need the 915resolution tool.

  5. overclock the HorizSync and VertRefresh values with about 20 each, nowadays most monitors have out-of-range protections and will simply turn off if the monitor cannot handle the resolution.

    On my Cheap-ass AOC 19Klr this did the trick.

    What does the /var/log/Xorg.0.log say? “Out of range” for the 1600×1200 modes?

    How’s your video-card doing, how much memory does it have? Does it have funky limitations as some older ATI circuits do?

    Godspeed

  6. Try adding a modeline in your xorg.conf under the Monitor section

    Modeline “1600×1200@60” 140.0 1600 1664 1856 2160 1200 1201 1204 1250

    Then re-start X.

  7. I’m afraid I don’t have any helpful advice for you, but I do find it interesting that such a prominent GNOME community member doesn’t feel comfortable mucking around with their xorg.conf. I guess it shows how far Linux has come for desktop users… when I started with Linux, mucking about with the X config file was the only way to get the thing to start up.

    Given the amount of hair I’ve lost over that, I think this is a good thing. If we can overcome the hurdles that are left, of course – like this one.

  8. Yes, blah could be correct on that one. We have a multitude of GeForce 4s at our house, and I convinced Dad to only buy 17-inch instead of 19-inch LCD monitors because GeForce 4s only support up to 1280×1024.

  9. It’s highly dependent on what card/driver you’re using. Also, some monitors just don’t do a good job of sending their info using the VESA EDID specification. So, like Claude said, you can try to get your driver to not use DDC to fetch these modelines. Look through /var/log/Xorg.0.log for Modeline and the preceeding lines to see how DDC gathers this info from your monitor. For the older intel driver (don’t know what’s on FC7), you’d add Option “DDC” “no” to turn that off.

    If you end up creating a custom mode, the way you use it is to specify it with an identifier in the Monitor section:

    Section “Monitor”

    Modeline “my1600x1200” 108.00 1280 1328 1440 1688 1024 1025 1028 1066 +hsync +vsync
    EndSection

    and enable it in the Screen section:

    Section “Screen”

    SubSection “Display”

    Modes “my1600x1200” …
    EndSection
    EndSection

  10. Modern versions of Xorg has very impressive support for autoconfiguration. I’ve actually seen X perform better after deleting the xorg.conf. (Of course, you’d want to rename instead of delete, just to be safe.)

  11. You can try read-edid. It can information from your monitor and it will give you back the Modelines for your monitor just to be added in xorg.conf.

    It works as:
    $ sudo read-edid | parse-edid

    Anyway, I thought xrandr >= 1.2 shouldn’t require that kind of magic.

    Good luck.

  12. I don’t think Modeline works in Fedora 7.

    At the end of the display subsection add a line:

    Modes “1600×1000” “some” “other” “resolutions”

    That worked for me (different monitor though).

  13. Is it one of the ones with the buttons on the right hand side of the screen? We have a lot of Samsung 20″ LCDs at work, and we’ve found that some of these model appear to be defective with recent nVidia drivers giving exactly the same problem you are experiencing (the older drivers or the older model monitors both work fine; no fancy configuration required). Hilariously, if you start up the machine with a working monitor and then plug these monitors in, they will quite happily run at the correct resolution. I can’t remember if we ever properly solved the problem, I will have a look on Monday for you.

    I suspect the firmware on these monitors is a bit shitty and more recent versions of the nVidia driver get confused by this shittyness.

  14. I seem to recall one of the nVidia drivers (proprietary or open) stopped correctly getting the refresh rates from the monitor at some point. I fixed it by using the other driver, noting down the refresh rates it detected and then defining them specifically in my X config file for the other driver.

  15. Hm, i have the same problem with a FSC P20-2 – but only with the nv-driver. nvidia works fine. I would really be interested in a solution – i haven’t found one myself and been wondering why nobody else has this problem.

  16. I have a Samsung 204B on debian sid. running from an onboard GeForce 6100 GPU (256MB reserved in BIOS). It was detected and ran at 1600×1200 with no preconfigured xorg.conf file (nv driver) but the display was offset 1.5 cm to the right. I tried proprietory Nvidia drivers but these did not work with the linux 2.6.21 kernel. I now run following patched version at 1600×1200 which can be found with google : NVIDIA-Linux-x86-100.14.11-pkg1-patched.run – the install creates an xorg.conf file. Pls email me for further info.

  17. Hi,

    I happen to run a Samsung 226BW in its native resolution of 1680×1050 with an nVidia (7600 GS) gfx-card and Fedora 7 just fine. Connecting the monitor by dvi is definitately the preferred method though as the analog connection never got the display size right (though resolution was correct).

    Maybe you should over to livna.org and grab “livna-config-display”- and the nvidia-driver-packages (they do work with kernels up to 2.16.22.2). No modeline hacks are neccessary in xorg.conf, or what so ever. For me, autodetection just worked right out of the box.

    If this does not solve the issue, your videocard may be the culprit here – as was mentioned before. Have you tried “the other OS” and checked that its possible there to select and run desired resolution ? I am bit lazy in math things like that, but to me it appears 128 MB RAM is not quite enough to run 1600×1200 in 32bit mode.

    cheers,

  18. See nona’s comment above. This is a limitation in the nv driver. We’re using a panel setup frontend that’s limited by whatever the BIOS configured at boot time, and nv BIOSes have a habit of defaulting to 1280×1024.

    The usual workaround is to drive the panel over VGA instead, but we are planning to fix this properly someday.

  19. i think the point of the question was (at least for me) how to get the Open Source nv-driver working. Not the Closed Source nvidia-driver.

  20. VGA isn’t so bad compared to DVI…I have a monitor with both VGA and DVI inputs and I’m constantly switching between both. Both inputs are running at 1920×1200, and honestly I couldn’t tell you which was VGA if you put a gun to my head.

  21. Turns out we sent ours back as defective. Got new ones that worked. Someone suggested that there might be a bad batch kicking around out there.

Comments are closed.