I would recomend Matrox, I have heard good things about their 2D performance, but I know they are behind the times on 3D, and I would consider nvidia to be reasonably OSS friendly
2005-10-18 06:11 am (UTC)
Nvidia OSS friendly? Where've you been?
i have one of those at work... my honest recommendation is for 2 (or maybe 3) 21" or 19" (a previous setup)... with a monitor that wide, if you're looking at the stuff to the far-right or far-left of center, it's noticably angled and farther away... with two monitors you can angle them so they are equidistant from the eyes (your mileage may vary with your own eyes of course)... i also find the real-world boundary useful... ("maximize" also becomes useless with a monitor that big, whereas it still works with 2 monitors [not that i use that much])...
good for making co-workers jealous though...
2005-10-18 06:14 am (UTC)
But then I'd have to get angry about dual-head and Xinerama and finding that if I were to only use the binary-only ATI and nvidia drivers, everything would work perfectly magically.
Plus I remember doing some research on dual-head DVI and found that like nothing out there supports it with open drivers, and my motherboard only has one AGP slot.
I would just give up and get nvidia, because their cards with closed source drivers *do* work better than ati with quasi-source drivers.
I am getting a bunch of the 24" Dells for office/lab/etc.; they're the best deal ever. You can get them for $800 or so sometimes, too.
I am mostly using a t43p with 1600x1200 as my primary machine these days though, which has a radeon x600.
2005-10-18 06:40 am (UTC)
From what I can tell Intel is doing about the best in terms of providing open source video drivers for modern hardware. But I think it's pretty much limited to embedded chipsets, for laptops and the like.
Matrox used to be fairly good about open source drivers, but I've not followed them for quite a while (I've mainly used laptops with Intel chipsets in them for the last 3 years or so), so I'm not sure if this is the case for modern cards. Last I looked -- a couple of years back -- the open source drivers were about a generation behind, but otherwise good for 2D. The cards themselves didn't have the same extent of high-end 3D features as ATI or nVidia based cards.
The rest of the market seems to boil down to ATI OEMs and nVidia OEMs. ATI have somewhat-open-source drivers (there are open source drivers for approximately previous-generation cards, but not for a lot of the fancier features, plus some closed-source drivers with more features), and nVidia have no-cost-but-closed-source drivers. From all I've heard the nVidia support is very good (eg, comparable with their Windows support in terms of currency and features), and the ATI support is not so good (at least compared with their Windows support).
Even pretty core Debian Developers were saying that they'd pick a nVidia chipset for Linux usage because the features/quality/etc of the drivers was enough to make them put up with the closed-source nature of the drivers.
Basically I think the video card industry and open source community still haven't really figured out how to get on properly. So a heaping of pragmatism is probably still required to get anything useful done.
2005-10-18 06:57 am (UTC)
So a heaping of pragmatism is probably still required to get anything useful done.
That's what I suspected. I just try to gauge the landscape every couple years and hope things've changed.
Are you really gonna buy that big monitor and not watch HDTV on it? Just a thought :)
Yes, much hate for the video card industry.
The NYT article doesn't really give us information about how fair the comparison between 15" and 42" was. Resolution? Viewing angle?
There is, I dunno, 20- or 30- or 40-year-old HCI research comparing doing reading-related tasks on the computer with doing them with books, which found that users worked faster, made fewer mistakes, and were happier using the printed word. (Obviously we're not talking about search tasks here.)
However, they found the effect diminished with increasing screen resolution, that is, with increasing pixel-count of fonts. One assumes that with sufficient resolution the functionality advantages of screen would win.
Although this research seems to have been forgotten or ignored by the industry, I've been using 1600x1200 for as long as possible ('97 -ish?); I didn't buy a laptop until the first ones came out with 1600x1200.
I now use two side-by-side 21" Dell LCDs.
Unfortunately, I'm a Windows developer, and to get more-pixels-per-character I have to set a conveniently provided 'large fonts' option in windows that requires rebooting so most applications never test it (if they even know about it) so many applications end up trying to print text in boxes that are too small to contain the text. (I've encountered dialog boxes where the 'cancel' and 'OK' buttons were actually unclickably off the edge of the dialog, presumably because they laid things out based on font metrics but didn't resize the dialog box overall.) It has been a painful 10-ish years.
2005-10-18 09:15 am (UTC)
FWIW, I had phenomenally better (yet still occasionally shitty) luck with the proprietary nVidia drivers than the utterly halfassed Matrox drivers. Fuck open sores.
Fuck open sores.
it's not the size of the monitor that matters, it's the available screen area that matters. you can be more productive if you can have more things up on the screen at a time without having to flip between pages for relevant information, you can get the same or better effect with multiple monitors on the same system
*two 21" 1600x1200 crt's, looking to add a third for a grand total of 4800x1200 screen resolution goodness*
2005-10-18 10:25 am (UTC)
Heh. Know what's scary? I've got that resolution (1920x1200) in my 15' LCD on my Inspiron.
Eye-busting good times.
can't help, as I'm running nvidia here with the closed-source drivers (I need dual head so I can have server monitors running off my main screen), but ramtops
and I bought a brace of those Dells a month ago and they're necessary
; don't wait, get (at least) one now
. Never thought I'd ever say that about anything with a Dull badge, but I'm nothing if not pragmatic :)
No bad pixels we've yet spotted on either, either.
Just use vga. The D/A converters in new monitors are so good there's really no difference between dvi and vga.
2005-10-18 05:39 pm (UTC)
This is also my experience.
And DVI supports DRM (google HDCP) so if you're gonna go the Free route, you may as well go all the way.
I got the 2005FPW a couple months ago and I'm very pleased with it. Don't pay full price, though; there have been some good deals out there on these monitors. http://www.gotapex.com/deals.php?search=2405
has a coupon code that brings the 2405FPW down to $780, although it looks like it expires soon. At any rate, I recommend checking that site before buying.
2005-10-19 05:53 am (UTC)
11 minutes before it was supposed to expire and it says the coupon's expired. Must have hit the 5000 user max on it.
I have that monitor! It looks pretty nice in FreeBSD. ;)
I'm feeling peer pressure.
I agree with the suggestions that multiple small monitors are worth considering. I use dual 1280x1024 17" LCDs, and I'd sooner spend a grand on three or four more 1280x1024 LCDs than on one big monitor.
Ever noticed that the best LCD monitors seem to be from Korean manufacturers? Several months ago, I saw a display at Office Depot wherein about a dozen LCD monitors were all playing the same LOTR DVD. The Samsung and LG units both had strikingly better color depth than all of the other LCDs, including units of a certain very expensive Japanese brand.
Both of my LCDs have analog inputs only, but one of them is plugged into the DVI port of my video card via a DVI-to-VGA converter. Silly, but it works fine. Regardless, I don't think using DVI natively would offer any tangible benefits vs. what I have.