So, not really Vista related.
Lately I have been trying to learn the basics of digital photography, being busy developing a couple of thousands raw pix, taken over the summer. Now, I got I a Spyder3Elite to calibrate my monitor. This was more of a manual piece of work than I expected. My Spyder is running in version 3.1.1. I have a Dell 2407WFP-HC (high Gamut) monitor. My pix will be shared with other people (not over internet) and printed at home and in labs.
Anybody there with experience of monitor calibration?
When using the Spyder in ambient light calibration mode, it suggests I calibrate my display to a white point of 5800 K, a gamma of 2.2 and a luminance of 125 cd/m2 (120 is usually LCD default). 5800 makes it a bit yellow, I think, so after studying various reco's on the internet, I calibrated it to 6500K instead. Better.
The monitor has controls for brightness and contrast. The manual says that "Brightness adjusts the luminance of the backlight" and "The Contrast function adjusts the degree of difference between darkness and lightness" (well, surprise).
To adjust luminance, I should adjust brightness first and then contrast to achieve my desired luminance.
To do this, I used a "grey scale" and reduced the brightness control until I could barely see a difference between the two darkest boxes; then contrast until I got 125 as my measured luminance.
Basically, this method, to first adjust the brightness control and then the contrast, gives an endless amount of possible settings, starting with brightness at 0.5% and contrats at default 50% to whatever. I ended up with brightness at 26% and contrast at 42%, but I cannot say if this is the best choise when I develop my raws.
Anyone available to give me guidance?
Is there a good grey scale anywhere on the net to find the best setting?
How much does it matter, what settings I choose?