shakey wrote:I could put G Stark in one corner and K Rockwell in the other for a punch out about this...but...
Ok, first of all, I have no experience (yet) with LED monitors.
So, lets talk about LCDs and CRTs just to give the discussion a firm basis, and then perhaps you can ask the question that you posed ...
Potoroo wrote:Well, if doing it once made no discernable difference - the 2408WFP is reputed to be very accurate out of the box - I would very much like some serious reasons why it would need to be recalibrated monthly. That just doesn't seem reasonable.
Let's look at this: CRTs can noticeably degrade over time. The electron beams get old, and like us, they slow down over time, right?

So, they need regular adjustments to endure that they stay within the standards that we need in order to produce consistent results.
And please take note: there are six words in the prior paragraph that are of supreme importance: "in order to produce consistent results". There is no other reason, ever, to calibrate your systems, after all: you want to see prints that look like what you see on the screen.
There's our baseline. LCDs are backlit, and the backlighting can, over time, fail and fade. Or breakdown completely. How good are the standards under which your monitor was made? Over time, does the manufacturer maintain the same supplier for all components? As components move from one manufacturer to another, how does that affect the overall product quality between buyer A who bought something from batch F, as against buyer B who bought something manufactured within batch P?
Again, your goal with monitor calibration is to produce consistent results. This applies over time, too. So that if you pull up an image from three years ago, that you applied your post to with a CRT, but you're now using an LCD, how do you know that what you're seeing today on that LCD can be compared to what you saw three years ago on that CRT, unless both of those devices were calibrated to a common reference point?
Now, let's return to your question for a moment ... there's one phrase that requires attention: "no discernable difference".
Under what conditions? When you - when I, or when anyone else - look at a monitor, the results that we perceive will be affected by the ambient lighting conditions under which we are viewing what is displayed by that monitor. Are we in a bright or a dark room? Is the monitor beside a window with bright sunlight streaming through? Perhaps the room is lit by flouros? Or maybe just plain old garden variety bulbs? QI bulbs perhaps?
Any and all of these will have an impact upon how we perceive what we're seeing on the screen, but none of them would be sort of basis for saying that what we are seeing is "correct". It is only through the use of calibration tools, actually measuring the actual output of the screen, that an accurate and reliable assessment of that output can be made.
That we, as users, might not be able to discern any changes is irrelevant. Our eyes adjust and can compensate for these sorts of things and we will know little of this compensation processes that are being undertaken. Our bodies are very adaptable, and this are totally unreliable when it comes to something like a scientific assessment of the calibration status of any monitor.
Now, what was it that you were saying that was unreasonable?
