24" Widescreen CRT (FW900) From Ebay arrived,Comments.

This may be the case, but I'd like to understand the exact interdependencies before making that assumption.

Eh, I don't know enough specifically to make that call. All I'm reporting is what I've heard and read, and what I've observed through my own work with WinDAS and other monitor adjusting.

A question I have for you is - how do you know exactly that your DTP-94 is reading accurately?
 
I have to interrupt your discussion with my trivial problem ;)

I recently switched back from Windows 8.1 to 7 and before i never had problems running 2304x1440 via the modified 64bit driver i got from here and CRU but after installing windows 7 again and the newest catalyst i can't use the "Use extended Display Identification Data or driver defaults" option.

I use a BNC-DVI cable so obviously i won't get EDID information but i have the driver installed but when i select it, it goes blank for a second and switches back to 2048x1536 max resolution and 120Hz max refresh rate; even when i lower it to, for example, 1920x1200 it goes back to 2048x1536 (in the CCC and in the windows resolution setting).
Adding any resolution higher than 2048x1536 won't show up in windows. I had a newer catalyst beta driver installed but even going back to an older one on which it worked didn't help.
I have no idea why it suddenly won't work anymore :confused:
 
A question I have for you is - how do you know exactly that your DTP-94 is reading accurately?

couple reasons:

1: I've tested it against my brand new i1 pro (which is a spectroradiometer), and the readings are virtually identical

2: The measurements I get with both instrument match up very well to readings that other researchers have obtained when measuring the FW900's primaries.
 
I use a BNC-DVI cable so obviously i won't get EDID information but i have the driver installed but when i select it, it goes blank for a second and switches back to 2048x1536 max resolution and 120Hz max refresh rate; even when i lower it to, for example, 1920x1200 it goes back to 2048x1536 (in the CCC and in the windows resolution setting).
Adding any resolution higher than 2048x1536 won't show up in windows. I had a newer catalyst beta driver installed but even going back to an older one on which it worked didn't help.
I have no idea why it suddenly won't work anymore :confused:

Not sure how to help you here as I don't have much experience in this area, but did you say that it was defaulting to 2048x1536 @ 120hz???

That doesn't make much sense. How do you know it was actually running at that refresh rate?
 
it can't run @ 120hz when in 2048, it is just given as the max. refresh rate (which should be 160hz on 640x480 i think)

this is how my CCC looks:
etHDhUv.png


But powerstrip for example reads the data correctly, so i think it has something to do with the driver
kpPrw8X.png
 
Not sure how to advise. I will, however, make the observation that the resolutions that the catalyst drivers seem to be recognizing are 4:3 aspect ratios rather than 16:10, but you probably already noticed this.
 
Not sure how to advise. I will, however, make the observation that the resolutions that the catalyst drivers seem to be recognizing are 4:3 aspect ratios rather than 16:10, but you probably already noticed this.

indeed :D
(wel it does list widescreen resolutions up to 1920x1200) The driver probably sees vga=crt=4:3 (i know like 3 widescreen crts). Withouth installing the modified 64bit fw900 driver via device manager it only goes up to 1600x1200; but i can't tell the amd driver to use driver defaults (from the .inf file) which should give the correct resolutions.
 
Hi guys.. my monitor has begun to develop a loud buzz near the power..

sigh.. flyback?

is this caused by old caps?
 
couple reasons:

1: I've tested it against my brand new i1 pro (which is a spectroradiometer), and the readings are virtually identical

2: The measurements I get with both instrument match up very well to readings that other researchers have obtained when measuring the FW900's primaries.

Ah okay. Cool - just wondering, because I've never used such devices and so I wouldn't have the slightest clue where to start. I'm afraid that anyone just getting a cheap DTP-94 on ebay though may not be as lucky, and would have to send their unit in for recalibration.
 
Ah okay. Cool - just wondering, because I've never used such devices and so I wouldn't have the slightest clue where to start. I'm afraid that anyone just getting a cheap DTP-94 on ebay though may not be as lucky, and would have to send their unit in for recalibration.

As I mentioned in an earlier post, you can measure your primaries (in native mode, i.e. not sRGB) with the DTP-94 and see whether the chromaticities match up to what they should. We seem to be lucky to have a monitor like the FW900 which seems to have pretty tight tolerances on the phosphor chromaticities, so we can use that as a reference to see if calibration is actually needed.

Harder to tell whether luminance readings are accurate, but for calibrating white point balance, chromaticity is more important.
 
couple reasons:

1: I've tested it against my brand new i1 pro (which is a spectroradiometer), and the readings are virtually identical

2: The measurements I get with both instrument match up very well to readings that other researchers have obtained when measuring the FW900's primaries.

1) "I've tested it against my brand new i1 pro (which is a spectroradiometer)"... Unless I read different or missed something, according to the information made available to me, the X-Rite i1 is an spectrophotometer, NOT an spectroradiometer. I never heard that the X-Rite i1 Pro is an spectroradiometer... Please let us know where did you get that information...

2) "the readings are virtually identical?" Compared to which standard? What are you measuring against to make such an statement? Look at your Delta E's? Most of them are above 3... Do you know what that means?

3) "The measurements I get with both instrument match up very well to readings that other researchers have obtained when measuring the FW900's primaries." With those Delta E's you obtained? Oh Please....

Unkle Vito!
 
when it does develop, is it there constantly?

Yes.. It's quite a loud.. Bzzzzzzz.. medium pitch.. not high pitch like coil whine.. seems to be on the "right side" near the back.

Now.. I've heard that if it were the flyback, then the picture would get all fuzzz-ed. but my picture remains consistent no change on screen.. just the buzzz

is it possible that my monitor does not like the sync info from my gfx card? it's ati, and i've heard ati doesn't have as great crt support as nvidia.
 
Hi guys.. my monitor has begun to develop a loud buzz near the power..

sigh.. flyback?

is this caused by old caps?

Again, without performing a complete diagnose on it and based on what you are describing...

a) IF the monitor turns on, and the unit experiences a constant buzzing noise in the PSU area, is the PSU going bad.

2) IF the monitor DOES NOT turn on and makes a buzzing noise in the FBT area, followed by a buzzing light flashing that last 1-2 seconds in two intervals, the tube may have short.

Hope this helps...

Sincerely,

Unkle Vito!
 
Again, without performing a complete diagnose on it and based on what you are describing...

a) IF the monitor turns on, and the unit experiences a constant buzzing noise in the PSU area, is the PSU going bad.

2) IF the monitor DOES NOT turn on and makes a buzzing noise in the FBT area, followed by a buzzing light flashing that last 1-2 seconds in two intervals, the tube may have short.

Hope this helps...

Sincerely,

Unkle Vito!

Hi UV... thx for the -bites

You are on the West coast? I'm stuck in Penn
 
1) "I've tested it against my brand new i1 pro (which is a spectroradiometer)"... Unless I read different or missed something, according to the information made available to me, the X-Rite i1 is an spectrophotometer, NOT an spectroradiometer. I never heard that the X-Rite i1 Pro is an spectroradiometer... Please let us know where did you get that information...

A spectroradiometer measures the spectral power distribution of a light source. A spectrophotometer measures the spectral quantities of objects. The latter is typically done by illuminating the sample with a reference spectral source (which is generated by the spectrophotometer).

I am using the definitions provided in chapter five of Janos Schanda's Colorimetery: Understanding the CIE System. The chapter is authored by Yoshi Ohno, from the National Institute of Standards and Technology.

Given that the i1 pro measures both emmissive spectra, and objects, the instrument is actually both a spectroradiometer and a spectrophotometer.


2) "the readings are virtually identical?" Compared to which standard? What are you measuring against to make such an statement? Look at your Delta E's? Most of them are above 3... Do you know what that means?

The delta E's that you saw have nothing to do with the accuracy or reliability of my instruments. They reflect deviation from Rec 709's D65 across my grayscale. They do not reflect deviations compared to a reference instrument. A display that hasn't had a grayscale calibration done can have huge delta Es even with a spectroradiometer that costs hundreds of thousands of dollars.

And yes, I know what delta E means, and I've asked you about this before, as you keep saying you get delta E's of under 0.05 percent. Do you realize that this is meaningless?

Delta E refers to the three dimensional geometric distance between two points in a perceptually uniform color space (this is a color space that has been transformed so that a given distance between any two points will result in the same perceptual difference, regardless of which two points are chosen). As far as I understand, a delta E of under 2.3 is perceptually indistinguishable (although the values 1.0 and 3.0 are often cited).

As for your question of which standard I'm using, I've gone through the literature and found research that has actually measured the phosphors of the FW900 and reported the chromaticities. One of these studies was commissioned by the National Information Display Laboratory. The chromaticities that I obtained with my brand new i1 pro (factory certified) when I measured the phosphors of my FW900 match up precisely with those obtained by other researchers.

More relevant to this particular discussion, my DTP-94 matches precisely (within 0.001) what I obtain with my i1 pro.


3) "The measurements I get with both instrument match up very well to readings that other researchers have obtained when measuring the FW900's primaries." With those Delta E's you obtained? Oh Please....

Unkle Vito!

I hope you realize that your objection here is meaningless, based on my explanation earlier in this post.

So again, I will ask you the question:

What exactly are you referring to when you say your displays leave your lab with delta E's of under 0.05%. Are you referring to the delta E of each luminance level's white point balance?
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
So for WinDAS. Do I have to have a clean system to run it or would any system work? Because I'm still getting the ECS syntax error NG! NG! NG!; even after running a version I had from 2010.
 
So for WinDAS. Do I have to have a clean system to run it or would any system work? Because I'm still getting the ECS syntax error NG! NG! NG!; even after running a version I had from 2010.

Are you sure the version says 2010 or is it 2001? WinDAS didn't get updated (at least the WinDAS we know and love) to 2010.

EDIT: Kermie - you have a PM. :)
 
Last edited:
So for WinDAS. Do I have to have a clean system to run it or would any system work? Because I'm still getting the ECS syntax error NG! NG! NG!; even after running a version I had from 2010.

It might be your cable, but not sure. I remember reading that if you accidentally switch the gnd and +5v order, you can blow the cable. Might be worth buying a new one to test, if nothing else seems to work. Where do you live btw?
 
It might be your cable, but not sure. I remember reading that if you accidentally switch the gnd and +5v order, you can blow the cable. Might be worth buying a new one to test, if nothing else seems to work. Where do you live btw?

Corona, CA; I seem to be the only one that has an FW900.
 
If you have the proper instrumentation and know what you are doing, WinDAS adjusts white balance properly.

WinCATs adjust both white balance and colors, and for that you will need an spectroradiometer. We have a Minolta CA-100 and the CA-210 for that. WinDAS and WinCATs, when they are acquired from Sony (legit licenses), both require dongles and registration to be activated. Otherwise, they will not work.

just found another sony bulletin, discussing WinCATs.

I've also gone through the WinCATs user manual.

From what I can gather, it's simply an automated white point balance procedure, that requires one of two color analyzers:

the Konica-Minolta CA-100, or the Minolta TV-2130.

I've gone through the user manuals of both these instruments, and they are not spectroradiometers, they are tristimulus devices (i.e. colorimeters).

So, from what I can gather, as long as you have WinDAS, and a colorimeter or spectroradiometer you trust, you can achieve exactly the same outcome as an automated WinCATs procedure, but it will take more time.
 
just found another sony bulletin, discussing WinCATs.

I've also gone through the WinCATs user manual.

From what I can gather, it's simply an automated white point balance procedure, that requires one of two color analyzers:

the Konica-Minolta CA-100, or the Minolta TV-2130.

I've gone through the user manuals of both these instruments, and they are not spectroradiometers, they are tristimulus devices (i.e. colorimeters).

So, from what I can gather, as long as you have WinDAS, and a colorimeter or spectroradiometer you trust, you can achieve exactly the same outcome as an automated WinCATs procedure, but it will take more time.


No... It is not automated and you can run the program with the CA-100, the CA-210 and the CS-2000 which we have them in the lab. The WinCATs program, like WinDAS, is not dependent of the measuring device. The measuring device will take a reading and depending on the results, you need to make the adjustments in the parameter(s) the programs (WinDAS/WinCATs) tells you to adjust to achieve targets. To perform the WBA accurately, you must run both processes: one to adjust/calibrate and the other to confirm your results of what you just did. That's the way we do it in our lab.

Hope this helps...

Unkle Vito!
 
Last edited:
No... It is not automated and you can run the program with the CA-100, the CA-210 and the CS-2000 which we have them in the lab.

According to the manual, it does have an automatic adjustment protocol, and you can set the tolerances of x,y, and Y independently.

from the manual:

All adjustments and inspections are operated automatically except the process of manual adjustment as G2 adjustment.

Then again, I've never actually used the software, so I can't be sure.

The CS-2000 is a BEAUTIFUL instrument. I wish I could work with one :)


To perform the WBA accurate, you must run both processes: one to adjust/calibrate and the other to confirm your results of what you just did. That's the way we do it in our lab.

I like the WinDAS white point balance test - it allows you to slide the contrast level and see how the chromaticity changes, so it gives you an idea of how well the grayscale tracks.

One thing that many calibrators do, when testing grayscale tracking (white point balance across the range of luminance) is do a systematic test from 0% to 100% luminance, usually in 10 percent intervals, and record delta E's at each of those 10 points. Due to display and instrument limitations, you'll rarely see low delta E's at the very low luminance levels. Even on a high end Sony trimaster OLED, the WPB doesn't track perfectly below 0.1 cd/m2 (see fig 11 of this recent open access paper from journal of vision.

Here's an example of how grayscale tracking can be reported. This was a special calibration done by Scott Wilkinson and Robert Heron on a high end OLED tv (Samsung KN55S9C, close to $10,000) owned by Leo Laporte.

Notice in this figure that the average delta E was 2.28 which is still considered very good (although their other instrument reported delta E's below 1).

Here's the video of the entire calibration process.

Keep in mind that those delta E's are simply delta E's relative to the chromaticity of D65.

Delta E is used in many contexts - when measuring the quality of an instrument, Delta E refers to the difference of that instrument relative to a lab-grade reference instrument. Delta E can also be used to refer to repeatability measurements within the same instrument, or between instruments of the same model.

When reporting Delta E's in the context of display calibration, it's done with reference to a particular standard. In the case of HD video, that standard is Rec 709.

With computer displays, it can be tricky to report these results, since the software that does the measurements typically run in a windows or OSX environment, and there may be weak links in the display pipeline that alter the final signal. With a signal generator, you can be certain that what you see reflects the monitor's output at its most basic operating level.

Once you introduce video cards, operating system, and test pattern software, there is the risk of all those components changing the original signal.

There are a couple ways to deal with this, however. One is to really manage your operating system and software environment carefully, to ensure that there are no weak links in the pipeline. One can confirm whether this is done successfully by measuring test patterns in signal generator environment and comparing the x,y,Y readings to those obtained when measuring the same test patterns in a windows+software environment. If they are the same, then one can proceed with a degree of confidence. Vito, or anyone else: if you end up doing this experiment, I'd be very curious to learn the results!

The other way is to actually measure the x,y,Y values at different IRE's using the signal generator, and then record the values (from 0% to 100% or whatever you choose). You can then open up HCFR, and input those values into the grayscale measurement tab (check the box called "editable data"), and it will automatically calculate the Delta E's at each luminance level that you input.
 
Last edited:

I'm more amazed by the black level of that OLED TV. 0.000. That is pitch black. Not even any CRT that I've ever seen (including the FW900) could display a perfectly black screen. They always had some glow in the dark. Of course, maybe with proper calibration they can reach that but none of mine ever did, unless you set the G2 so low that it ate up the lower range of the grayscale.
 
According to the manual, it does have an automatic adjustment protocol, and you can set the tolerances of x,y, and Y independently.

from the manual:



Then again, I've never actually used the software, so I can't be sure.

The CS-2000 is a BEAUTIFUL instrument. I wish I could work with one :)




I like the WinDAS white point balance test - it allows you to slide the contrast level and see how the chromaticity changes, so it gives you an idea of how well the grayscale tracks.

One thing that many calibrators do, when testing grayscale tracking (white point balance across the range of luminance) is do a systematic test from 0% to 100% luminance, usually in 10 percent intervals, and record delta E's at each of those 10 points. Due to display and instrument limitations, you'll rarely see low delta E's at the very low luminance levels. Even on a high end Sony trimaster OLED, the WPB doesn't track perfectly below 0.1 cd/m2 (see fig 11 of this recent open access paper from journal of vision.

Here's an example of how grayscale tracking can be reported. This was a special calibration done by Scott Wilkinson and Robert Heron on a high end OLED tv (Samsung KN55S9C, close to $10,000) owned by Leo Laporte.

Notice in this figure that the average delta E was 2.28 which is still considered very good (although their other instrument reported delta E's below 1).

Here's the video of the entire calibration process.

Keep in mind that those delta E's are simply delta E's relative to the chromaticity of D65.

Delta E is used in many contexts - when measuring the quality of an instrument, Delta E refers to the difference of that instrument relative to a lab-grade reference instrument. Delta E can also be used to refer to repeatability measurements within the same instrument, or between instruments of the same model.

When reporting Delta E's in the context of display calibration, it's done with reference to a particular standard. In the case of HD video, that standard is Rec 709.

With computer displays, it can be tricky to report these results, since the software that does the measurements typically run in a windows or OSX environment, and there may be weak links in the display pipeline that alter the final signal. With a signal generator, you can be certain that what you see reflects the monitor's output at its most basic operating level.

Once you introduce video cards, operating system, and test pattern software, there is the risk of all those components changing the original signal.

There are a couple ways to deal with this, however. One is to really manage your operating system and software environment carefully, to ensure that there are no weak links in the pipeline. One can confirm whether this is done successfully by measuring test patterns in signal generator environment and comparing the x,y,Y readings to those obtained when measuring the same test patterns in a windows+software environment. If they are the same, then one can proceed with a degree of confidence. Vito, or anyone else: if you end up doing this experiment, I'd be very curious to learn the results!

The other way is to actually measure the x,y,Y values at different IRE's using the signal generator, and then record the values (from 0% to 100% or whatever you choose). You can then open up HCFR, and input those values into the grayscale measurement tab (check the box called "editable data"), and it will automatically calculate the Delta E's at each luminance level that you input.


I've just got off the phone with the X-Rite Tech Support folks and have confirmed that the i1 instrument you have IS NOT an spectroradiometer but an spectrophotometer, as I indicated to you several times.

If you want to continue the discussion, please call Kevin at X-Rite and he will be more than glad to send you the specs on the instrument, and any other spectrophotometer they have.

Hope this helps...

Unkle Vito!
 
A spectroradiometer measures the spectral power distribution of a light source. A spectrophotometer measures the spectral quantities of objects. The latter is typically done by illuminating the sample with a reference spectral source (which is generated by the spectrophotometer).

I am using the definitions provided in chapter five of Janos Schanda's Colorimetery: Understanding the CIE System. The chapter is authored by Yoshi Ohno, from the National Institute of Standards and Technology.

Given that the i1 pro measures both emmissive spectra, and objects, the instrument is actually both a spectroradiometer and a spectrophotometer.




The delta E's that you saw have nothing to do with the accuracy or reliability of my instruments. They reflect deviation from Rec 709's D65 across my grayscale. They do not reflect deviations compared to a reference instrument. A display that hasn't had a grayscale calibration done can have huge delta Es even with a spectroradiometer that costs hundreds of thousands of dollars.

And yes, I know what delta E means, and I've asked you about this before, as you keep saying you get delta E's of under 0.05 percent. Do you realize that this is meaningless?

Delta E refers to the three dimensional geometric distance between two points in a perceptually uniform color space (this is a color space that has been transformed so that a given distance between any two points will result in the same perceptual difference, regardless of which two points are chosen). As far as I understand, a delta E of under 2.3 is perceptually indistinguishable (although the values 1.0 and 3.0 are often cited).

As for your question of which standard I'm using, I've gone through the literature and found research that has actually measured the phosphors of the FW900 and reported the chromaticities. One of these studies was commissioned by the National Information Display Laboratory. The chromaticities that I obtained with my brand new i1 pro (factory certified) when I measured the phosphors of my FW900 match up precisely with those obtained by other researchers.

More relevant to this particular discussion, my DTP-94 matches precisely (within 0.001) what I obtain with my i1 pro.




I hope you realize that your objection here is meaningless, based on my explanation earlier in this post.

So again, I will ask you the question:

What exactly are you referring to when you say your displays leave your lab with delta E's of under 0.05%. Are you referring to the delta E of each luminance level's white point balance?


I've just got off the phone with the X-Rite Tech Support folks and have confirmed that the i1 instrument you have IS NOT an spectroradiometer but an spectrophotometer, as I indicated to you several times.

If you want to continue the discussion, please call Kevin at X-Rite and he will be more than glad to send you the specs on the instrument, and any other spectrophotometerr they have.

Hope this helps...

Unkle Vito!
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I've just got off the phone with the X-Rite Tech Support folks and have confirmed that the i1 instrument you have IS NOT an spectroradiometer but an spectrophotometer, as I indicated to you several times.

If you want to continue the discussion, please call Kevin at X-Rite and he will be more than glad to send you the specs on the instrument, and any other spectrophotometerr they have.

Hope this helps...

Unkle Vito!


Please tell me what you think the difference between a spectrophotometer and a spectroradiometer is.
 
Here, this is written by Yoshi Ohno, whose expertise I trust over just about anyone.

11al3zs.png


vzfq5x.png


Do you understand that the i1 pro measures both emmisive sources, and object sources (referred to in the xrite terminology as display measurement and spot measurement), and is therefore, according to the authoritative definition I have provided, both a spectroradiometer and a spectrophotometer?

I understand it's confusing, as usually the term radiometric and photometric are distinguished in that one involves radiometric quantities, such as radiant flux (in watts), and the other involves photometric quantities such as luminous flux (lumens), and in order to convert from one to the other you have to take into account the spectral luminous efficiency function.

So one might think that a spectroradiometer measures radiance, while a spectrophotometer measures luminance.

But this is not what separates the two. Converting from radiance to luminance is not a function of the optics or the hardware of the instrument, it is a simple mathematical function that is applied to the Spectral Power Distribution (SPD).

I can actually measure SPD's using my i1 pro:

this is what it looks like (courtesy of Zoyd, who took the measurements in HCFR with his i1pro):

bd4e4034_wave3.png


Another point of confusion is that people often use the terms spectroradiometer and spectrophotometer interchangeably, but, according to the authoritative source, a spectroradiometer measures a test light directly, whereas a spectrophotometer uses a reference light to illuminate a test sample, then reads the reflected light off that sample. The combination of the reference illuminant and the spectral reflectance factor of the sample will determine the final reading.

In this sense, a spectrophotometer is actually more sophisticated than a spectroradiometer, as a spectroradiometer has to only measure incoming light, whereas a spectrophotometer has to both generate a reference illuminant, and read the reflected light.
 
Like I said, please continue the discussion with the professionals at X-Rite. They are the ones that designed and built the instruments and pretty much know much more than you when it comes to the instruments. If you cannot understand the specs of the instruments, then I can't help you but maybe the professionals at X-Rite may be able to further assist you with your discussions and concerns. Kevin at Tech Support maybe able to help you understand the instrument, what it is, what is does, what can do, what it can't do, and its uses...

This is the end of the discussion for me on this topic...

Sincerely,

Unkle Vito!
 
Here, this is written by Yoshi Ohno, whose expertise I trust over just about anyone.

11al3zs.png


vzfq5x.png


Do you understand that the i1 pro measures both emmisive sources, and object sources (referred to in the xrite terminology as display measurement and spot measurement), and is therefore, according to the authoritative definition I have provided, both a spectroradiometer and a spectrophotometer?

I understand it's confusing, as usually the term radiometric and photometric are distinguished in that one involves radiometric quantities, such as radiant flux (in watts), and the other involves photometric quantities such as luminous flux (lumens), and in order to convert from one to the other you have to take into account the spectral luminous efficiency function.

So one might think that a spectroradiometer measures radiance, while a spectrophotometer measures luminance.

But this is not what separates the two. Converting from radiance to luminance is not a function of the optics or the hardware of the instrument, it is a simple mathematical function that is applied to the Spectral Power Distribution (SPD).

I can actually measure SPD's using my i1 pro:

this is what it looks like (courtesy of Zoyd, who took the measurements in HCFR with his i1pro):

bd4e4034_wave3.png


Another point of confusion is that people often use the terms spectroradiometer and spectrophotometer interchangeably, but, according to the authoritative source, a spectroradiometer measures a test light directly, whereas a spectrophotometer uses a reference light to illuminate a test sample, then reads the reflected light off that sample. The combination of the reference illuminant and the spectral reflectance factor of the sample will determine the final reading.

In this sense, a spectrophotometer is actually more sophisticated than a spectroradiometer, as a spectroradiometer has to only measure incoming light, whereas a spectrophotometer has to both generate a reference illuminant, and read the reflected light.


Like I said, please continue the discussion with the professionals at X-Rite. They are the ones that designed and built the instruments and pretty much know much more than you when it comes to the instruments. If you cannot understand the specs of the instruments, then I can't help you but maybe the professionals at X-Rite may be able to further assist you with your discussions and concerns. Kevin at Tech Support maybe able to help you understand the instrument, what it is, what is does, what can do, what it can't do, and its uses...

This is the end of the discussion for me on this topic...

Sincerely,

Unkle Vito!
 
Back
Top