JPG wrote:I have to jump into this quagmire.
Resolution - the smallest detectable change.
Accuracy - no deviation from 'true'.
These two things are NOT interdependent, but rather independent from each other.
One would 'hope' that accuracy deviation is smaller than resolution. However that ain't necessarily so.
One can create something with discrete steps that establishes the resolution.
If those steps are not more accurate then the steps, accuracy becomes the defining factor.
Not clear since a better description escapes me tonight.
An example is the Wixey type angular indicator. It's stated accuracy is less precise than it's resolution. .1 degree resolution vs .2 degree accuracy.
I don't think I stated that accuracy and resolution were interdependent for this laser measure device, but if I did that was not my intent. I don't know enough about the internals of this device to make that assumption, which is why I have been treating them as separate entities. In the general case, can we say that accuracy and resolution are always independent from each other?
Resolution is dependent on what type of detector/sensor is used and if this sensor/system has multiple ranges, then the resolution is dependent on that range selected. Often times accuracy will be dependent on the range selected, so in some respect accuracy and resolution are often "linked", so if one changes, the other may change as well. So there is some shared causality between them. Perfect example is the specs from the DMM sitting on my bench (see below). From this spec sheet, they don't look "independent" to me.
So, is this a quagmire yet? I mean maybe if I had brought up Schrodinger's cat and asked if it was dead or alive while I was trying to measure the length of its tail using my laser measure, then maybe it has become a quagmire...