by Halc on Wed Oct 29, 2003 8:56 am
Thank you RJW for that excellent update!
Let me comment on a few things.
These are my musings based on my on-going cd-r testing and ongoing research into dvd-/+r standards and testing gear/methodology.
My main arguments:
1) KProbe is not useless
Kprobe does seem to measure reading process related bit errors to a useful statistical repeatability accuracy FOR A PARTICULAR LITEON MODEL (sans some that have known inaccuracies).
That is, the results using one disc and several same model drives correlate reasonably well with each other.
This means, that for testing the relative count of reading process error related bit errors, KProbe is indeed very useful.
2) KProbe results cannot be generalized to other drives or to other discs
Kprobe results with disc A do not mean that disc A is generally good or bad. That depends on the writer and to a lesser extend, the reader.
Low level results from a professional analyzer are needed (over a large sampling of same discs) to get a relative measure about the quality of a disc under varying writing conditions.
Kprobe cannot offer these low level results. Plextools offers some results like these for CD discs and the generalisability of those results is still somewhat in the air (yes, I've read your excellent report on the Plextools test earlier).
3) Kprobe results cannot be used as an absolute measure
If one gets X amounts of Y type of errors on a disc A with Kprobe and certain LiteOn drive, does that mean that the disc writte is within specifications (for dvd and for dvd+r) or not?
This is impossible to deduce from the results.
As low level problems manifest themselves in reading process related bit errrors, which vary from drive to drive, it is impossible to generalise absolute measures from one drive to another.
We can only say that the reading resulted in a situation that was within the specifications or which wasn't.
We cannot say for sure that reading in another drive (different model/maker) will result in a similar result.
4) Calibrated Audio Developments CATS analyzers have limits to their testing capability, which can sometimes be exceeded by using consumer drives (e.g. LiteOn and/or Plextor drives)
In my CD-R testing I have come across cd-r discs that had so bad low level problems that CATS just plainly refused to even return low level measurements for these disc. The testing process just failed to start.
However, I was able to successfully measure some rough C1 and C2 error distributions for a Plextor Premium and LiteON 48246S and LiteOn 52327S drives.
Furthermore, I was able to extract that data from the disc that the CATS just plainly refused to even test, not to mention measure for reading process related bit level errors.
Hence, the testing drive, even in calibrated CATS units, is just an example of what one particular type of drive can do at its best.
It may be able to perform wonderfully on a certain type of discs, but it may utterly fail on other discs, which still remain readable on consumer drives.
Of course, CATS is under constant development for testing purposes.
That's why situations where it fails and consumer drives excel (and not vice versa) are probably statistically quite rare. Probably, because I don't have a large enough sampling to argument this in a believable manner even to myself :)
5) There are no absolute single set of measurements for low level attributes and reading process related bit errors that are the final truth
Deduced from above it is painfully apparent that there is no single set of measurements that is 'the truth, only the truth and nothing but the truth'.
In measuring analog signals (that's how bits are encoded on discs, as analog variations), the measurements are always a combination of various factors:
- the measurement technique (what is measured and how)
- the measurement device (CASTs, LiteOn, Premium, etc)
- the measurement situation variables (EMI, RFI, humidity, temp, vibration, etc)
- measurement signal processing (ASICS, software, etc)
However, after having seen CATS in action along with three other professional quality analyzers I can say that for overall professional management of disc quality, the professional testers are much, much more useful than something like KProbe or Plextools Pro are.
They just return so much more useful information and you get a better cross reference from one measurement to another, which enables you deduce further things from the results.
Bit level errors are for the most part reading process related errors and it's debatable whethere they even are on the discs themselves.
As such, the low level measurements that preceed bit level errors are really, really useful in disc quality analysis.
6) What we need are rough relative measures of low level problems on discs, done using consumer gear (they don't need to be overly generalisable or absolutely correct)
We need something like Plextools Professional for DVD discs, with more parameters and more manufacturer drives supported than just Plextor.
I believe this is in our future if we press on and ask for these things from manufacturers repeatedly.
While getting there, I still believe that KProbe and UM Doctor Pro II can be useful relative measures for determing which DVD discs can be more succesfully read in the drive on which it was tested (however, this may not give us always a very accurate insight into the writing quality/compatibility with the disc).
So, absolute measures or generalisations from test drive to other models are still merely a pipe dream, although we can sometimes speculate interesting and even useful things, when we have enough data from various models available.
In the end, please do not understand that this is disagreeing automatically with anybody.
I'm merely writing what I've learned and what I think is the most useful way of looking at these tests.
I think I agree with 99% of what RJW has written.
I only think that Kprobe/UM Doctor Pro II are not 'useless' per se.
They can be useful, even if somewhat inaccurate, ambigious in their labelling of various levels of errors and even if the measurements cannot be inductively generalized to a larger population of drives.
friendly regards,
Halcyon
PS Yes, my CD-R test is coming along slowly. I've hit some bumbs on the way and have other matters to tend to as well. More news to follow as I progress.