Home News Reviews Forums Shop


CD-R media longevity / durability / error tolerance TEST

Burn baby burn!

CD-R media longevity / durability / error tolerance TEST

Postby Halc on Thu Jan 01, 1970 1:34 am

This idea was brought up by Nox's excellent post about a cd-r media DIY testing.

I have decided to test the longevity, durability and resistance to error build up of the best available CD-R media (exlucing medical grade media). Nox has initially offered to help me out. We'll find out if that still stands after this message :)

This test would need more volunteers.

The idea is to test the disc's resistance to heat, humidity and radiation (UV, infra, visible) by placing them outside in the sun for the duration of the summer.

The data contents (same on each disc, verified 1:1 before testing) would then be read and tested every second day to measure for possible build-up of read-errors (and the time required to read the disc).

The test would run as long as the last disc fails to read without errors or when the testers run out of steam. Whichever comes first. I hope to be able to run the test for at least 80-90 days.

Preliminary results of how this kind of testing can affect cd-r quality remarkably fast can be found from the address Nox gave in an earlier thread:

http://es.geocities.com/pruebacds/Cds/frcd.html


NOW, I need your help

1) Media to be tested
I need suggestions for HIGH QUALITY CDs to be tested from manufacturers brands that are not already on the list:

10 x Mitsui Gold Ultra (Mitsui)
10 x Kodak Ultima Silver+Gold 80 24x (Kodak)
10 x Philips CD-R 80 Silverspeed 32x (Ritek)
10 x Fuji CDR-74 24x (Fuji)
10 x Ricoh CD-R 80 (Ricoh)
10 x Hi-Space CarbonSound 80 min (MPO)
10 x TDK Reflex Ultra 32 X (Tayio Yuden)
10 x Verbatim DataLifePlus Super Azo 32x (Mitsubishi)

Please, no suggestions for bulk media or manufacturers alredy included above. REPEAT: no bulk media will be tested. If you want to test bulk media, run your own test :)

I think 8 is the absolute maximum of different brands to test. More than that and it'll get really cumbersome. Five brands would be much less work, but it would leave a lot of important brands out as well.

2) Methodology to be used

Initial test procedure would be as follows. Please give comments on how to improve the test. If you can only criticise and not offer a suggestion for an improvement, I'm not interested in your comments :)

- X volunteers (hopefully 5) each use/test Y pieces of test media (hopefully 2) from each manufacturer (that means up to 16 disc pers volunteer!)

- Same audio data is burned on each disc and verified to be 1:1 on each test media, before test starts

- Image of the bit-perfect test data is written to hard-disk for comparison purposes

- Read speed (test for 1:1 accuracy) of each disc written down before test starts (each speed measured in each volunteer's respective set up)

- Discs are labelled clearly on the CLEAR inner ring of the disc. No stickers, no writing on the surface, no printing (we probably need to engrave codes on the ring, because felt tip pen ink might wear out during the test)

- All the discs are placed side by side in a sunny place outside where they are exposed to temp, humidity and radiation changes daily.

- Íf possible, one disc of each brand should be layed out label side up and another disc form the same brand label side down

- The test discs should be on a surface that prevents them from being moved / scratched / dropped / being stepped on.

- Rain can hit the surface of discs as far as I'm concerned (this would simulate really hard conditions / increased aging test)

- Every second day each disc is brought in, radially wiped clean with a lint free cloth under wet conditions and then dried.

- Each cleaned/dried disc is then tested for read errors (1:1 bit accuracy) and read speed (compared to original speed).

- All errors and read speeds are marked down

- When and if the read speed becomes intolerably slow with a lot of read errors or is completely unreadable, it is removed from the testing batch and the time of death is reported.

- Each test disc that fails is photographed with a (digi)cam to illustrate the changes on the physical appearance of the disc, both on the label side and the reflective side

- Testing continues untill all discs have failed or time available for testers runs out (this needs to be agreed upon by the testers themselves)


3) Test volunteers

I need people (I can provide the discs) that can take part in this gruelling, but a VERY useful test, for as long as 3 months.

Summer is a good time, but I don't think anybody can promise to be there for testing every second day.

Therefor I think we need five volunteers (four in addition to myself) so that we can make it less likely for all of the testers being away at the same time. At least we'll have some data every second day, if not from all volunteers.

I must warn you, this will be very cumbersome and time consuming test. I'm not even sure it's doable and that's why I'm posting this idea here and soliciting comments.

If you have any improvements to the test or want to participate in the test (volunteers from within EU preferred, can't afford to send the discs to Asia/USA), please reply to this thread.

You're an ideal candidate if you're in an office summer job for the duration of most of the summer, using a computer with a cd-rom drive in it, know how to use EAC and have a place on which to put the discs for testing.

What can I or other volunteers offer in return?

Fame and/or notoriety, that's all. No money involved I'm afraid. Not for me at least.

4) Publishing the results

The data will be gathered from all volunteers when the test ends:

- read errors and read speed for each disc from all participants for all the dates tested

- in addition the cd-burner model, cd-burning speed, cd-rom reader model and computer used will be catalogued

- the results will then be tabulated and calculated for averages, medians and variances

- some textually summary will be written to explain the test data

- images of each disc from the day they failed will be provided

- hopefully a winner or winners from the gruelling test will be revealed

- results will be posted here, on most important forums and proably in a couple of computer magazines as well

- the test data will be sent to manufacturers (or rather, an URL of the test report will be given to them via e-mail) and their possible responses perhaps included in the test addendum

FINALLY

Why am I doing this?

I'm going to start archiving some digital pictures and others stuff for archival purposes. I want these archival copies to last as long as possible, without having to transfer them to new media every year or so.

The last even semi-reliable durability test (in addition to the one Nox posted about) that I could find is from 1996.

A lot has happened on the cd-r scene since.

I just want to personally learn what is the most error resistant and durable HIGH QUALITY CD-R media right now and want to publish the results to others.

So, anybody want to volunteer? :)

cheers,
Halcy

PS Upon reading my message through myself, I think up to 10 volunteers (1 disc / brand = 8 discs / volunteer) and testing every three days would be more realistic. I'm not sure even that's doable, but let's hear your comments first...
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Postby nox on Thu Jan 01, 1970 1:35 am

I supose that I will join...

I'll try Verbatim SuperAZO 24x and HP/TY at least on my own, because they are what I can get here.


> - Image of the bit-perfect test data is written to hard-disk for comparison purposes

How do we compare discs?
How many minutes of audio? (I guess 74m)
Does EAC's checksums depend on offsets?
Maybe it's better to check tracks checksum to see if errors are accumalated in one zone (it could show that the CDR deteriorated in only one zone, maybe because a manufacture error?).

My idea is/was to check c2-errors to see how they develop, until real errors that make my Plextor slow down appear, and then be checked with EAC.
I mean with C2-errors it's easy to "graphically" display how CDR deteriorated: you have a curve / function before the first "real" error appear.
Probably when CDR is too much deteriorated and drive has to stop and restart the number of C2-errors is not a good measure of quality anymore, but the first days I think it's interesting.
And I would also do this test every day the first days or weeks...

And yes, more people would be nice...
nox
Buffer Underrun
 
Posts: 40
Joined: Fri Mar 15, 2002 4:00 pm

Postby meperidine on Thu Jan 01, 1970 1:37 am

No offense intended but there are too many uncontrollable variables to draw any conclusions. Each geographic location is going to receive differing amounts of insolation, temp. and humidity. Trying to do a metaanalysis only works when the protocol is identical. For ex. a cloudy day will have a greater proportion of UVA to UVB than a sunny day. Taking the mean survival length from different locations is less than useful. None of the days will produce the same light conditions (not to mention temp. and humidity). An average will only be related to the test conditions, which is difficult to quantify. There's no point in combining data gathered under constantly varying conditions. If one brand lasts a mean of 10d and another 20d that does not mean the 2nd brand endured 2x the abuse, or even significantly more. Thus even if the experiment is performed at the same location the results will be confounded.

If you're serious about doing this you will have to do it under controlled conditions. Also you will have to separate the experimental variables in order to draw any conclusions. To test UV resistance you could get a light box with UV tubes and stick it in a closet. This is probably too expensive; you might try normal fluorescents but this might not be enough for an accelerated test. Likewise repeat the same testing temp. and then humidity. If you maintain a strict protocol you can spread out the tests so it won't become a full time job.
meperidine
Buffer Underrun
 
Posts: 2
Joined: Wed May 15, 2002 7:23 pm

You're quite right

Postby Halc on Thu Jan 01, 1970 1:37 am

Combining the results will of course be tricky statistically if the geographically dispersed test data is hugely variant.

Then again, I have a hunch that they won't be - and that there will be strong statistical correlation between each brand failure rate - regardless of the test site.

But this is what the test would also test for.

If the data is so all over the map that any safe general conclusions can't be made, then that's the general conclusion :)

I would like to do this in a real lab situation, but no such luck: no lab, no gear, no time, no expertise, no money available.

You are of course perfectly allowed to do such a test yourself :)

cheers,
Halc
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Postby Pio2001 on Thu Jan 01, 1970 1:37 am

Hi,
It'a good idea.
Unfortunately, I won't be able to take part, because my two drives (burner/reader) are dying and in the process of being replaced, and I have no room to left CDRs outside. On the windows, birds would drop guano on them.

I've read that silver could be easily attacked by the dye (I've not the link here, but I'll post it as soon as I get home if you're interested) , and that was why manufactrers used gold for pro CDRs, though it's less reflective that silver (this info about reflectivity is controversed).

Two thirds of my Mitsui SG and Mitsui media that are carefully stored since 4 years ago are dying a slow death. Some have already become completely unreadable. I've not got enough old CDRs of other brands to compare.

That's why I now use gold CDRs, and it would be interesting to compare gold/silver CDRs from exactly the same manufacturers.

HiSpace Metal/Gold and Mitsui Silver/Gold are possible. By the way, I've never been able to find any Mitsui Gold in France, while Mitsui silver are quite common (Teac, Philips, HP were Mitsui silver).

I find it strange that you chose HiSpace CarbonSound, that are audio CDRs for standalone burners, the same for computers are HiSpace CarbonCD.

And also, having studied how copper of speaker cables oxydized, I found that the main factor was fingerprints and other dust/fat : oxydized in 7 days while perfectly clean copper lasts several monthes.

In the same way, I wonder if my Mitsuis have their silver attacked by my numerous fingerprints on the label side through the laquer (I've got a fat skin).
I suggest to test CDRs with fingerprints all over their label side vs CDR cleaned with ethylic alcohol. Fingerprints can catalyze oxydation in an incredible way. But no cheating with food/oil, use your own skin fats !
Pio2001
Pio2001
Buffer Underrun
 
Posts: 16
Joined: Thu May 16, 2002 6:28 am
Location: France

Postby Halc on Thu Jan 01, 1970 1:38 am

Pio2001,

thanks for the great comments. Some further musings on the issue:

>Unfortunately, I won't be able to take part, because my two
>drives (burner/reader) are dying and in the process of being

Sorry to hear that. Can you tell me which drives you are considering as replacement drives?

>replaced, and I have no room to left CDRs outside. On the >windows, birds would drop guano on them.

Now, that would be a *real* stress test :)

>I've read that silver could be easily attacked by the dye (I've >not the link here, but I'll post it as soon as I get home if

Chemically silver is not as stable as gold, that's a know fact.

Also, like you stated a lot of manufacturers offer better longevity for their gold based discs.

>Two thirds of my Mitsui SG and Mitsui media that are
>carefully stored since 4 years ago are dying a slow death.

That's scary. I had the same happen to my Kodak GOLD coloured discs. The whole label side peeled off - completely removing the reflective layer. These were merely 2,5 - 3,5 year old discs (properly stored).

I haven't used Kodak since.

>That's why I now use gold CDRs, and it would be interesting
>to compare gold/silver CDRs from exactly the same
>manufacturers.

I tried to find more gold CDs, but finding even those discs I told above was quite difficult.

I couldn't get real gold discs from Mitsui, Ricoh, Kodak (they still make them, but minimum order is 100 000) and others.

>I find it strange that you chose HiSpace CarbonSound, that >are audio CDRs for standalone burners, the same for >computers are HiSpace CarbonCD.

I bought the HiSpace on a whim from Feurio store. It's the only place on the net I've found it for sale. They didn't have CarbonCD (non-audio) when I made my order.

>And also, having studied how copper of speaker cables
>oxydized, I found that the main factor was fingerprints and
>other dust/fat : oxydized in 7 days while perfectly clean
>copper lasts several monthes.

Yes, fat (among other substances) can act as a catalyst for the oxidation process.

>I suggest to test CDRs with fingerprints all over their label
>side vs CDR cleaned with ethylic alcohol. Fingerprints can
>catalyze oxydation in an incredible way. But no cheating with
>food/oil, use your own skin fats !

That's a very good idea. If we ever get around to doing this test (still need more people), then I'm sure that would be a good addition.

cheers,
Halc
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Postby nox on Thu Jan 01, 1970 1:38 am

I agree with Halc. I think that there will be a reasonable correlation between different places.
It's a matter of explaining how the conditions were and the results based on location. Maybe we can not have a ranking of results, but they can probably be interpreted and get some conclusions...

Also, if the results are homogenous it's possible to merge values of different locations doing something like a normalization.
I mean, for example, the "winner" of each location can be "awarded" 100 points and the rest of brands are "awarded" points accordingly.

The problem about different testing lengths could mostly be due to the different readers, to the moment in which is decided that a CDR is dead, apart from the weather.
I would try to avoid rainy weather. I don't think it's a good measure because the CDRs don't usually are in water... If they get wet it's not problem, but I'd try to avoid it. (The conditions have to be explained in the test).

Halc, what's your opinion on testing for C2-errors at least until occur slowdowns when extracting?
I think it's a good measure before unreadable errors appear...

My email is xpg@myrealbox.com
nox
Buffer Underrun
 
Posts: 40
Joined: Fri Mar 15, 2002 4:00 pm

Postby Pio2001 on Thu Jan 01, 1970 1:38 am

Halc wrote:Can you tell me which drives you are considering as replacement drives?


Sony DDU1621 as DVD rom, as your test advises, and I thought about a Yamaha CRW3200 as burner, for the Audio Master.

Halc wrote:I'll post it as soon as I get home


http://www.chipchapin.com/CDMedia/cdr5.php3#cdr-dyes

Homepage : http://www.chipchapin.com/CDMedia/

Halc wrote:I had the same happen to my Kodak GOLD coloured discs. The whole label side peeled off - completely removing the reflective layer. These were merely 2,5 - 3,5 year old discs (properly stored).


I heard about it from a friend too.
But my Mitsuis look perfect. Indistinguishable from brand new ones, exept one, that had a strange little aera that seemed a bit darker with dots, that made the track unreadable. CD-RW.ORG had such a spot too on one of his Mitsuis.

Halc wrote:I tried to find more gold CDs, but finding even those discs I told above was quite difficult.

I couldn't get real gold discs from Mitsui, Ricoh, Kodak (they still make them, but minimum order is 100 000) and others

I bought the HiSpace on a whim from Feurio store. It's the only place on the net I've found it for sale. They didn't have CarbonCD (non-audio) when I made my order.


I can get you as much HiSpace CDRs as you want. Metal (silver), Gold, and/or carbon ones (silver + black UVproof). I can have Mitsui Advanced Media Golden Dye Silver too. I could also try to order some Mitsui Gold on mitsuimedia.fr, they ship in France.

Halc wrote:(still need more people)


I'll try to find a way to hang CDR on something outside. In addition, it will scary pidgeons :D

I wonder if sunlight is a good real-life test, after all CDRs are stored out of light. If I can take part, I'll put one out of direct sunlight if I can. And an audio master one too.
Pio2001
Pio2001
Buffer Underrun
 
Posts: 16
Joined: Thu May 16, 2002 6:28 am
Location: France

Postby Halc on Thu Jan 01, 1970 1:39 am

nox wrote:The problem about different testing lengths could mostly be due to the different readers, to the moment in which is decided that a CDR is dead, apart from the weather.


Reader differences can be at least partially normalized by having baseline read speed results for each media from each participant's setup, before the stress test begins.

But you are absolutely correct that drive's differencies in handling error conditions will cause big differencies in read speeds when the test progresses. Some drives just handle errors much faster than others.

I would try to avoid rainy weather. I don't think it's a good measure because the CDRs don't usually are in water...


I agree. We'd then need a semi-water-proof pouch for the discs, that lets in humidity but doesn't let in rain. Also, it'd need NOT to block UV light nor much of the infrared spectrum.

Then again, if the discs can stand rain (and all the acids in it) then the pressing of the layers is propably very tight and the label side resistant enough to corrosion.

Relative humidity changes from 40-100% would be a good measure. Most accelerated aging tests use high relative humidity and higher temp.

Halc, what's your opinion on testing for C2-errors at least until occur slowdowns when extracting?
I think it's a good measure before unreadable errors appear...


I think your proposal was good. I still don't know if we can use this, unless we all have TRUELY C2 reporting drives that have really been confirmed to report C2 correctly. Most don't do it properly, even if they claim to do it.

However, even if the measurement for errors cropping up is less-than perfect, at least we have a measure of when the dicss fail.

cheers,
Halc
Last edited by Halc on Sat May 18, 2002 2:44 am, edited 2 times in total.
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Re: You're quite right

Postby meperidine on Thu Jan 01, 1970 1:40 am

Halc wrote:Combining the results will of course be tricky statistically if the geographically dispersed test data is hugely variant.

Then again, I have a hunch that they won't be - and that there will be strong statistical correlation between each brand failure rate - regardless of the test site.

But this is what the test would also test for.



Test conditions will vary in the exact same location, adding in different sites is just going to make it worse. Sorry but the protocol is fundamentally flawed. I'm not trying to be a jerkass but an experiment where the test conditions are not controlled and there are multiple variables being tested simultaneously is not going to produce half way valid results.

nox wrote: I think that there will be a reasonable correlation between different places.
It's a matter of explaining how the conditions were and the results based on location. Maybe we can not have a ranking of results, but they can probably be interpreted and get some conclusions...


That's the problem though, no matter what location the conditions will be constantly changing (with multiple variables to boot) and it would be really difficult to quantify the test conditions. The issue isn't multiple places, its the differences in test conditions from day to day. If one brand dies after 10d of mostly sunshine, then another dies 10d later under mostly cloudy conditions what does that mean? Is the 2nd brand twice as tough, or does UVA have virtually no effect and the brands are nearly identical?

nox wrote:Also, if the results are homogenous it's possible to merge values of different locations doing something like a normalization.
I mean, for example, the "winner" of each location can be "awarded" 100 points and the rest of brands are "awarded" points accordingly.


No way. The test conditions won't be the same so combining the data is very misleading, as is trying to a metaanalysis.

All I'm saying is that doing something quick and dirty isn't worth the effort if the results can't be trusted. Setting up a bank of fluorescent tubes wouldn't be expensive, though it may take a long time to run.
meperidine
Buffer Underrun
 
Posts: 2
Joined: Wed May 15, 2002 7:23 pm

This is not science - but a DIY test

Postby Halc on Thu Jan 01, 1970 1:40 am

Meperidine,

thank you for your comments. I think you point out one good thing, which is the day-to-day test variable changes after a sample disc has been removed from the sample pool.

This means no absolute data between the discs can be measured.

However, it (alone) does not nullify the relative performance characteristics of the discs.

As such, I see no reason to go into several orders of magnitude more time consuming/expensive test setup where each variable is measured one at a time under controlled conditions.

I can understand your concern, but I am personally unable to do what the single variable at a time test setup would require.

Perhaps you could do the UV lamp test yourself. I'm sure more than just me would be interested in the results.


meperidine wrote:Test conditions will vary in the exact same location, adding in different sites is just going to make it worse.


Of course there will be variations (a could might shade one disc for 16 seconds more than another disc), but as long as they are random, it's just statistical noise in the data - not a skewing of some discs only.

The only variation is day-to-day variation which is equal for all discs that are tested for the same period of time.

If one takes into account, the baseliine for testing is reseted every time one disc is taken out of the sample pool. This way there won't be absolute measured data for ALL the discs, only relative ranking for all the discs, which is what we are after at any rate.

meperidine wrote:I'm not trying to be a jerkass but an experiment where the test conditions are not controlled and there are multiple variables being tested simultaneously is not going to produce half way valid results.


That's your hypothesis, nothing more. We have ours and we'll run a test to check it.

You should check yours with an empirical test too :)

meperidine wrote:That's the problem though, no matter what location the conditions will be constantly changing (with multiple variables to boot)


In real life tests discs will be in constantly changing conditions with multiple variables. Nothing new there.

What our multiple variables at the same time test set up will NOT reveal is WHICH of the variables causes the detoriation (if any). It will just measure the tolerance to various variables over a length of time and put the sample units in relative order of tolerance to these changes.

meperidine wrote:The issue isn't multiple places, its the differences in test conditions from day to day. If one brand dies after 10d of mostly sunshine, then another dies 10d later under mostly cloudy conditions what does that mean?


That the second brand outlatested the first one. That's all.

Drawing any kind of "x times better" conclusions is totally useless with our sample size at any rate - even if we did a real lab tests with measured control variables.

What our test information can bring forward is that if there's a correlation between quality and durability, if the rate of detoriation will be similar in all areas.

If it is, it should be reasonably useful to put the test candidates in relative order.

The absolute measures will not tell you much, due to day-to-day condition changes after sample pool changes, just as you pointed out.

meperidine wrote:All I'm saying is that doing something quick and dirty isn't worth the effort if the results can't be trusted.


The results can't be trusted for per variable absolute results data, because we don't measure per variable absolute data (e.g. uv radiation, relative humidity levels, abs. temp, temp delta, etc).

They can be (if strong enough correlation exists) be a pretty good indication of relative longevity among the tested brands - especially if one compares this data to any other data available in public now (100% anecdotes).

I think the test would be a marked improvement on the quality of information about relative longevity of various cd-r brands.

meperidine wrote:Setting up a bank of fluorescent tubes wouldn't be expensive, though it may take a long time to run.


That's a very nice idea for a single variable controlled test.

Maybe you should run a test like that yourself.

Personally, I have neither the money, time or the interest to test for merely single control variable at a time. Nor am I convinced that UV radiation is the most important test variable for the setting in which I use CD-R discs.

Cheers,
Halcy

PS We won't try to publish our results in a peer-review journal, if that makes you any happier. Better than anecdotes, though.
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Postby Pio2001 on Thu Jan 01, 1970 1:41 am

According to accelerated tests, UV takes 100 years to damage pthalocyanine, therefore our Mitsuis and Kodak (made with Formazan or pthalo ?) didn't die because of UV.
I think heat can be an important factor, CDRs are heated in the drives when they are read. I don't know if the laser beam can wear them (since it's also a laser beam that burns them). They are shaken a little bit too.

I've ordered two drives, and I should be able to hang CDR outside the window, it will be rain-proof (exept during strong storms, but I can take them in then), and exposed to sunlight.

I'm new to these forums, are all those terrible popups normal ? I get some windows that doesn't have contextual menus to close them (I must ctrl-alt-del them), and some that wants to install software on my computer without asking me :evil: (fortunately, Windows Explorer asks by itself).
Pio2001
Pio2001
Buffer Underrun
 
Posts: 16
Joined: Thu May 16, 2002 6:28 am
Location: France

Postby eiki on Thu Jan 01, 1970 1:45 am

nm.
eiki
Buffer Underrun
 
Posts: 9
Joined: Tue Apr 02, 2002 1:35 pm
Location: Iceland

Postby eiki on Thu Jan 01, 1970 1:45 am

I wish I could help you but the only media in the list that I can get my hands on here in Iceland is Kodak.
eiki
Buffer Underrun
 
Posts: 9
Joined: Tue Apr 02, 2002 1:35 pm
Location: Iceland

Postby Halc on Thu Jan 01, 1970 1:45 am

Eiki,

the discs are no problem. I can send the discs as they arrive to me (I've already ordered them).

The problem is time and putting in the effort to test the discs.

That's what I can't compensate for, but if you're interested. Please keep on reading this thread.

As soon as I get my discs in the mail I will start compiling a list of volunteers and we can start discussing the test details.

When we agree on how to test and so on, I can then mail out the test discs to everyone.

cheers,
Halc
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Postby eiki on Thu Jan 01, 1970 1:45 am

One problem springs to mind. I live in Iceland, so lack of sunshine could prose a problem, if you want that type of test. Its called Iceland for a reason you know. We have sun 24 hours a day during the summer but it is rarely very strong.

If you still want me to test the CD's I'm game. I wont have much to do this summer anyway.
eiki
Buffer Underrun
 
Posts: 9
Joined: Tue Apr 02, 2002 1:35 pm
Location: Iceland

Postby Halc on Thu Jan 01, 1970 1:49 am

Eiki,

Iceland is fine by me :)

The endless summer of us northerners will more than compensate for the lack of intense UV radiation.

I just received a message that my discs are in the mail and I should be receiving them next week hopefully.

Could everyone who's interested in participating list what CD-ROM readers and/or burners they have available for this test?

We need to check if we can use the Feurio C2 checking by ensuring all drive models report C2 data accurately.

cheers,
Halc
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Postby nox on Thu Jan 01, 1970 1:49 am

I have a Plextor Ultraplex 40x (firm. 1.05) and a Teac CDR-55s (caches audio and no C2-error info).


I have a problem I think: when I enable "Use C2 error information for error correction" in EAC the results are not reliable. I mean it says "Copy OK" when it really is not OK.
Almost all times that there is a C2 error when the 16 red lights of EAC shine, it only does one "pass" in places where it usually needs more passes when that option is disable.

The truth is that EAC configures Plextor correctly: Accurate Stream, no cache, Drive is able of C2-errors, and don't "use C2-error info for correction"...

But, why ¿doesn't that option work?
nox
Buffer Underrun
 
Posts: 40
Joined: Fri Mar 15, 2002 4:00 pm

Postby eiki on Thu Jan 01, 1970 1:49 am

I've got an ASUS 32x12x40 in an old P3 rig running Win2000.
eiki
Buffer Underrun
 
Posts: 9
Joined: Tue Apr 02, 2002 1:35 pm
Location: Iceland

Postby Pio2001 on Thu Jan 01, 1970 1:49 am

I've just finished to test the Sony DDU 1621 that I received today.
Same as the Hitachi GD-7500, but faster : C2 are reported as soon as the CDR is difficult to read, but when error correction starts, no way to get same CRC, even when copy is "OK". It seems to be a common behaviour.

I've got the Yamaha CRW3200e too, but it doesn't support C2.
Pio2001
Pio2001
Buffer Underrun
 
Posts: 16
Joined: Thu May 16, 2002 6:28 am
Location: France

CD-R test media received

Postby Halc on Thu Jan 01, 1970 1:53 am

I have received the following test media for the DIY longevity test. Ten pieces of each disc:

1. Emtec Ceram Guard CD-R 74 24x
- Tayio Yuden Company Limited (Type 1)

2. Mitsui Gold Ultra
- Mitsui Chemicals, Inc. (Type 6)

3. Kodak Ultima Silver+Gold 80 24x
- Kodak Japan Limited (Type 6)

4. Philips CD-R 80 Silverspeed 32x
- Ritek Co (Type 7)

5. Fujifilm CDR-74 24x
- Fuji Photo Film Co (Type 5)

6. Ricoh CD-R 80
- Ricoh Company Limited (Type 6)

7. Hi-Space CarbonSound 80 min
- MPO (Type 9)

8. TDK Reflex Ultra 32 X
- TDK Corporation (Type 5)

9. Verbatim DataLifePlus Super Azo 24x
- Mitsubishi Chemical Corporation (Type 2)

That makes it nine discs. Quite a lot of work, but all are supposedly "quality" discs even though all are not from known "quality" factories.

It's time to start thinking about the test itself.

Everybody who wants to contribute their time, please liste your cd-r reader and/or writer so I can compile a list of drives and their capabilities.

My example:

Burner Yamaha CRW-3200E
- C2: NO
- Read CD-DA correctly: YES (Feurio)
- device can read audio data perfectly: YES (Feurio)

Sony DDU1621
- C2: YES (according to Feurio 1.65)
- Read CD-DA correctly: YES (Feurio)
- device can read audio data perfectly: YES (Feurio)

Are there any suggestions as to how to test for errors?

Nox has already mentioned testing for C2 errors with Feurio until the discs fail to read or the reading takes way too long time.

Are there any other suggestions?

cheers,
Halcy
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Postby Halc on Thu Jan 01, 1970 1:56 am

*BUMP*

Need volunteers to send me e-mail at:

halcyon at myrealbox dot com

cheers,
Halc
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Postby nox on Thu Jan 01, 1970 1:57 am

Halc, I've just received your email.
I'd like to discuss one point.
If we use Feurio and C2-errors we can't know the real number of errors unless we compare the WAVs with the originals (with EAC for example).
We also can argue that tracks extracted with EAC will have less errors than with Feurio probably.
And compare different WAVs is quite time consuming...

So, I don't really know if it's better to use Feurio or EAC.
I mean I like to use Feurio the first days, before the errors appear, so we can have an estimation of how the CDR is degraded: I mean 2 CDRs can have 0 uncorrectable errors, but one the drive can inform of 2000 C2-errors in one CDR and 0 C2-errors in the other. So the first one has already started to lose quality, and not the other.

But maybe when the errors are too big, the number of C2-errors is such huge that it reprents little (maybe, I don't know). Also maybe once the CDROM has to slow down and start, the number of C2-errors varies quite randomly (with little sense)... (at least at high speed the Plextor stops quite sharply and maybe that makes the number rise a lot)

Also, Feurio stops the extraction when an unreadable sector is reached.
It has a setting to change the number of retries. The default maybe is 10.
After those retries the extraction is stopped and it says if you want to retry or continue.
This is unuseable when there are too many errors. With EAC you don't have to say nothing while extracting.

Probably it's possible to change how Feurio behaves when it reaches unreadable sectors, but I don't know how.

Also, maybe CRC Checksums is enough to check accuracy, so we can afford "Compare WAVs" while there are no errors.
When do we finish? When there are too many different samples? When the CDR is not readable at all?

Why 4 minutes pauses? I don't know, but I thought that in DAO mode at least the pauses are read. Am I right or wrong?
Aren't pauses the silence inside of index-0 zones?
What's the advantage?
nox
Buffer Underrun
 
Posts: 40
Joined: Fri Mar 15, 2002 4:00 pm

Postby Pio2001 on Thu Jan 01, 1970 1:58 am

I suggest extracting only first, middle and last track, in secure mode with C2, and note the speed.
While we get 100 % track quality and full speed, no need to perform further tests. It's ok.

Then if speed decreases, it can be due to Windows98 (or is it something else in my system ?). We must reboot and try again. Usually, I get full speed again.

The first symptom of decaying CD is a lot of C2 errors. Only one shouldn't mean anything.
I don't know Feurio. If it can count the C2 errors it can be interesting.
We can watch for, in EAC, the ripping speed, and the number of rows of error correction. I don't think we should need 5 lines, it's slow, and doesn't make much difference with 3 lines.
The next step in decay is the presence of sync errors, or read errors in secure mode (the three lines are not enough to correct errors).
After that, the ripping gets slower and slower.
Pio2001
Pio2001
Buffer Underrun
 
Posts: 16
Joined: Thu May 16, 2002 6:28 am
Location: France

Postby Halc on Thu Jan 01, 1970 1:59 am

Nox and Pio2001,

thank you for your good comments.

To put it shortly, my opinions:

1. If everybody has a C2 functioning drive, we should consider using Feurio with C2 -> it's much faster than re-ripping in EAC and doing "compare WAVs". Further, as Nox stated, it will also show signs of detoriation earlier than EAC will.

2. I don't think the EAC 100% quality is accurate. I wouldn't trust it. I *think* (I can't be 100% sure) my tests published in EAC forum show this. If we combined rip speed with this quality like Pio suggests, then perhaps it would be useful.

3. I really need ALL the drive models from all volunteers. Otherwise we can't check if we can use Feurio or not.

4. I didn't know that GAPs were read as parts of the track (i.e. if you burn with 4 minute gap in Nero or Feurio). That makes this more difficult. I will have to build a test disc with three tracks (one in the inner layer, one in middle and one in outer layer of the disc) and test it in Feurio/EAC.

So, please send the hardware data if you haven't already.

If you have any more ideas on what the test disc should be like, please post here or to all volunteers via e-mail.

cheers,
Halcyon
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Next

Return to CD-R/CD-RW Drives

Who is online

Users browsing this forum: No registered users and 3 guests

All Content is Copyright (c) 2001-2024 CDRLabs Inc.