No, don't apologize, it's a good discussion even if my/our level of skepticism is a bit high. It's just that when I see a huge long list of tweaks like that, without there being a demonstrable reason to do it in the first place or a demonstrable improvement afterwards, it raises my "BS meter" hackles. That's not to say that I think it's necessarily wrong, just that I haven't (yet) been convinced that it's right.
I guess my big question about the whole clock/jitter thing is, if that is indeed a problem with this (or any spdif) device, what is the result on the final audio stream? Is there a subtle degradation of the data - i.e. introduction of some random bit flips here and there - or does the stream fail completely? I guess I tend to think of digital signals as "it either works or it doesn't," not "it works mostly." In other words, an analog signal is susceptible to noise by interference etc., but how does "noise" manifest itself in a digital signal? I would think the protocols would include some sort of bit check or something to prevent corruption... but I'm not any sort of expert in these things.