As longs as it can serve up the 'bits' to the DAC accurately and the DAC can accurately clock the samples.
If you are attempting to assert that the above statement is the null-hypothesis when evaluating at digital transports..... Then you are absolutely right, congratulations !!!
..... However, I believe you totally underestimate the enormity of the task you have identified.
Consider SPDIF. The waveform of SPDIF has a bandwidth of 30 MHz
Unless your transport is approaching perfect, then SPDIF waves on a scope are nothing like square .... This introduces jitter/distortion in the receiver and DAC chip.
It's all very well to run some type of "bit perfect" test, using many available methods .... and decide that "bits in = bits out" and everything is dandy ..... but these tests are only tell you the output data is the same as the source data while it's sitting still .... it says nothing about the time domain.
It is this difference in the time domain, that introduces the distortion in the analog output of a DAC.
As to whether this distortion is audible, it's going to be very system dependant, as it can easily be masked by other things in the playback system.
In short: The laws of physics are very clear that any less than perfect timing of bits will make distortion. There are very clear oscilloscope measurements (google) showing even the most perfect transports in existence not being 100% perfect square waves.
So rather than trying to arrange blind testing to prove the laws of physics wrong ... why don't you stick to a much simpler question "is it audible in playback system X?"