Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: PC clock vs. DAC (Read 7357 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

PC clock vs. DAC

someone made the follwing claim in the context that using WASAPI event style produces superior sound output as compared to WASAPI exclusive.
Is there any truth behind this:
"...
That implies that USB transfer is regulated by the (dismal) PC clock, rather then the optimized clock from the USB DAC. And that difference can (very clearly) be heard on high level HiFi installation.
..."


PC clock vs. DAC

Reply #2
someone made the follwing claim in the context that using WASAPI event style produces superior sound output as compared to WASAPI exclusive.
Is there any truth behind this:
"...
That implies that USB transfer is regulated by the (dismal) PC clock, rather then the optimized clock from the USB DAC. And that difference can (very clearly) be heard on high level HiFi installation.
..."


That alleged difference is usually just another one of those things that usually disappears when good listening test methodologies are used.

It's been decades since I've seen a DAC clock upgrade article that referenced any kind of listening test methdologies but casual audiophile sighted evaluations.

There are also several  shadings of meaning here.

The USB device clock controls the DAC, and depending on the USB protocol being used, it is cued by means of reference to the PC clock but not continually dependent on it, or it runs completely independently. 

DAC perfectionists set very high standards for DAC clocks, so their standard of reference is very high and generally vastly exceeds audibility by very large margins.

No, I don't know of any home PC's that use Rubidium frequency standards or even TXCOs  to control their system clocks, but wouldn't that be more than a little overkill?


PC clock vs. DAC

Reply #4
The only time I've heard of clock problems is when it's far enough off to cause pitch or timing problems.   

For example, someone records a backing track with a good quality USB "podcast" microphone.  Then, they play-back the backing track through their regular-cheap consumer soundcard (which has a less-accurate clock) and there is a slight pitch-shift...  The pitch-shift is usually not noticeable for those of us who don't have perfect pitch, but when the musician tries to play-along with the backing-track, they may notice it.  Or, they may play or sing in-tune with the off-pitch backing track without noticing it.  Then when the tracks are mixed, the pitches won't match.

Or the musician may be playing in-time to a drum track.  When the tracks are mixed, they may drift noticeably out of sync by the end of the song (or by the end of a long performance).

The problem doesn't show-up unless you record with one device and play back on another device.  Of course, if you didn't make the recording yourself, it was recorded on a different device...

I'm NOT saying that most consumer soundcards have bad clocks...  Most are very good...  But if you do have a clock problem, it's usually a the fault of a cheap soundcard or a cheap USB mic.

PC clock vs. DAC

Reply #5
WASAPI event has WASAPI exclusive mode

PC clock vs. DAC

Reply #6
The only time I've heard of clock problems is when it's far enough off to cause pitch or timing problems.   

For example, someone records a backing track with a good quality USB "podcast" microphone.  Then, they play-back the backing track through their regular-cheap consumer soundcard (which has a less-accurate clock) and there is a slight pitch-shift...  The pitch-shift is usually not noticeable for those of us who don't have perfect pitch, but when the musician tries to play-along with the backing-track, they may notice it.  Or, they may play or sing in-tune with the off-pitch backing track without noticing it.  Then when the tracks are mixed, the pitches won't match.

Or the musician may be playing in-time to a drum track.  When the tracks are mixed, they may drift noticeably out of sync by the end of the song (or by the end of a long performance).

The problem doesn't show-up unless you record with one device and play back on another device.  Of course, if you didn't make the recording yourself, it was recorded on a different device...

I'm NOT saying that most consumer soundcards have bad clocks...  Most are very good...  But if you do have a clock problem, it's usually a the fault of a cheap soundcard or a cheap USB mic.



Pitch problems are highly unlikely for good equipment operating properly. It takes about 0.5% pitch difference to be audible, but the crystals and resonators used for clocks in digital gear are generally far better than that - 0.005% is not unusual.

Timing problems are extremely likely whenever two clocks are involved because it takes less than 10 mSec of timing offset to be audible, and clocks that have standard levels of accuracy will naturally drift that far apart in at most a few minutes.

The general rule is that wherever possible separate clocks are to be avoided, which is one reason why digital setups used in pro audio often have a central clock that everything else that matters is synched to.

One may end up with this problem when recording video using both camerad to record sound, and also portable digital recorders. Manually synching audio tracks is a canonical skill for video production people.

PC clock vs. DAC

Reply #7
Why not ask the people making these assertions for objective evidence?


Because those golden ears can hear jitter buried 100 dB or more down in the spectrum? 

I'm no EE, but I'm pretty sure like cables where you have to intentionally fail big time on basic engineering order to even bring jitter to the point of audibility.

PC clock vs. DAC

Reply #8
someone made the follwing claim in the context that using WASAPI event style produces superior sound output as compared to WASAPI exclusive.
Is there any truth behind this:
"...
That implies that USB transfer is regulated by the (dismal) PC clock, rather then the optimized clock from the USB DAC. And that difference can (very clearly) be heard on high level HiFi installation.
..."


The older version of WASAPI is called "push."  Both push and event can work in exclusive mode, which is the main reason to use WASAPI for listening in the first place.

PC clock vs. DAC

Reply #9
The older version of WASAPI is called "push."  Both push and event can work in exclusive mode, which is the main reason to use WASAPI for listening in the first place.

yes i know this thanks. The claim being made is that event style WASAPI produces superior sound output because the feeding of sound data is driven by the DAC clock as opposed to push style which is driven by the PC clock.
I dont know enough about the lower level workings of WASAPI to know if thats how event style WASAPI actually works but it sounds plausible and assuming it is correct then it seems based on the comments made above that the computer clock should be accurate enough for the purpose of outputting sound data and never produce any audible difference anyway

PC clock vs. DAC

Reply #10
someone made the follwing claim in the context that using WASAPI event style produces superior sound output as compared to WASAPI exclusive.
Is there any truth behind this:
"...
That implies that USB transfer is regulated by the (dismal) PC clock, rather then the optimized clock from the USB DAC. And that difference can (very clearly) be heard on high level HiFi installation.
..."


The older version of WASAPI is called "push."  Both push and event can work in exclusive mode, which is the main reason to use WASAPI for listening in the first place.



The general reason for preferring ASPI of any kind that it is a way to bypass a lot of Windows driver and protocol code that may not be doing the perfect thing, especially if you consider all versions of windows and all versions of the protocols and drivers.

As of the current Windows 7.1  both the ASPI path and the windows protocols are reasonably well perfected.

Every time I test an an interface both ways, I get essentially the same results. But it has not always been this way. At other times in the past the Windows driver and protocol code has been far less than perfect.

ASPI drivers can be pretty much be counted on to be optimized for latency or processing delay, which may not still be true of other Windows drivers and protocols.

Many people report clearly audible differences that are actually illusions, and artifacts of the primitive procedures they use to "test" with.  Not credible and don't pass reasonable scrutiny.

PC clock vs. DAC

Reply #11
someone made the follwing claim in the context that using WASAPI event style produces superior sound output as compared to WASAPI exclusive.
Is there any truth behind this:
"...
That implies that USB transfer is regulated by the (dismal) PC clock, rather then the optimized clock from the USB DAC. And that difference can (very clearly) be heard on high level HiFi installation.
..."


The older version of WASAPI is called "push."  Both push and event can work in exclusive mode, which is the main reason to use WASAPI for listening in the first place.



The general reason for preferring ASPI of any kind that it is a way to bypass a lot of Windows driver and protocol code that may not be doing the perfect thing, especially if you consider all versions of windows and all versions of the protocols and drivers.

As of the current Windows 7.1  both the ASPI path and the windows protocols are reasonably well perfected.

Every time I test an an interface both ways, I get essentially the same results. But it has not always been this way. At other times in the past the Windows driver and protocol code has been far less than perfect.

ASPI drivers can be pretty much be counted on to be optimized for latency or processing delay, which may not still be true of other Windows drivers and protocols.

Many people report clearly audible differences that are actually illusions, and artifacts of the primitive procedures they use to "test" with.  Not credible and don't pass reasonable scrutiny.


I believe event style is supposed to have lower latency, if you have an application where that's important.

I mainly use it as a simple way of disabling other background noises, such as windows chimes and web advertisements.

PC clock vs. DAC

Reply #12
someone made the follwing claim in the context that using WASAPI event style produces superior sound output as compared to WASAPI exclusive.
Is there any truth behind this:
"...
That implies that USB transfer is regulated by the (dismal) PC clock, rather then the optimized clock from the USB DAC. And that difference can (very clearly) be heard on high level HiFi installation.
..."

Sounds like someone mixing up a lot of things
wasapi can be used in a push and a pull mode (event style)
This is completely unrelated to shared or exclusive mode

This has nothing to do with the clocking of the USB
this can be adaptive or asynchronous
TheWellTemperedComputer.com

PC clock vs. DAC

Reply #13
yes i know this thanks. The claim being made is that event style WASAPI produces superior sound output because the feeding of sound data is driven by the DAC clock as opposed to push style which is driven by the PC clock.


I have an implemenation of WASAPI output in both, shared and exclusive mode, using the push mode (i don't have event mode because i don't care).

The difference between doing push or doing event is only who is responsible to know when the host has to send audio to the hardware.

Event based:

- Host tells API that it wants to be informed when it is the appropiate moment to send audio
- Host might prepare some audio in a separate thread so that it is ready when the API asks for it
- API asks host for more audio
- Host sends the prepared buffer if it was ready, or prepares then the buffer and sends it.

Push based:

- Host tells API that it will ask when it is the appropiate moment to send the audio.
- Hosts prepares some audio so that it is ready when the API is ready.
- Hosts asks the API if it is ready.
- If it is not ready, waits some time, and asks again
- When the API replies that it is ready, the host sends the prepared buffer. It might also prepare the buffer at this time and send it.


Things to know:
- One sample (4 bytes if 16bit stereo), at 44.1Khz is a wait time of 0,022676 milliseconds. 256 samples are 5,80 milliseconds.
Audio is never sent sample by sample (Only in the very old days of MS-DOS and earlier, and IRQ driven hardware, audio might have been sent sample by sample, but since the DMA days, audio is sent in packets).
- The latency that is set in audio programs is directly related to the size of the buffers that it interchanges with the audio API.  64 samples, 512 samples, 2048 samples...
- When working with low latency APIs ( WASAPI exclusive, Kernel Streaming, ASIO..) the number of buffers is reduced to two: The one the API is reading, and the one the Host is writing.
-The way the hosts waits in push based mode is generally with the sleep function. Sleep function works in milliseconds.  Older windows (pre NT) versions had a minimum wait time of 15milliseconds. Now it is possible to wait just one millisecond (The OS matters, because sleep returns the control to the OS, which stops assigning it time to work, and after the wait time, it starts assigning time to work to it again).


With all that, with very low latencies ( less than 5 milliseconds ), there might be a possibility for event based implementations to be less prone to audio skips than push based implementations, given a sleep time of 1 millisecond. This is not the same than saying that push based has more latency.

And I guess it is clear that push based implementations are not "based on Host Clock". They just wait.

To make this even more strong: Clocks only matter at hardware level, not at software level. Software works with bytes, not with signals.

PC clock vs. DAC

Reply #14
thank you [JAZ] - thats exactly the information i needed