There are two different clock "topologies" for a DAC:
Source is master, DAC is slave
Source is slave, DAC is master
What matters is the jitter at the DAC chip (in the DAC). Having a fixed frequency clock in the DAC, right next to the DAC xhip is the best way to implement this. You can do this when the DAC is master, it has the "master clock". USB can do this in "asynchronous mode". The BH DAC uses asynchronous mode so it can do the DAC as master. In this mode the source (usually a computer) sends the data out, but the DAC can tell it to speed up or slow down so tha average data rate matches the clock in the DAC.
There is another USB mode called adaptive, in which DAC is the slave, but the BH DAC does not use this. SOME other DACs use this mode.
The S/PDIF inputs (coax and optical) just work with the source as master and the DAC as slave. Thus the DAC has to somehow synchronize its clock to the data rate from the source. This is traditionally done with a device called a PLL, which is built in to all the S/PDIF receiver chips. PLLs have much higher jitter than a good fixed frequency clock. The BH DAC does not do it this way. It cleans up the S/PDIF signal, and sends it into an FPGA (field programmable gate array) which does the S/PDIF decoding. But the special part is a digitally controlled ultra low jitter clock. This is almost as good as the best fixed frequency clocks. The FPGA tells this clock to speed up or slow down so it is synchronized to the average data rate of the source.
The result of this is that both S/PDIF and USB produce ultra low jitter to the DAC chip. This combination of ultra low jitter from BOTH S/PDIF and USB doesn't exist in any other DAC. On other DACs one or the other will be significantly worse than the other input.
John S.