Page 1 of 2
S/PIDF vs. RCA
Posted: Mon Jan 10, 2005 9:25 pm
by Leaf
I got S/PDIF ins... I use the outs for my monitors. My bass player buddy wants to go in the S/PDIF ins, using an RCA cable. I am under the impression that this is a bad idea for some reason, and he's adamant that it doesn't matter. Anyone know why you aren't supposed to use RCA cables for S/PDIF inputs?
Re: S/PIDF vs. RCA
Posted: Mon Jan 10, 2005 9:39 pm
by jb
Leaf wrote:I got S/PDIF ins... I use the outs for my monitors. My bass player buddy wants to go in the S/PDIF ins, using an RCA cable. I am under the impression that this is a bad idea for some reason, and he's adamant that it doesn't matter. Anyone know why you aren't supposed to use RCA cables for S/PDIF inputs?
I use an RCA cable to go from my J-Station S/PDIF out to my Delta 66 S/PDIF in, and it works fine. Actually, I took one of those three-cable dealies, with the yellow, red, and white cable for audio/video, and I grabbed the yellow one and peeled it off the other two and just use that, so I don't have a useless cable dangling there getting all up in my business.
Posted: Tue Jan 11, 2005 8:26 pm
by fluffy
The whole point to S/PDIF is that, being a digital connection, as long as the signal gets through it doesn't matter how good the cable is. Basically, either you get a perfect signal or it'll be dead or garbled, without anything in between.
So, yeah, RCA cable is fine. It's what it's designed to use, what with having RCA plugs.
Though the way you phrased things makes it sound like your bass player buddy is planning on connecting his bass directly to the S/PDIF input. Hooking an analog device to an S/PDIF port won't work too well. :) (It wouldn't cause any damage, but an S/PDIF device seeing an analog signal will just do nothing, and an analog device seeing an S/PDIF signal will make a sound kinda-sorta like a dialup modem.)
Posted: Wed Jan 12, 2005 12:44 am
by Leaf
Yeah, I didn't mean it like that. He has a s/pdif output on his bass amp, and I had heard from one person I trust and one I don't that "you were n't supposed to" ... and for the life of me, I can't think of any reason why... I've tried imagining that it would have something to do with cable shielding, something reasonable like that... but I just can't remember why. The thing is, my buddy, bless his heart, often will do things because he thinks it's right, and he refuses to listen to any reason until he's discovered, for himself, not through a conversation, that he was wrong. I really didn't see the need to use the output since we had 8 free tracks, without getting some advice that would assure me that this was o.k. It seems reasonable, I mean , they look identical except for the price tag.... it sounds like noone has had any equipement failures or damage as a result... maybe that should be the question... (implied here).
Posted: Wed Jan 12, 2005 5:06 am
by jb
Leaf wrote: it sounds like noone has had any equipement failures or damage as a result... maybe that should be the question... (implied here).
When you think about it, it's all just wire. So if you're going from one S/PDIF to another S/PDIF the worst that can happen is it won't work, since it's not like you're going from an electrical outlet into an S/PDIF input. In which case, if you achieved such a miracle of cabling, your computer would probably make very exciting noises before it stopped working, immediately prior to the smoke.
Posted: Wed Jan 12, 2005 7:04 am
by fluffy
Yeah, like, we're only talking like 12VDC and 100mA, if even that much.
"S/PDIF cables" are just RCA cables with a higher pricetag, sold because many audiophiles are idiots.
Posted: Mon Jan 31, 2005 11:55 pm
by Sleazy_D
The only reason the cable quality matters is if timing in the signal is critical... like if you're going from S/PDIF to analog anything, or directly to a PA where it's not so loud that you can still hear nuances, and those are only problems if your S/PDIF source is also the master digital clock.
Basically, audio RCA cables only have to have descent frequency response in audio frequencies. The sqaure waves pushed out through S/PDIF need something much higher, and that usually means you want somthing with linear phase distortion up to a few MHz so that the clocking is stable.
If you're using it to record to another digital source, then it all gets sorted out when you play back to a stable clock signal, like your sound card.
Posted: Tue Feb 01, 2005 12:23 am
by Leaf
That was a very articulate answer.... a little over my head in terms of what I know... but still..thanks... that's the thing I was waiting to read... a friend told me that the physical difference was in ohms?? Something to that effect, but thanks.
Posted: Tue Feb 01, 2005 8:21 am
by deshead
Leaf wrote:a friend told me that the physical difference was in ohms??
Here's a
decent but technical article on the issues. Nutshell version: the S/PDIF standard specifies the use of a 75 ohm cable. So if you use the
right cable, everything will work according to the spec. If you use the wrong cable, nothing's guaranteed.
You won't do damage, though. A cable with the wrong impedance might distort the signal, and since it's digital you'll get jitter, is all.
(Aside from the obvious hucksters who want to sell their over-priced cables, most of the loud voices on this issue are audio professionals. For guys mastering CDs, jitter in the stream is unacceptable, so the command "never use the wrong cable" carries some weight. Maybe your buddy talked to one of them.)
Posted: Tue Feb 01, 2005 8:22 am
by fluffy
Sleazy_D wrote:The only reason the cable quality matters is if timing in the signal is critical... like if you're going from S/PDIF to analog anything, or directly to a PA where it's not so loud that you can still hear nuances, and those are only problems if your S/PDIF source is also the master digital clock.
Basically, audio RCA cables only have to have descent frequency response in audio frequencies. The sqaure waves pushed out through S/PDIF need something much higher, and that usually means you want somthing with linear phase distortion up to a few MHz so that the clocking is stable.
Nice answer, but totally wrong. Yes, you need reallllly good cables to perfectly replicate the square wave on the other end, but the point to a digital signal is that you don't need perfect replication, and anyway it's not a perfect square wave coming out either. Timing is a non-issue because the signal always travels at the same speed no matter what the frequency is, and anyway, it moves close to the speed of light (or actually at the speed of light for TOSlink) - the cable would have to be several hundred miles long to introduce even a nanosecond of latency.
The "nuances" aren't affected by cable quality AT ALL. Nor is the quality of the magnetic surface on your hard drive a factor in how good your .wav file sounds. The only things which affect the sound quality in an S/PDIF setup are the quality of your recording and the quality of your DAC and clock source (and since the clock signal is part of the sound signal, this, too, is totally unaffected by cable quality as long as the signal gets through).
With a digital signal, either you get a pefect signal, or you get obvious errors. There is no in-between.
Oh, and jitter is malarky. In any situation where jitter is an issue, it'd totally mess everything up. If bits arrive out-of-order, the sound will be completely messed up. And, again, the signal always travels at the same speed no matter what its frequency - so any out-of-orderness of the bits wouldn't be due to issues in the cable.
Posted: Tue Feb 01, 2005 9:46 am
by roymond
Is it safe to connect S/PDIF cables with the device power on? History suggests that you'll blow things up, but it may have been how my audio card's ports were configured (it blew the A/D converters on the AM III card).
Posted: Tue Feb 01, 2005 10:28 am
by deshead
fluffy wrote:And, again, the signal always travels at the same speed no matter what its frequency - so any out-of-orderness of the bits wouldn't be due to issues in the cable.
Read the article I linked. If all the components in the digital signal chain aren't impedance-matched, you'll get reflections. Depending on the return-loss value of the cable, the reflections may peak enough to trigger a transition in the DAC. How could this
not cause jitter?
fluffy wrote:In any situation where jitter is an issue, it'd totally mess everything up. If bits arrive out-of-order, the sound will be completely messed up.
But only for that frame, right? Doesn't the encoding ensure the DAC can resync with the preamble in each sub-frame? So if bits are lost or scrambled, the DAC will right itself for the next sample. I'd expect this would just manifest as single-sample distortion, akin to digital clipping. (And either way, nothing that's going to damage Leaf's soundcard.)
Posted: Tue Feb 01, 2005 11:10 am
by Adam!
fluffy wrote:the cable would have to be several hundred miles long to introduce even a nanosecond of latency.
(Being a pedant here)
Rule of thumb: 1 foot of cable = 1 nanosecond of latency.
Yeah, I had some guy at The Sony Store try to bullshit me about jitter. He said that jitter comes from random errors in the bit pattern, and it sounds like quiet static, which means that only low-order bits get randomized / scrambled. I told him he must think I'm retarded if he expected me to believe that an audio cable would discriminate between low and high order bits, and that I certainly thought he was retarded, and then I walked out of the store.
Posted: Tue Feb 01, 2005 12:11 pm
by fluffy
Puce wrote:fluffy wrote:the cable would have to be several hundred miles long to introduce even a nanosecond of latency.
(Being a pedant here)
Rule of thumb: 1 foot of cable = 1 nanosecond of latency.
Oh, I didn't realize the signal propagated that slowly. Though that depends on the particular material of the wire, yes?
Anyway, it was just a half-assed order-of-magnitude thing. One nanosecond of latency won't hurt anything at all. Actually there is a very small amount of latency added in by the rising-edge trigger threshold on the receiving device, but that's more a function of the trigger generator on the sending device than anything else (and unless your wire is realllly crappy and extremely long - like, hundreds of feet long - the wire won't be a major contributing factor to that).
Yeah, I had some guy at The Sony Store try to bullshit me about jitter. He said that jitter comes from random errors in the bit pattern, and it sounds like quiet static, which means that only low-order bits get randomized / scrambled. I told him he must think I'm retarded if he expected me to believe that an audio cable would discriminate between low and high order bits, and that I certainly thought he was retarded, and then I walked out of the store.
Yeah. Single-bit swapping wouldn't be possible, and even if it were, it'd sound like WAY more than just "digital clipping" - it'd be a very major popping sound.
Posted: Tue Feb 01, 2005 1:12 pm
by c hack
I was always under the impression that you'd get dropped samples if you use a regular RCA cable ( I think some guy at guitar ctr told me that once). Why not do a test? Try both, and see if there's any difference. I'd be curious (but not curious enough to try it myself

).
Posted: Tue Feb 01, 2005 1:48 pm
by Adam!
fluffy wrote:Puce wrote:Rule of thumb: 1 foot of cable = 1 nanosecond of latency.
Oh, I didn't realize the signal propagated that slowly. Though that depends on the particular material of the wire, yes?
Actually you were right, it travels at the speed of light (times some coefficient).
c is about 300,000 km/sec, or 1 billion feet/sec, or 1 foot per nanosecond. But your so right, a nanosecond doesn't really mean anything in comparison to the time it takes to get through the A/D converters.
Math is fun!
Posted: Tue Feb 01, 2005 2:17 pm
by c hack
I've noticed that the longer my speaker cable, the weaker (quieter) the signal (obviously analog). Does that make sense? More resistance?
Posted: Tue Feb 01, 2005 4:04 pm
by fluffy
puce wrote:Actually you were right, it travels at the speed of light (times some coefficient). c is about 300,000 km/sec, or 1 billion feet/sec, or 1 foot per nanosecond. But your so right, a nanosecond doesn't really mean anything in comparison to the time it takes to get through the A/D converters.
Oh, right. So still just me being an order of magnitude off in my head.
I like how the speed of sound (in Earth atmosphere at STP) is (roughly) one foot per millisecond and the speed of light is (roughly) one foot per nanosecond. That makes it easier to keep track of stuff.
I guess the unit of time I meant to use was microseconds, anyway.
c hack wrote:I've noticed that the longer my speaker cable, the weaker (quieter) the signal (obviously analog). Does that make sense? More resistance?
Yeah, more resistance. Using a larger-gauge stranded wire will result in a louder signal (more paths for conductivity = less resistance; wiring two resistors in parallel results in *less* ohmage than either of the resistors together). Also make sure your gauges match, and try to keep the lengths more or less the same.
Oh, also, because I missed it earlier:
deshead wrote:
Read the article I linked. If all the components in the digital signal chain aren't impedance-matched, you'll get reflections. Depending on the return-loss value of the cable, the reflections may peak enough to trigger a transition in the DAC. How could this not cause jitter?
That'd have to be a pretty big impedence mismatch with a big tangled mess of cable. Articles hosted on audiophile sites usually aren't the best sources of physics or EE information. HTH, HAND.
Posted: Tue Feb 01, 2005 4:09 pm
by tonetripper
c hack wrote:I've noticed that the longer my speaker cable, the weaker (quieter) the signal (obviously analog). Does that make sense? More resistance?
Here is an informative link. I think it'll inspire some of you out there who are trying to think of ways of beefing up your DAW by buying monstrously priced cables. I make my own so I save a lot more dough in the long run. It's very easy and I'm sure there are cable construction links out there if you are curious about how to do it.
Audio Cables - Fact and Fiction Revealed
But coming back to the point, primarily for a long speaker run to have signal loss is probably due to bad shielding, crap wire, type of connector and level of signal being sent down the line in comparison to the rating of the cable.
The shielding is important as it'll keep the electic impulse (whatever that may be) taking the path of least resistance. This meaning that if your shielding is crap probably you'll have loss. In the case of speaker wire, most of the time that stuff is barely shielded, but refer to the article about where he speaks of inductance in addition to resistance. There is no rule. Usually the idea is the better the conductive material combined with good shielding will produce the best results and this can be applied in anything where there is an electrical impulse coming down a line.
The more conductive the wiring the better. Although you have to be careful obviously, cuz if the wire is rated too high or low for the signal and connecting sources, then issues may arise in the area of the path of least resistance, whereby it may not be an option due to the cable being wrongly rated for the signal..... etc.. The fundamental to all electrical concepts, the path of least resistance. The better conductivity wins in the end even without great shielding, which is what is informative about that article.
Also the connector. If the connectors are shit then you might have loss at the sources. The better the connector connects with good solder points , and the impedence rating are reasons why the cable may be able to go a greater distance or have better connectivity. Basicaly all three work off of each other in order to constitute a good cable.
So C. Hack, if you are using regular speaker cable, it usually isn't shielded all that well, so after a short distance it'll lose signal quick. Primarily due to shitty shielding. If using properly rated wiring, the better the shielding, in most cases, the better the signal transfer (ofcourse this also goes back to the type of cable and connector). This applies to cable construction in general from my experience.
In the world of SPDIF, I use an RCA cable. I don't think it makes that huge of diffence as long as you are not going a rediculous distance, but if you have the dough invest in a well shielded SPDIF or video RCA or whatever it might be called. Cuz in the long run the signal coming digitally out is pretty hot and the chance of bad shielding coming into major issue is rarely going to be an issue, but why take the chance and compared to guitar cables or mic cable it's relatively cheap (that and rca connectors are hard to solder - well I still do it now and again). This is ofcourse my opinion. If you do buy one, try to buy one that you have the ability to repair the ends. The rubber ones are a hassle and usually if they break then you just got to chuck them. A computer and/or stereo guy is going to tell you that you need to spend 30 dollars on a four foot cable with gold connectors when you can spend 5 dollars at the local electronic store (not radio shack) and have a way better cable in the end. Cable constuction is easy. You all should have a go at it if you haven't done so already.
Pablo
Posted: Tue Feb 01, 2005 6:07 pm
by roymond
I have a set of good (gold plated connectors, thick cable, etc.) cables I used to use between my DAT and sound card. Pretty long (I forget...20 feet?). I no longer need them and I don't anticipate needing them again. PM if you have a particularly good reason for me to ship them to you. Leaf gets first dibs since he started this excellent thread.
Posted: Wed Feb 02, 2005 12:15 am
by Leaf
Wow! this is getting awesome. Fluffy and Puice, you guys blow me away with your brains, and well. damn. I was hoping to read some kind of discourse like this. Anway, based on what I ve learnt from you all, I don't feel so scammed for buying a couple of S/PDIF cables for my monitors, since I use them constantly, I'll err on the side of caution! (I bought them two years ago for 30 bucks each, and that's slightly more expensive than a mic cable round here, so not so bad....
Oh, thanks for the offer Roymond, but I'll have to pass and open the bidding to the massess.
Anyway, for recording the bass, the RCA has worked totally finel; and I I've been able to route through my roland vsr-880 and add two tracks through my Delta 10/10 so that's been an added bonus to this whole convo for me.
...so in short I offer nothing of value to this conversation other than thanking you all for sharing some super facts/opinions! Keep laying it down!
Posted: Thu Feb 10, 2005 4:23 am
by Sober
Ok, my
Yamaha DG-Stomp has a "Digital Out" that says 'coaxial' under it. It looks like an RCA. Am I to assume that this will go right into my SPDIF input on my soundcard? Supposedly, this nifty piece of equipment will sound like it's a perfectly mic'ed amp if you use that output.
So, the questions are: Where the hell do I get a coaxial cable, does that matter, will an RCA do the job, and will it please please please not sound like crappy direct in?