Forgot your password?
typodupeerror
Music Hardware Games

Do You Really Need a Discrete Sound Card? 520

Posted by Soulskill
from the either-that-or-you-don't dept.
crookedvulture writes "Integrated audio has become a common freebie on motherboards, causing many to question whether there's any need to have a sound card. Tech Report took a closer look at the issue by testing the latest integrated Realtek codec against a couple of sound cards: Asus' $30 Xonar DG and its considerably more expensive $280 Xense cousin. Everything from gaming performance to signal quality is explored, and it's the blind listening tests that prove most revealing. The integrated solution is obviously flawed, and in a bit of a surprise, the cheaper Xonar is the one most preferred. Discrete sound cards certainly have their benefits, and you don't need to spend a lot to get something that sounds a lot better than the average motherboard."
This discussion has been archived. No new comments can be posted.

Do You Really Need a Discrete Sound Card?

Comments Filter:
  • Well... (Score:5, Informative)

    by CSFFlame (761318) on Monday November 22, 2010 @05:04PM (#34310352) Homepage
    I would like it point out that a good card lets you recieve certain inputs that a normal card would not, such as both coax and optical SPDIF. I also would say that much of the audio quality comes from the DACs and Sampling rate conversion.
  • Educate yourselves (Score:1, Informative)

    by Anonymous Coward on Monday November 22, 2010 @05:06PM (#34310372)

    Asus'

    Asus's

    I like Mr. Jones.

    I am going to Mr. Jones's house to meet his wife.

    It's hard keeping up with the Joneses.

    The Joneses' house is quite pretty.

    Do you get it yet?

  • Sometimes Yes You Do (Score:2, Informative)

    by sanjacguy (908392) on Monday November 22, 2010 @05:09PM (#34310398)
    I have a MB with a built in RealTek sound 'card'. I also run Windows XP 64, cause I'm crazy. The RealTek system for XP 64 is notoriously unstable. When I played Champions Online, the game would disable the sound because it could and would crash the program. Borderlands took it the other route - you can run the program, but you will always crash when you hit level 10, due to the special level 'ding' sound for level 10. Solution? Get a sound card, or a new OS.
  • by Attila Dimedici (1036002) on Monday November 22, 2010 @05:15PM (#34310480)
    Actually, it is not necessary to use an additional "s" to form a possesive with words that end with an "s" sound. Several sources say that it is prefered, however I find that it is more readable without the additional "s".
  • by HermMunster (972336) on Monday November 22, 2010 @05:19PM (#34310516)

    Sound cards used to be sold because their ability to decode sound was done on the card rather than having the CPU doing it, which would slow down the gaming performance (somewhat). I'm sure that sound cards also have other features not found in on-board chipsets, but most of those are for things like high end gaming.

    About 7 years ago I remember getting an on-board NVIDIA chipset that had hardware decoding of mp3 files. The CPU utilization of the system without the hardware decoding the CPU jumped to about 45% continuous while playing back the mp3 file. On the rig with the NVIDIA chipset with hardware decoding the CPU utilization was nearly imperceptible. It became to expensive for NVIDIA to offer those for long so they replaced them with generic sound chipsets.

  • by Cinder6 (894572) on Monday November 22, 2010 @05:19PM (#34310522)

    Actually, the Chicago Manual of Style allows " Asus' " as an alternative to "Asus's". Just make sure to be consistent.

  • Re:Yes (Score:3, Informative)

    by arivanov (12034) on Monday November 22, 2010 @05:32PM (#34310654) Homepage

    Depends which ones. I have tried making a media center out of nearly anything short of a dead badger. Based on my experience:

    Most VIA EPIAs have sound quality on par with discrete audio solutions. It is something you can hook to a proper amp and not be disgusted by what comes out from the other end. Most via based mini-ITXes can proudly play flac encoded audio with proper Hi Fi quality. So are some of the older Crystal Audio chipsets found on really old high end motherboards.

    Compared to that most audio on Intel chipset motherboards I have had to deal with is utter tripe (with the notable exemption of Asus). The most common problems are:

    1. Interference from the network hardware. As the network works it "ticks" over the audio channel. Makes a PC totally unusable for music. This is more common on older kit, though I still see it here and there even today.
    2. IRQ interference problems on new hardware. I thought that shared IRQ problems are something of the distant (circa 1998) past. Recently Fujitsu-Siemens and Intel proved me wrong. The Intel HD on the Scaleo-E needs special IRQ tweaking on Linux in order not to skip: http://foswiki.sigsegv.cx/bin/view/Net/DebianScaleoE [sigsegv.cx]
    3. Distortion. Most onboard Intel HD audio has notable distortion in the high freq range. Examples - HP 6xxx series laptops, Lenovo S10e.

    You get whatever you pay for. Viva le monopoly - result is crap video, crap audio, crap disk IO and the consumer is blaming it all on guess what - the too slow CPU so they are aiming to get a bigger one for Xmas which is in favour of guess who...

  • by jandrese (485) <kensama@vt.edu> on Monday November 22, 2010 @05:36PM (#34310698) Homepage Journal
    7 years ago a new system would have been built with a 1.8-2.3Ghz Athlon XP or a 2.5-3.5Ghz Pentium 4. If you managed to make an MP3 decode eat up 45% of the CPU with any of those chips, you were doing something horribly wrong.
  • Got relays, beyatch? (Score:5, Informative)

    by Cordath (581672) on Monday November 22, 2010 @05:49PM (#34310882)

    Sound quality matters, but sometimes small features that one might usually overlook even more.

    For example, say that you have a nice speaker setup and a good amp, but an aging pre-amp that can no longer decode the latest audio formats. If you run things with a PC, the pre-amp is basically a very expensive DAC. If you can find a sound-card with good DAC's on it you can, in theory, just toss the old pre-amp and connect your computer directly to your amp.

    Problem! When a computer boots up, a large voltage spike goes through its various components including the audio card. With many audio cards or audio chipsets this spike goes right out the line to your amp, which dutifully amplifies it into a very large CRAWHOOMP!!! Besides causing your cat or dog to projectile defecate on whatever it happens to be near at the time, this can also damage your speakers and/or amp!

    How do other components like pre-amps get around this problem? Good audio components all have some way of electrically isolating their outputs from the rest of the device so that these power-up CRAWHOOMP's don't happen. This usually means electromechanical relays. This is why your expensive amp or receiver usually makes some clicking noises moments after being powered up. That's the relays clicking into place once voltage levels have normalised.

    Good audio cards, like the Asus Xonar series, also have these now. On-board chip-sets usually do not since it would add a few dollars to the price of the board and most people don't plug their computers output directly into an expensive amp and speakers.

    Long story short, what audio components you hook up to your computer and how you hook them up both have a large impact on the features you need in your computer's audio card. For a long time, computers had zero chance of replacing pre-amps because almost all audio cards lacked the small features that good audio gear almost universally possesses. That's changing, and about time too!

  • by Anonymous Coward on Monday November 22, 2010 @05:51PM (#34310904)

    It sounds like you're talking about the nVidia nForce 2 chipset [wikipedia.org], which was great, and notable for Soundstorm's real-time Dolby Digital 5.1 encoding.

  • by Luckyo (1726890) on Monday November 22, 2010 @05:51PM (#34310910)

    My own machine still has audigy2 I bought long, long ago, and that has been in at least 2 other systems before. My parents, who usually get my "hand downs" use ancient SB live!

    Both machines have realtek on board audio, and even my father, who is not audiophile by any stretch noticed a difference in spite of using some crappy 50€ speaker+mic set on that machine after I put SB Live in (his words were something among the lines of "whatever you did to our computer, it sounds different. I like it more that way"). On my machine, I use logitech's Z5500 (http://www.logitech.com/en-us/speakers-audio/home-pc-speakers/devices/224) and frankly, realtek sounds utterly horrible even on mp3 playback.

    Playing games on realtek was just painful. Literally painful, subwoofer was outputting noticeable distortions even in WoW during the short period of it bugging out with creative cards about a year ago, forcing me to roll over to it.

    Point to case - buy a basic ~40€ sound card if you care about sound at all. You don't have to shell any more unless you're an audiophile - pretty much all bells and whistles are a waste.
    But a proper bulk sound card does make a magnificent difference even for a mid-end speaker setup. And in some cases, even with basic headphones.
    I'm not sure if it matter whether the card is from crative or asus at this point - both seem to support EAX, and with DX10 onwards losing DirectSound completely, there's just not much of a point in hardware audio in a consumer PC beyond the basic quality and post-processing.

  • Re:Ghost Recon (Score:4, Informative)

    by dunezone (899268) on Monday November 22, 2010 @05:53PM (#34310932) Journal
    Back in the day a decent sound card would have its own on-board processor. This processor would take over the work of processing sound and relieve the burden from the actual CPU of the machine which was needed for other critical activities same concept as a GPU. Sounds to me the processing of the sound was different from single-player to multiplayer, maybe there was extra over head to process where the sounds were coming from within the environment. That extra overhead was put your CPU over the edge. Of course disabling the sound helped your game play. But today with multi-core processing and fast processors this is less of a concern and doesn't create the bottle neck like it used to. Heck, we might be seeing CPU/GPU combos on the same chip in the near future, I believe AMD and ATI were working on that?
  • by Anonymous Coward on Monday November 22, 2010 @06:08PM (#34311120)

    No, Creative Labs made a Soundblaster 16 with a SCSI2 port [stason.org].

  • DDLive/DTS Connect (Score:3, Informative)

    by otis wildflower (4889) on Monday November 22, 2010 @06:09PM (#34311128) Homepage

    If you want digital surround sound for a HTPC, you want Dolby Digital Live or DTS Connect to transcode into DD/DTS bitstream into your HT receiver.

    AFAIK there are currently ZERO onboard sound chips that do this.

    Yes, you could run 6 cables from the back of the HTPC into the analog preamp ins on your receiver (assuming it isn't a skinny modern HTPC-in-a-box that only has SPDIF or HDMI in) but you'd likely also end up with hum and other strange sound artifacts from the chintzy DAC..

    These days, I'd _REALLY_ prefer a dump of 5.1 LPCM over HDMI, and it's technically probably easier to do to boot, or at least less license-y..

  • Re:Well... (Score:5, Informative)

    by Anonymous Coward on Monday November 22, 2010 @06:13PM (#34311160)

    OK, everyone's talking about the noise produced inside a PC, but what in a PC is going to have noise at audible frequecnies?

    You don't need noise at audible frequencies, you only need noise that produces unwanted harmonics at audible frequences.

    EMI, for one, introduces all kinds of artifacts. Have you ever held your mobile phone near a powered speaker? Did you hear the crackling/popping noise coming from it? Yet, your phone communicates at 900MHz or above, which by your reasoning should be inaudible. High-resolution DACs are very sensitive to electrical interference. Such interference usually does not mean co-resonance (where the device oscillates with the same frequency as the noise source), but more often "beating".

    In the same vein, there are plenty of devices inside a PC that impact the stability of the power supply voltage rails. Small wrinkles on the power rail might again cause DAC inaccuracy, but of more importance is its impact on signal timing: a power surge (or dip) will affect the slew rate of transistors, which can cause inaccuracies in the timing of signals.

    In how many ways this can affect music reproduction is up for debate. But usually the second form of interference (jitter) causes much more audible problems than the first.

  • Re:No (Score:3, Informative)

    by Physics Dude (549061) on Monday November 22, 2010 @07:38PM (#34312024) Homepage

    ... Gold is a poor conductor ...

    Gold is a superior conductor to aluminum and not much worse than copper. I'm guessing its cost has more to do with aluminum winning out on low end shielding. :)

  • Re:Yes (Score:4, Informative)

    by fyngyrz (762201) on Monday November 22, 2010 @08:06PM (#34312250) Homepage Journal

    No, baby, it's not about the length - it's about how you plug it in.

  • Re:Yes (Score:1, Informative)

    by Anonymous Coward on Monday November 22, 2010 @08:45PM (#34312578)

    It's actually quite easy to perceive the difference between quality HDMI cabling and the cheaply-made stuff. However it only shows up when the run is longer. When all you need is a cable to connect devices a few feet from each other, any crappy cable will do. When you need to run more than 50 or so feet of cabling, quality cables will improve the timeliness of the signal and reduce the amount of data that arrives after its usefulness (the signal may be digital, but it's still real-time and very small delays in signal transmission will result in visible artifacts.) Beyond 100 feet, I've been told you really need to transition to fiber optic for the bulk of the run.

    So it entirely depends on the context. An expensive monster 6ft cable is, as you've pointed out, a bad joke. An expensive 50ft cable is often very useful.

  • Re:Well... (Score:3, Informative)

    by Woodmeister (7487) <woodford@jason.gmail@com> on Monday November 22, 2010 @11:09PM (#34313468)
    _Real-time_analog_circuitry_ has "NO LATENCY"[1].

    [1]The closest thing you have to latency in these circuits is slew-rate, which is measured in volts per _nano_seconds. There are also the phase shift/distortions that the GP mentions, but the truth is these are practically impossible for humans to perceive in any real sense. 'specially for audio frequencies and circuits that aren't garbage.

"Now this is a totally brain damaged algorithm. Gag me with a smurfette." -- P. Buhr, Computer Science 354

Working...