24-bit 96khz.......a neccessity?

TheGooch

Member
I use a condenser mic an 8-channel mixer and a 24-bit soundcard with connections for near any-type of input/output.



- Will i see a large difference in sound quality when recording in 24-bit 96khz as opposed to say 48khz or even the old-standard 44.1khz?

- I have been recording at 44.1khz. WIll the increased recording quality bring out a more true to life sound when recording vocals?

- can a really good 48khz mixdown equal a 96khz mixdown. Will I be able to tell the difference?
 
Thanks, Carnage, for remembering that thread. It's prevented me from once again trying to explain the problems with 96kHz to 44.1kHz conversion. Every time I try I find my explanation worse than my previous attempts. (I recently found two of my long-winded, undoubtedly confusing "explanations" a page apart in the same thread...

If I keep my wits about me, I can just refer folks to that thread. Unless you do it for me. :D
 
Last edited:
theblue1 said:
Thanks, Carnage, for remembering that thread. It's prevented me from once again trying to explain the problems with 96kHz to 44.1kHz conversion. Every time I try I find my explanation worse than my previous attempts. (I recently found two of my long-winded, undoubtedly confusing "explanations" a page apart in the same thread...

If I keep my wits about me, I can just refer folks to that thread. Unless you do it for me. :D

:cheers:
 
no real reason to go 96k. there is nothing in that range to record. often 96k converters will sound better (digi 888 vs digi 192), but that is because of better build and better jitter specs. 24 bit will always sound better. everything ive worked on for the past 3 years has been 24/48k and i see no reason to go beyond. if a client INSISTS on going 96k, then i'll do it. but at that resolution you usually get half the ins/outs on your hardware. even on digital consoles that will give you full channel count at high res, you get LOW fader resolution.
 
When I moved from a meager 4 track to a DAW 4+ years ago, I was advised it is best to record at 24/44.1 if my goal was audio CD production (my gear is capable of 24/96). In four years I have not read anything yet to convince me otherwise.
 
And, in fact, Apogee (who I daresay we all would like to have converters from) ran a scientific listening test where they had their test groups listen to 48kHz/24 bit and 96kHz/16 bit (no one ever really uses 96/16 anymore... there were a few converters in the late 80s and early 90s that offered it but they never went anywhere).

No one preferred 96/16 over 48/24. Everyone with a preference liked 48/24.

It seems pretty clear that the improvement in bit depth is more important than the improvement in overall frequency range.

(Interestingly, there was a semi-scientific test done by either Mix or EQ, if I recall, where they tested a bunch of high end amplifiers. Many of the people assembled preferred the amp that rolled off by 30 kHz instead of the several amps that had a range up to 100kHz. There was some inconclusive conjecture about why that was... Of course, none of the participants were able to actually hear anything explicit above 18-22kHz.)
 
Last edited:
theblue1 said:

No one preferred 96/16 over 48/24. Everyone with a preference liked 48/24.

It seems pretty clear that the improvement in bit depth is more important than the improvement in overall frequency range.


Blue, would you say this was dependent on the frequency make up of the test files?
 
Well, I did some test with people claiming about recording 24/96 an of course nobody could notice the difference with 16/44.1, I mean NOBODY. But I actually record 48/24 cause of what I really think is the main goal: digital processing, say dinamics, plugs and all that stuff we can now enjoy in DAW's. Obviusly everyone can see that fx and digital processing -prior to mix and apply dithering in mastering to CD- should work the better and more accuracy the more resolution we provide to apply the process on . That is my pont of view, sorry guys if I'm wrong.
 
If you are recording mostly live gear (e.g. drums, piano, acoustic guitar), then you will hear a huge difference in quality. I haven't heard 24/192, but I have heard 24/96 and it's the closest thing to analog I've heard out of digital. But for what most of us do, the lesser rates work. I prefer 24/44 personally, just 'cause it works, and I've gotten into a rut of using it. Maybe I'll go 48 or even that 96 for some projects, but it it ain't really broke...
Peace
 
i attended a conference about that last night with Elliot Sheiner as a speaker. Anyway, wait til SACD becomes more mainstream (1bit / 2.82 MHz)

I do agree with you all that bitrate is a lot more important than sampling freq.
 
Carnage said:


Blue, would you say this was dependent on the frequency make up of the test files?

It seems likely that would be a factor. But there wasn't a lot of info on the methodology on the Apogee site (unless there was a link to a white paper that I've forgotten)... but my thinking here is a bit on the sloppy, sycophantic side... if it's good enought for Apogee... :D


On the "couldn't hear the difference between 24/96 and 16/44.1" I'm thinking that may be a result of source material and equipment. So many of us are recording drum machines and synthesizers (often 16 bit themselves) or using lesser quality mics and possibly boards and interfaces -- not to mention monitors -- that it's easy to imagine a home comparison coming up with little noticeable difference.

But I think ebebebe is really on to something with his follow up comment --- he uses 24 bit resolution because he figures it will provide more finesse in processing and mixing and that makes total sense to me.
 
Back
Top