• aleph
    link
    fedilink
    English
    arrow-up
    6
    ·
    4 months ago

    But 24-bit audio is useless for playback. The difference is literally inaudible. In fact, the application of dynamic range compression during the mixing/mastering process has a far greater impact on perceptible audio quality than sample rate or bitrate does (the placebo effect notwithstanding).

    If you care about audio quality, seek out album masters and music that is well-recorded and not dynamically crushed to oblivion. The bitrate isn’t really all that important, in the greater scheme of things.

    • resetbypeer@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 months ago

      I partially agree with you. Yes mixing and mastering is far more important than bitrate. However if I let my gf listen to a identical song both in normal 16/44khz and 24 bit version, she can hear difference. Now is it night and day ? Not always, but subtle Improvement can matter when enjoying music.

      • aleph
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        4 months ago

        Literally the only difference between 16 bit and 24 bit is that the latter has a lower noise floor, which is really only useful for sound production - It doesn’t translate to any increase in meaningful detail or dynamic range when dealing with playback.

        16-bit was chosen as the defacto standard for CDs and digital music precisely because it contains more than enough dynamic range for human hearing.

        Any difference your gf hears is due to the placebo effect rather than any inherent difference in the actual audio.

    • datendefekt@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      That writeup from Xiph is excellent. The comparison with adding ultraviolet and infrared to video makes so much sense. But you’re dealing with audiophiles who seriously consider getting hi-end power and ethernet cables. I read somewhere that there was a listening test with speakers connected with hanger wire - and audiophiles couldn’t tell.

      In the end, it’s all physics. I could never hear a quality improvement beyond normal 16bit, 320kbps, no matter how demanding the music.

      • aleph
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        As a recovering audiophile, I can safely say the hobby is heavily based around FOMO (the nagging doubt that something, somewhere, in your audio chain is causing a loss of audio quality), and digital audio is no exception. Not only is 320kbps more than enough, even with $1000s worth of equipment, but with codecs more efficient than MP3 (especially Opus), even 128kbps can be good enough to sound identical to lossless.

        If you have plenty of local storage then 16-bit FLAC is ideal, but if you are just streaming then you really don’t need a lossless service except to keep the FOMO at bay.

    • prole@sh.itjust.works
      cake
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Anyone who has ever heard a 128kbps mp3 side-by-side with a 320kbps (or really anything above 192kbps in my experience) version can tell you that bitrate definitely matters. The better audio equipment you play it through, the more noticeable it is.

      It definitely becomes inaudible at a certain point, but back in my CD ripping days, I’d scoff at anything below 192kbps

      • aleph
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Have you ever done an actual double blind listening test? You’d be surprised. Even with good listening equipment it can be very challenging.

        Have a go on the 128 kbps AAC test on this page and see how you do:

        https://abx.digitalfeed.net/spotify.html

          • aleph
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            Presumably it was using an older/outdated codec then. With modern encoders, especially with codecs like Opus, Ogg, and Apple’s AAC, the vast majority of listeners find 128kbps to be transparent, and certainly nowhere near night-and-day when compared to lossless.

            Check out the results of this public listening test here:

            https://listening-test.coresv.net/results.htm