• Get involved.
    We want your input!
    Apply for Membership and join the conversations about everything related to broadcasting.

    After we receive your registration, a moderator will review it. After your registration is approved, you will be permitted to post.
    If you use a disposable or false email address, your registration will be rejected.

    After your membership is approved, please take a minute to tell us a little bit about yourself.
    https://www.radiodiscussions.com/forums/introduce-yourself.1088/

    Thanks in advance and have fun!
    RadioDiscussions Administrators

HD/Atlanta

D

DuckBlue

Guest
I have 2 HD units, a Dual car receiver that I bought early in the year and a Sony home tuner that I recently bought. They were both on sale (a $30 discount by the seller on each), free shipping, and I got a $50 rebate from IBOC on each. It seems to me that they are almost giving the things away.

I bought the car unit originally to listen to NPR talk on the HD-3 channel of local public radio station WABE. Except for the annoying dropouts, it sounded great for a couple of months. Then the treble got boosted excessively, causing it to sound pretty bad. I can no longer listen to it. The same thing happened on their HD-1 channel, causing an interesting effect when the radio switches back and forth between analog and HD. Their HD-2 channel seems to be OK. I suspect that they are broadcasting a pre-emphasized signal on their HD-1 and HD-3 channels.

I notice no difference in the sound of commercial stations between analog and their HD-1 channels. The programming on their HD-2 and HD-3 channels is just more of the same. It is like switching channels on cable TV - more of the same. Two of them have their AM talk signals on the HD.
 
DuckBlue said:
Except for the annoying dropouts, it sounded great for a couple of months. Then the treble got boosted excessively, causing it to sound pretty bad. I can no longer listen to it. The same thing happened on their HD-1 channel, causing an interesting effect when the radio switches back and forth between analog and HD. Their HD-2 channel seems to be OK. I suspect that they are broadcasting a pre-emphasized signal on their HD-1 and HD-3 channels.

In recent weeks, I've noticed the same kind of high-end boost on the digital signal of some stations. My guess is that iBiquity and/or the HD Alliance has been encouraging this in an artificial attempt to accentuate the supposed "high definition" of digital. In technical sessions I've attended, they've mentioned that the digital modulation level should be set slightly higher than analog, under the premise that "louder is better". You see, we have to give the listener a "WOW" effect when the receiver switches over.

Perhaps it's a sign of desperation, but I find it annoying, especially when the delay drifts out of sync, due to system design flaws.

I notice no difference in the sound of commercial stations between analog and their HD-1 channels. The programming on their HD-2 and HD-3 channels is just more of the same. It is like switching channels on cable TV - more of the same. Two of them have their AM talk signals on the HD.

57 Channels -- and nothin' on (as you've noted, Public Radio is the usual exception)
 
Ironically Play Freebird, boosting highs with a lossy codec makes it far more likely that artifacts will become audible in the midrange, where the ear is most sensitive. The best way to minimize artifacts with a lossy codec is to limit hf extension. At low bitrates, limit highs to 12 khz or so, and almost nobody will notice anything "missing", but midrange artifacts which SCREAM "HERE I AM" drop dramatically in level.

One fo the most striking things about good HD radio is the dramatically open, extended high frequency response. But artifically boost those highs, and WATCH OUT! There ARE limitations to ANY lossy codec!
 
Mike Walker said:
Ironically Play Freebird, boosting highs with a lossy codec makes it far more likely that artifacts will become audible in the midrange, where the ear is most sensitive. The best way to minimize artifacts with a lossy codec is to limit hf extension. At low bitrates, limit highs to 12 khz or so, and almost nobody will notice anything "missing", but midrange artifacts which SCREAM "HERE I AM" drop dramatically in level.

You're absolutely correct. To reduce this problem, several of the latest audio processors (for example, the Omnia One Multicast with "Sensus") include a option to limit HF content ahead of lossy codecs, which does help to reduce artifacts. This is the digital parallel to FM "pre-emphasis limiting". Hold on a minute -- Wasn't a "selling point" of HD Radio its supposed ability to handle an audio range extending up to 20 kHz? But if you actually try that, it sounds bad.

High Definition -- yeah, right.
 
A good point Play Freebird, but as you know the pre-emphasis curve on FM actually begins in the midrange, and is HUGE by 10khz. This means that if a station is modulated to 100 percent at midrange frequencies, highs must be greatly attenuated. Not bandwidth limited, but attenuated in level. The highs are still there (to 15khz, anyway), they're just MUCH lower in level.

Since 10-20khz is the least audible octave, the effect I'm talking about (of "open" high frequencies) has less to do with extension than with BALANCE. A digital signal, even one with lots of lossy compression, can have a brighter overall BALANCE than densely-modulated analog FM stereo. It's not only possible, but demonstrable that an analog FM stereo signal that's densely modulated, but extended to 15khz will sound quite a bit "duller" than a digital signal without pre-emphasis that's extended only to 10khz. Add to this the fact that you CAN extend digital frequency response all the way to 20khz at higher bitrates, as long as you don't go silly doing things like boosing highs, and digital CAN sound a lot brighter and cleaner.

When we perceive sound as being "brighter and cleaner", it seldom has anything to do with high frequency extension. It's about BALANCE. On that score, digital can be both loud AND "bright", something analog fm stereo can't. Of course making a signal with lossy digital encoding too loud can introduce, and exaggerage midrange artifacts, because there are only so many "bits", and a wider dynamic range gives the codec more places to hide it's tracks.

In my own experiments with aac, given a bright source (music dripping with highs..cymbals, blatting trumpets, etc), and highs extended to 20khz, if you encode at 48kbps you REALLY begin to hear some artifacts vs., say, 96kbps. However, ifyou band-limit the signal to 10khz before encoding, the artifacts largely fall away, but there is no corresponding 'dullness" to the highs. I know this is unscientific, and I know that aac isn't the same codec used on HD (but it's a cousin). Still, I believe that 48kbps can sound just fine if the highs are bandwidth limited to about 10khz, and dynamic compression/limiting is kept moderate. By 64kbps, it seems that extending response to 15khz has no real drawbacks. At 96kbps, I'd go all the way to 20khz, but I wouldn't boost highs at all, EVER. Ironically, it seems to me that today's music, particularly pop/rock/country is WAY too bright. The highs SIZZLE. They sting. They're, well, UNREAL! Boosting them not only produces artifacts in lossy codecs, IT FREAKIN' HURTS!
 
Status
This thread has been closed due to inactivity. You can create a new thread to discuss this topic.
Back
Top Bottom