Considering devices that show signal strength have to sample the strength in dB and then convert it to the arbitrary "bar", why not just show the strength in dB? The Insignia portable HD radio shows strength in bars, which is nearly useless for me. I'd rather see the field strength in dBu. I jailbroke my iPhone and one of the things I changed is to display all signal strengths in dB. A -50 dBm and a -90 dBm cell signal is "5 bars" but it's a big difference in reliability for data. Same thing with Wi-Fi, a -80 dBm signal is "1 bar" but it's usable, a -90 dBm signal is not usable, but it's also "1 bar". A -40 dBm Wi-Fi signal can provide much faster speeds than a -65 dBm signal, yet both are "3 bars". Having a dBm reading lets me find the spot where the signal is strongest (or for radio, lets me null out a strong signal). Bars also shouldn't measure battery life, ideally the display should be the voltage or number of remaining milliamp-hours, or a percentage.
Bars should be places to get drunk, not measures of signal strength.
Bars should be places to get drunk, not measures of signal strength.