The ATSC standard allows for an HD signal to be broadcast at a constant bit rate (CBR) of 19.4 Mbps. Within a broadcast facility I imagine all HD signals are kept at this bit rate or higher. The question becomes what is the bit rate transmitted to the viewer and what is the impact on picture quality?
First lets consider over the air (OTA). Until recently WCBS was the only NYC area station without subchannels, thus allowing it to use the full 19.4 Mbps for its main HD signal. Now all stations have subchannels resulting in less bits available for the main HD signal. I assume that the stations are using statistical multiplexing (SM) to combine the main HD signal and all the subchannels into the 19.4-Mbps bit stream that is transmitted OTA, which results in variable bit rates (VBR) for the main and subs. Can someone confirm that and explain the various VBR rates that are being utilized?
Given that the broadcasters are using less than 19.4 Mbps for their main HD OTA signal, does anyone know what is the bit rate (constant or average VBR) that the broadcasters provide to Cablevision (or Verizon, Direct TV, etc.) for their HD signal?
Once Cablevision receives the broadcasters feeds can anyone explain what further signal processing (for example, transcoding and additional SM) that Cablevision performs and what the actual bit rate of the HD signal is when delivered to the consumer?
These questions stem from recent experiments with viewing major sporting events and noticing that the OTA signal resulted in a sharper picture with less macroblocking and pixilation than the Cablevision signal. Have others observed the same?
CV receives the "OTA" stations from NYC by fiber optic links, so the compression used in this manner is not the same as OTA transmission. cable companies can choose how they distribute "OTA" content to subscribers homes, the same applies to premium content and content available from satellite. in the real "old days" with analog tv, CV received the analog OTA stations by a giant VHF antenna on a tower, i think CV was called "Viacom" back then.
Not to go too off-topic here, but I believe they've always been called Cablevision, with Viacom being sold to them later on. If I recall, what is now CV of Hauppauge used to be owned by Viacom. I'm not certain, but I believe the Woodbury system was the original. And just about all cable systems used antennas to receive OTA signals, hence the term "CATV", which stood for "Community Antenna TV"
Not to go too off-topic here, but I believe they've always been called Cablevision, with Viacom being sold to them later on. If I recall, what is now CV of Hauppauge used to be owned by Viacom. I'm not certain, but I believe the Woodbury system was the original. And just about all cable systems used antennas to receive OTA signals, hence the term "CATV", which stood for "Community Antenna TV"
i seem to remember the old Ford/Chevy vans that said "Viacom" on the sides from the 70's though. back then when channel 3 was just static, i decided to try the huge top loader vcr i had, connected it to the cable line and asked my friend 1/4 down the block to see what he saw on channel 3, it worked flawlessly, clear picture/sound, then that scrambled sports channel appeared on channel 3, i forgot the name of it. WWHT on UHF 67/68 was another interesting scrambled channel that was easy to "de-scramble", but yeah, dont want to go too far off topic but the OP reminded me of the "old" days, so i just had to chime in a little bit of what i remember. CV still has some of the "old" hard line on the poles, i could think of some uses for it if its not being used for anything. back in the 70's it was easy to modify an analog tuner (below ch 7, above ch 13) to receive the analog cable channels that were not scrambled, although there was certainly not many of them for sure.
A guy in my college class built a cable descrambler as a class project. He demonstrated it in class by descrambling a WHT over-the-air broadcast (because the college didn't have cable). We got to see a little bit of Maud Adams in Tattoo.
Before it was WWHT, channel 68 was WBTB-TV. They were known for The Uncle Floyd Show (which was ultimately canceled by Cablevision management!). One day I tuned in and saw Voyage to the Bottom of the Sea being broadcast upside-down and backwards! Someone forgot to rewind the film reel.
A guy in my college class built a cable descrambler as a class project. He demonstrated it in class by descrambling a WHT over-the-air broadcast (because the college didn't have cable). We got to see a little bit of Maud Adams in Tattoo.
Before it was WWHT, channel 68 was WBTB-TV. They were known for The Uncle Floyd Show (which was ultimately canceled by Cablevision management!). One day I tuned in and saw Voyage to the Bottom of the Sea being broadcast upside-down and backwards! Someone forgot to rewind the film reel.
i remember "Uncle Floyd" and "U68" music videos. i liked it best when it was "U68" music videos. interesting article about "U68" mentioning "Viacom" as the cable provider for LI & NJ. »www.cat-house.org/U68.html
It was always Cablevision on much of LI. The rest of their systems were a hodgepodge of aquisitions over many years, including TCI, UA Columbia, TKR, Sammons Communications, and several others.
I'd appreciate if we could leave the history lessons behind and concentrate on the original questions regarding Bit Rate versus Picture Quality. Thank you.
At a high level, once the stream is received from the content provider, CV then has to fit it into a 38Mbps QAM modulated 6MHz channel.
This is where FiOS's "not compressed" claim they used a few years ago comes into play. They would allocate 2 HD streams per QAM - resulting in zero additional compression on the stream received from the content provider.
In general, cable companies do not have this channel availability luxury due to analog cable - so they'd typically stuff 3-4 HD channels per QAM. The allotment of bits between channels inside a single QAM is also variable, so they may "beef up" their FOX recompression during football games while starving QVC HD (just an example).
From my experience on FiOS, most channels are in the 12-16Mbps range. Given FiOS's stance on recompression, it would seem that's the rate they receive from the content provider.
I also have Time Warner Cable in Los Angeles, where we still have some analog cable channels. They typically place 4 HD channels per QAM and HD picture quality is downright horrible at times. Even when it's at it's best, it doesn't come close to FiOS. I have my FiOS and TWC TiVos linked, and transferring a Yankee game recorded from YES-HD at 16Mbps to the TWC TiVo is like night and day. I forget how good HD can look. I imagine the effect is similar on CV until they kill off all the analog channels (should be happening on TWC in my area soon).
It seems as though even if the broadcasters have limited their OTA to less than 19.4 Mbps for their HD signals that they would want to provide the full 19.4 Mbps to FiOS or whomever could take advantage of it. Isn't the idea to get the best signal possible all the way to the viewer? Perhaps the links between the broadcasters and FiOS, Cablevision, and others are limited to 19.4 Mbps and have to carry the main HD plus the subs.
It seems as though even if the broadcasters have limited their OTA to less than 19.4 Mbps for their HD signals that they would want to provide the full 19.4 Mbps to FiOS or whomever could take advantage of it. Isn't the idea to get the best signal possible all the way to the viewer? Perhaps the links between the broadcasters and FiOS, Cablevision, and others are limited to 19.4 Mbps and have to carry the main HD plus the subs.
That's a nice idea, but there's only so much bandwith a cable/satellite/telco has to work with.
It seems as though even if the broadcasters have limited their OTA to less than 19.4 Mbps for their HD signals that they would want to provide the full 19.4 Mbps to FiOS or whomever could take advantage of it. Isn't the idea to get the best signal possible all the way to the viewer? Perhaps the links between the broadcasters and FiOS, Cablevision, and others are limited to 19.4 Mbps and have to carry the main HD plus the subs.
I think stations do not have/want equipment to create 2 separate redundant data streams. At the same time cableco can think that if a stream is good enough for a station to broadcast OTA, it's good enough to send to cable customers.
I imagine cablecos are not too happy that they have to re-encode video(and maybe audio?). They know that it lowers picture quality. It cost them money in equipment. It introduces delay in "live" programming. It adds more possible points of failure.
With a proper viewing distance to allow you to see 1080 resolution (e.g. 6-8 feet from a 60 inch TV) you will be able to see a significant difference between the ~19Mbps and let's say a compressed ~10Mbps... I have been able to compare FIOS and CV right next to each other at my neighbors and there is a significant difference... (I don't have either TV service - the only TV I watch is Netflix BluRay's - their Streaming is also not up to snuff at all - If I wanted TV I would go back to FIOS at this point)
Clearly FiOS has the capability of delivering the full 19-Mbps HD signal all the way to the viewer. Are you saying that they are getting the full 19-Mbps HD signal from the broadcasters and that is what they are delivering? (This would be better than OTA which is less that 19 Mbps.)