Stupid Question
Moderator: Moderators
- Gus Morris
- Rank 5
- Posts: 280
- Joined: Sat 07 Mar 2015 05:45
- Contact:
Stupid Question
How come I have problems trying to watch live TV on BBC iPlayer (via a VPN) yet I can download a 90 minute programme in 20 minutes? Sure I can use Filmon but the picture quality reminds me of of the 1960's. Before 625 line PAL was introduced.
Is it a question of outmoded data compression techniques? Can anybody enlighten me?
Gus
PS I download from BBC iPlayer at an indicated 640 mb/sec max. Not very quick here!
Is it a question of outmoded data compression techniques? Can anybody enlighten me?
Gus
PS I download from BBC iPlayer at an indicated 640 mb/sec max. Not very quick here!
-
- Rank 5
- Posts: 1384
- Joined: Tue 01 Sep 2009 21:21
- Contact:
Re: Stupid Question
Not that stupid, firstly live broadcasts use a different technology to downloads.Gus Morris wrote:How come I have problems trying to watch live TV on BBC iPlayer (via a VPN) yet I can download a 90 minute programme in 20 minutes? Sure I can use Filmon but the picture quality reminds me of of the 1960's. Before 625 line PAL was introduced.
Is it a question of outmoded data compression techniques? Can anybody enlighten me?
Gus
PS I download from BBC iPlayer at an indicated 640 mb/sec max. Not very quick here!
If you have periodic go slows then you won't notice with a download but you will with a live broadcast.
IPlayer isn't just delivered from one computer, there is a whole series of them in different locations run by a number of content delivery partners. When you download it could be from a completely different source to when you watch live.
- Gus Morris
- Rank 5
- Posts: 280
- Joined: Sat 07 Mar 2015 05:45
- Contact:
-
- Rank 5
- Posts: 1384
- Joined: Tue 01 Sep 2009 21:21
- Contact:
No, absolutely not, I wonder how my last post led you to think that.Gus Morris wrote:Thanks Allan.
Are you saying that, in effect, BBC iPlayer is using bit torrent technology? Thus getting around some of the limitations of net parity by sending data packets from several sources simultaneously?
Gus
The packets do not come from multiple sources but live broadcasts will use multicast and downloads will be unicast, the system will simply direct you to what it feels is the most effective delivery mechanism.
Filmon works in a similar fashion as do most TV services
- Santiago
- Rank 5
- Posts: 1290
- Joined: Tue 27 Dec 2005 12:19
- Contact:
Thanks Allan, nice simple explanation.
Domaine Treloar - Vineyard and Winery - www.domainetreloar.com - 04 68 95 02 29
- Gus Morris
- Rank 5
- Posts: 280
- Joined: Sat 07 Mar 2015 05:45
- Contact:
- malcolmcooper
- Rank 5
- Posts: 225
- Joined: Wed 09 Jul 2008 10:02
- Contact:
-
- Rank 5
- Posts: 2086
- Joined: Sun 14 Apr 2013 14:37
Maybe your ISP anticipates very high volumes towards Netflix, and caters for them, and less towards iplayer,which is just random traffic for them. Who knows?malcolmcooper wrote:I have never understood why my Netlix never buffers whilst iPlayer often does. Does Netflix use a different delivery system?
-
- Rank 5
- Posts: 1384
- Joined: Tue 01 Sep 2009 21:21
- Contact:
Perhaps I didn't express it clearly enough, iPlayer is delivered from multiple sources to multiple users but each user is downloading from a single source allocated dynamicallyGus Morris wrote:
"IPlayer isn't just delivered from one computer, there is a whole series of them "
Isn't that the way torrenting works. Multiple sources delivering to a single recipient? Of course torrrenting involves the sources running simultaneously.
Gus Morris
- Gus Morris
- Rank 5
- Posts: 280
- Joined: Sat 07 Mar 2015 05:45
- Contact:
What seems to be an undeniable fact is that the systems for delivering TV via the internet have steadily improved in my neck of the woods. I don't have a technical background but have been trying to get a grip on some of the basics.
Until fairly recently we watched TV via analogue technology. This meant that, in effect, the value of a component, say brightness, was virtually infinitely variable between the upper and lower limits. The image itself was repeated 25 times a second. But retinal retention caused us to see the image without flickering. We also discovered that we can make sense of sound even if it is chopped up into bytes provided we hear them often enough. In the digital world things are different. Brightness, for example, has a set of graded steps between the upper and lower limits. Each with an assigned value. Ditto for all the other components of the signal be they audio or visual. All unwanted information, like sounds we can't hear, is removed. This is the basis of data compression. In general the greater the degree of compression the poorer the quality of the resulting display.
In the digital world data is transmitted in packets. All conforming to the same specification. If we can imagine a flat belt carrying these packages. A narrow belt will only be able to carry packages in single file. The number of packages delivered will depend on the spacing and ultimately be limited by the maximum speed of the belt. The only way to get more packets through is to increase the width of the belt. Hence increased bandwidth.
But the internet is designed to carry packets from many different sources. So the amount of data the client receives depends on the proportion of relevant packages in the incoming stream. To improve the situation some internet providers give packages from certain sources priority. So the delivery rate becomes above average. The more data the better the result.
Is this a fair description? All comments welcome.
Gus
Until fairly recently we watched TV via analogue technology. This meant that, in effect, the value of a component, say brightness, was virtually infinitely variable between the upper and lower limits. The image itself was repeated 25 times a second. But retinal retention caused us to see the image without flickering. We also discovered that we can make sense of sound even if it is chopped up into bytes provided we hear them often enough. In the digital world things are different. Brightness, for example, has a set of graded steps between the upper and lower limits. Each with an assigned value. Ditto for all the other components of the signal be they audio or visual. All unwanted information, like sounds we can't hear, is removed. This is the basis of data compression. In general the greater the degree of compression the poorer the quality of the resulting display.
In the digital world data is transmitted in packets. All conforming to the same specification. If we can imagine a flat belt carrying these packages. A narrow belt will only be able to carry packages in single file. The number of packages delivered will depend on the spacing and ultimately be limited by the maximum speed of the belt. The only way to get more packets through is to increase the width of the belt. Hence increased bandwidth.
But the internet is designed to carry packets from many different sources. So the amount of data the client receives depends on the proportion of relevant packages in the incoming stream. To improve the situation some internet providers give packages from certain sources priority. So the delivery rate becomes above average. The more data the better the result.
Is this a fair description? All comments welcome.
Gus
-
- Rank 5
- Posts: 2086
- Joined: Sun 14 Apr 2013 14:37
Where is this meant to be leading?
Digital audio signals are transmitted "chopped up" but are reconstituted as an analogue signal by passing through a filter. The theory says that if the sampling rate is equal to or better than the "Nyquist" value (ie twice the frequency range you are seeking to reproduce) it is indistinguishable from the original signal. "High-end" audio manufacturers and (particularly) hifi magazines make a living out of doubting that, but it is essentially woo.
Theory also says that you can compress signals by throwing away the bits that you don't need (or won't very much miss). It very much depends on the nature of the original signal: if it contains exploitable regularities (sine waves in music; lots of blue sky in video) you can achieve high levels of compression which are strictly "lossless" and much higher levels which are subjectively OK.
TV has always been "digital" in a sense since it has always depended on exciting a discrete array of pixels (phosphors in the CRT days) at discrete intervals. There doesn't, happily, seem to be a TV equivalent to the hifi nonsense (the old analogue programmes on CR tubes were so much "warmer" and "involving").
The quality of a streamed signal will depend, within limits, on the bandwidth available from end to end. Some ISPs (including my own, Free) are said to have inadequate capacity at peak times towards some content providers like YouTube (because they would like a sub towards their infrastructure costs). It seems unlikely to be a major issue for the small minority of subscribers here who favour English-language material, but I can't claim to know. But if every teenaged boy in your neighbourhood is logged on to an online game, you may have problems anyway. Not because you are getting their stuff (as I understand you to think, perhaps wrongly), but because there is not enough capacity, "upstream", for your stuff and their stuff at the same time.
Digital audio signals are transmitted "chopped up" but are reconstituted as an analogue signal by passing through a filter. The theory says that if the sampling rate is equal to or better than the "Nyquist" value (ie twice the frequency range you are seeking to reproduce) it is indistinguishable from the original signal. "High-end" audio manufacturers and (particularly) hifi magazines make a living out of doubting that, but it is essentially woo.
Theory also says that you can compress signals by throwing away the bits that you don't need (or won't very much miss). It very much depends on the nature of the original signal: if it contains exploitable regularities (sine waves in music; lots of blue sky in video) you can achieve high levels of compression which are strictly "lossless" and much higher levels which are subjectively OK.
TV has always been "digital" in a sense since it has always depended on exciting a discrete array of pixels (phosphors in the CRT days) at discrete intervals. There doesn't, happily, seem to be a TV equivalent to the hifi nonsense (the old analogue programmes on CR tubes were so much "warmer" and "involving").
The quality of a streamed signal will depend, within limits, on the bandwidth available from end to end. Some ISPs (including my own, Free) are said to have inadequate capacity at peak times towards some content providers like YouTube (because they would like a sub towards their infrastructure costs). It seems unlikely to be a major issue for the small minority of subscribers here who favour English-language material, but I can't claim to know. But if every teenaged boy in your neighbourhood is logged on to an online game, you may have problems anyway. Not because you are getting their stuff (as I understand you to think, perhaps wrongly), but because there is not enough capacity, "upstream", for your stuff and their stuff at the same time.
- Gus Morris
- Rank 5
- Posts: 280
- Joined: Sat 07 Mar 2015 05:45
- Contact: