Discussion:
Is 720p (or i) _broadcast_?
(too old to reply)
J. P. Gilliver
2024-04-10 13:09:14 UTC
Permalink
I was discussing quiz questions (the lack of science/tech ones in
particular on some prog.s), and it occurred to me what might be a fair
one might be how many lines is HD; you'd have to qualify it, something
like "On terrestrial FreeView television as broadcast in the UK, how
many picture lines does a broadcast described as HD have?"

It then occurred to me to wonder: is 720 used _for broadcast_ at all?
(Is there even much source material around in 720?)

Also, formulating this, I also wondered: is interlacing used much, or at
all, these days? Presumably it is, at least for SD, at least for
material originally made in i. (Again, I'm talking about what's
_broadcast_, not what assorted sets _display_.) The original _reason_
for it - keep the flicker down while keeping the [vertical] resolution
up - has more or less gone with the separation of the light source from
the image display process (presumably progressive at 25 frames per
second needs about the same _bitrate_ as interlaced at 50 fields per
second).
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

"I'm a paranoid agnostic. I doubt the existence of God, but I'm sure there is
some force, somewhere, working against me." - Marc Maron
NY
2024-04-10 13:29:55 UTC
Permalink
Post by J. P. Gilliver
I was discussing quiz questions (the lack of science/tech ones in
particular on some prog.s), and it occurred to me what might be a fair one
might be how many lines is HD; you'd have to qualify it, something like
"On terrestrial FreeView television as broadcast in the UK, how many
picture lines does a broadcast described as HD have?"
It then occurred to me to wonder: is 720 used _for broadcast_ at all? (Is
there even much source material around in 720?)
Also, formulating this, I also wondered: is interlacing used much, or at
all, these days? Presumably it is, at least for SD, at least for material
originally made in i. (Again, I'm talking about what's _broadcast_, not
what assorted sets _display_.) The original _reason_ for it - keep the
flicker down while keeping the [vertical] resolution up - has more or less
gone with the separation of the light source from the image display
process (presumably progressive at 25 frames per second needs about the
same _bitrate_ as interlaced at 50 fields per second).
Since (in "PAL land") an SD picture is 704 or 720 x 576 and an HD picture is
1920 x 1080, material with 720 rows must either be down-scaled for SD or
upscaled for HD. I suppose upscaled 720 on HD is better than upscaled 576
from archive material before the master was HD.

Is digital TV actually sent in interlaced format (ie odd-numbered rows,
pixel by pixel, then even-numbered rows) or is everything sent progressive,
with any interlacing generated at each TV? Indeed, are the pixels of a TV
lit in sequence, like the phosphor glow from a CRT, or do they all change
simultaneously, lit by a continuous backlight. A quick test on my PC (which
uses a 100 Hz refresh, not 120) when viewed through the camera on my phone
(which records at 30 fps) doesn't show the very obviously 10 Hz beating that
you used to get between a CRT and a camera if a "PAL" camera was pointed at
an "NTSC" TV. Having said that, the converse situation (120 Hz PC monitor
and 25 fps camcorder) did generate beating, which is why I changed the PC's
refresh rate from 120 to 100, for when I needed to show my PC screen in a
video I was shooting.
J. P. Gilliver
2024-04-11 20:53:58 UTC
Permalink
[]
Post by NY
Post by J. P. Gilliver
It then occurred to me to wonder: is 720 used _for broadcast_ at all?
(Is there even much source material around in 720?)
Also, formulating this, I also wondered: is interlacing used much, or
at all, these days? Presumably it is, at least for SD, at least for
[]
Post by NY
Since (in "PAL land") an SD picture is 704 or 720 x 576 and an HD
picture is 1920 x 1080, material with 720 rows must either be
down-scaled for SD or upscaled for HD. I suppose upscaled 720 on HD is
better than upscaled 576 from archive material before the master was HD.
If the original is 576, I'm dubious about upscaling, but I suppose if
your display _is_ 720, you've got to do something to make it fill the
screen (vertically).
Post by NY
Is digital TV actually sent in interlaced format (ie odd-numbered rows,
pixel by pixel, then even-numbered rows) or is everything sent
progressive, with any interlacing generated at each TV? Indeed, are the
Interesting question.
Post by NY
pixels of a TV lit in sequence, like the phosphor glow from a CRT, or
do they all change simultaneously, lit by a continuous backlight. A
I'm pretty sure modern displays use a continuous _backlight_, no
question. (If you swing your eyeballs over a modern display, you don't
get the stream of images you used to get with a CRT display.) I've
always assumed that the actual pixels - now just variable-transmission
things - did still change sequentially.
Post by NY
quick test on my PC (which uses a 100 Hz refresh, not 120) when viewed
through the camera on my phone (which records at 30 fps) doesn't show
the very obviously 10 Hz beating that you used to get between a CRT and
a camera if a "PAL" camera was pointed at an "NTSC" TV. Having said
that, the converse situation (120 Hz PC monitor and 25 fps camcorder)
did generate beating, which is why I changed the PC's refresh rate from
120 to 100, for when I needed to show my PC screen in a video I was
shooting.
Was that a CRT PC monitor? Is the one you're now trying with your 'phone
a non-CRT one?
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

"Look, if it'll help you to do what I tell you, baby, imagine that I've got a
blaster ray in my hand." "Uh - you _have_ got a blaster ray in your hand." "So
you shouldn't have to tax your imagination too hard." (Link episode)
Mark Carver
2024-04-12 15:27:27 UTC
Permalink
Post by J. P. Gilliver
I was discussing quiz questions (the lack of science/tech ones in
particular on some prog.s), and it occurred to me what might be a fair
one might be how many lines is HD; you'd have to qualify it, something
like "On terrestrial FreeView television as broadcast in the UK, how
many picture lines does a broadcast described as HD have?"
It then occurred to me to wonder: is 720 used _for broadcast_ at all?
(Is there even much source material around in 720?)
It's all to do the Kell Factor. Read up about it.

Basically, 1080i50, gives similar subjective quality as 720p50.

The Holly Grail is 1080p50. 1080p50 is not transmitted in the UK,
because it takes (almost) double the bandwidth of 1080i50, however
1080p50 is becoming standard within the studio environment now.

It's really easy to 'downscale' 1080p50 to 720p50 (because no temporal
conversion is required) and also really easy to upscale it to UHD (where
there is no interlaced mode (thank god))
J. P. Gilliver
2024-04-12 18:48:26 UTC
Permalink
In message <***@mid.individual.net> at Fri, 12 Apr 2024
16:27:27, Mark Carver <***@invalid.com> writes
[]
Post by Mark Carver
Basically, 1080i50, gives similar subjective quality as 720p50.
Interesting.

I was just wondering if anyone actually _broadcasts_ 720 (p or i). I
suspect not - SD will be broadcast at 576i since there's nothing to gain
from doing otherwise, and I presume HD is broadcast as 1080. I suspect
there is little (probably none, other than from amateur sources) source
material in 720.
Post by Mark Carver
The Holly Grail is 1080p50. 1080p50 is not transmitted in the UK,
because it takes (almost) double the bandwidth of 1080i50, however
So HD is broadcast i.
Post by Mark Carver
1080p50 is becoming standard within the studio environment now.
That makes sense - if you've got the storage, and 1080 sensors, you
might as well store at that for the future - even the conversion to i
doesn't require much (less than one field of storage, which is
negligible these days).
Post by Mark Carver
It's really easy to 'downscale' 1080p50 to 720p50 (because no temporal
But who is doing it? The broadcasters are presumably storing at 1080p50
as you say, but broadcasting at 1080i50. So viewers with a 720 set will
receive 1080i50 or 576i50, and down- or up-scale from that (presumably
to 720i, though maybe 720p) - they won't have a source of 1080p.
Post by Mark Carver
conversion is required) and also really easy to upscale it to UHD
(where there is no interlaced mode (thank god))
Indeed. I don't think CRTs - the main reason for interlacing - were made
in UHD.

Interesting that you say the pricklecup is 1080p50; presumably that's
only temporary, and something larger will come along eventually, with
the increasing size (resolution, really) of displays. Though IMO 50 -
once flicker is no longer relevant - is actually more than enough for
the majority of subjects, in fact overkill for most; for many types of
material 10 or 12 is in fact more than adequate. (Though I doubt _less_
than 50 will ever become common for general purpose studio production.)
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

Worst programme ever made? I was in hospital once having a knee operation and I
watched a whole episode of "EastEnders". Ugh! I suppose it's true to life. But
so is diarrhoea - and I don't want to see that on television. - Patrick Moore,
in Radio Times 12-18 May 2007.
Mark Carver
2024-04-13 16:26:32 UTC
Permalink
Post by J. P. Gilliver
[]
Post by Mark Carver
Basically, 1080i50, gives similar subjective quality as 720p50.
Interesting.
I was just wondering if anyone actually _broadcasts_ 720 (p or i).
720i is not a supported mode. Some European and some US broadcasters,
broadcast in 720p
Post by J. P. Gilliver
Post by Mark Carver
It's really easy to 'downscale' 1080p50 to 720p50 (because no temporal
But who is doing it?
The broadcasters mentioned above. Production/studio standards are 1080,
720 is only really used for emission
Post by J. P. Gilliver
Post by Mark Carver
conversion is required) and also really easy to upscale it to UHD
(where there is no interlaced mode (thank god))
Indeed. I don't think CRTs - the main reason for interlacing - were made
in UHD.
No. The primary reason for interlacing was to save bandwidth. 50% of it.
CRTs are (were) good for it, because the phosphor lag helped 'fill in
the gaps' on opposing fields
J. P. Gilliver
2024-04-14 00:30:23 UTC
Permalink
[]
Post by Mark Carver
Post by J. P. Gilliver
Post by Mark Carver
conversion is required) and also really easy to upscale it to UHD
(where there is no interlaced mode (thank god))
Indeed. I don't think CRTs - the main reason for interlacing - were
made in UHD.
No. The primary reason for interlacing was to save bandwidth. 50% of it.
Well, to save bandwidth _while preserving vertical resolution_
(otherwise they could have just done n/2 at 50) _and_ reducing flicker
(otherwise they could have just done n at 25, i. e. "p").
Post by Mark Carver
CRTs are (were) good for it, because the phosphor lag helped 'fill in
the gaps' on opposing fields
Yes.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

"Victory does not bring with it a sense of triumph - rather the dull numbness
of relief..." - Cecil Beaton quoted by Anthony Horowitz, RT 2015/1/3-9
Loading...