Discussion:
How do they splice film when editing, so as to avoid the joins showing?
(too old to reply)
NY
2017-09-04 19:57:03 UTC
Permalink
With amateur 8 mm film, it was common for a splicing machine to use one of
three methods:

- cut the film with a straight cut along the frame boundary and then join it
with adhesive tape

- cut with an overlap which was shaved to half-thickness on both halves and
then cement the overlap

- cut with a zigzag cut along the frame boundary and cement them

All of these are very obvious.

So how is it done professionally in TV and cinema film, so the joint isn't
visible?


I know there's something about making A and B rolls, such that clear film is
used to join the odd-numbered *negative* shots on one roll, and to join the
even-numbered *negative* shots on the other roll, with the two rolls of film
being sandwiched together before being printed negative-to-positive. But why
doesn't that also show a boundary between the A shot and clear, and between
clear and B shot, at the tape or cement join?


I thought about this because I've just watched an episode of Freewheelers
(on Talking Pictures TV) from 1972, and every film edit was very noticeable
as a line at the bottom of one frame and the top of the next in consecutive
frames:

Loading Image... (clean)

Loading Image... (line at bottom of
frame, with sideway displacement)

Loading Image... (line at top of frame,
part of previous frame)

Loading Image... (clean)

I've never noticed this before.
John Williamson
2017-09-04 20:49:20 UTC
Permalink
Post by NY
With amateur 8 mm film, it was common for a splicing machine to use one of
<snip>
Post by NY
So how is it done professionally in TV and cinema film, so the joint isn't
visible?
The professionals use negative film, and the negative is what's spliced.
Then they make an unspliced copy for distribution, which only gets
spliced if (a) it breaks, or (b) the cinema or broadcaster decide it's
too long. A lot of 1960s and 1970s programmes' films got hacked
mercilessly by the American networks as they sold more and more
advertising time and had to fit the programme into the same slot length.

Most editing was done using tape to join the ends together, with the
tape edge aligned to the interframe gaps. It's not hard when you use a
jig to cut the tape and film. Similar in principle to the one my dad
used to edit his 8mm epics, but bigger and more accurate.
Post by NY
I know there's something about making A and B rolls, such that clear
film is used to join the odd-numbered *negative* shots on one roll, and
to join the even-numbered *negative* shots on the other roll, with the
two rolls of film being sandwiched together before being printed
negative-to-positive. But why doesn't that also show a boundary between
the A shot and clear, and between clear and B shot, at the tape or
cement join?
If the boundary is between frames as it should be, it won't be seen, as
there is a period while the frame is pulled down during projection when
the light is blocked off. The most you might notice is a frame's worth
of about 10% reduction in brightness as the tape cuts the light down.
For projection in a cinema, there are usually three blockages per frame,
to increase the speed of the flicker, so making it less noticeable. TV
does the same trick by transmitting half the picture at a time at twice
the frame rate, and is helped by the persistence of the phosphors.
Post by NY
I thought about this because I've just watched an episode of Freewheelers
(on Talking Pictures TV) from 1972, and every film edit was very noticeable
as a line at the bottom of one frame and the top of the next in consecutive
The editor was in a bit of a hurry or not very good?

What you've seen may be an artifact of a badly adjusted scanner when it
was converted to video.

But as there is an edit in the film every time there is a new shot, are
you sure it was *every* edit that was faulty, or, given the age of the
film being projected, did you just notice the ones where the film had
been spliced to repair it.?
--
Tciao for Now!

John.
NY
2017-09-05 09:23:15 UTC
Permalink
Most editing was done using tape to join the ends together, with the tape
edge aligned to the interframe gaps. It's not hard when you use a jig to
cut the tape and film. Similar in principle to the one my dad used to edit
his 8mm epics, but bigger and more accurate.
I remember my dad had two jigs. One cut the film so there was a half-frame
overlap and there was a little plastic slider with a serrated edge
underneath to "file" off some of the film base from both sides where they
will overlap. These were then joined with cement. That might have been for
Standard 8. The other one, which was definitely for Super 8, made zig-zag
cut on the frame boundary and then pulled the two piece of film apart to
allow you to apply cement to the serrated bits before it pushed them back
together again.

Both produced very noticeable joins. Since dad had all the films transferred
to MPEG, I've been able to go through using VideoReDo and remove the two
adjacent frames that have the blemish, so the join is not obvious.
Post by NY
I know there's something about making A and B rolls, such that clear film
is used to join the odd-numbered *negative* shots on one roll, and to
join the even-numbered *negative* shots on the other roll, with the two
rolls of film being sandwiched together before being printed
negative-to-positive. But why doesn't that also show a boundary between
the A shot and clear, and between clear and B shot, at the tape or cement
join?
If the boundary is between frames as it should be, it won't be seen, as
there is a period while the frame is pulled down during projection when
the light is blocked off. The most you might notice is a frame's worth of
about 10% reduction in brightness as the tape cuts the light down.
Yes, if the tape covers exactly one frame either side of the join, it won't
be *too* obvious, though as you say there may be a reduction in brightness
and maybe slight blurring or, as in the case of the examples I attached
below, the tape may cause the film to move slightly in the gate, leading to
a bit of sideways shift.

I've managed to find a Wikipedia page about it and I'd got the A/B process
half-right (and so half-wrong). They use *black* leader between the shots
and I imagine they use an overlap join rather than tape, so as to leave the
last "good" frame totally clear of tape, and make any join where the film is
already black and won't be seen. The expose the print twice: once with just
the A roll (not with both sandwiched) and then with just the B roll. The
black leader on the B roll masks the parts of the print that had the
pictures on the A roll, and vice versa. https://en.wikipedia.org/wiki/B-roll
(section "16 mm film splices"). I'm intrigued by the statement "35 mm film
was wide enough to hide splices" - does that mean that they only applied
tape on the unseen perforations at either edge and left the visible part of
the negative unattached but still held in accurate register by the edges,
for feeding through the printer? Or does it mean that the gap between one
frame and the next is big enough to attach tape that only covers the border
and doesn't stray onto the visible frame?

I imagine it is cheaper and quicker just to join all the negatives together
and make a single positive (or film on reversal film and join that, in the
case of fast-turnaround news footage), so maybe that was done with low-end
TV film where the budget didn't stretch to the extra hassle of making up two
rolls of interspersed shots and running them in turn through the printer
onto the same film.
Post by NY
I thought about this because I've just watched an episode of Freewheelers
(on Talking Pictures TV) from 1972, and every film edit was very noticeable
as a line at the bottom of one frame and the top of the next in consecutive
The editor was in a bit of a hurry or not very good?
But as there is an edit in the film every time there is a new shot, are
you sure it was *every* edit that was faulty, or, given the age of the
film being projected, did you just notice the ones where the film had been
spliced to repair it.?
Having noticed it once, I checked and found that *every* shot change had a
"mucky edit" like this. Obviously it only related to the filmed parts and
not the studio interiors.


I presume that if anything for TV is shot on film these days, the editing is
done electronically on a telecine copy, rather than using cut-film editing.
Even if they *do* go back to the negative and cut it, I imagine all the
rehearsal and trying-out is done electronically, so the film only needs to
be cut and assembled once when everyone is happy with the final cut.
Dave Liquorice
2017-09-05 10:26:02 UTC
Permalink
Post by NY
I presume that if anything for TV is shot on film these days, the
editing is done electronically on a telecine copy, rather than using
cut-film editing. Even if they *do* go back to the negative and cut it,
I imagine all the rehearsal and trying-out is done electronically, so
the film only needs to be cut and assembled once when everyone is happy
with the final cut.
As was the case with real film. The neg would be kept very safe and
only cut once. All editing would be done with a positive "cutting
copy" and marked up with china graph pencil to indicate fades and
mixes. Cutting copies got bashed to hell with dirt, scratches, joints
etc, Remember the neg is a negative, I guess one would get used
watching that but proper colours are much nicer.
--
Cheers
Dave.
NY
2017-09-05 10:38:50 UTC
Permalink
Post by Dave Liquorice
Post by NY
I presume that if anything for TV is shot on film these days, the
editing is done electronically on a telecine copy, rather than using
cut-film editing. Even if they *do* go back to the negative and cut it,
I imagine all the rehearsal and trying-out is done electronically, so
the film only needs to be cut and assembled once when everyone is happy
with the final cut.
As was the case with real film. The neg would be kept very safe and
only cut once. All editing would be done with a positive "cutting
copy" and marked up with china graph pencil to indicate fades and
mixes. Cutting copies got bashed to hell with dirt, scratches, joints
etc, Remember the neg is a negative, I guess one would get used
watching that but proper colours are much nicer.
Did cutting desks ever contain a low-tech telecine (or at least a camera
pointing at the film, even if there was no frame synchronisation) to convert
negative to positive for the benefit of the cutting editor?

How are frames in the negative identified? Does the manufacturer edge-print
a consecutive number in the perforation area of each frame which then shows
up in the cutting copy? Presumably when you load the negative you set the
desk to show a given frame and dial in the edge code so it can increment
that as the film is wound through to locate segments of film to extract and
join.
Dave Liquorice
2017-09-05 11:52:43 UTC
Permalink
Post by NY
Did cutting desks ever contain a low-tech telecine (or at least a camera
pointing at the film, even if there was no frame synchronisation) to
convert negative to positive for the benefit of the cutting editor?
Only ever saw the desks in cutting rooms, purely optical. AFAIK the
neg stayed in the vaults at the labs and was only handled to produce
cutting copies(*) and the final neg cut.
Post by NY
How are frames in the negative identified? Does the manufacturer
edge-print a consecutive number in the perforation area of each frame
which then shows up in the cutting copy?
IIRC there were "rubber numbers" every 35 mm foot. You matched the
numbers and counted frames. I don't think there was any settable
counter but I have never seen the neg cutting process.

(*) Copies, ISTR you had to have a very good reason for more than
one, I don't think they used the rushes as cutting copy but I can't
see why couldn't assuming that the neg was correctly exposed and
devleoped.
--
Cheers
Dave.
Roger Wilmut
2017-09-05 16:04:28 UTC
Permalink
On 2017-09-05 10:38:50 +0000, NY said:

Originally joining edits was done by scraping a small area at the
framline on each piece and cementing them. Of course this was on the
negatice so it only had to go through a printer (continuous) not a
projector (intermittent motion).

Silent films, 4 x 3 ratio, had the top of each frame directly agains
the bottom of the next one, but in projection there would normally be
some masking of the edges of the picture.

When sound came in the soundtrack occupied part of the width of the
image, leaving a rather square frame. A number of films were shot like
this but then a thicker border between each frame - the rack-line - was
introduced to maintain the 4 x 3 ratio with a smaller frame. This made
the editing easier.

Amateur film on 16mm and 8mm didn't have much of a rackline, so any
cemenbt edits would be much more noticeable. Mostly amateurs used tape,
and this is also used nowadays professionally because polyester film
(non-flammable) won't take cement.

Professional film has always had 'edge-numbers' along the edge, in a
sequence which only repeats after about ten years. The actual editing,
in the sense of compiling, a film is done by the editor quickly made
prints, resulting in a 'wpork print' which is viewable but very messy
looking due to the handling, also with lines travelling across the
picture to indicate fades, etc.

When finalised, the actual negative is cut by special cutting staff,
handling the film very carefully (with gloves) and matching the work
print from the edge numbers.

Nowadays the assembly editing is usually done electronicallly, often on
the Avid system which you will sometimes see credited, using a video
copy with timecode burned into the picture. The negative is then
match-cut to the video.

For information on cutting and joining see a short article here:

https://en.wikipedia.org/wiki/Film_splicer

and a longer one here:

https://en.wikipedia.org/wiki/Film_editing
Angus Robertson - Magenta Systems Ltd
2017-09-05 17:30:00 UTC
Permalink
Post by NY
I know there's something about making A and B rolls, such that
clear film is used to join the odd-numbered *negative* shots on
one roll, and to join the even-numbered *negative* shots on the
other roll, with the two rolls of film being sandwiched together
before being printed negative-to-positive.
This was mainly used for 16mm productions, to allow optical effects
between the two rolls, fades, wipes, etc.

35mm film can edited invisibly for cuts, but optical effects have to be
produced separately.

For cheaper productions, such as the Carry On series in the sixties,
you can always tell when an optical effect is coming up, because the
picture quality suddenly drops as it becomes second generation, then
you see the effect and a cut back to the original negative quality.
Optical effects were charged per foot, so they kept them as short as
possible.

Angus
Roger Wilmut
2017-09-06 08:17:31 UTC
Permalink
Post by Angus Robertson - Magenta Systems Ltd
Post by NY
I know there's something about making A and B rolls, such that
clear film is used to join the odd-numbered *negative* shots on
one roll, and to join the even-numbered *negative* shots on the
other roll, with the two rolls of film being sandwiched together
before being printed negative-to-positive.
This was mainly used for 16mm productions, to allow optical effects
between the two rolls, fades, wipes, etc.
35mm film can edited invisibly for cuts, but optical effects have to be
produced separately.
For cheaper productions, such as the Carry On series in the sixties,
you can always tell when an optical effect is coming up, because the
picture quality suddenly drops as it becomes second generation, then
you see the effect and a cut back to the original negative quality.
Optical effects were charged per foot, so they kept them as short as
possible.
Angus
The reduction in quality just before an optical effect was widespread,
not just in cheap films, though if carefully done it wasn't too
noticeable.

A later development in 35mm printing was computer-aided. Suppose you
have two scenes linked by a 5-second crossfade. The negative would have
a 5-second blank spliced between the scenes. In the printing the
computer would instruct the printer to fade out the end of the first
scene, then roll the print backwards while still rolling the negative
forwards, then run the print in the forward direction and fade in the
second scene. This got round the cutting in of a dup copy with the
effect,
NY
2017-09-06 09:34:52 UTC
Permalink
Post by Angus Robertson - Magenta Systems Ltd
35mm film can edited invisibly for cuts, but optical effects have to be
produced separately.
Do you mean that there is sufficient vertical gap between frames that the
splicing tape doesn't need to overlap onto the frame? I would have thought
that all film would be roughly the same, apart from size, and that they
would space frames as closely as possible to use less film. The first
illustration in https://en.wikipedia.org/wiki/35_mm_film certainly shows a
very small inter-frame border. I suppose it depends on whether any of the
frame is masked-out during projection and is therefore safe for splicing
tape. I'm not sure why 35 mm is any better than 16 mm or 8 mm in terms of
visibility of splicing tape.
Post by Angus Robertson - Magenta Systems Ltd
A later development in 35mm printing was computer-aided. Suppose you have
two scenes linked by a 5-second crossfade. The negative would have a
5-second blank spliced between the scenes. In the printing the computer
would instruct the printer to fade out the end of the first scene, then
roll the print backwards while still rolling the negative forwards, then
run the print in the forward direction and fade in the second scene. This
got round the cutting in of a dup copy with the effect,
When you say "instruct the printer to fade out the end of the first scene"
and "fade in the second scene" was that just by varying the intensity of the
printing light that is exposing the positive from the negative? And was it
done by varying the voltage on the bulb, or by varying an aperture? The
former might give very orange light at lower intensities, causing a blue
tint (complementary colour) in the positive.



Changing the subject slightly, are there any film formats which run the film
*horizontally* rather that *vertically* through the camera and projector?
I've seen archive newsreel footage which shows very clear horizontal rather
than vertical scratch lines, and they are in a very consistent position so
unlikely to be caused by successive layers of film on the reel scratching on
each other during winding.

By the way, is film for American TV shot at 30 fps (in the same way that
it's shot at 25 fps for UK TV) or is it shot at 24 and converted by 3:2
pulldown? Do they get pulldown motion artefacts even with made-for-TV film,
or just for made-for-cinema film?
Dave Liquorice
2017-09-06 10:04:19 UTC
Permalink
Post by NY
Post by Roger Wilmut
A later development in 35mm printing was computer-aided. Suppose you
have two scenes linked by a 5-second crossfade. The negative would have
a 5-second blank spliced between the scenes. In the printing the
computer would instruct the printer to fade out the end of the first
scene, then roll the print backwards while still rolling the
negative
Post by NY
Post by Roger Wilmut
forwards, then run the print in the forward direction and fade in the
second scene.
Nifty, I do like mechanical solutions.
Post by NY
When you say "instruct the printer to fade out the end of the first
scene" and "fade in the second scene" was that just by varying the
intensity of the printing light that is exposing the positive from the
negative? And was it done by varying the voltage on the bulb, or by
varying an aperture? The former might give very orange light at lower
intensities, causing a blue tint (complementary colour) in the positive.
Almost certainly an aperture for the reason you state. Lowering the
voltage assumes a light source that is continuosly variable without
changing colour temperature down to zero, not many, if any are.
Post by NY
Changing the subject slightly, are there any film formats which run the
film *horizontally* rather that *vertically* through the camera and
projector?
IIRC 70 mm IMAX does.
Post by NY
By the way, is film for American TV shot at 30 fps (in the same way that
it's shot at 25 fps for UK TV) or is it shot at 24 and converted by 3:2
pulldown? Do they get pulldown motion artefacts even with made-for-TV
film, or just for made-for-cinema film?
This always makes my brain explode. B-)
--
Cheers
Dave.
Laurence Taylor
2017-09-06 11:37:46 UTC
Permalink
Post by NY
By the way, is film for American TV shot at 30 fps (in the same way that
it's shot at 25 fps for UK TV) or is it shot at 24 and converted by 3:2
pulldown? Do they get pulldown motion artefacts even with made-for-TV film,
or just for made-for-cinema film?
I believe it's shot at 24 (Of course, most these days will be tape/solid
state); shooting at 30, except for special occasionas, would be too expensive.

One "special occasion" that comes to mind is the TV series Babylon 5, where
many of the special effects were shot at 30fps (everything else being 24).
The higher quality is quite noticable, even after the PAL conversion.
--
rgds
LAurence
<><
...
I've used this particular tagline 3 times.
NY
2017-09-06 21:29:05 UTC
Permalink
Post by Laurence Taylor
Post by NY
By the way, is film for American TV shot at 30 fps (in the same way that
it's shot at 25 fps for UK TV) or is it shot at 24 and converted by 3:2
pulldown? Do they get pulldown motion artefacts even with made-for-TV film,
or just for made-for-cinema film?
I believe it's shot at 24 (Of course, most these days will be tape/solid
state); shooting at 30, except for special occasionas, would be too expensive.
Ah, right. I'd assumed that all film-for-TV was shot at the appropriate
frame rate for the destination system (ie 25 for "PAL" or "SECAM" systems or
30 for NTSC), to avoid any motion artefacts. I hadn't realised that US
film-for-TV was at 24 and then converted to 30 by 3:2 pulldown.

When the UK bought filmed US programmes (eg Quincy, Kojak, Starsky and
Hutch), did they get the original 24 fps film and then play it at 25 fps.

We don't know how lucky we are in "PAL land" having reasonably smooth film
without motion judder and double-imaging on pans.

The worst frame judder that I ever saw was footage on one of the US news
channels (maybe CNN) of Princess Diana's funeral. CNN were taking UK footage
from various UK broadcasters and converting it to 30, but then they were
converting all their output back to 25 for the version that was shown in
Europe. Back-to-back conversions, each of which introduces artefacts, looks
atrocious.
SimonM
2017-09-06 21:58:50 UTC
Permalink
Post by NY
When the UK bought filmed US programmes (eg
Quincy, Kojak, Starsky and Hutch), did they get
the original 24 fps film and then play it at 25 fps.
Generally, yes.

Most US stuff was shot using 35mm stock, and
treated basically as 60min (55min) cinema productions.

My understanding is that film was the universal
medium because it was so easy to use it in all
countries. BBC Enterprises certainly used to sell
programmes abroad on film. Dave will remember many
"happy" hours after a final mix making
international M+E premixes for foreign sales (IIRC
that was music, FX, any synch speech, but no
commentary, mixed in such a way that the local
voiceover could be added without the need to dip
the track!).

VT copies travelling were unusual, and there was a
fuss I remember sometime around 1984-5 when a
small African country was found to be transmitting
evaluation low-band U-Matic copies of natural
history programmes to avoid paying for them
(complete with a large timecode display burned
into the picture). It's possible, of course that
said country didn't possess a TK channel anyway.

Bear in mind too that poly-prism TK systems and
other mechanical arrangements pre-dated flying
spot scanning (Rank Cintel). Poly-prisms need no
relationship between the film and video frame
rates. The image isn't very good, but you can show
24 fps films at 24fps on PAL-I if so desired.
NY
2017-09-07 09:37:45 UTC
Permalink
Bear in mind too that poly-prism TK systems and other mechanical
arrangements pre-dated flying spot scanning (Rank Cintel). Poly-prisms
need no relationship between the film and video frame rates. The image
isn't very good, but you can show 24 fps films at 24fps on PAL-I if so
desired.
I'm surprised at the order in which different types of TK were produced. I'd
always thought that flying spot had been around for donkeys' years because
it was mechanically very simple - no need for a prism whose relationship
with the film frame had to be very accurately controlled (with a servo) to
prevent the sides of the picture "throbbing".

I though that prism came later when the servos issue was sorted out. I
hadn't realised that with prism you could run the film and prism at one
speed (eg 24) and scan at another (25). I wonder how they avoid 1 Hz
interference or frame boundary problems.

I presume all film nowadays is telecined using an n x 1 pixel sensor, which
is clocked at exactly the same speed as the film is running (as for prism,
not needing intermittent motion).

At what stage did broadcasters generally change over from telecineing a film
live to telecineing onto VT and playing the VT
Angus Robertson - Magenta Systems Ltd
2017-09-07 13:25:00 UTC
Permalink
Post by NY
I though that prism came later when the servos issue was sorted
out. I hadn't realised that with prism you could run the film and
prism at one speed (eg 24) and scan at another (25). I wonder how
they avoid 1 Hz interference or frame boundary problems.
The poly-prism telecine was variable speed.

Back in the seventies the BBC used them to speed up USA imports like
Ironside to fit the 50 minute scheduled slot. But they could not get
away with that for music.

Nowadays they would have to run the programme very slowly to fill that
slot.

Angus
NY
2017-09-07 18:56:05 UTC
Permalink
Post by Angus Robertson - Magenta Systems Ltd
Post by NY
I though that prism came later when the servos issue was sorted
out. I hadn't realised that with prism you could run the film and
prism at one speed (eg 24) and scan at another (25). I wonder how
they avoid 1 Hz interference or frame boundary problems.
The poly-prism telecine was variable speed.
Back in the seventies the BBC used them to speed up USA imports like
Ironside to fit the 50 minute scheduled slot. But they could not get
away with that for music.
When you say that poly prism was variable speed, I presume that you mean the
film and prism could run at variable speed against a fixed-speed 25 fps
raster. Why isn't that equally possible with any TK technology? No matter
what technology is used, you will have problems with beating if the film
speed is anything other than the raster speed. Of course certain speeds have
nice relationships - such as 24 and 30 which can show alternate film frames
for either 2 or 3 video fields, giving fewer motion artefacts that any other
pair of frequencies. Of course it's not really 24/30 but 23.x/29.97, given
the colossal fiddle that NTSC had to make when colour was introduced.

I imagine that with modern technology, there is nothing to stop them
transferring a film at a much higher speed than real-time 25 fps, as long as
the raster scans/clocks at the corresponding speed and you can record the
pixels fast enough. You then play them back out at the broadcast frequency.
Brian
2017-09-07 20:27:26 UTC
Permalink
Post by NY
Post by Angus Robertson - Magenta Systems Ltd
Post by NY
I though that prism came later when the servos issue was sorted
out. I hadn't realised that with prism you could run the film and
prism at one speed (eg 24) and scan at another (25). I wonder how
they avoid 1 Hz interference or frame boundary problems.
The poly-prism telecine was variable speed.
Back in the seventies the BBC used them to speed up USA imports like
Ironside to fit the 50 minute scheduled slot. But they could not get
away with that for music.
When you say that poly prism was variable speed, I presume that you mean the
film and prism could run at variable speed against a fixed-speed 25 fps
raster.
Yes. The prism had the effect of performing an optical dissolve
between adjacent film frames.

One advantage of a prism machine was that silent films (16 or 18 fps
or thereabouts if the camera had been hand-cranked) could be shown at
proper speed.

The big disadvantage (for a studios person's POV) was that the glass
prism introduced an optical loss, especially in the blue end, and
consequently prism TKs tended to have a poorer video s/n ratio than
twin-lens. This was a problem if you were trying to do CSO off TK, as
the key would be noisier than ideal. Therefore, on the occasions when
this was a requirement, Allocations would have to be asked to ensure a
twin-lens channel was booked.
Post by NY
Why isn't that equally possible with any TK technology?
Twin-lens TK relies on the film having moved exactly one film frame in
the time taken by one television field, hence the film speed must be
servo-locked to field syncs.

I believe that a hopping-patch TK can run the film at any speed. Don't
know about any of the other TK technologies. As you might have
gathered, I'm not a TK person.

<snip>
NY
2017-09-07 21:04:03 UTC
Permalink
Post by Brian
Post by NY
When you say that poly prism was variable speed, I presume that you mean the
film and prism could run at variable speed against a fixed-speed 25 fps
raster.
Yes. The prism had the effect of performing an optical dissolve
between adjacent film frames.
That's clever. So depending on the relative speeds you get varying
proportions of two adjacent film frames which makes the mismatch (in theory)
less noticeable. I suppose that it is the nearest that an optical system can
get to electronic blending of film frames at a non-standard rate.
Post by Brian
One advantage of a prism machine was that silent films (16 or 18 fps
or thereabouts if the camera had been hand-cranked) could be shown at
proper speed.
The big disadvantage (for a studios person's POV) was that the glass
prism introduced an optical loss, especially in the blue end, and
consequently prism TKs tended to have a poorer video s/n ratio than
twin-lens. This was a problem if you were trying to do CSO off TK, as
the key would be noisier than ideal. Therefore, on the occasions when
this was a requirement, Allocations would have to be asked to ensure a
twin-lens channel was booked.
I can see that this would be a problem. I don't think I've ever been aware
of a programme that has used CSO from a film: normally it's the converse
with a blue screen on a studio shot being able to key in a filmed exterior.

I suppose nowadays sensitivity is less of a problem, though optical
sharpness and glare in a large lump of glass could still be.

Stupid question: why is it that until recently, TV has usually used blue for
keying and film has usually used green, with green only beginning to be used
in TV? I'd have thought that in either case you want to use a colour that is
least likely to occur in the scene that is having something keyed into it -
which is why I think blue was used for TV. i believe that some films (eg
some of the Hitchcock films) used an intense yellow light produced by
special discharge lights shining on a plain white background rather than
white light shining on a coloured background.
Post by Brian
Post by NY
Why isn't that equally possible with any TK technology?
Twin-lens TK relies on the film having moved exactly one film frame in
the time taken by one television field, hence the film speed must be
servo-locked to field syncs.
I believe that a hopping-patch TK can run the film at any speed. Don't
know about any of the other TK technologies. As you might have
gathered, I'm not a TK person.
I'll have to see if I can find any info about twin lens and hopping patch.
The only technologies I've heard of are flying spot, rotating prism and
line-array CCD,
Roderick Stewart
2017-09-08 10:05:46 UTC
Permalink
Post by NY
Stupid question: why is it that until recently, TV has usually used blue for
keying and film has usually used green, with green only beginning to be used
in TV? I'd have thought that in either case you want to use a colour that is
least likely to occur in the scene that is having something keyed into it -
which is why I think blue was used for TV. i believe that some films (eg
some of the Hitchcock films) used an intense yellow light produced by
special discharge lights shining on a plain white background rather than
white light shining on a coloured background.
You can create an overlay key signal from anything that doesn't occur
in the foreground subject, even brightness, which is the way they did
it in television before colour.

I've also heard of a film technique using a retro-reflective
background (like road signs) that will look very much brighter than
the subject if it is illuminated along the same line as the camera's
viewing direction. This enables the use of a fairly low level of
illumination that doesn't dilute the normal theatrical lighting of the
subject, so that the subject looks properly lit and the background
looks very bright, but only along the viewing line of the camera
because the background light doesn't go everywhere else. This may be
the technique Hitchcock used.

For colour separation overlay, you ignore brightness but can use the
colour-difference signal corresponding to any hue that doesn't occur
in the forground subject. Before CSO became a commonplace technique,
there was no special equipment for it, but the B-Y colour difference
signal was readily available from the camera circuitry because it had
to be produced anyway as part of the encoding process. The best
background colour to give good separation along this axis was a sort
of ultramarine blue. It was only feasible to use the B-Y signal
directly from the camera circuitry before the bandwidth reducing
filters required by all the analogue encoding systems, otherwise the
key signal wouldn't have enough detail to give a clean cutout shape.
For this reason the foreground subject of a CSO shot could only be a
live signal from a camera, not from a recording.

Despite this restriction it became such a popular technique that
eventually special equipment was made to enable the extraction of any
hue for keying purposes. Sometimes the controls for this were added to
mixing panels. It meant that any colour could be chosen for the
background, so foreground presenters need no longer avoid wearing
anything blue.

Rod.
NY
2017-09-08 12:04:55 UTC
Permalink
Post by Roderick Stewart
You can create an overlay key signal from anything that doesn't occur
in the foreground subject, even brightness, which is the way they did
it in television before colour.
I've also heard of a film technique using a retro-reflective
background (like road signs) that will look very much brighter than
the subject if it is illuminated along the same line as the camera's
viewing direction. This enables the use of a fairly low level of
illumination that doesn't dilute the normal theatrical lighting of the
subject, so that the subject looks properly lit and the background
looks very bright, but only along the viewing line of the camera
because the background light doesn't go everywhere else. This may be
the technique Hitchcock used.
I think the retro-reflective background technique is used by quite a lot of
news studios etc: they surround the lens of each camera by a ring of lights
(LEDs, probably, nowadays) which are reflected straight back into the lens.
It probably works best if the camera is reasonably at 90 degrees to the
reflective screen, though I imagine that any light that is close the the
camera will have a fair amount reflected back on that axis.

I imagine that it works best with modern CCD cameras rather than older tube
cameras because CCDs can handle any overexposure better from "brighter than
normal white" reflections. I imagine that comet-tail and lagging of tube
cameras were the biggest problems with CSO: you really *don't* want those on
the keyed edge :-)
Post by Roderick Stewart
For colour separation overlay, you ignore brightness but can use the
colour-difference signal corresponding to any hue that doesn't occur
in the forground subject. Before CSO became a commonplace technique,
there was no special equipment for it, but the B-Y colour difference
signal was readily available from the camera circuitry because it had
to be produced anyway as part of the encoding process. The best
background colour to give good separation along this axis was a sort
of ultramarine blue. It was only feasible to use the B-Y signal
directly from the camera circuitry before the bandwidth reducing
filters required by all the analogue encoding systems, otherwise the
key signal wouldn't have enough detail to give a clean cutout shape.
For this reason the foreground subject of a CSO shot could only be a
live signal from a camera, not from a recording.
Despite this restriction it became such a popular technique that
eventually special equipment was made to enable the extraction of any
hue for keying purposes. Sometimes the controls for this were added to
mixing panels. It meant that any colour could be chosen for the
background, so foreground presenters need no longer avoid wearing
anything blue.
Ah, for PAL or NTSC, they already produced R-Y and B-Y (but not G-Y), so it
makes sense to use one of the two that *is* available - and R-Y might
produce too much false triggering from skin and reddish hair.

Where did signal mixing occur - at the camera or at the mixing desk? In
other words, did they have access to the raw R, G and B at the mixing desk
where they'd want to do the keying? Or had the R, G, and B already been
thrown away, with just R-Y, B-Y and Y by this point? Mathematically, RGB can
be still be derived if needed, but it's extra circuitry. Is there any
advantage in producing colour diff signals at camera rather than further
downstream? What about other sources like caption generators and TK: does
the mixing (and maybe even band-limiting) occur at source or at the mixing
desk?

It's a good point about needing to do it before bandwidth limiting of PAL or
NTSC. One of the big giveaways of CSO (apart from mismatched lighting!) is
the sharp, sometimes slightly ragged boundary that you get. I imagine that
it is a lot easier to do in the digital domain because you blend or feather
the edge slightly so you get a mixture of pixels rather than an
all-or-nothing hard edge. Also you can look for pure colours - not just blue
or green or orange or whatever over a certain level but also blue (etc) over
a certain level and also all other colours below that level. Mind you, I
suppose using B-Y rather than B has the advantage that it prevents false
triggering on bright white which has blue over the threshold but also other
colours.
Brian
2017-09-08 13:15:41 UTC
Permalink
On Fri, 8 Sep 2017 13:04:55 +0100, "NY" <***@privacy.net> wrote:

<snip>
Post by NY
I imagine that it works best with modern CCD cameras rather than older tube
cameras because CCDs can handle any overexposure better from "brighter than
normal white" reflections. I imagine that comet-tail and lagging of tube
cameras were the biggest problems with CSO: you really *don't* want those on
the keyed edge :-)
We managed quite nicely with EMI 2001s.

If you have a comet's tail in the picture, it is because you have
something which is over-exposed. The key to good CSO with tubed
cameras was very careful lighting, correct adjustment of both the
operational and engineering controls of the channel providing the
foreground, and correct adjustment of the key clip.
Post by NY
Where did signal mixing occur - at the camera or at the mixing desk?
Depends (a) on what you were trying to achieve and (b) which studio
you were in.

"Easy" CSO was done at bank level in the vision mixer. The mixer
contained an overlay switch which operated on coded PAL signals under
the "instruction" of a key signal which was derived from the
foreground source before the PAL coder, so it had no PAL artifacts on
it. If the foreground source was local to the studio, the RGB signals
were obviously readily available. If the foreground source was a TK
machine, each studio had a dedicated, timed, TK key line down which
the key signal could be routed from TK Control.

Studios equipped with later generations of equipment had 6-axis
overlay processors in the RGB feeds to the PAL coders. These could
produce a key from any of the primary or secondary colours ( -Y) and
also suppressed the keying colour in the foreground source to the
level of the highest remaining colour(s). This did away with the
coloured fringes sometimes seen on early CSO - at any rate, it
rendered any fringes neutral, and therefore almost invisible.

The problem with CSO in the mixer is that it put the Vision Mixer's
workload up and also tied up half the mixer. Studios were also
equipped with a Video Effects Desk whose output could be routed as a
source to the mixer. When more complex CSO was required, and
Electronics Effects Operator would be booked to come in and drive the
Effects Desk.

Simple Effects Desks did very little more than could be achieved by
the overlay switch in the mixer and worked in exactly the same way.
However, since the CSO effect was now presented to the vision mixer
complete, this allowed the workload to be shared between the two
operators, and it also made crossfades between the overlay and another
source very much easier to realise.

Larger studios, where it was anticipated that complex CSO productions
would be mounted, had a rather more complicated effects desk wherein
the overlay switches operated at RGB level, and the resulting overlay
composite was PAL coded in one or more effects coders. My
recollaction is that the standard arrangement allowed for overlays to
be cascaded so that a maximum of four layers was possible. Needless
to say, one production managed to design an effect which required a
fifth level of overlay, which I gather was a nightmare to rig and
time.
Post by NY
One of the big giveaways of CSO (apart from mismatched lighting!) is
the sharp, sometimes slightly ragged boundary that you get.
The ragged edge is what you get if the key is noisy. The "cut out"
effect is because there is a very fast, hard switch between foreground
and background sources, which is not an effect the eye would observe
if presented in nature with the scene the overlay is seeking to
create. This can be overcome in linear keyers, where the key
processing is more sophisticated than a simple slice, and where the
overlay switch does what amounts to a fast crossfade between
foreground and background rather than a simple switch.

The other problem with a lot of CSO, but one which is a production
rather than an engineering issue, is that far too often the resulting
image appears to have infinite depth of field.
Brian
2017-09-08 14:54:40 UTC
Permalink
On Fri, 08 Sep 2017 14:15:41 +0100, Brian
<***@2001.bjforster.force9.co.uk> wrote:

Supposed to be form to follow-up your own post, I know, but just for
completeness: some BBC mixers had a decoder in them which allowed, at
least in theory, a key to be derived from a coded source. The results
were truly horrible. Decoded CSO was to be avoided even more than the
plague.
Roderick Stewart
2017-09-08 14:32:41 UTC
Permalink
Post by NY
I think the retro-reflective background technique is used by quite a lot of
news studios etc: they surround the lens of each camera by a ring of lights
(LEDs, probably, nowadays) which are reflected straight back into the lens.
It probably works best if the camera is reasonably at 90 degrees to the
reflective screen, though I imagine that any light that is close the the
camera will have a fair amount reflected back on that axis.
I imagine that it works best with modern CCD cameras rather than older tube
cameras because CCDs can handle any overexposure better from "brighter than
normal white" reflections. I imagine that comet-tail and lagging of tube
cameras were the biggest problems with CSO: you really *don't* want those on
the keyed edge :-)
It shouldn't matter what angle a retro-reflective surface presents to
the camera and light source. That's the point. The entry and exit rays
are parallel to each other regardless of the angle of the reflector.
It's more difficult to visualise how this works in three dimensions
than two - for example a billiard ball bouncing into and out of a
corner - but it does. Look at a bicycle reflector under a magnifying
glass and you'll see that it consists of lots of little corners with
three surfaces mutually at 90 degrees, so I suppose the special paint
they use on road signs must be full of little reflective beads or
crystals with the same sort of arrangement of surfaces. You may also
have seen radar reflectors on boats, which are usually just three
sheets of aluminium intersecting at right angles and allowed to dangle
freely from the mast, so that a radio signal from any direction will
be reflected back in a parallel direction.
[...]
Post by NY
Where did signal mixing occur - at the camera or at the mixing desk? In
other words, did they have access to the raw R, G and B at the mixing desk
where they'd want to do the keying? Or had the R, G, and B already been
thrown away, with just R-Y, B-Y and Y by this point?
In cameras of the vintage that I'm talking about, namely analogue PAL
studio cameras, the encoder would be a separate rack-mounted unit fed
with RGB from the camera, so the colour-difference signals would be
generated by matrixing at this point, the bandwidth limiting for
modulation onto the subcarrier being done here also, as part of the
encoding process. The encoded composite PAL video signals were then
fed to the mixer. RGB normally only got as far as the encoder, and in
the earliest installations was sometimes also available at technical
monitoring points for diagnostic purposes, though in later ones even
that was omitted.

Output signals from the camera control unit could be RGB, RGBY, or
Y,R-Y,B-Y, and if I remember correctly the BBC encoders could handle
any of these but it became standard to use RGB between camera and
encoder. If Y,R-Y,B-Y signals weren't available simultaneously with
RGB, it would be a simple matter to add them as a modification as it
would just be a matter of adding output buffer amplifiers for signals
already present in the circuitry. I can't remember exactly how they
did this with all cameras, but I think the EMI 2001 had just about
everything on the back panel, making it very versatile, and if full
bandwidth colour difference signals were needed for keying they would
be taken from here.

Rod.
Paul Ratcliffe
2017-09-08 15:30:26 UTC
Permalink
Post by NY
Ah, for PAL or NTSC, they already produced R-Y and B-Y (but not G-Y), so it
makes sense to use one of the two that *is* available - and R-Y might
produce too much false triggering from skin and reddish hair.
Exactly.
Post by NY
Where did signal mixing occur - at the camera or at the mixing desk?
In the PAL coders, usually separate units to the camera CCUs.
Post by NY
In other words, did they have access to the raw R, G and B at the mixing desk
where they'd want to do the keying?
No. We had overlay processors between the cameras and coders which could
derive the key.
Post by NY
Or had the R, G, and B already been thrown away, with just R-Y, B-Y and Y
by this point?
Yes, except for the key output of the overlay processor which was fed
to the mixer separately.
Post by NY
Mathematically, RGB can be still be derived if needed, but it's extra
circuitry. Is there any advantage in producing colour diff signals at
camera rather than further downstream?
Not really.
Post by NY
What about other sources like caption generators and TK: does the mixing
(and maybe even band-limiting) occur at source or at the mixing desk?
They had PAL coders and overlay processors at source in the same way as
the cameras.
SimonM
2017-09-08 21:20:52 UTC
Permalink
Post by Roderick Stewart
I've also heard of a film technique using a retro-reflective
background (like road signs) that will look very much brighter than
the subject if it is illuminated along the same line as the camera's
viewing direction. This enables the use of a fairly low level of
illumination that doesn't dilute the normal theatrical lighting of the
subject, so that the subject looks properly lit and the background
looks very bright, but only along the viewing line of the camera
because the background light doesn't go everywhere else. This may be
the technique Hitchcock used.
Ealing had a large one of those when the BBC owned
it, in the second film stage. I took a short cut
through the empty studio one summer day.

The screen was in shadow, on the far side by the
door I was walking towards. I was in bright
sunshine coming in through the open scenery door:
the effect was really weird: as I walked, the
appearance was that I was always perpendicular to
the screen - it looked like a mirror turning to
follow me.

The material was just a beaded screen, as you'd
find for slide projectors, just on a very large scale.

IIRC, HTV also had one at Bath Road for use in
dramas. They had it arranged so that it could be
used with a very long throw on the projector.

BBC R+D produced something using a very similar
approach in the early 2000s, combining a beaded
screen with CSO blue, in a ring light round the
camera lens (powerful blue LEDs). It avoids the
need for blue drapes and banging lots of light on
said blue drapes, and as with sodium, the blue
colour disappears on the foreground, as the studio
lights swamp it. They had a patent, and a
commercial partner, so I assume it's now commonplace.
SimonM
2017-09-08 21:07:43 UTC
Permalink
Post by NY
Stupid question: why is it that until recently, TV
has usually used blue for keying and film has
usually used green, with green only beginning to
be used in TV? I'd have thought that in either
case you want to use a colour that is least likely
to occur in the scene that is having something
keyed into it - which is why I think blue was used
for TV. i believe that some films (eg some of the
Hitchcock films) used an intense yellow light
produced by special discharge lights shining on a
plain white background rather than white light
shining on a coloured background.
The issues are rather differnt.

First off, proper film hasn't, AFAIK used green.

As you said, the first travelling matte system of
that type (IIRC, for Disney's Mary Poppins) used
sodium lamps. This is because of the almost
monochromatic light, allowing the filtered output
to be very crisp AND the fact that yellow spill on
the foreground doesn't show very much (don't
forget that like CSO/Chromakey, the matte is
automatically generated whilst the foreground
action is being filmed, although obviously the two
images are combined in an aerial image rostrum).

I'd argue that, until really quite recently in the
digital domain, the sodium light system with 35mm
or (better) 65mm film gave better matting -
cleaner with less fringing etc. - than chromakey
systems. that's because it had an 'unfair
advantage' because of the sodium light.

IIRC, Disney got a technical academy award for the
system, and patented it, so there ought to be a
lot of information on the web about it.
Angus Robertson - Magenta Systems Ltd
2017-09-08 07:51:00 UTC
Permalink
Post by NY
When you say that poly prism was variable speed, I presume that
you mean the film and prism could run at variable speed against a
fixed-speed 25 fps raster. Why isn't that equally possible with
any TK technology?
There is a lot of information about early BBC telecine and the
polygonal prism system at:

http://tech-ops.co.uk/next/2015/02/bbc-telecine-in-the-1960s/

Angus
NY
2017-09-08 10:23:39 UTC
Permalink
Post by Angus Robertson - Magenta Systems Ltd
Post by NY
When you say that poly prism was variable speed, I presume that
you mean the film and prism could run at variable speed against a
fixed-speed 25 fps raster. Why isn't that equally possible with
any TK technology?
There is a lot of information about early BBC telecine and the
http://tech-ops.co.uk/next/2015/02/bbc-telecine-in-the-1960s/
Very interesting article. It's fascinating to see how various mechanical
solutions (eg twin lens with shutter) had to be used to compensate for the
fact that the film was moving progressively whereas they wanted an
interlaced signal. Life would have been so much simpler if they had had
framestores so they could scan progressively but read out interlaced :-)

I hadn't realised that the comopt sound track was considered very poor (eg
quality, noise and frequency response). You can certainly hear a lot of
rustling and crackling on the soundtrack of old cinema films (1930s-1950s)
that are broadcast on TV - presumably due to dirt and scratches on the area
close to the perforations.

It's a shame that the article doesn't really explain how they corrected for
smear due to persistence of the CRT phosphor. I'm intrigued by how you could
improve the signal once the smear has already occurred - the only solution
would seem to be a shorter phosphor. What is odd about the example of the
smeared "10" is that it just looks out of focus, rather than having vertical
and horizontal smearing that decreases in intensity as the beam moved
further right of the highlight, giving asymmetrical blurring,

Interesting that he talked about film producing sharper pictures than TV
cameras, whereas my perception with anything I saw from about 1970 onwards
(I'm too young to remember before that!) is that TV camera pictures usually
looked much sharper - maybe even *too* much if any edge-enhancement was
used) - even on black-and-white, I could distinguish at a glance between the
sharper studio segments and the slightly more blurred, flickery film inserts
on Blue Peter etc. Of course nowadays when you see archive material of
studio output (the dreaded Blue Peter and the incontinent elephant), a lot
is recorded on film by film-recording of the TV screen, which means you are
not seeing what cameras originally broadcast.

Things have moved on a bit since the days of having to spend an hour
aligning a machine before use: nowadays you'd expect any machine to stay
"good" more or less forever after initial setup. I wonder how much they had
to correct for different types of film or slightly different exposures,
especially if they had to intercut film from different sources with
different characteristics. i know how tedious it is getting good scans of 35
mm *still* negatives of different manufacturers
Paul Ratcliffe
2017-09-08 11:05:41 UTC
Permalink
Post by NY
Of course nowadays when you see archive material of
studio output (the dreaded Blue Peter and the incontinent elephant), a lot
is recorded on film by film-recording of the TV screen, which means you are
not seeing what cameras originally broadcast.
And even when there is a decent copy available, they deliberately
degrade it to make it look "old". I hate that.
Brian
2017-09-07 12:45:43 UTC
Permalink
On Wed, 6 Sep 2017 22:58:50 +0100, SimonM
Post by SimonM
Post by NY
When the UK bought filmed US programmes (eg
Quincy, Kojak, Starsky and Hutch), did they get
the original 24 fps film and then play it at 25 fps.
Generally, yes.
Most US stuff was shot using 35mm stock, and
treated basically as 60min (55min) cinema productions.
There was a tremendous fuss made when the US series-of-the-moment -
memory suggests it was "Dallas", but it's a long time ago - switched
over from origination on 35mm to origination as video.

Before the change, the BBC had received a 35mm print of each episode,
which was then TK transferred for transmission from VT. After the
change, we got a PAL videotape which had been standards-converted in
the US, on God-knows-what sort of converter. Purchased Programmes had
been advised that this arrangement was likely to end in tears, but
went ahead and did it anyway. The drop in quality was such that it
was readily noticeable at home, and attracted adverse comment in the
press.

The Americans were persuaded to supply the following series on 5N
videotape, which was converted at the Centre using the ACE. The
results were still not quite as good as 35mm, but were a lot better
than previously and at least the difference was no longer apparent on
domestic kit.

The sequel to this is that, not long afterwards, one of the American
news networks signed up with a UK satellite broadcasters to carry
their stuff over here "as live". Clearly, for this purpose, a PAL
feed of their station out was required. Their Chief Engineer had
obviously been following the Dallas saga and noted the improvement in
quality. The result was that the Americans ordered an ACE from GEC,
who had the commercial licence for it, and shipped it to the States,
where I believe it worked for many years.


Remove 2001. to reply by email. I apologise for the inconvenience.
Laurence Taylor
2017-09-07 20:25:19 UTC
Permalink
Post by Brian
There was a tremendous fuss made when the US series-of-the-moment -
memory suggests it was "Dallas", but it's a long time ago - switched
over from origination on 35mm to origination as video.
Before the change, the BBC had received a 35mm print of each episode,
which was then TK transferred for transmission from VT. After the
change, we got a PAL videotape which had been standards-converted in
the US, on God-knows-what sort of converter. Purchased Programmes had
been advised that this arrangement was likely to end in tears, but
went ahead and did it anyway. The drop in quality was such that it
was readily noticeable at home, and attracted adverse comment in the
press.
The Americans took a long time to get standards conversion right,
presumably because most of their material was home-grown they had little
demand for it. I'm surprised that some awful material still turns up,
either fuzzy VHS quality or with a nasty duplicated frame every second.

As someone said earlier, CNN were particularly bad, especially when stuff
had been through several conversions.
--
rgds
LAurence
<><
...
Theft Proof Tagline <hehe>
Paul Ratcliffe
2017-09-08 11:02:17 UTC
Permalink
Post by Brian
There was a tremendous fuss made when the US series-of-the-moment -
memory suggests it was "Dallas", but it's a long time ago - switched
over from origination on 35mm to origination as video.
It was indeed Dallas.
Post by Brian
Purchased Programmes had been advised that this arrangement was likely
to end in tears, but went ahead and did it anyway.
Why does this not surprise me?
Post by Brian
The results were still not quite as good as 35mm, but were a lot better
than previously and at least the difference was no longer apparent on
domestic kit.
I could still tell. And that was way before I worked in the industry.


BTW, did you end up in recruitment, or am I thinking of someone else?
Brian
2017-09-08 12:28:24 UTC
Permalink
On Fri, 08 Sep 2017 11:02:17 GMT, Paul Ratcliffe
<***@orac12.clara34.co56.uk78> wrote:


<snip>
Post by Paul Ratcliffe
BTW, did you end up in recruitment, or am I thinking of someone else?
I had a short and entirely unsuccessful spell at WN, but apart from
that served my sentence in and around Studio Engineering at Television
Centre.

I was Assistant (Training) to EiC Television Studios for a bit, and
shared an office with the two Assistants (Training) Tel Rec, so some
of their knowledge, which was encyclopediac, must have rubbed off on
me.
John Williamson
2017-09-06 22:31:27 UTC
Permalink
Post by NY
Post by Laurence Taylor
I believe it's shot at 24 (Of course, most these days will be tape/solid
state); shooting at 30, except for special occasionas, would be too expensive.
Ah, right. I'd assumed that all film-for-TV was shot at the appropriate
frame rate for the destination system (ie 25 for "PAL" or "SECAM"
systems or 30 for NTSC), to avoid any motion artefacts. I hadn't
realised that US film-for-TV was at 24 and then converted to 30 by 3:2
pulldown.
American stuff is all filmed at 24fps, and played back at 25fps for UK
and European broadcast distribution. They get very few complaints about
the pitch shift, from what I gather, and it also works the other way, as
European programmes made for TV are filmed at 25 fps, and played back at
24fps for the American market.
Post by NY
When the UK bought filmed US programmes (eg Quincy, Kojak, Starsky and
Hutch), did they get the original 24 fps film and then play it at 25 fps.
Yes.
--
Tciao for Now!

John.
s***@googlemail.com
2017-09-07 19:46:45 UTC
Permalink
The 'zig-zag' cement splicer was a purely amateur thing, made by Eumig; I remember seeing them in the 1970s, for both types of 8 mm film. There is a tape splicer which cuts the film in a similar way, but then uses a narrow splicing tape, about 10 mm wide over the splice. This is used for IMAX film.

Conventional cement splicers, where the films overlap slightly, are used for negative cutting. Prints are spliced in cinemas with a tape splicer. For about twenty years prints have been made on polyester stock, which cannot be cement spliced, though there is an ultrasonic splicer which 'welds' the film ends together in a similar way. These were used mainly in film laboratories.

As somebody else has said, with 35 mm film there is normally a much wider frame line than on smaller gauges, so an invisible cement splice can be made. Remember that the projector aperture is always slightly smaller than the camera aperture to give an overlap, so there's actually slightly more area available to make the splice than you might think from just looking at a piece of 35 mm film. 'Cinemascope' type prints have a narrower frameline, which did sometimes cause a negative splice to be visible at the top or bottom of the screen, particularly if the projector framing was slightly out. The 'scope aspect ratio has been changed several times, most recently from 2.35:1 to 2.39:1, sometime in the '70s I think, in an attempt to avoid this problem.

With 16 mm film there isn't the space available to make an invisible cement splice. Sometimes this was just accepted, but for good quality 16 mm work there was a solution. The negative was cut in two rolls, the 'A' and 'B' rolls. The first shot would be assembled in the A roll and the second in the B roll, alternating between the two rolls. In the roll where each shot wasn't, if you see what I mean, the was an equal length of black spacing. A special splicer was used which allowed the overlap to be made either way, and the ends overlapped in the opposite direction when running from picture to black than they did when running from black to picture. The black spacing was always cut exactly on the frameline, and it was always the overlap area on the picture stock from which the emulsion was removed to make the splice. This meant that the splice area was always covered by the black spacing, and so couldn't bee seen. When making a print or intermediate from a cut negative the 'A' roll is printed first, then the raw stock is rewound, and then the 'B' roll printed, so no splices can be seen in the print.

A few films are still shot on 16 mm, 'Carol' was one fairly recent one, but these days the camera negative is scanned, edited digitally, and then a digital intermediate produced from which a digital cinema package master, or a film negative for printing, can be made.
NY
2017-09-07 21:06:50 UTC
Permalink
Post by s***@googlemail.com
With 16 mm film there isn't the space available to make an invisible
cement splice. Sometimes this was just accepted,
As I suspect was done for the stills from the low-budget TV series that I
posted.
Post by s***@googlemail.com
but for good quality 16 mm work there was a solution. The negative was
cut in two rolls, the 'A' and 'B' rolls. The first shot would be assembled
in the A roll and the second in the B roll, alternating between the two
rolls. In the roll where each shot wasn't, if you see what I mean, the
was an equal length of black spacing. A special splicer was used which
allowed the overlap to be made either way, and the ends overlapped in the
opposite direction when running from picture to black than they did when
running from black to picture. The black spacing was always cut exactly
on the frameline, and it was always the overlap area on the picture stock
from which the emulsion was removed to make the splice. This meant that
the splice area was always covered by the black spacing, and so couldn't
bee seen. When making a print or intermediate from a cut negative the 'A'
roll is printed first, then the raw stock is rewound, and then the 'B'
roll printed, so no splices can be seen in the print.
Ah, so the crucial thing is to make sure that all the join is in the black
leader, with no overlap impinging on the picture.
J. P. Gilliver (John)
2017-09-11 00:37:46 UTC
Permalink
In message <uc_rB.1144419$***@fx43.am4>, SimonM
<***@large.in.the.world> writes:
[]
My understanding is that film was the universal medium because it was
so easy to use it in all countries. BBC Enterprises certainly used to
sell programmes abroad on film. Dave will remember many "happy" hours
[]
Yes. Sometimes "lost" episodes of something considered important are
rediscovered when someone finds the film in the cupboards of some TPLAC
TV organisation. (Sometimes it is monochrome film, as the destination
country didn't have colour; sometimes recently they've found that the
colour subcarrier suppression - if it was material initiated
electronically buy was transferred for export - in the to-film equipment
wasn't that great and they were able to recover the colour by getting
the subcarrier from the film; previous to that, especially for things
like some Dr. Who episodes, they had some success by getting the colour
from somebody's private Betamax (or whatever) tape that was too poor for
broadcast use, and combining it with the quality images from the export
monochrome film.
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

"It ain't those parts of the Bible that I can't understand that bother me, it's
the part that I do understand." - Mark Twain
J. P. Gilliver (John)
2017-09-11 00:41:23 UTC
Permalink
In message <***@brightview.co.uk>, NY
<***@privacy.net> writes:
[]
Post by NY
I presume all film nowadays is telecined using an n x 1 pixel sensor,
which is clocked at exactly the same speed as the film is running (as
for prism, not needing intermittent motion).
I've always made the same assumption, other than wondering if there were
ever telecine machines with two n × 1 sensors half a frame apart for
interlacing purposes. (I've always assumed probably not - do it
electronically, using a framestore etc. - because of the difficulty of
the precision required, but wondered if it might have happened briefly
when framestores were still expensive.)
Post by NY
At what stage did broadcasters generally change over from telecineing a
film live to telecineing onto VT and playing the VT
Interesting question.

(I would guess they went to VT as soon as quality VT became good enough
and cheap enough - because once it's on VT, you don't need the telecine
machine - or the original film, for that matter - if you want to
broadcast it again. So as soon as VT became both cheaper than telecine,
and didn't wear out after a few playings, they'd have gone to it.)
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

"It ain't those parts of the Bible that I can't understand that bother me, it's
the part that I do understand." - Mark Twain
Roderick Stewart
2017-09-11 09:56:20 UTC
Permalink
On Mon, 11 Sep 2017 01:41:23 +0100, "J. P. Gilliver (John)"
Post by J. P. Gilliver (John)
Post by NY
At what stage did broadcasters generally change over from telecineing a
film live to telecineing onto VT and playing the VT
Interesting question.
(I would guess they went to VT as soon as quality VT became good enough
and cheap enough - because once it's on VT, you don't need the telecine
machine - or the original film, for that matter - if you want to
broadcast it again. So as soon as VT became both cheaper than telecine,
and didn't wear out after a few playings, they'd have gone to it.)
I was told it was pretty much standard by the time I started working
at TC in 1968. Videotape was more reliable - it didn't snap as film
sometimes did - and the colour corrections only had to be done once,
during the transfer from film to tape.

Most cinema fims of that era contained colour variations from shot to
shot that wouldn't have been noticeable in a cinema, but were
seriously offputting when shown on a CRT screen in a living room. The
electronic correction itself was done using a system called TARIF
(Television Apparatus for the Restoration of Indifferent Film), and it
could either be done on the fly with the shot changes triggered by
little pieces of foil added to the edge of the film, or it could be
done by means of electronic editing.

Rod.
Laurence Taylor
2017-09-11 14:50:44 UTC
Permalink
Post by Roderick Stewart
I was told it was pretty much standard by the time I started working
at TC in 1968. Videotape was more reliable - it didn't snap as film
sometimes did - and the colour corrections only had to be done once,
during the transfer from film to tape.
Running TK to air was still done in the mid 80s. I remember watching an
episode of "Star Trek" where the film snapped and the loose end was visible
for several seconds before someone ran to the desk and put up an ident. I
think I've still got it on tape somewhere!
--
rgds
LAurence
<><
...
Caution! The tagline you steal may be your own!
NY
2017-09-11 17:47:48 UTC
Permalink
Post by Laurence Taylor
Post by Roderick Stewart
I was told it was pretty much standard by the time I started working
at TC in 1968. Videotape was more reliable - it didn't snap as film
sometimes did - and the colour corrections only had to be done once,
during the transfer from film to tape.
Running TK to air was still done in the mid 80s. I remember watching an
episode of "Star Trek" where the film snapped and the loose end was visible
for several seconds before someone ran to the desk and put up an ident. I
think I've still got it on tape somewhere!
Did news studios normally run TK straight to air, or did they always
transfer to VT and then play that by the end of the time that they were
using film?

I remember a drama series (1980s?) about a British TV newsroom. In one
episode they gave a cheap Super 8 camera to an undercover journalist behind
the Iron Curtain because it was smaller and easier to use without the
authorities seeing it than domestic video cameras were in those days. Having
smuggled out the film cartridges and hastily developed them, they then had
to fire up the Super 8 capable TK which hadn't been used for years and keep
reloading the film every time it broke - as it was going out live to air.
Real seat of the pants stuff.
Brian
2017-09-11 19:47:02 UTC
Permalink
On Mon, 11 Sep 2017 18:47:48 +0100, "NY" <***@privacy.net> wrote:

<snip>
Post by NY
Did news studios normally run TK straight to air, or did they always
transfer to VT and then play that by the end of the time that they were
using film?
I only went into News TK once, and that was when I was being shown
round on the circus in 1979. My recollection is that they were
wet-gate TKs and the stock was usually scanned in the negative - saves
processing time.
Post by NY
I remember a drama series (1980s?) about a British TV newsroom. In one
episode they gave a cheap Super 8 camera to an undercover journalist behind
the Iron Curtain because it was smaller and easier to use without the
authorities seeing it than domestic video cameras were in those days. Having
smuggled out the film cartridges and hastily developed them, they then had
to fire up the Super 8 capable TK which hadn't been used for years and keep
reloading the film every time it broke - as it was going out live to air.
Real seat of the pants stuff.
If that was "Drop the Dead Donkey", I would be inclined to regard most
of the technical detail as having been taken great liberties with for
comedic effect.


Remove 2001. to reply by email. I apologise for the inconvenience.
NY
2017-09-11 20:37:25 UTC
Permalink
Post by John Williamson
<snip>
Post by NY
Did news studios normally run TK straight to air, or did they always
transfer to VT and then play that by the end of the time that they were
using film?
I only went into News TK once, and that was when I was being shown
round on the circus in 1979. My recollection is that they were
wet-gate TKs and the stock was usually scanned in the negative - saves
processing time.
I'd always thought that news crews used transparency film (eg Ektachrome)
rather than negative film, but thinking about it, negative would probably be
a faster process (no bleach/reversal or second light exposure and
development) so neg film sounds a lot more plausible.
Post by John Williamson
Post by NY
I remember a drama series (1980s?) about a British TV newsroom. In one
episode they gave a cheap Super 8 camera to an undercover journalist behind
the Iron Curtain because it was smaller and easier to use without the
authorities seeing it than domestic video cameras were in those days. Having
smuggled out the film cartridges and hastily developed them, they then had
to fire up the Super 8 capable TK which hadn't been used for years and keep
reloading the film every time it broke - as it was going out live to air.
Real seat of the pants stuff.
If that was "Drop the Dead Donkey", I would be inclined to regard most
of the technical detail as having been taken great liberties with for
comedic effect.
I don't think it was DTDD. I think it was a serious drama rather a sitcom.
Can't think what it was called now. I remember another episode featured a
scoop about the MOD, and the editor took a call about 10 seconds before the
film was about to be transmitted and learned that the MOD had issued a
D-notice so he pulled it, leaving everyone struggling to fill the dead air,
and he cursed that he had answered his phone so quickly rather than letting
the film run for a while before answering.
Angus Robertson - Magenta Systems Ltd
2017-09-12 07:40:00 UTC
Permalink
Post by NY
Did news studios normally run TK straight to air, or did they
always transfer to VT and then play that by the end of the time
that they were using film?
When I worked in BBC TV News in the seventies, there were only five
VTRs but nine 16mm photo conductive telecines (not twin lens). All
local film was played live, having being developed in the lab on the
floor below. Remote contributions would have been sent down the line
and come from VT.

Don't recall if they transmitted negative or positive, but it had to be
viewed by a producer and edited in minutes and I'm not sure how easy
that is in negative. I only ever saw the telecine output.

At the time, main block telecine had nine 35mm and seven 16mm twin lens,
and two 16mm polygons, but there were 35 2in VTRs in the basement.
There were also two 35mm polygons at Lime Grove.

Angus
Roderick Stewart
2017-09-12 08:44:46 UTC
Permalink
Post by Angus Robertson - Magenta Systems Ltd
When I worked in BBC TV News in the seventies, there were only five
VTRs but nine 16mm photo conductive telecines (not twin lens). All
local film was played live, having being developed in the lab on the
floor below. Remote contributions would have been sent down the line
and come from VT.
News film of that era was usually reversal, i.e. positive image so
easy to edit on a Steenbeck if that's the way they decided to do it.

Rod.
SimonM
2017-09-13 20:34:46 UTC
Permalink
Post by Roderick Stewart
News film of that era was usually reversal, i.e. positive image so
easy to edit on a Steenbeck if that's the way they decided to do it.
Always reversal, AFAIK. The lab took about 45mins
to actually run the film through (had to be taken
out of the mag, etc. too. or almost two hours if
you wanted it properly dry!

Our lab guy was an expert chemist (and an L.R.P.S.
photographer too IIRC), and had densitometers and
so on (continuous replenishment process, so had to
be carefully managed). The image quality was
limited by the speed the lab ran and often the
fact that the film had to be pushed too. But he
was kind enough occasionally to develop my own
35mm Ektachrome (stills, obviously!), and the
results were the best I ever had, until I started
processing myself (even then I struggled to come
close to what he managed). It was far better than
the local commercial labs we had, even those doing
professional work.

I don't remember TK running negatives ever
normally, as any show using a negative process had
answer prints to use (the last of which became the
show print, after usually only one or two
iterations). Even then the show print usually
needed careful TARIFing, overseen by the editor
and occasionally the director/producer too.

I have a feeling that Nat Hist (in Bristol; the
last users of film in the BBC) did do negative
transfers in the late 1980s. Their "new" TK suite
had a Pogle and was capable of amazing quality for
the time. That TK looked like a bit of a Star Wars
set...
J. P. Gilliver (John)
2017-09-13 22:46:43 UTC
Permalink
In message <HDguB.940208$***@fx24.am4>, SimonM
<***@large.in.the.world> writes:
[]
I have a feeling that Nat Hist (in Bristol; the last users of film in
the BBC) did do negative transfers in the late 1980s. Their "new" TK
suite had a Pogle and was capable of amazing quality for the time. That
TK looked like a bit of a Star Wars set...
Why were they the last users of film - something to do with very
high-speed and very low speed (time-lapse)?
--
J. P. Gilliver. UMRA: 1960/<1985 MB++G()AL-IS-Ch++(p)***@T+H+Sh0!:`)DNAf

"If you have ten thousand regulations you destroy all respect for the
law." - Winston Churchill.
SimonM
2017-09-14 05:56:29 UTC
Permalink
Post by J. P. Gilliver (John)
[]
Post by SimonM
I have a feeling that Nat Hist (in Bristol; the
last users of film in the BBC) did do negative
transfers in the late 1980s. Their "new" TK
suite had a Pogle and was capable of amazing
quality for the time. That TK looked like a bit
of a Star Wars set...
Why were they the last users of film - something
to do with very high-speed and very low speed
(time-lapse)?
Nothing to do with that per se: video cameras
didn't like rainforests, the Arctic, etc.

Most of that was down to using "rust"* as a
storage medium. I won't bore you with the detail.
Solid state recording has probably eliminated many
of those issues now, but in the late 1980s the
film industry manufacturers had field-proven
workarounds for most of the problems, for example
built-in heating elements for film magazines,
batteries and, IIRC inside the cameras themselves
(Panaflex - 35mm). That could deal with the
Arctic, and could in some circumstances prevent
condensation in really soggy places (when you
couldn't completely waterproof the cameras for
other reasons).

Bristol also had truly excellent mechanical
engineers, and its own film maintenance
department, who not only looked after cameras and
editing equipment but made a lot of specialist
kit, such as unusual lens adaptors.

So, for most filming environments, someone else
had (usually literally) been there before, and
there was experience to draw on. Film cameras
could also be made extremely compact and still
deliver 'broadcast quality'. Bolexes and
ex-military gun cameras were used a lot (don't
forget a lot of NatHist was, and still is, shot
mute). IIRC, Simon King's first few films were
shot on a Bolex H16, when he was still a teenager
(his dad was a senior producer in Bristol and,
IIRC, Plymouth).

In the days when film was the standard medium for
news, we also had several "mute" cameramen for
Points West, who operated like stringers.

Regarding early uses of video for NatHist, Dave L.
(who worked on at least one series of Reef Watch
in the Red Sea) and Paul are better people than me
to comment. The bristol engineering team did some
amazing things though, including a
remotely-controlled submersible, with one of the
very earliest (and hugely expensive) fibre-optic
video links. I'm not sure how much of that was
digital, if anything (you'd assume the fibre link
was, but I'm honestly not sure).

I do remember it had a beautifully made stainless
tube chassis and looked a bit like a small bright
yellow torpedo with an "eye" at one end. The team
took it for depth/pressure testing, I assume to
Lympstone or somewhere similar. In all the
concentration on the high tech's reliability, they
forgot to drill holes in the tubing... which was
evident when it came back. The casing and
electronics worked very well, however!



*yes, I know - metallised tape etc. It didn't
change the basic issues much.
Paul Ratcliffe
2017-09-14 21:27:22 UTC
Permalink
Post by SimonM
Regarding early uses of video for NatHist, Dave L.
(who worked on at least one series of Reef Watch
in the Red Sea) and Paul are better people than me
to comment.
Didn't have much to do with 'em in those days. The Really Wild Show
was studio based then, and I did only one day in VT on Supersense,
with whichever Downer it was, because AJM was unavailable and I'd
shown interest.
Post by SimonM
The bristol engineering team did some
amazing things though, including a
remotely-controlled submersible, with one of the
very earliest (and hugely expensive) fibre-optic
video links. I'm not sure how much of that was
digital, if anything (you'd assume the fibre link
was, but I'm honestly not sure).
None of it IIRC. I used it twice. Once on Lundy for The RWS, and
then out on some wreck in the English Channel for South Today.
God knows how we got anything usable on the latter - it was
decidedly choppy 3 miles out of Littlehampton and once the boat
stopped I'd had it.
The kit was flying everywhere and my post-digested lunch ended up
over the side. I'm sure I thought about following it.
Post by SimonM
I do remember it had a beautifully made stainless
tube chassis and looked a bit like a small bright
yellow torpedo with an "eye" at one end.
Yes, I probably have a picture of it on Lundy somewhere. That
was a much more enjoyable OB.
SimonM
2017-09-13 20:18:27 UTC
Permalink
Post by NY
Did news studios normally run TK straight to air,
or did they always transfer to VT and then play
that by the end of the time that they were using
film?
Usually, yes.

Bristol had its own film processing lab on the
premises. News usually shot 160ASA
tungsten-balanced Ektachrome (occasionally pushed
up to two stops).

Cameras were usually Arriflex, but one freelance
had an Aaton towards the end of film. The Arris
(can't remember the exact model now) had a Wratten
80A available that could be quickly flipped in for
shooting out of doors - they lost 1 stop, IIRC.
Doing it that way round was far better than using
an 82B (blue) indoors. The balance was never right
under fluorescents!

The day started (probably still does) with an
editorial "offers meeting" at 0930h. Promising
stories for the day were discussed, and any
predicted special technical requirements brought
up. The day's editor would allocate film crews
(even earlier if long distances were involved).

Film shooting was done in the morning, processing
was done over lunch (as early as the film could be
delivered to the lab - dispatch riders were used),
and editing in the afternoon. Really late items
would even be cut with the film still wet. There
was sometimes an extra process involved, in that
ther could be need for a commag-sepmag transfer,
so interviews could be edited (see below).

The issue was usually that our regional news
studio had only a limited number of play-in
sources available: only one TK and a 2-machine
edit pair of 1" VTs (when I was first involved).
When Hi-Band came in, that was added to the
available sources. I'm fairly certain that the TK
operator also ran the Hi-Band machine.

In a news program, the items can be extremely
short - as little as 15 seconds. Both TK and VT
needed time to run up to speed. So you couldn't
have two stories featuring either (the same) VT or
film running back-to-back, as the machines
wouldn't have anything like enough time to cue up.

The art, for the director, was to choreograph use
of the two VTs and TK, so that she always had the
next machine cued up and ready. So we spent a lot
of time during the afternoon copying material from
TK to VT for play-in, and/or between VTs A and B,
so that no machine was caught out.

VTs could spool, TK couldn't - you were pretty
much limited to single speed forward and backwards
(without unlacing and rewinding). I remember one
very tightly-scripted show that went horribly
wrong. TK missed a cue early on (not Rex's
fault!), so the item was dropped and another story
substituted on-the-fly . But the dropped item
still had to be played through. So TK missed the
escond cue, and then the next -- the whole lot in
fact, to the end of the show.

It was the producer's fault and he was somewhat
chastened. It's one reason why you'll always have
at least one standby item easily accessible.

Aside: news used special film leaders that counted
down in seconds. Those are the ones with a
vertical-ish white line going left-to-right across
the screen between the numbers. Network used
Academy leaders that counted in 35mm feet (16
frames or around 0.7sec). Sync "plop" on 3 for
news, 4 for network (black leader thereafter to
first frame).

To avoid complexTK would cue up on -10.ity, news
always ran off ten second counts, so TK would cue
up on -10 (ish). Film played live into network ran
on a different cue, but I can't remember how long
away (in seconds) the Academy sync cross was from
the start of picture).

Another difference was that most editors would lay
picture back 8-10 frames before the official start
of picture (and sound, too), so that any
mis-timing in the studio could be allowed for -
the director could cut-up the TK early if
necessary, and I had something to fade in under
the end of the studio intro. It wasn't always
possible though.

A note on commag for news: the sound-picture
offset, IIRC, was 40 frames (one linear foot of
16mm). Since it was almost always used for
interviews, there had to be a transfer to sepmag
to allow editing.

For stories without interviews, you just had to
put up with non-sync cuts. I spent a lot of time
playing in live sound effects (off disk, tape or
even cassettes) to cover that sort of issue - you
could swamp quieter backgrounds so the
discontinuity wasn't obvious.
NY
2017-09-13 22:01:50 UTC
Permalink
The balance was never right under fluorescents!
I've always wondered why transparency film (eg Kodachrome, Ektachrome, Agfa)
never manages to reproduce fluorescent tubes correctly, even with a
so-called warm-white fluorescent filter, whereas modern digital cameras can
auto-white-balance to any sort of weird non-black-body lighting (warm white
/ daylight fluorescent, LED) without any colour cast and just a slight
darkening of some shades of red. But that's probably off-topic.
VTs could spool, TK couldn't - you were pretty much limited to single
speed forward and backwards (without unlacing and rewinding).
I didn't know that. Were the film guides or motors etc not suited to film
going through at faster than real-time? I presume un/relacing takes longer
than running at normal speed for a typical news report.
I remember one very tightly-scripted show that went horribly wrong. TK
missed a cue early on (not Rex's fault!), so the item was dropped and
another story substituted on-the-fly . But the dropped item still had to
be played through. So TK missed the escond cue, and then the next -- the
whole lot in fact, to the end of the show.
Ooops! :-)
A note on commag for news: the sound-picture offset, IIRC, was 40 frames
(one linear foot of 16mm). Since it was almost always used for interviews,
there had to be a transfer to sepmag to allow editing.
For stories without interviews, you just had to put up with non-sync cuts.
I spent a lot of time playing in live sound effects (off disk, tape or
even cassettes) to cover that sort of issue - you could swamp quieter
backgrounds so the discontinuity wasn't obvious.
Crafty.


Editors and directors of today wouldn't know that they are born, with
synchronous sound right from the moment of filming, and sources that can
start instantly without any advance cue. I admire directors who can predict
when the -10 seconds point in a newsreader's delivery is going to be, to get
a perfectly-timed cut. I know they work on about 3 words per second, but I'm
sure different people speak at different speeds and the same person may not
be absolutely consistent from day to day.
Paul Ratcliffe
2017-09-14 21:01:33 UTC
Permalink
Post by NY
I didn't know that. Were the film guides or motors etc not suited to film
going through at faster than real-time? I presume un/relacing takes longer
than running at normal speed for a typical news report.
In our case, I think it was just an ancient machine. The newer one in the
suite next door could spool, but it was reserved for 'proper' programmes
not lowly news.
Post by NY
Editors and directors of today wouldn't know that they are born, with
synchronous sound right from the moment of filming, and sources that can
start instantly without any advance cue.
And running orders that can be changed to suit their whims.
Post by NY
I admire directors who can predict
when the -10 seconds point in a newsreader's delivery is going to be, to get
a perfectly-timed cut. I know they work on about 3 words per second, but I'm
sure different people speak at different speeds and the same person may not
be absolutely consistent from day to day.
They had a person counting down in their ears so they could speed up or
slow down to some degree to suit.
Of course you soon learn who the capable presenters are and adjust to
suit.
Paul Ratcliffe
2017-09-14 20:55:52 UTC
Permalink
Post by SimonM
The day started (probably still does) with an
editorial "offers meeting" at 0930h.
Yes, although it's a bit earlier now - 0910 I think.
Post by SimonM
I'm fairly certain that the TK operator also ran the Hi-Band machine.
They did in my time. I did it a couple of time myself.
Post by SimonM
To avoid complexTK would cue up on -10.ity, news
I think someone's edited in a bit of extra film there.
Paul Ratcliffe
2017-09-14 20:50:10 UTC
Permalink
On Mon, 11 Sep 2017 10:56:20 +0100, Roderick Stewart
The electronic correction itself was done using a system called TARIF
(Television Apparatus for the Restoration of Indifferent Film)
FWIW, my memory says Technical Apparatus for the Rectification of Inferior
Film.
charles
2017-09-14 21:02:05 UTC
Permalink
Post by Paul Ratcliffe
On Mon, 11 Sep 2017 10:56:20 +0100, Roderick Stewart
The electronic correction itself was done using a system called TARIF
(Television Apparatus for the Restoration of Indifferent Film)
FWIW, my memory says Technical Apparatus for the Rectification of Inferior
Film.
I believe originally "Tony's Apparatus.........."
--
from KT24 in Surrey, England
Roderick Stewart
2017-09-15 06:09:40 UTC
Permalink
Post by charles
Post by Paul Ratcliffe
The electronic correction itself was done using a system called TARIF
(Television Apparatus for the Restoration of Indifferent Film)
FWIW, my memory says Technical Apparatus for the Rectification of Inferior
Film.
I believe originally "Tony's Apparatus.........."
I've seen all of these. Presumably Tony must have been involved in
some way with the invention of the equipment, but nobody was ever able
to tell me who he was.

Rod.

Brian
2017-09-11 10:46:25 UTC
Permalink
On Mon, 11 Sep 2017 01:41:23 +0100, "J. P. Gilliver (John)"
<snip>
Post by J. P. Gilliver (John)
Post by NY
At what stage did broadcasters generally change over from telecineing a
film live to telecineing onto VT and playing the VT
Interesting question.
(I would guess they went to VT as soon as quality VT became good enough
and cheap enough - because once it's on VT, you don't need the telecine
machine - or the original film, for that matter - if you want to
broadcast it again. So as soon as VT became both cheaper than telecine,
and didn't wear out after a few playings, they'd have gone to it.)
There's a bit more to it than that: the obvious issue is the
operational convenience of being able to go back and TARIF (grade) the
print again if a shot change is fluffed, and to be able to get the
reel-changeovers frame-perfect by VT editing.

Then there was the famous Christmas Eve incident involving reels
transmitted from TK in the wrong order late at night... :-)

As to when it happened: my guess, and it is no more than that, would
be early 80s, which makes me wonder whether there was a link with the
majority of VT machines being C-format rather than quad. A Tel Rec
person from the era would probably be able to give you
chapter-and-verse.


Remove 2001. to reply by email. I apologise for the inconvenience.
NY
2017-09-11 13:29:10 UTC
Permalink
Post by Brian
Post by J. P. Gilliver (John)
Post by NY
At what stage did broadcasters generally change over from telecineing a
film live to telecineing onto VT and playing the VT
Interesting question.
(I would guess they went to VT as soon as quality VT became good enough
and cheap enough - because once it's on VT, you don't need the telecine
machine - or the original film, for that matter - if you want to
broadcast it again. So as soon as VT became both cheaper than telecine,
and didn't wear out after a few playings, they'd have gone to it.)
There's a bit more to it than that: the obvious issue is the
operational convenience of being able to go back and TARIF (grade) the
print again if a shot change is fluffed, and to be able to get the
reel-changeovers frame-perfect by VT editing.
Then there was the famous Christmas Eve incident involving reels
transmitted from TK in the wrong order late at night... :-)
As to when it happened: my guess, and it is no more than that, would
be early 80s, which makes me wonder whether there was a link with the
majority of VT machines being C-format rather than quad. A Tel Rec
person from the era would probably be able to give you
chapter-and-verse.
Was quad still used as recently as the early 80s? I thought it was
superseded a good ten years before that - apart from for showing old
programmes that had originally been recorded on quad. But that was just a
wild guess.

Quad VTR always struck me as one of those technologies that was a miracle
that it worked so well: to be able to switch between one head and other
several times within each field, and get a flawless transition without
differences in gain or any timing glitches seemed to be PFM (pure f-ing
magic) :-) And as for the ability to do cut-tape edits and to line the two
pieces of tape up accurately enough not to get timing glitches is amazing. I
know they used iron oxide in a volatile solvent to make the tracks visible
so they could line them up by eye, but to do that to the accuracy required,
especially under time pressures (eg when compiling highlights of a live
event) - well, some people really *are* superhuman.
The Other John
2017-09-11 15:06:00 UTC
Permalink
Post by NY
Was quad still used as recently as the early 80s? I thought it was
superseded a good ten years before that - apart from for showing old
programmes that had originally been recorded on quad. But that was just
a wild guess.
I was using quad for archive retrieval up till '86 then the company
stopped offering it. When I joined another company in '96 they were still
using them for a/r but stopped soon after.
Post by NY
Quad VTR always struck me as one of those technologies that was a miracle
that it worked so well: to be able to switch between one head and other
several times within each field, and get a flawless transition without
differences in gain or any timing glitches seemed to be PFM (pure f-ing
magic)
The recording was FM so there were no gain differences, there could
however be equalisation and differential gain differences which showed up
as 'head banding' if not carefully aligned. Also if the replay head
penetration into the tape was wrong you got 'tip errors' and if the tape
vacuum guide height was wrong such that the centre of the head drum did
not match the centre of the tape tracks you got 'velocity errors', both of
these caused visible stripes at the boundary of each head's track. There
were electronic correctors for these but they didn't always clean
everything up. Penetration was servo controlled, either manual with a pot
or automatically using the tip error signal. Height was controlled with a
knurled knob.
Post by NY
And as for the ability to do cut-tape edits and to line the
two pieces of tape up accurately enough not to get timing glitches is
amazing. I know they used iron oxide in a volatile solvent to make the
tracks visible so they could line them up by eye, but to do that to the
accuracy required, especially under time pressures (eg when compiling
highlights of a live event) - well, some people really *are* superhuman.
On the early splicing blocks - Smiths & Smiths - usually called Smith &
Wessons! - a microscope was used to view the developed tracks in order to
cut in the right place and align the ends. Later there was an EMT
(German) splicer which had a rotating head under the control track, the
output of which was displayed on a small crt 'scope allowing you to locate
the frame ident pulse so you could cut in the right place. The tape ends
were held in position by a vacuum pump so they didn't move.
--
TOJ.
SimonM
2017-09-13 20:41:36 UTC
Permalink
Post by NY
Post by Brian
Post by J. P. Gilliver (John)
Post by NY
At what stage did broadcasters generally
change over from telecineing a
film live to telecineing onto VT and playing
the VT
Interesting question.
(I would guess they went to VT as soon as
quality VT became good enough
and cheap enough - because once it's on VT, you
don't need the telecine
machine - or the original film, for that matter
- if you want to
broadcast it again. So as soon as VT became
both cheaper than telecine,
and didn't wear out after a few playings,
they'd have gone to it.)
There's a bit more to it than that: the obvious
issue is the
operational convenience of being able to go back
and TARIF (grade) the
print again if a shot change is fluffed, and to
be able to get the
reel-changeovers frame-perfect by VT editing.
Then there was the famous Christmas Eve incident
involving reels
transmitted from TK in the wrong order late at
night... :-)
As to when it happened: my guess, and it is no
more than that, would
be early 80s, which makes me wonder whether
there was a link with the
majority of VT machines being C-format rather
than quad. A Tel Rec
person from the era would probably be able to
give you
chapter-and-verse.
Was quad still used as recently as the early 80s?
Yes, only just. In Bristol, we had a single edit
pair of Quads, which we got rid of around 1983. I
remember a lot of mucking about in Dubbing trying
to hide insert edit buzz on things that had been
assembled on Quad, such as the Antiques Roadshow
(originated on 1").

But even after 1" came in, there was need to be
able to replay Quad tapes. I never worked at TC,
but I'm fairly certain they kept a quad pair going
for quite a while. And BBC Enterprises probably
did too, at Woodlands, for foreign sales of older
shows and to handle requests for archival material.
s***@googlemail.com
2017-09-12 19:06:25 UTC
Permalink
Sometime before I left school, in 1974, we were taken on a visit the the new London Weekend studios on the South Bank, which had recently opened. They were still transmitting from film then, and we were given a demonstration of a 35 mm machine using a reel which was due to be transmitted on the following Saturday morning. It's a long time to think back, but I think it was an episode of 'Tarzan', which would have been quite old even then.
Loading...