Picture
ALL FACTORS THAT AFFECT PICTURE QUALITY

ALL Q&A'S PRE 2004 UNTIL LISTED

Dear Mr. Stapleton, I would love to hear your response to this question:
When looking at a film on television, how come it is possible to tell
whether the film was 'made for television' or for a theatrical release, just
by the quality of the picture?

Yours faithfully, John Rooksby

Now assuming you live in the USA, TV films are mostly made on 35mm so there
is no reason to spot them for technical reasons.  In the UK they are shot on
16mm so you might see a difference if you had a really good TV set. However,
this is not really the issue.  The “quality” that you are talking about is
actually a whole host of things from the script to the Directing, Editing,
Music etc oh yes and the Cinematography! One of things you might notice is
that TV directors like to shoot very close, and also use the zoom, as well
as a couple of cameras for speed. The combination of all these things
results in some rather ad hoc shooting.  The word “cover” used to be a dirty
word in the precise world of low budget film making.  You couldn’t afford
“cover” so you took the time and got the shot right.  That was the shot in
the movie.  TV films are shot incredibly quickly ? so when you see really
good work like in the X-Files you have to remember that the work is being
done really well and really quickly which is quite remarkable. Shots for TV
films are lit in 5 to 1O minutes.  Shots for features take anything from 20
minutes to (famously) several days!  There is no substitute for time when it
comes to lighting: if a Cinematographer is given the time he or she needs to
light, the image will be the better for it. But if the Cinematographer takes
for ever and nothing looks any good, it’s time to call the other guy. There
are no Cinematographers trying to move out of feature films and work for TV:
There are plenty trying to go the other way!

 If making a low budget horror feature film would you recommend
shooting on 35mm at 24 frames per second (like a real feature film)or super
16 at 25 frames per second as it is probably only destined to be released on
video or shown on tv. --Stephen

You've made one good decision already: to shoot on film! The next part is
easy - shoot Super 16mm as the quality difference for TV showing is minimal,
and a horror film tends to be high contrast and full of bloody close-ups, so
the only weakness of Super 16mm which is landscape won't be revealed.  And
if it turns out to be a masterpiece, Super 16mm blows up very nicely for
cinema release. (Check out Leaving Las Vegas.)

I see some films shot on 35mm and they look slightly grainier than others
shot with the same film stock. Is this due to lack of lens testing?
--Ruble

Grain is a function of the stock, how "fast" it is in the first place and how many times it has been copied.  Having said that, the human eye always looks for edges and detail, so a lens that is soft may result in an image that is softer than the grain around it, so the eye will see the grain instead of the image, as it searches for detail.  The degree of grain that is visible is one of the things a cinematographer will test until it is "right" for the film in question.

 First of all, thank you very much for your website and your
column. It has been a great comfort and help to me as an apprentice
filmmaker.

I am a student writer/director currently in early pre-production for a
medium-length film (about an hour). We have about $6000 or so at the moment
(hopefully some more down the line) and definitely want to shoot on film,
most likely 16mm. The project is going to attempt to emulate the look of
cinema from several different eras and, ultimately, be in both B/W and
color. I've heard that the best film stock for something like this would be
color reversal stock because it costs much less than negative film. I've
also heard that several music video cinematographers have used color
reversal film to create projects in which the final output is very sleek,
high contrast B/W. Do you think that this is the best way to go?

Thanks, Brian

Sounds like an interesting project.. First of all, I assume your finished
product is on tape, not film, as this makes quite a difference to your
approach.  When you say color reversal is cheaper than negative film, there
may be a confusion about the fact that there is not a print involved (which
adds cost).  However, you can telecine either pos or neg film: The cost of
the raw stock is about 10% more for negative, but labs charge 3 times more
to process reversal than negative so I think your information is misleading.
The main difference is that colour reversal has a much higher contrast than
negative, with more saturated colours and more grain (depending on the neg
stock you are comparing it to).  It probably is good for changing to b/w in
telecine, for this same reason.  My advice would be to use it for the "older
looks" in your film, both b/w and colour, and then use negative for the more
contemporary look.  Reversal stocks come in many forms, Ektachrome and
Kodachrome from 25ASA to 4OOASA... you could also consider using Super 8mm
for more grain, and/or a hand-cranked 16mm camera for "patchy" exposure and
speed variation. The "scratching" can be added afterwards in post quite
effectively - a bit less hair raising than doing it in the camera! By the
bye - shoot in Super 16mm as it doesn't cost any extra and makes the
transfer to 16:9 TV compositionally perfect, as well as possible blow-up to
35mm better.
 

Why is it that older television and movies have that almost grainy look?
A good comparison is the difference between the Orignal 'Star Trek' and
the most recent 'Enterprise'. The newer show looks smoother, sort of.
They're both shooting on film, what gives?

-Evan Shearin

Film technology has come a long way in the last twenty years.  The stock itself is much finer grained than it was ? especially faster stocks.
Another factor is that the newer telecine machines are much sharper than the ones around twenty years ago, so a new transfer from an old print will render it sharper, but in doing so may also emphasise the grain in the stock.

Why is soft focus still used when filming female actors? Who requests that it be used - actor, director, cinematographer or others? I found it particularly noticeable in "Chocolat" when the shot cut to and fro between Binoche (soft) and Depp (hard) - so much so that instead of following the dialogue I found myself being aware of the focus.
Regards,
Paul Villa

Hi Paul,
Er.. you're not supposed to spot that kind of thing, so I suppose I'll have to call my friend Roger Pratt and tell him he messed up!
Any one of the people you mentioned might have decided to use the filter, though generally it?s a decision  made by the cinematographer.  I did some of that with Michelle Pheiffer in One Fine Day.. Generally a ?soft focus? or diffusion filter is used to disguise the imperfections of the female face, or because it?s day 58, it's 11.30pm after a 16 hour day and she has a zit on her upper lip.. or something.
One of the little considered facts of cinematography, is that the audience looks at the face, no matter what size it is on the screen: that's why sometimes an image looks soft when a face is far away - your eye is searching for detail in the face which the film/lens cannot supply.  You spotted the filter because of the cross cutting - if Roger had used the filter on the opposite shot as well, you probably wouldn?t have noticed it.  But sometimes you've shot the Depp angle first, and only decide you need the filter when you shoot the Binoche angle...
The other day I shot a very close shot of Julianne Moore (for Shipping News): I filtered it (slightly) because the shot was so close that the sharpness of the image would be simply too much at that size.  I have a special 'variable diffuser' in my filter box, which you can slide across the matt box as you track in to a BCU - the audience will not notice it as the face gets bigger and bigger so the 'apparant sharpness' stays the same.

2005

Perhaps this is not exactly your field of expertise but you probably know the answer. As you will see, I am from Australia and I have paid quite a few dollars for high definition reception home equipment. Shows like "CSI" come through absolutely breathtaking clarity and colour. My question is to movies and Dolby digital. It seems that Australian TV stations, when screening a HD movie, almost always do not provide a 5.1 soundtrack. Typically it is just 2.0, with the only exception (so far) being the later "Star Wars" movies. From a consumers point of view I can rent/buy a DVD of a recent movie from at least 5 years plus ago and 90% of those DVDs will be 5.1. So my question is - are there 5.1 versions available for TV stations to acquire or do movie houses only produce stereo versions of latest movies for TV station broadcasts? Hopefully your answer would be relevant to Australia. ??--Rick

Might be a better question to post elsewhere. However, I do know that the whole HD situation is in a period of tremendous change at the moment, with the USA finishing everything on HD and Europe still using Digi Betamax. I would imagine there will be a slow transition to 5.1 soundtracks (and then the next big thing!), but in the meantime I would just enjoy what you have without being too concerned about surround sound. TV stations screening older movies will tend to use the format supplies which might be anywhere from one to ten years old!

Why is it that a film (especially a colour film) shot in 16mm but projected in 35mm, looks so much better than something shot in 35mm but projected in 16mm? Aren't they passing through the same 16mm bottleneck? ??--Henry

Interesting question. I guess it?s a question of where the bottleneck is in the process. One thing that 35mm projection has going for it is that the image brightness tends to be better (assuming a large image) and also the optics aren't quite as critical. 16mm projection also has sound problems with such a small track to play from. It's actually more or less disappeared because it is not that much better that DVD projection which is so much more convenient and can have great sound. If one set up a very controlled study you'd probably find that you could achieve a more or less equivalent image via the two processes: in real life blowing up 16mm is a common and much used process so the expertise is there to do it well. The reverse process used to be common 20 years ago but now must be rare except for use in out-of-the way places.

2006

I was flipping through TV channels recently and I came across the film Le Pacte des loups or "Brotherhood of the Wolf" as it is called in North America. I saw it in the theatre and now have the DVD, but I was stunned to see this TV version because the motion was quite noticeably different from my DVD. This particular TV version was transferred from PAL at 25 frames per second, much like many of the imported UK shows I see on some channels, and yet the version I had on my DVD was clearly transferred at 24fps. I understand that many foreign films are shot at 25fps and later transferred to 24fps to play in our theatres, but I find the difference in motion quite interesting. 25 is more smooth than 24, but I've always associated dramatic film with 24fps and I often don't like what I see from PAL transferred footage because it begins to approach NTSC's 29.97 frame rate and look too video-like.
Have you shot any features in PAL or 25fps, as opposed to normal 24fps? Which frame rate do you prefer?
Jeff

I think the differences you noticed were not to do with the 24/25fps factor, but more the way that DVD's highly compressed information often makes the picture judder with certain kinds of motion. In the present climate of iDigital it is overlooked that DVD is actually quite a crap format as far as motion is concerned. I don't prefer either frame rate as the eye (at least my eye) can't detect the difference: ie shot and projected at 24fps or shot and projected at 25fps. However, 25fps is the de-facto standard for PAL TV based on the 50hz cycle that goes with 240v AC. So if a film is shot for TV and released in the cinema, it used to just play slow: now days it can be corrected via a DI. It's best to shoot the frame rate that is going to be used. I am surprised about PAL transferred footage being too "video like" - I don't think the difference of 1fps is likely to do this? We used to shoot some Music Videos at 50fps in the UK and transfer them at 50fps but that I did find "video-like".. sort of.

A film, TV, PAL or otherwise is transmitted over TV via a very high quality system: Beta or HD. This is far superior to DVD as it compresses the image a lot less, so you will always perceive a higher quality in a film on a decent HD TV or Satellite because DVD is a domestic "cheap" system.

Concerning the discussion of implications of raising the frame rate of films, what are some of the "aesthetic consequences" of doing so? The quality of the projected image would improve, but wouldn't this affect things such as motion blur, which is a strong visual cue for viewers?
Jason

There was some talk a while back of increasing projector frame rate to 48fps (Maxivision - http://www.maxivision48.com). Those who have seen tests say there is an almost 3D effect in the clarity of the image. The main reason it hasn't happened is cost: films would be twice as long physically and upgrades to the thousands of projectors all over the world would be very expensive (although a lot cheaper than Digital).

Since the current age is convinced that electronic projection is "around the corner", there is not much interest in an intermediate system that would cost a lot.

Motion blur would be half at double the frame rate, but the fact that each frame is sharper would be compensated for by the fact that you would see twice as many. If there were no economics involved I am sure many filmmakers would prefer a higher frame rate, but, let's face it, it ain't gonna happen. We are stuck/blessed with Imax and then regular theatres for the foreseeable future.

One of the things I find interesting about Imax is that for me it decreases the enjoyment of a dramatic film, and increases my enjoyment of say a wildlife film, or even a documentary. Imax has struggled to release films in the large format successfully because I think in some way that "too much" screen quality is not always such a good thing. In a sense the more information you supply then the less the brain has to work it out. In you look at Radio or Story Telling as "no pictures" , then Theatre as a "live wide shot" and IMAX as MAX PICTURES then from the point of view of Drama you can get some sense of the power. The power of drama does not necessarily get increased just because the pictures are a higher quality from an engineer's point of view. 3D has had the same problem?

Why are 3D pictures limited? We enjoyed a 3-D film at Sea World, San Diego earlier this month. It was great!!! Everyone enjoyed it. Unfortunately, I don't see many 3-D films available.
Letty

It's been tried many times but I think that actually drama and 3-D don't really go together. In drama there is a need for the picture to "go away" in order to get truly immersed in the story. Think of lying in bed as a young child listening to Mum tell a story. It's an out of body experience that often sends one child to sleep and the other stays wide awake - living every moment. A true lover of drama is the second child that stays wide awake - imagination working full time. 3-D tries to replace imagination completely my making picture "ultra real" and somehow most people like that as a roller coaster type of experience (sensory cinema) but a good story will be inhibited by pictures that don't leave anything to the imagination.

3-D is part of the theme park entertainment experience and I think it will continue to grow in that genre. For good drama I don't think it has much of a future.. but who knows. Perhaps it will have a place in bigger action films but I think the same rule might apply. It's been around for an awful long time with a lot of very clever people trying to make it work as drama and not really succeeding.

Having said all this, 3-D is one way to lure people away from their home cinemas and back into the theatres so there is a lot of research going on and many different ways of converting a film shot in 2-D into a "3-D" version and this might entice viewers to see their favorite films (especially action and sci-fi films) all over again in the theatre.

Flipping through the channels one night I became very interested in just how easy it is to tell within the first few shots I see the calibre of the movie, be it theatrical, movie of the week, or something like a student film. Is it the tangible things like budget, lenses, film stock, etc that make the differences so apparent, or in your opinion is it more due to the level of talent behind the camera? Very interested to hear what you think.
Danny

You are so right; within seconds you pretty much know whether the film has any class or is just another "product". I would say that it is 100% to do with the combined talent of those making the film and nothing to do with budget. It's possible to make "seminal" films on very small budgets (like Kiesloswski) and dreadful looking films on huge budgets (er..take a look at most of what is in your local cinema!).

Films are made for many reasons and not a lot of them are made by artists - I use the word in a general way. Most films are chips on a roulette table and the producers and directors are not concerned about "look": they are concerned about "marketing". Sometimes they are concerned with both these things which is a much better situation. I have made it part of my aim in life to only be involved with films that I think are made by people who share my passion for making good films: this makes the choices quite restricted!

Another factor involved with judging how films look when you see them on "telly" is that the intentions of the cinematographer have sometimes been totally undermined either by a producer or director who likes "bright" or a lazy Telecine operator who's made a bad DVD transfer. I once (in the early '80's) called a TV channel as they were transmitting a film I had shot and it looked completely green. I got hold of the technician in charge of the transmission and when I complained he tweaked a few dials and asked me if it looked any better! I gave up shooting for TV after that..

Why has the industry become so obsessed with image sharpness in recent years? Today's features are so sharp that all glamour and mythos are lost, and I believe that the careers of stars (particularly female) are being shortened, because nobody over thirty-five can look really young when faced with today's laser-sharp focus. Makeup effects are often obvious when subjected to this kind of scrutiny, and the gulf between stills and the actual feature has widened to the point of ridiculousness. Stills have traditionally been retouched, but when one sees a Photoshopped still that makes someone's face smooth as a baby's bottom and then sees serious ruts and wrinkles in the feature (or TV show), it's getting a little silly.
Some things actually don't work any more because of today's clarity- a big-budget late-90's feature had one actor in a hairlace toupee, and the hairlace showed big-time in several shots when you watched the film in a theatre. Is this clarity being imposed on cinematographers, or do many feel that it's what people want, or do some have too great a fascination with available technology?
Sandy

It's a mix of all the factors in your last sentence. Throughout the history of film, scientists have worked to produce sharper "better" optics and film stocks and it is now getting to the point, as you so rightly perceive, where there is so much detail that the romance of the image is being replaced by "stark realism" as seen with 24p and the latest lens/film combinations. There certainly are a number of cinematographers - you only have to read the "blogs" to see this - that are continuously enamored with the next technological advance. This is most clearly seen in discussion of the latest and greatest digital stills cameras which are more or less in a state of continuous "improvement" pretty much like computers. No-one seems to want to make a judgment that what is available now is good enough for a few years, let alone decades! When I started a new film stock was quite an event: nowadays they put a new one out every few months it seems. Now we have advanced digital to add to the fun of it all and as you might have noticed, that is really detailed and er..really ugly.. except when treated heavily with various electronic processes to make it look "more like film" ie less sharp.

This roller coaster is not going to stop because our culture is enamored with New, Young, Technological and Expensive. An aesthetic that deliberately looks back and creates something beautiful is likely to be labeled retro or classic which somehow is an insult. Fast cutting, crazy angles, multiple cameras and "textures" (ie shooting MiniDV, Super 8, 16mm and 35mm all muddled up together), is what is cool, hip and happening.

There are some interesting "guidelines" on the BBC website: . http://www.bbc.co.uk/commissioning/production/docs/hd_makeup.pdf. If you look at the other many "guideline" sites put out by the BBC it shows you just how many hundreds of ways there now are to produce images. This will only get worse as the manufacturers fight for the "next best thing". To some extent the victories are in the hands of marketing people, although in the end the program makers finally choose the technologies that most suit them.

What is perceived as beautiful has always been a cultural influence: perhaps you and I are enamoured of a world gone by. Seeing those lines and wrinkles - is that part of the new Realism and as such, a beautiful thing? Only time will tell.

This question applies more to techniques that were used decades ago (especially the 1930's) than to what you would be using today -- but it's one I've been wanting to ask for a long time, so here goes: How were those earlier filmmakers able to achieve such amazingly controlled lighting without modern light meters? When I look at some of these older films with their very complicated set-ups, and then look at the way the light is CONTROLLED, I just can't figure out how they did it. The lighting on faces, especially, is beautiful: sparkling highlights that are never blown, beautiful modeling, consistent lighting ratios, etc. It's as if they were lit by Ansel Adams.
Then there was the filmmakers' ability to maintain the same "look" in the lighting across dozens of shots so it all fits together seamlessly.
Again: how were they able to achieve this? Was it simply experience? Did they run test shots of each set up, have the film processed, and fine tune the lighting from there?
Or, were the negatives a mess -- but "fixed in post"?

I'm asking this basically as a still photographer who knows how tough it is to do lighting well. My first meter was a Weston Master V back in the mid-1960's, and I know that meters were infinitely more primitive 30 years before that. Heck, they didn't even have Polaroids. I just don't see how they were able to do it, so well, on such a massive scale, in such a short time, with all the pressures they had to deal with. It's mystified me for years.
Steve

Fantastic, wasn't it? Luckily I grew up at the end of this "primitive" period so I'm in a good position to help you understand. It's simple really: those guys were really, really, really fantastically Good.

First up: the Lighting. Before "soft light" came about, films were lit with Fresnell lights pointing directly at the subject usually making them very very hot. In Black and White the ASA speed was under 100 for a long time and the lenses needed to be exposed at T4 or T5.6 to look any good so there was a LOT of Light. The combination of the lenses and the film stock meant that the overall resolution of the image was not that high, so there was a natural softness to the image, which gave that whole glow you are talking about - especially in Black and White. As far as ratios and consistency was concerned, they used incident meters (like the Spectre) and the ratios were made consistent that way. Dougie Slocombe was known to never use a meter but just put his thumb up to make a shadow on his hand and then he's say T4.5 and that was that!

Instead of Polaroids there were "slop tests" where a small piece of negative was processed straight away and the result brought back to set. The real experts would just tell by looking at the negative what was going on. When the old-fashioned colour timers looked at a colour negative they could tell you the printing lights by just holding it over a light box!! That kind of skill has just disappeared, as it is no longer necessary. There was no such thing a "fixing in post" - it had to be right otherwise it was a re-shoot.

Directors are shooting their own material now with HD cameras: the chemical/photographic skills are on their final chapter and a sad thing it is too. If you think of a musician with perfect pitch you can play them a note and they say "E Flat" and it's right. There were Cameramen who had "perfect pitch" for light - "5.6 please", "Thanks Guv, 5.6 on the lens." Marvellous.

2006

This is a question regarding deep focus photography. Why is it that in black and white movies like Citizen Kane, the deep focus is crystal clear, but when it is used in color (Star Trek: The Motion Picture, The Untouchables) there is a very apparent "fog" or "blur" around the object in the foreground? Is this fog present in b&w films too and I don't notice it? If so, then why is this blur always there?
Karl

"Atmos" wasn't used a great deal in the B&W days as a clear image was generally preferred and as you noticed, later colour films have a tendency to make the light "visible" by using smoke or atmos which is a less threatening way of saying it!

There is a couple of ways of making this "blur" as you call it. One is to use atmos and no filters on the lens, which means that the closer something is to the lens the clearer it will be. The background then becomes increasingly soft and lower in contrast and window light will "bloom" around the edges of the windows as it hits the atmos, depending on how strong it is. There are also a number of soft filters that can be used on the lens called various things like Soft FX, Fog, Promist etc and are made mostly by Tiffen or Harrison filters. These might be used in conjunction with atmos or not, depending on what the DP is after.

Radical looks (like David Hamilton's kitch/dreamy stills) can be made by using nets on the front or behind the lens: these might also be used with or without smoke. Lenses, film stocks and projection advances make today's images much sharper than the old technologies: but sharper is not necessarily appropriate to the film that you are trying to make so there has been a massive increase in the use of these softening devices to give the picture a strong aesthetic. Elsewhere another reader was complaining about the mystery of the film stars being lost because of the kinds of sharp and critical close-ups being made today.

There seems to be a group of people saying images are getting sharper, less interesting and "samey" and yet today's technologies (mostly via the DI process) has never been more flexible in terms of making strong images that have a look of their own. I find the work that is starting to happen now much more interesting than what was going on 5 years ago when the photochemical process seemed to have hit its end stop. The present (temporary!) combination of film in the camera and the DI process seems ideal to me.

2007

In the last few years, Hi-Def seems to have engulfed the home entertainment industry. Sporting events, DVD's and HBO are all on Hi-Def. I just have a few questions.
There are such things as Hi-Def digital cameras. These must be responsible for the crystal clarity of sporting events and television broadcasts. So then, how can Star Wars be released as a Hi-Def DVD when it was shot on film? How come my channel 6 looks awful, but my channel 786 looks absolutely amazing? Does this Hi-Def craze have anything to do with filming techniques, or are the entertainment gods just putting out better signals?
Ryan

"Hi-Def" is a buzzword without any real meaning unless it is followed by some technical jargon about what kind of Hi-Def. Once upon a time it had a meaning but this has been rapidly lost in a world of Pro-Sumer video. I couldn't even begin to offer a cogent analysis of all the systems out there (ask on Cinematography.net if you really want the nuts and bolts).

In the world of selling TV's Hi-Def means that the TV is capable of receiving what is called Hi-Def in the USA, and also called Hi-Def in the UK but means something slightly different. Unfortunately the nature of the transmitted signal is what really counts and this is subject to a frenzy of marketing forces where in principle the more channels that are squeezed into the "paid for" Hi Def space, the lower will be the end quality of the signal (like Hi-Speed internet being really Slo-Speed when there are a lot of users online.

What we will all be seeing with our new TV's is some great images and a lot of very poor ones which would have passed unnoticed on the older sets but suddenly look bad in the face of those ritzy sports images. In the sound world this transformation look place a while back and the CD had a label that showed things like ADD and DDD or AAD - the last one meaning that it was recorded Analogue (like Film), edited Analogue (like on a Steenbeck) and distributed on a CD (like a DVD). If DVD's had the same label system it might clear up some of the confusion!

In the last few years, Hi-Def seems to have engulfed the home entertainment industry. Sporting events, DVD's and HBO are all on Hi-Def. I just have a few questions. There are such things as Hi-Def digital cameras. These must be responsible for the crystal clarity of sporting events and television broadcasts. So then, how can Star Wars be released as a Hi-Def DVD when it was shot on film? How come my channel 6 looks awful, but my channel 786 looks absolutely amazing? Does this Hi-Def craze have anything to do with filming techniques, or are the entertainment gods just putting out better signals?
Ryan

"Hi-Def" is a buzzword without any real meaning unless it is followed by some technical jargon about what kind of Hi-Def. Once upon a time it had a meaning but this has been rapidly lost in a world of Pro-Sumer video. I couldn't even begin to offer a cogent analysis of all the systems out there (ask on Cinematography.net if you really want the nuts and bolts). In the world of selling TV's Hi-Def means that the TV is capable of receiving what is called Hi-Def in the USA, and also called Hi-Def in the UK but means something slightly different. Unfortunately the nature of the transmitted signal is what really counts and this is subject to a frenzy of marketing forces where in principle the more channels that are squeezed into the "paid for" Hi Def space, the lower will be the end quality of the signal (like Hi-Speed internet being really Slo-Speed when there are a lot of users online. What we will all be seeing with our new TV's is some great images and a lot of very poor ones which would have passed unnoticed on the older sets but suddenly look bad in the face of those ritzy sports images.

In the sound world this transformation look place a while back and the CD had a label that showed things like ADD and DDD or AAD - the last one meaning that it was recorded Analogue (like Film), edited Analogue (like on a Steenbeck) and distributed on a CD (like a DVD). If DVD's had the same label system it might clear up some of the confusion!

Can you elaborate on what are the pros and cons of always having a zoom lens mounted on the camera, besides the fact that you avoid always having to change lenses when you want a different focal length (except for extreme situations)?

20 years ago the Zoom lens was regarded in a somewhat negative way by cinematographers. The reasons for this, amongst others, were the following:

1. The quality of a zoom lens was less than that of a prime lens.
2. Zoom lenses were slow (T3.1 or T4).
3. "Zooming" was regarded as a cheap TV thing. Features tracked, TV zoomed.

Nowadays things have changed. First of all the lenses are no longer inferior to prime lenses, some are just as fast (T2.2) and the "breathing" associated with zooms as you focussed has been designed out. So many productions use only a zoom, just switching to prime lenses when there is a hand-held or steadicam shot. This has also changed recently with the current new "short" zooms in the 15mm-40mm range, designed especially for steadicam and HH purpose.

This has also made for a kind of laziness which suits certain kinds of films and not others. There is a precision involved with marking up a shot for a prime lens where the Cinematographer, Director and Actors get "nailed down" to particular positions on the floor. This encourages a discipline that suits precise lighting: the other approach suits other kinds of more causal film-making where the actor can improvise and wander around the set more: the shot size can be changed instantly and the lighting more soft and overall. In general, modern cameras and lenses have enabled new ways of shooting to happen because of the ease and portability of the systems: unfortunately high-end Digital is currently so festooned with cables, gadgets and techie stuff that it's almost like going back to the 50's!

HOME
Agents
Animation Techniques
Bleach By-Pass
Blue Screen/Back Projection
Books to Read
Budget Considerations
Car Photography
Cider House Rules
Clubs etc
Digital - Scanning
Director/DP Relationship
Dp's - where to get them
Equipment
Exposure Techniques
Exterior Shooting
Film versus Digi
Filming Monitors
Frame Rates and Digi
Framing Techniques

Future Outlook
Influences
Jobs in the Industry
Learning Film Technique
Lighting Issues
Miscellaneous
Multiple Cameras
Panic Room
Picture Quality
Post-Production
Pre-Production Testing
Production Designers
Slow Motion
Special Shot Techniques
Student / Career
Super 35 versus Anamorphic
The ;Look;
Timing/Grading Issues
Women's Issues