Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

MCD401 Handouts PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 387

Camera Basics,

Principles and
Practices
Combined by
M waqas(BS-MATHEMATICS)
0333-6535867

MCD 401
Camera Basics, Principles and Practices Course Code MCD 401

Whether you're hanging out with friends on the beach or reading about the history of the 1930s,
photography will likely make an appearance. The oldest known photograph dates back to 1826,
but the structure that would become the first camera was described by Aristotle. The process of
taking pictures has become increasingly refined during the 19th century, transitioning from
heavy glass plates to light, gelatin-coated flexible film. Today, once-innovative film cameras
take a back seat to the convenience and ease of digital cameras.

Photography is a fun form of art and many people are engaged in it. This is because of the wide
form of artistry it can create with the uniqueness of every shot. People around the globe have
appreciated the aftermaths of amazing photography. Do you know what makes these photos look
amazingly stunning? It is because of the photography composition each one has.

It would indeed be futile if you take shots without considering photography composition. You
will not be able to give your images a perfect beauty without it. There are some photography
composition techniques that one can do but it is the photographer’s touch and creativity that
makes each output unique.

1. It creates more appealing photos


2. To deliver a more convincing story
3. It looks more professional.
4. It has balance
5. It makes unique pictures
6. It shows a personality.
7. It adds more life to images
8. It allows you to capture the essence of an image.

9. It allows you to capture the essence of an image


10. It awes the viewers.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Macro photography is extreme close-up photography, usually of very small subjects, in which
the size of the subject in the photograph is greater than life size. By some definitions, a macro
photograph is one in which the size of the subject on the negative or image sensor is life size or
greater. However in other uses it refers to a finished photograph of a subject at greater than life
size.

The ratio of the subject size on the film plane (or sensor plane) to the actual subject size is
known as the reproduction ratio. Likewise, a macro lens is classically a lens capable of
reproduction ratios greater than 1:1, although it often refers to any lens with a large reproduction
ratio, despite rarely exceeding 1:1.

Apart from technical photography and film-based processes, where the size of the image on the
negative or image sensor is the subject of discussion, the finished print or on-screen image more
commonly lends a photograph its macro status. For example, when producing a 6×4 inch
(15×10 cm) print using 135 format film or sensor, a life-size result is possible with a lens having
only a 1:4 reproduction ratio.

Reproduction ratios much greater than 1:1 are considered to be photomicrography, often
achieved with digital microscope (photomicrography should not be confused with
microphotography, the art of making very small photographs, such as for microforms).

Due to advances in sensor technology, today’s small-sensor digital cameras can rival the macro
capabilities of a DSLR with a “true” macro lens, despite having a lower reproduction ratio,
making macro photography more widely accessible at a lower cost. In the digital age, a "true"
macro photograph can be more practically defined as a photograph with a vertical subject height
of 24 mm or less.

Macro" lenses specifically designed for close-up work, with a long barrel for close focusing and
optimized for high reproduction ratios, are one of the most common tools for macro
photography. (Unlike most other lens makers, Nikon designates its macro lenses as "Micro"
because of their original use in making microform.) Most modern macro lenses can focus
continuously to infinity as well and can provide excellent optical quality for normal photography.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

True macro lenses, such as the Canon MP-E 65 mm f/2.8 or Minolta AF 3x-1x 1.7-2.8 Macro,
can achieve higher magnification than life size, enabling photography of the structure of small
insect eyes, snowflakes, and other minuscule objects. Others, such as the Infinity Photo-Optical's
TS-160 can achieve magnifications from 0-18x on sensor, focusing from infinity down to 18 mm
from the object.

Macro lenses of different focal lengths find different uses:

 Continuously-variable focal length – suitable for virtually all macro subjects

 45–65 mm – product photography, small objects that can be approached closely without
causing undesirable influence, and scenes requiring natural background perspective

 90–105 mm – insects, flowers, and small objects from a comfortable distance

 150–200 mm – insects and other small animals where additional working distance is
required

Difference between macro micro and close up photography

Macro photography is that which is taken with a dedicated macro lens. A real macro lens has the
capability of achieving in the least a 1:1 magnification. Just because a camera has the word
macro written on it, doesn’t make it a true macro lens.

Close up photography, is the act of photographing objects such as flowers or insects in close
range so the subject you are photographing fills the frame. In other words, it’s the act of
photographing subjects close up. This is easily achievable with any lens, even a 300mm
telephoto lens.

Macro photography is in essence close up photography as well. However, close up photography


is not always considered as true macro photography. For example, if you have a lens that is NOT
considered a real macro lens, yet offers a macro setting (as many do nowadays), this is usually
referred to as being close up photography, and not true macro.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Whether you're hanging out with friends on the beach or reading about the history of the 1930s,
photography will likely make an appearance. The oldest known photograph dates back to 1826,
but the structure that would become the first camera was described by Aristotle. The process of
taking pictures has become increasingly refined during the 19th century, transitioning from
heavy glass plates to light, gelatin-coated flexible film. Today, once-innovative film cameras
take a back seat to the convenience and ease of digital cameras.

Pinhole Cameras and Photography

The pinhole camera (also known as a camera obscura) was first envisioned around the 5th
century BCE. The camera obscura was a box with a small hole in it, through which light (and the
image carried by it) would travel and reflect against a mirror. The camera obscura was originally
used to observe solar events and to aid in drawing architecture, though it became something
entirely new in 1800. A young man named Thomas Wedgwood attempted to capture the image
portrayed in a camera obscura with silver nitrate, which is light-sensitive. Unfortunately, the
images didn't hold, and it wasn't until the French inventor Joseph Niépce attempted the same feat
with bitumen (a kind of tar) that the first photograph was produced.

Louis Daguerre and Modern Photography

Niépce, keen to refine his newly-discovered process for taking pictures, partnered up with artist
and designer Louis Daguerre. When Niépce died in 1833, Daguerre pressed onwards with the
project, experimenting with a polished silver plate, coated in silver iodide, which developed an
image courtesy of mercury fumes. While Niépce's camera had required multiple hours of light
exposure for a single image, Daguerre's innovation cut the time down to mere minutes. He made
his invention public in 1839. In 1841, a man named William Henry Fox Talbot further refined
the process by substituting Daguerre's silver plate for paper.

The Birth of the Negative: Wet Plate Negatives, Dry Plate Negatives

In 1848, sculptor Frederick Scott Archer became frustrated with the stark definition offered by
photographs at the time. He set out to create a process that would allow him to capture the more
subtle variations in shade, since all photographs were, at this point, restricted to black and white.
For his wet plate process, he applied a gelatin mixture of iodide or chloride to a glass plate. The
plate would be dipped into a solution of silver nitrate and used to take a photograph while the
gelatin was still wet. The photograph had to be developed almost immediately afterward, but the
negative that formed on the glass was capable of capturing immaculate levels of detail. The one
downside to this process was the time required to prep the glass plate, usually on-site, which
made it extremely impractical for news photographs and field reporting. A few years later, in
1864, W.B. Bolton and B.J. Sayce created a one-step emulsion fluid with silver iodide. This
process, which became known as the dry plate process, wasn't faster than the wet plate process,
but it did produce photographs of better overall consistency.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Flexible Film and Photographic Films

For the next 20 years, daguerreotype photography remained the most popular form of taking
pictures. However, as young George Eastman discovered when he took a trip to Santo Domingo,
taking pictures was an expensive and heavy process. He set to work, building off the chemical
finesse of the dry plate process, and gradually developed a flexible gelatin-paper film. In 1885,
he created and patented a device to hold a roll of his new film, and in 1888, he introduced his
first Kodak camera to the market.

Camera Advancements: Daguerreotype Cameras, Box Camera, Flashbulbs, 35mm


Cameras and Polaroid’s

The box camera was reinvented with Kodak's Brownie camera, which was released in 1900. The
Brownie camera cost a scant $1 and was marketed towards children, although it became a hit
among servicemen when World War I began. Color photography became possible with the
Autochrome plate in 1907, although it didn't take off until the release of Kodachrome film in
1936. Flashes of light, produced by burning magnesium, had long been used by photographers to
enhance the light of a scene, but in 1930, The General Electric Company began producing
flashbulbs specifically for use with cameras. The 35mm camera was created in 1913 by Oskar
Barnack, who used existing 35mm movie film to capture still images. The first 35mm camera
released was known as the Leica I, and once it hit shelves in 1925, the new compact camera
became the standard for spur-of-the-moment snapshots. In 1943, Edwin Land introduced the
Polaroid camera after being asked by his 3 year-old daughter why she couldn't instantly see the
picture he'd taken of her.

Digital Cameras

In 1975, the field of photography morphed yet again with the introduction of a digital camera.
Developed by Steven Sasson, a research engineer at the Eastman Kodak company, this
rudimentary prototype weighed eight pounds and was as large as a kitchen toaster. Pictures were
stored on a cassette tape, and capturing a photo could take up to 23 seconds. The first filmless
camera was created by Sony in 1981. Their creation, the Mavica, could store pictures on floppy
disks which would then be viewed on a television monitor.

Smartphone Cameras and Technologies

Today, the latest incarnation of the camera may be no farther than your fingertips. In 2002, the
Nokia Lumia 7650 was released to the public. It was released at the same time as the movie
Minority Report, and demand for camera phones multiplied. In 2005, the Sony Ericsson K750i
introduced a memory card slot and a LED flash bulb, paving the way for photo sharing. In 2013,
the Internet company Twitter introduced a service called Vine, allowing users to use their camera
phones to record and share 6 seconds of color- and audio-enabled video with their online

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

followers. Only time will tell how cameras develop from here, but if it's anything like the past
two hundred years, we're in for a pretty wild ride.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

The history of photography has roots in remote antiquity with the discovery of the principle of
the camera obscura and the observation that some substances are visibly altered by exposure to
light. As far as is known, nobody thought of bringing these two phenomena together to capture
camera images in permanent form until around 1800, when Thomas Wedgwood made the first
reliably documented although unsuccessful attempt. In the mid-1820s, Nicéphore Niépce
succeeded, but several days of exposure in the camera were required and the earliest results were
very crude. Niépce's associate Louis Daguerre went on to develop the daguerreotype process, the
first publicly announced photographic process, which required only minutes of exposure in the
camera and produced clear, finely detailed results. It was commercially introduced in 1839, a
date generally accepted as the birth year of practical photography.

The metal-based daguerreotype process soon had some competition from the paper-based
calotype negative and salt print processes invented by Henry Fox Talbot. Subsequent innovations
reduced the required camera exposure time from minutes to seconds and eventually to a small
fraction of a second; introduced new photographic media which were more economical, sensitive
or convenient, including roll films for casual use by amateurs; and made it possible to take
pictures in natural color as well as in black-and-white.

The commercial introduction of computer-based electronic digital cameras in the 1990s soon
revolutionized photography. During the first decade of the 21st century, traditional film-based
photochemical methods were increasingly marginalized as the practical advantages of the new
technology became widely appreciated and the image quality of moderately priced digital
cameras was continually improved.

This art is the result of combining several different technical discoveries. Long before the first
photographs were made, Chinese philosopher Mo Ti and Greek mathematicians Aristotle and
Euclid described a pinhole camera in the 5th and 4th centuries BC.E In the 6th century CE,
Byzantine mathematician Anthemius of Tralles used a type of camera obscura in his
experiments.

Ibn al-Haytham (Alhazen) (965 in Basra – c. 1040 in Cairo) studied the camera obscura and
pinhole camera. Albertus Magnus (1193/1206–80) discovered silver nitrate, and Georges
Fabricius (1516–71) discovered silver chloride. Daniel Barbaro described a diaphragm in 1568.
Wilhelm Homberg described how light darkened some chemicals (photochemical effect) in
1694. The novel Giphantie (by the French Tiphaigne de la Roche, 1729–74) described what
could be interpreted as photography.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Around the year 1800, Thomas Wedgwood made the first known attempt to capture the image in
a camera obscura by means of a light-sensitive substance. He used paper or white leather treated
with silver nitrate. Although he succeeded in capturing the shadows of objects placed on the
surface in direct sunlight, and even made shadow-copies of paintings on glass, it was reported in
1802 that ". The images formed by means of a camera obscura have been found too faint to
produce, in any moderate time, an effect upon the nitrate of silver." The shadow images
eventually darkened all over because " no attempts that have been made to prevent the
uncoloured part of the copy or profile from being acted upon by light have as yet been
successful." Wedgwood may have prematurely abandoned his experiments due to frail and
failing health; he died aged 34 in 1805.

"Boulevard du Temple", a daguerreotype made by Louis Daguerre in 1838, is generally accepted


as the earliest photograph to include people. It is a view of a busy street, but because the
exposure time was at least ten minutes the moving traffic left no trace. Only the two men near
the bottom left corner, one apparently having his boots polished by the other, stayed in one place
long enough to be visible.

In 1816 Nicéphore Niépce, using paper coated with silver chloride, succeeded in photographing
the images formed in a small camera, but the photographs were negatives, darkest where the
camera image was lightest and vice versa, and they were not permanent in the sense of being
reasonably light-fast; like earlier experimenters, Niépce could find no way to prevent the coating
from darkening all over when it was exposed to light for viewing. One of the oldest photographic
portraits known, made by Joseph Draper of New York, in 1839 or 1840, of his sister, Dorothy
Catherine Draper.

The oldest surviving permanent photograph of the image formed in a camera was created by
Niépce in 1826 or 1827. It was made on a polished sheet of pewter and the light-sensitive
substance was a thin coating of bitumen, a naturally occurring petroleum tar, which was
dissolved in lavender oil, applied to the surface of the pewter and allowed to dry before use.
After a very long exposure in the camera (traditionally said to be eight hours, but in fact probably
several days), the bitumen was sufficiently hardened in proportion to its exposure to light that the
unhardened part could be removed with a solvent, leaving a positive image with the light regions
represented by hardened bitumen and the dark regions by bare pewter. To see the image plainly,
the plate had to be lit and viewed in such a way that the bare metal appeared dark and the
bitumen relatively light.

A new era in color photography began with the introduction of Kodachrome film, available for
16 mm home movies in 1935 and 35 mm slides in 1936. It captured the red, green and blue color
components in three layers of emulsion. A complex processing operation produced

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

complementary cyan, magenta and yellow dye images in those layers, resulting in a subtractive
color image. Maxwell's method of taking three separate filtered black-and-white photographs
continued to serve special purposes into the 1950s and beyond, and Polachrome, an "instant"
slide film that used the Autochrome's additive principle, was available until 2003, but the few
color print and slide films still being made in 2015 all use the multilayer emulsion approach
pioneered by Kodachrome.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

A pinhole camera is a simple camera without a lens and with a single small aperture, a pinhole
– effectively a light-proof box with a small hole in one side. Light from a scene passes through
this single point and projects an inverted image on the opposite side of the box.

It is completely dark on all the other sides of the box including the side where the point is
created. This part is usually painted black, but black boxes are also used for this purpose. There
is also a thin screen which looks like a projector sheet, and is put in between the dark side
adjacent to the pinhole.

Up to a certain point, the smaller the hole, the sharper the image, but the dimmer the projected
image. Optimally, the size of the aperture should be 1/100 or less of the distance between it and
the projected image.

Because a pinhole camera requires a lengthy exposure, its shutter may be manually operated, as
with a flap made of light-proof material to cover and uncover the pinhole. Typical exposures
range from 5 seconds to several hours.

A common use of the pinhole camera is to capture the movement of the sun over a long period of
time. This type of photography is called solargraphy. The image may be projected onto a
translucent screen for real-time viewing (popular for observing solar eclipses.

Pinhole devices provide safety for the eyes when viewing solar eclipses because the event is
observed indirectly, the diminished intensity of the pinhole image being harmless compared with
the full glare of the Sun itself.

The camera obscura was not so much an invention as a discovery and development. The camera
obscura works on a naturally occurring phenomenon (the rectilinear propagation of light) and
can, for example, often be observed when sunlight filters through dense leaves. Over the
centuries many people made contributions to the design of camera obscura as we know it but all
are based on the underlying optical laws that apply in nature.

In the 5th century BC, the Mohist philosopher Mozi in ancient China mentioned the effect of an
inverted image forming through a pinhole.[3] The image of an inverted Chinese pagoda is
mentioned in Duan Chengshi's (d. 863) book Miscellaneous Morsels from Youyang written
during the Tang Dynasty (618–907).[4] Along with experimenting with the pinhole camera and
the burning mirror of the ancient Mohists, the Song Dynasty (960–1279 CE) Chinese scientist
Shen Kuo (1031–1095) experimented with the camera obscura and was the first to establish
geometrical and quantitative attributes for it.

The Greek philosopher Aristotle observed the phenomenon in the fourth century BC. In his book
Problems, he wrote:

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

"Why is it that when the sun passes through quadri-laterals, as for instance in wickerwork, it
does not produce a figure rectangular in shape but circular?” and further "Why is it that an
eclipse of the sun, if one looks at it through a sieve or through leaves, such as a plane-tree or
other broadleaved tree, or if one joins the fingers of one hand over the fingers of the other, the
rays are crescent-shaped where they reach the earth? Is it for the same reason as that when light
shines through a rectangular peep-hole, it appears circular in the form of a cone?”

Principle of a pinhole camera: light


rays from an object pass through a
small hole to form an inverted image.

Selection of pinhole size

Within limits, a smaller pinhole (with a thinner surface that the hole goes through) will result in
sharper image resolution because the projected circle of confusion at the image plane is
practically the same size as the pinhole. An extremely small hole, however, can produce
significant diffraction effects and a less clear image due to the wave properties of light.
Additionally, vignetting occurs as the diameter of the hole approaches the thickness of the
material in which it is punched, because the sides of the hole obstruct the light entering at
anything other than 90 degrees.

The best pinhole is perfectly round (since irregularities cause higher-order diffraction effects),
and in an extremely thin piece of material. Industrially produced pinholes benefit from laser
etching, but a hobbyist can still produce pinholes of sufficiently high quality for photographic
work.

One method is to start with a sheet of brass shim or metal reclaimed from an aluminium drinks
can or tin foil/aluminum foil, use fine sand paper to reduce the thickness of the centre of the
material to the minimum, before carefully creating a pinhole with a suitably sized needle.

For standard black-and-white film, a wavelength of light corresponding to yellow-green (550


nm) should yield optimum results. For a pinhole-to-film distance of 1 inch (25 mm), this works
out to a pinhole 0.17 mm in diameter. For 5 cm, the appropriate diameter is 0.23 mm. The depth
of field is basically infinite, but this does not mean that no optical blurring occurs. The infinite

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

depth of field means that image blur depends not on object distance, but on other factors, such as
the distance from the aperture to the film plane, the aperture size, and the wavelength(s) of the
light source.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

What is a DSLR (Digital SLR)?


DSLR stands for “Digital Single Lens Reflex”. In simple language, DSLR is a digital camera
that uses mirrors to direct light from the lens to the viewfinder, which is a hole on the back of the
camera that you look through to see what you are taking a picture of.
1) What do DSLR cameras consist of?
Take a look at the following image of an SLR cross section (image courtesy of Wikipedia):
Lens
Reflex mirror
Shutter
Image sensor
Matte focusing screen
Condenser lens
Pentaprism
Eyepiece/Viewfinder

2) How do DSLR cameras work?


When you look through the viewfinder on the back of the camera, whatever you see is exactly
what you are going to get in the photograph. The scene that you are taking a picture of passes
through the lens in a form of light into a reflex mirror (#2) that sits at a 45 degree angle inside
the camera chamber, which then forwards the light vertically to an optical element called a
“pentaprism” (#7). The pentaprism then converts the vertical light to horizontal by redirecting
the light through two separate mirrors, right into the viewfinder (#8).
When you take a picture, the reflex mirror (#2) swings upwards, blocking the vertical pathway
and letting the light directly through. Then, the shutter (#3) opens up and the light reaches the
image sensor (#4). The shutter (#3) remains open for as long as needed for the image sensor (#4)
to record the image, then the shutter (#3) closes and the reflex mirror (#2) drops back to the 45
degree angle to continue redirecting the light into the viewfinder.
Obviously, the process doesn’t stop there. Next, a lot of complicated image processing happens
on the camera. The camera processor takes the information from the image sensor, converts it

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

into an appropriate format, then writes it into a memory card. The whole process takes very little
time and some professional DSLRs can do this 11 times in one second!

1558: Camera: Obscura It was the first optical device to project an image of its surroundings on a
screen. Although some evidence of its existence dates back to 1000 AD but it was perfected in
1558 by Giambattista della Porta.
1836: Daguerreo types Louis Daguerre invented a new camera to capture a permanent image on
a screen. It used a process of coating a copper plate with silver which was later treated by iodine
vapor to make it light-sensitive. The projected image was then developed by mercury vapor
which was later fixed with a solution of ordinary salt.
1841: Alexander Wolcotts: Camera This camera got the first US patent for photography. The
camera used daguerreo type plate placed in front of a focusing system using concave mirror to
project images. A sliding shutter prevented further exposure after capture.
1861: Panoramic camera: The first wide-angle lens camera used a 76 mm lens.The lens was
made up of two hollow glass hemispheres. The spherical lens was filled with water that would
project an image onto a curved plate.The flap in the front had to be lifted to capture an image.
1888: KODAK George Eastman pioneered photographic films in camera. His first camera was
called “Kodak”. It used a simple box having a fixed focus lens and a single shutter speed. It had
enough film for around a hundred photographs. The detachable film could be taken out to be
processed in a factory and a new film could be reloaded in the camera.
1900: Brownie: This was the camera which revolutionized the photo industry for the public. It
was a basic cardboard box camera with a simple meniscus lens which captured images on a film
role. It was priced at $1 and was extremely simple to use.
1913: LEICA: The Leica was the first practical 35 mm camera that used standard cinema 35 mm
film built by Oskar Barnack in 1913. It transports photographic film horizontally, extending
frame size to 24×36 mm, with a 2:3 aspect ratio. It was best suited for landscape photos.
1933: Exakta First single-lens reflex camera (SLR) for 127 roll film. In later models it pioneered
the first built-in flash socket, activated by the shutter. In 1936, the first SLR for 35mm film was
built.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

1939: Argus C3Best-selling 35mm camera in the world for three decades. It used a simple
diaphragm shutter built into the camera body allowing it to use interchangeable lenses without
the need for a complex focal plane shutter. The rangefinder was separate from the viewfinder and
was coupled to the lens through a series of gears located on the outside of the camera body.
1948: Polaroid Worlds first instant-picture camera. It used a patented chemical process to
produce finished positive prints from the exposed negatives in under a minute. In spite of the
high price, the camera remains one of the top-selling cameras of all time.
1949: Disposable Camera: A company called Photo-Pac produced a cardboard camera beginning
in 1949 which shot 8 exposures and was mailed-in for processing.
1980: Sony Mavica: It was one of the first analog camera. In essence it was a video movie
camera that recorded single frames, 50 per disk in field mode and 25 per disk in frame mode.
The image quality was considered equal to that of then-current televisions.
1988: Fuji DS-1P: It was the first true digital camera that recorded images as a computerized file.
Recorded images to a 16 MB internal memory card powered by a battery to retain data.
1991: Kodak DCS 100: First commercially available digital single-lens reflex camera (DSLR)
camera. Aimed at the photo journalism market & worked well for the field. Was mounted on a
Nikon F3 body and released by Kodak in May 1991.
1999: Nikon D1: At 2.74 megapixel this camera was the first digital SLR developed entirely by a
major manufacturer, and affordable cost which was targeted at professional photographers and
high-end consumers.
Link: http://www.thewindowsclub.com/history-digital-camera-ppt-pdf

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

The view camera is a type of camera first developed in the era of the daguerreotype (1840s-
'50s) and still in use today, though with many refinements. It comprises a flexible bellows that
forms a light-tight seal between two adjustable standards, one of which holds a lens, and the
other a viewfinder or a photographic film holder.

The bellows is a flexible, accordion-pleated box. It encloses the space between the lens and film,
and flexes to accommodate the movements of the standards. The front standard is a board at the
front of the camera that holds the lens and, usually, a shutter.

At the other end of the bellows, the rear standard is a frame that holds a ground glass, used for
focusing and composing the image before exposure—and is replaced by a holder containing the
light-sensitive film, plate, or image sensor for exposure. The front and rear standards can move
in various ways relative to each other, unlike most other camera types. This provides control
over focus, depth of field, and perspective. The camera is usually used on a tripod or other
support.

Basic view camera terminology

Types of view camera

Several types of view cameras are used for different purposes, and provide different degrees of
movement and portability. They include:

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Monorail camera - This is the most common type of studio view camera, with front and rear
standards mounted to a single rail that is fixed to a camera support. This design gives the greatest
range of movements and flexibility, with both front and rear standards able to tilt, shift, rise, fall,
and swing in similar proportion. These are generally made of metal with leather or synthetic
bellows, and are difficult to pack for travel.

Field camera - These have the front and rear standard mounted on sliding rails fixed to a hinged
flat bed that is fixed to a camera support (tripod, etc.). These cameras are usually made of wood,
or sometimes lightweight and strong composites such as carbon fiber. With bellows fully
retracted, the flat bed folds up, reducing the camera to a relatively small, light, and portable box.
The price for this portability is that the standards are not as mobile or as adjustable as on a
monorail design. The rear standard in particular may be fixed and offer no movement. These
large format but transportable cameras are popular with landscape photographers. Tachihara and
Wisner are examples of modern field cameras at opposite ends of the price scale.

Studio and salon cameras are similar to field cameras, but do not fold up for portability.

Press and technical cameras are true view cameras, as almost all of them have a ground glass
integral to the film-holder mechanism that allows critical focus and full use of the sometimes
limited movements. More expensive examples had a wide array of movements, as well as
focusing and compositing aids like rangefinders and viewfinders. They are most often made of
metal, designed to fold up quickly for portability, used by press photographers before and during
the world war II.

Advantages

The ability to skew the plane of critical focus: In a camera without movements the film plane
is always parallel to the lens plane. A camera with tilts and swings let the photographer skew the
plane of focus away from the parallel in any direction, which in many cases can bring the image
of a subject that is not parallel to the lens plane into near-to-far focus without stopping down the
aperture excessively. Both standards can be tilted through the horizontal or swung through the
vertical axes to change the plane of focus. Tilts and swings of the front standard alone do not
alter or distort shapes or converging lines in the image; tilts and swings of the rear standard do
affect these things, as well as the plane of focus: if the plane of focus must be skewed without
altering shapes in the image, front movements alone must be used.

The ability to distort the shape of the image by skewing the film plane: This is most often to
reduce or eliminate, or deliberately exaggerate, convergence of lines that are parallel in the
subject. If a camera with parallel film and lens planes is pointed at an angle to a plane subject
with parallel lines, the lines appear to converge in the image, becoming closer to each other the
further away from the camera they are. With a view camera the rear standard can be swung
toward the wall to reduce this convergence. If the standard is parallel to the wall, convergence is

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

entirely eliminated. Moving the rear standard this way skews the plane of focus, which can be
corrected with a front swing in the same direction as the rear swing.

Improved image quality for a print of a given size: The larger a piece of film is, the less detail
is lost at a given print size because the larger film requires less enlargement for the same size
print. In other words, the same scene photographed on a large-format camera provides a better-
quality image and allows greater enlargement than the same image in a smaller format.
Additionally, the larger a piece of film is, the more subtle and varied the tonal palette and
gradations are at a given print size. A large film size also allows same-size contact printing.

Shallow depth of field: view cameras require longer focal length lenses than smaller format
cameras, especially for the larger sizes, with shallower depth of field, letting the photographer
focus solely on the subject. Smaller apertures can be used: much smaller apertures can be used
than with smaller format cameras before diffraction becomes significant for a given print size.

Disadvantages

Lack of automation: most view cameras are fully manual, requiring time, and allowing even
experienced photographers to make mistakes. Some cameras, such as Sinars, have some degree
of automation with self-cocking shutters and film-plane metering.

Steep learning curve: In addition to needing the knowledge required to operate a fully manual
camera, view camera operators must understand a large number of technical matters that are not
an issue to most small format photographers. They must understand, for example, view camera
movements, bellows factors, and reciprocity. A great amount of time and study is needed to
master those aspects of large format photography, so learning view camera operation requires a
high degree of dedication.

Large size and weight: monorail view cameras are unsuitable for handheld photography and are
in most cases difficult to transport. A folding bed field camera like a Linhof Technika with a
lens-coupled range finder system even allows action photography.

Shallow depth of field: view cameras require longer focal length lenses than smaller format
cameras, especially for the larger sizes, with shallower depth of field.

Small maximum aperture: it is not feasible to make long focal length lenses with the wide
maximum apertures available with shorter focal lengths.

High cost: there is limited demand for view cameras, so that there are no economies of scale and
they are much more expensive than mass-produced cameras. Some are handmade. Even though
the cost of sheet film and processing is much higher than rollfilm, fewer sheets of film are
exposed, which partially offsets the cost.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Some of these disadvantages can be viewed as advantages. For example,

1. Slow setup and composure time allow the photographer to better visualize the image
before making an exposure.
2. The shallow depth of field can be used to emphasize certain details and deemphasize
others, especially combined with camera movements.
3. The high cost of film and processing encourages careful planning. Because view cameras
are rather difficult to set up and focus, the photographer must seek the best camera
position, perspective, etc. before exposing.
4. Beginning 35 mm photographers are even sometimes advised to use a tripod specifically
because it slows down the picture-taking process.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

A digital camera (or digicam) is a camera that encodes digital images and videos digitally and
stores them for later reproduction. Most cameras sold today are digital, and digital cameras are
incorporated into many devices ranging from PDAs and mobile phones (called camera phones)
to vehicles.

Digital and film cameras share an optical system, typically using a lens with a variable
diaphragm to focus light onto an image pickup device. The diaphragm and shutter admit the
correct amount of light to the imager, just as with film but the image pickup device is electronic
rather than chemical. However, unlike film cameras, digital cameras can display images on a
screen immediately after being recorded, and store and delete images from memory. Many
digital cameras can also record moving videos with sound. Some digital cameras can crop and
stitch pictures and perform other elementary image editing.

Steven Sasson as an engineer at Eastman Kodak invented and built the first electronic camera
using a charge-coupled device image sensor in 1975. Earlier ones used a camera tube; later ones
digitized the signal. Early uses were mainly military and scientific; followed by medical and
news applications. In the mid to late 1990s digital cameras became common among consumers.
By the mid-2000s digital cameras had largely replaced film cameras, and higher-end cell phones
had an integrated digital camera. By the beginning of the 2010s almost all smartphones had an
integrated digital camera

The two major types of digital image sensor are CCD and CMOS. A CCD sensor has one
amplifier for all the pixels, while each pixel in a CMOS active-pixel sensor has its own amplifier.
Compared to CCDs, CMOS sensors use less power. Cameras with a small sensor use a back-
side-illuminated CMOS (BSI-CMOS) sensor. Overall final image quality is more dependent on
the image processing capability of the camera, than on sensor type.

Types of digital cameras

Digital cameras come in a wide range of sizes, prices and capabilities. In addition to general
purpose digital cameras, specialized cameras including multispectral imaging equipment and
astrographs are used for scientific, military, medical and other special purposes.

Compact cameras are intended to be portable and are particularly suitable for casual "snapshot".

Many incorporate a retractable lens assembly that provides optical zoom. In most models, an
auto actuating lens cover protects the lens from elements. Most ruggedized or water-resistant
models do not retract, and most with (superzoom) capability do not retract fully.

Compact cameras are usually designed to be easy to use. Almost all include an automatic mode,
or "auto mode", which automatically makes all camera settings for the user. Some also have
manual controls. Compact digital cameras typically contain a small sensor which trades-off
picture quality for compactness and simplicity; images can usually only be stored using lossy

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

compression (JPEG). Most have a built-in flash usually of low power, sufficient for nearby
subjects. A few high end compact digital cameras have a hotshoe for connecting to an external
flash. Live preview is almost always used to frame the photo on an integrated LCD display. In
addition to being able to take still photographs almost all compact cameras have the ability to
record video.

Compacts often have macro capability and zoom lenses, but the zoom range (up to 30x) is
generally enough for candid photography but less than is available on bridge cameras (more than
60x), or the interchangeable lenses of DSLR cameras available at a much higher cost. Autofocus
systems in compact digital cameras generally are based on a contrast-detection methodology
using the image data from the live preview feed of the main imager. Some compact digital
cameras use a hybrid autofocus system similar to what is commonly available on DSLRs. Some
high end travel compact cameras have 30x optical zoom have full manual control with lens ring,
electronic viewfinder, Hybrid Optical Image Stabilization, built-in flash, Full HD 60p, RAW,
burst shooting up to 10fps, built-in Wi-Fi with NFC and GPS altogether.

Typically, compact digital cameras incorporate a nearly silent leaf shutter into the lens but play a
simulated camera sound for skeuomorphic purposes.

For low cost and small size, these cameras typically use image sensor formats with a diagonal
between 6 and 11 mm, corresponding to a crop factor between 7 and 4. This gives them weaker
low-light performance, greater depth of field, generally closer focusing ability, and smaller
components than cameras using larger sensors. Some cameras use a larger sensor including, at
the high end, a pricey full-frame sensor compact camera, such as Sony Cyber-shot DSC-RX1,
but have capability near that of a DSLR.

A variety of additional features are available depending on the model of the camera. Such
features include ones such as GPS, compass, barometer and altimeter for above mean sea level or
under(water) mean sea level.[13] and some are rugged and waterproof.

Starting in 2011, some compact digital cameras can take 3D still photos. These 3D compact
stereo cameras can capture 3D panoramic photos with dual lens or even single lens for play back
on a 3D TV.

In 2013, Sony released two add-on camera models without display, to be used with a smartphone
or tablet, controlled by a mobile application via WiFi.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

How to Load Film into a 35mm Camera

This provides information on loading film and adjusting the main settings on a 35mm camera.

Steps:

1. Locate the Rewind Knob, on the left side of the camera, and pull up until the back of the
camera opens.
2. Cut a Leader for your film to load into the Take-Up Spool. The leader must be
approximately 22mm to fit into spool.
3. Place the film into the left side of the camera. Push the Rewind Knob down to create a
snug fit on the film after it is placed into the pocket.
4. Pull the leader of film over to the Take-Up Spool. Insert the narrow end of the film leader
into the slot. Hold the spool steady with one hand and push the film deep inside until film
is way inside spool or comes out the other side.
5. Make sure the teeth are properly placed on both sides of the film. Use the rewind knob to
take in any slack in the film, tighten the film to the camera. Then close the back of the
camera and use the film advance to wind the film.
6. Take 3 pictures. If the film has been loaded properly the rewind knob will turn ever time
you crank the film advance lever. If not, make sure the slack has been taken out of the
film by retightening the film to the camera. Make sure you take at least 2 pictures to clear
out the exposed film and start with fresh film. Once you see the number 1 in the window,
you are ready to take pictures.
7. Now it’s time to set your ISO Film Speed. To set film speed, gently lift up the ISO speed
ring and turn it to your desired ISO speed shown in the window. (This should rotate the
numbers on the INSIDE window on the ring).
8. To set Shutter Speed, rotate the shutter speed ring to desired shutter speed. The shutter
controls the length of exposure, on this camera, the higher the number, the less the
exposure time. (This should rotate the numbers on the OUTSIDE of the ring).

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

UV filters are supposed to block UV light. So, for the newcomers to photography let's
first look at what UV light is and why you would want to block it.

The "traditional" visible spectrum runs from red to violet. Red light has the longest
wavelength and violet the shortest. Light which has a longer wavelength than red is
called infrared, and light which has a shorter wavelength than violet is called ultra violet
or UV. The wavelength of light is measure in units of nanometers (abbreviated as nm),
and 1nm is a billionth of a meter (that's a US billion or 1000 million, not a UK billion
which is a million million!). Light shorter in wavelength than about 400nm is called ultra
violet, light longer in wavelength than 700nm is called infrared.

So now we know what UV light is, why would be want to block it? Well the answer lies
in the way that color film works. There are basically three color sensitive layers, one
sensitive to red light, one to green light and one to blue light. The blue layer not only
responds to blue light, but also to UV light, so if there is a lot of UV around the blue
sensitive layer gets extra exposure and the final image takes on a blue color. Since film
isn't normally sensitive to infrared, you don't need an infrared blocking filter.
Interestingly though, digital sensors are infrared sensitive and most digital cameras have
an infrared blocking filter built in.

Now there isn't usually a huge amount of UV around at sea level. There is some (that's
what gives you a suntan or sunburn) but most of it is scattered by the atmosphere.
However as you gain altitude, for example by going up a mountain, the amount of UV
increases. Under these conditions a UV filter can prevent a blue cast in photographs.

Since UV filters look clear and neutral to the naked eye, some people also use them as a
protective filter which they leave on their lens at all times. Some people think this is a
good idea, other question the wisdom placing a $20 filter in front of a $1000 lens and
potentially affecting image quality. Both schools of thought have some valid points. It's
your choice.

So if you buy a UV filter, you'd expect it to block UV right? Well, sometimes you'd be
wrong as the results of this test show. I've looked at the range between 350nm and 400nm
for UV blocking since the glass used in almost all lenses will itself block any light with a
wavelength shorter than 350nm, so you don't need help from a filter there.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

How to Clean Your Lens and Filters?

Avoid unnecessary cleaning of your lens

Glass is relatively hard and durable. However, when advanced coatings and other
chemicals are added to the lens, it becomes a surface that's more vulnerable to scratches
and damage from chemicals and contact. Because of this, we want to try to keep our
lenses and filters free of fingerprints and dirt, and avoid repeated physical interaction—
this includes touching the lenses and—yes—cleaning.
When stored in your camera bag or on your shelf, judicious use of front and rear lens
caps will help keep your optics clean. But, when you use your gear, it's going to get dirty.
This cannot be avoided. Your lenses will benefit from an occasional cleaning of your
camera bag innards, as dust and dirt will likely find a home inside your bag and attach
itself to the lens.
.
Dust happens

Dust is everywhere and everywhere is dust. It will get on and inside your lens. Lenses are
manufactured in extremely clean factories, where manufacturers go to great lengths to try
to eliminate dust from the environment. Even then, brand-new lenses may have dust
between the lens elements.

Dust, however, is not the main enemy. A lens that sits on a shelf in your home for years
and collects a thick layer of dust will, obviously, produce image-quality issues. But, a few
specs of dust here and there on or inside the lens will have no effect on image quality. A
few specs of dust on or inside the lens will have no effect on image quality. That
statement was intentionally repeated.

"Dust is everywhere and everywhere is dust... Dust, however, is not the main enemy."

Trying to keep your lenses dust free through continual cleaning may serve to shorten the
life of your lens, as you run the risk of scratching the lens surfaces every time you clean
the glass.

Beware of rear smudges

Oily fingerprints and smudges on the rear element will have the most dramatic impact on
image quality, because of the way that the light is focused narrowly through the back of
the lens.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

The good news is that the rear element of the lens is less susceptible to dirt and oil
because, when mounted on the camera, it isn't subject to kids' sticky fingers, your sticky
fingers, or other environmental dangers.
Cleaning your optics is easy to do, even in the field

Here is a simple, three-step process for effective lens and filter cleaning:

1. Remove as much dust and dirt as possible from the lens with a blower or soft-
bristled brush.
2. Apply a few drops of lens cleaning solution to a lens tissue or cleaning cloth.
3. Using a circular motion, gently remove oil, fingerprints, and grime from the lens
surface, working from the center outward.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Handling the camera

Pictures that are shaky, bounce around, or lean over to one side are a pain to watch. So it is worth
that extra care to make sure that camera shots are steady and carefully controlled. There may be
times when the audience’s attention is so riveted to exciting action on the screen that they are
unconcerned if the picture does weave from side to side or move about. But don’t rely on it!
Particularly when there is little movement in the shot, an unsteady picture can be distracting and
irritating to watch. As a general rule, the camera should be held perfectly still, mounted to a
camera support, unless the camera operator is deliberately panning it (turning it to one side) or
tilting it (pointing it up or down) for a good reason. So what stops us from holding the camera
steady? There are a number of difficulties. Even “lightweight” cameras still grow heavier with
time. Muscles tire. Body movements (breathing, heartbeat) can cause camera movement. Wind
can buffet the camera. The camera operator may be shooting from an unsteady position, such as
a moving car or a rocking boat. On top of all that, if a telephoto lens is being used, any sort of
camera shake will be considerably exaggerated. To overcome or reduce this problem and provide
a stable base for the camera, several methods of camera support have been developed.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Keeping the handheld camera steady takes practice. Here are some techniques to handhold a
camera:
 Rest your back against a wall.
 Bracing the legs apart provides a better foundation for the camera.
 Kneel, with an elbow resting on one leg.
 Rest your body against a post.
 Lean the camera against something solid.
 Lean your side against a wall.
 Sit down, with your elbows on your knees.
 Rest your elbows on a low wall, fence, railings, car, or some other stationary object.
 Rest your elbows on the ground.
Supporting the camera
There are three basic ways to support a camera:
Use the camera operator’s body. With practice, cameras can be handheld successfully.
Depending on the camera’s design, a handheld camera may be steadied against the camera
operator’s head or shoulder while he or she
looks through the viewfi nder eyepiece.
Use some type of body support. A number of body supports are available forcameras of different
sizes. They add a mechanical support of some type to give the camera added stability.
Attach it to a camera mount. The camera can be attached to a camera mount of some type
(monopod, tripod) with a screw socket in its base.
A quick-release plate may be fastened to the bottom of the camera, allowing it to be removed in
a moment.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Handheld cameras

When the decision is made to have the operator hold the camera by hand, it is usually because
the camera has to be mobile, able to change positions quickly.
This method is most commonly used by news crews, for documentaries, at sports events, or for
shooting music videos. In all of these situations, the camera generally needs to move around to
follow the action.
Some of the more lightweight consumer and lower-end professional cameras can be held in one
hand. It is not large enough to be shoulder mounted.
A camera operator can maintain steadiness fairly easily for short periods of time. However, over
longer periods, even lightweight cameras can become difficult to hold steady.
Larger cameras are designed to be shoulder mounted. The body of the camera rests on the
camera operator’s right shoulder. The operator places his or her right hand through a support
loop on the side of the lens. This way, the operator’s fingers are free to control the zoom rocker
(servo zoom) switch while the thumb presses the record/pause switch. The camera operator’s left
hand adjusts the manual zoom ring, the focusing ring, and the lens aperture.
The secret to good camera control with a hand-held camera is to adopt a comfortable, well-
balanced position, with legs apart and slightly bent and elbows tucked in on the sides. Grip
the camera firmly but not too tightly or your muscles will tire and cause camera shake.
Enhance steadiness by resting your elbows against your body or something really secure.
This may be a wall, a fence, or perhaps a nearby car.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

The shoulder-mounted handheld camera is steadied by the right hand, positioned through the
strap on the zoom lens. That same right hand also operates the record button and the zoom rocker
(servo zoom) switch.
The comfort and success of handholding a camera depends largely on the camera operator’s
stamina and how long he or she will be using the camera. Standing with upraised arms
supporting a shoulder-mounted camera can be very tiring, so several body braces and shoulder
harnesses are available that help the camera operator to keep the camera steady when shooting
for long periods.
The monopod
The monopod is an easily carried, lightweight mounting. It consists of a collapsible metal tube of
adjustable length that’s crews to the camera base. This extendable tube can be set to any
convenient length. Braced against a knee, foot, or leg, the monopod can provide a firm support
for the camera yet allow the operator to move it around rapidly for a new viewpoint. Its main
disadvantage is that it is easy to accidentally lean the camera sideways and get sloping horizons.
And, of course, the monopod is not self- supporting.
The pan head (panning head or tripod head)
If the camera were mounted straight onto any mount it would be rigid, unable to move around to
follow the action. Instead, it is better to use a tripod head. Not only does this enable the camera
operator to swivel the camera from side to side (pan), and tilt it up and down, but the freedom of
movement (friction, drag) can be adjusted as well. The tripod head can also lock in either or both
directions.
Although a camera can be controlled by holding it, it is usually much easierto control the pans,
tilts, zooms, and focus by using the tripod arms (also known as a pan bar or panning handles)
attached to the head.
Whenever the camera is tilted or panned, the camera operator needs to feel a certain amount of
resistance to control it properly. If there is too little resistance, the camera operator is likely to
overshoot the camera move at the end of a pan or tilt. It will also be difficult to follow the action
accurately. On the other hand, if the camera operator needs to exert too much effort, panning will
be bumpy and erratic. So the friction (drag) for both pan and tilt is generally adjustable. Tripod
heads for video cameras usually use either friction or fluid to dampen movements. The cheaper,
simpler friction head has disadvantages, as pressure is gradually exerted to start a pan, the head

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

may suddenly move with a jerk. And at the end of a slow pan, it can stick unexpectedly. With a
fluid head though, all movements should be steady and controlled. Locking controls are part of
the tripod head. These controls prevent the head from panning or tilting. Whenever the camera is
left unattended, it should be locked off. Otherwise, the camera may suddenly tilt and not only jolt
its delicate mechanism but even tip the tripod over. Locking controls are useful when the camera
needs to be very steady (such as when shooting with a long tele- photo lens).
ii. Using a tripod
A tripod offers a compact, convenient method of holding a camera steady, pro- vided it is used
properly. It has three legs of independently adjustable length that are spread apart to provide a
stable base for the camera. However, tripods are certainly not foolproof. In fact, precautions need
to be taken in order to avoid possible disaster, so here are some useful tips:
Don’t leave the camera on its tripod unattended, particularly if people or animals are likely to
knock against or trip over it. Take special care whenever the ground is slippery, sloping, or soft.
To prevent the feet from slipping, tripods normally have either rubber pads for smooth ground or
spikes (screw-out or retractable) for rough surfaces. (Be sure, though, not to use spikes when
they are likely to damage the floor surface.)
If the ground is uneven, such as on rocks or a staircase, the tripod legs can be adjusted to
different lengths so that the camera itself remains level when panned around. Otherwise
horizontals will tilt as the camera pans.
Many tripods are fitted with bubble levels to help level the camera.
■ Tripods fitted with a camera tend to be top heavy, so always make sure that the tripod’s legs
are fully spread and that it is resting on a firm surface.
■ There are several techniques for improving a tripod’s stability. The simplest is to add a central
weight, such as a sandbag, hung by rope or chain beneath the tripod’s center. The legs can be
tied to spikes in the ground. Or use a folding device known as a “spreader” to provide a portable
base.
The rolling tripod/tripod dolly
One practical disadvantage of the tripod is that the camera operator cannot move it around while
shooting. However, tripods can use a tri-pod dolly, a set of wheels that fit directly on the tripod,
or a wheeled base (called a camera dolly) to become a rolling tripod. Although it sounds obvious,
before moving a rolling dolly, remember to check that there is no cable or obstruction in the

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

camera’s path. It sometimes helps to give a slight push to align the casters in the appropriate
direction before the dolly begins. Otherwise the video image may bump a little as the tripod
starts to move. Although widely used in smaller studios, the rolling tripod dolly does not lend
itself to subtle camerawork. Camera moves tend to be approximate. Camera height is adjusted
resetting the heights of the tripod legs. So height changes while shooting are not practical unless
a jib is attached to the dolly.

iii. The pedestal


For many years, the pedestal (or ped as it is widely known) has been the all-purpose camera
mounting used in TV studios throughout the world. It can support even the heaviest studio
cameras, yet it still allows a range of maneuvers on smooth, level surfaces. Basically, the
pedestal consists of a three-wheeled base, supporting a central column of adjustable height. A
concentric steering wheel is used to push and steer the mounting around and to alter the camera’s
height. Thanks to a compensatory pneumatic, spring, or counter-balance mechanism within the
column, its height can be adjusted easily and smoothly, even while on shot. There may be
occasions when a second operator’s assistance is needed to help push the pedestal and to look
after the camera cable.
iv. Jib arms
In the golden days of filmed musicals, large camera cranes came into their own: bird’s-eye shots
of the action, swooping down to a group, sweeping along at floor level, shots of dancing feet
climbing to follow the action as dancers ascended staircases—in the right hands, such
camerawork is very impressive.
Some television production companies still use small camera cranes (jibs), but they need skilled
handling and occupy a lot of floor space compared with a pedestal. If you want a wide variation
in camera heights, a much less costly and more convenient mounting is the jib (or jib arm).
The long jib arm (boom) is counterbalanced on a central column. This column is generally
supported on a tripod or camera pedestal. The video camera is fixed into a cradle at the far end of
the jib, remotely controlled by the camera operator who stands beside the mounting, watching an
attached picture monitor. There is a wide range of jib designs, from the lightweight mountings
used with smaller cameras with a maximum camera height of 10 feet to heavy-duty jibs that will
reach up to 40 feet.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

The jib has a variety of operational advantages. It can reach over obstructions that would bring a
rolling tripod or pedestal to a halt. It can take level shots of action that occur high above the floor
while other mounts working on the floor level can only shoot the subject from a low angle.
However, as the jib is swung, the camera always moves in an arc, whether it is being raised,
lowered, or turned sideways. It cannot travel parallel with subjects moving across the action area.
Whether an operator can turn and raise/lower the jib while on shot and keep the moving subjects
in focus and in a well-composed picture will depend on the operator’s skills—and a bit of luck.
Specialty camera mounts
Several devices are available that can help camera operators cope with those awkward occasions
when the camera needs to be secured in unusual places.
Typical equipment that can prove handy for smaller production units includes the following:
■ Camera clamps. Metal brackets or clamps of various designs, which allow the camera to
be fastened to a wall, fence, rail, door, or other structure.
■ Car rig (or car mount). A car mounts that attaches to the car, inside or outside of the car,
in order to capture images that would otherwise be difficult to obtain. Some mounts use suction
cups to fi t onto the car, others fi t over the side door, and some sit in a beanbag type mount as
shown in Figure.
The Steadicam, Glidecam, and Fig Rig are just some of the special camera stabilizers that take
the shake and shudder out of wide camera movements. These systems allow the camera operator
to take smooth traveling shots while panning, tilting, walking, running, climbing, and so forth.
An LCD color screen (treated to reduce reflections) allows the operator to monitor the shots.
The main theme you will find running through this book is that it’s the result that matters, not
how you get it. The camera needs to be firmly supported. The audience does not know or care
whether the camera operator is using a tri- pod or resting the camera against a handy post.
A moving shot can be taken from a car, the back of a motorcycle, a hospital trolley, a wheelchair,
or even roller skates or skis. It is the result that counts, although some methods are a lot safer and
more convenient than others.
v. Handling care
It’s easy to endanger video equipment, especially when shooting on location. Although some
units are rugged and almost foolproof, others are easily damaged. A moment’s oversight can put
the equipment out of action altogether—a camera momentarily rested on the ground, a spilt

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

drink, the jolts, and vibration of a traveling vehicle. It takes just a moment to protect the camera
with a water-proof cover (rain hood) against wind-blown dust and grit, sea-spray, or rain.
Extreme cold, heat, or moisture can create problems too. A car’s trunk can become an oven in
the hot sun. High humidity can wreak havoc with videotape recorders.
Moving from a cold exterior to warm surroundings is liable to result in condensation (dew) in
VCRs and can cause tape or machine damage. Condensation can also cause major problems with
lens element misting, and care must be taken to protect internal elements if the lens is not sealed.
The newer memory card cameras are not as susceptible to tape types of problems. Wrapping the
.equipment (even in a plastic bag) may help fight the condensation.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

A camera lens (also known as photographic lens or photographic objective) is an optical lens or
assembly of lenses used in conjunction with a camera body and mechanism to make images of
objects either on photographic film or on other media capable of storing an image chemically or
electronically.

There is no major difference in principle between a lens used for a still camera, a video camera, a
telescope, a microscope, or other apparatus, but the detailed design and construction are
different. A lens may be permanently fixed to a camera, or it may be interchangeable with lenses
of different focal lengths, apertures, and other properties.

Convex Lens

A convex lens is thicker at the center than at the edges.

Convex lenses are thicker at the middle. Rays of light that pass through the lens are brought
closer together (they converge). A convex lens is a converging lens. When parallel rays of light
pass through a convex lens the refracted rays converge at one point called the principal focus.
The distance between the principal focus and the center of the lens is called the focal length.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Use of Convex Lenses – The Camera

A camera consists of three main parts.

1. The body which is light tight and contains all the mechanical parts.
2. The lens which is a convex (converging) lens).
3. The film or a charged couple device in the case of a digital camera.

The rays of light from the person are converged by the convex lens forming an image on the film
or charged couple device in the case of a digital camera. The angle at which the light enters the
lens depends on the distance of the object from the lens. If the object is close to the lens the light
rays enter at a sharper angled. This results in the rays converging away from the lens. As the lens
can only bend the light to a certain agree the image needs to be focussed in order to form on the
film. This is achieved by moving the lens away from the film.

Similarly, if the object is away from the lens the rays enter at a wider angle. This results in the
rays being refracted at a sharper angle and the image forming closer to the lens. In this case the
lens needs to be positioned closer to the film to get a focused image.

Thus the real image of a closer object forms further away from the lens than the real image of a
distant object and the action of focusing is the moving of the lens to get the real image to fall on
the film. The image formed is said to be real because the rays of lighted from the object pass
through the film and inverted (upside down).

While in principle a simple convex lens will suffice, in practice a compound lens made up of a
number of optical lens elements is required to correct (as much as possible) the many optical
aberrations that arise. Some aberrations will be present in any lens system. It is the job of the lens
designer to balance these out and produce a design that is suitable for photographic use and
possibly mass production.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Concave Lens

A concave lens is thinner at the center than at the edges.

Concave lenses are thinner at the middle. Rays of light that pass through the lens are spread out
(they diverge). A convex lens is a diverging lens. When parallel rays of light pass through a
concave lens the refracted rays diverge so that they appear to come from one point called the
principal focus. The distance between the principal focus and the center of the lens is called the
focal length. The image formed is virtual and diminished (smaller)

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

A lens collects light from a point on an object and focuses it to a corresponding conjugate point on an
image. Under most conditions, the lens fails at this task because of some error in the precision with
which it focuses this light. Rather than a true point image, the lens produces a blur circle. It is the
function of the optical designer to assure that this blur circle is sufficiently small to allow the required
resolution or image quality. The inability of a lens to form a perfect image is caused by lens
aberrations. The following paragraphs will describe the seven aberrations and discuss some of the
salient points of each.

1. Spherical aberration
Spherical aberration is the imaging error found when a lens is focusing an axial bundle of
monochromatic light. In the presence of spherical aberration, each zone or annulus of the lens
aperture has a slightly different focal length.
The result can be seen in Figure 1. The enlarged view shows the actual intersection of focused rays
with the image surface. At the paraxial focus A, all rays close to the axis of the lens are focused
accurately. The rays from zones farther from the axis are focused short of the paraxial focus. The
farther the rays are from the axis, the greater is this error in focus. This lack of a common focus for all
zones of the lens is spherical aberration.

Figure 1. Spherical aberration in a planoconvex lens.


In Figure 1, there is a point, B, just short of the paraxial focus where the blur circle, or spot size
caused by spherical aberration, is minimized. Figure 2 shows the intensity spread function for the
paraxial and minimum spot size focus positions. Analysis of these spread functions reveals that at the
paraxial focus there is a bright spot about 0.02 mm in diameter, surrounded by a circle of flare about
0.08 mm in diameter.
In the case of focus for minimum spot size, the central spot is slightly greater, about 0.025 mm in
diameter, while the visible flare diameter has been reduced to less than 0.04 mm. In almost all
applications where spherical aberration is present, the overall image quality is best when the lens is
focused close to the point of minimum spot size.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Figure 2. Point spread functions for the lens in Figure 1 at the paraxial focus (a) and focus for
minimum spot size (b).
2. Coma
Coma is an aberration that afflicts off-axis light bundles in a manner quite similar to the way in which
spherical aberration affects axial bundles. As shown in Figure 3, when an off-axis bundle is incident on
a lens afflicted with coma, each annulus focuses onto the image plane at a slightly different height and
with a different spot size. The result is an overall spot that is comatic in shape, having a bright central
core with a triangular flare extending toward the optical axis of the lens.
For a pair of simple lenses arranged symmetrically as in a relay lens system, or a complex lens that has
some degree of symmetry, a significant reduction in the amount of coma is found. This important
characteristic is used in the design of many lenses and instruments such as borescopes and submarine
periscopes. The residual coma in a lens system is usually combined with other off-axis aberrations,
making its individual contribution to final image quality difficult to evaluate.

Figure 3. Illustration of the off-axis aberration, coma.


4. Field curvature
In most optical systems, the final image must be formed on a plane or flat surface. Unfortunately, most
optical systems tend to form that image on a curved surface. The nominal curvature (1/radius) of that
surface is referred to as the Petzval, or field curvature of the lens. For simple lenses this curvature is

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

equal to approximately 2/3 of the lens power. When the lens is free of other off-axis aberrations, the
image is formed on the Petzval surface. When astigmatism is present (which is most often the case) the
Petzval surface has no real significance as far as actual imaging of the lens system is concerned.

Figure 4. Illustration of astigmatism.


5. Astigmatism
When astigmatism is present in a lens system, fans of rays of differing orientations at the lens aperture
tend to focus on differing curved surfaces. Figure 4 shows two fans of rays passing through a simple
lens and indicates how they are focused. The spot diagram in Figure 4 shows that the presence of
astigmatism causes the ideal circular point image to be blurred into an elliptical patch.

The field curves shown in Figure 5 represent another method of illustrating the aberrations of field
curvature and astigmatism. These curves represent a cross section of half the image surface from the
optical axis out to the edge of the field. Figure 5a shows a set of field curves for a lens afflicted with
both field curvature and astigmatism. If we think of the image as a spoked wheel centered on the
optical axis, the rim of the wheel is in focus at the tangential image surface, while the spokes are in
focus at the sagittal surface.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Figure 5a. Illustration of field curvature and astigmatism in a simple lens.


Astigmatism is, by definition, the difference between the tangential and sagittal field curves. If the
tangential and sagittal surfaces are coincident, then the lens is said to be free of astigmatism. In this
case, the image is formed on the Petzval surface. When astigmatism is present, the tangential field
departure from the Petzval surface is three times the departure of the sagittal field (Figure 5a). In
most cases it is not possible to correct field curvature and astigmatism to zero, but satisfactory image
quality usually can be achieved by balancing residual astigmatism with inherent field curvature as
illustrated in Figure 5b.

Figure 5b. Illustration of the introduction of negative astigmatism to balance field curvature.

6. Distortion
Distortion is a unique aberration in that it does not affect the quality of the image in terms of
sharpness or focus. Rather, distortion affects the shape of the image, causing it to depart from a true
scaled duplicate of the object. Figure 6b represents a lens system free of distortion that produces a
true reproduction of the checkerboard object. If the system suffers from positive distortion, then the
off-axis points are imaged at distances greater than nominal, creating the pincushion effect seen in 6a.
On the other hand, if the system exhibits negative distortion, the resulting image assumes a barrel
shape as seen in Figure 6c. With the exception of certain metrological systems, where critical
measurements are taken from the image, distortion errors in the 5 to 10 percent region usually are
deemed acceptable.
The five aberrations presented to this point have been monochromatic aberrations, generally computed
at the central wavelength for the lens system. If the lens is to be used over an extended spectral
bandwidth, the following two chromatic aberrations must also be considered.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Figure 6. Illustration of distortion: (a) approximately 15 percent positive (pincushion) distortion; (b)
zero distortion; and (c) approximately 10 percent negative (barrel) distortion.

6. Axial color
For all optical glasses, the index of refraction varies as a function of wavelength; the index is greater
for shorter (blue) wavelengths. Also, the rate at which the index changes is greater at the shorter
wavelengths. In a simple lens this causes each wavelength to focus at a different point along the
optical axis. This chromatic spreading of the light is known as dispersion.

Figure 7. Axial color in a simple lens (a) and in an achromat (b) of identical focal length and speed
(f/#).
Figure 7a illustrates a simple lens focusing a bundle of white light covering the spectral band from 450
to 650 nm. If the focus is set for the middle of the band, as shown, the blur circle consists of a green
central core with a halo of purple (red and blue) surrounding it. Except in very unusual cases, such as
laser systems or nearly monochromatic systems, axial color is an aberration that must be dealt with in
order to achieve usable image quality. This can be accomplished by converting the simple lens into an
achromatic doublet as shown in Figure 7b. The two glass types selected correct the primary axial color
by bringing the two extreme wavelengths to a common focus. In the lens illustrated, a reduction of 30
times in blur-circle size has been realized by the achromatization of this simple lens.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Figure 8. A simple lens with little chief ray refraction (a) has little lateral color. An eyepiece design
(b) where substantial nonsymmetrical refraction of the chief ray exists will be afflicted with lateral
color.
7. Lateral color
The second chromatic aberration (and the last of the seven primary lens aberrations) is lateral color.
For on-axis light bundles, the optical axis of the lens coincides with the central ray in that bundle. For
off-axis bundles, the corresponding central ray is called the chief ray, or principal ray. The height of
the chief ray at the image plane defines image size. If lateral color exists in the lens system, this chief
ray is dispersed, causing the differing wavelengths to be imaged at different heights on the image
plane. The result is a chromatic, radial blur for off-axis image points.
In the case of a simple lens with the chief ray passing through its center, there is little refraction of
that ray and, therefore, little lateral color. A system that is symmetrical about the point where the
chief ray crosses the optical axis (the aperture stop) has little or no lateral color because the
aberration tends to cancel itself as the chief ray traverses the symmetrical halves of the system.

The eyepiece is a classic example of a lens form that produces large amounts of chief ray refraction
that is not symmetrical about the aperture stop. As a result, in most eyepiece designs lateral color is a
major contributor to degradation of off-axis image quality. Figure 8 illustrates the chief ray path
through a simple lens 8a and an eyepiece 8b. The presence or lack of lateral color is shown in each

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

case.

Conclusion

This completes the review of the seven primary lens aberrations. The optical designer must evaluate
the potential contribution of each aberration to final system performance and adjust the configuration
of the optical system to achieve satisfactory performance.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Normal lens

In photography and cinematography, a normal lens is a lens that reproduces a field of view that
generally looks "natural" to a human observer under normal viewing conditions, as compared
with lenses with longer or shorter focal lengths which produce an expanded or contracted field of
view that distorts the perspective when viewed from a normal viewing distance. Lenses of
shorter focal length are called wide-angle lenses, while longer-focal-length lenses are referred to
as long-focus lens (with the most common of that type being the telephoto lenses).

For still photography, a lens with a focal length about equal to the diagonal size of the film or
sensor format is considered to be a normal lens; its angle of view is similar to the angle
subtended by a large-enough print viewed at a typical viewing distance equal to the print
diagonal; this angle of view is about 53° diagonally. For cinematography, where the image is
normally viewed at a greater distance, a lens with a focal length of roughly double the film or
sensor diagonal is considered 'normal'.

The term normal lens can also be used as a synonym for rectilinear lens.

Typical normal focal lengths for different formats

Film still

Four "normal" lenses for the 35mm format.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Typical normal lenses for various film formats for photography are:

Image Normal lens focal


Film format Image dimensions
diagonal length
9.5 mm Minox 8 × 11 mm 13.6 mm 15 mm
Half-frame 24 × 18 mm 30 mm 30 mm
APS C 16.7 × 25.1 mm 30.1 mm 28 mm, 30 mm
135, 35mm 24 × 36 mm 43.3 mm 40 mm, 50 mm, 55 mm
120/220, 6 × 4.5 (645) 56 × 42 mm 71.8 mm 75 mm
120/220, 6 × 6 56 × 56 mm 79.2 mm 80 mm
120/220, 6 × 7 56 × 68 mm 88.1 mm 90 mm
120/220, 6 × 9 56 × 84 mm 101.0 mm 105 mm
120/220, 6 × 12 56 × 112 mm 125.0 mm 120 mm
large format 4 × 5 sheet
93 × 118 mm (image area) 150.2 mm 150 mm
film
large format 5 × 7 sheet 120 × 170 mm (image
208.0 mm 210 mm
film area)
large format 8 × 10 sheet 194 × 245 mm (image
312.5 mm 300 mm
film area)

Wide-angle lens

In photography and cinematography, a wide-angle lens refers to a lens whose focal length is
substantially smaller than the focal length of a normal lens for a given film plane. This type of
lens allows more of the scene to be included in the photograph, which is useful in architectural,
interior and landscape photography where the photographer may not be able to move farther
from the scene to photograph it.

Another use is where the photographer wishes to emphasise the difference in size or distance
between objects in the foreground and the background; nearby objects appear very large and
objects at a moderate distance appear small and far away.

This exaggeration of relative size can be used to make foreground objects more prominent and
striking, while capturing expansive backgrounds.

A wide angle lens is also one that projects a substantially larger image circle than would be
typical for a standard design lens of the same focal length. This large image circle enables either
large tilt & shift movements with a view camera, or a wide field of view. By convention, in still
photography, the normal lens for a particular format has a focal length approximately equal to
the length of the diagonal of the image frame or digital photo sensor. In cinematography, a lens
of roughly twice the diagonal is considered "normal".

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Wide-angle lenses for 35 mm format

For a full-frame 35 mm camera with a 36 mm by 24 mm format, the diagonal measures 43.3 mm


and by custom, the normal lens adopted by most manufacturers is 50 mm. Also by custom, a lens
of focal length 35 mm or less is considered wide-angle.

Ultra wide angle lenses have a focal length shorter than the short side of the film or sensor. In 35
mm, an ultra-wide-angle lens has a focal length shorter than 24 mm. Common wide-angle for a
full-frame 35 mm camera are 35, 28, 24, 21, 20, 18 and 14 mm, the latter four being ultra-wide.
Many of the lenses in this range will produce a more or less rectilinear image at the film plane,
though some degree of barrel distortion is not uncommon here.

Ultra wide-angle lenses that do not produce a rectilinear image (i.e., exhibit barrel distortion) are
called fisheye lenses. Common focal lengths for these in a 35 mm camera are 6 to 8 mm (which
produce a circular image). Lenses with focal lengths of 8 to 16 mm may be either rectilinear or
fisheye designs. Wide-angle lenses come in both fixed-focal-length and zoom varieties. For 35
mm cameras, lenses producing rectilinear images can be found at focal lengths as short as 8 mm,
including zoom lenses with ranges of 2:1 that begin at 12 mm.

Macro Lens

The official definition of a macro lens is that it should be able to reproduce a life-sized image of
an object on the recording medium – in this case the image sensor. True macro lenses offer a
magnification factor of 1.0x or 1:1 at its closest focus setting. Fit one of these lenses to a DSLR
like the Canon EOS 60D, and a standard UK postage stamp will fill the whole frame.

That might not sound particularly impressive, but when you consider that the 18Mp sensor in
cameras like enable very large format prints, the potential for creating massive enlargements
from shots of tiny objects is really quite astonishing.

Micro Lens

A microlens is a small lens, generally with a diameter less than a millimetre (mm) and often as
small as 10 micrometres (µm). The small sizes of the lenses means that a simple design can give
good optical quality but sometimes unwanted effects arise due to optical diffraction at the small
features.

Tilt shift lenses: perspective control

Tilt shift lenses enable photographers to transcend the normal restrictions of depth of field and
perspective. Many of the optical tricks these lenses permit could not otherwise be reproduced
digitally—making them a must for certain landscape, architectural and product photography. The
first part of this tutorial addresses the shift feature, and focuses on its use for in digital SLR

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

cameras for perspective control and panoramas. The second part focuses on using tilt shift lenses
to control depth of field.

Tilt shift movements

Shift movements enable the photographer to shift the location of the lens's imaging circle relative
to the digital camera sensor. This means that the lens's center of perspective no longer
corresponds the image's center of perspective, and produces an effect similar to only using a crop
from the side of a correspondingly wider angle lens.

Tilt movements enable the photographer to tilt the plane of sharpest focus so that it no longer lies
perpendicular to the lens axis. This produces a wedge-shaped depth of field whose width
increases further from the camera. The tilt effect therefore does not necessarily increase depth of
field—it just allows the photographer to customize its location to better suit their subject matter.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

APERTURE
“Aperture is the size of the opening in the lens when a picture is taken”

Aperture either adds a dimension to a photograph by blurring the background, or magically


brings everything in focus.

Aperture is a hole within a lens, through which light travels into the camera body. It is easier to
understand the concept if you just think about our eyes. Every camera that we know of today is
designed like human eyes. The cornea in our eyes is like the front element of a lens – it gathers
all external light, then bends it and passes it to the iris. Depending on the amount of light, the iris
can either expand or shrink, controlling the size of the pupil, which is a hole that lets the light
pass further into the eye. The pupil is essentially what we refer to as aperture in photography.
The amount of light that enters the retina (which works just like the camera sensor), is limited to
the size of the pupil – the larger the pupil, the more light enters the retina. Aperture is „the opening
in the lens.‟ When you hit the shutter release button of your camera a hole opens up that allows your
cameras image sensor to catch a glimpse of the scene you‟re wanting to capture. The aperture that you
set impacts the size of that hole. The larger the hole the more light that gets in – the smaller the hole
the less light.

Aperture is measured in „f-stops‟.

for example f/2.8, f/4, f/5.6,f/8,f/22 etc. Moving from one f-stop to the next doubles or halves the size
of the amount of opening in your lens (and the amount of light getting through). Keep in mind that a
change in shutter speed from one stop to the next doubles or halves the amount of light that gets in
also – this means if you increase one and decrease the other you let the same amount of light in – very
handy to keep in mind).

One thing that causes a lot of new photographers confusion is that large apertures (where lots of light
gets through) are given f/stop smaller numbers and smaller apertures (where less light gets through)
have larger f-stop numbers. So f/2.8 is in fact a much larger aperture than f/22. It seems the wrong
way around when you first hear it but you‟ll get the hang of it.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Lens Apertures: Maximum and Minimum

Every lens has a limit on how large or how small the aperture can get. If you take a look at the
specifications of your lens, it should say what the maximum (lowest f-number) and minimum
apertures (highest f-number) of your lens are. The maximum aperture of the lens is much more
important than the minimum, because it shows the speed of the lens. A lens that has an aperture
of f/1.2 or f/1.4 as the maximum aperture is considered to be a fast lens, because it can pass
through more light than, for example, a lens with a maximum aperture of f/4.0. That‟s why
lenses with large apertures are better suited for low light photography.

The minimum aperture is not that important, because almost all modern lenses can provide at
least f/16 as the minimum aperture, which is typically more than enough for everyday
photography needs.

This 50mm lens has a max. aperture of f/1.4


There are two types of lenses: “fixed” (also known as “prime”) and “zoom”. While zoom lenses
give you the flexibility to zoom in and out (most point and shoot cameras have zoom lenses)
without having to move closer or away from the subject, fixed or prime lenses only have one
focal length. Due to the complexity of optical design for zoom lenses, many of the consumer
lenses have variable apertures. What it means, is that when you are fully zoomed out, the
aperture is one number, while zooming in will increase the f-number to a higher number. For
example, the Nikon 18-200mm lens has a variable maximum aperture of f/3.5-f/5.6. When
zoomed fully out at 18mm, the lens has an aperture of f/3.5, while when fully zoomed in at
200mm, the lens has an aperture of f/5.6. The heavy, professional zoom lenses, on the other
hand, typically have fixed apertures. For example, the Nikon 70-200mm f/2.8 lens has the same
maximum aperture of f/2.8 at all focal lengths between 70mm and 200mm.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Why is this important? Because larger maximum aperture means that the lens can pass through
more light, and hence, your camera can capture images faster in low-light situations. Having a
larger maximum aperture also means better ability to isolate subjects from the background.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

APERTURE
“Aperture is the size of the opening in the lens when a picture is taken”

Aperture either adds a dimension to a photograph by blurring the background, or magically


brings everything in focus.

Aperture is a hole within a lens, through which light travels into the camera body. It is easier to
understand the concept if you just think about our eyes. Every camera that we know of today is
designed like human eyes. The cornea in our eyes is like the front element of a lens – it gathers
all external light, then bends it and passes it to the iris. Depending on the amount of light, the iris
can either expand or shrink, controlling the size of the pupil, which is a hole that lets the light
pass further into the eye. The pupil is essentially what we refer to as aperture in photography.
The amount of light that enters the retina (which works just like the camera sensor), is limited to
the size of the pupil – the larger the pupil, the more light enters the retina. Aperture is „the opening
in the lens.‟ When you hit the shutter release button of your camera a hole opens up that allows your
cameras image sensor to catch a glimpse of the scene you‟re wanting to capture. The aperture that you
set impacts the size of that hole. The larger the hole the more light that gets in – the smaller the hole
the less light.

Aperture is measured in „f-stops‟.

for example f/2.8, f/4, f/5.6,f/8,f/22 etc. Moving from one f-stop to the next doubles or halves the size
of the amount of opening in your lens (and the amount of light getting through). Keep in mind that a
change in shutter speed from one stop to the next doubles or halves the amount of light that gets in
also – this means if you increase one and decrease the other you let the same amount of light in – very
handy to keep in mind).

One thing that causes a lot of new photographers confusion is that large apertures (where lots of light
gets through) are given f/stop smaller numbers and smaller apertures (where less light gets through)
have larger f-stop numbers. So f/2.8 is in fact a much larger aperture than f/22. It seems the wrong
way around when you first hear it but you‟ll get the hang of it.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Lens Apertures: Maximum and Minimum

Every lens has a limit on how large or how small the aperture can get. If you take a look at the
specifications of your lens, it should say what the maximum (lowest f-number) and minimum
apertures (highest f-number) of your lens are. The maximum aperture of the lens is much more
important than the minimum, because it shows the speed of the lens. A lens that has an aperture
of f/1.2 or f/1.4 as the maximum aperture is considered to be a fast lens, because it can pass
through more light than, for example, a lens with a maximum aperture of f/4.0. That‟s why
lenses with large apertures are better suited for low light photography.

The minimum aperture is not that important, because almost all modern lenses can provide at
least f/16 as the minimum aperture, which is typically more than enough for everyday
photography needs.

This 50mm lens has a max. aperture of f/1.4


There are two types of lenses: “fixed” (also known as “prime”) and “zoom”. While zoom lenses
give you the flexibility to zoom in and out (most point and shoot cameras have zoom lenses)
without having to move closer or away from the subject, fixed or prime lenses only have one
focal length. Due to the complexity of optical design for zoom lenses, many of the consumer
lenses have variable apertures. What it means, is that when you are fully zoomed out, the
aperture is one number, while zooming in will increase the f-number to a higher number. For
example, the Nikon 18-200mm lens has a variable maximum aperture of f/3.5-f/5.6. When
zoomed fully out at 18mm, the lens has an aperture of f/3.5, while when fully zoomed in at
200mm, the lens has an aperture of f/5.6. The heavy, professional zoom lenses, on the other
hand, typically have fixed apertures. For example, the Nikon 70-200mm f/2.8 lens has the same
maximum aperture of f/2.8 at all focal lengths between 70mm and 200mm.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Why is this important? Because larger maximum aperture means that the lens can pass through
more light, and hence, your camera can capture images faster in low-light situations. Having a
larger maximum aperture also means better ability to isolate subjects from the background.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

SHUTTER SPEED

“Shutter Speed is the amount of time that the shutter is open”

Shutter Speed is one of the three pillars of photography, the other two being ISO and
Aperture. Shutter speed is where the other side of the magic happens – it is responsible for
creating dramatic effects by either freezing action or blurring motion.

What is a Camera Shutter?

A camera shutter is a curtain in front of the camera sensor that stays closed until the camera fires.
When the camera fires, the shutter opens and fully exposes the camera sensor to the light that
passes through the lens aperture. After the sensor is done collecting the light, the shutter closes
immediately, stopping the light from hitting the sensor. The button that fires the camera is also
called “shutter” or “shutter button”, because it triggers the shutter to open and close.

Introduction to Shutter Speed in Digital Photography

The three main areas that you can adjust are ISO, Aperture and Shutter speed.

In digital photography shutter speed is the length of time that your image sensor „sees‟ the scene
you‟re attempting to capture.

Shutter speed, also known as “exposure time”, stands for the length of time a camera shutter is
open to expose light into the camera sensor. If the shutter speed is fast, it can help to freeze
action completely, as seen in the above photo of the dolphin. If the shutter speed is slow, it can

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

create an effect called “motion blur”, where moving objects appear blurred along the direction of
the motion. This effect is used quite a bit in advertisements of cars and motorbikes, where a
sense of speed and motion is communicated to the viewer by intentionally blurring the moving
wheels.

In short, High shutter speeds freeze action, while low shutter speeds create an effect of motion.

How shutter speed is measured

Shutter speeds are typically measured in fractions of a second, when they are under a second. For
example 1/4 means a quarter of a second, while 1/250 means one two-hundred-and-fiftieth of a
second or four milliseconds. Most modern DSLRs can handle shutter speeds of up to 1/4000th of
a second, while some can handle much higher speeds of 1/8000th of a second and faster. The
longest shutter speed on most DSLRs is typically 30 seconds (without using external remote
triggers).

Fast, slow and long shutter speeds

Fast shutter speed is typically whatever it takes to freeze action. For me, it is typically above
1/500th of a second for general photography and above 1/1000th of a second for bird
photography.

Slow shutter speed is considered to be the slowest shutter speed that you can handle without
introducing camera shake. Some of the newer Nikon lenses such as the Nikon 70-200mm VR
II have special “vibration reduction” technologies within the lens that can handle shutter speeds
of up to 1/10th of a second (depending on photographer‟s technique), hand-held!

Let me attempt to break down the topic of “Shutter Speed” into some bite sized pieces that should
help digital camera owners trying to get their head around shutter speed:

• Shutter speed is measured in seconds – or in most cases fractions of seconds. The bigger
the denominator the faster the speed (ie 1/1000 is much faster than 1/30).

• In most cases you‟ll probably be using shutter speeds of 1/60th of a second or faster. This
is because anything slower than this is very difficult to use without getting camera shake.
Camera shake is when your camera is moving while the shutter is open and results in blur in your
photos.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

• If you‟re using a slow shutter speed (anything slower than 1/60) you will need to either
use a tripod or some type of image stabilization (more and more cameras are coming with this
built in).

• Shutter speeds available to you on your camera will usually double (approximately) with
each setting. As a result you‟ll usually have the options for the following shutter speeds – 1/500,
1/250, 1/125, 1/60, 1/30, 1/15, 1/8 etc. This „doubling‟ is handy to keep in mind as aperture
settings also double the amount of light that is let in – as a result increasing shutter speed by one
stop and decreasing aperture by one stop should give you similar exposure levels.

• Some cameras also give you the option for very slow shutter speeds that are not fractions
of seconds but are measured in seconds (for example 1 second, 10 seconds, 30 seconds etc).
These are used in very low light situations, when you‟re going after special effects and/or when
you‟re trying to capture a lot of movement in a shot. Some cameras also give you the option to
shoot in „B‟ (or „Bulb‟) mode. Bulb mode lets you keep the shutter open for as long as you hold
it down.

• When considering what shutter speed to use in an image you should always ask yourself
whether anything in your scene is moving and how you‟d like to capture that movement. If there
is movement in your scene you have the choice of either freezing the movement (so it looks still)
or letting the moving object intentionally blur (giving it a sense of movement).

• To freeze movement in an image (like in the surfing shot above) you‟ll want to choose a
faster shutter speed and to let the movement blur you‟ll want to choose a slower shutter speed.
The actual speeds you should choose will vary depending upon the speed of the subject in your
shot and how much you want it to be blurred.

 Motion is not always bad. I spoke to one digital camera owner last week who told me that he
always used fast shutter speeds and couldn‟t understand why anyone would want motion in their
images. There are times when motion is good. For example when you‟re taking a photo of a
waterfall and want to show how fast the water is flowing, or when you‟re taking a shot of a
racing car and want to give it a feeling of speed, or when you‟re taking a shot of a star scape and
want to show how the stars move over a longer period of time. In all of these instances choosing
a longer shutter speed will be the way to go. However in all of these cases you need to use a
tripod or you‟ll run the risk of ruining the shots by adding camera movement (a different type of
blur than motion blur).
 Focal Length and Shutter Speed - another thing to consider when choosing shutter speed is the
focal length of the lens you‟re using. Longer focal lengths will accentuate the amount of camera
shake you have and so you‟ll need to choose a faster shutter speed (unless you have image
stabilization in your lens or camera). The „rule‟ of thumb to use with focal length in non image
stabilized situations) is to choose a shutter speed with a denominator that is larger than the focal

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

length of the lens. For example if you have a lens that is 50mm 1/60th is probably ok but if you
have a 200mm lens you‟ll probably want to shoot at around 1/250.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

DEPTH OF FIELD

Depth of Field (DOF) is that amount of your shot that will be in focus. Depth of field refers to the
range of distance that appears acceptably sharp. It varies depending on camera type, aperture and
focusing distance, although print size and viewing distance can also influence our perception of depth
of field.

Large depth of field means that most of your image will be in focus whether it’s close to your camera
or far away (like the picture to the left where both the foreground and background are largely in focus
– taken with an aperture of f/22).

Small (or shallow) depth of field means that only part of the image will be in focus and the rest
will be fuzzy (like in the flower at the top of this post (click to enlarge). You’ll see in it that the tip
of the yellow stems are in focus but even though they are only 1cm or so behind them that the
petals are out of focus. This is a very shallow depth of field and was taken with an aperture of
f/4.5).

What is Depth of Field with relevance to aperture?

Aperture has a big impact upon depth of field. One important thing to remember here, the size of
the aperture has a direct impact on the depth of field, which is the area of the image that appears
sharp. Large aperture (remember it’s a smaller number) will decrease depth of field while small
aperture (larger numbers) will give you larger depth of field.

The size of the circle represents the size of the lens aperture – the larger the f-number, the
smaller the aperture. A large f-number such as f/32, (which means a smaller aperture) will bring
all foreground and background objects in focus, while a small f-number such as f/1.4 will isolate
the foreground from the background by making the foreground objects sharp and the
background blurry.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Image on left shot at f/2.8, Image on right shot at f/8.0


As you can see, just changing the aperture from f/2.8 to f/8.0 has a big effect on how much of
WALL-E is in focus and how visible the background gets. If I had used a much smaller aperture
such as f/32 in this shot, the background would be as visible as WALL-E. Another example:

Mailboxes - Aperture set to f/2.8


In the above example, due to the shallow depth of field, only the word “Cougar” appears sharp,
while everything else in the front and behind of that word is blurred.

You now know that focus is related to the distance between the subject and the
l ens. There will be one focus setting which is the best for a subject at any particular
distance. But it’s also true that on either side of this point (both closer and farther
away) there’s a certain range of distance within which focus is still acceptable. This
range, from front to back, is known as the depth of field.
Lens Angle Affects Depth of Field
Wider lens angles give a greater depth of field. This means that when the camera is zoomed
out all the way, your subject will be able to move forward and backward across a considerable
range and still be in focus.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Narrower lens angles (especially telephoto) give a smaller depth of field. As you zoom in, the
acceptable focus range for your subject will decrease. When you’re zoomed in all the way on a
close-up shot, the depth of field will be smallest.

Iris Setting Also Affects Depth of Field


The wider the aperture (the more open the iris) the smaller the depth of field. This means focus
will be more problematic in low light conditions where the iris will need to be opened wide.
You’ll find that your subject won’t be able to move forward or backward very far without going
out of focus.

Telephoto Lens and Wide Iris Combined


The combination of telephoto lens (zoomed in all the way) and a wide aperture (big
iris opening) gives you the smallest depth of field of all.
This is the hardest situation for shooting action over which you have no directorial
control, because very small movements forward or backward will cause focus
difficulties.
For example, if you’re taping a singer in low light at a night-time outdoor concert,
and you have the lens in telephoto to give you a close-up of her face on the
screen, you’ll find that if she sways only slightly forward or back with the feel of
the music, she’ll go in and out of focus.
There’s not much you can do. Your work will look awful and people won’t be able
to understand why you didn’t just focus the camera.
To retrieve the situation, you can stay on a wider angle shot, and then move your
camera in closer to the stage when that song ends. But it’s times like this that make
you wonder how you got into video in the first place.
For better depth of field in lowlight conditions, you should try to either get in close to your subject
so you can stay on the wide angle lens, or add lights, so you can use a smaller aperture.

Lens Angle Affects Depth of Field

Wider lens angles give a greater depth of field. This means that when the camera
I had used a larger aperture such as f/1.4 and focused on one of the letters, probably only that
letter would have been sharp, while everything else would have been blurred out. The larger the
aperture, the smaller the area in focus (depth of field)You now know that focus is related to the
distance between the subject and the lens.

Aperture Priority/ Shutter Priority

Some cameras are automated to the extent that you can decide what function youwant most and
set that one, and the camera will adjust all the other function settings accordingly. For example,
if your main concern is to freeze the motion in the image because you’re a physio- therapist or

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

sports teacher and you’re doing motion analysis, you can tell the camera to priorities the high
speed shutter setting. The camera will then adjust the iris and other functions to suit the high
speed shutter setting you’ve selected. On the other hand, if you’re mainly concerned with
getting the greatest depth of field because you’re videoing dancers who will be coming forward
and backward in the frame and you need them to always be in focus, you can tell the camera
to prioritise the aperture setting, and the camera will make all the adjustments needed to the
other functions.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Exposure
“In photography, exposure is the amount of light per unit area (the image plane illuminance
times the exposure time) reaching a photographic film or electronic image sensor, as
determined by shutter speed, lens aperture and scene luminance.”
To be able to do any of the things that you wanted to do when you bought your digital camera,
you need to understand exposure. While you are able to take some decent pictures right out of
the box, once you have an understanding of exposure, you will find the pictures that you produce
surpass the questionable title of 'snapshots' and become photographs and memories.

Understand what "exposure of the image" is and how will it affect your photographs.

Exposure is an umbrella term that refers to two aspects of photography – it is referring to how to
control the lightness and the darkness of the image.

 The exposure is controlled by the camera's light meter. The light meter determines what the
proper exposure is; it all sets the f-stop and shutter speed. The f-stop is a fraction; the f represents
the focal length. The f-stop is determined by dividing the focal length by the aperture. f/2.8
would be 1/2.8 versus f/16 which would be 1/16. If you look at it like slices of a pie, you would
get a lot more pie with 1/2.8 than you would with 1/16.
 This can be very unnerving, but f-stops and shutter speeds on every picture to get the light right
or the lightness and darkness and exposure.
 A good way to understand it is to "think of a bucket of water with a hole in the bottom. If you
have a large hole in the bottom of the bucket (large aperture), water will drain out quickly (fast
shutter speed). Conversely, for the same amount of water, if you have a small hole in the bottom
of the bucket (small aperture), the water will drain out slowly (slow shutter speed)." [1]
 Exposure or lightness and darkness in the picture is a combination of the f-stop, which is the size
of the hole in the lens, and the shutter speed, which is the length of time that the shutter is open.
So, if you leave the shutter open longer, you're getting more light to the film or more light to the
digital sensor, and the picture gets brighter, or lighter. If you shorten the exposure (give less light
to the film or to the digital sensor), the exposure gets darker. Longer shutter speed: more
exposure, more light; shorter shutter speed: less exposure, less light.

Learn about the "f-stop". "F-stop" (also called "f-number") means fraction and the f-number is
the fraction of the actual opening in the lens compared to the focal length of the lens. The
aperture is the opening light passes through.

Try this example. Suppose that you have a lens with a focal length of 50mm and the f-number is
f/1.8. The f-number is determined by focal length/aperture. So 50/x=1.8 or x~=28. The actual

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

diameter where the light comes through the lens is 28mm across. If that lens had an f-stop of 1,
for example, the aperture would be 50mm, because 50/1=50. That's what the f-stop actually
means.

Study your digital camera's "manual exposure" mode. In the manual mode you can set both
the f-stop and shutter speed. If you really want to control the light, the exposure, and how the
picture works, you need to learn to know how to use the manual exposure mode; it's not just for
the propeller heads and the guys who still shoot film! Manual mode is still viable today even
with digital because it's really how you control the look and feel of your picture.

Understand why you would want to change the exposure. The aperture is really important to
control the picture; it lets in the light, and the light is the most important thing for your picture.
Without light, you won't have a picture.

 Set the aperture to control both the light and the amount that is in focus, in other words, the depth
of field.
 Set a wide opening, like f/2 or 2.8, to blur the background and have your subject razor sharp.
Also, you'll probably want to use the largest aperture when shooting in low-light, in order to
prevent blur.
 Shoot a medium aperture, 5.6 or 8 so the subject is sharp and background is slightly out of focus
but still recognizable.
 Shoot at smaller apertures, like f/11 and possibly smaller, for a landscape picture when you want
the flowers in the foreground, the river, and the mountains all in focus. Depending on your
format, tiny apertures like f/16 and smaller will cause you to lose sharpness due to diffraction
effects.
 For many photographers, the aperture is far more important to achieving great pictures than the
shutter speed, because it controls the depth of field of the picture, whereas it's more difficult to
tell if a picture was shot at 1/250 or 1/1000 of a second.
Understand why you would want to change the ISO. You change the ISO on your digital
camera to control the camera's sensitivity to light. In bright light, we set the camera to be less
sensitive, to give us a picture with less noise since the shutter speed is fast enough at 100 ISO. In
low light where there's less ambient light, you need more sensitivity in the camera. Therefore,
raise the ISO from 100 to possibly 1600 or even 6400 if you have to, to get enough light in so
that the picture isn't blurry. Now, what's the payback? As you raise the ISO, you get more noise
(the film equivalent being grain) in the picture and less color, so be sure to set the ISO as low as
possible without having the ISO too low that you end up with blurry pictures.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Determine what ISO is required for your shot. The ISO on your digital camera is just like it is
on film. You used to buy the film by the kind of light you were using. Today, you set the ISO on
your camera depending on the light.

 How do you set it? On some cameras there's a button right on the top of the camera that says
ISO. You press the button, turn the dial, and change it.
 Some cameras you have to go into the menu and find the ISO setting. Click on the ISO setting
and turn the dial and change it. That's how you set the ISO on your digital camera.
Stop action by changing the shutter speed on your camera. Change the shutter speed on your
camera to affect the action-stopping ability. If you're shooting a picture with your camera hand-
held, you will need a shutter speed that as fast as or faster than the reciprocal of your focal
length. In other words, if you were shooting on a 100mm lens, a shutter speed of 1/100 of a
second would be optimal. Camera blur can be eliminated at these speeds.

If you are shooting moving subjects, change your shutter speed to a shutter speed that
ranges from 1/500 to 1/1000 to stop the moving subjects.

If shooting pictures in low light, where you need more light to come in through the shutter,
set the shutter speed to a thirtieth or a fifteenth of a second. When you do this, the action is
going to blur, so use thirty or fifteen when there's low light or when you want the action to blur.
 Medium shutter speed: 125 or 250 for most pictures.
 Fast shutter speed: 500 or 1000 for action.
 Thirtieth or a fifteenth of a second to blur action or under low light.
Learn how to change the shutter speed on your digital camera. You might have the option of
a dial, a button on your camera, or you may have to do it in-camera.

Always err on the side of underexposure. Of course, it goes without saying that you want
fantastic exposure, but if you can't get it quite right, err on the side of underexposure (let your
scene be a little dark). When a picture is over-exposed, all of the information is lost and cannot
be recovered. With underexposed pictures, you have a greater chance of recovering the picture
through post-processing. You can set your camera to underexpose by using EV compensation
(exposure value compensation).

Learn your camera's "program mode". The exposure modes on your camera allows you to
control how you adjust the picture. The basic mode is the “P” mode (program mode) and it
allows you to manipulate both the shutter speed or aperture settings, and it will adjust the other
value accordingly so that the picture is exposed perfectly according to the light meter. The
advantage of program mode is that you don't need to know anything. It's just a little bit above the
green auto or “idiot proof” mode.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Get familiar with the "aperture priority" mode. On your digital camera you have the choice
of “A-mode” or aperture priority. In the aperture priority mode (it's a way to determine the
exposure); you the photographer picks the aperture or f-stop. The camera will choose the
shutter speed for you. Aperture priority could be considered the more useful of the modes. So,
you select the f-stop, whether it's f/2.8 toblur the background, f/8 for moderate depth of field, or
f/16 to have everything in focus.

Investigate your camera's "shutter priority" mode. Have at least some familiarity with the
shutter speed of your camera. The advantage of shutter speed is you set the number that's most
convenient or most comfortable to use. Then the camera will pick the other number, the f-stop.
On your camera, shutter priority can either be S or TV mode depending on your camera.
 In shutter priority mode, pick the shutter speed and the camera sets the f-stop.
 When in shutter priority, the camera will take the picture at the selected shutter speed regardless
of whether or not the picture will be exposed correctly.

Virtual University of Pakistan


FILM SPEED/ EXPOSURE METER

1. Film Speed

“Film speed is the measure of a photographic film's sensitivity to light, determined by sensitometry
and measured on various numerical scales, the most recent being the ISO system. A closely related
ISO system is used to measure the sensitivity of digital imaging systems.”

How important is film speed?

Many people today love using a digital camera to take pictures, but others still prefer the old-
school charm and control of traditional film. When we talk about film speed, we're referring to
the measure of a film's sensitivity to light. Each film speed is best suited for a different type of
photography.

The lower the speed, the longer an exposure to light is necessary to produce image density. If the
film speed is higher, it requires less exposure but generally has reduced quality in the form of
grain and noise. Noise and grain are the abnormalities in brightness and color in images; they
look similar to a layer of "snow" on a television set. They're measured using the ISO system
from the International Organization for Standardization (thus the ISO, which is used as an
abbreviation for the group and the film speed) and are the giant numbers you'll typically see on a
box of film. You'll also see the abbreviation ASA (American Standard Association) used in
conjunction with film speed. ASA and ISA are interchangeable.

The rating still applies to digital photography even though the cameras don't use film. ISO speed
is used in digital cameras to judge the relationship between the exposure rating and the sensor
data values. Most advanced cameras have an ISO setting available, which emulates the speed
rating of film. The basic rules of film speed apply equally to film and digital cameras.

Slow-speed films generally refer to film with 100-200 ISO ratings. These slower speeds are
excellent for outdoor landscape photography and inanimate objects. They can also be a great
choice if it's a particularly sunny day. Since the film takes longer to absorb light, it captures
detail more effectively. So if you plan on enlarging those pictures you'll want to shoot with the
lowest ISO possible.

Medium speed is 400 ISO. As can be expected, the medium speed is probably the best for
general-purpose use and can handle indoor lighting conditions, overcast days and any
combination of the two. Even so, it's not suited for action shots or very bright days.

Fast-speed film is usually rated at 800 ISO and above. It's best for moving subjects you might
see at a sporting event or concert, or when you plan on using a zoom lens or are shooting in a
dimly lit area. Unfortunately, if you plan on enlarging the photos, they'll likely turn out grainy
Film speed is remarkably important and can make or break a photograph. There are exceptions to
the above rules, and experimenting can certainly yield impressive and interesting results, but the
fact remains that the film speed you choose will have a direct effect on the quality and density of
the picture you take, regardless of whether you're shooting digital or on film.
Camera Basics, Principles and Practices Course Code MCD 401

FILM SPEED/ EXPOSURE METER

1. Film Speed

“Film speed is the measure of a photographic film's sensitivity to light, determined by sensitometry
and measured on various numerical scales, the most recent being the ISO system. A closely related
ISO system is used to measure the sensitivity of digital imaging systems.”

How important is film speed?

Many people today love using a digital camera to take pictures, but others still prefer the old-
school charm and control of traditional film. When we talk about film speed, we're referring to
the measure of a film's sensitivity to light. Each film speed is best suited for a different type of
photography.

The lower the speed, the longer an exposure to light is necessary to produce image density. If the
film speed is higher, it requires less exposure but generally has reduced quality in the form of
grain and noise. Noise and grain are the abnormalities in brightness and color in images; they
look similar to a layer of "snow" on a television set. They're measured using the ISO system
from the International Organization for Standardization (thus the ISO, which is used as an
abbreviation for the group and the film speed) and are the giant numbers you'll typically see on a
box of film. You'll also see the abbreviation ASA (American Standard Association) used in
conjunction with film speed. ASA and ISA are interchangeable.

The rating still applies to digital photography even though the cameras don't use film. ISO speed
is used in digital cameras to judge the relationship between the exposure rating and the sensor
data values. Most advanced cameras have an ISO setting available, which emulates the speed
rating of film. The basic rules of film speed apply equally to film and digital cameras.

Slow-speed films generally refer to film with 100-200 ISO ratings. These slower speeds are
excellent for outdoor landscape photography and inanimate objects. They can also be a great
choice if it's a particularly sunny day. Since the film takes longer to absorb light, it captures
detail more effectively. So if you plan on enlarging those pictures you'll want to shoot with the
lowest ISO possible.

Medium speed is 400 ISO. As can be expected, the medium speed is probably the best for
general-purpose use and can handle indoor lighting conditions, overcast days and any
combination of the two. Even so, it's not suited for action shots or very bright days.

Fast-speed film is usually rated at 800 ISO and above. It's best for moving subjects you might
see at a sporting event or concert, or when you plan on using a zoom lens or are shooting in a
dimly lit area. Unfortunately, if you plan on enlarging the photos, they'll likely turn out grainy

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Film speed is remarkably important and can make or break a photograph. There are exceptions
to the above rules, and experimenting can certainly yield impressive and interesting results, but
the fact remains that the film speed you choose will have a direct effect on the quality and
density of the picture you take, regardless of whether you're shooting digital or on film.

2. Exposure meter and gray card

Exposure Meter:

In digital photography exposure meter is an instrument for measuring the amount of light falling
on or being reflected by a subject, and usually equipped to convert this measurement into usable
information, such as the shutter speed and aperture size required to take a reasonable photograph.

Link: http://www.webopedia.com/TERM/E/exposure_meter.html

GRAY Card:

A gray card is a middle gray reference, typically used together with a reflective light meter, as a
way to produce consistent image exposure and/or color in film and photography. A gray card is a
flat object of a neutral gray color that derives from a flat reflectance spectrum.

Link: https://www.google.com.pk/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-
8#q=gray+card

3. Types of light meters

Choosing Hand Held Exposure Meters

Hand-held exposure meters measure light falling onto a light-sensitive cell and converts it into a
reading that enables the correct shutter speed and or lens aperture settings to be made. Hand-held
exposure meters come in many variations, each with specific benefits. By using the appropriate
meter for your specific needs, you can be assured of consistent professional results.

Incident vs. Reflected

The two distinct techniques of measuring light,

1. Incident and

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

2. Reflected
Each have their own advantages in different situations. Hand held meters can give you both
capabilities, along with features not found in even the most advanced cameras with built-in
meters.

The 18% Neutral Gray Standard

Light meters are designed to measure light in a consistent way. They cannot see the subject and
interpret it as a photographer can. For example, a light meter cannot distinguish a black cat from
a white cat, a red balloon from a blue balloon, nor textured powdery white snow from a shiny
white auto paint finish. Given the same lighting situation, each of these objects would reflect a
different amount of light.

Reflected measurements would indicate different exposures for each object. Incident
measurements would indicate the same exposure for each object, to render a consistent exposure.
Light meters are calibrated to assume that all subjects are of average 18% reflectance, or neutral
gray. The use of the 18% neutral gray standard allows a reflected light meter to render correct
readings for “average” subjects in “average” lighting situations. (The value of 18% neutral gray
is also referred to as Zone V in the Zone System, an advanced black and white exposure
method.)

Incident Metering

The incident meter is aimed at the light source and measures the light source falling directly on a
scene and is not influenced by the reflectance of the subject being photographed. For more
precise control of the photograph, incident meters are also used to measure various levels of light
from multiple sources falling on separate parts of a scene.

Using Incident Meters

Incident metering measures the intensity of light falling on the subject and gives accurate and
consistent rendition of the tonality and contrast regardless of reflectance, background, color, and
shape. Subjects that appear lighter than gray will appear lighter. Subjects that are darker than
gray will appear darker. Colors will be rendered accurately. Highlight and shadow areas will fall
naturally into place.

NOTE: Most light meters allow for both reflected and incident light readings.
Advantages of Incident Measurement

Incident meters measure accurately and consistently and are not affected by variances in
reflectance of the subject or scene. Because of this, incident meters give the most accurate
exposure for the majority of situations and subjects.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Reflected Metering

Reflected metering reads the intensity of light reflecting off the subject and may vary according
to variances in tonality, color, contrast, background, surface, or shape. Meters are designed to
regard all subjects as 18° neutral gray reflectance. Reflected measurement of any single toned
area will result in a neutral gray rendition. Subjects that appear lighter than gray will reflect more
light and result in an exposure that renders it darker. Subjects that are darker than gray will
reflect less light and result in an exposure that renders it lighter.

Using Reflected Measurement

Hand-held reflected light meters and built-in camera meters read the intensity of light
reflecting off the subject and measurements are taken from the camera position. Generally,
reflected measurement of a wide subject area can include many different reflective surfaces
or colors that can bias the meter and result in inconsistent and erroneous readings. Accuracy
of a reflected measurement can be improved by reading an 18° neutral gray test card placed
in front of the subject.

Light meters in cameras react to how intense the light is as seen from the camera. SLRs measure the
light (called metering) through the lens – TTL. They collect light that has actually passed through
the camera’s lens and measure its intensity. There are problems when the scene has parts that are
much brighter or darker than others, for example shadows on a sunny day. This can trick the light
meter into measuring the intensity of the light incorrectly, depending on which part of the scene was
illuminating the sensor.

Modern SLR cameras use multi-point light meters, meaning that several light meters are actually
scattered around the projected scene, each measuring the light intensity at that point. Very
sophisticated cameras may have dozens of metering points. How much the measured intensity of the
light at each point influences the final meter reading depends on the metering mode selected by the
photographer.

For a more detailed look at metering modes, you can read: Introduction to metering modes.

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

How to Use the Light Meter

As we now know, the correct exposure is created by juggling the three points of the exposure
triangle: aperture, shutter and ISO. The light meter is the tool that puts us in the right neighborhood
for how these should be set. If you are shooting on full auto, then when you meter the scene –
usually done at the same time as focusing, by half pressing the shutter – the light meter gives its best
guess for each of these variables.

If you want to take creative control of the photo, you can manually set each of the three variables
yourself. Typically ISO is left at the default, or previous setting, and you take control by choosing
aperture priority or shutter priority. On most DSLRs that’s done by turning the exposure mode dial.
If you set the dial to Av – aperture priority, the photographer chooses what the aperture will be, and
the light meter adjusts the shutter speed to maintain the correct exposure. The reverse is true for TV
– shutter priority.

When using these modes, it’s useful to refer to the exposure meter display on the camera.
The exposure meter (display) shows the result of the measurement taken by the light
meter (sensor). It will typically look something like this:

Exposure meter display on LCD Exposure meter display in viewfinder

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Each number represents a stop change in the light, as indicated, with the central mark being the
“correct” exposure, as determined by the light meter. Each pip between the numbers represents one
third of a stop. The arrow underneath indicates how close the current settings are to the correct
exposure. Usually in priority modes, the arrow will stay in the middle as the light meter will be able
to set the exposure correctly. However, if for example you set your aperture to 1/400sec in Tv
(shutter priority mode) and the light meter indicated that you needed an aperture of f4, but your lens
was only capable of f5.8, then the exposure meter will display one stop of underexposure. You will
need to compensate for this by setting a longer shutter time, or increasing the ISO.

The juggling act becomes more complicated, and the light meter’s assistance more valuable, when
you go to full manual control of the exposure. Here the exposure meter simply displays whether the
current settings will under or over expose the image, according to the light meter. The photographer
can freely change any of the values on the exposure triangle, and see the change to the predicted
versus recommended exposure.

Exposure compensation

Even though the light meter in your camera is pretty sophisticated, sometimes it can get it wrong,
especially with harsh contrasts, or highly reflective surfaces. Changing metering modes may help
this, but a more controlled approach is to use exposure compensation. Imagine you are
photographing a person against a large bright sky. The light meter thinks the sky is the most
important part, and exposes correctly for that, leaving the person a dark silhouette. By using
exposure compensation, you can tell the camera to take the metered exposure and make it
brighter by a chosen amount. This will then allow the photographer to correctly expose the
person. I’ll look at exposure compensation in more detail in a future post.

To show you how the different exposure modes might work in real world situations, here are
some scenarios. The settings given below are what they happened to be for the examples shown.
Settings for your own photo will be different.

Scanario 1 – Sports

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

 High speed is needed to freeze action


 Use Shutter Priority
 Set shutter speed to 1/800sec
 The light meter sets the aperture to f10
 If under exposed, change ISO to compensate – ISO400

Scanario 2 – Portrait

 An artistic narrow depth of field is desired


 Use Aperture Priority
 Set aperture to f5.6
 The light meter sets the shutter to 1/160sec
 If under exposed, change ISO to compensate – ISO100

Scenario 3 – Night scenery

 Ambient light is too low to accurately meter


 Use full Manual
 Set aperture to suit scene, erring to wider – f11
 Set a long shutter speed to light meter’s best guess – 20sec
 Set ISO to lowest possible for correct exposure – ISO100
 Take a test shot and adjust settings if the light meter got it wrong

Virtual University of Pakistan


Camera Basics, Principles and Practices Course Code MCD 401

Scenario 4 – Off-camera manual flash

 On auto, meter the scene and note settings


 Set camera to one or two stops under exposed
 Set up flashes and tweak power to expose correctly
 Tweak the flashes exposure by adjusting aperture
 Tweak the ambient light by adjusting shutter speed
 Settings for example shot: 1/160sec f8 ISO125, click image for flash details.

Virtual University of Pakistan


Camera Basics, Principles and techniques –MCD 401 VU

Topic 22
Types of Light Meters

What is the Light Meter?

Light meters in cameras react to how intense the light is as seen from the camera. SLRs measure the
light (called metering) through the lens – TTL. They collect light that has actually passed through
the camera’s lens and measure its intensity. There are problems when the scene has parts that are
much brighter or darker than others, for example shadows on a sunny day. This can trick the light
meter into measuring the intensity of the light incorrectly, depending on which part of the scene was
illuminating the sensor.

Modern SLR cameras use multi-point light meters, meaning that several light meters are actually
scattered around the projected scene, each measuring the light intensity at that point. Very
sophisticated cameras may have dozens of metering points. How much the measured intensity of the
light at each point influences the final meter reading depends on the metering mode selected by the
photographer.

Types of Light Metering

There are two Basic types of light metering

1. Incident Metering
2. Reflected Metering

1. Incident Metering

The incident meter is aimed at the light source and measures the light source falling directly on a
scene and is not influenced by the reflectance of the subject being photographed. For more
precise control of the photograph, incident meters are also used to measure various levels of light
from multiple sources falling on separate parts of a scene.

Using Incident Meters

Incident metering measures the intensity of light falling on the subject and gives accurate and
consistent rendition of the tonality and contrast regardless of reflectance, background, color, and
shape. Subjects that appear lighter than gray will appear lighter. Subjects that are darker than

1
Camera Basics, Principles and techniques –MCD 401 VU

gray will appear darker. Colors will be rendered accurately. Highlight and shadow areas will fall
naturally into place.

NOTE: Most light meters allow for both reflected and incident light readings.

Advantages of Incident Measurement

Incident meters measure accurately and consistently and are not affected by variances in
reflectance of the subject or scene. Because of this, incident meters give the most accurate
exposure for the majority of situations and subjects.

2. Reflected Metering

Reflected metering reads the intensity of light reflecting off the subject and may vary according
to variances in tonality, color, contrast, background, surface, or shape. Meters are designed to
regard all subjects as 18° neutral gray reflectance. Reflected measurement of any single toned
area will result in a neutral gray rendition. Subjects that appear lighter than gray will reflect more
light and result in an exposure that renders it darker. Subjects that are darker than gray will
reflect less light and result in an exposure that renders it lighter.

Using Reflected Measurement

Hand-held reflected light meters and built-in camera meters read the intensity of light reflecting
off the subject and measurements are taken from the camera position. Generally, reflected
measurement of a wide subject area can include many different reflective surfaces or colors that
can bias the meter and result in inconsistent and erroneous readings. Accuracy of a reflected
measurement can be improved by reading an 18° neutral gray test card placed in front of the
subject.

How to Use the Light Meter

2
Camera Basics, Principles and techniques –MCD 401 VU

As we now know, the correct exposure is created by juggling the three points of the exposure
triangle: aperture, shutter and ISO. The light meter is the tool that puts us in the right neighborhood
for how these should be set. If you are shooting on full auto, then when you meter the scene –
usually done at the same time as focusing, by half pressing the shutter – the light meter gives its best
guess for each of these variables.

If you want to take creative control of the photo, you can manually set each of the three variables
yourself. Typically ISO is left at the default, or previous setting, and you take control by choosing
aperture priority or shutter priority. On most DSLRs that’s done by turning the exposure mode dial.
If you set the dial to Av – aperture priority, the photographer chooses what the aperture will be, and
the light meter adjusts the shutter speed to maintain the correct exposure. The reverse is true for TV
– shutter priority.

When using these modes, it’s useful to refer to the exposure meter display on the camera.
The exposure meter (display) shows the result of the measurement taken by the light
meter (sensor). It will typically look something like this:

Exposure meter display on LCD Exposure meter display in viewfinder

Each number represents a stop change in the light, as indicated, with the central mark being the
“correct” exposure, as determined by the light meter. Each pip between the numbers represents one
third of a stop. The arrow underneath indicates how close the current settings are to the correct
exposure. Usually in priority modes, the arrow will stay in the middle as the light meter will be able
to set the exposure correctly. However, if for example you set your aperture to 1/400sec in Tv
(shutter priority mode) and the light meter indicated that you needed an aperture of f4, but your lens
was only capable of f5.8, then the exposure meter will display one stop of underexposure. You will
need to compensate for this by setting a longer shutter time, or increasing the ISO.

The juggling act becomes more complicated, and the light meter’s assistance more valuable, when
you go to full manual control of the exposure. Here the exposure meter simply displays whether the
current settings will under or over expose the image, according to the light meter. The photographer
can freely change any of the values on the exposure triangle, and see the change to the predicted
versus recommended exposure.

3
Camera Basics, Principles and techniques –MCD 401 VU

Exposure compensation

Even though the light meter in your camera is pretty sophisticated, sometimes it can get it wrong,
especially with harsh contrasts, or highly reflective surfaces. Changing metering modes may help
this, but a more controlled approach is to use exposure compensation. Imagine you are
photographing a person against a large bright sky. The light meter thinks the sky is the most
important part, and exposes correctly for that, leaving the person a dark silhouette. By using
exposure compensation, you can tell the camera to take the metered exposure and make it
brighter by a chosen amount. This will then allow the photographer to correctly expose the
person. I’ll look at exposure compensation in more detail in a future post.

To show you how the different exposure modes might work in real world situations, here are
some scenarios. The settings given below are what they happened to be for the examples shown.
Settings for your own photo will be different.

4
Camera Basics, Principles and techniques –MCD 401 VU

Topic 023

Sunny Rule f/16

What is the Sunny f/16 Rule?

The Sunny 16 Rule is a way to meter for correct exposure during daylight without using the
camera’s meter.

The basic rule of thumb states that if you have a clear, sunny day and your aperture is at f/16,
whatever ISO you are using, your shutter speed will be the reciprocal value of that ISO value
(ISO X = 1/X seconds shutter speed)

So for example, if your ISO is 200 at f/16, then your shutter speed will be 1/200 seconds. If your
ISO is 100, then your shutter speed will be 1/100 seconds.

Have you ever heard of the sunny 16 rule? It seems to have all but disappeared in most modern
discussions of photography. As a matter of fact, it’s one of many rules that photographers seem
to have forgotten. That’s a shame because the sunny 16 rule serves as a nice way to check your
current exposure settings. Let’s have a look at how it works.

Long before the time of digital cameras, photographers invented rules to help them navigate their
camera’s manual settings. Photographers had to carry light meters with them everywhere they
went, and they were virtually lost without them. But if you didn’t have time to take a light
reading, or you didn’t want to bother with carrying the equipment, you could resort to these
rules. They were the next best thing.
And it’s still pretty handy to know what manual mode settings to use with your camera.

How does the sunny 16 rule work?


The sunny 16 rule works like this:
 On a clear and sunny day, at an aperture of F/16, you will get a correct exposure if you use a
shutter speed that’s the inverse of the ISO speed you’re using.
 The second part is probably the one that’s confusing you (if any of it is). You have to know
what ISO speed is in order to decipher what’s going on. So allow me to explain.

The easiest way to explain is with an example. If it’s a sunny day, and have your aperture set to
F/16 and ISO set to 200, to correctly expose your image the shutter speed needs to be set to
1/200 (the inverse of the ISO number).
ISO speed is your camera’s sensitivity to light. A bigger ISO speed means a larger sensitivity. If
your camera is more sensitive to light, it takes less light to make a picture brighter. Most cameras
start out at an ISO speed of 100, and some models go as high as ISO 1600. That’s 16 times more

1
Camera Basics, Principles and techniques –MCD 401 VU

sensitive than the default, meaning you’d need to expose the camera to 16 times less light in
order to get the same picture. I
The rule says you need to use the inverse of the ISO speed. That’s interesting because as you
increase your ISO speed, you effectively have to increase your shutter speed to compensate. At
ISO 200, your camera is twice as sensitive to light, so you need to use a shutter speed of 1/200 of
a second to let in less light and balance it out.
Let’s use another example. Let’s say it’s a sunny day, and your camera is set to ISO 400.
According to the sunny 16 rule, if you use an aperture of F/16 and a shutter speed of 1/400 s, you
will have an evenly balanced image that is neither too bright nor too dark.
That’s interesting, but it seems like the rule can only help us out when it’s sunny.
 The snowy/sandy F/22 rule.
 The overcast F/8 rule.
 The slightly overcast F/11 rule.
 The heavy overcast F/5.6 rule.
 The sunset F/4 rule.

Depending on the weather, you can use a different version of the sunny 16 rule to get an accurate
exposure.
Photo By Jason Rogers

But wait. It gets ever better than this. You don’t always have to use F/16 on a sunny day or F/8
on an overcast day. These are merely starting places. As long as you compensate by adjusting
your shutter speed along with your aperture, you can use any aperture you want under any
lighting condition.
So, let’s go back to the drawing board and imagine another situation. What if it’s a bright sunny
day, and there’s a landscape you want to photograph? You could use F/16, but you want to use
an aperture that really gets the entire depth of field in front of you. You really want to use F/22.
What can you do?
2
Camera Basics, Principles and techniques –MCD 401 VU

Start with a pair and move forward from there


Let’s also assume you’ve set your ISO to the minimum of 100. According to the sunny 16 rule,
we’ve got a pair. You know that F/16 at shutter speed 1/100s will work. Now we simply need to
find a similar pair by adjusting the aperture and balancing it out with the shutter speed.
Thankfully, apertures and shutter speeds work on a system of stops. Every time you adjust
your shutter speed up by one stop, your camera lets in exactly half as much light. The same
is true for apertures. Every time you adjust your aperture either up or down, your camera lets in
or blocks out half as much light.
Well, that’s not entirely true. Some cameras allow you to adjust the aperture and shutter speeds
by half or quarter stops. But let’s ignore that for now.
So all we really need to do is keep moving our aperture up one stop and our shutter speed down
one stop until we get to F/22. Let’s give that a try.
On my Nikon D40x, the next aperture stop up is F18. The next shutter speed stop down is 1/80s.
It then goes to F/20, 1/60s
And finally, it arrives at F/22 shutter speed 1/50s. You’ll notice that the shutter speed is exactly
half of what it was at F/16. That makes sense because we’ve just closed the aperture by one half,
so we need twice as much light to take the same picture.
You’ll also notice that the D40x works on a system of quarter stops. It’s a little different from the
standard in photography, but the same rules apply. Every time you adjust the aperture up, you
need to adjust the shutter speed down. Every time you adjust the aperture down, you need to
increase the shutter speed by one stop. Keep doing this until you get to the aperture or shutter
speed you want to use.
Even with all this expensive gear, the sunny 16 rule can still come in handy. Sure, you can use
your camera’s light meter, but it isn’t always the most accurate. I like to do the sunny 16
calculations in my head so I at least know what the ballpark shutter speed and aperture values
will be. If I’m close to those numbers, I’m usually pretty happy.

The sunny 16 rule is just a starting point. Use it come up with any aperture and shutter speed
combination. In this case, the photographer picked a smaller aperture to capture a larger depth of
field.
Photo By Michael Kirwan

3
Camera Basics, Principles and techniques –MCD 401 VU

The sunny 16 rule isn’t an end all be all. I wouldn’t allow it to supplant good old trial and error.
Keep checking your LCDs and histograms. It’s much more valuable than knowing the sunny 16
rule, the overcast F/8 rule, or whichever rule you need for a given day. There is no such thing as
a “correct” exposure, after all. It really does come down to your own aesthetic sense. This is just
a guide to help you get there.

4
Camera Basics, Principles and techniques –MCD 401 VU

Topic 024

Filters

1. UV Filter

Ultra Violet filters are transparent filters that block ultra -violet light, in order to reduce
the haziness that is noticeably apparent in some daylight photography. UV filters don’t
affect the majority of visible light, so they are a perfect form of lens protection and they
will not alter your exposure. There are some “strong” UV filters that are more effective
at cutting atmospheric haze and reducing the notorious purple fringing that sometimes
shows up in digital photography. Purple fringing is a purple ghost that you see at the
edges of a subject when it is slightly out of focus.

2. Polarizing Filter

1
Camera Basics, Principles and techniques –MCD 401 VU

A Polarizing filter can be used to darken overly light skies as it increases the contrast
between clouds and the sky. Like t he UV filter, the Polarizer reduces atmospheric haze,
but also reduces reflected sunlight. The most typical function of a Polarizer is to remove
reflections from water and glass. When angled (or spun) properly, the Polarizer
eliminates the reflection when shooting through a glass window or into water; a handy
trick to be sure! There are two types of polarizers:

 linear

 circular

Both types of polarizers produce a similar effect, except the circular polarizer
eliminates unwanted reflected light with the help of a quarter-wave plate. The resulting
image is free of reflected light, and transparent objects like glass are free of reflections.

3. Color Balancing Filter

As you know, visible light is made up of a multiple color spectrum. But in photography,
you have to make a choice to capture images with the camera’s white balance set to
record whitish blue light of daylight or set to record the reddish -orange tungsten
(incandescent) light… with a few variations (i.e. sodium -vapor or fluorescent). This is
what the white balance is used to control, and you use a color balancing filter to affect a
change in your light sources. However, you can use a Color Balancing filter to
compensate for the various differences in the photographed color of light (e.g. daylight
is cooler and appears blue, whereas tungsten is warmer and appears reddish orange). The
85B (warm-up/orange filter) and the 80A (cool-down/blue filter) are the two standard

2
Camera Basics, Principles and techniques –MCD 401 VU

filters for compensating for color balancing. The 85B enables you to shoot in the
daylight when the white balance/color temperature is set for tungsten. Without the 85B
filter, your image will have a blue color cast to it. The 80A enables you to shoot under
tungsten light when the color temperature/white balance is set for daylight. Withou t the
80A, your image will be abnormally warm/reddish orange. These filters have fallen out
of use recently because this type of color temperature correction can easily be achieved
with image processing software. Some photographers use them for various art istic
affects.

4. Neutral Density Filter

Attaching a neutral density (ND) filter to your lens uniformly reduces the amount of
light entering the lens. The ND filter is helpful when the contrast between the highlights
and shadows is too great to get a quality exposure. The ND Filter also can enable greater
motion blurring and image detail by allowing a large aperture and/or a slow shutter
speed to be used. A variant on the ND filter is the graduated ND, in which there is a
gradient that effects the redu ction of light in a graduated, neutral level from 100% to 0%
across the length of the filter. The Graduated ND is recommended for shooting
landscapes and seascapes, because you can reduce the brightness of the sky (for better
contrast) but still maintain a n affecting exposure of the land or water.

5. Soft Focus Filter

3
Camera Basics, Principles and techniques –MCD 401 VU

Soft focus filters, do exactly that, they reduce the sharpness of an image, but only to an
extent that is barely noticeable. They are useful in shooting close up shots of people’s
faces. With the help of a little diffusion; imperfect skin conditions are replaced by silky
smooth skin. Remember you can use soft focus filters while photographing landscapes
or monuments as well.

6. Filters for B&W Photography

There are specific filters for B&W photography that lighten similar colors and darken
opposite colors, thereby enhancing the monochromatic look. There are Red, Orange,
Yellow, Green and Blue filters for use in B&W photography.

Red filters are a favorite among landscape photographers and are often used to add
drama. In nature photography, a red filter will increase the contrast between red flowers
and green foliage. A red filter will deepen a blue sky and make white clouds pop out. It
can also decrease the effects of haze and fog. In some cases, depending on its strength, a
red filter could even turn the sky black.

4
Camera Basics, Principles and techniques –MCD 401 VU

Orange filters increase contrast between tones in textures such as tile or bricks, making
it a good choice for general use and urban or abstract photography. It also helps to
decrease haze and fog, but its effects on the sky and clouds are subtler than the red
filter.

Yellow filters are even subtler than orange filters, making it a ‘classic’ choice for
beginners just starting to explore using filters with black and white photography. It helps
to darken the clouds slightly, and it also separates light green foliage from the darker
shades of green.

Green filters lighten dark green foliage and boost light green foliage. They have a mor e
specific use and are not as commonly used as the other filters, but green filters are
extremely useful for the nature photographer. Green filters may lighten the sky, so
landscape photographers should take note of this when using it.

Blue filters are not as commonly used in black and white photography because they
lighten the sky and darken highlights or colors that are seen as light. Blue filters can
draw attention to haze and fog, which can enhance the mood of the photo if needed. It’s
a good idea to experiment with this filter using the B&W setting, as opposed to shooting
in color and converting the image to B&W in an image processor.

Since a filter absorbs light, it necessitates an increase in exposure. Filter -makers will
usually suggest an amount of exposure compensation in the form of a “filter factor”. A
filter factor of 2X means that you should multiply the exposure by 2. A filter factor of
4X means that you should multiply your exposure by 4, and so on. If the filter factor is
2X and 4X, add 1 f/stop and 2 f/stops to your exposure respectively. Another alternative
is to divide your ISO by the filter factor. If the filter factor is 2X and your ISO is 200,
your new ISO is 100.

Filters Commonly Used in B&W Photography*

Filter** Filter f/stop Filter Effects Lightens Darkens


Factor Increase

5
Camera Basics, Principles and techniques –MCD 401 VU

Probably the most widely used. Offers an accurate Yellow, Blue,


tone range in compensating for the blue sensitivity Chartreuse, Violet,
Medium of panchromatic films. Will slightly darken sky Olive, Red, Purple,
Yellow 8 2x +1 and increase contrast between blue sky and Pink, Lilacs
(K2) clouds. Also may help reduce haze. Orange,
Lime
Green

Stronger effect than medium yellow. May darken Yellow, Blue,


sky considerably Chartreuse, Violet,
Deep Olive, Red, Purple
Yellow 15 2.5x +1 1/3 Pink,
(O) Orange,
Lime
Green

Produces very dramatic skies. Effects may border Reds, Blues,


on the surreal. Darkens foliage. Reduces haze. pinks, greens,
magentas, cyan
Red 25
8x +3 some
(A)
browns,
yellow,
orange

Lightens foliage and will darken skies Yellow, Blue,


somewhat. Sometimes used to produce pleasing yellow- violet,
Green
4x +2 skin tones in portraits green, magenta,
11***
olive, red,
greens maroon

Absorbs UV radiation and will reduce distant haze


UV 1x 0
or fogginess

Helps remove reflections and glare. May cut


Polarizer 2.5x +1 1/3
pollution haze. Darkens sky.

Reduces the amount of light reaching a part of the


Grad ND n/a n/a
image -- usually used to darken the sky.

6
Camera Basics, Principles and techniques –MCD 401 VU

Conclusion

Photographic filters are used to achieve image enhancement effects that can change the
tone and mood of your photographs. Filters inject slight, but noticeable alterations to
your image. You can achieve many of the same effects by extensive tweaking in
Photoshop (or another image manipulation software package), but when you use a filter
you can immediately see the difference to your image in the viewfinder. The effects of
filters are more pronounced when working in B&W, as the monochromatic tonal scale
reacts much differently, and also with greater dramatic affect. As with every new
photographic accessory, practice and experimentation are the keys to expanding the
application of your creative palette.

7
Camera Basics, Principles and techniques –MCD 401 VU

Topic 025

Filter Factors
Filter Factor

Filters change the dynamics of the light entering the lens and usually require you to alter
your exposure to compensate for this fact. This is called the Filter Factor and each filter
has a specific filter factor, so read up on these to learn how to use them.

If you use filters on your camera, this can have an effect on the white balance, depending on the
filter type.

Polarizer’s are neutral – they don’t change the color balance, only the depth of color. Warm-up
or other color-adjusting filters will, of course, change the color of the light.

The thing to make sure of here is that you don’t leave the camera set to auto white balance,
because it will simply attempt to compensate for the changed light color.

Always choose an appropriate white balance preset before using a colored filter. One interesting
alternative to a ‘straight’ colored filter is to use a colored graduate. This will add a color to the
sky without changing the foreground colors.

A blue grad can add a sunny feel to an overcast day, while a yellow/orange grad can add drama
to a stormy sky. With these, it’s wise to take your meter reading before you fit the filter.

Purpose of Using Filters

Color filters allow the black and white photographer to exercise some selective control over tone
values. To this extent they can be an important tool in helping photographers realize their
creative vision; to put on film what they see in their minds' eye. At times the use of filters is
almost mandated by the limitations of the medium. For example, film users quickly learn that,
despite being called "panchromatic", film is extra sensitive to (i.e., overexposes) the blue and
ultraviolet (UV) light in skies resulting in the dreaded "white sky" effect. A filter may be needed
to compensate for this bias.

1
Camera Basics, Principles and techniques –MCD 401 VU

Another problem in using the grey scale pallet is that, with exception of the blue bias mentioned
above, objects of similar reflectance may have similar tone values in the resulting print. For
instance, if we were photographing an apple tree and found that the green foliage and red apples
have similar reflectance (similar reflected light meter readings) they might be almost
indistinguishable in tone in a B&W print. In this case the photographer might choose to use a red
filter to lighten the apples and darken the foliage or use a green filter to darken the apples and
lighten the foliage.

With the exception of color correction filters, color filters are pretty much exclusively used by
B&W photographers However, we share some other filters with color photographers. These
filters might be to use to reduce haze, reflections or glare. We might also use neutral density
filters to reduce the light reaching all or part of the film. These are discussed in more detail
below.

How Filters Affect Tone Values

In a nutshell, a colored filter used with B&W film will lighten similar colors and darken opposite
colors. The color wheel shown below provides a visual example of what is meant by similar and
opposite colors. A red filter, for example, will darken the other two primary colors (blue and
green) and will especially darken its complementary or opposite color (cyan) that is formed by
combining green and blue. On the other hand, it will lighten red objects and to a lesser extent
colors that contain red such as yellow, orange and magenta. Yellow filters will do a particularly
good job of darkening blue objects but tends to lighten red and green objects and so on.

2
Camera Basics, Principles and techniques –MCD 401 VU

A filter lightens and darkens because it transmits some colors and absorbs (or filters) others.
Obviously, because it absorbs light, using a filter will necessitate an increase in exposure. (The
UV filter is an exception.) Logically, darker filters require more exposure compensation. Filter
makers will suggest an amount of exposure compensation and this is discussed below. However,
think of this as just a suggested starting point. The actual effect of a filter and the amount of
exposure compensation it needs will depend on:

o The film being used


o The color of the ambient light
o The predominant color of the subject.

For example, making an image using a yellow filter and filling the frame with yellow
sand dunes or "amber waves of grain" might require significantly less exposure
compensation since less light will be absorbed by the filter.

The amount of exposure compensation is often expressed as a "filter factor". A filter factor of
2X means that you should multiply the (unfiltered) exposure by 2, a 2.5 filter factor means that
you should multiply it by 2.5 and so on.

3
Camera Basics, Principles and techniques –MCD 401 VU

Topic 026

Filters Color Photography

1. UV/Skylight

These filters are almost clear (slightly amber) and reduce blue haze caused by UV light. They're
used mainly for protection - if you drop your lens, you might just damage the filter instead of the
lens. They also protect against dust, moisture and scratches.

Note: By using these filters you can save your lens. From my research on filters, Nikon and
B+W make the best Skylight and UV.

2. Polarizers

Most people only think about using a polarizer when there is a blue sky and they want it
make it a richer, deeper blue. This is a good enough reason to have this filter, but not the
only one. What this filter will not do is make a blue sky out of an overcast sky, no matter
how much you want it to be blue. The only way to do that is with a blue graduated filter.

Another use for this filter on sunny days is that it can be used as a neutral density filter in
order to give you longer shutter speed times. Enabling you to pan, zoom or any other
types of camera motion that you can think of.

3. Split Neutral Density and Split Graduated

The contrast of light in a scene either in the early morning or at sunset can be one of your
most difficult problems to overcome in exposing a successful landscape photograph.
Basically the film can record either the sky or the land properly - but not both! If you
expose properly for the land then the sky which appears colorful to your eye will be
colorless and washed-out in the final photograph. Graduated filters are useful for scenic
landscapes, when you want to combine a bright sky with a dark foreground. I use them
primarily at sunrise and sunset.

The top half of a split neutral density filter is neutral-density and the bottom half is a
clear. If you look through a split ND filter, the top half is dark; it gradually turns lighter
and finally becomes clear from halfway down to the bottom. The reference to "neutral

1
Camera Basics, Principles and techniques –MCD 401 VU

density" indicates that the filtration neither adds nor subtracts from any of the naturally
occurring color. The only effect the filter has is in decreasing the amount of light that
passes through the ND portion of the filter.

These filters can be purchased in a variety of densities, but are generally found in one of
the following: 1-stop (ND.3), 2-stops (ND.6) and 3-stops (ND.9). The three different
densities provide various amounts of ND effect depending upon the strength of the filter
(the strongest being the 3-stops variety). It is difficult to say which one you will need at
any given time. During the day if you are photographing in open shade you will probably
do best with a 2 stop, in darker shade - 3 stop. When photographing a landscape that is in
bright sunlight try using the 1 stop filter on the sky, it will slightly darken the sky and
make the landscape really stand out. When photographing at sunrise or sunset if you use a
3-stop grad filter you will have complete balance between the sky and the foreground.
Sometimes you may want this or you may want to keep the sky a little brighter than the
foreground, then use a 2 stop.

The split-graduated filters work the same way, except that they are colored. Whatever
color they are, that color will be added to your photograph. Sometimes the color works
while at other times it makes your picture look totally unrealistic. The only filter in this
group that I use but not all that often is the 1-stop blue, which I use on blue skies days to
darken the blue sky and add a bit of blue at the horizon line.

4. Yellow/Blue Polarizer

This filter adds blue and/or yellow to the scene.

5. Color Enhancing (Didymium/Intensifier)

It creates brighter, more saturated reds, rust browns and oranges on film, with minimal
effect to the other colors. Since it increases the color saturation, particularly for red, it is
useful for sunrise/sunset, fall foliage, red barns, red-orange flowers, and the red soil of
Prince Edward Island.

2
Camera Basics, Principles and techniques –MCD 401 VU

6. 80 Blue

The "80" series of filters is designed for daylight film to be used with tungsten lighting.
The strength of these filters is backwards to other filters. The 80A has the strongest blue
(2-stops), 80B (1-2/3 stops), 80C (1-stop). 80B at twilight increases the blue in the scene
to a rich cobalt blue as well as converting any tungsten lighting in the scene from yellow
to white light. This filter is also used when you are photographing waterfalls or snow
scenes to make them appear slightly blue.

7. Warming Filters

There are 2 different series of warming filters, both as their name suggests add warmth to
a picture. The 81 series is the more popular of the two, it is available in varying strengths
- 81A, 81B and 81C. A is the lightest, B is medium, and C is the strongest. Their best use
is to remove the blue cast from your pictures on an overcast day.

Number 85 is the other series (they look orange in color), they are much stronger in
intensity than the 81 series (they look amber). The #85 filters are normally used so that
tungsten film can be used in daylight - usually used for motion pictures. For the
landscape photographer they can be used at sunrise/sunset to create a very warm and
golden appearance.

Note: Out of three filters named, I recommend starting with 81B. For the other two, I use
the 81A more often, particularly with spring and summer foliage as it enhances the green
of leaves and grass.

8. Neutral Density (ND)

Don't get these confused with split-graduated neutral density filters. These neutral density
filters have the same degree of light reducing effect across the entire picture. You would
use this filter when you want to reduce the overall light level in a scene, usually to obtain
a slower shutter speed or if you wanted to use a wider depth-of-field. This doesn't apply
so much with today's cameras as it did a few years ago, when cameras didn't have the
high/fast shutter speeds that they do now. Uses for this filter include wildflowers blowing
in the wind to create a blur of color or moving water that takes on a softer look with a

3
Camera Basics, Principles and techniques –MCD 401 VU

slower shutter speed. These filters are available in densities of .3 (1 stop), .6 (2 stop) and
.9 (3 stops).

9. FL-D

This is a magenta colored filter that is designed to correct the color of fluorescent lighting
for daylight film. Florescent lighting unfiltered has a blue-green color cast on daylight
film. Some photographers use it for dusk shots in cityscapes, to correct for the green of
the office lights it also adds pink-purple color to the sky. Some other photographers use it
as an enhancing filter at sunrise/sunset.

The only problem with using the FL-D filter to correct for fluorescent lighting in a
cityscape at twilight is that there is also tungsten lighting in the city. This must be
corrected with an 80A or 80B filter, which makes the sky go cobalt blue. Unfortunately
you can't use both filters at the same time to correct for the two lighting sources because
they negate each other, therefore you must choose between the two.

10. Soft and Diffusing

The greatest selection and variety of soft/diffusing filters that is available on the market is
probably the largest of all of the filters. The concept behind them is to soften the image
by adding some blur so that the image is no longer sharp or crisp. Out of all of my soft
looking images 99% were created by using a double exposure technique, where one
exposure is sharply in focus and the other is completely out of focus.

11. Color Correcting (Compensating)

What these filters really do is they hold back all of the other colors in favor of the color
of the filter that is being used. For example, a CC20B reduces all colors but blue by 20%.
These filters come as "gels" (thin pieces of gelatin) which are used with a gel holder or as
hard plastic filters, which can be used in the Cokin holders. Color Correction filters are
available in primary colors - green, red and blue - and printing colors - yellow, cyan and
magenta.

4
Camera Basics, Principles and techniques –MCD 401 VU

12. Colored Polarizers

Sunset

This is a warm/orange-graduated filter that adds a light brown to the foreground and orange to
the sky, simulating or enhancing sunsets. I find that it just looks like you really filtered the shot
and not very well.

Sepia

This filter is used to give the image an old, weathered, brown look.

Star

These filters give a star effect from any bright - point light source.

5
Camera Basics, Principles and techniques –MCD 401 VU

Topic no 027
Photography
An art or Science
Literal meaning of photography is the “drawing with light”. Photo means light and graphy
means light. Dividing up the world into “art” types and “science” types is a useful way to
look at things. So much of what we do falls neatly into one category or another. Play
music is an Art. Build a machine is Science.
On the other hand, no one exists who lives wholly in the realm of the aesthetic, just as
technology alone cannot provide a full life.

Art-this topic particularly addresses aesthetics


Science-this topic is focused on technology

Camera Basics
There are three basic camera types:

1. Mechanical (M)
Older film cameras with most or all functions controlled mechanically or manually, with
rotating, mechanical settings indicators.

2. Electronic (E)
newer film cameras with most or all functions controlled by buttons or electrical knobs,
with digital readout of the settings.

3. Digital (D)
Similar to Electronic, except an electronic sensor replaces the film.

There are three basic systems that operate in all cameras:


• Viewing System
The viewing system allows a human being to see, with varying degrees of ac- curacy, what
image will strike the film or sensor at the time of exposure.

• Light Gathering System


The light gathering system is composed of one or more pieces of glass which gather light
reflected from an image and focus that light on the film or sensor.

• Exposure System
The exposure system allows a precisely controlled quantity of light to strike the focal
plane, where it (M/E) causes chemical changes in dyes and silver compounds that
eventually result in a viewable image, or it (D) causes electrons to be stored in cells that
eventually result in pixels on a electronic display.

1
Camera Basics, Principles and techniques –MCD 401 VU

These basic camera types control their basic systems using:

 Shutter Release
The button you push to record an image. (E) It also advances the film to the next frame.
 Focus
(M) A ring on the lens, or (E/D) a button or lever, that changes how sharp or fuzzy an
image appears in the viewfinder.
 Aperture
(M) A ring on the lens, or (E/D) a button or knob, that changes how much light is
allowed to pass through the lens.

 Shutter Speed
A button or knob that changes how long light strikes the (M/E) film or (D) sensor during
exposure.

 Zoom
(M) A ring on the lens, or (E/D) a buttor or lever, that changes the focal length of the
lens.

 Film Speed
(M) A knob, or (E/D) knob or menu selection, that determines how sensitive the imaging
system (film or sensor) is to light.

 Exposure Compensation
Similar to film speed, a knob that changes imaging system sensitivity, typi- cally used on a
per–image basis for unusual lighting.

Other controls perform supplementary functions:

 Self Timer
Allows you to be in the picture; also useful as a stability aid.

 DOF Preview
Allows you to see the effect of aperture on focus

 Flash Modes
Allows control of built–in flash.

 Exposure Modes
Allows different ways of measuring light, such as average, spot, matrix, etc

 Exposure Lock
Keeps exposure values from one shot to the next

 White Balance (D)


Compensates for different types of lighting (D) Menus, Previews, Resolution, and More!

2
Camera Basics, Principles and techniques –MCD 401 VU

Exposure:
There can be a lot of light, or there may be very little. “Exposure” is what we talk about
when we describe how the intensity of light is controlled to suit a particular film or sensor.
Art: overall exposure determines how light or dark an image is. Photographers call light images
“high key” and dark images “low key,” whereas artists refer to lightness and darkness in an
image as “value.”

Science: light is measured in terms of exposure value,” which is a logarithmic absolute


scale. Each increase of 1 EV represents a doubling of light.
Exposure is determined by four variables:

 The amount of light illuminating the subject,


You often don’t have much control over this, but you may be able to move lights around,
or to move your subject from shadow to sunlight, or to use a reflector to move light onto
your subject.

 The sensitivity of the film or imaging sensor,


Film comes in different sensitivities, specified by their “ASA” or “DIN” numbers. Digital
cameras can have different sensitivity settings per image. In either case, greater sensitivity
means more “grain” or “noise,” less sensitivity means longer exposure times and blurring.

 The “focal ratio,” or aperture of the lens,


Generally miscalled the “aperture,” this is actually the ratio of the length it takes to focus
the image (focal length) to the effective width of the lens (focal width). It is often represented
by the symbol “ƒ”.

 And the length of time the film or sensor is exposed.


Also known as “shutter speed,” this is typically fractions of a second up to tens of
seconds, and is represented by the letter “t”.

Your camera’s exposure system gives you control over the latter two and your (M/E)
choice of film or (D) sensitivity setting control the second item.
For “normal” lighting situations, your camera’s exposure system wants to use settings
that result in mid tone gray. But this is not always what you want!
What if you are taking a picture of a polar bear in a snow- storm, or a raven in a coal
mine? In these situations, you have to trick the camera’s exposure system into keeping
white or black.

Exposure Compensation
Most cameras have an “exposure compensation” control to help with such subjects.
Neither film, nor digital sensors come anywhere near being as sensitive to the wide
range of light that the human eye can perceive. The range of light to dark in images is
called the “scene contrast,” and you often have to choose to sacrifice shadow detail in

3
Camera Basics, Principles and techniques –MCD 401 VU

order to get highlight detail, or vice–versa.

Perspective art:
Distant objects appear to be smaller than nearby objects. Parallel lines, like the edges of
a long, straight road, converge in the distance. Looking up at a tall building makes it
look like it recedes into the distance. This is artistic perspective.
In photography, perspective is manipulated by two means:

 The focal length of the lens,


This is the distance from the lens’s rear nodal point and the film or sensor, and it
determines both the magnification of the imaged subject, and the angle of view of the
imaged scene.
 And the position of the lens relative to the film/sensor.
This is normally fixed in common 35mm and digital cameras, but is variable in large format
view cameras and with some specialized 35mm lenses.

Perspective is divided into three categories:


1. Wide Angle
These lenses enhance perspective — close objects appear much larger than normal, far off
objects appear much smaller than normal.

• Normal
These lenses correspond to the perspective we are used to seeing with our own eyes.

• Telephoto
These lenses reduce or compress perspective — close objects and far objects are closer in
size to each other than our eyes perceive them.

Perspective comes from the ratio of focal length to the diagonal measure of the film or
sensor. This is why 35mm cameras (with a diagonal measure of 50mm), have a different
focal length for a given perspective than cameras with other sized film or sensors.
In particular, digital cameras generally have smaller sensors, so the perspective for any
given focal length is greater than
It would be with most film cameras’ lenses of the same focal length.
The focal length is the primary thing that is changed when you “zoom” a lens, so
most people are familiar with its operation. But most people use zoom for “lazy
composition,” rather than for purposeful manipulation of perspective.
If you want a subject to be closer, without changing the relationship between the subject
and its surroundings, get closer to the subject!
Practice using zoom strictly for manipulating perspective; telephoto settings reduce
perspective, wide–angle settings enhance perspective.
Science:
 Wide angle lenses correspond to human “circle of perception,” the angle at which
we can sense objects.
4
Camera Basics, Principles and techniques –MCD 401 VU

 Normal lenses correspond to human “circle of attention,” the angle of the fovea,
an area of the retina that has an expanded number of cones.
 Telephoto lenses correspond to human “circle of detail,” the area upon which we
concentrate when we examine tiny objects, such as text.

MOTION CONTROL:
Much of the time, you want your images to be nice and sharp, as though frozen in
time. You control this by having an appropriate shutter speed.
However, once you master sharp images, you may find it interesting to indulge in
purposeful, controlled blurring. If the desire is to impart a feeling of motion, purposeful
blurring is usually more effective than frozen action!
Science:
Human “Persistence of vision” is an effect by which quickly moving objects appear
blurred. It is how movies and television are able to create the illusion of continuous motion,
when they’re actually a sequence of quickly changing still images.
The eye cannot perceive changes that happen in less than about 1/10th to 1/15th of a
second, so if you want motion to appear as you see it, choose such a shutter speed.

Motion is controlled through a variety of ways:


• Shutter speed
Is how you control how long it takes to form a latent image on the film (M/E)
or how long photons are collected by a sensor (D).

• Camera motion control


like a tripod, is how you keep camera motion from impacting the exposure as it is in
progress.

• Subject motion control


Like telling the child, “Sit still!” is how you keep subject motion from impacting the in–
progress exposure.

The first technique, shutter speed, is the first to come to mind, but it rarely can have
much impact without other considerations. It depends on several other items:
• Lighting
If possible, choose bright lighting to stop action, dim lighting to purposefully blur action. A
flash also stops action.

Film sensitivity
Determines how fast a shutter speed you can use for a given lighting situation.

• Maximum aperture
Of a given lens also impacts shutter speed. A lens with a large maximum aperture (ƒ2 or
larger) is often called a “fast” lens, because it enables faster shutter speeds for a given
lighting situation than a “slow” lens (ƒ3.5 or smaller) does.

5
Camera Basics, Principles and techniques –MCD 401 VU

Many techniques enable camera motion control:


• Tripod
Is essential for serious photography! But don’t simply leave it in your closet
— A lightweight tripod may get used more than a more expensive, sturdy, heavy one!

• Cable release
If you use a tripod, you need a cable release! Pushing the shutter button, even when on
a tripod, may move the camera.

• Self–timer
Set your camera on a stable surface, compose your shot, and then use the self–timer to
capture the image. This can be used instead of a cable release.

• Carful hand–holding
Make yourself into a human tripod!
You often have more control over subject motion than you imagine:
• Plane of action
Shoot into the motion of travel

• Peak of action
Shoot when the subject has the least relative movement.

Light
This is it. This is what photography is all about. Without light, there would be no
photography.

Basic qualities of light:

Light has six basic qualities:

1. Intensity
(Amplitude, brightness, value) how bright or dim the light is

2. Color
(Frequency, spectrum, temperature) warm, cool

3. Direction
(Angle, vector) front, top, bottom, side, back

4. Contrast
(Size & shape) soft, harsh

5. Polarization
Invisible to the human eye, but manipulable for special effects

6
Camera Basics, Principles and techniques –MCD 401 VU

6. Number of sources
Multiple light sources, each of which will have their own set of the five characteristics
above

The intensity of light is largely negated by your camera’s exposure system, which
guides you in choosing shutter speed and focal ratio such that the average light reflected
from the subject will result in proper exposure.
But by manipulating intensity, we can indirectly control other factors:
• The expression of time, via motion–control techniques,
• The sharpness of objects, via DOF techniques.

The color of light has a lot to do with the emotions your images evoke in the viewer.
Warm light often conveys feelings of well–being, cool light can invoke tension or angst.
Directionality of light is perhaps the most taken for granted. Yet it is primarily
responsible for how “unusual” an image looks.
We see top light every day — it comes from the sky, ceiling fixtures, etc. Front light has
become popularized by camera– mounted flash.
Other directions lend drama and impact to images, whether via artificial lighting, or via
sunrise or sunset.

Contrast of light:
The most poorly understood quality of light is contrast.
1. A high contrast light source has a small angular size, such as the sun. It tends to
produce sharp, hard–edged shadows.
2. A low contrast light source has a large angular size, com- pared to the subject,
such as the entire sky on an overcast day. It tends to produce soft, fuzzy–edged
shadows.
Polarized light has all its waves lined up in the same direction. With polarizing filters,
you can selectively produce or view certain polarization angles, while filtering out others.
Rarely will there be exactly one light source! Multiple sources come not only from
multiple lights, but also from reflections from other objects and surfaces.

7
Camera Basics, Principles and techniques –MCD 401 VU

Topic 028

Things to remember

Following are the important things which you need to remember in the art of photography:

1. Light

Without light, there is no illumination. In a room without illumination, everything is pitch black.
You can’t see a thing. Taking a shot – assuming your camera allows you to – produces a solid
black photograph. You switch on a lamp, and you send light across the room, and everything is
illuminated. Now you can take a photograph and show something in the picture. You realize that
your eye and the camera both need light and illumination to work. Photography is about
capturing light and recording it, whether on paper, or more frequently now, in a digital format.
As a photographer, you control the amount, intensity and duration of light required to create the
picture. The apparatus used to draw with light is called the camera, which comes from camera
obscura, a box with a hole for light to pass through and strike the backwall of it. The name
“camera obscura” actually means dark chamber, and indeed, the word “camera” is still used in
some languages such as Italian to mean “room” or “chamber”. The camera and our eyes work in
pretty much the same way. The difference between the two is that our eyes are better able to
handle wide differences in light intensity. For example, if you take a photograph from inside a
room with an open window, you may get the room properly exposed but the window is too
bright, or the window looks right but the room too dark. Yet our eyes don’t have such a problem:
they can see everything inside the room and outside the window properly exposed. The reason is,
our eyes can compensate for the wide difference in light whereas the camera cannot. Secondly,
our eye is more sensitive to light than most of the amateur/prosumer cameras. In a dark
environment, such as inside a movie theatre, our eyes can still adjust to the lack of light and
allow us to see the rows of chairs and people. Most cameras would have difficulty focusing
under such a demanding condition.

2. Quantity of light passing through aperture

Aperture refers to the opening of a lens's diaphragm through which light passes. It is calibrated in
f/stops and is generally written as numbers such as 1.4, 2, 2.8, 4, 5.6, 8, 11 and 16. The lower

1
Camera Basics, Principles and techniques –MCD 401 VU

f/stops give more exposure because they represent the larger apertures, while the higher f/stops
give less exposure because they represent smaller apertures.

3. Shutter speed

Shutter speed, also known as “exposure time”, stands for the length of time a camera shutter is
open to expose light into the camera sensor. If the shutter speed is fast, it can help to freeze
action completely, as seen in the above photo of the dolphin. If the shutter speed is slow, it can
create an effect called “motion blur”, where moving objects appear blurred along the direction of
the motion. This effect is used quite a bit in advertisements of cars and motorbikes, where a
sense of speed and motion is communicated to the viewer by intentionally blurring the moving
wheels.

4. Composition (retention of interest)

In essence, composition is all about putting together objects in your viewfinder in such a way as
to emphasize the bits you want to, making them stand out in just the right way. These objects
include anything in the foreground, the background, those that "frame" the picture, and most
importantly light.

The truth of the matter is that most successful photographic compositions are in fact quite
simple, there may be numerous objects but there is never any doubt as to what the subject
actually is.

Another great truth is that no matter how expensive your camera equipment is, that without
knowledge of composition, you'll never be able to "capture" the essence of the image you see.
Worse still, someone with a cheaper set up, and who knows about composition, will more than
likely be producing better photographs.

Annoyingly, some people seem to have the knack for creating well composed images, while
others have to shoot and shoot whilst they hone their composition skills. The important thing to
remember is that wherever you start, you will only get better with practice.

With landscapes the challenge is to capture an image so that the subject is presented in the way

2
Camera Basics, Principles and techniques –MCD 401 VU

that you want it to be, which for many means learning new ways of seeing what is in front of
you.

Besides the subject, there is of course "light", it can make all the difference between a fantastic
picture and a dull and uninteresting one. Make sure you note the lights direction, intensity and
color (yes the color of light changes quite dramatically, e.g. at the beginning and end of the day).

Besides the color of the light, the position of the sun in the sky also has a major impact on a
photograph, as when the sun is low it produces marvelous shadows that enhance the subjects
shape and give it that important 3d effect.

There is another thing about taking photographs when the sun is low in the sky, as this means
that it is either early or late in the day and thus its intensity is not at its height, which in turn
means that the range of contrast (the difference between the deep shadows and highlights) is low,
something that really helps capturing the details in both (especially with digital cameras).

The viewpoint is the next vital ingredient, but just what is the best? Here only you can tell, you
just have to move around the subject, looking at it from different angles and from different
heights, not to mention different focal lengths from the very wide to the telephoto. All of these
will have dramatic impact on the image you capture and there is no real formula to follow,
except perhaps one.

This composition “formula” center on something called the "thirds". Take any image and divide
it into three equal part both lengthwise and height wise and you end up with 2 vertical and 2
horizontal lines, placing anything on these vertical lines can really emphasize them, whilst
placing anything on the intersections can be really powerful. Bearing these "thirds" in mind
when composing your photograph could make all the difference.

To sum up, when taking your photographs, decide what your subject is, from which viewpoint
and angle it looks the best, decide where to place it, and most importantly, make sure that the
light is right, after all with landscapes there is normally always another day.

3
Camera Basics, Principles and techniques –MCD 401 VU

5. Optics (lens)

The most important part of a camera is its lens since the quality of an image is so dependent
upon it. The most basic camera body fitted with a good lens can make a good picture, but
the best camera body in the world cannot make a good picture if its lens is of poor quality - if the
image or parts of it are not in focus or if an inadequate amount of light for proper exposure
reaches the film or the sensor in a digital camera.

4
Camera Basics, Principles and techniques –MCD 401 VU

Topic 029

Pinhole photography

Pinhole photography is lens less photography. A tiny hole replaces the lens. Light passes through
the hole; an image is formed in the camera.
Basically a pinhole camera is a box, with a tiny hole in one end and film or photographic paper
in the other.

Tom Baril,
1998
Pinhole cameras are used for fun, for art and for science.
Pinhole images are softer - less sharp - than pictures made with a lens. The images have nearly
infinite depth of field. Wide angle images remain absolutely rectilinear. On the other hand,
pinhole images suffer from greater chromatic aberration than pictures made with a simple lens,
and they tolerate little enlargement.

From the window series, Robert Mann

1
Camera Basics, Principles and techniques –MCD 401 VU

Paris, Ilan Wolf, 1998

THE HISTORY OF PINHOLE PHOTOGRAPHY

 Sir David Brewster, a Scottish scientist, was one of the first to make pinhole photographs,
in the 1850s. He also coined the
very word "pinhole”.
 By the late 1880s the Impressionist movement in painting was a certain influence on
photography. Different schools or tendencies developed in photography.
 The "old school" believed in sharp focus and good lenses; the "new school", the
"pictorialists", tried to achieve the atmospheric qualities of paintings. Some of the
pictorialists experimented with pinhole photography.

"Expeditions" , Ruth Thorne-Thomsen,


1979

2
Camera Basics, Principles and techniques –MCD 401 VU

 Pinhole photography became popular in the 1890s.


 Commercial pinhole cameras were sold in Europe, the United States and in Japan. 4000
pinhole cameras ("Photomnibuses") were sold in London alone in 1892.

 The cameras seem to have had the same status as disposable cameras today - none of the
"Photomnibuses" have been preserved for posterity in camera
Collections
 Mass production of cameras and "new realism" in the 20th century soon left little space
for pinhole photography
 By the 1930s the technique was hardly remembered, or only used in teaching

Eastman Kodak Pinhole Camera 1930

The Revival of Pinhole Photography


 In the mid-1960s several artists, unaware of each other, began experimenting with the
pinhole technique - Paolo Gioli in Italy, Gottfried Jager in Germany, David Lebe, Franco
Salmoiraghi, Wiley Sanderson and Eric Renner in the USA.

 In 1971 The Time-Life Books published The Art of Photography in the well-known Life
Library of Photography and included one of Eric Renner's panoramic pinhole images.

Lago Massiore, North Italy, Peter Olpe, 1978

 The June 1975 issue of Popular Photography published the article "Pinholes for the
People", based on Phil Simkin's month-

3
Camera Basics, Principles and techniques –MCD 401 VU

long project with 15,000 hand-assembled and preloaded pinhole cameras in the
Philadelphia Museum of Art. People came into the museum, picked up a camera, made
an exposure. The images, developed in a public darkroom in the museum, were
continually displayed in the museum.

Brooklyn Bridge, Dona McAdams, 1983

In the 1970s pinhole photography gained increasing popularity. A number of articles and some
books were
published, but the critics tended to ignore pinhole photography
in art.

Beth III, Mimbres Hot Springs Ranch, Nancy Spencer, 1995

4
Camera Basics, Principles and techniques –MCD 401 VU

Topic 30

Rule of Thirds

The Rule of Thirds is perhaps the most well-known principle of photographic composition. Rule of
third is also known as “Rule of Golden means”.

Basic Principle:

The basic principle behind the rule of thirds is to imagine breaking an image down into thirds (both
horizontally and vertically) so that you have 9 parts. As follows:

As you’re taking an image you would have done this in your mind through your viewfinder or in the
LCD display that you use to frame your shot.

With this grid in mind the ‘rule of thirds’ now identifies four important parts of the image that you
should consider placing points of interest in as you frame your image.

Not only this – but it also gives you four ‘lines’ that are also useful positions for elements in your
photo.

1
Camera Basics, Principles and techniques –MCD 401 VU

The theory is that if you place points of interest in the intersections or along the lines that your photo
becomes more balanced and will enable a viewer of the image to interact with it more naturally.
Studies have shown that when viewing images that people’s eyes usually go to one of the intersection
points most naturally rather than the center of the shot – using the rule of thirds works with this natural
way of viewing an image rather than working against it.

In addition to the above picture of the bee where the bee’s eye becomes the point of focus here are
some of examples:

2
Camera Basics, Principles and techniques –MCD 401 VU

NOTE: How to use the rule of thirds?

To use Rule of third in your photography ask following questions to yourself before capturing
the photo:

 What are the points of interest in this shot?


 Where am I intentionally placing them?

3
Camera Basics, Principles and techniques –MCD 401 VU

Topic 31

Factors of Photography

There are six basic factors of photography.

1. Light

To make a picture you require light, that light may be natural (sunlight/moonlight) or artificial
such as a light bulb or flash. Light that our own eyes are sensitive to9 is called a visible light. But
other creatures and material are sensitive to non-visible forms of light such as ultra-violet light,
infrared, or x-rays.

2. Medium

Medium of light is also very important in photography. Photographer has to check which
medium is being used in the photography either its sunlight, bulb light, or any other indoor light.
Techniques of the photography vary according to the medium of photography.

3. Aperture

Aperture refers to the opening of a lens's diaphragm through which light passes. It is calibrated in
f/stops and is generally written as numbers such as 1.4, 2, 2.8, 4, 5.6, 8, 11 and 16. The lower
f/stops give more exposure because they represent the larger apertures, while the higher f/stops
give less exposure because they represent smaller apertures. This may seem a little contradictory
at first but will become clearer as you take pictures at varying f/stops. Be sure to check your
manual first to learn how to set your camera for Aperture Priority, and then try experimenting to
get comfortable with changing the aperture and recognizing the effects different apertures will
have on the end-result image.

4. Shutter speed

Shutter speed, also known as “exposure time”, stands for the length of time a camera shutter is
open to expose light into the camera sensor. If the shutter speed is fast, it can help to freeze
action completely, as seen in the above photo of the dolphin. If the shutter speed is slow, it can
create an effect called “motion blur”, where moving objects appear blurred along the direction of
the motion. This effect is used quite a bit in advertisements of cars and motorbikes, where a
sense of speed and motion is communicated to the viewer by intentionally blurring the moving
wheels.

5. Composition

Main factor of composition is Rule of Third. Rule of third can be done manually by cropping
too.

1
Camera Basics, Principles and techniques –MCD 401 VU

6. Optics (lenses)

Optics or lenses are very important in photography. There are different kinds of lenses used in
photography like Standard/Normal lens, Wide angle lens, Telephoto lens, Zoom lens, Fish eye
lens, Macro lens, Tilt-shift lens and image stabilization lens.

The choice of the lens depends on your type of photography. If you want to shot outdoor you
will need wide angle lens but in indoor photography vivid range of lenses are used to cater
different types of photography.

There are many possible lens choices and all gives you distinct and different images. Basically,
you choose your lens to get specific results in your picture. For example telephoto lens is used to
capture distant objects. You cannot get specific results with another camera lens.

All these fundamental principles of photography are very important to do photography.

From the six essential requirements two of them are important to learn first and fully understand;

 The aperture size


 The shutter speed

These are the main mechanical control functions of your SLR camera. If you learn to control
them manually than all other areas of photography becomes easier to understand. If you only use
your digital SLR in automatic modes then you will never fully benefit from the creative
possibilities of your digital SLR camera.

2
Camera Basics, Principles and techniques –MCD 401 VU

Topic 32
Lightening for photography
Hard or Soft light

When you photograph indoors or out, the scene is illuminated by light that ranges from hard to
soft.

1. Hard Light

Hard light coming from a source that's small compared to the subject casts hard shadows and
has high contrast. Outdoors you see this light on a bright sunny day. The sun may be very
large but it's also far away and small in the sky so it casts hard light on subjects.

2. Soft Light

Soft light falling on the subject from a source that's large compared to the subject, wraps light
around the subject, filling shadows and lowering contrast. Outdoors you see this light on a
cloudy bright day when the entire layer of clouds is the light source.

Whether light falling on a subject is hard or soft depends on one thing, the relative sizes of the
light source and subject. A large source will wrap light around a small subject filling shadows
and lowering contrast. A small source will direct light onto a large subject creating hard shadows
and high contrast. To imagine this, think of the light falling on a landscape on a bright sunny
day. The sun is small compared to the landscape, so the light is hard. Pictures have black
shadows or burned out highlights. Now imagine a thin layer of clouds drifting across the sky
from horizon to horizon. The sun hits the cloud layer from above, and it retransmits the light
from all parts of the sky. The light source has gotten dramatically larger and its diffuse light
softens shadows and lowers contrast.

How to create Hard and soft light?

There are two ways to soften light indoors in addition to moving a light closer to the subject—
using reflectors and diffusers. To get harder light, move the light farther from the subject or use a
bare bulb or bare bulb flash. When a bulb is mounted in a reflector, it's really the larger reflector
that is the light source. A bare bulb has no reflector so the light source is much smaller. Since it's
more of a point source, it casts a hard light on the subject. Because it lacks a reflector to focus
the light, its range is shorter than other kinds of light.

1
Camera Basics, Principles and techniques –MCD 401 VU

Hard light is created when the light source is small relative to the subject

Soft light is created when the light source is large relative to the subject

2
Camera Basics, Principles and techniques –MCD 401 VU

Topic 33

Lighting for Photography

Studio lights

Studio lighting can seem daunting task. However, most portrait photography lighting techniques
are not nearly as scary as most people think. By using a simple home photo studio kit with just a
couple of flash heads and a few basic accessories, you can get great results in no time at all. In
fact, it’s arguably easier to use a studio lighting setup than off-camera flash.

There are some of the key tools you’ll be needing to create the classic studio lighting techniques

1. Light stands
Studio flash is all about positioning the light source away from the camera, so stands are
crucial. They support the flash heads, which means they can be positioned at the right
distance and angle to the subject.

2. Flash heads
Most kits have two flash heads. Along with a flash tube, there’s a modeling light. Most

1
Camera Basics, Principles and techniques –MCD 401 VU

have a switchable ‘slave’, enabling one flash to be triggered by another, so you only need
to have your camera connected to one of the heads.

3. Umbrella
A brolly is the most standard form of lighting accessory. The flash is directed into the
brolly so the light is reflected back onto the subject. They are available in different
reflective surfaces – typically white, silver or gold.

4. Soft box
Soft boxes are slightly more sophisticated than brollies and once you’ve worked out how
to assemble these tent-like devices, they create a softer and generally more flattering
light, with a more even illumination.

2
Camera Basics, Principles and techniques –MCD 401 VU

5. Snoot/honeycomb
Both of these tools help to concentrate or ‘focus’ the light. They’re ideally suited for use
as backlights or for isolating a particular part of an image.

6. Reflector
A simple reflector can be really useful in a studio lighting setup, especially if you’re only
using one light. You use it the same way you would with natural light – to bounce light
back onto your subject and fill in any hard shadow areas.

Steps of Studio Lighting

1. Rembrandt
This studio lighting technique is ideal for artistic shots with depth

3
Camera Basics, Principles and techniques –MCD 401 VU

A simple reflector can be really useful in a studio lighting setup, especially if you’re only using
one light.
You use it the same way you would with natural light – to bounce light back onto your subject
and fill in any hard shadow areas.

Position one flash head with a silver brolly at a 45° angle to the model at about six feet high.
This creates a strong, hard, direct light from the side and above. This is called a key light.
To even the lighting, position a reflector on the other side of the model to bounce the light back
into the shadow side.
There should be a small triangle of light on the subject’s face – this is referred to as Rembrandt
lighting.

2. Clamshell
This studio lighting technique is used to capture every detail with even light

4
Camera Basics, Principles and techniques –MCD 401 VU

This studio lighting setup is great for beauty images as the lighting is flat and even.
It’s pretty easy to achieve this effect too – all you need to do is place two soft boxes on either
side of your subject at the same angle and at an equal distance.

Set the power so it’s the same from each light. Try using a reflector under the face – your model
should easily be able to hold this.
This will bounce light up and onto the face.

3. Backlight
This studio lighting technique is used to add depth and drama with rear lights

5
Camera Basics, Principles and techniques –MCD 401 VU

To add drama, use a honeycomb or snoot accessory on one of the lights. This will narrow the
beam of light.

We’re going to position this behind the model, pointing back towards the camera so that it lights
the back of her head.
This is a great way to add drama and depth to a photo, and it also creates a sense of separation
from the background.
Of course, you need to make sure the backlight isn’t visible in the shot.
4. Rim lighting
This studio lighting technique is used to create an exciting style with good definition

6
Camera Basics, Principles and techniques –MCD 401 VU

Place both lights slightly behind the subject, pointing back towards the camera. This setup
requires some tweaking and can work really well with nudes as it helps define body shape.

You’ll need to watch out for lens flare, though, as the lights are pointing back towards the
camera. A set of ‘barn doors’, a lens hood or a shield can help prevent this.
An assistant who can hold a carefully positioned reflector is useful – this will help fill in those
areas of deep shadow.

Tips on Studio Lighting

Avoid the wrong shutter speed!

7
Camera Basics, Principles and techniques –MCD 401 VU

Tip 1
The shutter speed you choose is less significant in a studio setup but obviously needs to be fast
enough to avoid any camera shake. However, you also need to be careful not to set a shutter
speed faster than the camera’s specified sync speed – on most cameras this is usually either
1/200 sec or 1/250 sec. Go any faster and you’ll have horrible black stripes across your images.
Tip 2
The power of flash is measured in Watt-seconds. Each of the heads we’re using is 400Ws, which
approximates a guide number of 64. This is fine for regular portrait work.
Tip 3
Switch your camera to manual and use the histogram and LCD to assess the exposure and effect
of the lights. Use the dials to change the power of the lights and the aperture to alter the
exposure.
Tip 4
A sync cables or a wireless trigger is needed to connect your camera with the lights so that when
you press the shutter, the lights fire at the same time. Some wireless triggers are so cheap now
that they’re the best option, especially as many popular DSLRs don’t have the PC socket you
need in order to use a more traditional sync cable.

8
Camera Basics, Principles and techniques –MCD 401 VU

Topic 34
Lighting for photography
Transmitted & Reflected light
Transmitted light photography is the technique of photographing a translucent subject (such as
flowers, feathers, and leaves) from light that passes through it, rather than photographing the
subject with light that is reflected or emitted from the front surface, as we normally see a subject.
This type of lighting enhances the dramatic mode of subject, produces a soft ethereal glow from
apparent interior lighting, and provides a visual perception of extended depth-of-field. Using
colored filters show exciting internal features of your subjects.

Backlighting is a well-known photographic technique for creating exciting and unusual images,
such as silhouettes, rim-lighted, and transmitted light photographs. Some photographers,
however, avoid backlighting because they were told early in their careers that the lighting source
should always face the subject. For backlighting, the light source and the photographer face each
other with the subject in between.
Transmitted lighting is not a new or unique phenomenon; we view it almost daily through leaves
and other translucent subjects’ backlit by the sun. However, many photographers are unaware of
transmitted light’s potential to produce unusual and radiant images.

1
Camera Basics, Principles and techniques –MCD 401 VU

The major difference between transmitted and reflected lighting is that transmitted light passes
through the entire subject’s cross section, whereas reflected lighting penetrates only a short
distance into a subject’s sub-surface. Many photographers assume the directional differences
between transmitted and reflected light will not have any consequence on the appearance of a
photograph. But, as I will show, it is just this difference in penetration depth that explains the
effects produced in transmitted light photographs.

Some photographers who show examples of transparent backlit subjects report their images have
strong photographic appeal because of the backlighting, but don’t speculate on the reasons for
the enhancement. Moreover, most backlit photographs are taken with a combination of lighting,
both transmitted and reflected; and, as I will suggest, the reflected component dilutes the effect.
However, for transmitted light photography, the backlighting source and its arrangement are
critical for achieving optimum results. That doesn’t mean the process has to be complicated
though: transmitted light photography requires no expensive equipment, can be conducted with
minimum space requirements, and can involve a number of my subjects.

2
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 35
Photography on Film

Photographic film is a chemically reactive material that records a fixed or still image when the
film is exposed to light. Typically, film is placed in a camera, and light from the image being
photographed is allowed to enter and is focused and sometimes made larger or smaller by
the camera lens. The film is exposed to the image by opening a shutter in the camera body, and
the combination of the speed of the shutter and the film speed (which is the chemical reactivity
of the film) controls the amount of light that strikes the film. The image is recorded on the film,
but it is a latent or invisible image. When the film is removed from the camera, it is developed by
chemical processes into a visible image. This visible image is negative or the reverse in
brightness of the way our eyes see light; the brightest parts of the photographed object appear the
darkest on the negative where the film received the most exposure to light. The negative image is
made positive, or as our eyes see it, by another type of processing whereby the negative is
printed on sensitive paper. Color-reversal films are positives and are used for making slides. All
of the elements of the process—the parts of the camera, the type and parts of the lens, the type of
film, including its chemistry, the developing process, the printing process, and the type of
paper—contribute to the sharpness or trueness of the finished photograph.

History

Film was "discovered" in a chemistry laboratory. In 1727, Johann Henrich Schulze, a German
doctor, mixed chalk, silver, and nitric acid in a flask to make silver nitrate. When the solution
was exposed to sunlight, it changed color from white to purple. When Schulze pasted cutouts of
letters and numbers on the outside of a flask of freshly made solution and exposed it to the light,
the cutouts appeared to have been printed on the solution. Although the discovery marked the
birth of photography, it was not used for over 100 years. In 1839, Louis Daguerre, a French
painter, created a photographic process in which liquid iodine was placed on a silvered copper
plate, and the plate was exposed to light. The liquid iodine was the emulsion, or light-reactive
chemical, and the copper plate was the base for these photographs called "daguerreotypes." The
American inventor Samuel F.B. Morse learned the art of daguerreotypy and taught it to Matthew

1
Camera Basics, Principles and techniques –MCD 401 VU

Brady, who made images of the Civil War that are treasured both as historical records and
artistic landmarks in photography.

Daguerreotypy was cumbersome to use; the "wet plate" process was awkward, the box-type
cameras had to hold the large plates, and the finished photographs were the size of the plates.
While Daguerre was developing his process, William Henry Fox Talbot, an English
archaeologist, created his own process called "calotype," meaning "beautiful picture" in 1841.
Talbot coated a paper base with an emulsion of silver iodide and produced a negative by a
developing process. The calotype is more like today's film and photographic process, and the
intermediate step resulting in a negative permitted more than one print to be made.

The flexibility of photography was improved further in 1871 when R.L. Maddox invented the
"dry plate" process. Gelatin made from animal bones and hides was used to coat glass plates, and
silver iodide was precipitated inside the gelatin layer. The plates and their dried jelly could be
exposed, then the photograph could be developed later by rewetting the gelatin. The complicated
procedure of manufacturing the plate, exposing it, and processing it into the finished photograph
was broken into parts that made the photographer's work easier and made photography and photo
processing a manufacturing industry.

George Eastman combined the paper base of Talbot's calotype with the gelatinous silver nitrate
emulsion from Maddox's process to invent flexible roll film in 1884. Eastman quickly made the
transition to an emulsion-bearing plastic, transparent film by 1889, which was a year after his
company introduced the first Kodak camera. These developments made photography a simple,
compact, portable practice that is now the most popular hobby in the United States.

2
Camera Basics, Principles and techniques –MCD 401 VU

Raw Materials :A roll of film consists of the emulsion and base that compose the film itself, the
cassette or cartridge, and outer protective packaging. The materials used to make the emulsion
are silver, nitric acid, and gelatin. The base consists of cellulose and solvents that are mixed to
form a thick fluid called dope. Film that is packed in a cassette (35-millimeter film is typically
packed this way) requires a metal spool, the protective metal canister, and plastic strips at the
canister opening where the film emerges. Other sizes of film including Polaroid film are
protected from light and air by plastic cartridges or packs. Outer packaging, which varies among
film products, is made from foil-lined paper, plastic, and thin cardboard cartons. The outer
packaging is also insulating and protects the film from exposure to light, heat, and air.

The Manufacturing Process

Base

 1 For most films, the base to which the light-sensitive emulsion is fixed consists of
cellulose acetate, which is wood pulp or cotton linters (short cottonseed fibers) mixed
with acetate to form a syrup. Solid pellets of cellulose acetate precipitate or separate out
of the syrup and are washed and dried. The pellets are dissolved in solvents to form the
transparent, honey-like dope. The dope is spread in a thin, even sheet on a wheel that is
two stories in diameter. The wheel is plated with chromium for a smooth finish, and it
turns slowly. The solvents in the dope volatilize or evaporate as the wheel turns. The
process is much like the applying and drying of nail polish. The remaining base is a thin
sheet of plastic that is of a uniform thickness measured in ten-thousandths of an inch.
When it is dry, the base is removed from the wheel and wound on 54-inch (137 cm)
diameter reels.

3
Camera Basics, Principles and techniques –MCD 401 VU

Emulsion

 2 Silver is the main ingredient of the emulsion. Pure silver bullion is received at the
manufacturing plant in bars that are checked by weight and serial number. The bars are
dissolved in a strong solution of nitric acid, and the process releases heat. After the acid
has completely dissolved the silver, the solution is stirred constantly and cooled. Cooling
causes crystals of silver nitrate to grow, much like salt crystals in water. The crystals are
wet with water that also separates out. The crystals are removed from the solution and
whirled in centrifuges with sieve-like openings to remove the water and keep the crystals
pure. At this point in the process, the chemical solutions are light-sensitive, so further
manufacturing processes are completed in darkness.
 3 Meanwhile, gelatin has been made using distilled water and treated with chemicals
including potassium iodide and potassium bromide. The gelatin serves as a binding agent
to hold the silver nitrate crystals, and also to fix them to the base. The gelatin and
chemicals are mixed in cookers that are lined with silver so the emulsion remains pure.
As the mixture cools, silver halide salts (chemical combinations of the silver, iodide, and
bromide) form as fine crystals that remain suspended in the gelatin to make the emulsion.

Coating process

 4 The emulsion is pumped through a piping system to "coating alley," a huge work area
that may be 200 feet (61 m) wide and five stories high. The area must be immaculately
clean and dust-free, and the operations of the roll-coating machines are controlled by

4
Camera Basics, Principles and techniques –MCD 401 VU

arrays of control panels in the fully automated process. Machines coat precise amounts of
emulsion in micro-thin layers on the wide strips of plastic base; a single, dried layer of
emulsion may be six one-hundred-thousandths of an inch thick. Successive layers of
three emulsions are applied to the base to make color film, and each emulsion layer has
its own color-forming chemicals called linked dyes. The three emulsion layers in color
film respond to blue, green, and red light, so each photograph is a triple latent image with
the sandwiched color range reproduced by processing. The strips of emulsion-coated base
(now film) are cut into progressively narrower widths, perforated so the film can be
advanced in the camera, and spooled, except for instant film and sheet film that are
packed flat.

Packaging

 5 Film is packed in cartridges, cassettes, rolls, instant packs, or sheets. Cartridges are
used in certain types of cameras and include a take-up spool that is built in so the exposed
film and cartridge are removed as a unit. Cassettes are made for cameras that use film in
the 35-millimeter format. They consist of a spool enclosed in a metal jacket. The tongue
of the film is drawn over the pressure plate at the back of the camera to a take-up spool
that is built into the camera. When the film is finished, it is rewound onto the spool in the
cassette, and the unit is removed. Roll films consist of paper-backed film that is packed
on a spool like the one in the camera. The film is wound onto the spool in the camera,
and that spool and film are removed. The spool on which the film was packed originally
can then be moved to the receiving side of the camera, and a new roll inserted. The packs
for instant cameras contain 8 to 12 sheets that are ejected individually after each shot.
Sheet film is used for specialized applications like x-ray film.

Plastic cartridges for cartridge-type film are made by injection molding, in which fluid-
like plastic is squirted mechanically into forms or molds. These are hardened, removed
from the molds, and trimmed and smoothed. The spooled film is then placed in the
cartridges and sealed. The metal canisters are printed on the outside, cut to shape and
size, trimmed and smoothed, and edged with protective plastic. The metal is shaped
around the spools of film. Plastic canisters and caps are also made for the film canisters,
as are other types of outer packaging such as foil-lined paper pouches, and the outer

5
Camera Basics, Principles and techniques –MCD 401 VU

cartons. The packaging is dated, shrink-wrapped in plastic in quantities appropriate for


sale, packed in cardboard containers for shipping, and stored in air-conditioned rooms to
await shipment.

6
Camera Basics, Principles and Techniques - MCD 401 VU

Topic 36
Dark Room & Film Processing

A darkroom is a room that can be made completely dark to allow the processing of light
sensitive photographic materials, including photographic film and photographic paper.
Darkrooms have been created and used since the inception of photography in the early
19th century. Darkrooms have many various manifestations, from the elaborate space
used by Ansel Adams to a retooled ambulance wagon used by Timothy H. O'Sullivan.
From the initial development to the creation of prints, the darkroom process allows
complete control over the medium.

Due to the popularity of color photography and complexity of processing color film and
printing color photographs and also to the rise, first of Polaroid technology and later
digital photography, darkrooms are decreasing in popularity, though are still
commonplace on college campuses, schools and in the studios of many professional
photographers. Other applications of darkrooms include the use in nondestructive testing,
such as magnetic particle inspection.

Darkroom equipment
In most darkrooms, an enlarger, an optical apparatus similar to a slide projector, that
projects the image of a negative onto a base, finely controls the focus, intensity and
duration of light, is used for printmaking. A sheet of photographic paper is exposed to the
enlarged image from the negative.

When making black-and-white prints, a safelight is commonly used to illuminate the


work area. Since the majority of black-and-white papers are sensitive to only blue, or to
blue and green light, a red- or amber-colored light can be safely used without exposing
the paper. Color print paper, being sensitive to all parts of the visible spectrum, must be
kept in complete darkness until the prints are properly fixed.

1
Camera Basics, Principles and Techniques - MCD 401 VU

Another use for a darkroom is to load film in and out of cameras, development spools, or
film holders, which requires complete darkness. Lacking a darkroom, a photographer can
make use of a changing bag, which is a small bag with sleeved arm holes specially
designed to be completely light proof and used to prepare film prior to exposure or
developing.

Print processing

During exposure, values in the image can be adjusted, most often by dodging. Dodging is
a technique of reducing the amount of light to a specific area of an image by selectively
blocking light to it for part or all of the exposure time; or burning, which means giving
additional exposure to specific area of an image by exposing only it while blocking light
to the rest. Filters, usually thin pieces of colored plastic, can be used to increase or
decrease an image’s contrast (the difference between dark tones and light tones). After
exposure, the photographic printing paper is ready to be processed.

Photographers generally begin printing a roll of film by making a contact print of their
negatives to use as a quick reference to decide which images to enlarge.

The paper that has been exposed is processed, first by immersion in a photographic
developer, halting development with a stop bath, and fixing in a photographic fixer. The
print is then washed to remove the processing chemicals and dried. There are a variety of
other, additional steps a photographer may take, such as toning.

Reference:

"Black and White Photography - Darkroom Layout & Equipment". Danmassey.co.uk.

2
Camera Basics, Principles and Techniques – MCD 401 VU

Topic 37

Printing & Enlarging

Since the 1960s, 35mm film has been the most popular film size. While the quality of 35mm film

cannot match that of larger film sizes, the ease of use and flexibility it offers is unmatched. As a

result, a large majority of film cameras, including single-lens reflex (SLR) cameras, support the

35mm film size. And because 35mm film has the most support in the photography industry, is

available at most retail outlets, and is typically much lower in price than the larger size film

equipment, 35mm film is often the starting point for beginner photographers.

Printing with 35mm Film

Because of the smaller size of 35mm film, photographers are somewhat limited in the quality of

print that can be achieved. Depending on the subject, film speed, lighting, and other factors,

35mm film can be enlarged up to 16×20 inches. Please note that 35mm film can be enlarged as

much as you would like but most prints larger than 16×20 will show noticeable grain and suffer

from a lower quality look. However, the personal preference or desired look the photographer

wishes to achieve will play a role in how large the image can be printed. Some photographers

using 35mm film may not go beyond a 5×7 print, while others may try to push beyond the 16×20

size. Photographers looking to make large sized prints will likely want to move up to a medium

format or large format camera, which use negatives much larger than 35mm and allow for bigger

prints.

1
Camera Basics, Principles and Techniques – MCD 401 VU

What Are Contact Sheets?

Contact sheets are useful to show clients how their finished photographs will look. Contact
sheets are sheets filled with all the photos from the shoot. Contact sheets are easy to produce and
will make your photography business much more professional.

A contact sheet is similar to a negative but in positive colors. This contains thumbnail images of
all the photos from the shoot. This makes it very easy for your clients to look at the photos that
you have chosen and decide which ones they want to order. These are also sometimes known as
an index sheet.

Digital Contact Sheets

When using software like Photoshop it's actually very easy to create contact sheets. This is
simply a matter of selecting the images that you want to include and then allowing the wizard to
add them in place. These can then be printed out on any photo printer. The photo contact sheets
can be printed to a large sheet of paper or a much smaller 4"x6" photograph if you prefer.

Deciding Which to Order

The best advantage of a contact sheet is that it gives your clients an opportunity to look at the
photos before they have ordered them. They can then decide exactly which photos they want to
order and have printed larger.

2
Camera Basics, Principles and Techniques –MCD401 VU

Topic 38

Flash Photography

The reason to use flash is of course that flash is very bright (and very fast) for easy camera
exposure. In comparison, the brightest light bulbs are dim for photography, not near sunlight
bright. Without flash, even well lighted rooms will suffer from long slow shutter speeds, or high
ISO, or both. Light bulbs can be fine for still life photography, when a one second shutter is no
problem, but which is unacceptable for pictures of people, who tend to move. Flash also allows
us to create the lighting, to be like we want it, to place flash wherever we want them, and to be
made soft as we might desire, etc.

Flash photography is many things. There is on-camera flash and off-camera flash, manual flash
and automatic TTL flash, and direct flash and bounce flash. There is fill flash in bright sun,
multiple flash units, studio and portrait and table top flash in umbrellas, high speed flash, and
more. Lighting is a big and fun subject, but before anyone can get much into "lighting", there are
a few more fundamental basics we need to know, about "light". In all of these cases, there are
basic differences between flash and existing continuous ambient light. Flash is not difficult, it is
just different than either sunlight or regular continuous room light, and we need to understand
flash too.

In short summary, the major points, the really big deal about flash, is:

 The intensity of any near light source falls off fast with distance. Therefore, flash can
achieve a correct exposure at only one distance. TTL automation can determine that
exposure, but we also need to know that relative to that subject distance, any distant
background is necessarily underexposed; any close foreground is necessarily
overexposed. Bounce flash can help to minimize this, but distance is a huge issue for
flash, with huge implications concerning our use. But in drastic contrast, sunlight is quite
unique, very special because the Sun is so distant that its intensity appears not to vary
with subject distanc.

1
Camera Basics, Principles and Techniques –MCD401 VU

 Flash pictures involve two exposures, flash and ambient, with two different concepts of
rules.
 Flash is typically very fast, shorter duration than the shutter duration (the shutter merely
needs to be open when the flash occurs). Speedlights in particular can be very fast, easily
stopping extreme motion. But therefore, flash exposure is not affected by shutter speed.
But of course, continuously available ambient light (continuous light) is still affected by
shutter speed, like we always understood. Since flash exposure does not care about
shutter speed, but shutter speed does affect any continuous ambient light, then
specifically, we can use shutter speed as a tool to adjust the ratio between flash and
continuous light in our photos.
 Flash is convenient to modify the light itself, as we desire, for example, large close lights
(umbrellas for example) are very soft light with vague diffused shadows, instead of the
harsh dark shadows from a small light source. Photography of course has other important
factors (composition, lighting, etc), but flash is simply about adding light, and flash
exposure is simply about adjusting the flash power level to deliver the right amount of
light to the specific distance of your subject.
 Flash imposes a few limits we work around, like flash power capability and flash range is
limited. Also our maximum shutter speed has a limit (maximum shutter sync speed,
which varies with camera model, but usually in the ballpark of around 1/200 second
maximum shutter sync speed). And waiting for flash recycle time between pictures can
be a factor. But flash also allows us control over more things about the light in our photos
- direction, intensity, soft light from umbrellas, etc.

Flash is just a light that we can aim. In one way, it is just another light source, but we can aim
flash where we want it (lighting), and we can turn its power up or down (exposure), to deliver the
lighting and exposure we want. It is not rocket science. Our picture shows everything that
happens. For Exposure, we simply adjust the flash intensity to give the result we want at the
subject. In manual flash modes, we simply adjust flash power level to do this. In TTL flash
modes, TTL automation gets it close, and then we simply adjust Flash Compensation to adjust
this level for our preference. Either way, manual or TTL, if it is too bright, then turn it down, etc.

2
Camera Basics, Principles and Techniques –MCD401 VU

Flash can be necessary, and it can be a big help. The simplest tips for universally better hot shoe
speed light snapshots are:

 Pictures indoors need flash of course, and specifically, Bounce Flash offers much better
lighting. Aiming the flash head up at the ceiling is simple, not always possible, but better
lighting is almost automatic when it is possible. Direct flash is flat uninteresting light, but
bounce is from an off-camera angle, causing soft graduated tonal shading that shows
shapes, and is greatly improved as bounce does need more flash power.
 Pictures of people outdoors in bright sun need fill flash, to lighten the dark harsh
shadows. Balanced flash mode is designed to do this, and even the little popup flash will
help, if the distance is not too great. Of course, finding some shade is always good (softer
light), but a little fill flash is still needed.
 The control of automatic TTL exposure is done with Flash Compensation. Simply adjust
Flash Compensation as seen needed. If TTL automation gives too much flash, turn it
down a little with -EV Flash Compensation, or vice versa.

3
Camera Basics, Principles and techniques –MCD 401 VU

Topic no 39

Comparative Imaging

Comparative imaging can be better understood with the perfect knowledge of lenses and their
function and usage. There are different types of lenses.

Types of lenses

1. Fixed Focal Lens/ Prime lens

Also referred to as a "prime lens," the fixed focal length lens (FFL) has a focal length that is
not adjustable. Photographers are unable to zoom in and out on a particular subject when
using a prime lens. Often used as a term opposite of zoom, prime lenses have only one focal
length, with fewer moving parts and a simpler lens formula. A fixed focal length lens is less
likely to produce images with chromatic aberrations (fringes of color along boundaries of dark
and light parts of an image). FFL lenses come in all focal lengths, from a wide-angle lens to
the longer telephoto lenses.

2. Zoom lens

Zoom lenses have variable focal lengths, and are extremely useful. Some can range
between a wide-angle and a telephoto (i.e. 24 to 300mm) so you have extensive versat ility
for composition. The trade off with zoom lenses is the aperture. Because of the number of
elements required in constructing these lenses, they have a limited ability to open up and
allow in light. So unless you’re prepared to outlay a lot of money, y ou will give up lens
speed.

3. Wide angle lens

A wide-angle has a shorter focal length (10 thru 42mm) when compared to a standard lens.
This enables you to capture a comparatively wider angle of view. A wide -angle lens is a
natural choice for capturing outdoor landscapes and group portraits. In fact, wide angle
can be the only way to capture the complete setting without omitting any important

1
Camera Basics, Principles and techniques –MCD 401 VU

elements in the image. In this manner, you can use wide -angle lenses to capture a deep
DOF.

4. Tele-photo lens

Telephoto lenses (100mm - 800mm) can provide you with a narrow field of view. These
long lenses enable you to compress a distance (and compress the sense of depth, as well)
and pick out specific objects from far off. They have a strong resolving power and an
inherent shallow DOF, where the slightest lateral moment can take a subject out of view.
Telephoto lenses are great for wildlife, portrait, sports, and documentary types of
photography. They enable you to capture subjects from hundreds of feet away.

5. Macro/Micro photography lens

Macro lenses are used for close-up or “macro” photography. They range in focal lengths
of between 50-200mm. These lenses obtain razor -sharp focus for subjects within the
macro focus distance, but lose their ability for sharp focus at other distances. These lenses
enable the photographer to obtain life-size or larger images of subjects like wasps,
butterflies, and flowers.

6. Process lens

Process lenses are lenses which are primarily designed for the graphics industry. They are flat
field lenses which are optimized for 1-1 reproduction. That being said, many of the process
lenses are excellent large format landscape lenses if stopped down. Normally a process lens will
be mounted in a barrel.

2
Camera Basics, Principles and Techniques – MCD 401 VU

Topic 40

Photographic Lab- Photoshop

How to Use Layers in Photoshop

If any of you have dabbled in Photoshop, you understand how frustrating it can be if you don’t
know how the layers work. This is by far the most important element of Photoshop, one of the
reasons many people throw their arms up in frustration -- but once you understand how they
work, they'll make your life much easier.

Think of layers as sheets of glass stacked on top of one another that you'll use to create a final
product. Each sheet can be modified individually without affecting the project as a whole; this
can save you tons of time when making edits to individual elements of your graphic. A layer can
be used for an image, text, brush strokes, background colors, patterns, and filters.

To add or delete a layer, you can either use the Layers tab from the top menu bar and select your
option, or you can locate the Layers module on the right hand side of your Photoshop application
and use the corresponding icons. Once you've located your Layers tool bar, you'll notice a
number of icons along the bottom of that module. The add and delete icons are indicated by the
screenshot below. You will also notice the little "eye" icons along the left side of each layer you
add; clicking these icons lets you toggle the visibility of those layers.

How to Use the Color Tool in Photoshop

This may seem like a pretty self-explanatory element, but the Color tool in Photoshop has
powerful features that will keep your visual content vibrant and unify your color schemes. The
Color tool is located on the tool bar on the right, and lets you use, modify, copy, and save custom
colors for your content.

In the color module, there are a set of bars that you can adjust to create your own custom color --
like if you already have a brand style guide that defines the colors you should use. Or, you can
use the quick color selection bar located underneath the RGB bars to find the color that's right for

1
Camera Basics, Principles and Techniques – MCD 401 VU

you. You'll notice that your selected color is indicated by a small box shown in the screenshot
below. If you would like to use a more expansive color picker, simply double click the
foreground color box, and you'll be presented with a much more advanced color creating tool to
work with. Within the Color Picker, you'll be able to create your custom color and save it to your
swatches for future use.

How to Use Fonts and the Text Tool in Photoshop

Fonts are a defining piece of any marketer’s visual content. Let’s face it; an image of a teddy
bear in a box is just an interesting image until you define it with some text. The great thing about
the Photoshop Text tool is that it’s easy to add custom fonts to your database, and it gives you
access to advanced font settings that give your text some serious style.

To access your font tools, click the icon indicated in the screenshot below. You'll immediately
notice all of the settings and font options you have available at your finger tips. These settings let
you change the font, font size, spacing between characters, height, width, color, and style.

The text tool works like any other text tool you’ve used. Click the "T" icon on the left side bar,
drag the text box over any particular area you want text to appear, and you’re set to go.
Whenever you create a text box, Photoshop will generate a layer for it. You can choose the
color, size, stroke, font style, and a variety of other options to switch things up.

How to Use Brushes in Photoshop

Brushes are a great way to add some visual accents to your content. Photoshop starts you off
with a nice selection of brush tips that you can use to clean up your graphics and create some
basic visual effects. Just as with fonts, you can add your own royalty-free custom brush tips.
With the brush settings, you can change the size, shape, and transparency of your brush strokes
to achieve a number of different visual effects. Check out the screenshot below for a breakdown
of the brush settings.

2
Camera Basics, Principles and Techniques – MCD 401 VU

How to Use the Brush Tool in Photoshop

The brush tool, as mentioned earlier, is perfect for adding design accents to your marketing
content. When using the brush tool, I always suggest adding a new layer to work with so you
don’t paint over any of your other elements. You can choose colors from your library of
swatches, or use a custom color. In the screenshot below, I’m using a custom brush to add a
splash of color and character to this graphic. It’s a simple process that produces great results.

To access your Brush tool, locate the icon indicated in the screenshot (you'll also notice that the
brush icon is also located in the top left corner). The drop down will indicate your current brush
tip.

Changing the brush settings can give your brush a drastically different look and style. Don’t be
afraid to play around a bit with all of your custom brushes. To make changes to your brush tip,
locate the settings icon on the side bar on the right; and a slide out menu will appear, and you can
make your changes as you see fit.

Once you're satisfied with your settings, create a new layer over the elements you'd like to paint
and go to town!

How to Use the Select Tool in Photoshop

If you’ve used any photo editing software, you’re probably well aware of the Select tool. For
Photoshop, this tool is represented by the dotted-line square in your tool bar (you'll see this in the
screenshot below). This is one of the most basic, yet frustrating, tools to use in Photoshop.
When used correctly, this tool will let you select individual elements, entire graphics, and
determines what is copied, cut, and pasted into your graphics.

Some people also get hung up on how to select an image to insert in their graphic. To do this,
open the image you would like to use in Photoshop, and use the Select Tool to determine how
much of the image you would like to copy. Once you’ve selected the area of the image, simply
copy the area. Open the tab for your current project and paste it in as a new layer. Once you’ve
pasted the image, you can position and resize the image anyway you’d like.

3
Camera Basics, Principles and Techniques – MCD 401 VU

How to Use the Move Tool in Photoshop

This is a fairly basic tool that allows you to move individual elements of your graphic. The Move
tool works on individual layers, and on the graphic as a whole -- if (remember how to do this?)
you highlight all of your layers. This tool comes in handy when you’re trying to reposition
images, text, and other design elements. Click the Move Icon from the left hand menu bar and
simply drag the object(s) you would like to move. You can also right click the object for
additional options.

How to Use the Magnetic Lasso in Photoshop

Have you ever wished you could just select a particular shape, person, or object instead of having
to highlight the entire image? Well, the Magnetic Lasso will do the trick! You can access the
Magnetic Lasso from your tool bar. Simply click it, and start selecting your custom object by
guiding your cursor along the outside of the object -- the Magnetic Lasso will snap to it like,
well, a magnet. Use your zoom tool to get up close and personal with your object, too, for more
accurate lasso work. Once you’ve finished highlighting your object, right click and copy or cut it
out.

How to Use the Eraser in Photoshop

The eraser is one of the most useful tools in Photoshop. Yes, I understand it’s technically just an
eraser, but you’ve never used an eraser like this!

First off, we’ve got the basic eraser, which functions a lot like the brush tool. You can change the
size and hardness of the eraser tip to achieve a variety of effects, like blending and fades. To use
the Erase tool, locate the icon on the right-hand tool bar and select it. Once you've selected your
Eraser tool, you can change the size, hardness, and other aspects of that erase tool by clicking on
the drop down tab in the top menu bar, indicated by the screenshot. Like most tools in
Photoshop, the eraser works only on a specifically selected layer. Make sure you've got the layer
you want selected before you start erasing!

4
Camera Basics, Principles and Techniques – MCD 401 VU

A Background eraser uses differences in color to help you erase unwanted background areas
from your images. This tool is a time-saving wonder! You can see how easily it eliminates
background colors from images. This is especially helpful if you need an object with a
transparent background.

To use the Background eraser, click and hold the eraser icon until the slide out menu appears.
Once you've located the Background eraser icon, pictured in the screenshot, click it. Now you're
ready to to some serious erasing! Adjust the size of the Background eraser, and simply click the
color you would like deleted from the selected layer.

How to Use the Crop Tool in Photoshop

This works like any crop tool you’ve ever encountered. Simply choose your area and crop it out!
You’ll find yourself using this just as often as any other tool in Photoshop, especially when
you’ve completed your graphic and need to clean up some of the free space around the edges.

To use the Crop tool, select the icon indicated in the screenshot from the side menu bar, and drag
the box over the area you would like to crop. To adjust the crop box, simply click and drag the
small anchor boxes on the sides and corners of the crop box.

How to Use the Paint Bucket in Photoshop

This tool is perfect for giving your marketing content some much needed color, and a nice
background off of which to build your graphics. There are three tools built into the Paint Bucket
that a marketer should learn how to use. The first is the basic Paint Bucket tool, which essentially
fills any solid area with the color of your choice. It’s great for solid backgrounds or coloring
large areas.

To use the Paint Bucket tool, simply select the icon from the side menu bar as indicated in the
screenshot, and find the color you would like to paint. Once you've found the color you'd like to
use, select the appropriate layer and apply the color by clicking the area you would like painted.

5
Camera Basics, Principles and Techniques – MCD 401 VU

The Paint Bucket can also be used to apply patterns to your images. These patterns can be
manually created if you have the time and patience, or you can find a variety of royalty-free
patterns available for download through a basic Google search.

To use the Paint Bucket to apply a pattern, select the paint bucket icon from the side menu bar
and use the drop down tab from the top menu bar to change the foreground to "Pattern" as
indicated in the screenshot below. Once you've changed the tab over to "Pattern," you will notice
a new tab appears to the right of that "Pattern" tab. Select that new pattern box and choose your
pattern from the drop down box. You have now chosen a pattern, and are now ready to apply that
pattern to whichever layer you would like.

The third feature of the paint bucket is the Gradient tool. This can be used to create a nice faded
background effect of the color of your choice. It’s a simple tool that’s easy to use, and produces a
nice professional look for your marketing content.

To use the Gradient tool, click and hold the paint bucket icon until the slide out menu appears.
Select the Gradient tool and make sure that the Gradient tab in the top menu bar is set to a
gradual face, as indicated in the screenshot above. Now, choose the color you would like to use,
place your cursor on the right side of the graphic, and drag to the left.

How to Use the Eyedropper in Photoshop

This handy little tool lets you simply extract and use any color from any image in Photoshop. It’s
perfect for keeping your marketing content uniform with company color schemes. To use the
Eyedropper tool, select the icon from the sidebar as indicated in the screenshot. Locate that color
you would like to extract, and simply click that area to clone the color. Once you've extracted the
color, you will see it indicated at the bottom of the left sidebar. You can double click that color
box to bring up the advanced color picker where you can adjust and save the color to a swatch
for future use.

6
Camera Basics, Principles and Techniques – MCD 401 VU

How to Use Blending Options in Photoshop

The blending options are great for putting finishing touches on your marketing content. You can
access these options by right-clicking the layer you want to use and selecting “Blending Options”
from the menu. You can also double click any layer to bring up the options for that particular
layer. You’ll notice in the screen shot below that there are quite a number of features you can
use. You should take some time to play around with all the layer effects and find out which ones
tickle your fancy.

7
Camera Basics, Principles and Techniques – MCD 401 VU

Topic 41

Digital Imaging Device

Key components in DSLR cameras

In the above figure, we can see that there are several key components in a DSLR camera, which
are listed as follows with their respective functions:

1. Matte focusing screen: A screen on which the light passes through the lens will
project.
2. Condensing lens: A lens that is used to concentrate the incoming light.
3. Pentaprism: To produce a correctly oriented and right side up image and project it to
the viewfinder eyepiece.
4. AF sensor: It’s full name is autofocus sensor, which is used to accomplish correct
auto focus.
5. Viewfinder eyepiece: To allow us to see what will be recorded on the image sensor.
6. LCD screen: It’s full name is liquid crystal display, which is used to display the

1
Camera Basics, Principles and Techniques – MCD 401 VU

photos stored in its memory card, settings and also what will be recorded on the image
sensor in the live view mode.
7. Image sensor: A device that contains a large number of pixels for converting an
optical image into electrical signals. The commonly used types are charge- coupled
device (CCD) and Complementary Metal-oxide-semiconductor (CMOS).
8. AE sensor: It’s full name is auto exposure sensor, which is used to provide exposure
information and adjust the exposure settings after calculations under different
situations.
9. Sub mirror: To reflect the light passes through the semi-transparent area on the main
mirror to the autofocus (AF) sensor.
10. Main mirror: To reflect incoming light into the viewfinder compartment. It must be
in an angle of exactly 45 degrees. There is a small semi-transparent area on it to facilitate
auto focus.

Having an overall idea about the internal structure of a DSLR camera and the functions
of different components, I will use the following figure to illustrate its working principle.

The basic working principle of a DSLR camera

2
Camera Basics, Principles and Techniques – MCD 401 VU

In the above figure, we can see that light from the outside world first passes through the
lens. After this, the light is projected on the matte focus screen through reflection by the
main mirror. The condensing lens and the pentaprism then project the image formed on
the matte focus screen to the viewfinder eyepiece by internal reflection. This explains
why we can see the image that will be taken by the camera through the viewfinder.

When we need to take a photo using autofocus, we can first press the shutter button half-
way down to trigger the process. During this process, the light is directed to the AF
sensor by the sub-mirror. The AF sensor then performs a series of calculations to achieve
correct focus. After focusing, the main mirror will flip up (towards the matte focus
plane). As a result, the light coming from the lens can reach the image sensor. A digital
image is formed after the light has been converted to electronic signals by the image
sensor.

Mirror less Interchangeable Lens Cameras (MILCs)

Mirrorless interchangeable lens camera

3
Camera Basics, Principles and Techniques – MCD 401 VU

The internal structure of a MILC

From the above figure, we can see that there are some common components with DSLR cameras
mentioned in the previous part, for instance image sensor and LCD screen.

However, unlike a typical DSLR, there are no mirrors, and therefore no optical viewfinder, in a
MILC, as what it called. Also, MILCs use contrast-detect autofocus, which is achieved by
measuring the contrast within the field of the image sensor, instead of phase-detect autofocus,
which is achieved by dividing the incoming light into pairs of images and comparing them, that
is being used in conventional DSLR cameras. In addition, auto exposure of MILCs is dealt with
by the image sensor in real time, rather than being done by a dedicated AE sensor in
conventional DSLR cameras. Furthermore, most MILCs do not have a viewfinder, but only the
LCD display. This can be a great disadvantage under strong sunlight since it is very difficult to
see what the LCD display shows due to reflection of sunlight.

4
Camera Basics, Principles and Techniques - MCD 401 VU

Topic 42
Digital Imaging Files

Pixel: In digital imaging, a pixel, pel, or picture element is a physical point in a raster image, or
the smallest addressable element in an all points addressable display device; so it is the smallest
controllable element of a picture represented on the screen. The address of a pixel corresponds to
its physical coordinates. LCD pixels are manufactured in a two-dimensional grid, and are often
represented using dots or squares, but CRT pixels correspond to their timing mechanisms and
sweep rates.

Each pixel is a sample of an original image; more samples typically provide more accurate
representations of the original. The intensity of each pixel is variable. In color image systems, a
color is typically represented by three or four component intensities such as red, green, and blue,
or cyan, magenta, yellow, and black.

In some contexts (such as descriptions of camera sensors), the term pixel is used to refer to a
single scalar element of a multi-component representation (more precisely called a photosite in
the camera sensor context, although the neologism sensel is sometimes used to describe the
elements of a digital camera's sensor), while in others the term may refer to the entire set of such
component intensities for a spatial position. In color systems that use chroma subsampling, the
multi-component concept of a pixel can become difficult to apply, since the intensity measures
for the different color components correspond to different spatial areas in such a representation.

The word pixel is based on a contraction of pix (pictures) and el (element); similar formations
with el for “element” include the words voxel and texel.

Resolution of computer monitors: Computers can use pixels to display an image, often an
abstract image that represents a GUI. The resolution of this image is called the display resolution
and is determined by the video card of the computer. LCD monitors also use pixels to display an
image, and have a native resolution. Each pixel is made up of triads, with the number of these
triads determining the native resolution. On some CRT monitors, the beam sweep rate may be
fixed, resulting in a fixed native resolution. Most CRT monitors do not have a fixed beam sweep
rate, meaning they do not have a native resolution at all - instead they have a set of resolutions

1
Camera Basics, Principles and Techniques - MCD 401 VU

that are equally well supported. To produce the sharpest images possible on an LCD, the user
must ensure the display resolution of the computer matches the native resolution of the monitor.

Bits per pixel: A bit is the basic unit of information in computing and digital communications. A
bit can have only one of two values, and may therefore be physically implemented with a two-
state device. These values are most commonly represented as either a 0or1. The term bit is
a combination of binary digit.

The number of distinct colors that can be represented by a pixel depends on the number of bits
per pixel (bpp). A 1 bpp image uses 1-bit for each pixel, so each pixel can be either on or off.
Each additional bit doubles the number of colors available, so a 2 bpp image can have 4 colors,
and a 3 bpp image can have 8 colors:

 1 bpp, 21 = 2 colors (monochrome)


 2 bpp, 22 = 4 colors
 3 bpp, 23 = 8 colors

 8 bpp, 28 = 256 colors


 16 bpp, 216 = 65,536 colors (Highcolor )
 24 bpp, 224 = 16,777,216 colors (Truecolor)

The Byte: Byte is a unit of digital information in computing and telecommunications that most
commonly consists of eight bits. Historically, the byte was the number of bits used to encode a
single character of text in a computer and for this reason it is the smallest addressable unit
of memory in many computer architectures. The size of the byte has historically been hardware
dependent and no definitive standards existed that mandated the size. The de facto standard of
eight bits is a convenient power of two permitting the values 0 through 255 for one byte. The
international standard IEC 80000-13 codified this common meaning. Many types of applications
use information representable in eight or fewer bits and processor designers optimize for this
common usage. The popularity of major commercial computing architectures has aided in the
ubiquitous acceptance of the 8-bit size.

2
Camera Basics, Principles and Techniques - MCD 401 VU

The unit Octet was defined to explicitly denote a sequence of 8 bits because of the ambiguity
associated at the time with the byte.

3
Camera Basics, Principles and Techniques – MCD 401 VU

Topic 43

Videography- Historical Background

With the invention of the kinetoscope by Thomas Edison in 1891, the trend towards what
we now call videography had begun. The kinetoscope led the way to the Kinetograph,
which was considered the first motion picture device. The Kinetograph consisted of still
photographs that were positioned in a setting of quick stop-and-go film movements and
gave the appearance of an actual movie.

In the 1920s and 1930s, people would flock to the movie “picture house” to watch silent
movies led by, of course, Charlie Chaplin. Then “talkies” (talking film movies) evolved
and the movie industry took off.
In 1932, 8mm movie film cameras and projectors were being sold to the public and the
Kodak Company, one of the pioneers for film projection, introduced “Super 8 film.” This
format provided images that appeared larger than normal 8mm film.

Whether 8mm or super 8mm film was used, people across the world would use these silent
movie cameras to film their families on vacation, their children, and weddings and other
affairs. When the movie was complete, the film would be sent to a film development
company to process them. In order to view the movie, the film would have to be “sewn”
through a “movie projector” before it could be viewed. The process usually took between
five to ten minutes to complete.
The technology had its advancements, but in general, making home movies this way
continued right into the 1960s and 1970s. Then a breakthrough occurred. What was once
only seen in sci-fi movies – “video” became a reality. No more would the people need to
use the cumbersome 8mm film cameras and projectors. All that was needed was a video
camera, cartridge and a player on which to watch the movies.
In the early years of video, the cameras were large and needed a separate battery back pack
when making movies. As technology advanced in the 1980s, the bulky video camera was
reduced to a hand-held size, making it much easier for people to take home movies.

1
Camera Basics, Principles and Techniques – MCD 401 VU

Professional photographers embraced this new technology and started offering professional
videos for weddings, bar mitzvahs and corporate events; hence, the beginning of the
professional videography industry.

2
Camera Basics, Principles and Techniques – MCD 401 VU

Topic 44

Motion Picture Historical Background

The illusion of motion pictures is based on the optical phenomena known aspersistence of
vision and the phi phenomenon. The first of these causes the brain to retain images cast upon the
retina of the eye for a fraction of a second beyond their disappearance from the field of sight,
while the latter creates apparent movement between images when they succeed one another
rapidly. Together these phenomena permit the succession of still frames on a motion-picture film
strip to represent continuous movement when projected at the proper speed (traditionally 16
frames per second for silent films and 24 frames per second for sound films).

Before the invention of photography, a variety of optical toys exploited this effect by mounting
successive phase drawings of things in motion on the face of a twirling disk
(thephenakistoscope, c. 1832) or inside a rotating drum (the zoetrope, c. 1834). Then, in
1839, Louis-Jacques-Mandé Daguerre, a French painter, perfected the positive photographic
process known as daguerreotypy, and that same year the English scientist William Henry Fox
Talbot successfully demonstrated a negative photographic process that theoretically allowed
unlimited positive prints to be produced from each negative.

As photography was innovated and refined over the next few decades, it became possible to
replace the phase drawings in the early optical toys and devices with individually posed phase
photographs, a practice that was widely and popularly carried out.

There would be no true motion pictures, however, until live action could be photographed
spontaneously and simultaneously. This required a reduction in exposure time from the hour or
so necessary for the pioneer photographic processes to the one-hundredth (and, ultimately, one-
thousandth) of a second achieved in 1870. It also required the development of the technology of
series photography by the British American photographer Eadweard Muybridge between 1872
and 1877. During that time, Muybridge was employed by Gov. Leland Stanford of California, a

1
Camera Basics, Principles and Techniques – MCD 401 VU

zealous racehorse breeder, to prove that at some point in its gallop a running horse lifts all four
hooves off the ground at once. Conventions of 19th-century illustration suggested otherwise, and
the movement itself occurred too rapidly for perception by the naked eye, so Muybridge
experimented with multiple cameras to take successive photographs of horses in motion.

Finally, in 1877, he set up a battery of 12 cameras along a Sacramento racecourse with wires
stretched across the track to operate their shutters. As a horse strode down the track, its hooves
tripped each shutter individually to expose a successive photograph of the gallop, confirming
Stanford’s belief. When Muybridge later mounted these images on a rotating disk and projected
them on a screen through a magic lantern, they produced a “moving picture” of the horse at full
gallop as it had actually occurred in life.

The French physiologist Étienne-Jules Marey took the first series photographs with a single
instrument in 1882; once again the impetus was the analysis of motion too rapid for perception
by the human eye. Marey invented the chronophotographic gun, a camera shaped like a rifle that
recorded 12 successive photographs per second, in order to study the movement of birds in
flight. These images were imprinted on a rotating glass plate (later, paper roll film), and Marey
subsequently attempted to project them. Like Muybridge, however, Marey was interested in
deconstructing movement rather than synthesizing it, and he did not carry his experiments much
beyond the realm of high-speed, or instantaneous, series photography. Muybridge and Marey, in
fact, conducted their work in the spirit of scientific inquiry; they both extended and elaborated
existing technologies in order to probe and analyze events that occurred beyond the threshold of
human perception. Those who came after would return their discoveries to the realm of normal
human vision and exploit them for profit.

In 1887 in Newark, N.J., an Episcopalian minister named Hannibal Goodwin developed the idea
of using celluloid as a base for photographic emulsions. The inventor and industrialist George
Eastman, who had earlier experimented with sensitized paper rolls for still photography, began
manufacturing celluloid roll film in 1889 at his plant in Rochester, N.Y. This event was crucial
to the development of cinematography: series photography such as Marey’s chronophotography
could employ glass plates or paper strip film because it recorded events of short duration in a
relatively small number of images, but cinematography would inevitably find its subjects in

2
Camera Basics, Principles and Techniques – MCD 401 VU

longer, more complicated events, requiring thousands of images and therefore just the kind of
flexible but durable recording medium represented by celluloid. It remained for someone to
combine the principles embodied in the apparatuses of Muybridge and Marey with celluloid strip
film to arrive at a viable motion-picture camera—an innovation achieved by William Kennedy
Laurie Dicksonin the West Orange, N.J., laboratories of the Edison Company.

3
Camera Basics, Principles and Techniques –MCD 401 VU

Topic 45

Film Reel

Reel, in motion pictures, is a light circular frame with radial arms and a central axis, originally
designed to hold approximately 1,000 feet (300 m) of 35-millimetre motion-picture film. In the
early days of motion pictures, each reel ran about 10 minutes, and the length of a picture was
indicated by the number of its reels. A film was a “one-reeler,” a “two-reeler,” or longer.

The number of reels in a motion picture became a point of controversy in the United States when
the Motion Picture Patents Company (1909–17), a trust of major film producers and distributors
who attempted a monopoly of the industry from 1909 to 1912, limited the length of films to one
or two reels because the viewing audience was considered incapable of appreciating motion
pictures of greater duration. Multiple-reel films achieved widespread acceptance in 1912,
however, becoming known thereafter as “feature” films. The word reel has lost its original
meaning in terms of time, since a modern projector accommodates reels holding from 2,000 to
3,000 feet of 35-millimetre film, while the so-called mini-theatres often mount an entire movie
on a single reel.

It is traditional to discuss the length of theatrical motion pictures in terms of "reels". The
standard length of a 35 mm film reel is 1,000 feet (305 m), which runs approximately 11 minutes
for sound film (24 frames per second) and slightly longer at silent film speed (which may vary
from approximately 16 to 22 frames per second). Most films have visible cues which mark the
end of the reel. This allows projectionists running reel-to-reel to change over to the next reel on
the other projector.

A so-called "two-reeler" would have run about 20–24 minutes since the actual short film shipped
to a movie theater for exhibition may have had slightly less (but rarely more) than 1,000 ft
(305 m) on it. Most modern projectionists use the term "reel" when referring to a 2,000-foot
(610 m) "two-reeler", as modern films are rarely shipped by single 1,000-foot (305 m) reels. A
standard Hollywood movie averages about five 2000-foot reels in length.

The "reel" was established as a standard measurement because of considerations in printing


motion picture film at a film laboratory, for shipping (especially the film case sizes) and for the

1
Camera Basics, Principles and Techniques –MCD 401 VU

size of the physical film magazine attached to the motion picture projector. Had it not been
standardized (at 1,000 ft or 305 m of 35 mm film) there would have been many difficulties in the
manufacture of the related equipment. A 16 mm "reel" is 400 feet (122 m). It runs, at sound
speed, approximately the same amount of time (11–12 minutes) as a 1,000-foot (305 m) 35 mm
reel.

A "split reel" is a motion picture film reel in two halves that, when assembled, hold a specific
length of motion picture film that has been wound on a plastic core. Using a split reel allows film
to be shipped or handled in a lighter and smaller form than film would on a "fixed" reel. In silent
film terminology, two films on one reel.

As digital cinema catches on, the physical reel is being replaced by a virtual format called Digital
Cinema Package, which can be distributed using any storage media (such as hard drives) or data
transfer medium (such as the Internet or satellite links) and projected using a digital projector
instead of a conventional movie projector.

Actors may submit a demo reel of their work to prospective employers, often in physical reel
format.

2
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 46

Film Projection

The History of 35mm Film

The first machine patented in the United States that showed animated pictures or movies was a
device called the "wheel of life" or "zoopraxiscope". Patented in 1867 by William Lincoln,
moving drawings or photographs were watched through a slit in the zoopraxiscope. However,
this was a far cry from motion pictures as we know them today. Modern motion picture making
began with the invention of the motion picture camera.

The Frenchman Louis Lumiere is often credited as inventing the first motion picture camera in
1895. But in truth, several others had made similar inventions around the same time as Lumiere.
What Lumiere invented was a portable motion-picture camera, film processing unit and projector
called the Cinematographe, three functions covered in one invention.

The Cinematographe made motion pictures very popular, and it could be better be said that
Lumiere's invention began the motion picture era. In 1895, Lumiere and his brother were the first
to present projected, moving, photographic, pictures to a paying audience of more that one
person.

The Lumiere brothers were not the first to project film. In 1891, the Edison company
successfully demonstrated the Kinetoscope, which enabled one person at a time to view moving
pictures. Later in 1896, Edison showed his improved Vitascope projector and it was the first
commercially, successful, projector in the U.S.

1
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 47

STILL CAMERA MECHANISM

Photography is undoubtedly one of the most important inventions in history -- it has truly
transformed how people conceive of the world. Now we can "see" all sorts of things that are
actually many miles -- and years -- away from us. Photography lets us capture moments in time
and preserve them for years to come.

The basic technology that makes all of this possible is fairly simple. A still film camera is made
of three basic elements: an optical element (the lens), a chemical element (the film) and a
mechanical element (the camera body itself). As we'll see, the only trick to photography is
calibrating and combining these elements in such a way that they record a crisp, recognizable
image.

There are many different ways of bringing everything together. In this chapter, we'll look at
a manual single-lens-reflex (SLR) camera. This is a camera where the photographer sees
exactly the same image that is exposed to the film and can adjust everything by turning dials and
clicking buttons. Since it doesn't need any electricity to take a picture, a manual SLR camera
provides an excellent illustration of the fundamental processes of photography.

The optical component of the camera is the lens. At its simplest, a lens is just a curved piece of
glass or plastic. Its job is to take the beams of light bouncing off of an object and redirect them
so they come together to form a real image -- an image that looks just like the scene in front of
the lens.

But how can a piece of glass do this? The process is actually very simple. As light travels from
one medium to another, it changes speed. Light travels more quickly through air than it does
through glass, so a lens slows it down.

When light waves enter a piece of glass at an angle, one part of the wave will reach the glass
before another and so will start slowing down first. This is something like pushing a shopping
cart from pavement to grass, at an angle. The right wheel hits the grass first and so slows down
while the left wheel is still on the pavement. Because the left wheel is briefly moving more
quickly than the right wheel, the shopping cart turns to the right as it moves onto the grass.

1
Camera Basics, Principles and techniques –MCD 401 VU

The effect on light is the same -- as it enters the glass at an angle, it bends in one direction. It
bends again when it exits the glass because parts of the light wave enter the air and speed up
before other parts of the wave. In a standard converging, or convex lens, one or both sides of the
glass curves out. This means rays of light passing through will bend toward the center of the lens
on entry. In a double convex lens, such as a magnifying glass, the light will bend when it exits
as well as when it enters.

2
Camera Basics, Principles and techniques –MCD 401 VU

This effectively reverses the path of light from an object. A light source -- say a candle -- emits
light in all directions. The rays of light all start at the same point -- the candle's flame -- and then
are constantly diverging. A converging lens takes those rays and redirects them so they are all
converging back to one point. At the point where the rays converge, you get a real image of the
candle. In the next couple of sections, we'll look at some of the variables that determine how this
real image is formed.

3
Camera Basics, Principles and techniques –MCD 401 VU

Topic no 48

Film Camera Mechanism

Ordinary cameras are brilliant for taking snapshots of the world. The only trouble is, the world
simply won't "sit still"—there's always something moving around us, over our heads, or even
under our feet. Fortunately, movie cameras can capture moving images that better reflect the
changing nature of our world.

Persistence of vision": How the eye fools the brain

Open up a movie camera or camcorder (a compact electronic video camera) and you'll find all
kinds of mechanical and electrical parts packed inside. But the basic science behind making
movies has nothing to do with lenses, gears, electric motors, or electronics—it's all about how
our eyes and brains work.

You've probably done that trick where you make a flick book (sometimes called a flip book) by
drawing little stick people on the corner of a pad of paper and flicking them with your fingers so
fast that they hop, skip, and jump. When your eye sees a series of still images (or "frames") in
quick succession, it holds each image for a little while after it disappears and even as the next
one starts to replace it. In other words, each picture leaks into the next one, so they blur together
to make a single moving image. This is known as the persistence of vision and it's the secret
behind every movie you've ever seen.

It's not just flick books that use persistence of vision. Before movie cameras and projectors were
invented, 19th-century toy makers were using the same idea to make relatively crude animated
films. A typical toy from this era was called the zoetrope. It was a large rotating drum with thin
vertical slits cut into its outer edge. Inside, you placed a long strip of paper with small colored
pictures drawn on to it. Then you rotated the drum to make the pictures blur together (just like a
flick book) and looked down through one of the slits to watch them. Here's a great photo of
a restored zoetrope by Andrew Dunn.

How Film Works?

It’s a relatively small step from flip books and zoetropes to fully fledged movies. The theory of
making a movie is just as simple: you take thousands and thousands of still photographs one after
another. When you play them back at high speed, they blur into a single moving image—a
movie.

A famous American photographer called Eadweard Muybridge (1830–1904) was one of the
first people to show how one moving picture could be made from many still ones. Using multiple
1
Camera Basics, Principles and techniques –MCD 401 VU

cameras arranged in rows, he took series of photographs of galloping horses and vaulting
gymnasts.

Photo: Few things illustrate how movies work better than Muybridge's amazing photos. Here's a
sequence he made called "The Horse in Motion. "Photo courtesy of US Library of Congress.

How movie cameras work

A movie camera or camcorder simply automates what Muybridge did by hand. Classic movie
cameras are largely mechanical and capture images on moving plastic film; they're examples of
what we call analog technology (because they store pictures as pictures). Modern video cameras
and camcorders work more like digital cameras and webcams and capture images digitally
instead (storing pictures as numbers).

2
Camera Basics, Principles and techniques –MCD 401 VU

Photo: At first glance, this old-style Arriflex film camera looks quite like a modern camcorder—
but look closer. On top, you can see a big oval-shaped case where a huge reel of film is stored. If
you were standing next to this camera, you would also be able to hear a motor inside whirring
away as the film rattled through the mechanism.

Photo by Dave Maclean, courtesy of Defense Imagery

Classic movie cameras

A basic movie camera is like a standard film camera that takes a photograph on to plastic
film every time the shutter opens and closes. In a standard film camera, you have to wind the
film on so it advances to the next position to capture another photograph. But in a movie camera,
the film is constantly moving and the shutter is constantly opening and closing to take a
continuous series of photographs—about 24 times each second. Before modern camcorders were
invented, people used mechanical home movie cameras, which were very small versions of
professional movie cameras with all the parts (and the film itself) miniaturized. In these early
cameras, the film was moved past the lens by either a wind-up (clockwork) mechanism or a
small electric motor.

How a Classic movie camera works?

1. The unexposed movie film starts out on the large reel at the front. The film and its path
through the camera are shown by the black dotted line and the black arrows.
2. The film passes over guide rollers and spring-loaded pressure rollers that hold it firmly
against the central sprocket (a large wheel with teeth protruding from its edge, rather like
a gear wheel). The sprocket's teeth lock into the holes on the edge of the film and pull it
precisely and securely through the mechanism.
3. Light from the scene being filmed enters through the lens and passes into a prism (shown
by the yellow triangle), which splits it in half.
4. Some of the light continues on through the shutter (black line) and hits the film, exposing
a single frame (one individual still photo) of the movie.
5. The rest of the light takes the lower path, bouncing down into a mirror.
6. The shutter is like a mechanical eyelid that blinks open 24 times a second, allowing light
through when each frame of the film is securely in place and blocking the light when the
film is advancing from one frame to the next. The shutter is driven by the same
mechanism that turns the sprocket.
7. More pressure rollers hold the exposed film against the lower part of the central sprocket.
The teeth on the sprocket pull the exposed film back through the camera.

3
Camera Basics, Principles and techniques –MCD 401 VU

8. Light redirected by the mirror exits through a lens and viewfinder so the camera operator
can see what he or she is filming.
9. Guide rollers take the exposed film back up toward the upper reel.
10. The large upper reel at the back collects the exposed film.

VIDEO CAMERA

4
Camera Basics, Principles and techniques –MCD 401 VU

CAMCODERS

When video recording was invented, photographic film was replaced by magnetic videotape,
which was simpler, cheaper, and needed no photographic developing before you could view the
things you'd recorded. Modern electronic camcorders use digital video. Instead of recording
photographic images, they use a light sensitive microchip called a charge-coupled
device (CCD) to convert what the lens sees into digital (numerical) format. In other words, each
frame is not stored as a photograph, but as a long string of numbers. So a movie recorded with a
digital camcorder is a series of frames, each stored in the form of numbers. In some camcorders,
the digital information is recorded onto videotape; in others, you record onto a DVD; and in still
others, you record onto a hard drive or flash memory. The advantage of storing movies in digital
format is that you can edit them on your computer, upload them onto web sites, and view them
on all kinds of different devices (from cellphones and MP3 players to
computers and televisions).

How to make a movie with Digital Camera

Most modern digital cameras allow you to capture video as well as still photos, so you don't need
a movie camera or camcorder to win an Oscar! If you've never tried home movie making with
your camera, why not give it a go? Here are a few basic tips:

 Preparing: Make sure your digital camera batteries are fully charged before you start.
Unlike still photography, which uses battery power only intermittently, making movies
means your camera is operating continually for perhaps a half-hour or more—easily
enough to drain your batteries at the most inconvenient moment. It helps to have fully
charged spare batteries standing by.
 Planning: Unless you're making a spontaneous home movie, decide what you want to
film before you start recording. You could even draw up a storyboard to help you plan

5
Camera Basics, Principles and techniques –MCD 401 VU

what to film and when. That way, you can record all the outside shots together, all the
inside shots together, and so on—to save lots of moving around.
 Casting: Who's going to act in your movie? Friends and family? Or maybe you'll just talk
to the camera yourself in a kind of video diary?
 Filming: Just like a professional movie director, be sure to record much more footage
than you actually need. That way, you can edit down to a much higher quality end
product. If you're working with other actors, record multiple takes of key scenes and
choose the best ones when you watch them later.
 Editing: Explore your computer and see what video editing software (if any) is already
installed on it. If you're a Microsoft Windows user, see if you have a program called
Movie Maker installed. It lets you load files you've recorded with a digital camera or
webcam and then edit them frame by frame, adding text titles and all kinds of other visual
effects. The most recent version is called Windows Movie Maker and you can download
it for free if you've not got it already. On a Mac, you can do the same sort of thing
in iMovie.
 Publishing: Once you've recorded and edited your movie, decide what you'll do with it
next. How about burning it onto DVD and giving it out to your friends and family? Or
maybe you could upload it to YouTube or Facebook and become the next overnight
movie sensation?

Photo: It's easy to add special effects to your movie with software like Movie Maker, a program
packaged with most recent versions of Microsoft Windows. With a preview of your movie in the
right-hand pane, you can click through a whole list of visual effects on the left, including
transitions, fade-outs, and making your movie look old and crackly like a silent movie from the
1920s!

6
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 49

Film Exposure

Exposure in film photography is defined as the quantity of light that is allowed through the
camera lens and onto the photo film controlled by the intensity of light (through the aperture) and
length of time (determined by the shutter speed). For correct exposure in a film camera,
whether 35mm, medium format, or large format, it is essential that you correctly set both the
shutter speed and aperture. Film speed will also play a role in determining the correct exposure.

Obtaining the Correct Film Exposure

Technically speaking, the correct film exposure does not exist as different photographers will
have a different idea of what is properly exposed and what is not. See the example below for
three different exposures of the same site. Depending on whether you want to see all the detail or
darken it for night will determine the right exposure for you. However, the way in which you get
any exposure remains the same.

Light Metering

Most modern 35mm single-lens reflex (SLR) film cameras have through-the-lens (TTL)
meters that intake the amount of light available to help you determine the best exposure. These
types of light meters are called Reflective Meters as they measure the light bouncing off your
subject and into the camera lens. For beginners using a camera without a light meter, it is highly
suggested that you purchase a handheld meter to use. Handheld meters can be Reflective or take

1
Camera Basics, Principles and techniques –MCD 401 VU

Incident Light Readings. Incident readings put the light meter directly in the light of the subject
to capture one light source instead of an overall average.
While successful, light meters often are very flawed in areas of great contrast. Because
contrasting images may contain white areas, mid-gray areas, and black areas, the light meter is
forced to take an average reading of average gray. This is of prime importance, especially in
beach or winter photography, as the bright areas will be measured as mid-gray. A system was
developed to compensate for the mid-gray exposure by exposing for the darker shadow areas.

Film Exposure without Light Meters

The /Sunny 16 Rule can be used when a light meter is not available. Instead of wasting film
trying to guess your exposure, you can use this rule. On a sunny day with no clouds overhead, set
your aperture to f/16. Your shutter speed is then set to the ISO film speed of the film you’re
using. This will leave you with a nice, even exposure.
It is rare that your shutter speed will match the ISO film speed perfect. For instance, there is no
shutter speed 100 for 100 ISO film. The Sunny 16 rule then dictates to use the next highest
shutter speed above the ISO film speed. So for speed 100 you would use shutter speed 1/125. It’s
much easier to remember shutter speed equals ISO film speed, but always remember the shutter
speed will likely not match.

On a perfect, clear, sunny day, f/16 is to be used. Days that aren’t clear and sunny can still use
the Sunny 16 rule with some minor changes. Extremely bright days with distinct shadows use
f/22. Hazy sun and soft shadows use f/11, cloudy days with barely visible shadows use f/8, and
overcast days with no shadows use f/5.6. The same shutter speed rule applies. In our example of
using ISO film speed 100, the shutter speed will always be 1/125, regardless of the aperture used.

Perfect Exposure:

There are four steps to get a perfect exposure of a running movie film:

1. A shutter to blank off the aperture while the mechanism pulls the film through ready for the
next frame to be exposed.

2
Camera Basics, Principles and techniques –MCD 401 VU

2. A channel or gate in which to position the film accurately.

3. A device to pull the film down to its next position, usually a ‘claw’.

4. Loops of film top or bottom of the film gate to act as reservoirs during the pull down period.

3
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 50

Camera Shutter and Claw

What is Shutter Speed?

Camera shutter speed works hand in hand with the aperture settings of a film camera lens to
achieve a good exposure. Any guide to photography will stress the importance of correct shutter
speed and aperture settings of your35mm film camera, medium format camera, orlarge format
camera. Whereas the camera aperture sets the amount of light allow through the lens, the shutter
speed dictates how fast the shutter opens and closes to allow more or less light in to expose the
film. And while the aperture will determine depth of field, the shutter speed will capture motion.
If your subject is moving, a slow shutter speed will result in a blurry subject. A fast shutter speed
will capture the subject nearly instantly to capture one frame of that movement.

How the Shutter Works

In the case of single-lens reflex (SLR) cameras, the shutter is also part of the mirrors used to
reflect the image from the lens into the viewfinder. When the photograph is taken, the mirror
flips up to allow proper timed exposure to the film sitting behind the shutter. Because the mirror
flips up during exposure the viewfinder will go black. On other film cameras, such as the twin-
lens reflex (TLR) camera, the shutter works in a similar fashion but does not block the
viewfinder – allowing the photographer to always see the subject.

How to Measure Shutter Speed on a Film Camera

The majority of film cameras have multiple shutter speed settings on the shutter speed dial. They
will typically read, in order, 1-2-4-8-15-30-60-125-250-500-1000-2000-4000 and possibly more.
Be aware that these numbers do not equate to full seconds. 1 is equivalent to 1 second, but 2 is
actually 1/2 second, 4 is 1/4 second, etc. Therefore, a shutter speed of 1/125 will allow more
light than 1/500 and is a longer exposure. This becomes prime importance because if there is a
slow shutter speed you may see some resulting blurriness from the subject moving or your hands
moving the camera slightly. A general rule of thumb is that anything slower than 1/60 requires
the use of a tripod to avoid blurriness.

Most cameras will also have a shutter speed labeled B, known as bulb mode. This B setting
allows the photographer to keep the shutter open for as long as they hold the shutter release. This
is especially useful for dark areas or night photography that require long exposures. Some newer
cameras will also have an Auto feature. By putting the shutter on Auto, the shutter speed is
automatically adjusted based on the camera’s aperture, referred to as aperture priority.
Of course, none of this matters if you do not set both the aperture and shutter speed to the correct
exposure.

1
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 51

Early Cameras

Photography is a word derived from the Greek words photos ("light") and graphein ("to draw")
The word was first used by the scientist Sir John F.W. Herschel in 1839. It is a method of
recording images by the action of light, or related radiation, on a sensitive material.

Pinhole Camera

Alhazen (Ibn Al-Haytham), a great authority on optics in the Middle Ages who lived around
1000AD, invented the first pinhole camera, (also called the Camera Obscura} and was able to
explain why the images were upside down The first casual reference to the optic laws that made
pinhole cameras possible, was observed and noted by Aristotle around 330 BC, who questioned
why the sun could make a circular image when it shined through a square hole.

The First Photograph

On a summer day in 1827, Joseph Nicephore Niepce made the first photographic image with
a camera obscura. Prior to Niepce people just used the camera obscura for viewing or drawing
purposes not for making photographs. Joseph Nicephore Niepce's heliographs or sun prints as
they were called were the prototype for the modern photograph, by letting light draw the picture.
Niepce placed an engraving onto a metal plate coated in bitumen, and then exposed it to light.
The shadowy areas of the engraving blocked light, but the whiter areas permitted light to react
with the chemicals on the plate. When Niepce placed the metal plate in a solvent, gradually an
image, until then invisible, appeared. However, Niepce's photograph required eight hours of light
exposure to create and after appearing would soon fade away.

Louis Daguerre

Fellow Frenchman, Louis Daguerre was also experimenting to find a way to capture an image,
but it would take him another dozen years before Daguerre was able to reduce exposure time to
less than 30 minutes and keep the image from disappearing afterwards.

1
Camera Basics, Principles and techniques –MCD 401 VU

The Birth of Modern Photography

Louis Daguerre was the inventor of the first practical process of photography. In 1829, he
formed a partnership with Joseph Nicephore Niepce to improve the process Niepce had
developed.
In 1839 after several years of experimentation and Niepce's death, Daguerre developed a more
convenient and effective method of photography, naming it after himself - the daguerreotype.
Daguerre's process 'fixed' the images onto a sheet of silver-plated copper. He polished the silver
and coated it in iodine, creating a surface that was sensitive to light. Then, he put the plate in a
camera and exposed it for a few minutes. After the image was painted by light, Daguerre bathed
the plate in a solution of silver chloride. This process created a lasting image, one that would not
change if exposed to light.

In 1839, Daguerre and Niepce's son sold the rights for the daguerreotype to the French
government and published a booklet describing the process. The daguerreotype gained popularity
quickly; by 1850, there were over seventy daguerreotype studios in New York City alone.

Negative to Positive Process

The inventor of the first negative from which multiple postive prints were made was Henry Fox
Talbot, an English botanist and mathematician and a contemporary of Daguerre.
Talbot sensitized paper to light with a silver salt solution. He then exposed the paper to light. The
background became black, and the subject was rendered in gradations of grey. This was a
negative image, and from the paper negative, Talbot made contact prints, reversing the light and
shadows to create a detailed picture. In 1841, he perfected this paper-negative process and called
it a calotype, Greek for beautiful picture.

Tintypes

Tintypes, patented in 1856 by Hamilton Smith, were another medium that heralded the birth of
photography. A thin sheet of iron was used to provide a base for light-sensitive material, yielding
a positive image.

2
Camera Basics, Principles and techniques –MCD 401 VU

Wet Plate Negatives

In 1851, Frederick Scoff Archer, an English sculptor, invented the wet plate negative. Using a
viscous solution of collodion, he coated glass with light-sensitive silver salts. Because it was
glass and not paper, this wet plate created a more stable and detailed negative.
Photography advanced considerably when sensitized materials could be coated on plate glass.
However, wet plates had to be developed quickly before the emulsion dried. In the field this
meant carrying along a portable darkroom.

Dry Plate Negatives & Hand-held Cameras

In 1879, the dry plate was invented, a glass negative plate with a dried gelatin emulsion. Dry
plates could be stored for a period of time. Photographers no longer needed portable darkrooms
and could now hire technicians to develop their photographs. Dry processes absorbed light
quickly so rapidly that the hand-held camera was now possible.

Flexible Roll Film

In 1889, George Eastman invented film with a base that was flexible, unbreakable, and could be
rolled. Emulsions coated on a cellulose nitrate film base, such as Eastman's, made the mass-
produced box camera a reality.

Color Photographs

In the early 1940s, commercially viable color films (except Kodachrome, introduced in 1935)
were brought to the market. These films used the modern technology of dye-coupled colors in
which a chemical process connects the three dye layers together to create an apparent color
image.

3
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 52

Film Speed

1. Film Speed

“Film speed is the measure of a photographic film's sensitivity to light, determined by sensitometry
and measured on various numerical scales, the most recent being the ISO system. A closely related
ISO system is used to measure the sensitivity of digital imaging systems.”

How important is film speed?

Many people today love using a digital camera to take pictures, but others still prefer the old-
school charm and control of traditional film. When we talk about film speed, we're referring to
the measure of a film's sensitivity to light. Each film speed is best suited for a different type of
photography.

The lower the speed, the longer an exposure to light is necessary to produce image density. If the
film speed is higher, it requires less exposure but generally has reduced quality in the form of
grain and noise. Noise and grain are the abnormalities in brightness and color in images; they
look similar to a layer of "snow" on a television set. They're measured using the ISO system
from the International Organization for Standardization (thus the ISO, which is used as an
abbreviation for the group and the film speed) and are the giant numbers you'll typically see on a
box of film. You'll also see the abbreviation ASA (American Standard Association) used in
conjunction with film speed. ASA and ISA are interchangeable.

The rating still applies to digital photography even though the cameras don't use film. ISO speed
is used in digital cameras to judge the relationship between the exposure rating and the sensor
data values. Most advanced cameras have an ISO setting available, which emulates the speed
rating of film. The basic rules of film speed apply equally to film and digital cameras.

Slow-speed films generally refer to film with 100-200 ISO ratings. These slower speeds are
excellent for outdoor landscape photography and inanimate objects. They can also be a great
choice if it's a particularly sunny day. Since the film takes longer to absorb light, it captures
detail more effectively. So if you plan on enlarging those pictures you'll want to shoot with the
lowest ISO possible.

Medium speed is 400 ISO. As can be expected, the medium speed is probably the best for
general-purpose use and can handle indoor lighting conditions, overcast days and any
combination of the two. Even so, it's not suited for action shots or very bright days.

1
Camera Basics, Principles and techniques –MCD 401 VU

Fast-speed film is usually rated at 800 ISO and above. It's best for moving subjects you might
see at a sporting event or concert, or when you plan on using a zoom lens or are shooting in a
dimly lit area. Unfortunately, if you plan on enlarging the photos, they'll likely turn out grainy

Film speed is remarkably important and can make or break a photograph. There are exceptions to
the above rules, and experimenting can certainly yield impressive and interesting results, but the
fact remains that the film speed you choose will have a direct effect on the quality and density of
the picture you take, regardless of whether you're shooting digital or on film.

2
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 53

The Film People

Producers & Directors

Producer

The producer is generally responsible for a specific production. Usually the producer is
concerned with the business organization, budget, the choice of the staff and crew,
interdepartmental coordination, script acceptance, and production scheduling. The
producer may select or initiate the program concepts and work with writers. He or she
may assign the production’s director and is responsible for meet i n g deadlines,
production planning, location projects, rehearsals, production treatment, and other
duties. Producers may also become involved in specifics such as craft or union problems,
assessing postproduction treatment, and the final program format.

Assistant producer or associate producer (AP)


The assistant o r associate pr odu c er is responsible for assisting th e producer . These
responsibilities, as assigned by the producer, ma y include co or d i na t i n g appointments
and production schedules, making sure contracts are completed, booking guests, creating
packages, and supervising postproduction. This person may be assigned some of the same
responsibilities of an associate director.

Director
Ultimately t h e director i s the individual responsible for creatively visualizing the script or
event. This means that the director instructs the camera operators on the type of shots they
want and select the appropriate camera shots for the final production. Directors are people
who can effectively communicate their vision to the crew. They are also team builders who
move the crew toward that vision. This involves advising, guiding, and coordinating the

1
Camera Basics, Principles and techniques –MCD 401 VU

various members on the production team (scenic, lighting, sound, cameras, costume,
etc.) and approving their anticipated treatment. The director may choose and hire
performers/talent/actors (casting), envision and p l a n the c a m e r a t r e a t m e n t (shots
and camera movements) and editing, and direct/rehearse the performers during pre-
rehearsals (Figure 2.1).

Fig 2.1 The field director is reviewing the camera shot on an HD monitor (the monitor has a sun
shade on it to increase visibility). The multi-camera live production director must look at multiple
camera images and select the most appropriate shot. (Control room photo by Jon Green hoe/ WOOD
TV.)

Assistant director or associate director (AD)


The AD is responsible for assisting the director. Functions may include su per - vising pre-
rehearsals and location organization. The AD may also review storyboards, implement
the shooting schedule, a n d shield the director from interruptions, and he or she is
sometimes responsible for the cast. The AD may take the director’s notes on changes,
retakes, performance, and other fa ct o r s . For multi-camera shoots, the AD may be
responsible for lining up shots, graphics, and tapes. He or she may also be responsible for
checking on special shots (such as Chroma key), giving routine cues (tape inserts), and
other duties while the director guides the actual performance and camera(s). The AD may
also check program timing and help the director with postproduction. This person may be
assigned so me o f the same responsibilities of an associate produ cer . This position may
be merged with the floor manager.

2
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 54

The Film People

The Director of Photography

A cinematographer or director of photography (sometimes shortened to DP or DOP) is the


chief over the camera and lighting crews working on a film, television production or other live
action piece and is responsible for achieving artistic and technical decisions related to the image.
Directors of Photography (DoPs) are key Heads of Department on film productions and theirs is
one of the major creative roles. They provide a film with its unique visual identity, or look.

DoPs must discover the photographic heart of a screenplay, using a variety of source material
including stills photography, painting, other films, etc.

They create the desired look using lighting, framing, camera movement, etc. DoPs collaborate
closely with the camera crew (Camera Operator, 1st and 2nd Assistant Camera, Camera Trainee
and Grips).
During filming, DoPs also work closely with the Gaffer (who runs the lighting team), the
Production Designer, Costume Designer, and the Hair and Make-up Department.

After reading the screenplay, DoPs meet with the Director to discuss the visual style of the film.
They conduct research and preparation including carrying out technical recces of locations. They
prepare a list of all required camera equipment, including lights, film stock, camera, cranes and
all accessories etc., for the production office to order.

During preparation DoPs also test special lenses, filters or film stocks, checking that the results
fit with the Director's vision for the film.

On each day of principal photography, DoPs and their camera crews arrive early on set to
prepare the equipment. During rehearsals, the Director and DoP block (decide the exact
movements of both actors and camera) the shots as the actors walk through their actions,
discussing any special camera moves or lighting requirements with the Camera Operator, Gaffer
and Grip.

Each shot is marked up for focus and framing by the 1st AC, and, while the actors finish make-
up and costume, the DoP oversees the lighting of the set for the first take.

1
Camera Basics, Principles and techniques –MCD 401 VU

On smaller films, DoPs often also operate the camera during the shoot. At the end of each
shooting day, DoPs prepare for the following day's work and check that all special requirements
(cranes, Steadicams, remote heads, long or wide lenses, etc.) have been ordered. They also
usually view the rushes (raw footage) with the Director.

During post production, DoPs attend the digital grading of the film, which may involve up to
three weeks of intensive work.

Most DoPs work on commercials and promos as well as on feature films. Although the hours are
long, and some foreign travel may be required, the work is highly creative and very rewarding.

2
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 55

Director of Photography

Job description

The director of photography, also known as the DP or the cinematographer, assists the film
director by establishing the visual look of the movie. As a DP, you’ll help tell the story through
the artistic and technical decisions you make regarding lighting, film stock, shot selection,
camera operation and other elements. A DP's duties and responsibilities include the work he does
before, during and after film production.

Visual Style
The director of photography works with production designers, art directors, set dressers and even
wardrobe crew and hairstylists to help establish the look of the film and its individual scenes.
The decisions made in this area should support the script and the director’s vision and result in
imagery that the camera can capture.

Film Stock Selection


The selection of film stock can dramatically influence the look of the film. The varying
concentration of light-sensitive emulsions on film stock determines the color tones and the
degree of graininess viewers see. The decisions made primarily in pre-production -- but also
including methods of printing the film in post-production -- help set the mood and advance the
film’s plot. For instance, an urban crime drama might use a grainy stock to reflect the gritty
setting and mood, while a more upbeat feature might be shot on a film stock that supports a
lighter, airier palette.

Lighting
In lighting the film, the director of photography might settle on an icy blue look to suggest a
physically or emotionally cold environment, or warm shades to set a nostalgic or heartwarming
tone. Gordon Willis, the Academy Award-winning cinematographer who worked on "The
Godfather" and its sequels, earned the nickname “The Prince of Darkness” for his dramatic and
starkly lit compositions.

1
Camera Basics, Principles and techniques –MCD 401 VU

Camera Operation
In rare cases, the director of photography actually operates the movie camera. More typically, he
oversees the camera crew and makes sure the director gets the film he envisioned through the
way its shot. This involves choosing the number of cameras involved, and their placement and
movement. It also involves framing of the scene, overseeing the use of camera filters and
aperture settings, and selecting special equipment. For example, David Lean’s cinematographer
on his sprawling epics "Lawrence of Arabia" and "Doctor Zhivago" was Freddie Young, an early
British devotee of the wide-screen Cinema Scope lens -- ideal for his director’s vast, panoramic
landscapes.

Three stages of production

Most programs go through three main stages:

1. Planning and preparation


The preliminaries, preparation, organization, and rehearsal before the production begin.
Ninety percent of the work on a production usually goes into the planning and
preparation phase.
2. Production
Actually shooting the production

3. Post-production
Post production includes Editing, additional treatment, and duplication of
production.

The nature o f the subject will influence the amount of work needed a t each stage. A
production that involves a series of straightforward “personality” inter- views is generally
a lot easier to organize than one on Arctic exploration or a historical drama. But in the
end, a great deal depends on how the director decides to approach the subject.

Working at the highest quality, directors can create incredible programming by using simple
m e t h o d s . Treatment d o e s not have to be elaborate to make its point. If a woman in the

2
Camera Basics, Principles and techniques –MCD 401 VU

desert picks up her water bottle, finds it empty, and then the camera shows a patch of
damp sand where it rested, the shot has told us a great deal without any need for
elaboration. A single look or a gesture can often have a far stronger impact than lengthy
dialog that attempts to show how two people feel about each other. It is important to
understand the complexity o f the production. Some ideas seem simple enough but can be
difficult or impossible to carry out. Others look very difficult or impracticable but are easily
achieved on the screen. For example:
“Hurry and arc the camera around the actor.” (Difficult: Movement shots are always time
consuming, making it almost impossibly to do quickly.)
“Make her vanish!” (Simple: Keep the camera still, have the subject exit, and edit out the walk.)

3
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 56

Production stages

There are three stages of production:

1. PRE-PRODUCTION
This is the stage in which all the planning for the project takes place. During pre-
production, the production is broken down into individual scenes and all the locations,
props, cast members, costumes, special effects and visual effects are identified. The
script, if not already complete, is written at this stage. A detailed schedule is produced
and arrangements are made for the necessary elements or people to be available to the
film-makers at the appropriate times.
2. PRODUCTION
This is the stage at which all the filming is carried out. All scenes planned out in pre-
production are filmed at the relevant locations. Each scene is filmed as many times as the
director deems fit, to ensure the best quality scenes will be used to construct the film.
This is where the strength of the pre-production work is put to the test. Great care must be
taken to make sure that all the filming is done correctly and all necessary shots are taken,
as it is sometimes difficult or impossible to go back and repeat certain events if the
filming is incomplete when it comes to the post-production stage.
3. POST-PRODUCTION
This is the stage in which the film is assembled by the editor. The first job of the film
editor is to build a rough cut taken from sequences (or scenes) based on individual
"takes" (shots). The purpose of the rough cut is to select and order the best shots. The
next step is to create a fine cut by getting all the shots to flow smoothly in a seamless
story. Trimming - the process of shortening scenes by a few minutes, seconds, or even
frames - is done during this phase. As well as the editing of footage, all music, graphics,
menus etc are added in this stage. After the fine cut has been screened and approved by
the director and producer, the picture is "locked," meaning no further changes are made.

1
Camera Basics, Principles and techniques –MCD 401 VU

Topic no 57

Recce or Scouting

Remote surveys (recce)

Fundamentally, there are two types of shooting conditions: at your base and on location.
Your base is wherever you normally shoot. It may be a studio, theater, room, or even a
stadium. The base is where you know exactly what facilities are available (equipment,
supplies, and scenery), where things are, the amount of room a v a il a bl e , and so on. If you
need to supplement what is there, you can usually do so easily. The Remote Survey

The amount of detail needed about a location varies with the type and style of the production.
Information that may seem trivial at the time can prove valuable later in the production process.
Location sites can be interiors, covered exteriors, or open-air sites. Each has its own problems.

Sketches
 Prepare rough maps of route to site that can ultimately be distributed to the crew and
talent (includes distance, travel time).
 Prepare a rough layout of the site (room plan, etc.).
 Outline anticipated camera location(s).
 Designate parking locations for truck (if needed) and staff vehicles.
Contact & Schedule
 Get location contact information from primary and secondary location
 Information contacts, site custodian, electrician, engineer, and security; this includes
office and cell phones as well as e-mail.
 If access credentials are required for the site, obtain the procedure and contact
information.
 Obtain the event schedule (if one exists), and find out if there are rehearsals that you can
attend.

Camera Locations

1
Camera Basics, Principles and techniques –MCD 401 VU

 Check around the location for the best camera angles.


 What type of camera mount will be required (tripod, Steadicam, etc.)?
 If a multi-camera production, cable runs must be measured to ensure that there is enough
camera cable available.
 What lens will be required on the camera at each location to obtain the needed shot?
 42
 Are there any obstructions or distractions (e.g., large signs, reflections)?
 Do you anticipate any obvious problems in shooting? Anything dangerous?
Lighting

 Will the production be shot in daylight? How will the light change through- out the day?
Does the daylight need to be augmented with reflectors or lights?
 Will the production be shot in artificial light? (If so, will you use theirs, yours, or a
combination of the two?) Will they be on at the time you are shooting?
 What are your estimates for the number of lamps, positions, power needed, supplies, and
cabling required?

Audio

 What type of microphones will be needed?


 Any potential problems with acoustics (such as a strong wind rumble)?
 Any extraneous sounds (elevators, phones, heating/air conditioning, machinery, children,
aircraft, birds, etc.)?
 Required microphone cable lengths must be determined. Safety
 Are there any safety issues that you need to be aware of?
Power
 What level of power is available, and what type of power will you need? This will differ
greatly between single-camera and multi-camera production.
 What type of power connectors are required?

2
Camera Basics, Principles and techniques –MCD 401 VU

Communications

 Are radios needed? How many?

 How many cell phones are needed?


 If it is a multi-camera production, what type of intercom and how many headsets are
required?

Logistics

 Is there easy access to the location? At any time, or at certain times only? Are there any
traffic problems? What kind of transportation is needed for talent and crew?
 What kind of catering is needed? How many meals? How many people?
 Are accommodations needed (where, when, how many)?
 If the weather is bad, are there alternative positions/locations available?
 Has a phone number list been prepared for police, fire, doctor, hotel, and local
(delivery) restaurants?
 What kind of first-aid services need to be available? (Is a first-aid kit
sufficient, or does an ambulance need to be on-site?)
 Is location access restricted? Do you need to get permission (or keys) to enter
the site? From whom?
 What insurance is needed (against damage or injury)?

Security
 Are local police required to handle crowds or just the public in general?
 What arrangements need to be made for security of personal items, equipment, props,
etc.)?
 Do streets need to be blocked?

3
Camera Basics, Principles and techniques –MCD 401 VU

A location is anywhere away from your normal shooting site. It may just be out- side the
building or way out in the country. It could be in a vehicle, down in a mine, or in
someone’s home. Your main concern when shooting away from your base is to find out
what you are going to deal with in advance. It is important to be prepared. The preliminary
visit to a location is generally called a remote su r v ey, site survey, or location survey. It
can be anything fro m a quick look around to a detailed su r v ey of the site. What you find
during th e survey may i n fl u e n c e the planned pr o duc t i on t r e a t m e nt .

4
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 58

Preparing for Shoot

Preparing for a film production shoot can be pretty stressful. You’ve got budgets to manage,
cameramen and editors to hire, and a whole slew of locations, sets, props and pieces of
equipment to keep track of. If you’re not organized and on top of things, it can be easy to fall off
track.

Want to make sure your upcoming shoot goes as smoothly and seamlessly as possible? Then
make sure you follow these 5 simple preparation tips:

1. Make a shooting schedule


If you want your shoot to be on time, on budget and to go as planned, then a shooting schedule is
an absolute must. Sit down with your directors, producers, sound guys, location managers and
DPs, and break down your shoot into manageable sections. Outline what scenes you’ll shoot on
which days, and pay careful attention to what actors are needed and what locations/sets you’re
using. It’s usually most cost-effective to shoot all of one location’s shots in a single day or period
of days. The same goes with actors. You don’t want an actor just sitting on the sidelines for two
weeks getting paid. If you can organize your shoot so that their scenes are done all at once, you’ll
save money and time in the long run.

2. Create a shooting script


The next crucial step in ensuring a smooth film production shoot is to create shooting scripts.
Once you’ve created your schedule, use that to create daily scripts for all the actors, cameramen,
directors and other crew members. Make sure to include all the scenes you’ll be shooting for the
day – in order – and leave out any extraneous scenes or dialogue you don’t need. This makes it
easier for everyone to stay on track and in the right headspace.

3. Double check your locations


Sure, you might have secured that movie theater three months ago, but what if the theater owner
quit or was replaced since then? What if they got a new event management system, and your
reservation was lost? You don’t want to leave locations up to chance, so take time to check in
with each and every one you’ve arranged. Make sure to get confirmations that you can – and will
– be allowed to shoot on the property on the days you need.

1
Camera Basics, Principles and techniques –MCD 401 VU

4. Verify your equipment and props


Nothing throws off a shoot like realizing you don’t have the right lights or mics to complete a
scene. Check in with your crew members and make sure they have all the equipment they need to
do their jobs. Also be sure to turn on each piece of equipment, test it out and ensure it’s in proper
working order before your shoot rolls around. Equipment issues cost you time and money, and
they can put your shoot significantly behind schedule.

5. Keep in constant communication with your team


Have regular meetings with your crew, and keep a constant stream of calls, texts and emails
going. Keep them appraised of any changes, and be sure to send out shooting schedules and
scripts ahead of time, so they can all properly prepare. You want to make sure every single crew
member is on the exact same page before your shoot begins.

2
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 59

Filmmaking 01

SHOOTING INSTRUCTIONAL PRODUCTIONS

Typical instructional productions

A wide range of video programs can be described a s “instruction.” Besides the


programming found on the networks, niche cable channels are filled with shows about
c o o k i n g , h u n t i n g , home i m p r o v e m e n t , medicine ( surgery), a n d many other topics.
Most of these programs are instructional and are designed for specific markets, such as
educational, professional, specialist trades, and so on.

Approaches to instruction

There are some subjects that cannot really be demonstrated effectively in a video program.
Instead, directors have to rely on verbal descriptions of processes. Other su b j ect s may be only
partly successful because of the limitations of the system, such as a demonstration of sound
quality.

One of the weakest instructional methods in a video program is an illustrated talk that
shows still photographs, diagrams, maps, and so on. To enhance the project, pan or tilt
the camera across the still photo or slowly zoom in and out of the still.

A cooking demonstration, on the other hand, ca n be extremely effective, it is full of


visual movement and change, a process that develops as we watch—even though the
audience cannot smell or taste whether it is as successful as it looks.

One of the most powerful forms of instruction is when the talent speaks to us
directly through the camera, pointing out each feature of the subject.

1
Camera Basics, Principles and techniques –MCD 401 VU

The “documentary format,” in which the host may be heard as a voiceover making
observations on the images, is much less personal. However, it can be more authoritative,
especially if the speaker’s visual presence is not particularly impressive on camera for any
reason. But the documentary approach is extremely greedy for imagery. Compared to the
“direct approach,” where directors can always take shots of the talent speaking to the
camera, the “documentary” method requires that the camera is continually looking at
the subject and its surroundings— which may not have enough interest to sustain the
viewer’s attention.

Advance planning
On the face of it, one might assume that instructional productions merely involve
presenting the item before the camera and pointing out its features.
Some very boring programs are made this way. The secrets of really successful
demonstrations are planning, preparation, and rehearsal, even for the most familiar subjects.
In fact, the more familiar it is, the more difficult it can be to make the program interesting
and hold the audience’s attention. Too easily, the director can assume that the viewers
“know all about that.”

A good instructional program is designed t o fit its audience. If it is produced at too h i g h


a level, the u ni nit iated become b e w i l d e r e d and e m b a r r a s s e d at not being able to
grasp the facts. They become co nfu sed. They lose their self- confidence. While sorting out
what has been said, they get lost.

A successful program encourages cu ri o sit y and intrigues, leaving the audience with a
sense of satisfaction and fulfillment. “What do you think wil l happen now?” is much more
involving than “When we do this that happens.”

Creating the instructional program

Instructional programs can take anything from a few minutes to years to complete.
Directors often need to stretch or condense time in various ways to suit the occasion:

2
Camera Basics, Principles and techniques –MCD 401 VU

■ Shoot everything from start to finish. This ensures that the audience misses none of the
action and gets an accurate idea of how long everything takes. Clearly this is a good approach
for relatively brief productions, but it is unsuitable when producing a program such as
“How to build a house.”
■ The v i d e o can be recorded continuously, t h e n edit out the time-
consuming, repetitious, or b o r i n g parts. This can p r o d u c e a shorter, better-paced
program and ensures that the audience concentrates on the important stages.
■ Portions of a show can be arranged i n prepared sections. For instance, when showing
how to bake a cake, the director may first display the ingredients, begin the method of
mixing, and then place the prepared mixture in an oven (or not, if one is not available).
The talent can then remove the cooked cake and begin to decorate it, and we end with a shot
of the fully decorated cake. Not only does this method save a great deal of time, but it
ensures that the results turn out right at each stage. However, directors have to guard against
leaving out important information when shortcutting time in this way. For instance, the
audience may inadvertently be left wondering how to tell when the food is fully cooked.
Preferably the host would both tell and show the audience how to test for doneness.
■ For demonstrations that take a long time to develop, such as a plant growing, the
simplest plan is to shoot short sequences at strategic moments (hours, days, or
weeks apart) and show them one after the other. The alternative is to use some type
of an automatic timer that takes a brief video shot at regular intervals. Computer
software is avail- able that will trigger the camera and record the video images directly to a
hard dri v e. When the recording is played back at normal speed, time is compressed,
and these time-lapse sequences show the process highly
Speeded u p. A plant can grow from seed, flower, and die in a minute or two. But, of
course, this method does tie up a camera and other related equipment (VCR or computer)
in a rigidly held position for the duration of the shooting period.
■ Sometimes a demonstration program has to be made to a prescribed length in
order to fit a scheduled time slot. You can achieve this goal by deliberately including
material t h a t is interesting but can be trimmed or omitted as necessary.

3
Camera Basics, Principles and techniques –MCD 401 VU

■ If the video program is to be used as part of a live presentation, such as a classroom


lecture where a teacher will also be speaking and answering questions, the production
can be designed to be extremely flexible. The program can be arranged a s a series of pre
timed, self-contained sections so that the teacher can use as much as is needed for a specific
lesson without the students feeling that th ey are being prevented from seeing the entire
program.

4
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 60

Filmmaking 02

Filmmaking (or in an academic context, film production) is the process of making a film.
Filmmaking involves a number of discrete stages including an initial story, idea, or commission,
through scriptwriting, casting, shooting, sound recording and reproduction, editing,
andscreening the finished product before an audience that may result in a film release and
exhibition. Filmmaking takes place in many places around the world in a range
of economic, social, and political contexts, and using a variety of technologies and cinematic
techniques. Typically, it involves a large number of people, and can take from a few months to
several years to complete.

Parts

Film production consists of five major stages:

 Development — The first stage in which the ideas for the film are created, rights to
books/plays are bought etc., and the screenplay is written. Financing for the project has to be
sought and green lit.
 Pre-production—Preparations are made for the shoot, in which cast and film crew are hired,
locations are selected, and sets are built.
 Production—The raw elements for the film are recorded during the film shoot.
 Post-production—The images, sound, and visual effects of the recorded film are edited.
 Distribution—The finished film is distributed and screened in cinemas and/or released
to home video.

Development

In this stage, the project producer selects a story, which may come from a book, play, another
film, true story, video game, comic book, graphic novel, or an original idea, etc. After identifying
a theme or underlying message, the producer works with writers to prepare a synopsis. Next they
produce a step outline, which breaks the story down into one-paragraph scenes that concentrate
on dramatic structure. Then, they prepare a treatment, a 25-to-30-page description of the story,

1
Camera Basics, Principles and techniques –MCD 401 VU

its mood, and characters. This usually has little dialogue and stage direction, but often contains
drawings that help visualize key points. Another way is to produce a scriptment once a synopsis
is produced.

Next, a screenwriter writes a screenplay over a period of several months. The screenwriter may
rewrite it several times to improve dramatization, clarity, structure, characters, dialogue, and
overall style. However, producers often skip the previous steps and develop submitted
screenplays which investors, studios, and other interested parties assess through a process
called script coverage. A film distributor may be contacted at an early stage to assess the likely
market and potential financial success of the film. Hollywood distributors adopt a hard-headed
business approach and consider factors such as the film genre, the target audience, the historical
success of similar films, the actors who might appear in the film, and potential directors. All
these factors imply a certain appeal of the film to a possible audience. Not all films make a profit
from the theatrical release alone, so film companies take DVD sales and worldwide distribution
rights into account.

The producer and screenwriter prepare a film pitch, or treatment, and present it to potential
financiers. They will also pitch the film to actors and directors (especially so-calledbankable
stars) in order to "attach" them to the project (that is, obtain a binding promise to work on the
film if financing is ever secured). Many projects fail to move beyond this stage and enter so-
called development hell. If a pitch succeeds, a film receives a "green light", meaning someone
offers financial backing: typically a major film studio, film council, or independent investor. The
parties involved negotiate a deal and sign contracts.

Once all parties have met and the deal has been set, the film may proceed into the pre-production
period. By this stage, the film should have a clearly defined marketing strategy and target
audience.

Development of animated films differs slightly in that it is the director who develops and pitches
a story to an executive producer on the basis of rough storyboards, and it is rare for a full-length
screenplay to already exist at that point in time. If the film is green-lighted for further
development and pre-production, then a screenwriter is later brought in to prepare the
screenplay.

Pre-production

2
Camera Basics, Principles and techniques –MCD 401 VU

In pre-production, every step of actually creating the film is carefully designed and planned.
The production company is created and a production office established. The film is pre-
visualized by the director, and may be storyboarded with the help of illustrators and concept
artists. A production budget is drawn up to plan expenditures for the film. For major
productions, insurance is procured to protect against accidents.

The producer hires a crew. The nature of the film, and the budget, determine the size and type of
crew used during filmmaking. Many Hollywood blockbusters employ a cast and crew of
hundreds, while a low-budget, independent film may be made by a skeleton crew of eight or nine
(or fewer). These are typical crew positions:

 Storyboard artist: creates visual images to help the director and production designer
communicate their ideas to the production team.
 Director: is primarily responsible for the storytelling, creative decisions and acting of the
film.
 Assistant director (AD): manages the shooting schedule and logistics of the production,
among other tasks. There are several types of AD, each with different responsibilities.
 Unit production manager: manages the production budget and production schedule. They
also report, on behalf of the production office, to the studio executives or financiers of the
film.
 Location manager: finds and manages film locations. Nearly all pictures feature
segments that are shot in the controllable environment of a studio sound stage, while
outdoor sequences call for filming on location.
 Production designer: creates the visual conception of the film, working with the art
director.[2]
 Art director: manages the art department, which makes production sets
 Costume designer: creates the clothing for the characters in the film working closely with
the actors, as well as other departments.
 Makeup and hair designer: works closely with the costume designer in addition to create
a certain look for a character.
 Casting director: finds actors to fill the parts in the script. This normally requires that actors
audition.

3
Camera Basics, Principles and techniques –MCD 401 VU

 Choreographer creates and coordinates the movement and dance - typically for musicals.
Some films also credit a fight choreographer.
 Director of photography (DP): is the cinematographer who supervises the photography of the
entire film.
 Director of audiography (DA): is the audiographer who supervises the audiography of the
entire film. For productions in the Western world this role is also known as either sound
designer or supervising sound editor.
 Production sound mixer: is the head of the sound department during the production stage
of filmmaking. They record and mix the audio on set - dialogue, presence and sound
effects in mono and ambience in stereo. They work with the boom operator, Director,
DoA, DoP, and First AD.
 Sound designer: creates the aural conception of the film, working with the supervising sound
editor. On some productions the sound designer plays the role of a director of audiography.
 Composer: creates new music for the film. (usually not until post-production)

Production

Steven Spielberg with Chandran Rutnam in Sri Lanka

See also: Cinematography, Audiography and Principal photography

In production, the video production/film is created and shot. More crew will be recruited at this
stage, such as the property master, script supervisor, assistant directors,

4
Camera Basics, Principles and techniques –MCD 401 VU

stills photographer, picture editor, and sound editors. These are just the most common roles in
filmmaking; the production office will be free to create any unique blend of roles to suit the
various responsibilities possible during the production of a film.

A typical day's shooting begins with the crew arriving on the set/location by their call time.
Actors usually have their own separate call times. Since set
construction, dressing and lighting can take many hours or even days, they are often set up in
advance.
The grip, electric and production design crews are typically a step ahead of the camera and sound
departments: for efficiency's sake, while a scene is being filmed, they are already preparing the
next one.

While the crew prepares their equipment, the actors are wardrobe in their costumes and attend
the hair and make-up departments. The actors rehearse the script and blocking with the director
and the camera and sound crews rehearse with them and make final tweaks. Finally, the action is
shot in as many takes as the director wishes. Most American productions follow a specific
procedure:

The assistant director (AD) calls "picture is up!" to inform everyone that a take is about to be
recorded, and then "quiet, everyone!" Once everyone is ready to shoot, the AD calls "roll sound"
(if the take involves sound), and the production sound mixer will start their equipment, record a
verbal slate of the take's information, and announce "sound speed", or just "speed" when they are
ready. The AD follows with "roll camera", answered by "speed!" by the camera operator once
the camera is recording. The clapper, who is already in front of the camera with
the clapperboard, calls "marker!" and slaps it shut. If the take involves extras or background
action, the AD will cue them ("action background!"), and last is the director, telling the actors
"action!". The AD may echo "action" louder on large sets.

A take is over when the director calls "cut!", and camera and sound stop recording. The script
supervisor will note any continuity issues and the sound and camera teams log technical notes for
the take on their respective report sheets. If the director decides additional takes are required, the
whole process repeats. Once satisfied, the crew moves on to the next camera angle or "setup,"
until the whole scene is "covered." When shooting is finished for the scene, the assistant director
declares a "wrap" or "moving on," and the crew will "strike," or dismantle, the set for that scene.

5
Camera Basics, Principles and techniques –MCD 401 VU

At the end of the day, the director approves the next day's shooting schedule and a daily progress
report is sent to the production office. This includes the report sheets from continuity, sound, and
camera teams. Call sheets are distributed to the cast and crew to tell them when and where to
turn up the next shooting day. Later on, the director, producer, other department heads, and,
sometimes, the cast, may gather to watch that day or yesterday's footage, called dailies, and
review their work.

With workdays often lasting 14 or 18 hours in remote locations, film production tends to create a
team spirit. When the entire film is in the can, or in the completion of the production phase, it is
customary for the production office to arrange a wrap party, to thank all the cast and crew for
their efforts.

For the production phase on live-action films, synchronizing work schedules of key cast and
crew members is very important, since for many scenes, several cast members and most of the
crew must be physically present at the same place at the same time (and bankable stars may need
to rush from one project to another). Animated films have different workflow at the production
phase, in that voice talent can record their takes in the recording studio at different times and
may not see one another until the film's premiere, while most physical live-action tasks are either
unnecessary or are simulated by various types of animators.

Post-production

Here the video/film is assembled by the video/film editor. The shot film material is edited. The
production sound (dialogue) is also edited; music tracks and songs are composed and recorded if
a film is sought to have a score; sound effects are designed and recorded. Any computer-graphic
visual effects are digitally added. Finally, all sound elements are mixed into "stems", which are
then married to picture, and the film is fully completed ("locked").

Distribution

This is the final stage, where the film is released to cinemas or, occasionally, directly to
consumer media (DVD, VCD, VHS, Blu-ray) or direct download from a digital media provider.
The film is duplicated as required (either onto reels or hard disk drives) and distributed to
cinemas for exhibition (screening). Press kits, posters, and other advertising materials are
published, and the film is advertised and promoted. A B-roll clip may be released to the press

6
Camera Basics, Principles and techniques –MCD 401 VU

based on raw footage shot for a "making of" documentary, which may include making-of clips as
well as on-set interviews.

Film distributors usually release a film with a launch party, a red-carpet premiere, press
releases, interviews with the press, press preview screenings, and film festival screenings. Most
films are also promoted with their own special website separate from those of the production
company or distributor. For major films, key personnel are often contractually required to
participate in promotional tours in which they appear at premieres and festivals, and sit for
interviews with many TV, print, and online journalists. The largest productions may require
more than one promotional tour, in order to rejuvenate audience demand at each release window.

Since the advent of home video in the early 1980s, most major films have followed a pattern of
having several distinct release windows. A film may first be released to a few select cinemas, or
if it tests well enough, may go directly into wide release. Next, it is released, normally at
different times several weeks (or months) apart, into different market segments
like rental, retail, pay-per-view, in-flight entertainment, cable, satellite, and/or free-to-
air broadcast television. The distribution rights for the film are also usually sold for worldwide
distribution. The distributor and the production company share profits.

Independent Filmmaking

Filmmaking also takes place outside of the mainstream and is commonly called independent
filmmaking. Since the introduction of DV technology, the means of production have become
more democratized. Filmmakers can conceivably shoot and edit a film, create and edit the sound
and music, and mix the final cut on a home computer. However, while the means of production
may be democratized, financing, traditional distribution and marketing remain difficult to
accomplish outside the traditional system. In the past, most independent filmmakers have relied
on film festivals to get their films noticed and sold for distribution. However, the Internet has
allowed for relatively inexpensive distribution of independent films on websites such as
YouTube. As a result several companies have emerged to assist filmmakers in getting
independent movies seen and sold via mainstream internet marketplaces, often adjacent to
popular Hollywood titles. With internet movie distribution, independent filmmakers who fail to
garner a traditional distribution deal now have the ability to reach global audiences.

7
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 61

Motion Picture Camera

Motion-picture camera, also called Movie Camera, any of various complex


photographic cameras that are designed to record a succession of images on a reel
of film that is repositioned after each exposure. Commonly, exposures are made at the
rate of 24 or 30 frames per second on film that is 8, 16, 35, or 70 mm in width.
A motion-picture camera essentially consists of a body, a film-transport system,
lenses, shutter, and a viewing-focusing system. The motor-driven transport system is
the chief element that differentiates motion-picture cameras from still cameras. Within
the camera, the unexposed film is housed in a totally dark chamber called the forward
magazine. One or both edges of the film are lined with regularly spaced perforations, or
sprocket holes. Sprocket-driven gears grip these perforations, feeding the film into an
enclosed exposure chamber. A mechanical claw pulls the film into position behind the
shutter, locking the film momentarily in place. The shutter opens, exposes an image
onto the film, and closes. Then the claw, with an automatic pulldown movement,
advances the film for the next exposure. Each frame of the film comes to a complete
stop for its exposure, and hence each exposure is a single still photograph, or frame. As
the film moves through the camera, the exposed sections are fed into the rear
magazine, which is another totally dark chamber.
Most cameras now use the reflex system for viewing and focusing; in this system a
mirror diverts to the viewfinder some of the light rays coming through the lens. Zoom
lenses are commonly used on many cameras, as are ordinary wide-angle and telephoto
lenses. The shutter is located behind the lens and in front of the film gate. It is usually
rotary, and consists of a half-circle that is pivoted around in synchronization with the
claw’s pull down of the film, so that the half-circle blocks out light from the lens when the
film is in transit and moves out of the way to let light through when the film frame is
motionless. Cameras used in sound filming contain internal insulation to dampen the
noise of their moving parts.

1
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 62

Equipment technology

The development of motion picture complexity has been driven by a continuing


technological evolution, ignited and manipulated by human initiative and inventiveness, which
has afforded filmmakers the opportunity to practice a more complex craft to tell more complex
stories. In concert with societal attitudes and proximity, this evolution has driven the
development of distinct styles, movements, and methods that would have been impossible
without increasingly advanced apparatus. However, while this technological progression has
been linear, it has not necessarily coincided with a similar evolution of quality; the skill of a
filmmaker should not be judged by the technological complexity of the production, but by the
ability of the filmmaker to wield the technology of the time and of his or her choosing to
effectively and clearly convey a narrative, evoke an emotion, or make an impression. Although
the linear technological evolution of filmmaking has empowered filmmakers by offering a more
diverse catalogue of tools and techniques, it is the filmmaker’s ability to effectively and
discerningly utilize this technology within a temporal and societal context that truly drives
cinematic quality, of which there has been no clear linear progression.

As film history has progressed, so too has the sophistication of filmmaking technology, from
cameras, to sound recording, to editing. Technological advancements in these areas expand the
creative potential of the filmmaker. However, just because technology is more advanced does not
mean that it is necessarily superior in each given application. Rather, advanced technology is
advantageous in that it broadens the toolset available to the filmmaker from whom he or she can
discern which equipment and techniques are best suited to a given production. French film
theorist Louis Delluc would call these filmmaking techniques and methods cinematic formal
elements or those elements unique to film as an art form, such as editing and camera movement
(Jaramillo). As the evolution of film has progressed, the catalogue of cinematic formal elements
has grown, enabling filmmakers to, at their discretion, make more complex films. Even restricted
to the confines of what Tom Gunning calls “cinema of attractions,” the dominant paradigm
before 1908 (73), this is evident.

1
Camera Basics, Principles and techniques –MCD 401 VU

Gunning worries that adopting an evolutionary view of cinema will categorize pre-WWI film
and cinema of attractions, as he puts it, “as [a] primitive […] early stage in which later potentials
are sketched out but imperfectly realized”. However, Gunning’s definition of cinema of
attractions frees it from this imperfect characterization:

By its reference to the curiosity-arousing devices of the fairground, the term denoted early
cinema’s fascination with novelty and its foregrounding of the new act of display. Viewed from
this perspective, early cinema did not simply seek to neutrally record previously existing acts or
events. Rather, even the seemingly stylistically neutral film consisting of a single shot without
camera tricks involved a cinematic gesture of presenting for view, of displaying.

On this view, the early films of cinema’s pioneers would not have been improved by the
advanced technology of later generations, for their displays did not call for it. Further they
cannot be seen as solely preparatory, for, like later narrative films, they presented a subject for
view in a uniquely cinematic way. The early films of Edison and Dickson were simple, short
glimpses of “well-known sports figures, excerpts from noted vaudeville acts, or performances by
dancers or acrobats” (Thompson & Bordwell 7). While it is true that primitive technology did
limit these small-scale productions, which, according to Thompson and Bordwell, “lasted only
twenty seconds or so – the longest run of film that the Kinetoscope could hold”, advanced
technology would not necessarily have improved them, for their simplistic nature did not call for
it. Regardless, filmmaking technology evolved with the Lumiere Brothers’ Cinématographe
which freed filmmakers from the confines of the studio and allowed for on location shooting
(Thompson & Bordwell 8-9). This, however, did not lead to better films, but only augmented the
possibilities for future films such as Workers Leaving the Factory (Lumiere 1895) and Arrival of
a Train (Lumiere1896), the production of which would have been impossible within a studio.
With this advancement, the global toolset of filmmakers grew; from Edison and Dickson,
filmmakers got the option to shoot in a light controlled studio and from the Lumiere brothers the
ability to shoot on location. Neither of these options is universally better, but only particularly
more suited to a given production and would, in themselves, evolves over time.

2
Camera Basics, Principles and techniques –MCD 401 VU

Single shot display films eventually gave way to films such as George Melies’ Trip to the
Moon(1902), composed of several single shot scenes, and later films like Cecil B. DeMille’s The
Cheat(1915), which employed analytical editing, using multiple shots from varied distances in
the same scene to show detail and emotion (Thompson & Bordwell). The continuation of this
technological editing evolution is most evident in the Constructivist-influenced, state-sponsored
Soviet montage movement of the 1920s. According to Thompson and Bordwell, Montage films
“have a greater number of shots than does any other type of filmmaking of their era […and]
frequently broke individual actions down into two or more shots” (117). However, the more
complex editing techniques were not, in themselves, what drove the quality of montage films,
but instead the “more specific strategies of editing, involving temporal, spatial, and graphic
tensions” (Thompson & Bordwell 117). Thompson and Bordwell write that Montage filmmaker
Dziga Vertov, for instance, “emphasized that the filmmaker should calculate the differences
between shots – light verses dark, slow motion versus fast motion, and so on.

These differences, or ‘intervals,’ would be the basis of the film’s effect on the audience”. The
influence of Marxist dialecticism led Sergei Eisenstein, another Montage filmmaker, to theorize
that shots should clash with one another to create a new idea in the mind of the viewer
(Thompson & Bordwell 116). This practice is employed multiple times in Eisenstein’s
film October (1928), such as in the juxtaposition of Kerensky with a shot of Napoleon
(Thompson & Bordwell 120). With Vertov and Eisenstein as exemplars, it is clear that the
Montage filmmakers achieved success not solely because of the technological evolution, but
because they purposefully utilized the cinematic formal elements, in this case editing, born from
that evolution to create a distinct style. Their inventiveness catered the technology to their goals
and resulted in quality.

While Soviet Montage filmmakers focused on editing, they recognized the importance of a
striking composition within each individual shot (Thompson & Bordwell 121). So too did the
French Impressionists and German expressionists who used other of the cinematic formal
elements, such as camera work and mise-en-scene, respectively, to externalize characters’ inner
states (Thompson & Bordwell). For French Impressionists, such as Louis Delluc, filmmaking
was about photogénie, “that quality that distinguishes a film shot from the original object

3
Camera Basics, Principles and techniques –MCD 401 VU

photographed” (Thompson & Bordwell 77). According to Thompson and Bordwell, photogénie
“is created by the properties of the camera: framing isolates objects from their environment,
black-and-white film stock transforms their appearance, special optical effects further change
them, and so on” (77). This emphasis led the Impressionists to develop innovative camera
techniques to externalize characters’ subjectivity. They manipulated the components of the
presented technology, in this case the camera, to purposefully elicit a desired effect. Thompson
and Bordwell detail Impressionist uses of the camera’s optical devices:

Superimpositions may convey a character’s thoughts or memories. A filter placed over the lens
may function to suggest subjectivity. […] Throwing the lens out of focus could also convey
subjectivity, whether we see the characters or through their eyes. […] Impressionist films also
feature camera movements that convey subjectivity and enhance photogénie. (78-80)

The innovative use of these uniquely cinematic tools, not the tools themselves, enabled greater
narrative clarity and character relatability. This reflects a clear evolutionary step, driven by
human faculty, in the ability of the camera to tell a story. While the Impressionists used
camerawork to achieve this effect, the German Expressionists utilized mise-en-scene, what
Thompson and Bordwell define as “all the elements placed in front of the camera to be
photographed: the settings and props, lighting, costumes and makeup, and figure behavior”
(733). The goal of Expressionist film was to fuse these elements into a singular and distorted
composition expressing the inner state of the subject, most famously seen in The Cabinet of Dr.
Caligari (Wiene 1920). Technologically, lighting is of the most interest in this practice.
Thompson and Bordwell write, “For the most part, Expressionist films used simple lighting from
the front and sides, illuminating the scene flatly and evenly to stress the links between the figures
and the décor” (94). Expressionism then exemplifies that technological simplicity aimed at a
certain goal is more effective than complexity. When they did use more complex lighting, it was
purposefully to create shadows augmenting the overall distortion of the frame (Thompson &
Bordwell 94). Once again, however, a filmmaker’s ability to appropriately and discerningly
employ tools and techniques, such as lighting, not the arbitrary use of them, correlates with
quality. This becomes clear in the evolution of technologies such as sound and color.

4
Camera Basics, Principles and techniques –MCD 401 VU

The advent of synchronized sound, first seen in The Jazz Singer (Crosland 1927), in the late
twenties and early thirties was met with apprehension from “some critics and directors [who]
feared that extensive dialogue scenes in adapted plays would eliminate the flexible camera
movements and editing of the silent era” (Thompson & Bordwell 177). The adoption of sound
was a major step forward in the technological evolution of film, but in order for it to be gainfully
applied, the practice in itself had to go through a self-contained evolution. Sound in its early
stages did not necessarily equate to better films; for instance, according to Thompson and
Bordwell, “The microphones initially were insensitive, and hence studios often insisted that
actors take diction lessons and speak slowly and distinctly. Many early talkies move at a slow
pace and the performance seem stilted to modern ears” (182). Improvements in microphones,
multiple-track sound recording, and syncing methods gradually enabled filmmakers to employ
the once clumsy tool effectively (Thompson & Bordwell 201). Thompson & Bordwell note that
“most filmmakers soon realized [...] that sound, used imaginatively, offered a valuable new
stylistic resource” (177). The combination of improved sync-sound with picture opened up new
avenues of storytelling not previously possible. Fritz Lange’s M (1931), for instance, takes
advantage of the new possibility of audible dialogue. Importantly, though, Lange doesn’t rely
solely on dialogue to move the story forward, but retains the strong visual storytelling methods
of the silent era, reserving dialogue to relate information that can’t be explained visually.

Additionally, M is also an early example of sound as a motif in film; the murderer at the center of
the story whistles a haunting tune that is used at crucial plot points to drive the narrative forward.
However, M’s use of sound in itself is not what led to the film’s quality, but rather the
filmmaker’s ability to discerningly and skillfully use it. Additionally, the use of sound alone is
not enough to declare that it is better than films of the silent era. For comparison, the narrative
of The Cheat was clearly and convincingly conveyed visually and its story was not muddled by
the absence of synchronized sound. The makers of both films managed to successfully tell their
story by using the technology at hand. However, while The Cheat would not necessarily have
benefited from sound, M, as it is, would have been a difficult if not impossible story to tell
without it. This highlights the function of the technological evolution in allowing, but not
mandating, filmmakers to do what was not possible with more primitive technology.

5
Camera Basics, Principles and techniques –MCD 401 VU

Color in film went through a self-contained evolution much like sound. Many films of the silent
era, for instance, used processes such as tinting and toning to give an overall color to the frame
(Thomspon & Bordwell 34). Thompson and Bordwell comment on the process that “color could
provide information about the narrative situation and hence make the story clearer to the
spectator” (34), much like the use of photogénie and mise-en-scene by the Impressionists and
Expressionists. Other films, such as The Great Train Robbery, employed stenciling to hand color
portions of the frame after photography. Color began its mainstream assent when Technicolor
introduced their three-strip coloring process in the 1930s (Thompson & Bordwell 203).
However, not every filmmaker immediately began producing color films, and those that did, did
so with reason. While this was greatly due to the fact that shooting in color increased budgets by
as much as thirty percent, Thompson and Bordwell reflect, “Today we regard color as a realistic
element in films, but in the 1930s and 1940s, it was often associated with fantasy and spectacle.
It could be used for exotic adventures like The Garden of Allah (1936), swashbucklers like The
Adventures of Robin Hood(1939), or musicals like Meet Me in St. Louis (1944)” (203). However,
despite this new technology, a film did not have to use color in order to be considered of quality.
Orson Welles’ 1941 Citizen Kane, for instance, was shot in black and white, despite the advent
of color film in the previous decade. While it is possible this decision was made for budgetary
reasons, the use of black and white dramatically accentuated the shadowy, mysterious tone of the
film. In this case, the decision not to use a tool born from the technological evolution actually
enhanced the end result. However, other technologies were meticulously chosen and skillfully
implemented to produce the complex film. Thompson and Bordwell write:

Stylistically, Kane was flamboyant, drawing extensively on RKO’s resources. For some scenes,
Welles used quiet, lengthy takes. Other passages, notably the newsreel and several montage
sequences, used quick cutting and abrupt changes in sound volume. To emphasize the vast
spaces of some of the sets, cinematographer Gregg Toland worked at achieving deep focus shots,
placing some elements close to the camera, others at a distance. (209)

Despite that Citizen Kane did not utilize Technicolor, it is clear that the film is still very much a
child of the technological evolution. The rhythmic use of editing and sound, for instance, is
reminiscent of the Soviet Montage movement. Even the tenets of this movement, specifically

6
Camera Basics, Principles and techniques –MCD 401 VU

Eisenstein’s dialectical montage, evolved with technology such as synchronized sound. In a


scene from Citizen Kane, for instance, a non-diagetic scream is heard after Kane strikes his wife.
This clashes with the diagetic sound to create a new idea in the mind of the viewer. It can also be
seen as a subjective tool, similar to those of French Impressionism and German Expressionism.
In the light of this convergence of styles and technical tools, Citizen Kane is a prime example of
the possibilities enabled by the technological evolution. However, it is most important to
remember that human inventiveness is responsible for the realization of these technologies in the
successful manner seen in Citizen Kane.

The evolution of film technology remains unpunctuated. New technologies are readily invented,
tested, and perfected. In recent years, the rise of digital cinema equipment and techniques has
begun encroaching on the arena once dominated solely by photographic film (Thompson &
Bordwell 713). As was true in previous evolutionary iterations, however, this technology only
serves as another option for filmmakers to choose and not a precondition of modern quality. This
is reflected by enthusiasm from some directors, such as George Lucas and Robert Rodriguez,
about digital technology, and apprehension from others. Thompson and Bordwell write, “Many
cinematographers, directors, designers, and other professionals were upset at the prospect of the
death of photographic film, as were many movie fans, but the rise of digital cinema seemed
inevitable” (713). This trend and the attitudes surrounding it harmonize with the patterns that
have characterized cinema history. However, fans of cinema need not fret, for neither adoption
nor disregard of this new technology can bring an end to cinematic quality. The power to do so
lies solely in the hands of the filmmaker, the quality of whose projects will ultimately depend
upon his or her ability to effectively wield the cinematic formal elements, whatever they may be
in the coming years, to clearly convey a story, emotion, impression, or idea.

7
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 63
Camera Lens Mechanism

The lens system

Engraved on the f r o n t of ev er y le ns a r e t w o important numbers:

 The lens’ focal length—or in the case of zoom lenses, its range of focal lengths. This
gives you a clue to the variations in shot sizes the lens will provide.
 The lens’ largest aperture or f-stop (e.g., f/2)— the smaller this f-stop number, the
larger the lens’ maximum aperture, so the better its performance under dim
lighting (low- light) conditions.

There are two fundamental types of lenses on video cameras:

 Prime l e n s ( primary lens), w h i c h h a s a specific ( unchangeable) focal length.


Prime lenses have become specialty items, primarily used by film- makers, digital
filmmakers, and in special use situations such as security or scientific research.
 Zoom l e n s , which h a s a variable f o ca l length. Zoom l e n s e s are by far the most
popular lens on cameras because of their ability to move easily from wide-angle to
telephoto focal lengths.

Focal length and lens angle

The term focal length is simply an optical measurement —the distance between the optical
center of the lens and the image sensor (CCD or CMOS) when you are focused at a great
distance such as infinity. It is generally measured in millimeters (mm).

A lens designed to have a long focal length (long focus) behaves as a narrow angle or telephoto
system. The subject appears much closer than normal, but you can only see a smaller
part of the scene. Depth and distance can look unnaturally compressed in the shot.

1
Camera Basics, Principles and techniques –MCD 401 VU

When the lens has a short focal length (short focus) this wide-angle sys- tem takes in
correspondingly more of the scene. But now subjects will look much farther away; depth
a nd distance app ear exaggerated. The exact coverage of any lens depends on its focal length
relative to the size of a camera CCD’s image size

The prime lens

The prime lens, which s t a n d s f o r primary l e n s , is a fixed-focal- length lens ( Figures 6 . 8


a n d 6.9). Only t h e i r i s (diaphragm) within the lens barrel is adjustable. Changing its
aperture (f-stop) varies the l en s’ image b r i g h t n ess , which c o n t r o l s the p i ctu r e’ s
exposure. The focus ring varies the entire l en s system’s distance from the receiving chip.

FIGURE 6.7
The various shots that can be obtained by different lenses. The wide-angle and telephoto shots can be
taken while standing in one place and exchanging lenses.

2
Camera Basics, Principles and techniques –MCD 401 VU

FIGURE 6.8
A prime lens has few features, focus and aperture. (Photo courtesy of Zeiss and BandPro.)
If a video camera with a single prime lens is being used and a closer or more distant shot
of the subject is needed, the camera operator has to move the cam- era nearer or farther
from the subject. The alternative i s to have a selection of prime lenses of various focal
lengths to choose from.

The zoom lens

A zoom lens is a variable focal length lens. It allows the camera operator to zoom in and
zoom out on a subject without moving the camera forward or backward. The zoom lens
enables th e camera operator t o select any coverage within it s range. Most video and
television ca mera s come with optical zo o m lenses. An optical zoom uses a lens to magnify

3
Camera Basics, Principles and techniques –MCD 401 VU

the image and send it to the chip. The optical zoom retains the original quality of the
camera’s chips (Figure 6.10).

An increasing n u m b er of consumer video cameras are fitted with a lens system that
combines both an optical zoom and a digital zoom. A camera might, for instance, have a 20
optical zoom and 100 digital zoom. Depending on the quality of its design, the optical
zoom system should give a consistently high- quality image throughout its zoom range;
the focus and picture clarity should remain optimal at all settings. In a digital system, the
impression of zooming in is achieved by progressively reading out a smaller and smaller
area of the same digitally constructed picture. Consequently, viewers are likely to see the
quality of the image progressively deteriorating as they zoom i n because fewer of the
original picture’s pixels are being spread across the television screen.

Lens design involves many technical compromises, particularly in small systems. The
problems with providing high performance from a lightweight, robust unit at a reasonable
cost have been challenging f o r manufacturers. So the optical quality of budget systems is
generally below that of an equivalent prime lens.

When the camera operator wants to get a “closer” shot of the subject or is try- ing to
avoid something at the edge of the picture coming into shot, it is obviously a lot easier to
zoom in than to move the camera, particularly when using a tripod. In fact, many people
simply stand wherever it is convenient and zoom in or out to vary the size of the shot.

4
Camera Basics, Principles and techniques –MCD 401 VU

However, the focal length of the lens does not just determine the image size. It also affects
the following factors:

 How much of the scene is sharp. The longer the telephoto used, the less amount
of depth of field (the distance between the nearest and farthest objects in focus).
 How prominent the background is in closer shots. The background is magnified at the
same time as the foreground subject. Instead o f zooming, if the camera were moved
closer to the subject, the background size would be different from the zoom shot
(see Figure 6.11).
 How hard it is to focus. The longer the telephoto, the smaller the depth of field.
 Camera shakes. The longer the telephoto, the more the operator’s shake is magnified.
The wider the shot, the less amount of shake.
 The accuracy of shapes (geometry). Lenses can easily distort s h a p e s . For example,
when a very wide-angle len s is tilted up at a tall building, the building will distort,
looking as though it is going to fall.

As you can see, the zoom l en s needs to be used with care, although amateurs do ignore
such distortions and varying perspective. The zooming action, to o, can be overused,
producing distracting and amateurish effects.

FIGURE

5
Camera Basics, Principles and techniques –MCD 401 VU

The background changes in size are due to the lens used. Note that the first photo was shot with
a telephoto lens and the last shot was taken with a wide-angle lens. The camera had to be moved closer
to the subject for each shot as a wider lens was attached so that the subject would stay the same
approximate size. (Photo by K. Brown.)

Zoom lens control

Zoom lens r e m o t e controls are a n i m p o r t a n t tool for c a m e r a operators.


Standing close to the camera to manipulate the controls on the lens is uncomfortable if
required f o r a long period of time. Remote controls a ll o w the cam- era operator to operate
the zoom lens while standing at the back of the camera (Figure 6.12).

6
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 64

Lens Factor

The Focal Length

A primary characteristic of a lens is the focal length. A lens’ focal length is defined as
the distance between the lens’ optical center and the camera’s image sensor (or film
plane) when focused at infinity. To understand this definition of focal length, we need
to define “optical center” as well. A lens’ optical center is the point (usually though not
always) within a lens, at which the rays of light from two different sources entering the
lens are assumed to cross. Shorter focal length lenses provide a wider field of view, but
offer less magnification. Conversely, longer focal lengths offer a shorter field of view,
but provide greater magnification. On DSLRs, the interchangeable lens’ focal length is
measured in millimeters. The focal length of a lens is usually displayed on the lens
barrel, along with the size of the adaptor ring.

1
Camera Basics, Principles and techniques –MCD 401 VU

The Lens Ratio

When you look upon the front end of your lens barrel, you’ll see a ratio number (1:2.8,
1:2.8-4, 1:3.5-5.6, etc), which is the maximum aperture of the lens. The aperture
determines how much light the lens transmits to the image sensor. The lower the
maximum aperture value will indicate the quality of the lens in terms of brightness.
High quality zoom lenses deliver a constant f-stop throughout the focal range (i.e. a
f/2.8 at 35mm and a f/2.8 at 80mm); whereas on a lower quality lens, the f -stop varies
as you travel up the focal range (i.e. a f/3.5 at 28mm, but a f/5.6 at 80mm); you are
losing at least one stop of light as you zoom up the focal length from wide angle to
telephoto. A lens with a low f-number (wide maximum aperture), is a better quality
lens, and allows you to do more with it. For example, such a lens is "brighter", allowing
you to take photos in low a mbient light conditions, yet still register a quality exposure.
In addition these bright lenses allow you to achieve a very shallow depth of field. It is
to be noted that any lens that is f/2.8 or lower is considered to be a professional lens,
and will have a correspondingly higher price tag.

2
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 65

Types of Lenses

1. Standard/Normal Lens

The standard lens has a fixed focal length (50mm, 85mm, 100mm), and reproduces
fairly accurately what the human eye sees – in terms of perspective and angle of view.
For a 35mm film camera or a full-frame DSLR, the 50mm lens is considered standard.
At higher focal lengths (85mm or 100mm) you have an ideal lens for portraiture,
because when coupled with a wide aperture they thoroughly soften any background
detail, thus making it less likely to distract from the main subject.

2. Wide Angle Lens

A wide-angle has a shorter focal length (10 thru 42mm) when compared to a standard
lens. This enables you to capture a comparatively wider angle of view. A wide -angle

1
Camera Basics, Principles and techniques –MCD 401 VU

lens is a natural choice for capturing out door landscapes and group portraits. In fact,
wide angle can be the only way to capture the complete setting without omitting any
important elements in the image. In this manner, you can use wide -angle lenses to
capture a deep DOF.

3. Telephoto Lens

Telephoto lenses (100mm - 800mm) can provide you with a narrow field of view.
These long lenses enable you to compress a distance (and compress the sense of depth,
as well) and pick out specific objects from far off. They have a strong resolving power
and an inherent shallow DOF, where the slightest lateral moment can take a subject out
of view. Telephoto lenses are great for wildlife, portrait, sports, and documentary types
of photography. They enable you to capture subjects from hundreds of feet away.

4. Zoom Lens

2
Camera Basics, Principles and techniques –MCD 401 VU

Zoom lenses have variable focal lengths, and are extremely useful. Some can range
between a wide-angle and a telephoto (i.e. 24 to 300mm) so you have extensive
versatility for composition. The trade off with zoom lenses is the aperture. Because of
the number of elements required in constructing these lenses, they have a limited
ability to open up and allow in light. So unless you’re prepared to outlay a lot of
money, you will give up lens speed.

5. Fisheye Lens

A fisheye lens is a specialized, wide-angle lens that provides extremely wide images by
changing straight lines into curves. It can sometimes produce circular, convex, or oval
images by distorting the perspective and creating a 180° image. The range of focal
length varies between 7~16mm in a fish-eye lens.

6. Macro Lens

3
Camera Basics, Principles and techniques –MCD 401 VU

Macro lenses are used for close-up or “macro” photography. They range in focal
lengths of between 50-200mm. These lenses obtain razor -sharp focus for subjects
within the macro focus distance, but lose their ability for sharp focus at other distances.
These lenses enable the photographer to obtain life -size or larger images of subjects
like wasps, butterflies, and flowers.

7. Tilt-Shift Lens

The Tilt-Shift lens enables you to manipulate the vanishing points, so when you’re
shooting buildings you can alter the perspective of an image so the parallel lines don’t
converge, thus eliminating the distorting quality of the lens. The tilt -shift lens also
enables you to selectively focus an image; where only specific portions of the image
are in focus and out of focus within the same plane.

8. Image-Stabilization Lens

These lenses contain small gyro stabilizer sensors and servo -actuated lens elements,
which purportedly correct for camera shake that occurs with longer focal length lens or
in low-light conditions when you need to have slower shutter speeds to achieve an
effective EV. It is claimed that these lenses enable the user to shoot handheld at 2 to 4
stop slower shutter speeds (exposure 4 to 16 times longer) than the minimum require d
for a sharp image.

Conclusion

4
Camera Basics, Principles and techniques –MCD 401 VU

There are many possible lens choices and all will give you a different and distinct
image. Part of the creativity of the photographer is in selecting the right lens to capture
the vision of the world the way she or he sees i t, or wants to present it.

5
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 66

Characteristic of lenses

The lens forming an image in the camera is a converging lens, the simplest form of which is a
single biconvex (lentil-shaped) element. In theory such a lens makes a light beam of parallel rays
converge to a point (the focus) behind the lens. The distance of this focus from the lens itself is
the focal length, which depends on the curvature of the lens surfaces and the optical properties of
the lens glass. An object at a very long distance (optically regarded as at “infinity”) in front of
the lens forms an inverted image in a plane (the focal plane) going through the focus. Light rays
from nearer objects form an image in a plane behind the focal plane. The nearer the object, the
farther behind the lens the corresponding image plane is located—which is why a lens has to be
focused to get sharp images of objects at different distances.

FOCAL LENGTH AND IMAGE SCALE


The image scale, or scale of reproduction, is the ratio of the image size to the object size; it is
often quoted as a magnification. When the image is smaller than the object, the magnification of
the object is less than 1.0. If the image is 1/20 the size of the object, for example, the
magnification may be expressed either as 0.05 or as 1:20. For an object at a given distance, the
scale of the image depends on the focal length of the lens (Figure 4). A normal camera lens
usually has a focal length approximately equal to the diagonal of the picture format covered. A
lens of longer focal length gives a larger scale image but necessarily covers less of the scene in
front of the camera. Conversely, a lens of shorter focal length yields an image on a smaller scale
but—provided the angle of coverage is sufficient (see below)—takes in more of the scene. Many
cameras, therefore, can be fitted with interchangeable lenses of different focal lengths to allow
varying the image scale and field covered. The focal length of a lens in millimetres (sometimes
in inches) is generally engraved on the lens mount.

APERTURE

The aperture, or f-number, is the ratio of the focal length to the diameter of an incident light
beam as it reaches the lens. For instance, if the focal length is 50 millimetres and the diameter of
the incident light beam is 25 millimetres, the f-number is 2. This incident-beam diameter is often

1
Camera Basics, Principles and techniques –MCD 401 VU

roughly the lens-diaphragm diameter, but it may be appreciably larger or smaller. The maximum
aperture (f-number at the largest diaphragm opening) is also marked on the lens, usually in the
form f:2, f/2, or 1:2.

ANGLE OF COVERAGE
A lens must cover the area of a camera’s film format to yield an image adequately sharp and with
reasonably even brightness from the centre to the corners of the film. A normal lens should cover
an angle of at least 60°. A wide-angle lens covers a greater angle—about 70° to 90° or more for
an ultra-wide-angle lens. A long-focus lens covers a smaller angle.

The angle of coverage depends on the lens design. Designations like “wide angle” or “narrow
angle” are not necessarily synonymous with “short focus” and “long focus,” as the latter terms
refer to the focal length of the lens relative to the picture format.

Optical performance
A simple lens produces a very imperfect image, which is usually blurred away from the center.
The image may have colour fringes around object outlines, and straight lines may be distorted.
Such defects, called aberrations, can be eliminated—and even then not completely—only by
replacing the single lens element by a group of elements of appropriate shape and separation.
Aberrations arising from some of the lens elements then counteract opposite aberrations
produced by other elements. The larger the maximum aperture, the greater the angle of coverage,
and the higher the degree of correction aimed at, the more complex camera lenses become. Lens
design for relative freedom from aberrations involves advanced computer programming to
calculate the geometric parameters of every lens element. Some aberrations can also be corrected
by making one or more of the surfaces of a lens system aspheric; i.e., with the variable curvature
of a paraboloid or other surface rather than the constant curvature of a spherical one.
Lenses usually consist of optical glass. Transparent plastics also have come into use, especially
as they can be molded into elements with aspheric surfaces. They are, however, more sensitive to
mechanical damage.

ABERRATIONS
There are a number of lens aberrations, each with its own characteristics. Chromatic aberration is
present when the lens forms images by different-colored light in different planes and at different

2
Camera Basics, Principles and techniques –MCD 401 VU

scales. Color-corrected lenses largely eliminate these faults. Spherical aberration is present when
the outer parts of a lens do not bring light rays into the same focus as the central part. Images
formed by the lens at large apertures are therefore unsharp but get sharper at smaller
apertures.Curvature of field is present when the sharpest image is formed not on a flat plane but
on a curved surface. Astigmatism occurs when the lens fails to focus image lines running in
different directions in the same plane; in a picture of a rail fence, for instance, the vertical posts
are sharp at a focus setting different from the horizontal rails. Another aberration, called coma,
makes image points near the edges of the film appear as irregular, unsharp shapes. Distortion is
present when straight lines running parallel with the picture edges appear to bow outward (barrel
distortion) or inward (pincushion distortion).

3
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 67

Camera Lens

Resolution & Contrast

Lens quality is more important now than ever, due to the ever-increasing number of megapixels
found in today's digital cameras. Frequently, the resolution of your digital photos is actually
limited by the camera's lens — and not by the resolution of the camera itself.

RESOLUTION & CONTRAST

Everyone is likely to be familiar with the concept of image resolution, but unfortunately, too
much emphasis is often placed on this single metric. Resolution only describes how much detail
a lens is capable of capturing — and not necessarily the quality of the detail that is captured.
Other factors therefore often contribute much more to our perception of the quality and sharpness
of a digital image.

To understand this, let's take a look at what happens to an image when it passes through a camera
lens and is recorded at the camera's sensor. To make things simple, we'll use images composed
of alternating black and white lines ("line pairs"). Beyond the resolution of your lens, these lines
are of course no longer distinguishable:

Black and white line pair’s → camera lens → unresolved line pairs

High Resolution Line Pairs Lens Unresolved Line Pairs

Example of line pairs which are smaller than the resolution of a camera lens.

However, something that's probably less well understood is what happens to other, thicker lines.
Even though they're still resolved, these progressively deteriorate in both contrast and edge
clarity (see sharpness: resolution and acutance) as they become finer:
Camera Basics, Principles and techniques –MCD 401 VU

Full contrast black and white line pairs → camera lens → black and white line
pairs softened by camera lens

Progressively Finer Lines � Lens � Progressively Less Contrast

& Edge Definition

For two lenses with the same resolution, the apparent quality of the image will therefore be
mostly determined by how well each lens preserves contrast as these lines become progressively
narrower. However, in order to make a fair comparison between lenses we need to establish a
way to quantify this loss in image quality.
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 68

Film Laboratory

A film laboratory is a commercial service enterprise and technical facility for the film industry
where specialists develop, print, and conform film material for classical film production and
distribution which is based on film material, such as negative and positive, black and white and
color, on different film.

Process

Exposed motion picture film will be processed according to exact chemical prescriptions at
measured temperature as well as over measured time. After processing there is an original, the
camera or picture original, in most cases a negative. From it a first sample is exposed on
a motion-picture film printer. Again after processing there is a positive ready for inspection by
the production representatives, usually by projection in the dark just like one sees a movie in a
theatre.

The film lab thus needs various apparatus from developing equipment and machines, over
measuring tools, cutting, editing devices, and printers to different sorts of viewing machinery
including classic projectors. There are sensitometers, densitometers, analyzers, and array of
chemical laboratory items that will help maintaining a level of repeatability of operations.
Auxiliary material is also encountered within a film laboratory, for example leader film, plain
plastic, to keep a developing machine threaded up.

1
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 69

Film Stock

Film stock is the basic component of all motion pictures, allowing images to be captured and
reproduced through the use of a camera. Since the early experiments with celluloid film in the
late 19th century, the motion picture world has undergone constant revolution through the
development and improvement of film. Thanks to applied technical wizardry, film has moved
from the grainy black and white images of the original Kodak camera to the colorful marvels of
modern stock in just over a century.
Originally, film was built on a paper base, making the composition of moving pictures an
incredibly difficult process. Celluloid film stock, which was flexible and less delicate than paper,
became heavily marketed by several early film pioneers, including George Eastman and Thomas
Henry Blair. Despite the considerable advantages given by celluloid film, early film stock was
deficient in a few serious matters: it was unable to process red light, and had no standardized
size.

In the early days, film cameras were often unique to their creators, leading to all kinds of
variation in the size of film used. As equipment became more standardized, film stock began
being issued in a few typical sizes, most notably the 35, 16, and 8 millimeter widths. The matter
of film being rendered in realistic color was not addressed until the early 20th century, with the
invention of panchromatic film that could see red, blue, and green layers of light.

Today, modern film stock is a lot more complicated than it looks. Instead of a simple piece of dark
flexible material, a typical piece of film contains several different layers of emulsions and filters. On
top of a safety base, an anti-hilation layer prevents fogging, followed by layers of red, green and blue
emulsions each with a filter between them. The film stock also contains yellow, magenta and cyan
dyes that are released during processing to give a full spectrum of color.

In purchasing film stock for a motion picture, speed and resolution are two key qualities to consider.
The width of the film determines the resolution, or image sharpness, given by the film. 8 mm film
typically has the lowest resolution, while 35 mm film is the standard form almost all major motion
pictures. Film speed determines how sensitive the film is to light; if a lot of night scenes are planned,
higher film speed may be necessary. However, higher film speed may lower the resolution, so
filmmakers tend to look for a happy medium in terms of resolution and speed.

1
Camera Basics, Principles and techniques –MCD 401 VU

Film stock can be quite pricey, depending on the width of the film and length of the roll used. With 35
mm film, a 1000 ft (304.8 m) roll will result in approximately 10 minutes of usable film, and will
usually start at about $500 US Dollars (USD.) Using lower resolution film, such as 8 mm, will result
in more time per foot of film, and may be a wise solution for amateur or low-budget filmmakers.
Some enterprising independent filmmakers choose to avoid film stock altogether by shooting on
digital cameras, but film cameras are still considered the giant of the motion picture industry by most
experts.

2
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 70
Film Camera
Demonstration

Camera operators need to know their camera so well that they don’t need to “think” about
it technically. That knowledge allows them to spend their time shooting creatively in a way that
effectively communicates.
A range of models
Video cameras today come in a wide variety of shapes and sizes that suit all kinds of different
situations. They range from units that fit in a pocket to cameras that are so heavy that they can
take a couple of people to lift them (Figure 6.1). Historically there were consumer, industrial,
and professional cameras. Many of those monikers have merged, with small, previously thought
of as “consumer” cameras now being used in the professional workplace. Traditionally, for a
multi-camera production, high-cost cameras were used that required camera control units.
Today’s multi-camera systems allow many types of cameras to be used in professional situations,
including low-cost cameras. The right camera depends on how the end production is going to be
used. What was considered a professional quality camera 10 years ago has been dwarfed by the
quality of small, low-cost high-definition pocket-sized cameras available today. Television and
film competitions are being won by directors who are using cameras that cost less than $1,000.
That was unheard of in the 1990s. So now, no one can blame the lack of quality on his or her
camera gear because almost anyone can afford the equipment. For all of the cool technological
advancements, keep in mind that the important thing is to know how to visually communicate.

Most productions are created with a camera that is a stand-alone unit; they are known as single-
camera productions. Single-camera productions are generally edited together during
postproduction. The second major type of production is a multi-camera production, where two or
more cameras are used with a switcher selecting the image to be shown to the viewer.

1
Camera Basics, Principles and techniques –MCD 401 VU

Figure:: Video cameras come in all different shapes and sizes. (Photos courtesy of Grass
Valley/Thomson, Sony, JVC, and Panasonic.)
Single cameras generally have a built-in recorder. These recorders may use a videotape, DVD,
flash card, and/or a hard drive (Figure 6.2). Note that some of the latest cameras have both
tape or flash card and a hard drive, allowing two recording options in one camera. Cameras may
also be combined (by wire or wirelessly) with other recorders such as a tape deck or a
portable hard drive. Some of the studio cameras or remote production (outside broadcast)
cameras are available without recorders because they are designed specifically for multi-camera
use.

Camera craft

Most video cameras are easy to operate at the basic level. Designers have gone to a lot of trouble
to make controls simple to use. The consumer-oriented cam- eras have been so automated that
all one needs to do to get a decent image is to point them at the subject and press the record
button. When shooting for fun, that’s fine. So why make camerawork any more complicated?

2
Camera Basics, Principles and techniques –MCD 401 VU

It really depends on whether the director plans to use the camera as a creative tool. The weakness
of automatic controls is that the camera is only designed to make technical judgments. Many
times these technical decisions are some- thing of a compromise. The camera cannot make
artistic choices of any kind. Auto-circuitry can help the camera operator avoid poor-quality
video images, but it cannot be relied on to produce attractive and meaningful pictures.
Communicating visually will always depend on how you use the camera and the choices you
make.

Obviously, good production is much more than just getting the shot. It begins with the way the
camera is handled and controlled. It is not just a matter of getting a sharp image, but of selecting
which parts of the scene are to be sharp and which are to be presented in soft focus. It involves
carefully selecting the best angle and arranging the framing and composition for maximum
impact, as well as deciding what is to be included in the shot and what is to be left out. It is the
art of adjusting the image tones by careful exposure. Automatic camera circuitry can help,
particularly when shooting under difficult conditions and save the camera operator from
having to worry about technicalities. However, automatic circuitry cannot create meaningful
images. Consider the following camera elements (Figures 6.3 and 6.4).

3
Camera Basics, Principles and techniques –MCD 401 VU

FIGURE 6.3:Video camera designs vary, but these are some of the common parts found in a
video camera. (Photo courtesy of Panasonic.)

This type of viewfinder is generally called an electronic news gathering (ENG) or electronic field
production (EFP) viewfinder. It is a small monitor designed to be placed next to the camera
operator’s eye
The power switch turns the camera on/off.
The manual zoom control lens ring allows the camera operator to zoom in and out manually.
The power zoom rocker switch, located on the side of the lens, allows the camera operator to
electronically zoom the lens. The speed of the zoom may vary, depending on the switch pressure.
The focus control ring on a lens allows the camera operator to turn the ring manually to obtain
the optimal focus.
The lens aperture control ring allows the camera operator to adjust the lens iris manually to
control exposure.
The white and black balance controls the circuitry in the camera that uses white or black to
balance the color settings of the camera.
The filter wheel includes a number of filters that can be used to correct the color in daylight,
tungsten, and fluorescent lighting situations.
Clip-on camera batteries allow the camera operator to carry multiple batteries.
Although at this point it is not common, some cameras are equipped with a built-in wireless
microphone and antennas.
On-camera shotgun microphones are useful for picking up natural sound but often pick up
camera and operator noises.
Lens shades protect the lens elements from picking up light distortions from the sun or a bright
light.

STATIONARY/STUDIO CAMERA ELEMENTS


 The camera cable is a two-way cable that carries the video to a distant camera
control unit (CCU) and allows the video operator to adjust the camera from a
remote site (such as studio control or a remote truck).

4
Camera Basics, Principles and techniques –MCD 401 VU

 The viewfinder (VF) monitors the camera’s picture. This allows the camera operator to
focus, zoom, and frame the image.
 The quick-release mount is attached to the camera and fits into a corresponding
recessed plate attached to the tripod/pan head. This allows the camera operator to
quickly remove or attach the camera to the camera mount.
 The tripod h ea d (panning head) enables the camera to tilt and pan smoothly.
 Variable friction controls (drag) steady these movements. The head can also be locked
off in a fixed position. Tilt balance adjustments position the camera horizon- tally to
assist in balancing the camera on the mount.
 One or two tripod arms (or panning bars/handles) attached to the pan head allow the
operator to accurately pan, tilt, and control the camera.

FIGURE 6.4
84 The television camera body is attached to a pan head. The zoom lens is then attached to the
camera body and the pan head. This type of system is generally used in studio or remote production
settings where the camera is stationary.
 The tripod, or camera mount can take various forms such as a tripod, pedestal, or jib.
 The zoom control (Servo zoom), focus control, and remote controls allow the camera
operator to zoom and focus the lens from behind the camera.

5
Camera Basics, Principles and techniques –MCD 401 VU

Topic no.71
Film Editing

Editing is where the material t hat has been shot is blended t o g et h er to form a convincing,
per su a siv e presentation. However, editing has a much more subtle role to play than a simple
piecing together process. It is the technique of select- ing and arranging shots; choosing
their order, their duration, and the ways in which they are to be joined together. Editing
is where graphics, music, sound effects, and special effects are added to the footage shot
earlier. It has a significant influence on the viewers’ reactions to what they see and hear.
Skilled editing makes a major co ntri bu tio n to the effectiveness of any production. Poor
editing can leave the audience c o n f u s e d a n d bored. The mechanics of editing are simple
enough; but the subtle effects of the editor’s choices are a study apart.

Editing goals

Basically, editing o r postproduction is the p r o cess o f combining individual shots in a


specific order. It has several purposes:
 To assemble material i n a sequential fashion the shooting order may differ from the
running order (see Figure 15.1).
 To correct mistakes b y editing them ou t or by covering them wit h other footage
 To create, enhance, e m b e l l i s h , and brin g to life images and ev ents that were once
captured live. Tools such as visual effects, sound effects, and music can give the
story more drama, thus more impact on the audience.

While I am cutting a show, I am always trying to determine what was in the director’s mind. But if I
have been successful, I will be able to present the director with some unexpected surprises. My goal is
to make the show better than it was in the script and even better than the director hoped for.
Lance Luckey, Emmy-Winning Editor

1
Camera Basics, Principles and techniques –MCD 401 VU

Shooting order versus running order


During the production process, when possible, events are usually shot in the order
that is most convenient or practical, and then the takes are joined together during the
editing process so that they appear consecutive. The eventual “running order” may be very
different from the order in which the scenes were shot ( the “ shooting order”). Some o f the
va r iou s s h o o t i n g situations follow:
Sometimes the action is shot from start to finish, such as might occur if you are shooting
someone who is blowing a glass vase.
Only s e c t i o n s of the t o t a l a c t i o n may be d el i b era t el y shot, omitting unwanted
action.
The action may be repeated so that it can be shot from various positions.
All of the action at one location may be shot before going on to the next location,
although the script may cut between them.
A series o f similar subjects m a y be shot that have reached different stages. For example,
shots of various newborn foals, yearlings, colts, and aging horses can be edited together
to imply the life cycle of a specific horse.

Editing video and audio


SPLICING

The original video edit technique included cutting and splicing segments of the videotape
together. However, the edits were physically hard on the VCR’s delicate heads and did not
look good on the television screen. This method was short-lived.

LINEAR EDITING
Next, editing moved on to the process of linear “dubbing” or copying the master tape to
another tape in a sequential order (Figure 15.1). This worked well for editors until the
director or client wanted significant changes to be made in the middle of a tape. With a
2
Camera Basics, Principles and techniques –MCD 401 VU

linear tape, that usually meant that the whole project had to be entirely reedited, this was
incredibly time-consuming and frustrating. Linear editing also did not work well if multiple
generations (copies of copies) of the tape had to be made, becau se each generation
deteriorated a little more. Linear systems are generally made up of a “player” and a
“record” VCR along with a control console. The original footage is placed into the player
and then is edited to the recorder (Figure 15.2). Although some segments of the television
industry are still using linear editing, the majority of programming today is edited on a
nonlinear editor.

NONLINEAR
Today almost all video and television programs are edited on a nonlinear editor.
Nonlinear editing i s the process whereby the recorded v i d e o is digitized (copied) onto
a compu ter . Then the footage can be arranged or rearranged, special effects can be
added, and the audio and graphics can be adjusted using editing software. Nonlinear
editing systems make it easy to make changes, moving clips around until the director
or client is happy. Hard disk and memory ca r d cam- eras have allowed editors to begin
editing much faster because they do not need to digitize all of the f o o t a g e . Nonlinear
systems c o st a fraction of the price of a professional linear editing system. Once the edited
project is complete, i t can be output to whatever medium is desired: tape, Internet, iPod,
CD, DVD, and so on.

Logging
An often-neglected important aspect of the production process i s logging th e r ec o r d ed
material. Logging saves time during the actual editing process because the logging can be
completed before the edit session (Figures 15.3 and 15.4).

3
Camera Basics, Principles and techniques –MCD 401 VU

FIGURE 15.1
Linear editing— copying the contents of one tape to another tape, one clip after another linearly—is
still used on a limited basis. Although the use of linear editors has been significantly reduced,
segments of the industry, such as news, still use them. (Photo by Jon Greenhoe.)

FIGURE 15.2
Laptop linear systems have been popular with news and sports crews that are on the road. They also
can be used as two separate tape decks when needed.

4
Camera Basics, Principles and techniques –MCD 401 VU

FIGURE 15.3
Logging can be done on paper or utilizing software. Here a camera is connected directly into the
computer to capture still frames from each clip and automatically import time code

5
Camera Basics, Principles and techniques –MCD 401 VU

Ins and outs. The screenshot shows the stored thumbnail frame, duration, and description. (Photos
courtesy of Imagine Products.)

6
Camera Basics, Principles and techniques –MCD 401 VU

FIGURE 15.4
Sample of a log sheet (Courtesy of the Avanti Group)

After logging the footage, the editor can then just digitize the specific clips that will be used
in the program instead of taking time to search through all of the clips. By digitizing the
specific clips instead o f all of the footage, logging also saves hard driv e space. Generally
7
Camera Basics, Principles and techniques –MCD 401 VU

some type of log sheet is used where notes can be written including time code (the
address where the footage is located), scene/take numbers, and the length of each shot.
The notes may also include a description of the shot and other comments like “very good,”
“blurry,” and so on. Logging can be simple notes on a piece of paper or can be based on
computer logging software. An advantage to some of the logging software is that it can work
with the editing software by importing the edit decisions au t o mat i call y into the computer.

Shots can be identified for the log a number of different ways:

 Visually (“the one where he gets into the car”)


 By shooting a “slate” (clapboard) before each shot, containing the shot number
and details (or an inverted board, at the end of shots)
 By time code, a special continuous time-signal throughout the tape that shows the
precise moment of recording

An overview of the nonlinear process


Step 1
Digitize the footage into the computer
Step 2
Trim (clean up) each video segment or clip, deleting unwanted video frames.

8
Camera Basics, Principles and techniques –MCD 401 VU

FIGURE 15.5
Screen shot showing the composition
Page of an editor. The program monitor allows the editor to see the program or to trim a clip to the desired
length. The video clip bin is usually where video clips are stored that are to be used in the program. At the
bottom is the timeline. This specific editor has two audio tracks and one video track in the timeline.

9
Camera Basics, Principles and techniques –MCD 401 VU

Step 3
Place the clips into t h e timeline. The timeline usually i n c l u d e s multiple tracks of video,
audio, and graphics. This timeline allows the editor to view the production and arrange the
segments to fit the script (Figure 15.5).
Step 4
Add video specia l effects and transitions. Nonlinear edit systems allow all kinds of effects such
as ripple, slow/fast motion, and color correction. Transitions include dissolves, cuts, and a
variety of wipes.
Step 5
Insert additional audio, if desir ed, at this point. Audio effects may be used t o “sweeten” the
sound. Music or voiceovers may be added a t different points in the project (Figure 15.6).
Step 6
Output the final program to the distribution medium.

FIGURE 15.6
The talent is doing a voiceover in an edit suite for a news story at a local news station. (Photo by Jon Greenhoe.)

10
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 72
Film Production
Overview

MULTI-CAMERA SHOOTING

When shooting with two or more cameras, a director tends to think in terms of effective
viewpoints rather than specific shots. The results may be similar, but the strategy is different;
cameras need to be positioned to catch various aspects of the continuous action (Figure 3.22).

When planning a multi-camera production, directors have to consider a variety of situations:

Will one camera come into another camera’s shot?


Is there time for cameras to move to various positions?

What kinds of shots does the script dictate?


How will the microphones and lighting relate to the cameras’ movements? (Visible mics or
shadows cast by the boom pole, etc.)?

1
Camera Basics, Principles and techniques –MCD 401 VU

SINGLE-CAMERA PRODUCTION

A single lightweight camera is independent (has its own recorder), compact, and free to go
anywhere (is not attached by a cable to a switcher). The director can be right there on the spot
beside the camera, seeing opportunities and explaining exactly what he or she wants to the
camera operator. In the case of documentary productions, the person devising and organizing the
project may be operating the camera too (Figure 4.2).
This method offers the director incredible flexibility—both when shooting and later when
editing. Directors can select and rearrange the material they have shot, trying out several
versions to improve the production’s impact. There is none of the feeling of instant
commitment, which can typify a multi-camera production. But it is slow. Patience is a necessity
(Figure 4.3).

Shooting with a single camera will often involve interrupting or repeating the action in
order to reposition the camera. The problems of maintaining continuity between setups, even
the way in which shooting conditions can change between takes (such as light or weather
variances), are not to be underestimated.
It has been said that unlike multi-camera production, which is a “juggling act,” shooting with a
single camera allows the director to concentrate on doing one thing at a time, on optimizing
each individual shot. The director is free to readjust each camera position, rearrange the
subject, change the lighting, adjust the sound recording, modify the decor, and make any other
alterations that are necessary to suit each take. That’s great. But shooting can degenerate into a
self-indulgent experimental session. It is all too easy to put off problems till tomorrow, with
“We’ll sort it out during the edit” or “Let’s leave it till postproduction.”

When shooting with a single camera, directors do not have to worry about coordinating
several different cameras, each with its different viewpoint. Directors are relieved of the tensions
of continually cueing, guiding, and switching between cameras. All production refinements
and supplementary features from background music to video effects are added at a later stage,
during the postproduction session. The other side of the coin is that when shooting with a

2
Camera Basics, Principles and techniques –MCD 401 VU

single camera, you finish with a collection of recordings containing a mixture of takes (good,
bad, and indifferent; mistakes and all) shot in any order, all needing to be sorted out at a later
time. So compiling the final production—including titles, music, and effects—can be a lengthy
process.

MULTICAMERA PRODUCTION

If you are shooting continuous action with a single camera and want to change the camera’s
viewpoint, you have basically two choices. You can move the camera to a new position while
still shooting, or you can miss some of the action as the camera is repositioned at the next setup.
A multi-camera production director simply switches from one camera to another, which is a clear
advantage when shooting a series of events going on at the same time or spread over a wide area
(Figure 4.4).
Unlike the director on a single-camera shoot, who is close to the camera, the director of a
multi-camera production is located away from the action. This director watches a series of
television monitors in a production control room and issues instructions to the crew over their
intercom (talkback) headsets and to the floor manager who guides the talent on the director’s
behalf (Figure 4.5).

In a live multi-camera production, most of the shot transitions (cuts, dissolves, wipes, etc.) are
made on a production switcher (also known as a vision mixer)
During the action, the switcher takes the outputs from various video sources (cameras, video
recorders, graphics, etc.) and switches between them (Figure 4.6). There are no opportunities
to correct or improve. However, the director does have the great advantage of continuously
monitoring and comparing the shots Figure 4.7). The continuity problems that can easily develop
during a single- camera production disappear during the multi-camera production because it is
real time. An experienced m u l t i -camera crew can, after a single rehearsal, pr o duce a polished
show in just a few hours. At the end of the recording per iod, the show is finished.

Multi-camera productions may be shot as follows:

 Live. Transmitted live to the viewing audience

3
Camera Basics, Principles and techniques –MCD 401 VU

 Live on tape. Shot from beginning to end and recorded. This style of production allows the
director to clean it up in postproduction
 Scene-by-scene. Each scene or act is shot, corrected, and polished one at a time.
 Shot-by-shot. Shot in short action sequences, with multi-camera switching to avoid
interrupting (or repeating) the action (Figure 4.8)

Figure 4.5. Multi-camera production in a studio. (Photo by Josh Taber.)

FIGURE 4.6
Video production switcher. (Photo courtesy of Grass Valley.)

4
Camera Basics, Principles and techniques –MCD 401 VU

FIGURE 4.7

Multi-camera production directors receive a variety of camera shots all at the same time and must
choose the shot that best communicates the action. (Photos by Josh Taber and Paul Dupree.)

FIGURE 4.8

5
Camera Basics, Principles and techniques –MCD 401 VU

A sitcom is a good example of a scene- by-scene multi-camera television production. (Photo by Josh
Taber.)

A multi-camera production can d e g e n e r a t e into a shot-grabbing routine in which the


director simply cuts between several camera viewpoints for the sake of variety. But for directors
with imagination, the ability to plan ahead, and a skilled team, the results can be of the highest
standards.

Multi-camera ISO

When the action cannot be repeated o r events are unpredictable, some directors make use of an ISO
(isolated) camera. This simply means that while all the cameras are connected to the switcher as before,
one of them is also continuously recorded on a separate recorder. This ISO camera takes wide shots of the
action (cover shots) or concentrates on watching out for the arrival of the guest, for instance, or a specific
player at a sports event, so that if the director misses the needed shot “live,” it is still on tape. Shots on
the ISO tape can be played back during a live show or edited in where necessary later.

Multi-camera production without a switcher

Another multi-camera a p pr oa c h i s t o use camcorders. Instead of cutting between


cameras with a switcher, shots from their separate recordings are edited together later during
a postproduction session. It is fairly simple to sync multiple cameras together on even some of the
lowest cost nonlinear editing systems. Voiceover narration, sound effects, video effects, graphics,
or music are then added in postproduction. The advantage of this type of multi-camera
technique is that it significantly reduces the cost of having to pay for a large crew and a control
room or remote production truck/OB van. The disadvantage is that the production is not over
when the event is over; it still needs to be finalized in postproduction, which can be time
consuming.

6
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 73

35 mm Camera

Recorder and Projector

A movie projector is an opto-mechanical device for displaying motion picture film by


projecting it onto a screen. Most of the optical and mechanical elements, except for the
illumination and sound devices, are present in movie cameras.

Projection elements
As in a slide projector there are essential optical elements:
1. Light source
Incandescent lighting and even limelight were the first light sources used in film projection. In
the early 1900s up until the late 1960s, carbon arc lamps were the source of light in almost all
theaters in the world.
The Xenon arc lamp was introduced in Germany in 1957 and in the US in 1963. After film
platters became commonplace in the 1970s, Xenon lamps became the most common light source,
as they could stay lit for extended periods of time, whereas a carbon rod used for a carbon arc
could last for an hour at the most.
Most lamp houses in a professional theatrical setting produce sufficient heat to burn the film
should the film remain stationary for more than a fraction of a second. Because of this, care must
be taken in inspecting a film so that it should not break in the gate and be damaged, particularly
if it is flammable cellulose nitrate film stock.
2. Reflector and condenser lens
A curved reflector redirects light that would otherwise be wasted toward the condensing lens.
A positive curvature lens concentrates the reflected and direct light toward the film gate.
3. Douser
A metal or asbestos blade which cuts off light before it can get to the film. The douser is usually
part of the lamphouse, and may be manually or automatically operated. Some projectors have a
second, electrically controlled douser that is used for changeovers (sometimes called a
"changeover douser" or "changeover shutter"). Some projectors have a third, mechanically
controlled douser that automatically closes when the projector slows down (called a "fire shutter"
or "fire douser"), to protect the film if the projector stops while the first douser is still open.
Dousers protect the film when the lamp is on but the film is not moving, preventing the film
Camera Basics, Principles and techniques –MCD 401 VU

from melting from prolonged exposure to the direct heat of the lamp. It also prevents the lens
from scarring or cracking from excessive heat.
4. Film gate and single image
A single image of the series of images comprising the movie is positioned and held flat within an
aperture called the gate. The gate also provides a slight amount of friction so that the film does
not advance or retreat except when driven to advance the film to the next image.
5. Shutter
A commonly held misconception is that film projection is simply a series of individual frames
dragged very quickly past the projector's intense light source. This is not the case; if a roll of film
were merely passed between the light source and the lens of the projector, all that would be
visible on screen would be a continuous blurred series of images sliding from one edge to the
other. It is the shutter that gives the illusion of one full frame being replaced exactly on top of
another full frame. A rotating petal or gated cylindrical shutter interrupts the emitted light during
the time the film is advanced to the next frame. The viewer does not see the transition, thus
tricking the brain into believing a moving image is on screen. Modern shutters are designed with
a flicker-rate of two times (48 Hz) or even sometimes three times (72 Hz) the frame rate of the
film, so as to reduce the perception of screen flickering. Higher rate shutters are less light
efficient, requiring more powerful light sources for the same light on screen.

Mechanical sequence when image is shown twice and then advanced.


Outer sprockets rotate continuously while the frame advance sprockets are controlled by the
mechanism shown - a Geneva drive.
Camera Basics, Principles and techniques –MCD 401 VU

6. Imaging lens and aperture plate

Imaging lens Diastar of an Askania 35mm movie projector (focal length: 400 mm)

A projection objective with multiple optical elements directs the image of the film to a viewing
screen (imaging lens). Imaging lenses also differ in aperture and focal length. Different lenses
are used for different aspect ratios.
Aspect ratios are controlled by the lens with the appropriate aperture plate, a piece of metal with
a precisely cut rectangular hole in the middle of equivalent aspect ratio. The aperture plate is
placed just behind the gate, and masks off any light from hitting the image outside of the area
intended to be shown. All films, even those in the standard Academy ratio, have extra image on
the frame that is meant to be masked off in the projection.
7. Viewing screen
In most cases this is a reflective surface which may be either aluminized (for high contrast in
moderate ambient light) or a white surface with small glass beads (for high brilliance under dark
conditions). Switchable projection screen can be switched between opaque and clear by a safe
voltage under 36V AC and is viewable from both sides. In a commercial theater, the screen also
has millions of very small, evenly spaced holes in order to allow the passage of sound from the
speakers and subwoofer which often are directly behind it.

Film transport elements


Camera Basics, Principles and techniques –MCD 401 VU

Film supply and take up


Two-reel system
In the two-reel system the projector has two reels–one is the feed reel, which holds the part of the
film that has not been shown, the other is the takeup reel, which winds the film that has been
shown. In a two-reel projector the feed reel has a slight drag to maintain tension on the film,
while the takeup reel is constantly driven with a mechanism that has mechanical 'slip,' to allow
the film to be wound under constant tension so the film is wound in a smooth manner.
The film being wound on the takeup reel is being wound "head in, tails out." This means that the
beginning (or "head") of the reel is in the center, where it is inaccessible. As each reel is taken
off of the projector, it must be re-wound onto another empty reel. In a theater setting there is
often a separate machine for rewinding reels. For the 16 mm projectors that were often used in
schools and churches, the projector could be re-configured to rewind films.
The size of the reels can vary based on the projectors, but generally films are divided and
distributed in reels of up to 2000 feet (610 m, about 22 minutes at 24 frames/sec). Some
projectors can even accommodate up to 6000 feet (1,830 m), which minimizes the number of
changeovers (see below) in a showing. Certain countries also divide their film reels up
differently; Russian films, for example, often come on 1000-foot (305 m) reels, although it's
likely that most projectionists working with changeovers would combine them into longer reels
of at least 2000 feet (610 m), to minimize changeovers and also give sufficient time for threading
and any possibly needed troubleshooting time.
Films are identified as "short subjects," taking one reel or less of film, "two-reelers," requiring
two reels of film (such as some of the early Laurel & Hardy, 3 Stooges, and other comedies), and
"features," which can take any number of reels (although most are limited to 1½ to 2 hours in
length, enabling the theater to have multiple showings throughout the day and evening, each
showing with a feature, commercials, and intermission to allow the audiences to change). In the
"old days" (i.e., ca. 1930–1960), "going to the movies" meant seeing a short subject (a newsreel,
short documentary, a "2-reeler," etc.), a cartoon, and the feature. Some theaters would have
movie-based commercials for local businesses, and the state of New Jersey required showing a
diagram of the theater showing all of the exits.
Changeover
Because a single film reel does not contain enough film to show an entire feature, the film is
distributed on multiple reels. To prevent having to interrupt the show when one reel ends and the
next is mounted, two projectors are used in what is known as a "changeover system," after the
switching mechanism that operates between the end of one reel on the first projector and the
beginning of the next reel on the second projector. The two-reel system was used almost
Camera Basics, Principles and techniques –MCD 401 VU

universally for movie theaters before the advent of the single-reel system in order to be able to
show feature-length films. Although one-reel long-play systems tend to be more popular with the
newer multiplexes, the two reel system is still in significant use to this day.
The projector operator operates two projectors, starting the first reel of the show on projector
"A." While this reel is being shown, the projectionist threads the second reel on projector "B."
As the reel being shown approaches its end, the projectionist looks for cue marks at the upper-
right corner of the picture. Usually these are dots or circles, although they can also be slashes.
Some older films occasionally used squares or triangles, and sometimes positioned the cues in
the middle of the right edge of the picture.
The first cue appears twelve feet (3.7 m) before the end of the program on the reel, equivalent to
eight seconds at 24 frames/sec. This cue signals the projectionist to start the motor of the
projector containing the next reel. After another ten and a half feet (3.2 m) of film is shown
(seven seconds at 24 frames/sec), the changeover cue should appear, which signals the
projectionist to actually make the changeover. When this second cue appears, the projectionist
has one and a half feet (457 mm), or one second at 24 frame/s, to make the changeover. If it
doesn't occur within one second, the tail leader of the reel coming to an end will be projected on
the screen.
Twelve feet before the "first frame of action," Academy leaders have a "START" frame. The
projectionist positions the "START" in the gate of the projector. When the first cue is seen, the
motor of the starting projector is started. Seven seconds later the end of the leader and start of
program material on the new reel should just reach the gate of the projector when the changeover
cue is seen.
On some projectors, the operator would be alerted to the time for a change by a bell that operated
when the feed reel rotation exceeded a certain speed (the feed reel rotates faster as the film is
exhausted), or based on the diameter of the remaining film (Premier Changeover Indicator Pat.
No. 411992), although many projectors do not have such an auditory system.
During the actual operation of a changeover, the two projectors use an interconnected electrical
control connected to the changeover button so that as soon as the button is pressed, the
changeover douser on the outgoing projector is closed in sync with the changeover douser on the
incoming projector opening. If done properly, a changeover should be virtually unnoticeable to
an audience. In older theaters, there may be manually operated, sliding covers in front of
the projection booth's windows. A changeover with this system is often clearly visible as
a wipe on the screen.
Once the changeover has been made, the projectionist unloads the full takeup reel from projector
"A," moves the now-empty reel (that used to hold the film just unloaded) from the feed spindle
Camera Basics, Principles and techniques –MCD 401 VU

to the takeup spindle, and loads reel #3 of the presentation on projector "A." When reel 2 on
projector "B" is finished, the changeover switches the live show from projector "B" back to
projector "A," and so on for the rest of the show.
When the projectionist removes a finished reel from the projector it is "tails out," and needs to be
rewound before the next show. The projectionist usually uses a separate rewind machine and a
spare empty reel, and rewinds the film so it is "head out," ready to project again for the next
show.
One advantage of this system (at least for the theatre management) was that if a programme was
running a few minutes late for any reason, the projectionist would simply omit one (or more)
reels of film to recover the time.
Single-reel system

Christie AW3 platter, BIG SKY Industries console, and Century SA projector.

There are two largely used single-reel systems (also known as long-play systems) today: the
tower system (vertical feed and take up) and the platter system (non-rewinding; horizontal feed
and take up).
The tower system largely resembles the two reel system, except in that the tower itself is
generally a separate piece of equipment used with a slightly modified standard projector. The
feed and take up reels are held vertically on the axis, except behind the projector, on oversized
spools with 12,000 foot (3,660 m) capacity or about 133 minutes at 24 frame/s. This large
capacity alleviates the need for a changeover on an average-length feature; all of the reels are
spliced together into one giant one. The tower is designed with four spools, two on each side,
each with its own motor. This allows the whole spool to be immediately rewound after a
showing; the extra two spools on the other side allow for a film to be shown while another is
being rewound or even made up directly onto the tower. Each spool requires its own motor in
Camera Basics, Principles and techniques –MCD 401 VU

order to set proper tensioning for the film, since it has to travel (relatively) much further between
the projector film transport and the spools. As each spool gains or loses film, the tension must be
periodically checked and adjusted so that the film can be transported on and off the spools
without either sagging or snapping.
In a platter system the individual 20-minute reels of film are also spliced together as one large
reel, but the film is then wound onto a horizontal rotating table called a platter. Three or more
platters are stacked together to create a platter system. Most of the platters in a platter system
will be occupied by film prints; whichever platter happens to be empty serves as the "take-up
reel" to receive the film that is playing from another platter.
The way the film is fed from the platter to the projector is not unlike an eight-track audio
cartridge. Film is unwound from the center of the platter through a mechanism called a payout
unit which controls the speed of the platter's rotation so that it matches the speed of the film as it
is fed to the projector. The film winds through a series of rollers from the platter stack to the
projector, through the projector, through another series of rollers back to the platter stack, and
then onto the platter serving as the take-up reel.
This system makes it possible to project a film multiple times without needing to rewind it. As
the projectionist threads the projector for each showing, he transfers the payout unit from the
empty platter to the full platter and the film then plays back onto the platter it came from. In the
case of a double feature, each film plays from a full platter onto an empty platter, swapping
positions on the platter stack throughout the day.

nonrewind in Royal - Malmo, Sweden.

The advantage of a platter is that the film need not be rewound after each show, which can save
labor. Rewinding risks rubbing the film against itself, which can cause scratching of the film and
smearing of the emulsion which carries the pictures. The disadvantages of the platter system are
that the film can acquire diagonal scratches on it if proper care is not taken while threading film
from platter to projector, and the film has more opportunity to collect dust and dirt as long
Camera Basics, Principles and techniques –MCD 401 VU

lengths of film are exposed to the air. A clean projection booth kept at the proper humidity is of
great importance, as are cleaning devices that can remove dirt from the film print as it plays.
Automation and the rise of the multiplex
The single reel system can allow for the complete automation of the projection booth operations,
given the proper auxiliary equipment. Since films are still transported in multiple reels they must
be joined together when placed on the projector reel and taken apart when the film is to be
returned to the distributor. It is the complete automation of projection that has enabled the
modern "multiplex" cinema - a single site typically containing from 8 to 24 theaters with only a
few projection and sound technicians, rather than a platoon of projectionists. The multiplex also
offers a great amount of flexibility to a theater operator, enabling theaters to exhibit the same
popular production in more than one auditorium with staggered starting times. It is also possible,
with the proper equipment installed, to "interlock", i.e. thread a single length of film through
multiple projectors. This is very useful when dealing with the mass crowds that an extremely
popular film may generate in the first few days of showing, as it allows for a single print to serve
more patrons.
Feed and extraction sprockets
Smooth wheels with triangular pins called sprockets engage perforations punched into one or
both edges of the film stock. These serve to set the pace of film movement through the projector
and any associated sound playback system.
Film loop
As with motion picture cameras, the intermittent motion of the gate requires that there be loops
above and below the gate in order to serve as a buffer between the constant speed enforced by
the sprockets above and below the gate and the intermittent motion enforced at the gate. Some
projectors also have a sensitive trip pin above the gate to guard against the upper loop becoming
too big. If the loop hits the pin, it will close the dousers and stop the motor to prevent an
excessively large loop from jamming the projector.
Film gate pressure plate
A spring-loaded pressure plate functions to align the film in a consistent image plane, both flat
and perpendicular to the optical axis. It also provides sufficient drag to prevent film motion
during the frame display, while still allowing free motion under control of the intermittent
mechanism. The plate also has spring-loaded runners to help hold film while in place and
advance it during motion.
Intermittent mechanism
Camera Basics, Principles and techniques –MCD 401 VU

The intermittent mechanism can be constructed in different ways. For smaller gauge projectors
(8 mm and 16 mm), a pawl mechanism engages the film's sprocket hole one side, or holes on
each side. This pawl advances only when the film is to be moved to the next image. As the pawl
retreats for the next cycle it is drawn back and does not engage the film. This is similar to the
claw mechanism in a motion picture camera.
In 35 mm and 70 mm projectors, there usually is a special sprocket immediately underneath the
pressure plate, known as the intermittent sprocket. Unlike all the other sprockets in the projector,
which run continuously, the intermittent sprocket operates in tandem with the shutter, and only
moves while the shutter is blocking the lamp, so that the motion of the film cannot be seen. It
also moves in a discrete amount at a time, equal to the number of perforations that make up a
frame (4 for 35 mm, 5 for 70 mm). The intermittent movement in these projectors is usually
provided by a Geneva drive, also known as the Maltese Cross mechanism.
IMAX projectors use what is known as the rolling loop method, in which each frame is sucked
into the gate by a vacuum, and positioned by registration pins in the perforations corresponding
to that frame.
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 74

35 mm

Film Grain

Film grain or granularity is the random optical texture of processed photographic film due to
the presence of small particles of a metallic silver, or dye clouds, developed from silver
halide that have received enough photons. While film grain is a function of such particles (or dye
clouds) it is not the same thing as such. It is an optical effect, the magnitude of which (amount of
grain) depends on both the film stock and the definition at which it is observed. It can be
objectionably noticeable in an over-enlarged photographic film photograph.

RMS Granularity
Granularity, or RMS granularity, is a numerical quantification of film-grain noise, equal to
the root-mean-square (RMS) fluctuations in optical density, measured with
a microdensitometer with a 0.048 mm (48-micrometre) diameter circular aperture, on a film area
that has been exposed and normally developed to a mean density of 1.0 D (that is, it transmits
10% of light incident on it)
Granularity is sometimes quoted as "diffuse RMS granularity times 1000",so that a film with
granularity 10 means an RMS density fluctuation of 0.010 in the standard aperture area.
When the particles of silver are small, the standard aperture area measures an average of many
particles, so the granularity is small. When the particles are large, fewer are averaged in the
standard area, so there is a larger random fluctuation, and a higher granularity number.
The standard 0.048 mm aperture size derives from a drill bit used by an employee of Kodak

Selwyn Granularity
Film grain is also sometimes quantified in a way that is relative independent of size of the
aperture through which the microdensitometer measures it, using R. Selwyn's observation
(known as Selwyn's law) that, for a not too small aperture, the product of RMS granularity and
the square root of aperture area tends to be independent of the aperture size. The Selwyn
granularity is defined as:

where σ is the RMS granularity and a is the aperture area.


Camera Basics, Principles and techniques –MCD 401 VU

Grain Effect with Film and Digital


The images below show an example of extreme film grain:

Rallycross car pictured on an Agfa 1000 RS slide

Detail of the same photo


Digital photography does not exhibit film grain, since there is no film for any grain to exist
within. In digital cameras, the closest physical equivalents of film grains are the individual
elements of the image sensor (e.g. CCD cell), the pixels; just as small-grain film has better
resolution than large-grain film, so will an image sensor with more elements result in an
image with better resolution. However, unlike pixels, film grain does not represent the limit
of resolution. As film grains are randomly distributed and have size variation, while image
sensor cells are of same size and are arranged in a grid, direct comparison of film and digital
resolutions is not straightforward.
In general, as the pixels from a digital image sensor are set in straight lines, they irritate the
eye of the viewer more than the randomly arranged film grains. Most people will reject an
enlargement that show pixels, whereas a grained film enlargement with lower resolution will
be acceptable, and perceived as 'sharper'.
The effect of film grain can be simulated in some digital photo manipulation programs, such
as Photoshop, adding grain to a digital image after it is taken.
In digital photography, image noise sometimes appears as a "grain-like" effect.

Film Grain overlay


Camera Basics, Principles and techniques –MCD 401 VU

Film grain overlay, sometimes referred to as "FGO," is a process in which film emulsion
characteristics are overlaid using different levels of opacity onto a digital file. This process
adds film noise characteristics, and in instances with moving images, subtle flicker to the
more sterile looking digital medium
As opposed to computer plug-ins, FGO is typically derived from actual film grain samples
taken from film, shot against a gray card.
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 75

35mm Film printing

Color prints have been developed since the 19th century. It all began in 1935 with Eastman
Kodak’s Company’s Kodachrome film, as well in 1936 with Agfa Company’s Agfacolor
film.[1] Color print film is the most common type of photographic film in consumer use. Print
film produces a negative image when it is developed, requiring it to be reversed again when it is
printed onto photographic paper.
Almost all color print film made today is designed to be processed according to the C-41
process.

Handling color print Film Negative


Color negatives are prone to damage through fingerprints and tears, therefore it is a good idea to
wear nylon or cotton gloves when handling them and placing them in negative sleeves. Avoid
bending, folding or rolling up your negatives sleeves as well.
Preserving the Prints form a color Print Film
Generally, color prints are more sensitive to temperature and light as opposed to black and white
film, therefore there are more precautions to take when trying to protect and optimize the
lifespan of them.
It is important to keep the prints protected from physical damage from as little as a fingerprint to
as much as scratches that can destroy them completely. Storage for prints that are developed
from color print film should be free of any unsafe, harmful chemicals, specifically referring to
peroxides, sulfur dioxide, ozone and nitrogen oxides. For the best prolonged storage and
protection, placing the prints in polyester uncoated sleeves and then into an envelope seals it
from further damage. When it comes to storing them, the optimal temperature would be at 2 °C,
as it is found to be the most effective preservation temperature when it comes to a mass
collection of colored photographic film prints. It is best to keep the color prints away from strong
sunlight exposure for prolonged periods of time because it may result in the decay of the gelatin
layer as well as a significant fade in the dye found in the print. Similar to that of watercolors and
textiles, dyes in color prints are prone to fade as well when exposed to too much light. However
it is to be noted that color photographs are susceptible to build stains if stored in dark fully for
prolonged periods of time as well, for example, an area of white in a photograph can change into
yellow. Therefore it is key to not store them in an area where they are exposed to long periods of

1
Camera Basics, Principles and techniques –MCD 401 VU

light and/or long periods of dark, there should be a balance. Prime examples of pla ces to store
the color prints are: durable binders, cabinets, trays or rigid boxes.
Cleaning Prints from Colored Print Films
If there is a chance that the prints got dirty, there are several effective methods that can be
undertaken to clean them carefully without damaging them. First off is using a soft brush that
can remove surface dirt on the print. Make sure to lightly brush the dirt off of the print. Damping
cotton swabs or using a specialized cleaning pad to dry wipe the surface of the print is also
another method to clean it. Remember to never wash photographs until the gelatin layer is dry
and stables. Furthermore, never attempt chemical treatments on color photographs because they
can get distorted and destroy the image as a whole.

2
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 76

Film & Video Conversion

Television standards conversion is the process of changing one type of TV system to another.
The most common is from NTSC to PAL or the other way around. This is done so TV programs
in one nation may be viewed in a nation with a different standard. The TV video is fed through
a video standards converter that changes the video to a different video system.
Converting between a different numbers of pixels and different frame rates in video pictures is a
complex technical problem. However, the international exchange of TV programming makes
standards conversion necessary and in many cases mandatory. Vastly different TV systems
emerged for political and technical reasons – and it is only luck that makes video programming
from one nation compatible with another.
History
The first known case of TV systems conversion probably was in Europe a few years after World
War II – mainly with the RTF (France) and the BBC (UK) trying to exchange their441
line and 405 line programming.
The problem got worse with the introduction of PAL, SECAM (both 625 lines), and the
French 819 line service.
Until the 1980s, standards conversion was so difficult that 24 frame/s 16 mm or 35 mm film was
the preferred medium of programming interchange.
Overview
Perhaps the most technically challenging conversion to make is the PAL to NTSC.

 PAL is 625 lines at 50 fields/s


 NTSC is 525 lines at 59.94 fields/s (60,000/1,001 fields/s)
The two TV standards are for all practical purposes, temporally and spatially incompatible with
each other.
Aside from the line count being different, it is easy to see that generating 60 fields every second
from a format that has only 50 fields might pose some interesting problems.
Every second, an additional 10 fields must be generated seemingly from nothing. The converter
has to create new frames (from the existing input) in real time.
Hidden Signals: not always transferred

1
Camera Basics, Principles and techniques –MCD 401 VU

TV contains many hidden signals. One signal type that is not transferred, except on some very
expensive converters, is the closed captioning signal.
Tele text signals do not need to be transferred, but the captioning data stream should be if it is
technologically possible to do so.
With HDTV broadcasting, this is less of an issue, for the most part meaning only passing the
captioning data stream on to the new source material. However, DVB and ATSC have
significantly different captioning data stream types.

Role of information Theory


Theory behind systems conversion
Information theory and the Nyquist–Shannon sampling theorem imply that conversion from one
television standard to another will be easier providing:

 one is going from a higher frame rate to a lower frame rate (NTSC to PAL or SECAM, for
example)
 one is going from a higher resolution to a lower resolution (HDTV to NTSC)
 one is converting from one progressive source to another progressive source
(interlaced PAL and NTSC are temporally and spatially incompatible with each other)
 interframe motion is limited, so as to reduce temporal or spatial judder
 signal to noise ratios in the source material are not detrimentally high
 the source material does not possess any continuous (or periodic) signal defect that inhibits
translation
Sampling systems and ratios
The subsampling in a video system is usually expressed as a three part ratio. The three terms of
the ratio are: the number of brightness ("luminance" "luma" or Y) samples, followed by the
number of samples of the two color ("Chroma") components: U/Cb then V/Cr, for each complete
sample area.
For quality comparison, only the ratio between those values is important, so 4:4:4 could easily be
called 1:1:1; however, traditionally the value for brightness is always 4, with the rest of the
values scaled accordingly.

2
Camera Basics, Principles and techniques –MCD 401 VU

The sampling principles above apply to both digital and analog television.
Telecine judder
The “3:2 pull down” conversion process for 24 frame/s film to television (telecine) creates a
slight error in the video signal compared to the original film frames.
This is one reason why NTSC films viewed on typical home equipment may not appear as
smooth as when viewed in a cinema. The phenomenon is particularly apparent during slow,
steady camera movements which appear slightly jerky when telecined.
This process is commonly referred to as telecine judder.
PAL material in which 2:2:2:2:2:2:2:2:2:2:2:3 pull down has been applied, suffers from a similar
lack of smoothness, though this effect is not usually called “telecine judder”.
In effect every 12th film frame is displayed for the duration of 3 PAL fields (60 milliseconds) –
whereas the other 11 frames are all displayed for the duration of 2 PAL fields (40 milliseconds).
This causes a slight “hiccup” in the video about twice a second.
Television systems converters must avoid creating telecine judder effects during the conversion
process.
Avoiding this judder is of economic importance as a substantial amount of NTSC (60 Hz,
technically 29.97 frame/s) resolution material that originates from film – will have this problem
when converted to PAL or SECAM (both 50 Hz, 25 frame/s).

Historical standards conversion techniques


Orthicon to orthicon
This method was used by Ireland to convert 625 line service to 405 line service. It is perhaps the
most basic television standard conversion technique.
RTÉ used this method during the latter years of its use of the 405 line system.

3
Camera Basics, Principles and techniques –MCD 401 VU

A standards converter was used to provide the 405 line service, but according to more
than one former RTÉ engineering source the converter blew up and afterwards the 405
line service was provided by a 405 line camera pointing at a monitor!
This is not the best conversion technique but it can work if one is going from a higher
resolution to a lower one – at the same frame rate.

The first video standards converters were analog. That is, a special professional video
camera that used a video camera tube would be pointed at a Cathode ray tube video monitor.
Both the Camera and the monitor could be switched to either NTSC or PAL, to convert both
ways. Robert Bosch GmbH's Fernseh Division made a large three rackanalog video
standards converter. These were the high end converters of the 1960s and 1970s. Image
Transform in Universal City, Ca used the Fernseh converter and in the 1980s made their own
custom digital converter. This was also a larger 3 rack device. As digital memory size
became larger in smaller packages, converters became the size of amicrowave oven. Today
one can buy a very small consumer converter for home use.
SSTV to PAL and NTSC
The Apollo moon missions (late 1960s, early 1970s) used SSTV as opposed to normal
bandwidth television; this was mostly done to save battery power. The camera used only 7
watts of power.
SSTV was used to transmit images from inside Apollo 7, Apollo 8, and Apollo 9, as well as
the Apollo 11 Lunar Module television from the Moon; see Apollo TV camera.

 The SSTV system used in NASA's early Apollo missions transferred ten frames per
second with a resolution of 320 frame lines using less bandwidth than a normal TV
transmission.
 The early SSTV systems used by NASA differ significantly from the SSTV systems
currently in use by amateur radio enthusiasts today.

4
Camera Basics, Principles and techniques –MCD 401 VU

 Standards conversion was necessary so that the missions could be seen by a worldwide
audience in both PAL/SECAM (625 lines, 50 Hz) and NTSC (525 lines, 60 Hz)
resolutions.
Later Apollo missions featured color field sequential cameras that output 60-frame/s video.
Each frame corresponded to one of the RGB primary colors. This method is compatible with
black and white NTSC, but incompatible with color NTSC. In fact, even NTSC monochrome
TV compatibility is marginal. A monochrome set could have reproduced the pictures, but the
pictures would have flickered terribly. The camera color video ran at only 10 frame/s. Also,
Doppler shift in the lunar signal would have caused pictures to tear and flip. For these
reasons, the Apollo moon pictures required special conversion techniques.
The conversion steps were completely electromechanical, and they took place in nearly real
time. First, the downlink station corrected the pictures for Doppler shift. Next, in an analog
disc recorder, the downlink station recorded and replayed every video field six times. On the
six-track recorder, recording and playback took place simultaneously. After the recorder,
analog video processors added the missing components of the NTSC color signal: These
components include...

 The 3.58-MHz color burst


 The high-resolution monochrome signal
 The sound
 The I and Q color signals
The conversion delay lasted only some 10 seconds. Then color moon pictures left the
downlink station for world distribution.
Standards conversion methods in common use
Nyquist subsampling
This conversion technique may become popular with manufacturers of HDTV --> NTSC and
HDTV --> PAL converter boxes for the ongoing global conversion to HDTV.

 Multiple Nyquist subsampling was used by the defunct MUSE HDTV system that was
used in Japan.
 MUSE chipsets that can be used for systems conversion do exist, or can be revised for the
needs of HDTV --> Analog TV converter boxes.
How it works

5
Camera Basics, Principles and techniques –MCD 401 VU

In a typical image transmission setup, all stationary images are transmitted at full resolution.
Moving pictures possess a lower resolution visually, based on complexity of interframe
image content.
When one uses Nyquist subsampling as a standards conversion technique, the horizontal and
vertical resolution of the material are reduced – this is an excellent method for converting
HDTV to standard definition television, but it works very poorly in reverse.

 As the horizontal and vertical content change from frame to frame, moving images will
be blurred (in a manner similar to using 16 mm movie film for HDTV projection).
 In fact, whole-camera pans would result in a loss of 50% of the horizontal resolution.
The Nyquist subsampling method of systems conversion only works for HDTV to Standard
Definition Television, so as a standards conversion technology it has a very limited use.
Phase Correlation is usually preferred for HDTV to standard definition conversion.
Framerate conversion
There is a large difference in frame rate between film (24.0 frames per second) and NTSC
(approximately 29.97 frames per second).
Unlike the two other most common video formats, PAL and SECAM, this difference cannot
be overcome by a simple speed-up, because the required 25% speed-up would be obviously
noticeable.
To convert 24 frame/s film to 29.97 frame/s NTSC, a complex process called "3:2 pulldown"
is utilized, in which parts of some frames are duplicated and blended. This produces
irregularities in the sequence of images which some people can perceive as a jitter/stutter
during slow pans of the camera.
What is Telecine?
Telecine is the process of transferring motion picture film into video and is performed in a
color suite. The term is also used to refer to the equipment used in the post-production
process.

For viewing native PAL or SECAM material (such as European television series and some
European movies) on NTSC equipment, a standards conversion has to take place. There are
basically two ways to accomplish this.

 The framerate can be slowed from 25 to 23.976 frames per second (a slowdown of about
4%) to subsequently apply 3:2 pulldown.

6
Camera Basics, Principles and techniques –MCD 401 VU

 Interpolation of the contents of adjacent frames in order to produce new intermediate


frames; this introduces artifacts, and even the most modestly trained of eyes can quickly
spot video that has been converted between formats.
Linear interpolation
When converting PAL (625 lines @ 25 frame/s) to NTSC (525 lines @ 30 frame/s), the
converter must eliminate 100 lines per frame. The converter must also create five frames per
second.
To reduce the 625-line signal to 525, less expensive converters drop 100 lines. These
converters maintain picture fidelity by evenly spacing removed lines. (For example, the
system might discard every sixth line from each PAL field. After the 50th discard, this
process would stop. By then the system would have passed the viewable area of the field. In
the following field, the process would repeat, completing one frame.) To create the five
additional frames, the converter repeats every fifth frame.
If there is little inter-frame motion, this conversion algorithm is fast, inexpensive and
effective. Many inexpensive consumer television system converters have employed this
technique. Yet in practice, most video features significant inter-frame motion. To reduce
conversion artifacts, more modern or expensive equipment may use sophisticated techniques.
Doubler
The most basic and literal way to double lines is to repeat each scanline, though the results of
this are generally very crude. Linear interpolation use digital interpolation to recreate the
missing lines in an interlaced signal, and the resulting quality depends on the technique used.
Generally the bob version of linear deinterlacer will only interpolate within a single field,
rather than merging information from adjacent fields, to preserve the smoothness of motion,
resulting in a frame rate equal to the field rate (i.e. a 60i signal would be converted to 60p.)
The former technique in moving areas and the latter in static areas, which improves overall
sharpness.
Interfiled interpolation
Interfiled Interpolation is a technique in which new frames are created by blending adjacent
frames, rather than repeating a single frame. This is more complex and computationally
expensive than linear interpolation, because it requires the interpolator to have knowledge of
the preceding and the following frames to produce an intermediate blended
frame. Deinterlacing may also be required in order to produce images which can be
interpolated smoothly.

7
Camera Basics, Principles and techniques –MCD 401 VU

Interpolation can also be used to reduce the number of scan lines in the image by averaging
the colour and intensity of pixels on neighboring lines, a technique similar to Bilinear
filtering, but applied to only one axis.
There are simple 2-line and 4 line converters. The 2-line converter creates a new line by
comparing two adjacent lines, whereas a 4-line model compares 4 lines to average the 5th.
Again, the greater the complexity and resulting price tag!
Interfield interpolation reduces judder, but at the expense of picture smearing. The greater
the blending applied to smooth out the judder, the greater the smear caused by blending.
Adaptive motion interpolation
Some more advanced techniques measure the nature and degree of inter-frame motion in the
source, and use adaptive algorithms to blend the image based on the results. Some such
techniques are known as motion compensation algorithms, and are computationally much
more expensive than the simpler techniques, thus requiring more powerful hardware to be
effective in real-time conversion.
Adaptive Motion algorithms capitalize on the way the human eye and brain process moving
images - in particular, detail is perceived less clearly on moving objects that.
Adaptive interpolation requires that the converter analyzes multiple successive fields and to
detect the amount and type of motion of different areas of the picture.

 Where little motion is detected, the converter can use linear interpolation.
 When greater motion is detected, the converter can switch to an inter-field technique
which sacrifices detail for smoother motion.
Adaptive Motion Interpolation has many variations and is commonly found in midrange
converters. The quality and cost is dependent upon the accuracy in analyzing the type and
amount of motion, and the selection of the most appropriate algorithm for processing the
type of motion.
Adaptive motion interpolation + block matching
Block matching involves dividing the image into mosaic blocks - say perhaps for the sake of
explanation, 8x8 pixels. The blocks are then stored in memory. The next field read out is also
divided up into the same number and size of mosaic blocks. The converter's computer then
goes to work and starts matching up blocks. The blocks that stayed in the same relative
position (read: there was no motion in this part of the image) receive relatively little
processing.

8
Camera Basics, Principles and techniques –MCD 401 VU

 For each block that changed, the converter searches in every direction through its
memory, looking for a match to find out where the "block" went (if there's motion, the
block obviously had to have gone somewhere..).
 The search starts at the immediate surrounding blocks (assuming little motion).
 If a match isn't found, then it searches further and further out until it finds a match.
 When the matching block is found, the converter then knows how far the block moved
and in which direction.
 This data is then stored as a motion vector for this block.
 Since interframe motion is often predictable owing to Newton's laws of motion in the real
world, the motion vector can then be used to calculate where the block will probably be
in the next field.
 The Newtonian method saves a lot of search and processing time.
When panning from left to right is taking place (over say 10 fields) it is safe to assume that
the 11th field will be similar or very close.

 Block matching can be seen as the "cutting and pasting" of image blocks.
The technique is highly effective but it does require a tremendous amount of computing
power. Consider a block of only 8x8 pixels. For each block, the computer has 64 possible
directions and 64 pixels to be matched to the block in the next field. Also consider that the
greater the motion, the further out the search must be conducted. Just to find an adjacent
block in the next field would entail making a search of 9 blocks. 2 blocks out would require a
search and match of 25 blocks - 3 blocks further distant and it grows to 49 etc.
The type of motion can exponentially compound the compute power required. Consider a
rotating object, where a simple straight line motion vector is of little help in predicting where
the next block should match. It can quickly be seen that the more inter frame motion
introduced, the much greater the processing power required. This is the general concept of
block matching. Block match converters can vary widely in price and performance
depending on the attention to detail and complexity.
A weird artifact of block matching owes to the size of the block itself. If a moving object is
smaller than the mosaic block, consider that it's the entire block that gets moved. In most
cases, it's not an issue, but consider a thrown baseball. The ball itself has a high motion
vector, but its background that makes up the rest of the block might not have any motion.
The background gets transported in the moved block as well, based on the motion vector of
the baseball, What you might see is the ball with a small amount of outfield or whatever,

9
Camera Basics, Principles and techniques –MCD 401 VU

tagging along. As it's in motion, the block may be "soft" depending upon what additional
techniques were used and barely noticeable unless your looking for it.
Block matching requires a staggering amount of processing horsepower, but today's
microprocessors are making it a viable solution.
Phase correlation
Phase Correlation is perhaps the most computationally complex of the general algorithms.
Phase Correlation's success lies in the fact that it is effective with coping with rapid motion
and random motion. Phase Correlation doesn't easily get confused by rotating or twirling
objects that confuse most other kinds of systems converters.
Phase Correlation is elegant as well as technically and conceptually complex. Its successful
operation is derived by performing a Fourier Transform to each field of video.
A Fast Fourier Transform (FFT) is an algorithm which deals with the transformation of
discrete values (in this case image pixels).
When applied to a sample of finite values, a Fast Fourier Transform expresses any changes
(motion) in terms of frequency components.
What is the advantage of using FFTs over simply trying to predict the motion vector on a
pixel by pixel basis?

 Mathematically, it's far easier and faster to recognize and process frequency signatures
from which very accurate motion vectors can then be calculated.
 Rather than having to measure where every pixel goes from frame to frame the FFT
rather results in representing just the changes from one frame to the next.
Since the result of the FFT represents only the inter-frame changes in terms of frequency
distribution, there's far less data that has to be processed in order to calculate the motion
vectors.

 Unlike other motion vector calculating methods, the FFT technique is not easily fooled
by objects that have rotational or spiraling motions.
 What results from the FFT is a three-dimensional frequency distribution represented
mathematically by peaks in a three-dimensional wave pattern.
 The 3rd dimension in this coordinate system represents subsequent fields of video.
In summation: Objects in motion can be mathematically correlated to their peaks in the
frequency distribution. Once the FFT is performed it becomes a computationally simple
matter for the computer to track just the peaks and assign them the appropriate motion

10
Camera Basics, Principles and techniques –MCD 401 VU

vectors. This conversion technique is both elegant and computationally involved.


Sophisticated software and large amounts of processor "horsepower" are required for these
complex computations.

DTV to analog converters for consumers


A digital television adapter, (CECB), or digital-to-analog converter (box), is a device that
receives, by means of an antenna, a digital television (DTV) transmission, and converts that
signal into an analog television signal that can be received and displayed on an analog
television.
These boxes cheaply convert HDTV (16:9 at 720 or 1080) to (NTSC or PAL at 4:3). Very
little is known about the specific conversion technologies used by these converter boxes in
the PAL and NTSC zones.
Downconversion is usually required, hence very little image quality loss is perceived by
viewers at the recommended viewing distance with most TV sets.

11
Camera Basics, Principles and techniques –MCD 401 VU

Topic no. 77

Exposure Meter

In digital photography exposure meter is an instrument for measuring the amount of light falling
on or being reflected by a subject, and usually equipped to convert this measurement into usable
information, such as the shutter speed and aperture size required to take a reasonable photograph.

Knowing how your digital camera meters light is critical for achieving consistent and accurate
exposures. Metering is the brains behind how your camera determines the shutter speed and
aperture, based on lighting conditions and ISO speed. Metering options often include partial,
evaluative zone or matrix, center-weighted and spot metering. Each of these have subject
lighting conditions for which they excel — and for which they fail. Understanding these can
improve one's photographic intuition.

BACKGROUND: INCIDENT vs. REFLECTED LIGHT

All in-camera light meters have a fundamental flaw: they can only measure reflected light. This
means the best they can do is guess how much light is actually hitting the subject.

Metering diagram: incident vs. reflected light meters

If all objects reflected the same percentage of incident light, this would work just fine, however
real-world subjects vary greatly in their reflectance. For this reason, in-camera metering is
standardized based on the luminance of light which would be reflected from an object appearing
as middle gray. If the camera is aimed directly at any object lighter or darker than middle gray,
the camera's light meter will incorrectly calculate under or over-exposure, respectively. A hand-
held light meter would calculate the same exposure for any object under the same incident
lighting.

18% Gray Tone

18% Red Tone

18% Green Tone

1
Camera Basics, Principles and techniques –MCD 401 VU

18% Blue Tone

Above patches depict approximations of 18% luminance. This will appear most accurate when
using a PC display which closely mimics the sRGB color space, and have calibrated your
monitor accordingly. Monitors emit as opposed to reflect light, so this is also a fundamental
limitation.

What constitutes middle gray? In the printing industry it is standardized as the ink density which
reflects 18% of incident light, however cameras seldom adhere to this. This topic deserves a
discussion of its own, but for the purposes of this tutorial, just know that each camera treats
middle gray slightly differently, but that it's usually somewhere between 10-18% reflectance.
Metering off a subject which reflects more or less light than this may cause your camera's
metering algorithm to go awry — either through under or over-exposure, respectively.

An in-camera light meter can work surprisingly well if object reflectance is sufficiently diverse
throughout the photo. In other words, if there is an even spread varying from dark to light
objects, then the average reflectance will remain roughly middle gray. Unfortunately, some
scenes may have a significant imbalance in subject reflectivity, such as a photo of a white dove
in the snow, or of a black dog sitting on a pile of charcoal. For such cases the camera may try to
create an image with a histogram whose primary peak is in the midtones, even though it should
have instead produced this peak in the highlights or shadows (see high and low-key histograms).

METERING OPTIONS

In order to accurately expose a greater range of subject lighting and reflectance combinations,
most cameras have several metering options. Each option works by assigning a relative
weighting to different light regions; regions with a higher weighting are considered more
reliable, and thus contribute more to the final exposure calculation.

Center-Weighted

2
Camera Basics, Principles and techniques –MCD 401 VU

Partial Metering

Spot Metering

Partial and spot areas are roughly 13.5% and 3.8% of the picture area, respectively, which
correspond to settings on the Canon EOS 1D Mark II.

The whitest regions are those which contribute most towards the exposure calculation, whereas
black areas are ignored. Each of the above metering diagrams may also be located off-center,
depending on the metering options and autofocus point used.

More sophisticated algorithms may go beyond just a regional map and include: evaluative, zone
and matrix metering. These are usually the default when your camera is set to auto exposure.
Each generally works by dividing the image up into numerous sub-sections, where each section
is then considered in terms of its relative location, light intensity or color. The location of the
autofocus point and orientation of the camera (portrait vs. landscape) may also contribute to the
calculation.

WHEN TO USE PARTIAL & SPOT METERING

Partial and spot metering give the photographer far more control over the exposure than any of
the other settings, but this also means that these are more difficult to use — at least initially.
They are useful when there is a relatively small object within your scene which you either need
to be perfectly exposed, or know will provide the closest match to middle gray.

One of the most common applications of partial metering is a portrait of someone who is back-
lit. Metering off their face can help avoid an exposure that makes the subject appear as an under-
exposed silhouette against the bright background. On the other hand, care should be taken as the
shade of a person's skin may lead to inaccurate exposure if this shade is far from neutral gray
reflectance (although not by as much as with backlighting.

3
Camera Basics, Principles and techniques –MCD 401 VU

Spot metering is used less often because its metering area is very small and thus quite specific.
This can be an advantage when you are unsure of your subject's reflectance and have a specially
designed gray card (or other small object) to meter off of.

Spot and partial metering are also quite useful for creative exposures, and when ambient lighting
is unusual. In the examples to the left and right below, one could meter off the diffusely lit
foreground tiles, or off the directly lit stone below the sky opening

CENTER-WEIGHTED METERING

Center-weighted metering was once a very common default setting in cameras because it coped
well with a bright sky above a darker landscape. Nowadays, it has more or less been surpassed in
flexibility by evaluative and matrix, and in specificity by partial and spot metering. On the other
hand, the results produced by center-weighted metering are very predictable, whereas matrix and
evaluative metering modes have complicated algorithms which are harder to predict. For this
reason some still prefer to use center-weighted as the default metering mode.

EXPOSURE COMPENSATION

Any of the above metering modes can use a feature called exposure compensation (EC). When
this is activated, the metering calculation still works as normal, but the final the final exposure
target gets compensated by the EC value. This allows for manual corrections if you observe a
metering mode to be consistently under or over-exposing. Most cameras allow up to 2 stops of
exposure compensation, where each stop provides either a doubling or halving of light compared
to what the metering mode would have done otherwise. A setting of zero means no compensation
will be applied (which is the default).

Exposure compensation is ideal for correcting in-camera metering errors caused by the subject's
reflectivity. No matter what metering mode is used, in-camera light meters will always
mistakenly under-expose a subject such as a white dove in a snowstorm (see incident vs.
reflected light). Photographs in the snow will therefore always require around +1 exposure
compensation, whereas a low-key image may require negative compensation.

When shooting in RAW mode under tricky lighting, sometimes it is useful to set slight negative
exposure compensation (0.3-0.5). This decreases the chance of clipped highlights, yet still allows
one to increase the exposure afterwards. Alternatively, positive exposure compensation can be
used to improve the signal to noise ratio in situations where the highlights are far from clipping.

4
Camera Basics, Principles & Techniques - MCD 401 VU

Topic 78
What is the Light Meter?
For as long as people have been taking photos, there has been a need to determine how bright a
scene is. Any method of recording light can only work in a relatively narrow band without over
or under exposing the image. To find the correct exposure that will record the image without
over or under exposing it too much, photographers need to know how bright the scene is. An
extremely talented photographer may be able to guess a near-enough exposure, but a light meter
is a far more accurate and convenient way to do it.

Light meters in cameras react to how intense the light is as seen from the camera. SLRs measure
the light (called metering) through the lens – TTL. They collect light that has actually passed
through the camera’s lens and measure its intensity. There are problems when the scene has parts
that are much brighter or darker than others, for example shadows on a sunny day. This can trick
the light meter into measuring the intensity of the light incorrectly, depending on which part of
the scene was illuminating the sensor.

Modern SLR cameras use multi-point light meters, meaning that several light meters are actually
scattered around the projected scene, each measuring the light intensity at that point. Very
sophisticated cameras may have dozens of metering points. How much the measured intensity of
the light at each point influences the final meter reading depends on the metering mode selected
by the photographer.

1
Camera Basics, Principles & Techniques - MCD 401 VU

How to Use the Light Meter?

As we now know, the correct exposure is created by juggling the three points of the exposure
triangle: aperture, shutter and ISO. The light meter is the tool that puts us in the right
neighborhood for how these should be set. If you are shooting on full auto, then when you meter
the scene – usually done at the same time as focusing, by half pressing the shutter – the light
meter gives its best guess for each of these variables.

If you want to take creative control of the photo, you can manually set each of the three variables
yourself. Typically ISO is left at the default, or previous setting, and you take control by
choosing aperture priority or shutter priority. On most DSLRs that’s done by turning the
exposure mode dial. If you set the dial to Av – aperture priority, the photographer chooses what
the aperture will be, and the light meter adjusts the shutter speed to maintain the correct
exposure. The reverse is true for TV – shutter priority.

When using these modes, it’s useful to refer to the exposure meter display on the camera.
The exposure meter (display) shows the result of the measurement taken by the light
meter (sensor). It will typically look something like this:

2
Camera Basics, Principles & Techniques - MCD 401 VU

Exposure meter display on LCD Exposure meter display in viewfinder

Each number represents a stop change in the light, as indicated, with the central mark being the
“correct” exposure, as determined by the light meter. Each pip between the numbers represents
one third of a stop. The arrow underneath indicates how close the current settings are to the
correct exposure. Usually in priority modes, the arrow will stay in the middle as the light meter
will be able to set the exposure correctly. However, if for example you set your aperture to
1/400sec in TV (shutter priority mode) and the light meter indicated that you needed an aperture
of f4, but your lens was only capable of f5.8, then the exposure meter will display one stop of
underexposure. You will need to compensate for this by setting a longer shutter time, or
increasing the ISO.

The juggling act becomes more complicated, and the light meter’s assistance more valuable,
when you go to full manual control of the exposure. Here the exposure meter simply displays
whether the current settings will under or over expose the image, according to the light meter.
The photographer can freely change any of the values on the exposure triangle, and see the
change to the predicted versus recommended exposure.

Exposure compensation

Even though the light meter in your camera is pretty sophisticated, sometimes it can get it wrong,
especially with harsh contrasts, or highly reflective surfaces. Changing metering modes may help
this, but a more controlled approach is to use exposure compensation. Imagine you are
photographing a person against a large bright sky. The light meter thinks the sky is the most
important part, and exposes correctly for that, leaving the person a dark silhouette. By using
exposure compensation, you can tell the camera to take the metered exposure and make it

3
Camera Basics, Principles & Techniques - MCD 401 VU

brighter by a chosen amount. This will then allow the photographer to correctly expose the
person. I’ll look at exposure compensation in more detail in a future post.

4
Camera Basics, Principles and Techniques- MCD 401 VU

Topic 79
18% Grey Card
A gray card is an object of neutral color which reflects a certain amount of light. It is used to
solve two color related problems in photography. One problem is a so called color cast in
images caused by different illumination conditions and the cameras (in) ability to neutralize it.
The other problem is the lack of a reference point for managing exposure time (amount of light
captured), to prevent an image turn out too dark or light.
There are several good reasons to buy a professional gray card but if you are in a hurry or want
to experiment you can also print a gray card first.

Description RGB Value Lab Value


12% Gray Card 31.37% / 80 34%
12% Gray Card 33.33% / 85 36%

18% Gray Card 47% / 119 50%

18% Gray Card 50% / 128 54%


90% Gray Card 93.33% / 238 94%

Post Processing multiple multiple

To use a gray card, hold it up in the light that is the same as the light hitting your subject, point
your camera at it (preferably using spot metering mode for best results) and you now have a
‘perfect’ setting. This 18% gray card is what your camera assumes the world is; placing such a
card in front of your camera now makes it able to meter the light with better accuracy.
Just like eating enough fiber or getting enough sleep, you know using a gray card is good for
you(r photography) but you don’t do it all the time. I don’t do it all the time either. Why? My
number one reason is the card is not always with me. A dedicated shoot I am being paid for? Of
course it is there. But taking my daughter to school and seeing something worth shooting on the
way? It’s not with me 100% of the time (also because I test many camera bags and it doesn’t
always get packed into the right bag). Plus, who has time to pull out the card when the lighting is
just right and the card is in your bag in the car? I’m as good at making excuses as the next guy.
As long as you have two hands, your gray card can always be with you.

1
Camera Basics, Principles and Techniques- MCD 401 VU

The technique is simple and the idea is that the color and tone of the palm of your hand doesn’t
change much. Certainly not as much as that back of your hand which has more pigment and sees
more sun? Why not use that?
To use your hand as a gray card you will first need a gray card. They are cheap and you can
order them online or find them at a local photo shop. In a nice even light, using spot metering
and manual exposure mode, point your camera at the gray card. Set your ISO so it is not on Auto
and maybe to 800, the number isn’t too important. Now adjust aperture and shutter speed until
the camera metering is at zero, meaning it is not over or underexposed according to the camera.
Next place your hand (I suggest your left hand) where the card was, with your fingers together.
Ensure the center metering spot is completely covered by your hand.

2
Camera Basics, Principles & Techniques –MCD 401 VU

Topic 80

Lighting Ratio

Lighting ratio in photography refers to the comparison of key light (the main source of light

from which shadows fall) to the fill light (the light that fills in the shadow areas). Higher the

lighting ratio, higher the contrast of the image; the lower the ratio, the lower the contrast. Since

the lighting ratio is the ratio of the light levels on the brightest lit to the least lit parts of the

subject, and the brightest lit are lit by both key (K) and fill (F), therefore the lighting ratio is

properly (K+F):F although for contrast ratios of 4:1 or more, then K:F is sufficiently accurate.

Light can be measured in foot candles. A key light of 100 foot candles and a fill light of 100 foot

candles have a 1:1 ratio (a ratio of one to one). A key light of 800 foot candles and a fill light of

200 foot candles has a ratio of 4:1.

The ratio can be determined in relation to F stops since each increase in f-stop is equal double the

amount of light: 2 to the power of the difference in f stops is equal to the first factor in the ratio.

For example, a difference in two f-stops between key and fill is 2 squared or 4:1 ratio. A

difference in 3 stops is 2 cubed, or an 8:1 ratio. No difference is equal to 2 to the power of 0, for

a 1:1 ratio.

In situations such as motion picture lighting sometimes the lighting ratio is described as key plus

fill to fill alone. A light meter can automatically calculate the ratio of key plus fill to fill alone.

High-key lighting is a style of lighting for film, television, or photography that aims to reduce

the lighting ratio present in the scene. This was originally done partly for technological reasons,

1
Camera Basics, Principles & Techniques –MCD 401 VU

since early film and television did not deal well with high contrast ratios, but now is used to

suggest an upbeat mood. It is often used in sitcoms and comedies. High-key lighting is usually

quite homogeneous and free from dark shadows. The terminology comes from the key

light (main light).

In the 1950s and 1960s, high-key lighting was achieved through multiple light sources lighting a

scene—usually using three fixtures per person (left, right, and central) —which resulted in a

uniform lighting pattern with very little modeling. Nowadays, multiple hot light sources are

substituted by much more efficient fluorescent soft lights which provide a similar effect.

The advantage to high-key lighting is that it doesn't require adjustment for each scene which

allows the production to complete the shooting in hours instead of days. The primary drawback

is that high-key lighting fails to add meaning or drama by lighting certain parts more

prominently than others.

Shows with bigger budgets have moved away from high-key lighting by using lighting set-ups

different from the standard three-point lighting. Part of the reason for this is the advent of new

lighting fixtures which are easier to use and quicker to set up. Another reason is the growing

sophistication of the audience for TV programs and the need to differentiate.

The term "high-key" has found its way from cinema into more widespread usage, for example

referring to an event that requires much organization or is subject to a great deal of publicity.

Low-key lighting is a style of lighting for photography, film or television. It is a necessary

element in creating a chiaroscuro effect. Traditional photographic lighting, three-point

lighting uses a key light, a fill light, and a back light for illumination. Low-key lighting often

2
Camera Basics, Principles & Techniques –MCD 401 VU

uses only one key light, optionally controlled with a fill light or a simple reflector. Low key light

accentuates the contours of an object by throwing areas into shade while a fill light or reflector

may illuminate the shadow areas to control contrast. The relative strength of key-to-fill, known

as the lighting ratio, can be measured using a light meter. Low key lighting has a higher lighting

ratio, e.g. 8:1, than high-key lighting, which can approach 1:1.

The term "low key" is used in cinematography to refer to any scene with a high lighting ratio,

especially if there is a predominance of shadowy areas. It tends to heighten the sense of

alienation felt by the viewer, hence is commonly used in film noir and horror genres.

3
Camera Basics, Principles & Techniques- MCD 401 VU

Topic 81

Camera Filters
n photography and videography, a filter is a camera accessory consisting of an optical that can be
inserted into the optical path. The filter can be of a square or oblong shape and mounted in a
holder accessory, or, more commonly, a glass or plastic disk in a metal or plastic ring frame,
which can be screwed into the front of or clipped onto the camera lens.

Filters modify the images recorded. Sometimes they are used to make only subtle changes to
images; other times the image would simply not be possible without them. In monochrome
photography coloured filters affect the relative brightness of different colours; red lipstick may
be rendered as anything from almost white to almost black with different filters. Others change
the colour balance of images, so that photographs under incandescent lighting show colours as
they are perceived, rather than with a reddish tinge. There are filters that distort the image in a
desired way, diffusing an otherwise sharp image, adding a starry effect, etc.
Supplementary close-up lenses may be classified as filters. Linear and circular polarizing filters
reduce oblique reflections from non-metallic surfaces.

Many filters absorb part of the light available, necessitating longer exposure. As the filter is in
the optical path, any imperfections—non-flat or non-parallel surfaces, reflections (minimized by
optical coating), scratches, dirt—affect the image.

There is no universal standard naming system for filters. The Written numbers adopted in the
early twentieth century by Kodak, then a dominant force in film photography, are used by
several manufacturers. Colour correction filters are often identified by a code of the form
CC50Y—CC for colour correction, 50 for the strength of the filter, Y for yellow.

Optical filters are used in various areas of science, including in particular astronomy; they are
essentially the same as photographic filters, but in practice often need far more accurately
controlled optical properties and precisely defined transmission curves than filters exclusively
for photographic use. Photographic filters sell in larger quantities at correspondingly lower prices
than many laboratory filters. The article on optical filters has material relevant to photographic
filters.

1
Camera Basics, Principles & Techniques- MCD 401 VU

In digital photography the majority of filters used with film cameras have been rendered
redundant by digital filters applied either in-camera or during post processing. Exceptions
include the ultraviolet (UV) Ultra typically used to protect the front surface of the lens, the
neutral density (ND) filter, the polarizing filter and the infra-red (IR) filter. The neutral density
filter permits effects requiring wide apertures or long exposures to be applied to brightly lit
scenes, while the graduated neutral density filter is useful in situations where the scene's dynamic
range exceeds the capability of the sensor. Not using optical filters in front of the lens has the
advantage of avoiding the reduction of image quality caused by the presence of an extra optical
element in the light path and may be necessary to avoid vignetting when using wide-angle lenses.

2
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 82

Camera Filters & Color Temperature

Understanding color temperature is one of the crucial rules of photography you must learn before

you can begin to break them.

So what is color temperature? In short, each light source has its own individual color, or ‘color

temperature’, which varies from red to blue.

Candles, sunsets and tungsten bulbs give off light that’s close to red (hence the ‘warm’ look they

give to pictures), whereas clear blue skies give off a ‘cool’ blue light. It’s fairly obvious stuff

once you read it.

Color temperature is typically recorded in kelvin, the unit of absolute temperature. Cool colors

like blue and white generally have color temperatures over 7000K, while warmer colors like red

and orange lie around the 2000K mark.

When you set your camera’s white balance manually (find out how to make a custom white

balance setting) you can choose from a number of pre-set color temperature options like

Tungsten, Daylight, Cloudy and Shade, or customize your own setting.

Color temperature is measured in 'kelvins' formerly known as 'degrees kelvin'. To get the idea,

think of a piece of metal being heated in a fire. First it gives off a reddish glow and, as it gets

hotter, the color gets whiter and then, as it really warms up, it starts to give off a bluish glow. In

Physics of course, we can't use any old bit of metal for the kelvin scale, we need a 'theoretical

black object'. The photographer's color temperature chart is a loose interpretation of the kelvin

scale, the numbers are not used in any precise manner.

1
Camera Basics, Principles & Techniques-MCD 401 VU

As photographers all we need to know is that different types of light source emit different colors.

5000 kelvins is what we photographers call white light and is represented by 'average daylight',

whatever that is, actually it's fairly obvious if you look at the chart below. We also need to know

that household bulbs give off an orange light and a cloudy day will appear blue.

White balance (WB) is the process of removing unrealistic color casts, so that objects which

appear white in person are rendered white in your photo. Proper camera white balance has to

take into account the "color temperature" of a light source, which refers to the relative warmth or

coolness of white light. Our eyes are very good at judging what is white under different light

sources, but digital cameras often have great difficulty with auto white balance (AWB) — and

can create unsightly blue, orange, or even green color casts. Understanding digital white balance

can help you avoid these color casts, thereby improving your photos under a wider range of

lighting conditions.

AUTO WHITE BALANCE

Certain subjects create problems for a digital camera's auto white balance — even under normal

daylight conditions. One example is if the image already has an overabundance of warmth or

coolness due to unique subject matter. The camera then tries to compensate for this so that the

average color of the image is closer to neutral, but in doing so it unknowingly creates a bluish

color cast on the stones. Some digital cameras are more susceptible to this than others.

A digital camera's auto white balance is often more effective when the photo contains at least one

white or bright colorless element. Of course, do not try to change your composition to include a

2
Camera Basics, Principles & Techniques-MCD 401 VU

colorless object, but just be aware that its absence may cause problems with the auto white

balance.

3
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 83

Camera Filter Effect

Screw-in Filters

Screw-in filters fit directly onto your lens, in the threads at the edge of the lens barrel. Each
screw-in filter is a specific width, so the more lenses (of different widths) you have the more
filters you’ll need. Screw-in filters are ideal, and make polarizers and UV filters easy to swap in
and out.

Slot-in Filters

1
Camera Basics, Principles & Techniques-MCD 401 VU

For slot-in filters, a filter holder is placed on the lens’ adapter ring and filters are dropped into
the holder. The holder usually has interchangeable rings so the holder can fit on a wide array of
lenses. The holder typically has three or four grooves, so you can put more than one filter in the
holder. The advantage of the slot-in filter, is that you can add or subtract filters relatively quickly
and larger filters can work on shorter, smaller lenses.

Filter Factor

Filters change the dynamics of the light entering the lens and usually require you to alter your
exposure to compensate for this fact. This is called the Filter Factor and each filter has a specific
filter factor, so read up on these to learn how to use them.

UV Filter

Ultra Violet filters are transparent filters that block ultra-violet light, in order to reduce the
haziness that is noticeably apparent in some daylight photography. UV filters don’t affect the
majority of visible light, so they are a perfect form of lens protection and they will not alter your
exposure. There are some “strong” UV filters that are more effective at cutting atmospheric haze
and reducing the notorious purple fringing that sometimes shows up in digital photography.
Purple fringing is a purple ghost that you see at the edges of a subject when it is slightly out of
focus.

2
Camera Basics, Principles & Techniques-MCD 401 VU

Polarizing Filter

A Polarizing filter can be used to darken overly light skies as it increases the contrast between
clouds and the sky. Like the UV filter, the Polarizer reduces atmospheric haze, but also reduces
reflected sunlight. The most typical function of a Polarizer is to remove reflections from water
and glass. When angled (or spun) properly, the Polarizer eliminates the reflection when shooting
through a glass window or into water; a handy trick to be sure! There are two types of polarizers:
linear and circular. Both types of polarizers produce a similar effect, except the circular polarizer
eliminates unwanted reflected light with the help of a quarter-wave plate. The resulting image is
free of reflected light, and transparent objects like glass are free of reflections.

Color Balancing Filter

As you know, visible light is made up of a multiple color spectrum. But in photography, you
have to make a choice to capture images with the camera’s white balance set to record whitish
blue light of daylight or set to record the reddish-orange tungsten (incandescent) light… with a
few variations (i.e. sodium-vapor or fluorescent). This is what the white balance is used to

3
Camera Basics, Principles & Techniques-MCD 401 VU

control, and you use a color balancing filter to affect a change in your light sources. However,
you can use a Color Balancing filter to compensate for the various differences in the
photographed color of light (e.g. daylight is cooler and appears blue, whereas tungsten is warmer
and appears reddish orange). The 85B (warm-up/orange filter) and the 80A (cool-down/blue
filter) are the two standard filters for compensating for color balancing. The 85B enables you to
shoot in the daylight when the white balance/color temperature is set for tungsten. Without the
85B filter, your image will have a blue color cast to it. The 80A enables you to shoot under
tungsten light when the color temperature/white balance is set for daylight. Without the 80A,
your image will be abnormally warm/reddish orange. These filters have fallen out of use recently
because this type of color temperature correction can easily be achieved with image processing
software. Some photographers use them for various artistic affects.

Neutral Density Filter

Attaching a neutral density (ND) filter to your lens uniformly reduces the amount of light
entering the lens. The ND filter is helpful when the contrast between the highlights and shadows
is too great to get a quality exposure. The ND Filter also can enable greater motion blurring and
image detail by allowing a large aperture and/or a slow shutter speed to be used. A variant on the
ND filter is the graduated ND, in which there is a gradient that effects the reduction of light in a
graduated, neutral level from 100% to 0% across the length of the filter. The Graduated ND is
recommended for shooting landscapes and seascapes, because you can reduce the brightness of
the sky (for better contrast) but still maintain an affecting exposure of the land or water.

4
Camera Basics, Principles & Techniques-MCD 401 VU

Soft Focus Filter

Soft focus filters, do exactly that, they reduce the sharpness of an image, but only to an extent
that is barely noticeable. They are useful in shooting close up shots of people’s faces. With the
help of a little diffusion; imperfect skin conditions are replaced by silky smooth skin. Remember
you can use soft focus filters while photographing landscapes or monuments as well.

Filters for B&W Photography

There are specific filters for B&W photography that lighten similar colors and darken opposite
colors, thereby enhancing the monochromatic look. There are Red, Orange, Yellow, Green and
Blue filters for use in B&W photography.

5
Camera Basics, Principles & Techniques-MCD 401 VU

Red filters are a favorite among landscape photographers and are often used to add drama. In
nature photography, a red filter will increase the contrast between red flowers and green foliage.
A red filter will deepen a blue sky and make white clouds pop out. It can also decrease the
effects of haze and fog. In some cases, depending on its strength, a red filter could even turn the
sky black.

Orange filters increase contrast between tones in textures such as tile or bricks, making it a good
choice for general use and urban or abstract photography. It also helps to decrease haze and fog,
but it’s effects on the sky and clouds are subtler than the red filter.

Yellow filters are even subtler than orange filters, making it a ‘classic’ choice for beginners just
starting to explore using filters with black and white photography. It helps to darken the clouds
slightly, and it also separates light green foliage from the darker shades of green.

Green filters lighten dark green foliage and boost light green foliage. They have a more specific
use and are not as commonly used as the other filters, but green filters are extremely useful for
the nature photographer. Green filters may lighten the sky, so landscape photographers should
take note of this when using it.

Blue filters are not as commonly used in black and white photography because they lighten the
sky and darken highlights or colors that are seen as light. Blue filters can draw attention to haze
and fog, which can enhance the mood of the photo if needed. It’s a good idea to experiment with
this filter using the B&W setting, as opposed to shooting in color and converting the image to
B&W in an image processor.

Since a filter absorbs light, it necessitates an increase in exposure. Filter-makers will usually
suggest an amount of exposure compensation in the form of a “filter factor”. A filter factor of 2X
means that you should multiply the exposure by 2. A filter factor of 4X means that you should
multiply your exposure by 4, and so on. If the filter factor is 2X and 4X, add 1 f/stop and 2
f/stops to your exposure respectively. Another alternative is to divide your ISO by the filter
factor. If the filter factor is 2X and your ISO is 200, your new ISO is 100.

Conclusion

Photographic filters are used to achieve image enhancement effects that can change the
tone and mood of your photographs. Filters inject slight, but noticeable alterations to your
image. You can achieve many of the same effects by extensive tweaking in Photoshop (or
another image manipulation software package), but when you use a filter you can
immediately see the difference to your image in the viewfinder. The effects of filters are
more pronounced when working in B&W, as the monochromatic tonal scale reacts much
differently, and also with greater dramatic affect. As with every new photographic

6
Camera Basics, Principles & Techniques-MCD 401 VU

accessory, practice and experimentation are the keys to expanding the application of your
creative palette.

(Source: http://www.exposureguide.com/lens-filters.htm)

7
Camera Basics, Principles & Techniques- MCD 401 VU

Topic 84

Aspect Ratio

The aspect ratio of an image describes the proportional relationship between its width and its

height. It is commonly expressed as two numbers separated by a colon, as in 16:9.

Why aspect ratio matters

Why does aspect ratio matter? It’s all to do with the relationship of the main subject to the sides

of the frame, and the amount of empty space you end up with around the subject.

An awareness of the characteristics of the aspect ratio of your particular camera can help you

compose better images. It also helps you recognise when cropping to a different aspect ratio will

improve the composition of your image.

What is aspect ratio?

Aspect ratio describes the relationship between the width and height of an image. It’s written as a

figure, in this form – width:height (width always comes first).

Virtually every digital camera comes with a sensor of one of two aspect ratios:

1
Camera Basics, Principles & Techniques- MCD 401 VU

An aspect ratio used by 35mm crop sensor and full-frame SLRs, some Leica medium format

cameras, mirror-less cameras, high end compacts and most 35mm film cameras. This aspect ratio

has been with us ever since Leica made the first 35mm film cameras early last century.

35mm crop sensor and full-frame SLRs have an aspect ratio of 3:2. The sensor is 1.5 times as

wide as it is high.

A full-frame 35mm sensor measures 36 x 24mm. You can express this figure as a ratio: 36:24.

Mathematicians always like to simplify ratios so that the relationship between the two numbers is

easy to visualise. In this case, you can divide both dimensions by twelve. That gives you 3:2.

Crop sensor cameras have smaller sensors, measuring approximately 22.5 x 15mm (the exact

measurements vary, depending on brand and model). These figures conform to the 3:2 aspect

ratio of the full-frame sensor.

2
Camera Basics, Principles & Techniques- MCD 401 VU

Examples

The most common aspect ratios used today in the presentation of films in cinemas

are 1.85:1 and 2.39:1. Two common videographic aspect ratios are 4:3 (1.33:1), the universal

video format of the 20th century, and 16:9 (1.77:1), universal for high-definition television and

European digital television. Other cinema and video aspect ratios exist, but are used infrequently.

In still camera photography, the most common aspect ratios are 4:3, 3:2, and more recently being

found in consumer cameras 16:9. Other aspect ratios, such as 5:3, 5:4, and 1:1 (square format),

are used in photography as well, particularly in medium format and large format.

With television, DVD and Blu-ray Disc, converting formats of unequal ratios is achieved by

enlarging the original image to fill the receiving format's display area and cutting off any excess

picture information (zooming and cropping), by adding horizontal mattes (letterboxing) or

vertical mattes (pillarboxing) to retain the original format's aspect ratio, by stretching (hence

distorting) the image to fill the receiving format's ratio, or by scaling by different factors in both

directions, possibly scaling by a different factor in the center and at the edges (as in Wide Zoom

mode).

Current video standards

1. 4:3 standard

4:3 (1.33:1) (generally read as "Four-Three", "Four-by-Three", or "Four-to-Three") for standard

television has been in use since the invention of moving picture cameras and many computer

monitors used to employ the same aspect ratio. 4:3 was the aspect ratio used for 35 mm films in

the silent era. It is also very close to the 1.375:1 aspect ratio defined by the Academy of Motion

3
Camera Basics, Principles & Techniques- MCD 401 VU

Picture Arts and Sciences as a standard after the advent of optical sound-on-film. By having TV

match this aspect ratio, movies originally photographed on 35 mm film could be satisfactorily

viewed on TV in the early days of the medium (i.e. the 1940s and the 1950s). When cinema

attendance dropped, Hollywood created widescreen aspect ratios (such as the 1.85:1 ratio

mentioned earlier) in order to differentiate the film industry from TV. However since the start of

the 21st century broadcasters worldwide are phasing out the 4:3 standard entirely, as technology

started to favor the 16:9/16:10 aspect ratio of all modern high-definition television sets,

broadcast cameras and computer monitors.

2. 16:9 standard

16:9 (1.77:1) (generally named as "Sixteen-Nine", "Sixteen-by-Nine" and "Sixteen-to-Nine") is

the international standard format of HDTV, non-HD digital television and analog widescreen

television PALplus. Japan's Hi-Vision originally started with a 5:3 (= 15:9) ratio but converted

when the international standards group introduced a wider ratio of 5⅓ to 3 (= 16:9). Many digital

video cameras have the capability to record in 16:9, and 16:9 is the only widescreen aspect ratio

natively supported by the DVD standard. DVD producers can also choose to show even wider

ratios such as 1.85:1 and 2.39:1 within the 16:9 DVD frame by hard matting or adding black bars

within the image itself.

4
Camera Basics, Principles & Techniques- MCD 401 VU

Topic 85

Depth of Field & Depth of Focus


Due to similarities in name and nature, depth of field and depth of focus are commonly confused

concepts. To simplify the definitions for our purposes, depth of field concerns the image quality

of a stationary lens as an object is repositioned, whereas depth of focus concerns a stationary

object and a sensor’s ability to maintain focus for different sensor positions, including tilt. When

a lens focuses on a subject at a distance, all subjects at that distance are sharply focused. Subjects

that are not at the same distance are out of focus and theoretically are not sharp. However, since

human eyes cannot distinguish very small degree of un-sharpness, some subjects that are in front

of and behind the sharply focused subjects can still appear sharp. The zone of acceptable

sharpness is referred to as the depth of field. Thus, increasing the depth of field increases the

sharpness of an image. We can use smaller apertures for increasing the depth of field.The

following shows an example. The lens focuses at the middle between the 3 inch and 4 inch

marks. Thus, the 3 inch and 4 inch marks are sharp in all images. The 5 inch mark is not very

sharp at F3.2, and is improved as the lens closes down to F3.6. Then, it becomes sharp in all

subsequent images. The 6 inch and 7 inch marks are not sharp until F5.0 and F6.4, respectively.

The 8 inch mark becomes reasonably sharp when the lens closes down to F8.0. The 9 inch and

10 inch marks are not sharp in all images; but, they become sharper as the lens closes down. For

the foreground, the 2 inch mark is acceptable at F3.2 and becomes "focused" at F4.0. The 1 inch

mark is not sharp until F5.6, and the lead of the ruler becomes reasonably sharp at F7.1. As you

can see, the range of sharpness (i.e., depth of field) gets larger as the aperture gets smaller.

Therefore, use a smaller aperture if a greater depth of field is needed. Normally, to increase the

depth of field you must either:

1
Camera Basics, Principles & Techniques- MCD 401 VU

• decrease the size of the aperture in the final lens A

• decrease the magnification M being used, or

• increase the distance W between the specimen and the lens

Depth of focus is a lens optics concept that measures the tolerance of placement of the image

plane (the film plane in a camera) in relation to the lens. The same factors that determine depth

of field also determine depth of focus, but these factors can have different effects than they have

in depth of field. Both depth of field and depth of focus increase with smaller apertures. For

distant subjects (beyond macro range), depth of focus is relatively insensitive to focal length and

subject distance, for a fixed f-number. In the macro region, depth of focus increases with longer

focal length or closer subject distance, while depth of field decreases.

Determining factors:

In small-format cameras, the smaller circle of confusion limit yields a proportionately smaller

depth of focus. In motion picture cameras, different lens mount and camera gate combinations

have exact flange focal depth measurements to which lenses are calibrated.

The choice to place gels or other filters behind the lens becomes a much more critical decision

when dealing with smaller formats. Placement of items behind the lens will alter the optics

pathway, shifting the focal plane. Therefore, often this insertion must be done in concert with

stopping down the lens in order to compensate enough to make any shift negligible given a

greater depth of focus. It is often advised in 35 mm motion picture filmmaking not to use filters

behind the lens if the lens is wider than 25 mm.

2
Camera Basics, Principles & Techniques- MCD 401 VU

Topic 86

Rule of Thirds

The rule of thirds is a powerful compositional technique for making photos more interesting and

dynamic. It's also perhaps one of the most well-known. The basic principle behind the rule of

thirds is to imagine breaking an image down into thirds (both horizontally and vertically) so that

you have 9 parts. As follows.

The rule of thirds is applied by aligning a subject with the guide lines and their intersection

points, placing the horizon on the top or bottom line, or allowing linear features in the image to

flow from section to section. The main reason for observing the rule of thirds is to discourage

placement of the subject at the center, or prevent a horizon from appearing to divide the picture

in half. Michael Ryan and Melissa Lenos, authors of the book An Introduction to Film Analysis:

Technique and Meaning in Narrative Film state that the use of rule of thirds is "favored by

cinematographers in their effort to design balanced and unified images" (page 40).

When filming or photographing people, it is common to line the body up to a vertical line and

the person's eyes to a horizontal line. If filming a moving subject, the same pattern is often

followed, with the majority of the extra room being in front of the person (the way they are

1
Camera Basics, Principles & Techniques- MCD 401 VU

moving).Likewise, when photographing a still subject who is not directly facing the camera, the

majority of the extra room should be in front of the subject with the vertical line running through

their perceived center of mass.

The rule of thirds was first written down by John Thomas Smith in 1797. In his book Remarks on

Rural Scenery, Smith quotes a 1783 work by Sir Joshua Reynolds, in which Reynolds discusses,

in un-quantified terms, the balance of dark and light in a painting. Smith then continues with an

expansion on the idea, naming it the "Rule of thirds".

Writing in 1845, in his book Chromatics, George Field notes (perhaps erroneously) that Sir

Joshua Reynolds gives the ratio 2:1 as a rule for the proportion of warm to cold colors in a

painting, and attributes to Smith the expansion of that rule to all proportions in painting.

Smith's conception of the rule is meant to apply more generally than the version commonly

explained today, as he recommends it not just for dividing the frame, but also for all division of

straight lines, masses, or groups. On the other hand, he does not discuss the now-common idea

that intersections of the third-lines of the frame are particularly strong or interesting for

composition.

Using the Rule of Thirds comes naturally to some photographers but for many of us takes a little

time and practice for it to become second nature.

In learning how to use the rule of thirds (and then to break it) the most important questions to be

asking of yourself are:

• What are the points of interest in this shot?

• Where am I intentionally placing them?

2
Camera Basics, Principles & Techniques- MCD 401 VU

Once again – remember that breaking the rule can result in some striking shots – so once you’ve

learnt it experiment with purposely breaking it to see what you discover.

Lastly – keep the rule of thirds in mind as you edit your photos later on. Post production editing

tools today have good tools for cropping and reframing images so that they fit within the rules.

Experiment with some of your old shots to see what impact it might have on your photos.

3
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 87

The Video Camera

A video camera is a camera used for electronic motion picture acquisition (as opposed to a movie

camera, that earlier recorded the images on film), initially developed for the television industry

but now common in other applications as well.

The earliest video cameras were those of John Logie Baird, based on the mechanical Nipkow

disk and used in experimental broadcasts through the 1920s-30s. All-electronic designs based on

the video camera tube, such as Vladimir Zworykin's Iconoscope and Philo Farnsworth's Image

dissector, supplanted the Baird system by the 1930s and remained in wide use until the 1980s,

when cameras based on solid-state image sensors such as CCDs (and later CMOS active pixel

sensors) eliminated common problems with tube technologies such as image burn-in and made

digital video workflow practical. The transition to digital TV gave boost to digital video cameras

and by 2010s, most of the video cameras were digital video cameras.

With the advent of digital video capture, the distinction between professional video cameras and

movie cameras have disappeared as the intermittent mechanism has became the same.

Nowadays, mid-range cameras exclusively used for television and other works (except movies)

are termed as professional video cameras.

Video cameras are used primarily in two modes. The first, characteristic of much early

broadcasting, is live television, where the camera feeds real time images directly to a screen for

immediate observation. A few cameras still serve live television production, but most live

connections are for security, military/tactical, and industrial operations where surreptitious or

1
Camera Basics, Principles & Techniques-MCD 401 VU

remote viewing is required. In the second mode the images are recorded to a storage device for

archiving or further processing; for many years, videotape was the primary format used for this

purpose, but gradually supplanted by optical disc, hard disk, and finally flash memory. Recorded

video is used in television production, and more often surveillance and monitoring tasks where

unattended recording of a situation is required for later analysis.

Modern video cameras have numerous designs and uses, which are listed below.

Professional video cameras, such as those used in television production; these may be television

studio-based or mobile in the case of an electronic field production (EFP). Such cameras

generally offer extremely fine-grained manual control for the camera operator, often to the

exclusion of automated operation. Usually uses 3 sensors to record separate of Red, Green and

Blue.

Camcorders, which combine a camera and a VCR or other recording device in one unit; these are

mobile, and were widely used for television production, home movies, electronic news gathering

(ENG) (including citizen journalism), and similar applications. Since the transition to digital

video cameras, most of the cameras have in-built recording media and as such are also

camcorders.

Closed-circuit television (CCTV) generally use pan tilt zoom cameras (PTZ), for security,

surveillance, and/or monitoring purposes. Such cameras are designed to be small, easily hidden,

and able to operate unattended; those used in industrial or scientific settings are often meant for

use in environments that are normally inaccessible or uncomfortable for humans, and are

therefore hardened for such hostile environments (e.g. radiation, high heat, or toxic chemical

exposure).

2
Camera Basics, Principles & Techniques-MCD 401 VU

Webcams are video cameras which stream a live video feed to a computer. Camera Phones,

nowadays most video cameras are incorporated in mobile phones.

Special camera systems, like those used for scientific research, e.g. on board a satellite or a

spaceprobe, in artificial intelligence and robotics research, in medical use. Such cameras are

often tuned for non-visible radiation for Infrared (for night vision and heat sensing) or X-ray (for

medical and video astronomy use).

Steadicam is a brand of camera stabilizer mount for motion picture cameras that mechanically

isolates it from the operator's movement. It allows for a smooth shot, even when moving quickly

over an uneven surface. The Steadicam was invented by cameraman Garrett Brown and was

introduced in 1975.

A tripod is a portable three-legged frame, used as a platform for supporting the weight and

maintaining the stability of some other object. A tripod provides stability against downward

forces and horizontal forces and movements about horizontal axes. The positioning of the three

legs away from the vertical centre allows the tripod better leverage for resisting lateral forces.

Tripods are used for both motion and still photography to prevent camera movement and provide

stability. They are especially necessary when slow-speed exposures are being made, or when

telephoto lenses are used, as any camera movement while the shutter is open will produce a

blurred image. In the same vein, they reduce camera shake, and thus are instrumental in

achieving maximum sharpness. A tripod is also helpful in achieving precise framing of the

image, or when more than one image is being made of the same scene, for example when

bracketing the exposure. Use of a tripod may also allow for a more thoughtful approach to

photography. For all of these reasons, a tripod of some sort is often necessary for professional

3
Camera Basics, Principles & Techniques-MCD 401 VU

photography. In relation to film/video, use of the tripod offers stability within a shot as well as

certain desired heights. The use of a tripod within film/video is often a creative choice of the

Director.

4
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 88

The Video camera- A Comparison

HD or High-definition, sometimes abbreviated as Hi-def or HD, commonly refers

to an increase in display or visual resolution over a previously used standard.

Visual technologies

• HD DVD, discontinued optical disc format


• HD Photo, former name for the JPEG XR image file format
• HDV, format for recording high-definition video onto magnetic tape
• HiDef, 24 frames-per-second digital video format
• High-Definition Multimedia Interface (HDMI), all-digital audio/video
interface capable of transmitting uncompressed streams
• High-definition television (HDTV), television signals and apparatus with
higher resolution than their contemporary counterparts
• High-definition video, used in HDTV broadcasting, digital film, and
computer HD video file formats

High-definition video is video of higher resolution and quality than standard-

definition. While there is no standardized meaning for high-definition, generally

any video image with considerably more than 480 horizontal lines (North America)

or 576 horizontal lines (Europe) is considered high-definition. 720 scan lines is

generally the minimum even though the majority of systems greatly exceed that.

1
Camera Basics, Principles & Techniques-MCD 401 VU

Images of standard resolution captured at rates faster than normal (60

frames/second North America, 50 fps Europe), by a high-speed camera may be

considered high-definition in some contexts. Television series' shot on high-

definition video are made to look as if they have been shot on film, a technique

which is often known as filmizing.

HD in filmmaking

Film as a medium has inherent limitations, such as difficulty of viewing footage

while recording, and suffers other problems, caused by poor film

development/processing, or poor monitoring systems. Given that there is

increasing use of computer-generated or computer-altered imagery in movies, and

that editing picture sequences is often done digitally, some directors have shot their

movies using the HD format via high-end digital video cameras. While the quality

of HD video is very high compared to SD video, and offers improved signal/noise

ratios against comparable sensitivity film, film remains able to resolve more image

detail than current HD video formats. In addition some films have a wider dynamic

range (ability to resolve extremes of dark and light areas in a scene) than even the

best HD cameras. Thus the most persuasive arguments for the use of HD are

currently cost savings on film stock and the ease of transfer to editing systems for

special effects.

2
Camera Basics, Principles & Techniques-MCD 401 VU

Depending on the year and format in which a movie was filmed, the exposed

image can vary greatly in size. Sizes range from as big as 24 mm × 36 mm for

VistaVision/Technirama 8 perforation cameras (same as 35 mm still photo film)

going down through 18 mm × 24 mm for Silent Films or Full Frame 4 perforations

cameras to as small as 9 mm × 21 mm in Academy Sound Aperture cameras

modified for the Techniscope 2 perforation format. Movies are also produced using

other film gauges, including 70 mm films (22 mm × 48 mm) or the rarely used 55

mm and CINERAMA.

The four major film formats provide pixel resolutions (calculated from pixels per
millimeter) roughly as follows:

1. Academy Sound (Sound movies before 1955): 15 mm × 21 mm (1.375) =


2,160 × 2,970
2. Academy camera US Widescreen: 11 mm × 21 mm (1.85) = 1,605 × 2,970
3. Current Anamorphic Panavision ("Scope"): 17.5 mm × 21 mm (2.39) =
2,485 × 2,970
4. Super-35 for Anamorphic prints: 10 mm × 24 mm (2.39) = 1,420 × 3,390

In the process of making prints for exhibition, this negative is copied onto other
film (negative → interpositive → internegative → print) causing the resolution to
be reduced with each emulsion copying step and when the image passes through a
lens (for example, on a projector). In many cases, the resolution can be reduced
down to 1/6 of the original negative's resolution (or worse). Note that resolution
values for 70 mm film are higher than those listed above.

3
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 89

Studio Camera

Studio Cameras

The studio television camera is the beginning of the video signal. It is here that

visible light is transformed or transduced into electrical energy. The video signal

remains in the form of electrical energy, either analog or digital, for most of the

remaining process until a picture monitor (TV set) converts the electrical signal

back into visible light. The principle parts of the studio camera are; the camera

head (including lens, imaging device, and viewfinder), the camera mount, and the

studio pedestal.

The Camera

Lens: The external optics is designed to collect and focus the light onto the face of

the imaging device. The lens contains focusing, focal length, and aperture controls.

The first two controls are made by the camera operator at the camera head, and the

aperture control is typically made by the video engineer at the CCU. Studio

cameras at KTSC-TV have servo controls for zoom, and manual controls for focus.

The servo zoom control, which provides smooth and variable speed zooms with a

little practice, is located on the right pan handle while the focus control is located

1
Camera Basics, Principles & Techniques-MCD 401 VU

on the left pan handle. NOTE: On a properly maintained camera and lens, focus

should be set with the lens set to maximum focal length. Once set, the lens will

maintain accurate focus throughout the zoom range as long as the distance between

subject and lens does not change.

Imaging Devices: The internal optics, including the beam splitter, are housed in

the camera body. KTSC-TV's Hitachi Z-One B cameras employ CCD (Charge-

Coupled Device) imaging devices and are immune to the problem of image

retention and burn-in.

View Finder: The monochrome (black and white) monitor on top of the camera

head is your window on the world. And while it provides no information about the

colors being reproduced, it is an accurate display for the purpose of framing, focus

and composition. The angle of the VF is adjustable to provide optimum viewing

regardless of the height of the camera or the height of the operator. The VF has

contrast and brightness controls and should be adjusted for your particular

situation. These controls do not in any way affect the video output of the camera.

The Camera Mount

The camera is attached to a head which is in turn attached to the camera support--

in our case a tripod and dolly combination. Types of professional camera heads

include cam heads and fluid heads. Both allow for smooth pans and tilts. However,

2
Camera Basics, Principles & Techniques-MCD 401 VU

the smoothness of these movements is determined in part by the operator's

proficiency and muscular coordination. Hours of practice are necessary before one

can be fully proficient with camera moves worthy of "on-air" service. Please be

aware of the location and use of the pan and tilt locks and tension adjustments.

Never try to operate the camera head with the locks engaged, or with the tension

adjustments tightened. Whenever the operator is at the camera, both the pan and tilt

adjustments should be unlocked and loose enough so that the camera movements

can be executed smoothly and quickly according to the director's wishes. Before

the operator leaves the camera, even for a moment, the pan and tilt should be

locked securely. Please follow these directions carefully!

Movement

1. Primary: movement of the subject(s) in front of the camera

2. Secondary: movement of the camera

3. pan: horizontal movement of the camera head

4. tilt: vertical movement of the camera head

5. pedestal: raising or lowering of the camera head

6. truck: pedestal movement left or right (in relation to the subject)

7. dolly: pedestal movement forward or back (in relation to the subject)

8. arc: pedestal movement around a subject, retaining a fixed distance from the

subject
3
Camera Basics, Principles & Techniques-MCD 401 VU

9. Tertiary: movement caused by a sequence of camera shots or transitions, e.g.

cuts, dissolves, fades, wipes, etc.

Camera Operation

Before the Shoot: Check out your headset, make sure that the intercom is working

and that the Director or TD knows that you are on camera.

Unlock the camera head and adjust the pan, and tilt drag (aka tension). Never use

the drag controls to lock down the camera!

If you don't have a cable puller assigned to your camera, make sure that you have

enough cable to reach your positions and that it is coiled neatly out of the way.

Check with the video engineer to uncap the camera. Focus and set your viewfinder

adjustments.

Practice zooming and setting focus--get a feel for the mechanical or servo controls.

If you have a shot sheet, rehearse your shots and moves. Check to see that your

TelePrompTer (if you have one) is working. Always lock your camera and

physically cap the lens before leaving it.

4
Camera Basics, Principles & Techniques-MCD 401 VU

During the Shoot

Unlock the camera head and make sure that the adjustments are correct. The

camera should never be operated with the pan and tilts locks engaged!

Preset the focus once you're in position. Unless you're on air, always set your focus

with the lens in its maximum focal length position.

Make sure that your wheels are set for planned dolly or trucking moves. If you

have a difficult move, have the FD or a floor assistant help you.

Be aware of other objects, people, activities around you--e.g. other cameras, mic

booms, monitors (don't stand between it and the talent), FD (don't run into

him/her), props, light stands, etc. Keep your eyes on the viewfinder and be looking

for your next shot--help the director but try not to "out-direct" him/her.

Be aware of your tally light. Anticipate and get to your next shot quickly.

Mark critical camera positions on the studio floor with tape.

Use your talk back mic only in emergencies. Listen to the directions given to other

cameras as well as your own. Use the external switch to view the "on-air" camera

and try to match shots when appropriate.

5
Camera Basics, Principles & Techniques-MCD 401 VU

After the Shoot

Wait for the "wrap" signal to lock down your camera. Cap the lens

Move your camera to its storage location and coil the cable neatly on the wall

hangers in a figure-eight wrap. Assist with other studio wrap procedures.

6
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 90

Practices of Photography

In film and video, a cutaway shot is the interruption of a continuously filmed action by inserting

a view of something else. It is usually, although not always, followed by a cut back to the first

shot, when the cutaway avoids a jump cut. The cutaway shot does not necessarily contribute any

dramatic content of its own, but is used to help the editor assemble a longer sequence. For this

reason, editors choose cutaway shots related to the main action, such as another action or object

in the same location.[4] For example, if the main shot is of a man walking down an alley, possible

cutaways may include a shot of a cat on a nearby dumpster or a shot of a person watching from a

window overhead.

Similarly, a cutaway scene is the interruption of a scene with the insertion of another scene,

generally unrelated or only peripherally related to the original scene. The interruption is usually

quick, and is usually, although not always, ended by a return to the original scene. The effect is

of commentary to the original scene, frequently comic in nature.

The most common use of cutaway shots in dramatic films is to adjust the pace of the main

action, to conceal the deletion of some unwanted part of the main shot, or to allow the joining of

parts of two versions of that shot. For example, a scene may be improved by cutting a

few frames out of an actor's pause; a brief view of a listener can help conceal the break. Or the

actor may fumble some of his lines in a group shot; rather than discarding a good version of the

shot, the director may just have the actor repeat the lines for a new shot, and cut to that alternate

view when necessary.

1
Camera Basics, Principles & Techniques-MCD 401 VU

Cutaways are also used often in older horror films in place of special effects. For example, a shot

of a zombie getting its head cut off may, for instance, start with a view of an axe being swung

through the air, followed by a close-up of the actor swinging it, then followed by a cut back to

the now severed head. George A. Romero, creator of the Dead Series, and Tom Savini pioneered

effects that removed the need for cutaways in horror films.

In news broadcasting and documentary work, the cutaway is used much as it would be in fiction.

On location, there is usually just one camera to film an interview, and it's usually trained on the

interviewee. Often there is also only one microphone. After the interview, the interviewer will

usually repeat his questions while he himself is being filmed, with pauses as they act as if to

listen to the answers. These shots can be used as cutaways. Cutaways to the interviewer,

called noddies, can also be used to cover cuts.

A jump cut is a cut in film editing in which two sequential shots of the same subject are taken

from camera positions that vary only slightly. This type of edit gives the effect of jumping

forwards in time. It is a manipulation of temporal space using the duration of a single shot, and

fracturing the duration to move the audience ahead. This kind of cut abruptly communicates the

passing of time as opposed to the more seamless dissolve heavily used in films predating Jean-

Luc Godard's Breathless, when jump cuts were first used extensively. For this reason, jump cuts,

while not seen as inherently bad, are considered a violation of classical continuity editing, which

aims to give the appearance of continuous time and space in the story-world by de-emphasizing

editing. Jump cuts, in contrast, draw attention to the constructed nature of the film.

A match cut, also called a graphic match (or, in the French term, raccord), is a cut in film

editing between either two different objects, two different spaces, or two different compositions

2
Camera Basics, Principles & Techniques-MCD 401 VU

in which objects in the two shots graphically match, often helping to establish a strong continuity

of action and linking the two shots metaphorically.

Cutting on action or matching on action refers to film editing and video editing techniques

where the editor cuts from one shot to another view that matches the first shot's action. Although

the two shots may have actually been shot hours apart from each other, cutting on action gives

the impression of continuous time when watching the edited film. By having a subject begin an

action in one shot and carry it through to completion in the next, the editor creates a visual

bridge, which distracts the viewer from noticing the cut or noticing any slight continuity error

between the two shots. A variant of cutting on action is a cut in which the subject exits the frame

in the first shot and then enters the frame in the subsequent shot. The entrance in the second shot

must match the screen direction and motive rhythm of the exit in the first shot.

3
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 91

5 C’s of Cinematography- Camera Angles


Camera Angles

Camera angles and movements combine to create a sequence of images, just as words, word
order and punctuation combine to make the meaning of a sentence. You need a straightforward
set of key terms to describe them.

Describing Shots

When describing camera angles, or creating them yourself, you have to think about three
important factors

— The FRAMING or the LENGTH of shot

— The ANGLE of the shot

— If there is any MOVEMENT involved

When describing different cinematic shots, different terms are used to indicate the amount of
subject matter contained within a frame, how far away the camera is from the subject, and the
perspective of the viewer. Each different shot has a different purpose and effect. A change
between two different shots is called a CUT.

Framing or Shot Length

1 . Extreme long shot

Extreme Long Shot

This can be taken from as much as a quarter of a mile away, and is generally used as a scene-
setting, establishing shot. It normally shows an EXTERIOR, eg the outside of a building, or a
landscape, and is often used to show scenes of thrilling action eg in a war film or disaster movie.

1
Camera Basics, Principles & Techniques-MCD 401 VU

There will be very little detail visible in the shot, it's meant to give a general impression rather
than specific information.

The extreme long shot on the left is taken from a distance, but denotes a precise location - it
might even connote all of the entertainment industry if used as the opening shot in a news story.

2. Long Shot

This is the most difficult to categorise precisely, but is generally one which shows the image as
approximately "life" size ie corresponding to the real distance between the audience and the
screen in a cinema (the figure of a man would appear as six feet tall). This category includes the
FULL SHOT showing the entire human body, with the head near the top of the frame and the
feet near the bottom. While the focus is on characters, plenty of background detail still emerges:
we can tell the coffins on the right are in a Western-style setting, for instance.

Long Shot

3. Medium Shot

Contains a figure from the knees/waist up and is normally used for dialogue scenes, or to show
some detail of action. Variations on this include the TWO SHOT (containing two figures from
the waist up) and the THREE SHOT (contains 3 figures...). NB. Any more than three figures and
the shot tends to become a long shot. Background detail is minimal, probably because location
has been established earlier in the scene - the audience already know where they are and now
want to focus on dialogue and character interation. Another variation in this category is the
OVER-THE-SHOULDER-SHOT, which positions the camera behind one figure, revealing the
other figure, and part of the first figure's back, head and shoulder.

2
Camera Basics, Principles & Techniques-MCD 401 VU

Medium Shot

4. Close-Up

This shows very little background, and concentrates on either a face, or a specific detail of mise
en scène. Everything else is just a blur in the background. This shot magnifies the object (think
of how big it looks on a cinema screen) and shows the importance of things, be it words written
on paper, or the expression on someone's face. The close-up takes us into the mind of a character.
In reality, we only let people that we really trust get THAT close to our face - mothers, children
and lovers, usually - so a close up of a face is a very intimate shot. A film-maker may use this to
make us feel extra comfortable or extremely uncomfortable about a character, and usually uses a
zoom lens in order to get the required framing.

Close up

5. Extreme Close-Up

As its name suggests, an extreme version of the close up, generally magnifying beyond what the
human eye would experience in reality. An extreme close-up of a face, for instance, would show
3
Camera Basics, Principles & Techniques-MCD 401 VU

only the mouth or eyes, with no background detail whatsoever. This is a very artificial shot, and
can be used for dramatic effect. The tight focus required means that extra care must be taken
when setting up and lighting the shot - the slightest camera shake or error in focal length is very
noticeable.

Extreme Close Up

4
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 92

5 C’s of Cinematography – Continuity

Continuity is the characteristic of a scene whereby the action seems fluid and

continuous, even though it is composed of a number of shots. There are many ways

that continuity can be broken -- which can be noticeable and therefore distracting

to an audience. For example, if the hero's clothes are dirty and bloody as he is

walking through the doorway, but clean as he emerges from the building, that is a

continuity error. Another kind of continuity error can be caused by poor editing.

For example, a character might move to a chair and sit down in a long shot, and

then we cut to a close-up and see the end of the character's sitting movement.

Depending on how these two shots are edited, it will either look like a continuous

motion (good continuity), or you might see repeated action or a gap in action (poor

continuity). Many people enjoy picking out continuity errors in movies.

Continuity errors are often the result of cutting for performance, where the editor

pieces together shots that form the desired feel of the scene with little or no

attention paid to background objects or actions that cause the errors.

Check continuity between a live camera and a previously recorded clip

To check continuity between a live camera and a recorded clip, display a frame

from the clip in the split region, and make the camera the active source.

1
Camera Basics, Principles & Techniques-MCD 401 VU

1. Scrub to an appropriate frame in a recorded clip.

2. In the Field Monitor, click the Split Screen button.

3. To switch the active source to the live camera feed, click Stop, or press the

Esc key.

Check continuity between multiple cameras

The Split Screen option is useful for comparing and calibrating multiple cameras to

give video from all of them a common appearance.

1. Plug both cameras into the computer, adjust the manual settings on one of

them, and record a small clip to the hard drive.

2. Switch to the second camera, and enable the Split Screen option between the

recorded clip from camera one and the live feed from camera two.

3. Adjust iris, white balance, and other settings so that the image from the

second camera has good continuity with the image from the first.

2
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 93

5C’s of Cinematography- Cutting

A non-linear editing system (NLE) is a video (NLVE) or audio editing (NLAE) digital audio

workstation (DAW) system that performs non-destructive editing on source material. The name is in

contrast to 20th century methods of linear video editing and film editing.

In digital video editing, non-linear editing is a method that allows you to access any frame in

a digital video clip regardless of sequence in the clip. The freedom to access any frame, and use a

cut-and-paste method, similar to the ease of cutting and pasting text in a word processor, and allows

you to easily include fades, transitions, and other effects that cannot be achieved with linear editing.

Linear and non-linear editing.

Non-linear editing is the most natural approach when all assets are available as files on video

servers or hard disks, rather than recordings on reels or tapes—while linear editing is tied to the need

to sequentially view film or hear tape.

Metadat

When ingesting audio or video feeds, metadata are attached to the clip. Those metadata can be

attached automatically (timecode, localization, take number, name of the clip) or manually (players

names, characters, in sports: red card, goal...).

Direct access

Non-linear editing enables direct access to any video frame in a digital video clip, without needing to

play or scrub/shuttle through adjacent footage to reach it, as was necessary with historical video

tape linear editing systems. It is now possible to access any frame by entering directly the time

1
Camera Basics, Principles & Techniques-MCD 401 VU

code or the descriptive metadata. An editor can, for example at the end of the day in the Olympic

Games, ask to retrieve all the clips related to the players who received a gold medal.

Basic techniques

The NLE method is similar in concept to the "cut and paste" techniques used in film editing or in IT.

However, with the use of non-linear editing systems, the destructive act of cutting of film negatives

is eliminated. It can also be viewed as the audio/video equivalent of word processing, which is why

it is called desktop video editing in the consumer space.

Accessing the material

The non-linear editing retrieves video media for editing. Because these media exist on the video

server or other mass storage that stores the video feeds in a given codec, the editing system can use

several methods to access the material:

• Direct access: the video server records feeds with a codec readable by the editing system,

has an Ethernet connection and allows direct editing. The editor previews material

directly on the server (which it sees as remote storage) and edits directly on the server

without transcoding or transfer. This method is new.

• Shared storage: the video server transfers feeds to and from shared storage that is

accessible by all editors. Media in the appropriate codec on the server need

only transferred. If recorded with a different codec, media must

be transcoded during transfer. In some cases (depending on material), files on shared

storage can be edited even before the transfer is finished.

• Importing: the editor downloads the material and edits it locally. This method can be used

with the previous methods

2
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 94

5C’s of Cinematography- Close ups

Close-up

a photograph or part of a film in which the camera seems to have been very close to the picture it

took

What is close up photography?

Close up photography, is the act of photographing objects such as flowers or insects in close

range so the subject you are photographing fills the frame. In other words, it’s the act of

photographing subjects close up. This is easily achievable with any lens, even a 300mm

telephoto lens.

Close-up photos pull you directly into a subject so you can examine its details from a unique

perspective. A close-up tends to focus on a specific thing—an insect, a plant, a flower, or a face,

for example. Or it can highlight something we don't usually pay much attention to, but which

turns out to be captivating, dramatic, or revealing when intimately observed.

Close-up photos can tell a powerful story in a single shot: Taking a photo of a person's weathered

hands, for example, might be a way to convey the fact that they have worked hard all their life.

Close-up vs. macro

Often we hear the word macro used in reference to—or even interchangeably with—close-up

photography. But there is a key difference. A close-up is an image shot at close range, where the

subject is isolated from its environment. Any camera and lens can shoot a close-up. A macro

1
Camera Basics, Principles & Techniques-MCD 401 VU

photograph, however, is an extreme close-up that portrays the subject as life-size or greater-than-

life-size.

Macro photos are characterized by both closeness and magnification. If you wanted to

photograph the details of an insect’s eyes, for example, you would take a macro photograph.

A macro photo is generally expressed as a ratio—a 1:1 ratio is when the image is life-size. To

take a high-quality macro shot, you must use a special macro lens whose performance is

specifically geared to close-focus shooting. A normal lens can't focus when it's very close to the

subject and thus can't take an image at a ratio greater than 1:1. A macro lens, however, can focus

when positioned very close to the subject, allowing it to achieve greater-than-life-

size magnification, a shallower depth of field, and thus clearer focus on tiny details.

Equipment

If you're aiming for high-quality macro shots, then consider investing in a dedicated macro lens.

Almost all manufacturers of DSLR cameras offer a variety of lenses, including macros ranging

from short (30mm to 60mm) to medium (60mm to 105mm) to tele macro (105mm to 200mm).

However, for regular close-ups, zoom lenses like a 55mm to 200mm or a 70mm to 300mm lens

will work well. Even a fixed 50mm lens with an f/1.8 aperture can produce some nice close-ups.

Macro mode

Certain point-and-shoot cameras or DLSRs let you switch into macro mode simply by turning

the dial to a macro setting (usually a tulip symbol). This allows you to focus at a very short

distance from the subject. The quality of this macro setting, however, is very different from the

2
Camera Basics, Principles & Techniques-MCD 401 VU

quality you get when you use a dedicated macro lens. A camera's macro setting will not shoot a

subject so that it appears greater than life-size.

Focus and composition

For a great close-up, isolate your subject from its background by using a shallow depth of field

(set the aperture to a low number) and/or picking a nondistracting background, if possible. Focus

carefully and pick a specific focus point so your subject comes out looking sharp against a softer

background. If you use a camera or lens with autofocus, make sure the lens is focusing on the

object you want. Without a macro lens, you may have trouble focusing precisely, but you can

remedy this by moving the camera a bit farther away from the subject. If you are using a zoom

lens, then move back and zoom into your subject.

Lighting and image stability

A common problem with close-ups is that if your light source is behind the camera, it will cast a

shadow over the subject. Fix this problem by using a flash or other off-camera lighting. An off-

camera flash will help avoid flattening the image and creating a shadow cast by the

illumination. In the image on the left, a shadow is cast over the subject. In the image on the right,

I took the strobe off the camera to eliminate the shadow from the camera. However, there is still

an on-camera flash causing the slight shadow in the background.

Keeping images sharp

Another common problem with close-up photography is image blur. The most common cause of

image blur is the lens’s inability to focus at such a close proximity to the subject. To prevent that,

first switch your camera to the macro setting (if it has one) and try again. If that fails, move the

camera a little further away from the subject, or if you're using a zoom lens, back up and zoom

3
Camera Basics, Principles & Techniques-MCD 401 VU

into the subject. Image blur can also be caused by slow shutter speed, low light, or a moving

subject. To prevent this kind of blurring, set your camera up on a tripod or raise your shutter

speed.

4
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 95

5C’s of Cinematography- Composition & Complement

It’s generally a combination of elements that make a great photo – many times when we see an

image that we are drawn to, it’s the whole package that is appealing to us rather than just one

element alone. But that said composition is one of the most important concepts to master when

starting your photography journey…so it’s worth getting to grips with a few of our photography

composition tips and putting them into practice when shooting.

Define your subject:

Photography begins with a subject and that’s that. You’ve raised your camera to snap a photo

because there is something that piques your interest, because you’ve seen something that is worthy

of a photograph. Be sure to clearly define that subject before snapping. Know what the subject is,

and define it within the surroundings.

Fill the frame

So you know what the subject is, but where does it fit within the frame? For the most part the subject

should fill the frame. That is, your subject should have a clearly dominant place within the frame and

should clearly stand out from the background. You can easily fill the frame by zooming in digitally

or using the “get closer” principle. However you do it, be sure that your subject sits prominently

within the composition.

Evoke interest & emotions

A great photo tells a story with a single glance. Your viewer should know almost immediately what

is happening in the photo and how the subject fits into that story. People are naturally drawn to

1
Camera Basics, Principles & Techniques-MCD 401 VU

something that makes them stop and think, or something they can relate to so the clearer the story or

the emotion, the easier it is for them to relate to your image. Draw your viewers in with emotion, and

keep them interested with a great story.

Balance the element

For a visually pleasing composition, try to balance the elements of your image. You could use the

rule of thirds as a guideline in this instance, ensuring that small elements and large elements balance

each other out by falling on the dividing lines. You can also create balance by off-setting the

elements a harmonious composition by utilizing the empty space.

Work with the light

Composition and light are usually considered two different elements of a great photograph, but by

using the light to compliment your composition you will boost the quality of your photo

exponentially. To do this, be aware of the direction of your light source. Envision how the light will

compliment or deter from the subject, and be calculated with the way you choose to shoot the

situation. Play with the light and find unique angles to harness the light for your composition. Take

your photos to the next level by working with the light!

2
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 96

Camera Angles

Camera Angles

The relationship between the camera and the object being photographed (ie the ANGLE) gives

emotional information to an audience, and guides their judgment about the character or object in

shot. The more extreme the angle (ie the further away it is from eye left), the more symbolic and

heavily-loaded the shot.

1. The Bird's-Eye view

This shows a scene from directly overhead, a very unnatural and strange angle. Familiar objects

viewed from this angle might seem totally unrecognisable at first (umbrellas in a crowd, dancers'

legs). This shot does, however, put the audience in a godlike position, looking down on the

action. People can be made to look insignificant, ant-like, part of a wider scheme of things.

Hitchcock (and his admirers, like Brian de Palma) is fond of this style of shot.

A camera man, raised above the action, gets a high


angle shot
1
Camera Basics, Principles & Techniques-MCD 401 VU

2. High Angle

Not so extreme as a bird's eye view. The camera is elevated above the action using a crane to

give a general overview. High angles make the object photographed seem smaller, and less

significant (or scary). The object or character often gets swallowed up by their setting - they

become part of a wider picture.

3. Eye Level

A fairly neutral shot; the camera is positioned as though it is a human actually observing a scene,

so that eg actors' heads are on a level with the focus. The camera will be placed approximately

five to six feet from the ground.

4. Low Angle

These increase height (useful for short actors like Tom Cruise or James McAvoy) and give a

sense of speeded motion. Low angles help give a sense of confusion to a viewer, of

powerlessness within the action of a scene. The background of a low angle shot will tend to be

just sky or ceiling, the lack of detail about the setting adding to the disorientation of the viewer.

The added height of the object may make it inspire fear and insecurity in the viewer, who is

psychologically dominated by the figure on the screen.

5. Oblique/Canted Angle

Sometimes the camera is tilted (ie is not placed horizontal to floor level), to suggest imbalance,

transition and instability (very popular in horror movies). This technique is used to suggest

2
Camera Basics, Principles & Techniques-MCD 401 VU

POINT-OF-View shots (ie when the camera becomes the 'eyes' of one particular character,seeing

what they see — a hand held camera is often used for this.

Source: http://www.mediaknowall.com/camangles.html

3
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 97

Camera Movements

Camera Movement

A director may choose to move action along by telling the story as a series of cuts, going from

one shot to another, or they may decide to move the camera with the action. Moving the camera

often takes a great deal of time, and makes the action seem slower, as it takes several second for

a moving camera shot to be effective, when the same information may be placed on screen in a

series of fast cuts. Not only must the style of movement be chosen, but the method of actually

moving the camera must be selected too. There are seven basic methods:

1. Pans

A movement which scans a scene horizontally. The camera is placed on a tripod, which operates

as a stationary axis point as the camera is turned, often to follow a moving object which is kept

in the middle of the frame.

2. Tilts

A movement which scans a scene vertically, otherwise similar to a pan.

3. Dolly Shots

Sometimes called TRUCKING or TRACKING shots. The camera is placed on a moving vehicle

and moves alongside the action, generally following a moving figure or object. Complicated

1
Camera Basics, Principles & Techniques-MCD 401 VU

dolly shots will involve a track being laid on set for the camera to follow, hence the name. The

camera might be mounted on a car, a plane, or even a shopping trolley (good method for

independent film-makers looking to save a few dollars). A dolly shot may be a good way of

portraying movement, the journey of a character for instance, or for moving from a long shot to a

close-up, gradually focusing the audience on a particular object or character.

4. Hand-held shots

The hand-held movie camera first saw widespread use during World War II, when news

reporters took their windup Arriflexes and Eyemos into the heat of battle, producing some of the

most arresting footage of the twentieth century. After the war, it took a while for commercially

produced movies to catch up, and documentary makers led the way, demanding the production of

smaller, lighter cameras that could be moved in and out of a scene with speed, producing a "fly-

on-the-wall" effect. This aesthetic took a while to catch on with mainstream Hollywood, as it

gives a jerky, ragged effect, totally at odds with the organized smoothness of a dolly shot. The

Steadicam (a heavy contraption which is attached a camera to an operator by a harness. The

camera is stabilized so it moves independently) was debuted in Marathon Man (1976), bringing a

new smoothness to hand held camera movement and has been used to great effect in movies and

TV shows ever since. No "walk and talk" sequence would be complete without one. Hand held

cameras denote a certain kind of gritty realism, and they can make the audience feel as though

they are part of a scene, rather than viewing it from a detached, frozen position.

5. Crane Shots

Basically, dolly-shots-in-the-air. A crane (or jib), is a large, heavy piece of equipment, but is a

useful way of moving a camera - it can move up, down, left, right, swooping in on action or

2
Camera Basics, Principles & Techniques-MCD 401 VU

moving diagonally out of it. The camera operator and camera are counter-balanced by a heavy

weight, and trust their safety to a skilled crane/jib operator.

6. Zoom Lenses

A zoom lens contains a mechanism that changes the magnification of an image. On a still

camera, this means that the photographer can get a 'close up' shot while still being some distance

from the subject. A video zoom lens can change the position of the audience, either very quickly

(a smash zoom) or slowly, without moving the camera an inch, thus saving a lot of time and

trouble. The drawbacks to zoom use include the fact that while a dolly shot involves a steady

movement similar to the focusing change in the human eye, the zoom lens tends to be jerky

(unless used very slowly) and to distort an image, making objects appear closer together than

they really are. Zoom lenses are also drastically over-used by many directors (including those

holding palm-corders), who try to give the impression of movement and excitement in a scene

where it does not exist. Use with caution - and a tripod!

7. The Aerial Shot

An exciting variation of a crane shot, usually taken from a helicopter. This is often used at the

beginning of a film, in order to establish setting and movement. A helicopter is like a particularly

flexible sort of crane - it can go anywhere, keep up with anything, move in and out of a scene,

and convey real drama and exhilaration — so long as you don't need to get too close to your

actors or use location sound with the shots.

3
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 98

Camera Shots

A camera shot is the amount of space that is seen in one shot or frame. Camera shots are used to

demonstrate different aspects of a film's setting, characters and themes. As a result, camera shots

are very important in shaping meaning in a film. Reviewing the examples on the right hand side

of this page should make the different camera shots clearer.

An extreme long shot (animation on right) contains a large amount of landscape. It is often used

at the beginning of a scene or a film to establish general location(setting). This is also known as

an establishing shot.

A long shot (animation on right) contains landscape but gives the viewer a more specific idea of

setting. A long shot may show the viewers the building where the action will take place.

A full shot (animation on right) contains a complete view of the characters. From this shot,

viewers can take in the costumes of characters and may also help to demonstrate the

relationships between characters

A mid shot (animation on right) contains the characters or a character from the waist up.

From this shot, viewers can see the characters' faces more clearly as well as their interaction with

other characters. This is also known as a social shot.

1
Camera Basics, Principles & Techniques-MCD 401 VU

A close-up (animation on right) contains just one character's face. This enables viewers to

understand the actor's emotions and also allows them to feel empathy for the character. This is

also known as a personal shot.

An extreme close-up (animation on right) contains one part of a character's face or other

object. This technique is quite common in horror films, particularly the example above. This type

of shot creates an intense mood and provides interaction between the audience and the viewer.

When analyzing a film you should always think about the different camera shots and why they

are being used. The next time that you are at the cinema or watching television see what camera

shots are being used.

Important: These camera shots are used in all forms of visual texts including postcards, posters

and print advertisements.

Camera angles

It is important that you do not confuse camera angles and camera shots. Camera shots are used to

demonstrate different aspects of setting, themes and characters. Camera angles are used to

position the viewer so that they can understand the relationships between the characters. These

are very important for shaping meaning in film as well as in other visual texts.

The following examples will help you to understand the differences between the different camera

angles

A bird's eye angle (animation on right) is an angle that looks directly down upon a scene. This

angle is often used as an establishing angle, along with an extreme long shot, to establish setting.

2
Camera Basics, Principles & Techniques-MCD 401 VU

A high angle (animation on right) is a camera angle that looks down upon a subject. A

character shot with a high angle will look vulnerable or small. These angles are often used to

demonstrate to the audience a perspective of a particular character. The example above

demonstrates to us the perspective or point of view of a vampire. As a viewer we can understand

that the vampire feels powerful.

An eye-level angle (animation on right) puts the audience on an equal footing with the

character/s. This is the most commonly used angle in most films as it allows the viewers to feel

comfortable with the characters.

A low angle (animation on right) is a camera angle that looks up at a character. This is the

opposite of a high angle and makes a character look more powerful. This can make the audience

feel vulnerable and small by looking up at the character. This can help the responder feel

empathy if they are viewing the frame from another character's point of view.

As with camera shots, you will be able to see many examples of camera angles in any film or

visual text that you view. The next time that you watch television or see a film, take note of the

camera angles and think of how they affect your perception (idea) of different characters.

Another camera angle that you might come across is a Dutch angle.

A Dutch angle (animation on right) is used to demonstrate the confusion of a character. The

example above should disorientate you.

An Evangelion shot (animation on right) is derived from the popular anime series 'Neon

Genesis Evangelion'. This camera movement begins as an extreme close-up and zooms out

abruptly, creating a blurring effect to emphasize the speed and size of the object.

3
Camera Basics, Principles & Techniques-MCD 401 VU

4
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 99

White Balance
White balance (WB) is the process of removing unrealistic color casts, so that objects which

appear white in person are rendered white in your photo. Proper camera white balance has to

take into account the "color temperature" of a light source, which refers to the relative warmth or

coolness of white light. Our eyes are very good at judging what is white under different light

sources, but digital cameras often have great difficulty with auto white balance (AWB) — and

can create unsightly blue, orange, or even green color casts. Understanding digital white balance

can help you avoid these color casts, thereby improving your photos under a wider range of

lighting conditions.

Note how 5000 K produces roughly neutral light, whereas 3000 K and 9000 K produce light

spectrums which shift to contain more orange and blue wavelengths, respectively. As the color

temperature rises, the color distribution becomes cooler. This may not seem intuitive, but results

from the fact that shorter wavelengths contain light of higher energy.

Why is color temperature a useful description of light for photographers, if they never deal with

true blackbodies? Fortunately, light sources such as daylight and tungsten bulbs closely mimic

the distribution of light created by blackbodies, although others such as fluorescent and most
1
Camera Basics, Principles & Techniques-MCD 401 VU

commercial lighting depart from blackbodies significantly. Since photographers never use the

term color temperature to refer to a true blackbody light source, the term is implied to be a

"correlated color temperature" with a similarly colored blackbody.

2
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 100

Steadicam

Steadicam is a brand of camera stabilizer mount for motion picture cameras that mechanically

isolates it from the operator's movement. It allows for a smooth shot, even when moving quickly

over an uneven surface. The Steadicam was invented by cameraman Garrett Brown and was

introduced in 1975.

Before the camera stabilizing system, a director had two choices for moving (or "tracking")

shots:

The camera could be mounted on a camera dolly, a wheeled mount that rolls on tracks or leveled

boards. This is time consuming to set up, and depending on the location, impractical in many

situations.

The camera operator could hold the camera in his hands. This allows greater speed and

flexibility, but even the most skilled operator cannot entirely prevent shaking. Hand-held camera

footage has traditionally been considered suitable mostly for documentaries, news, reportage

work, live action, un-rehearsable footage, or to evoke an atmosphere of authentic immediacy or

cinéma vérité during dramatic sequences.

The operator wears a harness—the Steadicam "vest"—which is attached to an iso-elastic arm.

This is connected by a mult-iaxis and ultra-low friction gimbal to the Steadicam armature which

has the camera mounted at one end and a counterbalance weight at the other. The counterbalance

1
Camera Basics, Principles & Techniques-MCD 401 VU

usually includes the battery pack and a monitor. The monitor substitutes for the camera's

viewfinder, since the range of motion of the camera relative to the operator makes the camera's

own viewfinder unusable. In the film industry the armature and weight are traditionally called

the "sled", as they resembled a sled in an early model of the Steadicam. The sled includes the top

"stage" where the camera is attached, the "post" which in most models can be extended, with the

monitor and batteries at the bottom to counterbalance the camera weight. This is how the

Steadicam stays upright, by simply making the bottom slightly heavier than the top, pivoting at

the gimbal. This leaves the center of gravity of the whole rig, however heavy it may be, exactly

at the operator's fingertip, allowing deft and finite control of the whole system with the lightest

of touches on the gimbal. The skill of the operator is to keep the desired framing and

composition by feathering his or her touch on the gimbal, while the rig and operator is in motion,

and, indeed, when still.

The combined weight of the counterbalance and camera means that the armature bears a

relatively high inertial mass which is not easily moved by small body movements from the

operator (much as it is difficult to quickly shake a bowling ball). The freely pivoting armature

adds additional stabilization to the photographed image, and makes the weight of the camera-sled

assembly acceptable by allowing the body harness to support it.

When the armature is correctly adjusted, the operator is able to remove their hands from the

Steadicam entirely and have the camera stay in place. During operation, the operator usually

rests his or her hand on the camera gimbal and applies force at that point to move the camera. To

avoid shaking the camera when lens adjustments must be made during the shot, a wireless

remote operated by the camera assistant is used to control focus and iris.

2
Camera Basics, Principles & Techniques-MCD 401 VU

For low-angle shots, the camera/sled armature can be rotated vertically, putting the camera on

the bottom and the sled on the top. This is referred to as "low mode" operation.

The newest generation is the Tango. The most modern body-supported camera-stabilization-

system, its horizontal mechanism makes it possible to move the camera freely while staying

horizontal. A Steadicam operator can change from low mode to high mode without any

alteration. Dimensions are not limited to ups and downs, but also in depth and over or through

obstacles.

The smallest, lightest Steadicam which can be used with a support arm and vest is the Steadicam

Merlin. It is light enough to be hand held with cameras weighing up to about 5.5 pounds (2.5 kg),

and may carry cameras up to about 7 pounds (3.2 kg) when used with the arm. The Merlin may

be folded up and carried in comparatively small spaces such as medium-size camera bags. In its

lightest configuration, the Merlin weighs just 12.5 ounces (0.35 kg). Photographers who shoot

with HDSLR cameras that combine still and motion photography most often work with the

Merlin. Since the Merlin has no facility to carry a separate monitor, cameras suitable for it must

have their own built-in monitors.

A smaller, lighter Steadicam was introduced in 2012, called the Smoothee™. Its tubular frame

can support Apple iPhones (4 through 5S) along with GoPro cameras that have the attached

viewfinder monitor. Its target retail price was originally "under $200" and it may be purchased in

consumer camera stores. Pre-weighted, balanced iPhone and GoPro adapters modularly allow

interchange of cameras.

3
Camera Basics, Principles & Techniques-MCD 401 VU

An even smaller, camera-specific Steadicam Curve™ is available for the GoPro cameras (2, 3

and 3+) which is made of a single, curved slash of aluminum. Its target retail price is just $100,

and it, too, is available in consumer camera outlets.

4
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 101

Mock Steadicam Practicing

Achieving dynamic balance is crucial for steadicam operation to be properly executed because

it keeps the camera level during movement. A steadicam rig makes it possible for camera

movement to be achieved without using a crane or dolly, and it eliminates the shakiness of

handheld footage. It allows an operator to walk with a camera while mechanically isolating his

movements, thus creating a smooth effect. But, the rig needs to be properly balanced in order for

this to happen.

The Parts of a Steadicam Rig

In order to properly balance a steadicam rig, you must first understand its parts. The center and

most crucial aspect of the steadicam is the post. At the top of the post is the stage. This is where

the camera gets mounted. There are also knobs on the stage that are used for adjusting the

camera mount for balancing, but there will be more on that later.

At the bottom of the post is the monitor clamp and the battery mount. The gimbal and gimbal

handle are located at the center of the post. The gimbal handle attaches the rig to a vest that the

operator wears. When the operator is not working with the camera, he can remove the gimbal

handle from his vest to detach the rig.

1
Camera Basics, Principles & Techniques-MCD 401 VU

Achieving Dynamic Balance

In order for the operator to use the rig, it needs to be properly balanced. This begins with the

mounting of the camera. You want to find where the camera's center of gravity is. You then want

to mount the camera so that the center of gravity is positioned slightly behind the center of the

post. The positioning of the camera can be adjusted with the knobs on the stage.

The steadicam needs to be balanced on three axis's: top to bottom, side to side, and forward to

aft. If one of these axis's is off balance, then the operator will have difficultly operating the rig.

Once the camera has been mounted, you can add the monitor and batteries to the bottom of the

post to balance out the weight between the top and the bottom. When adding these, you want to

keep them centered and balanced so that the weight distribution is equal in regards to side to side

and forward to aft.

If the rig feels bottom or top heavy, you can adjust the gimbal to balance the load. If that is not

enough, then there are mounts on the rig where weights can be added.

Once dynamic balance is achieved, your shots used with the rig will have no problem staying

level during movement.

2
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 102

Storyboarding for Camera

Storyboarding is a great way to plan and visualize your project before you go out and start

filming. It can even be helpful to you once you start editing your project. Through storyboarding,

you can plan what you’re going to be filming in your shots, the angles and sizes of your shots,

the order of your shots and your camera movement to achieve different feelings.

Your completed storyboard will assist you throughout the entire production process – from pre-

production to post-production. Here’s how:

Planning Saves Time

Storyboarding allows you to sit down and plan out all of your shots beforehand, so you don’t

have to make all of those decisions on-the-fly when you’re out filming and have limited time.

When you make the time to storyboard, you will spend more time planning each shot, which is

going to make your project look much better-composed in the end. This tutorial will describe the

different types of shots, shot angles, filming a conversation and camera movement so that you

can make educated decisions when planning your shots.

Sharing Your Vision

The storyboard will be particularly helpful to your camera person, because then they can see a

visualization of what you have in mind for each shot, as opposed to just trying to interpret a

description.

1
Camera Basics, Principles & Techniques-MCD 401 VU

Film More Efficiently

When you’re filming, you don’t have to film your shots in the exact order that they appear on

your storyboard. You can film your shots in whichever order is most convenient for you. The

storyboard can also then act like a checklist, so you know that you’re not forgetting to shoot any

important shots.

Edit More Effectively

Your storyboard will also be helpful when you go to edit and have to rearrange your shots in the

correct order.

Get Started

You can print out our Storyboarding Template to draw out your shots. (It doesn’t have to be

pretty: break out those stick figures!)

Include In Your Storyboards

Shot Sizes

There are various different shot sizes that you can use. Some simply make it easier for the

audience to follow the action, while others have symbolic meaning that implicates something

about the characters and/or plot. Additionally, keep the Rule of Thirds in mind when

planning the framing of your shots!

Wide Shot (WS)

This type of shot can be used as an establishing shot, which is the first shot in a scene that

orients the audience and shows them where the action is taking place. A wide shot can:

2
Camera Basics, Principles & Techniques-MCD 401 VU

1. give the audience a big look at the location (most common)

2. indicate the scope of the location

3. imply that the character or characters are lost, out of control or insignificant (by showing

them to be very small)

3
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 103

Director’s Camera

A director's viewfinder or director's finder is a viewfinder used by film directors and

cinematographers to set the framing of a motion picture or movie camera. There are three types

of director's viewfinder.

The most traditional director's viewfinder looks like a short telescope, can be held easily in one

hand and is often seen hanging from a lanyard on the director's neck. Sometimes called a "Tewe"

in Europe (after a German company that manufactured them), the most common viewfinder of

this type is still manufactured by Alan Gordon Enterprises, know as the Mark Vb. The

functionality of these devices is limited in that they only assist in observing the field of view of

the lenses that will be used on the motion picture camera but not the characteristics of that lens.

This type of viewfinder allows the user to select multiple camera formats, aspect ratios and focal

lengths within a specific range. Devices of this kind vary in price from $300US-$700US,

depending on size and features.

Early blimped motion picture film cameras like the Mitchell Camera BNCR were not reflex

finder cameras. Instead a viewfinder similar in concept to the Alan Gordon Mark Vb bolted to

the side of the camera was employed by the camera operator to frame a shot when filming. In

between takes, the camera could be "racked over" to allow viewing of the actual taking lens.

1
Camera Basics, Principles & Techniques-MCD 401 VU

Lens Finder

The second type, also called a director's viewfinder but sometimes referred to as a lens finder, is

a larger device than the traditional viewfinder and employs the lenses that are intended to be used

on the motion picture camera. These allow both the director and cinematographer to not only

observe the field of view but also the character of the lens in terms of depth of field, optical

aberration and general subjective "feel". These devices are still very common on film sets,

allowing shots to be framed without having to use the motion picture camera as a viewing

device. Lens finders are camera format specific and require the lenses that will be used in

production. These devices are considerably more expensive than the traditional viewfinder,

selling for between $2,000US -$6,000US.

Variations exist for different lens mounting systems, most typically Arri PL, Panavision PV

mount and Mitchell BNCR mounts. Other additions such as the addition of video assist have

been made available on models such as the Kish Optics Ultimate Director's Viewfinder.

2
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 104

Zoom Lens & its Impact

A zoom lens is a mechanical assembly of lens elements for which the focal length (and thus

angle of view) can be varied, as opposed to a fixed focal length (FFL) lens (see prime lens).

A true zoom lens, also called a parfocal lens, is one that maintains focus when its focal

length changes.

Zoom lenses are often described by the ratio of their longest to shortest focal lengths. For

example, a zoom lens with focal lengths ranging from 100 mm to 400 mm may be described

as a 4:1 or "4×" zoom. The term superzoom or hyperzoom is used to describe photographic

zoom lenses with very large focal length factors, typically more than 5× and ranging up to

18× in SLR camera lenses and 50× in amateur digital cameras. This ratio can be as high as

300× in professional television cameras. As of 2009, photographic zoom lenses beyond about

3× cannot generally produce imaging quality on par with prime lenses. Constant fast aperture

zooms (usually f/2.8 or f/2.0) are typically restricted to this zoom range. Quality degradation

is less perceptible when recording moving images at low resolution, which is why

professional video and TV lenses are able to feature high zoom ratios. Digital photography

can also accommodate algorithms that compensate for optical flaws, both within in-camera

processors and post-production software.

1
Camera Basics, Principles & Techniques-MCD 401 VU

Some photographic zoom lenses are long-focus lenses, with focal lengths longer than a

normal lens, some are wide-angle lenses (wider than normal), and others cover a range from

wide-angle to long-focus. Lenses in the latter group of zoom lenses, sometimes referred to as

"normal" zooms, have displaced the fixed focal length lens as the popular one-lens selection

on many contemporary cameras. The markings on these lenses usually say W and T for

"Wide" and "Telephoto". Telephoto is designated because the longer focal length supplied by

the negative diverging lens is longer than the overall lens assembly (the negative diverging

lens acting as the "telephoto group").

Some digital cameras allow cropping and enlarging of a captured image, in order to emulate

the effect of a longer focal length zoom lens (narrower angle of view). This is commonly

known as digital zoom and produces an image of lower optical resolution than optical zoom.

Exactly the same effect can be obtained by using digital image processing software on a

computer to crop the digital image and enlarge the cropped area. Many digital cameras have

both, combining them by first using the optical, then the digital zoom.

Zoom and superzoom lenses are commonly used with still, video, motion picture cameras,

projectors, some binoculars, microscopes, telescopes, telescopic sights, and other optical

instruments. In addition, the a focal part of a zoom lens can be used as a telescope of variable

magnification to make an adjustable beam expander. This can be used, for example, to

change the size of a laser beam so that the irradiance of the beam can be varied.

Static Shot

In a static shot, the camera does not move or change its aim within the shot, although the

camera may move from the shot to the next shot.

2
Camera Basics, Principles & Techniques-MCD 401 VU

There’s a certain power and authority to the static shot that means they’re used a lot,

although you’re right that they’re not as dominant as they were in the 30s and 40s*. If you’re

just learning how to compose shots and shoot really great video, it’s much easier to think

about one thing– the movement in the frame. It’s like learning to drive on an automatic

before tackling shifting gears.

Static shots prevailed in Hollywood during the ’30s by something of a technological

accident. Silent directors moved the camera constantly. But when Hollywood started making

“talkies” at the end of the ’20s, the camera noise could only be kept out of the mics by

putting the cameras in, literally, a small room on the set and shooting through glass. Later

“blimps” around cameras to silence noise were still big and bulky and hard to move. It took a

while to figure out how to make sound cameras small and light enough to maneuver easily.

3
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 105

TV Production Overview

Film frame

In filmmaking, video production, animation, and related fields, a film frame or video frame is

one of the many still images which compose the complete moving picture. The term is derived

from the fact that, from the beginning of modern filmmaking toward the end of the 20th century,

and in many places still up to the present, the single images have been recorded on a strip of

photographic film that quickly increased in length, historically; each image on such a strip looks

rather like a framed picture when examined individually.

The term may also be used more generally as a noun or verb to refer to the edges of the image as

seen in a camera viewfinder or projected on a screen. Thus, the camera operator can be said to

keep a car in frame by panning with it as it speeds past.

When the moving picture is displayed, each frame is flashed on a screen for a short time

(nowadays, usually 1/24, 1/25 or 1/30 of a second) and then immediately replaced by the next

one. Persistence of vision blends the frames together, producing the illusion of a moving image.

The frame is also sometimes used as a unit of time, so that a momentary event might be said to

last six frames, the actual duration of which depends on the frame rate of the system, which

varies according to the video or film standard in use. In North America and Japan, 30 frames per

1
Camera Basics, Principles & Techniques-MCD 401 VU

second (fps) is the broadcast standard, with 24 frames/s now common in production for high-

definition video shot to look like film. In much of the rest of the world, 25 frames/s is standard.

In systems historically based on NTSC standards, for reasons originally related to the

Chrominance subcarrier in analog NTSC TV systems, the exact frame rate is actually (3579545 /

227.5) / 525 = 29.97002616 fps.[1] This leads to many synchronization problems which are

unknown outside the NTSC world, and also brings about hacks such as drop-frame time code.

In film projection, 24 fps is the norm, except in some special venue systems, such as IMAX,

Showscan and Iwerks 70, where 30, 48 or even 60 frame/s have been used. Silent films and 8

mm amateur movies used 16 or 18 frame/s.

2
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 106

TV Production-Dailies & Previews

Dailies, in filmmaking, is the raw, unedited footage shot during the making of a motion picture.

They are so called because usually at the end of each day, that day's footage is developed, synced

to sound, and printed on film in a batch (and/or telecined onto video tape or disk) for viewing the

next day by the director and some members of the film crew. However, the term can be used to

refer to any raw footage, regardless of when it is developed or printed.

Another way to describe film dailies is "the first positive prints made by the laboratory from the

negative photographed on the previous day. In addition, during filming, the director and some

actors may view these dailies as an indication of how the filming and the actors' performances

are progressing.

In some regions such as the UK and Canada, dailies are usually referred to as rushes or daily

rushes, referring to the speed at which the prints are developed. Film dailies can refer to the

viewing of dailies on film in a theater. In animation, dailies are also called rushes or Sweat Box

sessions.

Dailies are usually viewed by members of the film crew either early in the morning before

filming starts, during the lunch break, or in the evening after filming ends. It is common for

several members of the film crew including the director, cinematographer, editor and others to

view and discuss the dailies as a group, but some productions opt to distribute multiple copies of

the dailies for individual viewing.

1
Camera Basics, Principles & Techniques-MCD 401 VU

Viewing dailies allows the film crew to see exactly what images and audio were captured the

previous day, allowing them to make sure there are no technical problems such as dirty,

scratched, or out of focus film. It also allows the director to ensure that they are happy with the

performances of the actors and that they have captured a scene from all the necessary camera

angles. If additional filming is desired it can often be done immediately rather than re-shooting

later when sets may have been torn down and actors may no longer be available.

Dailies are also often viewed separately by producers or movie studio executives who are not

directly involved in day-to-day production but seek assurance that the film being produced meets

the expectations they had when they invested in the project. Commonly a dailies sequence is

quite boring, as it often includes multiple takes of the same shot.

Film directors and film producers prefer to view film dailies rather than DVD dailies. However,

because of the costs involved, some productions will start by viewing film dailies and later

switch to DVD dailies. One reason why film dailies are preferred over DVD dailies is it is much

easier to check for correct focus with film dailies than with video dailies. HD dailies can be as

big as 2k resolution (2048 x 858, 2.39:1 aspect).

In the production of low-budget films with few crew and a short uninterrupted shooting period

there is sometimes no time to view dailies.

2
Camera Basics, Principles & Techniques-MCD 401 VU

3
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 107

TV Production-Sequencing of Editing

Classical film editing has developed a methodology which structures the work process into

precise stages – a methodology that is very similar in every country around the world.

Each stage has its own procedure and order:

1. Logging

The dailies or rushes are sorted and labeled in ‘bins’. Each take can contain extra notes from the

director or the cinematographer. This is the first time the editor sees the film, and since it is shot

out of sequence, it is out of context of the story. A good editor views the rushes and looks for

fluidity of movement and nuances that will later be incorporated into the film.

2. First Assembly

The editor considers all the visual and audio material collected on the shoot for each scene and

then re-orders it in the way to tell the story best. There are dozens of possible combinations the

editor can chose for this one simple sequence, each of which create a different mood and tell a

different story.

Editing on a large budget feature usually commences as soon as the film starts shooting. An

editor will work on the rushes and assemble scenes for the director and producer to view. Often

1
Camera Basics, Principles & Techniques-MCD 401 VU

at this point the editor and director will decide that additional footage of key moments is

necessary in order to make more editing choices available during the edit.

3. Rough Cut and Variations

The rough cut can take up to three months to complete. Each editor works differently.

Sometimes the editor works alone and shows the day or weeks work to the director and producer,

Sometimes the editor and director work together, discussing every nuance.

In the rough cut, the scenes are placed in order and checked for continuity. This all-important

step in the editing process allows for revisions and new ideas to be tried and tested.

Hint: Make the edit points between the scenes very obvious in order to emphasize the

‘roughness’. Failure to do so may result in the editor committing to an edit before it is ready.

4. First Cut

The first cut is the rough cut that is accepted by the editor, the director and the producer.

Selection and sequence are basically fixed, although changes can still be made. The later film is

visible. Detailed fine cut starts out from its proportions, structures, rhythms and emphasizes

them.

Hint: Never be afraid to let the first cut ‘rest’ for a few days so everyone involved can see it with

fresh eyes.

5. Fine Cut

The fine cut no longer focuses on the entire film, but on the details of each and every cut. The

fine cut emphasizes and strengthens the rhythms and structures identified in the first cut.

2
Camera Basics, Principles & Techniques-MCD 401 VU

6. Final Cut

When a fine cut has been agreed with the editor, director and producer, the sound designer,

music composer and title designer join the editor. Sound effects and music are created and added

to the final cut. When everyone has agreed with the final cut, the Edit Decision List is sent to the

lab where a negative cutter ‘conforms’ the negative to the EDL in order to create a negative that

is an exact copy of the final cut.

3
Camera Basics, Principles & Techniques-MCD 401 VU

Topic 108

TV Production-180 Degree Rule

Continuity is a big part of filmmaking. If you're shooting a short film or interview, it's important

to set the scene and establish your characters in space and time in order for the viewer to follow

the action. One of the most basic continuity rules is the 180 Degree Rule.

The 180 Degree Rule states that two characters in a scene should always have the same left/right

relationship to each other. If you don't follow the 180 Degree Rule, or break it intentionally, it

disrupts the scene disorients the audience. When you break the 180 line, a person who was

originally facing left in a scene is all of the sudden facing right. Wait! When did they switch

places?

This schematic shows the axis between two


characters and the 180° arc on which cameras may
be positioned (green). When cutting from the green
arc to the red arc, the characters switch places on
the screen. 1
Camera Basics, Principles & Techniques-MCD 401 VU

In a dialogue scene between two characters, Daniel (orange shirt, frame left in the diagram) and

Lucas (blue shirt, frame right), the camera may be placed anywhere on the green 180° arc and

the spatial relationship between the two characters will be consistent from shot to shot, even

when one of the characters is not on screen. Shifting to the other side of the characters on a cut,

so that Lucas is now on the left side and Daniel is on the right, may disorient the audience.

The rule also applies to the movement of a character as the "line" created by the path of the

character. For example, if a character is walking in a leftward direction and is to be picked up by

another camera, the character must exit the first shot on frame left and enter the next shot frame

right. A jump cut can be used to denote time. If a character leaves the frame on the left side and

enters the frame on the left in a different location, it can give the illusion of an extended amount

of time passing.

Another example could be a car chase: If a vehicle leaves the right side of the frame in one shot,

it should enter from the left side of the frame in the next shot. Leaving from the right and

entering from the right creates a similar sense of disorientation as in the dialogue example.

Usage

The 180-degree rule enables the audience to visually connect with unseen movement happening

around and behind the immediate subject and is important in the narration of battle scenes.

Pitfalls

The imaginary line allows viewers to orient themselves with the position and direction of action

in a scene. If a shot following an earlier shot in a sequence is located on the opposite side of the

180-degree line, then it is called a "reverse cut." Reverse cuts disorient the viewer by presenting

2
Camera Basics, Principles & Techniques-MCD 401 VU

an opposing viewpoint of the action in a scene and consequently altering the perspective of the

action and the spatial orientation established in the original shot.

Solutions

There are a variety of ways to avoid confusion related to crossing the line due to particular

situations caused by actions or situations in a scene that would necessitate breaking the 180-

degree line.

Prevention

Either alter the movement in a scene, or set up the cameras on one side of the scene so that all the

shots reflect the view from that side of the 180-degree line.

Camera Arch move

One way to allow for crossing the line is to have several shots with the camera arching from one

side of the line to the other during the scene. That shot can be used to orient the audience to the

fact that we are looking at the scene from another angle. In the case of movement, if a character

is seen walking into frame from behind on the left side walking towards a building corner on the

right, as they walk around the corner of the building, the camera can catch them coming towards

the camera on the other side of the building entering the frame from the left side and then walk

straight at the camera and then exit the left side of the frame.

Buffer shot

To minimize the "jolt" between shots in a sequence on either sides of the 180-degree line, shoot

a buffer shot along the 180-degree line separating each side. This lets the viewer visually

comprehend the change in viewpoint expressed in the sequence.

You might also like