Cameratechnology Web
Cameratechnology Web
Cameratechnology Web
Marketing Communication Group Product Information Department Business Planning Division B&P Company Sony Corporation
Preface
With the development of technologies and endeavors to reduce production costs, various advanced functions, originally available only on high-end video cameras, have been introduced to a wider range of professional video cameras, designed to target larger markets with their versatility. Thus, it has become imperative for those involved in the marketing and sales activities of such video cameras to understand these functions, as well as become familiar with the meanings of the related terminology. Having considered this background, we have created The Basics of Camera Technology a useful document that provides comprehensive explanations of all the camera terminologies we consider to be significant. These terms have been carefully selected and placed into five categories: Optical System, CCD Mechanism, Camera Functions, VTRs, and Others. This allows readers to quickly find relevant information according to the terminology in question. We hope this document will be helpful in your business.
Marketing Communication Group Product Information Department Business Planning Division B&P Company Sony Corporation
Table of Contents
Optical System
Angle of View..................................................................... 2 Chromatic Aberration......................................................... 3 Color Conversion Filters .................................................... 3 Color Temperature ............................................................ 4 Depth of Field .................................................................... 4 Flange-Back/Back Focal Length........................................ 4 Flare .................................................................................. 5 F-number ........................................................................... 5 Focal Length...................................................................... 6 Iris ..................................................................................... 6 Light and Color.................................................................. 7 MTF (Modulation Transfer Function)................................. 8 Neutral Density (ND) Filters .............................................. 8 Optical Low Pass Filter ..................................................... 9 Prism ................................................................................. 9 White Shading................................................................. 10 Zoom ............................................................................... 11
CCD Mechanism
EVS/Super EVS............................................................... 14 Field Integration and Frame Integration Mode ................ 15 HAD SensorTM.................................................................. 16 IT/FIT CCD ...................................................................... 16 On Chip Lens................................................................... 17 Picture Element ............................................................... 18 Readout Mechanism ....................................................... 18 RPN (Residual Point Noise) ............................................ 19 Spatial Offset Technology ............................................... 20 Variable Speed Electronic Shutter .................................. 21 Vertical Smear................................................................. 22
Camera Functions
Adaptive Highlight Control ............................................... 24 ATW (Auto Tracing White Balance)................................. 24 AWB (Auto White Balance) ............................................. 24 Black Balance.................................................................. 25 Black Clip......................................................................... 25 Black Gamma .................................................................. 25 Black Shading.................................................................. 26 Center Marker.................................................................. 26 Clear Scan/Extended Clear Scan (ECS) ......................... 26 Color Bars........................................................................ 27 Crispening ....................................................................... 28 Cross Color Suppression................................................. 28 Detail Level ...................................................................... 29 DynaLatitude
TM
Gain................................................................................. 31 Gamma ........................................................................... 32 Genlock ........................................................................... 32 H/V Ratio......................................................................... 33 Intercom (Intercommunication) System .......................... 33 Knee Aperture ................................................................. 33 Knee Correction .............................................................. 34 Lens File.......................................................................... 34 Level Dependence .......................................................... 34 Limiter ............................................................................. 35 Linear Matrix Circuit ........................................................ 36 Low Key Saturation ......................................................... 36 Mix Ratio ......................................................................... 37 Multi Matrix ...................................................................... 38 Pedestal/Master Black .................................................... 38 Preset White.................................................................... 39 Reference File................................................................. 39
................................................................ 30
Dynamic Contrast Control (Automatic Knee Control) ...... 30 Electric Soft Focus........................................................... 31 File System...................................................................... 31
Optical System
Return Video ................................................................... 40 Scene File ....................................................................... 40 Skin Tone Detail Correction ............................................ 41 Sub-carrier Phase Control/Horizontal Phase Control ..... 41 Tally ................................................................................ 42 Tele-Prompter ................................................................. 42 TLCS (Total Level Control System) ................................ 43
Triax................................................................................. 44 TruEyeTM (Knee Saturation Function) Processing ........... 44 Turbo Gain....................................................................... 45 V Modulation.................................................................... 45 White Balance ................................................................. 46 White Clip ........................................................................ 47 Zebra ............................................................................... 47
VTRs
ClipLinkTM/Index Picture/Automatic Logging Function .... 50 EZ Focus......................................................................... 50 EZ Mode ......................................................................... 51 SetupLogTM ...................................................................... 51 SetupNaviTM ..................................................................... 51
VTRs
Others
Additive Mixing................................................................ 54 Camera Control System.................................................. 54 Camera Control Unit (CCU) ........................................ 54 Master Setup Unit (MSU)............................................ 54 Remote Control Panel (RCP)...................................... 55 Camera Command Network Unit (CNU)..................... 55 Color Signal Forms ......................................................... 56 RGB ............................................................................ 56 Y/R-Y/B-Y ................................................................... 56 Y/C .............................................................................. 56 Composite................................................................... 56 Decibels (dB) .................................................................. 56 Dynamic Range .............................................................. 57 HD/SD (High Definition/Standard Definition) .................. 57 Horizontal Resolution...................................................... 57 Interlace/Progressive ...................................................... 58 Minimum Illumination ...................................................... 59 Modulation Depth............................................................ 59 NTSC/PAL ...................................................................... 60 PsF (Progressive, Segmented Frames).......................... 60 RS-170A ......................................................................... 61 S/N (signal-to-noise) Ratio.............................................. 61 SDI .................................................................................. 61 Sensitivity ........................................................................ 62 Synchronization Signal (Sync Signal).............................. 62 VBS/BS Signal................................................................. 63 Others Vertical Resolution........................................................... 63
C 2003 Sony Corporation. All rights reserved. Reproduction in whole or in part without written permission is prohibited. Sony, BETACAM, DVCAM, DynaLatitude, HAD sensor, Memory Stick, Trinitron and TruEye are trademarks of Sony. Some of images in this document are simulated. All other trademarks are the properties of their respective owners.
Camera Functions
VTRs
Optical System
Optical System
Angle of View
When shooting a landscape with a camera, as in figure A, there is a certain range that will be displayed on a picture monitor. Angle of view indicates the displayable range of the image (plane) measured by the angle from the center of the lens to the width of the image in the horizontal, vertical, and diagonal directions. These are called, the horizontal angle of view, vertical angle of view, and diagonal angle of view, respectively. Angle of view becomes narrow when a telephoto lens is used. On the other hand, angle of view becomes wider with a wide-angle (that is why it is called "wide-angle"). Consequently, the wider the angle of view is, the wider the displayable range becomes. The angle of view depends on image size, so lenses intended for 2/3-inch and 1/2-inch CCD cameras have different focal lengths.
Monitor
Video Camera
Figure A
Angle of view can be derived from the following equation. w = 2tan-1 y/2f w: Angle of view y: Image size (width of the image in horizontal, vertical, or diagonal direction.) f: Focal length
Focal length
Angle of view
Image size
CCD
Figure B
Optical System
Chromatic Aberration
When light passes through glass, the path it follows is 'refracted' or gets bent. The amount of refraction depends on the light's wavelength, which determines its color. This also holds true for lenses used in a video camera lens. The difference in refraction from color to color directly results in each color light (in a color camera RGB) forming focus on a different image plane. For example, if one color is in focus on the CCD imager, the other colors will be slightly out of focus and look less sharp. This phenomenon is more noticeable in lenses with longer focal lengths, and results in deterioration of the edges of the image. Recent technology has made it possible to effectively reduce chromatic aberration of a video camera lens. This is achieved by combining a series of converging and diverging lenses with different refraction characteristics to compensate for the aberration. The use of crystalline substances such as fluorite has been practiced to offset the aberration and accordingly the locus of the image reproduced.
VTRs
be possible to balance the camera for all color temperatures using the R/G/B amplifier gains, this is not practical from a signal-to-noise ratio point of view, especially when large gain up (refer to Gain ) is required. The color conversion filters reduce the gain adjustments required to achieve correct white balance. Others
Your question now may be, "why do we need color conversion filters if we can correct the change of color temperature electrically (white balance)?". The answer is quite simple. White balance (refer to White Balance ) electrically adjusts the amplitudes of the red (R) and blue (B) signals to be equally balanced to the green (G) by use of video amplifiers. We must keep in mind that using electrical amplification will result in degradation of signal-to-noise ratio. Although it may
400 500 600 Wavelength (nm) 700
Optical System
Color Temperature
The color reproduced by a camera largely depends on the color of the light source (or the illuminant) under which the camera is used. This is sometimes difficult to understand because the human eye is adaptive to changes in the light source's color and the color of an object will always look the same under any light source: sunlight, halogen lamps, candlelight etc. The color of light source is defined by using heated carbon (black body absorbing all radiation without transmission and reflection) as a reference. When heating a piece of carbon, it will start glowing and emitting light when it reaches a certain absolute temperature (expressed in Kelvin or (K)). The spectral distribution of the light emitted from the light source is determined by its corresponding absolute temperature, known as color temperature. Light Source Skylight Noon Sunlight Sunrise and Sunset 12 V/100 W Halogen Lamp Candlelight Color Temperature (approx.) 12000 K - 18000 K 4900 K - 5800 K 3000 K 3200 K 2900 K Since, cameras cannot adapt automatically to the color temperature of the light source, it is essential to select the appropriate color conversion filter (refer to Color Conversion Filters ) for the shooting environment in order to obtain accurate color reproduction. The combination of electronic White Balance (refer to White Balance ) with appropriate color conversion filter selection will create more accurate color reproduction.
Depth of Field
When focusing a lens on an object, there is a certain distance range in front of and behind the object that also comes into focus. Depth of field indicates the distance between the closest and furthest object that are in focus. When this distance is long, the depth of field is "deep" and when short, the depth of field is "shallow". Needless to say, any object outside the depth of field (range) will be out of focus and look blurred. Depth of field is governed by the three following factors: Thus depth of field can be controlled by changing these factors, allowing the camera operator creative shooting techniques. 1)The larger the iris F-number (refer to F-number ) (stopping down the amount of incident light), the deeper the depth of field. 2)The shorter the focal length of the lens, the deeper the depth of field. 3)The further the distance between the camera and the subject, the deeper the depth of field.
Optical System
ized as 17.526 mm and 12.5 mm respectively. There are three flange-back standards for the bayonet mount system, 35.74 mm, 38.00 mm, and 48.00 mm.
Similar to flange-back is back focal length, which describes the distance from the very end of the lens (the end of the cylinder that fits into the camera mount opening) to the image plane. The back focal length of the camera is slightly shorter than its flange-back.
Back focal lenght
CCD Mechanism
Flange-back
Camera Functions
Flare
Flare is a phenomenon that is likely to occur when strong light passes through the camera lens. Flare is caused by numerous diffused reflections of the incoming light inside the lens. This results in the black level of each red, green and blue channel being raised, and/or inaccurate color balance between the three channels. On a video monitor, flare causes the picture to appear as a misty image, sometimes with a color shade. In order to minimize the effects of flare, professional video cameras are provided with a flare adjustment function, which optimizes the pedestal level and corrects the balance between the three channels electronically.
VTRs
F-number
The maximum aperture of a lens indicates the amount of light that can be gathered by the lens and directed to the camera imager. A larger physical diameter lens will receive light over a wider area, and therefore is more efficient. The aperture is expressed as an F-number (or F-stop), where the numerical value of the F-number (F) is mathematically calculated by dividing the focal length (refer to Focal Length ) (f) by the effective aperture of the lens (D), as below: F = f/D This reciprocal relationship means that the smaller the Fnumber, the "faster" the lens, and the higher the sensitivity it will provide on a camera. The maximum aperture F-number is labeled on the front of the lens, and is an important distinguishing factor when comparing lenses. In lenses used with TV cameras, a mechanism is required to reduce the sensitivity of the lens and camera, and this is achieved by a variable
f: Focal length
diaphragm within the lens (refer to Iris ). The lens iris ring is also calibrated in F-stops. These calibrations increase by a factor of 2 , so lenses normally carry calibrations of 1.4, 2, 2.8, 4, 5.6, 8, 11, 16, and 22. Since the amount of incoming Others light is proportional to the cross-sectional area, the brightness of an image is in inverse proportion to the second power of the F-number. Simply put, as the value of the F-number increases by one stop, the brightness will decrease to one half. It is important to know that the F-number or F-stop is a key factor that affects the depth of field of the scene shot by the camera (refer to Depth of Field ). The smaller the F-number or F-stop becomes, the shallower the depth of field will become, and vice versa.
Incoming light
Optical System
Focal Length
Focal length describes the distance between the lens and the point where the light that passes through it converges on the optical axis. This point is where the lens is in focus and is called the focal point. To capture a focused image on a CCD imager, the focal point must coincide with the CCD imager plane by controlling the focus of the lens. Video camera lenses usually consist of a series of individual lenses for zooming and aberration-compensation purposes (refer to Chromatic Aberration ), and thus have a virtual focal point called the principal point.
Focal length
Focal point
Iris
The amount of light taken into a camera and directed to its imager is adjusted by a combination of diaphragms integrated in the lens. This mechanism is called the lens iris and functions just like the pupil of the human eye. By opening and closing these diaphragms, the diameter of the opening (also called aperture) changes, thus controlling the amount of light that passes through it. The amount of the iris opening is expressed by its F-number (refer to F-number ).
F = 4.0
F = 11.0
Optical System
It's Green...
Others
Only a green spectrum is reflected on the leaves. Other colors are absorbed.
Optical System
MTF 100 Lens A can produce higher image quality. Lens A Lens B Point X Lens B can produce higher image quality.
33
100
lines/mm
Optical System
CCD Mechanism
Prism
As explained in the section titled Additive Mixing (refer to Additive Mixing ), 3-CCD video cameras process color video signals by first separating the incoming light into the three primary colors, red, green, and blue. This is done by the camera's prism system, which is an arrangement of three prisms. The prism system utilizes the different reflection characteristics that light has depending on its color or wavelength. For example, in the figure below, green is not reflected by any of the prisms and thus is directed straight to the green CCD. Red is not reflected at the surface of the second prism, but gets reflected at the third, and through one more reflection within the second prism, it is directed to the red CCD. Camera Functions VTRs Others
Prism
CCD
Incoming light CCD CCD Camera Color separation system of a 3-CCD camera
Optical System
White Shading
White Shading is a phenomenon in which a green or magenta cast appears on the upper and lower parts of the screen, even when the white balance (refer to White Balance ) is correctly adjusted in the center of the screen (shown below). White shading is seen in cameras that adopt a dichroic layer (used to reflect one specific color while passing other colors through) in their color separation system. In this system, the three primary colors (red, green, and blue) are separated using color prisms (refer to Prism ). The three-color prisms use a combination of total reflection layers and color selective reflection layers to confine a certain color. For example, the blue prism will confine only the blue light, and will direct this to the blue CCD imager. However, the color-filtering characteristics of each prism slightly change depending on the angle that the light enters (angle of incidence) each reflection layer. This angle of incidence causes different light paths in the multi-layer structure of the dichroic coating layer and results in the change of the Another cause of White Shading is uneven sensitivity of the photo sensor in the CCD array. In this case, the White Shading phenomenon is not confined in the upper and lower parts of the screen. Sony high-end professional BC cameras are equipped with circuitry that automatically performs the proper adjustment to suppress the White Shading phenomenon. White shading is more commonly due to the lens having an uneven transmission characteristic and is seen typically as the center of the image being brighter than the edges. This is corrected by applying a parabolic correction waveform to the variable gain amplifiers used for white balancing. spectral characteristics of the prism. This effect is seen as the upper and lower parts of the screen having a green or magenta cast even when the white balance is correctly adjusted in the center.
Magenta Cast
Optical System
Zoom
Technically, 'zoom' refers to changing a lens's focal length (refer to Focal Length ). A lens that has the ability to continually alter its focal length is well known as a zoom lens. Zoom lenses allow the cameraperson to change the angle of view (refer to Angle of View ). Changing the angle of view in turn changes the area of the image that is directed to the CCD image. For example, by zooming-in to an image (the lens's telephoto position), less of the image will be directed to the lens and thus that area of the image will appear to be magnified. Zooming-out (wide angle position) means that more of the image is directed to the imager and thus, the image will look smaller. It also must be noted that the amount of light directed to the imager also changes when The correlation between the zooming ratio and angle of view can be described as shown in the figure below. Since chromatic aberration (refer to Chromatic Aberration ) and other light-diffusion characteristics change when the focal length is changed, high-quality zoom lenses use a series of compensation lenses (which account for the higher purchasing cost). changing the zoom position. In the telephoto position, less light is reflected from the subject and directed through the lens, and thus the iris must be adjusted accordingly.
4.8 mm 8 mm
400 mm 665 mm
120 mm
48 mm
VTRs
200 mm
Others
Camera Functions
VTRs
CCD Mechanism
CCD Mechanism
EVS/Super EVS
EVS (Enhanced Vertical Definition System) and Super EVS are features that were developed to improve the vertical resolution of a camera. Since Super EVS is an enhanced form of EVS, let's first look into the basic technology used in EVS. EVS has been developed to provide a solution when improved vertical resolution is required. Technically, its mechanism is based on Frame Integration (refer to Field Integration and Frame Integration Mode ), but reduces the picture blur inherent to this mode by effectively using the electronic shutter. As explained in Frame Integration, picture blur is seen due to the longer 1/30-second accumulation period. EVS eliminates this by discarding the charges accumulated in the first 1/60 seconds (1/30 = 1/60 + 1/60), thus keeping only those charges accumulated in the second 1/60 seconds. Just like Frame Integration, EVS uses the CCD's even lines to create even fields and its odd lines to create odd fields - thus providing the same high vertical resolution. However, since the first 1/60 seconds of accumulated charges are discarded, EVS sacrifices its sensitivity to one-half. Super EVS has been created to provide a solution to this drop in sensitivity. The charge readout method used in Super EVS sits between the Field Integration and EVS. Instead of discarding all charges accumulated in the first 1/60 seconds, Super EVS allows this discarded period to be linearly controlled. When the period is set to 0, the results will be the same as when using Field Integration. Conversely, when set to 1/60, the results will be identical to Frame Integration. And when set between 0 to 1/60, Super EVS will provide a combination of the improved vertical resolution of EVS but with less visible picture blur. Most importantly, the amount of resolution improvement and picture blur will depend on the selected discarding period. This can be summarized as follows:
When set near 0: less improvement in vertical resolution, less picture blur. When set near 1/60: more improvement in vertical resolution, more picture blur.
Each photo site Field Integration High in sensitivity but low in resolution. No discarded electrons. Odd
Even
Odd
Super EVS Advantages of both Field Integration and Frame Integration (Technically inbetween the two). The electric shutter is operated at a different timing in alternated lines.
Odd
EVS (Frame Integration) High in resolution but low in sensitivity. Shutter speed is set to 1/60s for NTSC or 1/50s for PAL. Electrons are to be discarded to the overflow drain of the CCD.
Even
Optical System
Field Odd Even A B C D E Frame Integration 1 frame A B C D E Field Integration CCD Read Out Modes X X X Charge Integration X X Pixels A, C, E, ETC Pixels B, D, ETC
VTRs Others
Time
+ + + +
X X X X Charge Integration A+B B+C C+D D+E C+D D+E A+B B+C Time
CCD Mechanism
HAD SensorTM
The HAD (Hole Accumulated Diode) sensor is a diode sensor which incorporates a Hole Accumulated Layer on its surface. This layer effectively reduced dark current noise, which is caused by electrons randomly generated at the Si-Si02 boundary layer. The Hole Accumulated Layer pairs up holes with the electrons generated at the CCD surface, reducing the number of electrons (amount of dark current noise) that enter and accumulate in the sensor. The reduction of dark current noise results in a reduction in fixed pattern noise, a high signal-to-noise ratio (refer to S/N (signal-to-noise) Ratio ), and low dark shading. The Hole Accumulated Layer also plays an important role in eliminating lag. The amount of lag in CCDs is determined by the efficiency of transferring the electrons accumulated in the photo sensor to the vertical shift register. In CCDs without a Hole Accumulated Layer, the bottom (potential) of the photosensing well tends to shift and, as shown in (a), an amount of electrons will remain in the well even after readout. However, with the HAD sensor, since the Hole Accumulated Layer clamps the bottom of the photo-sensing well to the same potential, the accumulated electrons will fall completely into the vertical register (b). Thus, electrons will not remain in the photo-sensing well after readout.
Surface direction
Depth direction 0V
Surface direction
Depth direction 0V
V-Regi ROG
V-Regi ROG
IT/FIT CCD
CCDs are categorized into two types, depending on the CCD's structure and the method used to transfer the charge accumulated at each photo site to the output. The IT (Interline-Transfer) CCD takes a structure such that the column of photo-sensing sites and vertical registers are arrayed alternately. The photo-sensing sites (so-called pixels) convert the incoming light into electrical charges over a 1/60-sec period (1/60 secfor NTSC, 1/50 sec for PAL) (refer to NTSC/PAL ). After this period, the accumulated charges are transferred to the vertical shift registers during the vertical blanking interval. The charges within the same line (the same row in the CCD array) are then shifted down through the vertical shift register in the same sequence and read into the horizontal register, line by line. Once a given line is read into the horizontal register, it is immediately read out (during the same horizontal interval) so the next scanning line can be read into the register. The only significant limitation in the IT imager structure is an artifact called vertical smear (refer to Vertical Smear ), which appears when the CCD is exposed to an extreme highThe FIT (Frame Interline Transfer) CCD was primarily designed to overcome this drawback. The upper part of this device operates exactly like an IT CCD, having a separate sensing area and charge shifting registers. The bottom part operates as a temporary storage area for the accumulated charges: immediately after the charges are transferred from the photo-sensing area to the horizontal registers (during the vertical blanking interval), they are quickly shifted to this fully protected storage area. Since the charges travel through the vertical register over a very short period, the effect of unwanted charges leaking into the vertical register from the photo sites is much smaller, especially when the CCD is exposed to highlights. light. Smear appears as a vertical line passing through the highlight, often seen when shooting a bright object in the dark. This phenomenon is due to electric charges, accumulated in highly exposed photo sites, leaking into the vertical register before the transfer from the photo sites to the vertical register occurs.
Optical System
The FIT structure thus offers superior smear performance but due to its complexity, usually costs more than an IT CCD. However, it most be noted that in recent Sony IT CCDs, the vertical smear has been reduced to an almost negligible level
Vertical shift register Photo sensor
due to the use of a HADTM sensor and On Chip Lens technology (refer to HAD SensorTM and On Chip Lens ).
IT CCD
Temporary storage area
VTRs
FIT CCD
Others
On Chip Lens
As compared to the human eye's ability to see in the dark, CCD cameras have a limitation in sensitivity (refer to Sensitivity ). Many technologies have been developed to improve sensitivity - the On Chip Lens being one of the most significant. On Chip Lens (OCL) technology drastically enhances the light-collecting capability of a CCD by placing a microlens above each photo sensor so that light is more effectively directed to them. The combination of Sony HAD-sensor technology and OCL has achieved tremendous improvement in image-capture capability, even under extremely low-light conditions. Since each micro-lens converges the incoming light to each photo-sensing area, less light leaks into the CCD's vertical register, greatly reducing vertical smear (refer to Vertical Smear ).
N-Substrate
Sensor C.S (Channel R.O.G stop) V-register (Read out gate)
Al Si Al Si
On-Chip-Lens
P+
N+ 2nd P-Well
N+ 1st P-Well
P+
Sensed light
CCD Mechanism
Picture Element
CCD specifications are indicated with the number of horizontal and vertical picture elements they have within their photosensitive areas. A picture element contains one photo sensor to sample the intensity of the incoming light directed to it. The number of picture elements within the CCD's sensing area is the main factor, which determines the resultant resolution of the camera. It must be noted that certain areas along the edges of the CCDs are masked. These areas correspond to the horizontal and vertical blanking periods and are used as a reference for absolute black. Thus, there are two definitions in describing the picture elements contained within the CCD chip. 'Total picture elements' refers to the entire number of picture elements within the CCD chip, including those which are masked. 'Effective picture elements' describes the number of picture elements that are actually used for sensing the incoming light.
Readout Mechanism
CCDs are the most popular imaging devices used in today's video cameras. In brief, CCDs convert incoming light directed through the camera lens into electrical signals that build a video signal. Since the mechanism of a CCD is very similar to the human eye, it is worthwhile to take a look at the how the human eye works and compare this to a CCD. As figure A shows below, in the human eye, an image (= light) is directed to and formed on the retina, which consists of several million photosensitive cells. The retina then converts the light that forms this image into a very small amount of electrical charges. These are then sent to the brain through the brain's nerve system. This is the basic mechanism of how people see. Coming back to the mechanism of a CCD, the CCD has photo sensors that work exactly like the retina's photosensitive cells. However, the electrical charge readout method is quite different. Figure C describes the structure of an Interline Transfer (refer to IT/FIT CCD ), with the photo sensors used for the light-tocharge conversion and the charge readout mechanism (to build the video signal) shown. The photo sensors, also called pixels, convert the incoming light into electrical charges. The light conversion and charge accumulation continues over a period of 1/60 seconds. After the 1/60-second period, the electrical charges at each photo sensor are transferred to the vertical shift registers during the vertical blanking interval. The charges within the same lines (the same row in the CCD array) are then shifted down through the vertical shift register, during the next 1/60-second accumulation period, and read into the horizontal register, line by line at a frequency of 15.734 kHz (NTSC format: refer to NTSC/PAL ). Once a given line is read into the horizontal register, it is immediately readout (during the same horizontal interval) so the next scanning line can be read into the horizontal register.
Optical System
CCD Mechanism
CCD Lens
Camera Functions
VTRs Others
CCD Mechanism
CCD(B)
CCD(G)
1/2P
P: pitch
(a)
A V Resister Photo sensor B
CCD(G)
(c)
Optical System
Generated electrons
VTRs
1/60
1/60
1/60
Output
Shutter period
Shutter period
Others
Output electrons
CCD Mechanism
Vertical Smear
Vertical Smear is a phenomenon peculiar to CCD cameras, which occurs when a bright object or light source is shot with the camera. This phenomenon is observed on the video monitor as a vertical streak above and below the object or light source as shown below. A most notable example is when the headlights of a vehicle are shot with a CCD camera in the dark. Smear is caused by the direct leakage of incoming light into the vertical shift register or the overflow of the electrical charges accumulated in the photo sites. The reason that smear is observed as a vertical streak is because electrons constantly leak into the vertical register while it shifts down to the horizontal register. The amount of smear is generally in proportion to the intensity of the light from the subject or light source and the area that these occupy on the CCD. Thus, in order to evaluate smear level, the area must be defined. Smear in recent Sony CCD cameras has been drastically reduced to a level that is almost negligible due to the use of the HAD sensor (refer to HAD SensorTM ).
Vertical smear
Camera Functions
VTRs
Camera Functions
The Basics of Camera Technology
Camera Functions
Normal
On
Optical System
the color temperature. This may sound somewhat inconvenient, however, AWB achieves much more accurate color reproduction as compared to ATW. AWB is achieved by framing the camera on a white object - typically a piece of white
paper - that occupies more than 70% of the viewfinder display, and then pressing the AWB button located on the camera body.
Black Balance
To ensure accurate color reproduction from a camera, it is imperative that the camera reproduces a true black when the lens iris is closed, otherwise a colorcast may be seen. This requires accurate matching of the R, G, and B black levels. Most cameras provide an Auto Black Balance function, which automatically closes the lens iris and balances the R, G, and B black levels when activated.
CCD Mechanism
Black Clip
All cameras have a circuit to prevent the camera output signals from falling below a practical video level, which is specified by the television standard. This is known as Black Clip, which electronically 'clips off' signal levels that are below a given level called the black clip point. The black clip point is set to 0% video level.
Camera Functions
Black Gamma
VTRs In high-end Sony cameras, the gamma curve near the black signal levels can be adjusted using the Black Gamma feature. This feature is achieved without affecting the gamma curve of the mid-tone and high tones areas. Adjusting black gamma to obtain a steep gamma curve near the black signal levels allows more contrast to be seen in dark parts of the picture, thus resulting in better reproduction of picture detail. Others However, it must be noted that using a steep gamma curve near the black signal levels also results in the increase of noise, and black gamma must be adjusted with care. Conversely, black gamma can be adjusted to reduce noise in dark picture areas but with the drawback of less contrast being seen. Reproduction of black is extremely important to obtain the desired color reproduction of entire images, accurately and faithfully. Thus, professional video cameras, especially those used in broadcast stations, are required to have this capability to reproduce faithful black level stipulated by each broadcast station (different broadcast stations will have their own stipulations on the black level).
Output level
Cross point
Gamma OFF
Camera Functions
Black Shading
Black Shading is a phenomenon observed as unevenness in dark areas of the image due to the dark current noise of imaging device. Dark current noise describes the noise induced in a CCD by unwanted electric current generated by various secondary factors, such as heat accumulated within the imaging device. A Black Shading Adjustment function is available in most professional video cameras to suppress this phenomenon to a negligible level.
Center Marker
The Center Marker is a mark in the viewfinder that indicates the center of the image being shot. This is especially useful when zooming in to a particular area of the subject. By using the center marker as a reference, camera operators can zoom in to a subject with the desired area accurately framed. This is most convenient when a high magnification lens and high zooming speed is used.
Center Marker
Optical System
Black bar
CRT display
CRT display 0 0
Signal level
Black bar T1 T1 (A) (a-1) f disp < fc.scan White bar 0 White bar T1 T1 (B) (a-2) f disp > fc.scan 0 White bar (b-2) (b-3) Black bar (a-2) (a-3)
CCD Mechanism
Camera Functions
(C)
VTRs
Color Bars
Color-bar signals are used to maintain consistent color reproduction throughout the entire production chain. They are usually recorded to the source tape at the acquisition stage as a reference to adjust the output encoders of VTRs and other equipment used in subsequent production processes. Adjustments are made so that each device outputs the color-bar signal to show the exact same color and brightness as when recorded in the acquisition stage. Vector scopes are used to adjust the color (hue/saturation) while wave-form monitors are used to adjust the brightness. There are a few color-bar definitions in the NTSC TV system (refer to NTSC/PAL ). However, all color-bar signals basically look the same (in some cases, additional color bars or blocks are placed below the color bars) when displayed on a picture monitor. There are seven vertical bars - one white bar on the very left, followed with six colored bars to the right. The order of the colored bars from left to right is yellow, cyan, green, magenta, red, and blue. This is the descending order of each color's luminance level. It is also important to know that each color (including the white bar) is a combination (total: seven combinations) of equally adding the three primary colors, red, green, and blue, and all have 100% saturations. A 75% color bar has the same 100% white bar but the levels of R, R and B for the colored bars is 75%, This maintains the level of the peak to 700 mV but reduces the saturation of the color bars. Shooting a program usually starts by recording a color-bar signal generated in the camera to the top of the tape. For this purpose, production cameras have internal color-bar generators. The color-bar signal from the camera can also be used as a reference for adjusting the chroma, phase, and brightOthers ness of a monitor.
Color bar
Camera Functions
Crispening
As mentioned in the section on Detail Level (refer to Detail Level ), detail signals are used to raise the contrast at the dark-to-light and light-to-dark transitions, making the edges of objects appear sharper both horizontally and vertically. Simply put, detail correction makes pictures look sharper than the actual resolution provided by the camera. However, since detail correction is applied to the entire picture, its use also emphasizes the noise of the picture, especially when the detail level is high. Crispening is a circuit implemented to avoid detail signals being generated around noise. By activating the Crispening function, detail signals with small amplitudes, which are most likely caused by noise, are removed from the signal. As shown in the figure below, in the Crispening process, only detail signals that exceed a designated threshold are utilized for image enhancement. Conversely, the detail signals with small amplitudes are regarded as noise and removed. Crispening allows detail to be adjusted without worrying about its influence over noise.
Crispening level
Cross Color
Optical System
Detail Level
In all cameras, image enhancement is used as a method to improve picture sharpness. In brief, image enhancement raises the contrast at the dark-to-light and light-to-dark transitions, making the edges of objects appear sharper than provided by the actual resolution of the camera. This process is applied electrically within the camera by overshooting the signal at the edges. In most professional cameras, image enhancement is applied to both vertical and horizontal picture edges. In camera terminology, this process is called 'detail'. Detail level refers to the amount of image enhancement, or in other words, the amount of sharpness added to the picture. In most professional cameras, this can be altered with the detail-level control circuitry. It is worthwhile understanding how the horizontal detail signal is created. For simplicity, let's examine how this is done in an analog detail-correction process. The original signal (a) is delayed by 50 nsec to obtain signal (b) and by 100 nsec to obtain signal (c). By adding (a) and (c) we have signal (d). The detail signal used for enhancement (signal (e)) is obtained by subtracting (d) from two times (b). This is further added to (b), completing the detail correction (f). The mechanism for creating the vertical detail signal is basically the same as horizontal detail correction. The only difference is that the delay periods for creating signals (b) and (c) are one horizontal scanning line and two horizontal scanning lines, respectively. Excessive detail correction will lead to an artificial appearance to the picture, as though objects have been "cut out" from the background.
(a)
(b) 1T (c) 2T
VTRs
(d)
Others
(e)
Detail correction
Camera Functions
DynaLatitudeTM
DynaLatitude is a feature offered on Sony DVCAM camcorders for capturing images with a very wide dynamic range or, in other words, images with a very high contrast ratio. For example, when shooting a subject in front of a window from inside the room, details of the scenery outside will be difficult to reproduce due to the video signal's limited 1 Vp-p dynamic range. However, DynaLatitude is a unique technology that overcomes this limitation so that both dark areas and bright areas of a picture can be clearly reproduced within this 1 Vpp range. DynaLatitude functions in such a way that the signal is compressed within the 1 Vp-p range according to the light distribution of the picture. DynaLatitude first analyzes the light distribution or light histogram of the picture and assigns more video level (or more 'latitude') to light levels that occupy a larger area of the picture. In other words, it applies larger compression to insignificant areas of the picture and applies less or no compression to significant areas.
DynaLatitude OFF
DynaLatitude ON
100% Illumination
600%
Optical System
File System
Professional video cameras allow a variety of detailed and complex adjustments in order to reproduce the desired colorimetry for each shooting scenario as well as to compensate for technical imperfections in certain camera components. In order to reduce the burden of repeating an adjustment each time a shoot is performed, professional cameras provide facilities that allow these to be saved and recalled as "data files" whenever required. This File System greatly contributes to operational efficiency, and has been adopted in all professional video camera systems available today. Depending on their nature and when they are used in the entire setup procedure, adjustment parameters are categorized into appropriate "data files", such as Reference File (refer to Reference File ), Scene File (refer to Scene File ), Lens File (refer to Scene File ) etc. In Sony camera systems, the Reference File and the Scene File can be stored on removable recording media such as Memory StickTM and Memory Card, which enables instant reproduction of shooting conditions for particular scenes as well as the duplication of camera setup adjustments between multiple cameras.
VTRs Others
Gain
When shooting with a video camera in low-light conditions, a sufficient signal level can often not be obtained due to a lack of light directed to the imager. For such cases, video cameras have a Gain Up function, which electronically boosts the video signal to a sufficient level for viewing on a monitor or recording to a VTR. The Gain Up function usually offers several Gain Up values, which are selected by the operator depending on the lighting. It must be noted that choosing a large Gain Up value will result in degrading the S/N ratio, since noise is also boosted. Some cameras have a minus Gain Up setting to improve their S/N ratio.
Camera Functions
Gamma
Gamma ( ) is a numerical value that shows the response characteristics between the image brightness of an acquisition device (camera) or display device (CRT monitor) and its input voltage. In order to obtain faithful picture reproduction, the brightness of the picture must be in direct proportion to the input voltage. However, in CRTs used for conventional picture monitors, the brightness of the CRT and the input voltage retain a relationship with an exponential function, instead of a directly proportional relationship. As shown in (a), the beam current (which is in proportion to the CRT's brightness) versus the input voltage rises as an exponential curve, in which the exponent is a factor larger than one. On the monitor screen, the dark areas of the signal will look much darker than they actually are, and bright areas of the signal will look much brighter than they should be. Technically, this relation is expressed in the following equation: I=CxE where I is the brightness, E is input voltage and C is a specific constant. The exponent in this equation is called the 'gamma' of the CRT. It is obvious that the gamma of a picture monitor CRT must be compensated for in order to faithfully reproduce pictures taken by the camera. Such compensation is called 'gamma correction'. Properly speaking, gamma correction should be done within the picture monitor. However, this is done within the camera since it is more economically efficient to perform the correction within the cameras used by the broadcaster, rather than in the huge number of picture monitors that exist in the market. The goal in compensating (gamma correction) for the CRT's gamma is to output a signal so that the light that enters the camera is in proportion to the brightness of the picture tube, as shown in (b). When the light that enters the camera is proportional to the camera output, it should be compensated for with an exponent of 1/ . This exponent (1/ ) is what we call the camera's gamma. The gamma exponent of a monitor is about 2.2. Thus the camera gamma to compensate for this is about 0.45 (1/2.2). Although gamma correction in the camera was originally intended for compensating for the CRT's gamma, in today's cameras, gamma can be adjusted to give the camera image a specific look. For example, a film-like look can be achieved by changing the settings of the camera's gamma.
Brightness
Brightness
Camera gamma
(a)
Input voltage
(b)
Input voltage
Genlock
In multi-camera systems, it is necessary to synchronize (refer to Synchronization Signal (Sync Signal) ) the internal sync generators of each camera within the system. More specifically, the frequencies and phases of the V sync, H sync, and sub-carrier of each camera output must be synchronized with each other. Otherwise, picture shifts will occur when switching from camera to camera with the switcher system used. Synchronization is accomplished by feeding the same composite signal to each camera as a timing reference. In video technology, this is described as 'genlock', which refers to the camera's function of locking its internal sync generator to the signal supplied to its Reference IN connector. The composite signal used to synchronize the cameras can be supplied from a signal generator, the switcher, or one of the cameras within the system designated as the master.
Optical System
H/V Ratio
As explained in the section on Detail Level (refer to Detail Level ), detail correction is applied to both horizontal and vertical picture edges using separate horizontal detail and vertical detail circuitries. H/V Ratio refers to the ratio between the amount of detail applied to the horizontal and vertical picture edges. It is important to maintain the balance of the horizontal and vertical detail signals to achieve natural picture enhancement. H/V Ratio should thus be checked every time detail signals are added.
CCD Mechanism
Knee Aperture
When knee correction (refer to Knee Correction ) is applied to the video signal within a camera, reduction in contrast in the highlight areas cannot be avoided. This is because the highlight contrast - and the detail signals generated to emphasize this contrast - are compressed by the knee correction process which follows. To compensate for this loss in contrast, the knee aperture circuit activates to emphasize the edges of only those areas where knee correction is applied (highlights above the knee point). Knee aperture can be adjusted in the same way detail correction can but only for those areas above the knee point.
Others
Camera Functions
Knee Correction
When we take a photo against a strong backlight, just like shooting a portrait picture in front of a sunlit window, we can still clearly see the subject's face while being able to see the details of scenery outside the room. This is because the human eye can handle wide dynamic range (refer to Dynamic Range ). However, this is not easily done by video cameras because of the limited video-level dynamic range specified by the television standard. Therefore if the camera lens iris was adjusted for correct exposure of human skin tones, the bright areas of the image would not fit into the video-signal range and would be washed out. Vice versa, if the iris was adjusted for the bright areas, the video level of human skin tones would be very low and would look too dark. In order to obtain an image reproduction like the human eye, as naturally as possible, a function called 'Knee Correction' is widely used on today's video cameras. Knee Correction is a function that compresses the wide dynamic video signals acquired by the imager (CCDs) into the limited video-level range specified by the television standard.
100% Illumination 600% 50% 100%
Knee point
The video level from which signals are compressed is called the knee point. As shown in the figure, the video output above the knee point is compensated for to give a more gradual response. Thus some detail (contrast) can still be observed in the bright areas above the knee point, broadening the dynamic range of the camera.
Video output White clip point
Lens File
In general, each camera lens introduces different 'offset' characteristics, which are electronically compensated for on a lens basis by making appropriate adjustments to the camera. However, when multiple lenses are used on the same camera, these different characteristics require the camera to be readjusted each time the lens is changed. In order to eliminate this burden, most high-end professional cameras have a so-called Lens File system. With this system, camera operators can store lens compensation settings for individual lenses within the camera as lens files. Since each lens file is assigned a file number designated by the operator, pre-adjusted lens-compensation data can be instantly recalled simply by selecting the correct file number. Simply put, once the setting for a given lens has been made and stored as a lens file, all that need to be done to use this lens again is to select the file number associated to it. Some large studio-type lenses take this a step further by automatically recalling the correct lens file by registering the same number in the lens's memory as that used for the associated lens file.
Level Dependence
Level Dependence is a similar function to Crispening (refer to Crispening ), which is used to prevent unwanted detail signals generated by noise. While 'Crispening' removes detail signals generated by noise at all signal levels, Level Dependence simply reduces the amplitude of detail signals generated in the low luminance areas. In other words, Level Dependence allows a different amount of detail correction to be applied under a given threshold. Since noise is most noticeable to the human eye in dark picture areas, Level Dependence improves the signal quality in these areas. Level Dependence is effectively used when the picture content has extremely fine details, which could be mistaken for removed noise if Crispening was used.
LEVEL DEP.
Camera Functions
Limiter
As mentioned in the section on Detail Level' (refer to Detail Level ), detail signals are used to raise the contrast at the dark-to-light and light-to-dark transitions, making the edges of objects appear sharper both horizontally and vertically. However, when there is large difference in the luminance level at the dark-to-light or light-to-dark transitions, the detail circuit is likely to generate an over-emphasized picture edge and objects may appear to 'float' on the background. This is because detail signals are generated in proportion to the difference in the luminance levels at the dark-to-light or light-todark transitions. The limiter is a circuit to suppress this unwanted effect. It 'clips off' the top of detail signals that are above a designated threshold, thereby preventing excessive detail correction for both white and black detail signals.
VTRs
Detail signal
Detail signal
Camera Functions
1.0
0.5
0.25
400
700
Optical System
Mix Ratio
In most professional video cameras, image enhancement or detail correction (refer to Detail Level ) is applied both before and after the gamma correction (refer to Gamma ) as shown below. The term, Mix Ratio, describes the ratio of the amount of detail applied at pre-gamma detail correction and post-gamma detail correction. The reasons that detail correction is applied twice owes to the gamma correction's non-linear nature. Gamma correction used in video cameras boosts the contrast at the dark picture areas and those in the white pictures get com*It should be noted that while the image will appear sharper as the amount of enhancement is increased, noise may also be enhanced.
pressed. For this reason, pre-gamma correction is effective for enhancing contrast at dark areas of the image. Postgamma correction, on the other hand, is used to effectively enhance the brighter parts of the image. Unlike H/V ratio (refer to H/V Ratio ), there is no optimum ratio for Mix Ratio. It is a parameter that is adjusted depending on each operator's different preference.
CCD Mechanism
Gamma
Camera Functions
DTL
VTRs Others
Camera Functions
Multi Matrix
Multi Matrix has been developed for further creative control in color adjustments of a scene. Unlike conventional color correction or matrix control, in which all color control parameters interact with each other, the Multi Matrix function allows color adjustments to be applied only over the color range designated by the operator. The color spectrum is divided into 16 areas of adjustment, where the hue and/or saturation of each area can be modified. For example, the hue and saturation of a flower's red petal can be changed, while keeping other colors unchanged. In addition to such special effects work, this function is also useful for matching color among multiple cameras, or for reproducing the color characteristics of another camera.
RY
saturation Phase BY
Normal
On
Pedestal/Master Black
Pedestal, also called Master Black, refers to the absolute black level or the darkest black that can be reproduced with the camera. On most cameras, pedestal can be adjusted as an offset to the set-up level. Since pedestal represents the lowest signal level available, it is used as the base reference for all other signal levels. As shown below, if the pedestal level is set too low due to improper adjustment, the entire image will appear darker than it should do (the image will appear blackish and heavier). If the pedestal level is set too high on the other hand, the image will look lighter than it should do (the image will look foggy with less contrast). By taking advantage of the pedestal characteristics, it is possible to intentionally increase the clearness of an image when shooting a foggy scene or when shooting subjects through a window simply by lowering the pedestal level.
Optical System
Pedestal level
Normal
VTRs
Preset White
As mentioned in the section on 'Color Temperature' (refer to Color Temperature ), since cameras are not adaptive to the variation of the different spectral distributions of each light source, this variation must be compensated for electronically and optically. Taking white balance (refer to White Balance ) refers to compensating for the different spectral distributions electronically. Choosing the correct color conversion filter (refer to Color Conversion Filters ) is also imperative to achieving accurate white balance - to reduce the amount of white-balance adjustments that must be applied electronically. Preset White is a white-balance selection used in shooting scenarios when white balance cannot be adjusted or when the color temperature of the shooting environment is already known (3200 K for instance). By selecting Preset White, the R/G/B amplifiers used for white-balance correction are set to their center value. This means that by simply choosing the
Others
correct color conversion filter, the approximate white balance can be achieved. It must be noted however, that this method is not as accurate as when taking white balance.
Reference File
For broadcasters and large video facilities, it is imperative that all cameras are setup to have a consistent color tone or 'look', specified by that facility. This is achieved by using common base-parameter settings for all cameras that govern the overall picture reproduction, such as gamma, detail and knee (refer to Gamma, Detail Level, Knee Aperture and Knee Correction ). Reference Files are used to store such user-defined reference settings so they can be quickly recalled, reloaded, or transferred from camera to camera. The parameters that can be stored in a camera's reference file may slightly differ between types of cameras. This difference is due to the different design philosophy of what base parameters should be commonly shared between all cameras.
Camera Functions
Return Video
In combination with the Intercom System (refer to Intercom (Intercommunication) System ) and Tally Systems (refer to Tally ), Return Video plays a vital role in multi-camera systems. The main purpose of Return Video is to allow a camera operator to view images captured by other cameras used in the system (or images being aired) by displaying these in the viewfinder of the camera he or she is operating. As shown in the figure below, when Camera 1 is shooting a subject, this same image can be simultaneously displayed on the viewfinders of Camera 2 and Camera 3. This is done by routing the output signal of each camera to all other cameras within the system via their CCUs (refer to Camera Control System ). In camera terminology, this signal is called the Return Video. By using Return Video, the camera operator can switch the viewfinder to display the output of the camera he/she is operating, or the Return Video from other cameras in the system. In most professional cameras, two to four return-video signals can be accepted. Return-video signals are sent from CCU to CCU using a composite video signal since picture quality is not required. The use of Return Video allows each camera operator to know how other cameras are framing the subject - camera angle, zoom, etc. This allows each camera operator to always be prepared to go on-air, with seamless switching achieved from camera to camera.
Viewfinder image
Camera 1
Viewfinder image
Viewfinder image
Camera 3
CCU
Scene File
Reference Files (refer to Reference File ) store parameter data that govern the overall 'look' common to each camera used in a facility. Scene Files, on the other hand, are provided to store parameter settings for color reproduction particular to each 'scene'. Scene Files can be easily created and stored for particular scenes by overriding the data in the Reference File. Scene Files allow camera operators to instantly recall the previously adjusted camera data created for scenes taken outdoors, indoors, or under other lighting conditions whenever demanded.
Optical System
Width
Phase
VTRs
B-Y Y
Saturation
Others
Camera Functions
Tally
Tally is a system equipped on professional camcorders and studio cameras that is used to alert those involved in the shoot to the status of each camera. The word 'tally' can be used to describe the entire tally system, the lamps used to indicate the tally signal, or the tally signal itself. In acquisition operations, a tally equipped on a camcorder usually refers to the REC Tally and is often distinguished from those used in a multi-camera studio system. All professional camcorders have a 'tally lamp' on the front of the viewfinder (for the actor, announcer, etc) and one within the camera viewfinder. The primary purpose of this tally is to inform those who will be shot (all performers who will be shot by the camcorder) that the camcorder is recording (therefore called REC tally). When the camcorder is put into 'record' mode, the tally lamps on the viewfinder light in red. In multi-camera studio systems, tallies play a different role they inform both the performer and the camera operators which camera in the studio is being put to air. In multi-camera systems, the camera output to be put to air (or recorded) is decided by the staff in the control room from the switcher control panel. When a camera is selected for on-air by the switcher, the switcher will also send a tally signal to the camera via the camera control unit controlling that camera. In this way, the camera contributing to the on-air signal will light its tally lamp in red informing both the performers and camera operators which camera is active. There are several tally lamps on studio cameras. The largest tally lamp is located on the upper-front of the camera body to be easily recognized by all staff in the studio. There are also tally lamps located on the viewfinders attached to the cameras that are used to communicate with the crew behind the camera, including the camera operator. Tally lamps are also provided on the camera control unit (usually located in the control room) as well as the video monitors in the control room.
Tele-Prompter
When watching news programs on TV, viewers usually notice newscasters delivering a sequence of stories without referring to a script or taking their eyes off the camera. To some, this may appear as the newscaster having a good memory, yet in most cases, they are actually viewing scripts displayed on a system called a tele-prompter. As shown below, illustrating the traditional tele-prompter system, a ceiling mount camera above the desk is faced down to shoot the script. The image from this camera is sent to the studio camera via the CCU (refer to Camera Control System ). In front of the studio camera, there is a device comprising two parts - a picture monitor facing upwards, and a half mirror to display the newscaster's scripts captured by the ceiling mount camera. Key to this system is the half mirror, which allows the picture monitor screen to be seen by the newscaster, but appear transparent to the camera (operator). The video signal of the script images are fed to this monitor from the prompter output located on side of the studio camera. This mechanism allows the newscasters to view the script while looking straight into the camera lens. All tele-prompter systems today generate the script from a computer, displayed on a computer screen viewed through the half-silvered mirror. The tele-prompter system has been a very helpful tool to support newscasters all over the world.
Optical System
CCU
CCD Mechanism
Monitor
Camera Functions
Tele-Prompters mechanism
VTRs Others
An Iris F-stop value for AGC or AE effective point can be pre-set by Advanced menu. Preset F-stop value AE effective point < F5.6 F16 > AGC effective point < F1.8 F5.6 >
AE
Auto Iris
AGC
Control range is equivalent to 2 F2 stop max. (Up to 1/250 sec. shutter speed)
Control range is equivalent to 2 F2 stop max. (Upper limit gain value of AGC can be preset by Advanced menu. (0/3/6/9/12 dB)
Camera Functions
Triax
Triax is a unique transmission system widely used in broadcast camera systems due to its reliability, flexibility, and convenience. In brief, Triax is an interface system used for connecting a camera to its associated camera control unit, for the transmission of signals that must be communicated between the two devices. In the earlier days of camera systems, multi-core interfaces were used, in which each signal was transmitted through an individual wire bundled within the multi-core cable. Although the multi-core interface was convenient for use in the studio, it also had a serious drawback - a limitation in transmission distance, most notable in outside-broadcasting applications. This was because thin wires had to be used to transmit each of the many signals through one multi-core cable. The Triax system brought an innovative solution to this drawback. Instead of using one wire per signal, the Triax system allows all signals to be communicated between the camera and the camera control through one cable. Each signal is modulated on a carrier signal (frequency) specific to that signal so that signals do not interfere with each other. This allows the signals to be added together and transmitted through the same wire. It also allows bi-directional transmission from the camera to the camera head through the same wire. Since only one wire is used in a Triax cable, this allows a wide diameter wire to be used. Using a wide diameter wire naturally results in further transmission distances without signal level drop. The result is a transmission distance of up to 2,000 m with the Triax system (when using a 14.5 mm diameter cable and a full rack CCU). The Triax system is state-of-the-art technology - allowing the communication of all signals between the camera and camera control unit through one simple and flexible cable connection. The figure shows the signals communicated and their frequency allocations.
Triax system
TruEye OFF
Brightness
TruEye ON
R G Knee Point B
G B
Camera Functions
(a)
T1
(b)
VTRs
Turbo Gain
Turbo Gain is a function adopted in Sony video cameras and camcorders that helps shooting in the dark. Turbo Gain is basically an extension of Gain Up (refer to Gain ) but offers a larger level boost (+42 dB) to the video signal.
V Modulation
V Modulation is a type of white shading (refer to White Shading ) that occurs when there is a vertical disparity in the center of the lens and prism axis. This causes the red and blue light components to be projected 'off center' of their associated CCDs, which results in green and magenta casts to appear on the top and bottom of the picture frame. V Modulation is caused by the different characteristics of each lens and/or the different optical axis of each zoom position and can be compensated for on the camera. Since this compensation data directly relates to the lens, it is automatically recalled as parameters of the Lens File (refer to Lens File ).
Others
Lens
Center
Prism
Camera Functions
White Balance
As mentioned in the section on Color Temperature (refer to Color Temperature ), video cameras are not adaptive to the different spectral distributions of each light source color. Therefore, in order to obtain the same color under each different light source, this variation must be compensated for electrically by adjusting the video amps of the camera. For example, imagine shooting a white object. The ratio between the red, green, and blue channels of the camera video output must be 1:1:1 to reproduce white. This ratio must stay the same under any light source (when shooting a white object). However, as in the earlier discussions of Color Temperature, the spectral distribution of light emitted from each light source differs. This also means that the spectral distribution of the light that reflects from the white object and enters the camera prism will also change according to the light source. As a result, the output of the three red, green, and blue CCDs will vary depending on the light source under which the white object is shot. For example, when the white object is shot under 3200 K, the signal output from the blue CCD will be very small while that of the red CCD will be very large.
Note: in the figure, white balance for 3200 K seems to require more adjustment of the video amps than 5600 K. However, the video amps of most cameras are preset to operate on color temperatures around 3200 K, and less gain adjustment is required.
This relation reverses for a 5600 K light source. As earlier mentioned, white can only be produced when the red, green, and blue video channels are balanced (R:G:B = 1:1:1), and therefore, electrical adjustments must be done at the CCD output. In the latter example (5600 K), the video amp of the blue CCD must be adjusted to have a gain smaller than 1, making the red, green, and blue signals equal in amplitude. This adjustment is called white balance. In brief, white balance refers to the adjustment of the video amps of the three CCDs, according to the color of the light source, to obtain a 1:1:1 ratio for the red, green, and blue signal levels in order to reproduce white. It is most important to note that, once white balance is adjusted, other colors come into balance as well. White balance should be adjusted frequently when the camera is used outdoors as color temperature changes rapidly with time.
Relative energy
3200 K
R=G=B B G R
400
700
Relative energy
400
Optical System
White Clip
All cameras have white-clip circuits to prevent the camera output signals from exceeding a practical video level, even when extreme highlights appear in a picture. The white-clip circuit clips off or electrically limits the video level of highlights to a level, which can be reproduced on a picture monitor.
100%
CCD Mechanism
50%
100% Illumination
600%
Camera Functions
Zebra
Zebra is a feature used to display a striped pattern in the viewfinder across highlight areas that are above a designated brightness level. This is particularly useful when manually adjusting the iris (F-number) of the lens (refer to F-numberand Iris ). Two types of zebra modes are available to indicate either 100 IRE or 70-90 IRE brightness level. They are used differently, so it is important to know which one to pay attention to. The 100 IRE Zebra displays a zebra pattern only across areas of the picture which exceed 100 IRE, the upper limit of legal video (100 IRE is pure white in NTSC (refer to NTSC/PAL ) - even though the camera can expose well above this. White levels above IRE brightness are illegal for broadcast). With this zebra mode, the camera operator adjusts the iris until the zebra becomes visible in highlight areas. The second type, 70-90 IRE zebra, displays a zebra pattern on highlights between 70-90 IRE, and disappears above the 90 IRE level. This is useful to determine the correct exposure for the subject's facial skin tones since properly exposed skin (in the case of Caucasian skin) highlights are likely to fall within the 80 IRE areas.
VTRs Others
ZEBRA
OFF 70 100
Camera Functions
VTRs
VTRs
VTRs
EZ Focus
EZ Focus is a feature employed in the high-end DXC series cameras and some DSR (DVCAM) camcorders to make manual focusing operation easier (this is not an auto-focus function). EZ Focus is activated by the touch of a button located on the control panel of the camera. When activating, the camera will instantly open the lens iris (refer to Iris ) to its widest opening, which shortens the depth of field (refer to Depth of Field ) and in turn allows for easier manual focusing. During the mode, video level is managed properly by activating the electric shutter. The lens iris will be kept open for several seconds and will then return to the level of iris opening before EZ Focus was activated.
EZ Focus activated
Focusing operation
Optical System
EZ Mode
In newsgathering, the camera operator must be prepared to immediately start shooting for unexpected incidents. In such situations, it is most likely that there will be no time for adjustments. EZ Mode provides a solution. EZ Mode is a feature that instantly sets the main parameters of the camera to their standard positions and activates the auto functions such as ATW (refer to ATW (Auto Tracing White Balance) ), TLCS (refer to TLCS (Total Level Control System) ), and DCC (refer to Dynamic Contrast Control (Automatic Knee Control) ). This is achieved by simply pressing a button located in a position that can easily be accessed when the camera is mounted on the user's shoulder. EZ mode is also fully automatic.
EZ Mode OFF
EZ Mode activated
SetupLogTM
SetupLog is a unique feature adopted in high-end DVCAM camcorders. This function automatically records, or logs, the camera settings - including the iris opening (refer to Iris ), the gain up setting filter selection (refer to Gain ), as well as basic and advanced menu settings - to a DVCAM tape. SetupLog data is constantly recorded to the Video Auxiliary data area of the DVCAM tape while the camcorder is in Rec mode - therefore making it possible to recall the camera settings that were used during any given shoot. SetupLog is particularly useful when the camera must be serviced, since the technician can recall the camera settings that were used when a recording was not made correctly to examine what when was wrong.
VTRs Others
SetupNaviTM
SetupNavi is a unique feature adopted in high-end DVCAM camcorders. As opposed to SetupLog (refer to SetupLogTM ), which takes a log of the camera settings in real-time, SetupNavi is specifically intended for copying camera setup data between cameras using a DVCAM tape as the copy medium. Activating this feature allows all camera setup parameters, including the key/button settings, basic and advanced menus, service menus, etc to be recorded to a DVCAM tape. It is important to note that SetupNavi data is recorded only when activated by the operator and that the DVCAM tape used to record this data cannot be used for other purposes. SetupNavi is particularly useful in the studio to copy setup data between multiple cameras. It is also convenient when the camera is used by multiple operators or for multiple projects, since the exact camera settings can be quickly recalled by inserting the DVCAM tape with the SetupNavi data.
Camera Functions
VTRs
Others
Others
Additive Mixing
Prior to the development of the color video system, experiments in colorimetry proved that most hues visible to the human eye could be composed using the three primary colors, red (R), green (G), and blue (B) (figure A). This fact also holds true when separating a specific hue - that is, any hue can be separated into a combination/amount of these three primary color components. This is called Additive Mixing. The mechanism of reproducing color on a picture monitor is based on this principle and is a good example to understand how Additive Mixing works. In a video monitor CRT tube, three R, G, and B guns each emit electrons (electron beams) corresponding to the amount of the R, G, and B components in the hue that is to be reproduced (figure B). This results in the emission of light from each of the R, G, and B phosphors with their intensities pro portional to their associated electron beams. To the human eye, these lights are seen as one light beam with the appropriate hue reproduced when viewed from a certain distance. The mechanism of a color video camera uses a reverse function as compared to a picture monitor. The light that enters the camera's lens is first separated into the three primary colors, R, G, and B using a prism system (refer to Prism ) known as the dichroic mirror. These color light components are converted into R, G, and B signal voltages at their associated R, G, and B CCD imagers. The R, G, and B signals are then processed into the appropriate signal formats to construct the output signal.
*Note: In the Sony Trinitron, only one gun is used to emit the three R, G, and B electron beams.
Gun
Sony Trinitron
Blue
Cyan
Magenta White
Green
Yellow
Red
Aperture grill
Optical System
override the general settings made on the MSU and make them appropriate for the given camera. Also, any adjustments made during the shoot are made on the RCPs since these are exclusive to each camera.
Camera CCU
VTRs
Camera CCU
MSU
Others
Camera CCU
Others
Y/C
The Y/C signal is obtained by use of an R-Y/B-Y encoder. The R-Y/B-Y signals are modulated on a 3.58-MHz** subcarrier using quadrature modulation and are combined into one chrominance signal (C). The bandwidth of the brightness signal is the same as the Y component of the Y/R-Y/B-Y signal. The bandwidth of the color is equivalent to that of the RY/B-Y signals but slightly distorted due to the quadrature modulator and the band-pass filter used to eliminate high-frequency harmonics.
*In actuality, in NTSC system, an I/Q encoder is used. **3.58-MHz for NTSC, 4.43-MHz for PAL.
Y/R-Y/B-Y
The Y/R-Y/B-Y signal is obtained by applying the RGB signals to a matrix circuit, which uses defined coefficients. Information on the brightness of the three RGB signals is converted into one signal called the luminance signal (Y), while information on color is converted into two signals called the color difference signals (R-Y/B-Y). Information on luminance is equivalent to that of the RGB signal. The bandwidth of the color (R-Y/B-Y) derived from the RGB signal may be limited to some extent (around 1.5 MHz) but sufficiently covers the human eye's sensitivity (resolution) to fine detail of color, which is much lower than that of brightness.
Composite
The luminance (Y) and chrominance (C) signals of the Y/C signal are added to form one signal, which contains both brightness and color information. The quadrature modulator mentioned above uses a color carrier signal with a frequency of approx. 3.58 MHz for NTSC and 4.43 MHz for PAL, so the resultant chrominance (C) signal's spectrum interleaves with the luminance signal's spectrum. This prevents the two signals from interfering with each other when added together to form the composite signal. This method allows the composite signal to be separated back into its luminance and chrominance components for display on a picture monitor.
Y C
RGB
Component
Y/C
Composite
Decibels (dB)
In electronics, it is often required to handle signal levels over a very wide range. The use of logarithm provides an easier way of expressing both small and large values. It is also more convenient to know the ratio between a signal's amplitude and one defined signal (e.g., 1.0 V in video electronics) rather than to know the actual amplitude of the signal. For these reasons, decibels are used to express signal level. Decibels are defined with the following equation: The following chart shows some examples of the above caldB = 20log(v'/v) (v:typical signal level) culation. As the relative magnitude is being discussed, by substituting v = 1 the given equation is v'=10db/20 v'=vx10db/20 By rearranging this we have
Optical System
Here it should be noted that a 20-dB drop in signal level dB -60 -50 -40 -30 -20 -10 -3 ratio 0.001 0.00316 0.01 0.0316 0.1 0.316 0.708 dB 0 +3 +6 +9 +12 +15 +18 ratio 0 1.0 1.413 1.995 2.818 3.981 5.623 7.943 means a one-tenth drop in amplitude, 6 dB is almost equal to a factor of 2, and that conversion from -dB to +dB is given with the reverse proportion. The use of decibels also allows easier calculation since multiplication is handled as addition. A typical example is the total gain of a series of amplifiers, which is calculated by adding
CCD Mechanism
Dynamic Range
In general, dynamic range indicates the difference or ratio of the smallest and largest signal that can be handled by a device. Although seldom used in camera technology, the term dynamic range is sometimes used to indicate the ratio between the smallest and largest amount of light that the imaging device can capture. With 2/3-inch CCDs, this is normally 600% while 1/2-inch CCDs have a dynamic range of 300%. The definition in this case is based on the amount of light required to produce a 1.0 Vp-p video signal, which is 100%. This means that an imager with 600% dynamic range has the capability to produce a video signal with six times the amplitude of the 1.0 V video standard. Technologies such as Knee Correction, DCC, and DynaLatitude are used to compress this amplitude into the 1.0 V range (refer to Knee Correction, Dynamic Contrast Control (Automatic Knee Control), and DynaLatitudeTM ).
Others
Horizontal Resolution
This term describes the camera's capability of reproducing the details of its subject. It is expressed by the number of black and white vertical lines resolvable within three-fourths the width of the picture. It is important to note that horizontal resolution does not refer to the number of resolvable lines within the total picture width. This is because it must be expressed under the same conditions as vertical resolution and thus only three-fourths the picture width. Horizontal resolution is usually measured by framing a resolution chart. The camera's horizontal resolution is known by reading the maximum calibration of the vertical wedges where the black and white lines can be resolved. Horizontal resolution is also measured as the maximum number of vertical black and white lines where the camera output exceeds 5% video level. Measurement of horizontal resolution must be performed with gamma (refer to Gamma ), aperture, and detail (refer to Detail Level ) set to 'On', and masking set to 'Off'.
Others
Interlace/Progressive
Interlace is a scanning method, which has been adopted into today's television broadcast system such as NTSC and PAL (refer to NTSC/PAL ). In brief, interlace scanning refers to scanning every other TV line of an image as the first field, and then scanning lines in-between as the second field. Interlace scanning is a method that has been inherited from the black and white TV days, it is worthwhile to know why this continues to be used in the NTSC color broadcast system. To display motion pictures on a video monitor with negligible flicker, tests have demonstrated that at least 50 to 60 images must be displayed within one second (here, film is an exception). However, the bandwidth available for broadcasting the TV signal (6 MHz in NTSC areas, 7 MHz in PAL areas) was not wide enough to transmit 60 full images per second. Apparently, it was required to consider a solution to reduce the signal's bandwidth but with minimum quality loss. After thoroughly investigating the human eye characteristics, a solution was found. The result was to use a 1/60-second rate for each scan, but in the first scan, from the top to bottom of the image, only the odd-numbered lines (1, 3, 5...525) are scanned, and for the next scan, only the even-numbered lines (2, 4, 6...524) are scanned. Since the image rate is 60 per second, this was successful in keeping flicker to a negligible level. Also, through visual experiments, it was found that despite the line scan of each frame being reduced to one-half, the drop in vertical resolution was only 70% of the total 525 (625 for PAL) TV lines. This is due to the human eye characteristics of seeing an after-image due to the short 1/60-second field rate. In this way, researchers decided to use the above interlace scanning method for the NTSC and PAL TV systems. Newer broadcast transmission infrastructures are less limited in bandwidth. This gives the broadcaster the option of using an interlace system or a non-interlaced system - the latter known as progressive scanning. The progressive scanning system has been adopted in computer displays, which do not require considerable transmission bandwidth. In progressive scanning, each line is scanned in order from the very top to bottom. The entire 525 lines (or 625 lines for PAL) are displayed in one scanning operation. Thus superior vertical resolution is achieved.
Optical System
Minimum Illumination
Minimum Illumination indicates the minimum amount of light required for shooting with a certain camera. It is expressed in Lux. When comparing minimum illumination specifications, it is important to consider the conditions they were measured at. All cameras provide a Gain Up (refer to Gain ) function to boost signal level. Although convenient for shooting under low light, Gain Up also boosts the signal noise level. Minimum illumination is usually measured with the highest Gain Up setting provided by the camera, and therefore does not represent the camera's sensitivity in any way. Simply put, it is best to have a high minimum illumination at a relatively smaller Gain Up setting.
CCD Mechanism
Modulation Depth
Modulation Depth is an important factor that governs the resolving performance of a video camera. While Horizontal Resolution (refer to Horizontal Resolution ) is used to indicate only the resolving ability, it is equally important to know the Modulation Depth - the camera's resolving performance for frequency range practically used in video bandwidth. Although, horizontal resolution does indicate the ability to reproduce fine picture detail, it can somewhat be misleading when judging a camera's performance. This is because horizontal resolution only defines the finest level of detail that is viewable - not clearly or sharply viewable. Modulation Depth, on the other hand, is used to indicate how sharp or how clear an image is reproduced. For this reason, modulation depth focuses on the response of the frequency ranges most used in video. Simply put, it is the frequency response in practical frequency ranges that governs the camera's sharpness - not the horizontal resolution. A camera's frequency response is usually measured by shooting a Multi Burst chart. A typical frequency chart has vertical black and white lines which, when converted to video, translate into 0- to 5-MHz signals. When measuring Modulation Depth, usually the 5-MHz area is used. This is because 5 MHz is a frequency that best represents a video camera's performance. The closer the response is to 100% at 5 MHz, the higher the capability to reproduce clear and sharp picture details. Modulation Depth of video cameras for standard definition broadcasting is typically in the range between 50% to 80% (at 5 MHz). It must be noted that Modulation Depth can be influenced by the performance of the camera lens and thus measurements should be conducted with an appropriate lens.
Overall Y Response (with Optical LPF, without Lens) Response 1.2 1.0 0.8 0.6 0.4 0.2 0 0.2 0.4 0.6 Depth of modulation characteristic 0 2 4 6 8 10 12 14 16 18 20 Horizontal frequency (MHz) 0.8 cf2: Pbo (1-1/4 ) (XQ3430) cf1: Pbo (2/3 ) (XQ3457) Power HAD (520,000 pixels) Pbo (1-1/4 ) (XQ3430) Pbo (2/3 ) (XQ3457)
Others
NTSC/PAL
NTSC is an acronym that stands for National Television Systems Committee. This is an organization that stipulated the standards for the NTSC television systems. When used in video terminology, NTSC refers to color television systems used in North America, Japan, and parts of South America. It employs an interlace scanning system (refer to Interlace/ Progressive ) of 525 lines per frame/30 frames per second. PAL is the acronym, which stands for Phase Alternate by Line. This term refers to the color television system mainly adopted in Europe, China, Malaysia, Australia, New Zealand, the Middle East, and parts of Africa. It has an improved color coding system that reduces the color shift problem that sometimes happens with NTSC color coding. Most countries that use PAL It employs an interlace scanning system with 625 lines per frame/25 frames per second. Some systems, for example, PAL-M as used in Brazil, use PAL color coding with 525 line 60 Hz scanning.
30 frames/sec
25 frames/sec
30
25
5 4 3 2 1 1 2
Optical System
RS-170A
RS-170A is a color-synchronization signal standard enacted by EIA (Electronic Industries Association). This standard includes additional regulations that did not exist in the original color-synchronization signal standard enacted by FCC (Federal Communication Commission). The main objective of adding these regulations was to prevent hue differences on TV receivers depending on the broadcast station or program. However, the rapid technical advances seen in TV receivers have reduced the above hue differences, and these additional regulations have become of less importance, except in the following cases: Otherwise, sync shock may be caused during switching.
Camera Functions
3. In digital devices
The A/D converters in digital VTRs and TBCs use the subcarrier as the reference for A/D conversion timing. If SC-H is not maintained, A/D conversion will take place at the wrong areas, and will result in picture shift when the signal is reproduced on picture monitors.
VTRs
SDI
SDI is an acronym that stands for Serial Digital Interface. SDI is an industry-standard interface designed for transmitting non-compressed (known as baseband) digital video and audio. SDI has become extremely popular for two reasons its compatibility with analog coax cables and for its long transmission distance. The SDI interface is offered in three versions: composite digital SDI (standard definition), component digital SDI (standard definition ) and HD-SDI (high definition). SDI offers the following advantages over analog I/Os: Technically, the transmission rates are 270 Mbps and 1.5 Gbps for standard-definition SDI and high-definition SDI, respectively. 1. Virtually degradation-free picture and audio quality by noise-resistant digital transmission. 2. Conventional coax cable can be used. 3. Only one BNC cable required for connecting video, audio, and time code. 4. Longer transmission distance (as compared to analog) allows flexible cabling (up to 200 m for SD and 100 m for HD)
Others
Sensitivity
The sensitivity of a camera is described by the iris (refer to Iris ) opening required to provide a video signal with sufficient level under a specified lighting condition. In general, sensitivity is measured using a 89.9% reflectance grayscale chart illuminated with a 2000 lux, 3200K illuminant. The camera's F-stop (refer to F-number ) provides a white video level of 100% when the chart is shot is the sensitivity of the camera. In CCD cameras, sensitivity is mainly determined by the aperture ratio of the photo-sensing sites. The more light gathered onto each photo sensor, the larger the CCD output, and the higher the sensitivity. However, it is also
*Note: the gain selector must be set at 0 dB.
important to know that the true sensitivity may differ from the total sensitivity of the camera. This is because the total sensitivity of a camera correlates with the camera's S/N ratio. That is, the camera's sensitivity can be improved by increasing the gain of the video amp, but with a drop in the camera's S/N ratio. An example of sensitivity specification is shown below. F5.6 at 2000 lux (3200K, 89.9 reflectance)
Vertical sync signal Horizontal sync signal Sync signals in NTSC format
Optical System
VBS/BS Signal
The VBS (Video Burst Sync) signal is a composite video signal in which the active video area contains picture content or color bars. As opposed to the VBS signal, the BS (Burst Sync) signal does not contain picture content and the active video area is kept at setup level.
CCD Mechanism
Video
Burst
Camera Functions
H-sync
VBS Signal
BS Signal
VTRs
Vertical Resolution
Vertical resolution describes the fineness of images in the vertical direction. While horizontal resolution (refer to Horizontal Resolution ) varies from camera to camera and is used as a reference to describe picture quality, vertical resolution is determined only by the number of scanning lines in the particular TV format. For example, the vertical resolution of all existing NTSC video equipment is 525 TV lines (to be precise, the effective vertical resolution is 486 TV lines due to interlace scanning) regardless of the different horizontal resolutions each equipment provides. Therefore, unlike horizontal resolution, vertical resolution is usually of less concern Others when evaluating a camera's performance.
INDEX
A
Adaptive Highlight Control ....................................................24 Additive Mixing ........................................................................ 54 Angle of View ............................................................................2 ATW (Auto Tracing White Balance) ....................................24 AWB (Auto White Balance) ................................................... 24
G
Gain .......................................................................................... 31 Gamma .................................................................................... 32 Genlock ................................................................................... 32
H
H/V Ratio ................................................................................. 33 HAD SensorTM ........................................................................ 16 HD/SD (High Definition/Standard Definition) ..................... 57 Horizontal Resolution ............................................................ 57
B
Black Balance ......................................................................... 25 Black Clip .................................................................................25 Black Gamma ......................................................................... 25 Black Shading ......................................................................... 26
I
Intercom (Intercommunication) System ............................. 33 Interlace/Progressive ............................................................ 58 Iris ............................................................................................... 6 IT/FIT CCD .............................................................................. 16
C
Camera Control System ........................................................ 54 Center Marker ......................................................................... 26 Chromatic Aberration ...............................................................3 Clear Scan/Extended Clear Scan (ECS) ............................ 26 ClipLink /Index Picture/Automatic Logging Function ...... 50
TM
K
Knee Aperture ........................................................................ 33 Knee Correction ..................................................................... 34
Color Bars ................................................................................27 Color Conversion Filters .......................................................... 3 Color Signal Forms ................................................................ 56 Color Temperature ...................................................................4 Crispening ............................................................................... 28 Cross Color Suppression ...................................................... 28
L
Lens File .................................................................................. 34 Level Dependence ................................................................. 34 Light and Color ......................................................................... 7 Limiter ...................................................................................... 35 Linear Matrix Circuit .............................................................. 36 Low Key Saturation ............................................................... 36
D
Decibels (dB) ........................................................................... 56 Depth of Field ............................................................................4 Detail Level ..............................................................................29 DynaLatitudeTM ........................................................................ 30 Dynamic Contrast Control (Automatic Knee Control) ....... 30 Dynamic Range ...................................................................... 57
M
Minimum Illumination ............................................................ 59 Mix Ratio ................................................................................. 37 Modulation Depth ................................................................... 59 MTF (Modulation Transfer Function) .................................... 8 Multi Matrix .............................................................................. 38
E
Electric Soft Focus .................................................................31 EVS/Super EVS ...................................................................... 14 EZ Focus .................................................................................50 EZ Mode .................................................................................. 51
N
Neutral Density (ND) Filters ................................................... 8 NTSC/PAL .............................................................................. 60
F
Field Integration and Frame Integration Mode ..................15 File System ..............................................................................31 Flange-Back/Back Focal Length ............................................4 Flare ........................................................................................... 5 F-number ...................................................................................5 Focal Length ............................................................................. 6
O
On Chip Lens .......................................................................... 17 Optical Low Pass Filter ........................................................... 9
P
Pedestal/Master Black .......................................................... 38 Picture Element ...................................................................... 18 Preset White ........................................................................... 39 Prism .......................................................................................... 9 PsF (Progressive, Segmented Frames) ............................. 60
R
Readout Mechanism ............................................................. 18 Reference File ........................................................................ 39 Return Video ........................................................................... 40 RPN (Residual Point Noise) ................................................. 19 RS-170A .................................................................................. 61
S
S/N (signal-to-noise) Ratio ................................................... 61 Scene File ............................................................................... 40 SDI ........................................................................................... 61 Sensitivity ................................................................................ 62 SetupLogTM ............................................................................. 51 SetupNaviTM ............................................................................ 51 Skin Tone Detail Correction ................................................. 41 Spatial Offset Technology .................................................... 20 Sub-carrier Phase Control/Horizontal Phase Control ...... 41 Synchronization Signal (Sync Signal) ................................ 62
T
Tally .......................................................................................... 42 Tele-Prompter ........................................................................ 42 TLCS (Total Level Control System) .................................... 43 Triax ......................................................................................... 44 TruEyeTM (Knee Saturation Function) Processing ............ 44 Turbo Gain .............................................................................. 45
V
V Modulation ........................................................................... 45 Variable Speed Electronic Shutter ...................................... 21 VBS/BS Signal ....................................................................... 63 Vertical Resolution ................................................................. 63 Vertical Smear ........................................................................ 22
W
White Balance ........................................................................ 46 White Clip ................................................................................ 47 White Shading ........................................................................ 10
Z
Zebra ........................................................................................ 47 Zoom ........................................................................................ 11
Acknowledgement
With the advent of sophisticated information networks made possible by the development of the Internet, we now have the capability to search and access the information we need much more easily and quickly than ever before. However, in my searches for information, I could not locate any documentation or literature that comprehensively explains professional video camera terminology in a manner that is easily understood by the layman. All of my searches turned up documents that were highly technical and very difficult to comprehend for people who are new to this industry or are unfamiliar with technical language and jargon. I felt that, "There should be some solution to this lack of good information." And, this is how the production of this document began. My goal in creating this document was to cover most of the basic terminologies that I feel to be significant in understanding professional video cameras. I also paid special attention to write in a style so that explanations are comprehensible to even those who have a limited technical background. I hope that this document invites its reader to become a part of the world of professional video cameras and that it provide a helpful solution in making a induction process a smooth one. Creating this document is the result of more than my individual effort. Without the support and cooperation of my colleagues in a number of different departments within Sony Corporation, this document could not have been completed. I would like to thank all of those who have been involved in this project. I am especially thankful to: Mr. Satoshi Amemiya, a Senior Technical Writer in the Marketing Communication Section, Product Information Department, Business Planning Division, B&P Company, Sony Corporation for his comprehensive suggestions and advice on the content of this document. Mr. Atsushi Furuzono and Hitoshi Nakamura of the Contents Creation Business Group, B&P Company, Sony Corporation for their detailed advice, explanations, and verification of the entire content from a technical point of view. Mr. Isao Matsufune of the Contents Creation Business Division, B&P Company, Sony Corporation for his help in checking the chapter on VTRs. Mr. Kinichi Ohtsu of the Contents Creation Division, B&P Company, Sony Corporation for his help in checking the chapter on Camera Functions. Mr. James Webb, a Technical Writer in the Marketing Communication Group, Product Information Department, B&P Company, Sony Corporation for his suggestions on the appropriate use of English expressions. Finally, I would like to express my gratitude to Ms. Michi Ichikawa, General Manager, and Ms. Kazuko Yagura, Senior Manager of the Product Information Department, B&P Company, Sony Corporation, for their support in providing me with the opportunity to create this document.
September 22, 2003 Toshitaka Ikumo Marketing Communication Group Product Information Department Business Planning Division B&P Company Sony Corporation
MK10088V1KYP03NOV
Printed in Japan