General Terms | |
2 K / 4K | 2K/4K are resolutions for digital images. The numbers refer roughly to the number of pixels across the horizontal axis of the screen/image, the K standing for Kilo or thousand. The number of pixels across the vertical axis may vary according to the aspect ratio of the image/film. In scanning, 2K may refer to a resolution of 2048x1556 (full-aperture), and 4K to 4096 x 3112 (full aperture). In that case 2K has 3.19 Mpixels and a 1.316:1 aspect ratio. It is used for digitizing full frame 35mm motion picture film sampled in 4:4:4 RGB color space – making each image 12 MB. In a 4K image the area of a 2K image is quadrupled, making it 12.76 Mpixels. DCI has named 2K / 24 fps or 48 fps and 4K / 24 fps as standard for digital cinema projection. In that case 2K refers to a pixel resolution of 2048 x 1080 and 4K to 4096 x 2160 pixel. These are able to emulate the traditional cinema projection aspect ratios of 2.35:1 and 1.85:1. There is often a confusion between aspect ration of capture and aspect ratio of DCI. DCI aspect ratio are only 1,85:1 and 2,39:1. Old aspect ratio are included in the 1,85:1. |
Ultra HD / QFDH | New Brodcasting resolution - Ultra HD (Old designation: QFHD= Quad Full HD): 3840 x 2160 pixels - Needs of a digital input able to carry a Ultra HD signal without upscaling. |
HD / high definition | High Definition video and HDTV broadcast standard definition. The HD aspect ratio is always 16:9. Pixels are square like in a computer display. HDTV is defined by three categories : Frame size, scanning system and frame rate. Frame size is given in pixels, for example 1280 x 720 or 1920 x 1080. Often the number of horizontal pixels is implied from context and is omitted, as in 720p and 1080p. Scanning system is identified with the letter p for progressive scanning or i for interlaced scanning. Frame rate or refresh rate is identified as number of video frames per second. Current resolutions for HDTV: • 720p (1280x720 progressive scan) • 1080i (1920x1080 split into two interlaced fields of 540 lines) • 1080p (1920x1080 progressive scan)in case of 1080 • PsF (Progressive Segmented Frame) is also type of aquiring /distribute. It is a true progressive image that is electronically "separated" or "split" into 2 "fields" (mainly used by Sony to fit with their equipment). |
Meta data (Data description) | Data about data or data describing other data. Information that is useful when associated with the essence (the file, the film) being provided. |
Proxy | Low resolution copy of an image, often used for on-site-editing. |
SD / standard definition | Standard Definition video and digital TV broadcast standard definition. SDTV ist usually shown in a 4:3 aspect ratio, a 16:9 aspect ratio can be achieved anamorpically or using letterboxing. Current resolutions for SDTV: • 576i (PAL): the picture is 576 active lines or pixels high. Two fields are ->interlaced (i). This gives 25 full pictures (frames) per sec. A standard definition digital picture is 704 (visible) pixels across, so the resolution of the picture is 704 x 576 pixels, but horizontal pixels may vary. • 480i (NTSC, mainly used in USA, Canada, Mexico, Japan): the picture is 480 active lines or pixels high. Two fields are interlaced (i). this gives 24 full pictures (frames) per sec. Both PAL and NTSC use non-square pixels. |
IIF- ACES Architecture | Standard of digital workflow developed by the Academy of Motion Picture Arts and Sciences (AMPAS). Currently in progress. Request came from Disney and other american companies to simplifythe process (example: matching different supports and codecs). IIF: Academy Image Interchange Framework Enable seamless interchange of high quality motion picture images regardless of source, Define the “Digital Source Master’’. ACES: Academy Color Encoding Specification, a Radiometrically Linear Light Encoding. AMPAS provide methodology to get from any source (Film, Digital, etc.) into ACES This is a complete specified architecture for placing any and all capture mediums into a standardized gamma and color space that far exceeds anything available today. |
Picture specifications | |
Additive colour / subtractive colour | RGB (red-green-blue) colours use additive method, all three combined they produce white light. CMY (cyan magenta yellow) colours are subtractive, they can be created by subtracting one of the RGB colours from white light. |
Aliasing | A digital artifact consisting of patterns or shapes that have no relation in size and orientation with those found in the original image. This is often caused by too low a scan resolution or sampling rate. The best solution is to acquire the image at a sufficient sampling rate or use an anti-aliasing algorithm. Aliased images appear as jagged edges, aliased audio produces a buzz. You can find aliasing also if the picture has enough information, but you need to re-sample it down (see typical artifacts in modern DSLR or when using high resolution picture in lower resolution environment). |
Aspect ratio | Common aspect ratios (image formats expressed as the ratio of the picture width to the picture height) are 1.66:1, 1.85:1 and 2.35 (or today 2.39:1) for shooting and screening of theatrical films and 4:3 or 16:9 for TV. The DCI 2K and 4K projection standard has been chosen to emulate the 1.85:1 or 2.39:1 aspect ratios. |
Colour Depth / Colour Resolution | Colour depth is used to describe the maximum number of colours that are used in the image. The higher the number of colours then the more realistic the image will appear. Measured in bit. A 10 bit image colour means 2 powered by 10 (1024) colours for each chanel Red Green Blue. The total is 1024 x 1024 x 1024 = 1,073,741,824 colours. |
Colour space | The range of colours that can be represented by a device. Colour space can be reprogrammed on some devices, including digital cinema projectors, to enable different looks for different content. During theatrical exhibition, the same colour space should be selected on the projector as was used during original mastering. Colour reference masters should be developed in order to secure that the work is displayed as intended by the creators and in the identical version irrespective of the theatre system. |
Gamma encoding | Optimization of the pixel's numerical value and its actual luminance. Without gamma, shades captured by digital cameras wouldn't appear as they did to our eyes (on a standard monitor). It's also referred to as gamma correction, gamma encoding or gamma compression, but these all refer to a similar concept. Understanding how gamma works can improve one's exposure technique, in addition to helping one make the most of image editing. |
Deinterlacing | Deinterlacing is the process of converting interlaced video, such as common analog television signals or 1080i format HDTV signals, into a non-interlaced form. It is used to display interlaced pictures on a progressive scan monitor or projector. |
Depth of field | The zone between the nearest and furthest points at which the camera can obtain a sharp focus. |
Dynamic range | Indicates the ratio between the maximum and minimum measurable light intensities. |
Interlaced | Method of recording or displaying an image, created in order to reduce bandwidth in TV broadcasts. Interlacing uses two fields to create a frame. One field contains all the odd lines in the image, the other contains all the even lines of the image. The two fields are displayed one after another. A PAL based television display, for example, scans 50 fields every second (25 odd and 25 even). The two sets of 25 fields work together to create a full frame every 1/25th of a second, resulting in a display of 25 frames per second. |
Progressive | The opposite of interlacing. Shooting progressive means recording the whole frame at 25 fps (in PAL), using the full vertical resolution. Progressive scan has therefore a higher vertical resolution and no artifacts associated with ->interlaced scan, but consumes higher bandwidth. Progressive images are also more suited to viewing on large screens and displays. |
Rendering | The process of calculating a three-dimensional image for movies, computer games, architecture, design etc. It is the final step in computer graphics and gives the objects its shading, texture and lighting. That proces is called in such a way in 2D and 3D environments... |
Resolution | Resolution indicates the number of pixels that can be displayed in each dimension on a display device. |
Compression | |
Bit rate | The bitrate (MbP/s) is the number of information per second delivered by a device. The bitrate (number of information per second) depends on : The image width (from 960 pixels to 4096 pixels, even more) The image height (from 720 pixels to 3072 pixels, even more) The signal processing (Raw, RVB or Component) The quantification (8 bits, 10 bits or 12 bits) The frequency (or speed) (Example: 23.98, 24, 25, 29.97,30, 50, 59.94,60 im/sec) The Codec type used (Jpeg 2000, HDCal SR, HDCam, Mpeg2…) |
CODEC / COmpression - DECompression | A codec is a computer programme that writes and reads digital data in the process of -> COmpression and DECompression. Codec sometimes is also used for compression formats. |
Compression | A way of reducing the size of digital images or sound files while retaining acceptable quality. The main objective is to save memory space, and to take less bandwidth during transmission or less time to transfer. Image compression is used in digital cinema so file sizes remain manageable for mastering, distributing, loading and projecting. The image must be uncompressed in the theatre before being displayed. Digital Cinema uses JPEG2000 or MPEG-2 compression. Compression can be divided into two types: "lossless" and "lossy". As the names imply, lossless techniques retain all of the original information in a more efficient form (used for archiving), whereas lossy techniques discard or approximate some information, thus reducing the quality of the data. |
Macroblocks building | Macroblocks are an image compression component, macroblocking is an image display error resulting in lower resolution images. Also known as pixeling, tiling, mosaicking. The size of a block depends on the codec and is usually a multiple of 4. |
RAW Data (Arri RAW-Red RAW ) | A digital camera records raw image data when there is no processing or compression and only the pure data from the sensor is taken This data needs subsequent processing outside the camera in order to both view and use the material, but it can result in better image quality. |
Technical aspects | |
- 24fps | 24 frames per second, used for film acquisition and cinema projection |
- 25fps | 25 frames per second, used for European television (actually 50i fields/sec) |
- 30fps | 30 frames per second, used for US and Japanese television (actually 60i fields/sec) |
Anamorphosis, anamorphic | Anamorphosis refers to a system which alters the image in a reversible way through an optical or a digital system. In the movie field, the anamorphic system is designed for wide screen images (aspect ratio 2,39 :1). The most well-known is CinemaScope. The original one starts by compressing during shooting (with anamorphic lenses, cylinder parts) and stretching (during screening) the image laterally. ThIs systems create a typical bokeh (aesthetical quality of the blur). Other process use spherical lenses and post-production dilating process. Example: Super 35 mm in film with optical or digital stretching for screening. Example with digital cameras using only the 2,35:1 ratio included in the 1,85:1one. With new large sensors on cameras, new anamorphic lenses and lighter codec digital cinema is creating a renew of the true anamorphic capture. |
Bayer Filter | A Bayer filter is an array of color filters which is used to alternately arrange the colors red, green and blue (RGB) in a grid. The Bayer filter is commonly used on digital photosensors mainly for the purpose of producing a colored image. When filtering colors, the green filter is doubled in a Bayer filter. |
Photosites | A digital camera uses an array of millions of tiny light cavities or "photosites"on Tthe sensor to record an image. They are in charge to transform the light in electricity. They are commonly named pixels. Which is misleading. A sensor is made with photosites not pixels. A pixel is the smallest part of the image. |
CCD sensor (Charge Coupled Device) | In the past, CCDs have provided the performance benchmarks in the photographic, scientific, and industrial applications that demand the highest image quality at the expense of system size. ->See also CMOS. |
Chrominance (chroma) | Chrominance (or chroma) is the signal used in video systems to convey the color information (roughly the hue and saturation) of the picture, separately from the accompanying luma signal. |
CMOS sensor (complementary metal oxide semiconductor) | CMOS imagers offer more functions on the chip, lower power dissipation (at the chip level), and the possibility of smaller system size, but they have often required tradeoffs between image quality and device cost.-> See also CCD. |
CMS (Color Management System) | Color management is a system that ensures that the color representations that one sees on various digital devices (digital cameras, scanners, computer monitors, projection systems) during the film production process are exactly the same. Ideally a video should appear the same color on a computer monitor during production or post-production and on a screen during projection. -> LUT. |
DPX / digital picture exchange file | The most common file format used in digital post-production. DPX provides a great deal of flexibility because it is easy to share between workstations, equipment, and facilities. It is a bitmap file format used to store a single frame of a motion picture or video data stream. DPX is a modification of the Kodak Cineon format with added header information. |
File based format | ‘File-based’ or 'tapeless' refers to storing media in files rather than as continuous streams like film stock or video tape. Advantages during film production and post are the immediate and direct access to each part of the footage. If footage during a digital shoot has been stored on hard disks or similar random access media it goes directly in the computer and can already be examined on set, be copied without quality loss, transferred to postproduction etc. |
Format | 35 mm film has many formats. Of these the most common are Full Frame (which occupies the largest possible area of the film), Academy and Cinemascope, see also -> aspect ratio. The -> DPX file specification defines different scan sizes of these image formats, and these scan sizes are also used (exactly or nearly) by new digital cinematography cameras. For digital cinema projection the -> DCI specifies exact sizes. Full frame may be additionally explained as FF which is FILM term and another FF which is photographic (hence DSLR video as well) format. |
Frequency / Image frame rate | Also called refresh rate or frequency at which a unique image is produced on a display device. Is expressed in FPS (frames per second) and in Hertz (Hz) for monitors. |
LUT (Look Up Table) | A look up table is a mathematical translation for viewing, previewing and converting an image in different Color spaces. The look-up table is also a way to calculate and emulate on a monitor during production and post-production how the digital image will look on the final film print. |
Native | A resolution is called native when it corresponds exactly to the physical resolution of a digital display. Native resolution stands also for the maximum resolution a display device can achieve. Optimal display quality can be reached only when the signal input matches the native resolution. |
Sensor | The sensor, which is the heart of a digital camera, records an image when you take a picture. Light strikes a sensor through the lens. In 35mm terms, the sensor is “film.” ->CCD (charge coupled device) and ->CMOS (complementary metal oxide semiconductor) image sensors are two different technologies for the acquisition of digital images. |
Storage | During digital post-production different storage concepts may be used to enable collaborative workflow: There are the NAS (Network Attached Storage) and SAN (Storage Area Network) architectures and ever more often a combination of the two, as well as new developments like iSCSI, depending on what is technically and economically feasible. |
Tape format | There are analog and digital tape formats for recording and storing film or video. Access to both is only sequential, i.e. more time-consuming than access to file formats. |
Artifacts / Picture problems | |
Artifacts / Picture problems | An artifact is an anomaly during visual representation of digital imagery not present in the original image, such as aliasing. Artifacts can be caused by many factors, including digital compression, film-to-video transfer, transmission errors, data readout errors and many more. |
Production | |
Acquisition | Process of recording film with a camera. |
Bluescreen / Greenscreen | Also chromakey, a method for creating a key based on a certain colour; a screen of solid colour; the process of shooting foreground elements in front of such a screen. |
Perforation / 4-perf, 3-perf, 2-perf | Perf is short for perforations. It is a way to describe the format of images on 35mm film by how many of the perforations are used per image. A normal ‘Academy' frame of 35mm film is pulled through the camera gate four perforations at a time. This will allow for all aspect ratios including anamorphic. To save film stock, the camera can be modified to pull 3 perforations through at a time. This still works for the normal cinema screen aspect ratio of 1.85:1 and also works for the 16:9 aspect ratio currently used by television broadcasters. A recently revived option is where the camera only pulls through 2-perfs at a time, saving over 50% in stock and processing. This is only suitable if you want to shoot the widescreen aspect ratio of 2.39:1. However, by cropping the sides, it can also be used in 16:9 mode for TV transmission. |
Pre-production | Pre-production starts as soon as a project has been developed and is greenlit. At this point a project should be fully financed and should have most of the key elements such as principal cast, director and cinematographer in place, as well as a screenplay which is satisfactory to all the financiers. During pre-production, the script is broken down into individual scenes and all the locations, props, cast members, costumes, special effects and visual effects are identified. A detailed schedule is produced. Sets are constructed, the crew is hired, financial arrangements are put in place and a start date for the beginning of principal photography is set. |
Post-production | |
Auto-conforming | Matching the digital intermediate to the final edit. The Digital Intermediate process creates the possibility of doing an auto conform where the movie is actually conformed during the DI process itself. With the auto conform method the original camera negative is never cut. Instead, each original camera roll is placed in tact on a film scanner and only the shots to be used in the final edit are scanned. The scanned frames are then assembled in the proper order by the computer to conform to the final edit of the movie. To perform this the computer uses the EDL (Edit Decision List) that was generated by the editing system to automatically conform the entire movie – hence “auto conform”. |
Blow up | Occurs when a smaller film format is increased to a larger format. An example would be going from Super 16 up to 35 mm. |
Cloud computing | The could computing is a concept of consumption of IT resources on demand. It offers accessibility and attractive costs via networks through a pooling and a frienfly use of services. It can be used both on post production workflows and on archiving. |
Colorist | The post-production expert dealing with colour grading, colour correction. |
Colour grading, colour correction | Colour grading is the process of altering and enhancing the colour of a film or image, either photo-chemically or digitally. Today colour correction is generally done digitally because it allows far more control than traditional colour timing. Digital colour grading may be used to: reproduce accurately what was shot, compensate for variations in the material (i.e. film errors, white balance, varying lighting conditions), optimize transfer for use of special effects, establish a desired 'look', enhance and/or alter the mood of a scene. |
Conforming / On line editing | During the Digital Intermediate process, conforming is the electronic equivalent of negative cutting. The digital intermediate has to match the final edit. This can be achieved by a special conforming software which uses an edit decision list (EDL). |
Cut List | A list of edits making up a sequence, usually specified in terms of key numbers. |
Debayer or demozaicing | The final image coming from a bayer filter could not be yet seen automatically since a pixel contains only one out of the three colors. Bayer "demosaicing" is the process of translating this Bayer array of primary colors into a final image which contains full color information at each pixel.Different demosaicing or debayer algorithms are used to fill in the actual colors of the image. These millions of tiny pixels undergo interpolation, turning them into image pixels, which may then produce a complete image. The user has the freedom to choose from different varieties of software algorithms in order to come up with the final output or image. |
Digital intermediate (DI) | The DI is a digital post-production process that allows filmmakers to evaluate and manipulate images prior to recording to film. Through scanning, assembly, digital opticals, SFX integration & colour correction, these multiple digital post-production processes lead to a high resolution digital master. This master can be recorded to film and any other medium such as a Digital Cinema Master, HD Masters, Blu-ray, DVD and any combination of broadcast formats. A digital intermediate is done at higher resolution and with greater color fidelity than telecine transfers. |
Digital-Beta (or Digibeta) | Digital Betacam, AKA Digibeta or D-Beta, was introduced by Sony in 1993 as a replacement for the analog Betacam SP format. Digital Betacam was superior in performance to DVCam and DVCPro, while being cheaper than D1.Digibeta didn't become an industry standard like it's predecessor. |
Edit List/ Edit Decision List (EDL) | The EDL is a way of representing a film or video edit. It contains an ordered list of digitally stored sequences of images (or on reel) and timecode data representing where each video clip can be obtained in order to conform the final cut. These days, linear editing systems have been superseded by non linear editing systems which can output EDLs electronically to allow autoconform on an online editing system - the recreation of an edited programme from the original sources and the editing decisions in the EDL. |
LTO / Tapes for Data recording | Linear Tape-Open (or LTO) is a magnetic tape data storage technology. The most recent LTO version was released in 2010 and can hold 1.5 TB in one cartridge. LTO is widely used with small and large computer systems, especially for backup. |
Rushes / dailies | Unedited footage from a shoot; the process of watching such footage. |
Secondary colour correction | While primary colour correction changes the whole image with regard to the colours red, green, blue, the mid tones, shadows and highlights, secondary correction selects and alters specific objects or portions of the colour spectrum. It adjusts values within a narrow range while having a minimum effect on the overall colour balance of a scene. |
Telecine | Telecine is the device and the process of transferring motion picture film into video form. Telecine technology is increasingly merging with that of motion picture film scanners; high-resolution telecines can be regarded as film scanners that operate in real time. |
Time Code (TC) | A time code is a sequence of numeric codes generated at regular intervals by a timing system. Time codes are used for synchronization of image and sound, and for logging material in recorded media. |
From HD / data files | |
Acquisition | Process that describes the input of media for the Digital Intermediate. All source material has to be digitized. |
Asset (or data) management | Asset (or data) management in film production and post-production is important in order to capture, archive, search and retrieve data, ensuring a smooth production workflow and seamless collaboration between the multiple players in the process. |
Calibration | Calibration ensures that all post-production devices, scanner, display or projection systems show the same picture/quality through the course of their life. It ensures that the input closely matches the output. In general, calibration is a comparison between measurements of two different devices, one of them providing the standard. |
Colour management system (CMS) | Color management is a system that ensures that the color representations that one sees on various digital devices (digital cameras, scanners, computer monitors, projection systems) during the film production process are exactly the same. Ideally a video should appear the same color on a computer monitor during production or post-production and on a screen during projection. -> LUT. |
Conforming | During the Digital Intermediate process, conforming is the electronic equivalent of negative cutting. The digital intermediate has to match the final edit. This can be achieved by a special conforming software which uses an edit decision list (EDL). |
Network SAN (Storage Area Network) / NAS (Network Attached Storage) | Different types of storage architecture in postproduction: A SAN commonly utilizes fibre channel interconnects. A NAS typically makes Ethernet and TCP/IP connections. Hybrids are possible. A NAS allows greater sharing of information especially between disparate operating systems. In a SAN file sharing is operating system dependent. Combinations of the two are possible. |
Editing | |
Edit List (EDL) Edit Decision List | The EDL is a way of representing a film or video edit. It contains an ordered list of digitally stored sequences of images (or on reel) and timecode data representing where each video clip can be obtained in order to conform the final cut. These days, linear editing systems have been superseded by non linear editing systems which can output EDLs electronically to allow autoconform on an online editing system - the recreation of an edited programme from the original sources and the editing decisions in the EDL. |
Transfer to film | |
Broadcast master / On air master | A broadcast master consists of one ore more video master tapes that are delivered to a TV broadcaster. It has to fulfil certain technical standards defined by the broadcaster. It contains the film, titles, end titles, subtitles, and several sound mixes. It also contains technical information up-front and at the end as well as -> metadata on the film. |
Data (digital files) to film transfer | Recording the digital master onto film, a (possible) final step of or after the digital intermediate post-production process. |
Internegative (IN) / Intermediate | In traditional photochemical filmproduction, the internegative is an intermediate copy of a film, used to make the release prints for mass distribution to the theatres. In digital post-production, the result of the digital intermediate process (scanning to recording), the -> Digital Source Master, is the equivalent of the internegative. |
Interpositive (IP) | The original edited negative is printed onto stock that comes out as an interpositive (IP). With the IP in hand, an Internegative (IN) is made which becomes the printing master or dupe negative (DN) for making multiple release prints.The interpositive is touched only on the occasion of making the first or a replacement internegative. Since interpositives are used so rarely, they are usually the best preserved film element. |
Printing | Multiplication (copying) of a film negative or positive - during post-production, but mainly for distribution. |
Processing | During processing exposed film is developed, fixed, and washed to create a negative or a positive image. |
Theatrical release print | Film prints for mass distribution to cinemas. They are made from the internegative. Release prints are generally expensive it can cost $1.500 to 2.000 to print and ship one to theatres. In digital cinema the equivalent would be the DCP. |
Visual effects (VFX) | |
Despotting / Dust Busting | Removal of visible dust and scratches after film has been digitized. |
End titles | A credit sequence generally shown at the end of a motion picture. |
Main/Front titles | A credit sequence generally shown near the beginning of a motion picture. |
Scanner / film to digital | Initial step of a digital intermediate process for non-digital footage. Film is scanned frame by frame, each frame becomes a file. |
Visual effects (VFX) | Visual effects (Visual F/X or VFX) are the various processes by which imagery is created and/or manipulated outside the context of a live action shoot. Visual effects involve the integration of live-action footage and generated imagery to create environments which look realistic, but would be dangerous, costly, or simply impossible to capture on film. Visual effects using computer generated imagery (CGI) have become common in big-budget films, and are becoming accessible to smaller budgets with the introduction of affordable animation and compositing software. Although most visual effects work is completed during post-production, it must be carefully planned in pre-production and production. Visual Effects are designed and edited in Post-Production, with the use of graphic design, modelling, animation and rendering software, while Special Effects are made on set. Visual effects may be divided into at least four categories: • Models: miniature sets and models, animatronics, stop motion animation. • Matte paintings and stills: digital or traditional paintings or photographs which serve as background plates for keyed or rotoscoped elements. • Live-action effects: keying actors or models through bluescreening and greenscreening. • Digital animation: modeling, computer graphics lighting, texturing, rigging, animating, and rendering computer-generated 3D characters, particle effects, digital sets, backgrounds. |
Sound - Recording | |
Bit rate | A measurement of bandwidth. |
Soundmix / Re-recording | During Re-Recording or Dubbing all the sound elements (Dialogue, Automated Dialogue Replacement, Foley, Sound Effects, Atmospheres, and Music), are mixed together to create the final soundtrack. Re-recording ensures that film sound is correct both technically and stylistically. Setting the relative volume levels and positioning these sounds requires the skill and aesthetic judgement provided by experienced Re-recording Mixers. |
TC / Time Code | A time code is a sequence of numeric codes generated at regular intervals by a timing system. Time codes are used extensively for synchronization of image and sound, and for logging material in recorded media. |
Laboratory sound treatment | |
Sound synchronization | The synchronous timing of audio and video streams in a film production. |
Digital Cinema / Digital Distribution | |
Archiving | Experts estimate that digital material has to be migrated every 5 years to a new medium. Besides, reading devices also evolve. Long-term archiving of digital data is therefore one of the biggest challenges today. To this day, the most reliable support for numeric conservation is the system of ->LTO tapes. Besides this, the 35mm copy remains one of the best archival aids. Film has the advantage of a life span of up to 100 years and -> metadata can be stored on the film in human readable format, without the need to use any reading devices. Different archiving solutions are being discussed, e.g. holographic storage and Colour Microfilm, i.e. data on film. |
DCDM / Digital Cinema Distribution Master | When all of the sound, picture and data elements of a production have been completed, they may be assembled into a DCDM. It is a master set of files that have not (yet) been compressed, encrypted, or packaged for Digital Cinema distribution. The DCDM contains all of the elements required to provide a Digital Cinema presentation. The DCDM is the output of the Digital Cinema post-production process (not to be confused with the feature post-production process, which creates the DSM) and is the image structure, audio structure, subtitle structure. |
DCI - Digital Cinema Initiatives | DCI is a joint venture of the 6 Hollywood majors Disney, Fox, Paramount, Sony Pictures Entertainment, Universal, and Warner Bros. Studios. The group set up voluntary specifications for an open architecture for Digital Cinema - from mastering to display (they did not provide standards for production and post) - that ensures a uniform and high level of technical performance, reliability, and quality control. |
DCI-SMPTE DC28 | The standard for digital cinema defined by DCI (Digital Cinema Initiatives) and SMPTE (Society of Motion Picture and Television Engineers) in the working group DC-28. |
DCP / Digital Cinema Package | Once the DCDM is compressed, encrypted and packaged for distribution, it is considered to be the Digital Cinema Package or DCP. This term is used to distinguish the package from the raw collection of files known as the DCDM. The Distribution Package contains a Packing List and one or more Digital Cinema Track Files. When the DCP arrives at the theatre, it is unpackaged, decrypted and decompressed to create the DCDM, where the DCDM image is visually indistinguishable from the original DCDM image. |
Digital cinema | Digital cinema (D-Cinema) is the definition of the highest quality theatre display of digital cinematic content. Performance specifications have been defined by ->DCI and are meant to reach at least the quality standard of 35mm projection. However, DCI definitions only cover the process from digital mastering to distribution and screening, not the digital production process. D-Cinema is to be distinguished from E-Cinema which is not precisely defined but generally covers any digital projection of a quality below the DCI standard. The main advantage of digital film is the ability to produce digital copies without quality loss and that these copies remain unaltered, i.e. free from scratches or dust irrespective of the number of screenings. A digital copy, stored on a Hard Drive, costs a fraction of the 1.500-2.000 US$ needed to produce and ship a feature film print. |
Digital cinema audio | The necessary characteristics for standardized digital audio are: bit depth, sample rate, reference level and channel count (DCI standard: 16 channels uncompressed, 48khz or 96khz). |
Digital Distribution | Digital distribution offers a major potential for economies on the studio/distributor side, and for independence on the producer's side since it could enable low volume distribution. However, the exhibitors have to come up for initial investments and continuous maintenance of the digital projection and storage systems. One business model which has been agreed upon by the big studios, together with distributors and cinema owners in the U.S. and parts of Europe, is the -> virtual print fee model. For smaller and art house cinemas a business model has still to be defined. 3D-movies are accelerating the acceptance of D-Cinema and are making recoupment of investments easier for those theatre owners who show them. Several challenges for digital distribution lie still ahead: global interoperability, a system that enables mastering to a single colour standard, and creating a new supply chain with a scaleable deployment plan that enables theatre owners to accept the system with its inherent financial investments. |
DLP | Digital Light Processing DLP is a technology developed by Texas Instruments. It is used in many digital projectors with 2K resolution e.g. by Christie and Barco. |
DSM | Digital Source Master, or DSM, is the final postproduction output.The DSM creates many elements (e.g. Film Distribution Masters, DCDM, Home Video Masters and Broadcast Masters). The DSM is not standardized. |
Encrypting / encoding | Techniques used to protect digital data. In digital cinema encryption is used at various points during the electronic chain to prevent data from being altered or stolen. |
Forensic mark, watermarking, fingerprinting | Different security functions used at the time of playback. Transactional or forensic watermarks, embed information specific to a particular transaction into a digital watermark. The watermark is embedded into the file as it is delivered to the exhibitor/consumer. If copyrighted content is subsequently illegally distributed, the content may be tracked back to the source (but unfortunately not to the forger) through the transactional watermark. This forces the exhibitor to be more prudent concerning security issues. |
Jpeg 2000 | Digital compression format, chosen as standard by DCI. It meets the requirements for digital cinema projection (no blocking artifacts). It is scaleable in terms of quality and resolution (can extract a 2K layer out of a 4K image, which is important since not all theatres have 4K projectors). But it is not as efficient as MPEG-4. |
KDM (key delivery message) or license | KDM is a standardized method of delivering security keys to digital cinema playback systems. KDM contains the keys necessary to decrypt a movie, plus information about how it may be used. |
Projection (digital) | The Digital Cinema projector projects the digital image from the digital cinema playback system onto the screen. At the moment there are two types of digital cinema projectors: The widely used ->DLP projectors, a technology developed by Texas Instruments and licensed by Christie, Barco and NEC, and the ->SXRD or LCoS technology used by Sony and JVC.There are several items affecting the projection system: color space, resolution, luminance (brightness), contrast and interfaces. Laser light projectors researches (which will be able to deliver more light levels usefull for 3d and large screenings) are still in progress. |
Security, Anti piracy | Security in Digital Cinema systems is provided through encryption and the management of content key access. Content is transported and received using encryption. The content is unlocked through standardized methods of delivering and using decryption keys. Together with key exchange comes Digital Rights Managament (DRM), which establishes the rules for using content. |
SMPTE | Society of Motion Picture and Television Engineers |
Storage and Media Block | Are components of the theatre playback system. Storage is the file server that holds the packaged content for eventual playback. The Media Block is the hardware device (or devices) that converts the packaged content into the streaming data that ultimately turns into the pictures and sound in the theatre. Media Blocks are secure entities. |
SXRD | Sony projection technology used by Sony and JVC projectors, resolution 4K. |
Theatre playback systems | Consist of projectors, media blocks, security managers, storage, sound systems, DCP ingest, theatre automation, Screen Management System (SMS) and Theatre Management System (TMS). |
Transport | DCP is transported via terrestrial fiber, satellite or physical packaged media (HDD, blue laser discs...). After being transported to the theatre, the files are stored on a file server until playback. |
Virtual Print Fee (VPF) | The Virtual Print Fee (VPF) model is a means of financing the conversion of the cinema exhibitionn industry to digital cinema. The basic premise is that a third party pays up front for the equipment, and then recoups the cost of the equipment over time, through VPF-payments from distributors (who pay the majority of the cost) on the one hand and usage- and maintenance-fees paid by the exhibitors on the other hand. Fees are paid per digital screening of a film. The distributors save money by shipping digital, rather than 35mm prints, and these savings are used to contribute towards the cost of the equipment. |
HDR (Hight Dynamic Range) | HDR is used: - During shooting in photo/cinematography to reproduce a greater dynamic range of luminosity. to capture, for example, the sky's detail without making the land look too dark, and vice versa - During post-production to create deliveries with HDR - And at the end, HDR is a new technology used to reproduce the greater dynamic range of luminosity captured by new cameras on displays and on projectors. The grading has to be adapted to this types of displays or projectors |
3D-Cinema, stereoscopic cinema | |
3D-Cinema, stereoscopic cinema | 3D-film is either shot with two parallel cameras on a rig, each representing one eye of the viewer, or computer-generated imagery creates the two perspectives, based on the data of a 2D-film. Special projection hardware and/or eyewear are used to provide the illusion of depth when viewing the film. Projection is digital. Today a 3D presentation does not require two projectors anymore. With exception of the IMAX system, only a single projector is used. There are four leading companies offering systems to view 3D - RealD, Xpand (nuVision), Dolby 3D and IMAX. Audiences viewing a film presented in 3D are given a pair of glasses - either anaglyph, polarized, shutter or InFiTec glasses. RealD and IMAX need an expensive ->silver screen, the other systems work with normal screens. |
Anaglyph | Anaglyph images are made up of two layers of complementary colours, superimposed, but slightly offset with respect to each other to produce a depth effect. The picture contains two differently filtered colored images, one for each eye. When viewed through the anaglyph glasses (today made of red and cyan foil) , they reveal a stereoscopic image. The brain fuses this into the perception of a three dimensional scene. Anaglyph images may be the cause of unwanted effects like ->ghosting, and they have a limited colour spectrum. Anaglyph glasses are the least expensive glasses for 3D-vision. Most of the 3D-movies of the 1950ies have used anaglyph technology. |
Dolby3D / Infitec glasses | Infitec has developed a technique for channel separation in stereo projection based on interference filters. Images for the left and right eye with basic colors are projected at different wavelengths (so-called wavelength multiplexing). The technology is used by Dolby3D. |
Ghosting | Ghosting is a motion blur or double image - in 3D-projection it is caused by polarized projection or when color filtering is used. |
Parallax | Parallax is the angle created by two straight lines starting in two different points (e.g. the eyes of the observer) and directed to one point (the object). |
Polarizing glasses | Polarizing glasses contain a pair of different polarizing filters. As each filter passes only that light which is similarly polarized and blocks the light polarized in the opposite direction, each eye sees a different image. This produces a three-dimensional effect by projecting the same scene into both eyes, but depicted from slightly different perspectives. Circularly polarized glasses are normally used today since - as opposed to linearly polarized glasses - they allow for much greater head movement of the viewer. Compared to anaglyph images, the use of polarized 3D glasses produces a full-color image that is considerably more comfortable to watch. However, even low cost polarized glasses are typically double the cost of red-cyan filters. |
Screen space | The region appearing to be within the screen or behind the surface of the screen. Images with positive ->parallax will appear to be in screen space. The boundary between screen and theater space is the plane of the screen and has zero parallax. |
Shutter glasses | Liquid crystal shutter glasses (also called active shutter glasses) contain a liquid crystal layer which becomes dark when voltage is applied, being otherwise transparent. The glasses are controlled by an infrared, radio frequency, DLP-Link or Bluetooth transmitter that sends a timing signal that allows the glasses to alternately darken over one eye, and then the other, in synchronization with the refresh rate of the screen. Meanwhile, the display alternately displays different perspectives for each eye, which achieves the desired effect of each eye seeing only the image intended for it. Unlike red/blue colour filter 3D glasses, LC shutter glasses are colour neutral enabling 3D viewing in the full colour spectrum. Shutter glasses tend to be much more expensive than other forms of stereoscopic glasses. |
Silver screen, polarizing screen | Polarized 3D projection (RealD) and IMAX need a silver screen for projection, because it preserves the light's polarization. Since polarized projection uses interposed filters, the image brightness is reduced. The silver screen doesn't disperse light as much as a non-metallic white screen, and is therefore better suited to polarized projection. The silver screen however adds up to a substantial investment for Theatre owners, which is the downside of polarized projection. Also – company named MASTERIMAGE is using circular polarizing solution – portable device you can use in front of digital projection. And – regarding silver screen – maybe – sweet spot? |
Stereoscopic | 3-D imaging. |