DCI, LLC Private Draft Document. Not for Publication.

Digital Cinema Initiatives, LLC

Digital Cinema System Test Plan

V2.0

April 25, 2004

Draft Approved (Insert Date Here)

Digital Cinema Initiatives, LLC Technology Committee

Final Approved (Insert Date Here)

Digital Cinema Initiatives, LLC Member Representatives Committee

Copyright 2004 by

Digital Cinema Initiatives, LLC.

6834 Hollywood Blvd.

Suite 500

Hollywood, CA. 90028, USA

DCI_Test_Plan_v2.docPage 110/23/2018

Confidential DCI, LLC

DCI, LLC Private Draft Document. Not for Publication.

THIS PAGE LEFT BLANK INTENTIONALLY

NOTICE

Digital Cinema Initiatives, LLC (DCI) is the author and creator of this specification for the purpose of copyright and other laws in all countries throughout the world. The DCI copyright notice must be included in all reproductions, whether in whole or in part, and may not be deleted or attributed to others. DCI hereby grants to its members and their suppliers a limited license to reproduce this specification for their own use, provided it is not sold. Others should obtain permission to reproduce this specification from Digital Cinema Initiatives, LLC; Attn: Chief Executive Officer; 6834 Hollywood Blvd., Suite 500; Hollywood, California 90028; USA; (323) 769-2885 (voice); (323) 769-2895 facsimile.

This document is a specification developed and adopted by Digital Cinema Initiatives, LLC (DCI). This document may be revised by DCI. It is intended solely as a guide for companies interested in developing products, which can be compatible with other products developed using this document. DCI makes no representation or warranty regarding this document, and any company using this document shall do so at its sole risk, including specifically the risks that a product developed will not be compatible with any other product or that any particular performance will not be achieved. DCI shall not be liable for any exemplary, incidental, proximate or consequential damages or expenses arising from the use of this document. This document defines only one approach to compatibility, and other approaches may be available to the industry.

This document is an authorized and approved publication of DCI. Only DCI has the right and authority to revise or change the material contained in this document, and any revisions by any party other than DCI are unauthorized and prohibited.

Compliance with this document may require use of one or more features covered by proprietary rights (such as features which are the subject of a patent, patent application, copyright, mask work right or trade secret right). By publication of this document, no position is taken by DCI with respect to the validity or infringement of any patent or other proprietary right. DCI hereby expressly disclaims any liability for infringement of intellectual property rights of others by virtue of the use of this document. DCI has not and does not investigate any notices or allegations of infringement prompted by publication of any DCI document, nor does DCI undertake a duty to advise users or potential users of DCI documents of such notices or allegations. DCI hereby expressly advises all users or potential users of this document to investigate and analyze any potential infringement situation, seek the advice of intellectual property counsel, and, if indicated, obtain a license under any applicable intellectual property right or take the necessary steps to avoid infringement of any intellectual property right. DCI expressly disclaims any intent to promote infringement of any intellectual property right by virtue of the evolution, adoption, or publication of this document.

The DCI specification will be undergoing modifications and changes throughout 2004. Companies, organizations and individuals are welcome to contribute comments to DCI for consideration. These contributions may be incorporated into future revisions. However, prior to submitting any changes (even electronically), please submit a signed copy of the “DCI Policy on IP and Confirmation of Understanding” (see next page).

Changes can be sent electronically to either:

Howard

Walt

In addition, the document is also being modified to reflect changes, as a result of ongoing testing. Therefore, due to the changing nature of this activity, the current version should not be considered final. Updates to this document will be provided as soon as they are available.

THIS PAGE LEFT BLANK INTENTIONALLY

Table of Contents

1.System Test Plan Introduction

1.1.Document Conventions

1.2.Test Plan Overview

1.2.1.Functional Framework

1.2.1.1.Setup and Calibration

1.2.1.2.Requirements Testing

1.2.1.3.Interoperation Testing

2.Setup and Calibration

2.1.Setup

2.1.1.Light Measurement Devices (LMD)

2.1.1.1.Photometer Type

2.1.1.2.Stray Light Elimination Tube (SLET)

2.1.1.3.Spectroradiometer Type

2.1.1.4.Digital Still Camera Type

2.1.2.Measurement Repeatability Luminance (VESA 301-4)

2.1.3.Signal Generation

2.1.4.Display Under Test (DUT) Measurement and Display Conditions

2.1.4.1.Electrical Conditions (VESA 301-2B)

2.1.4.2.Environmental (VESA 301-2C)

2.1.4.3.Warm-Up Time (VESA 301-2D)

2.1.4.4.Display controls (VESA 301-2E)

2.1.4.5.Darkroom conditions (VESA 301-2F)

2.1.4.6.Measurement Location (VESA 301-2G)

2.2.Projection Adjustment and Calibration

2.2.1.Aspect Ratio

2.2.2.Operational Check and Coarse Adjustment

2.2.2.1.Sizing of the Projected Image to the Screen

2.2.2.1.1.Center of Screen

2.2.2.2.Zoom Image

2.2.2.3.Focus

2.2.2.4.White Point Check

2.2.2.5.Luminance

2.2.2.6.Defects of Screen/Image (VESA 301-3C,E,G)

2.2.2.7.Flicker Visibility Assessment (VESA 301-3F)

2.2.2.8.Convergence Assessment (VESA 303-3I)

2.2.3.Fine Adjustment

2.2.3.1.Sizing

2.2.3.2.Tilt

2.2.3.3.Keystone

2.2.3.4.Linearity

2.2.3.5.Focus

2.2.3.6.White Point

2.2.3.7.Luminance

2.3.Projection Measurement

2.3.1.Aspect Ratio

2.3.2.Geometry

2.3.2.1.Geometry calculation methods (VESA 503-3A)

2.3.2.1.1.Horizontal Keystone (Trapezium)

2.3.2.1.2.Vertical Keystone (Trapezium)

2.3.2.1.3.Horizontal Rotation or Tilt

2.3.2.1.4.Vertical Rotation or Tilt

2.3.2.1.5.Orthogonality

2.3.2.1.6.Pincushion Distortions

2.3.2.1.7.Top Pincushion Calculation

2.3.2.1.8.Bottom Pincushion Calculation

2.3.2.1.9.Left Pincushion Calculation

2.3.2.1.10.Right Pincushion Calculation

2.3.2.1.11.Horizontal Linearity

2.3.2.1.12.Vertical Linearity

2.3.2.2.Reflector-less Gun Method

2.3.2.3.Reference Device Method

2.3.2.3.1.Sizing and Aspect Ratio

2.3.2.3.2.Linearity

2.3.2.3.3.Keystone

2.3.2.4.Digital Still Camera Method Calibration

2.3.2.5.Convergence

2.3.3.Luminance

2.3.3.1.Luminance Wander

2.3.3.2.Luminance Full Screen White

2.3.3.3.Luminance Full Screen Black

2.3.3.4.Sampled Luminance Non-uniformity (VESA 306-1)

2.3.3.4.1.Alternative Digital Still Camera Method

2.3.4.Contrast

2.3.4.1.Center Sequential Contrast (VESA 302-3)

2.3.4.2.Sampled Sequential Contrast Ratio

2.3.4.3.Sampled Contrast Non-Uniformity (VESA 306-3)

2.3.4.3.1.Alternative Digital Still Camera Method

2.3.4.4.ANSI Contrast (ANSI 228-1997 section 4.3)

2.3.4.4.1.Alternative Digital Still Camera Method

2.3.4.5.Screen Gain ( SMPTE RP-94 )

2.3.4.6.Room Effect

2.3.5.Electro-Optical Transfer Function or Gamma (VESA 302-5)

2.3.6.Color

2.3.6.1.White Point Accuracy (VESA 302-6A)

2.3.6.2.Sampled Chromaticity Non-uniformity of Colors (VESA 306-4)

2.3.6.2.1.Alternative Digital Still Camera Method

2.3.6.3.Chromaticity Tracking (VESA 302-7)

2.3.6.3.1.Alternative Digital Still Camera Method

2.3.6.4.Color Gamut (VESA 302-4)

2.3.7.Resolution

2.3.7.1.Effective Resolution

2.3.7.1.1.Alternative Digital Still Camera Method

3.Requirements Testing

3.1.Requirements Testing

3.1.1.Objective testing

3.1.1.1.Material Preparation for Requirements Testing

3.1.1.1.1.Test Pattern Creation

3.1.1.1.2.For Film Out:

3.1.1.1.3.For OCN to Data:

3.1.1.1.4.For Digital Files:

3.1.1.2.Screen luminance (12fL or 14fL)

3.1.1.2.1.Methodology 1

3.1.1.2.2.Test Material

3.1.1.2.3.Equipment

3.1.1.3.Contrast

3.1.1.3.1.Methodology

3.1.1.3.2.Methodology 2

3.1.1.3.3.Test Material

3.1.1.3.4.Equipment

3.1.1.4.Dynamic Range (Transfer Function)

3.1.1.4.1.Methodology

3.1.1.4.2.Test Material

3.1.1.4.3.Equipment

3.1.1.5.Geometry

3.1.1.5.1.Methodology

3.1.1.5.2.Test Material

3.1.1.5.3.Equipment

3.1.2.Subjective Testing

3.1.2.1.Material Selection for Live Action Content

3.1.2.2.Material Selection for CG Content

3.1.2.3.Digital Source Master Content Creation Procedure

3.1.2.3.1.StEM OCN to Data

3.1.2.3.2.Display Reel OCN to Answer Print

3.1.2.4.Color Correction Procedure for Content

3.1.2.5.Conforming Mini-Movie Procedure for Content

3.1.2.6.Film Output Procedure

3.1.2.7.Image Formatting Procedure

3.1.2.8.Down Conversion Procedure

3.1.2.8.1.Down Conversion to 2k Files

3.1.2.8.2.Down Conversion to HD

3.1.2.9.Test Procedure for Visual Discrimination Threshold

3.1.2.9.1.Objective

3.1.2.9.2.Background

3.1.2.9.3.Test Patterns

3.1.2.9.4.Patterns Created for the Test

3.1.2.9.5.12 Bit Luminance Levels and the Projector

3.1.2.9.6.Test Process

3.1.2.9.7.Forced Choice

3.1.2.9.8.Test Sequence

3.1.2.9.9.Equipment:

3.1.2.10.35mm Answer Print to Digital Cinema Side-by-Side Comparison (Display Reel)

3.1.2.10.1.Objective

3.1.2.10.2.Methodology

3.1.2.10.3.Materials

3.1.2.10.4.Equipment

3.1.2.11.2k / 48 Frame Playout

3.1.2.11.1.Methodology

3.1.2.11.2.Test Material

3.1.2.11.3.Equipment

3.1.2.12.Sharpness

3.1.2.12.1.Objective

3.1.2.12.2.Methodology

3.1.2.12.3.Materials

3.1.2.12.4.Equipment

3.1.2.12.5.Normative References

3.1.2.13.Compression

3.1.2.13.1.Methodology

3.1.2.13.2.Test Material

4.System Interoperation Testing

4.1.Packaging

4.1.1.MXF File(s) Generation

4.1.1.1.Methodology

4.1.1.2.Materials

4.1.1.3.Equipment

4.1.2.File Interchange

4.1.2.1.Methodology

4.1.2.2.Materials

4.1.2.3.Equipment

4.1.3.MXF File(s) Generation with Simple Encryption/Decryption

4.1.3.1.Methodology

4.1.3.2.Materials

4.1.3.3.Equipment

4.2.Security

4.2.1.Validate Integrity of Received Content

4.2.1.1.Objective

4.2.1.2.Test Method

4.2.1.2.1.Deliver to the Exhibition system

4.2.1.2.2.Perform commands to validate the delivered lists and files.

4.2.1.3.Test Material

4.2.1.4.Equipment

4.2.2.Detect Tampering with Received Content

4.2.2.1.Objective

4.2.2.2.Test Method

4.2.2.3.Test Material

4.2.2.3.1.Equipment

4.2.3.Validate KDM Message

4.2.3.1.Objective

4.2.3.2.Test Method

4.2.3.3.Test Material

4.2.3.4.Equipment

4.2.4.Detect Tampering with KDM

4.2.4.1.Objective

4.2.4.2.Test Method

4.2.4.3.Test Material

4.2.4.3.1.Equipment

4.2.5.Ready-To-Go Query from TMS

4.2.5.1.Objective

4.2.5.2.Test Method

4.2.5.3.Test Material

4.2.5.4.Equipment

4.2.6.Successful Playability Query from TMS

4.2.6.1.Objective

4.2.6.2.Test Method

4.2.6.3.Test Material

4.2.6.4.Equipment

4.2.7.Unsuccessful Playability Query from TMS

4.2.7.1.Objective

4.2.7.2.Test Method

4.2.7.3.Test Material

4.2.7.4.Equipment

4.2.8.Perform Play-Out from Single SM and Single KDM

4.2.8.1.Objective

4.2.8.2.Test Method

4.2.8.3.Test Material

4.2.8.4.Equipment

4.2.9.Perform Play-Out from Multiple SMs

4.2.9.1.Objective

4.2.9.2.Test Method

4.2.9.3.Test Material

4.2.9.4.Equipment:

4.2.10.Manual Override of Play-Out Problem

4.2.10.1.Objective

4.2.10.2.Test Method

4.2.10.3.Test Material

4.2.10.4.Equipment

4.2.11.Report Log Information to TMS

4.2.11.1.Objective

4.2.11.2.Test Method

4.2.11.3.Test Material

4.2.11.4.Equipment

4.2.12.Revoke Keys and Certificates for MD

4.2.12.1.Objective

4.2.12.2.Test Method

4.2.12.2.1.Test Material

4.2.12.3.Equipment:

4.2.13.Perform Play-Out from Single SM and Two KDM

4.2.13.1.Objective

4.2.13.2.Test Method

4.2.13.3.Test Material

4.2.13.4.Equipment

4.2.14.Perform Play-Out of MXF with “Integrity Pack”

4.2.14.1.Objective

4.2.14.2.Test Method

4.2.14.3.Test Material

4.2.14.4.Equipment

4.2.15.Detect Tampering When MXF with “Integrity Pack” Used

4.2.15.1.Objective

4.2.15.2.Test Method

4.2.15.3.Test Material

4.2.15.3.1.Equipment

4.2.16.Reasonable Behavior When Media Keys Unknown

4.2.16.1.Objective

4.2.16.2.Test Method

4.2.16.3.Test Material

4.2.16.4.Equipment

4.2.17.Perform Play-Out With LE and LD

4.2.17.1.Objective

4.2.17.2.Test Method

4.2.17.3.Test Material

4.2.17.4.Equipment

4.2.18.Report Log Information to TMS

4.2.18.1.Objective

4.2.18.2.Test Method

4.2.18.3.Test Material

4.2.18.4.Equipment

4.3.Projection

4.3.1.Verification of Color Transforms

4.3.1.1.Objective

4.3.1.2.Test Method

4.3.1.3.Test Material

4.3.1.4.Equipment

4.3.2.Demonstration of Projector Interoperability

4.3.2.1.Objective

4.3.2.2.Test Method

4.3.2.3.Test Material

4.3.2.4.Equipment

4.3.3.Verification of Projection Metadata File Compatibility

4.3.3.1.Objective

4.3.3.2.Test Method

4.3.3.3.Test Material

4.3.3.4.Equipment

4.3.4.Demonstrate Feasibility of Gamut Mapping Capability

4.3.4.1.Objective

4.3.4.2.Test Method

4.3.4.3.Test Material

4.3.4.4.Equipment

4.4.Theatre Systems

4.4.1.Sync

4.4.1.1.Objective

4.4.1.2.Test Method

4.4.1.3.Test Material

4.4.1.4.Equipment

4.4.2.Theatre Automation

4.4.2.1.Objective

4.4.2.2.Test Method

4.4.2.3.Test Material

4.4.2.4.Equipment

Annex A:Performance Specifications

A.1.Digital Cinema Projection Specifications

A.1.1.Performance Classes

A.2.Xenon DCDM Color Primaries, White Reference

A.2.1.Color Primaries

A.2.2.White Reference

Annex B:Equipment List

B.1.Testing Equipment List

Annex C:Condensed Projector Test Procedure

C.1.Condensed Projector Test Procedure

Annex D:Test Materials

D.1.1.Motion Test Patterns

D.1.2.Live Action Stills

D.1.3.Live Action Motion

Annex E:Projection Test Report

E.1.DCI Projection Test Report

Annex F:Stray Light Elimination Tube

F.1.Stray Light Elimination in Making Projection Display Measurements

F.2.NIST Stray Light Elimination Tube Prototype

Annex G:StEM Documentation

Table of Figures

Figure 1: Digital Cinema Test Pattern

Figure 2: Geometry Test Pattern

Figure 3: Self-Leveling Laser Guide

Figure 4: Horizontal Keystone Example

Figure 5: Trapezium and Rotation Example

Figure 6: Pincushion Distortion Example

Figure 7: Luminance Capture Configuration

Figure 8: Digital Scope Capture

Figure 9: Full Screen Black Test Pattern with Circles

Figure 10: Full Screen Black Test Pattern

Figure 11: ANSI Contrast Test Pattern

Figure 12: Single Step White Window Test Pattern

Figure 13: Transfer Function Plot

Figure 14: Test Pattern Material Preparation Flow

Figure 15: ANSI 4x4 Test Pattern

Figure 16: 32 Single Step Test Patterns

Figure 17: Geometry Test Pattern

Figure 18: Live Action Material Preparation

Figure 19: StEM Material Color Correction Flow

Figure 20: StEM Digital Post Production Flow

Figure 21: Graph of LUT Used to Match Project Precise Luminance Values

Figure 22: Requirements Test Server

DCI_Test_Plan_v2.docPage 110/23/2018

Confidential DCI, LLC

DCI, LLC Private Draft Document. Not for Publication.

  1. System Test Plan Introduction
  2. Document Conventions

This test plan will reference many documents for the different calibration procedures and test methodologies. The primary documents will be SMPTE Standards, and the VESA FPDM Measurement Standard v2.0. The standards are not included with this text but can be obtained from the following locations.

SMPTE Standards

The Society of Motion Picture and Television Engineers

595 W. Hartsdale Ave.

White Plains, NY 10607

+1 914.761.1100

VESA FPDM Measurement Standard v2.0.

VESA – Video Electronics Standards Association

920 Hillview Ct., Suite 140

Milpitas, CA 95035

+1 408.957.9277

American National Standards Institute

25West 43rd Street, (between 5th and 6th Avenues),4th floor
New York, NY 10036
+1.212.642.4900 Fax: 1.212.398.0023

Any other methods that are not published will be included either in the document or provided in the Annex of this document.

For the purposes of this document the following command language conventions apply: Text that does not contain command language is defined to be explanatory and cannot contain a requirement or permission.

Shall:Conforming Digital Cinema systems are required to support the requirement

Should:Conforming Digital Cinema systems are strongly suggested to support the capability

May:Indicates an allowed behavior for a conforming Digital Cinema systems

All text highlighted in red are items that are in a state of flux or are contentious.

1.2.Test Plan Overview

The following test plan method is based upon measurement of an installed projection system, consisting of a projector, a room, and a screen, and not merely a projector. Screen aging or yellowing may not be uniform across the screen surface and may result in luminance or chromaticity variation greater than expected.Likewise, maladjustment of the projector light source optical system or use of a non-optimum screen configuration may cause luminance readings taken at various locations in the seating area and on various areas of the screen to exceed the screen center reading (hot spots).In cases where excessive differences are measured across the image area, it may be helpful to identify the source of the problem by directly measuring the incident light from the projector falling on the screen.

In some cases it may not be possible or advisable to measure the projection system as a whole. One such case is where reflective light measuring devices may not possess the precision for such measurements and a incident device must be used.

In all cases during the measurement process, if defective equipment or poor screen performance is realized, then all efforts should be taken to correct the devices.

1.2.1.Functional Framework

This document is divided into three main sections; Setup and Calibration, Requirements Testing and Inter-operational Testing. There is also an Annex that contains an Equipment List, Test Procedures and Test Reports.

1.2.1.1.Setup and Calibration

The Setup and Calibration section provides a method and procedure for aligning and calibrating the Digital Cinema projector system. Before proper analysis of a test can be assessed, the projection system must first be properly setup and calibrated. Only when this has been accomplished can the results of the testing be valid and meaningful.

1.2.1.2.Requirements Testing

The Requirements Testing section of this document provides procedures and methods for testing the requirements in question of the DCI, LLC., Digital Cinema System Specification. This section reflects the concerns of the Draft Specification at the time it was prepared and throughout its’ progress. The test items and procedures attempt to answer the questions that required answers to complete the specification. At the time of the writing the specification, any items in contention or requiring further objective or subjective answers are included in this section. The test procedures are designed to give data to help solve these issues.

1.2.1.3.Interoperation Testing

The Interoperation Testing section provides procedures and methods for testing the interoperation of the components of a Digital Cinema system. This is meant to test the exchange of content, Key management, and other components of the DCI Digital Cinema System Specification. Any discrepancies found during this stage will require a change to the specification or a manufactures component change.

DCI_Test_Plan_v2.docPage 110/23/2018

Confidential DCI, LLC


DCI, LLC Private Draft Document. Not for Publication.

  1. Setup and Calibration
  2. Setup

The following section defines the setup methods used to measure and calibrate projectors and their surround environment. These procedures are based upon ANSI/SMPTE-196M-1995 and SMPTE-RP 98-1995 and VESA FPDM V2.0. These are presently under review and are designed for traditional film projection.

The way in which the Light Measurement Devices (LMD) and the Display Under Test (DUT) are set up is extremely important. These parameters are described below.

2.1.1.Light Measurement Devices (LMD)

2.1.1.1.Photometer Type

Screen Luminance or illuminance shall be measured with a spot photometer having the spectral luminance response of the standard observer (photopic vision), as defined in CIE S002.

1.A photometer with a photopic spectral response allows use of a well-known standard response for all photometer manufacturers.

2.The acceptance angle of the photometer shall be 2° or less.

3.The photometer response to luminance variation over time shall be to properly integrate any such variation occurring at frequencies at or above 24 Hz, and display the arithmetic mean value.

4.Adequate integration time.

2.1.1.2.Stray Light Elimination Tube (SLET)

To eliminate the room effect (ambient light) that would appear at an incident (illuminance) meter a Stray Light Elimination Tube should be used. Refer to Annex F: for the description and details to manufacture such a device.

2.1.1.3.Spectroradiometer Type

Chromaticity shall be measured with a spot spectroradiometer with an acceptance angle of 2° or less. It shall report values in CIE x, y coordinates, with an accuracy of ±0.002 or better for both x and y. (It should be noted that the accuracy applies to luminance levels above 0.1 cd/m².)

2.1.1.4.Digital Still Camera Type

Care must be taken when selecting a Digital Still camera and Lens. The camera shall meet the following requirements:

1.Capturing RAW data files without compression.

2.Have a minimum bit depth of 12 bits.

3.Have a non uniformity rating of 2.5% or less

4.The ability to turn off all automatic settings