Draft VQEG Hybrid Testplan Version 1.4

Hybrid Perceptual/Bitstream Group

TEST PLAN

Draft Version 12.8907

November 4Jan. 25, 2010

Contacts:

Jens Berger (Co-Chair)Tel: +41 32 685 0830Email:

Chulhee Lee (Co-Chair) Tel: +82 2 2123 2779Email:

David Hands (Editor)Tel: +44 (0)1473 648184Email:

Nicolas Staelens (Editor) Tel: +32 9 331 49 75Email:

Yves Dhondt (Editor) Tel: +32 9 331 49 85Email:

Margaret Pinson (Editor) Tel: +1 303 497 3579Email:

Hybrid Test PlanDRAFT version 1.4. June 10, 2009

Draft VQEG Hybrid Testplan Version 1.4

Editorial History

Version / Date / Nature of the modification
1.0 / May 9, 2007 / Initial Draft, edited by A. Webster (from Multimedia Testplan 1.6)
1.1 / Revised First Draft, edited by David Hands and Nicolas Staelens
1.1a / September 13, 2007 / Edits approved at the VQEG meeting in Ottawa.
1.2 / July 14, 2008 / Revised by Chulhee Lee and Nicolas Staelens using some of the outputs of the Kyoto VQEG meeting
1.3 / Jan. 4, 2009 / Revised by Chulhee Lee, Nicolas Staelens and Yves Dhondt using some of the outputs of the Ghent VQEG meeting
1.4 / June 10, 2009 / Revised by Chulhee Leeusing some of the outputs of the San Jose VQEG meeting
1.5 / June 23, 2009 / The previous decisions are incorporated.
1.6 / June 24, 2009 / Additional changes are made.
1.7 / Jan. 25, 2010 / Revised by Chulhee Lee using the outputs of the BerlinVQEG meeting
1.8 / Jan. 28, 2010 / Revised by Chulhee Lee using the outputs of the Boulder VQEG meeting
1.9 / Jun. 30, 2010 / Revised by Chulhee Lee during the Krakow VQEG meeting
2.0 / Oct. 25, 2010 / Revised by Margaret Pinson

Hybrid TestplanDRAFT version 1.0. 9 May 2007 1/208

Draft VQEG Hybrid Testplan Version 1.4

Contents

1.Introduction

2.List of Definitions

3.List of Acronyms

4.Overview: ILG, Proponents, Tasks and Schedule

4.1Division of Labor

4.1.1Independent Laboratory Group (ILG)

4.1.2Proponent Laboratories

4.1.3VQEG

4.2Overview

4.2.1Compatibility Test Phase: Training Data

4.2.2Testplan Design

4.2.3Evaluation Phase

4.2.4Common Set

4.3Publication of Subjective Data, Objective Data, and Video Sequences

4.4Test Schedule

4.5Advice to Proponents on Pre-Model Submission Checking

6.SRC Video Restrictions and Video File Format

6.1Source Sequence Processing Overview and Restrictions

6.2SRC Resolution, Frame Rate and Duration

6.3Source Test Material Requirements: Quality, Camera, Use Restrictions.

6.4Source Conversion

6.4.1Software Tools

6.4.2Colour Space Conversion

6.4.3De-Interlacing

6.4.4Cropping & Rescaling

6.5Video File Format: Uncompressed AVI in UYVY

6.6Source Test Video Sequence Documentation

6.7Test Materials and Selection Criteria

7.HRC Creation and Sequence Processing

7.1Reference Encoder, Decoder, Capture, and Stream Generator

7.2Video Bit-Rates (examples) <XXX>

7.3Frame Rates <XXX>

7.4Pre-Processing

7.5Post-Processing

7.6Coding Schemes

7.7Rebuffering

7.8Transmission Errors

7.8.1Simulated Transmission Errors

7.8.2Live Network Conditions

7.9PVS Editing

8.Calibration and Registration

9.Experiment Design

9.1Video Sequence Naming Convention <XXX>

9.2Check on Bit-stream Validity

10.Subjective Evaluation Procedure

10.1The ACR Method with Hidden Reference

10.1.1General Description

10.1.2Viewing Distance, Number of Viewers per Monitor, and Viewer Position

10.2Display Specification and Set-up

10.2.1QVGA and WVGA Requirements

10.2.2HD Monitor Requirements

10.2.3Viewing Conditions

10.3Subjective Test Video Playback

10.4Evaluators (Viewers)

10.4.2Subjective Experiment Sessions

10.4.3Randomization

10.4.4Test Data Collection

10.5Results Data Format

11.Objective Quality Models

11.1Model Type and Model Requirements

11.2Model Input and Output Data Format

11.2.1No-Reference Hybrid Perceptual Bit-Stream Models and No-Reference Models

11.2.2Full reference hybrid perceptual bit-stream models

11.2.3Reduced-reference Hybrid Perceptual Bit-stream Models

11.2.4Output File Format – All Models

11.3Submission of Executable Model

11.4Registration

12.Objective Quality Model Evaluation Criteria <XXX>

12.1Post Subjective Testing Elimination of SRC or PVS

12.2Evaluation Procedure

12.3PSNR <XXX>

12.4Data Processing

12.4.1Video Clips and Scores Used in Analysis

12.4.2Inspection of Viewer Data

12.4.3Calculating DMOS Values

12.4.4Mapping to the Subjective Scale

12.4.5Averaging Process

12.4.6Aggregation Procedure

12.5Evaluation Metrics

12.5.1Pearson Correlation Coefficient

12.5.2Root Mean Square Error

12.5.3Outlier Ratio

12.6Statistical Significance of the Results

12.6.1Significance of the Difference between the Correlation Coefficients

12.6.2Significance of the Difference between the Root Mean Square Errors

12.6.3Significance of the Difference between the Outlier Ratios

13.Recommendation

14.Bibliography

ANNEX I Instructions to the Evaluators

ANNEX II Background and Guidelines on Transmission Errors

ANNEX III Fee and Conditions for receiving datasets

ANNEX IV Method for Post-Experiment Screening of Evaluators

ANNEX V. Encrypted Source Code Submitted to VQEG

ANNEX VI. Definition and Calculating Gain and Offset in PVSs

APPENDIX I. Terms of Reference of Hybrid Models (Scope As Agreed in June, 2009)

1.Introduction6

2.List of Definitions7

3.List of Acronyms8

4.Overview: ILG, Proponents, Tasks and Schedule9

4.1Division of Labor9

4.1.1Independent Laboratory Group (ILG)9

4.1.2Proponent Laboratories10

4.1.3VQEG11

4.2Overview11

4.2.1Compatibility Test Phase: Training Data11

4.2.2Testplan Design11

4.2.3Evaluation Phase11

4.2.4Common Set12

4.3Publication of Subjective Data, Objective Data, and Video Sequences12

4.4Test Schedule12

4.5Advice to Proponents on Pre-Model Submission Checking13

6.SRC Video Restrictions and Video File Format15

6.1Source Sequence Processing Overview and Restrictions15

6.2SRC Resolution, Frame Rate and Duration15

6.3Source Test Material Requirements: Quality, Camera, Use Restrictions.16

6.4Source Conversion16

6.4.1Software Tools16

6.4.2Colour Space Conversion16

6.4.3De-Interlacing17

6.4.4Cropping & Rescaling17

6.5Video File Format: Uncompressed AVI in UYVY18

6.6Source Test Video Sequence Documentation19

6.7Test Materials and Selection Criteria19

7.HRC Creation and Sequence Processing21

7.1Reference Encoder, Decoder, Capture, and Stream Generator21

7.2Video Bit-Rates (examples) <XXX>21

7.3Frame Rates <XXX>22

7.4Pre-Processing22

7.5Post-Processing22

7.6Coding Schemes22

7.7Rebuffering23

7.8Transmission Errors23

7.8.1Simulated Transmission Errors23

7.8.2Live Network Conditions25

7.9PVS Editing25

8.Calibration and Registration27

9.Experiment Design30

9.1Video Sequence Naming Convention <XXX>30

9.2Check on Bit-stream Validity31

10.Subjective Evaluation Procedure32

10.1The ACR Method with Hidden Reference32

10.1.1General Description32

10.1.2Viewing Distance, Number of Viewers per Monitor, and Viewer Position33

10.2Display Specification and Set-up33

10.2.1QVGA and WVGA Requirements33

10.2.2HD Monitor Requirements34

10.2.3Viewing Conditions35

10.3Subjective Test Video Playback36

10.4Evaluators (Viewers)36

10.4.2Subjective Experiment Sessions37

10.4.3Randomization38

10.4.4Test Data Collection38

10.5Results Data Format38

11.Objective Quality Models40

11.1Model Type and Model Requirements41

11.2Model Input and Output Data Format42

11.2.1No-Reference Hybrid Perceptual Bit-Stream Models and No-Reference Models42

11.2.2Full reference hybrid perceptual bit-stream models42

11.2.3Reduced-reference Hybrid Perceptual Bit-stream Models43

11.2.4Output File Format – All Models44

11.3Submission of Executable Model44

11.4Registration45

12.Objective Quality Model Evaluation Criteria <XXX>46

12.1Post Subjective Testing Elimination of SRC or PVS46

12.2Evaluation Procedure46

12.3PSNR <XXX> 47

12.4Data Processing47

12.4.1Video Clips and Scores Used in Analysis47

12.4.2Inspection of Viewer Data48

12.4.3Calculating DMOS Values48

12.4.4Mapping to the Subjective Scale48

12.4.5Averaging Process49

12.4.6Aggregation Procedure49

12.5Evaluation Metrics49

12.5.1Pearson Correlation Coefficient50

12.5.2Root Mean Square Error50

12.5.3Outlier Ratio51

12.6Statistical Significance of the Results52

12.6.1Significance of the Difference between the Correlation Coefficients52

12.6.2Significance of the Difference between the Root Mean Square Errors52

12.6.3Significance of the Difference between the Outlier Ratios53

13.Recommendation54

14.Bibliography55

ANNEX I Instructions to the Evaluators56

ANNEX III Fee and Conditions for receiving datasets61

ANNEX V Method for Post-Experiment Screening of Evaluators62

ANNEX VI. Encrypted Source Code Submitted to VQEG64

ANNEX VII. Definition and Calculating Gain and Offset in PVSs65

APPENDIX I. Terms of Reference of Hybrid Models (Scope As Agreed in June, 2009) 66

1.Introduction6

2.List of Definitions7

3.List of Acronyms8

4.Overview: ILG, Proponents, Tasks and Schedule9

4.1Division of Labor9

4.1.1Independent Laboratory Group (ILG)9

4.1.2Proponent Laboratories10

4.1.3VQEG10

4.2Overview11

4.2.1Testplan Design11

4.2.2Compatibility Test Phase: Training Data11

4.2.3Evaluation Phase11

4.2.4Common Set12

4.3Publication of Subjective Data, Objective Data, and Video Sequences12

4.4Test Schedule <XXX> 12

4.5Advice to Proponents on Pre-Model Submission Checking13

6.SRC Video Restrictions and Video File Format14

6.1Source Sequence Processing Overview and Restrictions14

6.2SRC Resolution, Frame Rate and Duration14

6.3Source Test Material Requirements: Quality, Camera, Use Restrictions.15

6.4Source Conversion15

6.4.1Software Tools15

6.4.2Colour Space Conversion15

6.4.3De-Interlacing16

6.4.4Cropping & Rescaling16

6.5Video File Format: Uncompressed AVI in UYVY17

6.6Source Test Video Sequence Documentation18

6.7Test Materials and Selection Criteria <XXX>18

7.HRC Creation and Sequence Processiong20

7.1Reference Decoder, Encoder, Capture, and Stream Generator20

7.2Video Bit-Rates (examples) <XXX>20

7.3Frame Rates <XXX>21

7.4Pre-Processing21

7.5Post-Processing21

7.6Coding Schemes21

7.7Transmission Errors22

7.7.1Rebuffering22

7.7.2Simulated Transmission Errors22

7.7.3Live Network Conditions24

7.8PVS Editing24

7.9Calibration and Registration25

8.Experiment Design27

8.1Video Sequence Naming Convention <XXX>27

8.2Check on Bit-stream Validity27

9.Subjective Evaluation Procedure29

9.1The ACR Method with Hidden Reference29

9.1.1General Description29

9.1.2Viewing Distance, Number of Viewers per Monitor, and Viewer Position30

9.2Display Specification and Set-up30

9.2.1QVGA and WVGA Requirements30

9.2.2HD Monitor Requirements31

9.2.3Viewing Conditions32

9.3Subjective Test Video Playback33

9.4Evaluators (Viewers)33

9.4.2Subjective Experiment Sessions34

9.4.3Randomization35

9.4.4Test Data Collection35

9.5Results Data Format36

10.Objective Quality Models37

10.1Model Type and Model Requirements38

10.2Model Input and Output Data Format39

10.2.1No-Reference Hybrid Perceptual Bit-Stream Models and No-Reference Models39

10.2.2Full reference hybrid perceptual bit-stream models39

10.2.3Reduced-reference Hybrid Perceptual Bit-stream Models40

10.2.4Output File Format (All Models)41

10.3Submission of Executable Model41

10.4Registration42

11.Objective Quality Model Evaluation Criteria <XXX>43

11.1Post Subjective Testing Elimination of SRC or PVS43

11.2Evaluation Procedure43

11.3PSNR <XXX> 44

11.4Data Processing44

11.4.1Video Clips and Scores Used in Analysis44

11.4.2Inspection of Viewer Data45

11.4.3Calculating DMOS Values45

11.4.4Mapping to the Subjective Scale45

11.4.5Averaging Process46

11.4.6Aggregation Procedure46

11.5Evaluation Metrics46

11.5.1Pearson Correlation Coefficient47

11.5.2Root Mean Square Error47

11.5.3Outlier Ratio48

11.6Statistical Significance of the Results48

11.6.1Significance of the Difference between the Correlation Coefficients49

11.6.2Significance of the Difference between the Root Mean Square Errors49

11.6.3Significance of the Difference between the Outlier Ratios49

12.Recommendation51

13.Bibliography52

ANNEX I Instructions to the Evaluators52

ANNEX II Background and Guidelines on Transmission Errors54

ANNEX III Fee and Conditions for receiving datasets56

Some of the video data and bit-stream data might be published. See section <XXX> for details. 56

ANNEX V Method for Post-Experiment Screening of Evaluators56

ANNEX VI. Encrypted Source Code Submitted to VQEG59

ANNEX VII. Definition and Calculating Gain and Offset in PVSs60

APPENDIX I. Terms of Reference of Hybrid Models (Scope As Agreed in June, 2009) 61

1.Introduction6

2.List of Definitions7

3.List of Acronyms8

4.Overview: ILG, Proponents, Tasks and Schedule9

4.1Division of Labor9

4.1.1Independent Laboratory Group (ILG)9

4.1.2Proponent Laboratories10

4.1.3VQEG10

4.2Overview11

4.2.1Testplan Design11

4.2.2Compatibility Test Phase: Training Data11

4.2.3Evaluation Phase11

4.2.4Common Set12

4.3Publication of Subjective Data, Objective Data, and Video Sequences12

4.4Test Schedule <XXX> 12

4.5Advice to Proponents on Pre-Model Submission Checking13

6.SRC Video Restrictions and Video File Format14

6.1Source Sequence Processing Overview and Restrictions14

6.2Duration of SRC14

6.3Source Test Material Requirements: Quality, Camera, Use Restrictions.15

6.4Source Conversion15

6.4.1Software Tools15

6.4.2Colour Space Conversion15

6.4.3De-Interlacing16

6.4.4Cropping & Rescaling16

6.5Video File Format: Uncompressed AVI in UYVY17

6.6Source Test Video Sequence Documentation18

6.7Test Materials and Selection Criteria <XXX>18

7.HRC Creation and Sequence Processiong <XXX>20

7.1Reference Decoder, Encoder, Capture, and Stream Generator <XXX>20

7.2Video Bit-Rates (examples) <XXX>20

7.3Frame Rates <XXX>21

7.4Pre-Processing21

7.5Post-Processing21

7.6Coding Schemes21

7.7Transmission Errors22

7.7.1Simulated Transmission Errors22

7.7.2Live Network Conditions24

7.8Calibration and Registration24

8.Experiment Design27

9.Subjective Evaluation Procedure28

9.1The ACR Method with Hidden Reference28

9.1.1General Description28

9.1.2Viewing Distance, Number of Viewers per Monitor, and Viewer Position29

9.2Display Specification and Set-up30

9.2.1QVGA Requirements30

9.2.2SD Requirements31

9.2.3HD Monitor Requirements32

9.3Subjective Test Video Playback <XXX>33

9.4Evaluators (Viewers)34

9.4.2Subjective Experiment Sessions35

9.4.3Randomization36

9.4.4Test Data Collection <XXX>36

9.5Results Data Format36

10.Objective Quality Models38

10.1Model Type and Model Requirements40

10.2Model Input and Output Data Format <XXX>40

10.2.1Full reference hybrid perceptual bit-stream models41

10.2.2Reduced-reference Hybrid Perceptual Bit-stream Models42

10.2.3No-Reference Hybrid Perceptual Bit-Stream Models and No-Reference Models43

10.3Submission of Executable Model44

10.4Registration44

11.Objective Quality Model Evaluation Criteria <XXX>45

11.1Post Subjective Testing Elimination of SRC or PVS45

11.2Evaluation Procedure45

11.3PSNR <XXX> 46

11.4Data Processing46

11.4.1Subjective Data Analysis46

11.4.2Subjective Data Analysis47

11.4.3Calculating DMOS Values47

11.4.4Mapping to the Subjective Scale47

11.4.5Averaging Process48

11.4.6Aggregation Procedure48

11.5Evaluation Metrics48

11.5.1Pearson Correlation Coefficient48

11.5.2Root Mean Square Error49

11.5.3Outlier Ratio50

11.6Statistical Significance of the Results50

11.6.1Significance of the Difference between the Correlation Coefficients50

11.6.2Significance of the Difference between the Root Mean Square Errors51

11.6.3Significance of the Difference between the Outlier Ratios51

12.Recommendation53

13.Bibliography54

ANNEX I <XXX> Instructions to the Evaluators55

ANNEX II Background and Guidelines on Transmission Errors57

ANNEX III Fee and Conditions for receiving datasets60

ANNEX V Method for Post-Experiment Screening of Evaluators61

ANNEX VI. Encrypted Source Code Submitted to VQEG63

ANNEX VII. Definition and Calculating Gain and Offset in PVSs64

APPENDIX I. Terms of Reference of Hybrid Models (Scope As Agreed in June, 2009) 65

1.Introduction6

2.List of Definitions7

3.List of Acronyms8

4.Overview: ILG, Proponents, Tasks and Schedule9

4.1Division of Labor9

4.1.1Independent Laboratory Group (ILG)9

4.1.2Proponent Laboratories10

4.1.3VQEG10

4.2Overview11

4.2.1Testplan Design11

4.2.2Compatibility Test Phase: Training Data11

4.2.3Evaluation Phase11

4.2.4Common Set12

4.3Publication of Subjective Data, Objective Data, and Video Sequences12

4.4Test Schedule <XXX> 12

4.5Advice to Proponents on Pre-Model Submission Checking13

6.SRC Video Restrictions and Video File Format14

6.1Source Sequence Processing Overview and Restrictions14

6.2Duration of SRC14

6.3Source Test Material Requirements: Quality, Camera, Use Restrictions.15

6.4Source Conversion15

6.4.1Software Tools15

6.4.2Colour Space Conversion15

6.4.3De-Interlacing16

6.4.4Cropping & Rescaling16

6.5Video File Format: Uncompressed AVI in UYVY17

6.6Source Test Video Sequence Documentation18

6.7Test Materials and Selection Criteria <XXX>18

7.HRC Creation and Sequence Processiong <XXX>20

7.1Reference Decoder, Encoder, Capture, and Stream Generator <XXX>20

7.2Video Bit-Rates (examples) <XXX>20

7.3Frame Rates <XXX>21

7.4Pre-Processing21

7.5Post-Processing21

7.6Coding Schemes21

7.7Transmission Errors22

7.7.1Simulated Transmission Errors22

7.7.2Live Network Conditions24

7.8Calibration and Registration24

8.Experiment Design27

9.Subjective Evaluation Procedure28

9.1The ACR Method with Hidden Reference28

9.1.1General Description28

9.1.2Viewing Distance, Number of Viewers per Monitor, and Viewer Position29

9.2Display Specification and Set-up30

9.2.1QVGA Requirements30

9.2.2SD Requirements31

9.2.3HD Monitor Requirements32

9.3Subjective Test Video Playback <XXX>33

9.4Evaluators (Viewers)34

9.4.2Subjective Experiment Sessions35

9.4.3Randomization36

9.4.4Test Data Collection <XXX>36

9.5Results Data Format36

10.Objective Quality Models38

10.1Model Type and Model Requirements40

10.2Model Input and Output Data Format <XXX>40

10.2.1Full reference hybrid perceptual bit-stream models41

10.2.2Reduced-reference Hybrid Perceptual Bit-stream Models42

10.2.3No-Reference Hybrid Perceptual Bit-Stream Models and No-Reference Models43

10.3Submission of Executable Model44

10.4Registration44

11.Objective Quality Model Evaluation Criteria <XXX>45

11.1Post Subjective Testing Elimination of SRC or PVS45

11.2Evaluation Procedure45

11.3PSNR <XXX> 46

11.4Data Processing46

11.4.1Subjective Data Analysis46

11.4.2Subjective Data Analysis47

11.4.3Calculating DMOS Values47

11.4.4Mapping to the Subjective Scale47

11.4.5Averaging Process48

11.4.6Aggregation Procedure48

11.5Evaluation Metrics48

11.5.1Pearson Correlation Coefficient48

11.5.2Root Mean Square Error49

11.5.3Outlier Ratio50

11.6Statistical Significance of the Results50

11.6.1Significance of the Difference between the Correlation Coefficients50

11.6.2Significance of the Difference between the Root Mean Square Errors51

11.6.3Significance of the Difference between the Outlier Ratios51

12.Recommendation53

13.Bibliography54

ANNEX I <XXX> Instructions to the Evaluators55

ANNEX II Background and Guidelines on Transmission Errors57

ANNEX III Fee and Conditions for receiving datasets60

ANNEX V Method for Post-Experiment Screening of Evaluators61

ANNEX VI. Encrypted Source Code Submitted to VQEG63

ANNEX VII. Definition and Calculating Gain and Offset in PVSs64

APPENDIX I. Terms of Reference of Hybrid Models (SCOPE AS AGREED IN JUNE, 2009) 65

1.Introduction6

2.List of Definitions7

3.List of Acronyms8

4.Overview: ILG, Proponents, Tasks and Schedule9

4.1Division of Labor9

4.1.1Independent Laboratory Group (ILG)9

4.1.2Proponent Laboratories10

4.1.3VQEG10

4.2Overview11

4.2.1Testplan Design11

4.2.2Compatibility Test Phase: Training Data11

4.2.3Evaluation Phase11

4.2.4Common Set12

4.3Publication of Subjective Data, Objective Data, and Video Sequences12

4.4Test Schedule <XXX> 12

4.5Advice to Proponents on Pre-Model Submission Checking13

6.SRC Video Restrictions and Video File Format14

6.1Source Sequence Processing Overview and Restrictions14

6.2Duration of SRC14

6.3Source Test Material Requirements: Quality, Camera, Use Restrictions.15

6.4Source Conversion15

6.4.1Software Tools15

6.4.2Colour Space Conversion15

6.4.3De-Interlacing16

6.4.4Cropping & Rescaling16

6.5Video File Format: Uncompressed AVI in UYVY17

6.6Source Test Video Sequence Documentation18

6.7Test Materials and Selection Criteria <XXX>18

7.HRC Creation and Sequence Processiong <XXX>20

7.1Reference Decoder, Encoder, Capture, and Stream Generator <XXX>20

7.2Video Bit-Rates (examples) <XXX>20

7.3Frame Rates <XXX>21

7.4Pre-Processing21

7.5Post-Processing21

7.6Coding Schemes21

7.7Transmission Errors22

7.7.1Simulated Transmission Errors22

7.7.2Live Network Conditions24

7.8Calibration and Registration24

8.Experiment Design27

9.Subjective Evaluation Procedure28

9.1The ACR Method with Hidden Reference28

9.1.1General Description28

9.1.2Viewing Distance, Number of Viewers per Monitor, and Viewer Position29

9.2Display Specification and Set-up30

9.2.1QVGA Requirements30

9.2.2SD Requirements31

9.2.3HD Monitor Requirements32

9.3Subjective Test Video Playback <XXX>33

9.4Evaluators (Viewers)34

9.4.2Subjective Experiment Sessions35

9.4.3Randomization36

9.4.4Test Data Collection <XXX>36

9.5Results Data Format36

10.Objective Quality Models38

10.1Model Type and Model Requirements40

10.2Model Input and Output Data Format <XXX>40

10.2.1Full reference hybrid perceptual bit-stream models41

10.2.2Reduced-reference Hybrid Perceptual Bit-stream Models42

10.2.3No-Reference Hybrid Perceptual Bit-Stream Models and No-Reference Models43

10.3Submission of Executable Model44

10.4Registration44

11.Objective Quality Model Evaluation Criteria <XXX>45

11.1Post Subjective Testing Elimination of SRC or PVS45

11.2Evaluation Procedure45

11.3PSNR <XXX> 46

11.4Data Processing46

11.4.1Subjective Data Analysis46

11.4.2Subjective Data Analysis47

11.4.3Calculating DMOS Values47

11.4.4Mapping to the Subjective Scale47

11.4.5Averaging Process48

11.4.6Aggregation Procedure48

11.5Evaluation Metrics48

11.5.1Pearson Correlation Coefficient48

11.5.2Root Mean Square Error49

11.5.3Outlier Ratio50

11.6Statistical Significance of the Results50

11.6.1Significance of the Difference between the Correlation Coefficients50

11.6.2Significance of the Difference between the Root Mean Square Errors51

11.6.3Significance of the Difference between the Outlier Ratios51

12.Recommendation53

13.Bibliography54

ANNEX I <XXX> Instructions to the Evaluators55

ANNEX II Background and Guidelines on Transmission Errors57

ANNEX III Fee and Conditions for receiving datasets60

ANNEX V Method for Post-Experiment Screening of Evaluators61

ANNEX VI. Encrypted Source Code Submitted to VQEG63

ANNEX VII. Definition and Calculating Gain and Offset in PVSs64

1.Introduction6

2.List of Definitions7

3.List of Acronyms8

4.Overview: ILG, Proponents, Tasks and Schedule9

4.1Division of Labor9

4.1.1Independent Laboratory Group (ILG)9

4.1.2Proponent Laboratories10

4.1.3VQEG10

4.2Overview11

4.2.1Testplan Design11

4.2.2Compatibility Test Phase: Training Data11

4.2.3Evaluation Phase11

4.2.4Common Set12

4.3Publication of Subjective Data, Objective Data, and Video Sequences12

4.4Test Schedule <XXX> 12

4.5Advice to Proponents on Pre-Model Submission Checking13

6.SRC Video Restrictions and Video File Format14

6.1Source Sequence Processing Overview and Restrictions14

6.2Duration of SRC14

6.3Source Test Material Requirements: Quality, Camera, Use Restrictions.15

6.4Source Conversion15

6.4.1Software Tools15

6.4.2Colour Space Conversion15

6.4.3De-Interlacing16

6.4.4Cropping & Rescaling16

6.5Video File Format: Uncompressed AVI in UYVY17

6.6Source Test Video Sequence Documentation18

6.7Test Materials and Selection Criteria <XXX>18

7.HRC Creation and Sequence Processiong <XXX>20

7.1Reference Decoder, Encoder, Capture, and Stream Generator <XXX>20

7.2Video Bit-Rates (examples) <XXX>20

7.3Frame Rates <XXX>21

7.4Pre-Processing21

7.5Post-Processing21

7.6Coding Schemes21

7.7Transmission Errors22

7.7.1Simulated Transmission Errors22

7.7.2Live Network Conditions24

7.8Calibration and Registration24

8.Experiment Design27

9.Subjective Evaluation Procedure28

9.1The ACR Method with Hidden Reference28

9.1.1General Description28

9.1.2Viewing Distance, Number of Viewers per Monitor, and Viewer Position29

9.2Display Specification and Set-up30

9.2.1QVGA Requirements30

9.2.2SD Requirements31

9.2.3HD Monitor Requirements32

9.3Subjective Test Video Playback <XXX>33

9.4Evaluators (Viewers)34

9.4.2Subjective Experiment Sessions35

9.4.3Randomization36

9.4.4Test Data Collection <XXX>36

9.5Results Data Format36

10.Objective Quality Models38

10.1Model Type and Model Requirements40

10.2Model Input and Output Data Format <XXX>40

10.2.1Full reference hybrid perceptual bit-stream models41

10.2.2Reduced-reference Hybrid Perceptual Bit-stream Models42

10.2.3No-Reference Hybrid Perceptual Bit-Stream Models and No-Reference Models43

10.3Submission of Executable Model44

10.4Registration44

11.Objective Quality Model Evaluation Criteria <XXX>45

11.1Post Subjective Testing Elimination of SRC or PVS45

11.2Evaluation Procedure45

11.3PSNR <XXX> 46

11.4Data Processing46

11.4.1Subjective Data Analysis46

11.4.2Subjective Data Analysis47

11.4.3Calculating DMOS Values47

11.4.4Mapping to the Subjective Scale47

11.4.5Averaging Process48

11.4.6Aggregation Procedure48

11.5Evaluation Metrics48

11.5.1Pearson Correlation Coefficient48

11.5.2Root Mean Square Error49

11.5.3Outlier Ratio50

11.6Statistical Significance of the Results50

11.6.1Significance of the Difference between the Correlation Coefficients50

11.6.2Significance of the Difference between the Root Mean Square Errors51

11.6.3Significance of the Difference between the Outlier Ratios51

12.Recommendation53

13.Bibliography54

Annex I <XXX> Instructions to the Evaluators55

Annex II Background and Guidelines on Transmission Errors57

ANNEX III Fee and Conditions for receiving datasets60

ANNEX V Method for Post-Experiment Screening of Evaluators61

Annex VI. Encrypted Source Code Submitted to VQEG63

ANNEX VII. Definition and Calculating Gain and Offset in PVSs64

1.Introduction6

2.List of Definitions7

3.List of Acronyms8

4.Overview: ILG, Proponents, Tasks and Schedule9

4.1Division of Labor9

4.1.1Independent Laboratory Group (ILG)9

4.1.2Proponent Laboratories10

4.1.3VQEG10

4.2Overview11

4.2.1Testplan Design11

4.2.2Compatibility Test Phase: Training Data11

4.2.3Evaluation Phase11

4.2.4Common Set12

4.3Publication of Subjective Data, Objective Data, and Video Sequences12

4.4Test Schedule <XXX> 12

4.5Advice to Proponents on Pre-Model Submission Checking13

6.SRC Video Restrictions and Video File Format14

6.1Source Sequence Processing Overview and Restrictions14

6.2Duration of SRC14

6.3Source Test Material Requirements: Quality, Camera, Use Restrictions.15

6.4Source Conversion15

6.4.1Software Tools15

6.4.2Colour Space Conversion15

6.4.3De-Interlacing16

6.4.4Cropping & Rescaling16

6.5Video File Format: Uncompressed AVI in UYVY17

6.6Source Test Video Sequence Documentation18

6.7Test Materials and Selection Criteria <XXX>18

7.HRC Creation and Sequence Processiong <XXX>20

7.1Reference Decoder, Encoder, Capture, and Stream Generator <XXX>20

7.2Video Bit-Rates (examples) <XXX>20

7.3Frame Rates <XXX>21

7.4Pre-Processing21

7.5Post-Processing21

7.6Coding Schemes21

7.7Transmission Errors22

7.7.1Simulated Transmission Errors22

7.7.2Live Network Conditions24

7.8Calibration and Registration24

8.Experiment Design27

9.Subjective Evaluation Procedure28

9.1The ACR Method with Hidden Reference28

9.1.1General Description28

9.1.2Viewing Distance, Number of Viewers per Monitor, and Viewer Position29

9.2Display Specification and Set-up30

9.2.1QVGA Requirements30

9.2.2SD Requirements31

9.2.3HD Monitor Requirements32

9.3Subjective Test Video Playback <XXX>33

9.4Evaluators (Viewers)34

9.4.2Subjective Experiment Sessions35

9.4.3Randomization36

9.4.4Test Data Collection <XXX>36

9.5Results Data Format36

10.Objective Quality Models38

10.1Model Type and Model Requirements40

10.2Model Input and Output Data Format <XXX>40

10.2.1Full reference hybrid perceptual bit-stream models41

10.2.2Reduced-reference Hybrid Perceptual Bit-stream Models42

10.2.3No-Reference Hybrid Perceptual Bit-Stream Models and No-Reference Models43

10.3Submission of Executable Model44

10.4Registration44

11.Objective Quality Model Evaluation Criteria <XXX>45

11.1Post Subjective Testing Elimination of SRC or PVS45

11.2Evaluation Procedure45

11.3PSNR <XXX> 46

11.4Data Processing46

11.4.1Subjective Data Analysis46

11.4.2Subjective Data Analysis47

11.4.3Calculating DMOS Values47

11.4.4Mapping to the Subjective Scale47

11.4.5Averaging Process48

11.4.6Aggregation Procedure48

11.5Evaluation Metrics48

11.5.1Pearson Correlation Coefficient48

11.5.2Root Mean Square Error49

11.5.3Outlier Ratio50

11.6Statistical Significance of the Results50

11.6.1Significance of the Difference between the Correlation Coefficients50

11.6.2Significance of the Difference between the Root Mean Square Errors51

11.6.3Significance of the Difference between the Outlier Ratios51

12.Recommendation53

13.Bibliography54

Annex I <XXX> Instructions to the Evaluators55

Annex II Background and Guidelines on Transmission Errors57

ANNEX III Fee and Conditions for receiving datasets60

ANNEX V Method for Post-Experiment Screening of Evaluators61

Annex VI. Encrypted Source Code Submitted to VQEG63

ANNEX VII. Definition and Calculating Gain and Offset in PVSs64

1.Introduction6

2.List of Definitions7

3.List of Acronyms8

4.Overview: ILG, Proponents, Tasks and Schedule9

4.1Division of Labor9

4.1.1Independent Laboratory Group (ILG)9

4.1.2Proponent Laboratories10

4.1.3VQEG10

4.2Overview11

4.2.1Testplan Design11

4.2.2Compatibility Test Phase: Training Data11

4.2.3Evaluation Phase11

4.2.4Common Set12

4.3Publication of Subjective Data, Objective Data, and Video Sequences12

4.4Test Schedule <XXX> 12

4.5Advice to Proponents on Pre-Model Submission Checking13

6.SRC Video Restrictions and Video File Format14

6.1Source Sequence Processing Overview and Restrictions14

6.2Duration of SRC14

6.3Source Test Material Requirements: Quality, Camera, Use Restrictions.15

6.4Source Conversion15

6.4.1Software Tools15

6.4.2Colour Space Conversion15

6.4.3De-Interlacing16

6.4.4Cropping & Rescaling16

6.5Video File Format: Uncompressed AVI in UYVY17

6.6Source Test Video Sequence Documentation18

6.7Test Materials and Selection Criteria <XXX>18

7.HRC Creation and Sequence Processiong <XXX>20

7.1Reference Decoder, Encoder, Capture, and Stream Generator <XXX>20

7.2Video Bit-Rates (examples) <XXX>20

7.3Frame Rates <XXX>21

7.4Pre-Processing21

7.5Post-Processing21

7.6Coding Schemes21

7.7Transmission Errors22

7.7.1Simulated Transmission Errors22

7.7.2Live Network Conditions24

7.8Calibration and Registration24

8.Experiment Design27

9.Subjective Evaluation Procedure28

9.1The ACR Method with Hidden Reference28

9.1.1General Description28

9.1.2Viewing Distance, Number of Viewers per Monitor, and Viewer Position29

9.2Display Specification and Set-up30

9.2.1QVGA Requirements30

9.2.2SD Requirements31

9.2.3HD Monitor Requirements32

9.3Subjective Test Video Playback <XXX>33

9.4Evaluators (Viewers)34

9.4.2Subjective Experiment Sessions35

9.4.3Randomization36

9.4.4Test Data Collection <XXX>36

9.5Results Data Format36

10.Objective Quality Models38

10.1Model Type and Model Requirements40

10.2Model Input and Output Data Format <XXX>40

10.2.1Full reference hybrid perceptual bit-stream models41

10.2.2Reduced-reference Hybrid Perceptual Bit-stream Models42

10.2.3No-Reference Hybrid Perceptual Bit-Stream Models and No-Reference Models43

10.3Submission of Executable Model44

10.4Registration44

11.Objective Quality Model Evaluation Criteria <XXX>45

11.1Post Subjective Testing Elimination of SRC or PVS45

11.2Evaluation Procedure45

11.3PSNR <XXX> 46

11.4Data Processing46

11.4.1Subjective Data Analysis46

11.4.2Subjective Data Analysis47

11.4.3Calculating DMOS Values47

11.4.4Mapping to the Subjective Scale47

11.4.5Averaging Process48

11.4.6Aggregation Procedure48

11.5Evaluation Metrics48

11.5.1Pearson Correlation Coefficient48

11.5.2Root Mean Square Error49

11.5.3Outlier Ratio50

11.6Statistical Significance of the Results50

11.6.1Significance of the Difference between the Correlation Coefficients50

11.6.2Significance of the Difference between the Root Mean Square Errors51

11.6.3Significance of the Difference between the Outlier Ratios51

12.Recommendation53

13.Bibliography54

Annex I <XXX> Instructions to the Evaluators55

Annex II Background and Guidelines on Transmission Errors57

ANNEX III Fee and Conditions for receiving datasets60

ANNEX V Method for Post-Experiment Screening of Evaluators61

Annex VI. Encrypted Source Code Submitted to VQEG63

ANNEX VII. Definition and Calculating Gain and Offset in PVSs64

1.Introduction6

2.List of Definitions7

3.List of Acronyms9

4.Overview: ILG, Proponents, Tasks and Schedule11

4.1Division of Labor11

4.1.1Independent Laboratory Group (ILG)11

4.1.2Proponent Laboratories12