Tech-
invite
3GPP
space
IETF
space
21
22
23
24
25
26
27
28
29
31
32
33
34
35
36
37
38
4‑5x
Content for
TR 37.977
Word version: 16.0.0
1…
2…
A…
2
References
3
Definitions, symbols and abbreviations
4
Introduction
5
Performance metrics
6
Candidate measurement methodologies
7
Base Station (BS) configuration
8
Channel Models
9
Reference antennas and devices testing
10
Measurement results from testing campaigns
12
MIMO OTA test procedures
2
References
3
Definitions, symbols and abbreviations
Word‑p. 10
3.1
Definitions
3.2
Symbols
3.3
Abbreviations
4
Introduction
Word‑p. 11
4.1
Background
4.2
Work item objective
4.3
High level requirements
Word‑p. 12
5
Performance metrics
Word‑p. 13
5.1
Figure of Merits
5.1.1
Definition of MIMO throughput
5.1.2
Definition of Signal-to-Interference Ratio (SIR)
5.1.2.1
SIR Control for Multi-Probe Anechoic Chamber Methodology
|R13|
Word‑p. 14
5.1.2.2
SIR Control for the reverberation chamber method
|R13|
Word‑p. 15
5.1.2.3
SIR Control for the reverberation chamber plus channel emulator method
|R13|
Word‑p. 16
5.1.2.4
SIR Control for the two-stage methodology
|R13|
Word‑p. 17
5.2
Averaging of throughput curves
Word‑p. 19
5.2.1
Average of power levels
6
Candidate measurement methodologies
Word‑p. 20
6.1
Void
6.2
Void
6.3
Downlink measurement methodologies
6.3.1
Methodologies based on Anechoic RF Chamber
6.3.1.1
Candidate Solution 1
6.3.1.1.1
Concept and configuration
Word‑p. 21
6.3.1.1.2
Scalability of the methodology
Word‑p. 22
6.3.1.1.3
Test conditions
Word‑p. 23
6.3.1.2
Void
6.3.1.3
Candidate solution 3
Word‑p. 24
6.3.1.3.1
Concept and configuration
6.3.1.3.2
Test conditions
Word‑p. 27
6.3.1.3.3
Overview of calibration procedures specific to the RTS method
|R13|
6.3.1.4
Candidate solution 4
Word‑p. 28
6.3.1.4.1
Concept and configuration
6.3.1.4.2
Decomposition approach
Word‑p. 29
6.3.1.4.3
Conducted test
Word‑p. 30
6.3.1.4.4
Radiated test
6.3.1.4.5
Possible extensions of the decomposition method
Word‑p. 31
6.3.1.5
Candidate solution 5
6.3.1.5.1
Concept and configuration
Word‑p. 32
6.3.1.5.2
Test conditions
Word‑p. 34
6.3.2
Methodologies based on Reverberation Chamber
Word‑p. 35
6.3.2.1
Candidate solution 1
6.3.2.1.1
Concept and configuration
Word‑p. 36
6.3.2.1.2
Test conditions
Word‑p. 37
6.3.2.2
Candidate solution 2
Word‑p. 38
6.3.2.2.1
Concept and configuration
6.3.2.2.2
Test conditions
Word‑p. 39
7
Base Station (BS) configuration
Word‑p. 40
7.1
eNodeB emulator settings
8
Channel Models
Word‑p. 42
8.1
Introduction
8.2
Channel Model(s) to be validated
8.3
Verification of Channel Model implementations
Word‑p. 44
8.3.1
Measurement instruments and setup
8.3.1.1
Vector Network Analyzer (VNA) setup
8.3.1.2
Spectrum Analyzer (SA) setup
Word‑p. 45
8.3.2
Validation measurements
8.3.2.1
Power Delay Profile (PDP)
8.3.2.2
Doppler/Temporal correlation
Word‑p. 47
8.3.2.3
Spatial correlation
Word‑p. 49
8.3.2.4
Cross-polarization
Word‑p. 52
8.3.3
Reporting
Word‑p. 53
8.4
Channel Model validation results
Word‑p. 56
8.4.1
Scope
8.4.2
Power Delay Profile (PDP)
8.4.3
Doppler / Temporal Correlation
Word‑p. 62
8.4.4
Spatial correlation
Word‑p. 66
8.4.5
Cross polarization
Word‑p. 68
8.4.6
Summary
Word‑p. 69
8.5
Channel Model emulation of the Base Station antenna pattern configuration
Word‑p. 70
9
Reference antennas and devices testing
9.1
Reference antennas design
9.2
Reference devices
9.3
Description of tests with reference antennas and devices
9.3.1
The Absolute Data Throughput Comparison Framework
9.3.1.1
Introduction
9.3.1.2
Antenna pattern data format
Word‑p. 71
9.3.1.3
Emulation of antenna pattern rotation
Word‑p. 72
9.3.1.4
Absolute Data Throughput measurement enabler
Word‑p. 74
9.3.1.5
Output data format
Word‑p. 75
9.3.1.6
Application of the framework and scenarios for comparison
Word‑p. 78
9.3.1.7
Proof of concept
Word‑p. 80
9.3.1.7.1
The first scenario, anechoic based
9.3.1.7.2
The second scenario, reverberation chamber based
Word‑p. 82
9.3.1.7.3
The third scenario, reverberation chamber and channel emulator based
Word‑p. 83
9.4
Device positioning
Word‑p. 85
9.4.1
Handheld UE - Browsing mode
9.4.1.1
MPAC Positioning Guidelines
|R14|
9.4.2
Handheld UE - Speech mode
9.4.2.1
MPAC Positioning Guidelines
|R14|
9.4.3
Laptop Mounted Equipment (LME)
9.4.3.1
MPAC Positioning Guidelines
|R14|
Word‑p. 86
9.4.4
Laptop Eembedded Equipment (LEE)
9.4.4.1
MPAC Positioning Guidelines
|R14|
10
Measurement results from testing campaigns
10.1
Introduction
10.2
CTIA test campaign
10.2.1
Description of the test plan
10.2.2
Anechoic chamber method with multiprobe configuration
Word‑p. 87
10.2.3
Reverberation chamber method using NIST channel model and using channel emulator with short delay spread low correlation channel model
Word‑p. 92
10.2.4
RTS method results
Word‑p. 100
10.3
3GPP harmonization test campaign
|R13|
Word‑p. 104
10.3.1
Description of the test plan
10.3.2
Devices under test
Word‑p. 105
10.3.3
Measurement uncertainty bound for harmonization
Word‑p. 106
10.3.4
Summary of results
Word‑p. 107
10.3.5
Harmonization outcome
Word‑p. 111
10.3.5a
Harmonization outcome with device set 3
|R14|
Word‑p. 112
10.3.5b
Harmonization outcome with device set 4
|R14|
Word‑p. 113
10.4
Lab alignment procedures for performance labs
|R14|
Word‑p. 114
10.4.1
General
10.4.2
Channel model validation data
10.4.3
Calibration with a specific set of reference dipoles
Word‑p. 115
10.4.4
Performance alignment measurements
10.4.5
Acceptance criteria
11
Void
12
MIMO OTA test procedures
Word‑p. 116
12.1
Anechoic chamber method with multiprobe configuration test procedure
12.1.1
Base Station configuration
12.1.2
Channel Models
12.1.3
Device positioning and environmental conditions
12.1.4
System Description
12.1.4.1
Solution Overview
12.1.4.2
Configuration
12.1.4.3
Calibration
12.1.5
Figure of Merit
Word‑p. 117
12.1.6
Test procedure
12.1.6.1
Initial conditions
12.1.6.2
Test procedure
12.1.7
Measurement Uncertainty budget
Word‑p. 118
12.2
Reverberation chamber test procedure
12.2.1
Base Station configuration
12.2.2
Channel Models
12.2.3
Device positioning and environmental conditions
12.2.4
System Description
12.2.4.1
Solution Overview
12.2.4.2
Configuration
12.2.4.3
Calibration
12.2.5
Figure of Merit
12.2.6
Test procedure
12.2.6.1
Initial conditions
12.2.6.2
Test procedure
Word‑p. 119
12.2.7
Measurement Uncertainty budget
12.3
RTS method test procedure
12.3.1
Base Station configuration
12.3.2
Channel Models
12.3.3
Device positioning and environmental conditions
Word‑p. 120
12.3.4
System Description
12.3.4.1
Solution Overview
12.3.4.2
Configuration
12.3.4.3
Calibration
12.3.5
Figure of Merit
12.3.6
Test procedure
12.3.6.1
Initial conditions
12.3.6.2
Test procedure
12.3.7
Measurement Uncertainty budget
Word‑p. 121
12.4
Comparison of methodologies