Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x

Content for  TR 26.928  Word version:  18.0.0

Top   Top   Up   Prev   Next
0…   4…   4.1.2…   4.2…   4.3…   4.4…   4.5…   4.6…   4.6.7   4.7…   4.9…   5…   6…   7…   8   A…   A.4…   A.7…   A.10…   A.13   A.14   A.15   A.16   A.17   A.18…

 

4.9  Ongoing Standardisation Workp. 47

4.9.1  Related Work in 3GPPp. 47

4.9.1.1  Introductionp. 47

This clause summarizes relevant 3GPP activities efforts in the context of XR.
  • 3GPP TR 26.918 provides an introduction to Virtual Reality and TS 26.118 defines Virtual Reality Media Profiles for omnidirectional 3DoF media.
  • 3GPP TR 22.842 on Network Controlled Interactive Service (NCIS) analyses several use cases of NCIS as follows: NCIS Service Supporting
    • New Requirements for VR Based NCIS Service
    • Cloud Rendering for Games
    • High Speed Scenario
    • IoE Based Social Networking
    • Communication within NCIS group
    Based on the TR, several requirements are identified for new requirements in TS 22.261. Also, KPIs for such services mentioned above are documented in clause 6.2 of TR 22.842, requiring additional input including some information from this TR.
  • In context of Release-17, 3GPP work is ongoing in order to identify the integration of edge processing in 5G systems. TR 23.748 defines modifications to 5GS system architecture to enhance Edge Computing. This work is currently in study phase, defining Key Issues and scope for Release-17. In addition, in TR 23.758 a new set of application layer interfaces for Edge Computing is identified that may potentially be useful for integration edge computing.
Up

4.9.2  Related Work External of 3GPPp. 47

4.9.2.1  Introductionp. 47

This clause summarizes relevant external standardisation efforts in the context of XR that may provide certain functionalities being of benefit for 5G-based XR applications.

4.9.2.2  MPEGp. 47

4.9.2.2.1  Introductionp. 47
In October 2016, MPEG initiated a new project on "Coded Representation of Immersive Media", referred to as MPEG-I. The proposal was justified by the emergence of new devices and services that allow users to be immersed in media and to navigate in multimedia scenes. It was observed that a fragmented market exists for such devices and services, notably for content that is delivered "over the top". The project is motivated by the lack of common standards that do not enable interoperable services and devices providing immersive and navigable experiences. The MPEG-I project is expected to enable existing services in an interoperable manner and to support the evolution of interoperable immersive media services. Enabled by the Parts of this Standard, end users are expected to be able to access interoperable content and services, and acquire devices that allow them to consume these.
After the launch of the project, several phases, activities, and projects have been launched that enable services considered in MPEG-I.
The project is divided in tracks that enable different core experiences. Each of the phases is supported by key activities in MPEG, namely in systems, video, audio and 3D graphics-related technologies.
Core technologies as well as additional enablers are implemented in parts of the MPEG-I standard. Currently the following 14 parts are under development:
  • Part 1 - Immersive Media Architectures
  • Part 2 - Omnidirectional MediA Format
  • Part 3 - Versatile Video Coding
  • Part 4 - Immersive Audio Coding
  • Part 5 - Video-Based Point Cloud Coding (V-PCC)
  • Part 6 - Immersive Media Metrics
  • Part 7 - Immersive Media Metadata
  • Part 8 - Network-Based Media Processing
  • Part 9 - Geometry Point Cloud Coding (G-PCC)
  • Part 10 - Carriage of Video-based Point Cloud Coding Data
  • Part 11 - Implementation Guidelines for Network-based Media Processing
  • Part 12 - Carriage of Geometry-based Point Cloud Coding Data
  • Part 13 - Multi-Decoder Video Decoding Interface for Immersive Media
  • Part 14 - Scene Description for MPEG Media
In addition, other technical components may be provided in existing MPEG specifications outside of MPEG-I (e.g., HEVC and AVC) in order to create interoperable immersive experiences.
Up

4.9.2.3  Khronosp. 48

Khronos creates open standards for 3D graphics, Virtual and Augmented Reality, Parallel Computing, Neural Networks, and Vision Processing. Specifically relevant for the work on XR are the following activities:
  • OpenGL® is the most widely adopted 2D and 3D graphics API in the industry, bringing thousands of applications to a wide variety of computer platforms. It is window-system and operating-system independent as well as network-transparent. OpenGL enables developers of software for PC, workstation, and supercomputing hardware to create high-performance, visually compelling graphics software applications, in markets such as CAD, content creation, energy, entertainment, game development, manufacturing, medical, and virtual reality. OpenGL exposes all the features of the latest graphics hardware.
  • Vulkan is a new generation graphics and compute API that provides high-efficiency, cross-platform access to modern GPUs used in a wide variety of devices from PCs and consoles to mobile phones and embedded platforms.
  • OpenXR [16] is an open standard that provides high-performance access to Augmented Reality (AR) and Virtual Reality (VR)-collectively known as XR-platforms and devices.
  • glTF™ (GL Transmission Format) [39] is a specification for the efficient transmission and loading of 3D scenes and models by applications. glTF minimizes both the size of 3D assets, and the runtime processing needed to unpack and use those assets. glTF defines an extensible, common publishing format for 3D content tools and services that streamlines authoring workflows and enables interoperable use of content across the industry.
Up

4.9.2.4  W3C WebXRp. 48

The WebXR Device API Specification (https://immersive-web.github.io/webxr/) [17] provides interfaces to VR and AR hardware to allow developers to build compelling, comfortable VR/AR experiences on the web. The latest "WebXR Device API, Editor's Draft, 10 February 2020" is available here https://immersive-web.github.io/webxr/ and provides an interface to VR/AR hardware. It is no longer marked as "UNSTABLE API". It also provides a link to WebXR Device API Explained.
Up

4.10  XR Use Casesp. 48

In Annex A of this document, a significant amount of use cases are collected that serve for identifying potential interfaces, formats, protocols and requirements for the 5G system in order to support XR applications.
Table 4.10 provides an overview of the use cases and their characterization.
In addition, this table more explicitly adds the device types that have been developed in clause 4.8.
No Use Case Type Experience Delivery Device Types
13D Image MessagingAR3DoF+, 6DoFUpload and DownloadXR5G-P1
XR5G-AX
2AR SharingAR, MR6DoFLocal, Messaging Download and UploadXR5G-P1
XR5G-AX
3Streaming of Immersive 6DoFVR3DoF+, 6DoFStreaming Interactive SplitXR5G-V3 or XR5G-V4 with controller
4Emotional Streaming2D, AR and VR2D, 3DoF+, 6DoFStreaming Interactive, SplitXR5G-P1
XR5G-V3
XR5G-V4
5Untethered Immersive Online GamingVR6DoFStreaming, Interactive, SplitXR5G-V3
XR5G-V4 with gaming controller
6Immersive Game Spectator ModeVR6DoFStreaming, SplitXR5G-P1
XR5G-V3
XR5G-V4
7Real-time 3D Communication3D, AR3DoF+ConversationalXR5G-P1
XR5G-AX
8AR guided assistant at remote location (industrial services)2D video with dynamic AR rendering of graphics6DoF (2D + AR)Local, Streaming, Interactive, ConversationalXR5G-P1
XR5G-AX
9Police Critical Mission with ARAR, VR3DoF to 6DoFLocal, Streaming, Interactive, Conversational, Group CommunicationXR5G-A3
XR5G-A4
10Online shopping from a catalogue - downloadingAR6DoFDownload XR5G-P1
XR5G-AX
11Real-time communication with the shop assistantAR6DoFInteractive, Conversational XR5G-P1
XR5G-AX
12360-degree conference meetingAR, MR, VR3DoFConversationalXR5G-P1
XR5G-V3
XR5G-V4
133D shared experienceAR, MR, VR3DoF+ 6DoFConversationalXR5G-P1
XR5G-V3
XR5G-V4
146DOF VR conferencingVR6DoFInteractive, ConversationalXR5G-V3
XR5G-V4
15XR MeetingAR, VR, XR6DoFInteractive ConversationalXR5G-P1
XR5G-A1
XR5G-A2
XR5G-A5
16Convention / Poster SessionAR, VR, MR6DoFInteractive ConversationalXR5G-P1
XR5G-A1
XR5G-A2
XR5G-A5
17AR animated avatar callsAR2D, 3DoFConversationalXR5G-P1
XR5G-A1
XR5G-A2
XR5G-A5
18Online shopping from a catalogue - downloadingAR6DoFDownload XR5G-P1
XR5G-A1
XR5G-A2
XR5G-A5
19Front-facing camera video multi-party callsAR3DoFConversationalXR5G-P1
XR5G-AX
20AR Streaming with Localization RegistryAR, Social AR6DoFStreaming, Interactive, ConversationalXR5G-A1
XR5G-A2
XR5G-A5
21Immersive 6DoF Streaming with Social InteractionVR and Social VR3DoF+, 6DoFStreaming Interactive Conversational SplitXR5G-V3
XR5G-V4
225G Online Gaming PartyVR6DoFStreaming, Interactive, Split, D2DXR5G-V3
XR5G-V4
23Spatial Shared DataAR6DoFStreaming Interactive Conversational SplitXR5G-AX
The use cases are summarized in clause 5 into several core use cases and scenarios.
Up

4.11  Summary of Remaining Issues addressed in this Documentp. 50

Based on these introduced technologies and the use cases, the remainder of this Technical Report is addresses the following:
  • identify the mapping of different XR use cases to the 5G System and 5G Media Delivery services according to clause 4.3.
  • identify functions, interfaces and APIs for different delivery scenarios.
  • define high-level call flows and parameter exchange for the different service scenarios.
  • for the different scenarios, identify the formats as well as traffic requirements/properties
  • identify technical requirements on formats, processing functions, interfaces, and protocols in order to achieve adequate Quality of Experience based on the considerations in clause 4.2.
  • identify potential standardisation areas and their potential timeline.
Up

Up   Top   ToC