Figure 6.3.4.1-1 illustrates the procedure diagram for interactive immersive services using a STAR-based UE when all essential AR/MR functions in a UE are available without an assist by an edge.
The AR/MR Scene Manager includes immersive media rendering and scene graph handling functionalities.
The Media Player includes immersive content delivery and immersive media decoding functionalities.
The AR/MR Application in the UE is run by the user.
The STAR UE initialises AR registration (starts analysing the surroundings where a user/UE is located), it namely:
captures its surroundings via camera(s)
analyses where the device is located
registers the device into the analysed surroundings.
AR/MR Application and AR/MR Application Provider have exchanged some information, such as device capability or content configuration, for content rendering. The exchange procedures for device capability and content configuration are FFS.
AR/MR Application Provider has established a Provisioning Session and its detailed configurations has been exchanged.
AR/MR Application Provider has completed to set up ingesting immersive contents.
Service Announcement is triggered by AR/MR Application. Service Access Information including Media Client entry or a reference to the Service Access Information is provided through the M8d interface.
The Media Client requests and receives the full scene description. The entry point (scene description) is processed by the AR/MR Scene Manager, and a scene session is created.
The latest interaction and pose information are acquired by the AR/MR Scene Manager and shared to the Media Client. The Media Client sends this information to the Media AS and Scene Server.
The Scene Server processes the scene according to the interaction and pose information from the UE. Depending on the level of processing, the current scene may be updated or replaced.
The Media Client processes the delivery manifest(s). It determines for example the number of needed transport sessions for media acquisition. The Media Client is expected to be able to use the delivery manifest(s) information to initialize the media pipelines for each media stream.
The Media Client requests the immersive media data according to the delivery manifest processed, possibly taking into account pose information (e.g., viewport dependent streaming).
The Media Client receives the immersive media data and triggers the media rendering pipeline(s), including the registration of AR content into the real world accordingly.
The AR/MR Scene Manager renders the media, and passes the rendered media to the AR Runtime, which performs further processing such as registration of the AR content into the real world, and pose correction.