Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x

Content for  TS 26.143  Word version:  18.1.0

Top   Top   Up   Prev   Next
1…   4…   5…

 

4  Overview and Contextp. 9

4.1  Background and Assumptionsp. 9

Messaging services typically define a message container. Such a container typically carries one or more body parts with the actual message content (for example, an emoji used in a reaction, a plain text or rich text message or reply, a link, or an inline image, or richer media types).
An important feature of messages are body parts that include media content. Different media content exists, such as simple and rich text, still images, graphics, speech, audio, video, 3D scenes and many other media types.
This specification is not defining a container format, but it addresses the usability of 3GPP defined media types and formats into messages as part of a message body within message containers. Examples for message containers are OMA MMS PDUs [7] [8] [9] [15], IETF MIMI message containers [6] or GSMA RCS [10] [11].
The focus of this specification is the definition of parts of message body that carry multimedia content, referred to as multimedia messaging body part (MMBP). This specification does not generally define how the body part is encoded: existing functionalities, for example the ones defined in OMA MMS PDUs [7] [8] [9] [15] or MIMI message containers [6] may be used for this purpose. However, this specification provides the definition of an MMBP using the ISO Base Media File format [13] to provide features for mixing multiple sub-parts into a single body part. The specification relies on ISO/IEC 23000-24 [14].
However, this specification is not restricted to be used with a fully specified Messaging Service, it may as well be used as part of third-party messaging services as message body, or more specifically as MMBP. It may also serve to support content interoperability across different messaging services.
The term media type is used as short to refer to the IANA media type, subtype, and parameters as defined in RFC 2046 and provides defined properties of a content. For example, it may tell if the content is video or audio, it provides the encapsulation format, and it may provide parameters such as the codec in use. This specification defines, or at least assigns to each defined MMBP a media type, in order to uniquely identify the media type.
In order to use MMBPs as defined in this specification as part of a message container format, it is expected that the message container format supports the following functionalities:
  1. It can carry an octet string representing the content of the MMBP
  2. It can signal the media type of the content.
  3. The content and media type of the content is not restricted but allows to include formats that are not defined in the core container format.
In addition, a container format may support one or more of the following functionalities in alignment of definitions in IETF MIMI [6] and RFC 2046:
  • the body can be multi-part, i.e, it can have multiple, possibly nested parts, referred to as sub-parts, with one of the following properties and structures
    • mixed: there are multiple media types associated with the same message which need to be rendered together, for example a rich-text message with an inline image. The receiver is expected to process as many of the nested parts at this level as possible.
    • alternative: there are multiple media types associated with the same message and the receiver can choose an appropriate one based on its own policies using the media type or possibly other parameters (e.g. a language) of each part.
    • related: there are multiple media types associated with the same message and all the nested body parts at this level are part of a single entity that are processed jointly, possibly by providing a root object for initial processing. If the receiver does not understand even one of the nested parts at this level, the receiver is not expected to process any of them.
    • nested: there are multiple media types associated with the same message, and one or several of the media types are representing a single, mixed, alternative or related structure.
  • it may have body parts that reference external content via URI that will be processed automatically. It includes a media type and may optionally include the size of the data, an expiration timestamp other parameters. The content may be rendered with the other parts of the message, or a be downloaded or rendered separately.
  • it may have body parts for which the content is encrypted.
Note that based on the above, the MMBP may be the entire part of a message body, or it may be a sub-part.
Up

4.2  System Descriptionp. 10

Based on the background and assumptions in clause 4.1, Figure 4.2-1 provides an example system for a messaging services and highlights scope of this specification, namely the definition of a multimedia messaging body part (MMBP) and the associated metadata.
Copy of original 3GPP image for 3GPP TS 26.143, Fig. 4.2-1: Example system for Messaging multimedia message exchange
Up
A Messaging Service Sender instructs a MMBP generator to generate an MMBP, for example using an API. This for example allows to define configurations on codecs, size, experiences or other attributes of the MMBP. The details of such an API are outside the scope of this specification. The sender adds the MMBP to a Container Message (either included as a body part or by reference), together with MMBP metadata parameters that provide information about the MMBP. Metadata includes, but is not limited to:
  • The media type of the MMBP, including subtypes and parameters for codecs, etc.
  • The size of the MMBP
  • Accessibility or language information about the MMBP
  • processing requirements of recommendations of the MMBP
The client of the messaging service receives the container message that includes the above information. The client communicates with a MMBP player its capabilities whether the MMBP can be played back, and if multiple alternatives are present, which of those are to be selected. Then the messaging service client instructs the MMBP player to playback the MMBP as part of the messaging service, based on the processing requirements and instructions. Playback may be combined with additional instructions for a player, including play, pause, seek, etc.
Up

4.3  MMBP Player Modelp. 11

The design of the formats defined in this document is based on the player model as shown in Figure 4.3-1. The Figure illustrates the logical components of a conceptual MMBP Player model. In this Figure, the MMBP parser receives the MMBP, and playback instructions. The Messaging Service Client may use metadata provided in a container message for playback selection. Such metadata may for example include codec capability information, language codes, accessibility information and other information for the selection of alternative parts in the MMBP.
The client then provides the sub-parts for processing and decoding to the related sub-part processors, and controls those for playback. The rendered message output may be handed back to the Messaging Service client for inband rendering or may be rendered directly.
Copy of original 3GPP image for 3GPP TS 26.143, Fig. 4.3-1: MMBP Player Model
Figure 4.3-1: MMBP Player Model
(⇒ copy of original 3GPP image)
Up
Beyond the MMBP formats, this specification also defines capabilities of 3GPP-based MMBP players.

4.4  Generic MMBP Data Modelp. 12

Based on the description in clause 4.1, an MMBP can be the full body or part of the body of a container message.
An MMBP itself is identified by a media type.
The MMBP may be a single content with a media type.
The MMBP may include multiple additional MMBPs. The following multi-part MMBPs are defined:
  • mixed MMBP: multiple MMBPs are associated with the mixed MMBP that shall be rendered together. Each MMBP is identified by a media type. The receiver is expected to process as many as possible of the included MMBPs based on its capabilities.
  • parallel MMBP: multiple MMBPs are associated with the parallel MMBP that shall be rendered together. Each MMBP is identified by a media type. Real-time MMBPs included in a parallel MMBP share the same MMBP presentation timeline, which has a value of zero at the earliest media sample intended for presentation. If presented jointly, they shall be presented using this common MMBP presentation timeline.
  • alternative MMBP: multiple MMBPs are associated with the alternative MMBP. Each MMBP is identified by a media type. The receiver is expected to process exactly one based on its capabilities.
  • related MMBP: multiple objects are associated with the process MMBP. One object is identified as a root MMBP. The root MMBP is identified by a media type. The root MMBP is processed and identifies if any, several or all of the remaining objects are used as well. Hence, all other objects are typically also identified by media types, and a URL that links the objects being part of the related MMBP. The processor of the root MMBP also controls the selection, presentation and timing of the other objects.
MMBPs are a recursive structure. Hence, a receiver shall expect that multi-part MMBPs contain other multi-part MMBPs.
Up

4.5  Media Capabilities and Profilesp. 12

This specification defines media capabilities for both, MMBP generators as well as MMBP players in clause 5. The media capabilities provide requirements for content generation as well as playback instructions, respectively.
This specification also defines profiles for content generators and players. Profiles are a collection of media capability requirements and recommendations as defined in clause 6.
External specifications may reference capabilities defined in this specification.
Preferably, external specifications should reference full media profiles.
Up

Up   Top   ToC