Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x

Content for  TR 26.998  Word version:  18.1.0

Top   Top   Up   Prev   Next
0…   4…   4.2…   4.2.2…   4.2.2.2   4.2.2.3   4.2.2.4   4.2.3…   4.3…   4.4…   4.5…   4.6…   4.6.4…   4.6.5…   4.6.8…   5   6…   6.2…   6.2.4…   6.2.4.2   6.2.5…   6.3…   6.3.4…   6.3.4.2   6.3.5…   6.4…   6.4.4   6.4.5…   6.5…   6.5.4   6.5.5   6.5.6…   6.6…   6.6.4   6.6.5…   7…   8…   8.9   9   A…   A.2   A.3…   A.4   A.5   A.6   A.7…

 

7  Considerations on Devices Form-factorp. 92

7.1  Generalp. 92

The components of AR glasses are same or similar with those of mobile phones which may launch and execute AR/MR applications. However, AR glasses have rather different requirements and limitations compared with mobile phones.
From a form factor perspective, AR glasses have several different design considerations. For example, AR glasses have two separate see-through displays for each eye. They also usually include more than two vision cameras which are spatially separated in order to achieve better disparity for depth estimation. In addition, AR glasses are worn and closely attached to a user's face and contain IMU sensors to estimate where the user's focal point is. Most of the included components are designed and placed in order to meet requirements which differ to those for mobile phones.
From a media processing perspective, AR/MR applications consume far more energy than non-AR/MR applications [27]. Multiple, as well as different types of cameras are always turned on to track the features detected in 2D and 3D video every second. In the case when AR/MR objects are augmented into the real world, the objects need to be rendered frame by frame with different view frustum positions and directions. In the case when the AR/MR objects are rendered in a server, the AR/MR device is expected to upload the user's pose in a millisecond frequency, then download, decode, correct, and composite the pre-rendered image sequences streamed from the server.
Besides, from an ergonomics perspective, restrictions need to be considered to place the components of the AR glasses in a limited space and under the manageable range of user neck joint torque.
This clause addresses form-factor related issues from the components of AR glasses device architectures, such as battery/power consumption, camera, display, heat dissipation, and weight.
Up

7.2  Battery/Power consumptionp. 93

The run time of a typical battery is proportional to its physical size, capacity, and weight, while they are proportional to user discomfort and neck torque. A study on the characteristics of AR applications [27] measured battery consumption of commercially available applications on AR, streaming, and social networking which shows that AR applications consume around at least 46% more energy than non-AR applications. The capacity of the battery needs to be designed to support a fair amount of running time for the everyday use of AR/MR applications. The amount of running time could be from tens of minutes for shopping of products via AR remote advertising in Annex A.2, 1-2 hours for streaming of volumetric video in Annex A.3, or even several hours for AR gaming in Annex A.6. However, as capacity is typically proportional to weight, and as the AR glasses is expected to be worn and equipped under the consideration of human ergonomics such as neck strain, there are clear limitations on extending the capacity of the battery. Such limitations may be relaxed by dynamically offloading some energy-intensive workloads to 5G cloud/edge. In this case, local processing power consumption is exchanged with power consumption for 3GPP/non-3GPP connectivity and an always on connectivity as well. For connectivity Discontinuous Reception (DRX) and Reduced Capability (RedCap) may be one of examples looking for lower power consumption for the radio for AR/MR application.
The following KPI is related with battery and power consumption and listed in clause 4.5.2.
  • Maximum Available Power
Up

7.3  Camerap. 93

Augmented reality may be realized by SLAM. To understand the physical world through SLAM, various types of multiple cameras need to be continuously turned on and always need to be acquiring image sequences.
Among the various components contributing to heat, such as CPU, GPU, camera and display, it is measured that the cameras are one of major sources of heat dissipation for AR applications [27]. AR/MR applications may need to be aware of the available run time remaining, and the amount of heat dissipation felt by the user.
In addition, as multiple cameras may be equipped in AR glasses for various purposes, they need to be designed and placed optimally to process the required functions in AR Runtime. Camera related parameters, such as for calibration, pose correction, Vision Engine, SLAM etc. are expected to have a big impact on the quality of service for AR glasses. AR/MR applications may need to be aware of intrinsic and extrinsic parameters for the cameras to properly process the required functions. Such parameters may be delivered to the server whenever there is any change in camera configurations.
The following KPI is related with camera and listed in clause 4.5.2.
  • Maximum Available Power
Up

7.4  Displayp. 93

There is at least one display for each eye on a pair of immersive AR glasses. The AR glasses estimates the position of each eye then presents pixels of the rendered AR/MR objects on the display in order to combine the ray of light reflected from the surface of real-world objects with each pixel. A renderer in the AR scene manager may take into consideration the shape and optical distortion characteristics of the displays, pixel arrangements, and the estimated position of each eye of the user. At least one of the view frustum models that represents either an AR glasses, each display, or each eye, with a 3D map of the surroundings may be provided to the AR scene manager in order to minimize the post processing of customizing a generic rendered image to fit to a certain pair of AR glasses.
The following KPI(s) are related with display and listed in clause 4.5.2.
  • Maximum Available Power
  • Persistence - Duty time
  • Display refresh rate
  • Spatial Resolution per eye
  • Content frame rates
  • Brightness
  • Field of View
  • Eye Relief
  • Calibration
  • Depth perception
Up

7.5  Heat dissipationp. 94

It has been studied that AR applications may generate 4-5 degrees (in Celsius) higher heat than non-AR applications on the same device [27]. Another study shows that a user's heat sensation and discomfort increase with temperature. Overheated components have not only degraded performance but also power leakage through thermal throttling [28].
The following KPI may be related with heat dissipation and listed in clause 4.5.2.
  • Maximum Available Power
Up

7.6  Weightp. 94

AR glasses consists of displays, sensors, cameras, batteries and so on. The weight of AR glasses puts constant pressure on a user's skin and changes the amount of torque applied to the neck joints and muscles in a neutral posture.
A study shows that a user's posture may be changed from a neutral to a look-up posture, a look-down posture, or a body-bending posture because of the relative placement of virtual objects [29]. Those different postures increase the moment arm between the Centre of Mass (CoM) of the wearable device and the neck joint.
There are different characteristics between HMD type and glasses type devices, as the CoM of glasses type devices is biased towards the front of the device, by design. As a result, AR/MR applications need to consider the issues due to the differences in the ergonomics between the two different types of wearable devices.
The following KPI is related with weight and listed in clause 4.5.2.
  • Maximum Weight
Up

7.7  Audio |R18|p. 94

Audio requirements in TS 26.131 and TS 26.132 or in the scope of the ATIAS work item are not considering explicitly AR glasses. The form factor may require specific definitions of terminal audio performance requirements and objectives for AR glasses.

Up   Top   ToC