User:Edward Richard Miller/sandbox

= Project Aria = Project Aria is a research program and hardware device in a glasses form-factor developed and manufactured by Meta. The Project Aria glasses support the research of egocentric machine perception challenges, such as positional tracking, scene reconstruction, and action recognition, which are necessary to overcome to make AR glasses successful. The program is developed and supported by Meta Reality Labs Research, with individuals and partners in research institutions around the globe.

The Project Aria program was first introduced in September 2020, when Meta announced that Project Aria glasses would be made available to a limited group of Meta employees and contractors in the United States. In 2021, Meta announced further expansion of Project Aria, to include employees and contractors based in the UK, EU, Switzerland, Singapore, Canada and Israel.

Project Aria Glasses
The Project Aria device is an egocentric sensor platform in glasses form factor that is sufficiently light and unobtrusive to be worn for long periods of time. The glasses feature a rich multi-modal sensor suite which approximates what can be expected in future all-day-wearable AR glasses. The onboard battery allows the device to record 1-2 hours of data (with the nominal recording profile). Longer recordings are possible with an external power bank.

Project Aria glasses contain the following sensors:


 * 1 x 110 degree HFOV Rolling Shutter High Resolution RGB camera, up to 8MP
 * 2 x 150 HFOV / 120 degree VFOV Global Shutter mono cameras for SLAM & hand tracking, 640 x 480 pix
 * 2 x 80 degree DFOV Eye-tracking Global Shutter mono cameras with IR illumination, 320 x 240 pix
 * 2 x IMU (one 1KHz and the other 800Hz)
 * 1 x Barometer and Magnetometer
 * 7 x 48 KHz spatial microphones
 * GPS receiver
 * Bluetooth and Wi-Fi transceiver

Users can configure sensor settings through device recording profiles, including the ability to enable or disable specific sensor streams. This is used to increase recording time, or preserve the privacy of bystanders within certain capture environments.

More information about Project Aria glasses hardware is covered in the Project Aria Tools wiki and the Project Aria whitepaper.

Machine Perception Services (MPS)
In addition to the recording device, the Project Aria program provides machine perception capabilities to approved research partners. These capabilities are exposed via cloud-based Machine Perception Services (MPS), which generate derived data/annotations superior to off-the-shelf open source alternatives.

Machine Perception Services include:

6DoF trajectory​

 * Open loop trajectory that is a local odometry estimation from visual-inertial odometry (VIO)
 * Closed loop trajectory that is created via batch optimization, using multi-sensors' input (SLAM, IMU, barometer, Wi-Fi and GPS), fully optimized and providing poses in a consistent frame of reference.

Online sensor calibration

 * Time-varying intrinsic and extrinsic calibrations of cameras and IMUs are estimated at the frequency of SLAM cameras by the multi-sensor state estimation pipeline.

Semi-dense point cloud

 * Static scene 3D reconstructions, reliable 2D images tracks or a representative visualization of the environment.

Eye gaze

 * Eye gaze direction estimation with uncertainty.

Open Datasets
In 2022, Meta released a first party dataset, Aria Pilot Dataset, which consists of 159 sequences captured using Project Aria. Later in 2022, Project Aria glasses were used to capture Ego4D, a large-scale, egocentric dataset and benchmark suite collected by a consortium of academic research partners across 9 different countries.

In 2023, Meta released two more first-party datasets Aria Synthetic Environments (ASE), Aria Digital Twin (ADT), each of which accompanied research challenges to accelerate machine perception understanding. The Ego4D consortium also released Ego-Exo4D, a follow-up to the Ego4D dataset, captured using a mixture of egocentric and exocentric data captured using Project Aria and GoPro devices.

Privacy Considerations
The Project Aria glasses have an LED indicator that signals to bystanders when the device is recording raw data, in addition to a privacy switch. When activated, the privacy switch immediately stops and deletes the current recording. This allows wearers to immediately and easily fulfil any bystander request to delete any recording that might have been captured of them.

In addition, to protect the privacy of bystanders in public environments, recorded video is processed by an AI anonymization model that detects and removes bystander faces and vehicle license plates. This process helps ensure that only anonymized versions of videos are retained for research purposes.

In 2023, Meta open-sourced the AI anonymization model, under the name ‘EgoBlur’, along with a software library to enable researchers and industry to preserve the privacy of bystanders.