top of page


This exploratory project has successfully prototyped a LiDAR-based motion sensing and visualization tool. LiDAR makes use of an array of infrared beams and operates on time of flight (ToF), making estimates of distance based upon characteristics of reflected beams.

The technology we are developing is novel in the following ways:

  • It uses LiDAR technology to capture and display high-precision motion information, and presents it both in open-ended graph displays as well as through a gamified sequence of tasks for learners.

  • It will incorporate the placement of spatially-anchored motion vectors in the camera view ("3-D motion vector maps")

Advancement in LiDAR-based, high-precision position information has the potential to change the way educators, learners, and members of the workforce interact with their smartphones and tablets, and, in turn, impact the way developers support the needs of education and workforce.

Technical Details: 

Technical Details: Various iPad Pro models and the iPhone 12 and 13 Pro lineup feature Apple’s LiDAR Scanner. The devices’ imagers are composed of a 3-D-stacked, back-illuminated, near-infrared CMOS imaging sensor, a vertical cavity surface emitting laser (VCSEL) array, and a driver circuit (Hallereau et al., 2020). The CMOS imaging sensor is built around a single-photon avalanche diode (SPAD), a photodetector capable of identifying individual photons (Renker, 2006). Briefly, a SPAD work as follows: A p-n junction is reverse-biased to a high electric field (V > V_Breakdown). Incident photons release bound carriers, creating electron-hole pairs that are then accelerated to sufficient energy that the resulting impact ionization events cause a detectable avalanche current (Charbon et al., 2013).


Pairing this imaging sensor with the scanner’s VCSEL array, the device functions by emitting near-infrared wave pulses and measuring the time it takes those photons to be reflected off features in the environment and return (hence the term “time of flight (ToF) sensors”). Information about the returning incident rays is then translated into a distance, which can be directly read by the user or used as a feature of an app, such as for charting environmental features. Physics Toolbox employs the LiDAR scanner to plot relative motion, displaying the user’s position and velocity as they traverse their surroundings.

Learn more about how multiple interacting sensors through the use of LiDAR by viewing some of our tech-related blogs:


Building a LiDAR-based Position Visualizer—Rationale


Building a LiDAR-based Position Visualizer—Prototype

Learn more about how mobile sensors work by visiting the Vieyra Software webpage.

bottom of page