
Snapchat was one of the earliest adopters of augmented reality in social media with the release of Lenses back in 2015. The adoption of augmented reality is happening faster than we had previously anticipated, and we are working together as a team to execute on the many opportunities in front of us.” We’re excited about the growth of our business in Q3 as we continue to make long-term investments in our future. “Our focus on delivering value for our community and advertising partners is yielding positive results during this challenging time. In a press release, Evan Spiegel, CEO said: Daily active users had increased 18% year-over-year to 249 million and revenue increased 52% year-over-year to $679 million.

Robotics: Similar to autonomous vehicles, LIDAR in robotics are used to allow mapping of an environment and provide the robot with enough information to interact or avoid obstacles.On 20th October, Snap Inc announced its third-quarter financial results. Scanning LIDAR optical layout ApplicationsĪlthough we have mentioned Autonomous vehicles as an application for LIDARs, there are many interesting applications.Īgriculture: besides the classification of crops and monitoring growth, LIDAR can be used to detect types of insects and their movement.Īrcheology: LIDAR systems mounted on drones or airplanes have been used identify ruins and settlements that are usually cover by the canopy or heavy vegetation The next stage is a wide angle group that increases the FOV to 120-degree arc. The f-theta lens creates an image on a flat plane and it consists of three optical elements (with an effective focal length of 100 mm). In it, we have a focusing lens that focuses light on a MEMS and the MEMS reflects light into a f-theta lens.

One problem with scanning LIDAR is that if a scene is rapidly changing, the scanning system may not provide an accurate description of the viewing scene.Īn example of a scanning LIDAR system is shown in the figure below. These systems can be more accurate than flash systems, allow for longer detecting ranges, but are bulkier, more complex, and expensive. At each new location, the light is being detected by a single photodetector and the TOF is thencalculated. Scanning LIDAR has a single collimated source that scans the system’s field of view using a MEMS-based micromirror or rotating prims. The triplet lens creates a telecentric beam on the fisheye, and the fisheye increases the FOV to almost 180-degree arcįlash LIDAR optical layout Scanning LIDAR The EWOD (electrowetting) prism is used to scan the laser beam on a 15.6-degree arc. In the image below, we can see the emission optical system for a flash LIDAR. So, the laser beam is expanded with diffusers and then projected onto the FOV. Flash LIDARs require homogeneous, full-area illumination of the scene. The reflected light is then imaged onto a detector array and the TOF is calculated for each individual element in the detector. SInce they lack moving parts, they tend to be very robust, but they are usually used as short range sensors (<30 m) and have reduced fields of view compared to scanning systems. In a flash LIDAR, the entire field of view is illuminated by a single laser source.

For the collection lens design, this means optimizing the captured energy over the lens’ field of view (FOV).īelow we will discuss the lens designs used in two common LIDAR optical architectures. In a LIDAR lens design project, delivering high efficiency in sending the light pulse and collecting the return pulse is essential.

The reflected light is detected and the time-of-flight (TOF) is calculated giving an estimate of the distance to the object based on the photon return time. This is done by sending a laser pulse from a source which is reflected by the object. The main function of a LIDAR is to measure the distance to an object.
