Lidar, a technology first used by meteorologists and aerospace engineers and then adopted in self-driving vehicles, has slowly crept into consumer electronics over the last five years. If you have a Pro model iPhone or iPad, there's a good chance it has a lidar sensor, and you're likely using it whether you know it or not.
Apple began incorporating lidar (light detection and ranging) into its products starting in 2020 with the iPad Pro 11-inch (2nd generation), iPad Pro 12.9-inch (4th generation), iPhone 12 Pro, and iPhone 12 Pro Max. Since then, lidar sensors have appeared in every iPad Pro and Pro model iPhone. If you have a Pro model that's 2020 or newer, it has a lidar sensor incorporated into the rear camera system.
Lidar works by shining a laser at something and then detecting the time it takes for the light to return to its receiver. Since the speed of light is incredibly fast, even tiny differences in travel time allow your device's software to build a three-dimensional point cloud map of an area. This 3D map can be used to measure items, create 3D representations of physical objects, and enable other cool features like placing virtual objects in your room or navigating in augmented reality (AR).
When it comes to AR, apps can better understand the geometry of the environment around you to place digital objects more precisely in the real world, ensuring they adhere to the physical properties of the space. Lidar sensors give a more realistic experience than AR on other iPhone and iPad models since they can more accurately map the surroundings captured by your camera and provide better motion tracking and depth perception.
Apple first introduced its API for the lidar scanner in ARKit for iOS and iPadOS 13.4, giving third-party developers access to polygonal models of the surroundings in meshes, which they could then manipulate with RealityKit to create augmented experiences. The company gave even more power to developers starting with iOS and iPadOS 15.4, when it added lidar APIs in AVFoundation to provide high-quality, high-accuracy depth information for recording videos and taking photos.
Below, we'll explore some of the built-in uses of lidar on your iPhone or iPad and how third-party apps expand the technology into more advanced use cases.
1. Faster Autofocus
With a lidar sensor, autofocus capabilities in Apple's Camera app are more advanced than those of lidar-less devices. Your camera's ISP (image signal processor) uses lidar to determine how far away objects and people are so it can automatically focus on them more accurately — even in low-light conditions — while reducing capture time. That's because it measures depth data and does not process the image itself.
This faster autofocusing works not only for still images but also for videos, allowing you to focus on the moment and not on focusing the image in the camera. However, it should be noted that lidar autofocusing doesn't play friendly with windows, glass, water, and other transparent objects as it sees them as opaque.
Lidar autofocus priority is also baked into AVFoundation, a framework for accessing and capturing still images and videos from the camera so that third-party apps can also take advantage of lidar. With it, photography apps can take advantage of faster, more accurate autofocusing. And apps like Focus Puller can even bring lidar auto-focusing functionality to other hardware, such as BlackMagic cameras.
2. Better Portraits
When Portrait Mode was first released in 2016, it used machine learning to determine the photo's subject and apply the background blur just right. That's still very much the case on regular iPhone and iPad models. But on Pro-level models, lidar dramatically improves the look of Portrait photos taken with the rear camera since it can help capture more accurate depth data with the image.
Even third-party apps can capture better portraits, like ProTake, which leverages lidar to allow you to record portrait videos with real-time blurred backgrounds. There's also DSLR Camera, which lets you create AR portraits using the lidar.
3. Night Mode Portraits
Lidar also makes Night mode portraits possible on iPhone 12 Pro and newer Pro models. So, when shooting portraits in a darkened environment on an iPhone 12 Pro, 13 Pro, 14 Pro, or 15 Pro model, the lidar scanning functionality, which fires more often and frequently to get a better read depth read on the scene, enables Camera to adapt quickly to subject depths and more accurately change exposure settings. This can result in clear portraits captured in low-light environments. On iPhone models without lidar, Night mode simply won't kick in.
4. Enhanced Refocusing Precision
While you can change the focus in captured Portrait photos on any iPhone or iPad running iOS or iPadOS 17 or later, you get better results on Pro models, which uses the data captured from the lidar scanner. On iPhone 15 Pro models, you can even refocus regular photos as long as you have Portraits in Photo Mode enabled.
5. Easier, More Accurate Measurements
Apple's Measure app, which uses AR technology to map your environment and measure objects within the frame, basically turns your iPhone or iPad into a digital tape measure. While it's been available on any iPhone or iPad model since iOS 12, it's easier and more accurate to use Pro models with lidar scanners because it better understands the spatial relationship between you and the objects in the frame.
With lidar, measurements are far more accurate than on devices that rely solely on camera data and motion sensor processing. Lidar also provides Apple's Measure app with vertical and horizontal edge guides. Measure will automatically detect edges after you point the circle near them, snapping yellow guidelines along the sides so that it's easier for you to see and follow. You'll only see a dotted, non-sticky line on lidar-less models after you mark your first point.
Lidar also provides more granular measurements with Ruler View, allowing you to measure items more precisely in increments. Simply plot a line and then move closer to it to show the ruler.
Lidar-equipped Pro models also benefit from having a measurement history in the Measure app (tap the List button). The lidar itself isn't responsible for it, but it is responsible for giving you extra measurement details. When you measure something, tap the measurement and expand the window to see information like the elevation, distance away from you, angle, and more. For example, when viewing area, you'll see an example of the shape and additional measurement units.
There are also plenty of third-party measuring apps that use lidar, such as LiDAR Measuring, which will give you the distance between you and any object. It will even let you set a specific distance between you and a fixed point, giving you an approximation of when you reach the distance. Laser Rangefinder - LiDAR also uses lidar to provide "measurements with millisecond-level accuracy" for ranging.
6. People Measurements
One of the coolest features of Apple's Measure app is the ability to calculate the height of a seated or standing person. This function is exclusive to iPhones and iPads equipped with lidar. Whenever the Measure app detects a person seated or standing in the frame, it automatically includes their measurement at the top of their head, hair, or hat. Snap a photo using the shutter button to share the image with the measurement.
7. Accessibility
Accessibility tools are important to Apple, and its never-ending promise to make technology available to everyone led to the Magnifier app, first introduced in iOS 10, which helps users easily magnify and identify items in the frame of the camera. With iOS 16 and iPadOS 16, Apple added Door Detection for Pro models, which uses lidar to map and identify doors and their distance away.
Text descriptions of the door appear on the screen, such as the distance away, whether it's open or closed, what type of door it is, how it swings, if there are any signs on it, and more. VoiceOver can also provide descriptions for blind and low-vision users.
Other apps are available to aid users who are blind or have low vision, like Microsoft's Seeing AI, which uses lidar and augmented reality to help a user explore their environment, complete with audible announcements of objects using spatial audio. In a simpler implementation, there's LiDAR Sense. With this app, you hold your iPhone in front of you, facing your rear camera where you're walking, and the app will vibrate harder and make a louder sound as you get closer and closer to objects in your path.
8. AR Photo/Video Effects
While your iPhone or iPad already has a lot of built-in uses for lidar, Apple also has its Clips app for you, which can scan and transform the space you're recording a video in with AR effects. With a video open in the app, tap the star icon to open the effects menu, then tap the AR Spaces icon. Start the scan, start the effect, and choose the AR Space you want to use in the scene.
Since lidar became available on Apple products, Snapchat has been building Snapchat Lenses where you can place objects in your environment or turn your surroundings into something completely different. Users can also install Lens Studio on their computer to create their own Lenses, and there's a World Lenses collection to help you find all these creations.
DSLR Camera, a camera app for iPhone and iPad, also uses lidar, but instead of putting stars, shining lights, dancing characters, and flowers around your space, it lets you place graphics, text, and stickers onto a photo scene.
And these are just a few examples of the photo and video apps that take advantage of lidar for cool augmented reality effects.
9. 3D Scanning
In 2022, Apple added the RoomPlan API, which utilizes the power of lidar, to ARKit so that developers can more accurately capture the entirety of a room for making 3D models. Many apps have already taken advantage of this, including Scaniverse, which will help you create a 3D map of your living space, your office, furniture, and even objects using your iPhone or iPad, which you can then save and share.
Scaniverse lets you capture, edit, and share 3D content directly from your phone. Using LiDAR and computer vision, it builds 3D models with high fidelity and stunning detail. Scaniverse can accurately reconstruct objects, rooms, and even whole buildings and outdoor environments.
Other apps, such as 3d Scanner App and Shapr3D CAD modeling, also make use of RoomPlan to help you create floor plans in 2D or 3D, which can aid in the remodeling process of a physical space. And there are plenty more to try out: Canvas, Polycam 3D Scanner, magicplan, etc.
These modeling tools are also beneficial to scientists. For example, a historian used the 3d Scanner App to make 3D maps of trenches in Pompeii. Lidar also helped make scans to showcase and track forest thinning. It's equally useful to geoscientists, who can create 3D scans of cliff faces, rocks, and other earthly structures. Archaeologists and paleontologists can use lidar to scan models of bones, ancient artifacts, and fossils. Civil engineers, surveyors, transportation planners, and other specialists can create precise, 3D maps of footpath obstructions, pothole locations, street parking, and other pedestrian and vehicular areas.
10. Lidar Visualization
Curious about the visual output of lidar on top of the camera feed? Discover it firsthand with LiDAR View. This app offers a real-time overlay of lidar scanner data onto your camera view, providing a fascinating and nerdy experience in one.
Explore similar tools like Night Vision - LiDAR Scanner and LiDAR & Infrared Night Vision, both available on the App Store. Utilizing the lidar sensor, these apps enable enhanced visibility in low-light conditions, illuminating objects in front of your iPhone or iPad so you can see better in the dark.
Deep Field, another excellent app that demonstrates the lidar functionality as it relates to learning and education, is currently on tour at various museums around the world and helps users explore nature through augmented reality.
11. Smarter Buying
When ARKit first appeared, IKEA used it for its IKEA Place app, which allowed you to try out furniture in your space before buying anything (and spending the next three days assembling it). It was eventually replaced by IKEA Studio, which utilized lidar and not just ARKit's other APIs. That eventually morphed into a newer tool called IKEA Kreativ, built into the IKEA app. Its Scene Scanner also uses lidar to scan the environment and learn where the floor and walls are to position the objects in 3D. And you can design an entire room with IKEA furniture, even deleting all your current furniture.
12. Gaming
Lidar is more than just a utility available on your iPhone or iPad. It can also be fun and games thanks to games like Hot Lava, which utilize the lidar scanner on Pro model devices to transform your living room into a fiery obstacle course. You can interact with virtual elements in the real world and dodge molten lava. LIDAR.dark is another game, a first-person shooter that takes place in the dark and utilizes the lidar sensor to see around obstacles in the real world.
Apple
13. Health Education
Healthcare professionals have it easier when it comes to learning about the human body thanks to lidar. 3d4medical's Complete Anatomy 2024 allows for a human body to be placed in an AR scene in the real world and then digitally dissected to show individual components, their structures, and more. This learning tool is invaluable for those in medical school, those studying biology, or those wanting to revisit their high school science classes. Health apps like these can also measure joints and movements to diagnose health issues better.
Combining our powerful visualization tools with motion capture and the incredible lidar scanner ... we're excited to empower healthcare professionals to accurately and instantly measure range of motion for patients recovering from injury or surgery.
Apple
Cover photo, screenshots, and GIFs by Gadget Hacks (unless otherwise noted)
Comments
No Comments Exist
Be the first, drop a comment!