LiDAR scanner has shown that the wind of innovation driving through technology has not left the camera of mobile devices untouched. Currently great specs are made out of latest smartphones. Even the upcoming iPhone 12 has been rumoured to be coming out with LiDAR scanner, may be an advanced one compared to the one on iPad Pro 2020.
How will you feel to be able to take 3D picture with clear detection of structures that are in front of you? Like, be able to know the depth and height of structures just by clicking the snap-button on the camera app of your smartphone?
With this LiDAR, the camera is presumed to be able to produce 3D images and impressively contribute to advancement of augmented reality. While LiDAR holds a lot for tech fans and have aroused developers in most of the automated automobiles and smartphones companies making them to push towards LiDAR sensors, it is also good to note that there are other alternatives to LiDAR like Sonar and Radar.
While LiDAR uses laser beam, the sonar uses sound while Radar uses radio waves to create image of its scene.
In this review, we will consider the possible contributions that will come through LiDAR to smartphones; we will also understand the differences between the current time of flight sensors on smartphones and the LiDAR scanner coming up on iPhone 12.
We will also be left to make the conclusion on whether what we are having on smartphones now are shadows of LiDAR used by developer to prepare for the future LiDAR sensors yet to come for the benefit of tech fans.
If you have the latest iPad Pro 2020 and is conversant with the LiDAR scanner, you will definitely be left to ask what additional features LiDAR will offer to us in the rumoured upcoming iPhone 12.
What is LiDAR?
The concept behind LiDAR has been in use since the 1960s. In short, the tech lets you scan and map your environment by firing out light rays called “laser beams”, then timing how quickly they return. This laser beams tend to return after falling on structures; in other words, the beam that first fall on structure return first to the sensor.
All that is needed is for the gadget with LiDAR sensor to release light ray (laser beam) and expect it to come back to it. By this an accurate distance and depth of structure are inferred.
Like most futuristic tech, it started life as a military tool on planes, before becoming better known as the system that allowed the Apollo 15 mission to map the surface of the moon. You now know why I said it is a big news for augmented reality (AR) and, to a lesser extent, photography too.
More recently, LiDAR (also known as Lidar) has been seen on self-driving cars, where it helps detect objects like cyclists and pedestrians. You might have also unwittingly come across the tech in your robot vacuum.
But it’s in the past couple of years that LiDAR’s possibilities have really opened up. With the systems getting smaller, cheaper and more accurate, they’ve started becoming viable additions to mobile devices that already have things like powerful processors and Global Positioning System (GPS) – tablets and other smartphones. Powerful processor is needed for the smartphone to be able to drive the LiDAR sensor.
Of course, all LiDAR systems are not created equally. Until fairly recently, the most common types were able to build 3D maps of their environments by physically sweeping around in a similar way to a radar dish.
This obviously won’t cut it on mobile devices, so newer LiDAR scanner systems – including the 3D time-of-flight (ToF) sensors already seen on many smartphones – are solid-state affairs with no moving parts. But what is the difference between a time-of-flight sensor and the LiDAR ‘scanner’ that we will mostly likely see on the iPhone 12?
What’s different about Apple’s LiDAR scanner?
You might already be familiar with the time-of-flight (ToF) sensors seen on many Android phones – these help them sense scene depth and mimic the bokeh effects of larger cameras.
But the LiDAR scanner system used in the iPad Pro 2020 – and, most likely, the two ‘Pro’ versions of the iPhone 12 – promises to go beyond this. That’s because it’s a LiDAR scanner, rather than the ‘scannerless’ systems seen on smartphones so far.
The latter use a single pulse of infra-red light to create their 3D maps, but a LiDAR scanner system fires a train of laser pulses at different parts of a scene over a short period of time. This brings two main benefits – an improved range of up to five meters and better object ‘occlusion’, which is the appearance of virtual objects disappearing behind real ones like trees.
Impressively, it’s a speedy process too, but that speed is only really possible with the latest mobile processors. As Apple stated at the iPad Pro 2020 launch, three things are processed for a more detailed understanding of the scene; the LiDAR scanner’s data is processed together with data from cameras and a motion sensor, all these are “enhanced by computer vision algorithms on the A12Z Bionic”. In other words, there’s a lot going on to make it appear seamless.
Currently, Apple iPhone 12 has been rummored to be coming out with A14 Bionic processor which will definitely offer the needed support for Apple’s LiDAR scanners to process its data very fast. Unlike what is seen in iPad Pro 2020, there will also be great improvement in the lidar scanner that will be coming with iPhone 12.
A blog post from the developer of Halide camera app pointed out that the iPad Pro’s depth data just doesn’t offer the resolution needed for some applications, like detailed 3D scanning or even Portrait mode.
This means the iPad Pro’s LiDAR scanner is designed more for room-scale applications like games or shifting around Augmented Reality furniture in IKEA’s Place app.
It doesn’t currently let you to 3D scan objects with greater accuracy than other techniques like photogrammetry, which instead combines high-resolution RGB photos taken from different vantage points.
Wouldn’t it be great if these LiDAR scanner meshes could be combined with the kind of resolution and textures seen by RGB cameras or Face ID? That’s the ideal, but we’re not quite there yet – and it’s unlikely that the iPhone 12 will immediately make that leap either.
So what exactly will you be able to do with a LiDAR scanner on the iPhone 12?
What can a LiDAR scanner let you do on the iPhone 12?
So now we know the iPad Pro’s LiDAR scanner works best at room-sized scales, What can a LiDAR scanner let you do on the iPhone 12?
For the average person waiting to use this, the two main benefits include; Augmented reality for gaming and Augmented Reality for shopping.
Apple has previewed a few LiDAR-specific applications that are conveniently coming “later this year” (most likely to tie in with the iPhone 12’s announcement) and one of the more interesting is the game Hot Lava.
A first-person adventure game for iOS and PC, Hot Lava will have a new ‘AR mode’ in late 2020 that draws on Apple’s LiDAR sensor to bring its molten rivers into your living room.
So far, the demo isn’t quite as impressive as we’d hoped – most of the objects that your character leaps around are in-game renders rather than your actual furniture, but there’s still time for it to develop.
Naturally, any mention of AR gaming brings to mind Pokemon Go, the only real smash hit for augmented reality so far. Interestingly, the game’s maker Niantic seems to be forging its own AR path, rather than relying on Apple’s tech.
It recently announced a new ‘reality blending’ feature for Pokemon Go – which lets characters realistically hide behind real-world objects like trees – and revealed the acquisition of a 3D spatial mapping company called 6D.
This shows that next generation AR gaming won’t necessarily be tied to Apple’s LiDAR-based tech or ARKit platform, but the iPhone 12 should at least give you a ringside seat for watching the AR battle play out.
But what about non-gaming experiences for the LiDAR sensor? So far, the best semblance of it is based around interior design. For example, the IKEA Place app lets you move around virtual furniture in your living room, as if you’re in a real-life version of the picture.
LiDAR scanner helps you offer the best way you can design your room just by taking picture of your room and offering you a virtual design which then helps you put the design up to make your room beautiful.
But while the iPad Pro 2020’s improved Augmented Reality placement and occlusion (or ability to hide virtual objects behind real ones) are helpful, it’s still not a scintillating new use for the LiDAR scanner.
Is iPhone 12 coming out with LiDAR scanner?
Still, while the tech is currently more useful for Computer Aided Design-designers and healthcare professionals (if you have an iPad Pro, check out the impressive Complete Anatomy app), there is still plenty of room for creativity and surprises to appear in the upcoming iPhone 12.
As Halide’s proof-of-concept app Esper shows, the LiDAR sensor could help app developers invent new creative forms that go way beyond traditional photography and video.
In the meantime, it’s fair to say that the LiDAR scanner on the iPad Pro and possibly iPhone 12 will initially be there to wow developers rather than tech fans.
You’ll get the chance to test-drive the future on LiDAR-equipped devices – but the real leap should come when these sensors and apps arrive on the Apple Glasses.
With this review as an eye-opener of some of that things we will see in Lidar scanner powered smartphones, we will no longer be surprise to meet the Lidar scanner era. Though we hope that it will arrive our world soon enough, the exact time when the full benefits of Lidar scanner will come is not yet certain.
Thanks for reading my review on LiDAR scanner and its potential effects on smartphones. Please check our review page for more interesting reviews.