AWE Nite NYC ENGLISH

iPhone 12 Pro and LiDAR –

The LiDAR sensor has finally been integrated into the brand new iPhone 12 Pro and 12 Pro Max offering users, the ability to scan the surrounding environment and greatly improving interactions with Augmented Reality apps.

Apple had already sensed the power of LiDAR by inserting it into the iPad Pro presented in March 2020, but the easy handling of the iPhone completely changes its use, offering new opportunities for developers.

iPhone 12 Pro, in addition to LiDAR, includes a telephoto lens, higher-end autofocus, better performance in portrait mode and night mode portraits, Apple ProRAW, stainless steel edges, and up to 512GB of storage. (Digitames.com)

What are the changes that LiDAR will bring to our daily life?

Augmented Reality adds information to what we see through the camera of our mobile phone (or AR supports, such as glasses).
The biggest limitation so far has been that of not being able to create a true “relationship” between the mobile phone software and the real environment.

LiDAR allows you to overcome this obstacle, through a very in-depth scan and detailed analysis with recognition of objects, of the surrounding space, creating a link between the digital world and the real world.

AWE Nite NYC

During the last AWE Nite NYC which was held on October 27, Ori Inbar brought together, in an interesting discussion panel, some of the main managers of Snapchat, Niantic, Occipital, Patched Reality, to talk about the future of Augmented Reality with advent of LiDAR.

The following is the agenda of the meeting, which opened with a presentation by the Co-Founder of AWE:

1) Ori Inbar (Co-Founder of AWE and Super Ventures) the latest and greatest from the Augmented World. One of the most important leaders in the world in the Augmented Reality sector. A quick excursus on the AR history that with LiDAR, is launched in the consumer world.

2) Qi Pan (Senior Manager of Computer Vision Engineering, Snap) – Introducing Snapchat’s LiDAR-powered lenses for developers and Local Lenses.

Qi Pan is Senior Manager of Computer Vision Engineering at Snapchat.
One of his company’s goals, is to make the application smarter by giving developers the ability to create new use cases with Augmented Reality, for example by exploiting the position or creating AR experiences with multiple users simultaneously.
He explained how LiDAR serves to immediately understand the geometry of full-scale spaces and the semantics of surrounding environments and objects.

Qi Pan, also talked about Snap Lens the AR creation tool with over 1.5 million filters created by developers.

Snapchat, which had been working with Apple for a long time, the day after the presentation of the iPhone 12, released an update in Lens Studio 3.2 with the ability to create filters using LiDAR technology.
The features are simulated by developers with an interactive preview, so they can be created without having a device.

LiDAR offers great new features:
– Geometry of the instant metric scale with a distance of 30 centimeters;
– Analysis of the person in depth;
– Solves the problem of white walls…
It’s not yet a perfect technology (for example it doesn’t detect too small objects), but it’s a huge step forward for Augmented Reality applied to mobile devices.

3) Meghan Hughes (Head of Go to market Niantic Real World Platform, Niantic Labs) – The new AR Mapping task in Pokemon Go and the iPhone LiDAR

Meghan Hughes, Head of Go to market Niantic Real World Platform, at Niantic, the company that produced Pokémon Go, during her panel talked about the newly released application update, which asks players (Trainers level 20 or higher) to scan Pokéstops with the LiDAR sensor, in Augmented Reality.

Niantic is building a LiDAR experience level for all devices that support it. For devices that have the built-in AR sensor, the software gets dark vision benefits, improves 3D, dynamics, and object recognition.

It’s not a magic bullet, but it improves the software.

This development of the application brings the consumer closer to the innovative technology of Augmented Reality, using the mapping of reality, the understanding of reality, and finally the sharing of reality.

The multiplatform APIs in Augmented Reality that Niantic is building has field-tested both the AR functionality and the back-end services through millions of sessions per day.

Here’s what’s being improved:
– AR experiences become shared;
– Multiple players can share social, hyper-realistic experiences at the same time;
– There is an awareness of the context in which you play;
– In-depth scans are performed;
– The depth of the environments is perceived;
– There is a semantic understanding of objects;
– There is a 3D reconstruction of the environments and a final meshing.

The AR content becomes aware of the surrounding environment and context.
The Visual Positioning System is powered by an Augmented Reality map of the world, which is constantly updated by players.

The Niantic development team is looking for proposals from game developers who want to join the community.
If you want to access the beta of the development platform, follow this link: Game Developer Application.

4) Jeffrey Powers (Co-founder of Occipital, and CEO of Arcturus Industries) – From Occipital Structure Sensor to iPhone LiDAR

Jeffrey Powers in 2008, co-founded Occipital, making CanvasStructure Sensor360 Panorama, and RedLaser (acquired by eBay).
In 2020, he co-founded Arcturus Industries, with the aim of improving the state of the art in 3D perception for AR / VR devices.
During his panel, he talked about the development over the years, from Occipital Structure Sensor up to iPhone Lidar.

5) Patrick O’Shaughnessey (Patched Reality) – A developer’s perspective: how iPhone LiDAR unleashes a new game category like Epic Marble Run: roll marbles over any terrain.

Patrick O’Shaughnessey is a Software Engineering Executive and has 25 years of experience in the development and management sector.
He founded Patched Reality ten years ago, thanks to which he develops unique Augmented Reality applications for mobile phones and AR digital glasses.
His company’s client list includes COACH, Pepsi, Ben and Jerry’s, Red BullHBO, PBS/Henson and The American Museum of Natural History di New York.
During his panel, he talked about Epic Marble Run, and a new category of games with an advanced Augmented Reality, thanks to LiDAR.

During the session, there was also an interesting group conversation that involved the many participants connected in live streaming, from all over the world.
If you were unable to follow the event live, you can see it at this link:

AWE Nite is a physical and virtual place, which develops in the local area and includes continuous meetings. During these meetings, developers, creators, investors, academics, managers, and onlookers come together to discuss and increase interest in Augmented Reality.

Reference AWE Nite

Cecilia Lascialfari
Follow me