Beyond iPhone X: 3 potential uses for Apple’s TrueDepth

Share on FacebookShare on Google+Pin on PinterestShare on StumbleUponTweet about this on TwitterShare on LinkedIn

Apple is investing almost $400m into the one of the companies responsible for its TrueDepth camera system on the iPhone X, and it’s likely just the first step in rolling the technology out further in its range. The decision to grant millions from its Advanced Manufacturing Fund to Finisar, the US-based manufacturer of lasers used in TrueDepth, is being billed as a win for American industry; however, it’ll also give Finisar the scope to make more advanced versions. As they get smaller, more capable, and more precise, that opens the door to putting the clever camera into new places.

Beyond iPhone X: 3 potential uses for Apple’s TrueDepth

MacBook Pro

This is the obvious one. Once you’ve lived with Face ID on the iPhone X for a while, suddenly one day you catch yourself taking zero-contact biometric security for granted. For me, it was when the Face ID icon flickered briefly on-screen before the iPhone X pulled out my saved passwords and automatically logged me into my favorite websites.

It may sound ungrateful, since we only just got Touch ID on the MacBook Pro, but I’d love to see Apple integrate a TrueDepth camera into the bezel of its next laptop. Now that might be tricky to do: the camera module is, after all, the reason we have the dreaded notch on the iPhone X in the first place. A MacBook Pro lid is even thinner, and it’s going to take some electronics origami to shrink the whole assembly to fit without leaving a bulge.

Nonetheless, the effort will be worth it. A laptop that automatically logs you in when it sees you, keeps the screen awake while it knows you’re looking, and then locks itself again when you walk away. Video conferencing that can magically remove whatever background you’re actually in front of, since it understands depth information too. Not to mention the convenience of accessing saved passwords and accounts without needing to type in codes or stab a finger at the Touch Bar.

iPad Pro

On the fact of it, adding TrueDepth to the iPad Pro is another no-brainer. However, while I’m pretty sure Apple is already doing this – and we might see it as soon as the 2018 iPad Pro refresh – I’m actually hoping the camera isn’t looking at me. Instead, I’d love to see TrueDepth give iPad photography a proper reason for existing.

You’ve probably seen them: people holding up their tablets to snap a shot, and looking fairly ridiculous in the process. However, give the iPad Pro a more capable sensor array and suddenly you’ve unlocked some very interesting depth-perception talents.

For instance, it could allow the iPad Pro to do room-scale mapping, or to scan and digitize 3D objects. We’ve seen add-on cameras that do something along those lines before, and of course Google has Tango, its clever but woefully under-utilized camera tech, but building it into an iPad Pro natively would give things like augmented reality a huge boost. Indeed, it could make the tablet the go-to for AR developers wanting to get ready for Apple’s much-anticipated smart glasses.

Apple Watch

I know what you’re probably thinking: Apple Watch? That tiny little display on your wrist – why would you want a TrueDepth camera there? Here, I’m not talking about the value of Face ID security, it’s the measure of attention that I’m most interested in.

The Apple Watch is a great way to preview information popping up on your iPhone, but in most cases it demands two hands: one held up, since the wearable is strapped to that wrist, and another to tap the swipe the touchscreen, or twiddle the Digital Crown. That’s great, but there are plenty of times I don’t have both hands free. Suddenly, a notification comes in and I find myself trying to touch on-screen controls with the tip of my nose, or figure out how I’m going to read the entire message since I don’t have my other hand free to scroll.

Imagine if, instead, you could navigate the Apple Watch by gaze and attention. That’s well within the TrueDepth camera’s capabilities, with its eye-tracking technology. If it spots I’ve read to the bottom of what’s on-screen it could auto-scroll further down; if my attention is on a certain button, it could assume that’s the command I want to trigger next. All without having to talk to Siri.


As we’ve seen from other first-generation products from Apple, there’s some clunkiness to go with the cleverness. Nonetheless, the TrueDepth camera looks like it’s here to stay, and Apple’s investment into streamlining production of the complex part – not to mention do important R&D on improving its resolution and decreasing its physical size – telegraphs a real intention to do far, far more with the tiny sensor bar than just allow people to animate emojis with their face.




Share on FacebookShare on Google+Pin on PinterestShare on StumbleUponTweet about this on TwitterShare on LinkedIn