Technology Apple Is Said to Target Rear-Facing 3-D Sensor for 2019 IPhone

22:53  14 november  2017
22:53  14 november  2017 Source:   Bloomberg

iPhone X ‘notch remover’ now available in App Store

  iPhone X ‘notch remover’ now available in App Store <p>If you have a burning hatred of the so-called "notch" on your new iPhone X then boy do I have good news for you. Apple just approved an app called Notch Remover despite its urging of developers to embrace the notch by not masking it:</p>Don't mask or call special attention to key display features. Don't attempt to hide the device's rounded corners, sensor housing, or indicator for accessing the Home screen by placing black bars at the top and bottom of the screen.

(Bloomberg) -- Apple Inc. is working on a rear - facing 3 - D sensor system for the iPhone in 2019 , another step toward turning the handset into a leading augmented-reality device, according to people familiar with the plan.

Apple Inc. is working on a rear - facing 3 - D sensor system for the iPhone in 2019 , another step toward turning the handset into a leading augmented-reality device, according to people familiar with the plan.

A customer views an Apple Inc. iPhone X smartphone during the sales launch at a store in San Francisco, California, U.S., on Friday, Nov. 3, 2017. The $1,000 price tag on Apple Inc.'s new iPhone X didn't deter throngs of enthusiasts around the world who waited -- sometimes overnight -- in long lines with no guarantee they would walk out of the store with one of the coveted devices.: 1510144038_fintech © Bloomberg/Bloomberg 1510144038_fintech

(Bloomberg) -- Apple Inc. is working on a rear-facing 3-D sensor system for the iPhone in 2019, another step toward turning the handset into a leading augmented-reality device, according to people familiar with the plan.

Apple is evaluating a different technology from the one it currently uses in the TrueDepth sensor system on the front of the iPhone X, the people said. The existing system relies on a structured-light technique that projects a pattern of 30,000 laser dots onto a user’s face and measures the distortion to generate an accurate 3-D image for authentication. The planned rear-facing sensor would instead use a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a three-dimensional picture of the environment.

Here are Apple's tips for avoiding 'burn-in' on the iPhone X's screen

  Here are Apple's tips for avoiding 'burn-in' on the iPhone X's screen <p>The iPhone X is Apple's first device with an OLED display.</p>OLED has several benefits over the LCD displays Apple has historically used for the iPhone, like the ability to producer darker blacks and a more vivid picture. But the technology also has drawbacks.

Apple is working on a rear - facing 3 - D sensor system for the iPhone in 2019 , another step toward turning the handset into a leading augmented-reality device, according to people familiar with the plan.

Apple is working on a rear - facing 3 - D sensor system for the iPhone in 2019 , another step toward turning the handset into a leading augmented-reality device, according to people familiar with the plan.

The company is expected to keep the TrueDepth system, so future iPhones will have both front and rear-facing 3-D sensing capabilities. Apple has started discussions with prospective suppliers of the new system, the people said. Companies manufacturing time-of-flight sensors include Infineon Technologies AG, Sony Corp., STMicroelectronics NV and Panasonic Corp. The testing of the technology is still in early stages and it could end up not being used in the final version of the phone, the people said. They asked not to be identified discussing unreleased features. An Apple spokeswoman declined to comment.

The addition of a rear-facing sensor would enable more augmented-reality applications in the iPhone. Apple Chief Executive Officer Tim Cook considers AR potentially as revolutionary as the smartphone itself. He’s talked up the technology on Good Morning America and gives it as almost much attention during earnings calls as sales growth. “We’re already seeing things that will transform the way you work, play, connect and learn,” he said in the most recent call. “AR is going to change the way we use technology forever.”

The next iPhone X might finally do something no iPhone has ever done before

  The next iPhone X might finally do something no iPhone has ever done before Every new iPhone brings over various features that were never available on previous models, and the same is expected for next year’s models. We could tell you that much even if there were no early rumors to dissect. But one of the iPhone X’s successors will finally do something no other iPhone has been capable of, even though competing devices had the beloved and incredibly useful feature for years. That’s right, dual-SIM support is supposedly coming to a future iPhone.

Apple Inc. is working on a rear - facing 3 - D sensor system for the iPhone in 2019 , another step toward turning the handset into a leading augmented-reality device, according to people familiar with the plan.

Company looking at different technology from iPhone X’s Face ID.

Apple added a software tool called ARKit this year that made it easier for developers to make apps for the iPhone using AR. The tool is good at identifying flat surfaces and placing virtual objects or images on them. But it struggles with vertical planes, such as walls, doors or windows, and lacks accurate depth perception, which makes it harder for digital images to interact with real things. So if a digital tiger walks behind a real chair, the chair is still displayed behind the animal, destroying the illusion. A rear-facing 3-D sensor would help remedy that.

The iPhone X uses its front-facing 3-D sensor for Face ID, a facial-recognition system that replaced the fingerprint sensor used in earlier models to unlock the handset. Production problems with the sensor array initially slowed manufacturing of the flagship smartphone, partly because the components must be assembled to a very high degree of accuracy.

While the structured light approach requires lasers to be positioned very precisely, the time-of-flight technology instead relies on a more advanced image sensor. That may make time-of-flight systems easier to assemble in high volume.

Alphabet Inc.’s Google has been working with Infineon on depth perception as part of its AR development push, Project Tango, unveiled in 2014. The Infineon chip is already used in Lenovo Group Ltd.’s Phab 2 Pro and Asustek Computer Inc.’s ZenFone AR, both of which run on Google’s Android operating system.

To contact the reporters on this story: Alex Webb in San Francisco at awebb25@bloomberg.net, Yuji Nakamura in Tokyo at ynakamura56@bloomberg.net.

To contact the editors responsible for this story: Tom Giles at tgiles5@bloomberg.net, Alistair Barr, Molly Schuetz

Apple Files Patent For Extra Wide-Angle Lens Camera .
The patent was filed in Taiwan last month. The lens is expected to work with near-infrared imaging lenses.As it turns out, the company is also working on a new style of lenses which will be capable of taking extra wide angle shots. The company has filed a patent in Taiwan for infrared imaging lenses.

—   Share news in the SOC. Networks

Topical videos:

This is interesting!