Following this month’s release of Apple Vision Pro, Apple has shared an in-depth overview on how Vision Pro and visionOS protect your data. The new “Apple Vision Pro Privacy Overview” covers things like Optic ID, cameras and your surroundings, Persona, EyeSight, and more.
Apple Vision Pro and privacy
Apple explains that Vision Pro and visionOS were developed from the start with a focus on privacy.
We integrated hardware and software on Apple Vision Pro to protect your information in light of the unique privacy challenges posed by spatial computing. Apple Vision Pro features, from using it with your eyes and hands to showing digital content in your physical space, also have privacy built in. There are four privacy principles that inform everything we do at Apple, including all the new features on Apple Vision Pro. These four principles are: data minimization, ondevice processing, transparency and control, and security.
One of the big themes throughout Apple’s paper on Vision Pro is the emphasis on on-device processing. The paper explains that visionOS processes data on Vision Pro itself when possible, as opposed to sharing it with Apple or other developers:
visionOS processes data on-device where possible instead of sharing it with Apple or other developers. To protect where you look, the hover effects that are shown when you look at content are rendered on-device by visionOS and are not shared with the app you are using. visionOS also maps your surroundings ondevice in order to realistically render virtual objects in your physical space. Additionally, your Persona is generated entirely on-device with photos you take of yourself using your Apple Vision Pro.
What about all those Vision Pro cameras? Apple has a thoughtful approach to making sure all of that data is protected.
The places where you use Apple Vision Pro, like at home, often have detailed information about your personal life. From items on your desk to who is in the room with you, data about your surroundings is protected by visionOS. visionOS blends apps with your surroundings entirely on-device, so the apps you use do not need to access information about surroundings.
visionOS builds a three-dimensional model to map your surroundings on-device. Apple Vision Pro uses a combination of camera and LiDAR data to map the area around you and save that model on-device. The model enables visionOS to alert you about real-life obstacles, as well as appropriately reflect the lighting and shadows of your physical space. visionOS uses audio ray tracing to analyze your room’s acoustic properties on-device to adapt and match sound to your space. The underlying scene mesh is stored on-device and encrypted with your passcode if one is set.
By default, apps cannot access any information about your surroundings. Apps can also open a Full Space for a more immersive experience, where content from other apps disappears and the app can create windows, volumes, and unbounded content. With your permission, apps in a Full Space can access surroundings data to support more immersive experiences.
Check out the full Apple Vision Pro Privacy Overview on Apple’s website or embedded below.
FTC: We use income earning auto affiliate links. More.
Más historias
La cuenta de ahorros Apple Card obtiene otro recorte de tipos de interés
Threads ahora permite a los usuarios seguir cuentas fediversas, pero existen limitaciones
9to5Mac Daily: 4 de diciembre de 2024 – Apple y Amazon, Spotify Wrapped