Over the past year, we’ve been involved in lots of visionOS solutions for use at work. Apple’s announcements at WWDC this year may seem incremental, but they are actually very significant for enterprise spatial computing.
These new technologies make broader — and deeper — Vision Pro deployments quite a bit more feasible so rather than talk in the abstract, here’s my take on how we can apply these to 4 real project examples (some specific details are omitted to protect the innocent).
Remote Expert
Imagine an on-site engineer working to keep some complex equipment running. When they reach the limit of their knowledge, they need additional expertise to help. Usually, that arrives in the form of a human (likely after traveling on a plane).
In relatively ‘controlled’ (read safe) environments, Vision Pro is very useful. Since visionOS 2, the remote expert can see through the eyes of the on-site engineer and provide real-time guidance without travel.
visionOS 26 provides significant improvements:
- Stream 3D (stereo) video to the remote expert (with access to both front-facing cameras) enabling them to provide more effective guidance (via their own Vision Pro)
- Provide a world-locked video feed of a specific part of the users field of view — like a virtual camera that is facing a specific part of the equipment. Even when the user moves their head slightly, the remote expert can get an uninterrupted view.
- Windows can now follow the user as they move around an environment (they are locked to the real world by default) — written instructions from the remote expert can follow the user as they work.
Collaborative Design
Creating physical design mockups has many obvious drawbacks, especially for large and complex designs like cars or consumer products.
Vision Pro can already dramatically reduce the need to produce expensive prototypes, and the latest announcements from Apple remove some significant blockers to adoption:
- In-person shared experiences. Now you can share a virtual object (design prototype) with other people you are physically present with (Apple didn’t provide an API for that before). So you can walk someone through proposed design changes using a virtual object in your physical office
- No need for FaceTime. SharePlay (the Apple-provided API for sharing), requires FaceTime, which is all well-and-good unless you are deploying a solution in an enterprise environment. In-person shared experiences can now be delivered with no reliance on Apple cloud services
- Spatial accessories like a bluetooth stylus or a game controller now make it much easier to markup virtual objects with drawings and notes.
Experiential Learning
Adults learn better by doing. Vision Pro is extremely good at creating hyper-realistic objects and environments that are exceptionally useful in training.
This year brings some very important technical and logistical improvements that make sharing devices in a training environment much more feasible.
- Devices that are managed by an organization can be returned to a known state in an automated way. This ‘return to service’ ability reduces the need for on-site technical staff
- Users who have previously set up a Vision Pro can store their setup information in iCloud and pass it over to a Vision Pro via their iPhone. This could significantly speed up the onboarding process.
- Improved scene understanding and ‘environmental occlusion’ capabilities allow the Vision Pro to much more effectively merge the physical world with virtual objects. Imagine a physical training/simulation center, brought to life with photorealistic virtual objects.
- The previously mentioned in-person shared experiences allow multiple students to receive instruction on the same virtual object (this is wild). Improved virtual persona’s allow remote learning experiences to be more engaging.
Product experiences
Vision Pro is exceedingly good at providing your customer a powerful, emotional impression of a product or experience. Imagine an immersive product catalog that enables potential customers to be captivated by your products before seeing them in the real world.
VisionOS 26 gives us some good steps forward here:
- Include immersive media — in a multitude of formats — in your app. Transport your customer into an extremely high quality representation of an experience using 3D, 360 or even Apple Immersive video.
- ‘Progressive immersion’ (more flexible this year) allows a user to experience a ‘portal’ into your content — they need not be placed in a disorientating virtual environment, but can dial your content in-and-out (just like Apple’s own virtual environments).
- Metal — Apple’s native graphics API — receives several improvements this year, including the ability to highlight parts of the UI based on what the user is looking at. This means ultra high fidelity graphical experiences can have a much better user experience.
Chris Jinks, Spatial Computing Consultant“Spatial computing will become as integral to enterprise workflows as Mobile. We are seeing that inflection point getting closer and closer each year. This is Apple at its best in enterprise: making deliberate, iterative improvements based on feedback from the ecosystem.”
Work with Trifork
We don’t believe this technology is right for every use-case yet — we are very honest about that — but with each year an increasing number of work tasks can be fundamentally changed for the better.
If you have a idea for a spatial computing solution for your business, we’d love to talk about it. contact us here, and we’ll set up a conversation right away.
For further information please contact:
