Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regarding possibility of full-screen applications, or SBS stereoscopic rendering #41

Open
NicholasCStanley opened this issue Apr 10, 2024 · 2 comments

Comments

@NicholasCStanley
Copy link

As per current Ensemble implementations, and inherent limits of this type of tethering/mirroring in VisionOS, is there any possibility of passing along immersive content/SBS stereoscopic 3D data as a "fullscreen" app window to the Vision Pro?

Essentially the goal would be to run an App natively on the Mac, (think Unity engine), launch Ensemble, open the window on an AVP headset and enable a Stereo 3D VR visualization as a full-screen immersive view.

Ideally this would also allow for hand-off of gyroscopic/tracking data from AVP hardware to the Mac as an input for viewport pose/render perspective update. Basically, to run a MacOS native app, and use the Vision Pro as a 6DoF tracking 3D spatial display, rather than to run a VisionOS app in simulator on the Mac and stream with the developer strap.

@NicholasCStanley
Copy link
Author

Example:

Converting the Zstack implementation in RootWindowView.swift to a volumetric viewport -

Example:

struct RootWindowView: View {
    let remote: Remote
    let window: Window
    @State var children = [Window]()
    @State var appIcon: Data?
    
    var body: some View {
        GeometryReader { geometry in
            ZStack {
                WindowView(remote: remote, window: window)
                ForEach(children) { child in
                    let width = child.frame.width / window.frame.width * geometry.size.width
                    let height = child.frame.height / window.frame.height * geometry.size.height
                    let x = (child.frame.minX - window.frame.minX + child.frame.width / 2) / window.frame.width * geometry.size.width / geometry.size.width
                    let y = (child.frame.minY - window.frame.minY + child.frame.height / 2) / window.frame.height * geometry.size.height / geometry.size.height
                    
                    Color.clear
                        .ornament(attachmentAnchor: .scene(.init(x: x, y: y))) {
                            WindowView(remote: remote, window: child)
                                .frame(width: width, height: height)
                        }
                }
            }
            .ornament(attachmentAnchor: .scene(.init(x: 0.5, y: 1 + 64 / geometry.size.height))) {
                WindowToolbarView(title: window.title!, icon: appIcon)
            }
        }
        .background {
            AspectRatioConstrainingView(size: window.frame.size)
        }
        .windowStyle(.volumetric)
        .defaultSize(width: 0.6, height: 0.6, depth: 0.6, in: .meters)
        .task {

Obviously a volumetric window is not a stereoscopic immersive fullscreen view...

@saagarjha
Copy link
Owner

Sorry, I'm not sure how this would work? Stereoscopic content is rendered for each eye, correct? There isn't really a way to do that on visionOS to my understanding, you need to hand it your 3D content to render.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants