- cross-posted to:
- hackernews@derp.foo
- cross-posted to:
- hackernews@derp.foo
- Users of those services will be steered toward the web
- Searches indicate apps from Meta may also be unavailable
Bypass paywall: https://archive.ph/4kfYI
- Users of those services will be steered toward the web
- Searches indicate apps from Meta may also be unavailable
Bypass paywall: https://archive.ph/4kfYI
Though in that case, I’d rather have these virtual displays driven by my PC, not some bs apple ecosystem.
And their resolution and size are arbitrary. Those have meaning in the physical world because they are physical objects that need to have dimensions and must fit those pixels within that space. For virtual displays, it’s only limited by how much of your field of view would you like to dedicate to each display and how high is the resolution of your headset.
And this is only really scratching at the surface of what AR might be capable of. Why use virtual displays when windows could be displayed floating without a display? Why use windows when UI elements could be floating on their own? Why show a screen playing a video when you could render the video as a semi-transparent 3d scene happening around the viewer (other than the obvious "because it’s in video format, not 3d)?
That said, I’ll wait for someone else to do it since apple likes to take good ideas and simplify them down to the point of frustration.
Yeah I don’t want Apple’s implementation either, just saying to the other guy where I thought the endgame was headed
Your vision starts with iVision. You can see that Apple is trying to do most of that. If the high priced niche product succeeds, everyone else will jump on that bandwagon and your vision is a few years away