0

1349: Building Juno for YouTube visionOS App with iOS developer Christian Selig

 2 months ago
source link: https://voicesofvr.com/1349-building-juno-for-youtube-visionos-app-with-ios-developer-christian-selig/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

1349: Building Juno for YouTube visionOS App with iOS developer Christian Selig

February 15, 2024by kentbye

One of the more popular apps for the Apple Vision Pro is Juno for YouTube, which iOS developer Christian Selig created in a week using the emulator to have ready in time for the February 2nd launch. Mark Gurman reported on January 18th that Spotify and YouTube were joining Neflix in not having native visionOS apps at launch, and Selig saw an opportunity to port his previous YouTube API integrations from his Apollo Reddit app into a native app. I caught up with Selig on the same day that he pushed out a 1.1 version that fixed a lot of the differences between that the visionOS simulator felt like versus what the actual eye tracking and hand gestures felt like on the Apple Vision Pro hardware. There were many differences that we explore in this podcast, but also some of the existing operating system limitations of the 1.03 and 1.1 versions of visionOS that have launched so far in terms of multiple audio sessions, huge screen theatre modes not being available to developers yet, and the mixed reality limitations of the immersive modes.

Podcast: Play in new window | Download

Share this

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So in today's episode, I'm featuring iOS developer Christian Selig, who developed the Juno for YouTube app for VisionOS. So Krishan has been a longtime iOS developer who actually created the Apollo app for Reddit, which ended up shuttering after Reddit had changed their cost structure for the API that made it impossible for him to continue the application. But he developed YouTube integrations in the context of that application that he was able to take and create a Vision OS application purely within the simulator. Now, when he actually got a hold of the Apple Vision Pro hardware, he noticed that there was a lot of differences between what the app felt like when he did it in the simulator versus what it actually felt like. So I wanted to talk to Christian about his journey into creating what is probably one of the more popular Vision OS applications in the beginning, just because YouTube had decided not to launch a native application. They have since come out and said it's on the roadmap, but it's just not something that they had prioritized to get done by the launch. So Christian's taking very much an iterative approach of trying to boil down a lot of the core essential features that you want out of YouTube. So this is actually one of the first applications that I bought for Vision OS just because I like to have videos playing in the background and the Safari interaction is not great when you're trying to use it on It's sort of like the equivalent of trying to use a desktop or laptop version on the context of a mobile phone. It's not really optimized for that interface. And so there's a lot of ways that the eye tracking and hand gestures is not necessarily optimized when you start to use all these different applications in the Safari browser. So it makes a huge difference to have native integrations. You know, I found that with the Apple Vision Pro, I wanted to have more of this mixed reality environment. But then if you have something that's fully immersive, then it basically shuts out all of your One of which would be having like a YouTube video playing in the background if you're playing some sort of casual game. It can't necessarily mix and match when you're doing that on the Apple Vision Pro. So we talk about some of those different design challenges and some of the existing restrictions that Apple has, especially when it comes to how many audio channels you have by default, and that there are some options there on the back end, but yet it feels much more like you're operating on a mobile phone than you are on a full computer. So there's a lot of these things that I think that over time are gonna start to get fleshed out. But I just want to get a sense from Christian about some of his journey into this space and some of his reflections of what it's been like to develop Juno for YouTube, which is probably one of the more popular applications on the platform so far. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Christian happened on Wednesday, February 14th, 2024. So with that, let's go ahead and dive right in.

[00:02:55.085] Christian Selig: My name is Christian Selig. I'm an iOS developer from Eastern Canada. I mostly develop iOS apps historically, but I'm kind of dipping my toes in Vision OS development now. And I do it independently as an indie developer. It's my full-time job. I don't have a proper job on the side of this for better or worse. And it's been that way for about 10 years for me. And I tend to focus on building apps that scratch my own itch or apps that I want to see out there in the world and that do something that maybe I think there's a gap in the market currently that I'd like to see and hopefully other people enjoy. And that's kind of been my operational model so far.

[00:03:32.435] Kent Bye: Great. Maybe you could give a bit more context as to your background and your journey into becoming a developer.

[00:03:38.690] Christian Selig: Yeah, it's so basically in a high school, like it's common, I'm sure. I wasn't quite sure what I wanted to do, but I like computers a lot. So that kind of seemed like something to gravitate toward. And this was back at the point where the whole, there's an app for that commercials were getting really big. Like this was like 2010, I want to say. So the app store was becoming a big part of popular culture. Apple was really leaning heavy into it and I really bought into it. And I loved the idea that you could build something. And without a ton of effort, it would be distributed to a bunch of people's devices, provided it was interesting enough for people. And people out there in the world could be using something you built. And just the idea of going on a bus or something and looking over someone's shoulder and seeing them use something that you made was such a cool concept to me. Because prior to software, that seemed like such a nebulous idea, where you'd have to, I don't know, talk to a manufacturer, get distribution going, then hope to sell on Walmart or something. And just breaking the walls down and making that such a more attainable reality was fascinating to me. So, I remember I got a book on objective C development at the time and just devoured that. Ultimately, I ended up going to University for Computer Science. I did my bachelor's there. I always get a lot of questions about how important it is to have a formal education, I suppose, for iOS development or programming in general. And it's one of those areas where I would say it was a good structured program. But for what I do day-to-day, it definitely isn't an important aspect of my journey so far, I would say. But the one thing I did really appreciate with the university was we had a co-op internship program that was a required part of the degree. And my last one, I applied to Apple on a whim and ended up getting an internship there for the summer. And that was really my last like jobby job that I would have gone to. And it was such an incredible experience and it really cemented my desire just to build fun apps for people and keep going down that road for these fun platforms. And that's kind of been my situation since I suppose.

[00:05:31.783] Kent Bye: Awesome. What were you doing at Apple when you were working there?

[00:05:34.912] Christian Selig: The team to my knowledge doesn't exist anymore, but it was an enterprise app development team. So they kind of, it was almost evangelist-esque, like developer evangelist, where you'd essentially go to like other corporations and show them like best practices with iOS development and that kind of thing. And like for Apple, a lot of the interns, typically they get like a project to build throughout the summer that might not be like super applicable to the job itself, but just kind of like to give you a fun project to learn and to do. So that was a lot of it too. But it was cool just going around to different companies and talking to smart folks and being surrounded by so many smart folks at Apple too.

[00:06:05.340] Kent Bye: Well, I know that when I heard of the Juno app, it was cast in the context of the same developer who developed Apollo and wondering if you could elaborate on how Apollo came about. Cause that seems like that was one of your big hits when it comes to.

[00:06:17.113] Christian Selig: Yeah, yeah, totally. Um, so that was funnily enough, I. I remember walking home from university one day and I was trying to come up with another app idea I wanted to do because I had built a little speed reading app prior to that that did pretty well and helped some extra money for university. And I wanted something new to sink my teeth into. And I use Reddit a lot at the time. It was super big. And I was like, I love this platform, but there's not really any app out there that exists right now that I love. There's some great ones, but when you spend so many hours doing something, it's like if you're a carpenter and you're building a house and use a drill all day, it could be a really good drill. But if there's just one thing that you wish was a little different, you're like, gosh, I wish I could design my own drill. So I was like, okay, screw it. I'll take a swing at this. I ended up going to Apple not too long after that. So I kind of put it on pause. But after Apple, I really was jazzed up and really wanted to tackle it. And that became kind of my... Throughout my last semester of university, I built it up a bunch, got it to a point where I could show it off a little bit. And I posted about it on Reddit when people were really interested in the idea of this Reddit app for iOS that really felt like a first-class iOS citizen that if you were comfortable with iOS, you could pick it up and just immediately use it. And then it really adopted the platform's conventions and interface and features really well. And that's what I wanted. It almost as if Apple themselves built a Reddit app. And thankfully, that idea resonated with a lot of people and made me think, okay, I'll stick with this for a little bit. It seems like there's some smoke here where hopefully there's some fire. And I ended up just taking the money I made at Apple and just living frugally through it long enough to get the app out into a state that it could be beta tested and eventually made public. Just being like, well, hopefully this works. If not, it'll be a good resume item and I can just go back to Apple or something and had this fun experience building this app. But fundamentally, when I released it, it did really well from the get-go and people were really passionate about the app. And that became, gosh, I think like nine years of my life in total is building that app. And I got millions of downloads by the end of it. Just a really, really fun experience to build that app. Learned an absolute ton. I met a ton of cool people. Yeah, it was really awesome.

[00:08:22.474] Kent Bye: And so I know that when Elon Musk took over Twitter, he changed a lot of the API rules and it seemed like that Reddit took some inspiration for, Hey, maybe we could charge for APIs or maybe there's also some stuff that was going on with Reddit getting scraped for training stuff like open AI. So maybe you get from your perspective, what happened with the downfall of Reddit turning off all their APIs or making it prohibitively expensive for a lot of development.

[00:08:47.836] Christian Selig: It was... In some ways, I'm still trying to understand it myself because even from seeing a bit of how the sausage was made, it still didn't make a ton of sense to me. But yeah, I think your timeline is correct. Elon came in and I think started shaking things up at companies like Twitter, who historically had really powerful third-party apps and got rid of that. And I think it seems like Reddit smelled an opportunity there to get rid of some apps as well, for lack of a better term. Because I think it was April, they announced the changes and February that Elon did his. And January, I had a meeting with Reddit and they were like, Oh no, the API is great. We have no plans to change anything for a long time. We're focused on other stuff. And then of course, a few months later, it's like, actually, everything's changing. So there was clearly something in that short period that changed things up a little bit. And yeah, I think it was seeing what Elon did. And I think, like you said, there was a lot of apps like chat GPT and whatnot, and services with AI that were using Reddit as kind of the training model. And I think Reddit kind of wanted their slice of the pie there. That was one of those things where it was like, that seems totally fair to me. And not only that, but charging apps like myself seemed totally fair to me. And it was one of those things where the solution seems so obvious from the outside that where it could have just been a two-tiered approach where you'd say like, Hey, for platforms that are using our models for training, but not really giving anything back to Reddit as a whole. So like chat models or large language models, we're not fundamentally getting anything from that. So you will pay price X. For apps that are building on our platform and giving stuff back to our users like moderation tools, just alternative experiences that users might enjoy, you can pay price-wise. Like how Apple has the small business program, for instance. But they went with this instead, this one large lump sum cost that was... Basically, to put numbers to it, it was like Apollo was paying quite literally nothing. for the API. And then it went from that to them announcing the price and saying, okay, within 30 days, you're going to start incurring charges that will cost $20 million a year. And it was one of those things where you're like, okay, that's a substantial jump. Can't quite afford that. And not only that, but just the time period that they gave us, the 30 days was just so impossible to work within. These apps have millions of downloads and tons of users and completely rerouting a ship and re-architecting the payment model and migrating existing users and changing things up. It would take a lot longer than 30 days. That's why you had companies like... I know when Apple bought Dark Sky, I think they had at least a one-year period where they said like, look, nothing's going to change for a year. In a year, then we'll start to institute the changes. I mean, that kind of seemed a lot more reasonable than 30 days. But yeah, and it was the price coupled with the very, very short timeline made it really tricky. And ultimately, it's just not possible. But it was interesting because fundamentally, at the beginning, when they said like, look, we're going to start charging for the API, I remember talking to a lot of the other third party developers. legitimately a lot of excitement there about the possibility of just like a better, more solid relationship with Reddit. And so far as prior to that, like a lot of the new features Reddit had got, such as like an instant messaging DM feature, like they had developed that internally and not published the API. And we were thinking like, Oh, like stuff like that. Like if we're paying for the API, presumably they'd open more stuff up like that since we're paying for it. So there was this idea that, okay, like paying for stuff, it'll be an extra expense, but it sounds like we'll get something great out of it. And it'll solidify our relationship more. So that sounds good. But yeah, the proof's in the pudding. And when the bill is $20 million, there's 30 days to pay it. It's not too easy to make it happen. So I think most of us ended up shuttering and closing up shop by the end of the 30 days. Yeah, I think that was the elevator version of it.

[00:12:30.775] Kent Bye: Okay. So you have the end of the Apollo app is this first iteration. And then we just had the launch of the Apple vision pro on February 2nd was the launch date of 2024. What do you have been doing up until the point when you decided to develop the Juno app?

[00:12:45.281] Christian Selig: So funnily enough, like time is tricky, but I want to say it was whenever the iPhone 15 or the 14 Pro, the first one with the dynamic island came out when I remember seeing the dynamic island and being like, Oh, that's a really cool feature. Like I wonder if an Apollo, I could do something with that. That would be a little quirky. Like since there's like that little space above the dynamic island where it's not fully integrated into the device anymore, it's floating. There's that little space above it that was empty. And I was like, okay, Maybe I'll put a little virtual pet or something up there." And for whatever reason, that blew up on Twitter and TikTok and people were like, oh my God, you can have a little cat up there or something. And then to the extent of it ended up getting a crap ton of downloads. And the one complaint I had was people were like, okay, what is this virtual pet? Why do I have to download a Reddit app that has nothing to do with it? So I was like, okay, For the common TikTok user who doesn't understand this, I'll spin this off into a separate app called Pixel Pals and just let them download this much smaller, more concise app. And then ad ended up doing super well as well to the tune of millions of downloads. So I've just been working on that and just having fun with that, building such a fundamentally different app. since like, I guess last July and just refactoring things I kind of wanted to add in or change, given more time that I didn't really have when I was working on Apollo. And yeah, that's kind of been the time up until I guess like late January when I was like, Oh, I want to build like a little Vision OS YouTube app.

[00:14:03.754] Kent Bye: So yeah, I guess it had been announced that there was a number of different companies for whatever reason, some of the companies like Spotify, YouTube, Netflix. I suspect some of the reasons could have been that in order to have their payment processing, you know, you would have Apple take a 30% cut. So I don't know if they were taking a stand against Apple or if they just decided not to develop some of these apps because they didn't feel like it was going to be a big enough platform to be worth the effort. Yet when you heard that YouTube was not going to be coming to the vision OS then what happened from there for you to? Start to think about building what ended up becoming the Juno app

[00:14:39.020] Christian Selig: Well, for me, I just remember reading the headline and being like, oh, that sucks. I watch a lot of YouTube. My apps would be a lot better if YouTube didn't exist, just insofar as I find myself like, oh, I'll just go to YouTube for a second. And I'm there for 45 minutes watching stuff. And I'm like, ah, watching YouTube videos, just sitting at home with a massive screen. And that was one of the things I thought would be a really cool experience on Vision OS. with the Vision Pro. So it was a real bummer to be like, oh, that's not going to be a thing. And Safari is kind of janky. It doesn't eye track super well since there's like a zillion little anchor tags and everything everywhere. So your eyes are bouncing between a bunch of different stuff in the video player. And I was like, oh, in Apollo, people post YouTube links all the time. R slash videos is almost all YouTube links, for instance. So in Apollo, wanting to make that experience better, I built in a little YouTube player where YouTube has this iframe embed API, where you can just embed a web player version of the YouTube video and then you can interface with it programmatically. And I had all that code and knowledge of working with YouTube in that capacity that I was like, Oh, I could just take this code I had in Apollo and probably throw it together pretty quick, like a semi-functional YouTube version, just dedicated to the YouTube aspect of that. And then I was like, yeah, okay, I'll just do this. If nobody else downloads this other than me, that's fine. I just want it for myself. And funnily enough, a lot of people really felt the same way insofar as they were bummed that there wasn't a YouTube app. And there was a lot of excitement there. So it pushed me through a lot of long nights to get this out for day one on the Vision Pro. But it was such a fun experience. It really felt like I was a kid in college again, at a hackathon or something, just coding all weekend to try to get something out by the deadline. And I had so much fun with it and it continues to be fun. Like I'm just having a lot of fun working on it.

[00:16:17.339] Kent Bye: Have you been in touch with YouTube at all?

[00:16:20.193] Christian Selig: No, it's funny. I find Google is a very quiet company insofar as... Even with Apollo, which used the YouTube... This doesn't use the YouTube API. And I say that not as in like, oh, it goes around the API. But insofar as it literally just... The way the iframe API works is you're just embedding a website and you can call JavaScript commands to interface with the HTML components. So there's not really an API per se there. It's just the YouTube website. But Apollo did use the YouTube API. So if you're scrolling through r slash videos, it could have the duration of the video shown in line. And I didn't want to load up like a zillion web views every time for that. So Apollo did make extensive use of the YouTube API for video duration and thumbnails. So I did have a YouTube API key. And I had a tweet somewhere, but Apollo used an absolute ton of YouTube API requests. It was something like 200 million a month, just through people using rslashvideos and whatnot. And despite that, Apollo using untold billions of API requests over the years. I never talk to Google once. There was never any reach out. Once every 2 or 3 years, I'd get a form that would be like, Hey, every few years, we just want to know like, Hey, are you still using the API? And in what way are you using it? And it was just like a standard Google Doc that would be like, Hey, could you tell us what your app is about? How do you use the YouTube API? And they'd be like, Okay, cool. Keep going. So it's like even with Apollo and billions of requests that I never heard from YouTube. So I think it's just not really on their radar. They've got bigger stuff going on over there, I guess. And that was kind of why... Where you have the Spotify and Netflix where I took their stances a bit more as like, yeah, they're making a statement. They're saying like, we've clashed with Apple a little publicly. And this is how we're taking a stand. Whereas YouTube and Google, I've always... They always just seem like a little more... I've never seen them clash that much publicly with Apple. And their stance seems a lot more that they're just maybe a little slower moving. I remember stuff like with picture-in-picture playback in YouTube and the iOS app. That took forever to get there. Dark mode support, split screen on iPad, all these features that a lot of developers shipped on day one. I think it took years for Google to do so. I think they just take a much more slow and calculated approach. And they were just probably like, Oh, Vision Pro looks cool. But like, how many people are going to be there on day one? Like, we're not really in a rush to do it. So it just seems like me and maybe the platform in general, just kind of like small potatoes to them. That's kind of my tea leaves, but I don't know as much as you do.

[00:18:38.404] Kent Bye: So maybe you can describe a little bit about like what it was like for you to develop this on the simulator and then to see what kind of differences or gaps of what you're able to do in the simulator and what was different based upon when you actually were experiencing this on the Apple vision pro for the first time.

[00:18:54.156] Christian Selig: Yeah, that's a great question. It's one of those things where I've never developed for a device where, and it makes sense, but that the simulator is so fundamentally different than testing it on the device. With an iPhone, you could theoretically develop an entire app and never test it on a device once. There'd be some stuff that would probably be suboptimal, but you'd probably be fine. But with the Vision Pro, I find it's like, if you're taking the app seriously and you want it to do well, I almost don't see how you can't get a device. It's just so table stakes. And that's not to slight the simulator or anything. It's just so fundamentally... I don't know if it's possible to take a 2D representation of a 3D experience of moving your head with eye tracking and whatnot, and make that a one-to-one analog. I don't think that's possible. So you kind of just need a device. There were just so many features where I'd be like, Oh, this would be cool. Or I think it kind of works like this. Then you get the device and you'd be like, Oh my gosh, that doesn't work at all. I had one feature where it was from Apollo, where people really liked being able to scrub anywhere in the video to go back and forward in time. So if you were just like, Oh, what was that? Instead of trying to find a small play head, you could just kind of just put your finger anywhere on the screen and swipe and it would scrub you around to where you wanted to go. And I was like, okay, I'll do that to Juno, like might as well. Like, and that worked beautifully in the simulator, like, cause it's almost like a touch mouse experience. And then you try it on the device and it, and it turns out like, if you like go to look at something and you're slightly on the screen or like you're crossing your eyes over it and you accidentally pinch, like you're, you're flying stuff around. Like it's a much more, delicate environment, for lack of a better term. And it was just a terrible feature. So I ended up removing it in the 1.1 update and just made the video playhead better. Because fundamentally, it's pretty easy. It's a lot less labor on VisionOS just to look at something. What you gained from being able to scrub anywhere wasn't super big. So it's stuff like that. But you need to test it on a device and be like, Oh, this works. This doesn't. in order to fully understand what you're working with. And there's just a bunch of APIs in general as well that, depending on what you're doing with Vision OS, might not even work in the simulator. Stuff like any of the tracking APIs. So you can attach stuff to certain parts of your body. So you could have something always floating beside you, or you could have hand tracking. or you could stick something so it walks around and runs into stuff. But none of those features fundamentally work on the simulator. Those I find are a little bit more curious that they don't work because I know Meta's Quest simulator does have some of those features and they don't work nearly as well as on the device. But it's one of those things that it would be cool to have, even if they just didn't work super well, just so you could try them out. So for those things, if you wanted to build something where it was like, I don't know, you had a race car, virtual race car that could run around your apartment, You would, to my knowledge, physically need a device because those APIs, like you try to call them and they're unavailable. You would need a device for that. But for like a windowed experience, like a YouTube app or whatnot, you're pretty good in the simulator, but it's one of those things that you should probably test on a device if possible.

[00:21:44.438] Kent Bye: Yeah, I just had a chance to do a test of the pre-release of the Juno 1.1 and there was a lot of bugs that I was facing that seems like they've been fixed. One thing that I noticed actually when I was trying to move the window is that I had accidentally grabbed the scrub line and then like basically lost my place. video. So I feel like there's things like how the user interface of the little bar that's underneath the window. And when I try to grab it, sometimes I might accidentally grab the scrubber and like change the video position. So I feel like there's stuff like that trying to de-conflict some of the native UI versus some of the things that are built within the app. But I feel like the 1.1 has fixed a lot of the biggest gripes that I had, which was the video accidentally scrubbing wrong or getting logged out. And there's more features with being able to control the volume better or to change the resolution of the video. So there's a lot of improvements that you've made, but that's like, it's still in the early phases that I think you'll likely be hearing some of the feedback in terms of what is actually going to be the best. Cause we were very used to having the scrubber at the bottom and that's where we usually get, but I feel like there's some ways that it's so close to the other bar that it can be just like a very, that's an interesting one.

[00:22:52.617] Christian Selig: Yeah, that's a good point. Like it's one of, I've seen some apps like almost detach the video player completely like Apple's I think does when you go into like the fully immersive mode, like for the theater experiences, but it's interesting. Yeah. It's, it's one of those things where it feels like such a wild West kind of right now where you're trying things and seeing what works and you're getting feedback from users and seeing, okay, maybe that doesn't work at all. I've been trying new things and seeing how that goes. It's kind of really exciting as a developer in that way, being able to be at the forefront of all these experiments and seeing what works and what doesn't.

[00:23:22.043] Kent Bye: I think one of the things when it comes to watching videos is that a lot of the advertisement that Apple has been doing with Apple Vision Pro has people within the context of Apple TV where you can have this really huge giant screen that has reflections on Mount Hood. But yet when you have any other app that tries to do that, it's like locked out from having that big of a screen. That seems like an odd choice that Apple has made because you kind of expect it like, hey, I want to have this big theater view just like you see on TV, but it ends up being like a lot smaller. It's sort of capped out in a way that seems surprising.

[00:23:55.695] Christian Selig: Yeah, it's like I question that too. I feel like it must be a function of just like they couldn't get the APIs out in time for that maybe or something because I was like, can't imagine like licensing or something like Mount Hood doesn't want third party developers using its trees or something. But it's interesting insofar as like you can as a developer like build that in from scratch. I could go to Mount Hood, and I don't even know how you would, but get a 360 camera and 3D map the area. And then you could open an immersive space and place your video player really far out. And that's something that I'm looking into the feasibility of, maybe not quite going to Mount Hood. videoing it myself, but like having like theater like experiences, but it's unfortunate that like you effectively have to implement it from scratch as a developer rather than being able to leverage it. Like that happens a lot with Apple tech stuff. Like I remember like for iOS for the longest time, like if you want a dark mode, you'd have to implement that completely from scratch yourself as a developer. And then one day Apple was just like, okay, here's an API to do it like super easily. And then it was so much better. Not everyone had to have 700 different implementations on how they did it. It was just kind of one standard way that was really quick to implement. And I'm hoping like It would be really cool if come June, WWDCs around, Apple says like, hey, for third-party apps, you can use these environments in this way and you can use the theater mode and whatnot. Because yeah, it's a little curious now that you're not able to.

[00:25:15.700] Kent Bye: One of the things that I've noticed in using the Apple Vision Pro is that even though I do the Voices of VR podcast, I cover a lot of the fully immersive VR for the last decade, but when you go into fully immersive mode with an Apple Vision Pro, it blocks you out from doing anything else. You can't do that type of multitasking. So it has to be this decision if you do enter into immersive mode, then you can no longer have this mixing and mashing of these applications together, which I feel like the YouTube app is great to be able to use in combination with other things in that there's an app that feels like, okay, I'm entering an immersive mode, but now all of a sudden I can't have anything else that's happening. So I'm curious to hear some of your thoughts of just using the Apple Vision Pro and some of that experience of the mixed reality portions versus moving into fully immersive modes.

[00:25:59.952] Christian Selig: It's such a curious one because part of me is like, I 100% agree with you. And part of me is like, I could also just see it, again, be something that just didn't quite make it to the finish line for the 1.0 of Vision OS. And then I can also see, I feel like it would be more technologically difficult for Apple than it might immediately seem in terms of I think it makes it a lot easier from a programming standpoint to assume that when you're in the immersive mode, it's only your app. For instance, one thing for Juno I want to do is that a lot of people have requested is if I'm walking around my house vacuuming or something, I'd like to be able to keep a YouTube video with me, not stuck in my living room and I have to keep grabbing and pulling it around with me. you can effectively do that if you use the head tracking API. So you just pin it to your head. And then as your head moves, hopefully alongside the rest of your body, it would also move the window. And that's great. But as you said, as soon as you activate that, all the other apps go away. So it's a little bit of a trade-off, which I think is unfortunate. But it's also like you get into the questions of, okay, if Apple was to make it so that multiple apps could stay open at that time, what happens when you pin seven different apps to your head? Are they all running into each other? I can see why for Apple's perspective, it would be easier to say, okay, well, there can only be your app. And if it's conflicting with another window of your own apps, that's a you problem and similar stuff. But it definitely feels very limiting. And I'd love to see Apple... If it's for 2.0, whenever that may be, software-wise, not hardware-wise, I'd love to see them take a swing at that. Because it definitely feels like, yeah, there's this big trade-off with immersive mode. For me, Vision OS, one of the coolest things is the multitasking ability. And to lose that to use one of the other coolest parts of Vision OS kind of feels like, I want my cake and I want to eat it too. And yeah, I'd love to see that change too.

[00:27:41.550] Kent Bye: Yeah, you had mentioned that you had enough downloads to pay for the headset that you had bought. And I know Tom Fiske of ImmersiveWire had done a whole survey of different developers and he was finding and talking to a lot of developers that there seemed to be like a cap of like getting more than a thousand downloads in this early phases was really difficult. I don't know if that's because the Apple store is really nascent and has a lot of improvements. It's hard and difficult to just on my phone, look at all the vision OS apps. I have to be in the headset to be able to see it, but I'd love to hear some of your reactions to how well Juno has done and anything that you can share with how popular it has become.

[00:28:20.599] Christian Selig: Yeah, it's like paid for multiple devices at this point. Like it's been, I've been really fortunate in how well it's done and how people have received it. Like it's a very nice feeling to see that people have enjoyed it that much. But like, yeah, I've totally heard the same thing where like it can be tricky to kind of break into it and get past certain thresholds. And it's a tricky one because it's kind of like, it's also similar for the app store and that like fundamentally like the iPhone app store that is like a fundamentally I've known developers who have like multiple times smarter and more talented than me build like apps that are just like I could drool over and that fundamentally don't get that many downloads for whatever reason. And it's just kind of like, why is that? And like, maybe there's a little bit of that here with the VisionOS store where discovery I find in 2024 is such a big part of it where how your app is discovered is key. It's like that needs to be as much of the equation as how good quality the app is, essentially. If you had a real stinker of an app that was marketed super well, it would undoubtedly get more downloads than the most polished app ever that was never posted about. And yeah, I don't know how to reconcile that. Because to be honest, there's a little bit of... I was very fortunate in that Juno did very well. And that's the only app I have on Vision OS so far. So my only point of reference is this app that thankfully did pretty well because it had a good amount of news coverage. But at the same time, when you have 700 apps or 1,000 or whatever it was at launch, that's a lot of competition amongst apps if there's not necessarily... Your app wasn't fortunate enough to get covered by a larger publication. You're suddenly competing with hundreds upon hundreds of other apps. And if you think about it as... you're at a farmer's market on Saturday and there's 700 stalls, it's fundamentally difficult to sell a lot of bananas or tomatoes or whatever you're selling, unless your CNN comes over and it's just like, look at this guy's bananas, which is tricky. And I honestly don't know the solution to it. It's a bit of a, you got to play that marketing game and try your hardest to make sure it's something that really resonates with people. But again, fundamentally too, I think what's lost on a lot of people is that this is fundamentally a small platform. Juno has done super well, but If it was this viral hate that word, but like if it got that level of news coverage that it did and it was like an iPhone app. it would have made... I probably could retire. But it's such a fundamentally smaller platform at this stage that I think you almost have to calibrate your expectations accordingly that success on this platform, getting a few hundred downloads, that might be actually really awesome. It might be the equivalent of getting... 300 downloads might be the equivalent of 30,000 downloads on the iPhone. And that might be something to be really happy about, just because it is such a nascent new platform that people are still getting used to and getting devices and learning how to navigate and how they want to use it. So it's There's a, there's a lot of balls to juggle there, I think.

[00:31:03.189] Kent Bye: Well, I know that as a developer, you got the Apple vision pro to develop an app for it, but I'm curious, like, how have you been using the Apple vision pro outside of your development?

[00:31:13.425] Christian Selig: Honestly, a great question. To be honest, not a ton, not out of any difficulty with the device, just because like I went to the US to get it and it was like a 10 hour drive and I got back and that was like Sunday night, I want to say. And like, honestly, since then, I've just kind of been living in it, developing Juno because like all I can see are like, oh, that drives me crazy. Like, oh, that's a really good point about that feature not working super well and wanting to fix it and kind of feeling like a duty there to fix it as soon as possible. So I've been spending a ton of time just doing development work on it. And it's kind of just been a work machines for me so far. But as the 1.1 update comes out, and hopefully approved soon, and it gets to more of a stage where I can say like, okay, like, I can take a little bit of rest, the feedback is slowed down a little bit in terms of things that are driving people crazy, which thankfully, there hasn't been too many, I can maybe like, sit down and watch like a 3d movie or something and enjoy myself. But But instead of that, it's mostly been like, I've really enjoyed using it as like a development environment, like sitting at Mount Hood or in Joshua tree just coding for a few hours is really cool. Especially when you're just like sitting on your bed and it feels like you're somewhere else completely. I think that's just a really fun experience and being able to have your like Macedon up on the side, scrolling through it feels really cool. It's very futuristic, but yeah, I kind of think I've only scratched the surface in terms of like what I want to do with the device.

[00:32:27.363] Kent Bye: So you have been using it as like a monitor replacement, at least you've been doing development for sure. Inside of it.

[00:32:32.489] Christian Selig: Oh yeah, for sure. I definitely love that. It's one of those things where if you're doing Vision OS development, particularly on the device, it's very hard not to use that insofar as you could use passthrough to see your proper monitor, but you're not seeing it one-to-one. It's a camera feed of your monitor, which even though the cameras are really good, that's going to be less quality than just having the virtual screen in your reality and your debug app over on the side. Like when you're doing simulator work, like, of course you can do it at your desk. You don't need the device. But I find like, once you want to actually see how things look on the device, like, yeah, using the Mac virtual display mode is key. And it's a really great experience, honestly, thankfully. I think I just, I'd love if you could do more than one window, but that's like a super small thing. I think.

[00:33:11.816] Kent Bye: Did you buy the $250 developer strap?

[00:33:15.054] Christian Selig: I didn't. Honestly, to my surprise too, since I haven't had a lot of luck doing wireless iPhone development, like I found that very flaky to be honest. I've been doing the wireless development for Juno and it's been rock solid so far. So it's kind of like I don't want to spend hundreds of US dollars on a strap that would like, like if they were giving it out for free, I think I'd probably take it and play around with it. But like, honestly, I've been so fine with the quality of the wireless debugging that I'm happy sticking with that. And it would also be tricky for me too, because they announced that after I got back to Canada, like, again, I'd have to go back to the US to get it. But not only that, you need a US developer account too. I'd need to pay another hundred US dollars to register for a year of developer. I guess I could get a friend to buy it for me. But there's a difficulty in acquiring it that also makes it suboptimal. But honestly, I found to any people who were like, oh my God, I need that to do anything because wireless debugging must suck. It's honestly fine. I'd give it a shot first. It definitely, I will say, it takes a second, a bit longer when you compile it and run it for it to actually kick in and show your UI versus in the simulator. In the simulator, it's really quick. You run it and it's almost up and running. And on the device, I find it sometimes takes 10, 15 seconds for it to actually show up, which is a little unfortunate. But the app binary is so small that I don't think that's a function of wireless transfer speeds. I think it's probably just some provisioning process for the development environment on the actual device. So yeah, a long way of saying so far, wireless debugging has been fine.

[00:34:42.517] Kent Bye: So it sounds like you're able to mirror your screen as a virtual screen, and then you have like Macedon. Are there any other native apps that you are using to be able to aid in your development?

[00:34:52.825] Christian Selig: That's a great question. So far, not really. I've mostly leaned on my Mac for the development side of things. I'll keep my kind of tried and true software products that I use on the Mac just available because it's a 5K screen effectively. So you get a lot of screen real estate there to keep your windows up that you're using. But so far, I've stuck Yeah, like massive on the side, maybe one side and the Spotify website on another side. And that's been pretty good to me so far. But there's a lot of cool apps. I just haven't. And it's not to say there aren't great development apps out there. I just I haven't super ventured out there to look.

[00:35:25.137] Kent Bye: And then you're able to like have a Juno app on the side as well. Whenever you sort of push the new code, you're able to test it.

[00:35:30.401] Christian Selig: Yeah. It's, it's a little annoying in that, like the way Vision OS 1.0 works is you have, as a developer, you have no control over where your window spawns. So when the user launches your app, or even if you're like, Hey, they clicked on a link. I want to open that in a separate window. You have no way to say like, open that slightly off kilter. it literally always launches where their eyes are positioned. And so Apple seemingly doesn't have an API for that either. Because the one thing that does drive me crazy about development is when you compile and run it, it just goes smack dead in the middle of your eyes. So if you're coding, and you go to run it, and you're looking at Xcode, and it finally boots up, and it completely takes over your vision, and you're like, oh, I was coding something. Okay. You drag it and move it over to there. And that's fine. But like, if it would remember, I drug it over there, that would be fine. So you kind of get in this process where you're like, you run it, you wait like five seconds, you know, okay, the app's probably about ready to launch, you look over in the middle of nowhere for like, five seconds for it to like spawn the window there, then you can look back at Xcode. And it kind of becomes this fun, like little, stretching routine almost every time you run your app. Whereas I wish you could just say, keep my app to the side of my Mac display, please. Maybe some people want it to take it over, but I kind of like to be able to see what I'm coding and what the result is at the same time. So it works, but it's a little janky.

[00:36:42.155] Kent Bye: I've had that experience with the Bluetooth keyboard where whenever I type in a window, it'll have the preview of the typing pop up right in my face and I'll have to move it to the side. I'm sure there's going to be stuff like that over time that it's going to be a little bit easier to have a lot better window management, let's say.

[00:36:56.765] Christian Selig: For sure. I would hope because the Mac crushes it in window management, like, and like just getting a little bit more of the fervor over there into Vision OS, I think would be really handy.

[00:37:06.559] Kent Bye: So one of the things I told Ben Lang from Road to VR is that it feels like that Apple is coming from starting into the more immersive space from a 2D paradigm, all the insights that are coming from iOS and then expanding all their accessibility options. And that's something like the Quest 3 is coming from more of a 3D first and having something that's more fully embodied. You have. much more track controllers. You have a lot more like embodied agency and the types of experiences. But yet I feel like with Apple, you're starting with this productivity and all the background of iOS. So as an iOS developer, I'd love to hear a little bit about your own reactions of coming from iOS development and then entering into something that's much more spatial and immersive.

[00:37:46.784] Christian Selig: That's a great question. I would say for me, it's really interesting because I like both approaches. Like I have a Quest 3 and I remember using it for the first time and kind of my instinct was like, the fact that I had to pick up the, I guess you don't have to pick up the controllers, they have the, but like there was such a focus on controllers, like as someone who's never used VR before the Quest 3, like that was not at all intuitive to me that like, I thought it was kind of, it would be very ready player one where you'd put the thing on and you'd be doing the minority report stuff. And the fact that there was like this emphasis on like, okay, pick up these like other, gadgets, and this is like how you interface with stuff felt like kludgy to me. So I was pretty excited to see that they had some form of like, you can do like the pinch and whatnot. It was pretty janky compared to Apple's insofar as like, it's just they have no eye tracking. So it's just like dead center, whatever your like head is physically pointed at is what's selected. And like the reliability of the pinch detection and whatnot was a lot worse. So it was kind of like, It was a weird experience to me where I feel like, yeah, it kind of depends on where you're coming from because the Vision Pro felt like a lot more intuitive to me as someone who has never used AR VR to just use it as like a productivity device and like to do cool stuff with it. But I imagine like if you came in it from like a, okay, that sounds nerdy. I want to play some games like that. The quest would at this stage at least would probably be a more compelling device just because controllers, I assume would give you a lot more dexterity and like, you know, fundamentally playing a game just because like, Even if you're playing on an iPhone, it's this cool device, but ideally you want a controller because the touchscreen controls aren't the greatest. And I imagine that there's a similar analog with the Quest 3 where having those controllers gives it a lot more game focus. And not only that, just having been out for as long as the Quest and Oculus series of devices have been, I think that gives you a big leg up in terms of the software and the game catalog that's available. So it's, yeah, I guess it depends where you're coming from. Like it feels like the Quest, I think at least for the near future will be a more compelling game device, but I think the Vision Pro is a more compelling computing device. That's a little bit more intuitive, I would say, but they're both definitely very, very cool devices.

[00:39:49.085] Kent Bye: One of the things I've noticed with Apple Vision Pro is this thing that I mentioned earlier, which is when you get into the fully immersive mode, everything gets taken over. And I feel like the Quest has sort of got that mindset where it's really only running one app at a time most of the times. And so with Apple, it feels like it's more of being able to run multiple apps at the same time. Even with the Quest, you have like a browser and you can have three browser windows, but it's difficult to have anything else that's also running at the same time. I feel like with Appalachian Pro, it's starting from this, okay, we're going to have this multi app environment, the ways that they're interacting sometimes, like, I know that with iOS and iPad OS, it's like when you have one audio source, like, you know, I have one audio that's playing across the entire ecosystem. And I feel like there's Lot of things where that audio gets ducked out or have you found in terms of developing Juno like because you can't like have Spotify playing at the same time as a YouTube video at least that's been my experience where it doesn't mocks any of the audio feeds and so you can only have like one audio at the same time and

[00:40:47.841] Christian Selig: So you totally can as a developer enable that, like you have AV audio session, which is like the audio API that powers all of iOS. And you can say like, look for my app. I'd like you to either mix with others or duck out the other audio sources. I mean, I think a lot of people are just kind of like you said, coming from the iOS perspective of reporting over their iOS apps and keeping that existing methodology or mindset. where, yeah, like as soon as anything plays in my app, I want the other apps because it's such a modal experience. For Juno, it's a little unique because where it's fundamentally like a web view, kind of like re-tailored to be a native experience, the API that powers Juno, which is called WKWebView, which is basically just a little WebKit view, It runs out of process, and that means it has its own audio session running. And for whatever reason, I tried to bug someone at Apple with a report about this, but it completely isolates its audio session. And there's no way for a developer just to even say like, hey, can you just mix that with other apps? It just goes in kind of the dumb iPhone mode of like, Oh, did you play a sound effect? Kill every other app's audio session. And it drives me crazy because I was sitting there like, yeah, it would be nice to be able to put a YouTube video on low audio and have something else playing. But right now, no matter what I did, sure enough, you look it up and it's like, yeah, it's running out of process. You have no control over that. Apple doesn't give you anything. And it's one of those things where it feels kind of just like an oversight, hopefully that they'll like they're addressing, like in a lot of these like VisionOS 1.1 updates, they're addressing a lot of the community concerns. So hopefully stuff like that is just a matter of time and growing pains. But yeah, it's a little it's a little tricky now.

[00:42:15.827] Kent Bye: Yeah, well, I'd say the quest is even more so like you can only run one thing at a time versus like at least having more apps that are running. The AV audio sessions that you're talking about is one thing that I found also annoying. Even when I'm asking Siri something, it will duck out the audio and I'm like, no, like, ah, that's so.

[00:42:32.245] Christian Selig: You don't need to do that. Yeah, that's funny.

[00:42:35.273] Kent Bye: But I guess when you think about the code base of what you're developing, I guess one of the other things is that most of the development that's happening on the Quest is coming out of like apps like Unity or Unreal Engine. There's no real native framework that Meta has to be able to develop apps. And so with Apple, they have their own operating system, they have their own frameworks. And so if you were to describe like the code base from like Swift and percentage of like JavaScript, like what is the kind of mix of all the different languages that you're using to create Juno?

[00:43:02.976] Christian Selig: That's another good question. I'd say for Juno specifically, it's probably 50% Swift, 50% JavaScript. There's a lot of Swift, but there's also a lot of JavaScript interacting with the YouTube website and CSS for restyling some things to make them a little bit more Vision OS-esque. So yeah, so a lot of both, but it's also one of those things where, yeah, if you were doing like a completely different kind of app where like, I don't like that race car one, I was saying where it's going around in 3d space, that would be completely different altogether, where you might be back into JavaScript using unity or something, or no C sharp over there, but you might be using something completely different, like, because they have like poly spatial for building like the vision OS apps, where that might be a better experience, or you might be again, back to using Swift with reality kit. And yeah, kind of like you can be using all sorts of different technologies depending on the kind of app you want to build on Vision OS. And it's pretty cool in that way. Yeah. Yeah. There's, there's lots of options, but for Juno, it's definitely like a lot of Swift and a lot of JavaScript and CSS.

[00:43:57.280] Kent Bye: Well, right now the Juno app is very 2d and you're in a spatial context. Have you thought about like what kind of either 3d UI or other elements that have a little bit more of a spatial experience? Have you thought about what that would be? A little bit.

[00:44:09.611] Christian Selig: Yeah, I think it's interesting on how to do it in a way that is done elegantly insofar as I think it's really easy to get carried away and be like, everything's 3D, the buttons are 3D, this is 3D, this is floating over your head. And fundamentally, you're like, okay, cool. I just want to watch a video though and have the rest disappear. And you're being very in my face. So I think it's a balance between being like, balancing the novelty of new capabilities with the ability to still have just a fundamental user experience that's good and easy for the user to use and disappears to let the content that you're trying to show shine through more so than the UI. So it's tricky and you want to go buck wild and make everything crazy. But there's a certain amount of restraint you have to exercise, I think. But on the flip side, I think there totally is some areas that 3D can really make an app fundamentally better and more profound. And it's stuff like those theater experiences and whatnot, where you can have a really big screen or have something that surrounds you in some capacity. And that's something that I do want to look into more. Leveraging 3D where it makes sense and where you can do some really cool things with it. Because I think for a video playing app, there's definitely some stuff there that you could do that I think would leverage the 3D in a way that wouldn't feel ostentatious to the user.

[00:45:22.549] Kent Bye: I've definitely on my computer sometimes had a video that I'm watching and then another video that I'm listening to. So have you thought about whether or not it would even be technically feasible to have like two videos playing at the same time within Juno or is that something that people would even want?

[00:45:39.191] Christian Selig: Yeah, there's been a few requests for that. It's one of those things that I think it would be cool. You could have two videos you're interested in, but not necessarily super focused on that you can watch at the same time. And I think that would be cool to add. I admittedly haven't had a ton of time to look at the feasibility of it. If I had to guess, one of them, just because of that audio limitation I mentioned earlier, would have to be muted. in order for it to not pause the other player. I had an issue where making a sound effect when you interact with the UI component, it would pause the video. I had no way until I changed the app's audio session to mix with others. If you have two web views that are both run in independent processes that don't know about each other. I feel like they just both be fighting each other on and off, like you pause one, it would play the other, you'd play that one and pause the other, like you'd be going back and forth, unless you muted one, in which case I would assume its audio session would be basically killed. And then you could do essentially whatever you wanted. And maybe that's probably not even that much of a downside because I can't imagine you necessarily want somebody reviewing an iPhone and somebody reviewing an iPad shouting at you concurrently. That sounds kind of stressful. So it's probably not that bad just to have one versus the other. But it's something I definitely want to look into.

[00:46:49.348] Kent Bye: I was able to hack into that by just going to the website and pulling up multiple videos. So it is possible to do now, but you have to kind of suffer through the user interface of YouTube's website, which ends up being like super small a lot of times and difficult to really click on things in a way that is easy, which I think is one of the great benefits of Juno is that it's a lot more optimized for user interface for eye tracking. So you don't have those types of conflicts.

[00:47:12.534] Christian Selig: Well, yeah, it was funny. Even just using the Vision OS simulator and like YouTube and the Safari website, I was like clicking around with like a mouse and trackpad and being like, Oh, this isn't great. Like it's kind of hard to be precise. And then like you kind of use the device and you're like, Oh shoot. Like I was using a mouse and trackpad and it wasn't that great. Like once you introduce your eyes, which have like a much smaller degree of precision, like they're still precise, but they're not nearly as precise as a mouse pointer. It was like fundamentally worse. And I was like, Oh shit, it is nice to have these native controls where Apple put a lot of work into making them really compatible with eye tracking and whatnot.

[00:47:41.818] Kent Bye: Yeah, so I guess when I'm listening to some commentators talk about the Apple ecosystem, they're saying that Vision OS and Apple Vision Pro is the first major platform that they've launched since like the watch. And so I'm curious to hear a little bit about your reactions of what has been like so far to develop for the Apple Vision Pro.

[00:47:58.947] Christian Selig: Well, fundamentally, it's really exciting being there on like day one of a product launch. Like it kind of feels like you're, I don't know, it's corny, but it's like you're part of like this Apple experience of launching the product because For a lot of people, the iPhone is an incredibly sophisticated piece of hardware and software, but for a lot of people, it's the apps they use that are like, if they disappeared, they wouldn't care for the thing at all. knowing that the experiences you create through your software are such a big reason why people enjoy the device and being able to be there for day one, even if the product was a total flop and turd, which I don't think it is by any measure. It's just really exciting to be there and be a developer and kind of share in that day one excitement of a launch because yeah, like, like it's, it's fun just like going to the Apple store and seeing everyone so excited about this new product and like all talking to each other about like what you can do with it and sharing that excitement. It's kind of like a little bit like a, I don't know, like you go to like the new Avengers movie launch or something and everyone's jazzed about it. Like it's, there's that energy that's really fun to be a part of. So just even that aside for me, even if it didn't pay for itself, I think it was just a really fun experience to be able to be a part of.

[00:49:00.944] Kent Bye: And so what do you want to experience on the Apple vision pro?

[00:49:04.422] Christian Selig: Ooh, that's a profound question. I tend to go into things with very small expectations because I just find that makes me enjoy them a bit better. Whereas for now, I'm kind of just excited to see where it goes and where people take it. as far as software goes and experiences. I don't have necessarily like, if it does this, it's going to be a success for me. It's more so like, this just seems like such a blank canvas for technology and futuristic applications to exist. And I'm just really excited to have the capabilities to experience it and be there alongside this potentially profound computing movement. It could be a complete flop. And 10 years from now, we'll be looking back and being like, oh, remember when they tried to do all that AR, VR stuff? I hope that's not the case, but it kind of feels like we're in this age now where it's almost like a space race where everyone's trying to do this really cool technology thing and competing against each other and excited about this future and working really hard. And it feels a little like that in the AR VR space where there's all this movement and excitement for this new possible, completely different way of looking at computing. And just kind of being along for the ride is just really exciting in and of itself, even if it just doesn't ultimately go anywhere, but like, I don't know, the space race went somewhere. So I think this will.

[00:50:19.355] Kent Bye: Awesome. And what do you think the ultimate potential of spatial computing might be and what it might be able to enable?

[00:50:26.690] Christian Selig: Yeah, I'll give another cop-out answer that I really don't know. It feels a little bit like trying to predict where the iPhone would go back in 2007. I don't feel like I have any likelihood of success in being able to predict where it'll go. But the obvious answer is just it getting smaller and smaller and cheaper and cheaper to the extent that it's less obtrusive and more accessible to a wider audience of people. And I think I'd like to see it become more of a shared experience, if only insofar as it gets cheaper to use. Right now, showing my girlfriend something that you built, it's a bit of an arduous experience to transfer the headset over and her inter-pupillary distance to be recalibrated. And if she had a different prescription than me, all this stuff. Whereas if it was cheaper and less bulky, you could potentially just get two for the price of Hopefully like half of one now or if there's just an easier way to transfer it around and stuff like like having more of a shared experience I think would be really cool. But yeah, I guess I'm just kind of like excited to see Not like what I think it will become but what it could become insofar as there's just like computers have been so 2D up until now that just the concept of like breaking into this other dimension, I think it's just so ripe for opportunities that I'm just kind of like going into it being like, it's like going on a roller coaster that you've been told is really fun, but you don't know any of the flips and flops and upside downs like to come. You're just like, man, I'm just excited to be on this roller coaster that everyone seems to really enjoy.

[00:51:49.203] Kent Bye: Yeah, I think there's a natural comparison to the iPhone because of the new human computer interfaces that you have with the touchscreen was such a revelation starting the paradigm of mobile computing and all the human computer interaction with the eye tracking and the touch gestures feels like a similar type of innovation that is going to change the way that we do computing. But I've also heard a lot of people comparing it that the Apple Vision Pro is more analogous to the Macintosh than the iPhone.

[00:52:15.395] Christian Selig: Interesting. Yeah, yeah. Or it's more of like a full-fledged computing platform, I guess, in a sense. And it's maybe more ambitious in becoming the next way of doing computing. Yeah, that's pretty cool. I definitely see them occupying a similar spot. Insofar as sometimes it seems a little silly that you're running... I have an M1 Mac and I'm projecting it with this virtual display onto this computer running an M2 processor. The thing powering this external display is fundamentally probably more powerful than the actual thing it's projecting. So it feels like in a sense that there is a possibility to combine them there and you'd get this really cool experience that's more portable and more powerful and more I keep using it, but like profound to use in this 3D space. Yeah, it's definitely like, it could be a zillion things. I'm kind of excited to see what it materializes as.

[00:53:01.617] Kent Bye: Right. And is there anything else that's left unsaid that you'd like to say to the broader immersive community?

[00:53:06.302] Christian Selig: Um, that's a really good question. Um, no, I would just say like, I think there's a lot of skeptics out there and it's nothing wrong with that. And it's good to be skeptical about things, but I think it's also fun to be kind of like excited about new things and kind of see them for what they could be. And I think it's just kind of a really exciting time for computing. And it's, if you have the means, I encourage you to kind of like get out there and try a device of your own and see what you feel about it. And if you feel any passion there, and then maybe even if you have the capabilities or you'd like to learn, build something for it, because it's a really fun, different experience to develop for that. I think a lot of people will enjoy.

[00:53:39.148] Kent Bye: Awesome. You have the 1.0 version of Juno. You've submitted the 1.1. Do you have a sense for how long it's going to take for them to approve it?

[00:53:47.702] Christian Selig: That's a great question. From what I can tell, it's been all over the place with people who have submitted updates. The initial one took quite a while to be approved. I've seen some people also have that experience. Updates typically go a little faster, but they also seem like they're a little swamped with maybe... Because I imagine all the reviewers have Vision Pro, so there's probably a human scalability element to it. So I think it feels like they're a little behind, but hopefully within a day or two.

[00:54:11.425] Kent Bye: Okay. So we should be expecting the 1.1 version of Juno here soon. I think, you know, in my experience, it's fixing a lot of the bugs that I had issues with. And yeah, it sounds like you're going to continue to develop it and expand it even more, more feedback. And it was the first app that I bought because I felt like, okay, this is something that I want, like yourself. I use YouTube all the time. And yeah, I feel like just the way that it's able to use the native interface is just a lot more elegant than trying to go through the Safari website. So that's the goal. Thank you. Yeah. Thanks so much for joining me here on the podcast to help share a little bit more of your journey into developing for the Vision OS and Apple Vision Pro. And I highly encourage folks to go check out the Juno app. And yeah, I think it gives a much more native experience for YouTube. And yeah, thanks for taking the time to share a little bit more about your story.

[00:54:57.081] Christian Selig: Thank you for it. Genuinely was a pleasure to be here. So thank you so much.

[00:55:00.962] Kent Bye: So that was Christian Seelig. He's a developer for the Juno for YouTube Vision OS application. So I've a number of different takeaways about this interview is that first of all, Well, super fascinating just to hear the process of what it was like to develop on the simulator. A lot of the very early apps that were launched at launch day were likely completely developed on the simulator because not very many people had access to the hardware. So he saw that there were a ton of different changes that needed to happen. And with this 1.1 release that actually released later in the day after I interviewed him yesterday. So the 1.1 version is now actually out. So you can go check that out. And yeah, just a lot of like nuances for the operating system that Apple's been able to build. I think it's worth noting that he was able to build this completely with using Swift and JavaScript and CSS. So, he's able to take his existing iOS knowledge and start to build Vision OS applications. Now, this is very much a windowed application and I'll be curious to be diving much more into some of these other applications and see some of the more spatial features. I'll be digging into much more of that as I move forward. I wanted to just start here because this is actually the first app that I decided to buy because it was something that there wasn't any native YouTube integrations. Safari wasn't a great experience for YouTube. And I just wanted to easily pull up videos and have some of the aspects of my account. Not all the features are there. Like the watch history is something that it doesn't actually keep that in your watch history. So there's little quirks like that that isn't totally equivalent to what a native application would provide. But I think it's giving most of the core functionality that you get in a native application. And also just the whole paradigms of human computer interfaces of how much easier it is to use this application, given some of the optimizations for the Apple Vision Pro, just like there's been responsive design for going from desktop and laptop into like a mobile phone, there's going to be an equivalent of from 2D to 3D type of responsive types of designs. And in this case, I think that's starting with a lot of the native applications, but I'll be very curious in the future if there'll be like a web view, like if you pull something on Safari, if you'll start to like natively see some of these different integrations. The other thing is just reflecting on how, as I'm working in the context of these mixed reality spatial computing paradigms, just how much the fully immersive VR piece will completely shut out all the other applications that you won't be able to run them at the same time, which I think is usually what you do with existing VR headsets, either with the MetaQuest or with PC VR, there isn't a lot of mixing and mashing of these things together. I know that some early efforts of like Exokit as well as Pluto VR was doing some of this multi-app experimentations, and they found that it was super compelling, but even the operating system and everything on the Quest is really optimized for running single applications at the time. And this could get down to some of the core differences between what they're able to do on the mobile chipset with Qualcomm XR2 Gen2 versus what Apple's going to be able to do with the M2 chip and the R1 chip. just a lot more processing power to potentially run these multi app environments. Both Mark Zuckerberg and CTO Andrew Bosworth, aka buys, they've come out over the last couple of days, making the argument that the quest is the better product for mixed reality period. there's so much warping that happens in the mixed reality views of the meta quest versus the apple vision pro especially with close near field objects you have huge amounts of warping that you have much better point of view correction with an apple vision pro the price differences in terms of value probably undoubtedly get more value out of the quest but i'd say that there's so much less of an software ecosystem, and this is something that Ben Lang and I talked about, just how the ecosystem of Apple has just so much more resources, like over a million apps and 600 iOS apps. And also they have the operating system where you're actually able to build so much more tightly integrated and lightweight applications, whereas within the Quest, you just end up having to have everything wrapped within either a Unity or Unreal Engine wrapper, which has much more of a game engine construct and a lot more stuff that may not even be used for some of the different applications. And there's also a huge difference between how Apple's been curating their ecosystem and how Meta's been curating their ecosystem with only like 620, 630 apps that have launched. And then everything else is like relegated to App Lab where things get lost and they're very difficult to find. And they're basically treated at these kind of second class citizens. And so even though there are probably thousands of different applications that have been launched on the App Lab, they're just not very discoverable and you kind of have to know about it in order to find it. In terms of like the software ecosystem, Meta has taken so much more of a curated approach and Apple's taking so much more of an open approach, even though Zuckerberg is saying that they're much more of an open platform. So certainly there's a lot of things to continue to unpack with a lot of these comparisons between the Apple Vision Pro and the Quest 3. I think there's gonna be a lot of people who are actually looking at this and like, what do they actually need? And in some cases, the Quest 3 will be perfectly fine. In other cases, you'll probably want to dive into the Apple Vision Pro. I think if you're a developer, I think it makes much more sense to get the device and to see what you can do with all these affordances of this tightly integrated eye tracking and hand gestures that just super smooth. And overall, I think there's still very early, both from the software ecosystem with Apple, but also how their native integrations are tied into their system. When it comes to accessibility, I think there's just no question that Apple is by and far completely superior as a device. If you want an accessible device, there's certainly a lot more options that are built into the core operating system of Apple Vision Pro that have really been prioritized, whereas on MetaQuest, it's hardly accessible. So there's some things like that. So depending on what you're using it for, there's gonna be things that are gonna be just way better. So I'm going to be continuing to dive into different applications and explore some of the productivity of screen sharing. I think there's more and more options to be able to do screen sharing in the context of my Windows machine. I don't have a native Mac machine, so I haven't been able to test out some of the screen sharing capabilities. I'm going to start to be diving into both Steam Link and Moonlight to start to dive into some of these different screen sharing applications that are out there and to see how that works. and start to compare that also to the quest 3 screen sharing capabilities as well since the last time i've really tried out some of those screen sharing capabilities were back on like the quest 2 so i know that with a bump in resolution and there may be a little bit more streamlined integrations especially when you have access to things like virtual desktop which is still in the process of trying to have native integrations built for the apple vision pro so So stay tuned for some of these screen sharing explorations that I hope to be diving into as well, as well as featuring a little bit more of the other developers that have created applications for the Apple Vision Pro. So that's all I have for today. And I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends and consider becoming a member of the Patreon. This is a, this is a supported podcast. And so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you could become a member and donate today at patreon.com slash voices of ER. Thanks for listening.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK