visionOS Pathway

Pathways are simple and easy-to-navigate collections of the videos, documentation, and resources you’ll need to start building great apps and games. They’re the perfect place to begin your Apple developer journey — all you need is a Mac and an idea.

Looks like Apple launched Pathways for developers a couple of weeks ago.

A great looking resource to guide developers through the treasure trove of WWDC content, sample code and projects.

Great timing now I've got the hardware - currently dancing around in the visionOS Pathway while working on a little game.

First Vision Pro Experience

Wow.

I've only been working with it for an hour or so but my word, the simulator gets you so far but once you experience the actual hardware.. it's like nothing else.

The single greatest leap of technology I have experienced, first hand - probably in life to date.

Somewhere over in California and most likely spread around the world, an incredible team of people have achieved something amazing here. Well done to all of you, drop the mic.

The hardware is still not available for sale in the UK yet but I managed to get my hands on one finally.

So we're back from a little hiatus, I was rennovating a house for a six months 😅 - now I've got the hardware, it will get a lot more interesting on here.

Developing on the simulator is one thing but you truly need the actual hardware to do your best work on the software.

Let the journey begin!

Awesome visionOS Links

I recommend taking a look at this awesome public repository on Github from stevenpaulhoward.

It's packed full of visionOS links: sample projects, articles, some tutorials and useful people to follow in the community. It's a great resource, this site could use something similar soon!

Unity Beta Program for visionOS

Unity announced their beta program for visionOS developers. The annoucement was a couple weeks ago but well worth a mention incase you missed it. You can register for access here, there seems to be a bit of a wait so perhaps sooner the better.

Apple and Unity are obviously working in close partnership, so it's wise to keep in touch with what is on offer here as it develops. Unity are pitching this as for apps as well as games etc on visionOS, so there should be something for everyone, with a really good explainer on their site to learn more about it.

Vision Pro Developer Kits Announced

Apple have announced their visionOS compatibility evaluations, developer labs and developer kits are now available.

There is a short form to apply for the developer kit hardware on loan, if you can make a valid use case, get your hat in the ring. Labs are available for submitting booking requests on various dates through August. Locations so far: Cupertino, London, Singapore, Munich, Shanghai and Tokyo. The labs need details of a specific app you want to test on a device. The compatibility checks on offer also require a specific app. There's a checklist to run through before submitting your app, assume to ensure it's valid to manage resources. You need to select the app from a list from what's on your Developer/App Store Connect account too.

First Impressions

I fired up visionOS SDK 1 (The one is cute, right?) in the latest version of Xcode, so much fun to be had in here. It really hits home when you start to play around with it that this really is a whole new platform, it's no toy, no gimmick.

You can immediately start making things with SwiftUI, without even really thinking about it straight from File, Project, New. Serious work has gone into this and it really shows. It shines. This already feels like home.. but home is now a spaceship ..with a whole universe to explore.

The first app I made was placing spheres in the room.. it's a whole new debugging experience when you're panning and orbiting around a 3D kitchen looking for your missing models! Fun though. So much to learn here but a fotunate time to be able to join a new platform again, right from day one.

Performance wise, no complaints here. Not bad for a beta. Everything seems pretty snappy so far. Even the previews are fast and I was defintiely not expecting that. Obviously I haven't pushed the envelope much. The first time I played with an app in the simulator, using just my laptop (M2 MBA), things got very hot indeed. As soon as I switched to running on the Studio Display though, no heat, even with the clam closed.

Reality Composer Pro is bundled in with this Xcode now, I was excited to try that and it did not dissapoint. I think that app is going to be a bit of a hero, super simple to get started with as well. Within a few minutes I was building worlds with primitive shapes, adding materials and tinkering with particle emitters. Without a glance at the docs, bravo.

When you do look around the documentation for visionOS, it's really good, there's a lot of it. The sample projects are good too, you can immediately get your hands on some code to run and play with - which is what we all want. Hopefully they will trickle out some more projects as the year rolls on. One of the projects is a game, 'Happy Beam', it's demo software granted but there's a lot going on. It serves as a good reminder that the 3D wonderlands come with maths, often lots of maths. Great things are possible with great minds on that maths here. I've never had the head for the matrix math myself - I was reminded of an OpenGL course I tried my hand at back in the Cocos2D days, oof. I expect there will be lots of open source projects eventually to help folks with this stuff, animating model transforms, custom hand gestures etc, all for us to learn!

Pro tip for the simulator, get a game controller hooked up to your machine. I had a PS5 controller hanging around from the pandemic days, way easier than using your mouse in the simulator. Said simulator could definitely use some keyboard shortcuts to switch between pan, orbit etc quickly too. On the controller, L2 & R2 are pressure sensitive, they move you up and down nicely which I found sped up navigation around the scene massively.

The development environment doesn't really lend itself to working on a laptop very much - which we are all very used to. It needs screen real estate, a controller (or headset) and will probably be pushing tin later too. This could be an excellent excuse to invest in a Mac Studio I thought.. but the more interesting question.. what does a day developing on this device actually feel like?

We really need the headset to do this properly - if you imagine wearing it during development.. would you take it off to tweak your code? That's a lot of messing around.. so ideally no. So you would probably leave it on. So you're looking at Xcode through the headset with passthrough.. or you're looking at Xcode IN the headset, if that makes sense.. on a giant version of your Mac desktop.. and thus never take it off. There's probably other configurations for the developers 'daily driver' use case (we'll learn I'm sure) but you get the point, it's not obvious.. it's not the pictures under glass we're used to.

Parting thought, I had a little look for any sort of UI testing framework, no dice. Just unit tests for now by the looks of it. I'm sure there is a team busy working away on that somewhere.. and hey, nobody is shipping until May.. right?

visionOS Developer Tools Now Available

It's here! Apple have just announced the release of the Vision Pro Developer Tools and visionOS SDK.

Now the fun really starts.. we can start making things with it. Exciting times. We all still have so many questions on just the software and user interface, let alone the device. Fascinated to start exploring this, finding some answers and ideas - very much doubt I am alone in that!

I'm off on holiday in the morning too, damn. Looks like the laptop is coming in the bag to stealing some time away for it.

For all those heading over to the developer portal for downloads, have fun!

Thoughts on visionOS & Apple Car

OK, I know this is a reach. Might visionOS be a look ahead to the experience of riding in an Apple Car one day? I think it's an interesting muse on the future. We've all heard the rumours about Apple working on a car for a number of years.. as we can see with Vision Pro, usually where there is enough smoke.. a day will come when you can toast the marshmallows.

We’re undoubtably going to see some YouTubers strap on one of these headsets and go for a drive, it probably even has some uses.. until someone pulls the power cord or the battery dies ..and you’re plunged into complete darkness, hot at the wheel of a fast moving vehicle. Pretty clearly not advisable, with our current iteration of the Vision Pro at least. Let’s not try that one at home. So for now, let’s put the notion of wearing this gear while driving to the side, how else might it be practical to use the technology here?

Apple play the long game incredibly well - we often see this when looking back and connecting all the dots, to where we are today. Advancements with Swift, SwiftUI, Apple Silicon hardware (to name but a few) all pave the way to enable the Vision Pro hardware and visionOS software blowing our minds for the last couple of weeks. There's always a journey, a roadmap. Least of all a company of that size, needs to be re-using technology across the estate to maintain any kind of pace.. and they certainly have the pace, the legs, for all the races they are in.

So how might visionOS be related to an Apple Car one day? To immerse yourself in the visionOS experience, (simplified) you need two core things added to your environment. First, you need the video feed to your eyes that has the visionOS version of the world in it. Second, you need all the hardware sensors reading the environment and reading your movements for input, hand and eye tracking etc. The Vision Pro device imports all of this to your environment, wherever you are. You’re good to go.

So effectively you need the device to take control of any environment - to augment it. An alternative environment where this control could perhaps be achieved is within a vehicle. With the hardware put in place for sensors, perhaps the windscreen(s) is our augmented video feed, albeit less immersive depending on the size. We're skipping other advancements in technology here like eye tracking from a greater distance, the windscreen having transparent LED integrated to double as a display perhaps, things like that (from a long list) but it's not hard to imagine.

In a car you know where people will be sitting, where eyes and hands are likely to be. There are contradictions here in that, a windscreen is a letterboxed viewing experience vs being fully immersive - as intended by the Vision Pro. Squint your eyes a little, skip some of the details, you can see how a version of this lends well to a vehicle. With different arcs to the story if being piloted by a human vs self driving, autonomous.

The technology is likely to get miniaturised, maybe into glasses or contact lenses some day. The 'Spatial Computing' experience, as we see it today, essentially needs the view of the world replaced before your eyes see it and some sensors in the environment to enable interaction. Engineering, and product, wise - it's also hard to imagine Apple ever separating these from being in a single device. That said, we don't know what dots are yet to be connected down the road - the need for a headset might just be a place to start from. As good a place as any.

Vision Pro User Experience Demos

We've seen the keynotes from Apple - but what is this thing actually like to use? At WWDC there was a demo area where some of the attendees could get a 30 minute slot to try it out. What folks (only ~500 apparently) could see and do with the headset was tightly controlled by Apple, which makes sense, it's a new product a year from launch but even so - folks go to experience a lot.

There is some brilliant reporting on this from some of those lucky folks, I've put together a short list of links where you can learn more about it from them.

Fingers crossed for developer kits so we don't have to wait a year to see for ourselves!