Conferences Condensed: WWDC23
The sessions we can't stop talking about
Another year, another WWDC in the books. This year, we were lucky enough to have a mix of people attending the conference in-person and people meeting in NYC to watch sessions together. We’ve compiled a list of a few of those sessions that we really loved — let us know what your favorites were!
In this session, Apple introduced significant enhancements to the Swift Charts framework. These include the addition of pie and donut charts, and improved interactivity features such as selection and scrolling. The new SectorMark, integral for creating visually captivating pie and donut charts, enables effortless depiction of part-to-whole data relationships. Moreover, the enriched interactivity features provide a more intuitive and user-friendly way to navigate and explore data. This addition equips developers with a fresh approach to present information in an engaging, user-friendly manner. As a result, Apple has really rounded out the SwiftCharts framework, cementing it as a more comprehensive and robust toolkit for data representation.
During WWDC week, I focused my attention primarily on SwiftUI updates while waiting for the visionOS SDK to become available. While there were many fantastic sessions on our favorite UI framework, my favorite was Demystify SwiftUI Performance.
Lately, “Demystify” sessions have been incredibly strong, so much so that I always search the session list once the names become available each year for “demystify”, and prioritize what I find. Demystify SwiftUI Performance is no exception to this recent trend. It offers another peek behind the curtain to better understand minor code level changes and their impact on your app’s performance. SwiftUI is mysterious, for better or for worse. Many developers we work with still struggle to wrap their heads around result builders, and at many times, SwiftUI feels like magic to those of us who have spent more than a decade writing mobile UI imperatively. While this magic can be great for newcomers and for simple use cases, I strongly recommend taking any opportunity to wash some of it away and develop a deeper understanding of how things actually work under the hood.
Demystify SwiftUI Performance joins our list of “mandatory prerequisites” that we share with any developers we work with on SwiftUI projects, alongside its excellent, earlier counterpart, Demystify SwiftUI from WWDC21 (I wrote about this session here). Even if you’re a SwiftUI expert, I urge you to watch this year’s session, as you’re bound to learn something new that will help to shape the way you write UI.
— Michael Liberatore
If your app deals at all with a user’s photo library, even if you only need access for them to choose a profile photo, you’re likely going to watch this session. Apple has done a fantastic job with the new APIs for customizing the photo picking experience that’s sure to make your users’ experiences more streamlined and less fatiguing from permission prompts. Embedding the photo picking experience into your own views is so simple, and with Apple-provided subtle user messaging around what your app actually has access to, I think users will be much happier all around. This short session demos a lot of the embedded photo picker’s customization options, and after viewing, I’m sure you’ll come away wanting to upgrade your photo-picking code right away.
— Michael Liberatore
This session was probably my biggest surprise of WWDC23. Often Friday sessions are a bit less foundational to new technologies than ones earlier in the week, but this one was jam-packed with incredibly detailed and useful information about how spring animations work and some of the nice quality-of-life changes for them coming in the next releases. Jacob expertly employs straightforward examples at just the right times to show how complex topics like springs — and new APIs like gesture velocity — can improve animations across any app. He’s able to describe the new spring animation APIs that make it easier than ever to get a natural-feeling spring animation, while also going under the hood to describe the math behind it all without putting you to sleep. The beauty of a well-done WWDC talk is taking complex things that feel magical and describing how they work in an understandable and memorable way, and that’s what makes this one so good.
— Brian Capps
The newest platform by Apple, visionOS! If you’re wanting to get started with spatial computing this is the place to start. Peter goes through every step to get started from creating a new Xcode project, how to use the Simulator for Vision Pro, using Reality Composer Pro, and lastly using entity targeting.
I love how this is truly an introduction that you can follow without watching any other visionOS sessions. He explains new vocabulary like windows and volumes, and notes which sessions to watch to dive deeper into each topic. Throughout the session he notes best practices for visionOS, like always starting the user in a window with clear entry and exit controls so people can decide when to be more immersed in the content. As opposed to moving them into an immersive space without their knowledge.
I’m super excited to use the Vision Pro after just seeing the possibilities with the simulator.
— Mikaela Caron
The debate about “should designers code” has been raging for years. The question has sparked so many Twitter threads from some of the most insufferable thought leaders silicon valley has to offer. This session seemed like the loudest “Yes” from Apple to date.
Even more than a “yes”, it’s about embracing the world of dynamic prototyping. It’s not about designers becoming full-fledged developers or taking over coding responsibilities, it’s about using SwiftUI’s dynamism in ways static tools like Figma cannot match.
Static mockups are great and I am overjoyed that Apple now has official design resources for Figma, but a prototype is worth a thousand words. And there’s no better place to prototype than in the environment your product will actually be running in, where you can make it as rich as you want. Haptics, device sensors, shaders(!!), it’s all there. The only question is how far you want the prototype to go.
The future of interface design is backed by code written by designers. I apologize in advance to anyone who has to review my messy PRs.
— Sam Gold
🌏 Apple Developer Center session
My favorite session at WWDC 2023 was the in-person session presented at the Apple Developer Center. While it wasn’t recorded, I took notes on it and made them available here for anyone who wasn’t able to attend in person. In the session, John Geleynse introduced everyone to the Developer Center and its capabilities on a virtual tour. Then, Apple engineers from the visionOS team gave us a deeper look a the visionOS SDK and did live demos of building visionOS apps, using ARKIt, and RealityKit. We even got to see how Reality Composer Pro works and how to integrate hand tracking into full space experiences.
Afterward, we all got to chat with the evangelists and engineers over some light refreshments and explore a little more of the beautiful new space built for developers to collaborate directly with Apple.
— Matthew Bischoff
If you want help getting your app ready for iOS 17, or want to build something for visionOS, get in touch!