Start with the project in the Starter folder. This project is similar to the default template for an Immersive visionOS app in Xcode 15.2. The tap-to-enlarge gesture has been removed from the template. You will use some of the elements included to learn aspects of visionOS development.
Take a look at the items in the Project Navigator. At the top is the AmazeMeApp.swift
file. It sets up a single volumetric WindowGroup
and also has references to the ImmersiveSpace
that loads an ImmersiveView
struct. Inside this ImmersiveView.swift
file, you’ll do the majority of the coding work.
A volumetric WindowGroup
means that this window will be contained within a 3D volume that can be moved around by the user in the Shared Space environment that visionOS provides. You can think of these 3D volumes as the visionOS equivalent of 2D windows on macOS. Just like how 2D windows can coexist from many different apps, so can 3D volumes within the Shared Space of visionOS.
ContentView.swift
contains the initial view for the app, and it sets up the ImmersiveView
with a toggle. You may recall that Apple recommends you give the user an entry point before putting them into an immersive experience. You’ll stick to that convention because it gives us an easy way to reset the scene as needed.
The Assets
file has a 3D app icon already loaded. This was also covered in the Getting Started With visionOS modules.
Inside the Packages
is a RealityKitContent
Swift package that contains resources loaded from Reality Composer Pro. You’ll use some these items for demonstration now, but the bulk of the first two lessons will be done in code rather than using Reality Composer Pro.
Build and run to take a look in the Simulator.
When the app loads, you’ll see the gray sphere in the volume view along with a toggle to load the immersive view. At the top right of the window, switch to the museum environment, which has less virtual furniture cluttering the floor. Tap the Interact icon to toggle to see the Immersive space. Notice the two spheres floating above.
If you move the room with the Orbit tool or two-finger swipe on your trackpad, the camera moves the viewer’s point of view. If you move the window with the window bar, only the first gray sphere and toggle switch move. The immersive view content stays put. If you close the app’s initial volume with the X button, the immersive view stays on the screen. Use the Home icon or press Shift-Command-H to reload the app, and then you can toggle the immersive view to off.
Switch back to Xcode to explore more about Entities.
In Xcode, select Package.realitycomposerpro
underneath Packages
and then RealityKitContent
and choose Open in Reality Composer Pro
. Double-click Sphere_Left
in the Project Navigator to zoom in on viewport in the center.
Look at the attributes in the Inspector
on the right. Transform
sets its position
, rotation
, and scale
, relative to the user. who is located at the 0,0,0
coordinates.
Note: The coordinate space in RealityKit is slightly opposite to SwiftUI coordinate space. The z-axis positive value moves toward the user and negative moves away. The x-axis negative moves left and positive moves right. The y-axis positive values move up and negative down. This is opposite to SwiftUI, where the y values reverse.
RealityView
’sconvert
function can be used to match or convert the y values between RealityKit and SwiftUI.
At the top of the window, tap the “+” icon to Show Content Library, or press Shift-Cmd-L. Scroll down to Scratched Metallic Paint material and drag it to the Project Navigator. Click the Sphere_Left
, and in the Inspector, change the Material Binding to Scratched Metal Paint. Save the file, switch back to Xcode, and build and run. Toggle the Immersive view and zoom in to the left sphere with the Dolly tool.
Notice how the sphere has the same scratched paint as seen in the Reality Composer Pro app, but the lighting and colors in the environment also influence the model.
Take a look at the package in Xcode and notice that the Scratched Metallic Paint is a USDz
file, but there is no preview. Recall that a usdz file can contain several types of media, but may not contain a 3D mesh or object.
Switch back to Reality Composer Pro, and select Sphere_Left
. In the Inspector, click the Add Component button. Scroll down and choose Physics Body
in the modal, then double-click it to apply it. In the Inspector, notice that under Physics Body the Mode is Dynamic
and Affected by Gravity is checked. Save the file, switch back to Xcode, and build and run. In the Simulator, toggle the Immersive view. Hmm. The sphere is still floating. You need to add a Collision Body to get updates and move the sphere.
Return to Reality Composer Pro, and use Add Component to add a Collision component to the Sphere_Left
entity. You’ll notice that the interface adds a green cube around the sphere. Change the collison body component’s Shape to sphere
to match the type of object. Notice the red, green, and blue lines poking out. These represent the x, y, and z axes respectively. When the object rotates, you can access these values in your app. For example, if you had a cube dice, you could figure out what the top face of the die is by the axis. Depending on the axis, you could calculate the value of the dice roll.
Save the file, and run the app from Xcode. Watch what happens when you flip the toggle in the Simulator. The sphere immediately starts to fall. However, it continues to fall right through the simulated floor! The virtual environment has no physics body attached, so there’s nothing to stop the ball. Now is your chance to fix this with RealityKit!
It’s time to wwitch back to Xcode and clean up few things. Open up ContentView.swift
. You won’t need the RealityView
so select it and delete it. You may recall that if you double-click the opening brace, Xcode will select the whole code block. Make sure to select the RealityView and the update closure that follows the unnamed make closure.
You can also delete the Scene.usda
in the RealityKitContent
package. You’re going to use the Immersive.usda
scene.
Open ImmersiveView.swift
to create a new floor mesh. For this you’ll need an instance of ModelEntity
made with a plane, 100 x 100 meters. You’ll apply a special occlusion material, which is invisible to the renderer.
Add the following to the RealityView
content:
/* Occluded floor */
let floor = ModelEntity(mesh: .generatePlane(width: 100, depth: 100), materials: [OcclusionMaterial()])
You’re using an OcclusionMaterial()
that won’t be rendered in the scene. Just like in Reality Composer Pro, you’ll need to add a collision body and a physics body.
Add the following:
floor.generateCollisionShapes(recursive: false)
floor.components[PhysicsBodyComponent.self] = .init(
massProperties: .default,
mode: .static
)
Notice the collision body has recursive
set to false — and more importantly, the physics body’s mode is static
.
Finally, add the floor the scene’s content
:
content.add(floor)
Build and run, then switch the Show Immersive Space toggle. Now the metal Sphere_Left
drops but stops at the floor.
Now you can build a metal ball in RealityKit and learn how to interact with it. To make a 10cm metal ball, add the following code below where you added the floor to content
. This time, use the generateSphere
method.
/* steel ball */
let ball = ModelEntity(
mesh: .generateSphere(radius: 0.1),
materials: [SimpleMaterial(color: .white, isMetallic: true)])
You also need to specify its x, y, z position in the world, relative to the center, where the viewer is standing. Add this to the ball code.
ball.position.y = 1.0 // 1 meter (m) above the floor
ball.position.z = -1.5 // 1.5m in front of the user
ball.position.x = 0.5 // 0.5m right of center
content.add(ball)
Build and run, then flip the toggle.
Notice that the ball appears, but it stays exactly where you placed it. While it’s there, zoom in with the dolly and rotate the camera around it with the gyro tool. The camera in 3D parlance is for the point of view. This is what the wearer of a Vision Pro will see. Notice how the ball reflects the objects in the environment, like a mirror. This is because it’s pure metal, without any normals or roughness applied to the material.
Add the following components to the ball before it’s added to the content/scene.
ball.generateCollisionShapes(recursive: false)
// Enable interactions on the entity.
ball.components.set(InputTargetComponent())
ball.components.set(CollisionComponent(shapes: [.generateSphere(radius: 0.1)]))
// add mass to PhysicsBody
ball.components[PhysicsBodyComponent.self] = .init(
PhysicsBodyComponent(
// mass in kilograms
massProperties: .init(mass: 5.0),
material: .generate(
staticFriction: 0.0,
dynamicFriction: 0.0,
restitution: 0.0
),
mode: .dynamic
)
)
// add gravity
ball.components[PhysicsBodyComponent.self]?.isAffectedByGravity = true
You’ve set an input target so you can tap or drag the ball. You’ve added a spherical collision body and added some mass. The friction values will control the object’s slide. The physics body mode is dynamic, which is the default with objects created in Reality Composer Pro.
Now that you’ve seen how external forces can influence your models, you’ll add a gesture to interact with them in the Simulator.
At the bottom of the ImmersiveSpace
view, add a computed property with a drag gesture that can target any entity. You’ll be more precise about targeting in the next lesson.
var dragGesture: some Gesture {
DragGesture()
.targetedToAnyEntity()
.onChanged { value in
value.entity.position = value.convert(value.location3D, from: .local, to: value.entity.parent!)
value.entity.components[PhysicsBodyComponent.self]?.mode = .kinematic
}
.onEnded { value in
value.entity.components[PhysicsBodyComponent.self]?.mode = .dynamic
}
}
Notice that you have targeted “any entity”, through the call to targetedToAnyEntity
, which will apply to any entities that have a input target component.
Add the dragGesture
as a gesture modifier to the closing brace of the RealityView
:
.gesture(dragGesture)
Build and run as you did before. Notice that you can drag the shiny ball you created in code, but not Sphere_Left
. You need to add an InputTarget
component in Reality Composer Pro. Head over there, select the sphere, and in the Inspector Add Component, choosing Input Target. Leave it enabled and save the files. Build and run in Xcode.
You can play with the spheres. As you drag them, their modes change to kinematic so they don’t fly off your finger as the view updates. As soon as you release them, they become dynamic, as you set that in .onEnded
.
Pro Tip: Constrain the direction of the drag in the Simulator by holding down the Shift key as you drag. On the actual Vision Pro hardware, pinch to pick up and drag the ball, pull it toward you or away.
Start the drag slightly upward then hold down Shift and the balls will move either on the x or z axis. Be careful, because the floor is a plane and has no height by definition. If it goes slightly through and you release the drag or un-pinch, it will fall to infinity. You can also drop the balls near each other and they’ll collide and roll away. The floor is 100 meters square, but if they roll off, they’ll also fall. Catch them with a drag gesture or pinch as well.
While your app is still running, go back to Xcode and tap the Debug Visualizations from the toolbar under the center pane, aka standard editor. Check the box Collisions, Shapes and Axes. Back in the Simulator, you can see the collision shapes on the objects. On the Sphere_Left
you can see the x, y and z axes as well.
In this lesson, you learned how to apply basic materials to models or entities. You also learned how to create entities in code and how to add components in both RealityKit and Reality Composer Pro. Like their predecessors, such as Interface Builder, Apple has provided the tools to help create entities and apply components, with an inspector tool. You’ve also seen how to add and set components in code with RealityKit. You added collision, input targets, and physics bodies so that you can interact with the models and influence them.
Do you remember those wooden labyrinth toys? In the next lesson, you’ll build a very simple version of one. You’ll clean up some of the tools and models used here, and you’ll work on interacting in a more game-like way.