Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
## [6.0.2] - 2024-05-22

### Changed

- Rebuilt static libraries with Xcode version 15.4 (15F31d). You are now required to build iOS apps using Xcode 15.4 or newer.
  • Loading branch information
Unity Technologies committed May 22, 2024
1 parent 65882b5 commit 9095ecd
Show file tree
Hide file tree
Showing 11 changed files with 91 additions and 39 deletions.
12 changes: 12 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,18 @@ All notable changes to this package will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html).

## [6.0.2] - 2024-05-22

### Changed

- Rebuilt static libraries with Xcode version 15.4 (15F31d). You are now required to build iOS apps using Xcode 15.4 or newer.
- Changed AR Foundation dependency version from 6.0.1 to 6.0.2.

### Fixed

- Fixed issue [ARKB-58](https://issuetracker.unity3d.com/issues/arfoundation-application-crashes-when-arocclusionmanager-is-disabled-if-multithreaded-rendering-is-enabled-on-ios-devices-that-use-lidar) where iOS apps could intermittently crash when destroying the `AROcclusionManager` with multithreaded rendering enabled.
- Fixed the native input provider so that it now explicitly sets **Device** mode as its only supported [Tracking Origin Mode](xref:Unity.XR.CoreUtils.XROrigin.TrackingOriginMode). The [XR Origin component](xref:xr-core-utils-xr-origin-reference) will always use **Device** mode as its Tracking Origin Mode. You should set the XR Origin component's **Camera Y Offset** value to `0` to avoid adding a height offset to your camera and trackables.

## [6.0.1] - 2024-04-01

### Changed
Expand Down
1 change: 0 additions & 1 deletion Documentation~/TableOfContents.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
* [Introduction](xref:arkit-manual)
* [What's new](xref:arkit-whats-new)
* [Upgrade guide](xref:arkit-upgrade-guide)
* [Project configuration](xref:arkit-project-config)
* Subsystems
* [Session](xref:arkit-session)
Expand Down
56 changes: 48 additions & 8 deletions Documentation~/arkit-face-tracking.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,22 +3,62 @@ uid: arkit-face-tracking
---
# Face tracking

This page is a supplement to the AR Foundation [Face tracking](xref:arfoundation-face-tracking) manual. The following sections only contain information about APIs where ARKit exhibits unique platform-specific behavior.

> [!IMPORTANT]
> To use face tracking with ARKit, you must first enable face tracking in the XR Plug-in Management settings. Refer to [Enable the Face tracking subsystem](xref:arkit-project-config#enable-face-tracking) to understand how to enable face tracking for ARKit.
## Optional feature support

ARKit implements the following optional features of AR Foundation's [XRFaceSubsystem](xref:UnityEngine.XR.ARSubsystems.XRFaceSubsystem). The availability of features on specific devices depends on device hardware and software. Refer to [Requirements](#requirements) for more information.

| Feature | Descriptor Property | Supported |
| :------ | :------------------ | :-------: |
| **Face pose** | [supportsFacePose](xref:UnityEngine.XR.ARSubsystems.XRFaceSubsystemDescriptor.supportsFacePose) | Yes |
| **Face mesh vertices and indices** | [supportsFaceMeshVerticesAndIndices](xref:UnityEngine.XR.ARSubsystems.XRFaceSubsystemDescriptor.supportsFaceMeshVerticesAndIndices) | Yes |
| **Face mesh UVs** | [supportsFaceMeshUVs](xref:UnityEngine.XR.ARSubsystems.XRFaceSubsystemDescriptor.supportsFaceMeshUVs) | Yes |
| **Face mesh normals** | [supportsFaceMeshNormals](xref:UnityEngine.XR.ARSubsystems.XRFaceSubsystemDescriptor.supportsFaceMeshNormals) | |
| **Eye tracking** | [supportsEyeTracking](xref:UnityEngine.XR.ARSubsystems.XRFaceSubsystemDescriptor.supportsEyeTracking) | Yes |

> [!NOTE]
> Refer to AR Foundation [Face tracking platform support](xref:arfoundation-face-tracking-platform-support) for more information
> on the optional features of the face subsystem.
## Session configuration

Face tracking requires the use of the user-facing or "selfie" camera. It is the responsibility of your session's [XRSessionSubsystem.configurationChooser](xref:UnityEngine.XR.ARSubsystems.XRSessionSubsystem.configurationChooser) to choose the camera facing direction. You can override the configuration chooser to meet your app's needs. For more information on the [ConfigurationChooser](xref:UnityEngine.XR.ARSubsystems.ConfigurationChooser), refer to the [What’s new in Unity’s AR Foundation | Unite Now 2020](https://www.youtube.com/watch?v=jBRxY2KnrUs&t=677s) video (YouTube). You can access a sample that shows how to use the `ConfigurationChooser` to choose between the user-facing and world-facing camera on the [AR Foundation samples](https://github.com/Unity-Technologies/arfoundation-samples/tree/main/Assets/Scenes/ConfigurationChooser) GitHub repository.

### Configuration chooser

iOS devices support different combinations of features in different camera facing directions. If your scene contains several manager components that require the world-facing camera, AR Foundation's default configuration chooser might decide to use the world-facing camera, even if the AR Face Manager component is also enabled in your scene. You can create your own [ConfigurationChooser](xref:UnityEngine.XR.ARSubsystems.ConfigurationChooser) to prioritize face tracking functionality over other features if you desire greater control over the camera's facing direction.

You can access an example of using a custom `ConfigurationChooser` in the `Rear Camera (ARKit)` sample on the [AR Foundations samples](https://github.com/Unity-Technologies/arfoundation-samples/tree/main/Assets/Scenes/FaceTracking/WorldCameraWithUserFacingFaceTracking) GitHub. This example demonstrates how you can use the user-facing camera for face tracking, and the world-facing (rear) camera for passthrough video (iOS 13+).

## Blend shapes

ARKit provides a series of [blend shapes](https://developer.apple.com/documentation/arkit/arfaceanchor/2928251-blendshapes?language=objc) to describe different features of a face. Each blend shape is modulated from 0..1. For example, one blend shape defines how open the mouth is.

## Front facing camera
A blend shape represents action at a location on a face. Each blend shape is defined by an [ARKitBlendShapeLocation](xref:UnityEngine.XR.ARKit.ARKitBlendShapeLocation) to identify the location of the face action and a [ARKitBlendShapeCoefficient](xref:UnityEngine.XR.ARKit.ARKitBlendShapeCoefficient) to describe the amount of action at the location. The `ARKitBlendShapeCoefficient` is a value between `0.0` and `1.0`.

You can learn more about blend shapes with the `Blend shapes` sample on the [AR Foundation Samples](https://github.com/Unity-Technologies/arfoundation-samples/tree/main?tab=readme-ov-file#blend-shapes-arkit) GitHub. This sample uses blend shapes to puppet a cartoon face which is displayed over the detected face.

Face tracking requires the use of the front-facing or "selfie" camera. When the front-facing camera is active, other tracking subsystems like plane tracking or image tracking may not be available. If the rear-facing camera is active, face tracking might not be available.
## Face visualizer samples

Different iOS devices support different combinations of features. If you `Start` a subsystem that requires the rear-facing camera, the Apple ARKit package might decide to use the rear-facing camera instead. For more information, see [Camera and Tracking Mode Selection](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.1/manual/migration-guide-3.html#camera-and-tracking-mode-selection).
The [AR Foundation Samples](https://github.com/Unity-Technologies/arfoundation-samples) GitHub repository contains ARKit-specific prefabs that you can use to visualize faces in your scene, as outlined in the following table. Refer to the AR Foundation [AR Face](xref:arfoundation-face-tracking-arface) manual for more information on how to use these prefabs.

## Technical details
| Prefab | Description |
| :----- | :---------- |
| [AR Eye Pose Visualizer](https://github.com/Unity-Technologies/arfoundation-samples/blob/main/Assets/Prefabs/AR%20Eye%20Pose%20Visualizer.prefab) | Visualize the location and direction of the eyes of a detected face. |
| [Eye Laser Visualizer](https://github.com/Unity-Technologies/arfoundation-samples/blob/main/Assets/Prefabs/Eye%20Laser%20Prefab.prefab) | Use the eye pose to draw laser beams emitted from the detected face. |
| [Sloth Head](https://github.com/Unity-Technologies/arfoundation-samples/blob/main/Assets/Prefabs/SlothHead.prefab) | Use the face blend shapes provided by ARKit to animate a 3D character. |

### Requirements
<a id="requirements"></a>

Face tracking supports devices with Apple Neural Engine in iOS 14 and iPadOS 14 and requires a device with a TrueDepth camera on iOS 13 and iPadOS 13 and earlier. See Apple's [Tracking and Visualizing Faces](https://developer.apple.com/documentation/arkit/content_anchors/tracking_and_visualizing_faces?language=objc) documentation for more information.
## Requirements

### Contents
Face tracking supports devices with Apple Neural Engine in iOS 14 and newer, and iPadOS 14 and newer. Devices with iOS 13 and earlier, and iPadOS 13 and earlier, require a TrueDepth camera for face tracking. Refer to Apple's [Tracking and Visualizing Faces](https://developer.apple.com/documentation/arkit/content_anchors/tracking_and_visualizing_faces?language=objc) documentation for more information.

**Apple ARKit XR Plug-in** includes a static library that provides an implementation of the AR Foundation [Face tracking](xref:arfoundation-face-tracking) feature.
> [!NOTE]
> To check whether a device has an Apple Neural Engine or a TrueDepth camera, refer to Apple's [Tech Specs](https://support.apple.com/en_US/specs).
[!include[](snippets/apple-arkit-trademark.md)]
18 changes: 18 additions & 0 deletions Documentation~/arkit-occlusion.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,22 @@ There are three types of depth images that ARKit exposes through the provider's
- **Human depth**: distance from the device to any part of a human recognized within the camera field of view.
- **Human stencil**: value that designates, for each pixel, whether that pixel is part of a recognized human.

## Optional feature support

ARKit implements the following optional features of AR Foundation's [XROcclusionSubsystem](xref:UnityEngine.XR.ARSubsystems.XROcclusionSubsystem). The availability of features depends on device hardware and software. Refer to [Requirements](#occlusion-requirements) for more information.

| Feature | Descriptor Property | Supported |
| :------ | :--------------- | :----------: |
| **Environment Depth Image** | [environonmentDepthImageSupported](xref:UnityEngine.XR.ARSubsystems.XROcclusionSubsystemDescriptor.environmentDepthImageSupported) | Yes |
| **Environment Depth Confidence Image** | [environmentDepthConfidenceImageSupported](xref:UnityEngine.XR.ARSubsystems.XROcclusionSubsystemDescriptor.environmentDepthConfidenceImageSupported) | Yes |
| **Environment Depth Temporal Smoothing** | [environmentDepthImageSupported](xref:UnityEngine.XR.ARSubsystems.XROcclusionSubsystemDescriptor.environmentDepthImageSupported) | Yes |
| **Human Segmentation Stencil Image** | [humanSegmentationStencilImageSupported](xref:UnityEngine.XR.ARSubsystems.XROcclusionSubsystemDescriptor.humanSegmentationStencilImageSupported) | Yes |
| **Human Segmentation Depth Image** | [humanSegmentationDepthImageSupported](xref:UnityEngine.XR.ARSubsystems.XROcclusionSubsystemDescriptor.humanSegmentationDepthImageSupported) | Yes |

> [!NOTE]
> Refer to AR Foundation [Occlusion platform support](xref:arfoundation-occlusion-platform-support) for more information
> on the optional features of the occlusion subsystem.
## Environment Depth

The occlusion subsystem provides access to two types of environment depth: [raw](xref:UnityEngine.XR.ARSubsystems.XROcclusionSubsystem.TryAcquireRawEnvironmentDepthCpuImage(UnityEngine.XR.ARSubsystems.XRCpuImage@)) and [smoothed](xref:UnityEngine.XR.ARSubsystems.XROcclusionSubsystem.TryAcquireSmoothedEnvironmentDepthCpuImage(UnityEngine.XR.ARSubsystems.XRCpuImage@)). These correspond to the following ARKit APIs:
Expand All @@ -21,6 +37,8 @@ The occlusion subsystem provides access to two types of environment depth: [raw]
> [!NOTE]
> You must enable smoothed depth by setting [environmentDepthTemporalSmoothingRequested](xref:UnityEngine.XR.ARSubsystems.XROcclusionSubsystem.environmentDepthTemporalSmoothingRequested) to `true`. Otherwise, [TryAcquireSmoothedEnvironmentDepthCpuImage](xref:UnityEngine.XR.ARSubsystems.XROcclusionSubsystem.TryAcquireSmoothedEnvironmentDepthCpuImage(UnityEngine.XR.ARSubsystems.XRCpuImage@)) will return `false`.
<a id="occlusion-requirements"></a>

## Requirements

Environment depth requires Xcode 12 or later, and it only works on iOS 14 devices with the LiDAR scanner, such as the new iPad Pro.
Expand Down
4 changes: 2 additions & 2 deletions Documentation~/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ This package does not implement the following AR Foundation features as they are
| [Bounding Box detection](xref:arfoundation-bounding-box-detection) | Detect and track bounding boxes of 3D objects |

> [!IMPORTANT]
> Apple's App Store rejects any app that contains certain face tracking-related symbols in its binary if the app developer doesn't intend to use face tracking. To avoid ambiguity, face tracking support is available only when face tracking is enabled. See [Enable the Face tracking subsytem](xref:arkit-project-config#enable-face-tracking) for instructions for changing this setting.
> Apple's App Store rejects any app that contains certain face tracking-related symbols in its binary if the app developer doesn't intend to use face tracking. To avoid ambiguity, face tracking support is available only when face tracking is enabled. Refer to [Enable the Face tracking subsystem](xref:arkit-project-config#enable-face-tracking) for instructions for changing this setting.
# Install the Apple ARKit XR Plug-in

Expand Down Expand Up @@ -88,6 +88,6 @@ This version of Apple ARKit XR Plug-in includes:
* A shader used for rendering the camera image
* A plug-in metadata file

For more code examples, see the [AR Foundation Samples repo](https://github.com/Unity-Technologies/arfoundation-samples).
For more code examples, refer to the [AR Foundation Samples repository](https://github.com/Unity-Technologies/arfoundation-samples).

[!include[](snippets/apple-arkit-trademark.md)]
12 changes: 6 additions & 6 deletions Documentation~/project-configuration-arkit.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ You must install the **iOS Module** using the Unity Hub before you can enable th

To enable ARKit:

1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
1. Open the **Project Settings** window (menu: **Edit** &gt; **Project Settings**).
2. Select **XR Plug-in Management** to view the plug-in management settings.
3. Select the **iOS** tab to view the iOS settings. (This tab is only shown when you have installed the Editor iOS Module.)
4. Enable the **ARKit** option in the **Plug-in Providers** list.
Expand All @@ -48,7 +48,7 @@ The minimum version of iOS that supports ARKit is iOS 11.

To change this setting:

1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
1. Open the **Project Settings** window (menu: **Edit** &gt; **Project Settings**).
2. Select **Player** on the left to view the **Player Settings** page.
3. Select the **iOS** tab to view the iOS settings.
4. Open the **Other Settings** group (if necessary).
Expand All @@ -66,7 +66,7 @@ the `NSCameraUsageDescription` key.

To set or change this setting:

1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
1. Open the **Project Settings** window (menu: **Edit** &gt; **Project Settings**).
2. Select **Player** on the left to view the **Player Settings** page.
3. Select the **iOS** tab to view the iOS settings.
4. Open the **Other Settings** group (if necessary).
Expand All @@ -85,7 +85,7 @@ You must install the ARKit package before enabling ARKit face tracking. Refer to

To enable ARKit face tracking:

1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
1. Open the **Project Settings** window (menu: **Edit** &gt; **Project Settings**).
2. Click **XR Plug-in Management** on the left to open the plug-in provider list.
3. Select **ARKit** in the list to view the ARKit plug-in settings page.
4. Check the box next to **Face Tracking** to enable the feature.
Expand All @@ -101,7 +101,7 @@ You must install the ARKit package before you can change the ARKit requirement s

To change this setting:

1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
1. Open the **Project Settings** window (menu: **Edit** &gt; **Project Settings**).
2. Click **XR Plug-in Management** on the left to open the plug-in provider list.
3. Select **ARKit** in the list to view the ARKit plug-in settings page.

Expand All @@ -122,7 +122,7 @@ Some rules serve as warnings for possible configuration problems; you aren't req

To review the ARKit project validation rules:

1. Open the **Project Settings** window (menu: **Edit &gt; Project Settings**).
1. Open the **Project Settings** window (menu: **Edit** &gt; **Project Settings**).
2. Click **XR Plug-in Management** on the left to open the plug-in provider list.
3. Select **Project Validation** in the list to view the validation page.
4. Select the **iOS** tab to view the status of the validation checks for iOS XR plug-ins, including ARKit.
Expand Down
Binary file modified Runtime/FaceTracking/iOS/Xcode1500/libUnityARKitFaceTracking.a
Binary file not shown.
Binary file modified Runtime/iOS/Xcode1500/libUnityARKit.a
Binary file not shown.
10 changes: 0 additions & 10 deletions ValidationExceptions.json

This file was deleted.

7 changes: 0 additions & 7 deletions ValidationExceptions.json.meta

This file was deleted.

Loading

0 comments on commit 9095ecd

Please sign in to comment.