Skip to content

Usage 📜

Marcel Wiessler edited this page Jun 22, 2020 · 1 revision

AR Simulation comes with a couple of built-in components that let you create and control tracked AR planes, pointclouds or images as well as the camera (your device).

AR Planes

  • SimulatedARPlane is the most basic way to spawn a ARPane. Just add the component to a gameobject and position it in your scene. You can use local scale x and z for changing its size.
  • SimulatedARPlaneGeneration is a more advanced component. It uses raycasts to sample points in your scene to generate planes dynamically. This is a closer representation of how planes get created on device.

AR Tracked Images

  • SimulatedARTrackedImage can be used to simulate image tracking. You can track any image that is used in a XRReferenceImageLibrary asset. By default tracking automatically uses the camera frustum to update its Tracking State.

AR PointClouds

  • SimulatedARPointCloud is the most basic way to spawn a ARPointCloud. By default is generates random points in either a spherical or a planar shape but it's also possible to edit points directly in editor or via code.
  • SimulatedARPointCloudRaycaster uses raycasts to sample points in your scene.

AR Anchors

  • Supported but we don't have a editor component implementation as with the other features right now. You should be able to spawn anchors with AR Foundation and see them being created just as on device.

Background Image (Experimental)

  • SimulatedAREnvironment can be added to your root environment gameobject (the gameobject that contains objects that you want to render as a camera image). You can enable IsActive to assign itself to a SimulatedAREnvironemntManager which does the heavy lifting.
  • SimulatedAREnvironmentManager can be added to a scene for handling camera background rendering. The Scene Or Prefab field can be used to reference either a scene asset, a prefab or a gameobject in the current scene to be rendered as a camera image.
Clone this wiki locally