-
Notifications
You must be signed in to change notification settings - Fork 6
/
SCENES.txt
26 lines (13 loc) · 3.78 KB
/
SCENES.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
Example Scenes in Project:
UnityARKitScene.unity: This scene is a "minimal" scene that shows most of the ARKit functionality
It has a GameObject (GO) called ARCameraManager which has the script UnityARCameraManager.cs, and has a reference to the Main Camera in the scene. On startup, this script initializes the ARKit and updates the camera position, rotation and projectionMatrix based on the information updated per frame by the ARKit.
The Main Camera has a script on it called UnityARVideo.cs, which updates the live video using a reference to a YUVMaterial which contains a shader to carry out the rendering of the video.
There is a GO called RandomCube that is the checkerboard cube in the scene that is placed at 1 unit in the z direction from the origin of the scene. Since our tracking begins with the Main Camera position and rotation set at the origin, when you startup the scene you should be able to see a checkered cube 1 meter straight in front of you.
There is a GO called GeneratePlanes which has UnityARGeneratePlane script which on it which references a prefab to display the generated planes. The script hooks into the horizontal plane detection and update events that ARKit signals so that every new plane detected gets a corresponding instance of the prefab placed in the world. It uses some utility scripts to keep track of the planes detected and to generate the instances. As you scan the scene, this GO should generate an instance of the prefab you have referenced from it each time ARKit detects a plane, and it will update the extents and orientation of the prefab instance based on the plane update events.
There is a GO called HitCube which has a script called UnityARHitTestExample which references the parent transform. The script does a ARKit hit test whenever a touch is detected on the screen, and the resulting position of the hit test result is used to place the parent transform of the cube. When running the scene, touching on the screen would move the HitCube to where your touch intersected a plane, or if a plane was not hit, the nearest feature point.
There is a GO called PointCloudParticleExample which has a script of the same name, which gets the point cloud data from ARKit, and displays a particle per point data in the cloud. This shows in the scene as little yellow dots.
The Directional Light in the scene has a UnityARAmbient script on it, which uses ARKit light estimation value to change the intensity of the light. So if you go into a dark room, the objects in the scene will be lit with a less bright light than if you were in daylight.
UnityParticlePainter.unity: This scene is a sample that allows you to paint particles into your scene to augment your surroundings.
The UnityARCameraManager script is setup the same as way as in the minimal scene.
The main script used to implement the painting functionality is ParticlePainter.cs. The way it works: it has three modes of painting through which you can cycle as many times as you want using the button on the top right. The first mode is "OFF", which allows you to navigate through the scene using the phone so that you can examine your artistic masterpiece. The second mode is "PICK", which brings up a color picker from which you can pick the color of the paint you will use. The third mode is "PAINT", which allows you to move the phone around and cause the app to leave particles of the picked color behind in the world. This will continuously generate particles that as long as you move more than a certain threshold distance. After you have painted, or if you want to start a new section of paint or a new color, press the button again the required number of times to get to the mode you need.
There is some external source for HSVPicker that was used for this example: https://github.com/judah4/HSV-Color-Picker-Unity