Writing VR automation with Unium

As part of my talk in Heisenbug, I showed two examples of automation in virtual reality: Unium and Poco(Airtest). I feel the explanations weren’t very deep, so I decided to use this format to expand on them.

1) Preparing the APP to test

In the examples below I’m going to be using a VR app that I created as part of the VR foundations Udacity course. This app was done using Google VR Cardboard(which is currently sort of deprecated, but I believe it’s the best first step into VR development)

You can find the code here. It was build with Unity version 2017.1.0p4 and Google Cardboard version gvr-unity-sdk-1.0.3. I’m specifying this because, sometimes Unity is funny when switching versions and things go deprecated fast in Google VR, so if you use other versions it might not work straight away.

This application is a maze game. The user has to find a key and then a house to unlock the door with it. There are also coins scattered around the maze and the goal is to find them. Inside the house, there would be a board with the number of found keys.

Unit testing the application with test runner we can check some things. For example, we can check that the door ‘lock’ variable is on or off. However, we don’t see the real result as the user would, which could cover issues such as the lock variable does not cause anything to happen in the scene or the sound or particle system are not pleasant or the user cannot reach the objects or they sizes feel wrong…

To test this, you would have to actually go around the maze, get all the coins, get the key, go back to the door and check everything this way. The path is always the same and manual testing could get very time consuming. That’s the reason of looking for solution for automation of user behaviour, as we would see with Unium.

2) Installing and configuring Unium

Although there is a Github project available, I recommend you do this from the asset store in Unity.

To open the store in Unity we click Window -> Asset Store.

Then we search for Unium and we download the asset.

A folder will appear then in the Assets’s folder of the project. Then we add a new empty object in the Scene of the project and we add the script of the folder from our Assets. What this script does is opening a local web server, with port 8342 by default (this can be changed), that we can use to interact with the application, retrieving data from it and sending data to it. If you are curious about how this could work, check out my post about API’s. This creates communication capabilities between the server and our Unity program.

3) Using Unium

Now we can do calls directly in a browser as if we were accessing a website, such as:

http://localhost:8342/q/scene/Main Camera/transform/position={'x':1,'y':1,'z':0}

This moves the camera to the position 1,1,0. To rotate the camera we can use:

http://localhost:8342/q/scene/Main Camera.Transform.eulerAngles={'x':180,'y':0,'z':0}

This would set the camera to a rotation of 180 in the x axe, 0 in the y axe and 0 in the z axe.

Movement of the camera is crucial for VR apps, and we can see three issues already with these functions. The first one is that for VR applications we usually move by clicking objects called “waypoints” or with handset functions. With Unium we could use waypoints as objects and click them as such:


Note that I have all waypoints under another object “Waypoints”, the exact waypoint to click is called “Waypoint_7” and then the code that has the click method is called “Waypoint”. The call to click on the door the is easier because we only have one door object and we are not reusing names:


If you are not sure of the exact name of the object you can use wildcards and queries.

The second issue is the use of EulerAngles. I was able to see the rotation with quaternions (using “rotation” instead of “eulerAngles” at the end of the call) but I was not able to set it up for some reason (maybe now there is support for it or I was doing something wrong at the time of the testing)

The last issue is the rotation of the camera: users would create a movement when rotating towards a point or in a direction, but the rotation done with the call is done in one setting. This would not really emulate the user behaviour but with Unium we could set up a script and emulate such behaviour or use the web socket capabilities to run a query with a chosen frequency.

Unium has many scripting languages which can be used to automate these calls, for example nUnit (which is the one we would use in C# for unit test). Here is an example of use of code to do this:

dynamic camera = await u.Get("/q/scene/Camera");
await u.Get("/q/scene/Camera.transform.position={\"x\":1, \"y\":1, \"z\":1}")
dynamic pos1 = await u.Get("/q/scene/Camera.transform.position");
await u.Get("/q/scene/Camera.transform.position={\"x\":180, \"y\":1, \"z\":1}")
dynamic pos2 = await u.Get("/q/scene/Camera.transform.position");
// verify that pos1 != pos2

4) Conclusions

Unium is a powerful tool for automation of Unity projects. VR automation is possible with it, even emulating user’s behaviour. However, this sort of automation is not fully black boxed, as you need to add this code into the application and possible use some methods that are existing in the application. Ideally, a fully black boxed automation would simulate the behaviour of the controllers directly.

What’s the difference between automating any game VS automating a VR Game? The main difference is that instead of a movement of a player, you need to automate the camera movement and take into account the handsets (if present).

Sometimes, the transform (walking) is usually done by clicking waypoints and this could be done easily with Unium. Rotation can be emulated by rotating the camera as demonstrated before.

The next step would be to to automate the handsets. However, if we do this directly by calling functions on the controllers of the handset, we might enter the territory of testing that the functions by the controller’s provider are working fine. Instead of this, we might just want to make sure that our functions work fine, so maybe we should call the functions that we have created to control our application when the controller selects and clicks around. But that’s… well… another story…

One thought on “Writing VR automation with Unium

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s