Unity Multi-Platform Development? let's make more!

Date posted
6 March 2019
Reading time
16 Minutes
Chloe Thompson

Unity Multi-Platform Development? let's make more!

Augmented and Virtual Reality have been around for over 30 years with the term 'Virtual Reality' being coined in 1980's by Jaron Lanier. Both technologies are quickly becoming more mainstream as the years go on; this is due to improvements in cost and the development techniques rapidly improving. Despite these technologies delivering different experiences, where VR is a fully immersive environment using a head mounted display, compared to AR where objects are laid on top of a real environment, they are both developed in fairly similar ways. An interesting thought is how we can simplify and streamline the development process even more.

Within the Applied Innovation team here at Kainos, we have been developing more immersive applications to highlight their potential business value. This emphasised to us, some of the current development issues and led us to researching whether we could build a tool to improve the experience.

A common tool for developing immersive experiences is Unity, a platform that allows development in a visual manner with scenes and functionality implemented through scripts and plugins. Over the past few weeks I've been working on a project for a client, in relation to how we can add functionality to Unity to enable simpler workflows and builds for multiple platforms with a single process.

Why Multiple Builds?

Unity has the ability to build for multiple platforms; to do this, you have to configure and build for each platform separately every time. We came up with the idea for a tool, that would enable us to set the configuration once, build and test on all the platforms we needed in one motion.

By creating this singular tool we are reducing the need to tailor the configurations each time we want to build for testing our project on a different device. The saved time can then be better spent testing and improving the project, rather than configuring build settings.

Not only does this tool reduce the need for configuring each platform, but it also removes the need to have a new project for each platform. Unity tends to recommend that for each platform you wish to build to, that you have a separate project. However, because the multiple build tool builds scenes based on the platforms selected, it only includes plugins and elements in the build files that are relevant for the platform being built in that moment. Pretty clever, right?

/><figcaption>Example AR App??? ??Photo by <a href=Patrick Schneider on Unsplash

What is the gain?

My team, are tasked with considering technologies that will be beneficial in future years. By looking into tools like Unity and developing a multiple build package, we are able to build upon Kainos' AR and VR capabilities which will be key in years to come.

As a team, we can use this tool to test our Unity scenes on multiple platforms at the same time; for example we used this when creating and testing an AR scene. After starting some development on the tool, we discovered that we can build an AR scene to both mobile devices and VR platforms. The value in this, is allowing full interaction with the scene and enabling more comprehensive testing of an environment. Being able to experience the scene in VR and in a physical environment, we can test both object placement and interactions on two different build platforms. In doing so we are ensuring that in mutliple platform situations our application works as expected.

Viewing the environment on multiple platforms at the same time, significantly reduces the feedback loop during development; as both the user and the developer can quickly see and interact with the scene at the same time, on different platforms. With both the user and developer being able to see the scene, they are able to able to more simply outline changes. The multiple build tool enables the changes to be implemented, built and assessed in a repetitive manner until both parties are happy. This process of building to multiple platforms and iterative development creates a quick feedback loop for improving applications during development and testing.

This tool also enables our team to have a better test coverage on our projects; this is done by facilitating the testing of our scenes on a range of devices. Using Unity, we can create our project, add the multiple build tool on top and this one project can then be built for a range of platforms at the same time, allowing us to test the experience and performance on each one. The time saved configuring our projects allows for more rigorous testing, ensuring that we have compatibility across a range of different hardware specifications and that our project builds efficiently across the board.

A multiple platform build tool will reduce the learning curve for anyone new to working with Unity who wants to build and test on various devices. By removing the tedious and confusing configuration steps when setting up a project, we are reducing the amount of time and reducing the barrier to entry of developing immersive solutions. This reduction in set up time and the learning curve, enables a less experienced Unity user, to gain knowledge on the development process of a project in a shorter period of time. By using this one tool on top of projects we are also reducing the duplication of effort when creating builds and creating a standardised process across all projects.

The build tool can also be utillised alongside other forms of development platforms, such as creating experiences inside a VR environment that can then be tested as an AR experiences. By creating scenes while inside a VR environment we are able to see how the objects work together, physically interact with the objects and alter their placement all while inside the scene. Once we are happy with the scene we can take it out of the VR setting and pass it through the multiple build tool, to enable us to test our scene on multiple platforms.

/><figcaption>Example of VR Scene??? ??Made in Google Blocks</figcaption></figure>



<p>We will see the rise of this style of development in coming years with the advancement of tools like Microsoft <a href=Maquette and Google Blocks. Being able to build VR scenes to mobile platforms as AR applications, will also make the application more accessible to those without the technology necessary for VR experiences.

How did we do it?

The first option available is a range paid of packages that are available through the asset store; prices for these rise upwards from $10. These are viable options when looking to build to multiple platforms but only where they contain the platforms that you need, which isn't always case.

For our solution we went with an open source repository on Github that we could then build on top of. The repository, at the time of use didn't feature all the platforms that we needed for the project, for our needs we built upon the codebase and added some additional functionality. We spent time developing the iOS platform and altered the Android platform to reflect updates in phone hardware. On completing these changes, we tested both types of mobile device build and they proved to be highly successful!

The tool was able to determine which plugins were necessary for each device, so that packages were only added for each build if they were relevant to the platform. When it came to building AR scenes it became more complicated. Unity uses cameras to view scenes and has standard cameras or an AR camera which can sense depth for placing objects. If we want to deploy an AR scene on a VR platform, we have to include both the standard camera and an AR camera which is needed for mobile experiences. However, Unity was not able to build with both these cameras in the scene at the same time. The only way to get around this was to build a scene that contained both cameras and toggle the cameras in and out of the scene depending on the platform.

/></figure>



<p>I will admit that I learnt a lot over the course of researching and developing this tool. As someone who had never touched Unity before this project I had to quickly pick up the basics and understand both VR and AR development at the same time. Overall I throughly enjoyed the challenge in building this solution and look forward to seeing my team make use of it in the future!</p>



<p>Unity has been pushing updates since its first release in 2005 and from the developments they've made in the last few updates, we can assume we will see the release of a tool that does this with ease soon.</p>



<p>If you are interested in contacting the Applied Innovation team to discover more about what we do, feel free to send us an email: <em>appliedinnovation@kainos.com </em>or check out some of our other projects <a rel=here.

View the original medium article here.

About the author

Chloe Thompson