By Lynn Thompson
Downloads
Download Unity* 3D Touch GUI Widgets [PDF 966KB]
Source Code: (coming soon!)
This article provides an example of using Unity* 3D assets to simulate Windows* graphical user interface (GUI) widgets. The TouchScript package available at no cost from the Unity Asset Store makes several touch gestures available. The example in this article uses the Press Gesture and Release Gesture.
The example begins with a preconfigured scene that has geometry for visual reference points and a capsule asset with a first-person shooter (FPS) controller and the scene’s main camera as a child for navigating the geometry. The scene is initially navigated in the default method, where an FPSController
takes its input from the keyboard. To provide an alternate means of input for the FPS controller, I construct a GUI widget out of Unity 3D cubes. This widget will have four buttons: Up, Down, Left, and Right. I make the GUI widget visible on the screen by placing it in view of a second-scene camera with an altered normalized view port rectangle and a higher depth setting than the main camera. When these buttons receive a touch gesture, a respective action is sent to the FPS input controller to generate the desired effect.
I construct an additional four button widget to control rotation of the capsule asset that holds the scene’s main camera. These buttons enable users to manipulate the “first person’s” rotation independently of the FPSController
, moving the asset at the same time. This functionality uses the Press and Release Gestures.
When complete, running this simulation allows users to touch multiple buttons in varying patterns to navigate the scene. After this, I explore possibilities for how you can use and modify the GUI widgets developed in this example to emulate other, common gaming controllers. I also discuss the challenges I experienced using touch-based controllers in contrast to keyboard and handheld controllers.
Creating the Example
I begin this example by importing a preconfigured scene I exported from Autodesk 3ds Max*, adding a capsule and configuring it with an FPSController
. By default, this controller takes its input from the keyboard. See Figure 1.
Figure 1.Unity* 3D Editor with a scene imported from Autodesk 3ds Max*
Adding Geometry
Next, I add geometry (cubes MoveForward, MoveBackward, MoveLeft,
and MoveRight
) to simulate a Windows GUI widget. I also add a light and camera to visualize the newly added cubes. To place this camera’s view in the bottom left of the runtime scene, I change both of the normalized view port Rect settings for elements W
and H
from 0 to 0.25. Also, for the camera to appear in the scene, its Depth setting must be greater than that of the main camera. The Depth setting for the main camera is −1, so the default Depth setting of 0 for the new camera will work. I make the light and cubes children of the camera by dragging these elements onto the camera in the Hierarchy panel. Next, I add the TouchScript > Layers > Camera Layer to the camera by clicking Add Component in the Inspector panel. The GUI widgets won’t function if this Camera layer step is not performed. See Figure 2.
Figure 2.Unity* 3D Editor showing a GUI widget
Adding a GUI Widget
I repeat this process to add a GUI widget to the bottom right of the screen for rotation control of the capsule with the FPSController
and main camera. The scene looks like Figure 3, with both of the GUI widgets added to the scene.
Figure 3.Unity* 3D runtime with imported scene and GUI widgets
Connect the Widgets to the FPS Controller
The next step is to connect the new GUIWidget
cubes to FPSController
. To do so, I modify the default FPS input controller script for the capsule to use variables to instigate movement rather than input from the keyboard. See script FPSInputController.js
in the accompanying Unity 3D scene.
Adding Gestures to the Widget
Next, I add TouchScript Press and Release Gestures to each move GUIWidget
cube by clicking Add Component in the Inspector panel for each cube. The TouchScript menu for selecting a gesture became available when I downloaded and installed the TouchScript package.
After the TouchScript has been added, I add a custom script to the cube to receive the gesture and perform the desired action. Choosing to start with CubeMoveLeft
, I add a new MoveLeft
script to the cube by clicking Add Component in the Inspector panel. This script sends a Horizontal value of −1 to the FPSController
global variable horizontal
when the cube receives a Press Gesture. I also add code to this script to change the scale of the cube to visually confirm receipt of the gesture. See the C# script MoveLeft.cs
in the accompanying Unity 3D scene.
Similarly, I add scripts to send −1 to the MoveBackward
button and 1 to the MoveForward
and MoveRight GUIWidget
cubes. See the C# scripts Move[Backward,Forward,Right].cs
in the accompanying Unity 3D scene.
Enabling Button Functionality
At this point, I can use the Move GUI widgets to navigate the scene but only individually. I can’t use the MoveForward
and MoveLeft
or MoveRight
buttons in combination to move at a 45‑degree angle. To enable this functionality, I create an empty GameObject
at the top of the hierarchy and use the Add Component to add script Touch Manager from the TouchScript menu. I also add the script Win7TouchInput from the TouchScript Input menu.
Now that the Move buttons work and I can navigate the scene by touching multiple buttons, I’m ready to implement the rotation functionality. Theses buttons won’t manipulate the FPSController
directly but the rotation of the capsule holding the FPSController
and the scene’s main camera. Using the OnPress
and OnRelease
functionality as above, the script attached to the RotateLeft GUIWidget
cube rotates the FPS capsule and child main camera to the left when the cube is touched. See the script RotateLeft.cs
in the accompanying Unity 3D scene.
Similarly, I add scripts to send the appropriate rotation vector to the RotateUp, RotateRight
, and Rotate Down GUIWidget
cubes. See the scripts in the Rotate[Backward,Forward,Right].cs
in the accompanying Unity 3D scene.
The Completed Example
This completes “hooking up” the cubes being used as GUI widgets. I can now navigate the scene with touch-controlled movement and rotation multiple ways by touching and releasing multiple buttons.
I added a script to the main camera to create a video of the scene being run. This script writes a new .png file each frame. See the script ScreenCapture.cs
in the accompanying Unity 3D scene.
I compiled the .png files that this script writes into a video called Unity3dTouch2.wmv
using Autodesk 3ds Max and Windows Live* Movie Maker. I removed this script from the main camera upon completion of the video because it noticeably degrades the performance of the scene when active.
Video 1: Touch Script Multi Gesture Example
Common Game Controllers
Common game controllers include first person, third person, driving, flying, and overhead. Let’s look at each.
First Person
When comparing the GUI widgets implemented in the example above to the stock Unity 3D first-person controller, one of the most noticeable differences is the GUI widget example getting the camera rotation in an odd configuration. When you use two buttons for capsule rotation, it’s not immediately obvious how to get the rotation back to the original state where the camera is in alignment with the scene horizon.
The stock Unity 3D first-person controller uses a script called MouseLook
to perform the functionality that the Rotate[Left,Right,Up,Down]
buttons provide. The MouseLook
script uses localEulerAngles
to rotate the camera; it offers a better means of rotating the camera view than the capsule rotation I used in the example. To take advantage of this better means of rotation, you can implement it in a manner similar to the FPSInputController
: adding public variables mouseX
and mouseY
to the MouseLook
script. You can then use these variables to replace the Input.GetAxis(“Mouse X”)
and Input.GetAxis(“Mouse Y”)
functions in the script. When these variables are hooked up to the rotate buttons and incremented and decremented, respectively, the scene’s main camera will rotate in a more useful manner.
Third Person
The stock Unity 3D third-person controller can be adapted to touch in a way similar to the first-person controller. Implement Move[Left,Right,Up,Down]
in the ThirdPersonController.js
script after hooking it up to the touch buttons with a new script, as before. The stock Unity 3D third-person controller automatically calculates the main camera rotation and position, leaving the second GUI widget created in the example available for alternate use. One possibility is to use the top and bottom buttons to increase and decrease variable jumpHeight
, respectively, and use the left and right buttons to increase and decrease variable runSpeed
, respectively. Many variables are available for similar adjustment in the ThirdPersonController.js
script.
Driving
In the controllers examined so far, the Move[Left,Right,Forward,Reverse]
scripts stop the motion of the character when an OnRelease
event is detected. For a driving-type game, the scripts would do more than send a 1 or −1 to the first- or third-person controller. The Forward
and Reverse
scripts would have a range of values sent to emulate throttling and braking. The first 80% of the value range may occur rapidly when holding down the button for rapid acceleration; the last 20% of the values get slowly sent to the appropriate vector for slowly attaining maximum speed on a straight road while continually holding the Forward button down. The left and right buttons would perform similarly, possibly controlling the rotation of a Unity 3D asset that uses a wheel collider. In this type of scene, the GUI widget not used for steering can be used for control over parameters such as camera distance from the vehicle, throttle and braking sensitivity, and tire friction.
Flying
To use the GUI widget interface developed in the example for a flying-type game, you would use the Move[Left,Right,Forward,Reverse]
buttons similar to a joy stick or flight stick. The left and right buttons would adjust roll, and the up and down buttons would control pitch. The Rotate[Left,Right]
buttons in the other GUI widget can be used to increase and decrease yaw and camera distance from the aircraft.
Overhead View
In this type of scene, the main camera orbits the scene from overhead, likely moving around the perimeter of the scene while “looking” at the center of the scene. In a script attached to a Unity 3D scene’s main camera, you could define several Vector3s
at points along the perimeter of the scene. Using the Vector3.Lerp
function, you can control the fraction
parameter with the MoveLeft
and MoveRight
GUI widget buttons to move the camera between two of the perimeter points. The script can detect when a perimeter point has been reached and begin “Lerp’ing” between the next two Vector3
points. The MoveForward
and MoveReverse
buttons can be used to adjust the vertical component of the Vector3
points to move the orbiting camera closer to or farther away from the scene. You could employ the other GUI widget being used for Rotate[Left,Right,Up,Down]
in the example for a wide variety of things, such as time-of-day control or season-of-year control.
Issues with Touch Control
The most readily observed issue in using touch control in the example above is that it blocks the view of the scene in the lower left and lower right corners. You may be able to partially remedy this problem by getting rid of the cameras that view the GUI widget buttons and make the buttons children of the scene’s main camera. The buttons would still be in the scene but would not be blocking out an entire rectangle of the scene.
You could further minimize button visibility by making them larger for more intuitive contact, and then making them disappear when touched and reappear when released. You can achieve this disappearing and reappearing by manipulating the asset’s MeshRenderer
in the onPress
and onRelease
functions as follows:
GetComponent(MeshRenderer).enabled = false; . . . GetComponent(MeshRenderer).enabled = true;
Another challenge of using touch is ergonomics. When using touch, users can’t rest their wrists on a keyboard and may not be able to rest their elbows on a desk. When developing GUI widgets for touch, take care to place buttons in the best position possible and use of the most efficient gesture possible.
Conclusion
The TouchScript package functions well when implementing the Press and Release Gestures. The resulting Unity 3D scene performed as desired when developed and run on Windows 8, even though the TouchScript input was defined for Windows 7.
The more common gaming interfaces can be emulated with touch. Because you can implement many combinations of touch gestures, many options are available when implementing and expanding these emulations. Keeping ergonomics in mind while implementing these GUI widget interfaces will lead to a better user experience.
About the author
Lynn Thompson is an IT professional with more than 20 years of experience in business and industrial computing environments. His earliest experience is using CAD to modify and create control system drawings during a control system upgrade at a power utility. During this time, Lynn received his B.S. degree in Electrical Engineering from the University of Nebraska, Lincoln. He went on to work as a systems administrator at an IT integrator during the dot com boom. This work focused primarily on operating system, database, and application administration on a wide variety of platforms. After the dot com bust, he worked on a range of projects as an IT consultant for companies in the garment, oil and gas, and defense industries. Now, Lynn has come full circle and works as an engineer at a power utility. Lynn has since earned a Masters of Engineering degree with a concentration in Engineering Management, also from the University of Nebraska, Lincoln.
Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.
Copyright © 2013 Intel Corporation. All rights reserved.
*Other names and brands may be claimed as the property of others.