Scott Purcival

Game Programmer

Kinect Touchscreen

By on 21 March, 2021

The two week game.

It started with an idea…

Western Downs Libraries used to host an annual event for kids called ‘Fun Palaces’.
Terrible name, great event. It was essentially a STEAM showcase, and for those who don’t know, STEAM stands for Science Technology Engineering Art and Maths.

As Digital Support Officer for Western Downs Libraries, and resident game programmer, it was my job to come up with innovative tech experiences that pushed the boundaries of what had been done the year before.
The previous year, I had created a Greenscreen application to leverage the greenscreen that the library had invested in before I was hired, and I had also experimented a little with the Kinect in Unity for motion capture.

This year I decided to build on some inspiration that I had seen online and create a mega-touchscreen for kids to play on!


Initially I figured I would just create a simple game based on ‘mouse clicks’ and use a commercial Kinect touchscreen solution to make it all run.
After trying a number of solutions however, I discovered that the available solutions were either buggy and did not work consistently, or were too expensive.
I already had a Kinect plugin for unity from the previous year’s mo-cap project, so I decided to try my hand at my own depth sensing plugin for Unity.

The Touchscreen

The Kinect SDK provides a couple of interfaces. One is the skeletal style interface, where the Kinect does all of the processing and the programmer is simply presented with a ‘skeleton’ which they can map to their own characters’ bones.
The other interface, which we were more interested in, was raw depth data.

By pointing the Kinect sensor at a wall, and reading the distance to the wall, we can sense if an object, such as a hand, comes between the wall and the sensor.
By setting a minimum and maximum distance, which is within an inch or so of the wall, we can detect ‘touches’ without falsely triggering a touch from an arm or a body which is further away from the wall.

Of course this required a calibration step, to find the initial distance of the wall from the sensor.
I created a calibration routine with averaged the distance over 5 seconds, as the sensor is prone to some distance jitter.
I also added a calibration step where the operator would pick a point on each corner of the projected screen, as the Kinect saw it. This was required to map the projected screen to screen space on the computer.
I then ran the calibrated points through and orthographic projection transform matrix in realtime to map the touches on the wall to screen space.

Due to CPU limitations on the target device (an old dual core Pentium NUC) the touchable area was split into a 32 by 18 matrix, and whenever a touch was detected, any enemy in the touched cells would have a ‘hit’ function called on it.
For performance a simple distance calculation was performed from the centre point of the cell to the centre point of the enemy.

The Game

Now that the touchscreen was working, we started to build a game on top of it.

Since it was for the library, we went with a bookworm theme. The player had to stop the bookworms from eating all the books!
The library had a staff member, Alicia Streten, who was also a budding artist, so I brought her on board to create the artwork. Given the specifications, she was able to create excellent art in less than a week for the game.

The game loop was simple, enemies would spawn on the bookshelf, and wander left to right. Some enemies were faster than others, and therefore worth more points.
The players would have 90 seconds to kill as many worms as they could and the goal was to achieve the best high score.

Testing and Reception

We tested the game internally with library staff and student volunteers.

The initial reaction was that we needed to add more worms, and that they were too easy to kill by hitting the wall with your hands. Also the worms were often too high for shorter users to reach the top of the projected screen, and users bodies would block the image from the projector.

We decided to use soft beanbags, and instead let the players throw the bags from a distance to try and hit the worms.
This added the perfect level of difficulty, and solved our height and player obstructing the projection issues.

Bookworm Blitz at Fun Palaces 2019

We unveiled the game at Fun Palaces the following week and it was an instant mega hit.
Children were so engrossed in the game that parents often had to drag the children away, only for the kids to return later for another go.
Parents would often see how much fun the children were having and join in themselves, and beating the high score proved the perfect goal, and led to frenzied play of over six children at a time.

The game was so popular it was never idle from the start to the end of the day, and was considered a great success.

The game was deployed a number of times at subsequent events in the following years, each time it was received very well by all.

Children playing Bookworm Blitz at Fun Palaces 2019 – Faces blurred.


The game project timeline was two weeks from inception to completion.

Given more time, I would have liked to have optimised the touch detection routine, as it was a little slow, and this could lead to delayed or missed touches at times if the players were too fast.

As far as art direction, I was very happy with the art that Alicia was able to come up with in the short time allocated. The UI could do with more polish, and I would like a persistent high score system, however the players did not seem to mind.

Overall, I was pleased with the execution of the project, and the only time I would consider as ‘wasted’ was the excessive amount of time I spent trying to get commercial touch solutions to work at the beginning.

Bookworm Blitz was made for, and is the property of Western Downs Regional Council as a paid project, and as such I am unable to provide the game for download or play here.


3 comments on “Kinect Touchscreen

  1. Hi Scott !!
    Mi name is Miguel from Colombia, i am a indie developer, I really like your job with kinect touchscreen, right now im creating a similar Project using the “zed 2” depth camera and reading the depth data as you do with kinect.
    But i would like to ask you if there is a posibility you could give me a clue about how do you calibrate the camera with the projection, i understand what yo do with the 4 corners, and it works if the camera centered in front of projection, but when i give it some hight, the perspective give me problems.

    I really hope you could give me an advice about it.

    1. Hi Miguel!
      What you want is an inverse transform homography matrix.

      Here’s a basic class for it:
      You will need to create an instance of that class, initialize it with your four corners, and then pass any ‘touch’ points through GetInverseHomographyPosition() to translate them back to screen space coordinates for unity to use.

      // in Start
      Homograpy homography = new Homography();

      // in calibration
      homography.SetHomographyPoints(botL, topL, botR, topR); // these should be the four points where the user touched the wall.

      // in update, or on click
      Vector2 screenSpaceTouchPos = homography.GetInverseHomographyPosition(Input.mousePosition) * new Vector2(Screen.width, Screen.height);

      This should get you out of trouble. 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *