For the last three years, the UC Arts Digital Lab has been sharing a floor with the College of Arts Office staff and has been invited to enter their Christmas-door decorating competition. This is an invitation that we do not take lightly, mostly due to an over-zealous thirst for competition, but also because we love decorating things, and (of course) we love Christmas.
This year, we decided that as a Digital Lab our door should reflect the digital skills that we have on offer. So armed with Python and a webcam, we began the process of creating a script that would turn people walking down the hallway into Santa! The idea was to detect people’s faces using facial recognition software and to then to superimpose a beard and a hat onto their face and project it onto a screen.
As it happens, there is a fantastic library called OpenCV which can be used for facial recognition. OpenCV provides an infrastructure for object detection, which can be trained to detect any kind of object. It also comes with a number of ready-to-use detectors such as a face, mouth, and nose detector which can be used to build face-detecting programs. If you are interested in how these work, you can read about it in this blogpost by Engin Kurutepe.
A quick google revealed that there are many tutorials and scripts available online for OpenCV. I started with a tutorial which shows you how to attach a mustache to people’s faces:
While I liked the mustache, our goal had always been to add a beard. However, Jennifer and I quickly found that adapting scripts in OpenCV is incredibly difficult since everything is managed with just four coordinates (the top left and right corners plus bottom left and right corners of the face). Moving the beard down the face wasn’t too hard but adjusting the shape and proportions was incredibly difficult. What’s more, if the image of the beard tried to move beyond the limits of the screen, the programme would crash. We got very sick of seeing this message:
With some help from our volunteers Brad and Aidan, I did manage to get a beard working at one point, but it was very buggy. If you moved too close to the edges or the corners of the screen, the programme would stop running. I decided to abandon this idea.
Lucky for me, there was another tutorial online specifically for a santa hat! After adjusting the size of the frame, I at least had the hat portion of our program sorted:
Not happy with merely a hat, I tracked down a video of falling snow and overlaid each frame with the webcam. I also played around with the idea of audio, specially a clip which would play “ho ho ho” whenever it detected a face. I quickly realised that this was not only extremely irritating, but it also slowed our webcam feed to a standstill. Snow and a santa hat was enough anyway…right???
While I was working on the software component of the project, Jennifer and Rosalee created the “hardware”. Thanks to Rosalee’s talented flatmate Riaan, we had a screen with a metal bracket on the back that could be slotted over our door. We decided that we should construct a 3D structure around this that would look like a Gameboy (Christmas themed, of course):
As you can see, we ran cords under the door so that we could power and connect to the screen. We also spent sometime setting the code up on a Raspberry Pi, but unfortunately the bit rate to the webcam was far too low. There are ways to improve this but we were running out of time. Instead we ran the programme off a laptop on a chair behind the door (with just enough room to open it comfortably).
Are you ready to see the finished result? Without further ado, I present to you “Santa-fy Yourself” by the UC Arts Digital Lab: