The other 1/3 of Studio 3 is material design. That is creating some type of controller for a game that is custom, or used in an unconventional way, or doesn't use any digital products what so ever. It's pretty fun and I must say it really makes me feel smart to sit and toy with electrical parts and making stuff light up and so on.
For the material design project, me, Paul Frame, and Nicholas Duxbury are working on a game called... FULL METAL BARBER. You are a barber that uses a shotgun in order to cut peoples hair. Sounds awesome right?
Well, in order to get this working we had to decide how we wanted the game to work. We quickly came up with the idea of making some form of light-gun that you aim at a mannequin in order to make things happen in the game. Paul had the idea of using a Wiimote as the main "pointer" for the weapon. He also built an amazing cardboard-scissor-gun that we could place the wiimote and nunchuck inside.
Here comes all the challenges we had to go through in order to get it to work. First of all, connecting a Wiimote was easier than we thought, even though it was hard. All we had to do was connect the wiimote through bluetooth and that's it. The only issue we encountered with this was where we connected it (apparently there's two different places to connect bluetooth devices).
Next issue was that the Wiimote didn't react to movement or button presses at all. Of course, we had to have some form of App to recognize the Wiimote as a form of valid input device. We figured that since we were only going to make the game in Unity, that we use some form of API for the wiimote. Luckily, there is one, made by Flafla. All we had to do was press a button, and the button input was being shown!
There was however, no reaction of the actual movement of the controller. So it couldn't be used for what we wanted it to be used for. However, after some conferring with our lecturers, we came to the obvious conclusion that we needed IR lights in order to sense the position. Luckily, one of our lecturers had a spare one for PC lying around in his office that we were allowed to use for this test, and voila, there was movement recognition.
And now we get to the issue that plagues us to this day. The API. Flafla is amazing for having created this API, but as a user with no advanced experience in coding, I would have prefered some form of manual on how to use aspects of the code. Something as simple as "How to make shit happen on button presses" or "how to get the position of the pointer in world space and not in canvas space" would've been extremely helpful. This is still to be figured out, but it seems like Paul has a solution to it, so it remains to be seen what works and what doesn't.
Until that update comes out, take care and thanks for reading this post.