Affective Programming Task

As I mentioned in one of my earlier posts this trimester, one of our tasks was to create an external input tool that would enable the design students to use a webcam, eye tracker, or some other form of non standard input within their Unity projects. Following on from this post in particular, we decided against trying to convert eyeLike into a library since it was looking more and more difficult to actually create a working crossover in Unity.

We opted for an alternative approach that utilized some of our networking code from the draw client. Instead of running eyeLike within Unity we will have the Unity application launch eyeLike itself and, using some additions to the code base, create a local network connection between them. This means that eyeLike can send a constant stream of update packets to the Unity side of the application, which will give the user a number of different pieces of information to draw from.

With our target being a system that can tell when the user is looking at their keyboard I think we came up with a decent solution. While it still has quite a few false positives and some instances where it won’t properly detect at all, it does provide the required functionality. Considering the use of a standard webcam rather than dedicated hardware I’m impressed with the result.

Repo Link.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s