Victor - Part 1

I started this blog when I first got to Dartmouth. I'm now over a term into my junior year, and I still haven't mentioned one of my largest ongoing projects, and by far the one that I get the most enjoyment out of: Victor. It was around the summer of my freshman year when I first started messing around with the Kinect v2 SDK, and had the idea to build out an AI that could see the room, hear commands, and serve as a sort of custom built Siri. This idea for an AI would (eventually) be fleshed out into Victor.

Victor started off not even with code, but with something much more important: choosing a name. Apple had Siri, Google had Cortana, Tony Stark had J.A.R.V.I.S., but I wanted something unique. JARVIS stands for "Just a Rather Very Intelligent System", and I figured I could probably come up with an acronym that also formed a name if I tried hard enough. I wrote down a whole bunch of tech/AI terms for all the letters of the alphabet, and then just went through a bunch of names that popped into my head, trying to force one of them into a somewhat passable acronym. After many failed attempts, I finally settled on V.I.C.T.O.R. - Vocal Intelligence Controlled Through Order and Response.

It was time to get started. I decided that to start with, the Kinect would be Victor's "eye" and ears. It would run through my computer, and use the computer speakers to talk. I quickly dived into the Kinect SDK, and built up a quick thing that could listen to, process, and respond to commands. He was written in C# in Visual Studio, and in the beginning could tell you the weather using Wunderground's API...and that was it. Me and my roommates chose some hardcoded phrases that he could give semi-randomized answers to, and he just kind of stagnated. He ended up continually responding to misheard queries, and become much more annoying than actually useful. So, I unplugged him, and turned off his code, and that was that.