Hey we put up a sweet vimeo page to highlight all the excellent/terrible videos we took leading up to and during Imagine RIT. They show us working, us not working, and our installation mostly working! I’m just kidding our installation worked great. But they’re pretty fun and you should totally take a look.
I got all the bracelets printed at the 3D printer, and they came out…well, not quite how I built the models. Some parts are completely filled in, where they should have been open. I can’t actually explain how that happened, since I used different methods to create different holes. Some were booles, some were extruded geometry, and some were polygon holes that were manually filled. Suffice to say I’m not completely happy with the results, which took hours of messy processing with a Dremel to fix. In the end, I managed to fix them to the point that they are usable, and moved on to the dangerous and frustrating task of soldering all the parts into them.
USB cables are stupid expensive. USB patch cables (male-female) are even more so! Instead of buying 15′ cables to run our web cams for like $40 a piece, the innovators at DoodleOodle™ decided to make their own!
There have been some serious developments as far as tracking goes recently. Since we lost the contract with the Japanese, we have had to scramble to figure out a solution. For now, it appears that OpenCV, a Processing library, will suit our needs. However, the problem of actually tracking people persisted.
One of the ideas we initially had last quarter was to allow users to create content for our games. Using a doodle app (working on our website) they could draw powerups that they could then collect in the game. We had one end of the system working, the ability to create doodles, but we needed a way to incorporate them into the games. So when i got a few hours free I decided to give it a go. And what do you know I figured something out. The first part is a php script on the server that supplies Flash with a filename list of a certain directory when asked. Flash then goes through that list and figures out which files are image files that it wants. It then loads one at random and places it in a movieclip setup beforehand. The background is a scrap of paper and the holder for the image is set to multiply so that the black and white doodle looks like it was drawn on the paper scrap. The games can then use this created Object where they want. I was really happy that this worked successfully and was not terribly complicated. Now we may or may not actually implement it depending on our time frame leading up to Imagine RIT. The thumbnail shows a my basic implementation of this system. The mug doodle was randomly selected from the approved image folder.
Our initial strategy for talking between processing and flash was simplistic and flawed. Processing would export a xml file with the information in it. Flash would read the xml file and extract that information. The problem with that was two programs reading and writing a single file rapidly with no coordination. This caused quite a few read/write errors. It would still work if we ignored the errors but it was a bad system. We have updated it significantly by moving to an OSC protocol. Processing broadcasts OSC messages over a network we setup between two separate computers. On the other computer we run a FLOSC server. this is a java program that will translate recieved OSC messages into something flash can recognize with its socket listeners. This involved installing a special version a java and making sure the system environment variables were setup correctly and than the IP and port numbers were setup correctly in processing, flosc, and flash. But the sync issues are now gone, we also now transfer a much smaller string of characters that is deciphered by flash instead of a full xml document. the screenshot below shows most of the components need for this system running on one computer.
After constructing the basics of the Hub World/Game Manager in the beginning of the quarter, i’ve moved into the second stage of its construction. This time it mostly focuses on the second part of the title, the game manager. It’s not easy to coordinate the hub world with four different games that were developed by different people. So we’ve been working on a system that everyone can implement easily. But the look and the feel of the hub world is mostly finished as you can see in the screenshot. Lots of bug testing to follow…
Working on developing the technology to track players within our environment. We have the technology to track any particular color in the environment. However, the issue we are trying to overcome is how to track “4″ different colors in a low light environment. First prototype is to use light filters which is demonstrated in this youtube video.
Today we tested rear projection surfaces, mirroring to decrease throw distance, and dual floor projection to eliminate shadows.
Mission: To establish the different technology that will be utilized as an integral part of the big picture of this project.
Adam Sweet, Alex Stroh, Mady Villavicencio
–> Dig into the source code of previously developed FLASH motion tracking projects.
–> Research the possibility of using Wii and IR with FLASH motion tracking.
–>Dig into more research on Zigbee chips/networking.
–>Test Zigbee Chips and document their ability to be worked into the big picture of the project.
The group is continuing their research and pursuit to find the suitable technologies for this project, however we have a strong feeling that we are headed in the right direction. The main technology we have proposed to be used in this project consist of:
–>Flash Motion Tracking