An important aspect of our project piece shall be incorporating elements that allow the viewer to interact with the piece before viewing and having their interaction dictate the events that will then take place on screen. Though how this interaction shall be carried out has yet to be decided upon.
During the time off we have been investigating a number of methods of possible interaction that we could implement into our final piece.
Using this method we would develop a series of buttons or pull switches with each switch corresponding to a particular scene of the animation. Once the switch is pulled its corresponding scene will then be loaded to be played into the animation and once enough of the switches are pulled the animation will then begin. As the main characters of the animation will be stars we envision these switches to be shaped as such and will illuminate upon being used. In surveying the site in which we hope to display our piece we have seen that these switches would be buttons placed upon the wall of the space or possibly pulleys hanging from the ceiling.
If we were to move forward with this method we would more than likely be using an Arduino setup as well as developing the appropriate back-end code.
Another method that we have explored has been to use QR Codes as an interactive element. In a way this system would work similarly to the one outlined above, in that each QR Code would correspond to a particular scene and that upon being scanned would lead to a URL page that then loads that scene into the animations timeline. These QR codes again could be embedded into star shaped stickers to keep in theme with the animation and could be scattered amongst the entire Wandesford Gallery instead of being restricted to the display area. However a downside to this method is that in order for people to interact with the piece they will be required to have a smart-phone or other device capable of reading QR Codes.
A further method we are considering would be to develop a system similar to the one seen here in the project Tangible Viewpoints.
http://mf.media.mit.edu/pubs/conference/TangibleNavigation.pdf
In considering how we would replicate this system, we envision having a webcam looking down upon a surface and on that surface projecting the image of a star chart. We would then have star-shaped counters and possibly using Processing we would be able to track where the counters are placed upon the star chart. By splitting the chart into a number of different areas and assigning each area a particular scene, the placement of the stars by the users could then be used to determine the story of the piece.