I spoke with three alumni during the Alumni Feedback event, overall I would say this was helpful if for no other reason than to get use to talking about the project to people who are unfamiliar with it.
I spoke with three alumni during the Alumni Feedback event, overall I would say this was helpful if for no other reason than to get use to talking about the project to people who are unfamiliar with it. I would say I was very pleased with the interest and feedback of two of the alumni, however, the alumni I was most excited to talk with was uninspiring and gave little useful feedback.
Here, listed as combined list, is the feedback I received:
- How can I get concise as to what iteration number entails?
- Could the people in the projection room also have interaction with the pendulum?
- Look into the nature of code to create different pendulum movements.
- Looking into using heat sensors.
- Basic blob detection could be useful in gathering data about what is happening in the room.
- Incorporate threshold for different sensors i.e.: once a specific sensor reaches a threshold change to a different sensor.
- Possibly look into cell phone signals as a way to gauge people in the room.
- Amplify the sound of the motors instead of creating a sound.
- Increase connection between pendulum room and projection room.
- Sync the movements of the pendulum to an individual person.
- Look into using IR instead of visible light so the pendulum does not cast colored light onto the floor.
I think everything that was said was very relevant, however, for the first iteration I won’t be able to research or include many of the suggestions. I was able to incorporate the idea of amplifying the motor sound instead of creating a sound for this installation on April 2nd.
As of now I have everything working. I’m still tweaking a few parts of the touch designer sketch and need to finish fabrication of the pendulum and the base for the gantry. But both of those are well on their way to being finished.