G.I. -3

April 24, 2010 · Posted in Documentation, Pure Data, seesaw · Comment 

The more I work with Pure Data the more I see it as a series of programimg objects or components that are connected. The Gem window is simply a stand in for a screen. There is also a rendering chain called gemhead. There are many pix objects. These modify the gem window. The [pix_texture] object renders whatever is passed to it onto the gem window. In the case of See/Saw it’s a video clip. The videos are stored in a database or file folder. In both Max MSP and PD there is a [coll] object. The [coll object is a series of messages. These can be numbered. The messages are open video titled __. I can then modify the output with a random number selector so that the messages are not selected in order but are randomly selected.

Here's the sequence -

OnLoad
1. create a gem window
2. select a random video from the database
3. feed to gem window.
4. at the end of video after it is played file select another video at random.

Since the video clips are 2.35 to 1 proportion (that is widescreen) format, I have the potential of running two clips at the same time one on top of the other on a 4:3 video screen. There is sound for each of the videos but I solve this by using the [dac~] and putting the top videos image through the left channel and the bottom video through the right channel. I also make an automatic slider that fades the sound in as a new movie clip is started.

The sequence/render chain is doubled and mapped onto the gem window. The next task is to turn the render chain off and on for each separate video. This will cause the image to bounce from to to bottom and back.

Another part of the programming task is to get the Seesaw attached to a sensor that will read it’s movement. For a previous project with Peter Sinclair called Shooter, we used an iCubeX as a interface between the sensors and the computer. I have purchased a basic kit that along with a wifi midi interface to connect the seesaw to the computer and input the sensor data into the Pure Data programming patch. Since the sensor outputs midi numbers, 0 to 127 I’m assuming I can have it send numbers to the computer that will select video clips with the corresponding numbers stored in the [coll] object. The real question is how will that look and feel when it’s working? will the audience understand that moving the seesaw selects the movie clip? How do I buffer the number stream so that the program is not overwhelmed with messages it can’t process or does the program jump rapidly from scene to scene? I have no way of knowing how this will work until I build the see saw and test the interface.

Next Page »