Friday, April 4, 2008

Welcome

Hello. This is Tony Schultz. I have been teaching Dance & Technology for the Sarah Lawrence Dance Department for the past two years. The prospect of working with and teaching film and media students is exciting to me. In my class we have been using Max/MSP/Jitter to create our work and the results have been fruitful.

This blog presents some of the many possibilities for using Max/MSP/Jitter as a new media authoring tool. Max/MSP/Jitter or MMJ for short was first developed as an open source tool for generating electronic music interfaces. Since then its scope has expanded to include a broad range of video, graphics, networking, and interactive control capacities. With it one can custom build virtually any interactive media tool. I will give examples to allow you to understand its potential application in student work.

Thursday, April 3, 2008

Basics

MMJ is a visual programming environment. This means that you "write" your program by dragging and connecting "object boxes" and "GUI" (Graphical-User-Interface) elements. Since MMJ is not a text based language it can be friendlier to use than other languages. There is also no "compiling" necessary meaning you can add to a given program while it is running. This is handy since you can investigate a program while it is running. It helps for figuring out how given computational element works or trouble shooting something that isnt working the way you want it to.

Here is an example of a simple media machine written in MMJ. This given machine plays a movie using quicktime.


Code is also easy to share between people since it can be copied and pasted into its text equivalent. I will paste it below. This can be copied from an email or blog.

#P user jit.pwindow 54 136 82 62 0 1 0 0 1 0;
#P window setfont "Sans Serif" 9.;
#P window linecount 1;
#P message 97 68 30 196617 read;
#P toggle 42 44 15 0;
#P newex 42 68 52 196617 metro 20;
#P newex 84 96 63 196617 jit.qt.movie;
#P connect 0 0 4 0;
#P connect 3 0 0 0;
#P connect 1 0 0 0;
#P connect 2 0 1 0;
#P window clipboard copycount 5;

Wednesday, April 2, 2008

Stop Action

Of course you can record video in MMJ too. Since you are building your own interface changing the record rate is trivial. Here is a simple example of a stopaction interface. This interface actually comes as an example.

Here is an example stopaction video made with this interface.

Tuesday, April 1, 2008

Time Lapse

One can very easily tweak the stop action interface in order to set the record rate to and speed. This makes timelapse possible. Here is a timelapse done with such a patch. This video was recorded at a rate of one frame every 20 seconds.

Doing Math

MMJ handles all video in or out as matrix data. This means the RGB pixel values can be directly accessed and used for analysis. MMJ can do some pretty complicated mathematics out of the box and more sophisticated mathematics with a little bit of coding.

For example a video feed can be analyzed as it comes in. With only a little bit of work (setting up a frame differencing operation) the computer can figure out when there is motion in the video feed. We can use this information to determine when to record and when not to record. This can be used to setup something I call motion active recording. Like a security system, we can get MMJ to record only the frames when there has been a certain threshold of movement accumulated.



Here is a video artifact of motion active recording. This was successful at creating a campus drive through with no stops. Any stops yielded no change in the incoming image and therefore those frames were not recorded. This recording also took advantage of MMJ's capacity to record from many cameras simultaneously.

Playing With Playback

Traditional editing cuts up media and sticks it back together in a linear fashion. With MMJ you can build exotic playback machines to give many different kinds of playback options with regards to ordering in time.

Here are a few examples.

This video is an artifact of a machine that plays the video back according to certain rules. I build a system with which you could tag frames with certain names. Once a named frame is reached in the playback the computer randomly cuts to any other frame with that given name.


Playback reordering can be controlled by anything: music, a video game controller or language. Here is a video artifact that translates language into an editing scheme.

This final example is the best for showing how much fun you can have with this type of method. Shooting a subject against a greenscreen or other such space with uniform background (like the squashcourt) that can be subtracted out gives lots of freedom. This is an artifact of a videogame interface made stitching together small movement phrases. This way video can be reordered both in time and space.

Playing in Space

Here is a video artifact that shows how video can be composed a graphics space. This project used MMJ's networking capacity to synchronize recording on four cameras. The camera views are then recomposed in 3D and a second level of cinematography emerges when we write a determine the fly-through trajectory of the virtual camera. With this interface the trajectory of the virtual camera could be determined as the viewer was investigating the media or it could be determined by a pre-written score.

This system can also work in realtime. This means the four camera feeds can be sent over the network and recomposed elsewhere. At the remote site the camera view is controlled by the viewer using a wireless videogame controller.

This kind of assembly begins to bleand the lines between performance, film and gaming.

And its fun!

Simulation

Of course we can take this all the way and embed our experience in a completely simulated environment. This final example involves a simple physics simulator which models a central gravitational force field. The trajectory of the spaceship emerges out of the interaction of the gravitational field and the "thrusts" the player provides through a game controller. The solar system provides a landscape for play but also as a repository of astronomical knowledge. All of the planets in this simulation are in proper ratio to each other with regards to size, distance from the sun, rotational period around the sun and daily rotation around their own axis.

Here is a view of the MMJ patch that functioned as the physics simulator.

Here is a video artifact of the game in action.

Interactivity

One of main allures of this clumsy undefined beast we call "new media" is the promise of interactivity. New media should not simply mean content shown on an ipod or on a website. It should, if it is to live up to its promises, draw the viewer in as an active participant in shaping their own experience.

Users can interact with the work through various modalities. One way is by using gaming controllers. the most expressive gaming controller by far is the wireless Wii remote.


The Wii can "talk" very easily to MMJ through bluetooth. This controller also has 3 force sensors in it so that it not only responds to pressing its buttons but also by moving it in space and determining its orientation with respect to the direction of gravity.

The apple ir remote is another controller that plays well with MMJ.


The iphone, any midi controller, and the camera itself (using computer vision algorithms) are all easily accessible as expressive controllers. MMJ is very good at providing these kinds of connections for repurposing our consumer technologies without too much technical know-how.

In this way MMJ is a great tool for student to gain comfort with while developing their media making practice. It also provides a good opportunuty for students to learn some technological and mathematical foundations they might neglect otherwise.