This project was made to allow a musician to interact with his past self, allowing for a variety of unique musical possibilities. It makes extensive use of Java Music Specification Language (JMSL), which was built by Phil Burk and Nick Didkovsky. In performance, the musician is playing with an abstracted version of her past self, while simultaneously sculpting the current phrase to which her future self will respond. This dynamic leads to interesting musical results, and can be very enjoyable to play with!
Here is a video explanation and demonstration:
In more technical terms, this interactive performance partner analyzes incoming MIDI and separates it into discrete phrases. The program then analyzes these phrases for average pitch, pitch direction, average note hold time, and average note duration. Using these parameters, the program creates a new, similar phrase. This phrase is played back as soon as the human performer begins playing the next phrase. Additionally, each new phrase is played back using a randomly chosen voice from the General MIDI (GM) specification.

A snippet of Java code from this project
This system is currently being used to complement Stella, my electronic trumpet.