pdf version
a html version will come.
This master's thesis addresses the problem of using computer based systems for visualisations in live performances. A prototype system for animation control revealed points of views in the areas in controlling multiple entities in many ways at the same time, and in mapping controls to actions.
In this case, the constructed system was aimed at controlling 3d character animations and implemented in Python using the API provided by an interactive 3d graphics suite called Blender. The first real use of the solution was in a music/dance event during the Oulu music video festival in August 2003, and this study is limited to that early phase of development with discussions about future challenges and possibilities.
The system basically worked, proving the central model --- featuring untypical so-called event objects, which are used to implement all possible actions in a scene --- viable. However, in the evaluation major restructuring is proposed. Also, the prototyping brought up severe limitations with the selected engine, which has later resulted to both fixes in that engine and use of other engines. As a conceptual result, the use of high-level variables such as the amount of *difference* in a scene is identified as a powerful tool, explored with the real-time controlled variance in the animation of so-called clones in this system, and with possibly fruitful theoretical underpinnings found in gestalt theory.
raw text (readable, rst formatted) source is also available: gradu2.rst
a UML of the original design (which will be refactored before actual release): mover-original.uml.png
bibliographies: performance.bib , software.bib
screenshots of other references : related_work.jpg as they were worked on in FenPDF:
more information about the work, and the software itself, is available at KyperMover.
there's also a foreword for the printed book.