Interface theory of perception

Discovered Donald Hoffman's Interface theory of perception during the past week.
This theory agrees with the way I think of time as being a brain strategy for reducing complexity, so I have added the following links:
Donald Hoffman: The Interface Theory of Perception and
Donald Hoffman: lecture 1 (YouTube)
Donald Hoffman: lecture 2 (YouTube)
to my Pianola Music documentation.
The Pianola Music documentation (January 2016) discusses the history of music notation in the 20th century, and contains a description of my current views on the nature of (perceived) time.

Study 1 (2005)

I've now transcribed and uploaded my Study 1 (2005) for the Assistant Performer.
It can be played there on the Resident Sf2 Synth.

Web MIDI Synths, Pianola Music transcription

This blog was silent in 2015 because I have been working on aspects of the Assistant Composer and Assistant Performer that don’t directly affect this website. These are:
  • Spring 2015: As planned I implemented the “advanced prepared piano” algorithm in the Assistant Performer. This involved making further changes to the file format for SVG-MIDI scores that also affected the Assistant Composer.
  • Autumn 2015: created the following new open-source GitHub repositories:
    WebMIDISynthHost,
    SimpleMIDISynthHost,
    SimpleMIDISynthHost2, and
    SimpleSoundFontSynthHost.
    I am developing Web MIDI Synths that can be used as MIDI output devices by visitors to web sites, without them having to install any plug-ins.
    Added the Resident Soundfont Synth to the output devices available to the Assistant Performer.
  • December 2015: completed the transcription of my Pianola Music (1967) for the Assistant Performer. This was done as a test for the Resident Soundfont Synth.
A change that does affect the site came in January 2016 when I completed the documentation for the Pianola Music transcription. This includes a short summary of my time theory as it affects both the technical and social aspects of music.

Moritz v3 (major update)

Completed a major Moritz revision during the autumn of 2014, and have just finished the corresponding documentation at this web site.

The main entry point for reading about Moritz v3 is at:
http://james-ingram-act-two.de/moritz3/moritz3.html
This page includes not only information about Moritz itself, but also some background, its raison d'etre etc.

The code for Moritz' Assistant Composer has been thoroughly overhauled, and is now open source on Github. Many optimisations are still possible, but at least the worst of the spaghetti has disappeared. (My coding style is a bit pedestrian by present day C# standards, but maybe that's not such a bad thing.)
The biggest change, apart from cleaning up the code, is that the generated scores can now contain both input and output chords. This enables much greater control over what happens when midi input information arrives during a live performance: Parallel processing can be used to enable a non-blocking, "advanced prepared piano" scenario. Single key presses can trigger either simple events or complex sequences of events, depending on how the links inside the score are organized. An example score can be viewed (but not yet played) here.

The Assistant Performer now has two Github repositiories:
Repository 1: https://github.com/notator/assistant-performer-milestones:
This contains a possibly rather old version, which is however working and stable. This version can be tried out on the web here.
and
Repository 2: https://github.com/notator/assistant-performer
This is where the Assistant Performer is being developed. This version can be tried out on the web here. Warning: Note that this version is highly volatile. It may often contain bugs or not work at all!

The Assistant Performer does not yet implement the "advanced prepared piano" scenario. That is the next project on my agenda. 



Song Six

Recently completed the first setting of Song Six for my Retrospective.
The on-line documentation is not quite complete, but I am publishing this blog now (14.02.2014), because

Heloise Ph. Palmer will be giving the first performances of Song Six as part of her new program The Righteous Fatale between 19th February and 5th May 2014.

I am going to be there on 19th February, 8th March and 5th May.

The score was written using my Assistant Composer software, and designed to be playable by a live performer using a Doepfer R2M MIDI controller in conjunction with my Assistant Performer software. There are, however, several ways to play the piece. The Assistant Performer allows single tracks to be switched off, so the default sound of Clytemnestra's text can be overridden by a live performer, other MIDI input/output devices can be used, etc.

The mp3 recording (of the "default performance" embedded in the score) was made locally (off-line) using Audacity and the Assistant Performer software without a live performer.
Unfortunately I had to document this piece with an mp3 (rather than in the on-line AP) because it is currently rather difficult to play on-line. Not only is the Jazz plugin required, but users would also have to install the special sountFonts and a local soundFont player (I'm using the CoolSoft VirtualMIDISynth).
This recording uses the Arachno SoundFont, and a new soundFont of my own (created using a microphone, Audacity and the Viena SoundFont Editor) that overrides some of the Arachno patches.
Clytemnestra's voice was realized by recording my own whispering voice, and allocating the individual syllables to particular notes in new patches.
I also made the Wind from scratch (in Audacity) because, strangely, I couldn't find anything suitable on the web.

As noted in my earlier blogs, the major browsers are currently implementing the Web MIDI API, so the Jazz plugin should soon become redundant.
It should be possible to implement a soundFont player in Javascript using the Web Audio API, so that this and other pieces could be played by anyone on the web. I think such a virtual device would be at least as useful as an on-line synthesizer. Is anyone else interested in (helping with) that?

Web MIDI progress

Google's Chris Wilson announced today that Chrome Canary (the experimental version of Chrome) now contains an initial round of support for the Web MIDI API! Today's build (Version 30.0.1550.2) supports MIDI input on OSX.
So it looks as if browsers are really going to implement the Web MIDI API natively!

Chris also announced an update to his WebMIDIAPIShim glue code for the Jazz plugin -- both of which are currently used by my Assistant Performer
So I have updated the Assistant Performer's code, ready for the time when both the WebMIDIAPIShim and the Jazz plugin become redundant.

Assistant Performer now works on Mac Computers

Uploaded a new version of the Assistant Performer.
This version now uses Chris Wilson's Web MIDI API Polyfill, and fixes several bugs which prevented the program from running on Mac computers.
I am still only really testing this program in Chrome on 64 bit Windows 7, but have high hopes that it will soon be running on all other major browsers/systems.

Updated the file About the Assistant Performer