I have had the Lemur for a few weeks now and have been (between various career and personal crisis) climbing the learning curve of:
- The Lemur scripting language- A fairly standard Javascript like thing.
- The Lemur Object reference - Very cool but still a bit odd and buggy in places.
- Python - Never used it before, for the shame.
- The Python Live API - Access to the objects (clips, tracks, devices, scenes) in Ableton. talk about in at the deep end.
- OSC - The open sound protocol, think Midi 2.0. Seriously as Deadmau5 has already said, "Midi should fuck off and die already".
Also I am struggling with exactly what a "set" entails. There are so many ways I can make noise now that are all quite satisfying ranging from semi random, set it off and see what happens kind of things to very tightly controlled scene by scene progression.
Anyway I thought I would share a piccy of the UI I have been working on so hard. This bank holiday was pretty good as I finally got the Python code to send and receive mixer data patched into the Jazzmutant clip launcher scripts. This means that I have added a bank selectable row of faders below the clip launch grid.
I am still trying to work out how best to utilise the new track group function in Ableton and wondering how much (if at all) I want to use scene triggers. See a well constructed group is like a mini scene and I think that sits better with me rather than just stepping through the scenes to progress a track.
At some point when I have it all working nice I will try an do a video of it in action with a bit of a walkthrough.