Right this one was a little challenging but I got there in the end. I’ve been busy entertaining my parents since they visited me which was amazing but kinda threw a spanner in the .app a day mission I was on. That and a 6 year old and all hell breaks loose…
Anyway I seem to have built what I’ve cheaply called a Fauxtechre app. Clearly inspired by the fantastic work of WARP records’Autechre. I’ve had many moments over the years falling in and out of love with this band and their work, which to me is always a sign of fantastic challenging art.
This one uses the Karplus modules from BEAP and you can draw in your own oscillator type shapes and morph between then using LFO’s. It also has a morphing spectral filter which you can again draw your own shapes in to and wobble between. Just click around and find out what’s good for you and works, or doesn’t. The steps of the Gate Seq. and the Sequencer itself can be split and run at different step counts to maintain that slightly out wonky cycle…
Please find below another link to the software with the same conditions as previous ‘releases’ so to speak. Once again I’ve popped in an audio recorder so you don’t have to dick about with any routing, but foolishly I didn’t add keyboard shortcuts. Silly me. If any really wants them implemented then please just ask and I’ll do so.
This .app does nod to algorithmic composition techniques in some ways, albeit in a very embryonic way. I do hope that at least you’ll have fun.
It comes as is and with 0 support, but it shouldn’t really need any. It was created on Max MSP v 7.1 using Mac OS X El Capitan. Sadly I haven’t got any other versions of Mac OS with me at the moment to test. One small reminder is that I think you will have to have Java installed in order for this to work. Most newish Apple OS’s no longer come with that installed as standard. Just a wee heads up.
I haven’t written any documentation for the .app but it’s pretty straight forward to use. For any comments, ideas and suggestions please contact me and I’ll do my best. I hope it works for you and that you have many happy hours making noises. Please click the image below to start a .zip file download.
I don’t seem to be able to stop this make a MAX thing at the moment, so I decided to roll with it…
This time after many discussions with old IRC friends, they said I should make a quick & easy standalone audio recorder. Which is what I then did. For those moments when you’re jamming on a synth or something and don’t want to boot ProTools, setup a session and all that jazz, not that it’s hard, but you just want to hit record!
This supports various Bit Depths & Sample Rates. It will AutoName things so you don’t even have to worry about that, of course you set your own file names if you prefer. A small word of warning about AutoName, it’ll dump out the resultant recorded audio files the the same directory as wherever the .app is stored. Maybe run it from your Music directory or something sensible like that…
Here you go and I hope that you find it useful, it comes as is and with 0 support, but it shouldn’t really need any. It was created on Max MSP v 7.1 using Mac OS X El Capitan. Sadly I haven’t got any other versions of Mac OS with me at the moment to test. One small reminder is that I think you will have to have Java installed in order for this to work. Most newish Apple OS’s no longer come with that installed as standard. Just a wee heads up.
I haven’t written any documentation for the .app but it’s pretty straight forward to use. For any comments, ideas and suggestions please contact me and I’ll do my best. I hope it works for you and that you have many happy hours recording noises.
Right, I seem to have gotten lost in MAX v7 once again and I’ve made another thingy. This time it’s a MIDI FM type synth with reverb, hence the name FM(v)erb. I’m so inventive when it comes to names eh, outstanding performance there.
Anyway, grab it like a rabbit and see what you think. It’s still in the v01 state but it should mostly work. It has 3 MIDI CC slots which allows you to control Filter Cut Off, Resonance & Filter type. It doesn’t do presets, so just dial away every time it you open it, sometimes things are just better that way. The Envelope supports times of up to 20 seconds from milliseconds so although it is very limited feature wise the scope for variations within the sounds is I think still vast…
If people like that, it shouldn’t be too hard to implement more options of this kind. I’m new to all this software bizznizz so go easy on me and please try it out for yourself and let me know what you think.
It was created on Max MSP v 7.1 using Mac OS X El Capitan. Sadly I haven’t got any other versions of Mac OS with me at the moment to test. One small reminder is that you will have to have Java installed in order for this to work. Most newish Apple OS’s no longer come with that installed as standard. Just a wee heads up.
I haven’t written any documentation for the .app but it’s pretty straight forward to use, as far as I can tell. For any comments, ideas and suggestions please contact me and I’ll do my best. I hope it works for you and that you have many happy hours making noises. It has a built in recorder unit towards the bottom which saves out an audio file at the sample rate you’re running at whatever bit depth you’ve chosen in the module.
Click on the image below to download the .zip file!
I needed an Impulse Response plugin for a ProTools project I’m working on. So I spent some cash and bought Liquid Sonics’ Reverberate 2. I tried the demo and was well impressed with the flexibility, options and of course sound quality. The price point, which on the horrid consumer mad ‘Black Friday’, I never did get why it’s called that, was -%30 off the asking price. Win!
I would’ve gone down the Altiverb route but it’s (worthy) price point was a little high right now. Cash is tight at the moment since I’ve recently moved over to live in Sydney, Australia and had to pay for all manner of things in order to do so. Not to mention the awful feeling that my entire sound studio is in a shipping container bobbing around somewhere in the Indian ocean hopefully making it’s way eventually to New South Wales.
One of the best things about Impulse Response plugins and software is being able to create your own weird and wonderful IRs, in whatever space you happen to find yourself. Or as most sound designers do, use them ‘incorrectly’ just to listen to the happy accident results at the other end.
One thing I have realised over the years making my own IR’s, is that it isn’t quite as easy as I think it should be. Sure the process is once you have all the correct hardware, speakers, mics and so on but which software to make them with?
There are various utilities out there but most are proprietary and linked to their specific piece of host software. Altiverb can easily make IR’s by simply dragging and dropping your recordings in to the plugin interface. Great. It also sounds fantastic. Logic Pro X still comes with the Impulse Response Utility, even though Apple seem to want to hide it away from everyone. Residing inside the Logic .app package now and anyone that’s spent any time working with this knows what an absolute pig it is to use. It’s also not been updated in forever…
The results aren’t fantastic either in my experience, as there’s usually some strange alias ringing / resonance in the results. I’ve A/B’d the same all AES-EBU signal pathway via a TC System 6000 using both Altiverb and Logic’s Impulse Response Utility and the difference in the results were shocking. Altiverb, subjectively speaking, whipped Logic’s Impulse Response Utility completely and truly. Sadly I don’t have the A/B files here with me now to add to this post, since they too are on a boat somewhere in the ocean…
Ableton’s Live Suite, since about version 9.x has included a Convolution Reverb device as well as the IR Measurement Device so that you can create your own and save them out as .AIFF files. Neat.
These are Max 4 Live devices so you need to have Max MSP from Cycling’74 as well. Which I would highly recommend anyway for a myriad reasons, but that’s for another post entirely.
So when thinking once again about making some more groovy IR’s for use inside Reverberate 2, these Live devices got me a thinking… ‘What if I made a Max MSP application as a stand alone which anyone could use?…’ In the spirit of Open Source type thinking. And so this leads me on to the 1st version of the application which can be found below:
I took apart the Live Max4Live device and copied it in to Max MSP, added, tweaked and sorted as was my want. Please try it out for yourself and let me know what you think. It was created on Max MSP v 6.x using Mac OS X El Capitan. Sadly I haven’t got any other versions of Mac OS with me at the moment to test. Please see previous boat type comments above.
It should work with any Core Audio compliant sound card and once you have completed your sweep, hit the SAVE button to save out your IR file as a 24bit 44.1kHz, 48kHz, 88.2kHz or 96kHz .AIFF, depending on which settings you had selected during the creation process. This can then be used in whatever convolution plugin you use. One small reminder is that you will have to have Java installed in order for this to work. Most newish Apple OS’s no longer come with that installed as standard. Just a wee heads up.
I haven’t written any documentation for the .app but it’s pretty straight forward to use, as far as I can tell. For any comments, ideas and suggestions please contact me and I’ll do my best. I hope it works for you and that you have many happy hours making IR’s.
I was approached by Raymond and asked whether I would like to document the live performances over 3 separate dates. This was an interesting live recording, due to the long reverb times of the gallery space, being sensitive to the live audience and musicians alike.
DPA 4060’s taped to the rear walls provide the ambience mics I felt were required without having to have massive ‘over head’ mic stands in the way, blocking sight lines and making a mess of the visuals. Spot mics covered the rest for the close ups and we were good to go.
I edited, mixed and mastered each of the sessions, at the Reid Hall Studio and what follows below is one performance taster and some further text describing more about the project.
By co-authoring a series of original prints and musical compositions, they test the possibilities of images as conductors of sound and sound as a compositional tool for images. By trading their specialist understandings of music and visual art respectively, they explore the possibilities for creative learning and play outside conventional disciplinary boundaries.
Raymond and Josephine’s process starts with hand drawn ‘grids’ and photographs. These then form the basis for the development of more intricate images that will later become graphic scores for musicians to perform – as well as being artworks in their own right.
The images are developed in many different ways. “Sometimes we work on them together in a studio, sometimes we work through remote collaboration using scanning and digital editing, and sometimes we work on them in a live setting during rehearsals with musicians,” said Raymond, “This organic process allows both the music and the visual outcomes to vary each time the score is played.”
The exhibition traces the development of Raymond and Josephine’s collaboration and their changing approach to co-authorship. Some of their previous work will be on display in the upper gallery, which includes three graphic scores that were created by Josephine in response to existing music by Raymond and Marilyn Crispell; these scores were extended through Josephine’s creation of responsive animations.
On Saturday the 6th of April Tam Treanor and I were lucky enough to be invited to perform ‘SkypeBack’ at the Edinburgh Science Festival 2013. The venue was the fantastically modular and somewhat THX inspired, in my opinion, Inspace. With their fantastic array of projectors and flexible routing matrix we were able to up the visual side of things and explore the use of light within this performance piece. To blend in with the colour palette of the space we also purchased and put on some neutral white outfits, which transformed us in to moving projection surfaces. Please have a look at the edited short version of the video.
Below is the full live audio recording taken from the concert.
Here is the short edited live audio recording taken from the concert.
The pictures below were kindly taken by our friend and colleague Tracy Foster:
World Premiere of ‘Skypeback’. A feedback based networked performance instrument. Unfortunately only camera onboard audio was available for this video but you do get the gist of what the performance contained.
Here is the audio from the public World Premiere performance of ‘Skypeback’ which took place in Music & Sound research department within the Edinburgh College of Art, The University of Edinburgh at Alison House, Nicolson Square.
Now, the decision to place this in Sound Design or Music is an interesting one that’s still up for debate… The below audio was the result of a 3month project based around the study of phase transitions of water, mainly solid ice to liquid and the sounds or music that this could offer. Relating to, and an extension of, feedback networks and temporary performance interfaces. I have a lot to add here and will do so in time, once I extract the useful details from the Daisy World project web site.