Thursday, 29 January 2015

Perfect Find

Very brief post, here is a link to a project that I will be using as a guide and reference for the arduino side of mine.

I have mentioned the MSGEQ7 a few times before, this project uses it and the Arduino to generate data to drive the LED's via PWM (Pulse Width Modulation. There are a few other components needed, however I am going to be buying them all soon and commencing my build, so I would rather save listing them all and going indepth about what they do until then. I am going to spend the rest of the evening getting together a list of prices for all the components and where I can get them. Hopefully I should be able to commence building within a few weeks rather than a month as previously stated.
I also live with another Honours student, and we have talked about the possibility of integrating such a system with acoustic treatments in order to give them far more vibe and presence within a room - this could be pretty marketable to the next generation of bedroom producer. My aim would simply be to keep my device priced below $100 and I reckon it would sell with treatments for sure  Exciting stuff!




So much goin on

At the same time as looking to use this blog as a means to demonstrate where my project is at and where I am going with it, I also want to to continue to contextualise the project as I feel that some people may not see the need in the system i'm proposing, maybe even not understand why you may even want it.

There are tons of music tech things going on at the moment if you have been keeping up with NAMM 2015, a music instrument and technology messe in the US.

Something of interest with regards to this project is a new range of video synthesis modules aimed at the Eurorack format of Modular. These will enable the user to generate their own VJ show, in tandem with the music that the modular synth is producing. It also give the user the ability to have it set up in their room they can see what their music is doing, in a manner of speaking anyway.


The beauty of this is that the device is under voltage control, meaning that whilst we can of course have complete and accurate control over all the parameters, we can also set up a modular to run and change of its own volition. Here is another example, this time running in tandem with a Korg Monotribe.


There is actually a new modular system for within ableton live that has only just been released. I could actually look at intergrating one of the simple systems I have already demonstrated and try to create an in the box version. I quite like this idea, more on that later.

OSCILLIOT - Max for Live Modular System

It has also got me back thinking that I need to look into whether or not the Arduino can be controlled under CV. This wqould open up a whole new world of possibilities.

As this project is also focused around how music related to sound, I am shortly going to start posting a track of my own composition that has then had a max for live visual patch applied in order to create a music video. I feel this will be a great way to keep you in the loop of everything I am doing within the project - a window into my testing through use of the system. Although at this point it is more developing the system through use. The next post shall be back to practical stuff that I have created.

Wednesday, 28 January 2015

Been a while

Hey there, sorry it was pretty hectic over Christmas New Year etc... Getting right back in the swing of things now though.

So here is a cool video the I stumbled across recently, made cooler by the fact it has links to some really useful places online, namely an Arduino forum with a post about FFT data driving LED colour spectrums.


Although I can already see this being to rapid changing for any sort of analytical use, it could be fun to include this as a setting on the arduino to get a bit more vibe going when you are just listening back to what you have created or other peoples work.

On the more analytical side, we can look at something like the setup in the video below, I am pretty sure this could be calibrated in order to use it as a VU meter and RMS meter. As the ideas keep coming the device I am envisioning is a modernise real time analyser, I will make sure to get my ideas down in a drawing soon.


This is neat because I could give the user control over certain parameters, enabling them to tailor their visual experience to the way they like it. The beauty of Arduino is they can be loaded with multiple libraries of code that can be cycled through very simply for different uses.

Sunday, 7 December 2014

Visualising Music 3

For some typically obscure reason, my phone has decided not to record video anymore, which is annoying as I wanted to use that in place of screengrab software for now. Anyways, this is the most advanced and relevant of the tutorials featured on Max 7.

Here we really start to see the power of using the audio stream for control over visual parameters. Again using the "p turn" subpatchers mentioned in the last post, but with a different path for the VIZZIE data created from the amp and timbre data of the audio.


This utilises the "jit.world" and jit.mesh" objects within the JITTER catagory of Max 7. jit.world simply creates the window and container environment for the visuals to be created withing, however without it we wouldn't see anything, making it a vital part of the patch.

The "floating" and "erase_colour" are best left alone when you are playing, however you need to activate the patch to work by first clicking the "X" in the top left corner, then setting the audio to play as in the last tutorial.

The two green devices you see within the patch are again Beap object, "INTERPOL8R" and "SLIDR". These control parameters within the "jit.mesh" object below them (which you cannot see in the screen grab). They basically control the grid mesh graphics that can be created within jit.mesh.

Also control the jit.mesh object, is another object "jit.gl.material" which has many functions, but in this case it is used to control the colour palette of the grid mesh.

When I took the screen grab I was in the middle of starting to iterate on this tutorial device, by using the amplitude data to control the colour palette. I will follow up this development in further posts, but once again, open this patch up and have a play.

The controls you are looking to use in the patch are those contained in the INTERPOL8R and SLIDR object. For some reason the interp mode on the INTERPOL8R object is a bit tempremental. It starts on a garunteed interpolation to create visuals. Some others work too, but some just make all the visuals disappear. Being honest I dont know why... What can I say, Im still learning!

https://www.dropbox.com/s/k8wg3nh8tjw4b7a/Visualising%20Music%203.2.maxpat?dl=0


Max 7 Visualising Music 2

In this next tutorial patcher, we learn about the two subpatchers - "p turn timbre into VIZZIE data" and "p turn amplitude into VIZZIE data".

This is extremely useful knowledge in terms of this project, as timbre and amplitude are two of the most useful data stream from audio for control of visuals.

So the music loop this time routes in two paths, the first is straight to a device called Stereo from the Beap library, which is essentially a DAC for routing audio out of Max.

The second path is more convoluted. It first routes into the subpatchers.mentioned above. You can see the change in data types - as an audio signal in Max, yellow and black stripped cables are used. As soon as it is a numerical data stream, the cables changes to a block grey colour.

The devices they feed is called Patterniser. This is the device that gives the shape to the graphics you can see at the bottom of the patch. You have choices for many of the parameters of the graphics, including shape position, size, pixel seeds etc.

The device you see inbetween is called MAPPR. This devices allows you to control the levels of RGB saturation with the visualisations.

Run the Drum Loop at the top, and then simply play with the parameters. You can also draw your own saturation curves on the RGB MAPPR device. Have a play about!


And here is a link to the project itself.

https://www.dropbox.com/s/rdbiw96m7q2in9w/Visualising%202.maxpat?dl=0



 


Max 7 Visualising Music 1

In the first tutorial, we learn how to use a combination of the Jitter, MSP and new BEAP sub catagories within Max 7.

Here is a screen grab from the finished article.


On the left above, you can see a drum loop being routed into two devices. On the left you have a four way splitter, with a VU style meter visualisations to show how hot each of the four frequency bands are, on the right we have a sonogram, which shows an FFT based visual spectrum of the sound, with the top of the graph being high frequencies, scaled down to the bottom where the lowest frequencies are.

On the right of the picture, you see another patch which allows for a different view of spectral analysis. When you open the patch, use the drop down menu with "drum" written on it. Select any of the samples there and the device shall work. Below is a link to the patch via Dropbox.

https://www.dropbox.com/s/xrwsnxcguyoh94m/Visualising%20Music%20%601.maxpat?dl=0


Introduction to Max 7

So it's been a while since my last post because of the fairly large amount of paperwork needing done, however I have been working in Max in my spare time. So rather than going over the breadth of devices available, I am going to focus in now on the devices I am using for development.

I am going to use my digital camera to record the screen rather than use screen grab software at this point as I am pushed for time. This is mostly for the benefit of the Pre-Production document so I can disseminate where I am at in the project properly.

So I touched on the fact that Max 7 had been released. I knew this was coming for a while, and I actually stopped my work in Max 6 that I was doing over the summer to wait for the release, as it is a huge platform upgrade. One of the main advancements is being able to develop and/or use max for live devices within Max, without Ableton actually being open. There have also been massive upgrades to the API and search index that make it massively easier to use and understand, which as someone who doesn't understand programming at a deep level was very appealing!


If you have ever used Max 6 or previous versions, then the picture above will probably look quite unfamiliar. In order to make a project look like this in Max 6, you would have had to take massive amounts of time. As of Max 7, these modular objects are now all drag and drop and contained within a neat search system, with great tagging and innovative ways of file searching.

They have also added a frankly incredible tutorial system that is built into the program. As time goes on, this tutorial section will be expanded, but it ships with 5 tutorials that show you a massive amount of what is new to Max 7

It is encouraging for this project that, with the exception of one tutorial, they are ll aimed at exploring the new ways Max 7 allows the user to manipulate sound and visuals. (Someone is looking out for me!)

At this point I would say it's advisable to go and get the 30 day free and fully working version of Max 7 from cycling74.com if you don't have it already, as this will allow you to inspect and play with the projects I have made through following tutorials. (Kenny, I'm looking at you)

Next post please....