Automated lights with Python and some AI

This is a place for sharing with the community the results you achieved with QLC+, as a sort of use case collection.
You can share photos, videos, personal hardware/software projects, interesting HOWTO that might help other users to achieve great results.
Post Reply
tume
Posts: 4
Joined: Sun Jul 21, 2024 5:40 am
Real Name: Tuomas Niemi


A showcase of some Python code I've written to automate DMX lights with QLC+.

Using an mp3 file, the code runs the song through some models to predict bpm, find segments, and split instrument tracks. Some simple audio analysis with Librosa on volume helps with recognizing the more energetic parts. Then some hardcoded scripts are fitted to those segments and strobes based on drum onsets are added (really only works with metal).

The code writes a script for each element separately and synchronizes them, after which they are placed into a collection that contains the whole show. All of that, as well as some chasers and buttons (for the fun of it), are written to a new QLC file built on an existing setup.

This is a very experimental project and the code needs a lot of refactoring. I've built the program on WSL2, and getting all the necessary installations can also be annoying. I can try running the code for your lighting rig or give insights on the code if interested. Add me on discord username: _tume_

The models: https://github.com/mir-aidj/all-in-one? ... me-ov-file

Song is Sleepwalker by Parkway Drive
prophy17
Posts: 118
Joined: Tue Apr 09, 2019 9:24 pm
Real Name: Vladimir

Yes. It is a very interesting theme. I am not a professional in the soft industry. But I have a big interest for your great work. But I have not discord
prophy17
Posts: 118
Joined: Tue Apr 09, 2019 9:24 pm
Real Name: Vladimir

I would like to find for example automatically by AI places in the track, where strobo effect should be start and stop. Would you mind to help with this?
prophy17
Posts: 118
Joined: Tue Apr 09, 2019 9:24 pm
Real Name: Vladimir

And one more theme. I am interested in colormusic effects for lyric compositions. Trigger of QLC+ and even beat signal of OS2L plugin of Virtual DJ often can not help with such music. Do you have any works with AI in such direction?
Last edited by prophy17 on Tue Nov 19, 2024 9:12 am, edited 1 time in total.
User avatar
GGGss
Posts: 3052
Joined: Mon Sep 12, 2016 7:15 pm
Location: Belgium
Real Name: Fredje Gallon

tume wrote: Sun Jul 21, 2024 5:54 am (really only works with metal)
Vladimir, the OP, already quoted the limitations of such a system. I've seen some pretty nifty programs that can analyse music, but the outcome is always too harsh to be used practically. If you like Chinese light plans, go ahead. Bright, randomly flashy colours without any ambience or sphere. It will look like the wedding DJ with his four pars behind him.
All electric machines work on smoke... when the smoke escapes... they don't work anymore
prophy17
Posts: 118
Joined: Tue Apr 09, 2019 9:24 pm
Real Name: Vladimir

I think there is much work for AI in the World of Colormusic.
prophy17
Posts: 118
Joined: Tue Apr 09, 2019 9:24 pm
Real Name: Vladimir

Dear Mr GGGss thanks for your reply. I know about it. And so I think we should change such situation. AI should be the future of colormusic.
prophy17
Posts: 118
Joined: Tue Apr 09, 2019 9:24 pm
Real Name: Vladimir

And one more question for the author. Would you mind to answer why you use only Linux (from wsl2 how it has been written by you) and NATTEN, which has no good friendship with Windows. Many users work in Windows with music and colormusic.
tume
Posts: 4
Joined: Sun Jul 21, 2024 5:40 am
Real Name: Tuomas Niemi

prophy17 wrote: Tue Nov 19, 2024 8:33 am I would like to find for example automatically by AI places in the track, where strobo effect should be start and stop. Would you mind to help with this?
I used librosa's onset feature to find peaks in the drums track. Here's the code responsible for that:

https://github.com/tumffa/aidmx/blob/ma ... /onsets.py

With some tweaks, you could probably get it to work for different instruments or perhaps even just the snare drum, for example.
tume
Posts: 4
Joined: Sun Jul 21, 2024 5:40 am
Real Name: Tuomas Niemi

prophy17 wrote: Tue Nov 19, 2024 9:22 am And one more question for the author. Would you mind to answer why you use only Linux (from wsl2 how it has been written by you) and NATTEN, which has no good friendship with Windows. Many users work in Windows with music and colormusic.
Can't remember exactly, but I think the models were harder to get working on Windows. You could probably get it to work though. Sorry for not having put the time into making installing this easier.
prophy17
Posts: 118
Joined: Tue Apr 09, 2019 9:24 pm
Real Name: Vladimir

Thanks a lot, Dear Mr tume, for your reply. But I am sorry you did not answer for my question about using AI for colormusic with lyrical melodies? And as I have written before, I am not a professional in the soft industry. I know only VBA a little :) . So excuse me for my question, but there is a string at the start of your onset.py file:
dev detect_onsets(filename, threshold)
What does it mean? What filename (music track mp3 or wav with the Path for it or another one) and what threshold? Would you mind to send a real example of this string of your soft.
prophy17
Posts: 118
Joined: Tue Apr 09, 2019 9:24 pm
Real Name: Vladimir

And thanks one more time for your great work with AI for QLC+, Dear Mr tume. I wonder of it. I think it will give a start for using of AI in QLC+ for another users, who know Python or another language. As I have said, I think that: AI is the future of colormusic
tume
Posts: 4
Joined: Sun Jul 21, 2024 5:40 am
Real Name: Tuomas Niemi

prophy17 wrote: Thu Nov 21, 2024 5:26 am Thanks a lot, Dear Mr tume, for your reply. But I am sorry you did not answer for my question about using AI for colormusic with lyrical melodies? And as I have written before, I am not a professional in the soft industry. I know only VBA a little :) . So excuse me for my question, but there is a string at the start of your onset.py file:
dev detect_onsets(filename, threshold)
What does it mean? What filename (music track mp3 or wav with the Path for it or another one) and what threshold? Would you mind to send a real example of this string of your soft.
Sorry for being unclear. The filename is just the path to the audio file (I use the drums track).

This is the function I use to find the time windows for strobes https://github.com/tumffa/aidmx/blob/3a ... ts.py#L128.

After normalizing the onsets, the threshold is used to filter quiet onsets. Then it finds closely grouped onsets and creates time windows for them. Lastly, it further merges time windows close to each other.

The segments parameter is just a list of section start and end times, for example:
{
"start": 45.33,
"end": 59.79,
"label": "verse",
}

If you hardcode some filepath and a list of segments like [{"start": 0.00, "end": 60:00}] to the get_onset_parts function, you should get it to work :)

Also, if you haven't already, I suggest copy-pasting the code to some LLM to get some further explanations.
Post Reply