simultaneous ArtNet and MIDI output using passthrough?

Ask a generic question about the usage of QLC+, not related to a particular operating system
Post Reply
DrNeon
Posts: 10
Joined: Tue Dec 22, 2020 6:04 pm
Real Name:

I'm trying to figure out how to get simultaneous ArtNet and MIDI output using a Universe configured as a passthrough. I think what I want to do is possible but I'm just not having much luck getting it set up.

First, the why:
Intro: I'm using QLC+ to develop sequences for some home-built animatronic Halloween props. I have built the props to receive ArtNet, but the best way I could figure out how to save sequences of a show to replay to them is to have QLC+ generate MIDI (to a port made with loopmidi which is then recorded with a midi editor program), then replay the MIDI once the prop is triggered and use midimonster to convert it back to ArtNet. Convoluted, yes, but it works pretty well. So, if QLC is controlling the prop in real time, it'll speak ArtNet to it directly, but if the prop is running based on motion sensor triggers and pre-recorded sequences, it'll play back a MIDI file (from a Raspberry PI) that gets converted back to ArtNet (while also playing back multitrack audio using ecasound).

Why I want simultaneous output: I'd like to be able to use the audio trigger widget to help with choreographing jaw movement on my skeleton (doing it manually is a pain). If I'm operating in "real-time" mode (ArtNet out), I can have the audio trigger control one of the servo fixtures and that works pretty well (still running an older QLC version, 4.12.2, so the issue with low level audio on the trigger widget is still there but I can make it do what I want). But now, I'd like to simultaneously put out ArtNet and MIDI so that I can record whatever sequence is being produced as part of the real time performance to a MIDI file, for future playback. If I operate just with ArtNet output, then I can't capture to a file. If I operate just with MIDI, I can't observe my skeleton as I talk to it in real time. So.....I'd like both outputs at the same time.

What I've tried:
I've tried setting up another Universe ("Passthrough Universe") with the Input being set to ArtNet (either the same IP my "ArtNet Out" Universe is using, or 127.0.0.1), and MIDI as the Output, and checking "passthrough". I don't seem to get get anything coming in via the ArtNet input to Passthrough Universe when I run the Virtual Console and move sliders in ArtNet Out universe around. If I make a slider widget for the Passthrough Universe and have it set to follow an input, that doesn't work, but manually moving the slider will put out values to the MIDI out, so I know the output side of Passthrough Universe is working. I just can't seem to get the output from my ArtNet Out Universe back as input into my Passthrough Universe. What am I missing? Is there a simple tutorial anywhere that would provide some info for how to have one universe output ArtNet and have a second universe take that ArtNet info as its input for further use?

Any help or guidance would be greatly appreciated, thanks!
User avatar
GGGss
Posts: 3052
Joined: Mon Sep 12, 2016 7:15 pm
Location: Belgium
Real Name: Fredje Gallon

So if I'm getting the total picture, you want to merge Artnet signals?
In this scenario, you have to make sure you use unicast for the inputs and might use broadcast for the outputs. But a listening member of Artnet isn't supposed to subscribe to multiple sources (as far as I'm aware).
To ensure your signal flows are running correctly, I'd split things during development. Meaning you need another computer somewhere in line. The free 'DMX workshop' can sniff Artnet. This gives you the assurance that signal flow is present and you can try to get signals into QLC+. Then when signals reach QLC+, you are able to check the output side of QLC+ (Passthrough).
And then you can add the midi signal flow and verify it follows your wanted logic.
All electric machines work on smoke... when the smoke escapes... they don't work anymore
DrNeon
Posts: 10
Joined: Tue Dec 22, 2020 6:04 pm
Real Name:

So I sort of have this working using ArtNet broadcast (X.X.X.255 output) for one universe and then another universe as a passthrough with ArtNet input and MIDI output. The problem I'm running into (at least with 4.12.2, need to check 4.12.7 as well) is that broadcasting results in super slow, jittery performance at the ArtNet node (an ESP8266 device). If I unicast direct to the node, performance is flawless, but then I can't capture and re-output it as MIDI. Also I'm in full transmission mode, but will try partial shortly to see if that cleans things up. In any case, I'm wondering if there's a way to have a unicast ArtNet output that is simultaneously sent over the MIDI output plugin, to avoid the whole broadcast issue. Or some other clever better way of doing what I'm trying to do (simultaneously ArtNet and MIDI output).
User avatar
GGGss
Posts: 3052
Joined: Mon Sep 12, 2016 7:15 pm
Location: Belgium
Real Name: Fredje Gallon

aha ... ESP8266 - so you are using Wifi for the network connection and your Access Point is restricting Broadcast storms ... (delayed and jittery outcome).
Generally speaking, Wifi doesn't like Artnet at all. 1st the protocol is based on UDP (shoot and forget) and 2nd broadcast packets. Wifi doesn't like both because it may consume all bandwidth sending the packets to all clients connected. Therefore, some firmware (surely the modern ones) has a default enabled 'broadcast storm disabling' mechanism.
Try to disable this or use cables. (Be warned that using this setup in a public venue on a public Wifi network will impose the same problems again so I would avoid this setup.)
All electric machines work on smoke... when the smoke escapes... they don't work anymore
DrNeon
Posts: 10
Joined: Tue Dec 22, 2020 6:04 pm
Real Name:

Yes, using WiFi. Our network is on Google WiFi routers, so not as much flexibility as some of the other firmwares out there, and I don't see any mention of this particular option in any searching online. Cables are not an option for this application (skeletons are running on battery-powered ESP8266 I've programmed as ArtNet nodes among other functionality, namely some MQTT communication for some sensors and such...they are not close enough to any RJ45 jack on any system).

If you know how to make the change in the Google Wifi router firmware, please let me know. Otherwise, is there any way to do what I want to do without broadcasting? Again, my goal is not broadcasting, my goal is simultaneous ArtNet (unicast would be perfectly fine) and MIDI output. I'm trying figure out how to get QLC+ to send the same signals out over both interfaces (ArtNet and MIDI) at the same time.

This is at my house, so no need to worry about public WiFi
User avatar
GGGss
Posts: 3052
Joined: Mon Sep 12, 2016 7:15 pm
Location: Belgium
Real Name: Fredje Gallon

So the only reason you need midi is because you like to record and playback Artnet signals?
And what if you use a DMX recorder? There are tad recorders and playback can be triggered also...

Once your show is programmed, you still need both outputs?
All electric machines work on smoke... when the smoke escapes... they don't work anymore
DrNeon
Posts: 10
Joined: Tue Dec 22, 2020 6:04 pm
Real Name:

Yes, MIDI is used to capture the output from QLC+ to a file for later playback (during which the MIDI file is converted back to DMX). Each prop (skeleton) has its own MIDI file that get played to a separate port and can be selectively (via remote signal) muted and unmuted to start/stop animatronic motion (in conjunction with multitrack audio playback with different tracks turning on and off at the same time as motion playback).

If you know of a freeware DMX recorder/playback that will run on a raspberry pi zero W and allows for such multitrack control of both DMX and audio (via MQTT interfacing), please do let me know. Right now a motion sensor on each prop sends a signal via MQTT to the raspberry pi triggering playback start and selective unmuting of motion(MIDI->DMX)/audio(WAV) for each prop (via ecasound).

Once the show is programmed, I do not need both outputs. I need both outputs to better use the audio trigger function to help program the show, particularly with jaw movement (which is a pain to choreograph manually). I need to be able to send audio trigger output via DMX (so I can visualize what the prop is doing) and via MIDI (to record what is going on, via my current record/playback scheme).

I googled "tad recorder DMX" but didn't find anything that seemed relevant (only pricey hardware DMX decoders, not DMX recording/playback software)
User avatar
GGGss
Posts: 3052
Joined: Mon Sep 12, 2016 7:15 pm
Location: Belgium
Real Name: Fredje Gallon

Sorry - tad is a word saying numerous - so I wanted to point out that there are numerous DMX recorder solutions.
But now you are squeezing the funnel again, saying it has to run on raspberry pi zero W...
A quick search gave me this list: https://www.google.com/search?q=standal ... e&ie=UTF-8
All electric machines work on smoke... when the smoke escapes... they don't work anymore
User avatar
edogawa
Posts: 630
Joined: Thu May 07, 2015 10:34 am
Real Name: Edgar Aichinger

Sorry - tad is a word saying numerous - so I wanted to point out that there are numerous DMX recorder solutions
Fredje - I'm not a native speaker either, but in my understanding it means "a bit", "little", "small" or "few" etc., quite the opposite of what you think it means, and all the online translators to german tell me the same - "a tad bit" means "a very small amount", for example.
DrNeon
Posts: 10
Joined: Tue Dec 22, 2020 6:04 pm
Real Name:

I think we are going on a tangent here. I'm not squeezing a funnel, I'm describing the setup I have (as was described in the original post) and how I employ the software, QLC+, that this forum is devoted to. I am asking whether it is possible for QLC+ to produce simultaneous output via ArtNet and MIDI using a route that runs better than the current "broadcast-over-ArtNet-and-passthrough-to-MIDI-output" approach that I have sort of gotten working. I still don't know if there is a better way to do that or not.

I appreciate the thought, but solutions along the lines of "just do it a totally different way than what you have mostly working" aren't very helpful. Yes, I have googled ArtNet recorder and playback software for the RPi. And I have looked into them (e.g. Falcon player) and they won't work for my purposes. And, wouldn't you know it, many (most?) of the links in that search you suggest point to....QLC+. In any case, the issue is how to record the DMX output produced by using the audio trigger feature for subsequent playback at a later time.
User avatar
GGGss
Posts: 3052
Joined: Mon Sep 12, 2016 7:15 pm
Location: Belgium
Real Name: Fredje Gallon

I carry a 10Mbit hub (not a switch) in my tool case. I put it in the middle of an existing network and can sniff whatever is there.
By this, you could use the unicast working method and still capture the network packets. Use this as an input in another universe and have it throughputted to your midi world?
Had to think about it during the weekend... this should work.
All electric machines work on smoke... when the smoke escapes... they don't work anymore
DrNeon
Posts: 10
Joined: Tue Dec 22, 2020 6:04 pm
Real Name:

So......sounds like the answer is no, QLC+ (by itself) cannot do simultaneous output of ArtNet and MIDI without using ArtNet broadcast. That's too bad.
Post Reply