Page 1 of 1

Accesisng ArtNet device that doesn't support ArtPoll/ArtPollReply

Posted: Tue Dec 22, 2020 7:59 pm
by DrNeon
Total newbie here, just starting to learn QLC+ for my grand vision of a motion-activated skeleton band. My goal is to have an ESP8266 for each prop/fixture that controls LEDS and servos and the like, and that also has a motion sensor that, when tripped, unmutes a track of a multi-track audio file as well as enabling whatever sequence of LEDs and servo movements are supposed to be going on at that timepoint in the music (the first motion sensor event starts playback and also unmutes some portion of the tracks corresponding to who saw the motion). I have a simplified version of this working with Node Red and ecasound and the like, but it can only turn on and off motors and LEDs slowly, it isn't capable of moving servos according to the music (no sequencing, and not fast enough, just using MQTT). So I've decided to rework this in a more established context (software and hardware), and since QLC+ seems ok with the multitrack audio, figured it was a good place to start.

I'd like to have each ESP8266 respond to ArtNet data (or E1.31, not sure why one format would be better than another in this context, but open to opinions on this) by PWMing its GPIO pins (or, more likely, using a PCA9685 board or two for PWM output). I'm pretty sure I can parse the incoming ArtNet data to get the hardware to do what I want on the output side. The ESP8266 would also send data back to QLC+ if the motion sensor was activated (indicating which device saw motion), at which point various tracks may be unmuted (via the websockets interface? still figuring out how that will work).

So I've got an ESP8266 running the ArtNetWifi example (or the ArtNetNodeWifi example), and it sees ArtNet data I put on the network with Jinx, but the code doesn't seem to do respond to polling, and so the device isn't showing up in the list of ArtNet plugin devices. The help file suggests that it is possible to communicate with ArtNet devices that do not support ArtPoll/ArtPollReply, but I don't quite see how to set that up. Any advice on how to get QLC+ talking to an ArtNet device that isn't showing up on that list? Again, I can probably switch to E1.31, unless there's some reason I'm missing that it wouldn't work for this. I could also try to figure out how to write the polling response behavior into the ESP8266 code.

I've looked into the ESPixel firmware, but I'd need to modify the code to get the PWM outputs I want I think (and to get the motion sensor input back to QLC+), and the need to install gulp in windows is currently getting in the way (not impossible, but lots of extra steps, may be worth it though...)

I guess another question worth asking is whether it will be possible to get the motion sensor input on one of the ESP8266s to unmute, in realtime, an audio track and various other tracks (the first device that sees motion starts playback of the whole sequence, unmuted its unique set of tracks, and subsequent motion on other props should unmute additional tracks during playback...hope I've got the nomenclature correct).

The MQTT broker for the current implementation with Node Red and ecasound is running on a Raspberry Pi Zero W, and for the new implementation, I'd likely want QLC+ running on that board controlling the show (outputting audio, sending data to turn on/off LEDS and servos over WiFi, and responding to motion sensing events reported over WiFi).

Re: Accesisng ArtNet device that doesn't support ArtPoll/ArtPollReply

Posted: Wed Dec 23, 2020 10:24 am
by GGGss
A lot of if, then, else ;-)
Triggering a music track can be done - unmuting a track is not possible. QLC+ is a lighting program and not a DAW after all.
But you could use QLC+ to send out a signal to a DAW to unmute the track... Keep audio separated from lights/motors/servos.

If this is going to be a movable installation, I strongly suggest using cable for the network part of your setup.
How many times did we troubleshoot all kinds of things resulting in an unreliable wifi connection?
Use fixed network ID's - then the Artnet discovery isn't necessary. And while at it, use unicast over broadcast (and even more important when choosing wifi absolutely use unicast - Access Points have the naughty habit of filtering broadcast packets, which in turn overwhelm your wireless network. So a big NoNo).

Artnet or E1.31 sACN - basically the same except for RDM - don't forget that both rely on UDP packets (shoot and forget / no handshaking involved)
Can the ESP8266 talk DMX? Then you could dedicate a universe in QLC+ as Artnet input.
What is gulp doing in the chain of commands?

Re: Accesisng ArtNet device that doesn't support ArtPoll/ArtPollReply

Posted: Wed Dec 23, 2020 3:49 pm
by DrNeon
So I've got something working now in terms of QLC+->ESP8266 communication, but probably far from optimal. I have a generic RGB fixture that will control a simple RGB LED on three of the ESP8266's GPIO outputs. To respond to GGGss,

-Instead of muting/unmuting the audio track, presumably I can just remotely adjust the volume of each individual track from full to zero? If each track starts out at zero volume, and then I remotely turn up the volume upon seeing motion, that'd accomplish the same thing. Can this be done? I do see that each track has associated S/M buttons (solo/mute?), but can these not be configured remotely via the websockets interface? Note only should the audio tracks start playing (muted) upon first motion, but the other tracks should be playing ("muted" or "disabled") upon first motion as well, and then be enabled/unmuted remotely as needed. Otherwise I see no way to sync things.

-Keeping audio separated from lights/motors/servos would make it tricky to sync movement and lights to the music, I'd think. I guess in theory I could keep my Node Red/ecasound setup and have Node Red start/stop/mute/unmute things in QLC via websockets, but that seems overly complex if I can keep it all in QLC+ (audio + everything else).

-When you say use unicast over broadcast, do you mean to assign each universe a specific IP as opposed to going to .255? Does that mean each ESP8266 will be its own "universe"? Still getting up to speed on DMX nomenclature...I think I was just able to switch from .255 to the ESP8266's IP address (.30) for universe 1, and that seems to have improved responsiveness dramatically.

-I can probably get the ESP8266 to speak DMX back to QLC+, but not sure whether that's preferable to the websockets interface. The only information moving from ESP8266->QLC+ would be when a particular ESP8266 saw motion (i.e. telling QLC+ to either start playback or unmute additional tracks)

-The reference to gulp is that it is supposedly needed to compile ESPixelStick. As I'm trying to avoid modifying other code and instead just write stuff mostly from scratch, I'm hoping not to need that. Right now I have a slightly modified version of the ArtnetnodWiFi debug example listening to QLC+ produced DMX and adjusting a RGB LED accordingly (and watching a PIR motion sensor and writing to serial when it sees something).

-Getting back to the S/M notation on each track, what is the purpose of these if not to Solo/Mute individual tracks in QLC+? Can these parameters not be set in realtime during playback?

Re: Accesisng ArtNet device that doesn't support ArtPoll/ArtPollReply

Posted: Thu Dec 24, 2020 9:35 am
by GGGss
DrNeon wrote: Wed Dec 23, 2020 3:49 pm -Instead of muting/unmuting the audio track, presumably I can just remotely adjust the volume of each individual track from full to zero? If each track starts out at zero volume, and then I remotely turn up the volume upon seeing motion, that'd accomplish the same thing. Can this be done? I do see that each track has associated S/M buttons (solo/mute?), but can these not be configured remotely via the websockets interface? Note only should the audio tracks start playing (muted) upon first motion, but the other tracks should be playing ("muted" or "disabled") upon first motion as well, and then be enabled/unmuted remotely as needed. Otherwise I see no way to sync things.
Again ... QLC+ is not a DAW. If audio syncing is crucial, you will add a DAW to your string handling the audio. Sorry. What you describe is perfect in terms of DAW but not in terms of the 'show' part of QLC+. Please be warned that exact syncing is cumbersome in the show part. It is not perfect (far from, if you are counting on a timecode show behavior).
DrNeon wrote: Wed Dec 23, 2020 3:49 pm -Keeping audio separated from lights/motors/servos would make it tricky to sync movement and lights to the music, I'd think. I guess in theory I could keep my Node Red/ecasound setup and have Node Red start/stop/mute/unmute things in QLC via websockets, but that seems overly complex if I can keep it all in QLC+ (audio + everything else).
You will have to decide who will be the 'director'... I presume QLC+? On high synced demands, don't use WebSockets. Use DMX or Artnet. Websockets add latency again and you are already experiencing responsiveness problems.
DrNeon wrote: Wed Dec 23, 2020 3:49 pm -When you say use unicast over broadcast, do you mean to assign each universe a specific IP as opposed to going to .255? Does that mean each ESP8266 will be its own "universe"? Still getting up to speed on DMX nomenclature...I think I was just able to switch from .255 to the ESP8266's IP address (.30) for universe 1, and that seems to have improved responsiveness dramatically.
As expected... Once you will have multiple universes in action you will thank me :)
DrNeon wrote: Wed Dec 23, 2020 3:49 pm -I can probably get the ESP8266 to speak DMX back to QLC+, but not sure whether that's preferable to the websockets interface. The only information moving from ESP8266->QLC+ would be when a particular ESP8266 saw motion (i.e. telling QLC+ to either start playback or unmute additional tracks)
See ^^ latency with WebSockets. Let them talk 'hardware language', that is solid. (And easier to debug afterwards)
DrNeon wrote: Wed Dec 23, 2020 3:49 pm -The reference to gulp is that it is supposedly needed to compile ESPixelStick. As I'm trying to avoid modifying other code and instead just write stuff mostly from scratch, I'm hoping not to need that. Right now I have a slightly modified version of the ArtnetnodWiFi debug example listening to QLC+ produced DMX and adjusting a RGB LED accordingly (and watching a PIR motion sensor and writing to serial when it sees something).
Why serial? Have you PIR directly talk DMX or Artnet. (Raspi + GPIO + QLC+ does the trick - I used it 10's of time - mostly when people arrive and production wants a dramatic entry scene... These are triggered by multiple PIR's inside the corridor. Once getting outside of the corridor, people entering the venue are spotlighted ... (Off topic now)
DrNeon wrote: Wed Dec 23, 2020 3:49 pm -Getting back to the S/M notation on each track, what is the purpose of these if not to Solo/Mute individual tracks in QLC+? Can these parameters not be set in realtime during playback?
Yes, you can use those during show-time; but you can't bind any controls to these - so without a mouse, you are stuck.

Last warning coming from the doc:

Code: Select all

Warning: Even though QLC+ allows you to, it is not possible to play two audio files simultaneously. Especially on Windows, you might experience unwanted crashes
So your audio needs a DAW. (Same goes for video ... if video is demanded, a DVW or a video processor comes into place. That is how things work.

Last spiritual light-bulb: at the movies, what is synced to what? Audio to video or video to audio?
The human eye is less sensitive to changes in speed -> Video is synced to audio.

Imagining what you want to achieve: a kinetic installation, triggered by visitors, with motion and sound... I'm in the same think-tank state for a possible project later in 2021.
My eggs lay in a true timecoded show (as the Director) being fed by motion sensors or radar or even camera imaging, instructing the Director doing its things. 8 channel sound (!yes!) and 3 concentric moving light rings (diameter ~ 8m) and a user input desk for recording sounds. But hey - I'm still dreaming away...

Re: Accesisng ArtNet device that doesn't support ArtPoll/ArtPollReply

Posted: Thu Dec 24, 2020 4:12 pm
by DrNeon
Returning to original thread, sorry.

So I'm putting the motion-sensor activated unmuting on pause for now, while I figure out the websockets interface. I have a show bound to a button on the VC. I can see that show in the program and via the 127.0.0.1:9999 webpage. I can also access the QLC+ Web API test page and query and send data that way from a browser. I cannot, however, get my ESP8266 to connect to QLC+ over websockets. I've tried "<IP>:9999", "ws://<IP>:9999", "http://<IP>:9999", "<IP>:9999/qlcplusWS", "ws://<IP>:9999/qlcplusWS", nothing. The ESP8266 is able to connect to the echo.websocket.org server just fine. Also, using the https://www.websocket.org/echo.html page, my browser can connect to the echo.websocket.org server fine but not the QLC+ websocket on the same computer (using either 127.0.0.1 or the actual IP, with or without /qlcplusWS). So I'm a little stuck on making the ESP8266->QLC+ websocket connection. Using the ArduinoWebsockets library by gilmaimon, which is working with the echo server.

You mentioned that I should consider using ArtNet instead of websockets for controlling show playback, and I'm totally up for that. What I'm not sure of is how to get the ESP8266 to send an ArtNet command to start a show in a button on the VC (the ESP8266 will also need to know whether a show is currently running, i.e. an ArtNet equivalent of getWidgetStatus and Basic Widget value set via the websocket interface for the widget attached to the show). I don't think exact timing for this kind of response is critical (the old MQTT approach was fast enough to provide the desired functionality), but happy to use ArtNet if it'll work.

As for why serial on the PIR...for debugging purposes because right now I don't have a connection that allows the ESP8266 to start a show on QLC+, but wanted to ensure that the PIR is still working while I work with the hardware configuration (i.e. adding the PCA9685). Of course the serial output for the PIR detecting motion is not the actual goal. Writing to the serial monitor can be quite helpful for debugging code....I can certainly tell the ESP8266 to say something via ArtNet once the PIR is triggered, but I'm still not clear what it would say to trigger the show widget (after determining that the widget is not already in the on state). In my old configuration, triggering the PIR causes a message to be published via MQTT (doesn't seem like QLC+ can be controlled that way, so the response to PIR triggering will need to be either a websocket or, possibly, an ArtNet message)

I realize QLC+ is not a DAW. Perhaps the realtime remotely controlled S/M is a "feature request". Or perhaps QLC+ is not the software package I'm looking for (in which case, it'd be great to hear about something else that would do the things I'm hoping to do). I realize that multitrack audio playback is not a supported feature. On my PC, however, I'm getting 7 audio tracks to play back nicely together, at least with v4.12.2...they sync well (except when stopped, then one or two stop a little more slowly than the rest, no big deal). I haven't tried any of this on my Pi Zero W yet, which is the device I hope will serve as the controller/director/thing in charge of the other things. QLC+ or some equivalent software is the "director", but will need to accept remote controls from the ESP8266 nodes. As mentioned above, with my initial Node Red/ecasound/MQTT setup, this was working, but just didn't allow for animatronic-like control. No need for video. This is for a "simple" yard installation of decorations for various holidays. So video is not synced to anything; there is no video involved here.

Long post short...how can I get the ESP8266 to a) detect whether QLC+ is currently running a show (widget status = 127) and b) start a show if one isn't running (set widget status 127)? If you think it is possible and better to do this via ArtNet instead of websockets, I'd certainly like to learn how to do it that way. Currently, I have the ArtNet QLC+->ESP8266 line of communication working pretty well (servos and LEDs on the ESP8266 responding to sliders on the VC).