Wrong value mixing behavior
Posted: Mon Feb 24, 2014 4:40 pm
Hi there,
I probably found a general misbehavior in qlc+, but I'm not quite sure if it's a bug or a feature.
First some theory about light color mixing; If you mix 1 part red and 1 part green, you'll get yellow. If you mix 1 green and 1 blue you'll get cyan. That was easy.
Now we mix 1 yellow and 1 cyan. We now have 1 red, 2 green and 1 blue part and get pale green. We can also archieve pale green by mixing a fresh green (1 red, 2 green) with 1 blue part.
And here comes qlc+ which does not work like this. If you create some scenes to represent those settings (I used 128 = 1 part) and mix them with a fader, you'll get a dark white (128,128,128) or (1,1,1) Instead of pale green (1,2,1) or (128,255,128). But if you mix fresh green and dark blue, you get whats right.
The problem seems to be the way qlc+ processes multiple channel values. Instead of summing them up it just applies the highest value to the channel, which is mostly not enough.
Take for example the new amazing crossfaders. I like them really, but they have one tiny problem which is related to this. If you have a chaser with two identical scenes and crossfade (linked) from one to another, the value won't stay constant. Instead, it gets reduced to about 1/2 of the original value which is wrong for crossfadings.
That is because of the maximum function (abrv. max). The max sees two values: "1/2 of scene1 and 1/2 of scene2" The max now just takes the highest value - which is 1/2 of the actual correct value.
So my Idea would be to sum the single channel faders/inputs up and scale them to 256 by dividing the result trough the number of faders/inputs. This would solve the problem of color mixing and crossfading, I think.
Any ideas about this being intended to work this way or even if my idea would work?
Edit:
Working with QLC+ 4.6.1 on Windows. But I recognized this problem already about a year ago (QLC+ 4.3.1)
I probably found a general misbehavior in qlc+, but I'm not quite sure if it's a bug or a feature.
First some theory about light color mixing; If you mix 1 part red and 1 part green, you'll get yellow. If you mix 1 green and 1 blue you'll get cyan. That was easy.
Now we mix 1 yellow and 1 cyan. We now have 1 red, 2 green and 1 blue part and get pale green. We can also archieve pale green by mixing a fresh green (1 red, 2 green) with 1 blue part.
And here comes qlc+ which does not work like this. If you create some scenes to represent those settings (I used 128 = 1 part) and mix them with a fader, you'll get a dark white (128,128,128) or (1,1,1) Instead of pale green (1,2,1) or (128,255,128). But if you mix fresh green and dark blue, you get whats right.
The problem seems to be the way qlc+ processes multiple channel values. Instead of summing them up it just applies the highest value to the channel, which is mostly not enough.
Take for example the new amazing crossfaders. I like them really, but they have one tiny problem which is related to this. If you have a chaser with two identical scenes and crossfade (linked) from one to another, the value won't stay constant. Instead, it gets reduced to about 1/2 of the original value which is wrong for crossfadings.
That is because of the maximum function (abrv. max). The max sees two values: "1/2 of scene1 and 1/2 of scene2" The max now just takes the highest value - which is 1/2 of the actual correct value.
So my Idea would be to sum the single channel faders/inputs up and scale them to 256 by dividing the result trough the number of faders/inputs. This would solve the problem of color mixing and crossfading, I think.
Any ideas about this being intended to work this way or even if my idea would work?
Edit:
Working with QLC+ 4.6.1 on Windows. But I recognized this problem already about a year ago (QLC+ 4.3.1)