I haven’t had a whole lot of time to seriously play around with it, but I figured out how to get some not-real-time music visualization going on this brand new BlinkStick Flex.
I used Sonic Visualizer with the Melodic Spectrogram layer and saved a couple copies of different areas of the spectrum, mixed them all together in GIMP and added some color to them, and then fed the whole thing into a simple python script to keep the animation and sound sync’d up.
The script (only tested on Linux):
[code]from PIL import Image
from blinkstick import blinkstick
from time import sleep, time
stick = blinkstick.find_first()
file = ‘/home/username/Documents/All on u.png’
img = Image.open(file)
(w,h) = img.size
im = img.load()
delay = 25 # 25ms = 40fps
start = int(round(time() * 1000))
curtime = start
last_led_time = start
print(curtime, delay, w)
while curtime-start < delay*w:
x = int((curtime-start)/25)
data = 
for y in range(h):
for col in range(3):
curtime = int(round(time() * 1000))
if curtime - last_led_time < 20:
sleep((curtime - last_led_time)/1000)
last_led_time = int(round(time() * 1000))
Something’s not quite right about the script, though. The colors are off. I’ll have to play with it later to see where it’s going wrong, I think I have a good idea of where to look though.
The final image that I fed said script with (open it in a new tab):
It’s 8178x32, one horizontal pixel for each frame of the animation, and one vertical pixel for each LED.