We also want our art to respond to beats and percussions in the music and make a sort of dance. It is inspired by Lucid sonic and is set to respond to percussive or harmonic effects with generatives on pretrained style_gan_2 since training once's own GAN can take a lot of time. The images are set to move in a motion derived by pulse and frame rates are set to change based on the beats.
The music is the sherlock (copyright with BBC) and a little work with the main.py function and manipualtion with librosa can allow it to literally learn any instrument and transfer any audio style to art. The music then becomes visible in nature, contrast and brightness changes and the frames are appended to one another based on sound signals which are decomposed into percussive and harmonic parts and set to respons to those effects! The weights for images are derived from pre trained stylegan and stylegan2 loaded into a json, Traning our own weights on VGG would require a lot of GPU time.
A melspectrogram is generated using ibrosa and normalised. A chromagram is also generated. Pitches are asigned based on dominance and attributes are assigned for vector generation. Append Pulse and Motion update vectors to respective lists, and different class vectors are sommethened by using mean vectors.
It is a year long collaboration between different groups that met each other on discord!