Hackathon project that uses a webcam to read Rubik’s cube faces and converts the color patterns into music. The more solved a cube face is, the more harmonious it sounds.

Built in under 24 hours at the Drexel 2017 Music Hackathon with Emmanuel Espino and Jason Zogheb. Uses Python for computer vision (detecting cube colors) and audio synthesis (generating sounds).

This project is documented in: