As a founder of Harvard Innovation Lab startup, Sonation, I brought together a team to create apps for musicians to live their dream: become the star of a musical experience revolving around their own, real-time performance, accompanied by some of the greatest music ever created.
Cadenza, our first app, listened to musicians and, using predictive algorithms, accompanied them with a real orchestra, band, or piano recording synced with their own expressive performances as they happen. Then, using machine learning, Cadenza learned each user’s style to be better able to predict their performance nuances over time.
This montage shows different musicians each playing the same piece of music. Each one was originally recorded without accompaniment, but here Cadenza listens to them, accompanies, and even detects and recovers from their inevitable mistakes.
When musicians read music they can’t interact with apps. Their hands are already full, but they have to turn pages, too. For centuries this need was partly addressed partly by clever printing and layout tricks. More ambitious solutions have been tried, included pedal-powered contraptions and even employing a real person just to turn pages for another musician (a practice still in use now).
See the right music at the right time, without requiring any interaction (foot pedals, taps, gestures) and without making musicians bend to fit into pre-recorded page-turns. Free the musician to focus on making music. Eliminate page-turning anxiety.
Most solutions replicate those of the pre-digital era. Instead of reaching out to turn a page, they have you merely reach out to tap a screen. That’s not an improvement. Slightly better: press a BlueTooth pedal to turn the page onscreen. These solutions are not ideal.