ai music
"The world's first AI-composed music album is here, and it sounds amazing."
About The Album
I AM AI is the first album by a solo artist composed and produced with artificial intelligence. The songs explore the future of humans and machines, asking the questions:
Who are we?
What will we become
…and are we ready?
Comprised of eight tracks, the album’s first single “Break Free” currently has more than 4 million streams and landed on the Mediabase Indicator radio charts at #48 in August of 2018. The song has garnered reviews and coverage from publications like Wired, Forbes, and Fast Company.
YouTube's Creator Lab awarded Taryn a grant in 2017 to create three immersive VR videos, of which two songs on the album were written for.
I AM AI became available in September of 2018 and can be streamed here.
About The Process
Machine learning can be used to process, compose, and produce composition or instrumentation. With rule based AI, the artist can direct parameters (i.e. BPM, rhythm, instrumentation, style.) With generative AI, the artist can input musical data, and apply deep learning to output new musical compositions based on statistical probabilities and patterns. Editorial arrangement plays a heavy part in the artist's process in either scenario.
Taryn used a combination of tools
including IBM’s Watson Beat, Amper, AIVA, and Google Magenta. In all cases, AI software composed the notation, and when Amper was used, the AI also produced the instrumentation.
Taryn arranged the compositions
and wrote vocal melodies and lyrics, while Producer Ethan Carlson handled vocal production, mixing and mastering.
About The Album
I AM AI is the first album by a solo artist composed and produced with artificial intelligence. The songs explore the future of humans and machines, asking the questions:
Who are we?
What will we become
…and are we ready?
Comprised of eight tracks, the album’s first single “Break Free” currently has more than 4 million streams and landed on the Mediabase Indicator radio charts at #48 in August of 2018. The song has garnered reviews and coverage from publications like Wired, Forbes, and Fast Company.
YouTube's Creator Lab awarded Taryn a grant in 2017 to create three immersive VR videos, of which two songs on the album were written for.
I AM AI became available in September of 2018 and can be streames here.
About The Process
Machine learning can be used to process, compose, and produce composition or instrumentation. With rule based AI, the artist can direct parameters (i.e. BPM, rhythm, instrumentation, style.) With generative AI, the artist can input musical data, and apply deep learning to output new musical compositions based on statistical probabilities and patterns. Editorial arrangement plays a heavy part in the artist's process in either scenario.
Taryn used a combination of tools
including IBM’s Watson Beat, Amper, AIVA, and Google Magenta. In all cases, AI software composed the notation, and when Amper was used, the AI also produced the instrumentation.
Taryn arranged the compositions
and wrote vocal melodies and lyrics, while Producer Ethan Carlson handled vocal production, mixing and mastering.