Like many students, Taylor Colimore wasn’t sure what she wanted to do when she came to Virginia Commonwealth University. “I just knew that I really loved making art,” she said.
The Towson, Maryland native considered studying science and medical illustration or crafts and material studies. She did not envision working on human-robot interactions based on choreography, algorithmic choreography, or live coding languages.
Colimore took a class called “Time Studio” during his freshman year. She was convinced that the time-based digital media art class would be her favorite.
But she discovered a fascination with software and the “endless possibilities of working with a non-destructive medium” like kinetic imagery, which emphasizes animation, video, sound and the art of the performance.
“I was pleasantly surprised at how much I enjoyed working with digital media,” Colimore said. “In this class, I did my first animation, video, sound and performance projects and was excited about the new tools I had just discovered. The Kinetic Imaging department offered the same experimental environment that I loved so much in the art foundation, but with a focus on digital and temporal art.
Colimore graduated in May from the VCU School of the Arts with a bachelor’s degree in kinetic imaging and experiences she never could have imagined as a freshman.
In 2020, Colimore performed live code choreography as part of Terpsicode, a live coding language created by Kate Sicchio, Ph.D., assistant professor of dance and media technologies, which uses dance vocabulary to create sequences performed by dancers in real time.
Here’s how it works: Sicchio uses a code to call up an image on a projected screen large enough for the dancers to see. The dancers reproduce the position in the image. Another image is added to the mix and the dancers respond to the new image.
“As new frames are added, the sequence of frames changes, changing the sequences of movements the dancers perform in real time,” Colimore said. “I am fortunate to be part of Terpsicode 2.0, where images of myself and my movement can be called upon to be interpreted by any dancer in a Terpsicode performance.”
The following year, as part of the 2021 International Live Coding Conference, Colimore took part in a live performance of “Studio//Stage” with Sicchio and VCU dance student Tamara Denson. The web-based, live-coded screen dance performance uses a mini live coding language to create choreographed compositions in real time. The team used live code video clips of Denson and Colimore dancing in the same live coding environment. The piece explores loops, time and position.
Colimore also worked with Sicchio and Patrick Martin, Ph.D., an assistant professor in VCU’s Department of Electrical and Computer Engineering, on the first iteration of “Amelia and the Machine,” a duet between dancer Amelia Virtue and a robot named Isadora. with a focus on the gesture that debuted in February. Colimore recorded the dancer’s motion capture data to teach the robot choreography. She also soldered the robot’s LED “suit”, which lit up to match the dancer’s teal unit.
“During the actual performance, I was part of the play execution team as a cameraman for the live stream that was projected behind the two performers,” Colimore said. “All in all, this piece involved a lot of people and a lot of moving parts, but it was so exciting to see this project move from conceptualization to performance, and I’m so happy to be a part of the beginning of this collaboration.”
After graduation, Colimore plans to stay in Richmond to pursue a full-time creative career as a video editor or in a post-production position. His works can be seen at https://www.instagram.com/taylorcolimore/?hl=en.
Subscribe to VCU News
Subscribe to VCU News at newsletter.vcu.edu and get selected stories, videos, photos, news clips and event listings delivered to your inbox.