Project Timeline

In brief overview, Portraits of Change can be split into 3 phases. Phase 1 consists of the “alpha” version of our software and hardware, and is a proof of concept before we scale the Portraits prototype to more instruments/devices and create real performance opportunities. In Phase 2 (which we are dubbing the “beta”) we will have the base technology to perform live mixed reality music performances with at least one head-mounted display and a way for the audience to see the mixed performance space through projectors or mobile devices. Phase 3 represents the endgame of this project, utilizing future technologies such as augmented reality HMD’s and technologies we still need to raise funds for research and development. Learn more about the different phases below, and check out the Dev Logs for test vids of our work in action!

Phase 1 (Jan'19-Mar'20)

Portraits of Change Alpha (0.1.0)-(0.2.6)

Since January of 2019, Portraits of Change has been built from the ground up. What started as a recital that focused on past innovations in jazz led to entirely new ways of seeing music. In March of 2019, Portraits of Change became a fiscally sponsored project through arts non-profit Fractured Atlas. This gave us the tools to thrive. The Portraits of Change Alpha represents the major bootstrapping and rapid prototyping that took place before public performances.

This alpha prototype was built using the Oculus CV1, LEAP motion sensor, various music equipment for MIDI conversion of live instruments, LIV software, and our own Unity software. This was the cheapest and one of the only options for mixed reality at the time this project started. VR has already grown commercially since we started this project, and we can leverage new technology and the new growing VR audience to get to Phase 2.

Phase 2 (Current Phase)

Portraits of Change Beta (0.5.0)-(x.x.x)

In January of 2020, we began a push to upgrade our tracking system from the now outdated Oculus sensor setup to SteamVR. With the release of the Valve Index and the 2.0 Basestations, it became clear that we needed to move to this system. We applied for the Peak Arts Prize 2020 in the hopes to start Beta performances using this new tracking system, and continue development of the Unity software. In March, we acquired the Index system along with a Vive tracker and have since been refining the software. We had done experiments with pass-through AR during this phase, and will continue to prototype and test different types of head-mounted displays as they become developed and available to us. Regardless of the device that our audience will wear, we will always use the Steam tracking system and so the fundamental technology we build on is future-proof to a degree.

Upgrading to SteamVR tracking with Vive trackers allows us to support 6 individually tracked instruments. Our current AR prototype uses an Index headset with a ZED Mini mounted to the front for stereoscopic AR pass-through. In this beta phase, we currently support 1 participant in the HMD. Phase 2 performances will project what the user sees with the beta prototype AR headset on a screen so everyone can see what they are seeing. It is possible to add another ZED camera tracked with a Vive tracker if we want to get fancy.

With the first public performance (date TBA) approaching soon, we are putting together the final pieces to present our hard work over these last 2 years. I made the decision to end the fiscal sponsorship between this project and Fractured Atlas in August 2021, mainly because we were no longer actively seeking donations or applying for grants as we finalized the hardware and software for our first performance. We will continue looking for funding opportunities, arts and STEM grants, investment opportunities, and community outreach and grassroots support in the future but for now we are taking a step back and digging into music and software. We learned a lot from our fiscal sponsorship and from the various grants we applied for with their support! We especially appreciate the platform to fundraise that they gave us, which helped with the initial trajectory of building the prototype.

Phase 3

Portraits of Change 1.0

The great unknown. The frontier of audio-visual performance, being able to experience an improvised concert where you can see every note from every player in 3D. We hope to move to a standalone AR device like the Lynx R-1, HoloLens, or Nreal platforms to support multiple participants. Maybe we could actually hire a Unity developer and/or a VFX artist to flesh the whole thing out. I would really like to dig into collaboration in this phase, continue to evolve and develop what Portraits can do for technology, musicians and music lovers, and society as a whole. This would be the cutting edge of live performance and visual effects. Portraits of sound and light that are painted through music, right in front of you with full on augmented reality. You arrive at a club or a stage, put on some smart glasses and see galaxies form in front of you. You look at your hand, and browse information on the music you hear. We can use the same glasses to read sheet music and guide our effects, which will be totally unique each and every time you see it. Maybe in the future when we all have smart glasses, you could bring your own to a show and just download an app ;)