Bitwig-Artist_2209_Keijiro_Takahashi_banner

Japanese Computer Programmer and Producer Keijiro Takahashi's Journey from 8-Bit to MML to The Grid

Unity x The Grid: The multimedia artist sheds light on how he combines programming, 3-D graphics and generative music to create stunning audio-visual experiences.

How can music production and computer programming backgrounds intersect in an organic manner?

"The Grid devices have all musical features," says Keijiro Takahashi, referring to Bitwig's modular sound design environment that brought to life his recent generative album, G.M.C.1. The Japanese producer's creative history dates back to the times of 8-bit computers, when no dedicated music production software was available on the market. Takahashi was inspired by the sample-based programs he used back then to create his own version of the software in order to play back sounds. Later, he encountered the world of hardware and DAWs and shared a series of performance videos as Denkitribe on YouTube while working for a major video game company.

Recent years have seen Takahashi merging generative music with live visual performance based on Unity, a game engine for which Takahashi currently serves as a developer-advocate. As shown in  videos from his recent audio visual live performance, Takahashi demonstrates his knack for programming to join the dots between the advanced video game development environment and his theory-based musicality.

We sat down with the media artist and discussed the creative process behind G.M.C.1, an album made with a custom plug-in that integrates generative music with real-time visuals, as well as his insights into the generative approach and beyond. 

When did you start making music? Did you already produce tracks before Denkitribe?

I started making music back in the days of 8-bit computers. The sample programs that came with my computer played music and sound effects. They grabbed my interest, and I started learning about them. At the time of the 8-bit computer, there was no proper music production software, so it was more like "writing programs that play sounds" rather than "creating music." Around that time, in Japan, MML (Music Macro Language) was the main means of computer programming. I remember learning the basics of how to read and write sheet music through MML. I also learned the basic principles behind FM synthesis, as it was commonly featured in computers in the country. I didn't really understand music theory then, so I'd just program what I saw on a score.

Since I moved on to more modern computers, I've tried my hand at various approaches in line with the changing times. I enjoyed sampling with what's often called "tracker software," like Scream Tracker and Impulse Tracker, and I was amazed by the potential of virtual analog synths when ReBirth RB-338 arrived. I liked the simplicity and high modularity of 2000s music-making software. About ten years later, though, Bitwig came along, bringing this even higher degree of modularity, and it impressed me so much that I tried incorporating it into my creative process. More recently, the implementation of the Grid devices made me decide to fully convert to Bitwig as my main approach to music-making.

In other DAWs and synthesizers, modulators like LFOs and envelopes are all built in as part of device-specific functionalities and can only be used and assigned within a pre-fixed range. Bitwig, on the other hand, allows you to freely combine modulators and expand them as much as you like. I personally don't like monolithic synths where the UI is designed over numerous pages, so the simplicity like this looks very appealing to me.

You used The Grid to produce the tracks for the G.M.C.1 album. Why did you decide to produce an album with a generative approach?

I'd been working on a generative approach to music production for a while, but I had this impression that the software used in that field was rather inclined to experimental music and didn't have the right tools to make "traditional music." In contrast, the Grid devices have all the musical features and are very suitable for making it. Then I thought, "What kind of music can I create by actually using them?" The result was a series of tracks on this album.

Around the same time, there was a plan to produce video tutorials for Unity, so I set a secondary goal to have music materials that could be used in the background of the videos. For this reason, none of the tracks on this album make a strong statement. They are kind of like ever-evolving elevator music. For unassuming tracks used in video backgrounds, there is no need to compress the dynamic range, so I left the mix rather simple, with EQs and limiters applied only lightly. I didn't use any external gear, software or plug-ins, as I wanted to keep the tracks adjustable anytime with nothing but Bitwig Studio.

No edit was made to the arrangement of the album. The tracks included are simply spliced together by crossfading. In my previous experimental music projects, I used notes out of a scale on purpose or mixed random rhythms at different BPMs, though I gave it a miss this time. Instead I routed the Grid devices so they were generative within the standard realm of music theory. The Pitch and Data modules in the Grid allow you to achieve it easily.

When you created instruments that generate audio signals and notes, did you patch up modules to result in an image of the finished work you had in mind from the beginning? Was there anything you paid attention to in the process?

At this time, my main approach was to experiment with the functionalities of the Grid devices as ideas flowed. I have no specific image in mind before getting on with producing. It's just like feeling my way in a process of trial and error. Once I have tested all the capabilities throughout The Grid, I may then come to think of shifting to an image-based approach at some point.

Besides music, you are also create visuals and do computer programming. Is there one example of a project where you think you drew upon your skills across these fields synergistically?

If I had to choose, it would be this album. It taps into both musical and programming knowledge. In terms of computer programming, The Grid is a limited system, meaning that background knowledge is still required to a certain degree to put together something complex there. For instance, you can fully leverage what boolean operations have to offer in music-making when you have experience both in composition and computer programming.

Also, I'm experimenting with integration between Bitwig and external software via VZO, an original plug-in I developed. I used it to achieve the integration between the DAW and visuals for a recent live performance. In future live sets, I want to explore The Grid in various ways to integrate generative music and real-time graphics. If you use Unity together with the VZO plug-in on an instrument track in Bitwig, it sends out OSC signals that VZO VFX receives to use as visual controls. Converting Bitwig's outputs in this way, you have a much wider range of creativity and expression.

When you look at generative music and real-time graphics in the context of live performance, will their interaction be like the performer doing things in response to generative elements that then respond to the performer? What relation do you envision building between human and machine?

I guess there are many different ways of thinking about the methodology of how to bring a generative performance together. The approach I personally take most often is to prepare a number of mechanisms that keep generating music and imagery, and then manually manipulate the mix and FX to create developments in the performance.

The combination with Unity was exactly what I did at the recent live performance. I tried a simple integration that allows some FX to respond to several rhythmic elements. For this live performance, there were about 30 tracks with the Grid devices prepared alongside beat repeat effects built in FX Grid. I controlled them in real-time to trigger the tune's developments while switching around 10 instances of VZO VFX and applying them to volumetric visuals shot in Shibuya with iPhone's LiDAR sensor to create visual developments.

For the background of the video tutorials, dramatic changes weren't required, so I chose to play the generated tracks as they were. But live performances can be more entertaining with extra movements and developments like this, so I manipulated them manually. From the directorial perspective on how generated materials can be utilized, I feel that there is room for human involvement.

Bitwig project files of the Keijiro Takahashi's album G.M.C.1 are available as a free download on Github, allowing you to discover for yourself how his Grid devices are patched up to make the captivating generative music pieces. 

Follow Keijiro Takahashi:

Explore More