Putting composers back in the loop
Applying the latest deep learning techniques to music composition is appealing for AI researchers; but for composers, this intrusion of machines in their domain of expertise could be perceived as a threat. This fear of being replaced is legitimate: indeed, many recent generative models for music tend to produce infinite numbers of
scores without the need for human intervention. I think that this behavior is not desirable and that AI algorithms should instead be used by artists as assistants during the compositional process. By creating a fruitful discussion between a composer and the machine, the artist can then focus on the development of their musical ideas and let the AI do the technical parts. Professional composers can benefit from these tools to become more productive and explore uncharted regions of musical creation while amateur musicians can use these innovative tools to express themselves in an intuitive way. By putting composers back in the loop, we will go from automatic music composition to AI-augmented composition and redefine the way people compose music.
Exponential families and mixture families are parametric probability models that can be geometrically studied as smooth statistical manifolds with respect to any statistical divergence like the Kullback–Leibler (KL) divergence or the Hellinger divergence. When equipping a statistical manifold with the KL divergence, the induced manifold structure is dually flat, and the KL divergence between distributions amounts to an equivalent Bregman divergence on their corresponding parameters. In practice, the corresponding Bregman generators of mixture/exponential families require to perform definite integral calculus that can either be too time-consuming (for exponentially large discrete support case) or even do not admit closed-form formula (for continuous support case). In these cases, the dually flat construction remains theoretical and cannot be used by information-geometric algorithms. To bypass this problem, we consider performing stochastic Monte Carlo (MC) estimation of those integral-based mixture/exponential family Bregman generators. We show that, under natural assumptions, these MC generators are almost surely Bregman generators. We define a series of dually flat information geometries, termed Monte Carlo Information Geometries, that increasingly-finely approximate the untractable geometry. The advantage of this MCIG is that it allows a practical use of the Bregman algorithmic toolbox on a wide range of probability distribution families. We demonstrate our approach with a clustering task on a mixture family manifold. We then show how to generate MCIG for arbitrary separable statistical divergence between distributions belonging to a same parametric family of distributions.