Music and technology always go together. Without paper, there is no fugue or counterpoint.
Acoustic instruments are the result of several centuries of innovations. Electronic synthesisers
have enriched music with- a wide range of novel sounds. Without digital technology, popular
music in the twenty-first century is unthinkable.
At Sony CSL, we imagine the technology for the music of the next decade. Our researchers
work with musicians and content providers to push the boundaries of creativity and understand
the complexity of modern music production processes. By combining cutting-edge A.I. research
with a strong musical expertise, we develop the artificial intelligence that will extend the scope of
the musician’s capabilities. We pave the way for musical experiences yet to imagine.
Exponential families and mixture families are parametric probability models that can be geometrically studied as smooth statistical manifolds with respect to any statistical divergence like the Kullback–Leibler (KL) divergence or the Hellinger divergence. When equipping a statistical manifold with the KL divergence, the induced manifold structure is dually flat, and the KL divergence between distributions amounts to an equivalent Bregman divergence on their corresponding parameters. In practice, the corresponding Bregman generators of mixture/exponential families require to perform definite integral calculus that can either be too time-consuming (for exponentially large discrete support case) or even do not admit closed-form formula (for continuous support case). In these cases, the dually flat construction remains theoretical and cannot be used by information-geometric algorithms. To bypass this problem, we consider performing stochastic Monte Carlo (MC) estimation of those integral-based mixture/exponential family Bregman generators. We show that, under natural assumptions, these MC generators are almost surely Bregman generators. We define a series of dually flat information geometries, termed Monte Carlo Information Geometries, that increasingly-finely approximate the untractable geometry. The advantage of this MCIG is that it allows a practical use of the Bregman algorithmic toolbox on a wide range of probability distribution families. We demonstrate our approach with a clustering task on a mixture family manifold. We then show how to generate MCIG for arbitrary separable statistical divergence between distributions belonging to a same parametric family of distributions.
We introduce a model for music generation where
melodies are seen as a network of interacting
notes. Starting from the principle of maximum
entropy we assign to this network a probability
distribution, which is learned from an existing
musical corpus. We use this model to generate novel
musical sequences that mimic the style of the
corpus. Our main result is that this model can
reproduce high-order patterns despite having a
polynomial sample complexity. This is in contrast
with the more traditionally used Markov models that
have an exponential sample complexity.