A recurrent Markov state-space generative model for sequences

Anand Ramachandran, Steven S. Lumetta, Eric Klee, Deming Chen

Research output: Contribution to conferencePaperpeer-review

Abstract

While the Hidden Markov Model (HMM) is a versatile generative model of sequences capable of performing many exact inferences efficiently, it is not suited for capturing complex long-term structure in the data. Advanced state-space models based on Deep Neural Networks (DNN) overcome this limitation but cannot perform exact inferences. In this article, we present a new generative model for sequences that combines both aspects, the ability to perform exact inferences and the ability to model long-term structure, by augmenting the HMM with a deterministic, continuous state variable modeled through a Recurrent Neural Network. We empirically study the performance of the model on (i) synthetic data comparing it to the HMM, (ii) a supervised learning task in bioinformatics where it outperforms two DNN-based regressors and (iii) in the generative modeling of music where it outperforms many prominent DNN-based generative models.

Original languageEnglish (US)
StatePublished - 2020
Event22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019 - Naha, Japan
Duration: Apr 16 2019Apr 18 2019

Conference

Conference22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019
CountryJapan
CityNaha
Period4/16/194/18/19

ASJC Scopus subject areas

  • Artificial Intelligence
  • Statistics and Probability

Fingerprint Dive into the research topics of 'A recurrent Markov state-space generative model for sequences'. Together they form a unique fingerprint.

Cite this