TY - GEN
T1 - One-shot ontogenetic learning in biomedical datastreams
AU - Kalantari, John
AU - Mackey, Michael A.
N1 - Publisher Copyright:
© Springer International Publishing AG 2017.
PY - 2017
Y1 - 2017
N2 - Recent technological advances in the biological and physical sciences have allowed for the generation of large quantity datasets necessary for applying deep neural networks. Despite the demonstrable success of these methods in a variety of tasks including image classification, machine translation, and query-answering, among others, their widespread adoption in biomedical research has been tempered due to issues inherent to modeling complex biological systems not readily addressed by traditional gradient-based neural networks. We consider the problem of unsupervised, general-purpose learning in biological sequence data, wherein variable-order temporal dependencies, multi-dimensionality and uncertainty in model structure and data are the norm. To successfully model and learn these dependencies in an intuitive and holistic manner, we have utilized the data abstraction of Simplicial Grammar within a Bayesian learning framework. We demonstrate that this framework offers the ability to quickly encode and integrate new information, and perform prediction tasks without extensive, iterative training.
AB - Recent technological advances in the biological and physical sciences have allowed for the generation of large quantity datasets necessary for applying deep neural networks. Despite the demonstrable success of these methods in a variety of tasks including image classification, machine translation, and query-answering, among others, their widespread adoption in biomedical research has been tempered due to issues inherent to modeling complex biological systems not readily addressed by traditional gradient-based neural networks. We consider the problem of unsupervised, general-purpose learning in biological sequence data, wherein variable-order temporal dependencies, multi-dimensionality and uncertainty in model structure and data are the norm. To successfully model and learn these dependencies in an intuitive and holistic manner, we have utilized the data abstraction of Simplicial Grammar within a Bayesian learning framework. We demonstrate that this framework offers the ability to quickly encode and integrate new information, and perform prediction tasks without extensive, iterative training.
KW - Artificial intelligence
KW - Bayesian nonparametrics
KW - Probabilistic generative models
KW - Simplicial complexes
KW - Systems biology
KW - Unsupervised learning
UR - http://www.scopus.com/inward/record.url?scp=85028449844&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85028449844&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-63703-7_14
DO - 10.1007/978-3-319-63703-7_14
M3 - Conference contribution
AN - SCOPUS:85028449844
SN - 9783319637020
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 143
EP - 153
BT - Artificial General Intelligence - 10th International Conference, AGI 2017, Proceedings
A2 - Everitt, Tom
A2 - Goertzel, Ben
A2 - Potapov, Alexey
PB - Springer Verlag
T2 - 10th International Conference on Artificial General Intelligence, AGI 2017
Y2 - 15 August 2017 through 18 August 2017
ER -