Recent technological advances in the biological and physical sciences have allowed for the generation of large quantity datasets necessary for applying deep neural networks. Despite the demonstrable success of these methods in a variety of tasks including image classification, machine translation, and query-answering, among others, their widespread adoption in biomedical research has been tempered due to issues inherent to modeling complex biological systems not readily addressed by traditional gradient-based neural networks. We consider the problem of unsupervised, general-purpose learning in biological sequence data, wherein variable-order temporal dependencies, multi-dimensionality and uncertainty in model structure and data are the norm. To successfully model and learn these dependencies in an intuitive and holistic manner, we have utilized the data abstraction of Simplicial Grammar within a Bayesian learning framework. We demonstrate that this framework offers the ability to quickly encode and integrate new information, and perform prediction tasks without extensive, iterative training.