Previous chapters focused primarily on VAEs for data where samples are often treated as independent, such as individual images. However, many datasets possess inherent structure. For instance, text data consists of ordered sequences of words, time series data exhibits temporal dependencies, and social networks or molecules are best represented as graphs. This chapter extends the VAE framework to effectively model such sequential and structured information.
You will learn to:
By the end of this chapter, you'll be equipped to apply VAEs to a broader range of complex data formats, capturing the rich dependencies they contain.
6.1 Recurrent VAEs (RVAEs) for Time Series Modeling
6.2 VAEs with Attention Mechanisms for Sequences
6.3 Graph VAEs for Structured Data Representation
6.4 VAEs in Natural Language Processing
6.5 Temporal VAEs for Video and Dynamic Systems
6.6 Connections between State-Space Models and VAEs
6.7 Practice: Implementing VAEs for Sequential Data
© 2025 ApX Machine Learning