From system identification to sequence models - a primer on Structured State-Space Models

Some useful resources

Welcome to the this talk at Reglermöte 2025! Below you can find information about the talk, links to github repositories, and the list of publications mentioned during the presentation.

Information

  • Where: LTH, Lund, Track M:B.
  • When: June 12, 2025. 11:40 - 12:00.
  • Speaker: Fabio Bonassi
  • Extended Abstract: pdf

Repositories


References

  • A. Orvieto et al. (2023) “Resurrecting recurrent neural networks for long sequences.” International Conference on Machine Learning. PMLR. [link]
  • Lennart Ljung (1999). “System identification: theory for the user”. [info]
  • A. Gu and T. Dao. (2024). Mamba: Linear-Time Sequence Modeling with Selective State Spaces. 1st Conference on Language Modeling. [link]
  • Alonso et al (2024). “State space models as foundation models: A control theoretic overview.” arXiv preprint. [link]

Selected references

2024

  1. Conference
    Structured state-space models are deep Wiener models
    Fabio Bonassi, Carl Andersson, Per Mattsson, and Thomas B Schön
    In 20th IFAC Symposium on System Identification (SYSID), Feb 2024