Abstract
Systems are a complex, dynamic whole composed of interrelated, interacting components. One of the most significant challenges of describing system dynamics is identifying how extensively different components of the system interact and the origin of different patterns of dynamical behavior. A key question is how information flows among the different components and how information is exchanged with the surrounding environment. The tools of information theory make it possible to quantify the information flow and use it to discover causal dependency within the system. In this chapter, we discuss a number of entropy-based measures of dependency, including mutual information, cumulants, conditional mutual information, transfer entropy, redundancy, and information flow. We provide examples from known dynamical systems and space weather illustrating how these discriminating statistics can be applied. We show how these tools can be used to address such questions as: how do the dynamics of the magnetospheric response to the solar wind change during the solar cycle, can we identify physical processes responsible for substorm onset, how does information flow through the system, and how is it related to various physical processes?
| Original language | American English |
|---|---|
| Title of host publication | Machine Learning Techniques for Space Weather |
| Editors | Enrico Camporeale, Simon Wing, Jay Johnson |
| Place of Publication | Cambridge, MA |
| Publisher | Elsevier |
| Pages | 46-70 |
| ISBN (Electronic) | 9780128117897 |
| ISBN (Print) | 9780128117897 |
| DOIs | |
| State | Published - 2018 |
Keywords
- Information theory
- Cumulant-based analysis
- Mutual information
- Conditional mutual information
- Transfer entropy
Disciplines
- Astrophysics and Astronomy
- Stars, Interstellar Medium and the Galaxy