Measuring Information Transfer
About
An information theoretic measure is derived that quantifies the statistical coherence between systems evolving in time. The standard time delayed mutual information fails to distinguish information that is actually exchanged from shared information due to common history and input signals. In our new approach, these influences are excluded by appropriate conditioning of transition probabilities. The resulting transfer entropy is able to distinguish driving and responding elements and to detect asymmetry in the coupling of subsystems.
Thomas Schreiber• 2000
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Relational inference | Diffusion (DIFF) WS-50 | AUC90.04 | 7 | |
| Relational inference | Friedkin-Johnsen (FJ) WS-50 | AUC95.37 | 7 | |
| Relational inference | CMN WS-50 | AUC86.7 | 7 | |
| Relational inference | Diffusion WS50 low-data | AUC67.76 | 7 | |
| Relational inference | Kuramoto (KURA) BA50 low-data | AUC68.86 | 7 | |
| Relational inference | Kuramoto (KURA) WS50 low-data | AUC91.09 | 7 | |
| Relational inference | Friedkin-Johnsen (FJ) BA50 low-data | AUC79.26 | 7 | |
| Relational inference | Friedkin-Johnsen (FJ) ER50 low-data | AUC75.65 | 7 | |
| Relational inference | Friedkin-Johnsen (FJ) WS50 low-data | AUC91.02 | 7 | |
| Relational inference | Competitive Dynamics CMN WS50 low-data | AUC81.11 | 7 |
Showing 10 of 37 rows