The Neural Hawkes Process: A Neurally Self-Modulating Multivariate Point Process
About
Many events occur in the world. Some event types are stochastically excited or inhibited---in the sense of having their probabilities elevated or decreased---by patterns in the sequence of previous events. Discovering such patterns can help us predict which type of event will happen next and when. We model streams of discrete events in continuous time, by constructing a neurally self-modulating multivariate point process in which the intensities of multiple event types evolve according to a novel continuous-time LSTM. This generative model allows past events to influence the future in complex and realistic ways, by conditioning future event intensities on the hidden state of a recurrent neural network that has consumed the stream of past events. Our model has desirable qualitative properties. It achieves competitive likelihood and predictive accuracy on real and synthetic datasets, including under missing-data conditions.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Event Prediction | StackOverflow | RMSE1.027 | 42 | |
| Per time-step regression | Walker2D | Squared Error1.014 | 19 | |
| Sequence Classification | Bit-stream XOR Event-based (irregular) encoding (test) | Accuracy95.09 | 18 | |
| Event Prediction | Retweet | RMSE22.32 | 18 | |
| Sequence Classification | Bit-stream XOR Equidistant encoding (test) | Accuracy97.73 | 18 | |
| Event Forecasting | taxi | RMSE0.369 | 16 | |
| Event Prediction | MIMIC | Accuracy53.4 | 15 | |
| Next event prediction | AMAZON | RMSE0.612 | 14 | |
| Event sequence classification | Irregular sequential MNIST (test) | Accuracy94.84 | 11 | |
| Sequence Classification | Bit-stream sequence Event-based encoding (test) | Accuracy95.09 | 11 |