Long-Range Transformers for Dynamic Spatiotemporal Forecasting

Multivariate Time Series Forecasting focuses on the prediction of future values based on historical context. State-of-the-art sequence-to-sequence models rely on neural attention between timesteps, which allows for temporal learning but fails to consider distinct spatial relationships between variables. In contrast, methods based on graph neural networks explicitly model variable relationships. However, these methods often rely on predefined graphs and perform separate spatial and temporal updates without establishing direct connections between each variable at every timestep. This paper addresses these problems by translating multivariate forecasting into a spatiotemporal sequence formulation where each Transformer input token represents the value of a single variable at a given time. Long-Range Transformers can then learn interactions between space, time, and value information jointly along this extended sequence. Our method, which we call Spacetimeformer, achieves competitive results on benchmarks from traffic forecasting to electricity demand and weather prediction while learning fully-connected spatiotemporal relationships purely from data.

Fulldocs: https://towardsdatascience.com/multivariate-time-series-forecasting-with-transformers-384dc6ce989b
PDF: https://arxiv.org/pdf/2109.12218.pdf

3D Cellular Automata

This is a project that ressembles the classic Game of Life, but implemented in a 3D world. The project needs OpenGL 4.5 or greater to render, and currently only compiles under Linux (although it would be relatively easy to change CMakeLists to compile for Windows/macOS*(apple discontinued OpenGL support in favor of Metal)).

Cellular Automaton 3D

pic

Source:  https://github.com/w84death/cellular-automaton

Machine Learning’s ‘Amazing’ Ability to Predict Chaos

In new computer experiments, artificial-intelligence algorithms can tell the future of chaotic systems.
Gif illustration for "Machine Learning’s ‘Amazing’ Ability to Predict Chaos"

Researchers have used machine learning to predict the chaotic evolution of a model flame front.

Half a century ago, the pioneers of chaos theory discovered that the “butterfly effect” makes long-term prediction impossible. Even the smallest perturbation to a complex system (like the weather, the economy or just about anything else) can touch off a concatenation of events that leads to a dramatically divergent future. Unable to pin down the state of these systems precisely enough to predict how they’ll play out, we live under a veil of uncertainty.

In a series of results reported in the journals Physical Review Letters and Chaos, scientists have used machine learning — the same computational technique behind recent successes in artificial intelligence — to predict the future evolution of chaotic systems out to stunningly distant horizons. The approach is being lauded by outside experts as groundbreaking and likely to find wide application.

“I find it really amazing how far into the future they predict” a system’s chaotic evolution, said Herbert Jaeger, a professor of computational science at Jacobs University in Bremen, Germany  …. <more>

Graphic illustration depicting the charts of training computers to predict chaos.

Full pages:

https://www.wired.com/story/machine-learnings-amazing-ability-to-predict-chaos/

https://www.quantamagazine.org/machine-learnings-amazing-ability-to-predict-chaos-20180418/

 

 

A Novel Algorithm Enables Statistical Analysis of Time Series Data

MIT scientists have developed a novel approach to analyzing time series data sets using a new algorithm, termed state-space multitaper time-frequency analysis (SS-MT). SS-MT gives a structure to dissect time arrangement information progressively, empowering analysts to work in a more educated manner with extensive arrangements of information that are nonstationary, i.e. at the point when their qualities develop after some time.

A Novel Algorithm Enables Statistical Analysis of Time Series Data

Using a novel analytical method they have developed, MIT researchers analyzed raw brain activity data (B). The spectrogram shows decreased noise and increased frequency resolution, or contrast (E and F) compared to standard spectral analysis methods (C and D). Image courtesy of Seong-Eun Kim et al.

It is important to measure time while every task such as tracking brain activity in the operating room, seismic vibrations during an earthquake, or biodiversity in a single ecosystem over a million years. Measuring the recurrence of an event over some stretch of time is a major information investigation errand that yields basic knowledge in numerous logical fields.

This newly developed approach enables analysts to measure the moving properties of information as well as make formal factual correlations between discretionary sections of the information.

Emery Brown, the Edward Hood Taplin Professor of Medical Engineering and Computational Neuroscience said“The algorithm functions similarly to the way a GPS calculates your route when driving. If you stray away from your predicted route, the GPS triggers the recalculation to incorporate the new information.”

“This allows you to use what you have already computed to get a more accurate estimate of what you’re about to calculate in the next time period. Current approaches to analyses of long, nonstationary time series ignore what you have already calculated in the previous interval leading to an enormous information loss.”  …… 

Full post: https://www.techexplorist.com/novel-algorithm-enables-statistical-analysis-time-series-data/

Abstract: http://www.pnas.org/content/early/2017/12/15/1702877115