Espen Haugsdal

Circle Attention: Forecasting Network Traffic by Learning Interpretable Spatial Relationships from Intersecting Circles (2023)

Circle Attention: Forecasting Network Traffic by Learning Interpretable Spatial Relationships from Intersecting Circles (2023)

ECML-PKDD 2023

Abstract Accurately forecasting traffic in telecommunication networks is essential for operators to efficiently allocate resources, provide better services, and save energy. We propose Circle Attention, a novel spatial attention mechanism for telecom traffic forecasting, which directly models the area of effect of neighboring cell towers. Cell towers typically point in three different geographical directions, called sectors. Circle Attention models the relationships between sectors of neighboring cell towers by assigning a circle with learnable parameters to each sector, which are: the azimuth of the sector, the distance from the cell tower to the center of the circle, and the radius of the circle.
Persistence initialization: A novel adaptation of the transformer architecture for time series forecasting (2023)

Persistence initialization: A novel adaptation of the transformer architecture for time series forecasting (2023)

Applied Intelligence

Abstract Time series forecasting is an important problem, with many real world applications. Transformer models have been successfully applied to natural language processing tasks, but have received relatively little attention for time series forecasting. Motivated by the differences between classification tasks and forecasting, we propose PI-Transformer, an adaptation of the Transformer architecture designed for time series forecasting, consisting of three parts: First, we propose a novel initialization method called Persistence Initialization, with the goal of increasing training stability of forecasting models by ensuring that the initial outputs of an untrained model are identical to the outputs of a simple baseline model.