Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Recently, time series prediction models based on deep neural networks have demonstrated excellent capabilities in capturing the hidden relationships within time steps. However, due to these models directly outputting scalar values at each time step, it is challenging to account for uncertainty associated with their predictions. To address such challenge, we propose a novel model that directly constructs discrete probability distributions per step instead of a scalar. The regression output at each time step is derived by computing the expectation of the predictive distribution over a predefined support set. To mitigate prediction anomalies, a dual-branch architecture is introduced with interleaved support sets, augmented by coarse temporal-scale branches for long-term trend forecasting. Outputs from another branch are treated as auxiliary signals to impose self-supervised consistency constraints on the current branch's prediction. Extensive experiments on multiple real-world datasets demonstrate the superior performance of the proposed model. All source codes will be released upon acceptance.