@article{kumar2025qonvolution,
title={Qonvolution: Towards Learning of High-Frequency Signals with Queried Convolution},
author={Kumar, Abhinav and Aumentado-Armstrong*, Tristan and Valkov*, Lazar and Sharma, Gopal and Levinshtein, Alex and Grzeszczuk, Radek and Kumar, Suren},
journal={arXiv preprint arXiv:2512.12898},
year={2025}
}
1D Regression Results.QNN outperforms MLP-based architectures including Fourier encodings in regressing high-frequency signals. This simple experiment compares networks which take the 1D queries and low-frequency (LF) signal as input to predict the high-frequency 1D signal. The standard MLP-based networks including Fourier encodings take 1D coordinates as queries. QNN changes the linear layer to a 1D convolutional layer and also takes the low-frequency signal in addition to the 1D queries.
SR Results of DIV2K Val images.Adding QNN to Real-ESRGAN faithfully reconstructs high- frequency details in various regions and results in higher quality synthesis visually. We highlight the differences in inset figures.
NVS Results.We provide examples of NVS task using 3DGS (Kerbl et al., 2023) baseline on multiple datasets. Adding QNN to faithfully reconstructs high-frequency details in various regions and results in higher quality synthesis visually. We highlight the differences in inset figures.
``We stand on the shoulder of giants. (William of Conches, 1123)"
Following are some great works for learning high-frequency signals / details:
1. Encodings: Fourier encodings and Hash-grids change the input coordinates to a higher dimensional coordinates for an MLP.
2. Activations: SIREN, sinc, QIREN and FINER change activation functions for MLPs.
3. Frequency Domain Methods: Lee et al. predict Fourier series coefficients, while Cai et al. predict phase-shifted signals for MLP.
4. Frequency-weighted Loss: Fre-GS applies frequency-weighted losses during training.
5. Network Ensembles: Galerkin neural networks use multiple networks to approximate high-frequency signals.
There are probably many more by the time you are reading this.