Obserwuj
Filip Szatkowski
Tytuł
Cytowane przez
Cytowane przez
Rok
Hypernetworks build implicit neural representations of sounds
F Szatkowski, KJ Piczak, P Spurek, J Tabor, T Trzciński
Joint European Conference on Machine Learning and Knowledge Discovery in …, 2023
17*2023
Adapt Your Teacher: Improving Knowledge Distillation for Exemplar-free Continual Learning
F Szatkowski, M Pyla, M Przewięźlikowski, S Cygert, B Twardowski, ...
Proceedings of the IEEE/CVF Winter Conference on Applications of Computer …, 2024
142024
Exploiting Activation Sparsity with Dense to Dynamic-k Mixture-of-Experts Conversion
F Szatkowski, B Wójcik, M Piórczyński, S Scardapane
The Thirty-eighth Annual Conference on Neural Information Processing Systems, 2024
10*2024
Zero time waste in pre-trained early exit neural networks
B Wójcik, M Przewiȩźlikowski, F Szatkowski, M Wołczyk, K Bałazy, ...
Neural Networks 168, 580-601, 2023
82023
Progressive Latent Replay for Efficient Generative Rehearsal
S Pawlak, F Szatkowski, M Bortkiewicz, J Dubiński, T Trzciński
International Conference on Neural Information Processing, 457-467, 2022
12022
Sparser, Better, Deeper, Stronger: Improving Static Sparse Training with Exact Orthogonal Initialization
A Nowak, Ł Gniecki, F Szatkowski, J Tabor
Forty-first International Conference on Machine Learning, 0
1
Exploring the Stability Gap in Continual Learning: The Role of the Classification Head
W Łapacz, D Marczak, F Szatkowski, T Trzciński
arXiv preprint arXiv:2411.04723, 2024
2024
Accelerated Inference and Reduced Forgetting: The Dual Benefits of Early-Exit Networks in Continual Learning
F Szatkowski, F Yang, B Twardowski, T Trzciński, J van de Weijer
arXiv preprint arXiv:2403.07404, 2024
2024
Nie można teraz wykonać tej operacji. Spróbuj ponownie później.
Prace 1–8