Follow
Mostafa Dehghani
Mostafa Dehghani
Research Scientist, Google DeepMind
Verified email at google.com - Homepage
Title
Cited by
Cited by
Year
An image is worth 16x16 words: Transformers for image recognition at scale
A Dosovitskiy, L Beyer, A Kolesnikov, D Weissenborn, X Zhai, ...
arXiv preprint arXiv:2010.11929, 2020
339802020
Vivit: A video vision transformer
A Arnab*, M Dehghani*, G Heigold, C Sun, M Lučić, C Schmid
arXiv preprint arXiv:2103.15691, 2021
17872021
Scaling instruction-finetuned language models
HW Chung, L Hou, S Longpre, B Zoph, Y Tay, W Fedus, Y Li, X Wang, ...
Journal of Machine Learning Research 25 (70), 1-53, 2024
15102024
Efficient Transformers survey
DM Yi Tay, Mostafa Dehghani, Dara Bahri
ACM Computing Survey 55 (6), 1–28, 2022
1056*2022
Universal Transformers
M Dehghani, S Gouws, O Vinyals, J Uszkoreit, Ł Kaiser
International Conference on Learning Representations (ICLR), 2019
8872019
Palm 2 technical report
R Anil, AM Dai, O Firat, M Johnson, D Lepikhin, A Passos, S Shakeri, ...
arXiv preprint arXiv:2305.10403, 2023
7502023
Long Range Arena: A Benchmark for Efficient Transformers
Y Tay*, M Dehghani*, S Abnar, Y Shen, D Bahri, P Pham, J Rao, L Yang, ...
arXiv preprint arXiv:2011.04006, 2020
4952020
Neural Ranking Models with Weak Supervision
M Dehghani, H Zamani, A Severyn, J Kamps, WB Croft
The 40th International ACM SIGIR Conference on Research and Development in …, 2017
4012017
Gemini: a family of highly capable multimodal models
G Team, R Anil, S Borgeaud, Y Wu, JB Alayrac, J Yu, R Soricut, ...
arXiv preprint arXiv:2312.11805, 2023
4002023
Metnet: A neural weather model for precipitation forecasting
CK Sønderby, L Espeholt, J Heek, M Dehghani, A Oliver, T Salimans, ...
arXiv preprint arXiv:2003.12140, 2020
2872020
Ul2: Unifying language learning paradigms
Y Tay, M Dehghani, VQ Tran, X Garcia, J Wei, X Wang, HW Chung, ...
arXiv preprint arXiv:2205.05131, 2022
2662022
Simple open-vocabulary object detection
M Minderer, A Gritsenko, A Stone, M Neumann, D Weissenborn, ...
European Conference on Computer Vision, 728-755, 2022
2642022
Scaling vision transformers to 22 billion parameters
M Dehghani, J Djolonga, B Mustafa, P Padlewski, J Heek, J Gilmer, ...
International Conference on Machine Learning, 7480-7512, 2023
2502023
Parameter-efficient multi-task fine-tuning for transformers via shared hypernetworks
RK Mahabadi, S Ruder, M Dehghani, J Henderson
arXiv preprint arXiv:2106.04489, 2021
2112021
From neural re-ranking to neural ranking: Learning a sparse representation for inverted indexing
H Zamani, M Dehghani, WB Croft, E Learned-Miller, J Kamps
Proceedings of the 27th ACM international conference on information and …, 2018
1712018
Transformer memory as a differentiable search index
Y Tay, V Tran, M Dehghani, J Ni, D Bahri, H Mehta, Z Qin, K Hui, Z Zhao, ...
Advances in Neural Information Processing Systems 35, 21831-21843, 2022
1462022
Tokenlearner: Adaptive space-time tokenization for videos
M Ryoo, AJ Piergiovanni, A Arnab, M Dehghani, A Angelova
Advances in neural information processing systems 34, 12786-12797, 2021
1242021
Learning to Attend, Copy, and Generate for Session-Based Query Suggestion
M Dehghani, S Rothe, E Alfonseca, P Fleury
International Conference on Information and Knowledge Management (CIKM'17), 2017
1202017
Scale efficiently: Insights from pre-training and fine-tuning transformers
Y Tay, M Dehghani, J Rao, W Fedus, S Abnar, HW Chung, S Narang, ...
arXiv preprint arXiv:2109.10686, 2021
1032021
Exploring the limits of large scale pre-training
S Abnar, M Dehghani, B Neyshabur, H Sedghi
arXiv preprint arXiv:2110.02095, 2021
1022021
The system can't perform the operation now. Try again later.
Articles 1–20