top of page

RECENT PREPRINTS

First authors in bold, correspondence authors underlined

MAKE ME HAPPIER: EVOKING EMOTIONS THROUGH IMAGE DIFFUSION MODELS [PDF]

Qing Lin, Jingfeng Zhang, Yew Soon Ong, Mengmi Zhang

TTA-NAV: TEST-TIME ADAPTIVE RECONSTRUCTION FOR POINT-GOAL NAVIGATION UNDER VISUAL CORRUPTIONS [PDF]

Maytus Piriyajitakonkij, Mingfei Sun, Mengmi Zhang, Wei Pan

ADAPTIVE VISUAL SCENE UNDERSTANDING: INCREMENTAL LEARNING IN SCENE GRAPH GENERATION [PDF]

Naitik Khandlewal, Xiao Liu, Mengmi Zhang

REASON FROM CONTEXT WITH SELF-SUPERVISED LEARNING [PDF]

Xiao Liu, Ankur Sikarwar, Joo Hwee Lim, Gabriel Kreiman, Zenglin Shi, Mengmi Zhang

UNVEILING THE TAPESTRY: THE INTERPLAY OF GENERALIZATION AND FORGETTING IN CONTINUAL LEARNING [PDF]

Zenglin Shi, Jie Jing, Ying Sun, Joo Hwee Lim, Mengmi Zhang

HUMAN OR MACHINE? TURING TESTS FOR VISION AND LANGUAGE [PDF]

Mengmi Zhang, Giorgia Dellaferrera, Ankur Sikarwar, Marcelo Armendariz, Noga Mudrik, Prachi Agrawal, Spandan Madan, Andrei Barbu, Haochen Yang, Tanishq Kumar, Meghna Sadwani, Stella Dellaferrera, Michele Pizzochero, Hanspeter Pfister, Gabriel Kreiman

EFFICIENT ZERO-SHOT VISUAL SEARCH VIA TARGET AND CONTEXT-AWARE TRANSFORMER [PDF]

Zhiwei Ding, Xuezhe Ren, Erwan David, Melissa Vo, Gabriel Kreiman, Mengmi Zhang

WHAT MAKES DOMAIN GENERALIZATION HARD? [PDF]

Spandan Madan, Li You, Mengmi Zhang, Hanspeter Pfister, Gabriel Kreiman

SELECTED PUBLICATIONS

Full publication list [Google Scholar]

First authors in bold, correspondence authors underlined

Tuned Compositional Feature Replays for Efficient Stream Learning [PDF]

Morgan B. Talbot, Rushikesh Zawar, Rohil Badkundri,  Mengmi Zhang, Gabriel Kreiman

IEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2023

DECODING THE ENIGMA: BENCHMARKING HUMANS AND AIS ON THE MANY FACETS OF WORKING MEMORY [PDF]

Ankur Sikarwar, Mengmi Zhang

NeurIPS, Dataset and Benchmark Track, 2023

OBJECT-CENTRIC LEARNING WITH CYCLIC WALKS BETWEEN PARTS AND WHOLE [PDF]

Ziyu WangMike ShouMengmi Zhang

NeurIPS, 2023

TRAINING-FREE OBJECT COUNTING WITH PROMPTS [PDF]

Zenglin Shi, Ying Sun, Mengmi Zhang

WACV, 2023

LEARNING TO LEARN: HOW TO CONTINUOUSLY TEACH HUMANS AND MACHINES [PDF]

Parantak Singh, You Li, Ankur Sikarwar, Weixian Lei, Daniel Gao, Morgan Bruce Talbot, Ying Sun, Mike Zheng Shou, Gabriel Kreiman, Mengmi Zhang

ICCV, 2023

LABEL-EFFICIENT ONLINE CONTINUAL OBJECT DETECTION IN STREAMING VIDEO [PDF]

Jay Zhangjie Wu, David Junhao Zhang, Wynne Hsu, Mengmi Zhang, Mike Zheng Shou

ICCV, 2023 

INTEGRATING CURRICULA WITH REPLAYS: ITS EFFECTS ON CONTINUAL LEARNING [PDF]

Ren Jie Tee, Mengmi Zhang

AAAI symposium, 2023

SYMBOLIC REPLAY: SCENE GRAPH AS PROMPT FOR CONTINUAL LEARNING ON VQA TASK

Stan Weixian Lei, Difei Gao, Jay Zhangjie Wu, Yuxuan Wang, Wei Liu, Mengmi Zhang, Mike Zheng Shou

AAAI, 2023 [ORAL paper] [PDF]

LOOK TWICE: A GENERALIST COMPUTATIONAL MODEL PREDICTS RETURN FIXATIONS ACROSS TASKS AND SPECIES

Mengmi Zhang, Marcelo Armendariz, Will Xiao, Olivia Rose, Katarina Bendtz, Margaret Livingstone, Carlos Ponce, Gabriel Kreiman

PLOS Computational Biology, 2022 [PDF]

WHEN PIGS FLY: CONTEXTUAL REASONING IN SYNTHETIC AND NATURAL SCENES

Philipp Bomatter, Mengmi Zhang, Dimitar Karev, Spandan Madan, Claire Tseng, Gabriel Kreiman

ICCV, 2021 [PDF]

BEAUTY IS IN THE EYE OF THE MACHINE

Mengmi Zhang, Gabriel Kreiman

Nature Human Behaviour, 2021 [PDF]

VISUAL SEARCH ASYMMETRY: DEEP NETS AND HUMANS SHARE SIMILAR INHERENT BIASES

Shashi Kant Gupta, Mengmi Zhang, Chia-Chien Wu, Jeremy M. Wolfe, Gabriel Kreiman

Neurips 2021 [PDF]

PUTTING VISUAL OBJECT RECOGNITION IN CONTEXT

Mengmi Zhang, Claire Tseng, Gabriel Kreiman

CVPR, 2020 [PDF]

FINDING ANY WALDO WITH ZERO-SHOT INVARIANT AND EFFICIENT VISUAL SEARCH

Mengmi Zhang, Jiashi Feng, Keng Teck Ma, Joo Hwee Lim, Qi Zhao, Gabriel Kreiman

Nature Communications, 2018 [PDF]

DEEP FUTURE GAZE: GAZE ANTICIPATION ON EGOCENTRIC VIDEOS USING ADVERSARIAL NETWORKS

Mengmi Zhang, Keng Teck Ma, Joo Hwee Lim, Qi Zhao, Jiashi Feng

CVPR 2017 [ORAL paper acceptance rate 2.6% ] [PDF]

bottom of page