Lifelong compositional, scalable and object-centric learning
- Udandarao, Vishaal, et al. “A Practitioner’s Guide to Continual Multimodal Pretraining.” NeurIPS (2024).
- Yıldız, Çağatay, et al. “Investigating Continual Pretraining in Large Language Models: Insights and Implications.” arXiv preprint arXiv:2402.17400 (2024).
- Wiedemer, Thaddäus, et al. “Compositional generalization from first principles.” NeurIPS (2023).
- Dziadzio, Sebastian, et al. “Disentangled Continual Learning: Separating Memory Edits from Model Updates.” CoLLAs (2024).
- Wiedemer, Thaddäus, et al. “Provable Compositional Generalization for Object-Centric Learning.” arXiv preprint arXiv:2310.05327 (2023).
- Tangemann, Matthias, et al. “Object segmentation from common fate: Motion energy processing enables human-like zero-shot generalization to random dot stimuli.” NeurIPS (2024)