If you’re interested in collaborating, email me at email@example.com. I’ve posted a (work-in-progress) summary of my research approach.
Schaeffer, et al. Are Emergent Abilities of Large Language Models a Mirage? 2023.
Schaeffer *, Khona *, Fiete. Self-Supervised Learning of Efficient Algebraic Codes Generates Grid Cells. 2023.
Schaeffer, Khona, Fiete. No Free Lunch from Deep Learning in Neuroscience: A Case Study through Models of the Entorhinal-Hippocampal Circuit. NeurIPS 2022.
Schaeffer, Liu, Du, Linderman, Fiete. Streaming Inference for Infinite Non-Stationary Clustering. CoLLAs 2022.
Schaeffer, Du, Liu, Fiete. Streaming Inference for Infinite Latent Feature Models. ICML 2022.
Schaeffer, Khona, Fiete. No Free Lunch from Deep Learning in Neuroscience: A Case Study through Models of the Entorhinal-Hippocampal Circuit. ICML 2022 Workshop: AI for Science.
Schaeffer, Liu, Du, Linderman, Fiete. Streaming Inference for Infinite Non-Stationary Clustering. ICLR 2022 Workshop: Agent Learning in Open Endedness.
Schaeffer. An Algorithmic Theory of Metacognition in Minds and Machines. NeurIPS 2021 Workshop: Metacognition in the Age of AI.
Schaeffer, et al. Fiete, IBL. Neural population dynamics for hierarchical inference in mice performing the International Brain Lab task. Society for Neuroscience 2021.
Schaeffer, Bordelon, Khona, Pan, Fiete. Efficient Online Inference for Nonparametric Mixture Models. UAI 2021.
Schaeffer, Shaham, Kreiman, Sompolinsky. Neural network model of amygdalar memory engram formation and function. COSYNE 2021.
Schaeffer, Khona, Meshulam, IBL, Fiete. Reverse-engineering recurrent neural network solutions to a hierarchical inference task for mice. NeurIPS 2020.
Schaeffer et al. Streaming Inference for Infinite Latent Feature Models. AISTATS 2021.
Schaeffer et al. Memory Engrams Perform Nonparametric Latent State Associative Learning. 2022.
Schaeffer et al. Recovering low dimensional, interpretable mechanistic models via Representations and Dynamics Distillation. 2021.