收起
报告题目: Towards The Next Generation Foundation Model for Embodied AI
Building Generalized Foundation Model for Embodied AI (EAI) is a significant challenge. It necessitates not only comprehending the world through multi-modal sensors but also the capacity to adapt to novel tasks, environments, and hardware configurations through continual interaction with the world. Large-scale generative models exhibit the potential for learning-to-learn, enabling them to generalize to new tasks and acquire new skills without parameter tuning. Based on it, General-Purpose In-Context Learning (GPICL) offers a promising approach to addressing diverse tasks and hardware configurations using a single foundational model. Furthermore, it has been observed that the extent of generalization is not solely reliant on parameter scale but is more significantly influenced by the diversity of data sets, the length of context, and the size of memory states. Those insights suggests potential new pathways for developing a generalized foundation model for EAI.
嘉宾信息:
王凡当前在AIRS具身智能中心任研究员。他曾任百度杰出架构师,硕士毕业于CU-Boulder,本科毕业于中国科学技术大学。王凡主要研究领域和研究兴趣包括端到端机器人模型,自然语言大模型,AI for Science等。王凡发表超过30篇顶会和期刊Paper, 在生成式大语言模型,生物分子表征大模型,Human-In-The-Loop强化学习,元学习等领域发表过大量前沿论文,并获得包括吴文俊人工智能科技进步奖特等奖等荣誉。
Wang Fan is currently a researcher at the AIRS EAI Center. He was formerly a Distinguished Architect at Baidu. Fan acquired his Master's degree from University of Colorado at Boulder, and Bachelor's degree from the University of Science and Technology of China. Fan's main research areas and interests include end-to-end embodied AI and large-scale language models, AI for Science, etc. He has published over 30 top conference and journal papers, and has made a series of foundational contributions in the fields of large-scale generative models and representation models for natural language and biometry, and Human-In-The-Loop reinforcement learning, and has received honors including the First Prize of the Wu Wenjun Artificial Intelligence Award.