About Us
          
          
             
             
             
             As a laboratory rooted in education and academia, we advance technology through innovative research.
             Our mission is to enable machines to interact with the physical world via a unified natural language interface—Language + X, where X can be code, vision, tables, etc. We believe this will lay a solid foundation for future innovation.
            
          
            Projects
          
          
            
              
              
                Code Intelligence
                Automating the software development via code LLMs.
               
             
            
              
              
                Multimodal
                
                Understanding and generating content across multiple modalities.
               
             
            
              
              
                Language Model
                Building more reliable and robust language model. Extend its creativity into infinity.
               
             
            
              
              
                UI Intelligence
                Automating the UI design via UI code generation.
               
             
            
              
              
                Table Intelligence
                Building a more user-friendly interface for interacting with tabular data (e.g., Excel).
               
             
            
              
              
                Trustworthy AI
                Building reliable and robust generative models.
               
             
           
          
            News
          
          
             
              Sep. 2025
              Two papers have been accepted by NeurIPS 2025! Congratulations to Mingyang, Yuyang, Zetong, and Yuliang!
             
             
              Aug. 2025
              One paper has been accepted by EMNLP 2025! Congratulations to Chenlong and Yuanning!
             
            
              Aug. 2025
              One paper has been accepted by ACMMM 2025! Congratulations to Ruoxi!
             
            
              May. 2025
              Two papers have been accepted by ACL 2025! Congratulations to Geliang, and Caixi!
             
            
              May. 2025
              Two papers have been accepted by KDD 2025! Congratulations to Shu, and Zhongyi!
             
            
              May. 2025
              One papers have been accepted by ICML 2025! Congratulations to Chenlong, Zhengxiang and Zhaoyang!
             
            
              Jan. 2025
              Three papers have been accepted by ICLR 2025 (1 Spotlight, top 5%)! Congratulations to Dongping, and Siyuan!
             
            
              Jan. 2025
              Two papers have been accepted by WWW 2025! Congratulations to Zhongyi, and Dongping!
             
            
              Oct. 2024
              Four papers have been accepted by NeurIPS 2024 Workshop! Congratulations to Caixi, Yanru, Siyuan, and Dongping!
             
            
              Sep. 2024
              One paper has been accepted by NeurIPS 2024 and EMNLP. Congratulations to Chujie and Batu!
             
            
              Jun. 2024
              One paper has been accepted by ICML 2024 Workshop. Congratulations to Dongping!
             
            
              May. 2024
              MLLM-as-a-Judge has been accepted by ICML 2024 Oral (top 1.44%). Congratulations to Dongping, Ruoxi, and Yaochen!
             
            
              Mar. 2024
              LLM-as-a-Coauthor has been accepted by NAACL 2024 Findings. Congratulations to Chujie and Dongping!
             
            
              Jan. 2024
              Metatool has been accepted by ICLR 2024. Congratulations to Yue!
             
           
            
            Acknowledgement
            
            
               Dongping Chen and ONE Lab is supported by education program from 
Linear Company! We thank them for their invaluable support!