Culture
We strive to build an interdisciplinary team working at the area of deep learning, neuroscience, and HCI. Our core values are:
- Experimental: We value scientific rigor, focusing on researching under strong scientific grounds and conducting sound experiments that provide definitive and repeatable findings.
- Computational: Our scientific nature is to use algorithms, mathematical models, strong theoretical background, and strong coding skills.
Lab Entry
We welcomed students from all disciplines but those with a huge passion for research and/or making real industrial product. All Master/Ph.D. students are encouraged to publish at top-tier conferences/journals: HCI (CHI, UIST), NLP (ACL, EMNLP), CV (CVPR, ICCV), Brain (Journal of Neural Engineering).
Current Projects
Theory
- Distillation, Pruning, Grafting: utilizing distillation + pruning for better models
- Query-based summarization: designing effective models for query-based summarization
- Medical visual QA: designing effective models for medical visual QA
- Multimodality: understanding and utilizing multimodal signals (e.g., text, image) to design better models
- Fundamental models for Thai language: developing foundational models (e.g., embedding, GPT) for Thai language
- EEG + Wellbeing: understand EEG and utilize it to understand human behaviors and well-being
Application
Note: some of the demos can be found in https://ait-brainlab.github.io/demo/
- Glucose monitoring: contributes to the use of Raman spectroscopy and the development of Raman wearables to monitor blood glucose in real-time
- BCI speller: contributes to the development of BCI speller using EEG paradigms such as P300, SSVEP, Hybrid P300-SSVEP and motor imagery for locked-in patients.
- AI-assisted radiology software: contributes to the development of automatic annotation and reporting for fMRI/MRI images.
- Conversational AI: contributes to the development of chatGPT-like systems in domains such as legal, logistics, medical, insurance, corporate, etc. There will many components: (1) instruction (parameter efficient) tuning with dialogue datasets, (2) reinforcement learning, (3) retrieval augmentation, (4) quantization for efficiency, (5) support images, (6) support English and Thai.
- Code generation: contributes to automatic code generation, testing and deployment using large language models.
- ThaiGovAI: contributes to the development of models that can help turn informal text into formal text used in government, as well as generating common forms used in government operations.
- Research writing assistant: contributes to the development of models that can help style the writing into certain conference style.
- AI-generated content detector: contributes to the development of detector used to check plagiarism through AI generation.