To ensure effective applied behavioral analysis (ABA) interventions for children with ASD, accurate and detailed real- time annotation of targeted behaviors is crucial to the intervention evaluations and clinical decision-making of therapists. However, the majority of prior computational modeling research for ASD has focused on designing one-size-fits-all models for research domains that are standardized among participants, and not yet on modeling targeted atypical or functional behaviors that are highly individual and require long-term personalization. To bridge this gap, this work aims to develop socially assistive robot-supported ABA interventions to collect a novel long-term multimodal dataset of targeted behaviors from individual children with ASD, to include frame-by-frame ground-truth labels for both atypical maladaptive behaviors (vocal protest, vocal and physical stereotypy, and self-injurious behaviors) and encouraged functional behaviors (manding and tacting behaviors) targeted in ABA interventions. Based on this multi-session multimodal dataset, we propose to train and evaluate personalized multimodal machine learning models with supervised domain adaptation to automate the annotation process while adapting to each child’s unique behavioral patterns. The optimal goal of this project is to liberate therapists from tedious manual annotations, so we can pave the way for more cost-effective robot-assisted ABA interventions to fully support the special needs and challenges of children with ASD.