Graduate and Professional Student Association

Poster Session

12:45 PM – 1:45 PM | Merten Hall, Room 1204

Merten Hall, Room 1204

Bridging the Gap: A Conceptual Framework for AI-Driven Triage in Higher Education Mental Health Services

Eunjin Han (College of Education and Human Development)

The demand for mental health services in higher education has surged exponentially, overwhelming university counseling centers (UCCs) and creating critical bottlenecks in service delivery. While traditional triage models rely heavily on limited human resources, this often results in delayed interventions for students in distress. This proposal introduces a conceptual framework for an AI-driven triage and routing system designed to optimize student support efficiency and accessibility. Synthesizing principles from counseling psychology and educational technology, the proposed model utilizes Natural Language Processing (NLP) to analyze student self-reports and intake data. The framework categorizes student needs based on a tiered risk assessment model: (1) Crisis-level risks trigger immediate routing to human clinicians; (2) Moderate concerns are directed to scheduled counseling or group therapy; and (3) Lower-risk maintenance needs are matched with AI-curated self-help resources, workshops, or peer support networks. This presentation will outline the theoretical architecture of the system, emphasizing the integration of the Stepped Care Model with machine learning algorithms. Furthermore, it addresses critical ethical considerations, including data privacy, algorithmic bias, and the necessity of “human-in-the-loop” protocols to ensure clinical safety. By reimagining the intake process, this interdisciplinary framework offers a scalable solution to the mental health crisis on campus, aiming to reduce wait times and ensure that every student receives the appropriate level of care at the right time.


Integrating Social Vulnerability into Shelter Location-Allocation Model for Wildfire Evacuation

Md Mahfuzur Rhaman (College of Science), Hoda Bidkhori, George Mason University, College of Science, PhD

Wildfires increasingly threaten communities across the western United States, yet evacuation planning tools often prioritize efficiency while overlooking social vulnerability. This research develops an equity-aware shelter location–allocation framework to improve evacuation outcomes for socially vulnerable populations during wildfire events. The central research question asks how explicitly incorporating social vulnerability into shelter allocation models affects coverage, travel burden, and unmet demand under limited shelter capacity. We hypothesize that integrating Social Vulnerability Index–based equity weighting significantly improves shelter access for high-vulnerability communities with manageable efficiency trade-offs. To test this hypothesis, three optimization models are formulated: a distance-based shelter allocation model, a travel-time-based allocation model using road-network travel times, and a shelter location–allocation model that allows the opening of new candidate shelters. All models include a tunable equity parameter that prioritizes high-SVI census tracts. The framework is evaluated using Los Angeles County as a case study, integrating tract-level population data, social vulnerability scores, shelter capacities, and realistic road-network travel times. Large-scale problem instances are solved efficiently using a column generation algorithm. Results show that increasing equity weights consistently improves coverage and reduces unmet demand for very high-SVI communities across all models. Moderate equity settings achieve substantial equity gains while maintaining near-optimal overall system performance, while the inclusion of new shelters further amplifies benefits for vulnerable populations. Travel-time-based results confirm that equity effects persist under realistic network constraints. This study concludes that social vulnerability can be operationalized as a policy-relevant decision lever in wildfire evacuation planning. Future work will extend the framework using robust and adaptive robust optimization to address uncertainty.


Distributionally Robust Logistic Regression with Missing Data

Weicong Chen (College of Science), Hoda Bidkhori

Missing data presents a persistent challenge in machine learning. Conventional approaches often rely on data imputation followed by standard learning procedures, typically overlooking the uncertainty introduced by the imputation process. This paper introduces Imputation-based Distributionally Robust Logistic Regression (I-DRLR)—a novel framework that integrates data imputation with class-conditional Distributionally Robust Optimization under the Wasserstein distance. I-DRLR explicitly models distributional ambiguity in the imputed data and seeks to minimize the worst-case logistic loss over the resulting uncertainty set. We derive a convex reformulation to enable tractable optimization and evaluate the method on the Breast Cancer and Heart Disease datasets from the UCI Repository. Experimental results demonstrate consistent improvements for out-of-sample performance in both prediction accuracy and ROC-AUC, outperforming traditional methods that treat imputed data as fully reliable.


What Do We Mean by “Pilot Study”: Early Findings from a Meta-Review of Pilot Study Reporting at CHI

Belu Ticona Oquendo (College of Engineering and Computing), Amna Liaqat, Antonis Anastasopoulos

Pilot studies are ubiquitous in Human-Computer Interaction (HCI) research, yet their role, purpose, and reporting practices remain under-specified. Authors frequently reference pilot studies to justify design decisions or methodological choices, but it is often unclear what constitutes a pilot study, how it differs from other forms of exploratory work, or how pilot findings shape final study designs. We present early findings from an ongoing, large-scale meta-review of pilot study reporting in papers in the ACM CHI conference, published between 2008 and 2025. Analyzing 904 full papers that explicitly reference “pilot study,” we characterize where pilot studies are reported, how much detail is provided, and how their impact on main studies is described. Our preliminary results reveal substantial variability in reporting structure, limited documentation of pilot outcomes, and frequent ambiguity around pilot contributions.


Vision Development for High School Choirs

Judith Rautenberg (College of Visual and Performing Arts)

Charismatic leaders are often charged with the responsibility of creating a vision and path for their group (Bass, 1985; Conger & Kanungo, 1987; Conger, 1989). Vision is one of the most important variables a charismatic leader can possess, and it is exemplified by their ability to develop ideas that can lead their group towards a successful future (Burns, 1978; Wis, 2007). A clear and well-communicated vision engages and unites the group members to invest time and work for their benefit (Wis, 2007). For example, choral conductors are leaders who are required to constantly develop their vision by programming music and recruiting singers for their choir. However, the previous literature barely addresses how this vision can be created and nurtured (Goetze, 2016; Halsey, 2011; Jordan, 1996).
The purpose of this research was to interview a high school choir conductor and determine how they developed a vision for their school ensemble. This research addresses the results of a single case study, which is based on an interview with a high school choir director in the Mid-Atlantic area of the United States, The results emphasize the process of developing a future path for a high school choir and its challenges, expectations, as well as their solutions.
The interview was recorded via Zoom. The data were cleaned, de-identified, and coded using the software Dedoose. The researcher organized the results in three main themes: (a) the status of the choir groups and the expectations of students, parents, and administration; (b) reasons and paths to initiate change for the choir; and (c) solutions and compromises the choir director and administration agreed on.

A Systematic Examination of Cohort Selection Decisions and Equity in Machine Learning Outcomes Using N3C Data

Atefehsadat Haghighathoseini (College of Public Health), Dr. Janusz Wojtusiak

Cohort selection critically influences machine learning (ML) model performance and the equity of clinical outcome predictions across demographic groups. In practice, cohort definitions are often driven by arbitrary or inconsistent data processing decisions, introducing bias and limiting model generalizability. During the COVID-19 pandemic, the need for rapid cohort construction further intensified concerns about transparency and fairness in ML-based analyses. This study systematically examines how cohort selection and data processing decisions influence ML performance and demographic equity in predicting COVID-19–related in-hospital mortality. Using data from the National COVID Cohort Collaborative (N3C), we evaluated two structured sets of cohorts. The first set included 16 cohorts derived from four primary data processing decisions: COVID-19 case identification, inpatient inclusion criteria, diagnosis date selection, and admission timestamp availability. The second set expanded this framework to 64 cohorts by additionally incorporating provider ID and location ID filtering. Model performance was evaluated using the area under the receiver operating characteristic curve (AUC) across multiple training testing cohort combinations. Three ML models logistic regression, random forest, and gradient boosting were assessed using multiple analytical strategies. Performance was further analyzed across demographic subgroups defined by gender, race, and ethnicity. Results show that cohort selection decisions substantially influence ML model performance. Admission time inclusion consistently emerged as a key factor, while additional filtering criteria further affected results across models and subgroups. These findings demonstrate that seemingly minor cohort construction choices can meaningfully impact both predictive accuracy and demographic equity, underscoring the need for transparent, standardized, and equity-aware cohort selection practices in ML-based healthcare research.


Enhancing Prediction of Systolic Heart Failure Mortality through Genetic Algorithm-Optimized Laboratory Tests

Priscille Ngana (College of Public Health), Bhumi Patel, Janusz Wojtusiak

Heart failure is a leading cause of ICU admissions and is associated with high mortality, necessitating robust risk stratification to identify high-risk patients and guide care. Prior studies have applied machine learning to predict mortality and readmissions in heart failure populations. Leveraging clinical and laboratory data, our study addresses variability in laboratory test readings across time intervals. We aimed to optimize the timing and combination of these variables to identify the most informative measurements for improving mortality prediction in systolic heart failure. We utilized data from the MIMIC-IV database. The final cohort comprised 1,037 adult patients admitted to the ICU with systolic heart failure between 2008 and 2019, each with an ICU stay of >24 hours. Input variables included 48 demographic and clinical features, of which 38 were laboratory tests. A Genetic Algorithm (GA) was used to optimize feature selection and laboratory test timing for predicting systolic heart failure in-hospital mortality. Starting with randomly generated timing intervals, a 1000-generation evolutionary process (fitness evaluation, selection, crossover, mutation) was conducted to enhance model performance. GA optimization improved model performance, increasing AUC from 0.70 to 0.78 (range 0.71-0.87), outperforming traditional machine learning methods reported in prior studies (AUC ~ 0.70-0.80). The model also identified optimal timing windows for laboratory measurements and key mortality predictors. Genetic algorithm optimization of laboratory features and timing significantly enhances in-hospital mortality prediction in systolic heart failure. This strategy supports improved early risk stratification and has the potential to advance prognostic modeling in clinical practice.


Transgenic expression of chimeric NMDA receptor GluN2 subunits reveals that ionotropic signaling links novelty exposure to hippocampal Arc Expression

Hannah Zikria-Hagemeier(College of Humanities and Social Sciences), Guno D. Kletter, Diborah A. Gutema, Maryam S. Rafie, and Thoedore C. Dumas 

N-methyl-D-aspartate receptors (NMDARs) enable activity-dependent synaptic plasticity subserving learning and memory. Hippocampal NMDAR activation leads to Activity regulated cytoskeletal (Arc) expression marking contextual engram formation. Recently, it was shown NMDARs produce postsynaptic signals via ionotropic and non-ionotropic signaling. However, links between separate NMDAR signaling pathways and Arc haven’t been investigated. We created transgenic mice expressing chimeric GluN2 subunits [swapped carboxy terminal domains between GluN2A and GluN2B; GluN2A-Bctd and GluN2B-Actd] to separate ionotropic from non-ionotropic contributions to synaptic plasticity, learning and memory. We exposed mice at different ages (with different background content of GluN2A and GluN2B) briefly to a novel maze then examined hippocampal Arc expression via in situ hybridization staining in CA1 and DG. Animals under three weeks of age with predominantly GluN2B background expressing GluN2A-Bctd subunits displayed increased Arc levels compared to age-matched wildtype controls in both areas. Conversely animals over three weeks of age having a predominantly GluN2A background expressing GluN2B-Actd subunits exhibited lower levels of Arc compared to controls. However, baseline levels of Arc expression in home cage controls differed across genotypes, GluN2B-Actd expressing having more Arc expression. This suggests NMDAR ionotropic signaling regulates Arc expression. Additionally, excessive baseline Arc expression in mice expressing GluN2B-Actd subunits might explain impaired spatial learning and greater activity-driven Arc levels explain superior long-term memory in mice expressing GluN2A-Bctd subunits. A better understanding of the signaling pathways between environment and contextual engram formation may provide more molecular targets to improve treatments for spatial memory impairments in disorders including Alzheimer’s and schizophrenia.


Stronger Families, Stronger Students: Supporting academic achievement through a parenting support group

Maribel Tohara Nakamatsu (College of Education and Human Development), Dr. Rachael Goodman, Alejandra Salazar Salame, Ph.D., Leonela G. Jabbour Yammine, Gozde Sensoy Murt

Strong parent-child relationships play a critical role in supporting academic success. Research (Shao & Kang, 2022) demonstrates that positive parent-children relationships help to improve academic outcomes, including higher motivation, and stronger engagement in learning. However, parents whose own wellbeing is negatively impacted by systemic challenges (e.g., financial hardship, family separation, job insecurity, emotional stress) may have difficulty cultivating the relationships they aspire to have with their children. The ‘Parents Helping Parents: Building Strong Families’ support group was created through the initiative of Amigas de la Comunidad (Vesely et al., 2018), a group of Latina immigrant mothers, who sought to support other community members with mental health and parenting skills from a culture-centered framework. This project is unique because it uses a Community-Based Participatory Action Research (CBPAR) approach, ensuring that the experiences of community members guide its development and implementation. The support group was designed to strengthen parent-child relationships by empowering parents through shared experiences, guided reflection, and community connection. Approaches were designed for the community’s specific cultural and contextual factors, which created a supportive environment where participants could discuss challenges and provide support to one another. Group discussions, role-playing, and problem-solving exercises helped parents reflect on their practices and improve communication with their children. By strengthening family bonds and increasing parental confidence, the program supports children’s academic success. When parents feel prepared to provide emotional and academic support, students are more likely to thrive. Ultimately, this initiative promotes stronger families, greater parent engagement, and improved educational outcomes.


Don’t Believe the Hype(rsonic): Implications of Russia’s Novel Weapon Systems

Alvina Ahmed (Schar School of Policy and Government)

In March 2018, Vladimir Putin unveiled Russia’s five major novel weapon systems: Kinzhal, Avangard, Sarmat, Burevestnik, and Poseidon. Subsequently, the Tsirkon hypersonic ship-launched missile was unveiled, and a new intermediate-range ballistic missile possessing hypersonic speed, Oreshnik, was later introduced in 2024. The Kremlin’s rationale behind developing these news and sometimes dual-capable weapon systems was to possess capabilities that could counter U.S. precision strike munitions and evade U.S. and European missile defenses, in turn granting Russian forces both strategic advantages and operational superiority on the battlefield. Nevertheless, previous literature analyzing Russia’s use of these systems to carry conventional payloads in Ukraine suggests that the systems do not provide significant tactical unity. Therefore, this research asks the following questions: What were the Kremlin’s objectives in using the novel, dual-capable weapons systems in Ukraine? How well did it achieve these objectives? And what are the implications of Russia’s novel systems for NATO? The research conducts a cross-case analysis of five instances when Russian forces deployed some of the novel weapon systems in its ongoing war in Ukraine. The findings indicate that Russia primarily employed these weapons to conduct psychological warfare. The aim was to coerce Ukrainian forces to back down and to force the West to stop providing aid to Ukraine. The implication of Russia’s novel weapons for NATO is that they do not grant Russia a decisive strategic or tactical edge as Russia’s legacy weapons already maintain a credible second-strike capability against Moscow’s adversaries.