My research investigates how artificial intelligence can deepen scientific understanding by reinterpreting classroom demonstration experiments through data-driven modeling. Trained originally as a physics educator, I gradually shifted toward computer science, and this hybrid perspective highlighted a growing educational gap: many secondary-school and university students experience physics primarily as abstract formulas, while opportunities for hands-on experimentation are often limited. To test these assumptions, I conducted a survey with 144 participants. The results confirmed a strong demand for practice-oriented learning and for activities where theory is connected to real measurements and analysis.
To bridge physics and IT, I develop demonstrations that can be integrated into computer science lessons. Students collect data from physical processes, build models, validate predictions, and—crucially—use modern AI methods to infer the underlying physical relationships. This approach supports an “indirect proof” of natural laws: instead of starting from equations, learners begin with data and reconstruct the governing rules, experiencing scientific reasoning as discovery.
The first experiment focuses on thermodynamics using IoT sensors to measure indoor/outdoor temperature, air pressure, and background radiation. Because these signals are time-series, we apply an LSTM (Long Short-Term Memory) neural network to predict environmental changes. The model performs strongly and reveals how indoor temperature responds to external thermal effects. Beyond prediction, the results allow students to estimate quantities such as building heat capacity and to formulate simplified thermodynamic descriptions of the system.
The second demonstration examines kinematics through disc collisions recorded on video. Using shape and color recognition, we extract trajectories and analyze them to investigate conservation of momentum and energy. We apply the SINDy algorithm (Sparse Identification of Nonlinear Dynamics, Steven Brunton), which identifies interpretable differential equations directly from measurement data. The reconstructed motion is highly realistic and provides compelling evidence for students that an AI-discovered equation of motion can match observed reality. A deliberately data-driven configuration is used to let the model independently “rediscover” the physics with minimal prior assumptions.
The third demonstration introduces the physics of music, an area often underrepresented in standard curricula despite its relevance in engineering (e.g., noise detection and predictive maintenance). Students record sounds digitally and use Python-based analysis to visualize Fourier spectra and Mel spectrograms, linking harmonics, decay, and signal profiles to physical interpretation and human perception.
Future work aims to make “invisible” phenomena tangible, including muon detection from cosmic rays (supporting relativistic concepts such as time dilation) and magnetic levitation, where AI-based control methods may provide robustness beyond classical PID control. To maximize accessibility, all tools and workflows are planned for distribution via containerization (e.g., Docker), enabling schools and individual learners to reproduce experiments and analyses with minimal setup.
Overall, this project demonstrates how AI-enhanced, measurement-based activities can reconnect students with natural laws. By moving from data to models—and from models to physical meaning—learners gain both deeper conceptual understanding and modern computational skills.