The new version features enhanced automation, machine learning, and visualization tools help researchers uncover human insights faster, deeper, and more intuitively.
COPENHAGEN, Denmark, Nov. 5, 2025 /PRNewswire-PRWeb/ -- iMotions, the world's leading software platform for human behavior research, today announced the release of iMotions 11, the latest version of its industry-defining platform used by more than 1,500 organizations worldwide.
Built on nearly two decades of innovation, iMotions 11 continues to elevate possibilities in behavioral research by making it faster, more flexible, and more intuitive. The new release refines everything researchers already love about the iMotions platform while expanding sensor support, simplifying complex workflows, and unlocking deeper insights into how people think, feel, and act.
"With iMotions 11, we've focused on what matters most to our users: flexibility, speed, and confidence," said Peter Hartzbech, CEO and Founder of iMotions. "With enhanced machine learning, advanced visualizations, and smarter automation, researchers can better capture and analyze behavior and decision making. iMotions 11 helps them see human behavior faster, deeper, and more clearly than ever before."
With iMotions 11, the company builds on the foundation of version 10, streamlining integrations, boosting automation, and elevating the entire user experience.
Expanded hardware support is one of the key enhancements, reinforcing iMotions' long-term commitment to a hardware-agnostic platform that provides researchers the ability to choose the tools that fit their study. iMotions 11 now integrates seamlessly with the latest biosensors and eye-tracking systems, including Smart Eye Pro v12 (introducing drowsiness and profile-ID signals), Pupil Labs Neon (with blink and fixation imports), and the full Biosignalsplux suite. The platform also adds native fNIRS support with industry-standard export options, making brain-oxygenation studies easier to run and analyze.
Workflow efficiency has also been reimagined. Researchers can now create Areas of Interest (AOIs) faster and more precisely with dynamic drawing options and improved axis detection. Surveys are richer and more flexible, supporting skip logic, multimedia, and remote participation. A completely new Data Visualization Dashboard provides a clean, customizable view of multimodal signals, giving researchers a clear window into every layer of their data.
For data analysis, gaze mapping now includes mouse tracking and calibration exports for more precise attention mapping. Affectiva metrics have been enhanced to detect speaking, adaptive engagement, and emotional valence shifts. The new Multiface Analysis feature enables emotion tracking for multiple participants simultaneously, which is ideal for studies of group behavior and interaction.
Additional upgrades include a searchable study library, optimized ECG peak detection, and dynamic API configurations, which increase responsiveness and customization,
Several videos highlight the increased possibilities:
- In lab a demonstration, eye tracking and facial expression analysis are used together to quantify engagement, capturing both conscious and nonconscious reactions in real time.
- A second lab study showcases a multimodal setup that synchronizes ECG, respiration, voice analysis, eye tracking, and facial expression analysis, demonstrating how physiological and emotional data can combine to reveal deeper insights into stress, attention, and cognitive load under controlled conditions.
- A demonstration outside the lab, which will be available later this week, uses eye tracking glasses, respiration, ECG, GPS, and voice analysis to analyze how drivers react to real-world stimuli, exploring focus, fatigue, and emotional states behind the wheel.
About iMotions
Founded in 2005 and headquartered in Copenhagen, iMotions has developed the world's leading human behavior research software platform. More than 1,500 organizations around the world – from leading academic institutions to global brands and highly respected healthcare organizations – use iMotions to access real-time and nonconscious emotional, cognitive and behavioral data. By integrating and synchronizing all types of sensors into a single platform, iMotions provides researchers with access to deeper and richer insights – and the most complete picture of human behavior. For more information, visit iMotions.com.
Media Contact
Todd Graff, Graff Communications for iMotions, 1 617-309-0401, [email protected]
SOURCE iMotions

Share this article