.png)
At NexusCon 2025, April Yi, Director of Digital Engineering at Microsoft, walked through how her team is using machine learning to dynamically control HVAC start and stop times across roughly 50 buildings. The challenge wasn’t installing new hardware—it was replacing static schedules with predictive, data-driven operations that actually reflect when people show up.
Using indoor and outdoor temperature data, badge swipes, Wi-Fi session data, and HVAC telemetry, Microsoft built a system that forecasts occupancy and pushes optimized schedules directly into the BMS. The session focuses on what this looks like in real buildings, not a lab or a pilot slide.
Behind the paywall, April gets specific about what worked, what didn’t, and what surprised them once the system was live. You’ll hear how close the predictions were to actual ramp times, how Microsoft validated results with simple dashboards instead of over-engineered analytics, and why continuous daily model refreshes mattered more than perfect accuracy.
The talk also digs into the organizational realities—data reliability, model trust, and why removing manual processes was as important as the energy savings themselves. If you’re running large portfolios and still relying on fixed schedules, this is a concrete look at what it takes to operationalize ML without breaking comfort or ops trust.
Watch the full recording inside Nexus Pro →
At NexusCon 2025, April Yi, Director of Digital Engineering at Microsoft, walked through how her team is using machine learning to dynamically control HVAC start and stop times across roughly 50 buildings. The challenge wasn’t installing new hardware—it was replacing static schedules with predictive, data-driven operations that actually reflect when people show up.
Using indoor and outdoor temperature data, badge swipes, Wi-Fi session data, and HVAC telemetry, Microsoft built a system that forecasts occupancy and pushes optimized schedules directly into the BMS. The session focuses on what this looks like in real buildings, not a lab or a pilot slide.
Behind the paywall, April gets specific about what worked, what didn’t, and what surprised them once the system was live. You’ll hear how close the predictions were to actual ramp times, how Microsoft validated results with simple dashboards instead of over-engineered analytics, and why continuous daily model refreshes mattered more than perfect accuracy.
The talk also digs into the organizational realities—data reliability, model trust, and why removing manual processes was as important as the energy savings themselves. If you’re running large portfolios and still relying on fixed schedules, this is a concrete look at what it takes to operationalize ML without breaking comfort or ops trust.
Watch the full recording inside Nexus Pro →

Head over to Nexus Connect and see what’s new in the community. Don’t forget to check out the latest member-only events.
Go to Nexus ConnectJoin Nexus Pro and get full access including invite-only member gatherings, access to the community chatroom Nexus Connect, networking opportunities, and deep dive essays.
Sign Up
This is a great piece!
I agree.