Article
News
5
min read
Brad Bonavida

Stanford's Lesson from Migrating Its Utility Historian: The Technical Work Only Matters If Stakeholders Trust the Data

April 6, 2026

When AVEVA acquired OSIsoft and announced it would sunset eDNA (Stanford University's utility data historian of 10 years), Chris Guest, Manager of the Sustainability and Utilities Infrastructure Business Systems team, was tasked with migrating to OSIsoft PI. Three years in, the university has learned valuable lessons about what it actually takes to make an energy data migration succeed.

Stanford's historian feeds intercampus billing, real-time energy curtailment monitoring, and sustainability reporting across a campus with a 53 MW electrical feed from PG&E and a central cooling plant serving research labs and healthcare facilities. When the data is wrong, the people who depend on that data notice.

Two migration challenges proved that. The first involved compression and exception settings: a PI feature that discards data points within a tolerance band between two existing values, assuming they're redundant and can be interpolated. A bug in the migration code incorrectly configured those settings, creating discrepancies in the historical record that were difficult to trace.

The second was subtler. A researcher studying chilled water peak demand noticed that before a specific date in February 2023, peak demand always appeared around 8 or 10 in the morning. After that date, it showed up around 3 or 4 in the afternoon (a more plausible time for peak demand). The cutover date matched exactly when Stanford put its first interface in place. The old historian had been storing data in Pacific time; the migration team assumed UTC and converted it, shifting 10 years of timestamps by seven to eight hours.

"Things that you take for granted, time zones, daylight savings time, those kinds of details come out, and they can bite you," said Chris Guest, Manager of the Sustainability and Utilities Infrastructure Business Systems team at Stanford University.

Those challenges compound. "They get that skepticism in their mind, and then they have an actual problem like this," Guest said. "If your stakeholders aren't using that data in a way that they're confident about it, you're not gonna get the value out of it."

Stanford is now running an 18,000-point data validation effort with student data scientists and third-party firm Oros, and building a data standards document with DB Engineering that defines expected behavior per point: units, reporting frequency, compression settings, step vs. interpolated, to enable fault detection on the data itself.

Guest's lesson for anyone starting a similar migration: begin with the business case, not the data inventory. For one source system with 8,000 points, the energy operations team wanted everything brought in. Half still aren't fully cataloged. "I wish we had started more from the business case," he said.

Technical migrations are inevitable. Whether they deliver value depends entirely on whether the people using the data trust it.

Watch the full recording.

Register for the next Nexus Labs event.

Sign up for the newsletter to get 5 stories like this per week:

Sign Up for Access or Log In to Continue Viewing

Sign Up for Access or Log In to Continue Viewing

When AVEVA acquired OSIsoft and announced it would sunset eDNA (Stanford University's utility data historian of 10 years), Chris Guest, Manager of the Sustainability and Utilities Infrastructure Business Systems team, was tasked with migrating to OSIsoft PI. Three years in, the university has learned valuable lessons about what it actually takes to make an energy data migration succeed.

Stanford's historian feeds intercampus billing, real-time energy curtailment monitoring, and sustainability reporting across a campus with a 53 MW electrical feed from PG&E and a central cooling plant serving research labs and healthcare facilities. When the data is wrong, the people who depend on that data notice.

Two migration challenges proved that. The first involved compression and exception settings: a PI feature that discards data points within a tolerance band between two existing values, assuming they're redundant and can be interpolated. A bug in the migration code incorrectly configured those settings, creating discrepancies in the historical record that were difficult to trace.

The second was subtler. A researcher studying chilled water peak demand noticed that before a specific date in February 2023, peak demand always appeared around 8 or 10 in the morning. After that date, it showed up around 3 or 4 in the afternoon (a more plausible time for peak demand). The cutover date matched exactly when Stanford put its first interface in place. The old historian had been storing data in Pacific time; the migration team assumed UTC and converted it, shifting 10 years of timestamps by seven to eight hours.

"Things that you take for granted, time zones, daylight savings time, those kinds of details come out, and they can bite you," said Chris Guest, Manager of the Sustainability and Utilities Infrastructure Business Systems team at Stanford University.

Those challenges compound. "They get that skepticism in their mind, and then they have an actual problem like this," Guest said. "If your stakeholders aren't using that data in a way that they're confident about it, you're not gonna get the value out of it."

Stanford is now running an 18,000-point data validation effort with student data scientists and third-party firm Oros, and building a data standards document with DB Engineering that defines expected behavior per point: units, reporting frequency, compression settings, step vs. interpolated, to enable fault detection on the data itself.

Guest's lesson for anyone starting a similar migration: begin with the business case, not the data inventory. For one source system with 8,000 points, the energy operations team wanted everything brought in. Half still aren't fully cataloged. "I wish we had started more from the business case," he said.

Technical migrations are inevitable. Whether they deliver value depends entirely on whether the people using the data trust it.

Watch the full recording.

Register for the next Nexus Labs event.

Sign up for the newsletter to get 5 stories like this per week:

⭐️ Pro Article

Sign Up for Access or Log In to View

⭐️ Pro Article

Sign Up for Access or Log In to View

Are you interested in joining us at NexusCon 2026? Register now so you don’t miss out!

Join Today

Are you a Nexus Pro member yet? Join now to get access to our community of 600+ members.

Join Today

Have you taken our Smart Building Strategist Course yet? Sign up to get access to our courses platform.

Enroll Now
Conversation
Comments (-)
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Guest
6 hours ago
Delete

This is a great piece!

REPLYCANCEL
or register to comment as a member
POST REPLY
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Guest
6 hours ago
Delete

I agree.

REPLYCANCEL
or register to comment as a member
POST REPLY
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Get the renowned Nexus Newsletter

Access the Nexus Community

Head over to Nexus Connect and see what’s new in the community. Don’t forget to check out the latest member-only events.

Go to Nexus Connect

Upgrade to Nexus Pro

Join Nexus Pro and get full access including invite-only member gatherings, access to the community chatroom Nexus Connect, networking opportunities, and deep dive essays.

Sign Up