Article
Nexus Pro
16
min read
James Dice

Defining and exploring the Independent Data Layer

May 18, 2021

Hey friends,

As a follow-up to recent newsletters Where are the API-first companies? (last week) and The API-first Data Layer (today), this members-only deep dive unpacks this concept further by defining the independent data layer, unpacking the nuances and unanswered questions, and updating the Nexus Vendor Landscape with this new category. If you haven’t read those two essays first, please do.

Enjoy and let me know what you think!

“Middleware”…

“Data lake”…

“Data aggregation layer”…

“Data infrastructure layer”…

“Data transformation layer”…

“Infrastructure as a Service”…

We love to make up new terms and acronyms in our industry and the new wave of data layer terms are no different. I’m not sure where the term “Independent Data Layer” (IDL) came from, but it’s my preference, folks, and I’m sticking with it.

So what is it?

Terry Herr1 defines it as “an edge connectivity layer / middleware that is independent of the applications or application layer”. As he recently recounted in the March Pro member gathering, the need for this layer arose with the advent of cloud-based energy analytics software. These software providers needed meter, sensor, and actuator data from the building.

Early offerings like Building Robotics’ (Now Comfy) Trendr, Lynxspring’s BACnet data pump, Sierra Monitor’s IoT gateway, the open-source VOLTTRON did just that. But as we’ll see, that original need has expanded beyond simply pumping BAS/meter data into the cloud.

A good way to explain where we are today is the gap Brian Turner talked about on the Nexus podcast:

On one side you have modern enterprise (cloud) technologies (data lakes, middleware, applications) and on the other you have on-premise systems. You fill the gap between them with a domain-specific, web-developer-grade API.

And each vendor in this category is going to approach filling that gap differently.

Capabilities of IDL Products

Therefore, we can also define it in terms of capabilities, some of which are required to fill the gap and all of which represent a wide range in today’s vendor landscape.

  1. Drivers and pre-built integrations—From simple BACnet and Modbus to proprietary drivers to IoT device connectivity. Schlep work, part 1.
  2. Trending and edge compute—From network scanning to caching, device monitoring, and management capabilities. Schlep work, part 2.
  3. Data model / metadata database—The option to make the IDL the single source of truth for what the data means. Options range from the type of data model to the automated (as much as possible) modeling capabilities.
  4. Time series database—The option to store data in the IDL cloud or simply perform routing.
  5. Two-way communication / control—The option for bi-directional communication on-premise and the ability for the edge device to execute advanced supervisory control algorithms.
  6. Cloud API abstraction—The option/ability for the IDL’s API to abstract away all the other cloud APIs. Useful for integration, but also allows the IDL to be the glue between apps in multi-app use cases.
  7. User interface—The option for the user to log into a simple interface to perform administration and make changes to 3D graphics
  8. API—This will be each company’s special sauce and core differentiator. How simple can you make the schlep work for the layers above you?
  9. Turnkey service—Integrations are notorious for being unreliable over time. Shaun Cooley called this the “day 2 problem”. The value of someone handling this layer—so that others don’t need to worry about it—is massive.

While this layer started out very simple, you can see that beyond the range of capabilities, there are also a ton of nuances to explore and understand. Let’s do some exploration.

Exploring the nuances

As with any layer of the smart building stack, the devil is in the details. From my vantage point, abstracting the details away with a few lines of code, as is the API-first promise, is still a few years and many unanswered questions away.

So let’s dive into the questions, rapid-fire style. When you’re done reading these, you’ll understand the edges of the problem and my perspective on them. As usual, I make no guarantees on the right answers!

How big is the market?

On one hand, you could say every building needs an IDL. But I’d argue that a building owner really doesn’t feel the pain of not having it unless they have a ton of buildings and/or they’re pretty far along the smart building journey. And that limited and/or delayed pain could really limit the market for the IDL vendor.

Put simply, the business value of the IDL is exponentially greater when you have:

  • Multiple on-prem, siloed systems you’re connecting
  • Multiple use cases you’re enabling
  • The need/desire to switch application providers

In other words, if I’m just doing analytics on top of HVAC and metering, which honestly is the state of the art right now, how big of a deal is not having the IDL? Techy types like to say “the future is here, it’s just not evenly distributed”. That cliche applies here too. How long will the pain take to distribute?

Should the data be “transformed”?

Among Nexus Pro members, there seems to be a bit of a debate on whether the IDL should be just an abstraction tool for on-premise systems (capabilities 1 through 3 above) or whether it should also have the full suite of capabilities (1 through 9 above).

The former camp seems to think that the IDL should stay focused on integration with on-prem systems and leave everything else up to the application providers.

My take is similar to the one Brian Turner shared at the March gathering:

“It’s becoming less and less of a problem to get data out, especially with the new technologies and architectures implemented. And so the way we're thinking about things is less about how do I get data out of a 20 year old system that probably needs to be modernized anyway, and really focus on the newer technologies. And, and now that we have sensor clouds and all of these different streams of data, it's no longer coming from a BMS per se, it's coming from all over the place. It's now extremely important to understand how that data all relates. That's where the transformation comes in.”

I agree. I think this layer needs to go beyond in two ways:

  1. I think the data modeling and transformation will be where the true value lies for this layer. Because that’s where you’re really enabling more than one app, more than one use case, and creating the relationships between producers and consumers that enable the network effects and other benefits of being API-first.

I think this layer needs to be considered a software product in its own right—this isn’t as simple as a data feed with context included. As Andrew said on the podcast:

What our product brings is the tools that allow us to deploy faster, collect data better and more efficiently. We monitor for changes in the network, monitor the hardware platforms we deploy, and maintain the health of those hardware platforms. We're really selling a service around reliable data acquisition. Volttron plays a central role in that, but it's really the suite of the monitoring, maintenance, and all that stuff that we're offering on top of Volttron that is really where our value comes from.

I think the companies that win this layer will build their product better than anyone else because of their singular focus.

Will software vendors say “me too”?

As we’ve discussed at length, it’s very confusing to go shopping for smart building software. Even if a shopper understands the value of this layer, I predict that API-second application providers will claim that they do it too. And they will probably be at least somewhat correct. After all, it’s already part of their stack.

My take is that it will take some time for the advantages of the API-first strategy to kick in. It’s like a flywheel that takes a long time to start spinning, but once it does it’s nearly unstoppable.

But we’re not quite there yet.

Specifically, I think digital twin companies will lay claim to this layer because the value propositions have heavy overlap. And vice versa, too. I think IDL companies can and will begin to compete with API-second digital twin offerings. And let the buyer confusion continue.

Is the value of AI overhyped?

Getting data from buildings is hard no matter how you do it. IDL vendors will claim to do it faster and better with modern machine learning (ML) technology, reducing the need for engineers that specialize in integration and understanding how building systems work.

I’m a skeptic on this. Parts of this process are messy, not able to be automated, and still require a ton of integration work by people who know building systems.  The power of ML to make it easier is unproven and unquantified. As always, I’m happy and eager to be proven wrong here. But if I’m right, IDL companies will be limited in growth by how many integration engineers they can hire away from others.

Is this a new form of vendor lock-in?

While the IDL gives you choice at the application layer, one could argue that, depending on the role it’s playing and its capabilities, it still creates significant vendor lock-in. In other words, you’re reducing vendor lock-in at one layer and severely increasing it at the data layer.2

This is the double-edged sword of this layer: approaching this as a complete product and filling the gap makes you the best of the best, BUT it also makes it harder to replace you. As we’ll get into below, interoperability standards could help with this.

Can reality match the theory?

There are at least a few software application providers that are skeptical of whether IDL vendors can fulfill the value proposition at their heart of their offering. The main concern is whether the IDL’s processes and data model are robust enough to support the use cases sitting on top of it.

One example is with FDD. To do it right, you need a robust data model and you need to collect a lot of metadata—sequences of operations, design parameters, equipment sizes, etc—that isn’t necessary for other use cases. It’s not as simple as collecting data and throwing some tags on it.

This hurdle will be an important one because IDL vendors need application providers to bring them in as their partner for that layer.

Does the IDL really solve the interoperability problem?

I can see it now: because an IDL can bring together silos and provide an API for it, someone will make the claim that our collective interoperability problem is solved by their solution. And while that might be halfway true, data infrastructure is a separate issue from the independent interoperability standard we know we need.

That said, an IDL vendor can play a significant role in advancing existing standards by (1) maintaining conformance to the standards as they progress and (2) updating the standards as they extend them.

The question is: what will force them to do that? I think we as a community and building owners need to force the issue here… because this solves the double-edged sword problem above. Interoperability standards are the only way to avoid significant vendor lock-in at this layer AND get the advantages of making it API-first.

When will we get a true app store?

An app store is a natural extension of the IDL but is total hype at this point.

Key stakeholders and what this means for them

On the recent podcast episode with Shaun Cooley, we unpacked the different types of stakeholders the IDL will impact.

It seems like the early adopters will be big companies with lots of real estate and data science teams that want to analyze the data in-house (E.g. Google). For them, it’s a no-brainer... as long as the API meets their needs.

Then there are the building owners without the internal staff to use or, frankly, simply grasp the need for this layer. They need someone to bring them a whole product—so the IDL vendor will need to partner with whoever puts the full solution together.

That brings us to other vendors. MSIs are perfect partners for IDL vendors—if they can be convinced to transform their business model from one that sells projects and therefore benefits from not having an IDL. So are application providers that want to focus on developing software and not integration—if they can be convinced to give up revenue and the IDL can do it better than they can internally.

My hope is that the IDL vendors create the tooling for these partners to be able to set up the IDL on their own. We can’t afford to wait for the IDL vendors to do this as a centralized entity and we don’t want one company to be the bottleneck.

Conclusions and takeaways

Hopefully, you’ve enjoyed my synthesis here. And hopefully, it’s obvious that almost none of these ideas are my own and I’m grateful to our members and podcast guests for enlightening me on how this space is developing. That said, let’s try and bring the state of this layer into a few concluding bullets:

  • As more and more technology use cases gain traction that depend on the same data, the IDL is the right thing to do from the building owner’s perspective… so big building owners standardizing on one of these vendors is highly likely to occur.
  • Once they do, that starts the IDL flywheel and makes it easier for these vendors to serve less sophisticated building owners and makes it appetizing for full-stack software providers to hand over a big chunk of their stack.
  • The industry needs the best of the best applications to win out. In order for that to happen, app developers need to focus on what they do best and be okay with being part of the stack rather than the whole thing.
  • In order to do that, this layer needs to be a full-on product—it’s not just an open-source layer that isn’t opinionated on how problems should be solved. That doesn’t mean parts of it can’t be open source (E.g. VOLTTRON).
  • This layer, and the applications sitting on top of it, need to contribute to and converge on interoperability standards. This will allow the IDL to more easily be switched out and give building owners choice, allowing the best product to win out here at this layer too.

What are your thoughts and takeaways?

Discuss on Nexus Connect

The Nexus Vendor Landscape has been updated with this new category. Check it out for the IDL companies I know about today. As usual, we’ll keep it updated as things evolve and we get your feedback.

Check out the landscape

1

It may have come from him!

2

On the podcast, Andrew Rodgers talked about the value of VOLTTRON to avoid some of the potential handcuffs:

The value of being in an ecosystem where there are other people using that technology, um, where we have sort of this plausible, presentation to our customers that like, yes, we're a small company, but we're using this technology that is open and, you know, there are other providers for, so if you decide that, like we're not. Delivering you can fire us and you're not left holding a bag of some proprietary crap that you can't do anything with.

‍

Upgrade to Nexus Pro to continue reading

Upgrade

Therefore, we can also define it in terms of capabilities, some of which are required to fill the gap and all of which represent a wide range in today’s vendor landscape.

  1. Drivers and pre-built integrations—From simple BACnet and Modbus to proprietary drivers to IoT device connectivity. Schlep work, part 1.
  2. Trending and edge compute—From network scanning to caching, device monitoring, and management capabilities. Schlep work, part 2.
  3. Data model / metadata database—The option to make the IDL the single source of truth for what the data means. Options range from the type of data model to the automated (as much as possible) modeling capabilities.
  4. Time series database—The option to store data in the IDL cloud or simply perform routing.
  5. Two-way communication / control—The option for bi-directional communication on-premise and the ability for the edge device to execute advanced supervisory control algorithms.
  6. Cloud API abstraction—The option/ability for the IDL’s API to abstract away all the other cloud APIs. Useful for integration, but also allows the IDL to be the glue between apps in multi-app use cases.
  7. User interface—The option for the user to log into a simple interface to perform administration and make changes to 3D graphics
  8. API—This will be each company’s special sauce and core differentiator. How simple can you make the schlep work for the layers above you?
  9. Turnkey service—Integrations are notorious for being unreliable over time. Shaun Cooley called this the “day 2 problem”. The value of someone handling this layer—so that others don’t need to worry about it—is massive.

While this layer started out very simple, you can see that beyond the range of capabilities, there are also a ton of nuances to explore and understand. Let’s do some exploration.

Exploring the nuances

As with any layer of the smart building stack, the devil is in the details. From my vantage point, abstracting the details away with a few lines of code, as is the API-first promise, is still a few years and many unanswered questions away.

So let’s dive into the questions, rapid-fire style. When you’re done reading these, you’ll understand the edges of the problem and my perspective on them. As usual, I make no guarantees on the right answers!

How big is the market?

On one hand, you could say every building needs an IDL. But I’d argue that a building owner really doesn’t feel the pain of not having it unless they have a ton of buildings and/or they’re pretty far along the smart building journey. And that limited and/or delayed pain could really limit the market for the IDL vendor.

Put simply, the business value of the IDL is exponentially greater when you have:

  • Multiple on-prem, siloed systems you’re connecting
  • Multiple use cases you’re enabling
  • The need/desire to switch application providers

In other words, if I’m just doing analytics on top of HVAC and metering, which honestly is the state of the art right now, how big of a deal is not having the IDL? Techy types like to say “the future is here, it’s just not evenly distributed”. That cliche applies here too. How long will the pain take to distribute?

Should the data be “transformed”?

Among Nexus Pro members, there seems to be a bit of a debate on whether the IDL should be just an abstraction tool for on-premise systems (capabilities 1 through 3 above) or whether it should also have the full suite of capabilities (1 through 9 above).

The former camp seems to think that the IDL should stay focused on integration with on-prem systems and leave everything else up to the application providers.

My take is similar to the one Brian Turner shared at the March gathering:

“It’s becoming less and less of a problem to get data out, especially with the new technologies and architectures implemented. And so the way we're thinking about things is less about how do I get data out of a 20 year old system that probably needs to be modernized anyway, and really focus on the newer technologies. And, and now that we have sensor clouds and all of these different streams of data, it's no longer coming from a BMS per se, it's coming from all over the place. It's now extremely important to understand how that data all relates. That's where the transformation comes in.”

I agree. I think this layer needs to go beyond in two ways:

  1. I think the data modeling and transformation will be where the true value lies for this layer. Because that’s where you’re really enabling more than one app, more than one use case, and creating the relationships between producers and consumers that enable the network effects and other benefits of being API-first.

I think this layer needs to be considered a software product in its own right—this isn’t as simple as a data feed with context included. As Andrew said on the podcast:

What our product brings is the tools that allow us to deploy faster, collect data better and more efficiently. We monitor for changes in the network, monitor the hardware platforms we deploy, and maintain the health of those hardware platforms. We're really selling a service around reliable data acquisition. Volttron plays a central role in that, but it's really the suite of the monitoring, maintenance, and all that stuff that we're offering on top of Volttron that is really where our value comes from.

I think the companies that win this layer will build their product better than anyone else because of their singular focus.

Will software vendors say “me too”?

As we’ve discussed at length, it’s very confusing to go shopping for smart building software. Even if a shopper understands the value of this layer, I predict that API-second application providers will claim that they do it too. And they will probably be at least somewhat correct. After all, it’s already part of their stack.

My take is that it will take some time for the advantages of the API-first strategy to kick in. It’s like a flywheel that takes a long time to start spinning, but once it does it’s nearly unstoppable.

But we’re not quite there yet.

Specifically, I think digital twin companies will lay claim to this layer because the value propositions have heavy overlap. And vice versa, too. I think IDL companies can and will begin to compete with API-second digital twin offerings. And let the buyer confusion continue.

Is the value of AI overhyped?

Getting data from buildings is hard no matter how you do it. IDL vendors will claim to do it faster and better with modern machine learning (ML) technology, reducing the need for engineers that specialize in integration and understanding how building systems work.

I’m a skeptic on this. Parts of this process are messy, not able to be automated, and still require a ton of integration work by people who know building systems.  The power of ML to make it easier is unproven and unquantified. As always, I’m happy and eager to be proven wrong here. But if I’m right, IDL companies will be limited in growth by how many integration engineers they can hire away from others.

Is this a new form of vendor lock-in?

While the IDL gives you choice at the application layer, one could argue that, depending on the role it’s playing and its capabilities, it still creates significant vendor lock-in. In other words, you’re reducing vendor lock-in at one layer and severely increasing it at the data layer.2

This is the double-edged sword of this layer: approaching this as a complete product and filling the gap makes you the best of the best, BUT it also makes it harder to replace you. As we’ll get into below, interoperability standards could help with this.

Can reality match the theory?

There are at least a few software application providers that are skeptical of whether IDL vendors can fulfill the value proposition at their heart of their offering. The main concern is whether the IDL’s processes and data model are robust enough to support the use cases sitting on top of it.

One example is with FDD. To do it right, you need a robust data model and you need to collect a lot of metadata—sequences of operations, design parameters, equipment sizes, etc—that isn’t necessary for other use cases. It’s not as simple as collecting data and throwing some tags on it.

This hurdle will be an important one because IDL vendors need application providers to bring them in as their partner for that layer.

Does the IDL really solve the interoperability problem?

I can see it now: because an IDL can bring together silos and provide an API for it, someone will make the claim that our collective interoperability problem is solved by their solution. And while that might be halfway true, data infrastructure is a separate issue from the independent interoperability standard we know we need.

That said, an IDL vendor can play a significant role in advancing existing standards by (1) maintaining conformance to the standards as they progress and (2) updating the standards as they extend them.

The question is: what will force them to do that? I think we as a community and building owners need to force the issue here… because this solves the double-edged sword problem above. Interoperability standards are the only way to avoid significant vendor lock-in at this layer AND get the advantages of making it API-first.

When will we get a true app store?

An app store is a natural extension of the IDL but is total hype at this point.

Key stakeholders and what this means for them

On the recent podcast episode with Shaun Cooley, we unpacked the different types of stakeholders the IDL will impact.

It seems like the early adopters will be big companies with lots of real estate and data science teams that want to analyze the data in-house (E.g. Google). For them, it’s a no-brainer... as long as the API meets their needs.

Then there are the building owners without the internal staff to use or, frankly, simply grasp the need for this layer. They need someone to bring them a whole product—so the IDL vendor will need to partner with whoever puts the full solution together.

That brings us to other vendors. MSIs are perfect partners for IDL vendors—if they can be convinced to transform their business model from one that sells projects and therefore benefits from not having an IDL. So are application providers that want to focus on developing software and not integration—if they can be convinced to give up revenue and the IDL can do it better than they can internally.

My hope is that the IDL vendors create the tooling for these partners to be able to set up the IDL on their own. We can’t afford to wait for the IDL vendors to do this as a centralized entity and we don’t want one company to be the bottleneck.

Conclusions and takeaways

Hopefully, you’ve enjoyed my synthesis here. And hopefully, it’s obvious that almost none of these ideas are my own and I’m grateful to our members and podcast guests for enlightening me on how this space is developing. That said, let’s try and bring the state of this layer into a few concluding bullets:

  • As more and more technology use cases gain traction that depend on the same data, the IDL is the right thing to do from the building owner’s perspective… so big building owners standardizing on one of these vendors is highly likely to occur.
  • Once they do, that starts the IDL flywheel and makes it easier for these vendors to serve less sophisticated building owners and makes it appetizing for full-stack software providers to hand over a big chunk of their stack.
  • The industry needs the best of the best applications to win out. In order for that to happen, app developers need to focus on what they do best and be okay with being part of the stack rather than the whole thing.
  • In order to do that, this layer needs to be a full-on product—it’s not just an open-source layer that isn’t opinionated on how problems should be solved. That doesn’t mean parts of it can’t be open source (E.g. VOLTTRON).
  • This layer, and the applications sitting on top of it, need to contribute to and converge on interoperability standards. This will allow the IDL to more easily be switched out and give building owners choice, allowing the best product to win out here at this layer too.

What are your thoughts and takeaways?

Discuss on Nexus Connect

The Nexus Vendor Landscape has been updated with this new category. Check it out for the IDL companies I know about today. As usual, we’ll keep it updated as things evolve and we get your feedback.

Check out the landscape

1

It may have come from him!

2

On the podcast, Andrew Rodgers talked about the value of VOLTTRON to avoid some of the potential handcuffs:

The value of being in an ecosystem where there are other people using that technology, um, where we have sort of this plausible, presentation to our customers that like, yes, we're a small company, but we're using this technology that is open and, you know, there are other providers for, so if you decide that, like we're not. Delivering you can fire us and you're not left holding a bag of some proprietary crap that you can't do anything with.

‍

Upgrade to Nexus Pro to continue reading

Upgrade

Therefore, we can also define it in terms of capabilities, some of which are required to fill the gap and all of which represent a wide range in today’s vendor landscape.

  1. Drivers and pre-built integrations—From simple BACnet and Modbus to proprietary drivers to IoT device connectivity. Schlep work, part 1.
  2. Trending and edge compute—From network scanning to caching, device monitoring, and management capabilities. Schlep work, part 2.
  3. Data model / metadata database—The option to make the IDL the single source of truth for what the data means. Options range from the type of data model to the automated (as much as possible) modeling capabilities.
  4. Time series database—The option to store data in the IDL cloud or simply perform routing.
  5. Two-way communication / control—The option for bi-directional communication on-premise and the ability for the edge device to execute advanced supervisory control algorithms.
  6. Cloud API abstraction—The option/ability for the IDL’s API to abstract away all the other cloud APIs. Useful for integration, but also allows the IDL to be the glue between apps in multi-app use cases.
  7. User interface—The option for the user to log into a simple interface to perform administration and make changes to 3D graphics
  8. API—This will be each company’s special sauce and core differentiator. How simple can you make the schlep work for the layers above you?
  9. Turnkey service—Integrations are notorious for being unreliable over time. Shaun Cooley called this the “day 2 problem”. The value of someone handling this layer—so that others don’t need to worry about it—is massive.

While this layer started out very simple, you can see that beyond the range of capabilities, there are also a ton of nuances to explore and understand. Let’s do some exploration.

Exploring the nuances

As with any layer of the smart building stack, the devil is in the details. From my vantage point, abstracting the details away with a few lines of code, as is the API-first promise, is still a few years and many unanswered questions away.

So let’s dive into the questions, rapid-fire style. When you’re done reading these, you’ll understand the edges of the problem and my perspective on them. As usual, I make no guarantees on the right answers!

How big is the market?

On one hand, you could say every building needs an IDL. But I’d argue that a building owner really doesn’t feel the pain of not having it unless they have a ton of buildings and/or they’re pretty far along the smart building journey. And that limited and/or delayed pain could really limit the market for the IDL vendor.

Put simply, the business value of the IDL is exponentially greater when you have:

  • Multiple on-prem, siloed systems you’re connecting
  • Multiple use cases you’re enabling
  • The need/desire to switch application providers

In other words, if I’m just doing analytics on top of HVAC and metering, which honestly is the state of the art right now, how big of a deal is not having the IDL? Techy types like to say “the future is here, it’s just not evenly distributed”. That cliche applies here too. How long will the pain take to distribute?

Should the data be “transformed”?

Among Nexus Pro members, there seems to be a bit of a debate on whether the IDL should be just an abstraction tool for on-premise systems (capabilities 1 through 3 above) or whether it should also have the full suite of capabilities (1 through 9 above).

The former camp seems to think that the IDL should stay focused on integration with on-prem systems and leave everything else up to the application providers.

My take is similar to the one Brian Turner shared at the March gathering:

“It’s becoming less and less of a problem to get data out, especially with the new technologies and architectures implemented. And so the way we're thinking about things is less about how do I get data out of a 20 year old system that probably needs to be modernized anyway, and really focus on the newer technologies. And, and now that we have sensor clouds and all of these different streams of data, it's no longer coming from a BMS per se, it's coming from all over the place. It's now extremely important to understand how that data all relates. That's where the transformation comes in.”

I agree. I think this layer needs to go beyond in two ways:

  1. I think the data modeling and transformation will be where the true value lies for this layer. Because that’s where you’re really enabling more than one app, more than one use case, and creating the relationships between producers and consumers that enable the network effects and other benefits of being API-first.

I think this layer needs to be considered a software product in its own right—this isn’t as simple as a data feed with context included. As Andrew said on the podcast:

What our product brings is the tools that allow us to deploy faster, collect data better and more efficiently. We monitor for changes in the network, monitor the hardware platforms we deploy, and maintain the health of those hardware platforms. We're really selling a service around reliable data acquisition. Volttron plays a central role in that, but it's really the suite of the monitoring, maintenance, and all that stuff that we're offering on top of Volttron that is really where our value comes from.

I think the companies that win this layer will build their product better than anyone else because of their singular focus.

Will software vendors say “me too”?

As we’ve discussed at length, it’s very confusing to go shopping for smart building software. Even if a shopper understands the value of this layer, I predict that API-second application providers will claim that they do it too. And they will probably be at least somewhat correct. After all, it’s already part of their stack.

My take is that it will take some time for the advantages of the API-first strategy to kick in. It’s like a flywheel that takes a long time to start spinning, but once it does it’s nearly unstoppable.

But we’re not quite there yet.

Specifically, I think digital twin companies will lay claim to this layer because the value propositions have heavy overlap. And vice versa, too. I think IDL companies can and will begin to compete with API-second digital twin offerings. And let the buyer confusion continue.

Is the value of AI overhyped?

Getting data from buildings is hard no matter how you do it. IDL vendors will claim to do it faster and better with modern machine learning (ML) technology, reducing the need for engineers that specialize in integration and understanding how building systems work.

I’m a skeptic on this. Parts of this process are messy, not able to be automated, and still require a ton of integration work by people who know building systems.  The power of ML to make it easier is unproven and unquantified. As always, I’m happy and eager to be proven wrong here. But if I’m right, IDL companies will be limited in growth by how many integration engineers they can hire away from others.

Is this a new form of vendor lock-in?

While the IDL gives you choice at the application layer, one could argue that, depending on the role it’s playing and its capabilities, it still creates significant vendor lock-in. In other words, you’re reducing vendor lock-in at one layer and severely increasing it at the data layer.2

This is the double-edged sword of this layer: approaching this as a complete product and filling the gap makes you the best of the best, BUT it also makes it harder to replace you. As we’ll get into below, interoperability standards could help with this.

Can reality match the theory?

There are at least a few software application providers that are skeptical of whether IDL vendors can fulfill the value proposition at their heart of their offering. The main concern is whether the IDL’s processes and data model are robust enough to support the use cases sitting on top of it.

One example is with FDD. To do it right, you need a robust data model and you need to collect a lot of metadata—sequences of operations, design parameters, equipment sizes, etc—that isn’t necessary for other use cases. It’s not as simple as collecting data and throwing some tags on it.

This hurdle will be an important one because IDL vendors need application providers to bring them in as their partner for that layer.

Does the IDL really solve the interoperability problem?

I can see it now: because an IDL can bring together silos and provide an API for it, someone will make the claim that our collective interoperability problem is solved by their solution. And while that might be halfway true, data infrastructure is a separate issue from the independent interoperability standard we know we need.

That said, an IDL vendor can play a significant role in advancing existing standards by (1) maintaining conformance to the standards as they progress and (2) updating the standards as they extend them.

The question is: what will force them to do that? I think we as a community and building owners need to force the issue here… because this solves the double-edged sword problem above. Interoperability standards are the only way to avoid significant vendor lock-in at this layer AND get the advantages of making it API-first.

When will we get a true app store?

An app store is a natural extension of the IDL but is total hype at this point.

Key stakeholders and what this means for them

On the recent podcast episode with Shaun Cooley, we unpacked the different types of stakeholders the IDL will impact.

It seems like the early adopters will be big companies with lots of real estate and data science teams that want to analyze the data in-house (E.g. Google). For them, it’s a no-brainer... as long as the API meets their needs.

Then there are the building owners without the internal staff to use or, frankly, simply grasp the need for this layer. They need someone to bring them a whole product—so the IDL vendor will need to partner with whoever puts the full solution together.

That brings us to other vendors. MSIs are perfect partners for IDL vendors—if they can be convinced to transform their business model from one that sells projects and therefore benefits from not having an IDL. So are application providers that want to focus on developing software and not integration—if they can be convinced to give up revenue and the IDL can do it better than they can internally.

My hope is that the IDL vendors create the tooling for these partners to be able to set up the IDL on their own. We can’t afford to wait for the IDL vendors to do this as a centralized entity and we don’t want one company to be the bottleneck.

Conclusions and takeaways

Hopefully, you’ve enjoyed my synthesis here. And hopefully, it’s obvious that almost none of these ideas are my own and I’m grateful to our members and podcast guests for enlightening me on how this space is developing. That said, let’s try and bring the state of this layer into a few concluding bullets:

  • As more and more technology use cases gain traction that depend on the same data, the IDL is the right thing to do from the building owner’s perspective… so big building owners standardizing on one of these vendors is highly likely to occur.
  • Once they do, that starts the IDL flywheel and makes it easier for these vendors to serve less sophisticated building owners and makes it appetizing for full-stack software providers to hand over a big chunk of their stack.
  • The industry needs the best of the best applications to win out. In order for that to happen, app developers need to focus on what they do best and be okay with being part of the stack rather than the whole thing.
  • In order to do that, this layer needs to be a full-on product—it’s not just an open-source layer that isn’t opinionated on how problems should be solved. That doesn’t mean parts of it can’t be open source (E.g. VOLTTRON).
  • This layer, and the applications sitting on top of it, need to contribute to and converge on interoperability standards. This will allow the IDL to more easily be switched out and give building owners choice, allowing the best product to win out here at this layer too.

What are your thoughts and takeaways?

Discuss on Nexus Connect

The Nexus Vendor Landscape has been updated with this new category. Check it out for the IDL companies I know about today. As usual, we’ll keep it updated as things evolve and we get your feedback.

Check out the landscape

1

It may have come from him!

2

On the podcast, Andrew Rodgers talked about the value of VOLTTRON to avoid some of the potential handcuffs:

The value of being in an ecosystem where there are other people using that technology, um, where we have sort of this plausible, presentation to our customers that like, yes, we're a small company, but we're using this technology that is open and, you know, there are other providers for, so if you decide that, like we're not. Delivering you can fire us and you're not left holding a bag of some proprietary crap that you can't do anything with.

‍

Hey friends,

As a follow-up to recent newsletters Where are the API-first companies? (last week) and The API-first Data Layer (today), this members-only deep dive unpacks this concept further by defining the independent data layer, unpacking the nuances and unanswered questions, and updating the Nexus Vendor Landscape with this new category. If you haven’t read those two essays first, please do.

Enjoy and let me know what you think!

“Middleware”…

“Data lake”…

“Data aggregation layer”…

“Data infrastructure layer”…

“Data transformation layer”…

“Infrastructure as a Service”…

We love to make up new terms and acronyms in our industry and the new wave of data layer terms are no different. I’m not sure where the term “Independent Data Layer” (IDL) came from, but it’s my preference, folks, and I’m sticking with it.

So what is it?

Terry Herr1 defines it as “an edge connectivity layer / middleware that is independent of the applications or application layer”. As he recently recounted in the March Pro member gathering, the need for this layer arose with the advent of cloud-based energy analytics software. These software providers needed meter, sensor, and actuator data from the building.

Early offerings like Building Robotics’ (Now Comfy) Trendr, Lynxspring’s BACnet data pump, Sierra Monitor’s IoT gateway, the open-source VOLTTRON did just that. But as we’ll see, that original need has expanded beyond simply pumping BAS/meter data into the cloud.

A good way to explain where we are today is the gap Brian Turner talked about on the Nexus podcast:

On one side you have modern enterprise (cloud) technologies (data lakes, middleware, applications) and on the other you have on-premise systems. You fill the gap between them with a domain-specific, web-developer-grade API.

And each vendor in this category is going to approach filling that gap differently.

Capabilities of IDL Products

⭐️ Pro Article

This article is for Nexus Pro members only

Upgrade to Nexus Pro
⭐️ Pro Article

This article is for Nexus Pro members only

Upgrade to Nexus Pro

Are you a Nexus Pro member yet? Join now to get access to our community of 600+ members.

Join Today

Have you taken our Smart Building Strategist Course yet? Sign up to get access to our courses platform.

Enroll Now

Get the renowned Nexus Newsletter

Access the Nexus Community

Head over to Nexus Connect and see what’s new in the community. Don’t forget to check out the latest member-only events.

Go to Nexus Connect

Upgrade to Nexus Pro

Join Nexus Pro and get full access including invite-only member gatherings, access to the community chatroom Nexus Connect, networking opportunities, and deep dive essays.

Sign Up