Reasonable people can argue about just how much of today’s information technology is now being deployed in the cloud—5%? 20%?—but no one can doubt that hybrid cloud is the buzzy topic now.
In the last few weeks alone, a slew of tech news all centered on hybrid cloud. For example, when Hewlett-Packard (HPQ) (now Hewlett Packard Enterprise)(HPE), said it was discontinuing its Helion public cloud, it stressed it will instead support customers who want to use Amazon (AWS) Web Services or Microsoft Azure as the public cloud portion of their hybrid cloud deployments. HP most assuredly hopes that the private cloud component will run on its latest release of Helion OpenStack Cloud.
And, on Tuesday, when IBM (IBM) said it was buying Gravitant, a cloud broker company, one stated goal was to ease hybrid cloud deployments. On Wednesday, when Microsoft (MSFT) and Red Hat(RHT) finally made nice and said they’d jointly support Red Hat Enterprise Linux running on Azure, executives cited the need to—stop me if you’re heard this before—ease hybrid cloud deployment.
For those who aren’t in the weeds on these things, a public cloud, such as AWS, Microsoft Azure, or Google(GOOG) Cloud Platform, comprises a huge shared trove of computing, storage, and networking infrastructure run by those companies and offered for rent to customers.
A private cloud, on the other hand, comprises those same computing, storage, and networking components, but they’re dedicated to a single customer. Ideally, a private cloud offers users the same ability to turn resources on and off as needed, so departments within a company are billed for that use accordingly. But that “no sharing” aspect is reassuring to customers in financial services, healthcare, and other industries where concerns about data security and compliance with regulations are paramount.
In theory, hybrid cloud takes the best from both worlds: Mission critical “stuff” stays on private resources, but less sensitive data or applications can run beyond the firewall in someone else’s cloud.
“Hybrid capability is becoming the key factor of enterprise cloud adoption,” said analyst Janakiram MSV, founder of Janakiram and Associates. “With legacy workloads that run on-premises, only the front-ends and clients are making it to the public cloud. Those large Oracle databases and SAP workloads will not move to the cloud in the near future. The surrounding applications that depend on these workloads are moving to the public cloud with a low-latency connectivity to the enterprise data center.”
The need to put specific tasks on the infrastructure best suited for them and the ability to move them if necessary, will be a big topic of conversation at the Structure Conference in a few weeks. A major topic will be a set of new technologies—containers, container management and orchestration—that promise to make that portability easier.
At the show, Mesosphere chief executive and founder Florian Leibert will discuss the need to build IT implementations that span data centers and the cloud; Docker senior vice president of engineering Mariana Tessel will run through best practices for deploying Docker containers so that applications and data can be more portable; and Tim Kimmet, vice president of platform and systems for WalmartLabs, the technology arm of the Walmart (WMT) retail colossus, will go over the decision to open source some of the company’s own key technology to help other companies avoid cloud lock-in.
And Urs Hölzle, Google’s senior vice president of infrastructure and technical fellow, will also be on hand to talk about how the cloud of 2020 will differ dramatically from today’s cloud and why.
For more Fortune coverage of the cloud, check out this video:
Make sure to subscribe to Data Sheet, Fortune’s daily newsletter on the business of technology.