The tech world is enamored of containers, new technology exemplified by fast-rising startup Docker that packages up applications in a resource-efficient and portable way. The advantage for businesses is that containers can run applications on less hardware and those applications can pull data from many sources. But, perhaps most important, those applications can be moved from one set of infrastructure to another with minimal muss and fuss.
Sets or clusters of containers can be scheduled and run on a given cloud with tools like the newly available Google Container Engine as well as Google-backed Kubernetes or the Amazon Container Service. But now developers have their eye on the next frontier: Deploying container clusters across different clouds—something IBM (IBM) says it has accomplished. A team at IBM Research working with Moustafa AbdelBaky, a PhD candidate at Rutgers University’s Discovery Informatics Institute, used open-source technology called CometCloud as the basis of this work, which he calls C-Ports.
Being able to run containers across clouds and geographies can address several enterprise concerns. For example, it can ensure that a particular application and its associated data stay within a set region, which is important given data sovereignty rules. Or it can parse out work across regions and clouds so that if there’s a data center issue in one area, life can go on. Or if a company runs out of capacity in one situation, it can “burst” that job to another.
According to an IBM blog post about the work, C-Ports has proven itself in at least one situation.
Bluemix is IBM’s(IBM) cloud development environment; Chameleon is a cloud used for large-scale testing; FutureSystems is an academic cloud; and Amazon (AMZN) Web Services is the market-leading public cloud.
Parceling out containers on multiple environments is indeed an impressive feat if it works as advertised. “The whole premise behind containers is portability,” said David Mytton, CEO of London-based Server Density, who follows cloud developments closely. The ability to run workloads across cloud computing environments is an attractive proposition for big customers that want to take advantage of the best infrastructure to suit their needs and not get locked into any one cloud provider.
But Mytton and other cloud watchers want to hear more about this works in the real world. “Cloudbursting is harder than people think because of network and storage constraints. I’d like to see more about how this project addresses those issues,” said Sebastian Stadil, CEO of Scalr, San Francisco.
IBM, which Fortune contacted for additional comment but has not yet responded, is not alone in this quest for distributed container adoption. The nascent Google (GOOG) Ubernetes project aims to enable Kubernetes container clusters to share jobs across clouds. Rancher and Apcera promise similar capabilites.
As container use proliferates—and it will—look for more talk about the need to deploy and manage containers across clouds. After all, that’s what’s needed for them to fulfill their promise of portability.
Note: This story was updated August 27 at 7:28 a.m. to reflect that Rancher and Apcera promise similar cross-cloud container capabilities.
For more on IBM’s cloud-and-data strategy, please check out the video below.
Subscribe to Data Sheet, Fortune’s daily newsletter on the business of technology.