Supercomputers are coming to Microsoft’s Azure cloud computing service.
Now, companies don’t have to operate a Cray supercomputer and run it inside an internal data center.
“The way it works, is that a customer will have a dedicated Cray supercomputer in an Azure datacenter, which they pay for over time,” a Cray spokesperson said in an email. “Cray will work with each customer to architect a system, including storage, to match their business and application needs.”
The two companies did not say when the supercomputers would be available to use through Microsoft or any pricing details.
Organizations like the U.S. Department of Energy typically use supercomputers for crunching tremendous amounts of data for things like predicting natural disasters or weather forecasts.
Get Data Sheet, Fortune’s technology newsletter.
Although intensive data crunching is generally done by research organizations and government agencies, Microsoft and Cray hope to convince more traditional businesses to use supercomputers, which can handle cutting edge artificial intelligence techniques like deep learning. For example, the two companies said that pharmaceutical firms can use the supercomputers for genome sequencing while automotive companies would be able to simulate crashes.
“Our partnership with Microsoft will introduce Cray supercomputers to a whole new class of customers that need the most advanced computing resources to expand their problem-solving capabilities, but want this new capability available to them in the cloud,” Cray CEO Peter Ungaro said in a statement.
Besides Microsoft, several other companies pitch their respective technologies as the preferred way to crunch data.
Google, for example, built its own computer chip for machine learning tasks that it now rents customers access to through its cloud computing service. Chipmaker Nvidia (nvda) also rents access to its chips via cloud computing providers for deep learning projects, among other things.
Story updated Oct 25 at 10:50 AM PT with additional information on pricing.