Microsoft has already talked a bit about its new computing architecture that makes more extensive use of field programmable gate array (FPGA) chips as well as conventional microprocessors to power its Azure cloud. But on Monday, the tech giant is revealing more details about that deployment.
Essentially, Microsoft (msft) researchers say that treating FPGAs as the “front door” to each server can expedite processes and even offload some key jobs from the server’s main microprocessor chip. For prospective business customers, that could mean that Microsoft Bing would return better, more relevant Internet searches based on a wider array of sources without adding delays.
Additionally, complex computing tasks required by artificial intelligence (AI) applications would be accelerated, according to two Microsoft professionals involved in this work: Derek Chiou and Sitaram Lanka. Chiou, Microsoft partner hardware engineeering manager, will present a research paper on that topic later on Monday at a tech conference in Taiwan. Microsoft also published an article on the project.
These chips, which are more flexible than the standard microprocessors or central processing units (CPUs) exemplified by Intel’s (intc) X86 chip family, can be reprogrammed with software to handle different tasks as needed. That means that as work requirements change, the chip can be adapted on the fly and doesn’t necessarily need to be replaced. And just as Microsoft worked with Intel on the X86 microprocessor design in past decades, it’s working with the same company on FPGA issues, Chiao tells Fortune in an interview. Intel bought Altera, a maker of FPGAs last year for $16.7 billion.
There are many advantages of plugging an FPGA into the server, Chiou adds. For one thing, he explains, it can “sniff every packet” of data coming into the server as well as perform encryption, decryption, and compression without involving the CPU..
“Putting it at the front door doesn’t just make the network faster and more secure, you can also do many more things outside the purview of the CPU,” Chiou posits. “It’s like going to the bank and going to the teller to withdraw money. You don’t have to go to the manager.”
Get Data Sheet, Fortune’s daily newsletter about the business of technology.
Microsoft has been rolling out this technology internally and some of the advantages are showing up in Bing searches already. But the full benefits to people outside Microsoft should start showing up in the next six months or so, notes Lanka, partner group engineering manager at Microsoft.
While it’s impossible to quantify performance gains customers will get for any given application, Microsoft itself has seen some internal applications perform 200 times faster than in the past due to this additional hardware help, Chiou notes.
For more on Microsoft, watch:
Microsoft, Google (goog), Amazon (amzn), and others are all beefing up their data center infrastructures to boost artificial intelligence and other applications. Google, for example, is betting big on its Tensor chip, an application-specific integrated circuit, another chip technology, to speed up these applications.
When asked if flexible FPGAs will take on more jobs in data centers—perhaps even replacing other chip types—Chiou replies that data centers will use the best tools for any given job. So while there will likely be heavier use of FPGAs going forward, they’re not going to make microprocessors or ASICs obsolete.
“This will remain a heterogeneous world,” he remarks.