With all the presidential campaign talk about American exceptionalism, it might be easy to forget that we do a pretty unexceptional job at some things — like shopping for computers.
No question, we Americans buy a lot of them — the latest estimates say more than 75% of U.S. households have at least one PC, among the highest ownership rates in the world. The problem is, we are hooked on the underpowered, bargain-bin variety, the sort that putter around on the Internet, choke on high-definition video, and struggle to render 3D games. Our habits make PC buyers in places like Germany laugh at us. (The mainstream German PC buyer has a nose for good engineering — no big surprise there.)
What should we Americans be buying that we’re not? Something called a graphics processor is high on the list. These special chips made by companies like AMD and Nvidia speed up visually intensive (and increasingly popular) tasks such viewing photos and high-definition video, and playing games. According to research firm IDC, last year 39% of consumer PCs worldwide shipped with graphics chips — but both AMD and Nvidia says the United States lags savvy countries in Europe and Asia when it comes to embracing the technology.
That’s why when Apple unveiled new MacBook laptops last week, the specs turned a few heads. Unlike the other mainstream PC makers, Apple has chosen to stop using the standard-issue integrated graphics that come packaged with Intel chips, and switch to a new setup from Nvidia, which Apple says can run about five times faster. Apple will continue to source the main laptop processors from Intel, but those Intel processors will now work in tandem with a respectable graphics chip, part of Nvidia’s GeForce 9400M chipset.
It’s not that graphics chips are new – add-on, or “discrete” graphics chips have around for a long time. What’s new here is that these Nvidia graphics are built into the basic chipset. So mainstream Mac users will get the benefit of improved visual performance without having to pony up for a separate chip. It’s an acknowledgment that these chips can lead to a better experience for everyone, not just gamers and video geeks. (And Nvidia managed to keep this chip from being a heat-making power guzzler; otherwise, it never would have fit into the svelte MacBook Air.)
Could this endorsement from tech’s hottest company finally put graphics processors on the map for the mainstream? The folks at Nvidia certainly hope so. The day after Apple’s announcement, I caught up with Drew Henry, general manager of Nvidia’s media communications processor business unit, and he was practically gushing.
“I think this is the beginning of the era of visual computing,” he said. “I believe that Oct. 14, 2008 will be remembered as the moment when an inflection point happened.” He said other computer makers have already expressed more interest in the chipss. “You’ll see other designs over the next few weeks and months,” in time for the holiday season, he said, though Apple won the opportunity to release it first.
Apple just weighed in on one of the most intense battles brewing in technology. Nvidia and AMD’s ATI graphics unit have long vied for supremacy in their niche. Patrick Moorhead, AMD vice president of advanced marketing, recently showed me a demo to drive home this point; he displayed two computers, one with AMD graphics and one with Intel’s basic integrated graphics, running the popular Iron Man game and playing “The Simpsons Movie.”
The Intel-powered machine failed to display some ceilings and walls in the Iron Man game, and sputtered during complex scenes in the movie; the AMD-powered machine handled both smoothly. Adobe Systems , maker of graphics-heavy software programs like Photoshop, Flash and Illustrator, recently threw its weight behind the graphics chipmakers’ point of view; its latest version of those programs, CS4, is crafted to tap a graphics processor for a speed boost.
Intel, meanwhile, is not sitting still. In the hours after Apple’s laptop announcement, it put out a statement saying it intended to fight hard for Apple’s future business. One of the ways it will do that, no doubt, will be to try to lure Apple back into the Intel fold with its own upcoming graphics processor, code-named Larrabee, which will use multiple Intel computing cores to deliver extra visual oomph. The first products should arrive next year at the earliest.
So where does all of this leave consumers? And will Apple’s move really make U.S. computer buyers smarter about buying PCs? In the short term, probably not. The most affordable laptop to carry the chip so far costs $1,299, and folks like Hewlett-Packard and Dell sell laptops with discrete graphics processors for less money. But Apple’s embrace of graphics is clearly just a first step — it’s only a matter of time before it begins offering similar graphics performance in systems priced at $1,000 or less, and then in every computer it makes.
And that’s when things will get really interesting. Once Apple has built special graphics capabilities into most of its consumer systems, it’s sure to release a new version of its iLife software suite that takes advantage of the extra speed. And once that happens, it’s possible that mainstream American consumers will finally start to see the benefit of investing in graphics when we buy PCs — and those Europeans will have one less thing to snicker about. (MSFT)