I first met Steve Jobs 13 years ago, when I was working on a book on the history of Silicon Valley. Following an extended tap dance with his Apple gatekeeper, and after I’d already interviewed most of the Valley’s other leaders, Jobs agreed to see me, in a conference room at Apple headquarters. I got to see firsthand what I’d so often heard about: smarts, breadth, charm and abrasiveness.
Even before sitting down, he said, “You’ve got 20 minutes,” adding with some derision, “You’re not from here, are you?”
I asked why he asked, also wondering to myself where he’d honed his social graces. “Look at how you’re dressed!” he said. Jobs had on his usual St. Croix black mock turtleneck and Levi’s faded jeans. I was wearing a blue blazer and Oxford shirt.
“I was just trying to show you some respect,” I offered.
He nodded, smiled slightly and acknowledged my efforts. We wound up talking for three hours. I liked him right away, idiosyncrasies and all. Since his death last week — such a sad moment for the Valley and the culture, and his family — I’ve found myself remembering our occasional conversations.
In that initial one, back in 1998, Jobs began by going to a whiteboard to draw a biographical timeline of the Valley. There were Bill Hewlett and Dave Packard back in 1938, developing an audio oscillator in their Palo Alto garage, and in the process giving birth to Silicon Valley (though it wasn’t called as such until the early 1970s, when silicon became the main element in microchips); there was brilliant-but-pathological William Shockley, who founded the first semiconductor company in 1956, in Mountain View; there were the “Traitorous Eight” — including Gordon Moore, Bob Noyce and Gene Kleiner — who bolted from Shockley to launch Fairchild Semiconductor in 1957, which led to the most famous of the “Fairchildren” spin-offs, a company called Intel (INTC), started by Moore and Noyce in 1968, as well as the Valley’s first major venture-capital firm, Kleiner Perkins Caufield & Byers, co-founded by Kleiner and Tom Perkins four years later.
Jobs played the role of history teacher, with an appreciation for his entrepreneurial forbears that is rare in the Valley — a place that cares mostly for the new. And he told the narrative with personal reverence and humility: Packard and Noyce had been mentors, so much so that when Jobs got fired from Apple (AAPL) in 1985 he met with them “to apologize for screwing up so badly.”
What Jobs left out of the narrative, with even more uncharacteristic modesty, was Steve Jobs. At the end of that glorious chronology, sketched out over the course of 45 minutes, he should have added himself (and Steve Wozniak), for starting Apple Computer in 1975. If anybody were to carve a Mount Rushmore of high-tech into the Santa Cruz hills above the Valley, you’d unquestionably see the images of Hewlett and Packard, Noyce and Moore, Kleiner and Perkins, out-of-towner Bill Gates, and Woz and Jobs. Not certain which is Jobs? He’ll be the one with a smirk, seeming to whisper in the ear of Gates, “Even up here in granite I’m still cooler than you.”
* * *
In the iconography of Silicon Valley, Steve Jobs is nonpareil. While he and Wozniak did conceive the personal-computer industry, you could argue that other historical figures of the Valley rival him as inventors of mass technologies or risk-taking founders of companies that dominated an industry. Neither Apple nor Pixar have the revenue of Intel or workforce of Oracle (ORCL). Nor do they have their fingers in the pie of dozens of start-ups the way that Kleiner Perkins and Sequoia Capital always have. Moreover, given that the boss was sui generis, Apple’s management style has never been emulated by others. Since Jobs wasn’t Mr. Congeniality, Apple’s workplace environment has never been a template.
And, of course, it remains to be seen how his beloved company will fare now that his gravitational field is gone; the next act for Apple will be its most important, for Jobs is no longer there to lead it. In short, for all its significance as the creator of Macs and iTunes and iPods and iPhones and iPads — which all rightfully deserve credit for revolutionizing industries — Apple’s lasting significance may be debatable. And so by extension, may be Job’s — at least by any traditional indicia.
Yet all that misses the point. Jobs’s cultural legacy extends far beyond the impressive commercial reach of what he has wrought. You can’t underestimate his role in developing the Macintosh or driving the design of the iPhone or having the smarts to buy Pixar away from George Lucas and turn it into the premier animation studio of its time. Those brilliant corporate decisions — the result of a marketing and aesthetics genius fixated on detail and unmatched in his ability to enchant a cadre of consumers — were preconditions to his legend. But they do not alone explain it.
Due respect to oscillators and printers, nobody ever made a TV movie about Bill Hewlett and Dave Packard. On the other hand, TNT’s “Pirates of Silicon Valley” in 1999 became a cult classic and is the reason Noah Wyle, who mischievously played Jobs, won’t be remembered only as Dr. Carter on “E.R.” (For Wyle’s hilarious and sweet reminisce to me of Job’s reaction to “Pirates,” and how Jobs got him to appear onstage at the Macworld 1999 conference in New York, read this). Similarly, there’s a good reason that Jobs over the years has been depicted in the press as Jesus, Aladdin, an oracle, a magician and Buzz Lightyear. And it’s a pretty fair bet that Bill Gates will never be a prized miniature Lego creature.
Jobs’s star derived from celebrity, in the best and most legitimate sense — a persona born of natural showmanship, relentless perfectionism and fanatical drive. In an alternative universe, where manners weren’t a prerequisite to human interaction, Jobs might have been the ideal American politician. “He seems to be the most Shakespearean figure in American culture in the last 50 years—the rise of, the fall of, and the return of,” Wyle told me, over sandwiches at the Warner Brothers commissary during the last days of “E.R.” “But you get the ‘bonus round’ that F. Scott Fitzgerald said didn’t exist. Jobs has had one hell of a second act.” (It is Jobs’s celebrity, I think, that made the media ravenous for every detail about his health struggles. Business journalists did measured, masterful reporting on about Apple’s secrecy and lack of a CEO succession plan. But calls for more information about Jobs cancer, liver transplant, prognosis and all the rest still struck me as akin to what the tabloids wanted to know about the dying days of Farrah Fawcett. Jobs is now dead and Apple has had a new CEO since August – and the stock price is entirely robust.)
In the entrepreneurial epic that the Valley represents, Jobs was a hero of the first rank. You can see that progression in the Steve Jobs covers of national magazines over a quarter-century. “Striking It Rich,” proclaimed Time in February 1982 and by the end of the year Jobs pretty much got co-billing with “PC of the Year.” (He would’ve been named Man of the Year, but editors found him too unpleasant a character.) Little more than three years later, the worm had turned. Fortune featured “The Fall of Steve Jobs” and Newsweek chronicled “How Apple Dumped Its Chairman.”
Yet Jobs didn’t disappear as a cover boy. Different magazines covered his various ventures in wilderness. Business Week asked “Could He Do It Again?” (1987); Red Herring soon announced “He’s Back!” (1988); and Fortune said his dealmaking was “outfoxing” Bill Gates (1995). But not until his triumphal return to Apple in 1996 did the magazines come full circle. FSB ran his smiling face below the line, “Heroes” (1998); Fortune called him “Stevie Wonder” (1999); Time put him on its most-influential list (2007); and New York declared him “iGod” (2007). Two years ago, Fortune justly declared him “CEO of the Decade.”
At the opening of “Pirates,” Wyle’s Jobs performs an extended monologue. While it’s nominally directed at another character, it’s mostly for the audience. “I don’t want you to think of this as…some process of converting electrons and magnetic impulses into shapes and figures and sounds,” he says. “No, listen to me, we’re here to make a dent in the universe — otherwise, why even be here? We’re creating a completely new consciousness, like an artist or poet! That’s how you have to think about this. We’re rewriting the history of human thought with what we’re doing.”
Fictional though it was, that riff is the essence of Jobs’s megalomaniacal ambition, which in turn governed how he did business: Working with a small group of folks whose apparent shortcomings he can tolerate, obsessing over design, consumed not with the consumer’s sense of style, but chiefly his own — as if he simply intuits that his will charm others or doesn’t care if it doesn’t. Do you think Picasso did market research?
During his remarkable commencement speech at Stanford University in 2005 — the most revealing window into his heart, as well as a Gettysburgian model of elegance — Jobs explained what drove him. “Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose,” he told the students. “You are already naked. There is no reason not to follow your heart…Because almost everything — all external expectations, all pride, all fear of embarrassment or failure — these things just fall away in the face of death, leaving only what is truly important.”
Although you might call that management-by-narcissism if applied in the workplace, it’s inspirational leadership by any other name — the antithesis of the “organization man,” once the archetype for corporate America. In another era, Edison, Ford, Vanderbilt, Rockefeller harnessed new technologies and used their larger-than-life personalities to build an industry. Jobs managed to make narcissism a corporate ethos in Silicon Valley: He was the rebel operating from the inside, the individualistic champion who surpassed mere ego.
At Apple headquarters in the Valley, Jobs didn’t limit himself to the big strategic picture. More than a decade ago, he picked the chef who runs the company cafeteria. You say that’s being a control freak. Employees and guests reply the food’s pretty damn good — even the tofu. At Pixar, east of San Francisco, Jobs oversaw the design of the new building. Because the software jockeys worked in one area and the marketing folks worked in another and so forth, he decided to put the bathrooms in a central atrium. That way, employees had to run into each other each day.
Jobs wasn’t a talk-at-the-water cooler kind of guy. You used to be able to see him walking about the Apple campus, arms extended in front and fingers typing away at his iPhone; he was lost in the machine, oblivious to people and stimuli around him. But he well understood that interaction between diffuse departments was good for the company; “inadvertent encounters,” wrote Pixar’s president, fostered “collective creativity.” Initially Jobs’s management-by-architecture scheme — pooh-poohed by the press as aesthetics run amok — drove employees nuts. Then they bought in.
That micromanaging M.O. didn’t make him a prince to work for, but it’s why he and his products soared above other companies and leaders in the Valley. Ever see a Sun Microsystems workstation that made you drool? I doubt it, but I do remember my younger brother’s first Macintosh in the mid-1980s—I wasn’t sure what it could do, but I had to play with it. How about a Blackberry (RIMM) application that you made you cheer? By contrast, my older son’s iPhone has countless apps that embarrass any other handheld. Or an Oracle marketing presentation you even understood? As the impresario of Macworlds, Jobs’s performances were stagecraft to rival Barack Obama or Bruce Springsteen.
Marissa Mayer, Google’s (GOOG) reigning arbiter of design, has often praised Jobs’s “great eye” and ability to connect complex technology with what people like. In Jobs’s Stanford speech (which was watched viewed online a phenomenal 8 million times the day after he died), he neatly showed wherefrom his sensibilities came. Before he left Reed College without a degree, he wandered into a calligraphy class, simply because it intrigued him. “I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great,” Jobs said. “It was beautiful, historical, artistically subtle in a way that science can’t capture, and I found it fascinating.” Can you imagine Bill Gates talking that way?
Jobs then went on: “None of this had even a hope of any practical application in my life. But 10 years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, it’s likely that no personal computer would have them.”
Mayer is dead right that Jobs isn’t given sufficient credit for knowing technology; it’s just that he so integrates that understanding into his graceful products that the technology almost seems an afterthought. “Design is not just what it looks like and feels like,” Jobs liked to say. “Design is how it works.”
Remember that Jobs and Apple didn’t even invent the most important aspect of the early Macs—a mouse and a graphic-user interface (GUI). Macs popularized them, but it was Xerox PARC (XRX) in Palo Alto that created them. Call it borrowing or call it theft, Jobs’s singular brilliance was to recognize that GUI and a mouse could make computers accessible to the masses. Wozniak himself — the virtuoso programmer — says Jobs is underrated as a technologist. “He knows far more about technology than I’ll ever know about marketing,” Woz told me a few years ago, “and it was his ability to combine the two that made Apple.”
By moving technology from the technical to the vernacular, Jobs made himself not only cool but transformational in the culture. Hewlett and Packard never did that in the Valley or outside it. Neither did Noyce, whose combination of intense privacy and pied-piper charisma seemed to presage Jobs’s; Tom Wolfe did him up nice in an 1983 Esquire piece, “The Tinkerings of Robert Noyce,” but Noyce never was a rock star. And Bill Gates, operating up in Seattle? While Microsoft (MSFT) products are surely more widely consumed than Apple’s, few would suggest that over the long arc of their parallel careers Gates has been the leader and Jobs the follower.
* * *
The second time I met Steve Jobs was on a Manhattan street corner. He was coming to speak to a group of us at Newsweek and we entered the building at the same time. It was in 1999, the week after my book on the Valley had come out. “I’m hearing great things about your book, David,” he told me.
“Really?” I said. “That’s good to hear. What did you think of the book?”
“Haven’t read it — probably won’t.” He seemed to say it as a punch line, with some glee.
Hey, maybe I asked for that, but still, was the shot worth it? I doubt he intended to come off as snarky — it just didn’t matter. Job’s notorious temper, brusqueness, or what admirers describe as directness, paid few dividends for his image. Some leaders can score points for being gruff or even bullying; their foibles become assets, part of what sets them apart. Think Jack Welch, Donald Trump, Rudy Giuliani, Leo Durocher. The idiosyncrasies didn’t much work for Jobs. They became part of the negative exaggerated character that could overtakes biographies, journalistic profiles, idle chatter in the Apple cafeteria, and even on occasion the executive suite. “Bad Steve” is an inextricable part of his identity.
Bob Metcalfe, founder of 3Com and co-inventor of Ethernet, was friends with Jobs for more than 30 years, despite their first encounter, when Jobs unsuccessfully tried to lure him to join a nascent Apple. As geek-lads around the Valley, they double-dated and socialized in the early 1980s; on one double-date, the car had a blowout and Metcalfe changed the tire while Jobs supervised.
“I’ve been on the receiving end of him being an a-hole,” Metcalfe once told me with pure admiration. “I see the same thing with Gates. If you argue with them, one of the ways they respond is to attack. The ad hominem insult is part of their ‘persuasion repertoire.’ I’m sure Jobs is the one who came up with the phrase, ‘You just don’t get it, do you?’ Those are among the most-feared words to hear in Silicon Valley.” Words from Jobs to that effect—“Do you want to spend the rest of your life selling sugared water or do you want a chance to change the world?” — are supposedly how he goaded John Sculley in 1983 to leave Pepsi (PEP) to run Apple.
Like the titans in any field, the stars of the Valley are regularly caricatured. Hewlett and Packard were geniuses! After all, they ran an egalitarian shop in which innovation flourished; in truth, Packard could be a tyrant and Hewlett was less than always accessible. Then there’s Larry Ellison of Oracle — what a rascal! He’s a womanizing, jet-setting playboy who doesn’t worry enough about his software company; yes, he does have an extravagant lifestyle, but he hasn’t remained one of the richest billionaires in the world solely by luck. And what of Larry Page and Sergey Brin of Google? They’re peculiar recluses who rule by equations, as well as make job applicants build Legos; maybe the Google boys do prize quantification as a management principle, but there’s lots of evidence theirs isn’t such a simplistic view of decision-making.
By contrast, portraits of Jobs typically ran to polar extremes. In the zeitgeist of the Valley, except among the fanboys who know only the god and not the man, both Jobses emerged over the years. If you run “Steve Jobs” and “mercurial” through the Nexis database, you get hundreds of entries over the last 25 years; “Steve Jobs” and “temperamental” comes in right behind. The obituaries and appreciations of recent weeks understandably tilted to hagiography. But the essence of Jobs combined Good Steve and Bad Steve. “It’s a package deal with him, as it is with so many talented people with high standards,” Metcalfe correctly points out. The highs and the lows—the elusive contradiction—are why Jobs so fascinated us. In the tale of Steve Jobs in Silicon Valley, played out for 37 years, he alone constituted the dramatis personae.
* * *
The last time I saw Jobs was by chance in the courtyard at Apple headquarters 3½ years ago. I was there with my older son, then 15, to have lunch with an Apple friend. My son is a big Apple fan and user. By chance, we saw Jobs was walking along by himself, pecking away at his iPhone. I said hello, as did he — and he then took my son aside to chat for several minutes, about technology and thinking large. My son was rapt.
It was a gracious thing for Jobs to do, with no payoff for himself. (I don’t merit efforts to co-opt.) He later e-mailed me about the joys of parenthood. While Jobs was tone-deaf at times, he wasn’t a jerk. Some who knew him wondered if he was just missing the gene for civility. I wouldn’t let him off that easy. I saw him exercise those skills when we chose to. My hunch is he just couldn’t be bothered to most of the time because he was busy with something else. My hunch, too, is he knew full well how he came across and didn’t particularly care. You could do a lot worse.
Some years ago, Douglas Holt wrote an insightful article for the Harvard Business Review titled “What Becomes an Icon Most?” In it, Holt cites a handful of brands that “every marketer regards with awe” — for example, Nike (NKE), Harley-Davidson (HOG), Absolut, Volkswagen and Apple. “Revered by their core customers,” he wrote, “they have the power to maintain a firm hold in the marketplace for many years.” Most marketing “experts” haven’t a clue how to turn a brand into an icon. “That’s because icons are built according to principles entirely different from those of conventional marketing,” Holt goes on. “These brands win competitive battles not because they deliver distinctive benefits, trustworthy service, or innovative technologies (though they may provide all of these). Rather, they succeed because they forge a deep connection with the culture. In essence, they compete for culture share… As Apple’s customers typed away on their keyboards in the late 1990s, they communed with the company’s myth of rebellious, creative, libertarian values at work in a new economy.”
What Holt left out of his analysis is that only one of the brands he applauded was associated with a single person. Can you name the CEO of Harley-Davidson? In Silicon Valley, who’s running Intel or eBay or even Hewlett-Packard (HPQ) these days? Readers of Fortune know, but the typical consumer doesn’t. In their day, Noyce at Intel or Packard at HP were not brands unto themselves. Even Bill Gates, for a generation the overlord of Microsoft, is famous primarily because he’s rich. His used to be the most valuable high-tech company, his foundation has the most money, his house is biggest. Jobs’s multi-billion-dollar wealth was beside the point. His Hansel-and-Gretel house in Palo Alto is modest by Siliconillionaire standards; only the locals knew he lived there. Ever see photos of him vacationing on a megayacht in the Mediterranean? Or tooling about in the private jet Apple bought for him? Jobs did both, but the technorarazzi cared little. In that sense, his renown transcended mere lucre. He was an icon. His death at a young age bequeaths him to the ages.
In his coda to the graduating class at Stanford, Jobs spoke of death with the wisdom that sometimes comes mortal illness. But he was wrong about one thing. “No one wants to die,” he told students. “Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because death is very likely the single best invention of life. It is life’s change agent. It clears out the old to make way for the new.” Jobs will not so quickly be forgotten.
For more, please read Fortune‘s ebook
All About Steve
David A. Kaplan, a Fortune contributor and former senior editor at Newsweek, is the author of the national bestseller “The Silicon Boys” (1999) and other books.