by Chuck House
Mr. Phenomenon, Chuck House
What words come to mind when you think of the 2009 book "HP Phenomenon?" Comprehensive, insightful, strategic, historical, illuminating, thorough, definitive, interesting, analytical, fascinating, biographical, all presented with a sweeping view of how Hewlett-Packard, the pre-eminent high-tech company of the 20 th Century got that way.
When Chuck House and Ray Price did their methodical 10-year research about HP's inner operations and their emerging modes of strategic product planning, they interviewed nearly 600 people, reviewed hundreds if not thousands of documents (there were 1,114 footnotes). From all of this mass of research, they compiled their essential treatise on the amazing success of how HP seemed to almost always envision and choose the right new product performance to beat the new technologies that were just coming into the market.
The central business strategy of HP was ALWAYS the strategic product plan. In the early years, it was mostly internal discussions among the principals. New products were forced by the measurement technology of HP's customers' immediate near term requirements. In the 1960s, the development of the moving 5-year product plan became well organized and well honed. As the products moved to desktop computers and then to mid-range computer systems, the R&D funding became staggering, mistakes VERY costly, and decisions much more cautious.
These elements of business and product strategies became excruciatingly complex in John Young's years as CEO, because HP was then playing with the Big Boys; IBM, DEC, GE, and more. The stakes were crucial for proper decisions on technologies which drove operating systems, software, firmware, and long-shots like RISC. Chuck and Ray made those complex times meaningful and were masterful in their analysis.
Thus "HP Phenomenon" delivered a remarkable picture in describing and analyzing the EVENTS and PROCESSES of company progress. It was biographical to the extent that all those unique personalities had their places in the ultimate business success. But it did NOT deliver any of the life and times of the author Chuck House himself. House's memories of his HP period were partially revealed in several associated articles here on this website under the "Other HP Writings" section, GR vs. HP, and the story of the HP 1300A display.
This HP Memory of Chuck House, on the "Birthing of the HP Logic Analyzer" business for some 10+ years is much more descriptive of the life and times of Chuck. It shows his personal involvement, incisive thinking and detailed analysis which he went through in guiding his teams to beat the competition in some EXCEEDINGLY complex technologies. Sometimes in that period, it seemed that technologies were moving ahead with the speed of light.
We remember the first halting logic products, one from the F&T division was a simple clip-on display which enveloped an integrated circuit package, with 14 tiny LED lights. From there, an oscilloscope was commandeered to display tabular columns of algorithmic state conditions. Then life got A LOT more complicated, and Chuck House's imagination and diagnostic abilities showed through. He concludes with Seven Rules for successful product management.
This Logic Analyzer period for House represented only about 30% of his life career. He had HP experience with the 1300A display which he developed against the specific directions of Dave Packard. He spent some years at HP's Palo Alto headquarters, with responsibilities in corporate engineering planning. He moved to Intel for a decade and later to Stanford to lead their programs on Modern Media Innovations. Lastly, he was just appointed to the post of Chancellor of the Cogswell College, a digital media teaching institution. This HP Memory shows us how he thinks.
Click here to download the"Logic State Analyzer Birthing Pains" - The 58 pages document is a 1.5 Mb PDF file.
In early 1970 I was striving to become a successful program manager, to erase the image of earlier gaffes at Hewlett-Packard's Colorado Springs oscilloscope division. I had wangled the chance to lead 'Next Gen' - HP's bold but belated effort to match Tektronix's new 7000 series.
Our program was defined around four contributions - what one wag called, "Smaller, Faster, Cheaper, Lighter." The Tektronix units by comparison to all previous oscilloscopes were "high, wide and handsome", making traditional measurements with more accuracy, precision, sophistication and ease, with a greater price tag than ever before.
On March 23, 1970 in New York City , senior circuit designer Kent Hardage and I were sharing a room in the ritzy Plaza Hotel looking out on New York 's Central Park . Neither he nor I had ever known such opulent and plush style - quite an experience for two engineers from 'the sticks.' We had lunch in the hotel, discovering that they expected a coat and tie. The menu was amazing, we'd never seen so many different names for pasta.
We were attending the annual IEEE International Convention with sixty thousand others. It was the biggest show of its day for vendors to demonstrate their wares - an early precursor of today's extravagant Las Vegas Consumer Electronics Show aimed at gadget happy consumers.
Taciturn and gruff, Kent was a hard-core hardware design engineer, one of HP's best in Colorado Springs. He and I were there to evaluate the new Tektronix 7000 Series oscilloscope family - a competitive spying operation for our Next Gen boxes, so to speak. It hardly was covert because we had to wear our conference badges, and it took the Tektronix team less than three minutes to figure out who we were, what we were doing, and thus to develop a coping strategy. Oddly, they let us in. We knew the functionality better than most folk in their booth; surprisingly, we were invited to stay and exhibit their new 'scope to potential customers.
Inexplicably, I found myself showing their new boxes to erstwhile customers for several hours, after one of them exclaimed, "You demonstrate our features better than we do!"
The learning from the experience demo'ing the box was better than any imaginable market research trip. Potential customers admired the equipment but the oft repeated question "What does it do that I cannot do today?" was a terrific lead-in for us to ask, "What measurements do you need that you cannot do today?" Kent and I asked this so many times we lost track.
The most immediate need was to watch more signals simultaneously - new integrated circuits were confusing designers with their complexity, and many booth visitors were hopeful that by combining two four-channel vertical amplifiers, they could gain some understanding of these new circuit devices. With eight channels (we even tried twelve for a few folk), screen clutter was an issue, and the probing problems became physically daunting as well.
| This amount of digital data probing should help understand
what's happening on the PC Board.
Over dinner, we traded notes. Kent, involved heavily in HP's recently introduced very high frequency 'scope, was incredibly detailed in some of his observations, particularly with respect to sweep circuit non-linearity and CRT spot defocusing. My focus had been both on vertical amplifier distortions, and interactions between the 'scope functionality and time-shared characters that 'gave answers' - rise times, pulse widths, and other 'analytic data.'
But the real learning was sociological - or as we engineers saw it, business issues rather than technology. We learned, for example, how big their R&D team was, how long it took to develop the equipment, and which capabilities still needed further refinement. The 7000 Series was like an overgrown Swiss Army knife - with up to four plug-ins, it could be configured to make virtually any known measurement for design engineers or technicians. Such complexity, coupled with a novel concept for displaying 'answers' on the CRT, was historically unavailable from any instrument manufacturer. It was a huge bet on the future of instrumentation - by the 2nd largest instrumentation company. Our employer, Hewlett-Packard, was the obvious target.
The basic premise for the Tek 7000 Series was that electronics designers used only four basic equipments - voltmeters, oscillators, oscilloscopes, and spectrum analyzers - with occasional additional needs (e.g. ammeters, distortion analyzers). Why not put all of them in one box?
If this strategy worked, the advantage for Tektronix was obvious. They were leaders in 'scopes, but also-rans to HP in nearly every other category. The 7000 Series was thus intended to embed all tools that an electronics engineer might ever need, inside one mainframe, with an integrated systems approach to display readouts as well as electronic interaction between various tools. It augured to give them entrée for a host of related tools that HP then dominated.
HP had much to lose if this approach worked, but at HP headquarters in Palo Alto, there was surprisingly little concern. HP strategists felt that the various sub-disciplines of electronics were sufficiently separated, and each so sophisticated, that it was folly to centralize the functions. It was the classic specialist versus generalist argument - hard to know the results up-front, even hard to know until the experiments have run for a lengthy time. Both views could be correct.
My perspective, though, was different. We were not trying to answer a need perceived in Palo Alto; instead, we were trying to leverage HP strengths in voltmeters, spectrum analyzers, etc. into more 'scope sales. Our market belief was based in part on the remarkable success HP had crafted with the HP 140/141 'scope platform, when it added spectrum analyzer and network analyzer capability to that venerable mainframe. But the 140/141 platform only handled two plug-ins, and it was restricted both in power and CRT capability to relatively low-frequency 'scope functions. Our Next Gen program was defined around several contributions - up to eight plug-ins, very high frequency capability, stored memory and novel user interfaces.
Standing in the Tektronix booth, I never heard a potential customer excited about combining spectrum analysis with time-domain measurements. No one seemed enthused enough about active display answers to want to pay the overhead costs. By contrast, almost everyone seemed to have requirements for integrated circuit testing, either at the device or the board level. There wasn't agreement on what was needed, but there was widespread belief that it was 'the problem'.
My Next Gen program, under way for twenty months, didn't address these issues either. We'd already spent $2 million, with a burn rate that had to escalate for our next phase. We were, at best, two years from delivery. The Tek engineers told me that they had spent $34 million dollars and seven years of effort, to get to this show with an incomplete line. Whew!
Worse, my team was having trouble debugging our own prototype scheduled for a division review in four weeks. That review would be with two guys notoriously chary about poor engineering detail execution - Packard and Hewlett, headed to Colorado Springs in late April. It was not hard to figure out what to do. I had to cancel Next Gen.
The first problem was that I was in love with it. The second problem was that I viewed this as my redemption opportunity. I had defined the program, sold it to management, investigated and selected the new technologies, built a crackerjack team, and made great progress compared to historic efforts. It was, as I saw it, my chance for recovery from some prior management miscues, including the time I'd pissed off Packard by continuing the displays program.
To recommend that we disband the effort seemed like a suicidal path, especially if I had no alternative answer for what might be a better path to take.
Next Gen by now was a sizable bet - in fact, the largest bet we had underway in the division - and fundamentally, the only chance we had to match Tektronix, however late and understaffed we'd be. So the third problem was, "What now?" And since I had no answer, canceling the project would be a certain death knell for my management hopes.
In my heart, though, I knew that the project was doomed. We'd never be able to match Tek's functionality or capability, not with the constraints on R&D spending at the division. Faced with the idea of continuing a fatally flawed project, versus boldly arguing that it should be disbanded, nonetheless turned out to be a quite difficult emotional decision. So obvious in hindsight, these things are seldom clear when you're in the heat of battle.
I reflected on the display program for which I'd disregarded Packard's dictum. There the decision had open-ended possibility: if the project were to complete within the year, we had a chance to prove its value in the market. If we failed to finish, we'd been able to run a great low-cost experiment. Here the circumstance was different. If the analysis that Kent and I conducted was correct, no sanguine event in a year could 'save us.' We had no hope of completion in a year. If we did succeed in getting back on schedule, we'd still face a much better funded, more complete competitor with a head-start. Most importantly, there was no new measurement contribution in the plan - we'd be ducking the same problem that Tektronix avoided.
It took many sleepless nights to resolve this situation in my own mind. Today, the question that recurs is "Why did it take so long to get to this answer?"
The next question was how and when to cancel the program. I felt a strong obligation to support my team. They were working twelve to fourteen hour days to complete the breadboard for the annual review. To pull the plug on that, a mere three weeks before the event, seemed unfair, especially given recent progress and palpable enthusiasm. Yet, it felt duplicitous not to tell the team that no matter how well they did, the program wouldn't continue.
I was torn - a true dilemma, where neither choice is any good at all.
My team tried to explain in self-defense why they were having so much difficulty, ironically pulling me into the project more deeply. They showed me a maze of wiring between multiple connectors that mated plug-ins to the main frame, and they spared no detail in describing how hard it was to keep track of several hundred wiring connections, with no tools to facilitate it. More significantly, they laboriously discussed the data being shipped back and forth from a plug-in to the main frame, and the difficulty and importance of agreeing on data structure formats.
I'd never worried about the details about how computers actually worked. I knew that bits, combined into bytes and words, flowed through registers in sequence, determined by a software program. But what those words looked like, and how they were coded, was a mystery. No one on the team had been conversant with these 'normal' computing standards when they started, but they now patiently explained to me about issues surrounding MSBF (Most Significant Bit First) or LSBF (Least Significant Bit First), plus EBCDIC (Extended Binary Coded Decimal Interchange Code) and ASCII (American Standard Code for Information Interchange) codes, and even hexadecimal versus octal machine code. Which 'word' came first in serial word streams bedeviled our communications bus decisions in the prototype.
So this was more complicated, not because of electronics design problems with these new complex integrated circuits, but rather because we had little awareness of the contextual meaning and coding requirements of the more complex data being manipulated. The aha moment, late one night, was realizing that this was what some erstwhile customers had been trying to tell us in New York City two weeks ago. Kent and I just didn't know enough then to hear what they said.
For the fateful meeting with Messrs Packard and Hewlett, we had made enough progress that the development team was proud of their accomplishment. And then I proffered the opinion, without having cleared it with my own bosses (as I now recall the situation), that the chase was unimportant - that the world didn't think that the Tektronix product, investment notwithstanding, was worth it. In short, there was no significant measurement contribution beyond what was already available in the market. These two patriarchs understood - almost in unison, they chimed, "Then what is the real contribution to be made?"
Cancelling the Next Gen program notwithstanding, late spring 1970 was actually a positive time for HP Colorado Springs. A year before, the division had introduced the HP 183A, the world's highest frequency direct-writing oscilloscope (250 MHz), 150% faster than anything from Tektronix. Market acceptance was stunning. Project manager Jim Pettit, who'd started work at HP the same month as me, Kent Hardage whom I'd traveled to New York with, and Al DeVilbiss, who had designed the vertical amplifier for the HP 1300A Display system that got me in trouble with Packard, were the key developers for this wonder box.
The displays business, now astutely led by Coloradoan John Riggen, was also quite profitable, and with the division now competitive in the high frequency arena with Tektronix, our Palo Alto bosses were willing to let us experiment a bit more than usual.
Strangely, when I renounced Next Gen, eating humble pie in front of the division, it seemed to humanize me to some folk - a lesson I hadn't previously learned. Partially in atonement, when we canceled Next Gen, I was allowed to set up a modest investigation for a 'digital scope' program, a series of prototype ideas intended to test some very different types of tools that might help for these new digital circuit technologies that were starting to present themselves.
This 'digital 'scope' idea was not just a sudden inspiration, born out of Next Gen frustration, in a situation where no one else had noticed these issues. Au contraire. There had been four previous project managers working on 'digital tools' in our lab, plus several projects in other groups around HP. So, in one respect, this was hardly a risky proposition. If I failed, I could point to the previous attempts and say, "Well, it's a really tough problem." If, on the other hand, I were to succeed, it would dimensionally set me apart from the previous managers. The question was, what will we do differently this time? The remarkable thing, in hindsight, is that we were given a free hand to discover as best we could what might work.
I spent much time thinking - philosophically musing if you will - about what really was happening before our very eyes. One way to figure out what is happening is to go look. I sent our small team out to where 'digital designers' worked. What do they actually do in their designs? What is hard for them, and what helps? It would have been nice to find dedicated groups of folk who had only the new problems. Instead, hybrid systems abounded. In all labs, oscilloscopes were by far the most prevalent debugging tools. And we didn't yet have anything new to describe, so it wasn't really product marketing research so much as process observation.
But there were concrete ideas that we used as metaphors. When transistors arrived on the scene in the late 1950s and early 1960s, experienced circuit designers who were intuitively great with vacuum tube designs mostly struggled - the colleges used this example to describe why a scientific education rather than a technical education was better: "if you learn semiconductor physics, you can handle the duality of holes and electrons, whereas there is no equivalent concept of holes in a vacuum tube." At age twenty, I had thought such an argument silly, but when I had joined the workforce, there were indeed a number of circuit designers - derisively called cookbook engineers - who couldn't make the shift in thinking.
I've seen this happen numerous times in electronic circuit design. The shift from transistors to integrated circuits (groups of transistors which acted as a function, rather than as movement of holes or electrons) derailed many. Going from small-scale integration (SSI) to medium-scale (MSI), large-scale (LSI) and then very-large-scale (VLSI) changed the rules each time. When the key ingredient in designs changed from hardware to software, many more were disabled.
The particular design perspective that emerged from the MIT Radiation Lab experience in World War II was that while linear circuits worked well for communications and control systems, switching circuits worked better for systems such as radar, sonar, and computers. The value of oscilloscopes became quickly evident, which Tektronix capitalized on in the late 1940s for military designs for Korean equipments and Cold War electronics. Although the two largest instrument manufacturers - General Radio and HP - each designed early oscilloscopes, both companies were founded and led by communications designers who struggled with this new switching circuits design paradigm. Each company misunderstood the importance of 'scopes to the new electronics, allowing a nascent Tektronix to fill the void.
As coincidence would have it, I was completing a Master's Degree in history at the local university. An unusual focus for a Caltech scientist, even more so for a Hewlett-Packard engineer, it often engendered conversation at cocktail parties: "Why?"
Focused on the history of technology, the quest took me to Princeton, where I met with Dr. Thomas Kuhn about a possible PhD. Kuhn, a cadaverous figure, had garnered some fame, even notoriety, in both history and scientific circles for his insightful 1962 book, The Structure of Scientific Revolutions. A turgid book to read, it nonetheless outlined an impressive series of studies in history where new scientific theses were advanced, only to be categorically dismissed by those in power - essentially 'conventional wisdom' won, even if overwhelming evidence marshaled to the contrary. The new ideas essentially had to wait until the 'old guard' died.
Kuhn, a stooped, cranky curmudgeon at 57, was not open to new ideas that stretched his own thesis, which both surprised and dismayed me at the time. Naiveté is so easy when you're young. I had little appreciation for how much opprobrium he'd endured from the intellectual community assailing his thesis - and here I was, arguing that his thesis matched a crass commercial situation that in hindsight was quite unimportant to a serious scholar. A wave of the hand was summarily dismissive, as he said "at 29, you're too old to invest time in teaching you."
Flying home, I couldn't shake the notion that his thesis fit our situation - we were seeing new measurement needs from an old paradigm - and it was proving, as he prophesied, very hard to let go of the old beliefs and assumptions. With forty years more experience, it is now much easier to recognize the problem - but then it was strange, baffling, and difficult.
There certainly was a nugget of truth buried in this set of unusual problems that we faced. A few phone calls to friends in HP computer divisions were revealing. I'd start with a description of our issues, and almost immediately they'd respond with a similar horror story. Asking what tools they used elicited laughter or sober reflection that, "Hummn, there really aren't any."
The problem was that none of the descriptors helped to categorize the needed tools. Yes, we were going from linear circuits to switching circuits, and 'scopes worked great for that analysis. Yes, we were shifting from transistors to integrated circuits, but 'scopes still probed signals at IC pins. And yes, the integrated circuits generated functions - operational amplifiers, AND and OR gates, and flip-flops - but 'scopes could still manage these signals. And then - aha!
We realized that some of the new computer circuits created 'logical' combinations of parallel circuits - it was both these combinations and the switching behaviors which could not easily be monitored. 'Race conditions' can occur in parallel circuits - did circuit A get done before B? Keeping track of the finite rise and fall times of the switching behavior of multiple circuit gates became daunting. If eight rise and fall times were all displayed in perfect order on an oscilloscope, it was difficult but possible to discern some 'false positive' and 'true negative' events that only lasted very briefly, but could actually create anomalous failure modes.
Integrated circuit (IC) design itself was hobbled by similar questions, and novice IC designers at HP begged us for tools to be able to probe their small-scale integrated chips to examine them for race conditions. But we wanted to focus on system, not chip, testing.
The logical combinations were the more interesting phenomena, we felt. Called synchronous design by some and algorithmic state design by others, such designs were functional in nature rather than determined by electronic parameters. In other words, the devices would all be in a steady-state mode, describable as an event-time, and a switching action would shift everything into a new steady-state event-time. There would be no reason to sample the intermediate switching actions because nothing mattered functionally until the next true steady-state event.
What was hard for test equipment designers was to decide to concentrate on the steady-state conditions rather than on the switching behavior. Heretofore, the most interesting things to analyze were the dynamic transitions, not the sequence of steady state conditions. Somehow, our design group came to believe that handling the complexity of so many signals might better be handled by ignoring the transitional behaviors, instead monitoring and decoding the increasingly complicated 'state conditions' in-between the switching times.
It would be nice to report that we sat down around a table and discussed these issues rationally, and sometime during the meeting, we had that aha moment and we all left the room in agreement that this, THIS would be the breakthrough idea.
It of course didn't happen that way. We gathered up several folk with a bit of digital design experience, reassigned the bulk of the team on the Next Gen project to other 'scope projects, and tried to imagine tools that would help. Circuit designers Duncan Terry, Virgil West, and Kurt Gfeller, plus product designer Jim Freeman and I from the Next Gen system (down from twelve) built a Digital Waveform Display Conditioner prototype (nicknamed DWDC, or D'wuck ), that tried out several concepts - pattern triggers for both parallel and serial data streams, digitized or clocked event delays rather than linear time, and specialized small high-impedance probes.
Then, the bottom fell out of the electronics business. HP President Bill Hewlett announced over the public address system on a clear Friday morning, June 26, 1970, that the company would take every second Friday off without pay for the foreseeable future, until the economy recovered. Fortunately for us, our peripatetic division manager, Bill Terry, persuaded Bill Hewlett that our division should be exempt, because we had Tektronix under pressure for the first time. Indeed, initial Tek 7000 sales reports were poor, just as Kent and I had speculated.
In subsequent months, we ran many D'wuck experiments. Each feature or functionality taught us something; the set showed that random logic chip combinations were less important than gated logic arrays. Gated logic arrays made synchronized designs possible. The Intel 1103 D/RAM (Dynamic Random Access Memory) was the integrated circuit chip that made the difference. Announced in October 1970, it offered a 'low-cost high-speed memory' semiconductor chip with one thousand bits of memory, enough for 128 eight-bit words.
HP Loveland had been producing desktop calculators since late 1968; this D/RAM chip augured to alter greatly their cost/performance ratio. The first major adopter of Intel's chip, they consumed one-third of the world's D/RAM consumption for 1971 and 1972 in the new HP 9800 desktop calculators. Thus, one hundred thirty miles away, in a collegial Hewlett-Packard lab friendly with ours, we had a 'next-bench' lab to study.
With the HP Loveland insights, the D'wuck prototype came together, and I scheduled a key market research trip to the Bay Area. On Monday, June 21, 1971, I proudly stood in front of thirty-three digital designers in the HP Cupertino, California minicomputer lab, with Bert Forbes as my host. Bert, affable and smart as a whip, had listened on the phone to my story, and promised an audience. As I intoned our approach and findings, my Swiss émigré designer Kurt Gfeller turned on the D'wuck - and smoke filled the room. Gfeller started babbling in German. Not an auspicious start.
The next three days would be profoundly important. Bert gave us a lab bench to work on. By Thursday, the D'wuck worked. We only drew a crowd of six this time. But one invited us back into his test bay, and soon we all saw the LRCC (Longitudinal Redundancy Checksum Character) pulse on their disc drive, displayed on our 'scope CRT. The entire group was excited - "We've never seen that before. WOW!!!"
We did several other measurements that seemed to dazzle them, and then Bert did something that changed our world. He called a friend at IBM Santa Teresa labs, and said, "We've got some guys here with a box you should see."
Friday, we went to the IBM Santa Teresa research lab, the group that invented the RAMAC (i.e. Random Access Method of Accounting and Control), the first random access disc drive dating back to 1956. Now mid-1971, they were trying to perfect what would become the first Winchester disc drive. They loved the D'wuck; in fact, they wanted us to leave it with them, rather than take it back to Colorado. We refused, saying that this was the only unit in the world. They reluctantly let us go. Duncan voiced it, driving out: "There might be something here."
Reality set in as we returned home to HP Colorado Springs.
I held a small seminar to describe our market research findings. To illustrate the value of our serial pattern trigger capability, I carefully drew how LRCC validation checks operate for computer systems. Then I sketched an outline of how the addressing schema for satellite communications worked, pointing out that we'd have to widen the pattern search, but conceptually it was analogous. It was a careful, meticulous, precise presentation.
Walking out, my reserved supervisor, John Strathman, uncharacteristically threw his arm over my shoulder, and said, "That was the finest technical talk I've ever heard you give." Since he'd often given me feedback of the form, "Don't exaggerate so much," this felt like high praise.
Over the next week, though, a different message seeped through the division: "House doesn't believe in it. He gave a dry technical talk, for crying out loud. No enthusiasm whatsoever."
We were put 'on hold' - told to wait for some 'decisions' before we started the next design.
And then, for days, the management staff was locked in huddles. Rumors were rift - "This program goes, that one stays. . ." The ambiguity got to us, and somehow one Friday our HP team of five got fed up, and decided to quit en masse. We didn't know what next to do, but we knew we were done working for HP. They just didn't get it. We bundled up all of the D'wuck work, wrote out resignations, put them all in an envelope, destined for the HR director's desk.
The envelope found its way to Strathman's boss Dar Howard, who was home in bed recuperating from unexpected back surgery. Howard's surgery accounted for the re-organization delay, which in turn had fueled our precipitous action. Dar, my sponsor for the original display box, shielding me from Packard's dictum to cease development, now appealed to me, saying "You've done what we need with this 'digital scope' stuff. I have been trying to get you some real help, and we've figured it out. Don't leave now."
He shared some privileged information. IBM Poughkeepsie had called him after we'd been to the IBM Santa Teresa labs. They loved the D'wuck, which impressed Dar's management team. But budgets had become much tighter. While HP had relented on the nine-day fortnight after nine months, Tektronix had matched the HP183 specs, slowing our division sales. Dar and his team had decided to ship our sampling 'scope technology to the sister division in Japan, and our pulse generator lines to the German division. This would free up some of the better digital designers in our division - those who knew about sampling and pulse generation rather than linear circuits - to work in this new arena that my small team had been exploring. Our group would grow - from five to eighteen people. Howard cajoled: "This is your big chance."
I told the team, and we all agreed to stay. It was another few weeks before I learned that Dar intended for Jim Pettit to manage the group, not me. And still more weeks to find out how the group would be composed. It was a polyglot group, assembled from four different labs on-site, with little history in common. It wasn't, in retrospect, very hard to imagine why Pettit was a better choice to manage the group; I'd not exhibited a lot of tolerance for ambiguity in the past.
And then, some serendipity. HP had hired Roy Hayes from Tektronix. Roy, a linear circuit designer, co-owned a tractor with a Tektronix design engineer who lived in Beaverton, Oregon. The question became: "Would you consider buying out his half-interest?" I said, "Sure" and Roy and I hopped into his van and drove to Portland, where he arranged for us to stay with Merle Kaufmann, the tractor owner, while I filled a rental trailer with the tractor, plus trees and shrubs.
The van lacked a trailer hitch; every night Roy and Merle would go to someone else's place to build one while I babysat Merle's kids. And some Tektronix engineer would knock on the door, ask for Roy or Merle, and when I'd say, "They're not here, but I'll tell you where they are," they'd reply, "Oh, are you Chuck House? You're working on digital 'scopes? I'd love to talk." I invited them in. We compared notes on testing these new logic systems, for three nights. I learned a lot from them; I am sure they learned a lot from me. We each lamented that neither company's management seemed interested. And then we parted, not to meet again for years.
Back in the Springs, Jim Pettit and I guardedly eyed each other. I respected his ability, but the thought that he'd manage my program was more than galling. And he had little sensitivity to the issues we'd been wrestling with for a couple of years now. It is very hard for practiced managers to shift thinking, even harder if they believe that the methods they've been using were successful - and he had every reason to think his methods had worked, and little reason to think mine had.
As the holidays approached, Jim came to my home one Saturday to work on our proposal for a Monday review with some Palo Alto folk. Jim, unaware that we'd sold our home, was surprised to find us living in a two bedroom rental home in a decidedly low rent district. Gayle's brother was living with us and our three girls; it was easy to deduce that we were 'unfettered' in terms of staying with HP. I did nothing to ease his mind on the topic when he inevitably raised it.
New Year's Eve, December 31st, 1971, my wife and I were reflecting, champagne glasses in hand - the past year and a half seemed like a blur. Neither Gayle nor I could quite put it in words, but the pace had been exhausting, the energy expended quite high, and the costs seemed overwhelming. And she was expecting - a son to join our three daughters, due in July 1972. But the piece of big news that night, wonder of wonders, was that Jim Pettit was headed for a new job in HP Palo Alto, and I would get my group at HP back.
The next week, I took stock. The eighteen members of the team were quite a menagerie. Duncan, Jim, Virgil and Kurt were there from the original group; they felt some entitlement to be first among equals, since they'd been pioneering with the D'wuck. Pettit had not listened much to them in his few weeks at the helm; they were overjoyed to have me back as their leader.
There were four experienced designers from the sampling lab, notably phlegmatic Al Best and clever but enigmatic Bill Farnbach. There were also four from the pulse generator lab, including fiery leader Eddie Donn and skeptical Jeff Smith. Seven of these eight were senior designers, compared to the original team. Five designers from the 'scope lab rounded it out, none with stand-out skills as I remember it. The whole feeling seemed to me to resemble that of expansion baseball teams in the major leagues - where every extant owner offered the new team some utility infielders, disabled ballplayers, or fading stars; none were invested in having the new team produce a winning club in the foreseeable future.
Despite all eighteen people sharing the vaunted 'HP Way culture' in the same Colorado Springs division for years, there was precious little bonding amongst the group. Vocabularies for the design issues were different; perspectives varied widely. Debates about which measurements mattered were easy to stimulate; resolution was not. Attitudes about whether we were doing 'something new' or instead should simply be building 'scope accessories were strongly held, and bitterly contested. The erstwhile leadership - me, taciturn Duncan Terry, edgy Bill Farnbach, volatile Eddie Donn, and placating marketer Dick Cochran - had three things in common: mistrust of and even lack of respect for each other, profound lack of understanding of these new devices and their design and test issues, and essentially no management maturity.
Other than those limitations, we were a pretty good leadership team.
The group we managed wasn't so sure, and we weren't so sure about them apparently. The roster listed eighteen names to start the year, and ended with twenty-one. But an astounding eighty-two names graced the list during the year. Hardly the image of a stable, controlled, well-managed operation, it did not win many points for the kind, gentle, humane HP Way. Of the sixty-one who were intermittently on the roster during the year, nearly all found other roles within HP, so it wasn't really a hatchet job. Such roiling turmoil was unheard of then at HP; a decade hence in the hotly competitive entrepreneurial Silicon Valley world, such turnover would not be surprising, and HP's typically genteel approach increasingly seemed anachronistic.
Challenges abounded six months after taking over the digital 'scope program from Jim Pettit. Many doubted my ability to manage, others questioned the vision, almost all seemed uncertain of the strategic and tactical plans that might produce positive results. To be truthful, in the dead of night, my self-doubts were substantial as well. To deal with this, I developed the chart shown below, as a way of calibrating how I felt about things at any given time. With the chart, I could ask members of the team where they thought we stood - it became a litmus paper test of our confidence level. In retrospect, I'd found a semi-quantifiable way to communicate our fears, hopes, and beliefs in a manner that could surface differences of opinion, and tackle real issues.
|Six Stages of Leadership Capability
The chart was built to analyze 'vision' situations. Let's walk through the various stages:
This chart can be helpful for a leader or manager of any program. Imagine giving it to each member of your team at, say, quarterly intervals - you cannot miss getting a sense of whether their confidence is growing or waning in the project or in your leadership. Bear in mind that if you yourself do not have a vision of where you and the team are trying to go, it is incredibly hard to build a consensus about how to get there. If you do not appeal to the underlying ego-driven needs of the individuals who are going to take the journey with you, it will be very hard to build the team. Because the journey to realize a new vision is almost always so difficult and unknown, it is really important that you as the leader be able to appeal to your team at more levels than just the game plan written out in some procedural way.
Point of view is obviously crucial to the success of leaders - nowhere is this more true than in these truly innovative entrepreneurial or intrapreneurial environments. Table 2 below shows some common perspectives about your program and your own leadership, as seen by your managers, and your team, for each of the six stages. In fact, it is often easier to find out how people really view the program by asking these seemingly peripheral questions - that way, they don't have to be directly confrontive with you, but they can express their doubts about the program without putting personal invective into the equation.
| Multiple Perspectives of the Six Stages of Leadership Capability
For our polyglot digital tools team, I would use this chart several times that first year. It was a useful discriminator for "who got it" and who didn't, as well as for those who had enlisted versus Doubting Thomases who'd adopted a 'wait-and-see' attitude. In mid-1972, the digital team had strong belief that we were working on the right things, but not that we had 'the vision' yet. Bear in mind that I'd been an HP manager in this small division for most of a decade, seldom viewed as a leader in an E-stage or F-stage role. As 'our vision' crystalized over the year, rich debates between competing factions erupted, not easily resolved. Views of our leadership more often surfaced words like pig-headed or arrogant than charismatic or inspired.
Oh, my gawd. He took me literally. Virgil, our prototypical nerd, nicknamed 'high pockets' for the high-waisted dress slacks he wore, was calling from the Denver airport, beseeching my help. His words didn't make sense, as his high-pitched voice blurted out: "I cannot get on the plane, I put my tickets in the mailbox."
This thin, painfully shy, angular man hiding behind his horned rim glasses had three master's degrees - in astronomy, quantum physics, and circuit design - but he persistently seemed unable to synthesize answers at the intersection of those fields. And now, this. He was on his way to Atlanta, to see how electrical engineers were designing products using integrated circuits (ICs). His mistake? He'd written out his bills - telephone, utility, and garbage - and handed them to the airline counter clerk, after he'd put his airplane tickets in the USPS box outside.
At the start of the year, I had had an inspired thought - at least it felt like it at the time. The idea was to formalize the market research tactics that I had used on both the display box and the Next Gen investigation in previous years. This R&D lab was created to build tools for designers who might use these new integrated circuits (ICs). But none of us had used ICs ourselves.
So my idea was that each designer should go spend a week in a lab where designers were actually using these devices, and try to figure out what kinds of tools would help them. And in the first few months, half of my designers had done so, giving us many ideas.
But half of the designers had not gone anywhere - they hadn't even made plans to go. I lost patience at our monthly luncheon in June, and said something to the effect of "I don't care if all you do is go to Atlanta, and find out how the restrooms work - you'll have learned something."
Virgil, eager-to-please, promptly filed a trip request to go to Atlanta. He asked me what I expected him to find out, and I'd said, "If you know what you'll find, you don't need to go."
Circuit design with discrete devices had a sixty year history - using either vacuum tubes or transistors as amplifying elements. For forty years, communications circuits for AM or FM radio had dominated the work, and many tools - including oscillators and voltmeters - had evolved to help both designers and maintenance people. Analyzing such circuits was named the frequency domain. World War II had stimulated a second class of circuits, called switching circuits, to build different kinds of systems - radar, sonar, and television. These circuits needed new design and analysis tools, primarily pulse generators and oscilloscopes - a field called the time domain.
Both frequency and time domain designs used discrete parts, wherein each device did one thing, and it connected to the whole circuit via two or three nodes or 'pins' that could be probed to see the voltage as a function of time or of frequency. The breakthrough idea of integrated circuits was that multiple discrete devices could be put on one semiconductor 'chip', thus making a complete circuit function within one 'device'. This creation earned a Nobel Prize in Physics in 2000 for its 1958 co-inventor, Jack Kilby of Texas Instruments.
IC industry progress was rapid over the first dozen years. Gordon Moore, co-founder of Intel Corporation, predicted in 1965 that the number of devices on a chip could double every two years. This became known as Moore's Law. In 1972, Intel introduced the first microprocessor on a chip, the Intel 4004, with 2,300 transistors, mind-boggling for circuit designers of the day. And even though Moore's Law suggested that this number could become two billion devices by the turn of the century, none of us then imagined such a future. Moore's Law has held now for more than fifty years, giving rise to an Information Age product explosion - cellphones, laptops, entertainment devices, as well as automated systems for airplane guidance or traffic control to sophisticated banking and manufacturing operations.
Integrated circuits posed a major paradigm shift in requirements - and the instrumentation world had no answer for the design and analysis issues for this new class of devices. Drawing on the frequency vs. time domain duality, I postulated that this new category should be termed the data domain. As a term, it caught on. But that didn't answer the fundamental question - how do you design analysis tools for the data domain?
Bill Hewlett, HP's founder, had for years offered sage advice about such situations: first, "marketing is too important to leave up to marketing people"; and second, "we believe in the Next Bench Syndrome". The Next Bench Syndrome meant that HP engineers didn't have to travel for market research - we just solved problems that the design engineer at the next bench in the lab was experiencing. But Hewlett's dictum couldn't work if the problems weren't in nearby labs. The parent company, based in Palo Alto near dozens of leading-edge electronics companies, was close to many designers using such devices. My lab was thirteen hundred miles away, in a Wild West resort town with scant electronics activity. Except for Denver, a banking, commerce, and mining town, there wasn't another city within five hundred miles. There was no choice for our team but to travel to other companies and other HP labs to find designers using ICs. Citing Hewlett's sayings, I insisted that every designer spend a week with potential customers during 1972. Upon return, they were to report their findings to our monthly luncheon. We would then have a group debate about the lessons learned from those trips.
This travel decision was not without critics. My peers were certain that our 'kids' were unskilled with customers, and that such travel was profligate with Hewlett-Packard monies. They pointed out that engineers notoriously seek to find customers who agree with them, rather than independent unbiased input. There was a snobbish belief that our designers were so naïve, unsophisticated, or even crude that their presence would reflect badly on HP's reputation. True, we did have a motley crew. Field sales engineers still wore coats and ties to call on customers - I'd wager that half of our designers didn't own a tie. Peers invoked personal hygiene habits and salty language as reasons to keep the 'kids' out of customer facilities as well.
I had a different concern. What if we couldn't figure out the riddle? My view was that more brains are better on this problem - it was just not at all clear what kinds of solutions might best work. So I had to bet on the 'kids' - they were the only chance we had. My peers grudgingly agreed, if a field sales person set up the meetings and accompanied every engineer to a customer.
Many of these designers had never been to a customer facility in their lives; they were very unclear on what they were to do. I used our monthly luncheon meetings to describe how they might go about it. Their questions included, "where do I go?"; "what should I be looking for", and "what do you expect me to find?" Some engineers couldn't wait to go; others seemed frozen at their lab benches. Two product marketing managers, Dick Cochran and Bruce Farly, helped them choose which customers to visit, and what to look for and ask while they were there. In effect, they gave training classes in market research. The biggest point was not to ask "What do you need?" but to ask "What are you trying to do? Show me what you do and how you do it."
Importantly, we had carte blanche to design our strategy without interference or help from corporate planners - a legacy of HP's unusually decentralized organizational structure, and Hewlett's belief that innovation in remote divisions was a vital force for corporate renewal. We began with the idea that we wanted to build a 'digital 'scope, rather than an analog 'scope. Our definition of a 'digital 'scope' was not an analog 'scope with digitized readouts, but instead a tool as analogously valuable for digital designers as 'scopes were for analog designers.
We just didn't know the features yet. And Virgil couldn't even catch his plane to Atlanta.
Finding receptive companies to visit wasn't as hard as we expected. But the trip reports were often conflicting. Burly, argumentative Dan Kolody, with previous IC design experience, visited five different design teams in Phoenix - at Motorola, Honeywell, and Hughes Aircraft. He found a plethora of integrated circuits with exotic mnemonics - RTL, DTL, TTL, and ECL - each a family cluster that required a specific set of voltages, pin-outs, fan-outs and design rules. But RTL (Resistor-Transistor-Logic) was not compatible with TTL (Transistor-Transistor-Logic), and so forth. Emitter-coupled Logic (ECL), five times faster than any other family, naturally was the choice for military systems, but it was incompatible with all other ICs.
Back home, Kolody shared his learning. Half the team challenged him, calling him just a classic 'scope designer looking for the highest frequency problem on which to masturbate. Those were fighting words; the group polarized almost instantly - Kolody. loud and profane, rose to the bait like a small-mouth bass in spawning season. Amicable discussions eluded us for months.
An independent team led by urbane Sam Lee traveled to IBM Poughkeepsie, NY, and to Burroughs in Blue Bell, PA. Their high-frequency requirements, also ECL, were well in excess of 100 MHz, for multi-pin integrated circuits used by the hundreds on large printed circuit boards. It boggled our minds. Corroborating trips to computer manufacturers Control Data and Sperry in Minneapolis, MN and Cray Research in Chippewa Falls, WI affirmed these needs. Soft-spoken, but standing tall, Lee delivered his sobering report to a silent audience.
Another team, with methodical Bruce Farly and gregarious John Marshall, found companies designing with lower-frequency TTL logic. These were companies like Raytheon Data Systems, building airline reservation systems, and NCR, building Point-of-Sale terminals such as Wendy's hamburger cash registers with lettuce and mustard keys. These companies were all enamored of Stanford University methodologies such as Quine-McCluskey gate minimization rules. These methods were concepts familiar to HP lab designers in the Bay area, but no one at our division had ever heard of them. Bruce and John were both careful, patient journalists, recording what they saw without putting judgment on their observations. Not so for some of the team. Passions could run high, and arguments often ensued.
The HP-35, "Worldwide First"
Scientific Pocket Calculator
Internal sub-assemblies of the HP-35
HP Memory Collection
Baby-faced Bill Farnbach, a surprisingly caustic manager, and his precocious lab technician Chuck Small were fascinated by the algorithmic state machine (ASM) designs used in HP's desktop calculators and the very exciting handheld calculators from Palo Alto that debuted in January, 1972. While the desktops were the highest volume user of Intel D/RAMs, they lacked an integral microprocessor. The handhelds used a very simple microcomputer, a one-bit chip from Mostek Corporation. As a group, we were profoundly impressed by HPite Chris Clare's Algorithmic State Machine Design book, which described an analytic design technique of state variable mathematics taught only in specialized places ( e.g. the prestigious Indian Institute of Science in Bangalore ). Brilliant, insouciant Clare, teaching one semester at Stanford, was invited not to return - his avant garde concepts were far more advanced than Quine-McCluskey.
The HP 1620A Pattern Analyzer
HP Memory Collection
One problem with the Intel circuits was that they were based on Metal Oxide Semiconductor (MOS) technology, only one-tenth as fast as even the slowest TTL chips. P/MOS was the original technology, then N/MOS processing produced speeds twice as fast. Complimentary MOS, or C/MOS, had the additional advantage of a quiescent state which drew almost no power - making it the right choice for battery-operated equipment. But these were toy applications, idiotic in the view of "real" designers because they were so slow.
The team, combative all spring and summer, never could agree. I concluded to let the market decide. So we initiated a multi-faceted strategy by autumn. First, Duncan Terry expanded the D'wuck into a 'scope-like machine, with focus on real-time attributes and a wide-word serial trigger. This box, defined around IBM and Bell Labs needs, emphasized historic analog designer skills with augmented 'scope functionality. Consequently, it was viewed with suspicion by the new recruits to the program. It proved an expensive dud, even at IBM and Bell Labs.
Second, Kolody's team built some accessory boxes, tools intended to work as 'scope trigger signals. These units, aimed at TTL and ECL IC families, failed to ignite much customer interest.
The HP 1645A Data Error Analyzer
HP Memory Collection
The third group was hunkered down, trying to solve digital communications problems. They chose a new European communication standard from CCITT ( Comité Consultatif International Téléphonique et Télégraphique ) which enabled "digital telephony''. We built an end-to-end system tester, the HP 1645A Pseudo Random Bit Generator, to test the pioneering Austrian phone system. European telephone system designers bought enough systems to keep us going.
The fourth group was the most speculative - after studying the new P/MOS Intel chips, Bill Farnbach constructed a low-frequency (10 MHz) sampling system with twelve parallel channels. It was a useful tool that displayed a sequence of register events within a microcomputer, while the chip executed a small program. Farnbach, experienced at sampling domain frequencies one hundred times faster, was a one-man-band in terms of defining the features and the approach.
Dick Cochran, left, discusses 1601L LSA with Wescon Visitors.
Measure Magazine, August 1973.
Courtesy of the Hewlett Packard Company
The breakthrough result was the HP 1601L Logic State Analyzer, a plug-in for an HP 180A 'scope mainframe that could handle twelve input signals, printing the results as either a "1" or a "0" at each event-time. Assembling twelve signals side-by-side gave a "machine state" that could be organized as twelve binary, four octal, or three hexadecimal words. The HP 1601L (picture below) was introduced at the WESCON trade show in Anaheim, California in August 1973.
It seems crazy to imagine now, but this unit represented a paradigm shift that confused many folk. At a division review demonstration, HP's corporate development director uttered in frustration, "Time always flows left to right; you're telling me that it goes top to bottom?" Computer folk said, "Twelve channels? All computers have sixteen or more." The same folk also said, "No one builds 10MHz systems, they're all 20MHz and up." And almost all early visitors wanted to know how you'd see rise times on fast channels, or compare race conditions.
Launched in August 1973, the sales team struggled at first, but a few salesmen did enjoy success. Whereas the large screen display had sold forty units within the first four months, this first logic state analyzer sold ninety-one units by year end. The division manager, Hal Edmondon, was happy, and I was invited by Bill Terry, the newly named Instrument Group Vice President, to Las Vegas for the annual sales kick-off, with all one hundred eighty salesmen.
In retrospect, it is clear that Terry expected me to give an upbeat sales talk, which was the normal mode for division presenters invited to speak to this group. I hadn't really spent much time with Bill since he'd left Colorado Springs, and there we'd last faced off on the question of how he'd concluded that only thirty-one large screen displays could be sold. I focused the talk to the sales team on how well individuals were dealing with this new logic tool.
I singled out two folk - a superb young Denver-based chap who had sold eleven analyzers to ten different customers in four months, and veteran Jack Zorn in our Boston office, who had sold seventeen units to Raytheon Data Systems outside Boston. I'd been out to visit each, and I shared that Jack had simply found a 'digital guy' within Raytheon, who became enthused and the orders just started coming for Jack without any further effort. By contrast, the Denver fellow had become a bit of an expert on microcomputer chips, and he demonstrated the features of the new analyzer with exuberance to each prospective customer. Using these two extremes as examples of 'how to do it', I went on to note that only seven salesmen had sold more than two units. Gratuitously supplying the results, I shared that one person had sold six, another four, and three people had sold three. Twenty-eight had sold one; one hundred thirty-seven had yet to succeed.
In fact, three major sales regions had never sold a unit - an embarrassing factoid to be sure, and one that the sales leadership did not appreciate being mentioned. Terry was livid, but not nearly as enraged as some of his regional sales managers. I did not appreciate the gravity of this political error; suffice to say that it earned me a reputation if not accolades with the field force.
A year later, it was clear that the HP 1601L passed the 'customer excitement' test. And the HP 1645A opened up a new market for HP. We got a few sales, we got noticed, and we learned a lot. The team embraced the notion that synchronous design was important. Importantly, we came to trust each other, jelling as a team. Thank goodness, our group was allowed to continue. We retitled ourselves, "Logic Analyzer Lab." The road would be rocky, but we were underway.
How did we do financially? The HP 1645A PRBS Generator and the HP 1601L Logic Analyzer plug-in each sold several dozen units per month; eventually each produced about $3 million revenues, with net profits approaching twelve percent. HP's R&D lab goal was a profit return of six times investment. Our development 'burn rate' was about $1.1 million per year, so back-of-an-envelope accounting for the twenty months it took to "get to market" penciled out returns of maybe $0.40 per dollar invested instead of the $6.00 goal. Not so hot.
The next round featured new cabinets and linkage. The HP 1600A and HP1607A, bundled as the HP 1600S (Picture below), allowed 16 or 32 bits versus 12 bits for the HP 1601L.
|The HP 1600S Logic State Analyzer 'system|
|The HP 1600S Display Screen|
These products, once we learned how to market them, sold several hundred units per month. They cumulatively sold about $38 million in revenues, with 17 percent margins - giving a respectable $3.25 return per dollar invested. We hadn't hit the ball out of the park, but we were still at bat. We were earning our own way by this point; importantly, we were getting press.
The launch of the second round was troubled though. I was supposed to manage the R&D and product marketing program, not the sales success. That was the task of our sales team, both within the division and across HP's far-flung worldwide sales offices. Intrapreneurs, according to Gifford Pinchot's Ten Commandments, "circumvent any orders aimed at stopping the dream." I'd done it on the HP 1300A Display when Packard tried to stop it. Now a new threat emerged.
Our division sales manager, Bob MacVeety, returned from the annual July 1976 sales quota setting meeting with a $6.6 million Logic quota, after 1975 sales of $4.1 million. Bob was enthused that the notoriously chary field force accepted a stunning sixty-one percent increase.
MacVeety unwisely chose to share his quota setting achievement in a division staff meeting rather than telling me first - a big mistake for both of us. I blew up in the meeting - literally coming apart at the seams. Shouting "it should be $16.6 million, not 6.6," I stormed out of the room. I couldn't believe the level of my anger; my hunch is that no one else in the room could either. Fortunately for me, Hal Edmondson, the division manager took my side. We set a new goal, ignoring the field force - "$16.6 or Bust," built on the Colorado Springs history of "Pikes Peak or Bust" by the original gold miners. Even more fortunately, MacVeety had the maturity to perceive that my anger wasn't personally directed; he proved a tremendously helpful partner.
Logic Analyzer Belt Buckle
Three independent marketing ideas made the program work.
The first marketing idea - a brilliant one - was MacVeety's. He created the Logic Analyzer Belt Buckle. Selling ten Logic Analyzers earned a Bronze unit with the person's name and the date engraved on the back; twenty-five earned a Silver, fifty a Gold, one hundred a Platinum.
Just as Boy Scout merit badges built energy and enthusiasm, these garish buckles - nearly three inches wide, five inches long, weighing half a pound, and fashioned after gunslinger buckles from Colorado Gold Rush days - became a cult hit with the HP sales force. Salesmen wearing them to annual conventions would whip off their belt and compare dates of success. Bob, a 'good ole boy' at heart, created a wall-mounted kit to hold all four, with a suitable certificate. This idea, which I thought completely corny, proved to be one of the best incentive programs I've ever seen. They became collector's items.
Second was the idea of training the sales force, but doing so subtly, with a seminar that they invited their key customers to attend. This was conceived by William Wagner, who came to us from Motorola after we demonstrated to a surprised design team there that we knew about their unannounced 'look-ahead' feature in the new Motorola 6800 microprocessor. Wagner reasoned that a seminar teaching the fundamentals of microprocessor design with proper tools would be effective. He spent nine months designing a 'suitcase' of experiments and writing a set of twenty test cases - it was a leading edge showcase. Six 'teachers' from our product marketing group, plus Bill Farnbach and I, took this show "on the road" for a year. 25,000 attendees enrolled for a two-day free seminar, averaging one hundred attendees per group - it was a spectacular success.
Third, John Marshall and Bruce Farly constructed a University Associates program that eventually involved more than one hundred engineering schools with grants of $10,000 of logic analyzer equipment, selected by faculty who attended a three-day intensive at our division (plus two days at a Colorado ski resort). The hook? The faculty person had to design a lab course using the equipment, and a year later, report to our next meeting how well the course worked.
The surprise for me was the enthusiasm that this program generated both for faculty and for our own team. Many of our junior designers loved the idea of 'teaching their old professor' something new! The motivation they had to demonstrate their own growth since leaving school was immense, as was the camaraderie that this close involvement built between our team and faculty who hitherto had not given much credence to our division's tools.
More importantly, over time this gave rise to new curricula, developed by a cadre of faculty who came to know each other through our program. Years later, they had become the leaders of the 'next generation' of college professors teaching electronic design to countless students, cementing digital logic design or data domain concepts firmly in the landscape.
Again, it would be great to report that this was a mostly smooth path to success. Alas, it was not. We had stormy fights, defections, and casualties within the group; relations with other Colorado Springs groups were often strained. Interactions with the HP Santa Clara Division, busy building their own interpretation of 'digital tools', were volatile shouting matches rather than synergistic. We did continue to use HP divisions for market research. Returning from one trip to HP Fort Collins, we stopped in Denver for dinner. Over dessert, one designer said, "You know, they used to know so much more than we did; it's amazing how far we've come."
Asynchronous latching for the HP 1600S was a great example of the value of such visits. We had learned the internal state flow synchronous measurement well - so well in fact that when questions arose about comparing two asynchronous machines, we didn't listen. True, we had heard customers cite that we'd missed the required number of channels with the HP 1601L and its twelve channels, fashioning two units - the HP 1600A with a display CRT, and its accessory unit, the HP 1607A without a CRT, each with 16 channels. The two could be linked - e.g. the HP 1600S - to create a 32 channel machine to watch synchronized dual 16-bit register flows. So we were relatively smug about our 'listening ability' - we heard well what we wanted to hear.
What couldn't be done with these linked machines, though, was "handshake" between two different microcomputer systems; displaying first one and then the other on screen. Somehow our team just couldn't hear this asynchronous handshake idea as a valid request. "Why would you want to?" went the argument. Much unwarranted energy and time was consumed with this debate, until one day pragmatic Bruce Farly had heard enough. He called the Loveland team, saying, "We'd like to bring our new prototypes up." On site, within minutes the feedback was, "Boy, it'd sure be useful if we could look at this machine interacting with that one." Which we couldn't do. Finally setting aside our pride factor, a relatively simple change enabled it. It became a most important feature for minicomputer designers.
Years later, I had the privilege of listening to Irwin Jacobs, founder of Qualcomm, explain to a Computer History Museum audience just how important this asynchronous handshake mode was for his first company, Link-a-Bit, when he tried vainly for months to get IBM engineers to listen to him. Harking back to the analysis that I had done for Bill Terry's sales force, I approached Jacobs afterwards to say that Link-a-Bit had been the seventh buyer of our first Logic Analyzer. He straightened up, surveyed the small group around him, stuck out his hand and said, "Congratulations, that instrument made my company."
These tools did enable key computer developments. In 1977, the second largest computer company in the world, Digital Equipment Corporation, had a new product under development. The DEC VAX-780 had Virtual Address eXtensions, a unique method of extending 16-bit computer address space to a 32-bit address. The power of this bit-doubling, non-intuitive to the lay person, changes addressable locations from a mere sixty-five thousand to 4.3 billion spaces. Virtualizing this space was a brilliant technical approach, novel and effective, but hard to debug.
To analyze the VAX operation, DEC engineers needed our tools, but feared we'd share their designs with HP's computer group. HP computer managers argued that our products should be built primarily for HP's computer group, to give them an edge against stiff competition. I felt that our Logic Analyzer tools could only be 'best' if we understood the toughest measurements in computing, problems more apt to occur with some of HP's many competitors. I passionately argued that "instrumentation engineers have to maintain a Swiss-like neutrality" for all users.
The decision to help DEC was affirmed by the Executive committees and Board of Directors of both companies. Under non-disclosure rules, we put five people in DEC's lab for several months. Legendary Gordon Bell, DEC's R&D VP, would much later note that "HP's program-controlled logic analyzers allowed engineers to test and debug these complex systems."
IBM's R&D Vice President Joel Birnbaum wrote in 2011 that, "It is easy to forget how very difficult it was to evaluate design decisions and resultant performance characteristics of digital circuitry before these instruments were developed. From 1971-75, it is not an exaggeration to say that this single-cycle, microcode-free style of machine design, so cache-and timing-dependent, could not have succeeded at IBM without the Hewlett-Packard instruments."
Electronics magazine, the leading publication for electronic engineers, took a bold step, putting a new concept, The Data Domain, on its front cover in May 1975. Two lengthy articles two weeks apart made the point that these microcomputer marvels were ushering in a new perspective for designers - electronics would never again be the same.
The editors were nervous when we outlined the thesis for them in January 1975 - they said, "This really reduces the importance of everything our magazine has always stood for. And we're not sure our audience is ready for this."
Somehow, we persuaded them. I think in retrospect it was obvious, but I recall that it was a half-day of argument and debate. I felt it important that they really understood the distinction between and significance of the frequency domain and time domain descriptors on which MIT's Ernst Guillemin had spent so much time in the 1950s, trying to change electronics from worrying just about continuous wave theory to focusing on switching theory. Radar and computers, with invaluable oscilloscope tools, couldn't happen with only a frequency domain perspective.
I drilled this point home with the data domain . No longer will designers really care about pulse shapes, rise and fall times, overshoot and ringing, any more than radar systems worry about carrier frequencies and modulation. They instead will worry about logic state, register contents and flow, and flag line settings - in short, the data status of the machine in terms of data events, not in terms of either time or frequency events and parameters.
The editors bought it. And they gave us two seminal articles to illustrate our thesis. It was a watershed moment, and the products changed the world. But we still didn't have buy-in from the Colorado Springs management staff. I invited designers Tom Whitney and Chris Clare out from Palo Alto to describe how the exciting HP handheld calculator was invented. Clare had given us copies of his algorithmic state design book; Whitney provided some microprocessor chips.
Walt Fischer in HP's 'scope lab had taken one of these microcomputer chips and embedded it into the HP 1700 family of portable scopes, calculating rise and fall times and pulse width for signals shown on screen between two dots movable by the user. Since it was the first HP instrument to incorporate a microprocessor, it got considerable fanfare, once again beating Tek.
HP 1740A, and HP 1607A combination
HP Memory Collection
While the HP 1722A was indeed a 'digital 'scope', it was not a new paradigm - it merely digitized measurements that 'scopes had made for years. Somewhat shamelessly, we traded on this for our second generation Logic State Analyzer machines, borrowing existing HP 1700 'scope tooling to save costs, but also to provide an image that these were 'normal' extensions of 'scopes, even though they weren't. For the HP field force, this was psychologically important - the boxes came from the same division, they 'looked the same' from external appearance, and we even created datasheets and application notes describing how the HP 1740A and the HP 1607A could be linked to find both synchronous logic errors and asynchronous spurious signals. Nonetheless, the net result for the local HP executive team was still a fair amount of skepticism and cloudy understanding at best of our true contribution. How to solve this conundrum?
Was it important to solve? The emphatic answer: "YES." Bob MacVeety's willingness to accept a ridiculous quota from the HP sales force revealed that he was playing by 'the book' rather than with innate belief in the kind and impact of these new equipments. I had worked out a strategic positioning paper - "One Strategic Cycle equals Three Product Cycles" - which characterizes much of what I still teach today. We needed a string of three successive product families to establish a solid leadership position. But the third wave was at risk.
This second generation of equipment, while a logical follow-on to the HP 1601L, was at best evolutionary - fixing the channel count, and mostly providing a way to handle wider busses. If these intermediate tools didn't 'blow the doors out', we'd not get third-round funding, or wouldn't get it soon enough to maintain momentum.
The machines our lab now envisioned would use microprocessors themselves for significant new capability. We imagined a line of dedicated bus analyzers, some serial and some parallel, and all would display answers in 'data domain' format rather than as simple "1's" and "0's".
A corollary of a fully thought-out Strategic Cycle approach is that the costs to build the third generation of equipments dwarf investment in the first two rounds. Such a third generation bet needs to start well before the second generation is ready for launch. Such processes are seldom taught in management schools nor appreciated by corporate consultants. This is why many corporate innovation initiatives lose momentum; we now found ourselves in that precarious spot - needing huge support for the third round of investment, before results were in from the second.
For fiscal year 1976, the Logic Analyzer group booked $12.9 million in orders, and delivered $11.8 million to customers, almost double what the sales force quota had been. I was thrilled - we'd tripled the sales from the previous year, established HP as the leader in Logic Analyzers in the world, and we'd 'taught' the sales force a lesson. But I wasn't prepared for the reaction in the overall division, or even within our group. People had internalized "16.6 or Bust" to the point that $12.9m was seen as a monumental disaster. Nothing I could say was able to assuage this point for people. We got tarred with a 'loser' label within the larger division; people even in Logic felt badly, as though we'd been whipped.
Support from the executive team waned, both for our group and my brand of leadership.
What to do? I decided to construct and teach a ten week workshop to the Colorado Springs management team. HP had from inception believed in promoting design engineers into all key jobs of the company; every member of Hal Edmondson's division staff had an engineering background. I traded heavily on this, setting up a workshop project where in teams of two, these folk would define, design, and build a calculator. Meeting at 7am every Tuesday for two hours, they came. The class studied how a four-bit microprocessor worked, and how to connect a keyboard and recognize keystrokes. We figured out together how light-emitting diode (LED) readouts were driven to make the right numbers. And then it was 'build time'.
For test equipment, anyone was free to use tools that they knew from their history - 'scopes, voltmeters, etc. - and they could use our Logic State Analyzers if they desired.
It wasn't long before the lessons began to show their worth. There just was no way to figure out what was happening with these small but powerful chips using any tool that was historically available. Yet it was simple to discern proper operation using the new tools.
Once the class had their calculators 'up and running', we moved to more sophisticated goals, such as linking two machines, illustrating the more interesting features of our latest products. For the last two classes, a couple of our key designers described some possible extensions. 'Lightbulbs' dawned around the room. Resistance to our 'third investment wave' waned.
This would prove to be the breakthrough that 'made the line.'
While ASM designs had proliferated, the rapid growth of microprocessors changed the rules. We built microprocessors into our entire line, the first time any test equipment vendor used microprocessors in tools to monitor microprocessors. More capabilities were added to these state analyzers - wider busses, sequential state triggering, and alternative displays, including decoded octal, hexadecimal, and microprocessor mnemonic instruction sets that replaced the "1's" and "0's" and state space maps that helped illustrate more complex state flows.
The first product out of the gate was led by Jeff Smith, an experienced sampling 'scope and pulse generator designer, who teamed with a native Colorado Springs software developer, Tom Saponas, to define and create the HP 1611A Logic State Analyzer. The HP 1611A featured 'plug-in personality modules', each tuned for individual microcode instruction sets for a specific microprocessor from one company. The first module released was for Intel's newly released 8080 microcomputer. Personality module designers included Gail Hamilton, Dave Hood and Debbie Ogden. Don Bloyer, Don Miller and Roger Molnar contributed product design skills, each of them experienced in our 'scope labs.
One afternoon, Debbie approached me, with a request. She timidly said, "I'd like to become a lab manager like you. What were the three most important things you learned along the way?"
I'd not been asked something like that before - startled, I blurted out the first thing that came to mind: "I got a History degree in the History of Science and Technology, I joined the Colorado Air Pollution Control Commission, and I bought a plant nursery with my wife." Debbie looked at me as though I had three heads. In retrospect, it wasn't that bad an answer, but in truth, it stopped her cold. So then, I had to take a few minutes to explain - in actuality I had to figure out why such an answer came so easily to me.
The history degree, I explained, allowed me to learn from history - so that you don't have to repeat every mistake along the way. The example I used was the learning from historian Thomas Kuhn's thesis about the structure of scientific revolutions, which admirably described how our management, our customers, and our competitors all reacted to our products - disbelief in the new thesis, and a redoubling of efforts on the conventional wisdom approaches.
The air pollution work, I told her, was the first time that I found myself in a situation where I knew very little about the science or technology, and yet I was expected to make decisions. How do you handle decision making in the absence of adequate knowledge? The conclusion was that it is okay to ask 'dumb questions' - you're not expected to know everything.
The nursery business was perhaps most important: when I couldn't meet payroll, I really 'got it' about cash flow and profits. To my surprise, I found that HP financing worked the same. Ironically, the Return Map that Ray Price and I installed at HP and wrote for the Harvard Business Review had its origin in teaching nursery employees about return on roses and shrubs.
Debbie and I sat together, pondering these answers. I still wonder why they came so quickly.
New contributors joined our group - George Haag, from the California Data Terminals division, led the flagship HP 1610A (Figure 3a, and picture below) development, a high, wide, and handsome $10,000 state machine. Haag, an erudite but bellicose leader, led a skilled team - experienced circuit designers Justin Morrill, Jim Donnelly and Steve Shepard; the division's first software developer Gordon Greenley; and long-time mechanical 'scope designer Don Skarke.
Many folk within the Logic division viewed this product as overkill - too expensive, too late, out of touch with the emerging microprocessors, but Haag's insight came from the minicomputer world, not the microprocessor world. This is the box that appealed to the Cupertino divisions, to Digital Equipment, and importantly, changed IBM's outlook on these tools. Extensions of this line designed for the nascent Reduced Instruction Set Computer (RISC) architectures in 1981 would give HP Labs important insights for a major restructuring of the mainframe computer world, which brought DEC to their knees and eventually even dethroned IBM.
The major bet of the Logic team was instead on the HP 1615A, our first one defined as a combination Logic Timing and State Analyzer. Logic Timing Analyzers portrayed logic signals as pseudo timing diagrams, much easier for conventional electronic engineers to grasp (HP managers from Palo Alto liked these better, since time still went from left to right). Biomation and Tektronix competition mostly built Logic Timing Analyzers; their designers still struggled with the paradigm shift to think in instruction sets and register flow. Thomas Kuhn's thesis was holding true in our field as surely as in the Copernican astronomical revolution in Galileo's time.
Bill Martin and John Scharrer teamed up with Bob Wickliff to create the HP 1615A, and it rightly proved to be a winning combination - with a $6,800 price tag, it matched the best Logic Timing Analyzers from the competition, and in addition handled up to 24 channels of synchronous state capability (Figure 3b).
HP's sales force was much more comfortable with this switch-hitting machine than with our state analyzers. They loved selling head-to-head against Biomation and Tektronix - good salesmen always do.
| Figure 3
(a) HP 1610A Logic State Analyzer ------------------------------------------------------- (b) HP 1615A Logic State / Timing Analyzer
But the real point of this machine was much more significant than that. The HP Journal article, co-authored by the three ex-scope designers Scharrer, Wickliff and Martin, brilliantly outlined the unique capabilities for designers using the combination of asynchronous and synchronous analysis - along with a 'glitch trigger' that captured sales folk and designer imaginations alike. This unique 'glitch trigger' - a throw-back to old concerns about 'race conditions' - actually did have important value on occasion for isolating misbehaving circuitry.
The HP 1602A Logic State Analyzer
HP Memory Collection
The HP 1615A was the first HP Logic Analyzer product to book $15 million per year. Only a very few HP instruments - the historic counter from Santa Clara division, the big Spectrum Analyzer from the Microwave division, the HP 180A 'scopes, and the HP 1700 series portable 'scope series - ever reached this level. While it was a big accomplishment for the group, from my standpoint, it was a signal achievement in terms of marrying the talents of our 'scope designers with logic designers to build truly amazing merged capabilities.
Two other products, each uniquely defined, debuted also. Bill Farnbach contributed a portable version for field work, the HP 1602A. Chuck Small was his project leader, and versatile Al DeVilbiss - who had done so much in both 'scopes and displays - was the software developer. The expectation was that this would be a 'portable scope' equivalent - volumes would be correspondingly huge. The price point, the size and weight, and the ease of use all contributed beautifully, but we hadn't reckoned on the enormous time lag for adopters. Lab designers were just learning these tools; service groups weren't about to jump in yet.
The HP 1640B Serial Data Analyzer
HP Memory Collection
John Poss, Bob Erdmann, and Rick Vestal teamed to define and build the HP 1640A, a serial bus analyzer that was the first logic analyzer to decode transmitted text, in ASCII character format. We had a request from Y-HP, our Japanese division, to do this also for JIS-7 code; when we made the conversion, we had a hard time finding any Japanese near Colorado Springs who could verify operation. The unique requirements of this unit led me to think hard about partners.
Collectively, the line-up was formidable. It took the industry by storm, creating profitable sales unheard of for the division - cumulative revenues passed $100 million within three years: return on investment exceeded 10:1. Once our seminars began, coupled with the Electronics stories, we attracted attention. Sales blossomed, editors visited, and stories accumulated. It is seductive to be wanted, to be noticed. It also has its costs, of which we were blissfully unaware.
We had come a long way. We had a vision, and we had a plan that most people understood, and worked on diligently. The wider division, not to mention HP leadership elsewhere, was more chary. Many cross-currents flow within and around organizations; leadership is seldom universally acknowledged. Plenty of 'not-quite-converted' Monday-morning quarterbacks still existed in the parent division.