The Integrated Circuit Era   (1959 — 1975)

Compressing the World of Electronics


The Kennedy–Nixon Presidential campaign of 1960 brought forth an unlikely issue: American science and technology. The Russians had beaten the United States to space with their Sputnik satellite, and many felt the Russians had even more guided missiles than did the U.S.

The years ahead would show the "missile gap" to be illusory, though the arms race would continue. By the end of the decade, American astronaut Neil Armstrong had set foot on the moon, a staggering achievement made possible by breakthroughs in electronics and other fields throughout the sixties. In fact, the next 15 years would teem with great discoveries in fields as diverse as microelectronics and microbiology.

While human life was being transported to the earth's nearest neighbor in space, scientists probing the mysteries of the living cell isolated the basic particle of life itself. And South African surgeon Christian Barnard shows that a man could survive with another man's heart, thereby raising hopes that organ transplants would one day cure patients once considered incurable.

Integrated circuits, spawned in the 1950s, dominated the 60s as much as transistors ruled the previous decade. ICs found their way into missiles, computers, electronic instruments, communications equipment and consumer products. The tiny, ever–more–complex circuits made possible systems that were smaller, cheaper and more reliable than before.

Further, steady and dramatic advances in IC complexity and fabrication led directly to "computers on a chip" (microprocessors) in the 1970s, which were as much an advance over conventional ICs as ICs were over transistors.

But if the 1960s proved the best of times for science and technology, they were anything but that for the country's political and social institutions. Upheavals of one sort or another seemed to be the new way of life. President John F. Kennedy, his brother Robert, and black activist Martin Luther King were all assassinated. Parents acquired a new fear, that their children might succumb to the growing incidence of drug use. The war in Vietnam expanded throughout Southeast Asia as protests. increased on campus and elsewhere.

Enter the IC era

Against this unsettling background, engineers and scientists strove to establish new frontiers of knowledge. By the late 1950s, the key techniques that would be used for the fabrication of ICs had been developed for transistors.

A young, Dallas–based company, Texas Instruments, successfully replaced germanium with silicon as the transistor's semiconductor material, thereby extending operating temperatures to military ranges.

About the same time, Bell Laboratories developed oxide masking and diffusion. These techniques led to improved quality control and reduced manufacturing costs.

Click on Picture to Enlarge.

 
 
Long before energy conservation became a burning issue, International Rectifier in 1959 built a solar–cell energy source for autos and mounted the source on a 1912 Baker Electric. The solar cells put out about 200 watts.  

New solar cell technology

The crucial next step was embarked on at Fairchild and Texas Instruments. Engineers at both companies sought to produce, on a single chip of silicon not only transistors and diodes, but also resistors and capacitors — and then join the components to form a complete circuit.

The special properties of the circuit elements were to be achieved by selectively diffusing traces of impurities into the silicon or oxidizing them to silicon dioxide. By the use of photolithography, selected regions of silicon would be exposed while other regions would be protected.

At first, TI used fine wires for bonding the various elements into a functional circuit. At Fairchild, engineers achieved the same result simply by evaporating a thin film of aluminum over the, circuit elements and etching it selectively to leave a two–dimensional network. The Fairchild technique produced what became known as planar integrated circuits.

Fairchild invented the planar process in 1960 for the production of transistors. When it was found that passive components could be incorporated as easily as active devices, the process spread quickly to ICs. Two years later, Fairchild reached another milestone by producing the metal–oxide–semiconductor (MOS) transistor.

The first monolithic integrated circuit was built in 1958 at Texas Instruments. J.S. Kilby constructed a phase–shift oscillator from a single silicon bar. The device required no interconnections between one component and another; the electrical path was through the silicon. TI was also the first company to announce a product line of ICs. Called Solid Circuit Series 51, the initial offering in 1960 consisted of simple logic circuits.

Click on Picture to Enlarge.

 
Telstar, the first active communications satellite, was also the first privately owned satellite. Its construction and launch in 1962 were financed by AT&T. The 170-lb. satellite marked the first international attempt to transmit TV pictures and sound by use of an active–repeater station in space.

Click on Picture to Enlarge.

 
July 20, 1969: Apollo – 11 astronauts land on the moon. Neil Armstrong and Edwin Aldrin spent a day there, while Michael Collins circled the moon in a command module. The successful landing and return from earth's nearest neighbor in space culminated a decade of breakthroughs in electronics.

Meanwhile Bell Labs developed the epitaxial process. With this important tool, junctions could be formed economically in production by growing one crystal structure on another. The technique rapidly became a mainstay of transistor and IC fabrication.

The birth–control pill wasn't the only invention to draw widespread attention at the start of the decade. Another was spawned in July, 1960, when Theodore H. Maiman at Hughes Research Labs reached the end of his tenacious efforts. Following a path of research that differed markedly from that of his colleagues, Maiman at long last obtained emission from an "optical maser" — better known as a laser — made from a ruby crystal. The emission was obtained by pumping the ruby with a pulsed mercury arc.

The invention of this spectacular electro–optic device touched off a frenetic laser–development race. By March, 1961 — one month before the collapse of the invasion of Cuba by exiles at the Bay of Pigs — Six different types of lasers were in use. Ruby lasers were operating at Hughes, Bell Labs, Raytheon and other plants; IBM had developed calcium–fluoride lasers; and Bell had also produced the first continuous–wave heliumneon gas–discharge laser, pumped by a 28–megahertz source.

Rapidly growing knowledge of the physics of semiconductor and other materials, triggered by the transistor, led to a solid–state laser, one that would work if current was simply passed through it. By the time of the Cuban missile crisis of October, 1962, such a solid–state laser had been developed independently at GE Research Labs and IBM Research Lab.

In the same year, the laser properties of gallium arsenide were verified by RCA and others. An important by–product of these studies would be the development of light–emitting diodes, or LEDs. Now available in yellow, green and red, LEDs were first introduced commercially, only in red, in 1968.

The government contributes heavily

A major factor in the rapid development of new devices was the heavy support and financial backing of space and military agencies. In the fifties the American military, seeing the value of Bell Labs' transistor, awarded contracts for its continued development. Significantly, the military did not classify the new device in order to let everyone explore its uses. Similar support in the years ahead would accelerate the development of integrated circuits.

A direct beneficiary of these efforts was NASA's space program. Giant strides were made with each successive manned spacecraft. Mercury, for example, flew in 1961 without an on-board computer. Gemini had one with about 4000 (4–k) words of memory. Apollo — whose eleventh mission was the 1969 moon landing — had a 32–k memory computer both in the command and lunar modules. The latter even had a backup computer with 4–k memory.

NASA ushered in a new era in communications with the first use of artificial satellites as relay stations. In August, 1960, the space agency orbited Echo 1, a 100–ft sphere of aluminized Mylar plastic. On the very day of the launch — August 12 — the first two–way–radio voice transmission was accomplished via the artificial satellite. Three days later the first transcontinental telephone call via satellite was made from New Jersey to California.

Click on Picture to Enlarge.

 
The first commercial minicomputer, a 12–bit machine with 1–k of memory, was introduced by Digital Equipment Corp. in 1963 as the PDP–5. It sold for $27,000.

Click on Picture to Enlarge.

 
A computer the size of a two–drawer file cabinet, the PDP–8 became popularly known as a minicomputer. When Digital Equipment Corp. offered it in 1965, the mini cost $9000 less than its less powerful and larger predecessor, the PDP–5.

The first active communications satellite, Telstar, was launched by a Thor–Delta rocket almost two years later, on July 10, 1962 — five months after John Glenn became the first American to orbit space. Telstar settled into a nearly perfect orbit, in which the satellite appeared to hover over the same spot. That night the first telephone call, television program and photo facsimile transmission were relayed to and from the satellite. For the next few weeks technical firsts filled the air as Telstar relayed telephone conversations and color and black–and–white TV signals.

Telstar may well have become the most well known satellite. A song of the day, entitled appropriately enough "Telstar," recorded twice by the Tornadoes and the Ventures, became an international hit.

Click on Picture to Enlarge.

 
A solid–state device could oscillate at microwave frequencies. The unexpected development was discovered in 1963 by IBM's John Gunn, whose work lead to the active diode that bears his name.

A series of communication satellites followed, beginning with Early Bird (Intelsat I) in 1965 and ending with Intelsat IV just five years ago. The result was a system of synchronous–orbit satellites and the realization of a dream: communications coverage between any two points on the globe.

Computers and instruments evolve rapidly

In the private sector, computers and instruments moved rapidly to embrace the fruits of the latest technologies. Computers moved up from vacuum tubes and transistors to so called third generation machines — which used microelectronic components — and gave. birth to a new class — minicomputers. The touch of solid state in instruments led to drastic reductions in size and weight that allowed increased circuit density within a steadily shrinking package.

In April, 1964, IBM introduced its System 360 series. Intended as replacements for all existing IBM computer series, the 360 standardized such characteristics as instruction and character codes, units of information and modes of arithmetic. IBM developed a hybrid technology called Solid Logic Technology for the System 360. Many features of the series have since been accepted as the industry standard.

Shortly after the 360 was introduced, RCA announced a similar series, the Spectra 70, which used monolithic ICs rather than hybrids. RCA, has since dropped out of the computer business, as has General Electric, another firm that tried unsuccessfully to tackle IBM.

One year before the arrival of third–generation machines, a small Maynard, MA, company that had started in the late fifties selling logic–circuit modules came out with a parallel–data processor called the PDP–5. Digital Equipment Corp.'s 1963 entry — the first "mini" — had a 12–bit word length and contained 1–k of memory. It sold for $27,000 — expensive by today's standards, but not compared with competitive machines then.

Click on Picture to Enlarge.

 
A 1964 microwave spectrum analyzer came with all basic functions fully calibrated. Hewlett–Packard developers Arthur Fong (left) and Harvey Halverson produced a unit with a 2–GHz bandwidth from 10 MHz to 40 GHz, and helped establish HP as an important supplier of spectrum analyzers.

Two years later the company introduced the PDP–8, which was more powerful than the earlier model and cost only $18,000. With a size approximating that of a two–drawer legal file cabinet, it was the first machine popularly called a minicomputer. The PDP–8 was widely imitated because it was nearly as powerful as much larger computers costing several times more, and within a few years had given rise to an entire industry.

New kinds of instruments appear

Different kinds of instruments also began to appear. A new signal source — the function generator — was unveiled by Hewlett–Packard in the late fifties. The vacuum–tube instrument was intended as the source for process–control systems and low–frequency mechanical vibrators and for testing servo–mechanisms. It covered a range of 1200 hertz down to several millihertz, but the 50 pound unit never caught on.

It was not until late 1961, when a new company called Wavetek introduced the solid–state Model 101, that function generators took hold. They have since evolved into general–purpose signal sources that can provide square waves, triangles, ramps and pulses, as well as sinusoids, over the entire range from a microhertz to 20 megahertz and higher.

The standard signal generator confronted another competitor in 1964 when HP unveiled its 5100A frequency synthesizer. The unit employed over 2000 discrete semiconductors to provide frequencies to 50 MHz in 0.1 – hertz increments. Four years later the signal generator struck back in the form of Logimetrics' 900 series, a generator with a built–in counter. Like the synthesizer, the counter made exact frequency settings possible. Similar signal generators were soon offered by Singer and then HP.

Designers of consumer products such as radios and televisions were equally quick to embrace the results of semiconductor technology. Today's digital watches, employing ICs and solid-state displays, trace their beginnings to 1960. In that year Bulova introduced its Accutron tuning–fork watch. The discrete–component watch established that electronic accuracies in timekeeping products were possible.

One of the first consumer products to successfully incorporate transistors and miniaturized components was the hearing aid. In August, 1958, Zenith produced the Solaris, a hearing aid powered by silicon solar cells mounted on the temple bar of eyeglasses, but space–saving ICs seemed destined for this application. In March, 1964, Zenith introduced the first IC–based hearing–aid. The unit's integrated circuit contained six transistors and 16 resistors, and it was small enough that 10 of these circuits could be stacked inside the head of a match.

Logic families vie for dominance

Much of the early activity of semiconductor manufacturers centered on digital logic families. From the beginning, a host of companies was attempting to establish the dominance of one logic family over the other, or they were second–sourcing the strong suit of a competitor.

At first, resistor–transistor logic (RTL) seemed the way to go. Fairchild and Texas Instruments were strongly behind it. Then diode–transistor logic (DTL) came along in 1962 from recently formed Signetics, and that type of logic took off.

The enormous impact of DTL stemmed from the fact that designers were familiar with the logic form from their work with discrete–component (nonintegrated) circuits. Fairchild, noting the fast rise of DTL, was not long in following Signetics' lead. In 1964, Fairchild, came out with its 930 DTL series. Equipped with better noise immunity and less sensitivity to clock waveforms than Signetics' version, the Fairchild family became the most successful DTL line.

Meanwhile, work on transistor–transistor logic (TTL) was proceeding at Fairchild, Pacific Semiconductors and Signetic, among others. At Sylvania, the effort was spearheaded by Thomas Longo, who had pushed it as early as 1961. The first TTL circuits had high speed, but suffered from poor noise immunity among other problems.

Click on Picture to Enlarge.

 
A new microwave source, the Impatt diode, was discovered in the mid 1960s by Bernard LeLoach (left) and Ralph Johnston (right), along with Barry Cohen. The three Bell Labs researchers made the diode emit microwaves by pulsing it until an avalanche of carriers had been produced internally.

Click on Picture to Enlarge.

 
Charge–coupled devices found use in experimental TV camera demonstrated by Bell Labs' Willard Boyle (left) and George Smith — the inventors who received a patent for charge–coupled devices in 1974.

Longo developed improved versions that emerge from Sylvania as Sylvania's Universal High–Level Logic (SUHL) in 1963. The first practical application of SUHL followed soon after in the Phoenix missile being built by Hughes Aircraft.

1964: A year of sensational headlines

The year 1964 saw news headlines break from all parts of the globe. American planes bombed North Vietnam in retaliation for an attack against U.S. destroyers in the Gulf of Tonkin, and Mainland China conducted a successful test explosion of its first atomic bomb.

The Warren Commission released a report that claimed Lee Harvey Oswald was solely responsible for the killing of President Kennedy. Lyndon Johnson, Kennedy's successor, won a lopsided victory over Republican conservative Barry Goldwater. In the Soviet Union, a spacecraft launched with three men became the first space vehicle to carry more than a single man.

For the IC, industry, the year marked the entrance of Texas Instruments' 5400 Series TTL family, and the beginning of its surge to the front of the pack. TI's strategy in 1964 was a frontal attack on DTL, the most widely used logic line of the time. The Dallas–based manufacturer used DTI, pin configuration and the same kind of packaging (first ceramic packaging, and later plastic, in the 7400 Series).

Very early in the game, TI also offered medium–scale integration (MSI) parts. With these ICs, and others that followed, designers could replace several circuits with a single, more complex, MSI circuit.

The beginnings of emitter–coupled logic (ECL) actually go back to 1962. Motorola introduced MECL I in that year, and has since upgraded it with faster versions. This evolutionary process was matched by TI's drive to develop faster versions of its 54 / 74 family.

Standard 54 / 74 offered 10 nanosecond (typical gate–propagation delay) and 10 milliwatts (typical gate–power dissipation). It was slower than MECL I (8 nanoseconds delay), but it consumed much less than the 31 milliwatts that a MECL gate did.

Succeeding versions of both the ECL and TTL families cut gate delays, though with an increase in dissipation. The top speed was reached in the late sixties when Motorola introduced MECL III. It offered 1 nanosecond gate delay and 60 milliwatt gate dissipation. However, MECL III didn't catch on. For many applications, the speed was too high to be useful without special and usually costly packaging techniques, and the power dissipation was just too high.

The result was the 1971 introduction of MECL 10,000 (sometimes referred to as MECL II 1/2), which offered 2 nanoseconds delay and 25 milliwatts dissipation. Currently MECL 10,000 competes with a TTL version that uses Schottky clamping to achieve the fastest speeds in TI's 54 / 74 line. Called 54S / 74S, it boasts 3 nanoseconds delays and 20 milliwatts dissipation.

Standard logic lines haven't been limited to bipolar families, however. In 1968, RCA introduced CD4000 COS / MOS, the company's name for its complementary MOS (CMOS) logic series. Since then CMOS has become a strong competitor for TTL — especially for low–power applications — and the 4000 series has drawn more alternate sources than has any other logic line.

Linear ICs, too, made great strides in the sixties. Beginning with operational amplifiers, linear monolithics grew steadily in complexity and functions–per–chip.

In the early 1960s monolithic op amps were sold by at least two manufacturers, Texas Instruments and Westinghouse. Then in 1964 Fairchild came cut with the 702, the result of the first collaboration between the now–legendary team of Bob Widlar and Dave Talbert. The new op amp found only limited acceptance, but its development led to the 709, one of the most successful products of its day.

Op amp alters design rules

The 709 marked a turning point in the design of linear microcircuits. Instead of translating a discrete design into a monolithic (IC) form — the standard approach — Widlar followed a different set of rules: "Use transistors and diodes, even matched ones, with impunity. But use resistors and capacitors, particularly those with large values, only where necessary."

Even where use of a large resistor seemed inevitable, Widlar put a dc–biased transistor in its place. He exploited a monolith's natural ability to produce matched resistors and assumed only loose absolute values.

Improved op amps, like the 741, have since come along to replace the 709 in most new applications. Among the user benefits that have been added are internal compensation, and short–circuit protection. But the op amp and variations of it — like comparators and voltage regulators — account for a large portion of all the linear microcircuits available.

For both linear and digital ICs, packaging problems had to be overcome. Transistor packages were found to lack sufficient heat–sinking capability and an adequate number of interconnections. One solution was the flatpack, created by Yung Tao while at Texas Instruments. The original flatpack had 10 leads, and measured 1/4 x 1/8 inch. In 1964 Fairchild's Bryant ("Buck") Rogers fostered. the invention of the dual–inline package. The original DIP had 14 leads, and looked just as it does today. The same year, Martin LepSelter of Bell Labs invented the beam lead as a mechanical and electrical interconnection between the IC and its case.

IC advances help discrete devices

An important by–product of the technological innovations of the day was the development of improved discrete devices. Semiconductor technology had come of age in the IC era, though its birth had been due to the transistor. Now other discrete devices as well as the transistor would reap the benefits of the new frontiers of knowledge.

For example, the Gunn diode — discovered at IBM in 1963 — was one of the first important applications for the semiconducting material gallium arsenide. Researchers at Siemens in Germany more than a decade earlier had uncovered the material during work on semiconductors made from elements in the third and fifth groups of the periodic table.

Other high–frequency diodes followed. The first microwave gallium–arsenide field–effect transistor was built at IBM, also in 1963. Bell Labs introduced the Impatt diode oscillator in 1965, and in 1966 presented the theory for the Trapatt oscillator, which RCA developed in 1970. In the following year the Baritt oscillator emerged from Bell.

A major trend in transistors was toward ever–higher powers at higher frequencies. By 1964 epitaxial processing had been applied to commercial interdigitated rf devices. Refinements in geometry and better mask–production and alignment techniques also helped boost power ratings. A typical interdigitated transistor of the day could output 5 watts at 100 megahertz and 0.5 watts at 400 megahertz.

Then RCA came out with the first commercial transistor to employ an overlay structure — the 2N3375. It produced 10 watts of output power at 100 megahertz and could generate 4 watts at 400 megahertz. The key feature of the overlay structure was that part of the emitter metal lay over the base instead of adjacent to it. Emitter current was carried in metal conductors, formed into fingers that crossed over the base. Base and emitter areas were insulated from one another by a layer of silicon dioxide.

MOS makes its move

One of the most important developments of the last 10 years has been the emergence of MOS (metal–oxide semiconductor). For digital ICs, MOS types usually implied higher density and lower manufacturing costs, while bipolar types implied higher speed. In the early 1960s the benefits of MOS were more promise than reality — then engineers found how to overcome the tricky processing problems of MOS. Their success blazed the path to today's high capacity memories and to microprocessors, and led the way to the current proliferation of desk and pocket calculators — the largest commercial application of MOS circuits.

Much simpler than a bipolar device, a MOS device required fewer diffusion and masking steps, and its theory of operation had been known as far back as the 1930s. In fact, the research that led to the first transistor — a bipolar device — was actually intended to develop a MOS device.

The major obstacle to its development lay in the fact that a MOS device depended on the properties of the semiconductor surface. In contrast, a bipolar version primarily used the easier to–control bulk properties of the semiconductor crystal. By the middle sixties, however, engineers had solved the stability problems associated with the oxide-silicon interface and the behavior of the oxide itself.

Click on Picture to Enlarge.

 
Not quite computers on chips, microprocessors do perform many of the functions of central processing units in conventional computers. The first emerged from Intel in 1971. It was a 4–bit unit called the 4004.

Click on Picture to Enlarge.

 
The 8008, the first 8–bit unit, followed shortly thereafter. Three years later Intel came out with its current highflying 8080 an 8–bit processor. By that time microprocessors had formed an entire industry.

Those early years also saw efforts aimed at increasing the speed of MOS circuits. In 1960 Rockwell International reported the first successful growth of single–crystal silicon on an insulating sapphire substrate. Better known today as silicon–on–sapphire, or SOS, the technique has been employed in the seventies to build microprocessors and memories. Recently RCA combined SOS with CMOS to produce memories having speeds comparable to bipolar memories, but at a fraction of the latter's dissipation.

Memories, also marked some of the major successes of conventional MOS in the late sixties, when MOS memories began to seriously challenge magnetic cores for computer applications. Early bipolar memories had led the way into computers by creating a new class of memory systems: the cache (a high–speed low–capacity memory similar to a scratch–pad but with a larger capacity). It was the first large semiconductor memory system to be used in a computer, and was first reported by IBM in 1969. IBM designers had turned to costly bipolar ICs because no other memory component could provide the necessary high–speed performance.

But in the competition between cores and semiconductor memories, memories had to have the right combination of speed, density and price. Something of a breakthrough came in 1970 when Intel developed the 1103, a 1024–bit dynamic MOS random–access, memory (RAM). It had about the right specs and quickly caught on despite its initial price tag of $60.

The 1103 wasn't the final step. Power dissipation was on the high side, and external devices were needed to make it work. But the 1103 signaled that computer manufacturers would hereafter have to regard MOS dynamic RAMs as serious alternatives to cores.

By the end of the decade, a new term — large–scale integration, or LSI — had been coined to describe the level of chip complexity possible with ICs, especially MOS. The technology had advanced to the point that an entire four–function calculator could be built with just four–to–eight MOS integrated circuits. However, the accomplishment would soon be dwarfed as MOS / LSI advances accelerated during the next few years.

In 1970 — the year Thor Heyerdahl showed that ancient Egyptians could have crossed the Atlantic in a frail papyrus boat — Mostek, and then TI, showed that all the logic for a four–function calculator could be put on a single chip. The IC became the forerunner of chips used in today's low–cost pocket calculators.

Calculator lCs grow into microprocessors

But the next step, a multifunction calculator, proved too cumbersome for the usual logic techniques. Not only would the calculator have to handle the standard arithmetic functions, it would also have to accommodate exponential, logarithmic and trigonometric functions — an unwieldly assignment for a direct logic approach.

Click on Picture to Enlarge.

 
An IC array performs all computations in the first handheld electronic calculator. From Texas Instruments, the battery–operated instrument prints answers without impact on a narrow strip of heat–sensitive paper tape.

The problem was solved by the development of programmable calculators. In these, the necessary functions would be performed by algorithms stored in read–only memories, or ROMS. This concept was applied by Hewlett–Packard in its highly successful HP–35 "pocket slide rule."

Another company working on a programmable calculator was Busicom, a Japanese manufacturer that contracted Intel to produce the calculator's chips. Intel's Ted Hoff, a young Ph.D. from Stanford University who had worked on the 1103, condensed the Japanese design.

Originally spread around 11 chips, Hoff got the design down to three. One formed a central processing unit (CPU), or "brain." The other two were memory chips, one to move data in and out of the CPU and the other to provide the program to drive it. From this design emerged the first microprocessor, a 4–bit unit that Intel introduced in 1971 as the 4004.

Shortly thereafter, Intel introduced 8–bit microprocessor chip, the 8008. It had more computing power and flexibility than the 4004, and was better suited for applications of data handling and control. However, it also had serious limitations, due mainly to a package constraint of 18 pins. Nevertheless, the 8008 remained the sole 8–bit microprocessor for two years. Then, Intel announced an upgraded version, the 8080, and National Semiconductor and Rockwell among others fielded their own entries.

By that time the microprocessor industry was off and running. Applications for them were sprouting up everywhere, from sales terminals and electronic games to instruments — possibly the most affected area. Virtually every type of instrument seemed to be touched by micros, speeding up design changes that had been in the wind for years. Space–saving, flexible microprocessors presented instrument and other designers with the means to build smaller, cheaper and more versatile equipment.

On the bipolar side of the seventies, advances in technology produced density–enhancing techniques — and a challenge to the lure Of MOS. Fairchild's Isoplanar process, announced in 1971, achieved substantial reduction in chip real estate by eliminating the empty spaces between bipolar devices. The manufacturer employed the Isoplanar process in its highly successful 1–k bipolar RAM.

More recently the spotlight has turned full force on integrated–injection logic, I2L, which emerged simultaneously from research laboratories at IBM in Germany and Philips in the Netherlands. More a circuit technique than a new process, I2L allows chip densities that are comparable to MOS yet offers higher speed and even lower dissipation. It combines readily with other bipolar structures — TTL, ECL and linear — on the same monolithic chip. As if that weren't enough, I2L needs as few as four to five masking steps. The first I2L products — a 4–bit microprocessor slice and a watch circuit — have been announced by, Texas Instruments.

In addition to bipolar and MOS, two new memory technologies have recently started to catch on — charge–coupled–device (CCD) memories, and magnetic–bubble, memories. CCDS, a cousin of MOS devices, have considerably higher density but lower speed. They were invented at Bell Labs in 1970 — the year Harris Semiconductor started to use the term PROM for its new user–programmable memories. Both Intel and Fairchild have announced CCD memories.

Magnetic–bubble memories — developed in the mid 60s — aren't semiconductor devices. They employ the material, yttrium garnet, and must have a magnetic drive field. But they can accommodate very high densities. Both CCDs and magneticbubble memories are, serial–memory devices, though they can be organized into blocks to provide a pseudorandom–access memory. And both can be expected to play important roles in the years ahead.


Based on the bicentennial issue of

Electronic Design
for engineers and engineering managers

Vol 24, number 4   Feb. 16, 1976
© 1976   Hayden Publishing Company Inc.
50 Essex St.   Rochelle Park, NJ   07662


Historical Time Line — Introduction

The Foundation Years   The Era of Giants   The Communications Era

The Vacuum Tube Era   The Transistor Era   The Integrated Circuit Era

AM Broadcast Basics
The Original Theory for Radio was Presented by James Clerk Maxwell in 1873.
Nikola Tesla was the first to patent a workable system.

Gravity   Site Link List   Crossed-Field AM Antenna  

Magnetism   Maxwell's Equations in Magnetic Media

The Tortoise Shell Life Science Puzzle Box Front Page