History Podcasts

Computer business in 1969

Computer business in 1969

Mad Men, season 7, episode 4, describes the computer business in 1969 as follows. IBM would lease computers to businesses, but would do so only for short leases, and would replace them with newer models on the lease renewal. Therefore, a bunch of businesses had started in competition, who bought IBM computers and leased them for longer terms and cheaper.

Of course, Mad Men is fiction, but I wonder: (to what extent) is this true?


IBM started offering the "lease" option in the late 1960s, when leasing became popular in other industries. It seemed like a way to segment the market, but in so doing, IBM opened a window into their business that weakened what had up to that time been a quasi monopoly.

There are quite a few examples of firms that bought IBM computers for re-leasing to others, thereby competing with IBM's leasing business. One of them was a firm called Comdisco, (Computer Discount Corp.) started in 1969 by Ken Pontikes a former IBM salesman. Comdisco competed successfully with IBM by offering "better" prices and terms, because they were better at guessing the "residual" values of the equipment.

I am writing as a former stock analyst who covered Comdisco stock for Value Line.


Fascinating history!

There was a first generation of computer lessors, Saul Steinberg's Leasco, Itel, OPM, most of which underpriced IBM by way too much and paid the price.

Then a new generation, led by Ken Pontikes of Comdisco, followed a more disciplined model and prospered. Sadly, After Ken's untimely death at 52, his son ran the company into the ground.

(I was an investor in Comdisco starting in late 70's)


ARPAnet: The World's First Internet

On a cold war kind of day in 1969, work began on ARPAnet, the grandfather to the Internet. Designed as a computer version of the nuclear bomb shelter, ARPAnet protected the flow of information between military installations by creating a network of geographically separated computers that could exchange information via a newly developed technology called NCP or Network Control Protocol.

ARPA stands for the Advanced Research Projects Agency, a branch of the military that developed top secret systems and weapons during the Cold War. But Charles M. Herzfeld, the former director of ARPA, stated that ARPAnet was not created due to military needs and that it “came out of our frustration that there were only a limited number of large, powerful research computers in the country and that many research investigators who should have access were geographically separated from them."

Originally, there were only four computers connected when ARPAnet was created. They were located in the respective computer research labs of UCLA (Honeywell DDP 516 computer), Stanford Research Institute (SDS-940 computer), University of California, Santa Barbara (IBM 360/75) and the University of Utah (DEC PDP-10). The first data exchange over this new network occurred between computers at UCLA and the Stanford Research Institute. On their first attempt to log into Stanford's computer by typing "log win," UCLA researchers crashed their computer when they typed the letter 'g.'

As the network expanded, different models of computers were connected, which created compatibility problems. The solution rested in a better set of protocols called TCP/IP (Transmission Control Protocol/Internet Protocol) that were designed in 1982. The protocol worked by breaking data into IP (Internet Protocol) packets, like individually addressed digital envelopes. TCP (Transmission Control Protocol) then makes sure the packets are delivered from client to server and reassembled in the right order.

Under ARPAnet, several major innovations occurred. Some examples are email (or electronic mail), a system that allows for simple messages to be sent to another person across the network (1971), telnet, a remote connection service for controlling a computer (1972) and file transfer protocol (FTP), which allows information to be sent from one computer to another in bulk (1973). And as non-military uses for the network increased, more and more people had access and it was no longer safe for military purposes. As a result, MILnet, a military only network, was started in 1983.

Internet Protocol software was soon being placed on every type of computer. Universities and research groups also began using in-house networks known as Local Area Networks or LANs. These in-house networks then started using Internet Protocol software so one LAN could connect with other LANs.

In 1986, one LAN branched out to form a new competing network called NSFnet (National Science Foundation Network). NSFnet first linked together the five national supercomputer centers, then every major university. Over time, it started to replace the slower ARPAnet, which was finally shutdown in 1990. NSFnet formed the backbone of what we call the Internet today.

Here’s a quote from the U.S. Department report The Emerging Digital Economy:

"The Internet's pace of adoption eclipses all other technologies that preceded it. Radio was in existence 38 years before 50 million people tuned in TV took 13 years to reach that benchmark. Sixteen years after the first PC kit came out, 50 million people were using one. Once it was opened to the general public, the Internet crossed that line in four years."


UNIVAC Computer

The research for the project proceeded badly, and it was not until 1948 that the actual design and contract was finalized. The Census Bureau's ceiling for the project was $400,000. J Presper Eckert and John Mauchly were prepared to absorb any overrun in costs in hopes of recouping from future service contracts, but the economics of the situation brought the inventors to the edge of bankruptcy.

In 1950, Eckert and Mauchly were bailed out of financial trouble by Remington Rand Inc. (manufacturers of electric razors), and the "Eckert-Mauchly Computer Corporation" became the "Univac Division of Remington Rand." Remington Rand's lawyers unsuccessfully tried to re-negotiate the government contract for additional money. Under threat of legal action, however, Remington Rand had no choice but to complete the UNIVAC at the original price.

On March 31, 1951, the Census Bureau accepted delivery of the first UNIVAC computer. The final cost of constructing the first UNIVAC was close to $1 million. Forty-six UNIVAC computers were built for both government and business uses. Remington Rand became the first American manufacturers of a commercial computer system. Their first non-government contract was for General Electric's Appliance Park facility in Louisville, Kentucky, who used the UNIVAC computer for a payroll application.


Chronological History of IBM

The character of a company -- the stamp it puts on its products, services and the marketplace -- is shaped and defined over time. It evolves. It deepens. It is expressed in an ever-changing corporate culture, in transformational strategies, and in new and compelling offerings for customers. IBM's character has been formed over nearly 100 years of doing business in the field of information-handling. Nearly all of the company's products were designed and developed to record, process, communicate, store and retrieve information -- from its first scales, tabulators and clocks to today's powerful computers and vast global networks.

IBM helped pioneer information technology over the years, and it stands today at the forefront of a worldwide industry that is revolutionizing the way in which enterprises, organizations and people operate and thrive.

The pace of change in that industry, of course, is accelerating, and its scope and impact are widening. In these pages, you can trace that change from the earliest antecedents of IBM, to the most recent developments. You can scan the entire IBM continuum from the 19th century to the 21st or pinpoint -- year-by year or decade-by-decade -- the key events that have led to the IBM of today. We hope that you enjoy this unique look back at the highly textured history of the International Business Machines Corporation.


History and Timeline

Since it began to escape from AT&T's Bell Laboratories in the early 1970's, the success of the UNIX operating system has led to many different versions: recipients of the (at that time free) UNIX system code all began developing their own different versions in their own, different, ways for use and sale. Universities, research institutes, government bodies and computer companies all began using the powerful UNIX system to develop many of the technologies which today are part of a UNIX system.

Computer aided design, manufacturing control systems, laboratory simulations, even the Internet itself, all began life with and because of UNIX systems. Today, without UNIX systems, the Internet would come to a screeching halt. Most telephone calls could not be made, electronic commerce would grind to a halt and there would have never been "Jurassic Park"!

By the late 1970's, a ripple effect had come into play. By now the under- and post-graduate students whose lab work had pioneered these new applications of technology were attaining management and decision-making positions inside the computer system suppliers and among its customers. And they wanted to continue using UNIX systems.

Soon all the large vendors, and many smaller ones, were marketing their own, diverging, versions of the UNIX system optimized for their own computer architectures and boasting many different strengths and features. Customers found that, although UNIX systems were available everywhere, they seldom were able to interwork or co-exist without significant investment of time and effort to make them work effectively. The trade mark UNIX was ubiquitous, but it was applied to a multitude of different, incompatible products.

In the early 1980's, the market for UNIX systems had grown enough to be noticed by industry analysts and researchers. Now the question was no longer "What is a UNIX system?" but "Is a UNIX system suitable for business and commerce?"

Throughout the early and mid-1980's, the debate about the strengths and weaknesses of UNIX systems raged, often fuelled by the utterances of the vendors themselves who sought to protect their profitable proprietary system sales by talking UNIX systems down. And, in an effort to further differentiate their competing UNIX system products, they kept developing and adding features of their own.

In 1984, another factor brought added attention to UNIX systems. A group of vendors concerned about the continuing encroachment into their markets and control of system interfaces by the larger companies, developed the concept of "open systems."

Open systems were those that would meet agreed specifications or standards. This resulted in the formation of X/Open Company Ltd whose remit was, and today in the guise of The Open Group remains, to define a comprehensive open systems environment. Open systems, they declared, would save on costs, attract a wider portfolio of applications and competition on equal terms. X/Open chose the UNIX system as the platform for the basis of open systems.

Although UNIX was still owned by AT&T, the company did little commercially with it until the mid-1980's. Then the spotlight of X/Open showed clearly that a single, standard version of the UNIX system would be in the wider interests of the industry and its customers. The question now was, "which version?".

In a move intended to unify the market in 1987, AT&T announced a pact with Sun Microsystems, the leading proponent of the Berkeley derived strain of UNIX. However, the rest of the industry viewed the development with considerable concern. Believing that their own markets were under threat they clubbed together to develop their own "new" open systems operating system. Their new organization was called the Open Software Foundation (OSF). In response to this, the AT&T/Sun faction formed UNIX International.

The ensuing "UNIX wars" divided the system vendors between these two camps clustered around the two dominant UNIX system technologies: AT&T's System V and the OSF system called OSF/1. In the meantime, X/Open Company held the center ground. It continued the process of standardizing the APIs necessary for an open operating system specification.

In addition, it looked at areas of the system beyond the operating system level where a standard approach would add value for supplier and customer alike, developing or adopting specifications for languages, database connectivity, networking and mainframe interworking. The results of this work were published in successive X/Open Portability Guides.

XPG 4 was released in October 1992. During this time, X/Open had put in place a brand program based on vendor guarantees and supported by testing. Since the publication of XPG4, X/Open has continued to broaden the scope of open systems specifications in line with market requirements. As the benefits of the X/Open brand became known and understood, many large organizations began using X/Open as the basis for system design and procurement. By 1993, over $7 billion had been spent on X/Open branded systems. By the start of 1997 that figure has risen to over $23 billion. To date, procurements referencing the Single UNIX Specification amount to over $5.2 billion.

In early 1993, AT&T sold it UNIX System Laboratories to Novell which was looking for a heavyweight operating system to link to its NetWare product range. At the same time, the company recognized that vesting control of the definition (specification) and trademark with a vendor-neutral organization would further facilitate the value of UNIX as a foundation of open systems. So the constituent parts of the UNIX System (source code/technology and specification/trademark), previously owned by a single entity are now quite separate

In 1995 X/Open introduced the UNIX 95 brand for computer systems guaranteed to meet the Single UNIX Specification. The Single UNIX Specification brand program has now achieved critical mass: vendors whose products have met the demanding criteria now account for the majority of UNIX systems by value.

For over twenty years, since the inception of X/Open, UNIX had been closely linked with open systems. X/Open, now The Open Group, continues to develop and evolve the Single UNIX Specification and associated brand program on behalf of the IT community. The freeing of the specification of the interfaces from the technology is allowing many systems to support the UNIX philosophy of small, often simple tools , that can be combined in many ways to perform often complex tasks. The stability of the core interfaces preserves existing investment, and is allowing development of a rich set of software tools. The Open Source movement is building on this stable foundation and is creating a resurgence of enthusiasm for the UNIX philosophy. In many ways Open Source can be seen as the true delivery of Open Systems that will ensure it continues to go from strength to strength.

1969 The Beginning The history of UNIX starts back in 1969, when Ken Thompson, Dennis Ritchie and others started working on the "little-used PDP-7 in a corner" at Bell Labs and what was to become UNIX.
1971 First Edition It had a assembler for a PDP-11/20, file system, fork(), roff and ed. It was used for text processing of patent documents.
1973 Fourth Edition It was rewritten in C. This made it portable and changed the history of OS's.
1975 Sixth Edition UNIX leaves home. Also widely known as Version 6, this is the first to be widely available out side of Bell Labs. The first BSD version (1.x) was derived from V6.
1979 Seventh Edition It was a "improvement over all preceding and following Unices" [Bourne]. It had C, UUCP and the Bourne shell. It was ported to the VAX and the kernel was more than 40 Kilobytes (K).
1980 Xenix Microsoft introduces Xenix. 32V and 4BSD introduced.
1982 System III AT&T's UNIX System Group (USG) release System III, the first public release outside Bell Laboratories. SunOS 1.0 ships. HP-UX introduced. Ultrix-11 Introduced.
1983 System V Computer Research Group (CRG), UNIX System Group (USG) and a third group merge to become UNIX System Development Lab. AT&T announces UNIX System V, the first supported release. Installed base 45,000.
1984 4.2BSD University of California at Berkeley releases 4.2BSD, includes TCP/IP, new signals and much more. X/Open formed.
1984 SVR2 System V Release 2 introduced. At this time there are 100,000 UNIX installations around the world.
1986 4.3BSD 4.3BSD released, including internet name server. SVID introduced. NFS shipped. AIX announced. Installed base 250,000.
1987 SVR3 System V Release 3 including STREAMS, TLI, RFS. At this time there are 750,000 UNIX installations around the world. IRIX introduced.
1988 POSIX.1 published. Open Software Foundation (OSF) and UNIX International (UI) formed. Ultrix 4.2 ships.
1989 AT&T UNIX Software Operation formed in preparation for spinoff of USL. Motif 1.0 ships.
1989 SVR4 UNIX System V Release 4 ships, unifying System V, BSD and Xenix. Installed base 1.2 million.
1990 XPG3 X/Open launches XPG3 Brand. OSF/1 debuts. Plan 9 from Bell Labs ships.
1991 UNIX System Laboratories (USL) becomes a company - majority-owned by AT&T. Linus Torvalds commences Linux development. Solaris 1.0 debuts.
1992 SVR4.2 USL releases UNIX System V Release 4.2 (Destiny). October - XPG4 Brand launched by X/Open. December 22nd Novell announces intent to acquire USL. Solaris 2.0 ships.
1993 4.4BSD 4.4BSD the final release from Berkeley. June 16 Novell acquires USL
Late 1993 SVR4.2MP Novell transfers rights to the "UNIX" trademark and the Single UNIX Specification to X/Open. COSE initiative delivers "Spec 1170" to X/Open for fasttrack. In December Novell ships SVR4.2MP , the final USL OEM release of System V
1994 Single UNIX Specification BSD 4.4-Lite eliminated all code claimed to infringe on USL/Novell. As the new owner of the UNIX trademark, X/Open introduces the Single UNIX Specification (formerly Spec 1170), separating the UNIX trademark from any actual code stream.
1995 UNIX 95 X/Open introduces the UNIX 95 branding programme for implementations of the Single UNIX Specification. Novell sells UnixWare business line to SCO. Digital UNIX introduced. UnixWare 2.0 ships. OpenServer 5.0 debuts.
1996 The Open Group forms as a merger of OSF and X/Open.
1997 Single UNIX Specification, Version 2 The Open Group introduces Version 2 of the Single UNIX Specification, including support for realtime, threads and 64-bit and larger processors. The specification is made freely available on the web. IRIX 6.4, AIX 4.3 and HP-UX 11 ship.
1998 UNIX 98 The Open Group introduces the UNIX 98 family of brands, including Base, Workstation and Server. First UNIX 98 registered products shipped by Sun, IBM and NCR. The Open Source movement starts to take off with announcements from Netscape and IBM. UnixWare 7 and IRIX 6.5 ship.
1999 UNIX at 30 The UNIX system reaches its 30th anniversary. Linux 2.2 kernel released. The Open Group and the IEEE commence joint development of a revision to POSIX and the Single UNIX Specification. First LinuxWorld conferences. Dot com fever on the stock markets. Tru64 UNIX ships.
2001 Single UNIX Specification, Version 3 Version 3 of the Single UNIX Specification unites IEEE POSIX, The Open Group and the industry efforts. Linux 2.4 kernel released. IT stocks face a hard time at the markets. The value of procurements for the UNIX brand exceeds $25 billion. AIX 5L ships.
2003 ISO/IEC 9945:2003 The core volumes of Version 3 of the Single UNIX Specification are approved as an international standard. The "Westwood" test suite ship for the UNIX 03 brand. Solaris 9.0 E ships. Linux 2.6 kernel released.
2007 Apple Mac OS X certified to UNIX 03.
2008 ISO/IEC 9945:2008 Latest revision of the UNIX API set formally standardized at ISO/IEC, IEEE and The Open Group. Adds further APIs
2009 UNIX at 40 IDC on UNIX market -- says UNIX $69 billion in 2008, predicts UNIX $74 billion in 2013
2010 UNIX on the Desktop Apple reports 50 million desktops and growing -- these are Certified UNIX systems.

The Open Group is not responsible for the content of the following external articles.


Why Does My Gadget Say It's December 31, 1969?

If you’ve ever had the date on a cell phone or computer mysteriously switch to December 31, 1969, you may have thought it was simply random. But the reason behind this odd glitch is a nice little tidbit of computer trivia.

Unix is a computer operating system that, in one form or another, is used on most servers, workstations, and mobile devices. It was launched in November 1971 and, after some teething problems, the “epoch date” was set to the beginning of the decade, January 1, 1970. What this means is that time began for Unix at midnight on January 1, 1970 GMT. Time measurement units are counted from the epoch so that the date and time of events can be specified without question. If a time stamp is somehow reset to 0, the clock will display January 1, 1970.

So where does December 31 fit in? It’s because you live in the Western Hemisphere. When it’s midnight in Greenwich, England, it’s still December 31st in America, where users will see December 31, 1969—the day before Unix’s epoch.

So how do you fix it? Simple. Just correct the date to the present time.

Learn more about Unix from Ken Thompson and Dennis Ritchie, two of the creators of Unix:


1970s

HP's first PC

HP introduces its first personal computer, the HP-85. The unit had input/output modules that allowed it to control instruments, add on more powerful peripherals and even to talk to other computers.

David and Lucile Packard with Chinese delegation at Big Sur in 1980.

HP moves into China

HP&rsquos products become available in China, with the opening of the China Hewlett-Packard Representative office in Beijing.

HP's calculator standard

HP introduces the HP-12C business calculator. It will go on to become the world&rsquos standard financial calculator and is still being sold by HP today.

HP wins Deming prize

Yokogawa-Hewlett-Packard wins the prestigious Deming Prize for quality

1st handheld computer debuts

The HP-75C debuts as HP&rsquos first handheld computer. Able to connect to peripherals such as a digital cassette drive and printer, it&rsquos an early tool for mobile computing.

1st desktop mainframe

HP introduces the HP 9000 technical computer. The first "desktop mainframe," it&rsquos as powerful as the room-size computers of the 1960s

HP introduces Touchscreen PC

HP introduces the HP-150 Touchscreen PC, allowing users to activate features simply by touching the screen.

Bill wins science medal

Bill Hewlett is awarded the National Medal of Science, the nation&rsquos highest scientific honor.

First laptop

HP's first laptop, the HP-110.

HP invents ThinkJet printing

HP introduces thermal inkjet printing with the debut of the HP ThinkJet. It marks the success of HP Labs in miniaturizing inkjet technology to deliver superior quality, quieter operation and lower-power consumption over dot-matrix printers.

HP LaserJet takes off

HP introduces the HP LaserJet, which quickly becomes the world&rsquos most popular personal desktop laser printer.

David and Lucile with Chinese delegation at Big Sur in 1980

China goes high-tech

China Hewlett-Packard (CHP), the first high-tech joint venture in China, is established.

A single VLSI chip containing the entire CPU of a RISC-based computer.

HP creates RISC architecture

HP becomes the first major computer company to introduce a precision architecture based on reduced instruction set computing (RISC), making computers faster and less expensive. RISC executes instructions faster and does more work than previous generations of chips.

3D graphics

3D graphics come of age with the HP SRX, the first generation of graphics workstations. The device helps HP become a leading graphics workstation vendor.

Bill named HP's director emeritus

Bill Hewlett retires as vice chairman of the HP board and is named director emeritus.

Hardware recycling begins

HP begins its hardware recycling program.

HP Deskjet launched

The HP DeskJet debuts as the company&rsquos first mass-market inkjet printer.

Bill and Dave at the dedication of the HP Garage, California Historical Landmark 976, Birthplace of Silicon Valley, 1989

Garage named landmark

The birthplace of the company—Bill and Dave&rsquos rented garage—is dedicated as a California Historical Landmark to celebrate the 50th anniversary of HP.

World's first x86 server

Customers can achieve PC-like economics and flexibility for their server environments with the first x86 server built on industry standards. The Compaq SystemPro ushers in a new era of computing in enterprise server reliability, capacity and performance.


Computer business in 1969 - History

A database model is a structure or a format of a database.

There are three types of database model that is widely used:

Within a database there are three types of relationships they can have among them:

A network database model is a database model that allows multiple records to be linked to the same owner file. The model can be seen as an upside down tree where the branches are the member information linked to the owner, which is the bottom of the tree. The multiple linkages which this information allows the network database model to be very flexible. In addition, the relationship that the information has in the network database model is defined as many-to-many relationship because one owner file can be linked to many member files and vice versa.

The network database model was invented by Charles Bachman in 1969 as an enhancement of the already existing database model, the hierarchical database model. Because the hierarchical database model was highly flaw, Bachman decided to create a database that is similar to the hierarchical database but with more flexibility and less defaults. The original and existing hierarchical database has one owner file linked strictly to one member file, creating a ladder affect that restricted the database to find relationships outside of its category.

Network Database Model

Hierarchical Database Model

Easily accessed because of the linkage between the information

Difficult to navigate because of its strict owner to member connection

Great flexibility among the information files because the multiple relationships among the files

Less flexibility with the collection of information because of the hierarchical position of the files

Network Database Model

Relational Database Model

The files are greatly related

Information is stored on separate tables tied together with other clumps of information

- Because it has the many-many relationship, network database model can easily be accessed in any table record in the database

- For more complex data, it is easier to use because of the multiple relationship founded among its data

- Easier to navigate and search for information because of its flexibility

Disadvantage of a Network Database Model

- Difficult for first time users

- Difficulties with alterations of the database because when information entered can alter the entire database


1969: ARPANET is born

A map of the four connected computers when the first ARPANET message was sent. Image Source: VOX

Four university computers– at UCLA’s Network Measurement Center, Stanford Research Institute, University of California, Santa Barbara, and The University of Utah–are connected via nodes that allow electronic communication. UCLA sends off the first message, “lo,” to Standford on October 29.


Computer business in 1969 - History

In 1969 Intel was commissioned by a Japanese calculator company to produce an integrated circuit, a computer chip, for its line of calculators. Ted Hoff who was given the assignment was troubled by the fact that if he utilized standard methods of design the Japanese calculators would be just about as expensive as one of the new minicomputers that were being marketed and it would not do nearly as much. Hoff decided he would have to use a new approach to the calculator chip. Instead of "hardwiring" the logic of the calculator into the chip he created what is now called a microprocessor, a chip that can be programmed to perform the operations of a calculator i.e., a computer on a slice of silicon. It was called the 4004 because that was the number of transistors it would replace. The contract gave the Japanese calculator company exclusive rights to the 4004. Hoff realized that the 4004 was a significant technical breakthrough and was concerned that Intel should not give it away to the Japanese calculator company as part of a relatively small contract. Fortunately for Intel the Japanese company did not realize the significance of what they had obtained and traded away their exclusive rights to the 4004 for a price reduction and some modifications in the calculator specifications.

Intel later developed another microprocessor for the Computer Terminal Corporation (CTC). This one was called the 8008. In this case CTC could purchase the product from Intel but Intel retained the right to market the 8008 to other customers. Intel began to create support for this programmable chip, the 8008. An employee of Intel, Adam Osborne, was given the assignment of writing manuals for the programming language for the 8008. Osborne later became important in the development of the personal computer for bringing about creation of the first portable computer there is more about this below.

Gary Kildall, a professor at the Naval Postgraduate School in Monterey, worked at Intel to develop a language and programs for their microprocessors. Kildall also played another important role in the development of the personal computer in that he wrote the first operating system for a microprocessor. It was called CP/M. Without an operating system a personal computer is a very awkward device to use.

By the early 1970s there was a vast number of people who had had some experience with mainframe computers and would love to have a computer of their own. In Albuquerque, New Mexico there was a man named Ed Roberts who ran a business selling kits for assembling electronic devices. The company's name was MITS for Micro Instrumentation Telemetry Systems. The company was not doing too well and Ed Roberts was looking for some new products to increase sales. The calculator business was becoming saturated, especially when the chip manufacturers such as Texas Instruments began to market calculators themselves. After a disasterous attempt to sell kits for programmable calculators Ed Roberts was desperate for a new product. He decided to try to do what no one else had attempted, to create a kit for assembling a home computer. He decided to base it upon a new chip Intel had developed, the 8080. Roberts negotiated a contract with Intel that gave him a low price on the 8080 chips if he could buy in large volume. About that time a magazine Popular Electronics , edited by Les Solomon, was looking for workable designs for desktop computers. Roberts promised Solomon a working model if Solomon would promote it through Popular Electronics . Ed Roberts decided to call his computer the Altair after the name of a planet in a StarTrek episode Les Solomon's daughter was watching. Roberts and the MITS people worked feverishly on building a prototype of the Altair to send to Popular Electronics but when the deadline for publication arrived the model was not quite ready. Nevertheless Popular Electronics published a picture of the empty case of the Altair on its front cover. The computer case with its lights and switches did look impressive. An article in the magazine revealed that the kits for the Altair were available for $397 from MITS in Albuquerque, New Mexico.

To everyone surprise computer buffs from all over the country sent in their $397 to buy an Altair kit. In fact, MITS was flooded with money. It went from a state of near bankruptcy owing $365,000 to a situation in which it had hundreds of thousands of dollars in the bank. MITS bank was a bit concerned that MITS had started engaging in something lucrative but illegal.

The Altair had a very limited capability. It had no keyboard, no video display and only 256 bytes of memory. Data input had to done by flipping toggle switches and the only output was the flashing lights in the computer. Nevertheless there was great enthusiasm for the Altair.

Two programmers in the Boston area (students at Harvard actually) decided to develop software for the Altair. Their names were Bill Gates and Paul Allen. They called Ed Roberts and told him they had the programs to run the programming language BASIC on the Altair. Roberts said he would buy it if he could see it running on the Altair. Gates and Allen didn't actually have the programs written but they immediately set out to write them. It took about six weeks. It was an amazing accomplishment that they got it to work. They developed the programs for the Altair by programming a Harvard computer to emulate the limited capabilities of the Altair. They were successful in the development and Paul Allen flew to Albuquerque to demonstrate the result. Given the multitude of things that could have gone wrong it was a miracle that the program worked. It worked however on a more sophisticated lab version of the Altair at MITS rather than the version sold to the general public. Gates and Allen's company Microsoft was founded in Albuquerque and only later moved to the Seattle area.

The members of the general public that sent in their $397 were finding a long, long wait before they received their Altair kit. MITS was just not prepared to handle the volume of business that came in. But MITS showed the demand was there and the market started to work.

Gary Kildall joined forces with a professor from U.C. Berkeley, John Torode, to produce a small computer also based upon the 8080 chip. Torode built computers under the name Digital Systems and Kildall wrote the software under the name Intergalactic Digital Research.

Altair's most effective early competitor was created by IMSAI Manufacturing of San Leandro, California. IMSAI was established by Bill Millard who had no particular interest in computers but knew a hot marketing opportunity when he saw one.

About this time Lee Felsenstein entered the picture. Lee Felsenstein was an interesting individual who played a number of important roles in the development of the personal computer. He had a quite interesting background. He grew up in Philadelphia and became an engineering student. One summer he got a job in the Los Angleles area working as an engineer for an operation that required a security clearance. He loved being an engineer and had no plans for doing anything else. Then one day the security officer where he worked called him into his office to inform him that he would not be given the necessary security clearance. When Lee had filled out the application forms for the job he had stated that he did not know any members of the Communist Party, which he reaffirmed under questioning by the security officer. The security officer then informed Lee that his parents were members of the Communist Party. As Steven Levy reports in his book Hackers , based upon an interview with Lee:

The security officer told him that he could not give him a security clearance at that time but if he kept out of political involvements he could reapply in a year or so and probably would get a security clearance then. Lee left the organization and after a while moved to Berkeley in 1963 where the countercultural revolution was in just beginning. Lee went to work on a weekly newspaper called the Berkeley Barb as a technician and journalist. The Barb was a radical newspaper run by Max Scheer. The Barb did not make much money and the staff received no pay other than when Max took them home for his wife Jane to feed them. Later Max started selling advertisement space in the Barb to massage parlors and started making a lot of money. But he still did not pay the staff any salary. This upset many on the staff in a two ways. First they were perplexed at their newspaper calling for social revolution but selling ads to massage parlors and second they were not getting any of that money. A group of the Barb staff, including Lee Felsenstein, left and started another newspaper the Berkeley Tribe . The Tribe was committed to ideological anarchism. Lee managed the Tribe for a while and then entered UC Berkeley and finished his engineering degree. After graduation he joined a communal organization called Resource One and later an offshoot Community Memory which sought to bring computers to the people by installing remote terminals in places of business.

About the time the Altair was announced a group of San Francisco Bay Area computer buffs organized the Homebrew Computer Club. After the club was operating for sometime Lee Felsenstein became the facilitor for the Club, an informal master of ceremonies to direct the meetings and discussions. As many as 750 attended the meetings and they became a major locus of information exchange on computers in the Bay Area. Steve Jobs and Stephen Wozniak attended these meetings. Adam Osborne sold his book An Introduction to Microcomputers at these meetings.

Lee Felsenstein did occasional engineering design work including a computer which was named the Sol after the editor of Popular Electronics , Les Solomon. The Sol would sell for about $1000 but include a lot more capabilities than the Altair. Felsenstein and others were also creating enhancements, such as memory boards, for the Altair. Lee Felsenstein also designed the Osborne Computer, the first portable computer. It was not portable in the sense of a laptop computer that can be used while traveling. It was portable in the sense that it could be conveniently carried from one place to another and there plugged in and used. The size was limited to the dimensions that could fit under a jetliner seat.


Computer business in 1969 - History

The IBM Pavilion at the New York World's Fair closes, having hosted more than 10 million visitors during its two-year existence.

A 59-pound onboard IBM guidance computer is used on all Gemini flights, including the first spaceship rendezvous. The IBM 2361, the largest computer memory ever built, is shipped to the NASA Space Center in Houston. IBM scientists complete the most precise computation of the Moon's orbit and develop a fabrication technique to connect hundreds of circuits on a tiny silicon wafer.

IBM product launches include the IBM 1130, a low-cost, desk-size computer the 2740 and 2741 typewriter communications terminals and the 2321 data cell drive.

The first IBM-sponsored computer centers in European universities open in London, Copenhagen, and Pisa, Italy.