the Increased Use of Software Applications: Insights from
the Case of the American Petroleum Industry, 1950-2000
James W. Cortada
IBM Global Services
published: 13 September 2002
Abstract: Software applications represents a new field
of study within the history of computers. The author discusses
how to do research on the history of applications, using the
example of the petroleum industry to illustrate the approach.
It involves discussing interactions among technologies, software,
business issues, and industries, and argues that without such
a holistic approach it becomes difficult to appreciate the
historical importance of computers and their effects on society.
Keywords: software applications, petroleum
industry, Texas Instruments, IBM, historiography
II. Software applications
in the U.S. petroleum industry
III. Some final observations
historians launch a major phase in their study of a general
subfield, for example the history of software, they must engage
in a dialogue concerning the definition and scope of such
studies. So far, and at the risk of a large generalization,
software is generally thought of as programming languages,
systems that make computers operate, or code for sale (such
as an application or game). Taxonomies exist and, no doubt,
new ones will be developed in response to the historian's
impulse to organize and rationalize the study of important
The majority of the research done on the history of software
has focused on ephemera, such as programs ("lines of
code"), how they were written, and the history of their
integration and relative performance one versus the other.
So far, the majority of the work has centered on programming
languages and systems software.
Yet even with these two topics, there still exists a paucity
of material, which becomes evident whenever an historian looks
about for the prior historical literature to offer direction
and scope. The study of software's history by professional
historians is sufficiently new that the subject has a startup
feel about it. However, a body of responsible research is
slowly accumulating around the business of software, that
is to say, histories and economic analyses of the "Software
[end of page 1]
the particularist historiography of software might argue that
the history of an industry is not the same as the study of
software's past, claiming instead that the study of the history
of software involves discussions about the science and technology
involved and, in time, the circumstances under which these
evolved. Over time, one might expect such arguments to suggest
a hint of the future historiography that is more social, perhaps
economic in tone. But that all lies in the future; today the
hunt is largely focused on the history of specific types of
software and the techniques for their creation and use.
We have an equally
large problem that involves the one category of software that
provided the rationale for using computers in the first place,
applications. Corporations, government agencies, schools and
universities rarely acquired programming languages or systems
software just for their own sake; rather, these were put into
service to create and operate machines running application
software. The latter provided services to an end user; put
another way, "answers" from a computer. Examples
include word processors, spreadsheets, inventory control systems,
software to operate ATMs, POS terminals, factory production
planning systems, online blueprint design tools, and so forth.
The list is endless.
But how are we
to study such a vast eclectic accumulation of trillions of
lines of applications software?
It would seem a more comfortable proposition to write histories
of programming languages, a few famous software packages,
perhaps boldly move toward a history of a specific use (e.g.,
email, word processing), and, of course, all the various
methods for designing, writing, testing, and installing
software. But I would like to propose that the study of
software applications can have some discipline to it that
makes it possible to take the other kinds of studies of software
(e.g., of specific programming languages) and place them nicely,
and simultaneously, into the rich context of their use. Only if we do that does the history of software
ultimately become historically important in a broad sense,
especially to audiences beyond historians of information technology
(IT). A rationale approach to the study of applications also
serves the important requirement of the historian to provide
a taxonomy with which to organize the work of millions of
programmers and users of computers over the past half century,
whose total output of effort led the economies of the world
to spend one to four percent of their Gross Domestic Products
over the past quarter century; in short, trillions of dollars.
A possibly useful,
early approach is to examine the role of applications, software
packages, and end user communities within the context of specific
industries. What little we know about the use of computers
at an industry level suggests that what applications were
adopted by specific companies was largely influenced by what
other firms in the same industry did; in fact, adoption often
took the form of an industry-wide initiative coordinated by
an industry organization (as occurred with bar codes in retail,
and check design in banking).[end of page 2] To test that
assumption, looking at what uses to which companies deployed
computers can serve as an early step in the right direction.
It holds out several possibilities. First, we can learn what
applications were used, when, and by how many. That inventory
can then lead secondly to a series of discussions about providers
of such software, such as data processing departments within
firms, vendors of hardware and software, their interactions,
and so forth. Third, we then can move to discussions about
the effects of computing on specific industries, and ultimately
to individual types of applications, programming languages,
and other software tools across multiple industries or a national
have learned that the study of specific artifacts must proceed hand-in-hand with non-technical issues such as biographies of inventors and users and analyses of societal impacts. The study of the automobile, the clock,
computers, flight, weapons, just to mention a few, are familiar
examples. But the same also applies to application software.
To write a history of an inventory control package written
in COBOL, operating on an IBM System 360 computer, using OS
in the mid-1960s, ignores the central issues of why someone
wanted the software in the first place, how it influenced
the affairs of a company, an industry, or an economy. To conduct
the research effectively, we are quickly forced to recognize
what historians of clocks, cars, and computer chips have known
for some time, that context and other issues must be drawn
into the discussion.
Looking at software through industry eyes can help launch
a fuller discussion of software's history and role.
It is the nature
of application software that it is not so unique per se to
one company or organization. A team of programmers might write
an inventory control application in COBOL for an aerospace
company that, line for line reads differently than an inventory
control system used at the same time by an automotive manufacturer,
written in Assembler. But they both perform essentially the
same functions: tracking and forecasting the "ins"
and "outs" of inventory movement in a company. Furthermore,
it is also a characteristic of users of applications to borrow
notions of how best to incorporate them, and for what reasons,
from other enterprises and industries. Nobody, it seems, operated
in isolation. They read each other's trade magazines, IT industry
publications, and attended industry and technical conferences
where people shared their experiences about software applications.
By the early 1960s, computer user groups were in evidence,
such as the important IBM user's group, SHARE.
My own research
suggests that the greatest influence on which applications
were written (or bought) came from the experiences others
had within an industry.
That is not to say that vendors were not important, or other
programming experiences within a particular IT organization
or company, but ultimately, enterprises learned from others
in their own industries. In commercial operations, the hunt
for competitive advantage and productivity meant that business
and technical managers made their decisions about what application
software to acquire almost always within the context of their
competitive circumstance. [end of page 3] Put another way,
it is difficult to imagine that one could study the history
of software applications without coming to the topic from
a business or user-centric perspective. There may be other
ways to study application software, but approaching it through
the lens of business history is a useful way to start since
we know little about the history of this topic.
Furthermore, it appears practical to begin the historiographical
exercise by looking at the subject from a user perspective,
since industries influenced the types of applications enterprises
installed, the experience an organization had with this class
of software, and the rate of its deployment.
offers certain advantages for the historian. First, every
major industry long had associations and trade publications
that documented the issues of the day, including the role
of computers and applications. Most of these transcend the
entire period during which computers existed and in some cases,
date back to the early years of the twentieth century, which
provides the added benefit of being able to study pre-computer
histories of other uses of business machine applications.
In the case of the United States, there are almost no exceptions
to this generalization about the availability of materials
for the entire period of the computer. There exists a large
body of contemporary material conveniently available on such
issues as what applications were used, their features, why,
when, by whom, to what extent, and with what consequences.
are a substantial number of contemporary "how to"
technical IT publications that describe how to write and implement
software. These often include case studies. Related materials
include thousands of application briefs published by hardware
and software vendors, describing specific examples of implementation,
all done to support the sale of their products. This ephemera,
nonetheless, cites specific users by name, dates the installation
and use of applications, describes rationale for these and
the consequences of deployment, and provides many other specific
details, all grist for the historian. Third, archival evidence
exists, primarily of hardware and software vendors, along
with old consultants' reports on the scope of deployment of
applications; the collections of these kinds of resources
at the Charles Babbage Institute (CBI) and at the IBM Archives,
for example, are substantial and understudied. Very few historians
have examined these materials. Finally, we know that some
individual companies and government agencies have buried in
their respective archives documentation on their use of applications.
These collections have barely been tapped, although we have
some stunning examples of the potential of this documentary
evidence in the studies done on military applications, airline
reservation systems, and ERMA (an early banking application).
Not all of these various sources need be used in every study
of an industry's use of applications, but eventually one would
expect all to be examined.
So how would one
go about examining applications from an industry-centric perspective?
How would technical issues and business concerns integrate
to provide both insights into the use of computing technology
(especially software) while contributing to the history of
an industry? [end of page 4] These are new questions to ask of the history
of software applications. In part the answers will be a reflection
of what the historical evidence begins to suggest, while over
time, other concerns will surface suggesting new lines of
research. But taking an institutional, non-technical view
of applications suggests a useful approach to the initial
study of software applications.
The best way to
illustrate this approach is to try it. It quickly becomes
very clear that software, hardware, business, and economic
issues mixed together from the beginning of commercial uses
of computers. It is virtually impossible to isolate them and
still preserve a sense of the true importance of application
software from the context of its use. The case study below
is intended to suggest what a study of an application might
look like from an industry perspective. Given the fact that
in any advanced economy there are literally hundreds of industries
that one could study, the opportunity for both historians
of technology and business to come together, to learn from
each other, and to be informed in new ways is significant.
I focus entirely
on the perspective of the user, leaving out the real possibility
that the software or hardware industries had a role to play
in injecting software into an industry. To discuss that side
of the economic equationthe supply side of the storywould
have more than doubled the size of this essay. But it will
need to be done elsewhere to provide a full picture of other
industries, such as those that supplied the technology and
software. For example, Texas Instruments,
in the decades before the arrival of the computer, had sold
a variety of seismographic tools to companies in the petroleum
industry which were logical candidates for computerization
in the 1950s and 1960s, leading to the evolution of this firm
into a major participant in the IT industry.
Yet to be answered, and beyond the scope of this essay, are such obvious questions
as who provided software and how.
Did users write all their own software, or if not, who did? Were
conditions placed on the use of software tools by vendors?
These are all questions that would increase our understanding
of the role of software in general, and more specifically,
the interactions between the petroleum industry and other
computer-related industries and firms, such as software vendors
beginning in the 1970s.
The U.S. petroleum
industry suggests how the history of software applications
can be studied. It is an industry profoundly influenced by
the use of software. That influence extends to the daily
work in all corners of the trade, and ultimately affects
the organization of its enterprises and the configuration
of the industry as a whole. It is an industry that has a long
history of working with all kinds of technologies, so that
by the time the computer came along, its managers were sufficiently
versed in complex instrumentation and earlier information
technologies to effectively leverage computers and software. This is also an industry that
embraced the use of computers early on, pushing some of the
technical boundaries of both hardware and especially software,
particularly in the formative years of the technology (1950s-60s),
for which we have a useful body of historical records with
which to trace their adoption and evolution. [end of page 5] The companies
making up the industry were large, hence could afford to install
many computers, build complex IT systems, and write a great
deal of software. It also employed many thousands
of IT professionals, spending a great deal on computing over
the past half century. While other industries might well have
been chosen, such as those in manufacturing or even other
process industries (like the chemical trade), the petroleum industry
did just a little more than others over the entire continuum
of the half-century.
The high level
of concentration of activities in this industry in the hands
of a few companies is one of its most distinguishing features.
For the historian, it is an example of the arguments put forth
by Alfred D. Chandler, Jr., about the propensity of industries
to look for efficiency, and to use a term from the late 1990s,
the reduction of "friction" in the economy of its
internal operations. It is an industry that also spent a great
deal of its time and resources selling and working within
itself, unlike so many manufacturing industries that needed
to interact with firms in other industries or in retail, which
has to work with manufacturers and transportation industries.
Its insularity made it possible for the petroleum industry
to develop its own distinctive cultural traits. While a discussion
of that culture is outside the scope of this essay, one feature
that at least should be acknowledged is a propensity to rely
on and be comfortable with technologies of all types, especially
those that deal with continuous processing. It is within that
context that the industry embraced various forms of information
technology and its management software as a valued asset,
much like an invention or a trade secret, even from the earliest
days of computing.
applications in the U.S. petroleum industry
This is one industry
of the post-World War II period which became the subject of
so much attention around the world because it was the source
of a series of oil supply and pricing crises. It remained tied
up in various Middle Eastern wars, including most recently
the Gulf War of 1991, and involved questions of national security
as American dependence on Arab oil remained high through the
second half of the twentieth century. However interesting
and important all these issues are in understanding the international
economics of the recent past, we can safely bypass most of
that history because, regardless of political considerations,
day in and day out this industry still drilled for oil, refined
it, and shipped it to customers. Its products fueled the American
This is an industry
which has long suffered from an image of being stodgy, even
old fashioned; however, nothing could be further from the
truth in so far as it concerns the use of technologies of
many types. Of all the long-established manufacturing industries,
the petroleum trade has consistently been one of the most
high-tech, extensive users of computing technologies. It has
been high-tech all over the world, from the oil fields in
the Arab Emirates to retail gas stations across America. [end
of page 6] The breadth of its inventory of software applications
is impressive by any measure.
But to understand
the nature and value of these applications, first we need
to understand the industry's structure because firms within
each part of the industry are the ones that embraced information
technology, and installed applications. This industry essentially
comprised four parts over the half-century. The first was
production, which located and extracted natural gas and crude
oil from the earth. The second consisted of refineries, which
manufactured such finished products as gasoline, jet fuel,
kerosene, and other liquid petroleum-based goods. This second
cluster often has also been called the petrochemical industry.
A third piece of the industry consisted of those firms or
divisions of companies that marketed and sold products, both
wholesale and retail. Gas stations fit into this segment of
the industry. The fourth component comprised transportation
which, in this industry in the U.S., normally consisted of
all the pipelines that moved oil from well heads to refineries
(but also ships if coming from outside the country), and by
truck and pipe to retail outlets. Over the past century, the
largest firms in the industry have generally attempted to
play an active role in each of these four segments. There
also existed smaller firms active in one or some of this industry's
In the decade
following World War II, seven vertically integrated firms
dominated the industry and were called the majors or "seven
sisters." Five were based in the U.S., the others in
Europe. The American firms were Exxon (Esso initially), Texaco,
Gulf, Chevron, and Mobile. British Petroleum and Royal Dutch/Shell
are the two major European firms, although sometimes experts
on the industry like to add an eighth, the French Compagnie
Francaise des Petroles (CFP). In 1950 these eight companies
controlled one hundred percent of the production of crude
oil outside of North America and the Communist bloc. In 1970,
they still controlled eighty percent. This control always
involved an extensive array of alliances, supported by national
governments. In short, it was an oligopolistic arrangement,
operating on a global basis for much of the twentieth century.
In addition to these firms, in the 1950s a series of smaller
companies emerged called "the Independents." These companies
found, extracted, and sold oil on the "spot" market.
They caused national governments in producing nations to play
an extensive role in determining prices, availability, and
other terms and conditions as the century marched on.
In 1950 the U.S.
produced the majority of the oil it needed; by the early 1970s
it was importing over a third of its requirements. On a worldwide
basis in 1974, the U.S. produced 15.6 percent of the world
supply, 10.6 percent in 1998. The Middle East in 1974 produced
38.9 percent and in 1998, 34.8 percent. Newly emerging economies
in the intervening years added to the total supply of oil,
such as the North Sea fields in the Atlantic Ocean off northern
Europe, and Latin America. Beginning in the late 1990s new
sources emerged in Asia and in what used to be the Soviet
Union. [end of page 7]
At the end of
the twentieth century, oil companies went through another
round of mergers and acquisitions to reduce overall operating
costs, and to enhance what economists liked to call "forward
integration," which meant firms participating in more
sectors of the four parts of the industry.
National governments permitted this after several decades
of attempting to control industry dynamics. In August 1998
British Petroleum (BP) and Amoco merged; then in December
Exxon and Mobil. Other mergers occurred around the world as
well. Members of OPEC (Organization of Petroleum Exporting
Countries, established in September 1960) expanded their ownership
of assets in the American economy, most notably Saudi Arabia,
which had acquired half-ownership of Texaco's American refining
and distribution network in November 1988. The newly merged
majors accounted for about four percent of the world's crude
oil production. One of the reasons such mergers could occur
was due to application software, which made it possible to
integrate various operating units. This held out the promise
of efficiencies of scale while extending market share, or
what economists called "scope."
Because of the
widely differing activities of each of the four parts of this
industry, a brief survey of computing by sector gives us a
better sense of what happened within petroleum firms. Production
is the first area to look at. There are essentially two basic
activities involved: determining where to drill for oil and
gas, a geological exercise, and second, drilling holes and
extracting the crude oil and natural gas. The earliest software
applications in production were used to accumulate and present
data on various processing conditions, beginning in the 1950s.
Linking existing instruments to computers allowed firms to
begin presenting information useful to operators in the field.
During the 1960s and 1970s, software increasingly directed
instruments to change their activities in real-time, thereby
bringing a profound level of automation to field production
work. The same trend appeared in refinery operations.
In the 1950s and 1960s, the majors all experimented with centralized
computing and data collection from field operations.
With the availability
of smaller computer systems (e.g., IBM's 1400s) in the 1960s,
and minicomputers in the 1970s, local data processing operations
were established in the field, but normally linked to some
centralized control function, particularly as software increasingly
took over functions performed by field personnel. By the late
1960s, one could speak about a computer production and control
system that monitored status of remotely located wells, collected
production information, and generated a raft of management
reports. Automating these functions proved essential since
many wells were small and isolated, and there were tens of
thousands of them scattered all over the United States. Typical
software applications written by these companies included
monitoring and scheduling production, conducting automated
well testing, controlling secondary recovery, operating alarms
for problems with flow and leaks and machine failures, performing
data reduction and reporting, executing engineering computations,
and managing optimized gas plant controls. [end of page 8]
These firms were increasing overall production and yields
per site, determining from an economic perspective when it
was best to abandon a field, reduce production downtime, lower
operating costs, utilize field engineers more effectively,
and inform management. In the 1950s much of this was batch
processing, but by the end of the 1960s, a great deal was
real-time and online computing.
By the early 1970s, drilling operations were coming under
computer control for the purposes of monitoring drilling activities
(such as drill penetration rates), reporting results in offshore
drilling, optimizing drill-bit life, and reducing various
testing operations. Many locations often would have a staff
of one to three DP professionals to do this work. However,
extensive deployment of computing in drilling operations did
not occur until the end of the 1970s.
to drill for oil has long been a critical activity, requiring
extensive knowledge of geology and calling for good judgment
because drilling has always been an expensive operation. "Dry
holes" normally cost millions of dollars in unproductive
expenses. In the 1950s, computers held out the hope of being
used to model geological conditions and perform analysis to
help firms determine where to drill. Not until the amount
of computing increased sufficiently in the 1960s did this
become possible. At that point the first geological and geophysical
mapping software applications were written, a collection companies
continuously have enhanced to the present.
These applications included the study of shock wave reflection
patterns, and analysis of data from test well drilling. By
the end of the 1960s many firms had developed software applications
that helped in well drilling control, using mathematical models
to determine in advance where best to put wells, relying on
financial models to figure out what to pay for acquiring mineral
rights to drill, others for online testing, and then pipeline
management systems. By the end of the 1970s these tools were
in wide use, reducing the amount of guesswork by highly experienced
personnel, their work and decisions now controlled more
When initial experimentation
with computer-based models started at the beginning of the
1960s, it was not clear to management in general, and even
to DP professionals, exactly how useful computing could be
in the area of simulation. That is largely why one commentator
from the period could argue casually that simulation made
it possible "to allow management to 'play around' with
supply and distribution schedules, or the design of complex
process facilities, without disrupting present operations."
But they quickly found out that software could handle even better
than humans the "interpretation of tremendous volumes
of seismic and geologic data." One of the earliest databases
of such information, known at the time as the Permian Basin
Well Data system, evolved into "a huge electronic library
of information relating to one particular oil producing area."
Today no drilling occurs without extensive computer-based
modeling of options. At the same time, all drill sites are
overwhelmingly automated, extensively controlled from remote
locations. [end of page 9]
Refining is as
close to pure manufacturing as one gets in this industry,
with the critical task being the conversion of raw crude into
a variety of products that can either be shipped in bulk to
other firms, e.g., gas stations, or are converted into consumer
products, such as cans of oil sold in a Kmart store, or as
raw materials for other industries to use (e.g., plastics).
A typical refinery looks like an organized bowl of spaghetti
with many miles of pipes going every which way and attached
to large, tall containers that do the transformation of crude
to various products. The key notion to keep in mind is that
all the work is continuous. The industry places a premium on
uninterrupted operations, and on absolute understanding of
what is happening at every stage of the process. Long before
the arrival of the computer, the industry had developed a raft
of analog-based instruments to support these objectives. Computers
and software applications tailored to specific configurations
of petroleum industry technologies were then deployed to take
control of these instruments, monitor and manage them, and
redirect activities as needed to optimize continuous production.
Over the entire period we are looking at, firms continuously
upgraded instruments, digitizing many of them, and optimizing
the whole process.
Refineries are large, complex, and expensive
installations, providing perfect locations for computing so
it should be of no surprise that some of the earliest installations
in this industry were housed at refineries. By the middle
of 1963, nearly fifty refineries in the U.S. had installed one
or more data processing systems for the purposes of increasing
production of products, helping reduce operating costs, and
improving quality control. Standard of California used its
system to control catalytic cracking operations; American
Oil Company in Indiana used a computer to manage 437 tanks
controlled by 13 pump houses; while American Oil used theirs
for more traditional inventory control, production management,
and shipments. How things changed from the
1950s to the 1960s can be gleaned from this description of
what occurred at American Oil's refinery in Whiting, Indiana:
was no scarcity of basic data at the Whiting refinery. During
70 years of operations, such information had been developed
each day through the use of gauge sheets, unit morning reports,
weekly stock reports, routine staff reports and similar
documents. These items of information, however, arrived
at different times and were difficultif not impossibleto
to install computer systems to integrate this data and better
time its delivery:
[end of page 10]
system today [1960s] involves a smooth and rapid flow of
data from 13 reporting locations to a computer system. Supervisors
at the various pumphouses mark sense tank inventory and
shipment information onto cards. These cards are collected
periodically, automatically translated into standard punched
codes and fed into the computer system. Final result: a
daily IPS (inventories, production and shipments) report
providing all the information needed by management, ready
and waiting by 8 a.m.
The same location
next upgraded to online systems, feeding data into its applications
in real-time. In time, all refineries connected their various
analog instruments to computers, translating analog readings
into digital data. Deployment in these early years proved
impressive. Refineries had five computer systems in the U.S.
in 1959, ten by early 1961, and by mid-1968, 110.
Rapid deployment of computing in process industries was normal
in the 1960s, and not limited just to refineries. Between
1963the year in which a significant number of systems
were installed across the nation in process industriesand
1968, the installation rate averaged 48 percent per year,
and around the world 55 percent.
In short, this was the era when computing arrived in volume
in all process industries. In the 1970s and 1980s, refineries
filled up and upgraded and enhanced their systems, but in
the 1960s they had figured out what to use computers for and
began to invest in them. By the start of the 1970s, the U.S.
Bureau of Labor Statistics (BLS) was able to report that about
25 percent of all American refineries, which constituted about
two-thirds of the entire industry's production capacity, used
process control involved all refining processes, from crude
distillation to online gasoline blending. In the 1960s and
1970s, open-loop processing was widespread across the industry.
Open-loop processing involved data gathering by instruments
which software turned into reports presented. These reports
informed their decision-making. Their decisions were then
relayed back to equipment through computers. Increasingly
in the 1970s and throughout the 1980s, the industry moved
to closed-loop applications in which humans were taken out
of decision-making steps, delegating these to software, which
used decision tables to automatically make many more, incremental
operational decisions and adjustments. This approach took
advantage of the growing experience the industry was acquiring
with software and the fact that refineries were becoming increasingly
large, hence more complex to operate. BLS economists in the
1970s observed that it was not uncommon for a refinery to
have over one thousand instruments and sensors linked to a
computer system. Some of these collected complex data, such
as chromatographs, mass spectrometers, and octane analyzers.
Computers helped with the complexity, but as in other industries,
speed in resolving problems proved an essential benefit of
The third sector
of this industry, wholesale and retail sales, is perhaps the
most visible part of the industry to any American because
its companies sold the bulk of their products either through
gas stations or to fuel oil companies that, in turn, delivered
gas and heating oil to homes and businesses in trucks. Increasingly
in the 1980s, and extensively during the 1990s, products also
were sold through non-industry controlled retail outlets,
such as oil in quart containers at Kmart, 7-Eleven stores,
and so forth. The most visible application, and in a sense
novel in the 1950s and 1960s, was the deployment of the gas
credit card. It is unique because the retail industry as a
whole embraced the credit card later than the petroleum industry
at large, with the two exceptions of restaurants and hotels. [end of page 11]
Much of the early experience with credit cards on a massive
basis, therefore, came out of the work of such oil companies
as Mobile, Esso (later Exxon), Gulf, and so forth. Credit
cards were developed in the 1950s,
and by the early 1960s, sixty petroleum companies were issuing
gas credit cards; in fact, they had issued seventy million of them.
A quarter of all purchases made by the public in the U.S.
via credit cards in the early 1960s came from gas credit cards,
involving billions of dollars in small transactions. In the
late 1950s, only ten percent of all Americans had a gas credit
card, one-third of all adults did by the mid-1960s. It was
also not uncommon for Americans to have multiple gas credit
In short, credit cards brought about a major change in how
petroleum companies interacted with their customers in post-World
War II America.
These cards exposed
millions of American adults to credit cards in general, conditioning
them for bank-issued cards before the major expansion of the
likes of American Express, Mastercard, and Visa that took
place in the 1970s and 1980s.
We know a great
deal about the early history of this application, thanks to
a well informed report presented at the 1964 annual meeting
of the Data Processing Management Association (DPMA) by James
C. Beardsmore, Sr. At the time he was employed in the marketing
department of the Gulf Oil Company. He indicated that the reason
it was such a significant application grew out of the fact
that the volumes of transactions, and the dollars involved,
were so extensive. Simultaneously the number of billing
centers and personnel involved to process these transactions
had grown so very quickly, that his industry faced huge costs
and managerial issues. At the same time there was intense
focus on providing high levels of customer and dealer services
because of growing competition in the American market. The
application was not always computerized, but became increasingly
so to reduce costs and improve service. One major consideration
was quickly getting bills out to customers, leading to processes
and software programs whereby a certain number of bills were
created and mailed every day.
Gulf's operations as typical for his industry. The extent
of computer technology deployed was impressive. In the period
just prior to IBM's introduction of the System 360which
Gulf installedit used first IBM 1401s and then IBM 1460s
to support this application, along with early bar code readers,
and punch card peripheral equipment. He said the firm borrowed
time on the company's 7070 and 7074 systems to do processing.
Prior to 1961, the firm did not use a credit card imprinter
and had to develop one with various suppliers because that
became the critical data gathering instrument needed to capture
information about an individual sales transaction. His company's credit card accounting system consisted
of a customer master record, which triggered the billing and
accounts receivable processes. Once data was in the billing
system, the application was typical of what existed in most
companies during the 1960s.
These firms also implemented incremental changes in the application.
The most visible for customers arrived in the mid-1990s when
they could insert their card into the gas pump machine and
have it authorize their purchase by credit without requiring
signatures. [end of page 12] It was fast and paperless, saving
gas companies and retail outlets the expense of handling millions
of small transaction documents. Use of credit cards by American
drivers has remained to the present an essential sales tool
for the industry.
The actual form
of the credit carda plastic document just slightly larger
than a calling cardillustrates the interaction between applications
and technologies. As one group of observers in the 1960s argued,
"the optical scanner was one of the factors which made
possible the introduction of credit cards."
The problem this industry faced with credit card sales prior
to the arrival of the plastic card and optical scanning was
conditions under which the credit slip is filled are often
difficult. The card might be exposed to rain, be covered
with grease from the station operator's hand, be bent or
mutilated during handling, and so on. The imprinting on
the card must be clear and uniform in order to be scanned.
Moving from paper
stock to plastic (itself a petroleum-based material) and to
pressure-type imprinters resolved the problem. With uniform
lettering impressed on the slips from the card, a scanner
could be used to read these little documents, regardless of
whether or not the slips were dirty or otherwise in less than
pristine condition. Initially 80 to 92 percent
of all documents submitted to scanners were read; later versions
of the scanners read higher percentages of these papers. Companies
outside the petroleum industry made arrangements allowing
customers to use their gas cards to make purchases in stores.
That capability forced all card issuers to standardize on
lettering and design of cards, and in the use of both scanners
and the software applications supporting this billing process.
The fourth area
of the petroleum industry, and also one that relied extensively
on computing, involved transportation of oil and natural gas.
The industry needed to transport crude and natural gas (which
could be liquefied for that purpose) to refineries, and then
to regional wholesalers and dealers. Local wholesalers using
trucks made deliveries to homes and businesses. Companies
delivered natural gas almost universally by pipelines. The
vast majority of all transportation occurred in one of two
ways. Ever-larger tanker ships would move crude oil from overseas
sources (e.g., the Middle East) to American refineries or
via pipelines from wellheads. Refineries normally shipped
their finished products by pipeline around the U.S. or, for
specialized products, by tanker trucks. In the years following
World War II, the industry integrated its network of pipelines
from all across the North American continent, and regularly
increased the diameter of the pipes themselves in order to
expand the volume of product it could ship. Software could
help by minimizing the expense of running oil and its various
derivatives through the system and to ensure their continuous
flow to the right refinery or wholesaler. The entire industry
created whole bodies of practices and policies, and inter-firm
trade agreements, and so forth to optimize the use of the
[end of page 13]
For the same reasons
that computers appeared in refineries, they did in transportation,
often first applied by the majors. While simulation applications
were tried in the 1960s, it was not until the 1970s that enough
computer capacity existed to make simulation tools sufficiently
effective. Yet warehouse location studies were successfully
simulated in the 1960s, largely because they required less
data for analysis than, for example, geological studies.
At the same time that automation came to refineries, such
approaches were applied to the management of pipelines: flow
monitoring, notification of emergencies, and so forth.
Prior to the arrival of the computer, the industry had trunk
line stations along the entire network, staffed with employees
monitoring what was happening within their section of the
pipeline. Increasingly, instruments communicated with computers
housed in centralized facilities, displacing workers in the
field. The key strategy was to automate the operation of trunk
stations as much as possible. By 1958, nearly a third of all
trunk stations were controlled remotely and by the end of
1966 half of them. That number climbed to over 60 percent
by 1970 and in the 1980s to nearly 100 percent.
Tied to this deployment
were a series of other uses of software and hardware. Delivery
scheduling was an early application that remains central to
the entire network. Field process controls linked to more
traditional applications evident in other industries, such
as administrative business, accounting, financial, and auditing
work. The major changes in the 1970s were the industry's initiatives
to computerize scheduling and link online and central control
processing in order to move greater volumes of product through
Economists at the BLS reported that "productivity for
complex pipeline scheduling is being increased through better
computer programs for database updating, original and revised
scheduling, and shipment report preparation." Pipeline
engineering design also became more widespread, while firms
continuously upgraded pipeline instrumentation throughout
the 1970s and 1980s, using an ever growing combination of
commercially available products and home grown software and
instruments. The result was that by the end of the decade
as one commentator declared, "for many pipelines, monitoring
and regulatory tasks, including the operation of unmanned
pumping stations, are performed by headquarters dispatchers
using solid state electronic telecommunications equipment
and computers." Minicomputer systems at online stations
could be activated to take charge of the operations of a specific
section of the pipeline.
About 10 percent of all pipeline operators used computers
to schedule flows in 1971; that number rose steadily throughout
the decade. The industry learned early on to centralize its
computer operations as much as possible, and to link applications
together, for example, from scheduling to inventory control.
They also added pipeline-specific applications, such as computer-controlled
leak detection systems that reduced the number of human inspections
[end of page 14]
It is easy to
dismiss accounting applications as being so ubiquitous and
uniform in all industries, and also with little industry differentiation.
However, because these are ubiquitous applications, we should
acknowledge them. If only briefly, we should recognize that
in the late 1950s many manufacturing firms, including members
of the process industry, moved their applications from tabulating
and accounting equipment to computers. The Ashland Oil and
Refining Company typified many firms. In October 1956, it
installed an IBM 650 computer system, one of over 1,500 firms
to do that over the life of this product line. In addition
to using the system to do scientific and engineering applications,
it also ran accounting work through the same computer. Early
applications involved billing and payroll, which held out
the promise of saving on the costs of labor. By 1958, the
company had written software that allowed it to run a series
of accounting applications on this system: accounts receivable
aging analysis, depletion, depreciation and amortization schedules,
daily refinery inventory, and other programs. The programs
were batch, and the data entry occurred with cards. Many of
these programs were essentially sort/merge operations, which
accounts for why this system had 10 sorters, 7 collators,
and 17 card punches. As occurred in so many companies, the
strategy was to weave computing into existing accounting practices.
The report from which this paragraph was written originated
in 1958 and noted that the company "was able to integrate
the machine [meaning the IBM 650] into the present machine
accounting section with a minimum of change." The technical
staff then moved on to the installation of magnetic tape and
later RAMAC storage.
In the late 1950s,
others did the same. For example, Standard Oil Company of
California also installed an IBM 650 and had an IBM 704 for
the purpose, as stated by Comptroller W.K. Minor, "of
utilizing these machines whenever they will effect savings
over the best manual or punched card system or render improved
or more timely service."
In those days, prior to the time when a software industry
existed, a "system" included a variety of software
tools provided by the vendor of the installed computer, in
this case IBM. Systems included system control software (also
known as operating systems), utilities (such as sort/merge
programs), compilers (e.g., for Assembler and Fortran and
by the early 1960s, a new generation of compilers, as for
COBOL). All application software, however, was written by
the EDP programmers of the firm or by contract programmers
from consulting firms. General Petroleum Corporation, located
in Los Angeles, installed a Datatron-Cardatron system in November
1956 to run a payroll accounting system, and a production
lease profit and loss application. Like other
firms, it often had to anticipate savings from the installation
of computers, but expectations were similar. As one employee
put it, "we remain convinced that we will save time and
money, and produce for management information not now available.
The degree to which success in these areas is finally reached
will not honestly be known for perhaps another year."
described above were cataloged by the major functions within
the industry. However, a few observations about the source
of these various types of software are in order. [end of page 15] There existed
essentially three classes of software: the first which provided
monitoring functions, second were basic accounting applications,
and third were advanced, complex modeling tools. Accounting
applications were quick rewrites of tabulating applications
in the 1940s or 1950s, enhanced with tools provided by key
computer vendors, such as Burroughs and IBM, for such things
as sort/merge packages, compilers, file managers and TP access
protocols (e.g., VSAM) and operating systems. By the 1970s,
commercial accounting packages widely available across all
industries began to appear in this trade, along with such
widely used database packages as IMS and DB/1 by the early
1980s. Commercially available accounting software has remained
the norm to the present. Reports on business applications
were usually home grown affairs written using COBOL in the
1960s through the 1970s. After that the distribution of business
applications took place either through the use of commercially
available products, or internally created reports written
in RPG, COBOL, and C++, to name a few programming tools and
As for monitoring
software, the earliest tools were developed in the years prior
to the existence of programming. One of the major providers
of such software tools were companies that were predecessors
to Texas Instruments. Originally formed in 1930 as the Geophysical
Service, later the Geophysical Service, Inc., by J. Clarence
Karcher and Eugene McDermott, the firm developed aids to help
explore for oil, relying on methods for measuring seismic
waves to map conditions under the surface of the earth to
determine the possibility of oil deposits existing in an area.
The business proved successful, and in a later reiteration
during World War II it acquired a substantial amount of expertise
in electronics, much as happened to many other firms, such
as IBM and NCR. That newly acquired knowledge led Geophysical Service, after
the war, to concentrate more on electronics, although it remained
committed to geodetic explorations. In 1953,
it acquired Houston Technical Laboratories, which specialized
in gravity meters used in geophysical work, selling its services
and products to petroleum firms around the world. Beginning
at the end of the 1950s, and extending over the next two decades,
Texas Instruments (TI) also manufactured and deployed integrated
circuits designed to conduct seismic studies under contract
to the industry. Applications and software from TI were initially, to use a later term, real-time. Over time,
increasing use of digital technologies, and both batch and
online software tools characterized the collection of software
packages used by TI and the petroleum industry.
Across the entire
industry, beginning in the l950s and extending into the 1960s,
batch reports of monitoring applications were written internally
within its firms, in such languages as Fortran and Assembler.
When online processing became available in the 1960s, a long
process of converting monitoring applications to computer-based
software began. In the 1980s, specialized software vendors
began offering software products to the industry for various
[end of page 16]
Finally, we have
that large class of applications called modeling tools. These
were the most complicated packages in existence within the
industry. They came from a variety of sources. Individual
companies wrote some of the earliest modeling tools in the
late 1950s and in the 1960s. CDC, and later Cray, and more
specialized software firms also wrote modeling tools, often
in conjunction with joint development projects with the largest
firms in the industry. In fact, each of the major firms had
one project or another with a local university or a computer
vendor at one time or another, a pattern of development that
has continued to the present.
While going through
a description of a series of these projects is a worthwhile
exercise, they could easily fill another paper. What is important
to understand is that because these firms were large, often
having hundreds of IT professionals working for them, even
as early as the 1960s, they were able to combine a number
of strategies for the acquisition of software. Simple software
tools such as compilers and database managers were either
acquired from computer vendors as part of renting hardware
in the 1950s and 1960s, or were bought on the open market
in subsequent decades. Complicated, industry-specific software
tools were either written entirely in-house or later acquired
from software firms, although always there were internal software
development projects in the areas of monitoring and business
applications. Simulation applications normally were acquired
from joint development projects or as products from other
software and hardware firms. These were complicated, called
for extensive knowledge of algorithms, and scientific knowledge,
such as of geology. In its most advanced stages, this scientific
knowledge often existed in universities and in a few specialized
software and hardware firms, much as it had in the 1920s and
1930s with the companies preceding Texas Instruments.
of software on the industry's productivity
Increases in productivity
in this sector of the industry grew, as in refining. In the
latter, productivity grew all through the half century at
rates of between 2.9 and 3.2 percent, despite oil crises but
also in part due to increased demand. Occupations became far
more complex, increasingly technical and IT-centric throughout
the entire period.
Pipeline transportation productivity grew at annual rates
of ten percent in the years 1958 to 1967 as demand and pipeline
management practices improved dramatically. It then dropped
to low levels of 1.9 percent between 1967 and 1986, largely
due to relative declines in output and to various international
oil crises, which periodically reduced demand. Automation
in the earlier period actually caused an annual decline in
employment of 3.5 percent, evidence of the effects of automation,
and more specifically, of the increased centralization and
complexity of the control systems used to manage the network
This sector of the industry, as well as the other three, installed
many of the new computer systems that came out in the 1980s
and early 1990s, largely because of their increased capacity
and versatility which proved so essential to the centralized
applications embraced by all four sectors of the industry.
[end of page 17] Larger systems also made possible simulation
applications in all four sectors, and especially in pipeline
management practices in the 1980s and 1990s. A BLS economist
in 1988 concluded that, "the petroleum pipeline industry
has attained a high degree of automation."
We can quickly
summarize the complex history of computing in pipeline management.
In the 1950s traditional accounting applications were the
norm. The first use of data processing with computer systems
to manage the flow of petroleum came in the early 1960s and
expanded over the next two decades. Upgrades to all major
applications occurred continuously, but most dramatically
in the 1980s and 1990s, as major improvements in computer
capacity and their sophistication occurred. By the end of
the 1980s all firms operating in this sector of the industry
relied on computers and industry-specific applications to
do their daily work. Intelligent terminals and PCs were in
wide use by the early 1990s, while programmable control logic
units were embedded in many monitoring instruments and machinery
by the late 1980s. The "hot" applications in the
1980s included improved scheduling and dispatching, leak detection
systems, power optimization, and shipment documentation. All
were infused with extensive upgrades in telecommunications
across all four sectors in each decade.
As in the other sectors, as one labor economist noted,
of the increasing use of centralized computer control systems
involving the operation of almost all the functions of a
major pipeline, there has been a shift away from operators
involved in such manual operations as opening and closing
valves and switches, checking tank levels, and reading meters.
These functions have been almost completely taken over by
the computer. Pipeline personnel tend to be skilled in computer
operations and programming.
industry's dependence on telecommunications to support many
of its strategies for centralized computing meant it would
be an early user of Electronic Data Interchange (EDI) and
later the Internet.
Common IT systems and industry-wide practices facilitated
the spate of mergers that occurred in the 1990s. They were
designed to further operational efficiencies and to expand
global reach to sources of crude oil, natural gas, and to
markets. IT issues in these mergers were now news items.
Standardization, therefore, once again became a high priority
within IT organizations in the industry for such application
areas as telecommunications, ERP packages, and desktop computing.
IT was seen as an essential component of what was happening
in the industry. The Oil and Gas Journal, clearly not
a computing publication, acknowledged the role IT played in
helping companies to survive the commoditization of energy
that occurred in the mid-1980s, a lesson that it again reminded
readers about at the end of the 1990s.
At the start of the new century, the petroleum industry's expectations
included further global consolidations of what by then many
called the energy trade. This consolidation was often facilitated by the kinds
of software applications already installed in the industry.
The industry had many very large players who had the resources
to continue optimizing computing. In 1990, Royal Dutch/Shell
Group and Exxon were on the list of top ten global companies,
as measured by market capitalization; in 1998 they were still
there. [end of page 18] Both had been leaders in the industry in transforming
major segments of their businesses in the 1980s and 1990s,
despite oil spills and wildly fluctuating prices for crude
oil, and increased competition from ever-larger rivals.
As they entered the new century, oil company executives reached
out again to IT to help support their strategies of consolidating
and expanding markets.
industry is an example of an industry in which IT effectively
supported corporate strategies, but only in collaboration
with industry-wide coordination of such activities as prospecting
and transportation. As one observer in September 2000 wrote,
"IT and the information it drives are making it possible
for energy companies to expand their reach into remote pockets
of the world, to understand the consumers that buy their products,
and to align their supply chains and procurement efforts with
The Internet became
the next enhancement to the variety of telecommunications
tools already in use. It became especially useful in expanding
communications and sharing of information with retailers who
used it to order products, and to communicate levels of supplies
in their tanks. Purchasing practices changed, much along the
lines evident in the steel, automotive, and aerospace industries
and for the same reasons. Communications with tankers expanded,
making it possible to send information back and forth in graphical,
audio, and textual formats.
In this industry, as in so many others, the Internet emerged
as a basic component of the infrastructure of the supply chain
This case study
of the petroleum industry has both strengths and weaknesses.
On the plus side, it is a useful approach for conducting an
initial inventory of what applications there were and
to measure the extent to which companies deployed them. While
this exercise is primitive, it nonetheless needs to be done
first to establish the rationale and scope for further research.
One of the pleasant surprises for an historian
of software coming out of such an approach is the extent to
which software applications can be placed in the mainstream
of more established topics, such as business and economic
history. It begins to suggest the true importance of
software beyond the circle of technologists, computer scientists
and engineers, and their historians. In a long-term study
I am conducting on nearly fifty industries, the evidence already demonstrates
that computing profoundly changed the nature of work across
the entire American economy. In fact, it was as determinative
an influence as any other offered by economists or business
historians, particularly by the 1980s. This case study of
the petroleum industry hints at what is possible to uncover.
from this study is the need to consider the role of hardware
and software together. Historians seem to be dividing into
two the study of the history of machines and now an emerging
subfield on the history of software. [end of page 19] Each
topic has issues of sufficient differentiation to warrant
specific examination of them separately. But when applications
are discussed, and specifically from the perspective of the
user community, it makes more sense to combine the two because
that is the way companies and the DP departments treated them.
Both were components of systems installed for the purpose
of generating answers or performing tasks. The fact that systems
were made up of software and hardware were fine points of
differentiation that only became of intense concern to the
technicians installing components of a system, or the vendors
who supplied hardware and software. In other words, the closer
an historian moves toward a discussion of the business history
of computing, perhaps the less it matters that some of the
technology were machines and others software.
approach to the study of software applications makes it possible
to use largely untapped bodies of research materials that
are also rich in other evidence regarding a wide variety of
topics, such as programming experiences, features of specific
hardware devices, and role of computing in specific companies.
Each of the major oil companies has various types of archival
material, however, none of these were examined for the purposes
of writing this article. While that material would have been
useful, it would not have necessarily changed the basic scope
of this essay, namely to provide a short, cursory view of
the major applications in use in this industry. Armed with
the general inventory of applications, it now becomes easier
to examine in more detail, by company, what their specific
experiences were with software over the half century, and
to do that would require a careful examination of corporate
records and interviews of retired members of the industry,
because the published record cannot take us much beyond what
was presented in this paper. In short, this little exercise
in providing an overview of the industry suggests that there
is much yet that can learned both about computing in this
or any industry, and about the nature of software over time
in its various forms.
The approach described
in this essay, however, appears initially to have some important
limitations that historians need to be aware of. As with the
historian who wants to study how clock mechanisms worked,
scholars who want to learn what programming languages were
used or how applications were run, will find that, like modern
historians of the clock, they are pushed almost too soon into
discussions concerning society, economics, and business and
away from the pros and cons of using COBOL versus Basic, or
one programming methodology over another. Another topic that
this approach tends to push historians away from, yet is profoundly
important for the history of software applications, is the
role of file management systems. But, as
hardware capacity increased over the decades, and database
management systems came on stream, these two developments made
it possible to write more sophisticated applications, stimulated
changes in already installed software packages, and changed
the architecture of new systems. The petroleum industry relied
on vast quantities of data to perform some of its most essential
work; geological modeling and management of credit card accounts
are two examples. [end of page 20] One cannot fully understand the history
of software in this industry without a clear appreciation
of the role of file management systems, and in particular,
databases. Yet, as the case above demonstrates, we have an
inadequate understanding of the history of database management
systems (DBMS), let alone their role in this industry. The
reader will note that in this article I did not discuss databases;
it is a gap, one that cannot be rectified, however, without
specific studies on database tools in general.
Without a detailed
study of the internal managerial operations of the key firms
in the industry we are left with the impression that the adoption
of computing in this industry occurred uncritically, that
is to say, without the normal battles among "systems
people" versus already-established communities that might
have seen computing as a threat to their fiefdoms or views
evident in such industries as automotive manufacturing, airline
transportation, and in retail. Part of the problem historians
face is that by relying on industry publications, they perforce
use sources that tend to gloss over frictions as authors attempt
to provide optimistic accounts. That does not mean that this
material is not useful, far from it, but it does mask some of the internal struggles and debates
that probably took place and that will not become evident
until company-level studies are done. A new
generation of historians is providing examples of how to begin
to study some of the institutional-centric issues, which can
be applied neatly to the study of computer applications within
One of the byproducts
of an industry-centric study is what it points to as promising
future research topics. Just looking at the subject of software
history, the petroleum industry strongly demonstrates the
urgency of examining the history of file management and database
management systems. Other industries point to different topics.
For example, in an industry that requires high levels of rapid
transaction processing applications, such as the retail industry,
databases remain important, but not so crucial as, for example,
the role of transaction processing software. In that latter
case, channel speeds of computers, in combination with smart
and "dumb" bandwidth telecommunications control
software becomes an important topic, as does the role of TP
access software (like IBM's VSAM of the 1970s and 1980s) in
concert with operating systems.
Studies of any
major industry make it possible to connect various disparate
aspects of the history of computing. In the last decade of
the 1990s, historians began to examine a variety of managerial
and operational aspects of the history of computing, such
as how systems were designed, the role of technical staffs,
and the relationship of government, universities, and corporations
in the development of computing technologies. In addition,
since technologiesand hence one can say software tooare
conditioned by the social and political forces at work within
any organization or industry, we can learn about the influences
that played on software coming from outside the computing
communities, primarily from users and their managers. This
line of research will go far in linking the evolution of software
to the greater historical patterns in the transformation of
various technologies. [end of page 21] In short, the possibilities
presented for the study of new questions is quite formidable.
These opportunities are particularly exciting since we are
really at the threshold of the study of the history of software
in general by professional historians.
There is also
the basic question of how circumstances varied from one industry
to another. Each has its own personality, particular economic
circumstances, business models, and handed-down values and
practices. Each influence all technologies used within its
member enterprises and institutions, and thus can add to our
understanding of possibilities. For example, because the retail
industry needs high speed transactions and data collection,
the role it played in the development of the bar code and
its relevant software systems is a dramatically different
experience than how the military encouraged the development
of software to provide high levels of "up time."
Then we have petroleum companies deeply concerned about modeling
systems that used vast quantities of data. Manufacturing companies
prized data collection systems and the use of robotics. In
other words, we have hardly begun to explore the role of all
these various influences on software's development, use, and
effects on modern society. [end of page 22]
W. Cortada, "Studying the Increased Use of Software Applications:
Insights from the Case of the American Petroleum Industry,
1950-2000," Iterations: An Interdisciplinary Journal
of Software History 1 (September 13, 2002): 1-26.
We do not have, however, a modern general history of software.
Rather, there exist many accounts of individual software projects
and programmers. The most recent one is by Steve Lohr, Go
To: The Programmers Who Created the Software Revolution
(New York: Basic Books/Perseus, 2001).
For examples of this literature, see James W. Cortada, A
Bibliographic Guide to the History of Computing, Computers,
and the Information Processing Industry (Westport, CT:
Greenwood Press, 1990): 343-491, and the sequel, Second
Bibliographic Guide (1996): 276-278.
There are several dozen books and a few dozen articles, most,
however, not written by historians but rather by economists
or veterans of the world of computing. Two accessible examples
of books are, David C. Mowery (ed.), The International
Computer Software Industry: A Comparative Study of Industry
Evolution and Structure (New York: Oxford University Press,
1996) and Jason Dedrick and Kenneth L. Kraemer, Asia's
Computer Challenge: Threat or Opportunity for the United States
and the World? (New York: Oxford University Press, 1998).
Professor Martin Campbell-Kelly, an historian, is just finishing
a history of the American software industry which will include
discussions of the economic literature on the subject.
I am just completing a large study on the use of application
software in manufacturing, process, transportation, wholesale,
and retail industries that describes various types of applications
and their role in the American economy.
For a recent answer to the question, see Martin Campbell-Kelly,
"Software Preservation: Accumulation and Simulation,"
IEEE Annals of the History of Computing 24, no. 1 (January-March
The major example on clock history is David S. Landes, Revolution
in Time: Clocks and the Making of the Modern World (Cambridge,
MA: Harvard University Press, 1983); but see also on another
widely studied example, the wheel, Richard W. Bulliet, The
Camel and the Wheel (Cambridge: Cambridge University Press,
1975); and on automobiles (a favorite example for many historians),
see John B. Rae, The Road and the Car in American Life
(Cambridge, MA: MIT Press, 1971). For an extensive bibliography
on other examples involving stone tools, combustion engines,
electric motors, and other artifacts of technology and for
an appreciation of the argument in favor of studying specifics
artifacts to enrich the traditional scope of historical research
on technologies, see George Basalla, The Evolution of Technology
(Cambridge: Cambridge University Press, 1988). The significance
of Basalla's work is that he has profoundly influenced how
historians go about looking at the history of technology,
including that of computers.
For a sense of what it was like, see John Backus, "Programming
in America in the 1950s. Some Personal Impressions,"
in Nicholas Metropolis et al. (eds.), A History of Computing
in the Twentieth Century: A Collection of Essays (New
York: Academic Press, 1980): 127.
Atsushi Akera, "Volunteerism and the Fruits of Collaboration:
The IBM User Group SHARE," Technology and Culture
42, no. 4 (October 2000): 710-736.
"The Digital Hand," forthcoming.
The case for organizing views of business and technical issues
within industry constructs was made by Alfred D. Chandler,
Jr., in a series of books; however, for the application of
this approach to IT, see his Inventing the Electronic Century:
The Epic Story of the Consumer Electronics and Computer Industries
(New York: Free Press, 2001):1-12, in which he also integrates
recent economic literature on the notion of path-dependency.
For an early example of cases, see James L. McKenney, Waves
of Change: Business Evolution through Information Technology
(Boston: Harvard Business School Press, 1995).
Franco Malerba, The Semiconductor Business: The Economics
of Rapid Growth and Decline (Madison, WI: University of
Wisconsin Press, 1985): 111-117; T.R. Reid, The Chip: How
Two Americans Invented the Microchip and Launched a Revolution
(New York: Simon and Schuster, 1984): 90-95; Hans Queisser,
The Conquest of the Microchip (Cambridge, MA: Harvard
University Press, 1988): 97, 98, 115, 148; Ernest Braun and
Stuart Macdonald, Revolution in Miniature: The History
and Impact of Semiconductor Electronics: The History and Impact
of Semiconductor Electronics Re-explored in an Updated and
Revised Second Edition (Cambridge: Cambridge University
Press, 1982): 55, 58-59, 88-90, 94-97; Michael S. Malone,
The Microprocessor: A Biography (New York: Springer-Verlag,
1995): 13-14, 16, 54, 130-136, 143, 146, 148, 150, 210, 279;
Paul E. Ceruzzi, A History of Modern Computing (Cambridge,
MA: MIT Press, 1998): 179, 182, 187-188, 213, 215, 217.
For a summary of the modern history and economic realities
of this industry, see Stephen Martin, "Petroleum,"
in Walter Adams and James W. Brock (eds.), The Structure
of American Industry (Upper Saddle River, NJ: Prentice-Hall,
For a summary of how the industry functions within its four
parts, see Thomas G. Moore, "The Petroleum Industry,"
in Walter Adams (ed.), The Structure of American Industry
(New York: Macmillan, 1971): 117-155.
U.S. Bureau of Labor Statistics, Outlook for Computer Process
Control: Manpower Implications in Process Industries,
Bulletin 1658 (Washington, DC: U.S. GPO, 1970): 1.
For a description of how these applications worked, see IBM
Corporation, System/7 for Computer Production Control of
Oil and Gas Wells (White Plains, NY: IBM Corp., undated,
circa 1971), DP Application Briefs, Box B-116-3, IBM Archives.
"On-Site Instruments Help Avoid Troubles, Optimize Drilling,"
Oil and Gas Journal (September 24, 1973); W.D. Moore
III, "Computer-Aided Drilling Pays Off," Ibid. (May
31, 1976): 56-60; U.S. Bureau of Labor Statistics, Technological
Changes and Its Labor Impact in Five Energy Industries,
Bulletin 2005 (Washington, DC: U.S. GPO, 1979): 19-20.
Donald C. Holmes, "Computers in Oil-1967-1987,"
Computer Yearbook and Directory, 2nd edition (Detroit:
American Data Processing, 1968): 168-169.
T.E. McEntee, "Computers in the petroleum idustry,"
in Edith Harwith Goodman (ed.), Data Processing Yearbook
(Detroit: American Data Processing, 1965): 246-247.
Comments made in the last sentence are drawn from a reading
of dozens of articles on the industry published in the late
1990s in Oil & Gas Investor, Computerworld, Oil &
Gas Journal, Informationweek, and Petroleum Economist.
Oil & Gas Journal published several articles each
year on these kinds of publications, and is a rich source
on applications for the 1990s, including the now emerging
area of e-business.
BLS, Outlook for Computer Process Control, 12.
BLS, Technological Change and Its Labor Impact In Five
Energy Industries, 26.
Ibid., 28-29. These economists described life before and after
the arrival of open loop computing: "The duties of an
operator of a fluid catalytic cracking unit before computer
control were to manually adjust automatic analog controllers
at the control console and to monitor automatic data logging
equipment. After installation, the computer controls and monitors
a large part of the process and automatically logs the data,
although the operator still performs manual control. In case
of emergency, the operator can take control of any part or
all of the process."
For its history, see David Evans and Richard Schmalensee,
Paying with Plastic: The Digital Revolution in Buying and
Borrowing (Cambridge, MA: MIT Press, 1999): 61-68.
James C. Beardsmore, Sr., "Credit Card Accounting,"
Data Processing Proceedings 1964 (New Orleans, LA:
DPMA, 1964): 2-3.
The material for the last two paragraphs came from Ibid.,
Robert H. Church, Ralph P. Day, William R. Schnitzler, and
Elmer S. Seeley, Optical Scanning for the Business Man
(New York: Hobbs, Dorman & Company, Inc., 1966): 122.
Ibid., 122-125; includes a flowchart of the application, p.
Moore, "The Petroleum Industry," 135-136.
J.C. Ranyard, "A History of OR and Computing," Journal
of the Operational Research Society 39, no. 12 (December
1988): 1073-1086; Albert N. Schriber (ed.), Corporate Simulation
Models (Seattle, WA: University of Washington, Graduate
School of Business Administration, 1970); Ron Wolfe, "Evolution
of Computer Applications in Science and Engineering,"
Research & Development 31, no. 3a (March 21, 1989):
Holmes, "Computers in Oil," 169-170.
The American Petroleum Institute tracked this form of automation
very closely throughout the half century. For data on early
implementation of unmanned trunk line stations, see Hugh D.
Luke, Automation for Productivity (New York: John Wiley
& Sons, 1972): 262-263.
There is an extensive contemporary industry literature documenting
these, see James W. Cortada, A Bibliographic Guide to the
History of Computer Applications, 1950-1990 (Westport,
CT: Greenwood Press, 1996): 206-207.
BLS, Technological Change and Its Labor Impact in Five
Energy Industries, quotes from p. 39.
Management could increasingly then rely on visual inspections
done from low flying aircraft augmented with ground-based
inspection teams looking at segments of the pipeline that
either those flying overhead called out for further examination
or which had a history of problems, based on reports from
pipeline management software.
Automation Consultants, Inc., "The 650 Used in Refinery
Sales Billing," undated case study (circa 1958), quote,
p. III c1-8, full case study, III C1-1-8, CBI 55, "Market
Reports," Box 70, Folder 1, Archives Charles Babbage
Institute, University of Minnesota.
"EDP at Standard Oil of California," Ibid., III
"Datatron in Petroleum Accounting, Ibid., III C3-6.
For the early history of TI in the petroleum industry see
Texas Instruments, Inc., 50 Years of Innovation: The History
of Texas Instruments: A Story of People and Their Ideas
(Dallas: Texas Instruments, 1980). While there is much published
material on TI dealing with its development of computer chips,
its pre-chip history has yet to be fully explored.
See for example, Dale O. Cooper, "Advances in EDP in
The Petroleum Industry," Data Processing Proceedings
1964 (New Orleans: DPMA, 1964): 20-30; "BP, Amoco
Merger Marries IT Opposites," Computerworld 32,
no. 33, August 17, 1998, p. 76; Stuart J. Johnson, "IT
Fuels Speedup in Energy Industry," Informationweek
(September 14, 1998): 139-146.
The Charles Babbage Institute at the University of Minnesota
houses the corporate archives of CDC, which are replete with
material on this subject. However, there is limited historical
literature on the topic, but see, J.C. Ranyard, "A History
of OR and Computing," Journal of the Operational Research
Society 39, no. 12 (December 1988): 1073-1086. Hundreds
of articles and dozens of "how to" books were published
that included sporadic case studies of this application across
many industries, including petroleum.
Rose N. Zeisel and Michael D. Dymmel, "Petroleum Refining,"
in U.S. Bureau of Labor Statistics, A BLS Reader on Productivity,
Bulletin 2171 (Washington, DC: U.S. GPO, June 1983): 197-206.
U.S. Bureau of Labor Statistics, Technological Change and
Its Labor Impact in Four Industries, Bulletin 2316 (Washington,
DC: U.S. GPO, December 1988): 34.
Bob Tippee, "Electronic Data Interchange Changing Petroleum
Industry's Basic Business Interactions," Oil and Gas
Journal 96, no. 28 (July 13, 1998): 41-47.
See for example, Julia King, "BP, Amoco Merger Marries
IT Opposites," Computerworld 32, no. 33, August
17, 1998, pp. 1, 76.
Start J. Johnston, "IT Fuels Speedup in Energy Industry,"
Informationweek, September 14, 1998, pp. 139-146.
John Kennedy, "In Global Energy, Information Technology
Knits It All Together," Oil and Gas Journal, "Windows
in Energy Supplement" (Spring 1999): 1.
"The Middle East, New Super-Majors, and More Industry
Consolidation," Offshore 60, no. 4 (April 2000):
Jeff Sweat, "Information: The Most Valuable Asset,"
Informationweek, September 11, 2000, pp: 213-220, quote,
Don Painter and Robert Dorsey, e-Business: Refining the
Petroleum Industry (Somers, NY: IBM Corp., 2000).
See, for example, Thomas Haigh, "The Chromium-Plated
Tabulator: Institutionalizing an Electronic Revolution, 1954-1958,"
IEEE Annals of the History of Computing 23, no. 4 (October-December
2000): 75-104, which describes how first generation computers
were sold and acquired; Nathan L. Ensmenger, "The 'Question
of Professionalism' in the Computer Fields," ibid., 23,
no. 4 (October-December 2001): 56-74, on the roles of programmers;
Thomas Haigh, "Inventing Information Systems: The Systems
Men and the Computer, 1950-1968," Business History
Review 75, no. 1 (Spring 2001): 15-62, which is an important
study on the ideas and role of "systems men" of
the 1950s and 1960s, people who wanted to take over the management
For examples of current historiography, see the entire issue
of the IEEE Annals of the History of Computing 24,
no. 1 (January-March 2002).
to Top | Table
of Contents | Next Article
| CBI Home