The discussion of the scientific revolution in geology (Lecture 31) ended in the second half of the 20th century. The remaining few lectures will address developments that reach into our own lifetime. Such a discussion is impossible without an elementary assessment of the impact of technology on science and society.
Before the 20th century science and technology developed more or less independently, and before the Renaissance European scientists did not take much notice of technical skills. (Lecture 20) This situation changed towards the end of the 19th century, when scientists began to use technological devices that allowed them to investigate phenomena that could not be experienced by the human senses alone. (Lecture 28) During the second half of the 20th century science and technology were brought closer together again, to the degree that much scientific work developed as the result of new technological inventions and new inventions were the result of scientific discoveries.
The one technological development that had the greatest impact on developments in science was the invention of the digital computer. Modern astronomy, space research, meteorology, oceanography, climate research, the decoding of the human genome, medical research and research in many other scientific fields would be impossible without powerful computers, and digital image analysis and animation techniques have become essential tools to understand the results of scientific research.
The impact on society has been equally far-reaching. Many technical products of today, from cars to kitchen implements, contain computer chips. Their use determine the lifestyle of affluent societies, but they are not central to their very existence; replacing them by models without computer chips would cause some inconvenience but would not touch the society in its core. It is in areas such as banking, warehouse and goods distribution management, and communication where computers have become indispensable.
The use of the word "computer" to describe a machine developed during the years between the two World Wars; before the 20th century a computer was a person employed to do computations. The history of the computer has been documented in a number of studies. Like other studies of technology they often focus on personalities, priority of ideas, and the excitement of new inventions. In the context of science, civilization and society the invention of the computer becomes important because of its role in the competition between different social systems: Its history is inseparably linked with the increasing control of the military-industrial complex over scientific developments and with Japan's success to establish itself as a leading capitalist country.
Technology, like science, is motivated by the need to solve practical problems. The need to ease the burden of calculation was felt in the earliest societies. Technological solutions were developed soon after the invention of the place-value number system. (Lecture 4) The most successful of these solutions is the abacus, believed to have originated in Babylon around 3000 BC. (The Greek word abakos is thought to have its root in a Semitic word meaning "to wipe the dust.") In its original form it probably consisted of an area of cleared sandy floor and was developed into a board with divisions. The modern form of the abacus, which uses beads strung on a wire, was invented in Egypt around 500 BC. China and Japan used a computing tray from about 200 AD. In medieval Europe Gerbert of Aurillac used the abacus in the form of a calculating board.
The modern Chinese abacus uses beads on rods. Its use requires some training, but once it is mastered it allows extremely fast and accurate calculations. This was demonstrated in 1946 in a competition between "the most expert operator of the electric calculator" of the United States army and the "champion operator of the abacus" of the Japanese postal service. The contest ended 4:1 for the abacus.
Electronic calculators have undergone further development since 1946, but that did not challenge the position of the abacus as the fastest implement for elementary calculations. The speed of the calculator is determined by the operator's ability to key in the numbers, which has natural limits. The invention that will spell the eventual demise of the abacus is not the electronic calculator but the electronic cash register that links the sales to the electronic ordering of stock.
As civilizations developed and populations grew the need to make counting and calculating more efficient took on a new urgency. To establish a fair taxation system required a census of people and land, and even the basic task of regular bookkeeping required a veritable army of clerks. In 17th century Europe this attracted the attention of mathematicians, who designed and built mechanical calculators.
In 1624 Wilhelm Schickard (1592-1635), a friend of Johannes Kepler who worked at the University of Heidelberg, built a "Calculating Clock", a 6-digit calculator for addition and subtraction by turning a crank handle. His invention was lost, but his plans were found in 1935, lost again during World War II and located again in 1956. In 1960 copy built from his plans proved that it performed perfectly.
In 1642 Blaise Pascal built his "Pascaline", a mechanical calculator capable of addition, to assist his father in his work as judge of the tax court. His first machine had only 5 digits, and subtraction could only be achieved by adding complementary numbers to 10,000 (to subtract 16 the user had to add 9984). But each digit wheel could be turned independently, with the result that addition was achieved digit by digit (to add 16 the user added 6 to the first position and 1 to the second position), which made adding sets of large numbers fast and easy. Pascal sold a few dozen Pascalines, some supporting calculations with 8 digits, which shows that there was a need for faster and more accurate calculations. Its principle is still used today in water meters and odometers.
The central problem for all computing devices was and remains the need for calculations that go beyond addition and subtraction. In 1674 Leibniz invented a mechanical calculator that used a movable carriage for multiplications with operands of up to 5 and 12 digits and a product of up to 16. The machine required user intervention to carry over intermediate results and was difficult to use. Similar machines were constructed during the 18th century. All showed that mechanical calculators could not offer a practical solution to the multiplication problem.
While Pascal and Leibnitz occupied themselves with the design of mechanical calculators, other mathematicians had already found an alternative and vastly more powerful answer to the multiplication problem. The Scottish clergyman and amateur mathematician John Napier and the Swiss mathematician Joost Bürgi independently discovered the principle of logarithms, which convert multiplication and division into the simpler operations of addition and subtraction. Napier published his Mirifici Logarithmorum Canonis Descriptio ("Description of the Marvelous Canon of Logarithms") in 1614, Bürgi published his work in 1620.
In these works multiplication and division through the use of logarithms was based on the use of tables. This guaranteed the high accuracy required for astronomical calculations but made their use cumbersome for applications in engineering and technology. However, soon after publication of the tables William Oughtred invented the slide rule, which sacrificed extreme accuracy for ease of use. His invention remained a standard tool in technology and science well into the 20th century, when it was replaced by the electronic calculator.
The abacus, the Pascaline and all its relatives are based on the representation of numbers with a fixed number of digits. They are therefore called digital calculators, and their more powerful successors digital computers. Oughtred's slide rule was one of the earliest calculating devices that used a continuous scale for solving mathematical problems. Such devices are today called analog computers. Before the invention of the electronic computer analog devices were more powerful to solve problems that involved a continuous change of a variable in time.
A particularly pressing case was the prediction of tides, which involves the combination of harmonic functions (such as the sine function) with different periods. Before the 18th century tidal predictions necessary to plan ship departures had to be obtained from people who kept their knowledge a family secret. Daniel Bernoulli's equilibrium theory of 1740 had placed the procedure on a scientific basis, but it took another 133 years before William Thomson (Lord Kelvin) invented his "Tide Predicter", an analog machine that made tidal prediction a simple routine.
Like the slide rule, Kelvin's machine for tidal prediction was in use until the second half of the 20th century, when it was replaced by the computer. Other analog computers simulated continuous functions through the strength of the current in electrical circuits. The engineer Vannevar Bush built an analog computer known as a "differential analyzer" in 1935. It weighed 100 tons, contained 2,000 vacuum tubes, several thousand relays, 150 motors and more than 300 km of wire and was accurate to one part in 25,000. It was successful enough to be duplicated twice, for the U.S. Army's Ballistics Research Laboratory and for the University of Pennsylvania. (Shurkin, 1984) Much smaller analog computers are well suited for the analysis of damped oscillations and were used for the design of car suspension systems until only very recently.
Analog computers were successful to some degree, but in many applications an output in the form of numbers is more useful than a continuous graph. If digital computers could find solutions to problems with arbitrary functions they would most likely be the preferred machines. The key to digital computing was differentiation. By the 18th century it was known that every continuous function can be approximated by a polynomial. When a polynomial is differentiated repeatedly the eventual result is zero. This fact was already used to turn the calculation of polynomials into a sequence of simple additions.
In 1786 J. H. Müller in Hessen, Germany, designed the first "difference engine", a digital computer for the tabulation of values of polynomials. Unable to raise funds for his project, he had to abandon its construction, and the idea was forgotten.
A few decades later Charles Babbage was more successful in England. With an initial government grant of 1,500 pounds he began the construction of his first difference engine in 1822. His design envisaged 20-digit numbers and 6th-order differences. But Babbage was an eccentric character with an unsteady mind; he rarely brought a project to its end, having been sidetracked by some other fascinating problem. Only a prototype of his difference engine, which operated on 6-digit numbers and 2nd-order differences, was ever completed.
Having consumed 17,000 pounds of government money and a similar amount of his own fortune, Babbage abandoned the difference engine project in 1833. (Shurkin, 1984) A year later the Swedish printer Georg Scheutz and his son Edvard saw a report on Babbage's project in the Edinburgh Review and decided to build a difference engine of their own. After twenty years of effort they exhibited their machine at the Paris Exhibition of 1855. It was sold to the Dudley Observatory in Schenectady, New York. A second engine was ordered by the British government and delivered at a cost of a mere 1,200 pounds. (Campbell-Kelly and Aspray, 1996) It handled 15-digit numbers and 4th-order differences and printed the results.
Although the design of the difference engine was a great intellectual achievement, Babbage was never satisfied with the limited ability of his invention. It could only perform the set of calculations for which it was set up. Working on a different problem required a major resetting of its parts. Babbage envisaged a machine that could perform virtually any calculation and make decisions based on the result of its computations. In 1833, in collaboration with the 17 year old mathematician Ada Byron, he began work on his "Analytical Engine", again a project that did not see its completion. But the concept of the Analytical Engine was a revolution in computing: It described the first programmable computer. Babbage wrote:
The paragraphs contain the essence of modern computing, as a comparison with modern terminology shows:
Babbage's Analytical Engine | modern digital computer |
Hardware: store mill |
memory CPU (central processing unit) |
Software: operation cards variable cards |
operation statements variables |
"the Engine moving forward by eating its own tail" | conditional (if ... then) statements |
Babbage envisaged a library of functions that could be called upon as required. It would consist of bundles of perforated cards, the same cards used to feed programs into the engine. The use of perforated cards in process control was already proven technology; Joseph-Marie Jacquard had invented it in 1805 and used it to program the weaving of patterns in looms.
No working prototype of the Analytical Engine was ever completed; the government was reluctant to spend funds on yet another of Babbage's projects. (A small part of the mill is now in the Science Museum in London.) Even if funding would have been found, implementing a programmable computer as a mechanical machine would have limited its versatility.
Babbage's dream was forgotten. It was rediscovered a century later, when the theoretical physicist Howard Hathaway Aiken (1900 - 1973) of Harvard University had a need to solve nonlinear differential equations and thought of building a digital calculator. The result was the IBM Automatic Sequence Controlled Calculator, better known as the Harvard Mark I, the first fully functional programmable computer.
Aiken later recalled that when he talked to colleagues about his idea in 1936, a technician said that he "couldn't see why in the world I wanted to do anything like this in the Physics Laboratory, because we already had such a machine and nobody ever used it." (Campbell-Kelly and Aspray, 1996) He was then taken to the attic, where a fragment of one of Babbage's engine, donated by Babbage's son, had been sitting for nearly a century.
Aitken went to the library and found Babbage's autobiography. Three years later the IBM board approved the spending of 100,000 dollars on an "Automatic Computing Plant." Another four years later, in early 1943, the Harvard Mark I did its first calculation. Aitken missed Babagge's description of the procedure for conditional statements, so the Harvard Mark I could only perform computations without conditional branching.
Aitken's design was an electromechanical machine driven by a 5 horsepower electrical motor. It was somewhat easier to program than Babbage had envisioned, but its speed was similarly limited and could not be increased significantly. The way to faster speeds was through electronics. While IBM worked on its electromechanical machine John Vincent Atanasoff, Professor of Mathematics and Physics at Iowa State College, and his graduate student Clifford Berry in 1939 built an adding machine with vacuum tubes that used binary numbers stored in capacitors.
There was considerable debate, accompanied by years of litigation in the courts, whether this was the first electronic computer or just an adding machine. (In 1973 a judge ruled that it was the first automatic digital computer.) Many books on the history of the computer describe the ENIAC (Electronic Numerical Integrator and Computer) as the first programmable electronic computer. The ENIAC was operational in 1943 and completed in 1945.
But new ideas in science rarely grow in isolation; they sprout in many places when the time is ripe. Newton considered himself the inventor of calculus and attacked Leibnitz for having the same idea; Darwin's ideas were formulated independently by Wallace. The major ideas that make a programmable digital computer developed likewise in several countries. Konrad Zuse in Germany built his first computer in 1938. His Z1 used relays for storage, was based on binary arithmetic and was the first machine to introduce floating-point calculation with overflow protection. His Z3, completed in 1941, is now considered the "first properly functioning computer" (Zuse, 1987), and in 1998 a panel session at the International Conference on History of Computing identified Zuse as a pioneer of modern computing. (Zuse, 2004)
The ENIAC used decimal arithmetics and thus required ten vacuum tubes per digit, which gave it very limited memory capacity. The EDVAC (Electronic Discrete Variable Automatic Computer), which did its first computation in 1949, rectified this by switching to binary arithmetic.
In the context of science, civilization and society most of the computer developments that followed led to improvements of existing concepts rather than innovations. The invention of the transistor made computers smaller, memory larger and computations faster. Increasing computing speed created new opportunities in the entertainment, leisure and music industry and influenced peoples' lifestyles. The introduction of the personal computer and the invention of the internet changed the way in which people communicate.
Like many other inventions, the internet can be put to different uses. Capitalist monopolies use it to strengthen their economic position; ordinary people use it to build coalitions against imperialist governments and multinational companies. Computers do not take sides in this; they may have supported the development of weapons of mass destruction, but unlike bombs and guns they can be put to good use as well.
Where computers really instrumental in the development of advanced warfare? They were not invented exclusively for that use, but there is no denying that as time went on military considerations overruled everything else.
J. H. Müller was a member of the Hessian army when he designed his calculator in 1786. He may have been motivated by military considerations. For the next 150 years non-military uses, including science applications, were more prominent. Babbage had visited Paris in 1819 and again on later occasions. He was familiar with Napoleon's project to establish a fair system of property taxation and knew how the Bureau du Cadastre, the French ordnance survey office, organized the calculation of new tables based on the metric system of measures like a factory that "manufactured logarithms."
On his return to England in 1820 Babbage and his friend John Herschel were put in charge of supervising the calculation of tables for the Astronomical Society by freelance computers. They found the process slow and riddled with errors. The first difference engine built by Scheutz father and son was sold to an astronomical observatory.
The second of the Scheutz engines went to the British government to assist with the handling of another problem, the computation of new life insurance tables. In the 18th century Equitable Societies offered a payout at death in return for payment of an annual premium. These societies were set up for the affluent classes who could afford to pay an annual bill when it arrived in the mail. When the Prudential was established in 1856 its aim was to offer life insurance to the working class. Ten pounds were enough to cover a working class funeral, but the annual premiums would have to be replaced by small weekly payments. To enable the Prudential to make a profit from the collection of pennies from a large number of customers every week it had to keep its administrative overheads at a minimum. It organized its office into a calculation factory and managed to service millions of policies with a workforce of 300 clerks. (Campbell-Kelly and Asprey, 1996)
By the end of the 19th century another problem had become intractable. Taxation tables and life insurance statistics require census data. The rapid population growth during the 19th century had turned a census into a mammoth undertaking. In the USA the number of employees at the Census Bureau had spun out of control. The answer was Hollerith's punched card system, the successor to Jacquard's system of perforated cards used in Babbage's engines. Hollerith's monopoly for the census operations of 1890 and 1900 laid the foundations for what was to become the International Business Machines Corporation (IBM).
As the 20th century progressed computer development came more and more under military control. It began with the need for ballistic tables. The trajectory of a projectile is determined by many factors, chiefly by head winds, cross winds, air temperature, air density and local gravity. Each type of projectile responds differently to these variables, and accurate targeting requires a complicated calculation. In the 1930s and 1940s these calculations were provided in the form of tables for the gunner.
A typical firing table contained data for about 3,000 different trajectories. Each trajectory required the solution of a differential equation in seven variables. The coefficients for the equations that expressed the drag of the air on the projectile had to be determined experimentally, and at the time of World War I all major countries had established ballistic firing ranges or "proving grounds" for that purpose. In the USA the research laboratory of the proving ground was known since 1938 as the Ballistics Research Laboratory (BRL).
In the 1930s the computation of ballistic tables was a task for female computers. A team of 100 "computer girls" could produce a complete table in about one month. Every modification to an existing firing weapon required new tables, and the task of producing new tables was never-ending. The BRL was one of the first customers to acquire a Bush differential analyzer but still could not cope with the work; a memo of 1942 described the situation:
The BRL responded by commissioning and financing the development of the ENIAC at the Moore School of Electrical Engineering at the University of Pennsylvania. Support came also from an initially independent but increasingly important contract with the School to develop a "Moving Target Indicator" for military radar that introduced new ideas on the use of vacuum tubes in computers. The mathematician John von Neumann, who became instrumental in the development of the EDVAC, was a consultant to the Manhattan Project to build a nuclear bomb.
The ENIAC and the EDVAC remained a military secret until 1945, when the army held a demonstration press conference. It did not reveal to what extent its computers had assisted the nuclear bomb project, but one reporter quoted a source as having said that the ENIAC had been used on "a very difficult wartime problem" that would have taken 100 human computers a year to solve; the ENIAC had completed it in two hours. (Shurkin, 1984)
It would be wrong to claim that computers contributed significantly to the events of World War II. Their development took far longer than most military departments had hoped, and the first real use of computer power did not occur until after the war. In Germany the development time for computers was judged more realistically. But the German military were convinced that the war would be over in a year, and years of computer development was judged not worthy of support. As a result Zuse's proposal for an anti-aircraft defence computer was turned down and Zuse never became involved in military development contracts. (Zuse, 1987)
Science and technology always develop in response to a need. Early attempts to build computing machines were driven by tax administration, life insurance and census requirements. Equally important was the demand for better computing facilities by astronomy and mathematics, and the history of the computer provides an illuminating example how science and technology have come to depend on each other.
But the low impact of computers on the development of World War II cannot diminish the role of the military in their more recent history. This becomes particularly clear during the years after the war, which saw the USA emerge as the leader in computer technology. When the war ended Germany, Britain and the USSR had comparable expertise in computer design. Britain had gone the path of strictly military secret development and set up project Colossus, an electronic computer for the decoding of German coded messages. Although several machines were built and operated before 1944 the project remained a military secret into the 1970s, and British industry did not receive the support and incentive to develop computers for civilian use.
A similar development occurred in the USSR. The successful launch of Sputnik, the Earth's first artificial satellite, in 1957 testified for the advanced state of Soviet computer design; but here as in Britain computers remained military devices, and the Soviet computer industry aimed at civilian applications remained crippled.
The close connection between the military, academia and industry typical for the economy of the USA was engineered by Vannevar Bush (the inventor of the differential analyzer), who became director of the Office of Scientific Research and Development in 1941. Rather than promoting a huge military research establishment Bush used his budget to outsource military research to university departments. With his large financial resources he could steer academic institutions into research directions of military interest, influence the appointment of key scientists to key institutions and establish closer links between universities and the military industrial complex.
After World War II Bush promoted the establishment of a National Research Foundation that would be at arms length from the military but continue "by contract and otherwise [to] support long-range research on military matters." Today the National Science Foundation is the major non-military research support agency, but military involvement in academic research is still strong in several areas. Thus, the Office of Naval Research dominated the development of oceanography after World War II (Hamblin, 2005) and is still one of the key financial sources for oceanographic research in the USA.
The launch of Sputnik lead to the establishment of the Advanced Research Projects Agency (ARPA), which received a 7 million dollar grant to promote the use of computers in military applications. By 1963 it had developed ARPANET, a system to increase the power of computers by linking them together through a network. This was the beginning of the internet.
In Germany the only military forces after the war were those of the occupying powers. This ruled out any military support for the German computer industry established by Zuse. His company grew to 1200 employees before it was taken over by Siemens and later became part of Fujitsu-Siemens computers.
Japanese scientists had been involved in computer research already before World War II, but after the war Japan's industry was not capable to compete with the designs developed in the USA, and no Japanese company built a computer with vacuum tube technology. Yoshiro Nakamats at the Imperial University in Tokyo invented the floppy disk in 1950, but the sales license was given to IBM.
After the invention of the transistor at Bell Laboratories Japan's industry put the transistor to use in the leisure electronics market. In 1977 the playing card manufacturer Nintendo introduced the first computer game. Today Japan is the undisputed leader in electronic gadgets from games to robots. The question how Japan could develop into an industrial power that can dominate a section of the electronic industry cannot be answered without a closer look at the history of Japanese civilization and society.
A suitable point in time to begin a discussion of Japanese civilization is the unification of the country under the rule of a tenno ("emperor of heaven"). This occurred around somewhere between 266 and 413. This date is suggested by the absence of any records of visits from Japan in Chinese government documents. Such visits, exchanges of diplomatic missions and occasional major presents had been recorded in previous centuries, when Japan consisted of over 100 states. The lack of recorded contact for a period of 150 years suggests that Japan went through turmoil that made contact with the outside world impossible.
From the 5th century onward Japan was always a powerful regional force. During periods of strength the tenno received tribute from states in Korea; during periods of weakness he sometimes tried (unsuccessfully) to find strength in the role of a regional representative of the Chinese court. Chinese influence was pervasive in all aspects of Japanese life, from the arts to philosophy, technology, law and administration, with often only minor adaptations to local conditions. The Japanese emperor had the dual role of monarch and high priest, and the government was structured into parallel agencies, the Dajokan (Council of State) and the Jingikan (Office of Deities).
Buddhism arrived from China and became the major religion. During the 8th century the Buddhist monasteries of Nara amassed great wealth, and the monks began to gain significant political power. This led to reactions from the feudal aristocracy, and Buddhism was forbidden to interfere in politics but continued to receive support for its religious affairs. Certain aristocratic families became quite powerful in the process, and exemption from taxes and other privileges established them in a position where the emperor could not always exercise his power unhindered.
The title shogun (sei-i tai shogun, "barbarian-quelling generalissimo") originated in the 8th century when the Japanese emperor commissioned military commanders to subdue the Aino people of northern Japan. Legally the shogun was under the command of the emperor, and his authority extended only to the control of the military and decisions of warfare. From the late 12th century the feudal Japanese society could not maintain a centre of power outside the military, and the shogun became the real ruler, reducing the role of the emperor to a symbolic figure of head of state.
Although the shogun's rule was never acknowledged in legal documents, shoguns controlled the military, the administration and the judiciary. They appointed military governors (shugo) to every province. The position of shogun became hereditary, and shoguns were often described as military dictators. Several shoguns were sponsors of architecture, literature and the arts. The system of shogun government lasted for 670 years, from 1192 until 1867.
When the first European vessels turned up in Asian waters in the 16th century the shogun Tokugawa Iemitsu took measures to close Japan against colonial invasion. Shortly after 1630 Christianity was banned, and Japanese were forbidden to travel abroad. As a result of these and other measures Japan never became a colony. But they were imposed through a stern and dictatorial regime, which led to a revival of Buddhist ideas based on the teachings of Mencius in the Kogaku ("Study of Antiquity") School.
The policy of closure to the outside world was successful in the sense that it allowed Japan to maintain its political independence, but it also cemented the structure of its feudal society. Towards the end of the 19th century Japan was an ancient feudal empire in a world dominated by strong capitalist economies. Some aristocratic families realized that to maintain Japan's independence they had to strengthen the country's economy.
In 1853 the USA sent Commodore Matthew C. Perry with a fleet of gunships to Japan to force the end of the ban on foreigners that existed since 1633 and to impose negotiations on trade relations. This posed the question: Should Japan open itself to foreigners, as demanded by the USA, and run the risk of becoming a colony like India, Indonesia and others, or should it keep its borders closed but adopt European technology and science? Tokugawa Nariaki, the head of the Mito fiefdom, favoured the second option. Not being part of the shogun government, he concentrated on reform in his own fiefdom, reorganizing its finance and administration, instituting a public works programme, setting up an iron works and a shipyard, and introducing European military organization.
Nariaki's activities became a model for the rest of the country, but the shogun government saw it as a threat. This led to a situation where the movement for modernization rallied around the idea to replace the shogun government by restoring the power of the emperor. The Meiji Restoration of 1868 ended the shogunate and established the imperial Meiji dynasty as the central power; it was Japan's bourgeois revolution. Capitalism in Japan therefore began with the coronation of an emperor.
Capitalism in Japan developed in response to outside pressure, before an industrial working class was established in the country. The unique character of the Japanese transition from feudalism to capitalism led to a unique brand of capitalism characterized by lifetime employment and loyalty of workers to their employers bordering on devotion: The feudal landlord, who owned the land with all its villages including their inhabitants, became the new industrialist and turned his peasants into his factory workers. This system survived until the end of the 20th century, when the necessities of modern capitalism such as mass dismissals required its modification.
As a latecomer to capitalism Japan was in a similar situation as Germany: Its share of the colonial land distribution was extremely small. To rectify the situation Japan resorted to the same means as Germany: It turned to fascism. Like Germany, it was defeated; but in a very different way.
The war in Europe had seen several war crimes; Hitler had ordered the bombing of Coventry, Churchill the bombing of Dresden. But Germany was a fascist country, and no one in Germany could point to the British war crimes as an excuse for the crimes of German fascism.
The war in the Pacific ended with a different balance sheet. Japan had already been defeated and was close to surrender when the USA dropped two experimental nuclear bombs on the country. What could have been accepted as a just defeat was thus turned into an outcry against the inhumanity of a barbaric adversary. This explains why to this day Japan has not been able to face its own war crimes history. While a German chancellor has publicly apologized and acknowledged Germany's historical guilt, Japan's ruling elite still teaches a distorted history in the country's schools and refuses to acknowledge the crimes of Japan's fascist period.
Some other traditions are also still alive and contribute to a positive future for the country. In 1956 IBM requested permission to set up a wholly owned manufacturing subsidiary in Japan with the right to return royalty payments and profits to its parent company. The Japanese government denied the request. IBM persevered with its demand. In 1960 it was allowed to establish its desired subsidiary, but repatriation of royalties back to its parent was restricted to ten percent, and in return IBM had to license its patents to all interested Japanese companies for a 5-year period at a reduced rate.
Despite these protective measures Japanese companies are still struggling to catch up with the development of computer technology in the USA. But Japan's computer industry found its own area where it became a world leader. Modern entertainment and leisure electronics is driven more by Japanese inventions than by anything else, and this has led to a growing penetration of the movie and music industry of the USA by Japanese capital.
The most remarkable development in the context of science, civilization and society is the gradual influence of elements of Japanese culture on the European/American civilization. Japan's boom in electronic entertainment products has expanded the country's comic book tradition into animated movies that begin to surpass the animated features from Holliwood. The Oscar won by the movie Sen to Chihiro no Kamikakushi ("Spirited Away") in 2003 documents this trend. Its creator Hayao Miyazaki based his story entirely in the Japanese cultural context - cinemas in Europe and other western countries are now running Hayao Miyazaki festivals. Hopefully this is one step on the way to multiculturalism at global level.
Campbell-Kelly, M. and W. Aspray (1996) Computer; a History of the Information Machine. Basic Books, New York.
Hamblin, J. D. (2005) Oceanographers and the Cold War: Disciples of Marine Science. University of Washington Press, Seattle.
Lévy, P. (1995) The Invention of the Computer. In: M. Serres (editor): A History of Scientific Thought, Elements of a History of Science. Blackwell, Oxford, 636 - 663. (Translation of Éléments d'Histoire des Sciences, Bordas, Paris, 1989)
Shurkin, J. (1984) Engines of the Mind. W. W. Norton & Company, New York.
Zuse, H. (2004) The Life and Work of Konrad Zuse. EPE Online, http://www.epemag.com/zuse/ (accessed 8 September 2004)
Zuse, K. (1987) Computer Design - Past, Present, Future. Talk given in Lund (Sweden) on 2nd October. Quoted from Lee, J. A. N. (1994) Konrad Zuse. http://ei.cs.vt.edu/~history/Zuse.html (accessed 7 September 2004)
next lecture |