Реферат по предмету "Иностранный язык"


Dawn Of The Digital Age Essay Research

Paper

The Dawn of the Digital Age

The history of computers starts out about two thousand years ago, at the birth of the abacus. The abacus is a wooden rack holding two horizontal wires with beads strung on them. When these beads are moved around, according to “programming” rules memorized by the user, all regular arithmetic problems can be done.

Blaise Pascal is usually credited for building the first digital computer in 1642. It added numbers entered with dials and was made to help his father, a tax collector. Gottfried Wilhelm von Leibniz invented a computer that was built in 1694. It could add, and multiply, after changing some of the parts around. Leibniz invented a special stepped gear mechanism for introducing the addend digits, and this is still being used.

The prototypes made by Pascal and Leibniz were not used in many places. They were even considered a little weird until, a little more than a century later, Charles Xavier Thomas created the first successful mechanical calculator. Thomas’ calculator could add, subtract, multiply, and divide. Many improved versions of the desktop calculator followed. By about 1890, the range of improvements on the calculator included accumulation of partial results, storage and automatic reentry of past results (memory functions), and a printing of the results. These improvements were mainly made for commercial users, and not for the needs of science.

While Thomas was developing the desktop calculator, a series of very interesting developments in computers started in Cambridge, England. In 1812, Charles Babbage, a mathematics professor, realized that many long calculations, especially those needed to make mathematical tables, were really a series of predictable actions that were constantly repeated. From this he suspected that it should be possible to perform these actions automatically.

Babbage began to design his automatic mechanical calculating machine, which he called a Difference Engine. By 1822, he had a working model to demonstrate. With financial help from the British government, Babbage started fabrication of the Difference Engine in 1823. It was intended to be steam powered and fully automatic, including the printing of the resulting tables, and commanded by a fixed instruction program. The Difference Engine, although having limited adaptability and applicability, was really a great advance. Babbage continued to work on it for the next ten years, but in 1833 he lost interest to what he thought was a better idea. Babbage wanted to start construction on what would now be called a general purpose, fully program-controlled, automatic mechanical digital computer. Babbage called his idea an Analytical Engine. The ideas of this design showed a lot of foresight, although they would not be appreciated until a full century later.

The plans for this engine required an identical decimal computer operating on numbers of fifty decimal digits (or words), and having a storage capacity (memory) of one thousand such digits. The built-in operations were supposed to include everything that a modern, general – purpose computer would need. There was even to be a Conditional Control Transfer Capability that would allow commands to be executed in any order, not just the order in which they were programmed. The analytical engine was to use punched cards (similar to those used in a Jacquard loom), which would be read into the machine from several different Reading Stations. The machine was supposed to operate automatically, by steam power, and require only one user.

Babbage’s computers were never finished. There are various theories for why he failed. Most view his lack of precision machining techniques to be at fault. Another speculation is that Babbage was working on a solution for a problem that few people in 1840 really needed to solve. After Babbage’s attempts, there was a temporary loss of interest in automatic digital computers.

Between 1850 and 1900 more advances were made in mathematical physics. It came to be known that most observable dynamic phenomena could be identified by differential equations, meaning that most events occurring in nature can be measured or described by one equation or another. This led way to easier means for calculations.

Also, the availability of steam power caused manufacturing, transportation, and commerce to prosper. This led to a period advanced engineering achievements. The designing of railroads, and the making of steamships, textile mills, and bridges required differential calculus to determine such things as center of gravity, center of buoyancy, moment of inertia, and stress distributions. Even the assessment of the power output of a steam engine needed mathematical integration. A strong need thus developed for a machine that could rapidly perform many repetitive calculations.

A step towards automated computing was the development of punched cards. Herman Hollerith and James Powers, both of whom worked for the U.S. Census Bureau, were the first to successfully use punched cards with computers in 1890. They developed devices that could read the information that had been punched into the cards automatically, without human help. Because of this, reading errors were reduced dramatically, workflow increased, and stacks of punched cards could be used as easily accessible memory of almost unlimited size. Furthermore, different problematic equations could be stored on different stacks of cards and accessed when needed.

These advantages were seen by commercial companies and soon led to the development of improved punch-card using computers created by International Business Machines (IBM), Remington, Burroughs, and other corporations. These computers used electromechanical devices in which electrical power provided mechanical motion. Such systems included features that could feed in a specified number of cards automatically. They could also add, multiply, and sort. They could even feed out cards with punched results.

As compared to today’s machines, these computers were slow, usually processing fifty to two hundred-twenty cards per minute, each card holding about eighty decimal numbers, or characters. At the time, however, punched cards were a huge step forward. They provided a means of input/output, and memory storage on a huge scale. For more than fifty years after their first use, punched card machines did most of the world’s first business computing, and a considerable amount of the computing work in science.

The start of World War II produced a large need for computer capacity, especially for the military. New weapons were made for which trajectory tables and other essential data were needed. In 1942, John P. Eckert, John W. Mauchly, and their associates at the Moore school of Electrical Engineering of University of Pennsylvania, decided to build a high-speed electronic computer to do the job. This machine became known as ENIAC (Electrical Numerical Integrator And Calculator).

The size of ENIAC’s numerical “word” was ten decimal digits, and it could multiply two of these numbers at a rate of three hundred per second, by finding the value of each product from a multiplication table stored in its memory. ENIAC was therefore about one thousand times faster then the previous generation of relay computers.

ENIAC used eighteen thousand vacuum tubes, about one thousand-eight hundred square feet of floor space, and consumed about one hundred-eighty thousand watts of electrical power. It had punched card input/output, one multiplier, one divider/square rooter, and twenty adders using decimal ring counters, which served as adders and also as quick-access read-write register storage. The executable instructions making up a program were embodied in the separate “units” of ENIAC, which were plugged together to form a “route” for the flow of information.

These connections had to be redone after each computation, together with presetting function tables and switches. This “wire your own” technique was inconvenient, for obvious reasons, and with only some latitude could ENIAC be considered programmable. However, it was efficient in handling the particular programs for which it had been designed.

ENIAC is commonly accepted as the first successful high-speed electronic digital computer (EDC) and was used from 1946 to 1955. A controversy developed in 1971, however, over the patent ability of ENIAC’s basic digital concepts. The claim was made that another physicist, John V. Atanasoff, had already used the same ideas in a simpler vacuum-tube device. It was in 1939 that John Atanasoff and Clifford Berry of Iowa State College completed their prototype of the first digital computer. It could store data and perform addition and subtractions using binary code. They had to abandon their efforts for a next generation machine due to the onset of World War II. In 1973 the courts found in favor of the company using the Atanasoff claim.

Fascinated by the success of ENIAC, the mathematician John Von Neumann undertook, in 1945, an abstract study of computation that showed that a computer should have a very simple, fixed physical structure. Furthermore, he concluded, it should be able to execute any kind of computation by means of a proper-programmed control, without the need for any change in the unit itself.

Von Neumann contributed a new awareness of how practical and fast computers should be organized and built. These ideas, usually referred to as the stored-program technique, became essential for future generations of high-speed digital computers and were universally adopted.

The stored-program technique involves many features of computer design and function. In combination, these features make high-speed operation attainable. Consider what one thousand operations per second means. If each instruction in a job program was used once, in consecutive order, no human programmer could generate enough instruction to keep the computer busy.

Arrangements must be made for parts of the job program (called subroutines) to be used repeatedly in a manner that depends on the computation variables. Also, it would be helpful if instructions could be changed when needed during a computation to make them behave differently. Von Neumann met these two needs by developing a special type of machine instruction, called a conditional control transfer. This allowed the program sequence to be stopped and started again at any point, storing all instruction programs together with data in the same memory unit, so that, when needed, instructions could be arithmetically changed in the same way as data.

As a result of these techniques, computing and programming became much faster, more flexible, and more efficient. Regularly used subroutines did not have to be reprogrammed for each new program, but could be kept in “libraries” and read into memory only when needed. Thus, much of a given program could be assembled from the subroutine library.

The multi-purpose computer memory became the assembly place in which all parts of a long computation were kept, worked on piece by piece, and put together to form the final results. The computer control survived only as an “errand runner” for the overall process. As soon as the advantage of these techniques became clear, they became a standard practice.

The first generation of modern programmed electronic computers to take advantage of these improvements was built in 1947. This group included computers using Random Access Memory. RAM is memory designed to give almost constant access to any particular piece of information. . These machines had punched card or punched tape input/output devices and RAM’s of one thousand word capacity. Access times were one-half Greek MU seconds. Some could perform multiplications in two to four MU seconds. Physically, they were much smaller than ENIAC. Some were about the size of a grand piano and used only two thousand-five hundred electron tubes, a lot less than required by the earlier ENIAC. The first generation stored program computers needed a lot of maintenance, reached probably about seventy to eighty percent reliability of operation, and were used for eight to twelve years. They were usually programmed in ML, although by the mid 1950’s progress had been made in several aspects of advanced programming. This group of computers included EDVAC and UNIVAC, the first commercially available computers.

Early in the 1950’s, two important engineering discoveries changed the image of the electronic/computer field from one of fast but unreliable hardware to an image of relatively high reliability and even more capability. These discoveries were the magnetic core memory and the transistor circuit element. These technical discoveries quickly found their way into new models of digital computers. RAM capacities increased from eight thousand to sixty-four thousand words in commercially available machines by the 1960’s, with access times of two to three MS (Milliseconds). These machines were very expensive to purchase or rent and were particularly expensive to operate because of the cost of expanding the programming. Such computers were mostly found in large computer centers operated by industry, government, and private laboratories that could staff themselves with many programmers and support personnel. This situation led to modes of operation that enabled sharing of these marvel machines.

One such mode was batch processing, in which problems are prepared and then held ready for computation on a relatively cheap storage medium. Magnetic drums, magnetic – disk packs, or magnetic tapes were usually used. When the computer finished with a problem, it “dumped” the whole problem (program and results) on one of these peripheral storage units and started on a new problem.

Another mode for accessing these fast, powerful machines was called time-sharing. In time-sharing, the computer processes many jobs in such rapid succession that each job runs as if the other jobs did not exist, thus keeping each “customer” satisfied. Such operating modes needed elaborate executable programs to attend to the administration of the various tasks.

In the 1960’s, efforts to design and develop the fastest possible computer with the greatest capacity reached a turning point with the LARC machine, built for the Livermore Radiation Laboratories of the University of California by the Sperry – Rand Corporation, and the Stretch computer by IBM. The LARC had a base memory of ninety-eight thousand words and multiplied in ten Greek MU seconds. Stretch was made with several degrees of memory having slower access for the ranks of greater capacity, the fastest access time being less then one Greek MU second and the total capacity in the vicinity of one hundred million words.

During this period, the major computer manufacturers began to offer a range of capabilities and prices, as well as extras such as consoles, card feeders, page printers, cathode-ray-tube displays, and graphing devices. These were widely used in businesses for accounting, payroll, inventory control, ordering supplies, and billing purposes.

CPU’s for these uses did not have to be very fast arithmetically and were usually used to access large amounts of records on file, keeping these up to date. By far, most computer systems were sold for simpler uses, such as for hospitals to keep track of patient records, medications, and treatments given. They were also used in libraries, like the National Medical Library retrieval system, and in the Chemical Abstracts System, where computer records on file now cover nearly all known chemical compounds.

The trend during the 1970’s was, to some extent, moving away from very powerful, single-purpose computers towards computers with a larger range of applications, and for cheaper computer systems. Most continuous-process manufacturing, such as petroleum refining and electrical-power distribution systems, now used smaller computers for controlling and regulating their jobs.

In the 1960’s, the problems in programming applications were an obstacle to the independence of medium sized on-site computers, but gains in applications programming language technologies removed these obstacles. Applications languages were now available for controlling a great range of manufacturing processes, and for using machine tools with computers. Moreover, a new revolution in computer hardware was under way, including the shrinking of computer-logic circuitry components by what are called large-scale integration (LSI) techniques. In the 1950s it was realized that “scaling down” the size of electronic digital computer circuits and parts would increase speed and efficiency and by that, improve performance, if they could only find a way to do this. About 1960 photo printing of conductive circuit boards to eliminate wiring became more developed. Then it became possible to build resistors and capacitors into the circuitry by the same process. In the 1970’s, vacuum deposition of transistors became the norm, and entire assemblies, with adders, shifting registers, and counters, became available on tiny “chips.”

In the 1980’s, very large scale integration (VLSI), in which hundreds of thousands of transistors were placed on a single chip, became more and more common. Many companies introduced programmable minicomputers supplied with software packages. The “shrinking” trend continued with the introduction of personal computers (PC’s), which were programmable machines small enough and inexpensive enough to be purchased and used by individuals.

Many companies, such as Apple Computer and Radio Shack, introduced very successful PC’s in the 1970s, encouraged in part by a fad in computer video games. In the manufacturing of semiconductor chips, the Intel and Motorola Corporations were very competitive, although Japanese firms were making strong economic advances, especially in the area of memory chips. By the late 1980s, some personal computers were run by microprocessors that could process about four million instructions per second.

Microprocessors equipped with read-only memory (ROM), now performed an increased number of process-control, testing, monitoring, and diagnosing functions, like automobile ignition systems, automobile-engine diagnosis, and production-line inspection duties.

Cray Research and Control Data Inc. dominated the field of supercomputers through the 1970s and 1980s. In the early 1980s, however, the Japanese government announced a gigantic plan to design and build a new generation of supercomputers. This new generation, the so-called “fifth” generation, is using new technologies in very large integration, along with new programming languages, and will be capable of amazing feats in the area of artificial intelligence, such as voice recognition.

Progress in the area of software has not matched the great advances in hardware. Software has become the major cost of many systems because programming productivity has not increased very quickly. New programming techniques, such as object-oriented programming, have been developed to help relieve this problem. Despite difficulties with software, however, the cost per calculation of computers is rapidly lessening, and their convenience and efficiency are expected to increase in the early future.

The computer field continues to experience huge growth. Computer networking, computer mail, and electronic publishing are just a few of the applications that have grown in recent years. Advances in technologies continue to produce cheaper and more powerful computers offering the promise that in the near future, computers or terminals will reside in most, if not all homes, offices, and schools.

Bibliography

Becher, Rhoda McShane. Parents and schools. Urbana, Ill.: ERIC Clearinghouse on Elementary and Early Childhood Education, University of Illinois, [1986].

Besterman, Theodore. Education; a bibliography of bibliographies. Totowa, N.J., Rowman and Littlefield, 1971.

Carey, Nancy Lane, Laurie Lewis, Elizabeth Farris, and Shelley Burns, project officer. Parent involvement in children’s education: efforts by public elementary schools. Washington, DC: U.S. Dept. of Education, Office of Educational Research and Improvement, National Center for Education Statistics: For sale by the U.S. G.P.O., Supt. of Docs., [1998].

Christopher, Cindy J. Building parent-teacher communication: an educator’s guide. Lancaster, Pa.: Technomic Pub. Co., c1996

Dodd, Anne W., and Jean L. Konzal. Making our high schools better: how parents and teachers can work together. New York: St. Martin’s Press, 1999.

Drazan, Joseph Gerald. An annotated bibliography of ERIC bibliographies, 1966-1980. Westport, Conn.: Greenwood Press, 1982.

Gestwicki, Carol. Home, school, and community relations: a guide to working with parents. Albany, N.Y.: Delmar Publisher, c1987.

Haley, Paul. Karen Berry; edited by Leslie F. Hergert. Home and school as partners: helping parents help their children. Andover, MA: Regional Laboratory for Educational Improvement of the Northeast and Islands, 1988.

Henderson, Anne. Parent participation-student achievement: the evidence grows [an annotated bibliography]. Columbia, Md.: National Committee for Citizens in Education, c1981.

Idol, Lorna, Phyllis Paolucci-Whitcomb, and Ann Nevin. Collaborative consultation. Rockville, Md.: Aspen Publishers, 1986.

McKinney, Kay.; edited by Nancy Paulu. Parents: here’s how to make school visits work. Washington, D.C.: Office of Educational Research and Improvement, U.S. Dept. of Education: For sale by the Supt. of Docs., U.S. G.P.O., [1987?]

Rich, Dorothy. Teachers and parents: an adult-to-adult approach. Washington, D.C.: National Education Association, c1987.

United States. Congress. House. Select Committee on Children, Youth, and Families. Improving American education: roles for parents: hearing before the select Committee on Children, Youth, Families, House of Representatives, Ninety-eighth Congress, second session, hearing held in Washington, DC, on June 7, 1984. Washington: U.S. G.P.O.: For sale by the Supt. of Docs., U.S. G.P.O., 1984.

Wilde, Jerry. An educators guide to difficult parents. Huntington, N.Y.: Kroshka Books, c2000.




Не сдавайте скачаную работу преподавателю!
Данный реферат Вы можете использовать для подготовки курсовых проектов.

Поделись с друзьями, за репост + 100 мильонов к студенческой карме :

Пишем реферат самостоятельно:
! Как писать рефераты
Практические рекомендации по написанию студенческих рефератов.
! План реферата Краткий список разделов, отражающий структура и порядок работы над будующим рефератом.
! Введение реферата Вводная часть работы, в которой отражается цель и обозначается список задач.
! Заключение реферата В заключении подводятся итоги, описывается была ли достигнута поставленная цель, каковы результаты.
! Оформление рефератов Методические рекомендации по грамотному оформлению работы по ГОСТ.

Читайте также:
Виды рефератов Какими бывают рефераты по своему назначению и структуре.

Сейчас смотрят :

Реферат Основные принципы, функции и подфункции маркетинга
Реферат Особенности практики применения наказаний в XV-XVIII вв. в Российской империи
Реферат Психология профилактической деятельности
Реферат Роль Европейского Суда по правам человека в формировании и деятельности судебной системы России
Реферат Типы и виды конфликтов и Мероприятие по гражданскому воспитанию учащихся учреждения начального
Реферат Деньги, кредит, банки 2
Реферат Понятие системы наказаний
Реферат Состояние, развитие, пути повышения эффективности молочного скотоводства в СПК имени К
Реферат Законодательство о защите природы
Реферат Оцінка кредитоспроможності позичальника як засіб зниження кредитного ризику комерційного банку
Реферат Излучения в производстве и защита от них
Реферат Экспедиция русского флота в архипелаг (1768—1774гг.)
Реферат Расчёт районной распределительной подстанции
Реферат Рецензия на рассказ И. Бунина "Сны Чанга"
Реферат Система кількісних оцінок ступеня ризику