The machine can carry out mathematical operations. Some made use of mechanical devices, such as hand computer; some use made of electronic components, such as the computer.
Translated by Google
Interpretation
Computer (computer / calculation machine) is a general term, usually in academic or formal occasions. In general terms, the computer generally refers to the computer using a personal computer. The computer is able to follow the instructions on the various data and information for automatic processing and handling of electronic equipment. It is composed by several parts, such as CPU, motherboard, memory, power supply, video card and so on. Receiving, processing and providing data of a device, usually by the input and output devices, memory, arithmetic and logic unit and controller components; has analog, digital and mixed-type three types.
Translated by Google
Categories
From the computer type, operating mode, constitute the device, the operating principle, application status by, the computer has a variety of categories. From the data representation, the computer can be divided into digital computer, computers, and hybrid computer simulation of three; Posed by digital computer devices division, had mechanical and electrical computer computer use computer is now being studied optical computers, quantum computers, biological computers, neural computers, etc.. Computer or system of its size in terms of functionality, can be divided into giant, large, medium, small and micro-computer.
Translated by Google
Introduction
Computer named computer science, electrical calculators from the early evolved. In 1946, the world has witnessed the first electronic digital computer "eniac", used to calculate the trajectory. By the Institute of Electrical Engineering, University of Pennsylvania Moore made, but it's huge in size, covering over 170 square meters, weighing about 30 tons, consumed 100 kilowatts of electricity. Obviously, such a computer is costly, inconvenient to use. In 1956, the birth of the transistor computer, this is the second-generation computer. Just a few larger cabinet can only put it, computing speed is greatly increased. Appeared in 1959, is the third generation integrated circuit computer. The first computer invented by John von Neumann (then equivalent to the current computing power of the computer's calculator), there are three big warehouses, evolved later. From the 20th century, 70's, this is the latest stage of computer development. By 1976, the large scale integrated circuits and ultra large scale integrated circuit made of "Cray One," the computer into the fourth generation. The invention of ultra large scale integrated circuits, so that a small computer continuously toward miniaturization, low power, intelligent, systematic direction of replacement. 90 20th century, the computer to the "intelligent" direction, to create a computer similar to the human brain, can be thinking, learning, memory, network communication and so on. In the 21st century, it is laptop computers, miniaturization and specialization of operations per second rate of more than 100 million times, is not only easy to operate, inexpensive, and can replace the mental part of the people, and even expanded in some areas people intelligence. So, today's micro-computer to be called the image of a computer. The world's first personal computer introduced in 1981 by the ibm.
Translated by Google
Principle and structure
Regardless of the computer, they are composed of hardware and software.
Translated by Google
Hardware
Computer systems used by the electronic circuit and physical device is visible, tangible entity, such as the central processing unit (cpu), memory, external devices (input and output devices, i / o device), and the bus and so on. ① memory. Main function is to store programs and data, the program is based on computer operations, data is the object of computer operation. Memory is a memory bank, address decoder, read and write control circuit, the address bus and data bus. Can be directly by the central processor instructions and data random access memory as main memory, disk, tape, CD and other large-capacity memory as external memory (or secondary storage.) From the main memory, external memory and the corresponding software, composed of computer storage systems. ② The main function of the central processor is based on the existence of the program memory, one by one to perform the specified operation procedures. The main components of central processing: Data register, instruction register, instruction decoder, arithmetic logic unit, the operation controller, the program counter (instruction address counter), the address register. ③ external device is a bridge between users and machines. The task is to input devices for computer processing of the data user requirements, character, text, graphics and procedures and other forms of information into a computer code that is acceptable to the computer in the form of deposit. The task is to output device computer processing results in the form of user needs (such as screen display, text printing, graphics, charts, language, sound, etc.) output. Input and output interface is an external device and a buffer between the central processing unit, responsible for the electrical properties of the matching and information format.
Translated by Google
Software
Make the computer hardware system for smooth and effective _set_ of programs that work in general. Program is always some sort of physical media through to store and said, they are disk, tape, program paper, punch cards, but the software does not mean the physical media, but those can not be seen or touched the program itself. Reliable computer hardware as a person's strong physique, effective software as a person's intelligent thinking. Computer software systems can be divided into system software and application software in two parts. System software is responsible for the entire computer system resource management, scheduling, monitoring, and services. Application software refers to various areas of the users for their own needs and the development of various applications. Computer software system include: ① Operating System: the core system software, which is responsible for a variety of computer system software and hardware resource management, control and surveillance. ② database management system: the computer system is responsible for all documents, information and data management and sharing. ③ build system: responsible for the user with a high-level language source program written and compiled into the implementation of the machine can understand the machine language. ④ network system: responsible for computer systems to organize and manage network resources, making the computer more than one independent to each other between resource sharing and communication. ⑤ Standard Library: A standard format prepared by some of the procedures in the collection of standard procedures, including solving the elementary functions, linear equations, ordinary differential equations, numerical integration computer program. To fully exploit the potential of a single processor, people turn to the development of parallel processing technology. The beginning (1952) is designed in a parallel arithmetic unit arithmetic logic, and then began to use multi-functional components, namely, the establishment of independent central processor, but may also work in features. After 30 years of development, consisting of single-processor computer system, the performance has reached a very high level, vector supercomputer technology is the crystallization of this period.
Translated by Google
History
eniac the advent of machines has great significance, that the computer era, in the next 40 years, very rapid development of computer technology, science and technology in human history has not a subject with the pace of development compared to the electronic computer. Modern computer stage (ie traditional mainframe stage) The so-called modern computer is the use of advanced electronic technology to replace obsolete mechanical or relay technology. Experienced over half a century of modern computer development, the outstanding representative of this period the British and American scientists Turing Hungarian scientist John von Neumann. Turing's contribution to modern computers are mainly: the establishment of the Turing machine model, the development of computability theory; proposed definition of the Turing test of machine intelligence. Von Neumann's contribution is mainly: to establish the basic structure of modern computers, that von Neumann architecture. Its characteristics can be summarized as the following: (1) use a single processing unit to complete the calculation, storage and communications work; (2) storage unit is a fixed-length linear organization; (3) The unit of storage space is directly addressable; (4) the use of machine language instructions to complete the operation code by a simple operation; (5) calculation of the order of a centralized control. Plan on behalf of the principles of modern computer is based mainly used the computer to divide the different electronic devices, which is generally known as the tubes, transistors, integrated circuits, very large scale integrated circuits, four generations.
Translated by Google
Development process
1614, Scots john napier (1550-1617) published a paper in which he invented a reference to calculate the square root of four fundamental operations and sophisticated computing devices. 1623, wilhelm schickard (1592-1635) can be produced a number of addition and subtraction of six or less, and ring out the answer through the 'computing bell'. By turning the gears to operate. 1625, william oughtred (1575-1660) invented the slide rule 1642 to 1643, Pascal (blaise pascal) to help make tax collectors of the father, he invented a gear adder operation, called "pascalene", this is the first mechanical adder. In 1666, the invention of samuel morland in the UK can calculate the addend and subtrahend a mechanical counting machine. 1673, gottfried leibniz created a step-type (stepped) count of cylindrical rotor machine, called the "stepped reckoner", this calculator can multiply the number of repeat and the number of devices in the automatically Jiaru Jia. In 1694, German mathematician, gottfried leibniz, to Pascal's pascalene improved, creating a machine that can calculate the multiplier, it is still operating with the gear and the dial. 1773, philipp-matthaus manufacture and sell a small amount of precision to 12-bit Computing Machinery. 1775, the third earl of stanhope leibniz invented a similar multiplication and calculator. 1786, jhmueller designed a difference engine, but unfortunately there is no funding to produce. 1801, joseph-marie jacquard loom is connected in sequence with the punch card controlled weaving style. 1854, george boole published "an investigation of the laws of thought", is about the symbols and logical reason, it later became the basic concepts of computer design. 1882, william s. burroughs quit work in a bank clerk, and focus on the invention of addend. 1889, herman hollerith electric tabulating machine in the game really well, and is used in the 1890 census. herman hollerith jacquard loom used to calculate the concept, he card data storage, and then compile the results into the machine. The machine that would have required a decade to get the population survey, done in just six weeks. In 1893, the first four-function calculator was invented. 1895, guglielmo marconi send broadcast signals. 1896, hollerith _set_ up tab Machines (tabulating machine company). In 1901, punching keys appeared half a century after only a few changes. 1904, john a.fleming vacuum diode to obtain patents, create the basis for radio communications. 1906, lee de foredt added a third valve in felming diode, created a three-electrode vacuum tube. 1907, recorded music composition in New York, the first official station. In 1908, British scientists campbell swinton describes the electronic scanning method and indicates that manufacturing cathode ray tube TV. 1911, hollerith table machine company and the other two companies merged to form the computer tabulating recording company (ctr), watch and record companies. But in 1924, renamed the international business machine corporation (ibm). In 1911, the Dutch physicist kamerlingh onnes discovered superconductivity in leiden unversity. In 1931, vannever bush invented a procedure to solve differential counting machine, this machine can solve some of that mathematicians, scientists headache complex differential programs. In 1935, ibm (international business machine corporation) the introduction of "ibm 601", which is a has arithmetic components and can be calculated within 1 second multiplier in the perforated card machine. The calculation of its scientific and business play a great role. Manufacturing the 1500 total. 1937, alan turing come up with a "universal machine (universal machine)" concept, you can perform any of the algorithms, the formation of a "computable (computability)" the basic concept. turing the same concept than the other types of inventions as well, because he used the symbol processing (symbol processing) concepts. November 1939, john vincent atannsoff and john berry produced a 16-bit addend device. It is the first computing machine to use vacuum tubes. 1939, zuse created with schreyer open Si "v2" [later called z2], this machine z1 mechanical storage devices in use, with a logic with a broken electrical (relay logic) parts of the new arithmetic. After completion of the draft when zuse, the scheme was interrupted for one year. 1939-40 years, schreyer completed the 10-bit addend with vacuum devices, and the use of neon lights (neon) of the memory. January 1940, in the bell labs, samuel williams and stibitz completed a machine that can calculate the complex numbers, called "complex numbers calculator (complex number calculator)", later renamed as the "circuit breaker counting machine model i (model i relay calculator) ". Some do it with the telephone switch logic unit: 145 breaker, 10 bars switch. Numbers with "plus 3bcd" representative. In the same year in September, etype teletype installed in a mathematics conference, the connection to New York by the new hampshire. In 1940, zuse finally completed z2, it works better than v2, but not too reliable. Summer 1941, atanasoff and berry completed a designed to solve simultaneous linear equations system (system of simultaneous linear equations) of the calculator, and later called the "abc (atanasoff-berry computer)", which has 50 of 60 memory, the capacitor (capacitories) installed in the form of rotating drum 2, the clock speed is 60hz. February 1941, zuse complete "v3" (later called z3), is the first programming operation can count machine. It is also used floating-point operations, a 7-bit exponent, 14 bit mantissa, and a sign. Memory can store 64 characters, you need 1,400 breaker. It has more than 1,200 of the arithmetic and control components, and programming, input, output and z1 the same. January 1943 howard h. aiken complete "ascc mark i" (automatic sequential control calculator mark i, automatic sequence - controlled calculator mark i), also known as "haward mark i". This machine has 51 feet long, weighs 5 tons, from 750,000 for part of the merger. It has 72 accumulators, each has its own arithmetic components, and 23-digit register. December 1943, tommy flowers and his team completed the first "colossus", which has 2,400 vacuum tubes for logic components, 5 paper tape loop readers (reader), each work can be 5000 characters per second . In 1943, led by the john brainered, eniac began to study. The john mauchly and j. presper eckert responsible for implementation of the plan. In 1946, the first electronic digital integrator calculator (eniac) in the United States completed. In 1947, the American Association of Calculator (acm) was established. In 1947, Britain completed the first storage tube o 1948 Bell Telephone Company for the development of semiconductors. In 1949, the British completed the construction of "Electronic Delay Storage Automatic Calculator" (edsac) 1950, "Automation" is the first time in the automotive industry. In 1951, Massachu_set_ts Institute of Technology into core In 1952, the first "stored program calculator" was born. In 1952, the first large-scale computer systems ibm701 declared completed. In 1952, the first successful invention of symbolic language translator. In 1954, the first semiconductor computer developed by the Bell Telephone Company. In 1954, the first general-purpose data processor ibm650 born. In 1955, the first large-scale use of core computer ibm705 completed. In 1956, ibm 704 computer, introduced science. In 1957, the programming language fortran come out. In 1959, the first successful development of small scientific calculator ibm620. In 1960, the data processing system ibm1401 developed. In 1961, the programming language cobol come out. In 1961, the first computer subsystem design is completed by the Massachu_set_ts Institute of Technology. In 1963, basic language come out. In 1964, the third generation of computer ibm360 series made. In 1965, the U.S. Digital Equipment Corporation introduced the first minicomputer pdp-8. In 1969, ibm company successfully developed the 90 card machines and systems - 3 computer systems. In 1970, ibm system 1370 computer series made. In 1971, the design is completed the University of Illinois supercomputer Queen Ake iv. In 1971, the first 4004 microprocessor developed by Intel Corporation. In 1972, microprocessor-based film to begin mass production and sales. In 1973, the first piece of floppy disk from the ibm company successfully developed. In 1975, atari - 8800 microcomputer come out. In 1977, Kemo the whole portfolio company said Doyle microcomputer pet - 2001 developed. 1977, trs - 80 microcomputer was born. In 1977, Apple - ii microcomputer was born. In 1978, the beginning of VLSI applications. In 1978, the second time for the magnetic bubble memory business computer. In 1979, Sharp announced that made the first portable microcomputer. In 1982, the microcomputer began to spread, a lot of access to schools and families. In 1984, the Japanese computer industry to improve the development of the "fifth generation computer" --- a computer with artificial intelligence. 1984: dns (domain name server) domain name server release, the interconnection line to run more than 1,000 hosts. 1984: hewlett-packard released the excellent laser printer, hp inkjet printer also stay ahead of technology. January 1984: apple's macintosh release. Based on motorola 68000 microprocessor. Can be addressed 16m. August 1984: ms-dos 3.0, pc-dos 3.0, ibm at release, using isa standard, support for large hard drives and 1.2m high-density floppy drive. September 1984: apple released a 512kb memory macintosh, but no increase in other areas. The end of 1984: compaq began developing ide interface, you can transfer data faster, and was adopted by many of its peers, and later further eide introduced to support the drive to 528mb. Faster data transfer. 1985: philips and sony cd-rom to launch the drive. 1985: ega standard available. March 1985: ms-dos 3.1, pc-dos 3.1. This is the first part of the network functions to provide support for dos version. October 17, 1985: 80386 dx launched. Clock frequency to reach 33mhz, addressable 1gb memory. More than 286 instructions. 6 million instructions per second, integrated 275,000 transistors. November 1985: microsoft windows release. However, in its 3.0 version of the full has not been widely used. Need dos support, similar to the Mac interface, which sued by Apple. Proceedings to August 1997 was terminated. December 1985: ms-dos 3.2, pc-dos 3.2. This is the first system to support 3.5-inch disk. But only support to 720kb. Only support the 3.3 version to 1.44 trillion. January 1986: apple released a high performance macintosh. Four megabytes of memory, and the scsi adapter. September 1986: amstrad announced release cheap and powerful computers amstrad pc 1512. With cga graphics adapter, 512kb memory, 8086 processor, 20 MB hard drive. Using the mouse and graphical user interface, designed for families. 1987: connection machine supercomputers released. Parallel processing, 200 million times per second operation. 1987: microsoft windows 2.0 release to be successful than the first, but there is not much improved. . 1987: British mathematician michael f. barnsley find the image compression method. 1987: macintosh ii release, based on the motorola 68020 processor. Clock 16mhz, 2.6 million instructions per second. Has a scsi adapter and a color adapter. April 2, 1987: ibm introduced ps / 2 system. Originally based on the old xt 8086 processor and bus. Later, the transition to 80386, start using the 3.5-inch 1.44mb floppy drive. The introduction of micro-channel technology, this series of models with great success. Shipments of 200 million units. 1987: ibm vga release technology. 1987: ibm release of their own design microprocessor 8514 / a. April 1987: ms-dos 3.3, pc-dos 3.3. With the ibm ps / 2 with the release, support 1.44mb drives and hard disk partitions. For separation of multiple logical drives hard. April 1987: microsoft and ibm release s/2warp operating system. But not much success. August 1987: ad-lib sound card released. A Canadian company's products. October 1987: compaq dos (cpq-dos) v3.31 released. Support for hard disk partition larger than 32mb. 1988: Development of optical computer input, using photons instead of electrons to increase the computer's processing speed. 1988: xms standards established. 1988: eisa standards established. 1988 June 6: 80386 sx order to meet the needs of low-cost PC release. July 1988 to August: pc-dos 4.0, ms-dos 4.0. Support ems memory. However, because of bug, and later launched 4.01a. September 1988: ibm ps/20 286 released, based on the 80286 processor, do not use the Micro Channel bus. But other machines continue to use the bus. October 1988: macintosh iix release. Based on the motorola 68030 processor. Still use the 16 mhz frequency, instructions per second, 3.9 million to support the 128m ram. November 1988: ms-dos 4.01, pc-dos 4.01 release. 1989: tim berners-lee world wide web prototype creation, he worked at the European particle physics institute. Through hypertext links, the novice can easily browse the Internet. This greatly contributed to the development of internet. 1989: phillips and sony released cd-i standard. January 1989: macintosh se/30 release. Based on the new 68030 processor. March 1989: e-ide standards established, can support more than 528mb of hard drive capacity. Can reach 33.3 mb / s transfer speed. And is widely used cd-rom. April 10, 1989: 80486 dx publishing, integrated 1.2 million transistors. Its successor models of the clock frequency of 100mhz. November 1989: sound blaster card (sound) release. 1990: svga standard established. March 1990: macintosh iifx release, based on 68030cpu, clocked at 40mhz, using the faster scsi interface. May 22, 1990: Microsoft released windows 3.0. Ms-dos compatibility mode. October 1990: macintosh classic release, there are 256 color support to display adapter. November 1990: the first generation of mpc (Multimedia PC standards) release. Processor at least 80286/12mhz, later increased to 80386sx/16 mhz, and an optical drive, at least 150 kb / sec transfer rate. 1991: Published isa standard. May 1991: sound blaster pro release. June 1991: ms-dos 5.0, pc-dos 5.0. In order to promote the os / 2's development, bill gates, said: dos5.0 is dos Terminator, the future will no longer spend our energy here. This version of the basic breakthrough 640kb memory limit. This version also marks Microsoft and ibm dos on cooperation in the end. 1992: windows nt released addressable 2g ram. April 1992: windows 3.1 release. June 1992: sound blaster 16 asp release. 1993: internet start commercial operation. 1993: the classic game doom release. 1993: novell M & digital research, dr-dos as novell dos. March 1993 22: pentium release. Integrated more than 300 million transistors. Early work in the 60-66mhz. Perform 100 million instructions per second. May 1993: mpc Standard 2 release. cd-rom transfer rate requirements of 300kb/sec. In the window 320 * 240 15 frames per second. December 1993: ms-dos6.0 release, including a hard disk compression process doublespace,, but a small company claims that Microsoft stole some of its technology. So in the later dos6.2, Microsoft will be renamed: drivespace. Later win95 in dos as dos7.0, win95osr2 called dos7.10. March 7, 1994: intel release 90-100 mhz pentium processor. September 1994: pc-dos 6.3 release. October 10, 1994: intel release 75 mhz pentium processor. 1994: doom ii release. Pc machine game opened up a vast market. 1994: netscape 1.0 browser release. 1994: comm & conquer (C & C) release. March 27, 1995: intel released the pentium 120 mhz processor. 19956 1st: intel released the pentium 133 mhz processor. August 23, 1995: windows '95 release. Significantly different from its previous version. Completely out of ms-dos, but taking care of user habits also retained dos form. Pure 32-bit multitasking operating system. This version has achieved great success. November 1, 1995: pentium pro release. Frequency of up to 200 mhz, 440 million instructions per second to complete, integrate 550 million transistors. December 1995: netscape release of its. Javascript. 1996: quake, civilization 2, command & conquer - red alert and a series of well-known game was released. January 1996: netscape navigator 2.0 release, the first browser to support javascript. January 4, 1996: intel released the pentium 150-166mhz processor, integrated 330 million transistors. 1996: windows '95 osr2 release fixes some bug, expanded part of the function. 1997: gr and theft auto, quake 2, blade runner, and other famous game was released, 3d graphics accelerator card in great demand. January 8, 1997: intel released pentium mmx. For games and multimedia functions were enhanced. April 1997: ibm dark blue (deep blue) computer, over the human world chess champion Garry Kasparov. May 7, 1997: intel released pentium ii, an increase of more instructions and more cache. June 2, 1997: intel pm 233 mhz pentium mmx. 1997 on the 16th: apple encountered serious financial crisis, Microsoft extends a helping hand, injection of 1.5 billion USD. Condition is to withdraw its complaint apple: Microsoft Windows interface to imitate their prosecution, and that apple is a xerox copy of the design. February 1998: intel 333 mhz pentium ii processor release. Using 0.25-micron technology to improve speed and reduce heat. June 25, 1998: microsoft release windows '98, some people attempt to dismember Microsoft, Microsoft hit back saying it would hurt U.S. national interests. January 25, 1999: linux kernel 2.2.0 released. People of their high hopes. February 22, 1999: amd Announces k6-iii 400mhz. There is a test that outperforms intel p-iii. Integrated 23 million transistors, socket 7 structure. February 26, 1999, intel pentium Ⅲ processor the company launched, pentium Ⅲ and pentium Ⅱ using the same slot1 architecture and has added 70 new instructions sse instruction _set_ to enhance the 3d and multimedia processing capabilities. More than the original clock frequency 450mhz bus speed of 100mhz or more, using 0.25μm process, 512kb or more integrated secondary cache. April 26, 1999, Chen Ying Ho of Taiwan students in the preparation of the cih virus outbreaks worldwide, nearly 100 million units of computer hardware and software in varying degrees of damage, direct economic losses of billions of dollars. May 10, 1999, id soft launch of the "quake Ⅲ" the first test version, the next time, "quake Ⅲ" gradually establish a competitive standard fps game and become a computer hardware performance testing criteria. June 23, 1999, amd has introduced with a new architecture, called the athlon processor, and the first time in cpu frequency beyond the intel company, from the exciting end of the century opened the speed of the processor clock speed war. September 1, 1999, nvidia graphics chip company launched geforce256 and gpu proposed a new concept. October 25, 1999, code-named coppermine (copper) in pentium Ⅲ processor release. Using 0.18μm process, the internal integration 256kb full l2cache, built-in 28 million transistors. January 1, 2000, the world is waiting, Oh, the millennium bug and did not explode. February 17th, Microsoft officially released the U.S. windows2000. March 16, 2000, amd officially launched the company speeds up to 1ghz's "athlon" processor, which opened a ghz processor war. March 18, 2000, intel company launched its own 1ghz pentium3 processor. The same day, up to 50 billion dollars of as_set_s Iridium declared bankruptcy, the company fully terminate the Iridium phone service. The Pentagon eventually won the right to use Iridium, but uses far unknown. April 27, 2000, amd the company issued a "Duron" (duron) processor, intel started to attack the low-end market. May 14, 2000, titled "i love you" (Love Bug) virus attacks worldwide, only three days to create the world's nearly 45 million computers infected, economic losses amounted to 2.6 billion. September 14, 2000, Microsoft officially launched the windows for home users millennial version of windows me, and this was also the last Microsoft operating system based on dos. November 12, 2000, Microsoft announced the launch of slim PC tablet pc. November 20, 2000, intel pentium4 processor, formally launched. A new netburst the processor architecture, bus frequency reached a 400mhz, and another increase of 144 new instructions for improved video, audio and other multimedia and 3d graphics capabilities. December 14, 2000, 3dfx announced the sale of all as_set_s to a rival nvidia, thus ending his legendary history. February 1, 2001, Sega announced its withdrawal from the game hardware market. March 26, 2001, Apple released mac os x operating system, which is Apple's operating system since its inception in 1984, a modified version of the first major June 19, 2001, intel introduced by "tualatin" (Tualatin) and Celeron processor core p3, which is the intel first 0.13-micron process. October 8, 2001, amd athlon xp series processors announced the launch of the new processor core using a new, professional 3d now! Instruction _set_ and opga (organic pin array) package, and the use of the "relative performance marked" (pr nominal value) of the naming convention, while the processor makes a very excellent price intel pressure doubled. October 25, 2001, Microsoft launched the operating system windows xp, Bill. Gates said: "dos era is over." Windows xp release, also contributed to the world living in low pc hardware market. February 5, 2002, nvidia geforce 4 series published graphics chip, the series is divided into two series of ti and mx, which geforce4 ti 4200 geforce 4 mx 440 and the two products is a strong vitality of the market model. May 13, 2002 and died a long time veteran matrox graphics chip manufacturer officially launched the parhelia-512 (Chinese name: Magic Day) graphics chip, which is the world's first 512bit gpu. July 17, 2002, ati radeon 9700 video card released, the card uses the graphics core, codenamed r300, and no dispute will be first off the nvidia 3d performance of the overlord's throne. November 18, 2002, nvidia released the code for the nv30 the geforce fx graphics card, and the product was first used on 0.13-micron manufacturing process, the use of a number of advanced technologies, so the card is also known as an epoch-making products. January 7, 2003, intel released a new mobile Treatment Standards "Centrino." February 10, 2003, amd released the athlon xp barton core processors, although the introduction of a long period of time, lack of media recognition, but with high cost performance and excellent overclocking ability, to create the ultimate barton out of an infinite memory of all diyer barton era. February 12, 2003, futuremark released 3dmark 03, but this has triggered a crisis of confidence in testing software. Full Turn intel 2004 pci-express. 2005 began to promote dual-core intel cpu. 2006 began to promote intel quad-core cpu. Intel idf 2007 shocked the world were announced at the 2 trillion 80-core cpu.
Translated by Google
Computers in China
Development of human civilization in China was in the early history of computing tools to create aspects of the invention has written a glorious chapter. Far as the Shang dynasty, China has created a decimal notation method, led the world in more than a thousand years. In the Zhou dynasty, the invention of the most advanced computational tools - count chips. This is a kind of bamboo, wood or bone stick made of a different color. Calculated for each math problem, it is usually compiled a _set_ of verses in the form of the algorithm, while computing, while constantly re-cloth stick. Ancient Chinese mathematician Zu, is used to calculate the chip count in pi between 3.1415926 and 3.1415927. This result is a millennium earlier than the West. Abacus beads is another original, but also the history of computing tools first major invention. This light, flexible, easy to carry, and the people living close to the computational tools, first appeared in the Han Dynasty about to mature during the Yuan Dynasty. Abacus beads China's economic development not only played a useful role, but also spread to Japan, Korea, Southeast Asia, and has withstood the test of history, still in use. Guide to Chinese inventions car, water was fully assembled device, recording drum trucks, jacquard machine, not only for the automatic control of mechanical outstanding contributions to the development and evolution of computational tools have a direct or indirect impact. For example, Zhang Heng's water transport celestial globe made instrument, can be run automatically synchronized with the Earth and subsequently by the Tang and Song dynasties improvements, became the world's first astronomical clock. Recording drum car is the world's first automatic counting device. Liu Jacquard principle the development of computer programs have been controlled indirectly. Ancient China by Yang and Yin Yao constitute two gossip, but also the development of computing technology have had a direct impact. Leibniz wrote a research paper gossip, systematic presentation of a binary arithmetic algorithms. In his view, the world's first binary representation is China's gossip. After a long silence, the new China was founded, the Chinese computing technology has entered a new period of development, has established research institutions, institutions of higher learning established in the computing and computational mathematics professional and professional equipment, and _set_ out to create China's computer industry . 1958 and 1959, China has made the first small and large tube computer. Mid-60s, China successfully developed a number of transistor computer and languages such as preparation of the algol compiler and other system software. The late 60's, China began to study integrated circuit computer. 70 years, China has a small integrated circuit computer production. 80 years later, China began to focus on developing micro-computer system and application; in large computers, particularly computer technology giant has also made important progress; established a computer services sectors, and gradually improve the structure of the computer industry. In Computer Science and Technology, China in the finite element method, mathematical theorem proving, Chinese information processing, computer systems and software and other aspects of the structure to accomplish something. Computer applications in China in the field of scientific computing and engineering has made remarkable achievements. In the management and process control in terms of research and practice of computer applications is also increasingly active.
Translated by Google
Computer Science and Technology
Computer Science and Technology is a very practical, extremely rapid development of technical disciplines for the larger society, it is based on mathematics, electronics (especially micro-electronics), magnetic, optical, precision machinery, and other subjects basis. However, it is not a simple application of the knowledge of certain subjects, but the formation of highly integrated _set_ of information that transformation, storage, processing, control and use of the theory, methods and techniques. Computer science is the study of various phenomena in and around the computer and scale of science, including theoretical computer science, computer architecture, software and artificial intelligence. Computer technology is applied in the computer field refers to the technical methods and techniques, including computer systems, software technology, component technology, device technology and assembly technology. Computer Science and Technology consists of five branches, namely, theoretical computer science, computer architecture, computer organization and implementation, computer software and computer applications.
Translated by Google
Theoretical Computer Science
Is to study the basic theory of computer subjects. The development of mathematics in a few thousand years, people of a wide variety of computing, the creation of a number of algorithms. However, in order to calculate the nature or method for the study of mathematical theory, but it is only in the 20th century, 30 years to develop. At that time, several scholars of mathematical logic algorithm _set_ theory, theory of computation can also known as recursive function theory, 40 years of the 20th century, the formation of modern computer design had an impact. Since then, computers and programs on the real nature of the mathematical model and the computational complexity of the research continue to be developed.
Translated by Google
Computer Architecture
Computer programmers see properties, focusing on the concept of structure and function of the computer features, hardware, software and firmware subsystem and its interface to determine function allocation. Designers of high-level language program using the computer to see properties, mainly software and firmware subsystem subsystem properties, including programming languages and operating systems, database management systems, network software, the user interface. Using the machine language the computer designers have seen property, the conceptual structure of the hardware subsystem (hardware subsystem structure) and functional characteristics, including the instruction _set_ (machine language), and register definition, interrupt institutions, input and output way, the machine working status. The typical structure of hardware subsystems von Neumann structure, which by the operation controller, memory and input and output devices composed of a "command-driven" approach. At first, it is nonlinear, differential equations and design, did not foresee the high-level languages, operating systems such as the emergence and adaptation to the special requirements of other applications. For a long period of time, the software subsystems are based on the basis of this von Neumann and the development of the structure. However, during the case of incompatible gradually exposed, thus promoting the computer system and structure.
Translated by Google
Computer Organization and realization of
Is to study the function of computer components, parts, connections and interactions between each other, and the implementation of the relevant computer technology, both belonging to Computer Organization and realization of the task. Structure in the computer system to determine the functions assigned to the hardware subsystem and the conceptual structure, the organization's mission is to study the computer components of the internal structure and interconnected to the machine instruction level to achieve the various functions and features. Such interaction features, including the arrangement of mutual connections and interactions. With the expansion of computer capabilities and enhance the performance of the computer that contains the functional components are also increasing, during the interconnection structure is becoming more complex. Has three types of modern interconnect, respectively, the central processor, memory or communication subsystem for the center, and other parts interconnect. As the center of the communication subsystem organization, so that computer technology and communication technology closely integrated to form a computer network, distributed computer systems and other important computer research and applications. And computation to achieve a wide range of technologies related, including computer components, devices, digital circuit technology, assembly technology and related manufacturing technology and processes.
Translated by Google
Computer Industry
Computer industry, including the two divisions, namely computer manufacturing and computer services. The latter is also known as information processing industry, or information services. Computer industry is a province of the energy, resource saving, high value-added, knowledge-and technology-intensive industries, the development of the national economy, national defense capabilities and have a tremendous impact on social progress. Therefore, many countries have adopted policies to promote the prosperity of the computer industry. Computer manufacturing, including production of computer systems, peripherals, terminals, and related equipment, components, devices and materials manufacturing. Computer as industrial products, required products are inherited, there is a high performance - price ratio and overall performance. Inheritance of computer software compatibility is reflected in particular, it allows users and manufacturers to develop the software used in the past, new products, so that the high price of wealth continue to play a role in the software, thus reducing time and again, software development costs. Improve performance - price ratio is the target computer and power product updates. Computer manufacturing industry of computer products, generally include only some of the hardware subsystems and software subsystems. Often, the lack of software subsystems to adapt to specific application environment applications. In order to make the computer play in a particular environment performance, also need to design applications 系统 and development of application software addition, the computer's operation and maintenance, the need to acquire specialized knowledge of the technical staff, which is often made by an user can not. In response to these social needs, some computer manufacturers attach great importance to provide users with a variety of technical services and marketing services. Some independent of the computer manufacturer's computer services, also 50's began to appear. To the late 60s, the computer services industry in the world have been formed as a separate industry.
Translated by Google
Development and application of
Computer science and technology in all disciplines combined, improved research tools and methods to promote the development of each discipline. In the past, people mainly through experimental and theoretical research in science and technology in two ways. Now, calculation and simulation research has become the third way. Computer equipment with the combination of experimental observations can be recorded on field experimental data, sorting, processing, analysis and charting, significantly improve the quality and efficiency of the experimental work. Computer-aided design has become a high quality engineering design, an important means of automation. In theory, the computer is an extension of the human brain, can replace a number of human brain function and to be strengthened. Mathematical operations by the old pen and paper, and now the computer has become a new tool, like mathematical theorem proving heavy mental work has been done by the computer or may be partially completed. Calculation and simulation as a new research tools, often makes a number of subjects derived from the new branch. For example, aerodynamics, meteorology, elasticity analysis, structural mechanics and application of face "dyscalculia," With high-speed computer in the calculating method and began after the breakthrough, and the derived calculation of aerodynamics, numerical weather other edge of the branch prediction. Quantitative research using computers, not only in natural science played a major role in the social sciences and humanities as well. For example, in the census, social surveys and research in natural language, the computer is a very effective tool. A wide range of computer applications in all walks of life, often have a significant economic and social benefits, which led to the industrial structure, product structure, management and services, significant changes in their ways. In view of the industrial structure has been out of the computer manufacturing and computer services, and the knowledge industry and other new industries. Embedded microprocessors and micro-computer mechanical and electrical equipment, electronic equipment, communications equipment, instruments and household appliances, allowing these products to the intelligent direction. Computer system was introduced in a variety of production processes, so chemical industry, petroleum, steel, electricity, machinery, paper, cement and other automation of the production process has greatly improved labor productivity rising, quality improvement, cost reduction. Embedded computer systems of various weapons and weapons dry, can significantly improve their operational effectiveness. Management, the computer can be used to complete the statistics, planning, information, inventory management, market analysis, decision support, so that the operation and management of scientific and efficient, thus speeding up cash flow, reduce inventory levels, improve service quality, and shorten the new product development cycle, improve labor productivity. In the office automation, computer files can be used for drafting, searching and management, significantly improve office efficiency. Computer or people's learning tools and live instruments. With home computers, personal computers, computer networks, database systems and a variety of terminal equipment, people can learn a variety of courses, for a variety of information and knowledge, deal with affairs of life (such as ticketing, shopping, access to money, etc.), even can work from home. More and more people work, study and live with the computer will be a direct or indirect contact. Universal computer education has become an important issue. In short, the development and application of computers has not only a technological phenomenon but also a political, economic, military and social phenomena.
Translated by Google
Computer Type
New Computer Now some of the emerging new computer are: biological computers, Photon computers, quantum computers, etc. 1, the biological computer bionic The superiority of biological computer is very attractive, and now scientists in the development of many of the world it is, many scientists believe that the vacuum tubes 50 years ago, who would think of today's computer can be popular around the world; the current bio-computer is static quietly developed a one day appear in the technology arena, it is possible to achieve the full realization of existing computers can not deal with the human right brain functions and the fuzzy neural network processing functions of the brain. 2, the binary nonlinear quantum computer 3, the photon computer In early 1990, the U.S. made Bell Labs, the world's first photonic computer. A computer is a photon of light signals by a digital operation, logic operation, information storage and processing of the new computer. The basic composition of photon computer is integrated optics components, there must be laser, lens and nuclear microscopy. Since photons faster than the electronic, photonic computer running at speeds up to a trillion times. Its storage capacity of modern computer several times, but also on language, graphics, and gesture recognition and synthesis. Currently, many countries have invested heavily in computer studies Photon. With modern optical and computer technology, microelectronic technology, in the near future, photonic computer will become the universal human tools. Compared to computer and computer photons, mainly has the following advantages: (2) ultra-large-scale information storage capacity. Compared with the computer, the computer has a very large scale photonic information storage capacity. Photon computer has a very good light radiation - lasers, photon transmission is not the wire, and even in the case of the intersection between them do not have the slightest influence each other. Photon computer without wires parallel channels to transmit information, the density is actually infinite, one-fifth the size of the gold coin mirror, it was actually the information capacity of existing telephone and cable channels around the world many times. (3) energy consumption, low heat distribution, is an energy-saving products. Photon-driven computer, only computer of similar specification for a small part of the driving energy, which not only reduces power consumption, greatly reducing the heat the machine, but also for the miniaturization of photonic computer and portable and development, provides a convenient conditions. Scientists are testing the traditional electronic converters and photon combine to create a "hybrid" of the computer, this computer can not only process information faster, but also overcome the giant computer running internal overheating problem. At present, the photon number of key computer technologies such as optical storage technology, optical interconnection technology, optoelectronic integrated circuits and so has been a breakthrough, the most significant increases photon computing power is currently the subject of scientific research work is facing. The advent of photonic computer and further development, improvement of the human will cross to a better tomorrow, providing infinite power. 】 【Hybrid computer Has developed into a modern hybrid computer simulation program with automatic scheduling of hybrid multi-processor system capacity. It includes a super-small computer, one or two peripheral array processor, several simulation capability with automatic programming processor; between the various types of processors, through a mixture of intelligent interface for data and control signal conversion and transmission. This system has a strong real-time simulation capabilities, but the price is expensive. Intelligent Computer】 【 Intelligent Computer (intelligent computers) so far no universally accepted definition. A. computational theory of the Turing definition of one of the founders of computer information processing discrete digital computer. While the digital computer can simulate human intelligence in this matter of principle, there are diametrically opposing views. A. Church and Turing in 1937 were made independent of man's thinking ability and the ability of a recursive function is equivalent to the hypothesis. This hypothesis was not proved by some artificial intelligence scholars expressed as: If an issue submitted to the Turing machine Turing machine can not be resolved, the human mind to this question can not be resolved. The school has inherited the only logical thinking based on theory and philosophy to restore the traditional, digital computer simulation emphasizes the great potential of human thought. Other scholars, such as Dreyfus and other philosophers h. certainly think that the Turing machine-based digital computer can not simulate human intelligence. They believe that digital computers can only do formal information processing, and human intelligence activities, not necessarily formal, not necessarily information processing, human intellect can not be seen as composed of discrete, independent _set_ of rules and the environment dominated the situation computing. The school principle, denied the use of close material composition of the human brain, the possibility of intelligent machines, but this is different from the generalized digital computer intelligent machines. Some scholars also believe that no matter what the machine can not simulate human intelligence, but more the majority of scholars believe that the brain activity can be used to analyze the symbols and calculations. Must be pointed out, people in deepening the understanding of computing and widened. Some scholars of the physical processes can be realized as computation. Gene switch can also be seen as a calculated cell operation can also be explained by the so-called molecular computing. From this sense, generalized intelligent computer or smart machines and intelligent machines almost the same areas. 】 【Single-chip computer
Translated by Google
Computer network
Computer networks, refers to the geographic location of different functions with independent multiple computers and peripheral equipment, connected via communication lines in the network operating system, network management software and network communications protocol management and coordination, resource sharing and the transmission of information computer systems.
Translated by Google
Computer language
Computer language (computer language) means for communication between human and computer languages. Computer language is the transmission of information between people and computers in the media. The development of computer programming languages, evolved from machine language, assembly language to high-level language course. Computer language is divided into three categories: 1. Low-level language 2. High-level language 3. Specific Languages
Translated by Google
Supercomputer
Supercomputer typically refers to hundreds of thousands or more processors (machines) components, and can calculate the normal pc machine and the server can not complete the large and complex computer issues. To help you better understand the computational speed of supercomputers we compared the average computer's processing speed in adults walking pace, then a super computer to reach the speed of the rocket. In this context the computing speed, it can predict and explain the numerical simulation of natural phenomena not previously experiments.
Translated by Google
Name
Computer Name: computer Early known as: PC Between the early businessmen in Hong Kong is known as: computer, after the introduction of domestic, as the public is most familiar names.
Translated by Google
Hanyu Pinyin
jì suàn jī
Translated by Google
English
computer, calculation machine, PC (personal computer), laptop (notebook computer), Workstation (Workstations), Server (server), Mainframe (mainframe), Supercomputer (Super Computer)
Translated by Google
French
Ordinateur Basic concept A computer is a pre-stored in accordance with the procedures, automatic, high speed for a large number of numerical calculation and a variety of information processing in modern intelligent electronic devices.
Translated by Google
Introduction
Computer named computer science, electrical calculators from the early evolved. Speaking generally, "recognized the world's first electronic digital computer," Most people think that the 1946 launch of the "ENIAC", it is primarily used to calculate the trajectory. By the Institute of Electrical Engineering, University of Pennsylvania Moore made, but it's huge in size, covering over 170 square meters, weighing about 30 tons, consumed nearly 140 kilowatts of electricity. Obviously, such a computer is costly, inconvenient to use. This argument is commonly used in computer based textbooks, in fact, in 1973, according to the U.S. Supreme Court ruled that the first electronic digital computer, should be the U.S. Department of Physics, Iowa State University professor John Aten Denisov and his graduate students that Assistant Cliff Berry (Clifford E. Berry ,1918-1963) in October 1939 made "ABC" (Atanasoff-Berry-Computer). The reason why there is such a misunderstanding, because the "ENIAC" the research team a man named Mo Keli plagiarism in 1941 that John Aten Denisov research, and in 1946, applied for a patent. Until 1973 due to various reasons the errors are reversed. (Refer to the specific circumstances that Baidu Encyclopedia ----" John Aten Denisov "term, I hope you remember that ABC and John Aten Denisov, hope to be able to modify the textbook errors). Later, in recognition and commemoration of John Aten Denisov in the computer field that the great contribution made in 1990, former U.S. President George W. Bush awarded the John Aten was the highest technology award ----" Denisov U.S. National Science and Technology Award. " In 1956, the birth of the transistor computer, this is the second-generation computer. Just a few larger cabinet can only put it, computing speed is greatly increased. Appeared in 1959, is the third generation integrated circuit computer. The first computer invented by John von Neumann (then equivalent to the current computing power of the computer's calculator), there are three big warehouses, evolved later. From the 20th century, 70's, this is the latest stage of computer development. By 1976, the large scale integrated circuits and ultra large scale integrated circuit made of "Cray One," the computer into the fourth generation. The invention of ultra large scale integrated circuits, so that a small computer continuously toward miniaturization, low power, intelligent, systematic direction of replacement. 90 20th century, the computer to the "intelligent" direction, to create a computer similar to the human brain, can be thinking, learning, memory, network communication and so on. In the 21st century, it is laptop computers, miniaturization and specialization of operations per second rate of more than 100 million times, is not only easy to operate, inexpensive, and can replace the mental part of the people, and even expanded in some areas people intelligence. So, today's micro-computer to be called the image of a computer. The world's first personal computer introduced by the IBM in 1980. IBM introduced the Intel x86 hardware architecture and Microsoft's MS-DOS personal computer operating system, and to develop the PC / AT for the PC specifications. After the launch of the Intel microprocessors and Microsoft operating system launched by the development is almost equivalent to the history of personal computers. Wintel architecture completely replace the IBM personal computer in the leading position.
Translated by Google
Categories
From the computer type, operating mode, constitute the device, the operating principle, application status by, the computer has a variety of categories. From the data representation, the computer can be divided into digital computer, computers, and hybrid computer simulation of three; Posed by digital computer devices division, had mechanical and electrical computer computer use computer is now being studied optical computers, quantum computers, biological computers, neural computers, etc.. Computer or system of its size in terms of functionality, can be divided into giant, large, medium, small, micro computers and microcontrollers. Taken together, the computer classification is this: (1) classification according to performance indicators ① supercomputer: high-speed, large capacity ② mainframe: speed, used in the field of military technology research ③ Minicomputer: simple structure, low cost, outstanding performance and low cost ④ microcomputers: small size, light weight, low price (2) in accordance with the use classification ① Machine: targeted, specific services, specifically designed ② universal machine: scientific computing, data processing, process control, solving the problems (3) in accordance with the principles of classification ① digital machine: speed, high precision, automation, versatility ② simulator: as with analog computation, fast, low precision ③ mixer: the first two focus on the advantages and avoid disadvantages in the development stage
Translated by Google
Constitute
Regardless of the computer, they are composed of hardware and software, the two are inseparable. It did not install any software to the computer called the bare metal. Hardware Computer systems used by the electronic circuit and physical device is visible, tangible entity, such as the central processing unit (CPU), memory, external devices (input and output devices, I / O devices) and the bus and so on. ① memory. Main function is to store programs and data, the program is based on computer operations, data is the object of computer operation. Memory is a memory bank, address decoder, read and write control circuit, the address bus and data bus. Can be directly by the central processor instructions and data random access memory as main memory, disk, tape, CD and other large-capacity memory as external memory (or secondary storage.) From the main memory, external memory and the corresponding software, composed of computer storage systems. ② The main function of the central processor is based on the program memory, one by one to perform the specified operation procedures. The main components of central processing: Data register, instruction register, instruction decoder, arithmetic logic unit, the operation controller, the program counter (instruction address counter), the address register. ③ external device is a bridge between users and machines. The task is to input devices for computer processing of the data user requirements, character, text, graphics and procedures and other forms of information into a computer code that is acceptable to the computer in the form of deposit. The task is to output device computer processing results in the form of user needs (such as screen display, text printing, graphics, charts, language, sound, etc.) output. Input and output interface is an external device and a buffer between the central processing unit, responsible for the electrical properties of the matching and information format. Software Make the computer hardware system for smooth and effective _set_ of programs that work in general. Program is always some sort of physical media through to store and said, they are disk, tape, program paper, punch cards, but the software does not mean the physical media, but those can not be seen or touched the program itself. Reliable computer hardware as a person's strong physique, effective software as a person's intelligent thinking. Computer software systems can be divided into system software and application software in two parts. System software is responsible for the entire computer system resource management, scheduling, monitoring, and services. Application software refers to various areas of the users for their own needs and the development of various applications. Computer software system include: ① Operating System: the core system software, which is responsible for a variety of computer system software and hardware resource management, control and surveillance. ② database management system: the computer system is responsible for all documents, information and data management and sharing. ③ build system: responsible for the user with a high-level language source program written and compiled into the implementation of the machine can understand the machine language. ④ network system: responsible for computer systems to organize and manage network resources, making the computer more than one independent to each other between resource sharing and communication. ⑤ Standard Library: A standard format prepared by some of the procedures in the collection of standard procedures, including solving the elementary functions, linear equations, ordinary differential equations, numerical integration computer program.
Translated by Google
History
1614, Scotsman John Napier (1550-1617) published a paper which mentioned that he invented a calculating arithmetic and the equation of sophisticated computing devices. In 1623, Wilhelm Schickard (1592-1635) can be produced a number of addition and subtraction within six, and through the tone output of the answer 'calculations bell'. By turning the gears to operate. In 1625, William Oughtred (1575-1660) invented the slide rule. 1642 to 1643, Pascal (Blaise Pascal) to do tax collectors in order to help the father, he invented a gear adder operation, called "Pascalene", this is the first mechanical adder. In 1666, Samuel Morland invented in the UK can calculate the addend and subtrahend a mechanical counting machine. In 1673, Gottfried Leibniz created a step-type (stepped) count of cylindrical rotor machine, called the "Stepped Reckoner", this calculator can multiply the number of repeat and the number of devices in the automatically Jiaru Jia. In 1694, German mathematician, Gottfried Leibniz, Pascal's Pascalene to improve, creating a machine that can calculate the multiplier, it is still operating with the gear and the dial. 1773, Philipp-Matthaus manufacture and sell a small amount of precision to 12-bit Computing Machinery. 1775, The third Earl of Stanhope and Leibniz invented a similar multiplication calculator. 1786, JHMueller designed a difference engine, but unfortunately there is no funding to produce. In 1801, Joseph-Marie Jacquard loom is connected sequentially in the punch card controlled weaving style. Charles Babbages's difference engine in 1847, computer pioneer, British mathematician Charles Babbages began to design mechanical difference engine. Took 2 years to make the overall design, this machine can complete the operation, and 31-bit precision results of printing to paper, it is widely considered the world's first mechanical computer. However, the design is too complex and changes too often, Charles Babbages until his death did not put their design into reality. Until March 2008, it only then Charles Babbages the difference engine made out, this machine has 8,000 parts, weighs 5 tons, is currently placed in California's Silicon Valley Computer History Museum for visitors. 1854, George Boole published "An Investigation of the Laws of Thought", which is about symbols and logical reason, it later became the basic concepts of computer design. In 1882, William S. Burroughs resigned from the bank clerk's work and focus on the invention of addend. 1889, Herman Hollerith electric tabulating machine in the game really well, and is used in the 1890 census. Herman Hollerith used the Jacquard loom used to calculate the concept, he card data storage, and then compile the results into the machine. The machine that would have required a decade to get the population survey, done in just six weeks. In 1893, the first four-function calculator was invented. In 1895, Guglielmo Marconi transmitted radio signals. In 1896, Hollerith tabulating machine company was established (Tabulating Machine Company). In 1901, punching keys appeared half a century after only a few changes. In 1904, John A. Fleming patents the diode vacuum obtained, the basis for the establishment of radio communications. 1906, Lee de Foredt added a third valve in Felming diode, created a three-electrode vacuum tube. 1907, recorded music composition in New York, the first official station. In 1908, British scientists Campbell Swinton describes the use of electronic scanning method and indicates that manufacturing cathode ray tube TV. In 1911, Hollerith machine table with the other two companies merged to form the Computer Tabulating Recording Company (CTR), watch and record companies. But in 1924, changed its name to International Business Machine Corporation (IBM). In 1911, the Dutch physicist Kamerlingh Onnes discovered superconductivity in Leiden Unversity. In 1931, Vannever Bush invented a procedure to solve differential count machine, the machine can solve some of that mathematicians, scientists, the complexity of headache difference code. In 1935, IBM (International Business Machine Corporation) the introduction of "IBM 601", which is a has arithmetic components and can be calculated within 1 second multiplier in the perforated card machine. The calculation of its scientific and business play a great role. Created a total of 1500. In 1937, Alan Turing came up with a "universal machine (Universal Machine)" concept, you can perform any of the algorithms, the formation of a "computable (computability)" the basic concept. Turing's concept of the invention of the same type than the other as well, because he used the symbol processing (symbol processing) concepts. November 1939, John Vincent Atannsoff and John Berry has created a 16-bit addend device. It is the first computing machine to use vacuum tubes. In 1939, Zuse and Schreyer opened Tai created "V2" [later called the Z2], this machine follow the Z1's mechanical storage devices, coupled with a logic circuit breaker with (Relay Logic) components of the new arithmetic. But Zuse completed the draft after the program was interrupted for one year. 1939-40 years, Schreyer completed a 10-bit addend with vacuum devices, and the use of neon lights (neon) of the memory. January 1940, at Bell Labs, Samuel Williams and Stibitz complete a machine that can calculate the complex numbers, called "complex numbers calculator (Complex Number Calculator)", later renamed as the "circuit breaker counting machine Model I (Model I Relay Calculator) ". Some do it with the telephone switch logic unit: 145 breaker, 10 bars switch. Numbers with "Plus 3BCD" representative. In the same year in September, etype teletype installed in a mathematics conference, the connection to New York by the New Hampshire. In 1940, Zuse was finally completed Z2, it works better than the V2, but not too reliable. The summer of 1941, Atanasoff and Berry complete a special order to solve simultaneous linear equations system (system of simultaneous linear equations) of the calculator, and later called the "ABC (Atanasoff-Berry Computer)", which has 50 of 60 memory, the capacitor (capacitories) installed in the form of rotating drum 2, the clock speed is 60Hz. February 1941, Zuse completed "V3" (later called the Z3), is the first programming operation can count machine. It is also used floating-point operations, a 7-bit exponent, 14 bit mantissa, and a sign. Memory can store 64 characters, you need 1,400 breaker. It has more than 1,200 of the arithmetic and control components, and programming, input, output, same with the Z1. January 1943 Howard H. Aiken complete the "ASCC Mark I" (automatic sequential control computer Mark I, Automatic Sequence - Controlled Calculator Mark I), also known as "Haward Mark I". This machine has 51 feet long, weighs 5 tons, from 750,000 for part of the merger. It has 72 accumulators, each has its own arithmetic components, and 23-digit register. December 1943, Tommy Flowers and his team completed the first "Colossus", which has 2,400 vacuum tubes for logic components, 5 paper tape loop readers (Reader), each work can be 5000 characters per second. 1943 by John Brainered leadership ENIAC began to study. And John Mauchly and J. Presper Eckert responsible for implementation of the plan. 1946, (ENIAC) in the United States completed. In 1947, the Association for Computing Machinery (ACM) was established. In 1947, Britain completed the first storage tube O 1948 Bell Telephone Company for the development of semiconductors. In 1949, the British completed the construction of "Electronic Delay Storage Automatic Calculator" (EDSAC) 1950, "Automation" is the first time in the automotive industry. In 1951, Massachu_set_ts Institute of Technology into core In 1952, the first "stored program calculator" was born. In 1952, the first large-scale computer systems IBM701 declared completed. In 1952, the first successful invention of symbolic language translator. In 1954, the first semiconductor computer developed by the Bell Telephone Company. In 1954, the first general-purpose data processor IBM650 born. In 1955, the first large-scale use of core computer IBM705 completed. In 1956, IBM introduced the computer science 704. In 1957, the programming language FORTRAN come out. In 1959, the first small scientific calculator IBM620 developed. In 1960, the data processing system IBM1401 developed. In 1961, the programming language COBOL come out. In 1961, the first computer subsystem design is completed by the Massachu_set_ts Institute of Technology. In 1963, BASIC language come out. In 1964, the third generation of computer IBM360 series made. In 1965, the United States, Digital Equipment Corporation introduced the first minicomputer PDP-8. 1969, IBM Corporation successfully developed the 90 card machines and systems - 3 computer systems. 1970, IBM System 1370 computer series made. In 1971, the University of Illinois design is completed the Queen Akbar IV supercomputer. In 1971, the first 4004 microprocessor developed by Intel Corporation. In 1972, microprocessor-based film to begin mass production and sales. In 1973, the first piece of floppy disk developed by the IBM Corporation. In 1975, ATARI - 8800 microcomputer come out. In 1977, Kemo the whole portfolio company said Doyle micro PET - 2001 developed. In 1977, TRS - 80 microcomputer was born. In 1977, Apple - II microcomputer was born. In 1978, the beginning of VLSI applications. In 1978, the second time for the magnetic bubble memory business computer. In 1979, Sharp announced that made the first portable microcomputer. In 1982, the microcomputer began to spread, a lot of access to schools and families. In 1984, the Japanese computer industry to improve the development of the "fifth generation computer" --- a computer with artificial intelligence. 1984: DNS (Domain Name Server) name server release, the interconnection line to run more than 1,000 hosts. 1984: Hewlett-Packard released the excellent laser printers, HP inkjet printer also stay ahead of technology. January 1984: Apple's Macintosh release. Based on the Motorola 68000 microprocessor. Can address 16M. August 1984: MS-DOS 3.0, PC-DOS 3.0, IBM AT released, the use of ISA standards, support for large hard drives and 1.2M high-density floppy drive. September 1984: Apple has issued a 512Kb memory, Macintosh, but there is no increase in other areas. The end of 1984: Compaq began developing the IDE interface, you can transfer data faster, and was adopted by many of its peers, and later introduced further EIDE, can support up to 528MB drive. Faster data transfer. 1985: Philips and Sony to launch CD-ROM drive. 1985: EGA standards were introduced. March 1985: MS-DOS 3.1, PC-DOS 3.1. This is the first part of the network functions to provide support for DOS version. October 17, 1985: 80386 DX released. Clock reaches 33MHz, 1GB of addressable memory. More than 286 instructions. 6 million instructions per second, integrated 275,000 transistors. November 1985: Microsoft Windows release. However, in its 3.0 version of the full has not been widely used. Need the support of DOS, Mac user interface similar to that charged by Apple. Proceedings to August 1997 was terminated. December 1985: MS-DOS 3.2, PC-DOS 3.2. This is the first system to support 3.5-inch disk. But only support to 720KB. Only support the 3.3 version to 1.44 trillion. January 1986: Apple released a high performance of the Macintosh. Four megabytes of memory, and SCSI adapters. September 1986: Amstrad Announced am cheap and powerful computers Amstrad PC 1512. With a CGA graphics adapter, 512KB memory, 8086 processor, 20 MB hard drive. Using the mouse and graphical user interface, designed for families. 1987: Connection Machine supercomputer release. Parallel processing, 200 million times per second operation. 1987: Microsoft Windows 2.0 release, to be successful than the first, but there is not much improved. . 1987: English mathematician Michael F. Barnsley find the image compression method. 1987: Macintosh II release, based on the Motorola 68020 processor. Clock is 16MHz, 2.6 million instructions per second. A SCSI adapter and a color adapter. April 2, 1987: IBM introduced PS / 2 systems. Originally based on 8086 processor and the old XT bus. Later, the transition to 80386, start using the 3.5-inch 1.44MB floppy drive. The introduction of micro-channel technology, this series of models with great success. Shipments of 200 million units. 1987: IBM released VGA technology. 1987: IBM released its own design of microprocessor 8514 / A. April 1987: MS-DOS 3.3, PC-DOS 3.3. With the IBM PS / 2 with the release, support 1.44MB drives and hard disk partitions. For separation of multiple logical drives hard. April 1987: Microsoft and IBM released S/2Warp operating system. But not much success. August 1987: AD-LIB soundcard released. A Canadian company's products. October 1987: Compaq DOS (CPQ-DOS) v3.31 released. Supported hard disk partitions larger than 32Mb. 1988: Development of optical computer input, using photons instead of electrons to increase the computer's processing speed. 1988: XMS standard established. 1988: EISA standards established. 1988 June 6: 80386 SX computer in order to meet the needs of low-priced release. July 1988 to August: PC-DOS 4.0, MS-DOS 4.0. Supports EMS memory. But because of BUG, and later launched 4.01a. September 1988: IBM PS/20 286 released, based on the 80286 processor, do not use the Micro Channel bus. But other machines continue to use the bus. October 1988: Macintosh Iix release. Based on Motorola 68030 processors. Still use the 16 MHz frequency, 3.9 million instructions per second, support 128M RAM. November 1988: MS-DOS 4.01, PC-DOS 4.01 release. 1989: Tim Berners-Lee founded World Wide Web prototype, he worked at the European particle physics institute. Through hypertext links, the novice can easily browse the Internet. This greatly contributed to the development of INTERNET. 1989: Phillips and Sony released CD-I standard. January 1989: Macintosh SE/30 released. Based on the new 68030 processor. March 1989: E-IDE standards established, can support more than 528MB of hard drive capacity. Up to 33.3 MB / s transfer speed. Was used in many CD-ROM. April 10, 1989: 80486 DX released, integrated 1.2 million transistors. Its successor models of the clock frequency of 100MHz. November 1989: Sound Blaster Card (sound) release. 1990: SVGA standard established. March 1990: Macintosh Iifx release, based on 68030CPU, clocked at 40MHz, using a faster SCSI interface. May 22, 1990: Microsoft released Windows 3.0. Compatible with MS-DOS mode. October 1990: Macintosh Classic released, there is support to the 256-color display adapter. November 1990: the first generation of MPC (Multimedia PC standards) release. Processor at least 80286/12MHz, later increased to 80386SX/16 MHz, and an optical drive, at least 150 KB / sec transfer rate. 1991: Published ISA standards. May 1991: Sound Blaster Pro released. June 1991: MS-DOS 5.0, PC-DOS 5.0. In order to promote OS / 2 development, Bill Gates said: DOS5.0 is the DOS Terminator, the future will no longer spend our energy here. This version of the basic breakthrough 640KB memory limit. This version also marks Microsoft and IBM co-operation in the DOS on the end. 1992: Windows NT release, addressable 2G RAM. April 1992: Windows 3.1 release. June 1992: Sound Blaster 16 ASP release. 1993: INTERNET begin commercial operation. 1993: the classic game Doom release. 1993: Novell acquisition of Digital Research, DR-DOS became Novell DOS. March 1993 22: Pentium release. Integrated more than 300 million transistors. Early work in the 60-66MHz. Perform 100 million instructions per second. May 1993: MPC standard 2 release. CD-ROM transfer rate requirements of 300KB/sec. In the window 320 * 240 15 frames per second. December 1993: MS-DOS6.0 release, including a hard disk compression process DoubleSpace,, but a small company claims that Microsoft stole some of its technology. So in the later DOS6.2, Microsoft will be renamed: DriveSpace. WIN95 in DOS was a DOS7.0, WIN95OSR2 called DOS7.10. March 7, 1994: Intel released 90-100 MHz Pentium processor. September 1994: PC-DOS 6.3 release. October 10, 1994: Intel released 75 MHz Pentium processor. 1994: Doom II release. PC, the game opened up a vast market. 1994: Netscape 1.0 browser release. 1994: Comm & Conquer (C & C) release. March 27, 1995: Intel released the Pentium 120 Mhz processor. June 1, 1995: Intel released the Pentium 133 Mhz processor. August 23, 1995: Windows '95 release. Significantly different from its previous version. Completely out of MS-DOS, but take care of user habits also retained DOS form. Pure 32-bit multitasking operating system. This version has achieved great success. November 1, 1995: Pentium Pro released. Frequency of up to 200 MHz, 440 million instructions per second to complete, integrate 550 million transistors. December 1995: Netscape released its. JavaScript. 1996: Quake, Civilization 2, Command & Conquer - Red Alert and a series of well-known game was released. January 1996: Netscape Navigator 2.0 release, the first browser to support JavaScript. January 4, 1996: Intel released the Pentium 150-166MHz processor, integrated 330 million transistors. 1996: Windows '95 OSR2 release, fixes some of the BUG, expanded part of the function. 1997: Gr and Theft Auto, Quake 2, Blade Runner and other famous games publishing, 3D graphics accelerator card in great demand. January 8, 1997: Intel released Pentium MMX. For games and multimedia functions were enhanced. April 1997: IBM's Deep Blue (Deep Blue) computer, over the human world chess champion Garry Kasparov. May 7, 1997: Intel released Pentium II, an increase of more instructions and more CACHE. June 2, 1997: Intel released 233 MHz Pentium MMX. 16, 1997: Apple encountered serious financial crisis, Microsoft extends a helping hand, injection of 1.5 billion USD. Apple to withdraw its complaint on condition that: Microsoft Windows interface to imitate their prosecution, and that Apple is also the imitation of the XEROX design. February 1998: Intel released 333 MHz Pentium II processor. Using 0.25-micron technology to improve speed and reduce heat. June 25, 1998: Microsoft released Windows '98, some people attempt to dismember Microsoft, Microsoft hit back saying it would hurt U.S. national interests. January 25, 1999: Linux Kernel 2.2.0 released. People of their high hopes. February 22, 1999: AMD has released K6-III 400MHz. There is a test that outperforms Intel P-III. Integrated 23 million transistors, socket 7 structure. February 26, 1999, Intel introduced the Pentium Ⅲ processor, Pentium Ⅲ and used the same Slot1 Pentium Ⅱ structure, and increased with 70 new SSE instruction _set_ instructions to enhance the 3D and multimedia processing capabilities. More than the original 450MHz clock frequency, bus speed of 100MHz or more, using 0.25μm process, integrated 512KB or more of the secondary cache. April 26, 1999, Chen Ying Ho Taiwanese students prepared CIH virus outbreaks worldwide, nearly 100 million units of computer hardware and software in varying degrees of damage, direct economic losses of billions of dollars. May 10, 1999, id Soft launched the "Quake Ⅲ" the first test version, the next time, "Quake Ⅲ" gradually establish competitive standard FPS game, and become a computer hardware performance testing criteria. June 23, 1999, AMD introduced the use of new architecture, called the Athlon processor, and CPU frequencies beyond the first time Intel, from the exciting end of the century opened the speed of the processor clock speed war. September 1, 1999, Nvidia has introduced GeForce256 graphics chip, and put forward a new concept GPU. October 25, 1999, code-named Coppermine (copper) of the Pentium Ⅲ processor release. Using 0.18μm process, the internal integrated 256KB full-speed L2Cache, built-in 28 million transistors. January 1, 2000, the world is waiting, Oh, the millennium bug and did not explode. February 17th, Microsoft officially released the U.S. Windows2000. March 16, 2000, AMD officially launched the company speeds up to 1GHz, "Athlon" processor, which opened a GHz processor war. March 18, 2000, Intel Corporation launched its own 1GHz Pentium3 processor. The same day, up to 50 billion dollars of as_set_s Iridium declared bankruptcy, the company fully terminate the Iridium phone service. The Pentagon eventually won the right to use Iridium, but uses far unknown. April 27, 2000, AMD has released a "Duron" (Duron) processor, Intel began to attack the low-end market. May 14, 2000, entitled "I LOVE YOU" (Love Bug) virus attacks worldwide, only three days to create the world's nearly 45 million computers infected, economic losses amounted to 2.6 billion. September 14, 2000, Microsoft officially launched the windows for home users millennial version of Windows Me, and this was also the last Microsoft operating system kernel based on 9X. November 12, 2000, Microsoft announced the launch of slim PC Tablet PC. November 20, 2000, Intel officially launched the Pentium4 processor. Netburst the processor using the new architecture, the bus frequency reached 400MHz, and adds another 144 new instructions for improved video, audio and other multimedia and 3D graphics capabilities. December 14, 2000, 3dfx announced the sale of all as_set_s to a rival Nvidia, thus ending his legendary history. February 1, 2001, Sega announced its withdrawal from the game hardware market. March 26, 2001, Apple released Mac OS X operating system, which is Apple's operating system since 1984 since the birth of the first major of the amended version. June 19, 2001, Intel launched with "Tualatin" (Tualatin) P3 and Celeron processor core, this is the first time Intel 0.13-micron process. October 8, 2001, AMD announced the Athlon XP series processors, the new processor core using a new, professional 3D Now! Instruction _set_ and OPGA (organic pin array) package, and the use of the "relative performance marked" (PR nominal value) of the naming convention, while the processors Intel makes a very excellent price pressure doubled. October 25, 2001, Microsoft released Windows XP operating system, Bill. Gates said: "DOS era is over." Windows XP's release, also contributed to the downturn in the global PC hardware are in the market. February 5, 2002, Nvidia released GeForce 4 series graphics processing chip, the Ti and the Mx series is divided into two series, one of the GeForce4 Ti 4200 and GeForce 4 MX 440 is the market two products in the great vitality model. May 13, 2002 and died a long time veteran graphics chip manufacturer officially launched Matrox Parhelia-512 (Chinese name: Magic Day) graphics chip, which is the world's first 512bit GPU. July 17, 2002, ATI released the Radeon 9700 graphics card, the card uses code-named R300 graphics core, and no dispute will be the first time for Nvidia 3D performance under the hegemony of the throne. November 18, 2002, Nvidia released the GeForce FX, code-named NV30 graphics card, and the product was first used on 0.13-micron manufacturing process, the use of a number of advanced technologies, so the card is also known as an epoch-making products. January 7, 2003, Intel released a new mobile Treatment Standards "Centrino." February 10, 2003, AMD released the Athlon XP Barton core processors, although the introduction of a long period of time, lack of media recognition, but with high cost performance and excellent overclocking ability, to create the ultimate Barton out of an infinite memory of all DIYer Barton era. February 12, 2003, FutureMark released 3Dmark 03, but this has triggered a crisis of confidence in testing software. Overall shift in 2004 Intel PCI-Express. 2005 Intel began to promote dual-core CPU. In 2006 Intel began to promote the four core CPU. Intel IDF 2007 shocked the world were announced at the 2 trillion 80-core CPU. January 2007 Microsoft released Windows Vista (Windows 6) January 2009 Microsoft released Windows 7 Computers in China Development of human civilization in China was in the early history of computing tools to create aspects of the invention has written a glorious chapter. Far as the Shang dynasty, China has created a decimal notation method, led the world in more than a thousand years. In the Zhou dynasty, the invention of the most advanced computational tools - count chips. This is a kind of bamboo, wood or bone stick made of a different color. Calculated for each math problem, it is usually compiled a _set_ of verses in the form of the algorithm, while computing, while constantly re-cloth stick. Ancient Chinese mathematician Zu, is used to calculate the chip count in pi between 3.1415926 and 3.1415927. This result is a millennium earlier than the West. Abacus beads is another original, but also the history of computing tools first major invention. This light, flexible, easy to carry, and the people living close to the computational tools, first appeared in the Han Dynasty about to mature during the Yuan Dynasty. Abacus beads China's economic development not only played a useful role, but also spread to Japan, Korea, Southeast Asia, and has withstood the test of history, still in use. Guide to Chinese inventions car, water was fully assembled device, recording drum trucks, jacquard machine, not only for the automatic control of mechanical outstanding contributions to the development and evolution of computational tools have a direct or indirect impact. For example, Zhang Heng's water transport celestial globe made instrument, can be run automatically synchronized with the Earth and subsequently by the Tang and Song dynasties improvements, became the world's first astronomical clock. Recording drum car is the world's first automatic counting device. Liu Jacquard principle the development of computer programs have been controlled indirectly. Ancient China by Yang and Yin Yao constitute two gossip, but also the development of computing technology have had a direct impact. Leibniz wrote a research paper gossip, systematic presentation of a binary arithmetic algorithms. In his view, the world's first binary representation is China's gossip. After a long silence, the new China was founded, the Chinese computing technology has entered a new period of development, has established research institutions, institutions of higher learning established in the computing and computational mathematics professional and professional equipment, and _set_ out to create China's computer industry . 1958 and 1959, China has made the first small and large tube computer. Mid-60s, China successfully developed a number of transistor computer and languages such as preparation of the ALGOL compiler and other system software. The late 60's, China began to study integrated circuit computer. 70 years, China has a small integrated circuit computer production. 80 years later, China began to focus on developing micro-computer system and application; in large computers, particularly computer technology giant has also made important progress; established a computer services sectors, and gradually improve the structure of the computer industry. In Computer Science and Technology, China in the finite element method, mathematical theorem proving, Chinese information processing, computer systems and software and other aspects of the structure to accomplish something. Computer applications in China in the field of scientific computing and engineering has made remarkable achievements. In the management and process control in terms of research and practice of computer applications is also increasingly active.
Translated by Google
Note the use of computer
A switching machine: Computer equipment must be properly turn off the power, otherwise it will affect their working life, but also some failures of the culprit. The correct order of the computer switch: boot, first connect and turn on the computer peripheral device power (such as monitors, printers, etc.), and then open the host computer power supply; off the opposite order, first turn the console off, and then disconnect the other peripheral devices. Second, computer equipment safety instructions: (A). Computer equipment should not be placed in dusty places (such as a window near the street, etc.), there is no condition for local, should be able to dust cover and other cover when not in use; should not be placed more damp place (such as water bottles concentrated place next to drinking fountains, etc., were easy to pour the water splashed on the device), there is attention to the main chassis of the heat and avoid direct sunlight on to the computer; (B). Computer dedicated power outlet and then use other electrical appliances should be strictly prohibited, Hand stove and other personal electrical equipment, computer equipment, work should be checked whether all off and then leave; (C). Can not move the computer when the computer work; because the computer's hard disk structure is particularly fine, so try to avoid the hard drive vibration. (D). Do not plug in the computer when work equipment, frequently switching machines, each charged pluggable interfaces (in addition to USB interface), easy to burn interface cards or damage caused by manifold; (E). Anti-static, anti-dust, can not let the keyboard, mouse, water and other equipment; (VI). Regular data backup and 整理 disk. As the drive frequency of use, viruses, misuse, and some data is easily lost. So should regularly back up important data to prevent the completion of the work of a few months not timely because of the backup and all is lost. often finishing disk, clean the junk files, to avoid junk files take up too much disk space, back to the normal file search and management of inconvenience, not only easy to _delete_ important files, but also in urgent need Can not find the file when needed and so on; (VII). Identify problems and Times to repair, so that the machine always works in good condition. Include: equipment for unusual problem in all the wiring is loose, etc.; (H). To prevent computer viruses, install antivirus software, update regularly, and killing the virus Some computers use the Note: 1, automatically linked to some strange site. Should pay attention to the Internet, do not understand something do not make points, especially some pornographic pictures, advertisements were floating in the browser page, do not click on it; if it affects you browse the web, to drag the slider down until the good perspective so far. In addition, some Internet plug-in so as not installed. And do not install Internet Assistant and Toolbar, such software can sometimes affect the normal use of the browser. 2, Do not download and install a number of small Internet software or procedures. 3, a stranger sent by e-mail. Receive e-mail sent by strangers, especially those heading is the temptation, such as a joke, or a love letter, etc., but also e-mail with attachments. 4, using the USB killing the virus before the operation, regular use of anti-virus software detection system for viruses. 5. Installed will function better is to install anti-virus software. Computer viruses attack the performance of the common points: 1, the running speed was significantly slower; 2, before running the software can be frequently out of memory error; 3, suggesting a number of irrelevant words. 4, to produce a specific image. 5, without making any operation, the hard disk light flashing. 6, Windows desktop icons change. 7, the computer suddenly crash or restart. 8, automatically send e-mail. 9, the mouse automatically is busy. 10, some of the normal file suddenly disappeared, forcing the virus is hidden. 11, not to search in the address bar with a "virus" the words of text, or if the page automatically forced to close. Computer virus may have suffered adverse consequences: 1, the drive not power up, data loss 2, the system file is missing or damaged 3, file directory chaos 4, part of a document is missing or corrupted 5, part of a document automatically encrypted 6, the network was paralyzed and unable to provide normal service.
Translated by Google
Computer maintenance and assembly need to know
The computer's basic components and development, understanding of the main components of the system Common hardware peripherals and the system knowledge and performance parameters identification Hands-on assembly of computer and system software installation System drive and common software installation and basic drive backup File and system backup, ghost in the use of File and system backup should be restored in determining the ghost before 1. First decided to restore the disk (eg C drive); 2. And then finishing C drive, the location of some security software to remove the wrong C drive, do not remove the software rely on C drive (Pinyin input method, Thunder, office software, wps, etc.) against each restored, but also re- equipment; 3. And then use the antivirus software on the C drive and complete anti-virus, followed by use of optimization software (Super Rabbit, optimize the master, etc.) clean up the registry, remove useless information 4. By ghost backup C drive (the backup files do not put into the backup disk); 5. Backup finished, restore C drive 6. To re-start, has Enter Bios _set_ up and maintain basic The basic system testing and system optimization, system security testing Basic troubleshooting and maintenance Basic knowledge of computer information security View of the dangers of computer viruses spread of large, computer system once infected with the virus, may cause the computer system failures, which could have disastrous consequences. Therefore, countries have been guilty of anti-computer virus attached great importance, and gradually adopted a series of scientific management methods and preventive measures. Usually follows the safety management methods and countermeasures. 1, the staff should be strengthened and its harm on the education of computer viruses to make the staff realize that the destruction of programs and data or system failures, not only to the company or factory causing huge economic losses, but also seriously damage the company and the factory's social image and reputation. 2, in order to guard against virus invasion, the system software should be write-protected. Attention to the need for executable programs and data files write-protected. 3, for the files to be protected, in order to prevent any possible changes to the file, only read operations are allowed, and should always check the file has been modified. 4, the procedure for computer system, comparing to regular tests and inspections to detect whether there are viruses. 5, public software to be used with caution to prevent transmission and spread of computer viruses. 6, in the use of password (Password), you should _select_ the random characters as the password as much as possible, so that the password itself is meaningless, birthdays, etc. Do not use the name as the password in order to increase the difficulty of deciphering the password. 7, to strengthen the management of using a floppy disk, floppy boot not as much as possible. Boot with the floppy disk because the chance of infection caused by a little more, but with hard disk boot, the more secure. 8, should be strictly prohibited in other sectors of the program into the system. Really need to use external programs such as must be authorized by management personnel, and to pass strict inspection and testing, in certain non-virus infection is allowed to run in this system. 9, not allowed to just casually connected to the system and external systems to prevent the virus outside the system took the opportunity to invade. 10, does not allow all kinds of game software into the computer system, because the game software to be many opportunities for viruses, with the possibility of large viruses, in order to prevent computer viruses take advantage of intrusive. 11, the use of antivirus software, virus is essentially a process, it is the method by software to prevent or clear the virus, which is anti-virus software. The market today popular international and domestic killing virus software is very rich. Such as: Scan, Clean, kill family, KV200, KV300, and so forth. The antivirus software have their own characteristics, the use of more series and kv series is kill these two series are available through the menu interface, easy to operate. The price of modern computer: If you are a home user, the basic price: 3000 yuan The next generation of computer Quantum computer, by definition, the use of the extraordinary properties of quantum physics world. Once able to create quantum computers, then it will make the speed increase on the general computer can offer. Of course, this involves physical simulation of quantum cryptography and next-generation computer also is only in concept stage. Computer science In today's world, almost all professionals are closely linked with the computer. However, only certain specified occupations and disciplines will be in-depth study of the computer itself manufacture, programming and use of technology. Used to interpret computer science research in different areas of the meaning of each academic term is changing, and new subjects are endless. Computer Engineering: a branch of electronic engineering, computer hardware and software and the main contact with each other between the two. Computer Science: Computer is a traditional title for academic research. The main computing and efficient algorithm to perform specific tasks. The subject for us to solve a problem to determine whether the field in the computer solution, such as how can the efficiency of solutions, and how the program made more efficient. Today, in computer science have been derived in a number of branches, each branch are different types of issues for further study. Software Engineering: focuses on research and development of high-quality software systems methodologies and practical methods, and tried to compress and forecast development costs and development cycle. Computer information system organized in a wide range of environment (business-oriented) in the computer application. Many disciplines are intertwined with other disciplines. Such as geographic information systems specialists is the use of computer technology to manage geographic information.
Translated by Google
Encyclopedia
jisuanji Computer computer Computers in the aerospace ground computer system is widely used ground, such as civil aviation ticketing, ground simulation test, measurement and control of spacecraft, aircraft design and manufacturing and so on. Air passenger reservation system began to appear from the 50s to deal with computer reservation services, has developed into a complete programmable passenger reservation system. This high-speed communication network system through the major cities of the world ticket link up with the computer center to form a computer network, to carry out long-distance connecting guest book business. Such systems are generally connected with thousands of terminal devices such as display terminals, printers, ticketing machines. Seats on flight conditions, timetables and other passenger information and data are stored in hundreds of large internal memory, according to the prompt in the display terminal displays the relevant data to answer the various questions. Ground high performance computer simulation is the main equipment for ground simulation test, this test developed for the new aircraft has opened up a new way. Ground simulation test can substitute some or all of some of the ground vehicle simulation test, it can partially replace the aircraft flight. This test requires the establishment of accurate mathematical models of physical processes, has some versatility. It can save development cost and shorten the development cycle and reduce the accident aircraft flight test (see the flight simulator, control system simulation technology.) Spacecraft monitoring and control space tracking and data acquisition network of spacecraft to track, measure and control the spacecraft work is the basic guarantee. This time is strong, informative, high reliability, must rely on large-scale high-speed computer. Aerospace Control Center general configuration of multiple, large-scale high-speed computers and sophisticated software system. The computer system can be measured real-time processing of spacecraft orbit and telemetry data, for spacecraft orbit, attitude, and the parameters of the work of various sub-systems for monitoring and control of spacecraft to provide information. When the spacecraft fails, the computer system can help find the cause of the malfunction, and provide possible ways of dealing with failure. In aircraft design and manufacture of aircraft design and manufacturing process, the need to address the aerodynamics, flight mechanics, structural mechanics, mechanics of materials, manufacturing processes and other aspects of a large number of complex issues, requires the help of computer solver. With the large-scale CNC machines and general-purpose drawing software drawing the emergence of computer-aided design and manufacture of aircraft design and manufacturing has become an important tool. Man-machine dialogue work, can be directly output from the computer design drawings, manufacturing information needed to significantly shorten the development cycle, improve product quality and reduce costs. Figure 1 China's "Galaxy" billion for the Chinese computer giant computer main cabinet. 60 years since the aircraft computer, microelectronics and computer technology developing rapidly, particularly since the advent of the microprocessor 70, digital computer in the aircraft to obtain an increasingly wide range of applications. The special requirements of aircraft on the aircraft computer, the computer has some special requirements, which are: ① to withstand the harsh environmental conditions: such as wide operating temperature range (the plane is -60 ~ +60 □, missiles up to -55 ~ +125 □), the impact of up to 40 □ overload, and vibration, humidity, salt Fog, electromagnetic interference, space particle radiation and nuclear radiation, for which generally require use of carefully _select_ed military electronic devices. Sapphire substrate CMOS devices have a higher radiation resistance, in the missile and spacecraft computer used more often.
n.: brain, accounting machine, calculating machine, calculator, electronic device for storing and analysing information fed into it, making calculations, or controlling machinery automatically
pref.: Cyber-
French Expression
n. ordinateur
Thesaurus
to calculate, watch, apparatus, catalog, breviate, horologe, clock, docket, bill computor brain, artificial brain