Computer's commonly known. Xu Chi, "Evergreen Tree of Life": "to work hand command of the computer, acting on nature to make a living, not only the protection of nature and progress of mankind in accordance with the will, complete, perfect to transform human society."
Translated by Google
No. 2
Means the computer.
Translated by Google
English name
: computer Is a series of computer instructions according to the data processing machine. Called the related research in computer science from the data at the core of the research, said information technology. The definition for the computer to calculate or include only a limited number of functions for a particular purpose equipment. However, when it comes to modern electronic computer, the most important feature is that as long as the correct direction, any one computer can simulate the behavior of any other computer (limited only by their computer's storage capacity and speed of execution) . Accordingly, relative to the early modern electronic computer electronic computer is also known as general-purpose computer.
Translated by Google
History
eniac is a milestone in the history of computer development have been, the computer's original English word "computer" refers to people engaged in data calculation. And they often need the help of some mechanical computing device or analog computer. The ancestors of these early computing devices include abacus, and dates back 87 years by the ancient Greeks used to calculate the planetary movement 安提基特拉 mechanism. With the late Middle Ages, mathematics and engineering of the European economy booming, the 1623 first developed by the wilhelm schickard Europe first computing device, which is capable of six within a few addition and subtraction, and can ring out the answer "calculation bell. " Use rotating gears to operate. In 1642 the French mathematician pascal william oughtred slide rule, based on the slide rule to be improved, to the eight calculations. Also sold many products, became a fashionable commodity. 1801, joseph marie jacquard loom on the design of the improvements, which he used a series of punched paper cards to weave complex patterns as the procedure. jacquard loom type, although not considered a real computer, but it really is the emergence of modern computer an important step in the development process. Charles Babbage (charles babbage) is the concept and design a fully programmable computer, the first person, was in 1820. However, due to technical conditions, financial constraints, and can not tolerate non-stop on the remedial design, this computer has been unable to come in his lifetime. About the late 19th century, many turned out to be of great significance to the computer science technologies have emerged, including punch cards and vacuum tubes. hermann hollerith designed a watch with a machine, punch cards to achieve the application of large-scale automatic data processing. In the first half of the 20th century, in order to meet the needs of scientific computing, many single-purpose and deepen the complex computer simulations may be developed. These computers are used for specific issues they are mechanical or electronic model as the basis. Three to four years of the 20th century, a powerful computer's performance and versatility has been gradually improved, the key feature of modern computers continue to be joined. commodore in the 1980s produced amiga 500 computer up and down along such a long long quest to define the so-called "first electronic computer" can be quite difficult. May 12, 1941, konrad zuse completed his share of mechanical and electrical equipment "z3", this is the first feature with automatic binary arithmetic and the possible functions of computer programming, but is not "electronic" computer. In addition, other notable achievements are: the birth of the summer of 1941 Atanasoff - Berry Computer is the world's first electronic computer, which uses a vacuum tube calculator, binary values, reusable memory; in the UK was demonstrated in 1943, the mystery of the Colossus computer (colossus computer), despite the extremely limited programming skills, but it really does tell people to use vacuum tubes only trustworthy but also the re-electrification program; Harvard University harvard mark i; and binary-based "Eni Akbar" (eniac, 1944), the intent is the first general-purpose computer, but because of its design and lack of flexibility, resulting in 对它 every physical re-programming means that the electrical circuit re-connection. Eni Ake development team for its shortcomings and further improve the design, and finally today we are showing a well-known von Neumann architecture (program memory architecture). This system is the basis for all of today's computers. In the late 1940s, a large number of computers based on this system began to be developed, which the British first. Although first developed and put into operation is complete, "small-scale experimental machine" (small-scale experimental machine, ssem), but the real practical machine was developed is likely to be edsac. Throughout the 1950s, dominated by vacuum tube computer. September 12, 1958 in robert noyce (intel founder) under the leadership of the invention of integrated circuits. Soon after the launch of the microprocessor. Between 1959 to 1964, designed the second generation of computers are generally known as a computer. The 1960s, transistors replaced their computer. Transistors smaller, faster, more affordable, more reliable, which allows them to be commercial production. 1964 to 1972, generally referred to as the third generation of computers computer. Extensive use of integrated circuits, a typical model is ibm360 series. By the 1970s, the introduction of IC technology has greatly reduced the production cost of the computer, the computer began to millions of households. After 1972, the computer used to be called the fourth generation of computers. Based on large scale integrated circuits, and later the VLSI. April 1, 1972 launch of intel 8008 microprocessor. 1976 stephen wozinak and stephen jobs founder of Apple Computer. And the launch of its apple i computer. May 1977 apple ii computer-based publishing. June 1, 1979 release of 8-bit intel 8088 microprocessor. In 1982, micro-computers began to spread, poured into the schools and families. January 1982 commodore 64 computer publishing, Price: $ 595. 80286 issued in February 1982. Clock frequency to 20mhz, and increased protection mode, accessible memory 16m. Support more than 1gb of virtual memory. 2.7 million instructions per second, integrates 134,000 transistors. November 1990: the first generation of mpc (Multimedia PC standards) release. Processor at least 80286/12mhz, later increased to 80386sx/16 mhz, and a CD-ROM drive, at least 150 kb / sec transfer rate. October 10, 1994 release 75 mhz pentium intel processor. November 1, 1995 pentium pro release. Frequency up to 200 mhz, 440 million instructions per second to complete, integrate 550 million transistors. January 8, 1997 release intel pentium mmx. For gaming and multimedia capabilities have been enhanced. After the rapid changes in the computer, published in 1965, Moore's Law should continue to be issued permits to predict in the next 10 to 15 years still remains applicable.
Translated by Google
Principle
Personal computer (pc: personal computer) the main structure: Monitor Motherboard cpu (central processor) Main memory (RAM) Expansion card (graphics card motherboard sound card, etc. Some of these can be integrated) Power Supply CD-ROM Secondary storage (hard drive) Keyboard Mouse Although computer technology since the 1940s, the birth of the first electronic general-purpose computer has been dazzling since the rapid development, but today is still basically a computer program stored in the structure, that is, von Neumann architecture. This structure to achieve a practical general-purpose computer. Storage between program structure will describe a computer into four main sections: the arithmetic logic unit (alu), control circuit, memory, and input and output devices (i / o). These components through a _set_ of cables to connect a group (especially, when a group of lines are used for a variety of different data transmission time and is intended as the bus), and driven by a clock (of course some other events may drive control circuit). Conceptually, a computer's memory can be regarded as a "cell" unit. Each "cell" has a number, called the address; they can be stored in a small fixed-length information. This information can be either directive (to tell the computer what to do), or a data (instruction processing objects). In principle, each "cell" can all be stored in either of the two. Arithmetic logic unit (alu) can be called the computer's brain. It can do two operations: the first is arithmetic, such as addition and subtraction of two numbers. Arithmetic function of the alu parts is very limited, in fact, some alu does not support circuit-level multiplication and division (by the user only for multiplication and division operations by programming). The second category is the comparison operator that given two numbers, alu compare them to determine which larger. Input-output system is a computer receiving information from the outside world and the feedback operation result to the outside world means. For a standard personal computer input devices are the keyboard and mouse, output devices are monitors, printers, and many others will be discussed later can be connected to the computer i / o devices. Control over the computer system will link the various parts. Its function is to input and output devices from the memory and to read instructions and data, decoding the instruction to meet the requirements of the Directive alu delivery is entered correctly, this alu What do these data and results of operations data back to where they are. A control system is an important component to keep track of where the current instruction address counter. Typically this counter with the instruction execution and accumulate, but sometimes if the instruction to make the jump is not so rule. Since the 1980s, alu and control unit (both synthetic CPU, cpu) have been gradually integrated into an integrated circuit, called a microprocessor. Type of computer operating mode is very straightforward: in a single clock cycle, start the computer fetches instructions and data memory, then execute commands, store data, and then get the next instruction. This process is repeatedly executed until they have a termination command. Explained by the controller, perform the instruction _set_ computing is a very limited number of well-defined _set_ of simple instructions. Can generally be divided into four categories: 1), data movement (eg: a value from the storage unit a copy to the storage unit b) 2), the number of logic operations (such as: calculation of a storage unit and storage unit b of the results returned storage unit c) 3), verify the conditions (such as: If the storage unit within a value of 100, then the next instruction address storage unit f) 4), sequence Gaiyi (such as: the next instruction address storage unit f) The same instruction as the data inside a computer is expressed in binary. For example, 10110000 is an intel x86 family of microprocessor instructions copy the code. Supported by a particular instruction _set_ computer is the computer's machine language. Therefore, the use of the popular machine language will make the de facto software on a new computer run much easier. So for those models of commercial software development people, they are usually only concerned about one or several different machine language. More powerful small computers, mainframe computers and servers may be different with the computer. They usually share the task to a different cpu to run. Today, multicore microprocessors and personal computers are also moving in that direction. Supercomputers typically have a stored program computer with the basic architecture of a significant difference. They usually have thousands of cpu, but these designs seem to be useful only for specific tasks. In a variety of computers, some microcontrollers program and data used to make the separation of the Harvard architecture (harvard architecture).
Translated by Google
Computer digital circuit
Through the gates of the permutations and combinations can be designed to accomplish many complex tasks. For example, the adder is one of them. The device electronics to achieve the results of the two numbers together and saved - in computer science in such a _set_ of operations through a specific intent to achieve a method is called an algorithm. Finally, a significant number of people through the logic gates assembled successfully complete alu and controller. It significant, just look at csirac this may be the smallest practical tube computer. Machine with 2000 tubes, of which there are many dual-use devices, which means a total of 2000-4000 combined with logic devices. Vacuum tubes for the manufacture of large-scale gate was clearly laboring. Expensive, unstable (especially in quantity for a long time), bloated, high energy consumption, and speed is not fast enough - although far more than the mechanical switch circuit. All this led to the 1960s they were replaced by transistors. The latter are smaller, easy to operate, reliable, cheaper energy, and lower cost. Integrated circuits are the basis for today's computer after the 1960s, transistors began as a large number of transistors, a variety of other electrical components and wiring placed in a silicon integrated circuit board replaced. 70, alu and controller as composed of two parts of cpu, began to be integrated into a single chip, known as the "microprocessor." History of the development along the circuit, you can see a chip on the number of integrated devices has been growing rapidly. The first integrated circuit only contains dozens of components, and by 2006, an intel core duo processor, up to the number of transistors on a one hundred fifty-one million is huge. Whether it is tube, transistor or integrated circuit, they can be designed by using a trigger mechanism to be used as the stored program architecture "Storage" component. In fact, the trigger has indeed been used as small-scale high-speed storage. However, almost no use of computers to design a trigger for large-scale data storage. The first is to use a computer to a TV screen tube williams or some of the mercury delay line (line of sound waves go through this line of extremely slow enough to be considered "stored" in the above) emission electron beam and then come back to read the way stored data. Of course, although not how elegant and effective way to eventually be replaced by magnetic storage. For example, magnetic core memory, the current representative of the information in which the iron material in manufacturing long-lasting weak magnetic field, when the magnetic field was read out again when it realized data recovery. Dynamic random access memory (dram) was also invented. It is an integrated circuit that contains a large number of capacitors, which store data capacitance device is responsible for the charge - charge intensity is defined as data values. Input and output devices Input and output devices (i / o) is the world's information to the external computer equipment and the results will be returned to the outside world devices in general. These results may be returned as the users can visually experience, as the computer or other device controlled by the input: For a robot to control the computer's output is basically the robot itself, such as to make the kinds of behavior. Program Simply put, the computer program is a sequence of computer instructions that were executed. It can be just a few instructions to perform a simple task, it may be possible to operate a huge amount of data of complex instruction queue. Many computer programs contain millions of instructions, many of which command may be executed repeatedly. In 2005, a typical personal computer can perform about 30 million instructions per second. Computer usually does not perform some very complex instructions to get the extra functions, more programmers are arranged in accordance with those relatively simple to run, but a large number of short instructions. Computer software and computer programs are not equal to another word. A more inclusive computer software highly technical terms, it contains a variety of procedures used to complete the task and all related materials. For example, a video game does not only contain the program itself, but also pictures, sounds and other data to create the contents of the virtual game environment. In the retail market, in a computer is just an application on a large number of users of the software for a copy. This commonplace example of course Microsoft's office software suite, which includes some of the columns are interrelated, and the demand for general office procedures. The use of those very simple machine language instructions to numerous powerful applications bound to mean that its programming is not a small scale. windows xp operating system, this program contains high-level language c + + source code to reach 40 million lines. Of course this is not the greatest. Such a large scale shows the management software in the importance of the development process. The actual programming, the program will be broken down to each programmer can be an acceptable length of time to complete the scale. Even so, the process of software development process is still slow, unpredictable, and missed a lot. Came into focus for software engineering on how to accelerate progress and improve operating efficiency and quality. Library and operating system Shortly after the birth of the computer, it was discovered that certain operations in many different programs have been implemented, for example, calculate some standard mathematical functions. For efficiency considerations, the standard version of the program was to collect a "library" for each procedure call. Handle many tasks often go to a wide range of additional input and output interfaces, then, can be used to connect the library come in handy. 1960s, with the popularity of computer industrialization, more and more computers being used as a treatment of different jobs within the organization. Quickly, automatically arrange continued operation and implementation of special software appeared. These not only control the hardware and software responsible for job scheduling is called "operating system." An early example of the operating system is ibm's os/360. Constantly improving, the operating system and the introduction of time-sharing mechanism - concurrent. This allows different users to "simultaneously" to use the machine to perform their own procedures, it seems like everyone has a own computer. To this end, the operating system as each user needs to provide a "virtual machine" to separate the different procedures. As the need for operating system control equipment is also increasing, one of which is hard. Consequent, the operating system and the introduction of document management and directory management (folder), which greatly simplifies this type of permanent storage of equipment. In addition, the operating system is also responsible for security control, ensuring that users can access only those who have been allowed to file. Of course, so far the course of the last operating system development is an important step is to provide a standard graphical user interface program (gui). Although there is no technical reason that operating system have to provide these interfaces, but the operating system vendors and encourage those who always want to run the software on their systems to the appearance and behavioral characteristics consistent with the operating system or similar. In addition to these core functions, the operating system also encapsulates a number of other common tools. Although some of the significance of the Computer Management does not, however, it is useful to users. For example, Apple mac os x on the application that contains the video clip. Some smaller scale for the computer's operating system may be so many useless features. Since the early micro-computer memory and processing capacity is limited and does not provide additional functionality, and embedded computer uses certain of the operating system or simply do not, they are often directly through the application of certain agents operating system functions.
Translated by Google
Application
Computer-controlled machinery is very common in the industry. Many modern mass production of toys, such as the furby, is not no cheap embedded processors. At first, bulky and expensive digital computer is used primarily perform scientific calculations, especially military topics. The first is to be used as eniac artillery trajectory calculation and design of the hydrogen bomb neutron cross-section when calculating the density (in many of today's supercomputer simulations of nuclear testing still plays a huge role). Australia's first stored program designed computer csir mk i type responsible for the hydropower projects in the catchment area of rainfall to assess the situation. Some are used to decrypt, such as the British "Colossus" programmable computer. Excluding these early scientific or military applications, the promotion of computer in other areas is also very fast. From the beginning, stored program computer to solve the problem is closely related with the business. As early as the first commercial computer ibm before the birth, such as the United Kingdom j. lyons leo on the design and manufacture of the as_set_ management or cater for other commercial purposes. As the volume and continued cost control, computer began to spread within a smaller organization. Coupled with the 1970s invention of the microprocessor, low-cost computers become a reality. 80 years, a comprehensive popular personal computers, electronic document writing and printing, to calculate the budget reports and other repetitive operations increasingly began to rely on the computer. With cheap computer up, creative work of art also began to use them. People use synthesizers, computer graphics and animation to create and modify sound, image, video. Video game industry also shows that the computer has also created a new entertainment history. Since the miniaturization of computers, machinery and equipment have begun to rely on computer control support. In fact, it is then small enough to build an embedded computer to control the Apollo spacecraft was to stimulate a leap forward in integrated circuit technology. Today is not trying to find a computer-controlled machinery and equipment than the active one even if it is to find some computer-controlled equipment to be much more difficult. Possibly the most famous computer-controlled equipment to be none other than the non-robots, these machines have a human appearance and more or less and have a sub_set_ of human behavior. In mass production, industrial robots is an unusual thing. However, the complete humanoid robot just to stay in science fiction or lab being. Robotics in the field of artificial intelligence is essentially part of the physical expression. The so-called artificial intelligence is a vaguely defined concept, but it is certain that the discipline has tried to make the computer but now they do not have the inherent ability as human beings. For several years, there have been many new methods have been developed to allow the computer to do that only people who previously could do. Such as reading, playing chess. However, to date, in the development of a human general "overall" intelligent computer, the progress is still very slow.
Translated by Google
Network, Internet
Since the 1950s, computers began to be used as coordinate information from different parts of the tool, the U.S. military's Sage system (sage) of this is the first large-scale systems. After the "knife" and a series of special-purpose commercial systems continue to emerge. After 70 years, major U.S. computer engineer colleges start using telecommunications technology to connect their computers. Since this work has been arpa sponsorship of its computer network also known as the arpanet. Since then, the technology for arpa network quickly spread and evolution of this network is to break through the range of universities and the army eventually resulting in today's Internet (internet). Network led to the emergence of computer properties and boundary re-definition. Sun Microsystems of john gage and bill joy pointed out: "that is, a computer network." Computer operating systems and applications have to be able to access other computers within the network such as network resources, the direction of development. Initially limited to these high-end network equipment used by scientists, but 90 years later, with e-mail and World Wide Web (world wide web) technology diffusion, and Ethernet networking technology and adsl and other low-cost technology, the Internet has become ubiquitous. Today the total number of network computers, how millions of people; the popularity of wireless Internet technology, making the Internet mobile computing environment also go hand in hand. For example, in notebook computers widely used wi-fi technology is the representation of wireless Internet applications.
Translated by Google
The next generation of computer
Quantum computer, by definition, the use of the extraordinary properties of quantum physics world. Once able to create quantum computers, it will enhance the speed of the computer that is generally difficult to hold a candle. Of course, this involves cryptography and quantum physics simulation of the next generation of computer just to stay in the conceptual stage.
Translated by Google
Computer science
In today's world, almost all professional are closely linked with the computer. However, only certain professions and academic disciplines will be in-depth study of the computer itself manufacture, programming and use of technology. Used to interpret different research areas within computer science, the meaning of each academic term is changing, and new subjects are endless. Computer engineering is a branch of electrical engineering, computer hardware and software and major research between the two contact each other. Computer science is the study of traditional academic computer title. Major research computing and efficient algorithm to perform specific tasks. The discipline for us to solve a problem in the computer field to determine whether the solvable, such as how can the efficiency of the solution, and how to made into a more efficient process. Today, in computer science has spawned a number of branches within each branch are different types of issues in-depth research. Software Engineering focuses on research and development of high-quality software systems methodology and practical way, and try to compress and forecast development costs and development cycle. Computer information systems research in a wide range of organized environment (business-oriented) in computer applications. Many disciplines are intertwined with other disciplines. Such as geographic information systems specialists is the use of computer technology to manage geographic information. There are three large-scale global commitment to the organization of computer science: the British Computer Society (bcs); Association for Computing Machinery (acm); of Electrical and Electronics Engineers (ieee).
Translated by Google
Computer processor
Processor interprets and executes instructions of the features. Each processor has a unique such as add, store or load this _set_ of operations, the operation of the processor's instruction _set_ is the system. Computer system designers used the computer as the machine, so the instruction is sometimes referred to as the machine instruction _set_, and write their language called binary machine language. Note, not the processor instruction _set_ and basic or pascal such high-level programming language in the instructions confused. Instruction by the operation code and operands, the operation code indicates the operation to complete functionality, while the number of operations, said the operation object. For example, an instruction to complete the operation of two numbers together, it must know: (1) This is what two numbers? (2) where these two numbers? When these two numbers are stored in computer memory, should have specified the location of the address, so if that is the operand data in computer memory, called the operand address. Processor's job is to find the instructions from memory and operands, and perform each operation, the completion of these next instruction after the notice sent to memory. Processor at an alarming rate over and over again over this step by step operation. A clock timer accurately called the issue of time signals, the signal processor to provide a regular pulse. Measuring the speed of computer terminology taken from the field of electronic engineering, called megahertz (mhz), MHz means one million cycles per second. For example, a common clock ticks per second, and in 8mhz processor, the computer's clock is ticking eight million times. Processor consists of two functional components (control unit and arithmetic logic unit) and a work space called the register of special composition. The control unit control unit is responsible for overseeing the entire operation of the computer system features. In some respects it is similar to the smart phone switches, computer systems because it will link up the various features, and based on the current execution of the need to control each component to complete the operation. Control unit fetches instructions from memory, and to determine its type or the decoding, then each instruction into a series of simple, small steps or actions. In this way, you can control the entire computer system, step by step operation. . Count logic component count logic components (alu) is to provide the logic for the computer and the computing power of features. Count data to the control logic component parts, and then considered by the execution logic components required to complete an arithmetic or logic operation. Arithmetic operations include add, subtract, multiply and divide. Logic operation is complete comparison, and _select_ion operation based on the results, for example, compare the two numbers are equal, if equal, continue processing; if the range, stop processing. The register is a register within the processor memory cell. The register is used to control components in a running program to track the overall status, as it is stored as the current instruction, will execute the next instruction address and the current operand so some information. In the calculation logic unit, the register kept to add, subtract, multiply, divide and compare the data items, while other registers are stored arithmetic and logic operation. Affect the processor speed and performance is an important factor is the register size. Word size of the term (also called word) description of the size of the operand register, but it can also be used to describe less strict the size of the channel and out of the processor. Now, the general-purpose computer word length is usually 8-64. If the processor operand register is 16 bits, then the saying that the processor is 16-bit processors. 2. The script is a universal digital computer digital system. A general-purpose digital computer to perform various micro-operations, and also provides what it must perform a specific sequence of operations. Users of the system can be programmed to control the process, the so-called program is specified, and the operation code and the implementation of the processing sequence of instruction _set_. Different commands by writing a new program or input different data for the same command, can simply change the data processing tasks. Computer instruction is specified micro-operation sequence of binary computer code. Script with the data stored in memory. Read from the memory controller, each instruction, and stored in the control register, and then remove the command controller to explain the binary code, and by issuing a series of control operations to complete the instruction. Each general-purpose computer has its own unique instruction. The ability to store and execute the command (stored program concept) is the most important characteristics of general-purpose computer. "Frequency", also known as "frequency", the higher the frequency, indicating that the faster instruction execution, instruction execution time is shorter, the information processing capacity and efficiency will be high. Here we must say that for beginners, the processor's operating frequency can not fully determine its performance, design, environment, etc. These are all important factors of good or bad performance. The current mainstream mobile phones are now used on the cpu frequency has 104mhz, 160mhz, 200mhz, 220mhz and 400mhz range. Central processor or processors for short, the English abbreviation for the cpu, that centralprocessingunit, is the computer (port translation - Electronic Calculator) is one of the major equipment, and its main function is to interpret the instructions for the computer and data processing computer software . cpu computer design provides for the basic properties of figures. cpu, storage devices and input / output devices of modern micro-computer of the three core components. Integrated circuits manufactured by the cpu is usually referred to as micro-processors. From the mid-1970s, almost single-chip micro-processor to replace all other types of cpu, cpu today, almost all of the terms of the generation of micro-processors, said. "CPU" is the name, the conventional terms used to describe a series of computer programs can perform complex logic machines. This vague definition is very easy in the "cpu" before the name is commonly used in early computer are also included. In any case, at least from the early 1960s (weik1961), the name and its abbreviation in the computer industry has started to be widely used. Although earlier than the "central processing" in the physical form, design and manufacture, and implementation of specific tasks with the dramatic development, but the basic principle of operation has not changed. Early CPU is usually for large and application-specific computer (Hong Kong translation - computer) and custom. However, this application customized for a specific cpu expensive method has largely given way to development of cheap, standardized, for one or more purpose processor class. This standardization trend started by a single transistor mainframes and PC's, with IC (English integratedcircuit (ic) the emergence of speed. Ic cpu can be made more complex in a small space design and manufacturing (in the order of microns). cpu standardization and miniaturization have made this type of digital device (Port Translation - electronic components) in the frequency of occurrence in modern life far beyond the limited application-specific computers. Modern microprocessors appear in the from cars to mobile phones to children's toys, including a variety of articles. cpu composition of the calculator: arithmetic, logic (components: arithmetic logic unit, accumulator, register _set_, the path converter, data bus) controller: re_set_, Enable (parts: counter, instruction registers, instruction decoders, state register, timing generator, micro-operation signal generator). Computer logic and timing circuit is a combination. Other form of computer 1, notebook computers (laptops, notebooks) 2, Pocket PC 3, super-computers (supercomputers) 4 desktop computers Web site to learn computer 1, Nebula Tutorial http://www.gonet8.com/ 2, the computer tutorial home http://www.pcppc.cn/ 3, Golden Eagle Network http://www.xjke.com/ Tutorial 4, Pacific Internet http://www.pconline.com.cn/
Translated by Google
Introduction
English name: computer Official name: computer Brief name: Computer Commonly known as: Computer English abbreviation: Computer (PC), PC (personal computer PC)
Translated by Google
Computer components
PC computer can generally be divided into two parts: software and hardware systems. Software system Software systems, including: operating systems, application software. Application management software in the computer industry, IT computer industry, an essential tool for the development of the computer industry, erp software. Hardware system Hardware system includes: chassis (power supply, hard drives, disk memory, motherboard, CPU-CPU, optical drives, sound cards, network cards, video cards) monitor, keyboard, mouse, etc. (also can be equipped with headphones, speakers, printers, etc.). Generally have a home computer motherboard onboard sound card, network card. Part of the motherboard integrated graphics. CPUCPU in English stands for "Central Processor Unit", translated into Chinese is "central processing unit." It is the role of the PC can be the equivalent of the brain's role in the body. All computer programs are run by it. Motherboard called Mother Board (motherboard). It is actually a circuit board, above the dense are a variety of circuits. It can be said that the PC's nervous system, CPU, memory, graphics card, sound card, etc. are installed directly on the motherboard, and hard drives, floppy drives and other components also need to connect via cable and the motherboard. Motherboard The host will generally be placed in the chassis of the computer components referred to as "master." It is the most important part of the computer, motherboard, CPU and hard drive and other major components are within the host. Memory card is connected monitor and the PC motherboard major components. It is _insert_ed in the expansion slot on the motherboard inside. It is mainly responsible for host sent to the monitor display signal into a general electrical signals, allowing the display to understand what a PC in it. Also has memory card, called "show memory", how much it will directly affect the display of the display, such as clarity and color richness, and so on. Graphics display is one of the computer output device, similar to a TV appearance, the display resolution is higher than the average TV. Disk and disk-drive is one of the PC's external memory, hard drives and floppy disks are divided into two. Both have in common is to use magnetic media to store all data, so called "disk." Want to use a PC disk, the disk must be placed in a special device, which is the disk drive. Hard drives and hard drives of the English is Hard Disk, directly translated into Chinese is "hard plate." As the hard drive is built-in hard drive inside, it is generally put the hard disk and hard drive confused. The general appearance of the size of the hard drive is 3.5 inches. Hard drive capacity generally M (megabytes) and G (1000 trillion) calculations. Normally see the hard drive capacity from several megabytes to several gigabytes has. Usually referred to as the C drive, D drive, and really hard not exactly the same thing. Term is called a real hard "physical disk", in DOS operating system to a physical disk partition, divided into C drive, D drive, E drive so a number of "fake hard", the term is called "logical drive." Keyboard and mouse DVD: The digital versatile disc. Refers to the DVD drive to read DVD-ROM devices. DVD disc capacity of 4.7GB, equivalent to seven times the CD-ROM can store 133 minutes the film, including seven Dolby Digital surround tracks. DVD disc can be divided into: DVD-ROM, DVD-R (write at once), DVD-RAM (for multiple write) and DVD-RW (read and rewritten). The current DVD drive to use more EIDE interfaces, can be the same as the CD-ROM drive connected to IDEas, SATA or SICI interface.
Translated by Google
The development of computer
Human birth of the first electronic computer "Eni Ake" was originally designed to calculate the ballistic dedicated computer. But then by changing the _insert_ panel in the wiring to solve various problems, and to become a universal machine. It is a modified machine was used to develop the hydrogen bomb. "Eni Ake" program using external plug-in, whenever a new software center calculation must reconnect the lines. Sometimes a few minutes or several minutes of computing, take a few hours or 1 - 2 days to prepare for connections, this is a fatal weakness. Another weakness is its storage capacity is too small. February 15, 1996, in the "Eni Ake," came the 50th anniversary of the United States Vice President Al Gore held at the University of Pennsylvania to celebrate the anniversary ceremony, once again this has been sleeping pressing the 40 years of large computer push-button start. Gore took part in the "Eni Akbar," the research, scientists still alive today made a speech: "I would like to develop this computer when the pioneers of congratulations." Eni Ake on two rows of lights in order to accurately The section of flashing to 46, marked its advent in 1946, and then flashes to 96, marking the beginning of the computer age 50 years. Computer evolution 1642 to 1643, Pascal (Blaise Pascal) to help make tax collectors of his father, he invented a gear on the operation of the adder, called "Pascalene", this is the first mechanical adder. In 1666, Samuel Morland invented in the UK can calculate the addend and subtrahend a mechanical counting machine. 1671, the famous German mathematician Leibniz (GWLeibnitz) made the first to add, subtract, multiply, and divide the four operations of the mechanical computer. 1673, Gottfried Leibniz created a step-type (stepped) cylindrical wheel counting machine, called the "Stepped Reckoner", this calculator can multiply the number of repeat and automatically Jiaru Jia counter inside. In 1694, the German mathematician, Gottfried Leibniz, the Pascal's Pascalene improved, creating a machine that can calculate the multiplier, it is still operating with gears and dials. 1773, Philipp-Matthaus manufacture and sell a small amount of precision to 12-bit Computing Machinery. 1775, The third Earl of Stanhope and Leibniz invented a similar multiplication calculator. 1786, JHMueller designed a difference engine, but unfortunately there is no funding to create. 1801, Joseph-Marie Jacquard's loom is connected in sequence with the punch card controlled weaving style. 1854, George Boole published "An Investigation of the Laws of Thought", is about the symbols and logical reason, it later became the basic concepts of computer design. In 1858, a first telegraph line across the Atlantic, and provide a few days of service. 1861, a transcontinental telegraph line to connect the Atlantic and Pacific coast. In 1876, Alexander Graham Bell invented the telephone and obtain patents. 1876-1878, Baron Kelvin creating a harmonic analyzer and tidal prediction machine. In 1882, William S. Burroughs resigned from the bank clerk's job, and focus on addend of the invention. In 1889, Herman Hollerith electric tabulating machine in the game to perform well, and is used in the 1890 census. Herman Hollerith used the concept of a Jacquard loom used to calculate, he card data storage, and then compile the results into the machine. This machine would take a decade to make time to get census results, in just six weeks to do. In 1893, the first four-function calculator was invented. Old computer In 1895, Guglielmo Marconi transmitted radio signals. In 1896, Hollerith founded the company tabulation machines (Tabulating Machine Company). 1908, British scientist Campbell Swinton out the use of electronic scanning method and indicates that manufacturing cathode-ray tube TV. In 1911, Hollerith machine table with the other two companies merged to form the Computer Tabulating Recording Company (CTR), watch and record companies. But in 1924, changed its name to International Business Machine Corporation (IBM). In 1911, the Dutch physicist Kamerlingh Onnes discovered superconductivity in Leiden Unversity. 1931, Vannever Bush invented a procedure to solve the differential count machine, this machine can solve some of that mathematicians, scientists, headache complex differential programs. In 1935, IBM (International Business Machine tion) the introduction of "IBM 601", there is an arithmetic unit and can calculate the multiplier in one second, perforated card machine. Its scientific and commercial computing play a large role. Making a total of 1,500 department. In 1937, Alan Turing came up with a "universal machine" concept, you can execute any algorithm, the formation of a "computable (computability)" the basic concept. Turing's concept of the invention of the same type than the other as well, because he used the symbol symbol-processing concept. November 1939, John Vincent Atannsoff and John Berry has created a 16-bit addend device. It was the first to use vacuum tube computing machine. 1939, Zuse and Schreyer opened 鈶 created a "V2" [later called the Z2], the machine follow the Z1's mechanical storage devices, coupled with a logic with broken appliances (Relay Logic) the new arithmetic unit. But Zuse completed after the draft, this plan was interrupted for one year. Scientific Calculator In 1946, the first formal computer "Eni Ake" was born in the United States, but very power. In 1959, the first small scientific calculator IBM620 developed. In 1960, the data processing system IBM1401 developed. In 1961, the programming language COBOL come out. In 1961, the first computer sub-system design is completed by the Massachu_set_ts Institute of Technology. In 1963, BASIC language come out. In 1964, the third generation of computer IBM360 series made. In 1965, Digital Equipment Corporation introduced the first minicomputer PDP-8. In 1969, IBM Corporation successfully developed the 90 card machines and systems - 3 computer systems. In 1970, IBM System 1370 computer series made. In 1971, the University of Illinois design is completed the Queen Akbar IV supercomputer. In 1971, the first 4004 microprocessor developed by Intel Corporation. In 1972, microprocessor-based chip to begin mass production and sales. In 1973, the first piece of floppy disk developed by IBM. 1975, ATARI - 8800 microcomputer come out. In 1977, Kemo Doyle claims that the whole combination of micro-PET - 2001 developed. 1977, TRS - 80 microcomputer was born. In 1977, Apple - II microcomputer was born. In 1978, VLSI begin. In 1978, the second magnetic bubble memory for commercial computers. In 1979, Sharp announced that made the first portable microcomputer. In 1982, the microcomputer starts to spread, poured into the schools and families. In 1984, the Japanese computer industry to improve the development of the "fifth generation computer" --- a computer with artificial intelligence. 1984: DNS (Domain Name Server) name server release, the interconnection line to run more than 1,000 hosts. 1984: Hewlett-Packard released the excellent laser printer, HP inkjet printer also stay ahead of technology. January 1984: Apple's Macintosh release. Based on Motorola 68000 microprocessor. Can address 16M. August 1984: MS-DOS 3.0, PC-DOS 3.0, IBM AT released, the use of ISA standards, and support high-density floppy disk and 1.2M. September 1984: Apple released a 512Kb memory of the Macintosh, but not in other areas to improve. The end of 1984: Compaq began developing the IDE interface can transfer data faster, and was adopted by many of its peers, further EIDE was introduced, can support up to 528MB drive. Faster data transmission. 1985: Philips and Sony to launch CD-ROM drive. 1985: EGA standards were introduced. March 1985: MS-DOS 3.1, PC-DOS 3.1. This is the first part of the network functions to provide support for DOS version. October 17, 1985: 80386 DX released. Clock reaches 33MHz, 1GB of addressable memory. More than 286 instructions. 6 million instructions per second, integrated 275,000 transistors. November 1985: Microsoft Windows release. However, in its 3.0 version of the full has not been widely used. Need DOS support, similar to the Mac interface, which is Apple's sue. Proceedings to terminate until August 1997. December 1985: MS-DOS 3.2, PC-DOS 3.2. This is the first 3.5-inch disks. But only support to 720KB. To the 3.3 version only supports 1.44 trillion. January 1986: Apple released a high performance of the Macintosh. Four megabytes of memory, and SCSI adapters. September 1986: Amstrad Announced release cheap and powerful computers Amstrad PC 1512. With a CGA graphics adapter, 512KB memory, 8086 processor, 20 MB hard drive. Using the mouse and the graphical user interface, designed for the family. 1987: Microsoft Windows 2.0 released. 1988: EISA standards established. 1989: European Particle Physics Institute Tim Berners-Lee founded the World Wide Web prototype. Through hypertext links, the novice can easily browse the Internet. This greatly contributed to the development of Internet. March 1989: EIDE standard is established, can support more than 528MB of hard drive, can reach 33.3MB / s transfer rate, and is used in many CD-ROM. April 10, 1989: 80486 DX released. The processor incorporates 1.2 million transistors, and its successor models of the clock frequency of 100MHz. November 1989: Sound Blaster Card (sound) release. May 22, 1990: Microsoft released Windows 3.0, MS-DOS compatibility mode. November 1990: the first generation of MPC (Multimedia PC standards) release. The standard requires that the processor is at least 80286/12MHz (later increased to 80386SX/16MHz) and a CD-ROM drive, at least 150KB/sec transfer rate. 1991: ISA standard release. June 1991: MS-DOS 5.0 and PC-DOS 5.0 release. In order to promote OS / 2 development, Bill Gates said DOS5.0 Terminator is DOS, it will no longer spend energy on this. The basic version of breaking the 640KB memory limit. This version also marks Microsoft's collaboration with IBM on the end of the DOS. 1992: Windows NT release, 2GB of addressable memory. April 1992: Windows 3.1 release. 1993: Internet began commercial operation. 1993: the classic game Doom release. March 22, 1993: Pentium released, the processor integrates 300 million transistors, the early version of the core frequency of 60 ~ 66MHz, perform 100 million instructions per second. May 1993: MPC standard 2 release, requires CD-ROM transfer rate of 300KB / s, 320 × 240 in the window of 15 frames per second. March 7, 1994: Intel released 90 ~ 100MHz Pentium processor. 1994: Netscape 1.0 browser release. 1994: the famous real-time strategy game Command & Conquer (C & C) release. March 27, 1995: Intel released the Pentium 120MHz processor. June 1, 1995: Intel released the Pentium 133MHz processor. August 23, 1995: Pure 32-bit multitasking operating system Windows 95 release. The operating system is significantly different from the previous version, completely out of the MS-DOS, but also to take care of user habits retained DOS mode. Windows 95 was a great success. November 1, 1995: Pentium Pro released, clocked up to 200MHz, perform 440 million instructions per second, integrates 5.5 million transistors. December 1995: Netscape released its JavaScript. January 1996: Netscape Navigator 2.0 released. This is the first browser to support JavaScript. January 4, 1996: Intel released 150 ~ 166MHz Pentium processor with integrated 310 ~ 330 million transistors. 1996: Windows 95 OSR2 release, fixes some BUG, expand some functions. 1997: Heft Auto, Quake 2, and Blade Runner and other famous game software, and is driven by the rapid rise of 3D graphics accelerator card. January 8, 1997: Intel released Pentium MMX CPU, processor, gaming and multimedia capabilities to be enhanced. April 1997: IBM's Deep Blue (Deep Blue) computer over the human world chess champion Garry Kasparov. May 7, 1997: Intel released Pentium Ⅱ, added more instructions and Cache. June 2, 1997: Intel released 233MHz Pentium MMX.
Translated by Google
Principle
Personal computer (PC: personal computer) the main structure: Host: motherboard, CPU (central processing unit), main memory (RAM), expansion card (video card sound cards, etc. Some of these can be integrated motherboard), power supply, optical drive, secondary storage (hard drive), floppy drive Peripherals: monitor, keyboard, mouse (speakers, camera, external modem, MODEM, etc.). Although computer technology since the 1940s, the birth of the first electronic general-purpose computer has been dazzling the rapid development, but today is still basically a computer program stored in the structure, that is, von Neumann architecture. This structure to achieve a practical general-purpose computer. Storage between program structure will describe a computer into four main sections: the arithmetic logic unit (ALU), control circuit, memory, and input and output devices (I / O). These components through a _set_ of cables to connect a group and by a clock to drive. Conceptually, a computer's memory can be regarded as a "cell" unit. Each "cell" has a number, called the address; they can be stored in a small fixed-length information. This information can be either command or a data. In principle, each "cell" can all be stored in either of the two. Since the 1980s, the ALU and control units have been gradually integrated into an integrated circuit, called a microprocessor. Type of computer operating mode is very straightforward: in a single clock cycle, start the computer fetches instructions and data memory, then execute commands, store data, and then get the next instruction. This process is repeatedly executed until they have a termination command. Explained by the controller, perform the instruction _set_ computing is a very limited number of well-defined _set_ of simple instructions. Can generally be divided into four categories: 1), data movement 2), the number of logic operations 3), verify the condition 4), sequence Gaiyi The same instruction as the data inside a computer is expressed in binary. For example, 10110000 is a copy of Intel x86 family of microprocessor instruction code. Supported by a particular instruction _set_ computer is the computer's machine language. Therefore, using the popular language of the machine will make the de facto software on a new computer run much easier. So for those models of commercial software development people, they are usually only concerned about one or several different machine language. More powerful small computers, mainframe computers and servers may be different with the computer. They usually share the tasks to different CPU to execute. Today, multicore microprocessors and personal computers are also moving in that direction. Supercomputers typically have a computer with a basic class of electronic stored program control switches are usually implemented using 2 has thousands of CPU, but these designs seem to be useful only for specific tasks. In a variety of computers, some microcontrollers program and data used to make the separation of the Harvard architecture.
Translated by Google
Computer digital circuit
Above-mentioned physical implementation of these conceptual designs are varied. A stored program computers can be both Babbage's mechanical, it can be based on digital electronics. However, digital circuits can be relay, such as the number of binary arithmetic and logic operations. He quickly pointed out that some scholars can use vacuum tubes instead of relay circuits. Was originally used as a radio tube amplifier circuit, after they began to be increasingly used as a digital electronic circuit of fast-switching. When the tube is energized, a pin, the current can be passed freely between the other ends. Through the gates of the permutations and combinations can be designed to complete many complex tasks. For example, the adder is one of them. The device electronics to achieve the results of the two numbers together and saved - in computer science in such a _set_ of operations through a specific intent to achieve a method is called an algorithm. Finally, a significant number of people through the logic gates assembled successfully complete ALU and control. It significant, just look at CSIRAC this may be the smallest practical tube computer. Machine with 2000 tubes, of which there are many dual-use devices, which means a total of 2000-4000 combined with logic devices. Vacuum tubes for the manufacture of large-scale gate was clearly laboring. Expensive, unstable, bloated, high energy consumption, and speed is not fast enough - although far more than the mechanical switch circuit. All this led to the 1960s they were replaced by transistors. The latter are smaller, easy to operate, reliable, cheaper energy, and lower cost. Integrated circuits are the basis for today's computer after the 1960s, transistors began as a large number of transistors, a variety of other electrical components and wiring placed in a silicon integrated circuit board replaced. 70, ALU and the controller CPU as composed of two parts, began to be integrated into a single chip, known as the "microprocessor." History of the development along the circuit, you can see a chip on the number of integrated devices has been growing rapidly. The first integrated circuit only contains dozens of components, and by 2006, an Intel Core Duo processor up to the number of transistors on a one hundred fifty-one million is huge.
Translated by Google
Input and output devices
Input and output devices (Input / Output, I / O) is the world's information to the external computer equipment and the results will be returned to the outside world devices in general. These results may be returned as the users can visually experience, as the computer or other device controlled by the input: For a robot to control the computer's output is basically the robot itself, such as to make the kinds of behavior. The first generation of computer input and output types of equipment is very limited. The input devices are often used punch card reader, used to command and data into memory; and to store the results of the output device is usually tape. As technology advances, the richness of input and output devices is improved. Personal computer, for example: keyboard and mouse is a computer user to input information directly to the main tool, monitor, printer, speakers, headphones returns the results. In addition, there are many other input devices can accept different types of information, such as digital cameras can be input image. The input and output devices, there are two very notable: the first is the secondary storage devices such as hard disk, CD or other slow speed but with a high-capacity equipment. The second is the computer network access devices, achieved through their direct data transfer between computers has greatly enhanced the value of the computer. Today, the Internet success of the tens of millions of computers that are in various types of data transfer.
Translated by Google
Program
Simply put, the computer program is a sequence of computer instructions that were executed. It can perform a simple task just a few instructions, may also be a huge amount of data to operate the complex instruction queue. Many computer programs contain millions of instructions, many of which command may be executed repeatedly. In 2005, a typical personal computer can perform about 30 million instructions per second. Computer usually does not perform some very complex instructions to get the extra functions, more programmers are arranged in accordance with those relatively simple to run, but a large number of short instructions. In general, programmers are not directly in machine language to write instructions for the computer. So do the results can only be time-consuming, inefficient and full of loopholes. Therefore, programmers usually through the "Advanced" some of the language to write programs, then by some special computer programs, such as the interpreter or compiler of the translation into machine language. Some programming languages look very close to machine language, such as assembler, are considered low-level language. While other languages, such as abstract principles such as that of Prolog, while completely ignoring the operational details of the actual operation of a computer, can be described as high-level language. For a specific task, should be based on the characteristics of its affairs, the programmer skills, available tools and customer needs to _select_ the appropriate language, among which the most important customer needs.
Translated by Google
Library and operating system
Shortly after the birth of the computer, it was discovered that certain operations in many different programs have been implemented, for example, calculate some standard mathematical functions. For efficiency considerations, the standard version of the program was to collect a "library" for each procedure call. Handle many tasks often go to a wide range of additional input and output interfaces, then, can be used to connect the library come in handy. 1960s, with the popularity of computer industrialization, more and more computers being used as a treatment of different jobs within the organization. Quickly, automatically arrange continued operation and implementation of special software appeared. These not only control the hardware and software responsible for job scheduling is called "operating system." An early example of the operating system is IBM's OS/360. Constantly improving, the operating system and the introduction of time-sharing mechanism - concurrent. This allows different users to "simultaneously" to use the machine to perform their own procedures, it seems like everyone has a own computer. To this end, the operating system as each user needs to provide a "virtual machine" to separate the different procedures. As the need for operating system control equipment is also increasing, one of which is hard. Consequent, the operating system and the introduction of document management and directory management (folder), which greatly simplifies this type of permanent storage of equipment. In addition, the operating system is also responsible for security control, ensuring that users can access only those who have been allowed to file. Of course, so far the course of the last operating system development is an important step is to provide a standard graphical user interface program (GUI). Although there is no technical reason that operating system have to provide these interfaces, but the operating system vendors and encourage those who always want to run the software on their systems to the appearance and behavioral characteristics consistent with the operating system or similar.
Translated by Google
Computer applications
With the increasing popularity of computers and computer access to almost all industries, play an important role. It has become today's society can be an indispensable tool for the normal operation of the computer in modern life to occupy such an important position, it is dependent on the computer so high, I can not imagine life without a computer will become what look. Applications: 1, numerical In scientific research and engineering design, there are a lot of fan trouble, complex numerical problems to solve such problems often can not be qualified manpower. The high-speed, high-precision complex mathematical problem solver is the computer's strengths. Thus, to date, numerical calculation is still an important area of computer applications. 2, data processing Is the use of computers to process, manage and operate the various forms of data. Data processing in general is always some kind of management for the purpose. For example, the financial sector to make notes with the computer processing, accounts processing and _set_tlement; personnel departments use computers to create and manage personnel files, and so on. With different numerical calculation, data processing on large amounts of data to focus on comprehensive and analytical processing. Generally does not involve complex mathematical problems, but requires a great amount of data processed and disposed of in a short time often required. 3, real-time control Also known as process control, is to use a computer for continuous operation of the control object can be controlled automatically. Requires a computer to collect timely detection of signals, by computing, adjust the signal sent automatically adjust the control object. Process control applications in the computer processing of the results of the input output is always in real time. For example, the missile launch and guidance process, always keep the flight parameters of the test was to quickly calculate and process, constantly control signals control the missile's flight, until it reaches the target _set_ that up. Real-time control in industrial automation, military, etc. is widely used. 4, computer-aided design (CAD) Is the use of computers for product design. This technique has been widely used in machinery, ships, aircraft, and other aspects of large scale integrated circuit layout design. The use of CAD technology can improve the design quality and shorten the design cycle, improve design automation. For example, computer-aided mapping system is a universal software package that provides some basic plot elements and command, in this based on a variety of different departments can develop applications gallery. This allows engineers and technicians from heavy repetitive work of liberation. To accelerate the product development process, improve product quality. CAD technology developed rapidly, expanding its scope of application, sent birth of many new branches of technology, such as computer-aided manufacturing CAM, computer-assisted instruction CAI and so on. 5, pattern recognition Is a computer simulation of human intelligence in the application. For example, according to the principles of spectrum analysis, the use of computers to break down the human voice, synthesis, so the machine can distinguish between voice, and issued a similar human or synthetic voice. Again, the use of computers to identify the types of images, even human fingerprints and so on. 6, fun and games In the general area of home computers, entertainment, games almost as main home computer use, video playback, gaming is the main home computer entertainment. The strong performance of the computer, can easily combined with the involvement of the Internet, so take home after the TV game, a major gaming platform. Meanwhile, the home computer has become the home theater direction, especially with the increasing popularity of high-definition video, audio and video to home computers as a media center, is the best, most affordable way, and gradually rise to this new home HTPC computer concepts. In summary, the computer is the input of various information, such as values, text, images, signals, etc., automatically efficiently processed and output of electronic devices.
Translated by Google
Computer processor
Processor interprets and executes instructions of the features. Each processor has a unique such as ADD, STORE or LOAD operations such as _set_, the operation of the processor's instruction _set_ is the system. Computer system designers used the computer as the machine, so the instruction is sometimes referred to as the machine instruction _set_, and write their language called binary machine language - Note: Do not the processor instruction _set_ and advanced BASIC or PASCAL such programming language instructions to be confused - by the instruction opcode and operands, the operation code indicates the operation to complete the function, operation and operand, said object. For example, an instruction to complete the operation of two numbers together, it must know: 1. This is what two numbers? 2 where these two numbers? When these two numbers are stored in computer memory, should be a specified address of its location, so if that is the operand data in computer memory, called the operand address. Processor's job is to find the instructions from memory and operands, and perform each operation, the completion of these next instruction after the notice sent to memory. Processor at an alarming rate over and over again over this step by step operation. A clock timer accurately called the issue of time signals, the signal processor to provide a regular pulse. Measuring the speed of computer terminology taken from the field of electronic engineering, called megahertz (MHz), MHz means one million cycles per second. For example, a common clock ticks per second, while at 8MHz processor, the computer's clock is ticking eight million times. Processor consists of two functional components (control unit and arithmetic logic unit) and a work space called the register of special composition. Control unit is responsible for overseeing the entire operation of the computer system features. In some respects it is similar to the smart phone switches, computer systems because it will link up the various features, and based on the current execution of the need to control each component to complete the operation. Control unit fetches instructions from memory, and to determine its type or the decoding, then each instruction into a series of simple, small steps or actions. In this way, you can control the entire computer system, step by step operation. . Count logic unit (ALU) is provided for the computer logic and computing power features. Count data to the control logic component parts, and then considered by the execution logic components required to complete an arithmetic or logic operation. Arithmetic operations include add, subtract, multiply and divide. Logic operation is complete comparison, and _select_ion operation based on the results, for example, compare the two numbers are equal, if equal, continue processing; if the range, stop processing. The register is the processor inside the storage unit. The register is used to control components in a running program to track the overall status, as it is stored as the current instruction, will execute the next instruction address and the current operand so some information. In the calculation logic unit, the register kept to add, subtract, multiply, divide and compare the data items, while other registers are stored arithmetic and logic operation. Affect the processor speed and performance is an important factor is the register size. Word size of the term (also called word) description of the size of the operand register, but it can also be used to describe less strict the size of the channel and out of the processor. Now, the general-purpose computer word length is usually 8-64. If the processor operand register is 16 bits, then the saying that the processor is 16-bit processors. 2. The script is a universal digital computer digital system. A general-purpose digital computer to perform various micro-operations, and also provides what it must perform a specific sequence of operations. Users of the system can be programmed to control the process, the so-called program is specified, and the operation code and the implementation of the processing sequence of instruction _set_. Different commands by writing a new program or input different data for the same command, can simply change the data processing tasks. Computer instruction is specified micro-operation sequence of binary computer code. Script with the data stored in memory. Read from the memory controller, each instruction, and stored in the control register, and then remove the command controller to explain the binary code, and by issuing a series of control operations to complete the instruction. Each general-purpose computer has its own unique instruction. The ability to store and execute the command (stored program concept) is the most important characteristics of general-purpose computer. "Frequency", also known as "frequency", the higher the frequency, indicating that the faster instruction execution, instruction execution time is shorter, the information processing capacity and efficiency will be high. Here we must say that for beginners, the processor's operating frequency can not fully determine its performance, design, environment, etc. These are all important factors of good or bad performance. Central processor or processors for short, the English abbreviation for the CPU, which CentralProcessingUnit, is the computer (port translation - Electronic Calculator) is one of the major equipment, and its main function is to interpret the instructions for the computer and data processing computer software . CPU for the computer design the basic characteristics of digital computing. CPU, storage devices and input / output devices of modern micro-computer of the three core components. Integrated circuits manufactured by the CPU is usually referred to as micro-processors.
Translated by Google
Other form of computer
1, laptop (portable computers, laptop computers) 2, Pocket PC 3, super computers (supercomputers) 4, 5 desktop computers, computer photons 6, 7 black hole computer, bio computer 8, DNA computer 9, computer 10 nm, AI (Artificial Intelligence)
Translated by Google
Computer classification
Computer simulation can be divided into two major categories of computers and digital computers. The main features of the computer simulation: the values involved in computing the amount indicated by the continuous uninterrupted, the computing process is continuous, analog computer components quality due to the impact of their accuracy is low, narrow range of applications, is now rarely production. The main characteristics of digital computers: the values involved in computing the amount that the number of intermittent use, and its operation process is calculated according to the digit, digital computer such as a logic function is similar to the human brain's "thinking" approach to work, is also known as "computer." Digital computer by use of special computer and can be divided into general-purpose computer. Special and general-purpose computer in its efficiency, speed, configuration, structure, complexity, cost and adaptability, there is a difference. Dedicated computer for certain types of problems can show the most effective, fastest and most economical features, but its poor adaptability, not suitable for other applications. Missiles and rockets in use on a computer is very much dedicated computer. The thing is then advanced, you do not use it to play games. Adaptable general-purpose computer, the application was very broad, but its operating efficiency, speed and economy depending on the application objects are subject to different degrees. General-purpose computer according to their size, speed, and functionality can be divided into supercomputer, mainframe, midrange, minicomputer, microcomputer and microcontroller. The basic difference between these types is usually the size of its volume, structure, complexity, power consumption, performance, data storage, instruction and equipment, software configurations and so different. In general, the giant computer processing speed is very high, up to hundreds of millions of instructions per second, large data storage capacity, large-scale complex, expensive, mainly for large-scale scientific computing. It is also the measure of a country's scientific strength, an important landmark. Monolithic integrated circuit made the computer only by the one, its small size, light structure is very simple, and single-chip supercomputer performance range is between mainframe, midrange, minicomputer and microcomputer. Their performance and structure of the corresponding size in descending order. Said personal computer or microcomputer which is currently the fastest growing areas. Based on microprocessor chips it uses are divided into a number of different types: first is to use the Intel Pentium chip, 386, 486 and other IBM PC and compatibles; followed by the use of IPM - Apple-Motorola PowerPC chips jointly developed machine, Apple's Macintosh machines using this chip have been; again, DEC launched the Alpha chip with its own machines. The PC is being used by a desktop or notebook to a portable laptop-based development. There is the CD (audio, video), telephone, fax, television and other integrated, a multimedia personal computer, and will receive the network. Von Neumann, the father of the computer (invented)
Translated by Google
Computer - Notes
Computer First, the switch machine Computer equipment must be properly turn off the power, otherwise it will affect their working life, but also some of the failures of the culprit. Correct computer switch the order of: start, first turn on the computer peripherals and power supply (such as monitors, printers, etc.), then open the host computer power supply; off the opposite order, first power off the host, and then disconnect the other peripheral devices. Second, computer equipment safety precautions (A) the computer equipment should not be placed in dusty places (such as a window near the street, etc.), there is no condition for local, dust cover, etc. should be used when not in use cover; not on the more humid place (for example, water bottle depot, next to drinking fountains, etc., were easy to pour the water splashed on the device), there is attention to the main chassis of the heat, avoid direct sunlight on your computer; (B) an outlet on a dedicated computer to use other electrical appliances should be prohibited, hands warm stove and other personal electrical equipment, computer equipment should be checked when the work is all off and then leave; (C) can not move the computer to work when the computer; (D) do not work when the plug in the computer equipment, frequently switching the machine, each charged pluggable interface (in addition to USB interface), interface card is easy to burn or cause damage to the manifold; (E) anti-static, anti-dust, can not let the keyboard, mouse, water and other equipment; (6) regularly back up and organize the data disk. Because of the frequent use of hard drive, viruses, misuse, and some data is easily lost. So should regularly back up important data to prevent the work done for a few months back and all is not lost in time. Often finishing disk, clean junk files, to avoid junk files take up too much disk space, back to the normal file search and management of inconvenience, not only important documents easy to remove, will not find in the urgently needed documents and other issues; (G) to identify problems and repair times, the machine always work in good condition, including: whether the device has unusual problem in all the wiring is loose, etc.; (H) the prevention of computer viruses, install antivirus software regularly upgraded and killing the virus. Note the use of computer points: 1, automatic links to some strange site. Internet should pay attention, do not mess do not understand something, especially some of the pornographic pictures, ads were floating in the browser page, do not click on it; if it affects you browse the web, you drag the slider up and down, until the most good perspective so far. In addition, some Internet plug-in try not to install. And do not install the Internet Assistant and Toolbar, such software can sometimes affect your normal use. 2, Do not download and install on the Internet, or some small software program. 3, a stranger sent by e-mail. Receive e-mail sent by strangers, especially those tempting titles, such as a joke, or a love letter, etc., and e-mail with attachments. 4, the use of USB flash drives for viruses before the operation, regular use of anti-virus software, there is no virus detection system.
Translated by Google
Commonly used computer operating system
Linux: RedHat Linux, RedFlag Linux, Ubuntu Linux Windows: Microsoft Windows 95/98/Me/NT/2000/xp/2003/Vista, Microsoft Windows Server 2000/2003/Windows 7 DOS: MS-DOS Other: Unix operating system Kirin Chinese Academy of Sciences Mac OS X is Apple Computer's Macintosh operating system software, the latest version of Mac OS.
Translated by Google
World's top ten supercomputers
Super computer name where the agency's operation speed development IBM Roadrunner at Los Alamos National Laboratory 1.105 petaflop / s Jaguar Cray Oak Ridge National Laboratory, 1.059 petaflop / s JUGENE IBM Research Center in Germany You Lixi 825.5 teraflop / s Pleiades SGI NASA Ames Research Center, 487.01 teraflop / s BlueGeneL IBM Lawrence Livermore National Laboratory 478.2 teraflop / s Kraken XT5 Cray American International Computer Science Institute 463.3 teraflop / s BlueGene / P IBM Argonne National Laboratory 458.61 teraflop / s Ranger Sun Texas Advanced Computing Center 433.20 teraflop / s Dawn IBM Lawrence Livermore National Laboratory 415.70 teraflop / s JUROPA Bull SA Research Center in Germany You Lixi 274.80 teraflop / s
Translated by Google
China has the influence of computer brand
Association with the side Shenzhou Founder Great Wall Haier TCL Acer Asus MSI Blue Hisense HP IBM Dell
Translated by Google
Computer science
In today's world, almost all professional are closely linked with the computer. However, only certain professions and academic disciplines will be in-depth study of the computer itself manufacture, programming and use of technology. Used to interpret different research areas within computer science, the meaning of each academic term is changing, and new subjects are endless. Computer engineering is a branch of electrical engineering, computer hardware and software and major research between the two contact each other. Computer science is the study of traditional academic computer title. Major research computing and efficient algorithm to perform specific tasks. To determine the discipline to solve a problem in the computer field is solvable, such as how can the efficiency of the solution, and how to made into a more efficient process. Today, in computer science has spawned a number of branches within each branch are different types of issues in-depth research. Software Engineering focuses on research and development of high-quality software systems methodology and practical way, and try to compress and forecast development costs and development cycle. Computer information systems research in a wide range of organized environment (business-oriented) in computer applications. Many disciplines are intertwined with other disciplines. Such as geographic information systems specialists is the use of computer technology to manage geographic information. There are three large-scale global commitment to the organization of computer science: the British Computer Society (BCS); Association for Computing Machinery (ACM); of Electrical and Electronics Engineers (IEEE). Computer engineering is a branch of electrical engineering, computer hardware and software and major research between the two contact each other. Computer science is the study of traditional academic computer title. Major research computing and efficient algorithm to perform specific tasks. To determine the discipline to solve a problem in the computer field is solvable, such as how can the efficiency of the solution, and how to made into a more efficient process. Today, in computer science has been derived within the many branches, each branch are different types of issues in-depth research. Software Engineering focuses on research and development of high-quality software systems methodology and practical way, and try to compress and forecast development costs and development cycle. Computer information systems research in a wide range of organized environment (business-oriented) in computer applications. Many disciplines are intertwined with other disciplines. Such as geographic information systems specialists is the use of computer technology to manage geographic information. There are three large-scale global commitment to the organization of computer science: the British Computer Society (BCS) of; Association for Computing Machinery (computers); of Electrical and Electronics Engineers (IEEE). Computer virus Computer virus in the "People's Republic of computer information system security protection regulations" were clearly defined, the virus refers to "the preparation of computer programs or computer functions into the destruction or damage to data, and can affect self-replicating computer using a _set_ of computer instructions or code. " User's computer of 24 symptoms of poisoning 1 computer system to run slowly. 2 computer system crashes often for no apparent reason. 3 computer system changes the length of documents. 4 abnormal reduce computer storage capacity. 5 boot slowed down. 6 missing files or corrupted files. 7. Abnormal display on the computer screen. 8 computer system abnormalities buzzer sounds. 9 disk label changes. 10 The system does not recognize the hard drive. 11 access to the storage system abnormalities. 12 keyboard exception. 13 file date, time, property and other changes. 14 file can not be properly read, copied or opened. 15 command execution error. 16 false alarms. 17 for the current disc. Some viruses will switch the current drive to the C drive. 18. Turn the clock back. Some viruses naming system back in time, reverse time. 19.WINDOWS operating system error occurs frequently for no reason. 20. System abnormalities restart. 21. Some external device is not working. 22 anomalies require the user to enter a password. 23.WORD or EXCEL prompts "macro." 24. Is not memory-resident program resident memory. Computer Radiation 1, computer radiation hazards Computers emit radiation waves are often ignored by the people. MPR Ⅱ in accordance with international radiation safety requirements: the 50cm distance must be less than equal to 25V / m radiation exposure. Computer radiation: 1, keyboard 1000V/m2, mouse 450V/m3, screen 218V/m4, host 170V/m5, Notebook2500V / m Therefore, the electromagnetic radiation has been following the World Health Organization as water, air, noise, the fourth largest source of environmental pollution, a hazard to human health of the invisible "killer", electromagnetic radiation protection has become a priority. 2 computers prevent radiation 1 to avoid long continuous operation of computer, pay attention to rest. To maintain one of the most appropriate posture, eyes and the screen distance should be 40 to 50 cm, the eyes look flat screen TV or slightly down. 2 room to maintain a good working environment, such as a comfortable temperature, clean air, a suitable anion concentration and ozone concentration. 3. Computer room lighting should be appropriate, not too bright or too dark to avoid exposure to direct light on the screen caused by interference of light. Studio ventilation to dry. 4 computer screen color filter to be used to reduce visual fatigue. Better to use glass or high quality plastic filter. 5 Install protective devices, reduce the intensity of electromagnetic radiation. 6. To keep the skin clean. Surface of the computer screen a large number of static electricity, the concentration of dust can be re-mapped to the face and hands exposed skin, a long time, prone to rash, pigmentation, severe and even cause skin lesions, and so on. 7. Attention to nutrition. The screen before the computer operator of long working hours, the retina of the rhodopsin would be consumed, and rhodopsin mainly by vitamin A synthesis. Therefore, the computer operator should eat more carrots, cabbage, bean sprouts, tofu, red dates, oranges and milk, eggs, liver, lean meat and other food to supplement the body of vitamin A and protein. And drink some tea, tea polyphenols and other active substances will facilitate the absorption of radioactive material with resistance
Translated by Google
Major brand notebook computers and Comment
European and American systems: IBM (International Business Machines): hardware boss. HP (Hewlett Packard): The world's largest computer manufacturers. Apple (Apple Computer): mac stability of the system is absolutely outstanding. DELL (Dell): desktop background assembled by the American brand, low price, high configuration. Gateway (Gateway): well-known PC manufacturers. Japanese: TOSHIBA (Toshiba): master of the hard core technology. FUJITSU (Fujitsu): Japan's IBM, Japan's largest IT companies, the world's third largest IT services provider. Sony (Sony): video entertainment products unbeatable, excellent design, functionality whimsy. Panasonic (Matsushita): unique design, lightweight body, a strong resilience and has the world's longest battery life, but also rugged notebook. NEC (NEC): Japan's large IT service provider, NEC is the biggest feature thin fashion, unique design and delicate, the notebook around the fuselage, key bits _set_ very particular about. SOTEC (Sauter): R & D has nearly two decades of history of Japanese professional notebook computer manufacturer SOTEC. Korean: SAMSUNG (Samsung): one of the world's best electronics companies, some of the digital leader in the field, the notebook is characterized by the appearance of fine fashion, focusing on entertainment fashion, high-tech performance is slightly stable than the Japanese. LG: LG notebook has a beautiful appearance, excellent LCD screen, reliable product quality workmanship, strong performance, ultra-thin portable, battery life capability. Taiwan-based: ASUS (ASUS): the world's 10 most important manufacturers of computers, laptops and stable quality of people feel at ease outside. Acer (Acer): the world's seventh largest PC makers, business machine and PC with a perfect combination of delicate and exquisite, beautiful appearance, fashion, technology innovation, continuous improvement. BenQ (BenQ): Entertainment notebook fashion a new image. MSI (MSI): a model of the hardware manufacturers, the world's top three manufacturers of motherboards, the world's largest graphics card manufacturer. Domestic Department: Lenovo (Lenovo): China's IT proud, acquired IBM's PC division, Lenovo is the Chinese mainland and overseas brand laptop confront those who carry the flag. TCL: Domestic mainstream notebook line, starting from the parts production processes, to achieve the quality of the source began to strict monitoring, to build the best quality of China's most outstanding notebook computers. FOUNDER (Founder): design, innovation, security and stability, services and so do very well, novel appearance, unique design, has resulted in a stylish users. HASEE (Shenzhou): a model of low-price, functionality, performance, quality of the models vary, but did not cut corners Divine low. TONG FANG (Tongfang): low quality low price, Tongfang commercial portable, high-performance business notebook. Mainstream notebooks on the market, the Chinese famous universities invested in IT companies. Greatwall (Great Wall): country's oldest IT companies, the Great Wall to the consumer, to bring consumers excellent quality, reasonably priced notebooks. HEDY (Hedy): Focus Features, and other brands of notebooks to keep a certain amount of differentiation, widescreen notebook, compared to other brands most common 4:3 screen has a great advantage. AMOI (Amoi): China notebook rookie, with super strength, is China's first independently developed and marketed was a notebook manufacturers, independent research and development of international standard quality system and the whole production line. Haier (Haier): a newcomer to the field of notebook computers, has been missing now in the brand, but strong, with the impact of its brand in the appliance industry as well as the trust is bound to hit a rising star. SALO (Blue): the appearance of relatively simple, Blue's desktop market share of large, large single order for Africa's ten million.
Translated by Google
Encyclopedia
diannao Computer electronic brain Another computer name.