Evolution and history of the most prominent 16 and 32 bit microprocessors in the microcomputer and how they are similar to and different from each othe. Іnteractivity addition to the combination of text, graphics, images, audio and video by multimedia.
|Рубрика||Иностранные языки и языкознание|
|Размер файла||28,7 K|
Отправить свою хорошую работу в базу знаний просто. Используйте форму, расположенную ниже
Студенты, аспиранты, молодые ученые, использующие базу знаний в своей учебе и работе, будут вам очень благодарны.
Summary on the subject: Microprocessors Evolution
Only once in a lifetime will a new invention come about to touch every aspect of our lives. Such a device that changes the way we work, live, and play is a special one, indeed. The Microprocessor has been around since 1971 years, but in the last few years it has changed the American calculators to video games and computers (Givone 1). Many microprocessors have been manufactured for all sorts of products; some have succeeded and some have not. This paper will discuss the evolution and history of the most prominent 16 and 32 bit microprocessors in the microcomputer and how they are similar to and different from each other. Because microprocessors are a subject that most people cannot relate to and do not know much about, this paragraph will introduce some of the terms that will be involved in the subsequent paragraphs. Throughout the paper the 16-bit and 32-bit microprocessors are compared and contrasted. The number 16 in the 16-bit microprocessor refers how many registers there are or how much storage is available for the microprocessor (Aumiaux, 3). The microprocessor has a memory address such as A16, and at this address the specific commands to the microprocessor are stored in the memory of the computer (Aumiaux, 3). So with the 16-bit microprocessor there are 576 places to store data. With the 32-bit microprocessor there are twice as many places to store data making the microprocessor faster. Another common term which is mentioned frequently in the paper is the oscillator or the time at which the processors “clock” ticks. The oscillator is the pace maker for the microprocessor which tells what frequency the microprocessor can process information, this value is measured in Mega-hertz or MHz. A nanosecond is a measurement of time in a processor, or a billionth of a second. This is used to measure the time it takes for the computer to execute an instructions, otherwise known as a cycle. There are many different types of companies of which all have their own family of processors. Since the individual processors in the families were developed over a fairly long period of time, it is hard to distinguish which processors were introduced in order. This paper will mention the families of processors in no particular order. The first microprocessor that will be discussed is the family of microprocessors called the 9900 series manufactured by Texas Instruments during the mid-70s and was developed from the architecture of the 900 minicomputer series (Titus, 178). There were five different actual microprocessors that were designed in this family, they were the TMS9900, TMS9980A, TMS9981, TMS9985, and the TMS9940. The TMS9900 was the first of these microprocessors so the next four of the microprocessors where simply variations of the TMS9900 (Titus, 178). The 9900 series microprocessors runs with 64K memory and besides the fact that the 9900 is a 16-bit microprocessor, only 15 of the address memory circuits are in use (Titus, 179). The 16th address is used for the computer to distinguish between word and data functions (Titus, 179. The 9900 series microprocessors runs from 300 nanoseconds to 500 ns from 2MHz to 3.3MHz and even some variations of the original microprocessor where made to go up to 4MHz (Avtar, 115). The next microprocessor that will be discussed is the LSI-11 which was produced from the structural plans of the PDP-11 minicomputer family. There are three microprocessors in the LSI-11 family they are the LSI-11, LSI-11/2, and the much improved over the others is the LSI-11/32 (Titus, 131). The big difference between the LSI-11 family of microprocessors and other similar microprocessors of its kind is they have the instruction codes of a microcomputer but since the LSI-11 microprocessor originated from the PDP-11 family it is a multi-microprocessor (Avtar, 207). The fact that the LSI-11 microprocessor is a multi-microprocessor means that many other microprocessors are used in conjunction with the LSI-11 to function properly (Avtar, 207). The LSI-11 microprocessor has a direct processing speed of 16-bit word and 7- bit data, however the improved LSI-11/22 can directly process 64-bit data (Titus, 131). The average time that the LSI-11 and LSI-11/2 process at are 380 nanoseconds, while the LSI-11/23 is clocked at 300 nanoseconds (Titus, 132). There are some great strengths that lie in the LSI-11 family, some of which are the efficient way at which the microprocessor processes and the ability to run minicomputer software which leads to great hardware support (Avtar, 179). Although there are many strengths to the LSI- 11 family there are a couple of weaknesses, they have limited memory and the slow- ness of speed at which the LSI-11 processes at (Avtar, 179). The next major microprocessors in the microcomputing industry were the Z8001 and Z8002, however when the microprocessor entered into the market the term Z8000 was used to mean either or both of the microprocessors (Titus, 73). So when describing the features of both the Z8001 and the Z8002, they will be referred to as the Z8000. The microprocessor was designed by the Zilog Corporation and put out on the market in 1979 (Titus, 73). The Z8000 are a lot like the many other previous micro- processors except for the obvious fact that it is faster and better, but are similar be- cause they depend on their registers to function properly (Titus, 73). The Z8000 was improved by using 21 16-bit registers, 14 of them are used for general purposes opera- tions (Titus, 73). The difference with the Z8001 and the Z8002 is the Z8002 can only address 65K bytes of memory, which is fascinating compared to the microprocessors earlier in time but is greatly inferior to the Z8001 which can address 8M bytes (8000K) of memory (Titus, 73). The addressing memory between the two otherwise very similar microprocessors is drastically different were as other functions of the microprocessors seem to be quite the same. An example of this is the cycle time. The cycle time is 250 nanoseconds and the average number of cycles that occur per instruction are between 10 and 14 for both microprocessors (Avtar, 25). The next microprocessor that will be discussed is the 8086. This microprocessor is the best in my opinion, out of all the 16-bit microprocessors. Not only because the speeds of processing are tremendous, but because it simply paved the way to the 32-bit microprocessors using various techniques that will be discussed later. The 8086 was the second Intel microprocessor (being preceded by the 8080) (Avtar, 19). The 8086 was introduced in early 1978 by Intel (Avtar, 19). Like so many of the other processors the 8086 is register oriented with fourteen 16-bit registers, eight of which are used for general processing purposes (Avtar, 19). The 8086 can directly address 1MB (1,048,576 bytes) which is used only in accessing Read Only Memory. The basic clock frequency for the 8086 is between 4MHz and 8MHz depending on the type of 8086 microprocessor that is used (Avtar, 20). Up until this point in the paper there have been common reoccurring phrase such as a microprocessor containing 14 16-bit registers. At this time in the evolution of microprocessors come the 32-bit register, which obviously has double the capacity to hold information for the microprocessor. Because of this simple increase of the register capacity we have a whole different type of microprocessor. Although the 16- bit and 32-bit microprocessors are quite different (meaning they have more components and such), the 32-bit microprocessors will be described in the same terms as the 16-bit microprocessors were. The remainder of the paper will discuss the 32-bit microprocessor series. The external data bus is a term that will be referred to in the remainder of the paper is. The data bus is basically what brings data from the memory to the processor and from the processor to the memory (Givone, 123). The data bus is similar to the registers located on the microprocessor but are a little bit slower to access (Givone, 123). The first 32-bit microprocessor in the microprocessor industry that will be dis- cussed is the series 32000 family and was originally built for main-frame computers. In the 32000 family all of the different microprocessors have the same 32-bit internal structure; but may have external bus values such as 8, 16, or 32 bits (Mitchell, 225). In the 32000 family the microprocessors use only 24 of the potential 32 bit addressing space, giving the microprocessor a 16 Mbyte address space (Mitchell, 225). The 32- bit registers are set up so there are six 32-bit dedicated registers and then in combination there are two 16-bit dedicated registers (Mitchell, 231). Each dedicated register has its own type of specific information that it holds for processing (Mitchell, 232). The microprocessors oscillator (which now comes from an external source) runs at 2.5 MHz, but due to a “divide-by-four prescaler” the clock frequency runs at 10MHz. There have been many new ideas put into practice to improve the 32000 series micro- processor generally and thus making it run faster and more efficient. The next family of microprocessor which was fabricated for the microcomputer is the MC68020 32-bit microprocessor which is based on the MC68000 family. The other microprocessors that are included in this family are the MC68000, MC68008, MC68010 and the MC68012 (Avtar, 302). Before going into the types of components that this microprocessor contains, it should first be know that the making of the MC68020 has been the product of 60 man-years of designing including the manufacturing of the High-density Complementary Metal Oxide Semiconductor giving the microprocessor high speed and low resistance and heat loss (Avtar, 302). Because of all the work that was put into the MC68020 and its other related microprocessors, it is an extremely complex microprocessor. The MC68020 operates in two modes, these are the user mode(for application programs) or the supervisor mode (the operating system and other special functions) (Mitchell, 155). The user and supervisor modes all have their own specific registers to operate their functions. The user programming has 17 32-bit address registers, and an 8-bit register (Mitchell, 155). Then the supervisor programming has three 32-bit, an 8-bit and two 3-bit registers for small miscellaneous functions (Mitchell, 155). All of these registers within the two modes are split up into different groups which would hold different information as usual, but this set up of registers gives the microprocessors a 20 32-bit information storing capacity. The next family of microprocessor is Intel's 80386 and 80486 families. The 80386 and 80486 were mostly over all better then the other microprocessors being made by the different companies in the industry at this time, simply because Intel is now the leading microprocessor producer in today's market. The 80386 was a product that evolved from Intel's very first microprocessor, the 8-bit 8080 (Mitchell, 85). Then next came the earlier mentioned 16-bit 8086. The reason why Intel did so well in the market for microprocessors was because every microprocessor that they made was compatible with the previous and future (Mitchell, 85). This means that if a piece of software worked on the 8080 then it worked on the future microprocessors and vice-a- versa. Not only did Intel look forward but they looked back. The main difference between the 80386 and the other 32-bit microprocessors is the added feature of a barrel shifter (Mitchell, 88). The barrel shifter allowed information to switch places multiple times in the registers within a single cycle (Mitchell, 88). The microprocessor contains 8 general purpose 32-bit registers, but with the barrel shifter that is increased to the equivalent of a 64-bit microprocessor. For the most common 20MHz 80386 microprocessor the run time for each cycle is 59 nanoseconds, but for a 33MHz microprocessor the cycle time is reduced to 49 nanoseconds. The next 32-bit microprocessor in market are AT&T's WE32100 and 32200 (Mitchell, 5). These microprocessors also needed six peripheral chips in order to run, these are termed: Memory Management Units, floating point arithmetic, Maths Acceleration Units, Direct Memory Access Control, and Dynamic Rand Access Memory Control (Mitchell, 5). These microprocessors apart from the microprocessors all work an important part of processing the data that comes through the microprocessor. The difference from this microprocessor and the others is because the WE32200 address information over the 32-bit range with the help of a disk to work as a slow form of memory (Mitchell, 9). The WE32200 microprocessor runs at a frequency of 24MHz (Mitchell, 9). The 16-bit and 32-bit microprocessors are a mere page in the great book of processor history. There will be many new and extremely different processors in the near future. A tremendous amount of time and money have been put into the making and improving of the microprocessor. The improving and investment of billions of dollars are continually going toward the cause of elaborating the microprocessors. The evolution of the microprocessor will continue to evolve for the better until the time when a much faster and more efficient electronic device is invented. This is turn will create a whole new and powerful generation of computers. Hopefully this paper has given the reader some insight into the world of microprocessor and how much work has been put into the manufacturing of the microprocessor over the years. The Evolution of The Microprocessor November 25, 1996
The term media refers to the storage, transmission, interchange, presentation, representation and perception of different information types (data types) such as text, graphics, voice, audio and video. The term multimedia is used to denote the property of handling a variety of representation media in an integrated manner. The phrase 'representation media' is used because it is believed the most fundamental aspect of multimedia systems is the support for different representation types. It is necessary for a multimedia system to support a variety of representation media types. It is also important that the various sources of media types are integrated into a single system framework. Multimedia is more than multiple media. Multimedia adds interactivity to the combination of text, graphics, images, audio and video. Creating your own media is more interactive than is using existing content, and collaborating with others in the creation of media is still more interactive. Multimedia systems use a number of different media to communicate supplementary, additional or redundant information. Often this may take the form of using multiple sensory channels, but it may also take the form of different types of visual input - textual, graphical, iconic, animation and video. Multimedia - the combination of text, animated graphics, video, and sound--presents information in a way that is more interesting and easier to grasp than text alone. It has been used for education at all levels, job training, and games and by the entertainment industry. It is becoming more readily available as the price of personal computers and their accessories declines. Multimedia as a human-computer interface was made possible some half-dozen years ago by the rise of affordable digital technology. Previously, multimedia effects were produced by computer-controlled analogue devices, like videocassette recorders, projectors, and tape recorders. Digital technology's exponential decline in price and increase in capacity has enabled it to overtake analogue technology. The Internet is the breeding ground for multimedia ideas and the delivery vehicle of multimedia objects to a huge audience. While we have treated various output media in isolation, it is clear that interesting issues emerge as they are combined in what is termed multimedia. In this sense, any computer application that employs a video disk, images from a CD-ROM, uses high quality sound, or uses high quality video images on screen may be termed a multimedia application. Such interfaces are often aesthetically appealing and, where high capacity storage devices such as CD-ROM are used, can provide effective interactions for the user by acting as very large databases or storehouses of information with dense but easy-to-use cross-referencing and indexing. Multimedia is all things to all people. The name can convey a highly specific meaning or less then nothing, depending on your audience. In fact, multimedia is a singular mix of disparate technologies with overlapping application in pursuit of a market and an identity. We can describe it as the seamless integration of data, text, images and sound within a single digital information environment. Multimedia finds its worth in the field of presenting information in a manner that is intuitive and more natural than traditional means. A multimedia user interface must provide a wide variety of easily understood and usable media control tools. In addition, information views need to be integrated with structural views, since the viewing of information will often alternate moving through the structure by one means or another. Interactive Multimedia (IMM) is about empowering the user to explore new realms by a variety of pathways. It is an umbrella term for a range of videodisc, compact disc and computer-based systems that allow the creation, integration and manipulation of text, graphics, still and moving video images and sound. The computer elements of an IMM system have the capacity to: · Store, manipulate and present a range of information forms · Allow various forms of computer-based information to be accessed in linear and non-linear ways. · Provide graphics overlay and print out screen material. · Enable learners to work independently. · Provide feedback to the learner Interactive multimedia provides a powerful means of enhancing learning and information provision. There are however some cautions which need to be heeded if the full potential of IMM is to be realised. These can be seen listed below: · Lack of world standards · Technical problems · Platforms · Building successful teams · Developmental costs Interactivity means that the user receives appropriate and expected feedback in response to actions taken. It is a two-way human-machine communication involving an end-user and a computer-based instructional system. Users actively direct the flow and direction of the instructional or information programmes which, in turn, exchange information with the viewers, processing their inputs in order to generate the appropriate response within the context of the programme. The basic elements of human interface design are now well established. The user, not the computer should initiate all actions. The user accesses and manipulates the various elements of the product by clicking on buttons, icons or metaphors with a mouse or other pointing device. Interface design should be consistent where appropriate and differentiated where needed so the user can rely on recognition rather than recall. The user should always be given immediate auditory or visual feedback. User activities should be broken into small steps where tasks are complex. The interface design should be aesthetically pleasing, appropriate to the content and suited to the learner's culture and prior knowledge. For designers of multimedia the main design issues are how to integrate the media and which media to use for presenting different kinds of information. The development of metaphorical interfaces, direct manipulation, graphical user interfaces (GUI's) and recent advances in the field of virtual reality allow users to control the system by manipulating objects such as icons, windows, menus and scroll bars. In well designed Interfaces, these objects are so selected and represented that users can intuitively deduce their meaning and their function in the system from prior 'everyday knowledge' and experience. Hypertext is a system for presenting active text. The key feature from the learner's point of view is that the text has many nodes and links, which allow them to determine their own routes through the material. Hypertext has many applications, including use as a presentation medium for information management and browsing, providing access to information that the public needs (such as tourism information) and for various activities. Hypermedia combines aspects of hypertext and a variety of multimedia used in some combination. The branching structure of hypertext is used with multimedia in order to produce a system in which learners can determine their own paths through the medium. Hypertext is the process of linking concepts within text documents through the use of 'hotwords'. A hotword is an active word within a document that the user can click on to navigate to another part of the project or to initiate some form of interaction. However navigation by hypertext can be confusing, it can be easy for a user to become 'lost in hyperspace'. After a few clicks users can be so far from the original topic that they become hopelessly confused. Nearly all multimedia applications include text in some form. Text and the written language remain the most common way of communicating information in our society. The computer brings extra power to text, not only by allowing you to manipulate its size and shape but also making it an interactive medium. The ability to show moving images using digital video can greatly enhance IMM projects. Just as video has a role in multimedia, sound also plays an important part in a project. A few carefully placed sounds can greatly enhance a project, but a continuous monologue can be highly distracting. With the text-to-speech technology, the computer interprets text and converts it into phonetic sounds in much the same way as a human would. Thus, the computer can read back any text within any program with reasonable fidelity. This feature is very useful within an IMM program because large amounts of text can be converted to audio without large sound files. A particular use of this technology is to offer an alternative for vision-impaired people. There are however, some disadvantages to computer generated speech. The speech can sound robotic compared to human speech and it lacks the variable information that can make human speakers appealing. Unlike print or graphics, animation is a dynamic medium. We get a sense of relative timing, position, direction and speed of action. We need no captions because the message is conveyed by the motion and the scene. Simply put, animation is the process of creating, usually graphically a series of frames and then having them display rapidly to get a sense of movement. Video provides high-speed information transfer and shows temporal relationships. Video is produced by successive capture and storage of images as they change with time. Two types of speech are available for use by multimedia developers: digitised and synthesised. Digitised speech provides high quality natural speech while synthesised speech may not sound as natural as human speech. Even with improved techniques for generating speech, it is not incorporated into multimedia programs as often as it could be. This may be due to a lack of understanding of how high quality speech is produced. Multimedia interface designers have typically used a navigation/map metaphor, a menu/hierarchy metaphor or a journal (sequence) metaphor. An example of the first strategy is the Virtual Museum, produced by Apple Computer. Here the user accesses the multimedia information by navigating through the virtual museum, moving from room to room by selecting directions of movement. Examples of the second strategy include on-line encyclopaedias and electronic books where a table of contents is used to organize the material. It is helpful to view multimedia applications as a convergence of today's content and titles, such as movies and books of today's computer application programs, such as word processors and of today's network services. As an example a multimedia book should have the following features. Besides text, the book has other media that the author created, including not only text, graphics and images but also audio and video to make the book's content clearer or more enjoyable. Programs should be built-in to help a user navigate through the author's media. Multimedia's driving technologies, mainly digital electronics and fiberoptic communications are making more and more functions sufficiently economical for consumers to use. Example applications include: Desktop Video Conferences with collaboration Multimedia Store-and-Forward mail Consumer Edutainment, Infotainmnet, Sociotainment Digital Libraries Video on demand Hybrid Applications IMM has many applications in libraries. IMM can bring knowledge in its entire media formats into condensed, accessible forms capable of being used for reference and educational applications. On the whole, within the library sector IMM is currently regarded with some ambivalence. Many library professionals look upon it as an interesting technology, but one that will require significant investment and change if its potential is to be fully realised. Possible barriers to the effective adoption of IMM by librarians may be cited as financial constraints and a lack of requisite resources resulting in a lack of opportunity to become familiar with the new and emergent systems; ingrained traditional resistance to change; a degree of uncertainty regarding the appropriateness of the technology to various applications; an inability to grasp the significance of IMM and a lack of experience, knowledge and skills in regard to IMM among library professionals. Example applications include the Book House - a library system using hypertext techniques to help users find books without the limitations of traditional information retrieval. The user interface of the Book House is based on a building like a real library with the user being able to enter rooms filled with children's books, adult books etc. The system supports four basic search strategies, using icons and pictures to enable location of the books or topic sought. Voice response and voice recognition technologies could be used in a library situation, this could mean that merely speaking a unique book identifier or name could trigger the system into automatically filling in the remainder of the bibliographic or personal details relating to that item or person. Increasingly, multimedia systems will be developed with the aim of allowing non-textual information to be used directly, in a demonstrational manner. Even when text is present other media provide different additional information. Also, when dealing with multimedia, users are naturally disposed to interact in ways other than those developed for text. A first step to giving the user the impression that he/she is dealing directly with non-textual material allows database search on the basis of identifying images that best suit the user's purposes. An initial query that turns up a large number of images can be refined by allowing the user to point a few images out of the set that contain items of interest. The system can then use the text descriptions attached to the chosen images to form a new query and offer a further set of possibly more relevant images. My conclusion is that design could benefit tremendously from open and collaborative multimedia research - not from relatively closed multimedia packages.
Are MP3's a breakthrough in technology or are they just another bomb waiting to explode on us? Many people say they are good, while others say they are not just bad, but horrifying to musicians that want to make it to the top. MP3's are widely used by teenagers on their computers usually illegally, and their distributors are constantly being threatened by the producers of the music. Millions of dollars are being lost due to the Internet craze of the MP3 technology. This is mainly because fewer people are buying the legal music from record stores. Now that the problem is here, Internet police are on the loose to find these illegal distributors of music and put them to a stop. MP3's are highly compressed, CD-quality, sound files. The MP3 has become the most commonly used unofficial file format, which is downloadable from the Internet. The only requirement you need to play an MP3 is a program like Winamp (found at www.winamp.com) or Microsoft Windows Media Player. The Internet allows users to download songs (in MP3 format) in a matter of minutes without paying any money. This compressed MP3 technology is popping up everywhere on the Internet. There is almost no music site that you can go to where an MP3 of some sort is not being offered. All you have to do is login and download. MP3's are breaking copyright laws and are a part of online piracy. Online piracy is playing, or downloading, songs and lyrics without authorization and without paying tribute to the artists, on the Internet. Downloading even one song without permission is considered online piracy. When people download MP3's from the Internet, they choose to ignore the copyright laws because the disclaimers are all written in font sizes under 10pts at the bottom of the page. If people stop going to the site, the site stops making money. All things that might make the user leave the site are hidden. The RIAA (Recording Industry Association of America) has two copyrights that apply to MP3's. 1. Copyright in musical work Lyrics and musical notes as they're written on paper. The songwriter or music publisher typically owns this copyright. 2. Copyright in the sound recording Which is a recording of a performer singing or playing the particular song. The record company usually owns this copyright. Therefore, the only legal way to copy, download, and upload an MP3 is to get permission, from the artist, which every user either forgets to do, or doesn't even bother. This is the primary cause for the war of legal rights that goes on today, because free is good right? Wrong! Having free MP3's on the Internet creates a problem. The problem is that millions and millions of dollars are lost everyday to all of the musicians that make the music possible. The Canadian Recording Industry Association reported that there are around 80,000 infringing MP3 sites on the Internet and each one is carrying around 300 or more recordings each. That means that there are around 24 million songs that are illegally on the Internet. Major money is being lost here. The RIAA also calculated that there are 120 million downloads from MP3 sites weekly and climbing, representing an annual loss of $5 billion (US) to the recording industry and around $1 million a day in the United States alone. The recording industry is going crazy trying to fix this problem. Brian Robertson, president of the Canadian Recording Industry Association spoke at a conference and said “There are tens of thousands of sound recordings that are basically sitting around in a virtual record store with the door wide open and everyone is helping themselves” and concluded by saying “Everyone using MP3's feels they have the inalienable right to use the product”. Because of an increase in hard drive capacity, users cannot only trade individual songs, but full albums too. This makes matters even worse because people just get what is called a CD-Burner and writes the MP3's onto a CD so they can now listen to MP3's on any audio CD player. People could also get what is called an MP3 player. An MP3 player is a small portable device that stores and plays MP3's. An example of one of these is a NOMAD Player (made by Creative). The users of MP3's are having their fun now, but how long will this adventure last? How long will recording companies and artists allow money fall out of their pockets by some little teenager who has no clue about the copyrights or laws he/she is breaking? Not very long it seems. More and more companies are teaming up together to fight MP3's. The 5 biggest global music and entertainment companies (Time Warner Inc., EMI, Sony, Seagram and Bertelsmann) have hooked up with big computer businesses like IBM to try to control the music distribution over the Internet. According to Market Tracker International, legal Internet-related music sales rose to $147 million from $29 million in 1997. This shows that companies can use the Internet as an advantage. Companies need to use marketing techniques to lure users into their sites to actually pay for music even though the net is filled with illegal web sites distributing the product for free. Vorton Corp., for example, lures up to 50,000 visitors a day just for selling CD's at reasonable prices. The number of sales for Vorton Corp. increases as the illegal downloads decrease. Organizations, all over the web, have full-time employees surfing the Internet all day looking for offending MP3 sites. Artists and recording companies are losing the money they should make from their hard and creative work because of illegal downloading of MP3's. The battle is just beginning. People need to know that even though it is easy to get MP3 files for free. They are creating the artists and the recording companies, and are breaking the law. Although MP3 files seem like a friend, they are really everyone's foe.
With its Pentium-crushing speed and new design, the Power Mac G4 picks up where the old Macintosh (G3) left off. Its enclosure is now highly polished silver and graphite, yet it still offers easy access to every internal component through its swing-open side door. With PowerPC G4 with Velocity Engine, the computer speeds up to 450MHz, one megabyte of backside level-2 cache running at half the processor speed, and a 100MHz system bus supporting up to 800-megabytes-per-second data throughput, the Power Mac G4 delivers high performance. And when you've completed your projects, shooting those big files across the network is a snap, because every new Power Mac G4 comes with 10/100BASE-T Ethernet built in. This means that when you buy it, it is ready to go to get set up to your local cable internet provider. The secret of the G4's revolutionary performance is its aptly named Velocity Engine. It's the heart of a supercomputer miniaturized onto a sliver of silicon. The Velocity Engine can process data in 128-bit chunks, instead of the smaller 32-bit or 64-bit chunks used in traditional processors (it's the 128-bit vector processing technology used in scientific supercomputers--except that they have added 162 new instructions to speed up computations). In addition, it can perform four (in some cases eight) 32-bit floating-point calculations in a single cycle, which is two to four times faster than traditional processors. The new G4's are proven to be faster that pentium 3's. Using six of Intel's tests, the 450MHz G4 was, on average, more than two and a half times as fast as the 600MHz Pentium III (2.65 times, to be exact). These benchmark advantages translate directly into real-world advantages. For example, typical Photoshop tasks run 187% faster on the Power Mac G4 as they do on the fastest Pentium III-based PCs, with specific Photoshop filters running up to four times faster. Compressing QuickTime files is also much faster. Other Hardwear Monitor This is the ultimate companion to the Power Mac G4: the Apple Cinema Display. Its 22-inch diagonal screen makes this the largest LCD display ever brought to market, with a viewing area identical to that of a 24-inch flat CRT display. This is state-of-the-art technology, and supplies will naturally be limited. It's sold exclusively through the Apple Store, bundled with a 450MHz or faster Power Mac G4. But if you're fortunate enough to use one, your office view will never be the same again. Its viewing quality, is a world apart: twice as bright and twice as sharp as a CRT, with triple the contrast ratio and absolutely zero flicker. And its colors remain true from almost any viewing angle. Like a movie theatre, the Apple Cinema Display has a letterbox format (1600x1024 pixels), with room enough to display an entire 11x17 image. And unlike most other displays, it receives its data digitally from the computer, preserving the highest quality image. Operating System Mac OS 9 Mac OS 9 is the ultimate sytem for anyone on the Internet or anyone who wants to be. It is also very good for businesses. With more than 50 powerful new features, Mac OS 9 offers a full range of capabilities for new and advanced users. At work, there's no better way to get the most of your Macintosh computer and the Internet. What is comes with: Sherlock 2. In addition to its smart disk and file searching, Sherlock 2 is your Internet search detective and personal shopper. Look the Web for people, references, Apple info and current news in a flash. Sherlock 2 also lets you shop and compare prices. ColorSync 3. Included in Mac OS 9, ColorSync 3 manages color across input, display and output. It features improved AppleScript support, saveable workflow settings, powerful Calibration Assistant and enhanced control for profiles. At www.apple.com, there is also acompreensive manual and tutorial to teach you and your colleagues about all the uses and features of Mac OS 9 Softwear QuickTime 4 With a customer base of more than fifteen million Mac and Windows users who downloaded the preview release, and a growing list of online publishers--including, most recently, Fox News Online, Fox Sports Online and The Weather Channel--QuickTime 4 is the hottest streaming technology on the Internet. Some examples of how QuickTime is used in live programming and on-demand programming on the webare are the BBC, Bloomberg, HBO, NPR and WGBH Boston, and industry giants like Broderbund, Voyager, Cyan, Pixar, Lucasfilm, Macromedia, Microsoft, Disney and CNN. All for one simple reason: QuickTime is the standard for digital video and streaming media. QuickTime can be used to communicate to distant colleagues and potential clients, and is a must for businesses. AppleWorks Everything you need to create professional-quality documents with ease and speed in one affordable package. AppleWorks provides word processing, graphics and illustration, spreadsheet, charting/graphing, presentations and database capabilitie as well as the ability to save your work in HTML for publishing it on the web. Apple Share IP AppleShare IP software enhances workgroup productivity with an integrated suite of network services. Using a single administrative interface, you can quickly set up file, print, mail and web servers for your business. Support for standard Internet protocols makes AppleShare IP file, mail and web services available to authorized and guest users via the Internet and your intranet. Hypercard Now more integrated with QuickTime, HyperCard organizes information into easy-to-use "stacks" of cards through which users can navigate and search for the information they need. Simply by clicking on a button, they can view related text, see a graphic, hear a sound, watch a QuickTime movie, or listen to text spoken out loud.
Silicon is the raw material most often used in integrated circuit (IC) fabrication. It is the second most abundant substance on the earth. It is extracted from rocks and common beach sand and put through an exhaustive purification process. In this form, silicon is the purist industrial substance that man produces, with impurities comprising less than one part in a billion. That is the equivalent of one tennis ball in a string of golf balls stretching from the earth to the moon. Semiconductors are usually materials which have energy-band gaps smaller than 2eV. An important property of semiconductors is the ability to change their resistivity over several orders of magnitude by doping. Semiconductors have electrical resistivities between 10-5 and 107 ohms. Semiconductors can be crystalline or amorphous. Elemental semiconductors are simple-element semiconductor materials such as silicon or germanium. Silicon is the most common semiconductor material used today. It is used for diodes, transistors, integrated circuits, memories, infrared detection and lenses, light-emitting diodes (LED), photosensors, strain gages, solar cells, charge transfer devices, radiation detectors and a variety of other devices. Silicon belongs to the group IV in the periodic table. It is a gray brittle material with a diamond cubic structure. Silicon is conventionally doped with Phosphorus, Arsenic and Antimony and Boron, Aluminum, and Gallium acceptors. The energy gap of silicon is 1.1 eV. This value permits the operation of silicon semiconductors devices at higher temperatures than germanium. Now I will give you some brief history of the evolution of electronics which will help you understand more about semiconductors and the silicon chip. In the early 1900's before integrated circuits and silicon chips were invented, computers and radios were made with vacuum tubes. The vacuum tube was invented in 1906 by Dr. Lee DeForest. Throughout the first half of the 20th century, vacuum tubes were used to conduct, modulate and amplify electrical signals. They made possible a variety of new products including the radio and the computer. However vacuum tubes had some inherent problems. They were bulky, delicate and expensive, consumed a great deal of power, took time to warm up, got very hot, and eventually burned out. The first digital computer contained 18,000 vacuum tubes, weighed 50 tins, and required 140 kilowatts of power. By the 1930's, researchers at the Bell Telephone Laboratories were looking for a replacement for the vacuum tube. They began studying the electrical properties of semiconductors which are non-metallic substances, such as silicon, that are neither conductors of electricity, like metal, nor insulators like wood, but whose electrical properties lie between these extremes. By 1947 the transistor was invented. The Bell Labs research team sought a way of directly altering the electrical properties of semiconductor material. They learned they could change and control these properties by "doping" the semiconductor, or infusing it with selected elements, heated to a gaseous phase. When the semiconductor was also heated, atoms from the gases would seep into it and modify its pure, crystal structure by displacing some atoms. Because these dopant atoms had different amount of electrons than the semiconductor atoms, they formed conductive paths. If the dopant atoms had more electrons than the semiconductor atoms, the doped regions were called n-type to signify and excess of negative charge. Less electrons, or an excess of positive charge, created p-type regions. By allowing this dopant to take place in carefully delineated areas on the surface of the semiconductor, p-type regions could be created within n-type regions, and vice-versa. The transistor was much smaller than the vacuum tube, did not get very hot, and did not require a headed filament that would eventually burn out. Finally in 1958, integrated circuits were invented. By the mid 1950's, the first commercial transistors were being shipped. However research continued. The scientist began to think that if one transistor could be built within one solid piece of semiconductor material, why not multiple transistors or even an entire circuit. With in a few years this speculation became one solid piece of material. These integrated circuits(ICs) reduced the number of electrical interconnections required in a piece of electronic equipment, thus increasing reliability and speed. In contrast, the first digital electronic computer built with 18,000 vacuum tubes and weighed 50 tons, cost about 1 million, required 140 kilowatts of power, and occupied an entire room. Today, a complete computer, fabricated within a single piece of silicon the size of a child's fingernail, cost only about $10.00. Now I will tell you the method of how the integrated circuits and the silicon chip is formed. Before the IC is actually created a large scale drawing, about 400 times larger than the actual size is created. It takes approximately one year to create an integrated circuit. Then they have to make a mask. Depending on the level of complexity, an IC will require from 5 to 18 different glass masks, or "work plates" to create the layers of circuit patterns that must be transferred to the surface of a silicon wafer. Mask-making begins with an electron-beam exposure system called MEBES. MEBES translates the digitized data from the pattern generating tape into physical form by shooting an intense beam of electrons at a chemically coated glass plate. The result is a precise rendering, in its exact size, of a single circuit layer, often less than one-quarter inch square. Working with incredible precision , it can produce a line one- sixtieth the width of a human hair. After purification, molten silicon is doped, to give it a specific electrical characteristic. Then it is grown as a crystal into a cylindrical ingot. A diamond saw is used to slice the ingot into thin, circular wafers which are then polished to a perfect mirror finish mechanically and chemically. At this point IC fabrication is ready to begin. To begin the fabrication process, a silicon wafer (p-type, in this case) is loaded into a 1200 C furnace through which pure oxygen flows. The end result is an added layer of silicon dioxide (SiO2), "grown" on the surface of the wafer. The oxidized wafer is then coated with photoresist, a light-sensitive, honey-like emulsion. In this case we use a negative resist that hardens when exposed to ultra-violet light. To transfer the first layer of circuit patterns, the appropriate glass mask is placed directly over the wafer. In a machine much like a very precise photographic enlarger, an ultraviolet light is projected through the mask. The dark pattern on the mask conceals the wafer beneath it, allowing the photoresist to stay soft; but in all other areas, where light passes through the clear glass, the photoresist hardens. The wafer is then washed in a solvent that removes the soft photoresist, but leaves the hardened photoresist on the wafer. Where the photoresist was removed, the oxide layer is exposed. An etching bath removes this exposed oxide, as well as the remaining photoresist. What remains is a stencil of the mask pattern, in the form of minute channels of oxide and silicon. The wafer is placed in a diffusion furnace which will be filled with gaseous compounds (all n- type dopants), for a process known as impurity doping. In the hot furnace, the dopant atoms enter the areas of exposed silicon, forming a pattern of n-type material. An etching bath removes the remaining oxide, and a new layer of silicon (n-) is deposited onto the wafer. The first layer of the chip is now complete, and the masking process begins again: a new layer of oxide is grown, the wafer is coated with photoresist, the second mask pattern is exposed to the wafer, and the oxide is etched away to reveal new diffusion areas. The process is repeated for every mask - as many as 18 - needed to create a particular IC. Of critical importance here is the precise alignment of each mask over the wafer surface. It is out of alignment more than a fraction of a micrometer (one-millionth of a meter), the entire wafer is useless. During the last diffusion a layer of oxide is again grown over the water. Most of this oxide layer is left on the wafer to serve as an electrical insulator, and only small openings are etched through the oxide to expose circuit contact areas. To interconnect these areas, a thin layer of metal (usually aluminum) is deposited over the entire surface. The metal dips down into the circuit contact areas, touching the silicon. Most of the surface metal is then etched away, leaving an interconnection pattern between the circuit elements. The final layer is "vapox", or vapor-deposited-oxide, a glass-like material that protects the IC from contamination and damage. It, too, is etched away, but only above the "bonding pads", the square aluminum areas to which wires will later be attached.
Mitchel, H.J. 32-bit Microprocessors. Boston: CRC Press. 1986,1991
Titus, Christopher A. 16-Bit Microprocessors. Indiana: Howard W. Sams & Co., Inc. 1981
Aumiaux, M. Microprocessor Systems. New York: John Wiley & Sons. 1982
Givone, Donald D.; Rosser, Robert P. Microprocessors/Microcomputers. New York: McGraw-Hill Book Company. 1980
Avtar, Singh. 16-Bit and 32-Bit Microprocessors: Architecture, Software, and Interfacing Techniques: New Jersey. Englewood Cliffs. 1991
Historical background of the History of English. Assimilative Vowel Changes: Breaking and Diphthongisation. Old English phonetics and grammar. Morphological classification of nouns. Evolution of the grammatical system. Personal and possessive pronouns.
курс лекций [104,6 K], добавлен 23.07.2009
Educational text from English with translation about history of Ukraine. Some information about history of Ukraine, its independence, Zaporizka Sich, activity of the Dnipro Cossacks. Short dictionary, list of questions to the text and answers to them.
контрольная работа [1,4 M], добавлен 21.11.2010
Joseph Niepce as the first photograph. The birth of modern photography. Negative to positive process. The first camera with a uniform mirror lens. Evolution and color photographs. Digital camera, modern functions and main technological boundary.
презентация [1,2 M], добавлен 25.02.2012
The concept of transnational corporation, history of their development. The evolution of a Transnational Corporation, classification. TNCs’ role in mobilizing financial resources and the impact on investment. Transnational corporations and agriculture.
дипломная работа [2,7 M], добавлен 04.06.2011
Text and its grammatical characteristics. Analyzing the structure of the text. Internal and external functions, according to the principals of text linguistics. Grammatical analysis of the text (practical part based on the novel "One day" by D. Nicholls).
курсовая работа [23,7 K], добавлен 06.03.2015
The history of second oldest university in England - Cambridge, which adheres to the medieval European education system. Most prominent attractions - museum of classical archeology, zoology, history of science. The system of education and tuition costs.
топик [24,9 K], добавлен 18.12.2010
History and basic steps of creating a film "Help", his theme and content. The reflection in the movie the problems of racial segregation and discrimination based on gender. Characteristics of the main characters and the description of their images.
реферат [16,8 K], добавлен 19.06.2013
Familiarization with the concept of 3D graphics and principles of working with pictures in graphic editors. Guidelines for creating a virtual 3D worlds. Achieve a realistic image with depth of field. Creating the effect of moving three-dimensional object.
курсовая работа [2,0 M], добавлен 04.10.2010
Systematic framework for external analysis. Audience, medium and place of communication. The relevance of the dimension of time and text function. General considerations on the concept of style. Intratextual factors in translation text analysis.
курс лекций [71,2 K], добавлен 23.07.2009
The first names of the streets of London and their relationship with the city's history. What historical reasons influenced the second elements of street names. How the tendencies of street-naming in London are similar to street-naming in Morshansk.
презентация [3,6 M], добавлен 17.10.2010