Friday, June 19, 2009

19) HISTORY OF COMPUTER 'S PERIPHERAL DEVICES

This category focuses on the history and timeline of various items used in computers, Each item is explained in general, providing information suitable for everyone from beginners to advanced users.
Take a cruise through history to discover some of the key developments that have brought us to our present state of computing, including the development of numbers, the introduction of mechanical aids to calculation, the evolution of electronics, and the impact of electronics on computing.
No one person may be credited with the invention of computers, but several names stand proud in the crowd. The following offers some of the more notable developments and individuals, with great regret for any omissions.


1- CR- ROMS



CD-R is an abbreviation of compact disc recordable. Operating on the same premise as a CD, a CD-R is thin disc made of polycarbonate with 120mm diameter used to store music or data. However, unlike conventional CD media, a CD-R has dye core instead of a metal core. A laser is used to etch "pits" into the dye so that the disc can later be read by the laser in a CD-ROM drive or CD player. Once used, a CD-R cannot be erased and reused, but it can be recorded in multiple sessions by using UDF format. A cdrw, though, can be reused.
There was some incompatability with CD-R and older CD-ROM drives. This was primarily due to the lower reflectivity of the CD-R disc. In general, CD drives marked as 8x or greater will usually read CD-R discs.


2- DVD ROMS



DVD started as the Digital Video Disc but now means Digital Versatile Disc or just DVD. It is a multi-application family of optical disc formats for read-only, recordable and re-writable applications. The main features of the DVD formats are:
Backwards compatibility with current CD media. All DVD hardware will play audio CDs and CD-ROMs (although not all hardware will play CD-Rs or CD-RWs).
Physical dimensions identical to compact disc but using two 0.6 mm thick substrates, bonded together.
Single-layer/dual-layer and single/double sided options.
Up to 4.7 GB read-only capacity per layer, 8.5 GB per side maximum.
Designed from the outset for video, audio and multimedia, not just audio.
All formats use a common file system (UDF).
Digital and analogue copy protection for DVD-Video and DVD-Audio built into standard.
Recordable and re-writable versions are part of the family.
DVD started in 1994 as two competing formats, Super Disc (SD) and Multimedia CD (MMCD). DVD now is the result of an agreement by both camps on a single standard to meet the requirements of all the various industries involved.


3- FLOPPY DISKETTES



In 1971, an IBM team led by Alan Shugart invented the 8-inch floppy diskette. This floppy was an 8" plastic disk coated with magnetic iron oxide; data was written to and read from the disk's surface. The nickname "floppy" came from it's flexibility. The floppies were considered revolutionary devices at the time for its portability which provided a new and easy physical means of transporting data from one computer to another.
A floppy disk is a circle of magnetic material similar to any kind of recording tape; one or two sides of the disk are used for recording. The disk drive grabs the floppy by its center and spins it like a record inside its housing. The read/write head, much like the head on a tape deck, contacts the surface through an opening in the plastic shell, or envelope.


4- HARD DISK DRIVES


The hard disk drive has short and fascinating history. In 24 years it evolved from a monstrosity with fifty, two foot diameter disks holding five MBytes (5,000 bytes) of data to today's drives measuring 3 /12 inches wide and an inch high (and smaller) holding more than 70 GBytes (70,000,000,000 bytes/characters). Here, then, is the short history of this marvelous device.
Before the disk drive there were drums... In 1950 Engineering Research Associates of Minneapolis built the first commercial magnetic drum storage unit for the U.S. Navy, the ERA 110. It could store one million bits of data and retrieve a word in 5 thousandths of a second.
In 1956 IBM invented the first computer disk storage system, the 305 RAMAC (Random Access Method of Accounting and Control). This system could store five MBytes. It had fifty, 24-inch diameter disks!
By 1961 IBM had invented the first disk drive with air bearing heads and in 1963 they introduced the removable disk pack drive. In 1970 the eight inch floppy disk drive was introduced by IBM. My first floppy drives were made by Shugart who was one of the "dirty dozen" who left IBM to start their own companies. In 1981 two Shugart 8 inch floppy drives with enclosure and power supply cost me about $350.00. They were for my second computer. My first computer had no drives at all.
In 1973 IBM shipped the model 3340 Winchester sealed hard disk drive, the predecessor of all current hard disk drives. The 3340 had two spindles each with a capacity of 30 MBytes, and the term "30/30 Winchester" was thus coined.
Seagate ST4053 40 MByte 5 1/4 inch, full-height "clunker" with ST506 interface and voice coil circa 1987. My cost was $435.00. In 1980, Seagate Technology introduced the first hard disk drive for microcomputers, the ST506. It was a full height (twice as high as most current 5 1/4" drives) 5 1/4" drive, with a stepper motor, and held 5 Mbytes. My first hard disk drive was an ST506. I cannot remember exactly how much it cost, but it plus its enclosure, etc. was well over a thousand dollars. It took me three years to fill the drive. Also, in 1980 Phillips introduced the first optical laser drive. In the early 80's, the first 5 1/4" hard disks with voice coil actuators (more on this later) started shipping in volume, but stepper motor drives continued in production into the early 1990's. In 1981, Sony shipped the first 3 1/2" floppy drives.
In 1983 Rodime made the first 3.5 inch rigid disk drive. The first CD-ROM drives were shipped in 1984, and "Grolier's Electronic Encyclopedia," followed in 1985. The 3 1/2" IDE drive started it's existence as a drive on a plug-in expansion board, or "hard card." The hard card included the drive on the controller which, in turn, evolved into Integrated Device Electronics (IDE) hard disk drive, where the controller became incorporated into the printed circuit on the bottom of the hard disk drive. Quantum made the first hard card in 1985.
In 1986 the first 3 /12" hard disks with voice coil actuators were introduced by Conner in volume, but half (1.6") and full height 5 1/4" drives persisted for several years. In 1988 Conner introduced the first one inch high 3 1/2" hard disk drives. In the same year PrairieTek shipped the first 2 1/2" hard disks.
In 1997 Seagate introduced the first 7,200 RPM, Ultra ATA hard disk drive for desktop computers and in February of this year they introduced the first 15,000 RPM hard disk drive, the Cheetah X15. Milestones for IDE DMA, ATA/33, and ATA/66 drives follow: 1994 DMA, Mode 2 at 16.6 MB/s -- 1997 Ultra ATA/33 at 33.3 MB/s -- 1999 Ultra ATA/66 at 66.6 MB/s



5- IDE ( INTELLIGENT DRIVER ELECTRONICS)


Compaq started the development of the IDE interface. This standard was designed specially for the IBM PC and can achieve high data transfer rates through a 1:1 interleave factor and caching by the actual disk controller - the bottleneck is often the old AT bus and the drive may read data far quicker than the bus can accept it, so the cache is used as a buffer. Theoretically 1MBps is possible but 700KBps is perhaps more typical of such drives. This standard has been adopted by many other models of computer, such the Acorn Archimedes A4000 and above. A later improvement was EIDE, laid down in 1989, which also removed the maximum drive size of 528MB and increased data transfer rates.


6- MODEMs



Modem, device that converts between analog and digital signals. Digital signals, which are used by computers, are made up of separate units, usually represented by a series of 1's and 0's. Analog signals vary continuously; an example of an analog signal is a sound wave. Modems are often used to enable computers to communicate with each other across telephone lines. A modem converts the digital signals of the sending computer to analog signals that can be transmitted through telephone lines. When the signal reaches its destination, another modem reconstructs the original digital signal, which is processed by the receiving computer. If both modems can transmit data to each other simultaneously, the modems are operating in full duplex mode; if only one modem can transmit at a time, the modems are operating in half duplex mode.
To convert a digital signal to an analog one, the modem generates a carrier wave and modulates it according to the digital signal. The kind of modulation used depends on the application and the speed of operation for which the modem is designed. For example, many high-speed modems use a combination of amplitude modulation, where the amplitude of the carrier wave is changed to encode the digital information, and phase modulation, where the phase of the carrier wave is changed to encode the digital information. The process of receiving the analog signal and converting it back to a digital signal is called demodulation. The word "modem" is a contraction of its two basic functions: modulation and demodulation.
Dennis C. Hayes invented the PC modem in 1977, establishing the critical technology that allowed today's online and Internet industries to emerge and grow. He sold the first Hayes modem products to computer hobbyists in April of 1977 and founded D.C. Hayes Associates, Inc., the company known today as Hayes Corp., in January of 1978. Hayes quality and innovation resulted in performance enhancements and cost reductions that led the industry in the conversion from leased line modems to intelligent dial modems - the PC Modem.
Hayes-Compatible, in computer science, an adjective used to describe a modem that responds to the same set of commands as a modem manufactured by Hayes Microcomputer Products, originators of the de facto standard for microcomputer modems.



7- MONITORS



Often referred to as a monitor when packaged in a separate case, the display is the most-used output device on a computer. The display provides instant feedback by showing your text and graphic images as you work or play. Most desktop displays use a cathode ray tube (CRT), while portable computing devices such as laptops incorporate liquid crystal display (LCD), light-emitting diode (LED), gas plasma or other image projection technology. Because of their slimmer design and smaller energy consumption, monitors using LCD technologies are beginning to replace the venerable CRT on many desktops.
Displays have come a long way since the blinking green monitors in text-based computer systems of the 1970s. Just look at the advances made by IBM over the course of a decade: In 1981, IBM introduced the Color Graphics Adapter (CGA), which was capable of rendering four colors, and had a maximum resolution of 320 pixels horizontally by 200 pixels vertically. IBM introduced the Enhanced Graphics Adapter (EGA) display in 1984. EGA allowed up to 16 different colors and increased the resolution to 640x350 pixels, improving the appearance of the display and making it easier to read text. In 1987, IBM introduced the Video Graphics Array (VGA) display system. Most computers today support the VGA standard and many VGA monitors are still in use. IBM introduced the Extended Graphics Array (XGA) display in 1990, offering 800x600 pixel resolution in true color(16.8 million colors) and 1,024x768 resolution in 65,536 colors. Most displays sold today support the Ultra Extended Graphics Array (UXGA) standard. UXGA can support palette of up to 16.8 million colors and resolutions of up to 1600x1200 pixels, depending on the video memory of the graphics card in your computer. The maximum resolution normally depends on the number of colors displayed. For example, your card might require that you choose between 16.8 million colors at 800x600, or 65,536 colors at 1600x1200.
The combination of the display modes supported by your graphics adapter and the color capability of your monitor determine how many colors can be displayed. For example, a display that can operate in SuperVGA (SVGA) mode can display up to 16,777,216 (usually rounded to 16.8 million) colors because it can process a 24-bit-long description of a pixel. The number of bits used to describe a pixel is known as its bit depth. With a 24-bit bit depth, 8 bits are dedicated to each of the three additive primary colors -- red, green and blue. This bit depth is also called true color because it can produce the 10,000,000 colors discernible to the human eye, while a 16-bit display is only capable of produ cing 65,536 colors. Displays jumped from 16-bit color to 24-bit color because working in 8-bit increments makes things a whole lot easier for developers and programmers.
Briefly, the measure of how much space there is between a display's pixels. When considering dot pitch, remember that smaller is better. Packing the pixels closer together is fundamental to achieving higher resolutions. A display normally can support resolutions that match the physical dot (pixel) size as well as several lesser resolutions. For example, a display with a physical grid of 1280 rows by 1024 columns can obviously support a maximum resolution of 1280x1024 pixels. It usually also supports lower resolutions such as 1024x768, 800x600, and 640x480.
In monitors based on CRT technology, the refresh rate is the number of times that the image on the display is drawn each second. If your CRT monitor has a refresh rate of 72 Hertz (Hz), then it cycles through all the pixels from top to bottom 72 times a second. Refresh rates are very important because they control flicker, and you want the refresh rate as high as possible. Too few cycles per second and you will notice a flickering, which can lead to headaches and eye strain.


8- MOUSE POINTERS


Years before personal computers and desktop information processing became commonplace or even practicable, Douglas Engelbart had invented a number of interactive, user-friendly information access systems that we take for granted today: the computer mouse was one of his inventions. At the Fall Joint Computer Conference in San Francisco in 1968, Engelbart astonished his colleagues by demonstrating the aforementioned systems---using an utterly primitive 192 kilobyte mainframe computer located 25 miles away! Engelbart has earned nearly two dozen patents, the most memorable being perhaps for his "X-Y Position Indicator for a Display System": the prototype of the computer "mouse" whose convenience has revolutionized personal computing.
Mouse (computer), a common pointing device, popularized by its inclusion as standard equipment with the Apple Macintosh. With the rise in popularity of graphical user interfaces in MS-DOS; UNIX, and OS/2, use of mice is growing throughout the personal computer and workstation worlds. The basic features of a mouse are a casing with a flat bottom, designed to be gripped by one hand; one or more buttons on the top; a multidirectional detection device (usually a ball) on the bottom; and a cable connecting the mouse to the computer. By moving the mouse on a surface (such as a desk), the user typically controls an on-screen cursor. A mouse is a relative pointing device because there are no defined limits to the mouse's movement and because its placement on a surface does not map directly to a specific screen location. To select items or choose commands on the screen, the user presses one of the mouse's buttons, producing a "mouse click."
Mouse Patent # 3,541,541 issued 11/17/70 for X-Y Position Indicator For A Display System Douglas Engelbart's patent for the mouse is only a representation of his pioneering work in the design of modern interactive computer environments.



9- PLOTTERS


A plotter is a vector graphics printing device that connects to a computer.
Plotters print their output by moving a pen across the surface of a piece of paper. This means that plotters are restricted to line art, rather than raster graphics as with other printers. They can draw complex line art, including text, but do so very slowly because of the mechanical movement of the pens.
Another difference between plotters and printers is that a printer is aimed primarily at printing text. This makes it fairly easy to control, simply sending the text to the printer is usually enough to generate a page of output. This is not the case of the line art on a plotter, where a number of printer control languages were created to send the more detailed information like "draw a line from here to here". The most popular of these is likely HPGL.
Early plotters were created by attaching ball-point pens to drafting pantographs and driving the machines with motors controlled by the computer. This had the disadvantage of being somewhat slow to move, as well as requiring floor space equal to the size of the paper. Later versions worked by placing the paper over a roller which moved the paper back and forth for X motion, while the pen moved back and forth on a single arm for Y motion. Another change was the addition of an elecrtically controlled clamp to hold the pens, which allowed them to be changed and thus create multi-colored output.
For a time in the 1980's smaller "home-use" plotters became popular for experimentation in computer graphics. But their low speed meant they were not useful for general printing purposes, and you would need another conventional printer for those jobs. With the widespread availability of high-resolution inkjets and laser printers, plotters have all but disappeared.
Plotters are used primarily in drafting and CAD applications, where they have the advantage of working on very large paper sizes while maintaining high resolution. Another use has been found by replacing the pen with a cutter, and in this form plotters can be found in many garment and sign shops.


10- SOUND CARDS


Computers were never designed to handle sound. About the only audio you'd hear from an early computer were beeps, designed to tell you if there was a problem. Computer games manipulated these beeps, to produce truly awful music as an accompaniment to games like Space Invaders. However, surely there was more to sounds than beeps? Thankfully a company from the Far East recognised this, and made the original Sound Blaster sound card for the now ancient ISA bus. It could record real audio and play it back, something of a quantum leap. It also had a MIDI interface, still common on sound cards today, which could control synthesisers, samplers and other electronic music equipment. It could "create" sounds by using FM synthesis, which were not that realistic but were nevertheless better than simple beeps. The quality of the audio was 8 bit 11 kHz, so sounded roughly like an AM radio.
The sound card is quite a complicated piece of electronics. The most important parts are the ADC and DAC. The ADC (Analogue-to-Digital convertor) takes in analogue signals, for example from a microphone and converts them to digital signals for the computer to store. The DAC (Digital-to-Analogue convertor) does the opposite. However, in the future there will be no need for either, since both speakers and microphones will be able to directly record and playback digital signals directly. The heart of a CD player is also the DAC. CD players tend to sound better than the average, because they generally cost more and are simpler devices. Hence the DAC component of a CD player tends to be more expensive (and thus better quality). Having said that, the quality of DACs on sound cards is improving all the time.
The advantage of digital audio (ie. storing audio as 1s and 0s) is that no matter how many times it is copied it remains identical, and does not degrade like analogue sources, such as vinyl. The next major development for sound cards was the leap up to 16 bit 44.1 kHz stereo audio, ie. CD quality. However, this posed problems for the archaeic ISA bus, which had problems playing back and recording more than one track at the same time. This effectively meant it was difficult to use your computer to make phone calls on the internet (since you couldn't talk and hear at the same time!) or use it as a multitrack audio editor (for musicians). The PCI bus solved this problem. Nowadays virtually all soundcards are PCI. Currently we are seeing 24 bit 96 kHz sound cards emerging, which promise even better sound quality than CDs! Some sound cards also decode Dolby Digital sound, so you can connect computer speakers to them for surround sound, when playing back DVDs. High-end sound cards also come with digital inputs and outputs, letting you bypass the sound cards convertors and use external ones.
Recently there has been the advent of the USB and Firewire buses. These enable you to connect fast external devices to your computer. Sound cards attached to the USB bus cannot playback as many tracks simultaneously as a PCI sound card. However, for people other than musicans this is hardly relevant. Also being external they can be used on more than one machine and on laptops, which notoriously have poor sound cards. There are also several external Firewire sound cards. These are quite expensive and designed to playback and record many tracks. Consequently they are a waste of money if all you do is watch DVDs or play MP3s on your computer.
Sound on the PC has come a long way since all those beeps twenty years ago!


11- TOUCH SCREENS


A touch screen is a special type of visual display unit with a screen which is sensitive to pressure or touching. The screen can detect the position of the point of touch. The design of touch screens is best for inputting simple choices and the choices are programmable. The device is very user-friendly since it 'talks' with the user when the user is picking up choices on the screen.
Touch technology turns a CRT, flat panel display or flat surface into a dynamic data entry device that replaces both the keyboard and mouse. In addition to eliminating these separate data entry devices, touch offers an "intuitive" interface. In public kiosks, for example, users receive no more instruction than 'touch your selection.'
Specific areas of the screen are defined as "buttons" that the operator selects simply by touching them. One significant advantage to touch screen applications is that each screen can be customized to reflect only the valid options for each phase of an operation, greatly reducing the frustration of hunting for the right key or function.
Pen-based systems, such as the Palm Pilot® and signature capture systems, also use touch technology but are not included in this article. The essential difference is that the pressure levels are set higher for pen-based systems than for touch.
Touch screens come in a wide range of options, from full color VGA and SVGA monitors designed for highly graphic Windows® or Macintosh® applications to small monochrome displays designed for keypad replacement and enhancement.
Specific figures on the growth of touch screen technology are hard to come by, but a 1995 study last year by Venture Development Corporation predicted overall growth of 17%, with at least 10% in the industrial sector.
According to Jim Sido, IBM's National Marketing Manager for Food Service Products, this year should see even greater growth than last year.
John Muhlberger, Director of Product Management at PAR Microsystems estimated that, for POS applications, touch screen terminals outsell keyboard terminals about 4:1, even though the touch terminals cost somewhat more.
Other vendors agree that touch screen technology is becoming more popular because of its ease-of-use, proven reliability, expanded functionality, and decreasing cost.

No comments:

Post a Comment