This is default featured slide 1 title
This is default featured slide 2 title
This is default featured slide 3 title
 

Monthly Archives: October 2016

History of Microprocessor Computer

The evolution of the microprocessor has been one of the greatest achievements of our civilization. In some cases, the terms ‘CPU’ and ‘microprocessor’ are used interchangeably to denote the same device. Like every genuine engineering marvel, the microprocessor too has evolved through a series of improvements throughout the 20th century. A brief history of the device along with its functioning is described below.

Working of a Processor

☞ It is the central processing unit, which coordinates all the functions of a computer. It generates timing signals, and sends and receives data to and from every peripheral used inside or outside the computer.

☞ The commands required to do this are fed into the device in the form of current variations, which are converted into meaningful instructions by the use of a Boolean Logic System.
☞ It divides its functions in two categories, logical and processing.

☞ The arithmetic and logical unit and the control unit handle these functions respectively. The information is communicated through a bunch of wires called buses.

☞ The address bus carries the ‘address’ of the location with which communication is desired, while the data bus carries the data that is being exchanged.
Types of Microprocessors
◆ CISC (Complex Instruction Set Computers)
◆ RISC(Reduced Instruction Set Computers)
◆ VLIW(Very Long Instruction Word Computers)
◆ Super scalar processors

Types of Specialized Processors
◆ General Purpose Processor (GPP)
◆ Special Purpose Processor (SPP)
◆ Application-Specific Integrated Circuit (ASIC)
◆ Digital Signal Processor (DSP)

History and Evolution

The First Stage
The invention of the transistor in 1947 was a significant development in the world of technology. It could perform the function of a large component used in a computer in the early years. Shockley, Brattain, and Bardeen are credited with this invention and were awarded the Nobel prize for the same.

Soon, it was found that the function this large component was easily performed by a group of transistors arranged on a single platform. This platform, known as the integrated chip (IC), turned out to be a very crucial achievement and brought along a revolution in the use of computers.

A person named Jack Kilby of Texas Instruments was honored with the Nobel Prize for the invention of IC, which laid the foundation on which microprocessors were developed. At the same time, Robert Noyce of Fairchild made a parallel development in IC technology for which he was awarded the patent.

The Second Stage
ICs proved beyond doubt that complex functions could be integrated on a single chip with a highly developed speed and storage capacity. Both, Fairchild and Texas Instruments, began the manufacture of commercial ICs in 1961.

Later, complex developments in the IC led to the addition of more complex functions on a single chip. The stage was set for a single controlling circuit for all the computer functions.
Finally, Intel corporation’s Ted Hoff and Frederico Fagin were credited with the design of the first microprocessor.

The Third Stage
The work on this project began with an order from a Japanese calculator company Busicom to Intel, for building some chips for it. Hoff felt that the design could integrate a number of functions on a single chip making it feasible for providing the required functionality.

This led to the design of Intel 4004, the world’s first microprocessor. The next in line was the 8-bit 8008 microprocessor. It was developed by Intel in 1972 to perform complex functions in harmony with the 4004.

This was the beginning of a new era in computer applications. The use of mainframes and huge computers was scaled down to a much smaller device that was affordable to many.
Earlier, their use was limited to large organizations and universities. With the advent of microprocessors, the use of computers trickled down to the common man.
Further Developments
▪ The next processor in line was Intel’s 8080 with an 8-bit data bus and a 16-bit address bus. This was amongst the most popular microprocessors of all time.

▪ Very soon, the Motorola corporation developed its own 6800 in competition with the Intel’s 8080.

▪ Fagin left Intel and formed his own firm Zilog. It launched a new microprocessor Z80 in 1980 that was far superior to the previous two versions.

▪ Similarly, a break off from Motorola prompted the design of 6502, a derivative of the 6800. Such attempts continued with some modifications in the base structure.

▪ The use of microprocessors was limited to task-based operations specifically required for company projects such as the automobile sector. The concept of a ‘personal computer’ was still a distant dream for the world, and microprocessors were yet to come into personal use.

▪ The-16 bit microprocessors started becoming a commercial sell-out in the 1980s with the first popular one being the TMS9900 of Texas Instruments.

▪ Intel developed the 8086, which still serves as the base model for all latest advancements in the microprocessor family. It was largely a complete processor integrating all the required features in it.
68000 by Motorola was one of the first microprocessors to develop the concept of microcoding in its instruction set. They were further developed to 32-bit architectures.

▪ Similarly, many players like Zilog, IBM, and Apple were successful in getting their own products in the market. However, Intel had a commanding position in the market right through the microprocessor era.

▪ The 1990s saw a large-scale application of microprocessors in the personal computer applications developed by the newly formed Apple, IBM, and Microsoft Corporation. It witnessed a revolution in the use of computers, which by then, were a household entity.

▪ This growth was complemented by a highly sophisticated development in the commercial use of microprocessors. In 1993, Intel brought out its ‘Pentium Processor’ which is one of the most popular processors in use till date.

▪ It was followed by a series of excellent processors of the Pentium family, leading into the 21st century. The latest one in commercial use is the Pentium Quad Core technology.

▪ They have opened up a whole new world of diverse applications. Supercomputers have become common, owing to this amazing development in microprocessors.

Certainly, these little chips will go down as history, but will continue to rein in the future as an ingenious creation of the human mind.

History And Working of Linux

Linux is one of the popularly used operating systems and a free software supporting open source development. Originally designed for Intel 80386 microprocessors, Linux now runs on a variety of computer architectures and is widely used.

A Brief History

Unix was the third operating system to CTSS, the first one followed by MULTICS. A team of programmers led by Prof. Fernando J. Corbato at the MIT Computation Center, wrote the CTSS, the first operating system supporting the concept of time-sharing. AT&T started working on the MULTICS operating system but had to leave the project as they were failing to meet deadlines. Ken Thompson, Dennis Ritchie, and Brian Kernighan at Bell Labs, used the ideas on the MULTICS project to develop the first version of Unix.

MINIX was a Unix-like system released by Andrew Tenenbaum. The source code was made available to the users but there were restrictions on the modification and distribution of the software. On August 25, 1991, Linus Torvalds, a second year computer engineering student studying in the University of Helsinki made an announcement that he was going to write an operating system. With an intent to replace MINIX, Torvalds started writing the Linux kernel. With this announcement of Torvalds, a success story had begun! Linux was previously dependent on the MINIX user space but with the introduction of the GNU GPL, the GNU developers worked towards the integration of Linux and the GNU components.

An Introduction to the Linux Operating System

The Unix-like operating system that uses the Linux kernel is known as the Linux operating system. In 1991, Linus Torvalds came up with the Linux kernel. He started writing the Linux kernel after which, around 250 programmers contributed to the kernel code. Richard Stallman, an American software developer, who was a part of the GNU project, created the General Public License, under which Linux is distributed. The utilities and libraries of Linux come from the GNU operating system.

By the term ‘free software’, we mean that Linux can be copied and redistributed in the altered or unaltered form without many restrictions. Each recipient of the Linux software is entitled to obtain the human readable form of the software and a notice granting the person the permissions to modify its source code. In other words, the distribution of the Linux software implies the distribution of a free software license to its recipients. Linux supports open source development by which we mean that all its underlying source code can be freely modified, used and distributed. The open source method of development enables the users to access its source code.

A Linux distribution is a project that manages the collection of Linux software and the installation of the OS. It includes the system software and the application software in the form of packages and the initial installation and configuration details. There are around 300 different Linux distributions. The most prominent of the Linux distributions include Red Hat, Fedora and Mandrake. Fedora Core came up after the ninth version of Red Hat Linux. Fedora Core is a rapidly updated Linux distribution. Most of the Linux distributions support a diverse range of programming languages. Most of them include Perl, Python, Ruby, and other dynamic languages. Linux supports a number of Java virtual machines and development kits as also the C++ compilers.

Linux is a freely available OS based on the Linux kernel. It is an inexpensive and effective alternative to UNIX programs and utilities. Its open source implementation enables any programmer to modify its code. Linux supports a multi-tasking and multi-user environment as also the copy-on-write functionality. The monolithic Linux kernel handles the process control, networking and the file system. Device drivers are integrated in the kernel. The Linux operating system is equipped with libraries, compilers, text editors, a Unix shell, and a windowing system. Linux supports both the command line as well and the graphical user interfaces. It is popularly used in servers and also with desktop computers, supercomputers, video games and embedded systems. I have always enjoyed working on the Linux platform, have you?

Facts About a Personal Computer

You might be surprised to know that the first computer came in the year ‘1945’. The early computer history tells us that the first electronic digital computer called “ENIAC” (Electronic Numerical Integrator and Computer) was introduced in 1945 in Philadelphia and it consumed so much electricity that lights in the nearby town dimmed every time it was switched on! Can you imagine how huge it was? It weighed 30 tons and it required 1500 square feet of floor space! Initially the computers were so huge that they used to occupy a big room. The size was reduced gradually, but the working and storing capacities were significantly improved. The development of the microprocessor lead to the evolution of personal computers. These days, you can find numerous types of personal or desktop computers, for example, portable PCs, laptops, palmtops, etc. The way a PC works with different gadgets is just incredible. Now that the prices for good personal computers have come down considerably, even the schools in the developing countries can afford PCs. Owing to the increasing demand, they are now commonly available in the market.

Personal Computer

Even though laptops are becoming more popular, there is still a big market for desktop PCs. Kids and adults both love playing PC games. A PC takes care of all your personal things. While buying a PC, you have more choices than ever before. Companies like Dell allow you to design your computers and you can have a PC right like you want it to be! They build a system just for you. You have to decide about computer uses, whether you want to use it for playing games/music, or for business or for photography, and they provide the required things. With the help of computer software, you can use your PC as an intelligent typewriter. By using the Internet, you can send/receive mails and explore the vast treasures of information. Over the years, computers have become smaller but powerful. They have become faster and more user-friendly.

Facts
According to the available statistics, it took 38 years for radio to reach 50 million users, 13 years for TV, and only 5 years for the Internet.
One of the most amazing personal computer facts is that early hard drives in PCs held 20 Megabytes (MB), and the price was about $800 whereas an $8 flash drive today holds 2 Gigabytes (GB) computer memory. That’s a 100-fold ‘decrease’ in price for a 100-fold ‘increased’ computer performance!
PC viruses are little files or codes that have been written to ruin your computer. So you need to install an Anti-Virus scanner, no matter you’re connecting to the Internet or not. AVG antivirus is quite popular today. A ‘firewall’ is another way of defense which helps block other computers and programs from connecting to you and playing with your stuff.
Computers history facts inform us that first commercial computer was introduced in 1951 by John Presper Eckert and John W. Mauchly from UNIVAC Computer. In 1953, International Business Machines (IBM) entered into the field of computers.
The first computer mouse was introduced in 1968 by Douglas Engelbart at the Fall Joint Computer Expo in San Francisco.
The first consumer computers were introduced by IBM in 1974/75. In 1981, Microsoft introduced the computer operating system of the century. User-friendly Microsoft Windows was first introduced in November 1985.
Tim Berners-Lee in 1990 first coined the phrase ‘World Wide Web’ and he is considered as father of Internet. The Internet gained popularity with the release of the first popular web browser Mosaic in 1993.
Steve Jobs and Steve Wozniak built the first Apple computer from parts they got for free from their employers. Their idea of a ‘PC’ was rejected by the employers!
One of the most amazing PC facts is that the computer mouse, the windowing GUI (graphical user interface, use of icons), laser printing and the network card were all developed at one company, ‘Xerox’ in Palo Alto, California.
Do you know that the computer in your cell phone has more processing power than all the computers in the Apollo 11 Lunar Lander that helped 2 men to land on the moon?
I was really surprised (and pleased) when I came to know that the popular programming language COBOL was invented by Admiral Grace Hopper, the first female admiral in the US Navy.
One of the latest products, pocket PC, has brought the world more closer. Now we can be in touch with the world while we are on the go.
Tablet PC, ultra mobile PC, laptops, notebooks, home theater PC, workstation, are all different types of personal computers. Microsoft and Intel are the dominant players in the PC market.
PC WIZARD is amongst the most popular and essential system information programs available in the market. It is designed for detection of hardware and for some more analysis. It can identify a wide range of system components and support the latest technologies and standards.

Edsger Dijkstra’s quote, “Computer science is no more about computers than astronomy is about telescopes”, reflects the process of evolution of technology. Most modern PCs are lightweight and they can perform almost any function. They are so versatile that they can meet almost any need. Moreover, to use a computer, you don’t have to be a computer or electronics expert. One of the most important personal computer facts is that in the life of a computer, time goes fast. What is hot today will be lukewarm tomorrow! Facts about computers that belonged to the previous decades may seem quite funny; but they make us feel more proud of human intelligence.

Macintosh Computers

The Macintosh 128K, released on January 24, 1984, was a commercial success. It was the first personal computer which came with a mouse and a graphical user interface. With the passing years, Apple Inc. evolved and today, it is a business giant in the field of computers.

History

Jef Raskin, a human computer interface expert from America, was an Apple employee who came up with the idea of building an affordable and easy-to-use computer. In 1979, Raskin started planning for building a team that would bring his idea into reality. He soon formed a team of Bill Atkinson, a Lisa team member, Burrell Smith, a service technician and others. Soon, they started working on Raskin’s idea. The first Macintosh board that their team developed had a 64 KB RAM, used a Motorola microprocessor and featured a black and white bitmap display.

By the end of 1980, Smith, one of the team members of the first Macintosh team, created a board that ran on a higher speed, featured a higher-capacity RAM and supported a wider display. Steve Jobs, impressed by this design, began to take interest in this project. His ideas have highly influenced the design of the final Macintosh. Jobs resigned from Apple in 1985.

The following years witnessed the development of desktop publishing and other applications such as Macromedia FreeHand, Adobe Photoshop, and Adobe Illustrator, which helped in the expansion of the desktop publishing market. It was also during these years that the shortfalls of Mac were exposed to the users. It did not have a hard disk drive and had little memory. In 1986, Apple came up with Macintosh Plus. It supported some excellent features like the parallel SCSI interface, a megabyte of expandable RAM, and support for attachment of peripheral devices. The MacPlus was produced until 1990, making it the longest-lived Macintosh.

In 1987, Apple brought about HyperCard and MultiFinder, which endowed Macintosh with multitasking features. After Macintosh II, Macintosh SE was released. The Macintosh SE supported the Snow White language and the Apple desktop bus mouse and keyboard.

Claris, a computer software company formed as a spin-off from Apple Computer in 1987, brought the Pro series to the market. Their line of products included the MacPoint Pro, MacDraw Pro and others. By the early 1990s, Claris had become immensely popular. Claris released ClarisWorks, which later came to be known as AppleWorks.

In 1991, Macintosh came up with System 7, a 32-bit rewrite of their operating system. They soon introduced Macintosh Quadra 700 and 900, both using the Motorola 68040 processor. They also established the Apple Industrial Design Group to work on further developments in their operating system. The year 1991 witnessed the creation of the PowerBook Range by Apple. In the following year, Apple started selling their low-end Mac, Performa. In 1994, they started using the RISC PowerPC architecture developed by the alliance of Apple Computer, IBM, and Motorola. Their new product line was a huge success.

Apple has always had to face fierce competition from Intel and Microsoft. After the return of Steve Jobs, Apple had a ‘no looking back’ period. They introduced an all-in-one Macintosh and called it iMac. It was a great success. In 1999, they came up with iBook, their first laptop computer. The Mac Mini launched in 2005, is the least expensive Mac till today. Mac OS 9 evolved to Mac OS X that was based on Unix. Mac OS X came up in 2000. The MAC OS remains to be one of the most popular operating systems till date.

The glorious history of Macintosh computers convinces us of their bright future.