Question:
what is the difference between C.P.U , M.P.U and M.C.U? explain them in detail?
vishwas t
2006-12-07 22:54:09 UTC
THIS QUESTION IS RELATED TO COMPUTER SCIENCE
C.P.U.=central processing unit,
M.P.U=micro processing unit,
M.P.U.=micro cotrol unit
Five answers:
Tanvi B
2006-12-07 23:01:28 UTC
See, it's quite simple.

A central processing unit (CPU), or sometimes simply processor, is the component in a digital computer that interprets computer program instructions and processes data. CPUs provide the fundamental digital computer trait of programmability, and are one of the necessary components found in computers of any era, along with primary storage and input/output facilities.

A CPU that is manufactured as a single integrated circuit is usually known as a microprocessor.

A microcontroller (or MCU) is a computer-on-a-chip used to control electronic devices. It is a type of microprocessor emphasizing self-sufficiency and cost-effectiveness, in contrast to a general-purpose microprocessor (the kind used in a PC). A typical microcontroller contains all the memory and interfaces needed for a simple application, whereas a general purpose microprocessor requires additional chips to provide these functions.

So there you go.
David W
2006-12-07 23:22:53 UTC
CPU refers to the the Core of the chip. There could be Intel, Renesas, Freescale, IBM cores. (Pentiums, ColdFire, PowerPC,...)



MPU is an accelerator, for example the MMU in the Pentium processors circa 1999. It gave a performance increase to graphics.



MCU is a block that handles specific interfaces. For example, a block that handles PCI traffic is an MCU.



So when you look from the software point of view, the CPU is the most robust and could handle all sort of transaction that can be programmed through General Inputs/Outputs while the MCU has specific functions in hardware to speed up transactions.



CPU - 80486

MPU - Southbridge device

MCU - MCF547x, MPC8220i, MCF5282



(Don't you hate it when someone just copies straight from wikipedia without doing some sort of edit...else just post a link)
anonymous
2016-03-13 08:36:31 UTC
Well, I know I tend to want to look at things just a bit off the beaten path and from way outside the box, so I will give this one a try! I know for a fact that it has every bit to do with the lack of a right of passage into adulthood in our culture! The fact that Mom & Dad are too bussy making a living to provide proper guidance with many families also leads to what you have termed "Marriage Partners", as they usually get married to get out of the house. This true lack of Maturity has nothing to do with age, but with a lack of being taught a value system as can be found within most SPIRITUAL gathering places. Teens become Lovers by having sex, and not realizing that a marriage is intended to be a bond of friendship not just lust! Do you realize that Divorce has become the number one right of passage into Adulthood in the American Culture? Many think it is the drivers license or turning 21... I guess the easy way is to say that marriage partners have sex, while SPIRITUAL PARTNERS have SEX. What passes for an intimate relationship with marriage partners isn't even good foreplay when compaired to that of SPIRITUAL Partners! ME! .
?
2016-10-03 02:21:23 UTC
Mcu Wiki
anonymous
2006-12-07 23:04:40 UTC
CPU



A central processing unit (CPU), or sometimes simply processor, is the component in a digital computer that interprets computer program instructions and processes data. CPUs provide the fundamental digital computer trait of programmability, and are one of the necessary components found in computers of any era, along with primary storage and input/output facilities. A CPU that is manufactured as a single integrated circuit is usually known as a microprocessor. Beginning in the mid-1970s, microprocessors of ever-increasing complexity and power gradually supplanted other designs, and today the term "CPU" is usually applied to some type of microprocessor.



The phrase "central processing unit" is a description of a certain class of logic machines that can execute computer programs. This broad definition can easily be applied to many early computers that existed long before the term "CPU" ever came into widespread usage. However, the term itself and its initialism have been in use in the computer industry at least since the early 1960s (Weik 1961). The form, design and implementation of CPUs have changed dramatically since the earliest examples, but their fundamental operation has remained much the same.



Early CPUs were custom-designed as a part of a larger, usually one-of-a-kind, computer. However, this costly method of designing custom CPUs for a particular application has largely given way to the development of mass-produced processors that are suited for one or many purposes. This standardization trend generally began in the era of discrete transistor mainframes and minicomputers and has rapidly accelerated with the popularization of the integrated circuit (IC). The IC has allowed increasingly complex CPUs to be designed and manufactured in very small spaces (on the order of millimeters). Both the miniaturization and standardization of CPUs have increased the presence of these digital devices in modern life far beyond the limited application of dedicated computing machines. Modern microprocessors appear in everything from automobiles to cell phones to children's toys.





MIcroprocessor

Microprocessors

Intel 80486DX2 microprocessor in a ceramic PGA package

Enlarge

Intel 80486DX2 microprocessor in a ceramic PGA package



Main article: Microprocessor



The introduction of the microprocessor in the 1970s significantly affected the design and implementation of CPUs. Since the introduction of the first microprocessor (the Intel 4004) in 1970 and the first widely used microprocessor (the Intel 8080) in 1974, this class of CPUs has almost completely overtaken all other central processing unit implementation methods. Mainframe and minicomputer manufacturers of the time launched proprietary IC development programs to upgrade their older computer architectures, and eventually produced instruction set compatible microprocessors that were backward-compatible with their older hardware and software. Combined with the advent and eventual vast success of the now ubiquitous personal computer, the term "CPU" is now applied almost exclusively to microprocessors.



Previous generations of CPUs were implemented as discrete components and numerous small integrated circuits (ICs) on one or more circuit boards. Microprocessors, on the other hand, are CPUs manufactured on a very small number of ICs; usually just one. The overall smaller CPU size as a result of being implemented on a single die means faster switching time because of physical factors like decreased gate parasitic capacitance. This has allowed synchronous microprocessors to have clock rates ranging from tens of megahertz to several gigahertz. Additionally, as the ability to construct exceedingly small transistors on an IC has increased, the complexity and number of transistors in a single CPU has increased dramatically. This widely observed trend is described by Moore's law, which has proven to be a fairly accurate predictor of the growth of CPU (and other IC) complexity to date.



While the complexity, size, construction, and general form of CPUs have changed drastically over the past sixty years, it is notable that the basic design and function has not changed much at all. Almost all common CPUs today can be very accurately described as von Neumann stored-program machines.



As the aforementioned Moore's law continues to hold true, concerns have arisen about the limits of integrated circuit transistor technology. Extreme miniaturization of electronic gates is causing the effects of phenomena like electromigration and subthreshold leakage to become much more significant. These newer concerns are among the many factors causing researchers to investigate new methods of computing such as the quantum computer, as well as to expand the usage of parallelism and other methods that extend the usefulness of the classical von Neumann model.



MCU



A microcontroller (or MCU) is a computer-on-a-chip used to control electronic devices. It is a type of microprocessor emphasizing self-sufficiency and cost-effectiveness, in contrast to a general-purpose microprocessor (the kind used in a PC). A typical microcontroller contains all the memory and interfaces needed for a simple application, whereas a general purpose microprocessor requires additional chips to provide these functions. A microcontroller is a single integrated circuit, commonly with the following features:



* central processing unit - ranging from small and simple 4-bit processors to sophisticated 32- or 64-bit processors

* input/output interfaces such as serial ports

* peripherals such as timers and watchdog circuits and signal conversion circuits.

* RAM for data storage

* ROM, EPROM, EEPROM or Flash memory for program storage

* clock generator - often an oscillator for a quartz timing crystal, resonator or RC circuit



This integration drastically reduces the number of chips and the amount of wiring and PCB space that would be needed to produce equivalent systems using separate chips and have proved to be highly popular in embedded systems since their introduction in the 1970's.



Microcontrollers take the largest share of sales in the wider microprocessor market. Over 50% are "simple" controllers, and another 20% are more specialized digital signal processors (DSPs) [citation needed]. A typical home in a developed country is likely to have only one or two general-purpose microprocessors but somewhere between one and two dozen microcontrollers. A typical mid range vehicle has as many as 50 or more microcontrollers. They can also be found in almost any electrical device: washing machines, microwave ovens, telephones etc.

A PIC 18F8720 microcontroller in an 80-pin TQFP package.

Enlarge

A PIC 18F8720 microcontroller in an 80-pin TQFP package.



Manufacturers have often produced special versions of their microcontrollers in order to help the hardware and software development of the target system. These have included EPROM versions that have a "window" on the top of the device through which program memory can be erased by ultra violet light, ready for reprogramming after a programming ("burn") and test cycle. Other versions may be available where the ROM is accessed as an external device rather than as internal memory. A simple EPROM programmer, rather than a more complex and expensive microcontroller programmer, may then be used, however there is a potential loss of functionality through pin outs being tied up with external memory addressing rather than for general input/output. These kind of devices usually carry a cost up in part prices but if the target production quantities are small, certainly in the case of a hobbyist, they can be the most economical option compared with the set up charges involved in mask programmed devices. A more rarely encountered development microcontroller is the "piggy back" version. This device has no internal ROM memory, instead pin outs on the top of the microcontroller form a socket into which a standard EPROM program memory device may be installed. The benefit of this approach is the release of microcontroller pins for input and output use rather than program memory. These kinds of device are normally expensive and are impractical for anything but the development phase of a project.


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...