Selasa, 23 November 2010

What are the functions of processor?

A Central Processing Unit (CPU), or sometimes just called processor, is a description of a class of logic machines that can execute computer programs. This broad definition can easily be applied to many early computers that existed long before the term "CPU" ever came into widespread usage. The term itself and its initialism have been in use in the computer industry at least since the early 1960s (Weik 1961). The form, design and implementation of CPUs have changed dramatically since the earliest examples, but their fundamental operation has remained much the same.

Early CPUs were custom-designed as a part of a larger, sometimes one-of-a-kind, computer. However, this costly method of designing custom CPUs for a particular application has largely given way to the development of mass-produced processors that are suited for one or many purposes. This standardization trend generally began in the era of discrete transistor mainframes and minicomputers and has rapidly accelerated with the popularization of the integrated circuit (IC). The IC has allowed increasingly complex CPUs to be designed and manufactured to tolerances on the order of nanometers. Both the miniaturization and standardization of CPUs have increased the presence of these digital devices in modern life far beyond the limited application of dedicated computing machines. Modern microprocessors appear in everything from automobiles to cell phones to children's toys.

What is the function of a socket processor?

"Socket" is what you call the frame the processor is seated on. There's "socket A processors" for example, maybe that is what you mean. They perform the same task as any other processor, they just happen to fit on a normed socket called "socket A". A socket has no processor of its own, it just connects the CPU to the motherboard.

Source :

Read More......

Senin, 22 November 2010

Nuclear weapon

A nuclear weapon is an explosive device that derives its destructive force from nuclear reactions, either fission or a combination of fission and fusion. Both reactions release vast quantities of energy from relatively small amounts of matter. The first fission ("atomic") bomb test released the same amount of energy as approximately 20,000 tons of TNT. The first thermonuclear ("hydrogen") bomb test released the same amount of energy as approximately 10,000,000 tons of TNT.

A modern thermonuclear weapon weighing little more than a thousand kilograms (2,200 pounds) can produce an explosion comparable to the detonation of more than a billion kilograms (2.2 billion pounds) of conventional high explosive. Thus, even single small nuclear devices no larger than traditional bombs can devastate an entire city by blast, fire and radiation. Nuclear weapons are considered weapons of mass destruction, and their use and control has been a major focus of international relations policy since their debut.

In the history of warfare, only two nuclear weapons have been detonated offensively, both near the end of World War II. The first was detonated on the morning of 6 August 1945, when the United States dropped a uranium gun-type device code-named "Little Boy" on the Japanese city of Hiroshima. The second was detonated three days later when the United States dropped a plutonium implosion-type device code-named "Fat Man" on the city of Nagasaki, Japan. These two bombings resulted in the deaths of approximately 200,000 Japanese people (mostly civilians) from acute injuries sustained from the explosion. The role of the bombings in Japan's surrender and the U.S.'s ethical justification for them remains the subject of scholarly and popular debate.

Since the Hiroshima and Nagasaki bombings, nuclear weapons have been detonated on over two thousand occasions for testing purposes and demonstration purposes. A few states have possessed such weapons or are suspected of seeking them. The only countries known to have detonated nuclear weapons—and that acknowledge possessing such weapons—are (chronologically) the United States, the Soviet Union (succeeded as a nuclear power by Russia), the United Kingdom, France, the People's Republic of China, India, Pakistan, and North Korea. Israel is also widely believed to possess nuclear weapons, though it does not acknowledge having them.

Types of nuclear weapons

There are two basic types of nuclear weapon. The first type produces its explosive energy through nuclear fission reactions alone. Such fission weapons are commonly referred to as atomic bombs or atom bombs (abbreviated as A-bombs), though their energy comes specifically from the nucleus of the atom.

In fission weapons, a mass of fissile material (enriched uranium or plutonium) is assembled into a supercritical mass—the amount of material needed to start an exponentially growing nuclear chain reaction—either by shooting one piece of sub-critical material into another (the "gun" method) or by compressing a sub-critical sphere of material using chemical explosives to many times its original density (the "implosion" method). The latter approach is considered more sophisticated than the former and only the latter approach can be used if the fissile material is plutonium.

A major challenge in all nuclear weapon designs is to ensure that a significant fraction of the fuel is consumed before the weapon destroys itself. The amount of energy released by fission bombs can range from the equivalent of less than a ton of TNT upwards of 500,000 tons (500 kilotons) of TNT.

The second basic type of nuclear weapon produces a large amount of its energy through nuclear fusion reactions. Such fusion weapons are generally referred to as thermonuclear weapons or more colloquially as hydrogen bombs (abbreviated as H-bombs), as they rely on fusion reactions between isotopes of hydrogen (deuterium and tritium). However, all such weapons derive a significant portion, and sometimes a majority, of their energy from fission (including fission induced by neutrons from fusion reactions). Unlike fission weapons, there are no inherent limits on the energy released by thermonuclear weapons. Only six countries—United States, Russia, United Kingdom, People's Republic of China, France and India—have conducted thermonuclear weapon tests. (Whether India has detonated a "true", multi-staged thermonuclear weapon is controversial.)
The basics of the Teller–Ulam design for a hydrogen bomb: a fission bomb uses radiation to compress and heat a separate section of fusion fuel.

Thermonuclear bombs work by using the energy of a fission bomb to compress and heat fusion fuel. In the Teller-Ulam design, which accounts for all multi-megaton yield hydrogen bombs, this is accomplished by placing a fission bomb and fusion fuel (tritium, deuterium, or lithium deuteride) in proximity within a special, radiation-reflecting container. When the fission bomb is detonated, gamma and X-rays emitted first compress the fusion fuel, then heat it to thermonuclear temperatures. The ensuing fusion reaction creates enormous numbers of high-speed neutrons, which can then induce fission in materials not normally prone to it, such as depleted uranium. Each of these components is known as a "stage", with the fission bomb as the "primary" and the fusion capsule as the "secondary". In large hydrogen bombs, about half of the yield, and much of the resulting nuclear fallout, comes from the final fissioning of depleted uranium.

By chaining together numerous stages with increasing amounts of fusion fuel, thermonuclear weapons can be made to an almost arbitrary yield; the largest ever detonated (the Tsar Bomba of the USSR) released an energy equivalent of over 50 million tons (50 megatons) of TNT. Most thermonuclear weapons are considerably smaller than this, due to practical constraints arising from the space and weight requirements of missile warheads.

There are other types of nuclear weapons as well. For example, a boosted fission weapon is a fission bomb which increases its explosive yield through a small amount of fusion reactions, but it is not a fusion bomb. In the boosted bomb, the neutrons produced by the fusion reactions serve primarily to increase the efficiency of the fission bomb. Some weapons are designed for special purposes; a neutron bomb is a thermonuclear weapon that yields a relatively small explosion but a relatively large amount of neutron radiation; such a device could theoretically be used to cause massive casualties while leaving infrastructure mostly intact and creating a minimal amount of fallout.

The detonation of any nuclear weapon is accompanied by a blast of neutron radiation. Surrounding a nuclear weapon with suitable materials (such as cobalt or gold) creates a weapon known as a salted bomb. This device can produce exceptionally large quantities of radioactive contamination.

Most variation in nuclear weapon design is for the purpose of achieving different yields for different situations, and in manipulating design elements to attempt to minimize weapon size.

Weapons delivery

Nuclear weapons delivery—the technology and systems used to bring a nuclear weapon to its target—is an
important aspect of nuclear weapons relating both to nuclear weapon design and nuclear strategy. Additionally, development and maintenance of delivery options is among the most resource-intensive aspects of a nuclear weapons program: according to one estimate, deployment costs accounted for 57% of the total financial resources spent by the United States in relation to nuclear weapons since 1940.
Historically the first method of delivery, and the method used in the two nuclear weapons actually used in warfare, was as a gravity bomb, dropped from bomber aircraft. This method is usually the first developed by countries as it does not place many restrictions on the size of the weapon and weapon miniaturization is something which requires considerable weapons design knowledge. It does, however, limit the range of attack, the response time to an impending attack, and the number of weapons which can be fielded at any given time.

With the advent of miniaturization, nuclear bombs can be delivered by both strategic bombers and tactical fighter-bombers, allowing an air force to use its current fleet with little or no modification. This method may still be considered the primary means of nuclear weapons delivery; the majority of U.S. nuclear warheads, for example, are free-fall gravity bombs, namely the B61.
A Trident II SLBM launched from a Royal Navy Vanguard class ballistic missile submarine.

More preferable from a strategic point of view is a nuclear weapon mounted onto a missile, which can use a ballistic trajectory to deliver the warhead over the horizon. While even short range missiles allow for a faster and less vulnerable attack, the development of long-range intercontinental ballistic missiles (ICBMs) and submarine-launched ballistic missiles (SLBMs) has given some nations the ability to plausibly deliver missiles anywhere on the globe with a high likelihood of success.

More advanced systems, such as multiple independently targetable reentry vehicles (MIRVs), allow multiple warheads to be launched at different targets from one missile, reducing the chance of a successful missile defense. Today, missiles are most common among systems designed for delivery of nuclear weapons. Making a warhead small enough to fit onto a missile, though, can be a difficult task.

Tactical weapons have involved the most variety of delivery types, including not only gravity bombs and missiles but also artillery shells, land mines, and nuclear depth charges and torpedoes for anti-submarine warfare. An atomic mortar was also tested at one time by the United States. Small, two-man portable tactical weapons (somewhat misleadingly referred to as suitcase bombs), such as the Special Atomic Demolition Munition, have been developed, although the difficulty of combining sufficient yield with portability limits their military utility.

Governance, control, and law

Because of the immense military power they can confer, the political control of nuclear weapons has been a key issue for as long as they have existed; in most countries the use of nuclear force can only be authorized by the head of government or head of state.

In the late 1940s, lack of mutual trust was preventing the United States and the Soviet Union from making ground towards international arms control agreements, but by the 1960s steps were being taken to limit both the proliferation of nuclear weapons to other countries and the environmental effects of nuclear testing. The Partial Test Ban Treaty (1963) restricted all nuclear testing to underground nuclear testing, to prevent contamination from nuclear fallout, while the Nuclear Non-Proliferation Treaty (1968) attempted to place restrictions on the types of activities which signatories could participate in, with the goal of allowing the transference of non-military nuclear technology to member countries without fear of proliferation.

In 1957, the International Atomic Energy Agency (IAEA) was established under the mandate of the United Nations in order to encourage the development of the peaceful applications of nuclear technology, provide international safeguards against its misuse, and facilitate the application of safety measures in its use. In 1996, many nations signed and ratified the Comprehensive Test Ban Treaty which prohibits all testing of nuclear weapons, which would impose a significant hindrance to their development by any complying country.

Additional treaties have governed nuclear weapons stockpiles between individual countries, such as the SALT I and START I treaties, which limited the numbers and types of nuclear weapons between the United States and the Soviet Union.

Nuclear weapons have also been opposed by agreements between countries. Many nations have been declared Nuclear-Weapon-Free Zones, areas where nuclear weapons production and deployment are prohibited, through the use of treaties. The Treaty of Tlatelolco (1967) prohibited any production or deployment of nuclear weapons in Latin America and the Caribbean, and the Treaty of Pelindaba (1964) prohibits nuclear weapons in many African countries. As recently as 2006 a Central Asian Nuclear Weapon Free Zone was established amongst the former Soviet republics of Central Asia prohibiting nuclear weapons.

In the middle of 1996, the International Court of Justice, the highest court of the United Nations, issued an Advisory Opinion concerned with the "Legality of the Threat or Use of Nuclear Weapons". The court ruled that the use or threat of use of nuclear weapons would violate various articles of international law, including the Geneva Conventions, the Hague Conventions, the UN Charter, and the Universal Declaration of Human Rights. In view of the unique, destructive characteristics of nuclear weapons, the International Committee of the Red Cross calls on States to ensure that these weapons are never used, irrespective of whether they consider them to be lawful or not.

Additionally, there have been other, specific actions meant to discourage countries from developing nuclear arms. In the wake of the tests by India and Pakistan in 1998, economic sanctions were (temporarily) levied against both countries, though neither were signatories with the Nuclear Non-Proliferation Treaty. One of the stated casus belli for the initiation of the 2003 Iraq War was an accusation by the United States that Iraq was actively pursuing nuclear arms (though this was soon discovered not to be the case as the program had been discontinued). In 1981, Israel had bombed a nuclear reactor being constructed in Osirak, Iraq, in what it called an attempt to halt Iraq's previous nuclear arms ambitions.


Beginning with the 1963 Partial Test Ban Treaty and continuing through the 1996 Comprehensive Test Ban Treaty, there have been many treaties to limit or reduce nuclear weapons testing and stockpiles. The 1968 Nuclear Non-Proliferation Treaty has as one of its explicit conditions that all signatories must "pursue negotiations in good faith" towards the long-term goal of "complete disarmament". However, no nuclear state has treated that aspect of the agreement as having binding force.

Only one country—South Africa—has ever fully renounced nuclear weapons they had independently developed. A number of former Soviet republics—Belarus, Kazakhstan, and Ukraine—returned Soviet nuclear arms stationed in their countries to Russia after the collapse of the USSR.
The 1962 Sedan nuclear test formed a crater 100 m (330 ft) deep with a diameter of about 390 m (1,300 ft), as a means of investigating the possibilities of using peaceful nuclear explosions for large-scale earth moving.

Apart from their use as weapons, nuclear explosives have been tested and used for various non-military uses, and proposed, but not used for large-scale earth moving. When long term health and clean-up costs were included, there was no economic advantage over conventional explosives.

Synthetic elements, such as einsteinium and fermium, created by neutron bombardment of uranium and plutonium during thermonuclear explosions, were discovered in the aftermath of the first thermonuclear bomb test. In 2008 the worldwide presence of new isotopes from atmospheric testing beginning in the 1950s was developed into a reliable way of detecting art forgeries, as all paintings created after that period may contain traces of cesium-137 and strontium-90, isotopes that did not exist in nature before 1945.

Nuclear explosives have also been seriously studied as potential propulsion mechanisms for space travel (see Project Orion).

Read More......

Minggu, 21 November 2010

Computer Aided Design (CAD)

Computer-aided design (CAD), also known as computer-aided design and drafting (CADD), is the use of computer technology for the process of design and design-documentation. Computer Aided Drafting describes the process of drafting with a computer. CADD software, or environments, provide the user with input-tools for the purpose of streamlining design processes; drafting, documentation, and manufacturing processes. CADD output is often in the form of electronic files for print or machining operations. The development of CADD-based software is in direct correlation with the processes it seeks to economize; industry-based software (construction, manufacturing, etc.) typically uses vector-based (linear) environments whereas graphic-based software utilizes raster-based (pixelated) environments.

CADD environments often involve more than just shapes. As in the manual drafting of technical and engineering drawings, the output of CAD must convey information, such as materials, processes, dimensions, and tolerances, according to application-specific conventions.

CAD may be used to design curves and figures in two-dimensional (2D) space; or curves, surfaces, and solids in three-dimensional (3D) objects.

CAD is an important industrial art extensively used in many applications, including automotive, shipbuilding, and aerospace industries, industrial and architectural design, prosthetics, and many more. CAD is also widely used to produce computer animation for special effects in movies, advertising and technical manuals. The modern ubiquity and power of computers means that even perfume bottles and shampoo dispensers are designed using techniques unheard of by engineers of the 1960s. Because of its enormous economic importance, CAD has been a major driving force for research in computational geometry, computer graphics (both hardware and software), and discrete differential geometry.

The design of geometric models for object shapes, in particular, is often called computer-aided geometric design (CAGD).


Current computer-aided design software packages range from 2D vector-based drafting systems to 3D solid and surface modellers. Modern CAD packages can also frequently allow rotations in three dimensions, allowing viewing of a designed object from any desired angle, even from the inside looking out. Some CAD software is capable of dynamic mathematic modeling, in which case it may be marketed as CADD — computer-aided design and drafting.

CAD is used in the design of tools and machinery and in the drafting and design of all types of buildings, from small residential types (houses) to the largest commercial and industrial structures (hospitals and factories).

CAD is mainly used for detailed engineering of 3D models and/or 2D drawings of physical components, but it is also used throughout the engineering process from conceptual design and layout of products, through strength and dynamic analysis of assemblies to definition of manufacturing methods of components. It can also be used to design objects.

CAD has become an especially important technology within the scope of computer-aided technologies, with benefits such as lower product development costs and a greatly shortened design cycle. CAD enables designers to lay out and develop work on screen, print it out and save it for future editing, saving time on their drawings.


Computer-aided design is one of the many tools used by engineers and designers and is used in many ways depending on the profession of the user and the type of software in question.

CAD is one part of the whole Digital Product Development (DPD) activity within the Product Lifecycle Management (PLM) process, and as such is used together with other tools, which are either integrated modules or stand-alone products, such as:

Computer-aided engineering (CAE) and Finite element analysis (FEA)
Computer-aided manufacturing (CAM) including instructions to Computer Numerical Control (CNC) machines
Photo realistic rendering
Document management and revision control using Product Data Management (PDM).

CAD is also used for the accurate creation of photo simulations that are often required in the preparation of Environmental Impact Reports, in which computer-aided designs of intended buildings are superimposed into photographs of existing environments to represent what that locale will be like were the proposed facilities allowed to be built. Potential blockage of view corridors and shadow studies are also frequently analyzed through the use of CAD.


Originally software for Computer-Aided Design systems was developed with computer languages such as Fortran, but with the advancement of object-oriented programming methods this has radically changed. Typical modern parametric feature based modeler and freeform surface systems are built around a number of key C modules with their own APIs. A CAD system can be seen as built up from the interaction of a graphical user interface (GUI) with NURBS geometry and/or boundary representation (B-rep) data via a geometric modeling kernel. A geometry constraint engine may also be employed to manage the associative relationships between geometry, such as wireframe geometry in a sketch or components in an assembly.

Unexpected capabilities of these associative relationships have led to a new form of prototyping called digital prototyping. In contrast to physical prototypes, which entail manufacturing time in the design.

Today, CAD systems exist for all the major platforms (Windows, Linux, UNIX and Mac OS X); some packages even support multiple platforms.

Right now, no special hardware is required for most CAD software. However, some CAD systems can do graphically and computationally expensive tasks, so good graphics card, high speed (and possibly multiple) CPUs and large amounts of RAM are recommended.

The human-machine interface is generally via a computer mouse but can also be via a pen and digitizing graphics tablet. Manipulation of the view of the model on the screen is also sometimes done with the use of a spacemouse/SpaceBall. Some systems also support stereoscopic glasses for viewing the 3D model.
[edit] Effects
Unbalanced scales.svg
The neutrality of this article is disputed. Please see the discussion on the talk page. Please do not remove this message until the dispute is resolved. (August 2010)

Beginning in the 1980s Computer-Aided Design programs reduced the need of draftsmen significantly, especially in small to mid-sized companies. Their affordability and ability to run on personal computers also allowed engineers to do their own drafting work, eliminating the need for entire departments. In today's world most, if not all, students in universities do not learn drafting techniques because they are not required to do so. The days of hand drawing for final drawings are almost obsolete.[4] Universities such as New Jersey Institute of Technology no longer require the use of protractors and compasses to create drawings, instead there are several classes that focus on the use of CAD software such as Pro Engineer or IDEAS-MS.

Another consequence had been that since the latest advances were often quite expensive, small and even mid-size firms often could not compete against large firms who could use their computational edge for competitive purposes.[citation needed] Today, however, hardware and software costs have come down. Even high-end packages work on less expensive platforms and some even support multiple platforms. The costs associated with CAD implementation now are more heavily weighted to the costs of training in the use of these high level tools, the cost of integrating a CAD/CAM/CAE PLM using enterprise across multi-CAD and multi-platform environments and the costs of modifying design work flows to exploit the full advantage of CAD tools. CAD vendors have effectively lowered these training costs. These methods can be split into three categories:

Improved and simplified user interfaces. This includes the availability of “role” specific tailorable user interfaces through which commands are presented to users in a form appropriate to their function and expertise.
Enhancements to application software. One such example is improved design-in-context, through the ability to model/edit a design component from within the context of a large, even multi-CAD, active digital mockup.
User oriented modeling options. This includes the ability to free the user from the need to understand the design intent history of a complex intelligent model.

Source : Wikipedia

Read More......

Minggu, 07 November 2010

10 best antivirus in the world

1. BitDefender Antivirus Pro 2011

Publisher: SOFTWIN
Version: Pro 2011
Pros: BitDefender Antivirus Pro 2011 has the most comprehensive feature set and is completely flexible.
Cons: The software is a little heavy on the resources used (comparatively), though you won't notice a difference on most computers.
The Verdict: BitDefender's base-line antivirus software is well-balanced, powerful, and effective.

2. Kaspersky Anti-Virus 2011

Publisher: Kaspersky
Version: 2011
Pros: Kaspersky Anti-Virus has all the right features an
d tools, packed in an interface that is advanced but straightforwardCons: The price for a 3-user license is definitely worth it, but isn’t as competitive
The Verdict: This antivirus power-house is flexible enough for experts, simple enough for beginners, and effective enough for everyone.

3. Webroot AntiVirus 2011

Publisher: Webroot Software
Version: 2011
Pros: The software combines antivirus protection from Sophos with the best antispyware program Spy Sweeper for an effective one-two punch against malware.
Cons: Webroot is still missing a few nice features like a laptop mode and link scanner.
The Verdict: Webroot has one of the best antispyware/antivirus combo packages we’ve seen

4. Norton AntiVirus 2011

Publisher: Symantec Corporation
Version: 2011
Pros: Norton 2011 improves on all the great features from last year and adds a few more. Norton AntiVirus continues to provide proved protection while maintaining a light footprint.
Cons: The performance issues of a few years ago are gone, but Norton still doesn't always play nice with other applications, or uninstall completely.
The Verdict: Norton's most basic antivirus protection is superior to most of the competition, and includes some features you simply won't find elsewhere.

5. ESET Nod32 Antivirus 4

Publisher: Eset
Version: 4
Pros: Proven security without the slowdown, ESET features heuristic detection and advanced diagnostic tools.
Cons: ESET has all the essentials covered, but misses others like IM protection and antiphishing. The interface is good, but not great.
The Verdict: We can definitely give ESET a 'nod' of approval for security and performance.

6. AVG Anti-Virus 2011

Publisher: AVG Technologies
Version: 2011
Pros: AVG Anti-Virus 2011 includes the LinkScanner and Social Networking Protection.
Cons: AVG’s scanning engine is effective, but not the fastest. The software also overprotects at times, causing false positives.
The Verdict: AVG Anti-Virus continues to deliver new features and is one of the best antivirus software available, just not THE best.

7. G DATA AntiVirus 2011

Publisher: GData
Version: 2011
Pros: G Data AntiVirus provides unparalleled security by using two separate scanning engines, heuristics and self-learning fingerprinting.
Cons: Unfortunately, the software is missing a few extra features such as a gamer mode, battery saving mode or link scanner.
The Verdict: G Data is affordable antivirus software with a high level of security.

8. Avira AntiVir Premium

Publisher: Avira GmbH
Version: Premium
Pros: Avira AntiVir Premium balances protection and performance, and includes protection from a number of threats.
Cons: The Advanced Heuristics Analysis and Detection (AHeAD) technology sometimes overprotects by flagging non-malware as dangerous (false positives).
The Verdict: Known for their cost-effective antivirus software, Avira AntiVir is a good overall solution.

9. Vipre Antivirus 4

Publisher: Sunbelt Software
Version: 4
Pros: Vipre has one of the most efficient scanning engines with a small scan footprint. The home site license is perfect for households with multiple PCs.
Cons: The installation is thorough, but requires a restart.
The Verdict: The Vipre brand may still be pretty new, but the software is definitely worth looking into.

10. Trend Micro Titanium Antivirus +

Publisher: Trend Micro
Version: Antivirus +
Pros: Trend Micro Titanium Antivirus + features cloud security for real-time protection and updates.
Cons: The software has fallen behind a bit, and is still missing a handful of nice features and tools that you can find elsewhere.
The Verdict: Trend Micro Titanium Antivirus + offers good protection, but the software still has a ways to go to catch up with the competition

Source :

Read More......

Sabtu, 06 November 2010


A robot is a virtual or mechanical artificial agent. In practice, it is usually an electro-mechanical machine which is guided by computer or electronic programming, and is thus able to do tasks on its own. Another common characteristic is that by its appearance or movements, a robot often conveys a sense that it has intent or agency of its own.


Building the robot of Leonardo da Vinci

Since the beginnings of civilisation man has had a fascination for a human-like creation that would assist him. Societies in the early part of the first millennium engaged in slavery and used those slaves to perform the tasks which were either dirty or menial labours. Having slaves freed the enslavers to carry on their society and concentrate on what they perceived as more important tasks such as business and politics. Man had discovered mechanics and the means of creating complex mechanisms which would perform repetitive functions such as waterwheels and pumps. Technological advances were slow but there were more complex machines, generally limited to a very small number, which performed more grandiose functions such as those invented by Hero of Alexandria.

In the first half of the second millennium man began to develop more complex machines as well as rediscovering the Greek engineering methods. Men such as Leonardo Da Vinci in 1495 through to Jacques de Vaucanson in 1739 have made plans for, and built, automata and robots leading to books of designs such as the Japanese Karakuri zui (Illustrated Machinery) in 1796. As mechanical techniques developed through the Industrial age we find more practical applications such as Nikola Tesla in 1898 who designed a radio-controlled torpedo and the Westinghouse Electric Corporation creation Televox in 1926. From here we find a more android development as designers tried to mimic more human-like features including designs such as those of biologist Makoto Nishimura in 1929 and his creation Gakutensoku, which cried and changed its facial expressions, and the more crude Elektro from Westinghouse in 1938.

Electronics now became the driving force of development instead of mechanics with the advent of the first electronic autonomous robots created by William Grey Walter in Bristol, England in 1948. The first digital and programmable robot was invented by George Devol in 1954 and was ultimately called the Unimate. Devol sold the first Unimate to General Motors in 1960 where it was used to lift pieces of hot metal from die casting machines in a plant in Trenton, New Jersey.

Since then we have seen robots finally reach a more true assimilation of all technologies to produce robots such as ASIMO which can walk and move like a human. Robots have replaced slaves in the assistance of performing those repetitive and dangerous tasks which humans prefer not to do or unable to do due to size limitations or even those such as in outer space or at the bottom of the sea where humans could not survive the extreme environments.

Robots come in those two basic forms: Those which are used to make or move things, such as Industrial robots or mobile or servicing robots and those which are used for research into human-like robots such as ASIMO and TOPIO as well as those into more defined and specific roles such as Nano robots and Swarm robots.

Man has developed a fear of the autonomous robot and how it may react in society, such as Shelley's Frankenstein and the EATR, and yet we still use robots in a wide variety of tasks such as vacuuming floors, mowing lawns, cleaning drains, investigating other planets, building cars, entertainment and in warfare.


DateSignificanceRobot nameInventor
1st century AD and earlierDescriptions of over a hundred machines and automata, including a fire engine, wind organ, coin-operated machine, and steam-powered aeliopile, in Pneumatica and Automata by Heron
Ctesibius, Philo, Heron, and others
1206Early programmable automataRobot bandAl-Jazari
c. 1495Designs for a humanoid robotMechanical knightLeonardo da Vinci
1738Mechanical duck that was able to eat, flap its wings, and excreteDigesting DuckJacques de Vaucanson
19th centuryJapanese mechanical toys that served tea, fired arrows, and paintedKarakuri toysHisashige Tanaka
1921First fictional automata called "robots" appear in the play R.U.R.Rossum's Universal RobotsKarel Čapek
1928Humanoid robot, based on a suit of armor with electrical actuators, exhibited at the annual exhibition of the Model Engineers Society in LondonEricW. H. Richards
1930sHumanoid robot exhibited at the 1939 and 1940 World's FairsElektroWestinghouse Electric Corporation
1948Simple robots exhibiting biological behaviors[118]Elsie and ElmerWilliam Grey Walter
1956First commercial robot, from the Unimation company founded by George Devol and Joseph Engelberger, based on Devol's patents[119]UnimateGeorge Devol
1961First installed industrial robotUnimateGeorge Devol
1963First palletizing robot[120]PalletizerFuji Yusoki Kogyo
1973First robot with six electromechanically driven axes[121]FamulusKUKA Robot Group
1975Programmable universal manipulation arm, a Unimation productPUMAVictor Scheinman

Source : Wikipedia

Read More......

Future Car Technologies

Potential future car technologies include new energy sources and materials, which are being developed in order to make automobiles more sustainable, safer, more energy efficient, or less polluting. Cars are being developed in many different ways.

With rising gas prices, the future of cars is leaning towards fuel efficiency, energy-savers, hybrid vehicles, battery electric vehicles and fuel-cell vehicles (Xiang, Jia, Jianzhong, Zhibiao, Yuanzhang, & Qinglin 2008).

Energy source

One major problem in developing cleaner, energy efficient automobiles is the source of power to drive the engine. A variety of alternative fuel vehicles have been proposed or sold, including electric cars, hydrogen cars, and compressed-air cars.

In one experiment done to improve the future of cars, a new kind of battery was installed which can be easily removed, and recharged in two different ways. First, by a generator integrated with the IC and second by removing the cassettes so that they can be recharged off-board in the home ( Charters, Watkinson, Wykes, & Simpkin, 2008). d

Energy savers

Conventional automobiles operate at about 15% efficiency. The rest of the energy is lost to engine and drive-train inefficiencies and idling. Therefore, the potential to improve fuel efficiency with advanced technologies is enormous.

Various technologies have been developed and utilized to increase the energy efficiency of conventional cars or supplement them, resulting in energy savings.

Regenerative braking Regenerative braking technology saves and stores energy for future use or as back up power. When conventional brakes are used, 30% of the energy is lost in the form of heat (Raunekk, 2009). Regenerative braking uses this energy to recharge the batteries in a hybrid vehicle.
BMW's Turbosteamer BMW’s Turbosteamer concept uses energy from the exhaust gases of the traditional Internal Combustion Engine (ICE) to power a steam engine which also contributes power to the automobile (Hanlon, 2005). This can increases energy efficiency up to 15 %.
Compressed air Hybrid is an engine made by researchers at Brunel University in Britain, which forces highly compressed air into the engine, which they claim reduces fuel consumption by 30%.
Utilization of waste heat from D.W. as useful mechanical energy through exhaust powered steam, stirling engines, thermal diodes, etc..
Using computational fluid dynamics in the design stage can produce vehicles which take significantly less energy to push through the air, a major consideration at highway speeds. The Volkswagen 1-litre car and Aptera 2 Series are examples of ultra-low-drag vehicles.
Installing Vortex prevention devices at the back of the roof of a car reduces drag and therefore improve fuel efficiency. 

Source : Wikipedia

Read More......

Jumat, 05 November 2010


A smartphone is a mobile phone that offers more advanced computing ability and connectivity than a contemporary basic feature phone. Smartphones and feature phones may be thought of as handheld computers integrated within a mobile telephone, but while most feature phones are able to run applications based on platforms such as Java ME, a smartphone allows the user to install and run more advanced applications based on a specific platform. Smartphones run complete operating system software providing a platform for application developers. A smartphone can be considered as a Personal Pocket Computer (PPC) with mobile phone functions, because these devices are mainly computers, although quite smaller than a desktop computer (DC). Additionally a PPC (Personal Pocket Computer) is more personal than a DC (desktop computer).

Growth in demand for advanced mobile devices boasting powerful processors, abundant memory, larger screens, and open operating systems has outpaced the rest of the mobile phone market for several years. According to a study by ComScore, over 45.5 million people in the United States owned smartphones in 2010 and it is the fastest growing segment of the mobile phone market, which comprised 234 million subscribers in the United States. Despite the large increase in smartphone sales in the last few years, smartphone shipments only make up 20% of total handset shipments, as of the first half of 2010.

Rise of Symbian, Windows Mobile, and BlackBerry

In 2000 Ericsson released the touchscreen smartphone R380, the first device to use the new Symbian OS. It was followed up by P800 in 2002, the first camera smartphone.

In 2001 Microsoft announced its Windows CE Pocket PC OS would be offered as "Microsoft Windows Powered Smartphone 2002." Microsoft originally defined its Windows Smartphone products as lacking a touchscreen and offering a lower screen resolution compared to its sibling Pocket PC devices.

In early 2002 Handspring released the Palm OS Treo smartphone, utilizing a full keyboard that combined wireless web browsing, email, calendar, and contact organizer with mobile third-party applications that could be downloaded or synced with a computer.

In 2002 RIM released the first BlackBerry which was the first smartphone optimized for wireless email use and had achieved a total customer base of 32 million subscribers by December 2009.

In 2007 Nokia launched the Nokia N95 which integrated a wide range of features into a consumer-oriented smartphone: GPS, a 5 megapixel camera with autofocus and LED flash, 3G and wi-fi connectivity and TV-out. In the next few years these features would become standard on high-end smartphones.


A smartbook is a concept of a mobile device that falls between smartphones and netbooks, delivering features typically found in smartphones (always on, all-day battery life, 3G connectivity, GPS) in a slightly larger device with a full keyboard. Smartbooks will tend to be designed to work with online applications.

Smartbooks use the ARM processor, which gives them much greater battery life than a netbook which uses a traditional Intel x86 processor. They are likely to be sold initially through mobile network operators, like mobile phones are today, along with a wireless data plan.

Source : Wikipedia

Read More......

Subscribe via email

Enter your email address:

Delivered by FeedBurner