Tuesday, October 25, 2011

Nobel Prize 2011




Physics 

"for the discovery of the accelerating expansion of the Universe through observations of distant supernovae"

Written in the stars

"Some say the world will end in fire, some say in ice..." *
What will be the final destiny of the Universe? Probably it will end in ice, if we are to believe this year's Nobel Laureates in Physics. They have studied several dozen exploding stars, called supernovae, and discovered that the Universe is expanding at an ever-accelerating rate. The discovery came as a complete surprise even to the Laureates themselves.

In 1998, cosmology was shaken at its foundations as two research teams presented their findings. Headed by Saul Perlmutter, one of the teams had set to work in 1988. Brian Schmidt headed another team, launched at the end of 1994, where Adam Riess was to play a crucial role.

The research teams raced to map the Universe by locating the most distant supernovae. More sophisticated telescopes on the ground and in space, as well as more powerful computers and new digital imaging sensors (CCD, Nobel Prize in Physics in 2009), opened the possibility in the 1990s to add more pieces to the cosmological puzzle.

The teams used a particular kind of supernova, called type Ia supernova. It is an explosion of an old compact star that is as heavy as the Sun but as small as the Earth. A single such supernova can emit as much light as a whole galaxy. All in all, the two research teams found over 50 distant supernovae whose light was weaker than expected - this was a sign that the expansion of the Universe was accelerating. The potential pitfalls had been numerous, and the scientists found reassurance in the fact that both groups had reached the same astonishing conclusion.

For almost a century, the Universe has been known to be expanding as a consequence of the Big Bang about 14 billion years ago. However, the discovery that this expansion is accelerating is astounding. If the expansion will continue to speed up the Universe will end in ice.

The acceleration is thought to be driven by dark energy, but what that dark energy is remains an enigma - perhaps the greatest in physics today. What is known is that dark energy constitutes about three quarters of the Universe. Therefore the findings of the 2011 Nobel Laureates in Physics have helped to unveil a Universe that to a large extent is unknown to science. And everything is possible again.

Chemistry 

"for the discovery of quasicrystals"


A remarkable mosaic of atoms

In quasicrystals, we find the fascinating mosaics of the Arabic world reproduced at the level of atoms: regular patterns that never repeat themselves. However, the configuration found in quasicrystals was considered impossible, and Dan Shechtman had to fight a fierce battle against established science. The Nobel Prize in Chemistry 2011 has fundamentally altered how chemists conceive of solid matter.

On the morning of 8 April 1982, an image counter to the laws of nature appeared in Dan Shechtman's electron microscope. In all solid matter, atoms were believed to be packed inside crystals in symmetrical patterns that were repeated periodically over and over again. For scientists, this repetition was required in order to obtain a crystal.

Shechtman's image, however, showed that the atoms in his crystal were packed in a pattern that could not be repeated. Such a pattern was considered just as impossible as creating a football using only six-cornered polygons, when a sphere needs both five- and six-cornered polygons. His discovery was extremely controversial. In the course of defending his findings, he was asked to leave his research group. However, his battle eventually forced scientists to reconsider their conception of the very nature of matter.

Aperiodic mosaics, such as those found in the medieval Islamic mosaics of the Alhambra Palace in Spain and the Darb-i Imam Shrine in Iran, have helped scientists understand what quasicrystals look like at the atomic level. In those mosaics, as in quasicrystals, the patterns are regular - they follow mathematical rules - but they never repeat themselves.

When scientists describe Shechtman's quasicrystals, they use a concept that comes from mathematics and art: the golden ratio. This number had already caught the interest of mathematicians in Ancient Greece, as it often appeared in geometry. In quasicrystals, for instance, the ratio of various distances between atoms is related to the golden mean.

Following Shechtman's discovery, scientists have produced other kinds of quasicrystals in the lab and discovered naturally occurring quasicrystals in mineral samples from a Russian river. A Swedish company has also found quasicrystals in a certain form of steel, where the crystals reinforce the material like armor. Scientists are currently experimenting with using quasicrystals in different products such as frying pans and diesel engines.

 Medicine 

 This year's Nobel Laureates have revolutionized our understanding of the immune system by discovering key principles for its activation.

Scientists have long been searching for the gatekeepers of the immune response by which man and other animals defend themselves against attack by bacteria and other microorganisms. Bruce Beutler and Jules Hoffmann discovered receptor proteins that can recognize such microorganisms and activate innate immunity, the first step in the body's immune response. Ralph Steinman discovered the dendritic cells of the immune system and their unique capacity to activate and regulate adaptive immunity, the later stage of the immune response during which microorganisms are cleared from the body.

The discoveries of the three Nobel Laureates have revealed how the innate and adaptive phases of the immune response are activated and thereby provided novel insights into disease mechanisms. Their work has opened up new avenues for the development of prevention and therapy against infections, cancer, and inflammatory diseases.
Two lines of defense in the immune system

We live in a dangerous world. Pathogenic microorganisms (bacteria, virus, fungi, and parasites) threaten us continuously but we are equipped with powerful defense mechanisms (please see image below). The first line of defense, innate immunity, can destroy invading microorganisms and trigger inflammation that contributes to blocking their assault. If microorganisms break through this defense line, adaptive immunity is called into action. With its T and B cells, it produces antibodies and killer cells that destroy infected cells. After successfully combating the infectious assault, our adaptive immune system maintains an immunologic memory that allows a more rapid and powerful mobilization of defense forces next time the same microorganism attacks. These two defense lines of the immune system provide good protection against infections but they also pose a risk. If the activation threshold is too low, or if endogenous molecules can activate the system, inflammatory disease may follow.

The components of the immune system have been identified step by step during the 20th century. Thanks to a series of discoveries awarded the Nobel Prize, we know, for instance, how antibodies are constructed and how T cells recognize foreign substances. However, until the work of Beutler, Hoffmann and Steinman, the mechanisms triggering the activation of innate immunity and mediating the communication between innate and adaptive immunity remained enigmatic.
Discovering the sensors of innate immunity

Jules Hoffmann made his pioneering discovery in 1996, when he and his co-workers investigated how fruit flies combat infections. They had access to flies with mutations in several different genes including Toll, a gene previously found to be involved in embryonal development by Christiane Nüsslein-Volhard (Nobel Prize 1995). When Hoffmann infected his fruit flies with bacteria or fungi, he discovered that Toll mutants died because they could not mount an effective defense. He was also able to conclude that the product of the Toll gene was involved in sensing pathogenic microorganisms and Toll activation was needed for successful defense against them.

Bruce Beutler was searching for a receptor that could bind the bacterial product, lipopolysaccharide (LPS), which can cause septic shock, a life threatening condition that involves overstimulation of the immune system. In 1998, Beutler and his colleagues discovered that mice resistant to LPS had a mutation in a gene that was quite similar to the Toll gene of the fruit fly. This Toll-like receptor (TLR) turned out to be the elusive LPS receptor. When it binds LPS, signals are activated that cause inflammation and, when LPS doses are excessive, septic shock. These findings showed that mammals and fruit flies use similar molecules to activate innate immunity when encountering pathogenic microorganisms. The sensors of innate immunity had finally been discovered.

The discoveries of Hoffmann and Beutler triggered an explosion of research in innate immunity. Around a dozen different TLRs have now been identified in humans and mice. Each one of them recognizes certain types of molecules common in microorganisms. Individuals with certain mutations in these receptors carry an increased risk of infections while other genetic variants of TLR are associated with an increased risk for chronic inflammatory diseases.
A new cell type that controls adaptive immunity

Ralph Steinman discovered, in 1973, a new cell type that he called the dendritic cell. He speculated that it could be important in the immune system and went on to test whether dendritic cells could activate T cells, a cell type that has a key role in adaptive immunity and develops an immunologic memory against many different substances. In cell culture experiments, he showed that the presence of dendritic cells resulted in vivid responses of T cells to such substances. These findings were initially met with skepticism but subsequent work by Steinman demonstrated that dendritic cells have a unique capacity to activate T cells.

Further studies by Steinman and other scientists went on to address the question of how the adaptive immune system decides whether or not it should be activated when encountering various substances. Signals arising from the innate immune response and sensed by dendritic cells were shown to control T cell activation. This makes it possible for the immune system to react towards pathogenic microorganisms while avoiding an attack on the body's own endogenous molecules.
From fundamental research to medical use

The discoveries that are awarded the 2011 Nobel Prize have provided novel insights into the activation and regulation of our immune system. They have made possible the development of new methods for preventing and treating disease, for instance with improved vaccines against infections and in attempts to stimulate the immune system to attack tumors. These discoveries also help us understand why the immune system can attack our own tissues, thus providing clues for novel treatment of inflammatory diseases.


Bruce A. Beutler was born in 1957 in Chicago, USA. He received his MD from the University of Chicago in 1981 and has worked as a scientist at Rockefeller University in New York, at UT Southwestern Medical Center in Dallas, where he discovered the LPS receptor, and the Scripps Research Institute in La Jolla, CA. Very recently, he rejoined the University of Texas Southwestern Medical Center in Dallas as professor in its Center for the Genetics of Host Defense.

Jules A. Hoffmann was born in Echternach, Luxembourg in 1941. He studied at the University of Strasbourg in France, where he obtained his PhD in 1969. After postdoctoral training at the University of Marburg, Germany, he returned to Strasbourg, where he headed a research laboratory from 1974 to 2009. He has also served as director of the Institute for Molecular Cell Biology in Strasbourg and during 2007-2008 as President of the French National Academy of Sciences.

Ralph M. Steinman was born in 1943 in Montreal, Canada, where he studied biology and chemistry at McGill University. After studying medicine at Harvard Medical School in Boston, MA, USA, he received his MD in 1968. He was affiliated with Rockefeller University in New York since 1970, where he was professor of immunology from 1988. Sadly, Ralph Steinman passed away before the news of his Nobel Prize reached him.



Key publications:
Poltorak A, He X, Smirnova I, Liu MY, Van Huffel C, Du X, Birdwell D, Alejos E, Silva M, Galanos C, Freudenberg M, Ricciardi-Castagnoli P, Layton B, Beutler B. Defective LPS signaling in C3H/HeJ and C57BL/10ScCr mice: Mutations in Tlr4 gene. Science 1998;282:2085-2088.
Lemaitre B, Nicolas E, Michaut L, Reichhart JM, Hoffmann JA. The dorsoventral regulatory gene cassette spätzle/Toll/cactus controls the potent antifungal response in drosophila adults. Cell 1996;86:973-983.
Steinman RM, Cohn ZA. Identification of a novel cell type in peripheral lymphoid organs of mice. J Exp Med 1973;137:1142-1162.
Steinman RM, Witmer MD. Lymphoid dendritic cells are potent stimulators of the primary mixed leukocyte reaction in mice. Proc Natl Acad Sci USA 1978;75:5132-5136.
Schuler G, Steinman RM. Murine epidermal Langerhans cells mature into potent immunostimulatory dendritic cells in vitro. J Exp Med 1985;161:526-546.

Peace 

The Norwegian Nobel Committee has decided that the Nobel Peace Prize for 2011 is to be divided in three equal parts between Ellen Johnson Sirleaf, Leymah Gbowee and Tawakkul Karman for their non-violent struggle for the safety of women and for women’s rights to full participation in peace-building work. We cannot achieve democracy and lasting peace in the world unless women obtain the same opportunities as men to influence developments at all levels of society.

In October 2000, the UN Security Council adopted Resolution 1325. The resolution for the first time made violence against women in armed conflict an international security issue. It underlined the need for women to become participants on an equal footing with men in peace processes and in peace work in general.

Ellen Johnson Sirleaf is Africa’s first democratically elected female president. Since her inauguration in 2006, she has contributed to securing peace in Liberia, to promoting economic and social development, and to strengthening the position of women. Leymah Gbowee mobilized and organized women across ethnic and religious dividing lines to bring an end to the long war in Liberia, and to ensure women’s participation in elections. She has since worked to enhance the influence of women in West Africa during and after war. In the most trying circumstances, both before and during the “Arab spring”, Tawakkul Karman has played a leading part in the struggle for women’s rights and for democracy and peace in Yemen.

It is the Norwegian Nobel Committee’s hope that the prize to Ellen Johnson Sirleaf, Leymah Gbowee and Tawakkul Karman will help to bring an end to the suppression of women that still occurs in many countries, and to realise the great potential for democracy and peace that women can represent. 

Literature 

Tomas Tranströmer

The Nobel Prize in Literature 2011 was awarded to Tomas Tranströmer "because, through his condensed, translucent images, he gives us fresh access to reality".

Economics 

"for their empirical research on cause and effect in the macroeconomy"


Cause and effect in the macroeconomy

How are GDP and inflation affected by a temporary increase in the interest rate or a tax cut? What happens if a central bank makes a permanent change in its inflation target or a government modifies its objective for budgetary balance? This year's Laureates in economic sciences have developed methods for answering these and many of other questions regarding the causal relationship between economic policy and different macroeconomic variables such as GDP, inflation, employment and investments.

These occurrences are usually two-way relationships – policy affects the economy, but the economy also affects policy. Expectations regarding the future are primary aspects of this interplay. The expectations of the private sector regarding future economic activity and policy influence decisions about wages, saving and investments. Concurrently, economic-policy decisions are influenced by expectations about developments in the private sector. The Laureates' methods can be applied to identify these causal relationships and explain the role of expectations. This makes it possible to ascertain the effects of unexpected policy measures as well as systematic policy shifts.

Thomas Sargent has shown how structural macroeconometrics can be used to analyze permanent changes in economic policy. This method can be applied to study macroeconomic relationships when households and firms adjust their expectations concurrently with economic developments. Sargent has examined, for instance, the post-World War II era, when many countries initially tended to implement a high-inflation policy, but eventually introduced systematic changes in economic policy and reverted to a lower inflation rate.

Christopher Sims has developed a method based on so-called vector autoregression to analyze how the economy is affected by temporary changes in economic policy and other factors. Sims and other researchers have applied this method to examine, for instance, the effects of an increase in the interest rate set by a central bank. It usually takes one or two years for the inflation rate to decrease, whereas economic growth declines gradually already in the short run and does not revert to its normal development until after a couple of years.

Although Sargent and Sims carried out their research independently, their contributions are complementary in several ways. The laureates' seminal work during the 1970s and 1980s has been adopted by both researchers and policymakers throughout the world. Today, the methods developed by Sargent and Sims are essential tools in macroeconomic analysis. 

(http://www.nobelprize.org/nobel_prizes/lists/year/) 

The End

Born: 1942 (exact date not known)
Birth Place: Sirte region, Libya
Death:
Nationality: Libyan


Qaddafi was the youngest child born to a nomadic Bedouin peasant family living in the desert around Sirte. He received a traditional religious primary school education. From 1956 he attended the Serbha preparatory school in Sidra. He was an excellent student who quickly took up the nationalist Arab cause and became politically active. He was expelled in 1961 for his political activities.

In 1963 he entered the Benghazi Military Academy. It was here that he and a few of his fellow militants organized a secretive group dedicated to overthrowing the pro-Western Libyan monarchy. After he graduated in 1965, he was sent to Britain for further training. He returned a year later as a commissioned officer in the Signal Corps.

On 1 September 1969 Colonel Qaddafi and other young officer conspirators staged a bloodless
and unopposed coup d’etat in Tripoli against King Idris I who was out of the country on a visit to Turkey. For a short time afterwards there was a power struggle between Qaddafi and his young supporters and older senior officers and civilians. By January 1970 his faction had received support from Egypt and had eliminated their opponents. Qaddafi assumed power and named the country the Libyan Arab Republic. He firstly ruled as president of the Revolutionary Command Council until 1977 when he switched to the title of president of People's General Congress. Two years later he renounced all official titles but remained the de facto head of the Libyan government.

Qaddafi politics were a blend of Arab nationalism and social welfare state. He described this as “direct, popular democracy” and named it “Islamic socialism.” He outlined his political philosophy in his Green Book published in 1976. All this is not quite as idealistic as this suggests, since Qaddafi is not above using violence and repressive tactics when and if his regime is threatened. He has called for political assassination as a tool and has sent agents to carry out his wishes.

He is a fervent supporter of unity of all Arab states into a single Arab nation. Following Egyptian President Nasser’s death he attempted to take on the mantle of ideological leader of Arab nationalism. In 1972 he proclaimed the “Federation of Arab Republics (Egypt, Libya and Syria). Two years later he tried again with a proposal to merge Tunisia and Libya. Both these failed. Qaddafi has also called for the creation of a Saharan Islamic state.

He has provided general support for almost any liberation movements that cared to contact him but his support for the Palestine Liberation Organization has been particularly strong although it has caused a rift between himself and Egypt. Finding his efforts for pan Arab alliances faltering he turned to the Soviet Union for support and military aid. Although Libya became the first recipient of MiG-25 fighters outside the Soviet bloc relations were never warm and could be characterized as distant.

He was widely regarded in the West as the principal financier of international terrorism. Amongst other events Libya has been implicated in support for Black September and their massacre at the 1972 Munich Olympic Games and the bombing, in 1986, of the Berlin discotheque. This latter event resulted in three deaths and over two hundred wounded including a substantial number of American servicemen.

When Ronald Reagan came to power his administration saw Libya as an unacceptable player on the international stage because of support for such terrorist activities. At first it imposed economic sanctions but in January and March 1986 this flared into open conflict. The U.S. attacked Libyan patrol boats during clashes over access to the Gulf of Sidra; Libya claimed this as territorial waters the Americans did not accept this.

On 14 April 1986 President Reagan ordered major bombing raids, called Operation El Dorado Canyon against targets in Tripoli and Benghazi. The United States accused Libya with direct involvement in the Berlin bombings and the resulting deaths. The raids killed sixty; amongst the victims was Qaddafi’s adopted daughter.

During the 1990s Libya endured economic sanctions and diplomatic isolation as a result of his refusal to hand over two Libyans accused of planting a bomb on Pan Am Flight 103 which exploded over Lockerbie, Scotland.

Since his isolation Qaddafi has managed to improve his connections among Middle Eastern nations and is today considered a much more moderate and responsible leader in the Arab world. At the same time he has emerged as a popular Africa leader. Finally he has done much to improve his relationship with the West. This had included his admission his country had had an active weapons of mass destruction programme. International inspectors have identified considerable quantities of equipment and material and an active programme of destruction is going ahead.

Quotes


Ronald Reagan plays with fire! He sees the world like the theatre.

Irrespective of the conflict with America, it is a human duty to show sympathy with the American people and be with them at these horrifying and awesome events which are bound to awaken human conscience.
(Referring to 11 September 2001)

http://www.nationalcoldwarexhibition.org/explore/biography.cfm?name=Al-Gaddafi,%20Muammar

World of Computer Science on Dennis Ritchie




Dennis Ritchie is a computer scientist most well-known for his work with Kenneth Thompson in creating UNIX, a computer operating system. Ritchie also went on to develop the high-level and enormously popular computer programming language C. For their work on the UNIX operating system, Ritchie and Thompson were awarded the prestigious Turing Award by the Association for Computer Machinery (ACM) in 1983.


Dennis MacAlistair Ritchie was born in Bronxville, New York, on September 9, 1941, and grew up in New Jersey, where his father, Alistair Ritchie, worked as a switching systems engineer for Bell Laboratories. His mother, Jean McGee Ritchie, was a homemaker. Ritchie went to Harvard University, where he received his B.S. in Physics in 1963. However, a lecture he attended on the operation of Harvard's computer system, a Univac I, led him to develop an interest in computing in the early 1960s. Thereafter, Ritchie spent a considerable amount of time at the nearby Massachusetts Institute of Technology (MIT), where many scientists were developing computer systems and software. In 1967 Ritchie began working for Bell Laboratories. Ritchie's job increased his association with the programming world, and in the late 1960s he began working with the Computer Science Research Department at Bell. It was here that he met Kenneth Thompson. Ritchie's lifestyle at Bell was that of a typical computer guru: he was devoted to his work. He showed up to his cluttered office in Murray Hill, New Jersey, around noon every day, worked until seven in the evening, and then went home to work some more. His computer system at home was connected on a dedicated private line to a system at Bell Labs, and he often worked at home until three in the morning. Even in the early 1990s, after he became a manager at Bell Labs, his work habits did not change substantially. "It still tends to be sort of late, but not quite that late," Ritchie told Patrick Moore in an interview. "It depends on what meetings and so forth I have."

When Ritchie and Thompson began working for Bell Labs, the company was involved in a major initiative with General Electric and MIT to develop a multi-user, time-sharing operating system called Multics. This system would replace the old one, which was based on batch programming. In a system based on batch programming, the programmers had no opportunity to interact with the computer system directly. Instead, they would write the program on a deck or batch of cards, which were then input into a mainframe computer by an operator. In other words, since the system was centered around a mainframe, and cards were manually fed into machines to relate instructions or generate responses, the programmers had no contact with the program once it had been activated. Multics, or the multiplexed information and computing service, would enable several programmers to work on a system simultaneously while the computer itself would be capable of processing multiple sets of information. Although programmers from three institutions were working on Multics, Bell Labs decided that the development costs were too high and the possibility of launching a usable system in the near future too low. Therefore, the company pulled out of the project. Ritchie and Thompson, who had been working on the Multics project, were suddenly thrown back into the batch programming environment. In light of the advanced techniques and expertise they had acquired while working on the Multics project, this was a major setback for them and they found it extremely difficult to adapt.

Thus it was in 1969 that Thompson began working on what would become the UNIX operating system. Ritchie soon joined the project and together they set out to find a useful alternate to Multics. However, working with a more advanced system was not the only motivation in developing UNIX. A major factor in their efforts to develop a multi-user, multi-tasking system was the communication and information-sharing it facilitated between programmers. As Ritchie said in his article titled "The Evolution of the UNIX Time-sharing System," "What we wanted to preserve was not just a good environment in which to do programming, but a system around which a fellowship could form. We knew from experience that the essence of communal computing, as supplied by remote-access, time-shared machines, is not just to type programs into a terminal instead of a keypunch, but to encourage close communication."

In 1969 Thompson found a little-used PDP-7, an old computer manufactured by the Digital Equipment Corporation (DEC). To make the PDP-7 efficiently run the computer programs that they created, Ritchie, Thompson, and others began to develop an operating system. Among other things, an operating system enables a user to copy, delete, edit, and print data files; to move data from a disk to the screen or to a printer; to manage the movement of data from disk storage to memory storage; and so on. Without operating systems, computers are very difficult and time-consuming for experts to run.

It was clear, however, that the PDP-7 was too primitive for what Ritchie and Thompson wanted to do, so they persuaded Bell Labs to purchase a PDP-11, a far more advanced computer at the time. To justify their acquisition of the PDP-11 to the management of Bell Labs, Ritchie and Thompson said that they would use the PDP-11 to develop a word-processing system for the secretaries in the patent department. With the new PDP-11, Ritchie and Thompson could refine their operating system even more. Soon, other departments in Bell Labs began to find UNIX useful. The system was used and refined within the company for some time before it was announced to the outside world in 1973 during a symposium on Operating Systems Principles hosted by International Business Machines (IBM).

One of the most important characteristics of UNIX was its portability. Making UNIX portable meant that it could be run with relatively few modifications on different computer systems. Most operating systems are developed around specific hardware configurations, that is, specific microprocessor chips, memory sizes, and input and output devices (e.g., printers, keyboards, screens, etc.). To transfer an operating system from one hardware environment to another--for example, from a microcomputer to a mainframe computer--required so many internal changes to the programming that, in effect, the whole operating system had to be rewritten. Ritchie circumvented this problem by rewriting UNIX in such a way that it was largely machine independent. The resulting portability made UNIX easier to use in a variety of computer and organizational environments, saving time, money, and energy for its users.

To help make UNIX portable, Ritchie created a new programming language, called C, in 1972. C used features of low-level languages or machine languages (i.e., languages that allow programmers to move bits of data between the components inside microprocessor chips) and features of high-level languages (i.e., languages that have more complex data manipulating functions such as looping, branching, and subroutines). High-level languages are easier to learn than low-level languages because they are closer to everyday English. However, because C combined functions of both high- and low-level languages and was very flexible, it was not for beginners. C was very portable because, while it used a relatively small syntax and instruction set, it was also highly structured and modular. Therefore, it was easy to adapt it to different computers, and programmers could copy preexisting blocks of C functions into their programs. These blocks, which were stored on disks in various libraries and could be accessed by using C programs, allowed programmers to create their own programs without having to reinvent the wheel. Because C had features of low-level programming languages, it ran very quickly and efficiently compared to other high-level languages, and it took up relatively little computer time.

Interestingly, because of federal antitrust regulations, Bell Labs, which is owned by American Telephone & Telegraph (AT&T), could not copyright C or UNIX after AT&T was broken up into smaller corporations. Thus, C was used at many college and university computing centers, and each year thousands of new college graduates arrived in the marketplace with a lot of experience with C. In the mid and late 1980s, C became one of the most popular programming languages in the world. The speed at which C worked made it a valuable tool for companies that developed software commercially. C was also popular because it was written for UNIX, which, by the early 1990s, was shipped out on over $20 billion of new computer systems a year, making it one of the most commonly used operating systems in the world.

At the end of 1990, Ritchie became the head of the Computing Techniques Research Department at Bell Labs, contributing applications and managing the development of distributed operating systems. He has received several awards for his contributions to computer programming, including the ACM Turing award in 1983, which he shared with Thompson.

courtesy : http://www.bookrags.com/biography/dennis-ritchie-wcs/

Steve Jobs - A real Hero


His personality was reflected in the products he created. Just as the core of Apple's philosophy, from the original Macintosh in 1984 to the iPad a generation later, was the end-to-end integration of hardware and software, so too was it the case with Steve Jobs: His passions, perfectionism, demons, desires, artistry, devilry, and obsession for control were integrally connected to his approach to business and the products that resulted.

The unified field theory that ties together Jobs's personality and products begins with his most salient trait: his intensity. His silences could be as searing as his rants; he had taught himself to stare without blinking. Sometimes this intensity was charming, in a geeky way, such as when he was explaining the profundity of Bob Dylan's music or why whatever product he was unveiling at that moment was the most amazing thing that Apple had ever made.

At other times it could be terrifying, such as when he was fulminating about Google or Microsoft ripping off Apple. This intensity encouraged a binary view of the world. Colleagues referred to the hero/shithead dichotomy. You were either one or the other, sometimes on the same day.

The same was true of products, ideas, even food: Something was either "the best thing ever," or it was shitty, brain-dead, inedible. As a result, any perceived flaw could set off a rant. The finish on a piece of metal, the curve of the head of a screw, the shade of blue on a box, the intuitiveness of a navigation screen, he would declare them to "completely suck" until that moment when he suddenly pronounced them "absolutely perfect." He thought of himself as an artist, which he was, and he indulged in the temperament of one.


His quest for perfection led to his compulsion for Apple to have end-to-end control of every product that it made. He got hives, or worse, when contemplating great Apple software running on another company's crappy hardware, and he likewise was allergic to the thought of unapproved apps or content polluting the perfection of an Apple device. This ability to integrate hardware and software and content into one unified system enabled him to impose simplicity. The astronomer Johannes Kepler declared that "nature loves simplicity and unity." So did Steve Jobs.

teve Jobs was born February 24, 1955, to two University of Wisconsin graduate students who gave him up for adoption. Smart but directionless, Jobs experimented with different pursuits before starting Apple Computers with Stephen Wozniak in the Jobs' family garage. Apple's revolutionary products, which include the iPod, iPhone and iPad, are now seen as dictating the evolution of modern technology.

Early Life

Steven Paul Jobs was born on February 24, 1955, to Joanne Simpson and Abdulfattah "John" Jandali, two University of Wisconsin graduate students who gave their unnamed son up for adoption. His father, Abdulfattah Jandali, was a Syrian political science professor and his mother, Joanne Simpson, worked as a speech therapist. Shortly after Steve was placed for adoption, his biological parents married and had another child, Mona Simpson. It was not until Jobs was 27 that he was able to uncover information on his biological parents.

As an infant, Steven was adopted by Clara and Paul Jobs and named Steven Paul Jobs. Clara worked as an accountant and Paul was a Coast Guard veteran and machinist. The family lived in Mountain View within California's Silicon Valley. As a boy, Jobs and his father would work on electronics in the family garage. Paul would show his son how to take apart and reconstruct electronics, a hobby which instilled confidence, tenacity, and mechanical prowess in young Jobs.

While Jobs has always been an intelligent and innovative thinker, his youth was riddled with frustrations over formal schooling. In elementary school he was a prankster whose fourth grade teacher needed to bribe him to study. Jobs tested so well, however, that administrators wanted to skip him ahead to high school—a proposal his parents declined.

After he did enroll in high school, Jobs spent his free time at Hewlett-Packard. It was there that he befriended computer club guru Steve Wozniak. Wozniak was a brilliant computer engineer, and the two developed great respect for one another.

Apple Computers

After high school, Jobs enrolled at Reed College in Portland, Oregon. Lacking direction, he dropped out of college after six months and spent the next 18 months dropping in on creative classes. Jobs later recounted how one course in calligraphy developed his love of typography.

In 1974, Jobs took a position as a video game designer with Atari. Several months later he left Atari to find spiritual enlightenment in India, traveling the continent and experimenting with psychedelic drugs. In 1976, when Jobs was just 21, he and Wozniak started Apple Computers. The duo started in the Jobs family garage, and funded their entrepreneurial venture after Jobs sold his Volkswagen bus and Wozniak sold his beloved scientific calculator.

Jobs and Wozniak are credited with revolutionizing the computer industry by democratizing the technology and making the machines smaller, cheaper, intuitive, and accessible to everyday consumers. The two conceived a series of user-friendly personal computers that they initially marketed for $666.66 each. Their first model, the Apple I, earned them $774,000. Three years after the release of their second model, the Apple II, sales increased 700 percent to $139 million dollars. In 1980, Apple Computer became a publically traded company with a market value of $1.2 billion on the very first day of trading. Jobs looked to marketing expert John Scully of Pepsi-Cola to help fill the role of Apple's President.

Departure from Apple

However, the next several products from Apple suffered significant design flaws resulting in recalls and consumer disappointment. IBM suddenly surpassed Apple sales, and Apple had to compete with an IBM/PC dominated business world. In 1984 Apple released the Macintosh, marketing the computer as a piece of a counter culture lifestyle: romantic, youthful, creative. But despite positive sales and performance superior to IBM's PCs, the Macintosh was still not IBM compatible. Scully believed Jobs was hurting Apple, and executives began to phase him out.

In 1985, Jobs resigned as Apple's CEO to begin a new hardware and software company called NeXT, Inc. The following year Jobs purchased an animation company from George Lucas, which later became Pixar Animation Studios. Believing in Pixar's potential, Jobs initially invested $50 million of his own money into the company. Pixar Studios went on to produce wildly popular animation films such as Toy Story, Finding Nemo and The Incredibles. Pixar's films have netted $4 billion. The studio merged with Walt Disney in 2006, making Steve Jobs Disney's largest shareholder.

Reinventing Apple

Despite Pixar's success, NeXT, Inc. floundered in its attempts to sell its specialized operating system to mainstream America. Apple eventually bought the company in 1997 for $429 million. That same year, Jobs returned to his post as Apple's CEO.

Much like Steve Jobs instigated Apple's success in the 1970s, he is credited with revitalizing the company in the 1990s. With a new management team, altered stock options, and a self-imposed annual salary of $1 a year, Jobs put Apple back on track. His ingenious products such as the iMac, effective branding campaigns, and stylish designs caught the attention of consumers once again.

Pancreatic Cancer

In 2003, Jobs discovered he had a neuroendocrine tumor, a rare but operable form of pancreatic cancer. Instead of immediately opting for surgery, Jobs chose to alter his pescovegetarian diet while weighing Eastern treatment options. For nine months Jobs postponed surgery, making Apple's board of directors nervous. Executives feared that shareholders would pull their stocks if word got out that their CEO was ill. But in the end, Job's confidentiality took precedence over shareholder disclosure. In 2004, he had a successful surgery to remove the pancreatic tumor. True to form, in subsequent years Jobs disclosed little about his health.

Recent Innovations

Apple introduced such revolutionary products as the Macbook Air, iPod, and iPhone, all of which have dictated the evolution of modern technology. Almost immediately after Apple releases a new product, competitors scramble to produce comparable technologies. In 2007, Apple's quarterly reports were the company's most impressive statistics to date. Stocks were worth a record-breaking $199.99 a share, and the company boasted a staggering $1.58 billion dollar profit, an $18 billion dollar surplus in the bank, and zero debt.

In 2008, iTunes became the second biggest music retailer in America-second only to Wal-Mart. Half of Apple's current revenue comes from iTunes and iPod sales, with 200 million iPods sold and six billion songs downloaded. For these reasons, Apple has been rated No. 1 in America's Most Admired Companies, and No. 1 amongst Fortune 500 companies for returns to shareholders.

Personal Life

Early in 2009, reports circulated about Jobs' weight loss, some predicting his health issues had returned, which included a liver transplant. Jobs had responded to these concerns by stating he was dealing with a hormone imbalance. After nearly a year out of the spotlight, Steve Jobs delivered a keynote address at an invite-only Apple event September 9, 2009.

In respect to his personal life, Steve Jobs remained a private man who rarely discloses information about his family. What is known is Jobs fathered a daughter with girlfriend Chrisann Brennan when he was 23. Jobs denied paternity of his daughter Lisa in court documents, claiming he was sterile. Jobs did not initiate a relationship with his daughter until she was 7 but, when she was a teenager, she came to live with her father.

In the early 1990s, Jobs met Laurene Powell at Stanford business school, where Powell was an MBA student. They married on March 18, 1991, and lived together in Palo Alto, California, with their three children.

Final Years

On October 5, 2011, Apple Inc. announced that co-founder Steve Jobs had died. He was 56 years old at the time of his death.

 (Courtesy- biography.com and times of India