Tuesday, January 31, 2012

CCNA Routing

The CCNA routing is a basic in Cisco networking world.... where the packets are routed by its source and destination IP Address and using corresponding MAC Address....


Will Update The post Soon... :))

Sunday, December 23, 2007

Computer science Major achievements

Despite its relatively short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society. These include

Applications within computer science

A formal definition of computation and computability, and proof that there are computationally unsolvable and intractable problems.

The concept of a programming language, a tool for the precise expression of methodological information at various levels of abstraction.

Applications outside of computing

Sparked the Digital Revolution which led to the current Information Age and the internet.[12]
In cryptography, breaking the Enigma machine was an important factor contributing to the Allied victory in World War II.[9]
Scientific computing enabled advanced study of the mind and mapping the human genome was possible with Human Genome Project.[12] Distributed computing projects like Folding@home explore protein folding.
Algorithmic trading has increased the efficiency and liquidity of financial markets by using artificial intelligence, machine learning and other statistical/numerical techniques on a large scale

Relationship with other fields

Computer science is frequently derided by the sentence "Any field which has to have 'science' in its name isn't one." This was placed in print by physicist Richard Feynman in his Lectures on Computation (1996) after his passing.

Despite its name, a significant amount of computer science does not involve the study of computers themselves. Because of this several alternative names have been proposed. Danish scientist Peter Naur suggested the term datalogy, to reflect the fact that the scientific discipline revolves around data and data treatment, while not necessarily involving computers. The first scientific institution applying the datalogy term was DIKU, the Department of Datalogy at the University of Copenhagen, founded in 1969, with Peter Naur being the first professor in datalogy. The term is used mainly in the Scandinavian countries. Also, in the early days of computing, a number of terms for the practitioners of the field of computing were suggested in the Communications of the ACM—turingineer, turologist, flow-charts-man, applied meta-mathematician, and applied epistemologist.[14] Three months later in the same journal, comptologist was suggested, followed next year by hypologist.[15] Recently the term computics has been suggested.[16] Infomatik was a term used in Europe with more frequency.
In fact, the renowned computer scientist Edsger Dijkstra is often quoted as saying, "Computer science is no more about computers than astronomy is about telescopes." The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often called information technology or information systems. Computer science is sometimes criticized as being insufficiently scientific, a view espoused in the statement "Science is to computer science as hydrodynamics is to plumbing" credited to Stan Kelly-Bootle[17] and others. However, there has been much cross-fertilization of ideas between the various computer-related disciplines. Computer science research has also often crossed into other disciplines, such as artificial intelligence, cognitive science, physics (see quantum computing), and linguistics.
Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines.[8] Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel and Alan Turing, and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.
The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term "software engineering" means, and how computer science is defined. David Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals, making the two separate but complementary disciplines.[18]
The academic political and funding aspects of computer science tend to have roots as to whether a department in the US formed with either a mathematical emphasis or an engineering emphasis. In general, electrical engineering based CS departments have tended to succeed as computer science and/or engineering departments. CS departments with a mathematics emphasis and with a numerical orientation consider alignment computational science. Both types of departments tend to make efforts to bridge the field educationally if not across all research

Computer science

Computer science, or computing science, is the study of the theoretical foundations of information and computation and their implementation and application in computer systems.[1][2][3] Computer science has many sub-fields; some emphasize the computation of specific results (such as computer graphics), while others relate to properties of computational problems (such as computational complexity theory). Still others focus on the challenges in implementing computations. For example, programming language theory studies approaches to describing computations, while computer programming applies specific programming languages to solve specific computational problems. A further subfield, human-computer interaction, focuses on the challenges in making computers and computations useful, usable and universally accessible to people.

History

The history of computer science predates the invention of the modern digital computer by many centuries. Machines for calculating fixed numerical tasks, such as the abacus, have existed since antiquity. Wilhelm Schickard built the first mechanical calculator in 1623.[4] Charles Babbage designed a difference engine in Victorian times (between 1837 and 1901)[5] helped by Ada Lovelace.[6] Around 1900 the IBM corporation sold punch-card machines.[7] However all of these machines were constrained to perform a single task, or at best, some subset of all possible tasks.

During the 1940s, as newer and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors. As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1960s, with the creation of the first computer science departments and degree programs.[8] Since practical computers became available, many applications of computing have become distinct areas of study in their own right

Many initially believed it impossible that "computers themselves could actually be a scientific field of study" (Levy 1984, p. 11), though it was in the "late fifties" (Levy 1984, p.11) that it gradually became accepted among the greater academic population. It is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM or International Business Machine as the company is officially termed released the IBM 704 and later the IBM 709 computers which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating...if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again" (Levy 1984, p.13). Obviously, during the period of the late 1950s the computer science discipline was very much in its developmental stages and such issues were commonplace

Thursday, December 20, 2007

DNA basis for new generation of computers

SAN FRANCISCO, California (AP) -- It almost sounds too fantastic to be true, but a growing amount of research supports the idea that DNA, the basic building block of life, could also be the basis of a staggeringly powerful new generation of computers.

If it happens, the revolution someday might be traced to the night a decade ago when University of Southern California computer scientist Leonard Adleman lay in bed reading James Watson's textbook "Molecular Biology of the Gene."

"This is amazing stuff," he said to his wife, and then a foggy notion robbed him of his sleep: Human cells and computers process and store information in much the same way.

Computers store data in strings made up of the numbers 0 and 1. Living things store information with molecules represented the letters A,T,C and G.

There were many more intriguing similarities, Adleman realized as he hopped out of bed. He began sketching the basics of DNA computing.

Those late-night scribbles have long since given way to hard science, backed by research grants from NASA, the Pentagon and other federal agencies. Now a handful of researchers around the world are creating tiny biology-based computers, hoping to harness the powers of life itself.

They call their creations "machines" and "devices." Really, they are nothing more than test tubes of DNA-laden water, and yet this liquid has been coaxed to crunch algorithms and spit out data.

The problems solved by DNA computers to date are rudimentary. Children could come up with the answers more quickly with a pencil and paper

But the researchers hope to someday inject tiny computers into humans to zap viruses, fix good cells gone bad and otherwise keep us healthy

They're also pursuing the idea that genetic material can self-replicate and grow into processors so powerful that they can handle problems too complex for silicon-based computers to solve.

Eventually, the scientists aim to create self-sustaining computers that can be used, for instance, on deep-space voyages, to monitor and maintain the health of humans on board.

DNA computing is born

What struck Adleman most that night he jumped out of bed was how a living enzyme "reads" DNA much the same way computer pioneer Alan Turing first contemplated in 1936 how a machine could read data.

"If you look inside the cell you find a bunch of amazing little tools," said Adleman, who made the first DNA-based computation in 1994. "The cell is a treasure chest."

Adleman used his computer to solve the classic "traveling salesman" mathematical problem -- how a salesman can visit a given number of cities without passing through any city twice -- by exploiting the predictability of how DNA interacts

Adleman assigned each of seven cities a different strip of DNA, 20 molecules long, then dropped them into a stew of millions of more strips of DNA that naturally bonded with the "cities." That generated thousands of random paths, in much the same way that a computer can sift through random numbers to break a code.

From this hodgepodge of connected DNA, Adleman eventually extracted a satisfactory solution -- a strand that led directly from the first city to the last, without retracing any steps. DNA computing was born.

What these researchers are essentially trying to do is control, predict and understand life itself. So there's little wonder that their machines are decades away from being anything more than a neat laboratory trick.

Biologists are only now grasping the basics of how and why DNA unzips, recombines and sends and receives information. DNA is notoriously fragile and prone to transcription errors -- as the world's cancer rates prove.

These realizations and others have tempered initial expectations that DNA would ultimately replace silicon chips. Still, researchers in this field believe they remain on the vanguard of a computational revolution

After all, a single gram of dried DNA, about the size of a half-inch sugar cube, can hold as much information as a trillion compact discs. Adelman senses that can be exploited somehow, some way.

"I'm just not sure how," he said.

Fifth generation computer

The Fifth Generation Computer Systems project (FGCS) was an initiative by Japan's Ministry of International Trade and Industry, begun in 1982, to create a "fifth generation computer" (see history of computing hardware) which was supposed to perform much calculation utilizing massive parallelism. It was to be the end result of a massive government/industry research project in Japan during the 1980s. It aimed to create an "epoch-making computer" with supercomputer-like performance and usable artificial intelligence capabilities

The term fifth generation was intended to convey the system as being a leap beyond existing machines. Computers using vacuum tubes were called the first generation; transistors and diodes, the second; ICs, the third; and those using microprocessors, the fourth. Whereas previous computer generations had focused on increasing the number of logic elements in a single CPU, the fifth generation, it was widely believed at the time, would instead turn to massive numbers of CPUs for added perform

Background and Design Philosophy

Throughout these multiple generations since the 1980s, Japan had largely been a follower in terms of computing advancement, building computers following US and British leads. The Ministry for International Trade and Industry (MITI) decided to attempt to break out of this follow-the-leader pattern, and in the mid-1970s started looking, on a small scale, into the future of computing. They asked the Japan Information Processing Development Center (JIPDEC) to indicate a number of future directions, and in 1979 offered a three-year contract to carry out more in-depth studies along with industry and academia. It was during this period that the term "fifth-generation computer" started to be used

The primary fields for investigation from this initial project were:

Inference computer technologies for knowledge processing

Computer technologies to process large-scale data bases and knowledge bases

High performance workstations

Distributed functional computer technologies

Super-computers for scientific calculation

The project imagined a parallel processing computer running on top of massive databases (as opposed to a traditional filesystem) using a logic programming language to access the data. They envisioned building a prototype machine with performance between 100M and 1G LIPS, where a LIPS is a Logical Inferences Per Second. At the time typical workstation machines were capable of about 100k LIPS. They proposed to build this machine over a ten year period, 3 years for initial R&D, 4 years for building various subsystems, and a final 3 years to complete a working prototype system. In 1982 the government decided to go ahead with the project, and established the Institute for New Generation Computer Technology (ICOT) through joint investment with various Japanese computer companies.

Intel Processor

Processor technology is a collection of technologies designed to work together to deliver a great computing experience. Specifically, it is a microprocessor, chipset, and software and may also include additional hardware, services, and support. Processor technologies enable new capabilities for people in all facets of their lives—at work and at home.

Mobile high performance and manageability with great battery life.

Maximize productivity and increase security of mobile business PCs with Intel® Centrino® Pro processor technology. This new generation of business notebooks delivers up to 2x performance¹ that can provide the headroom to multi-task with ease and run security processes like anti-virus scans in the background without impacting end users.

Intel Centrino Pro processor technology also lets you remotely manage notebooks over the network, even if the system is down or off.² That increases uptime, reduces costly desk-side visits, and frees you up to provide strategic value to your business

Innovative power-saving features designed to extend battery life are also built right in,³ giving your users the ability to get more work done on the road.

Saturday, December 15, 2007

Interesting Management Stories

It's a fine sunny day in the forest and a lion is sitting outside his cave, lying lazily in the sun. Along comes a fox, out on a walk.

Fox: "Do you know the time, because my watch is broken"Lion: "Oh, I can easily fix the watch for you"

Fox: "Hmm... But it's a very complicated mechanism, and your big claws will only destroy it even more."Lion: "Oh no, give it to me, and it will be fixed"

Fox: "That's ridiculous! Any fool knows that lazy lions with great claws cannot fix complicated watches"Lion: "Sure they do, give it to me and it will be fixed"

The lion disappears into his cave, and after a while he comes back with the watch which is running perfectly. The fox is impressed, and the lion continues to lie lazily in the sun, looking very pleased with himself.

Soon a wolf comes along and stops to watch the lazy lion in the sun.

Wolf: "Can I come and watch TV tonight with you, because mine is broken"Lion: "Oh, I can easily fix your TV for you"

Wolf: "You don't expect me to believe such rubbish, do you? There is no way that a lazy lion with big claws can fix a complicated TV.Lion: "No problem. Do you want to try it?"

The lion goes into his cave, and after a while comes back with a perfectly fixed TV. The wolf goes away happily and amazed.


Scene :

Inside the lion's cave. In one corner are half a dozen small and intelligent looking rabbits who are busily doing very complicated work with very detailed instruments. In the other corner lies a huge lion looking very pleased with himself.


Moral :

IF YOU WANT TO KNOW WHY A MANAGER IS FAMOUS; LOOK AT THE WORK OF HIS SUBORDINATES.

Management Lesson in the context of the working world :IF YOU WANT TO KNOW WHY SOMEONE UNDESERVED IS PROMOTED; LOOK AT THE WORK OF HIS SUBORDINATES

Ad's Search

Google