Quantum Computers
computer science crazy Super Moderator Posts: 3,048 Joined: Dec 2008 
22092008, 09:59 AM
Definition The history of computer technology has involved a sequence of changes from one type of physical realisation to another  from gears to relays to valves to transistors to integrated circuits and so on. Today's advanced lithographic techniques can squeeze fraction of micron wide logic gates and wires onto the surface of silicon chips. Soon they will yield even smaller parts and inevitably reach a point where logic gates are so small that they are made out of only a handful of atoms; i.e. the size of the logic gates become comparable to the size of atoms. On the atomic scale matter obeys the rules of quantum mechanics, which are quite different from the classical rules that determine the properties of conventional logic gates. So if computers are to become smaller in the future, new, quantum technology must replace or supplement what we have now. The point is, however, that quantum technology can offer much more than cramming more and more bits to silicon and multiplying the clockspeed of microprocessors. It can support entirely new kind of computation with qualitatively new algorithms based on quantum principles. The story of quantum computation started as early as 1982, when the physicist Richard Feynman considered simulation of quantummechanical objects by other quantum systems. However, the unusual power of quantum computation was not really anticipated until the 1985 when David Deutsch of the University of Oxford published a crucial theoretical paper in which he described a universal quantum computer. After the Deutsch paper, the hunt was on for something interesting for quantum computers to do. At the time all that could be found were a few rather contrived mathematical problems and the whole issue of quantum computation seemed little more than an academic curiosity. It all changed rather suddenly in 1994 when Peter Shor from AT&T's Bell Laboratories in New Jersey devised the first quantum algorithm that, in principle, can perform efficient factorisation. This became a `killer application'  something very useful that only a quantum computer could do. Concept of Information To explain what makes quantum computers so different from their classical counterparts we begin by having a closer look at a basic chunk of information namely one bit. A bit is the basic unit of information in a digital computer. From a physical point of view, a bit is a physical system which can be prepared in one of the two different states representing two logical values  no or yes, false or true, or simply 0 or 1. For example, in digital computers, the voltage between the plates in a capacitor represents a bit of information: a charged capacitor denotes bit value 1 and an uncharged capacitor bit value 0. One bit of information can be also encoded using two different polarizations of light or two different electronic states of an atom. In any of the systems listed above, a bit can store a value of logical 1 or logical 0 using some method which depends on the system used. In quantum computers also, the basic unit of information is a bit. The concept of quantum computing first arose when the use of an atom as a bit was suggested. If we choose an atom as a physical bit then quantum mechanics tells us that apart from the two distinct electronic states (the excited state and the ground state), the atom can be also prepared in what is known as a coherent superposition of the two states. Use Search at http://topicideas.net/search.php wisely To Get Information About Project Topic and Seminar ideas with report/source code along pdf and ppt presenaion



pigannu Active In SP Posts: 1 Joined: Mar 2011 
19032011, 10:01 AM
Please send me the full seminar and presentation report for the topic Quantum Computers



seminar flower Super Moderator Posts: 10,120 Joined: Apr 2012 
01082012, 01:40 PM
QUANTUM COMPUTERS
QUANTUM COMPUTERS.docx (Size: 477.26 KB / Downloads: 30) ABSTRACT In a quantum computer any superposition of inputs evolves unitarily into the corresponding superposition of outputs. It has been recently demonstrated that such computers can dramatically speed up the task of finding factors of large numbers  a problem of great practical significance because of its cryptographic applications. Instead of the nearly exponential ($\sim \exp L^{1/3}$, for a number with $L$ digits) time required by the fastest classical algorithm, the quantum algorithm gives factors in a time polynomial in $L$ ($\sim L^2$). This enormous speedup is possible in principle because quantum computation can simultaneously follow all of the paths corresponding to the distinct classical inputs, obtaining the solution as a result of coherent quantum interference between the alternatives. Hence, a quantum computer is sophisticated interference device, and it is essential for its quantum state to remain coherent in the course of the operation. In this report we investigate the effect of decoherence on the quantum factorization algorithm and establish an upper bound on a ``quantum factorizable'' $L$ based on the decoherence suffered per operational step. Introduction to Quantum Computers Around 2030 computers might not have any transistors and chips. Think of a computer that is much faster than a common classical silicon computer. This might be a quantum computer. Theoretically it can run without energy consumption and billion times faster than today’s PIII computers. Scientists already think about a quantum computer, as a next generation of classical computers. Gershenfeld says that if making transistors smaller and smaller is continued with the same rate as in the past years, then by the year of 2020, the width of a wire in a computer chip will be no more than a size of a single atom. These are sizes for which rules of classical physics no longer apply. Computers designed on today's chip technology will not continue to get cheaper and better. Because of its great power, quantum computer is an attractive next step in computer technology. (Manay, 1998, p. 5). A technology of quantum computers is also very different. For operation, quantum computer uses quantum bits (qubits). Qubit has a quaternary nature. Quantum mechanic’s laws are completely different from the laws of a classical physics. A qubit can exist not only in the states corresponding to the logical values 0 or 1 as in the case of a classical bit, but also in a superposition state. History of Quantum Computers In 1982 R.Feynman presented an interesting idea how the quantum system can be used for computation reasons. He also gave an explanation how effects of quantum physics could be simulated by such quantum computer. This was very interesting idea which can be used for future research of quantum effects. Every experiment investigating the effects and laws of quantum physics is complicated and expensive. Quantum computer would be a system performing such experiments permanently. Later in 1985, it was proved that a quantum computer would be much more powerful than a classical one. (West, 2000, p. 3) The Major Difference between Quantum and Classical Computers The memory of a classical computer is a string of 0s and 1s, and it can perform calculations on only one set of numbers simultaneously. The memory of a quantum computer is a quantum state that can be a superposition of different numbers. A quantum computer can do an arbitrary reversible classical computation on all the numbers simultaneously. Performing a computation on many different numbers at the same time and then interfering all the results to get a single answer, makes a quantum computer much powerful than a classical one. (West, 2000) The Potential and Power of Quantum Computing Quantum computer with 500 qubits gives 2500 superposition states. Each state would be classically equivalent to a single list of 500 1's and 0's. Such computer could operate on 2500 states simultaneously. Eventually, observing the system would cause it to collapse into a single quantum state corresponding to a single answer, a single list of 500 1's and 0's, as dictated by the measurement axiom of quantum mechanics. This kind of computer is equivalent to a classical computer with approximately 10150 processors. (West, 2000, p. 3) Conclusion It is important that making a practical quantum computing is still far in the future. Programming style for a quantum computer will also be quite different. Development of quantum computer needs a lot of money. Even the best scientists can’t answer a lot of questions about quantum physics. Quantum computer is based on theoretical physics and some experiments are already made. Building a practical quantum computer is just a matter of time. Quantum computers easily solve applications that can’t be done with help of today’s computers. This will be one of the biggest steps in science and will undoubtedly revolutionize the practical computing world. 


