As my own UNISCA First Committee chair report to the General Assembly – "Converging Technologies: The Future of the Global Information Society" – focused specifically upon these long-term technological and cultural challenges, which we will have to confront both as a society and as a species – I have high hopes that this meeting will provide a stimulating, and unparalleled, open venue for exploration of novel ideas, discussion of alternative paradigms, and fertile grounds for brainstorming original, innovative solutions. My congratulations to the initiative of the conference organizers. I look forward to reporting back upon conclusion of the conference, as there has been an open conference wiki set up at the conference website to contribute novel ideas to the discussion.
20081115
Convergence08 Mountain View " From 15-16 November 2008 – the world's most dangerous ideas will collide in Mountain View, California. Convergence08 examines the world-changing possibilities of nanotechnology and the life-changing promises of biotechnology. It is the premier forum for debate and exploration of cognitive technology ethics – and ground zero of the past and future information technology revolution. Convergence08 is an innovative, lively 'unconference' – the first and only unclassified forum dedicated solely to the convergence of NBIC – nano-, bio-, information and cognitive – technology developments."
20081002
Time in Quantum Mechanics Perimeter Institute The Clock and the Quantum focuses on conceptual and technical issues concerning the role of time in quantum theory – including quantum correlations in time, histories approaches, pre- and post-selected ensembles, time and quantum measurement, and causality under the framework of quantum theory.
Lee Smolin discusses cosmological inflation, the problem of initial conditions, and the interpretation of the "wavefunction of the universe." Lev Vaidman provides a review of the two state vector formalism, which considers backwards-evolving quantum states. Noriyuki Hatakenaka presents a new scheme for testing macrorealism without statistical treatments by combining Leggett-Garg and Greenberger-Horne-Zeilinger (GHZ) inequalities, i.e. a temporal GHZ test using quantum correlations in time. Lucian Ionescu outlines an "upgrade" of the Feynman path integral formalism – where qubits, instead of complex amplitudes – are associated within elementary transitions of a causal network structure.
The conference is first in a series of foundations conferences organized under joint collaboration between the Perimeter Institute and three Australian national universities.
20080909
From Qubits to Black Holes Technion | Macquarie University "Asher Peres (1934-2005), was an Israeli scientist who is widely considered to be one of the pioneering founders of quantum information science. A student of Nathan Rosen, (the “R” of EPR), Asher codiscovered quantum teleportation, a time-reversal test for quantum entanglement, and published numerous works on the foundations of quantum science. His research legacy continues through his many research collaborators, students, textbooks and research papers."
The Technion (Israel), and Macquarie University (Australia), will host the inaugural Asher Peres International Physics School 2008, for senior undergraduates and junior postgraduates in a series of lectures ranging from quantum mechanics, theory and experiments, through to quantum gravity. Held over five days, from November 17-22 (2008), in the historic environs at Chowder Bay on Sydney Harbor, the School will feature lectures from leading scientists from around the world, including Sir Peter Knight, Artur Ekert, Christian Kurtsiefer, Chris Fuchs, Bei Lok Hu, Jason Twamley, Daniel Terno, Gavin Brennen, Alexei Gilchrist, James Rabeau, and James Cresser.
20080817
Progress in Quantum Computing IQSA | LT25 | Lorentz Center – I've recently returned from a series of international conferences and workshops on superconductivity, quantum computation, entanglement and quantum coherence. In Sopot, Poland at the International Conference on Quantum Structures, much of the week was spent in long walks on the shores of the Baltic Sea, holding intense discussions on quantum information theory with Lev Levitin, who will be hosting the IQSA meeting at MIT in two years. We also continued ongoing research with Roman Zapatrin (Starlab) to advance the development of adaptive quantum networks for applications in fault-tolerant quantum computation, associative processing and pattern recognition.
Following IQSA, I moved on to the 25th triennial International Conference on Low-temperature Condensed Matter Physics, where I met with Keith Schwab following presentation of his group's recent experiments with nanomechanical resonators to probe the boundaries of quantum and classical regimes, as well as discussing present and upcoming experiments in superconducting flux qubit systems with Yasu Nakamura, John Clarke, Robert Schoelkopf, and John Martinis.
Upon conclusion of LT25, a satellite conference on Quantum Decoherence in Quantum Information Systems was held at the Lorentz Center, where I met with Vlatko Vedral to discuss long-term research initiatives in multipartite and macroscopic entanglement in condensed matter systems. Jasper van Wezel presented a review of the limits to quantum behavior related to spontaneous symmetry breaking – summarizing recent results on the quantum to classical transition, and future experiments which may elucidate the process of wavefunction collapse. Dirk Bouwmeester was generous enough to offer a tour of the experimental laboratory setup for the MiniGRAIL gravitational wave detector, which has just undergone several modifications, including improvements to the antenna, cryogenic cool-down systems, improved shielding, redesign of the capacitive transducer and fabrication of a new two-stage SQUID module for more stable operation at low temperatures.
Following IQSA, I moved on to the 25th triennial International Conference on Low-temperature Condensed Matter Physics, where I met with Keith Schwab following presentation of his group's recent experiments with nanomechanical resonators to probe the boundaries of quantum and classical regimes, as well as discussing present and upcoming experiments in superconducting flux qubit systems with Yasu Nakamura, John Clarke, Robert Schoelkopf, and John Martinis.
Upon conclusion of LT25, a satellite conference on Quantum Decoherence in Quantum Information Systems was held at the Lorentz Center, where I met with Vlatko Vedral to discuss long-term research initiatives in multipartite and macroscopic entanglement in condensed matter systems. Jasper van Wezel presented a review of the limits to quantum behavior related to spontaneous symmetry breaking – summarizing recent results on the quantum to classical transition, and future experiments which may elucidate the process of wavefunction collapse. Dirk Bouwmeester was generous enough to offer a tour of the experimental laboratory setup for the MiniGRAIL gravitational wave detector, which has just undergone several modifications, including improvements to the antenna, cryogenic cool-down systems, improved shielding, redesign of the capacitive transducer and fabrication of a new two-stage SQUID module for more stable operation at low temperatures.
20080619
Space QUEST: Experiments with quantum entanglement in space Vienna | ESA | ISS In a recent submission to the arXiv, Zeilinger's group at University of Vienna, Austria has proposed an experiment—Space-QUEST, Quantum Entanglement Science and Technology—for space-to-ground, entangled-photon Bell Inequality violation measurements to verify quantum nonlocality at distances over thousands of kilometers, in a joint operation between the International Space Station and a ground observatory in the European Union.
Entanglement and nonlocality have been pivotal controversies since the birth of quantum mechanics—Einstein's "spooky action at a distance" implies simultaneous, nonlocal correlations between separate entangled particles. J. S. Bell was the first to confirm the phenomenon experimentally in 1964.
Further refinements and increasing precision in succeeding experiments have consistently shown quantum mechanics to be an explicitly nonlocal theory—the outcome Einstein was most averse to accept. However, long-distance relativistic experiments, such as between orbiting satellites, have been technologically cost-prohibitive to date. The paper will be presented at the 2008 IAC Microgravity Sciences and Processes Symposium, under a proposed joint initiative between the European Space Agency and the International Space Station.
"Testing quantum correlations over distances achievable with systems placed in the Earth orbit, or even beyond, would allow to verify both the validity of quantum physics and the preservation of entanglement over distances impossible to achieve on the ground. Using the large relative velocity of two orbiting satellites, one can perform experiments on entanglement where – due to special relativity – both observers can claim that they have performed the measurement on their system prior to the measurement of the other observer. In such an experiment, it is not possible anymore to think of any local realistic mechanisms that potentially influence one measurement outcome according to the other one."
Zeilinger's group has previously conducted proof-of-principle experiments in the Canary Islands with a 144 km free-space link, using an ESA receiver telescope to receive single entangled photons, cf. Nature Physics, 3:481-486 (2007). A more recent experiment in Italy has demonstrated single-photon downlink communications viability from a near-earth orbit satellite, cf. New Journal of Physics, 10:033038 (2008).
Entanglement and nonlocality have been pivotal controversies since the birth of quantum mechanics—Einstein's "spooky action at a distance" implies simultaneous, nonlocal correlations between separate entangled particles. J. S. Bell was the first to confirm the phenomenon experimentally in 1964.
Further refinements and increasing precision in succeeding experiments have consistently shown quantum mechanics to be an explicitly nonlocal theory—the outcome Einstein was most averse to accept. However, long-distance relativistic experiments, such as between orbiting satellites, have been technologically cost-prohibitive to date. The paper will be presented at the 2008 IAC Microgravity Sciences and Processes Symposium, under a proposed joint initiative between the European Space Agency and the International Space Station.
"Testing quantum correlations over distances achievable with systems placed in the Earth orbit, or even beyond, would allow to verify both the validity of quantum physics and the preservation of entanglement over distances impossible to achieve on the ground. Using the large relative velocity of two orbiting satellites, one can perform experiments on entanglement where – due to special relativity – both observers can claim that they have performed the measurement on their system prior to the measurement of the other observer. In such an experiment, it is not possible anymore to think of any local realistic mechanisms that potentially influence one measurement outcome according to the other one."
Zeilinger's group has previously conducted proof-of-principle experiments in the Canary Islands with a 144 km free-space link, using an ESA receiver telescope to receive single entangled photons, cf. Nature Physics, 3:481-486 (2007). A more recent experiment in Italy has demonstrated single-photon downlink communications viability from a near-earth orbit satellite, cf. New Journal of Physics, 10:033038 (2008).
The Reality Tests Vienna In Seed (June 2008), the Vienna experimental group discusses physical and philosophical implications of new correlations between entangled photons, which violate an inequality proposed by Leggett for nonlocal realistic theories. This new series of experiments invalidates macrorealism in quantum mechanics by more than 80 orders of magnitude. Preliminary coverage of the experimental results was first presented in Nature 446 (871) and PhysicsWorld, 20 April 2007. According to Časlav Brukner: "Quantum mechanics does not always wash itself out – but to observe its effects for larger and larger objects, we would need more and more accurate measurement devices. We just do not have the sensitivity to observe the quantum effects around us. In essence, we do create the classical world we perceive. There could be other classical worlds completely different from ours."
Quantum networks: Entanglement of distant atoms by projective measurement University of Barcelona | ICFO | Spain Quantum cryptography is rapidly developing into a mature and robust technology for secure data transactions in financial, government and military sector applications. In arXiv 0806.1052, Zippilli et al. quantify the role of photon detector efficiency in quantum repeaters, which will be necessary to scale beyond the point-to-point networks currently employed for secure communications.
Presently, state-of-the-art systems employ atom-photon interaction to generate entanglement between distant nodes across a quantum network through projective measurement. "We assess proposals for entangling two distant atoms by measurement of emitted photons, analyzing how their performance depends on the photon detection efficiency – we believe that these concepts are generally applicable to all systems that may be considered for the creation of distant entanglement, including atomic-ensemble, photonic, and solid state implementations."
The groups's objectives are to quantify the importance of detector efficiency as applied to generating remote entanglement across quantum networks. With minor modifications, these results can be extended to the efficiency of quantum teleportation protocols that are also based on projective quantum measurement. "In all such systems, the detection efficiency will have a similar, important role for the use of the entanglement as a resource in quantum technologies."
Space-QUEST: Experiments with quantum entanglement in space Vienna | ESA | ISS In a recent submission to the arXiv, Zeilinger's group at University of Vienna, Austria has proposed an experiment – Space-QUEST, Quantum Entanglement Science and Technology – for space-to-ground, entangled-photon Bell Inequality violation measurements to verify quantum nonlocality at distances over thousands of kilometers, in a joint operation between the International Space Station and a ground observatory in the European Union.
Entanglement and nonlocality have been pivotal controversies since the birth of quantum mechanics – Einstein's "spooky action at a distance" implies simultaneous, nonlocal correlations between separate entangled particles. J. S. Bell was the first to confirm the phenomenon experimentally in 1964.
Further refinements and increasing precision in succeeding experiments have consistently shown quantum mechanics to be an explicitly nonlocal theory – the outcome Einstein was most averse to accept. However, long-distance relativistic experiments, such as between orbiting satellites, have been technologically cost-prohibitive to date. The paper will be presented at the 2008 IAC Microgravity Sciences and Processes Symposium, under a proposed joint initiative between the European Space Agency and the International Space Station.
"Testing quantum correlations over distances achievable with systems placed in the Earth orbit, or even beyond, would allow to verify both the validity of quantum physics and the preservation of entanglement over distances impossible to achieve on the ground. Using the large relative velocity of two orbiting satellites, one can perform experiments on entanglement where – due to special relativity – both observers can claim that they have performed the measurement on their system prior to the measurement of the other observer. In such an experiment, it is not possible anymore to think of any local realistic mechanisms that potentially influence one measurement outcome according to the other one."
Zeilinger's group has previously conducted proof-of-principle experiments in the Canary Islands with a 144 km free-space link, using an ESA receiver telescope to receive single entangled photons [Nature Physics, 3:481-486 (2007)]. A more recent experiment in Italy has demonstrated single-photon downlink communications viability from a near-earth orbit satellite [New Journal of Physics, 10:033038 (2008)].
Quantum networks: Entanglement of distant atoms by projective measurement University of Barcelona | ICFO | Spain Quantum cryptography is rapidly developing into a mature and robust technology for secure data transactions in financial, government and military sector applications. In arXiv 0806.1052, Zippilli et al. quantify the role of photon detector efficiency in quantum repeaters, which will be necessary to scale beyond the point-to-point networks currently employed for secure communications.
Presently, state-of-the-art systems employ atom-photon interaction to generate entanglement between distant nodes across a quantum network through projective measurement. "We assess proposals for entangling two distant atoms by measurement of emitted photons, analyzing how their performance depends on the photon detection efficiency – we believe that these concepts are generally applicable to all systems that may be considered for the creation of distant entanglement, including atomic-ensemble, photonic, and solid state implementations."
The groups's objectives are to quantify the importance of detector efficiency as applied to generating remote entanglement across quantum networks. With minor modifications, these results can be extended to the efficiency of quantum teleportation protocols that are also based on projective quantum measurement. "In all such systems, the detection efficiency will have a similar, important role for the use of the entanglement as a resource in quantum technologies."
Space-QUEST: Experiments with quantum entanglement in space Vienna | ESA | ISS In a recent submission to the arXiv, Zeilinger's group at University of Vienna, Austria has proposed an experiment – Space-QUEST, Quantum Entanglement Science and Technology – for space-to-ground, entangled-photon Bell Inequality violation measurements to verify quantum nonlocality at distances over thousands of kilometers, in a joint operation between the International Space Station and a ground observatory in the European Union.
Entanglement and nonlocality have been pivotal controversies since the birth of quantum mechanics – Einstein's "spooky action at a distance" implies simultaneous, nonlocal correlations between separate entangled particles. J. S. Bell was the first to confirm the phenomenon experimentally in 1964.
Further refinements and increasing precision in succeeding experiments have consistently shown quantum mechanics to be an explicitly nonlocal theory – the outcome Einstein was most averse to accept. However, long-distance relativistic experiments, such as between orbiting satellites, have been technologically cost-prohibitive to date. The paper will be presented at the 2008 IAC Microgravity Sciences and Processes Symposium, under a proposed joint initiative between the European Space Agency and the International Space Station.
"Testing quantum correlations over distances achievable with systems placed in the Earth orbit, or even beyond, would allow to verify both the validity of quantum physics and the preservation of entanglement over distances impossible to achieve on the ground. Using the large relative velocity of two orbiting satellites, one can perform experiments on entanglement where – due to special relativity – both observers can claim that they have performed the measurement on their system prior to the measurement of the other observer. In such an experiment, it is not possible anymore to think of any local realistic mechanisms that potentially influence one measurement outcome according to the other one."
Zeilinger's group has previously conducted proof-of-principle experiments in the Canary Islands with a 144 km free-space link, using an ESA receiver telescope to receive single entangled photons [Nature Physics, 3:481-486 (2007)]. A more recent experiment in Italy has demonstrated single-photon downlink communications viability from a near-earth orbit satellite [New Journal of Physics, 10:033038 (2008)].
20080601
Superconducting Qubits RIKEN | UBC | Sherbrooke – In arXiv 0805.0164, Zagoskin and Blais provide a broad and accessible introduction to quantum information processing with superconducting qubits. "From a physicist's standpoint, the most interesting part of quantum computing research may well be the possibility to probe the boundary between the quantum and the classical worlds. The more macroscopic are the structures involved, the better. So far, the most "macroscopic" qubit prototypes that have been studied in the laboratory are certain kinds of superconducting qubits. To get a feeling for how macroscopic these systems can be, the states of flux qubits which are brought in a quantum superposition corresponds to currents composed of as much as 105 - 106 electrons flowing in opposite directions in a superconducting loop."
20080519
Efficient pulsed gates for an oscillator stabilized Josephson qubit IBM Watson In arXiv 0709.1478 and New J. Phys. 10 033027 (2008), Koch, DiVincenzo, Brito and Steffen derive operational specifications for high-fidelity one and two-qubit pulsed gates for a superconducting flux qubit, calculating the Hamiltonian with tunable interaction from initialization to readout.
"The quantitative fact that the values of gate infidelity are at the 1% level – and below – is the major result of this paper."
So, can a "debugged" IBM qubit be used soon for universal quantum computation?
"The short answer is, in our opinion, ultimately yes."
"The answer would certainly be no if the noise threshold for fault-tolerant quantum computation were in the neighborhood of the oft-quoted value of 10−5. It is not inconceivable for the experiment to get to these values someday, since we find that the infidelities decrease much faster than linearly with the assumed noise levels."
"To get to 10−5, we would need to get to the very daunting levels of 100nΦ0 at 1Hz for the 1/f noise amplitudes and 100 f s for timing accuracies; there is optimism that both of these numbers are ultimately attainable. Fortunately, while 10−5 was the threshold as it was understood ten years ago, much recent work shows that with good designs, much higher thresholds are possible. According to Terhal and Burkard – 1% is, in fact, on the high end of the noise levels for which fault tolerance may be possible."
"The quantitative fact that the values of gate infidelity are at the 1% level – and below – is the major result of this paper."
So, can a "debugged" IBM qubit be used soon for universal quantum computation?
"The short answer is, in our opinion, ultimately yes."
"The answer would certainly be no if the noise threshold for fault-tolerant quantum computation were in the neighborhood of the oft-quoted value of 10−5. It is not inconceivable for the experiment to get to these values someday, since we find that the infidelities decrease much faster than linearly with the assumed noise levels."
"To get to 10−5, we would need to get to the very daunting levels of 100nΦ0 at 1Hz for the 1/f noise amplitudes and 100 f s for timing accuracies; there is optimism that both of these numbers are ultimately attainable. Fortunately, while 10−5 was the threshold as it was understood ten years ago, much recent work shows that with good designs, much higher thresholds are possible. According to Terhal and Burkard – 1% is, in fact, on the high end of the noise levels for which fault tolerance may be possible."
20080514
Photon transmission through sub-wavelength diameter apertures Delft | Optica In Optics Express 16, 10 (abstract, full article) and concurrent TU Delft summary, Photonics review, Adam, Planken et al. report on high time-resolution terahertz mapping of photon transmission through sub-wavelength diameter apertures:
"According to the laws of physics, it is particularly difficult to pass light through a hole smaller than half the wavelength of the light used." The Delft group conducted experiments using extremely high time-resolution measurements in the terahertz (THz) frequency range. The group discovered that even if the hole is up to fifty times smaller than the wavelength used, sufficient light can pass through to allow measurements near the hole – an extremely difficult task using other methods. "Improving the sharpness of THz microscopes, coupled with more sensitive detectors, will improve the viability of creating images of biological cells using this type of measurement."
Prior experiments at Leiden University (Nature 418, 304-306) have also studied photon transmission through sub-wavelength metal films and shown entanglement conservation to be much more robust than expected – surviving the conversion process from surface-plasmon waves, which tunnel through the barrier, before reradiating as photons on the opposite side of the film. "It's a good omen, because it's saying quantum entanglement can survive when you might not expect it to," says Bill Barnes, a photonics expert at the University of Exeter. "If they can survive this, what else can they survive?"
"According to the laws of physics, it is particularly difficult to pass light through a hole smaller than half the wavelength of the light used." The Delft group conducted experiments using extremely high time-resolution measurements in the terahertz (THz) frequency range. The group discovered that even if the hole is up to fifty times smaller than the wavelength used, sufficient light can pass through to allow measurements near the hole – an extremely difficult task using other methods. "Improving the sharpness of THz microscopes, coupled with more sensitive detectors, will improve the viability of creating images of biological cells using this type of measurement."
Prior experiments at Leiden University (Nature 418, 304-306) have also studied photon transmission through sub-wavelength metal films and shown entanglement conservation to be much more robust than expected – surviving the conversion process from surface-plasmon waves, which tunnel through the barrier, before reradiating as photons on the opposite side of the film. "It's a good omen, because it's saying quantum entanglement can survive when you might not expect it to," says Bill Barnes, a photonics expert at the University of Exeter. "If they can survive this, what else can they survive?"
20080508
Time Reversal in Bose-Einstein Condensates Toulouse | CNRS In arXiv 0804.3514, Martin, Georgeot, and Shepelyansky of Quantware MIPS Center investigate time reversibility in Bose-Einstein condensates (BEC). "We show that inside the regime of quantum chaos, time-reversal dynamics can be inverted from explosion to collapse. The accuracy of time reversal decreases with the increase of atom interactions inside BEC, until it is completely lost – though, surprisingly, quantum chaos helps to restore time reversibility. Existing experimental setups similar to Ryu, Behinaein, and Wayper can test the fundamental question of BEC time reversal discussed here."
20080506
DARPA INFOSEC Mandate DARPA | EOP | Congress In Wired briefing 01 May 2008, Danger Room reports on the new DARPA Information Security program mandated by Congress and ratified by the President. The UNISCA First Committee INFOSEC Chair briefing to the UN General Assembly is particularly àpropos to the initiative. "The Defense Advanced Research Projects Agency, or DARPA, was created 50 years ago in response to the Soviets' launch of Sputnik. In less than a year, DARPA put together the infrastructure that guided the American space effort for decades to come. Now, DARPA has been given new marching orders: to help America fight and win battles online.
Under a directive signed by the President – and recently approved by Congress – nearly every arm of the government's security apparatus is starting work on a massive national cybersecurity initiative designed to protect the United States from electronic attack and strike at adversaries online. DARPA's role: to create a cyberwarfare range where all these new forms of electronic combat can be tried out. According to a defense official familiar with the program, "Congress has given DARPA a direct order; that's only happened once before – with the Sputnik program in the '50s."
Danger Room's sister blog, Threat Level, has a good writeup of the cybersecurity initiative, which has been labeled as a Manhattan Project-type effort. In the case of cybersecurity, there is at least talk of big money: about $30 billion dollars. For its part, DARPA's "National Cyber Range" would create a virtual environment where the Defense Department can mock real warfare, both defense and offense.
DARPA today issued an announcement, describing how the range would be a test where the government could conduct unbiased, quantitative and qualitative assessment of information assurance and survivability tools in a representative network environment ; replicate complex, large-scale, heterogeneous networks and users in current and future Department of Defense (DoD) weapon systems and operations ; enable multiple, independent, simultaneous experiments on the same infrastructure ; enable realistic testing of Internet/Global-Information-Grid (GIG) scale research ; develop and deploy revolutionary cyber testing capabilities, and enable the use of the scientific method for rigorous cyber testing.
This is clearly a serious deal for the agency: DARPA Director Tony Tether is a scheduled speaker at the proposers' day workshop scheduled for mid-May, and apparently plans to help handpick the contractors. Tether is known for his close involvement in DARPA contracts. Many of the details surrounding this program will be classified."
Under a directive signed by the President – and recently approved by Congress – nearly every arm of the government's security apparatus is starting work on a massive national cybersecurity initiative designed to protect the United States from electronic attack and strike at adversaries online. DARPA's role: to create a cyberwarfare range where all these new forms of electronic combat can be tried out. According to a defense official familiar with the program, "Congress has given DARPA a direct order; that's only happened once before – with the Sputnik program in the '50s."
Danger Room's sister blog, Threat Level, has a good writeup of the cybersecurity initiative, which has been labeled as a Manhattan Project-type effort. In the case of cybersecurity, there is at least talk of big money: about $30 billion dollars. For its part, DARPA's "National Cyber Range" would create a virtual environment where the Defense Department can mock real warfare, both defense and offense.
DARPA today issued an announcement, describing how the range would be a test where the government could conduct unbiased, quantitative and qualitative assessment of information assurance and survivability tools in a representative network environment ; replicate complex, large-scale, heterogeneous networks and users in current and future Department of Defense (DoD) weapon systems and operations ; enable multiple, independent, simultaneous experiments on the same infrastructure ; enable realistic testing of Internet/Global-Information-Grid (GIG) scale research ; develop and deploy revolutionary cyber testing capabilities, and enable the use of the scientific method for rigorous cyber testing.
This is clearly a serious deal for the agency: DARPA Director Tony Tether is a scheduled speaker at the proposers' day workshop scheduled for mid-May, and apparently plans to help handpick the contractors. Tether is known for his close involvement in DARPA contracts. Many of the details surrounding this program will be classified."
20080419
Quantum sensor effect in bird navigation? Institute for Electronic Structure and Lasers, Foundation for Research and Technology, Heraklion | University of Crete In arXiv:0804.2646, arXivblog, and Slashdot reports, Kominis investigates the potential for Zeno effect-based quantum sensing in bird navigation. "How birds use the Earth's magnetic field to navigate has puzzled researchers for decades. In recent years, a growing body of evidence has pointed to the possibility that a weak magnetic field can influence the outcome of a certain type of chemical reaction involving the recombination of pairs of ions in bird retinas. The trouble is that the ion recombination is known to happen too quickly for the Earth's weak magnetic field to have any effect. Now it looks as if the quantum Zeno effect may explain the process (abstract). This is the "watched-pot-never-boils" effect, in which the act of observing a quantum system maintains it for longer than expected. This is extraordinary news, because it means a quantum sensor is determining the macroscopic behavior of living birds. Kominis says we may well see these effects elsewhere, and mentions that a similar mechanism might be at work in photosynthesis."
MagiQ Research Labs Andrew Hammond | MagiQ Technologies MagiQ Technologies (NYC) has founded MagiQ Research Labs in Somerville, MA. The lab will provide a technical, engineering and production facility base for public and private sectors in medical optics, quantum information, fiber sensing, aerospace and defense applications, and has been established under research grants from ARO, DARPA, and NASA.
MagiQ Research Labs Andrew Hammond | MagiQ Technologies MagiQ Technologies (NYC) has founded MagiQ Research Labs in Somerville, MA. The lab will provide a technical, engineering and production facility base for public and private sectors in medical optics, quantum information, fiber sensing, aerospace and defense applications, and has been established under research grants from ARO, DARPA, and NASA.
20080316
Room-temperature quantum oscillations in diamond crystals UCSB|CNSI|Kavli TU Delft|Ames DOE In Science Express, Science and concurrent UCSB press release, Hanson, Awschalom et al. report on striking experimental observations of quantum oscillations in diamond crystals at room temperature. "We were stunned by these unexpected experimental results, and extremely excited by the ability to control and monitor single quantum states, especially at room temperature," said David Awshalom. "To our surprise, when looking at longer times, the oscillations disappeared, then re-appeared. At first it looked like an artifact, but repeated measurements reproduced this behavior," said co-author Ronald Hanson, a postdoctoral student at UCSB during this period who is now a professor at Kavli Institute of Nanoscience Delft, Delft University of Technology, in the Netherlands.
20080217
Workshop on Neuromorphic Computing DSO As profiled in Wired and BAA SN08-16, DSO is hosting a workshop on neuromorphic adaptive plastic scalable electronics to be held on 04 March, 2008. "Briefly, the vision for the anticipated DARPA SyNAPSE program is to enable electronic neuromorphic machine technology that is scalable to biological levels. As compared to biological systems, today’s intelligent machines are less efficient by a factor of one million to one billion in real world, complex environments. The key to achieving the vision of the SyNAPSE program will be an unprecedented multidisciplinary approach that can coordinate aggressive technology development activities in the following SyNAPSE areas: 1) hardware; 2) architecture; 3) simulation; and 4) environment. Hardware includes neuromorphic electronics with novel, high density, plastic, synaptic components; architecture includes neuromorphic design from microcircuit to complete system; simulation includes large-scale digital simulation of neuromorphic circuits and functional neuromorphic systems; and environment includes virtual training, testing and benchmarking for neuromorphic systems realized in hardware or simulation. "
20080118
QuanTalk EU QIST Via the Pontiff, a new web-based initiative supported by the European Union under ERA-Pilot QIST grant, QuanTalk. " The purpose of the project is to provide a central facility for open review and discussion of research in quantum information science worldwide. The heart of the site is the Articles section, wherein we provide three main features, (1) Open scientific discussion of the latest research in quantum information. The system will be familiar to those who use forum systems or blogs, but it differs in its approach to permanence and accountability. (2) A community review process, open peer review, to which authors may submit their work. This feature is not yet available during the current beta test of quantalk.org. (3) An open archive where authors can deposit digital material that they wish to make available to the community. During the beta phase, hosting is restricted to PDF documents."
Subscribe to:
Posts (Atom)