با عرض سلام خدمت همکار عزیزمان از جامعه فیزیک ایران
احتراماً پوستر دومین نمایشگاه کاریابی فیزیک پیشگان کشور به پیوست ارسال می شود. نمایشگاه کاریابی فیزیک پیشگان ایران (2nd Iran’s Physics Job Fair) قرار است در 14 دیماه 1402 در دانشکده فیزیک دانشگاه تهران، همزمان با گردهمایی سراسری فیزیک ایران 1402 از ساعت 9 الی 17 برگزار شود.
حضور شما گرمی بخش این نمایشگاه خواهد بود.
همچنین حضور شرکتهای متقاضی جذب فیزیک پیشگان کشور مورد انتظار است. لطفا در این مهم به جامعه فیزیک کشور یاری برسانید. منتظریم.
لطفا به دانشجویان و دانش آموختگان جامعه فیزیک کشور جهت بازدید اطلاع رسانی نمایید.
حهت هرگونه سوال با کارشناس نمایشگاه آقای مسعود مظفری (09120673360) تماس حاصل فرمایید.
با تشکر
دبیر اجرایی نمایشگاه کاریابی فیزیک پیشگان ایران
دکتر محمدی زاده
Scientists are building extremely powerful lasers. For around a trillionth of a second, one of these machines will emit thousands of times the power of the US electric grid. Such devices could allow researchers to explore unsolved problems related to fundamental physics principles and to develop innovative laser-based technologies. But these applications require precise knowledge of the intensity of any such laser, a parameter that is difficult to measure, as no known material can withstand the anticipated extreme conditions of the laser beams. Now Wendell Hill at the University of Maryland, College Park, and his colleagues demonstrate a technique that uses electrons to determine this intensity [1].
For their demonstration, the researchers fired a high-power laser pulse at a low-density gas, causing the gas to release electrons. The laser’s electromagnetic field then propelled these electrons forward and out of the laser beam. The team observed the angular distribution of the ejected electrons in real time using surfaces called image plates that act like photographic film.
Analyzing these image plates, Hill and his colleagues find that the angle of the emitted electrons relative to the beam’s direction is inversely proportional to a laser’s intensity, allowing this angle to serve as an intensity measure. The researchers demonstrate their method for laser intensities of 1019–1020 W/cm2 and suggest that it could be applied to intensities in the range of 1018–1021 W/cm2. They say that the approach could help scientists in field testing the next generation of ultrapowerful lasers and in studying, with high precision, the interaction between matter and strong electromagnetic fields.
–Ryan Wilkinson
Ryan Wilkinson is a Corresponding Editor for Physics Magazine based in Durham, UK.
Many physicists hear the words “quantum field theory,” and their thoughts turn to electrons, quarks, and Higgs bosons. In fact, the mathematics of quantum fields has been used extensively in other domains outside of particle physics for the past 40 years. The 2024 Breakthrough Prize in Fundamental Physics has been awarded to two theorists who were instrumental in repurposing quantum field theory for condensed-matter, statistical physics, and gravitational studies.
“I really want to stress that quantum field theory is not the preserve of particle physics,” says John Cardy, a professor emeritus from the University of Oxford. He shares the Breakthrough Prize with Alexander Zamolodchikov from Stony Brook University, New York.
The Breakthrough Prize is perhaps the “blingiest” of science awards, with $3 million being given for each of the five main awards (three for life sciences, one for physics, and one for mathematics). Additional awards are given to early-career scientists. The founding sponsors of the Breakthrough Prize are entrepreneurs Sergey Brin, Priscilla Chan and Mark Zuckerberg, Julia and Yuri Milner, and Anne Wojcicki.
The fundamental physics prize going to Cardy and Zamolodchikov is “for profound contributions to statistical physics and quantum field theory, with diverse and far-reaching applications in different branches of physics and mathematics.” When notified about the award, Zamolodchikov expressed astonishment. “I never thought to find myself in this distinguished company,” he says. He was educated as a nuclear engineer in the former Soviet Union but became interested in particle physics. “I had to clarify for myself the basics.” The basics was quantum field theory, which describes the behaviors of elementary particles with equations that are often very difficult to solve. In the early 1980s, Zamolodchikov realized that he could make more progress in a specialized corner of mathematics called two-dimensional conformal field theory (2D CFT). “I was lucky to stumble on this interesting situation where I could find exact solutions,” Zamolodchikov says.
CFT describes “scale-invariant” mappings from one space to another. “If you take a part of the system and blow it up by the right factor, then that part looks like the whole in a statistical sense,” explains Cardy. More precisely, conformal mappings preserve the angles between lines as the lines stretch or contract in length. In certain situations, quantum fields obey this conformal symmetry. Zamolodchikov’s realization was that solving problems in CFT—especially in 2D where the mathematics is easiest—gives a starting point for studying generic quantum fields, Cardy says.
Cardy started out as a particle physicist, but he became interested in applying quantum fields to the world beyond elementary particles. When he heard about the work of Zamolodchikov and other scientists in the Soviet Union, he immediately saw the potential and versatility of 2D CFT. One of the first places he applied this mathematics was in phase transitions, which arise when, for example, the atomic spins of a material suddenly align to form a ferromagnet. Within the 2D CFT framework, Cardy showed that you could perform computations on small systems—with just ten spins, for example—and extract information that pertains to an infinitely large system. In particular, he was able to calculate the critical exponents that describe the behavior of various phase transitions.
Cardy found other uses of 2D CFT in, for example, percolation theory and quantum spin chains. “I would hope people consider my contributions as being quite broad, because that’s what I tried to be over the years,” he says. Zamolodchikov also explored the application of quantum field theory in diverse topics, such as critical phenomena and fluid turbulence. “I tried to develop it in many respects,” he says. The two theorists never collaborated, but they both confess to admiring the other’s work. “We’ve written papers on very similar things,” Cardy says. “I would say that we have a friendly rivalry.” He remembers first encountering Zamolodchikov in 1986 at a conference organized in Sweden as a “neutral” meeting point for Western and Soviet physicists. “It was wonderful to meet him and his colleagues for the first time,” Cardy says.
“Zamolodchikov and Cardy are the oracles of two dimensions,” says Pedro Vieira, a quantum field theorist from the Perimeter Institute in Canada. He says that one of the things that Zamolodchikov showed was the infinite number of symmetries that can exist in 2D CFT. Cardy was especially insightful in how to apply the mathematical insights of 2D to other dimensions. Vieira says of the pair, “They understood the power of 2D physics, in that it is very simple and elegant, and at the same time mathematically rich and complex.”
Vieira says that the work of Zamolodchikov and Cardy continues to be important for a wide range of researchers, including condensed-matter physicists who study 2D surfaces and string theorists who model the motion of 1D strings moving in time. One topic attracting a lot of attention these days is the so-called AdS/CFT correspondence, which connects CFT mathematics with gravitational theory (see Viewpoint: Are Black Holes Really Two Dimensional?). Cardy says that there’s also been a great deal of recent work on CFT in dimensions more than two. “I’m sure that [higher-dimensional CFT] will win lots of awards in the future,” he says. Zamolodchikov continues to work on extensions of quantum field theory, such as the “ ��¯ deformation,” that may provide insights into fundamental physics, just as CFT has done.
Zamolodchikov and Cardy and the 18 other Breakthrough winners will receive their awards on April 13, 2024, at a red-carpet ceremony that routinely attracts celebrities from film, music, and sports. Cardy says that he is looking forward to it. “I like a good party.”
–Michael Schirber
Michael Schirber is a Corresponding Editor for Physics Magazine based in Lyon, France.
REFERENCE: https://physics.aps.org/articles/v16/165?utm_campaign=weekly&utm_medium=email&utm_source=emailalert
A ratchet is a device that produces a net forward motion of an object from a periodic (or random) driving force. Although ratchets are common in watches and in cells (see Focus: Stalling a Molecular Motor), they are hard to make for quantum systems. Now researchers demonstrate a quantum ratchet for a collection of cold atoms trapped in an optical lattice [1]. By varying the lattice’s light fields in a time-dependent way, the researchers show that they can move the atoms coherently from one lattice site to the next without disturbing the atoms’ quantum states.
One type of ratchet (a Hamiltonian ratchet) works by providing periodic, nonlossy pushes to a gas or other multiparticle system. For particles starting in certain initial states, the pushes are timed with their motion, and the resulting movement is in a particular forward direction. For particles in other states, the pushes are out of sync, and the particles travel in chaotic trajectories with no preferred direction.
Hamiltonian ratchets have previously been demonstrated for quantum systems, but for those ratchets the particles ended up spread out in space. The ratchet designed by David Guéry-Odelin from the University of Toulouse, France, and his colleagues has tighter directional control. For the demonstration, the researchers placed 105 rubidium atoms in the periodic potential of an optical lattice. Applying specially tuned modulations to this potential, they showed that the atoms moved in discrete steps from one lattice site to the next. At the end of each step, the atoms came to rest in their ground state. This well-defined transport could have potential applications in controlling matter waves for quantum experiments, Guéry-Odelin says.
–Michael Schirber
Michael Schirber is a Corresponding Editor for Physics Magazine based in Lyon, France.
The advent of ChatGPT, Bard, and other large language models (LLM) has naturally excited everybody, including the entire physics community. There are many evolving questions for physicists about LLMs in particular and artificial intelligence (AI) in general. What do these stupendous developments in large-data technology mean for physics? How can they be incorporated in physics? What will be the role of machine learning (ML) itself in the process of physics discovery?
Before I explore the implications of those questions, I should point out there is no doubt that AI and ML will become integral parts of physics research and education. Even so, similar to the role of AI in human society, we do not know how this new and rapidly evolving technology will affect physics in the long run, just as our predecessors did not know how transistors or computers would affect physics when the technologies were being developed in the early 1950s. What we do know is that the impact of AI/ML on physics will be profound and ever evolving as the technology develops.
The impact is already being felt. Just a cursory search of Physical Review journals for “machine learning” in articles’ titles, abstracts, or both returned 1456 hits since 2015 and only 64 for the entire period from Physical Review’s debut in 1893 to 2014! The second derivative of ML usage in articles is also increasing. The same search yielded 310 Physical Review articles in 2022 with ML in the title, abstract, or both; in the first 6 months of 2023, there are already 189 such publications.
ML is already being used extensively in physics, which is unsurprising since physics deals with data that are often very large, as is the case in some high-energy physics and astrophysics experiments. In fact, physicists have been using some forms of ML for a long time, even before the term ML became popular. Neural networks—the fundamental pillars of AI—also have a long history in theoretical physics, as is apparent from the fact that the term “neural networks” appears in hundreds of Physical Review articles’ titles and abstract since its first usage in 1985 in the context of models for understanding spin glasses. The AI/ML use of neural networks is quite different from the way neural networks appear in spin glass models, but the basic idea of representing a complex system using neural networks is shared by both cases. ML and neural networks have been woven into the fabric of physics going back 40 years or more.
What has changed is the availability of very large computer clusters with huge computing power, which enable ML to be applied in a practical manner to many physical systems. For my field, condensed-matter physics, these advances mean that ML is being increasingly used to analyze large datasets involving materials properties and predictions. In these complex situations, the use of AI/ML will become a routine tool for every professional physicist, just like vector calculus, differential geometry, and group theory. Indeed, the use of AI/ML will soon become so widespread that we simply will not remember why it was ever a big deal. At that point, this opinion piece of mine will look a bit naive, much like pontifications in the 1940s about using computers for doing physics.
But what about deeper usage of AI/ML in physics beyond using it as an everyday tool? Can they help us solve deep problems of great significance? Could physicists, for example, have used AI/ML to come up with the Bardeen-Cooper-Schrieffer theory of superconductivity in 1950 if they had been available? Can AI/ML revolutionize doing theoretical physics by finding ideas and concepts such as the general theory of relativity or the Schrödinger equation? Most physicists I talk to firmly believe that this would be impossible. Mathematicians feel this way too. I do not know of any mathematician who believes that AI/ML can prove, say, Riemann’s hypothesis or Goldbach’s conjecture. I, on the other hand, am not so sure. All ideas are somehow deeply rooted in accumulated knowledge, and I am unwilling to assert that I already know what AI/ML won’t ever be able to do. After all, I remember the time when there was a widespread feeling that AI could never beat the great champions of the complex game of Go. A scholarly example is the ability of DeepMind’s AlphaFold to predict what structure a protein’s string of amino acids will adopt, a feat that was thought impossible 20 years ago.
This brings me to my final point. Doing physics using AI/ML is happening, and it will become routine soon. But what about understanding the effectiveness of AI/ML and of LLMs in particular? If we think of a LLM as a complex system that suddenly becomes extremely predictive after it has trained on a huge amount of data, the natural question for a physicist to ask is what is the nature of that shift? Is it a true dynamical phase transition that occurs at some threshold training point? Or is it just the routine consequence of interpolations among known data, which just work empirically, sometimes even when extrapolated? The latter, which is what most professional statisticians seem to believe, involves no deep principle. But the former involves what could be called the physics of AI/ML and constitutes in my mind the most important intellectual question: Why does AI/ML work and when does it fail? Is there a phase transition at some threshold where the AI/ML algorithm simply predicts everything correctly? Or is the algorithm just some huge interpolation, which works because the amount of data being interpolated is so gigantic that most questions simply fall within its regime of validity? As physicists, we should not just be passive users of AI/ML but also dig into these questions. To paraphrase a famous quote from a former US president, we should not only ask what AI/ML can do for us (a lot actually), but also what we can do for AI/ML.
Sankar Das Sarma is the Richard E. Prange Chair in Physics and a Distinguished University Professor at the University of Maryland, College Park. He is also a fellow of the Joint Quantum Institute and the director of the Condensed Matter Theory Center, both at the University of Maryland. Das Sarma received his PhD from Brown University, Rhode Island. He has belonged to the University of Maryland physics faculty since 1980. His research interests are condensed-matter physics, quantum computation, and statistical mechanics.
به گزارش ایسنا و به نقل از نوبل پرایز، جایزه نوبل فیزیک ۲۰۲۳ به «پیر آگوستینی»، «فرنس کراوس» و «آن لوهولیر» برای پژوهش در مورد «روشهای تجربی که پالسهای اتوثانیهای نور را برای مطالعه دینامیک الکترون در ماده تولید میکنند» اهدا شد.
ثبت کوتاه ترین لحظات ممکن میشود
سه برنده جایزه نوبل فیزیک ۲۰۲۳ به خاطر آزمایشهایی شناخته میشوند که راههای جدیدی را برای کاوش در دنیای الکترونهای درون اتمها و مولکولها در اختیار بشریت گذاشتهاند. «پیر آگوستینی»(Pierre Agostini)، «فرنس کراوس»(Ferenc Krausz) و «آن لوهیلیر»(Anne L’Huillier) راهی را برای ایجاد پالسهای بسیار کوتاه نور نشان دادهاند که میتوان از آنها برای اندازهگیری کردن فرآیندهای سریعی استفاده کرد که طی آنها الکترونها حرکت میکنند یا انرژی را تغییر میدهند.
وقایع سریع به دنبال یکدیگر رخ میدهند و درست مانند فیلمی هستند که از تصاویر ثابت تشکیل شده است و با حرکت مداوم درک میشود. اگر بخواهیم رویدادهای واقعا کوتاه را بررسی کنیم، به فناوری ویژهای نیاز داریم. در دنیای الکترونها، تغییرات در چند دهم اتوثانیه اتفاق میافتند که بسیار کوتاه است.
آزمایشهای برندگان نوبل فیزیک ۲۰۲۳، پالسهای نوری را به قدری کوتاه تولید کردهاند که در اتوثانیه اندازهگیری میشوند. بنابراین، آزمایشهای آنها نشان میدهند که این پالسها را میتوان برای ارائه کردن تصاویری از فرآیندهای درون اتمها و مولکولها استفاده کرد.
در سال ۱۹۸۷، لوهیلیر دریافت هنگامی که نور لیزر فروسرخ را از طریق یک گاز نجیب منتقل میکند، بسیاری از «اورتونهای»(overtone) گوناگون نور پدید میآیند. هر اورتون، یک موج نور به همراه تعداد چرخههای معین برای هر چرخه در نور لیزر است. چرخهها در اثر تعامل نور لیزر با اتمهای موجود در گاز ایجاد میشوند و به برخی از الکترونها انرژی اضافی میدهند که سپس به صورت نور ساطع میشود. لوهیلیر به اکتشاف در مورد این پدیده ادامه داده و زمینه را برای پیشرفتهای بعدی فراهم کرده است.
در سال ۲۰۰۱، آگوستینی موفق شد به تولید و بررسی یک مجموعه پالسهای نوری متوالی بپردازد که در آن، هر پالس فقط ۲۵۰ اتوثانیه طول میکشید. در همان زمان، کراوس در حال کار کردن روی آزمایش دیگری بود که جداسازی یک پالس نوری به طول ۶۵۰ اتوثانیه را ممکن کرد.
مشارکت برندگان این جایزه، امکان بررسی فرآیندهای بسیار سریعی را فراهم کرده است که پیشتر پیگیری آنها غیرممکن بود.
«اوا اولسون»(Eva Olsson) رئیس کمیته نوبل در حوزه فیزیک گفت: اکنون میتوانیم دری را به روی دنیای الکترونها باز کنیم. فیزیک اتوثانیه به ما این فرصت را میدهد تا مکانیسمهایی را که توسط الکترونها کنترل میشوند، درک کنیم. گام بعدی، استفاده کردن از آنها خواهد بود.
این پدیده، کاربردهای بالقوهای را در زمینههای گوناگون دارد. برای مثال، در حوزه الکترونیک، درک و کنترل کردن نحوه رفتار الکترونها در یک ماده مهم است. پالسهای اتوثانیه را میتوان برای شناسایی کردن مولکولهای گوناگون در حوزههای متفاوت، از جمله در تشخیصهای پزشکی استفاده کرد.
محل کار کنونی برندگان نوبل فیزیک ۲۰۲۳ براساس وبسایت نوبل پرایز:
پیر آگوستینی. «دانشگاه ایالتی اوهایو»(OSU) در کلمبوس آمریکا
فرنس کراوس. «موسسه اپتیک کوانتومی ماکس پلانک»(MPQ)، «دانشگاه لودویگ ماکسیمیلیان مونیخ»(LMU)
آن لوهیلیر. «دانشگاه لوند»(Lund University)
X-ray free-electron lasers (XFELs) first came into existence two decades ago. They have since enabled pioneering experiments that “see” both the ultrafast and the ultrasmall. Existing devices typically generate short and intense x-ray pulses at a rate of around 100 x-ray pulses per second. But one of these facilities, the Linac Coherent Light Source (LCLS) at the SLAC National Accelerator Laboratory in California, is set to eclipse this pulse rate. The LCLS Collaboration has now announced “first light” for its upgraded machine, LCLS-II. When it is fully up and running, LCLS-II is expected to fire one million pulses per second, making it the world’s most powerful x-ray laser.
The LCLS-II upgrade signifies a quantum leap in the machine’s potential for discovery, says Robert Schoenlein, the LCLS’s deputy director for science. Now, rather than “demonstration” experiments on simple, model systems, scientists will be able to explore complex, real-world systems, he adds. For example, experimenters could peer into biological systems at ambient temperatures and physiological conditions, study photochemical systems and catalysts under the conditions in which they operate, and monitor nanoscale fluctuations of the electronic and magnetic correlations thought to govern the behavior of quantum materials.
The XFEL was first proposed in 1992 to tackle the challenge of building an x-ray laser. Conventional laser schemes excite large numbers of atoms into states from which they emit light. But excited states with energies corresponding to x-ray wavelengths are too short-lived to build up a sizeable excited-state population. XFELs instead rely on electrons traveling at relativistic speed through a periodic magnetic array called an undulator. Moving in a bunch, the electrons wiggle through the undulator, emitting x-ray radiation that interacts multiple times with the bunch and becomes amplified. The result is a bright x-ray beam with laser coherence.
The first XFEL was built in Hamburg, Germany, in 2005. Today that XFEL emits “soft” x-ray radiation, which has wavelengths as short as a few nanometers. LCLS switched on four years later and expanded XFEL’s reach to the much shorter wavelengths of “hard” x rays, which are essential to atomic-resolution imaging and diffraction experiments. These and other facilities that later appeared in Japan, Italy, South Korea, Germany, and Switzerland have enabled scientists to probe catalytic reactions in real time, solve the structures of hard-to-crystallize proteins, and shed light on the role of electron–photon coupling in high-temperature superconductors. The ability to record movies of the dynamics of molecules, atoms, and even electrons also became possible because x-ray pulses can be as short as a couple of hundred attoseconds.
The upgrades to LCLS offer a new mode of XFEL operation, in which the facility delivers an almost continuous x-ray beam in the form of a megahertz pulse train. For the original LCLS, the pulse rate, which maxed out at 120 Hz, was set by the properties of the linear accelerator that produced the relativistic electrons. Built out of copper, a conventional metal, and operated at room temperatures, the accelerator had to be switched on an off 120 times per second to avoid heat-induced damage. In LCLS-II some of the copper has been replaced with niobium, which is superconducting at the operating temperature of 2 K. Bypassing the damage limitations of copper, the dissipationless superconducting elements allow an 8000-fold gain in the maximum repetition rate. The new superconducting technology is also expected to reduce “jitter” in the beam, says LCLS director Michael Dunne. Greater stability and reproducibility, higher repetition rate, and increased average power “will transform our ability to look at a whole range of systems,” he adds.
LCLS-II is a boon for time-resolved chemistry-focused experiments, says Munira Khalil, a physical chemist at the University of Washington in Seattle. Khalil, a user of LCLS, plans to take advantage of the photon bounty of the rapid-fire pulses to perform dynamical experiments. She hopes such experiments may fulfill “a chemist’s dream”: real-time observations of the coupled motion of atoms and electrons. With extra photons, scientists could also probe dilute samples, potentially shedding light on how metals bind to specific sites in proteins—a process relevant to the function of half of all of nature’s proteins.
The megahertz pulse rate also means that experiments that previously took days to perform could now be completed in hours or minutes, says Henry Chapman of the Center for Free Electron Laser Science at DESY, Germany. At LCLS and later at Hamburg’s XFEL, Chapman ran pioneering experiments to determine the structures of proteins. The method he used, called serial crystallography, involves merging the diffraction patterns of multiple samples sequentially injected into the XFEL’s beam. Serial crystallography has allowed scientists to determine the structures of biologically relevant proteins that form crystals too small to study with conventional crystallography techniques. Chapman says that the increased throughput enabled by LCLS-II will permit much more ambitious experiments, such as measurements of biomolecular reactions on timescales from femtoseconds to milliseconds. “One could also think of an ‘on the fly’ analysis that feeds back into the experiment to discover optimal conditions for drug binding or catalysis,” he says.
For Khalil, the dramatic speedup of the experiments is a key advance of LCLS-II, as she thinks it will make these kinds of experiments accessible to a wider group of people. Until now, she says, XFEL facilities were mostly used by people who had the opportunity to work extensively at XFELs as postdocs or graduate students. Many more experimenters should now be able to enter the XFEL arena, she says. “It’s an exciting time for the field.”
–Matteo Rini
Matteo Rini is the Editor of Physics Magazine.
When light and matter interact, quasiparticles called polaritons can form. Polaritons can change a material’s chemical reactivity or its electronic properties, but the details behind how those changes occur remain unknown. A new finding by Rakesh Arul and colleagues at the University of Cambridge could help change that [1, 2]. The team has uncovered an interaction regime where the light–matter coupling strength significantly exceeds that seen in previous experiments. Arul says that the discovery could improve researchers’ understanding of how polaritons induce material changes and could allow the exploration of a wider range of these quasiparticles.
Light–matter coupling experiments are typically performed using molecules trapped in an optical cavity. Researchers have explored the coupling of such systems to visible and microwave radiation, but probing the wavelengths in between has been tricky because of difficulties in detecting polaritons created using infrared light. Midinfrared wavelengths are interesting because they correspond to the frequencies of molecular vibrations, which can control a material’s optoelectronic properties.
To study this regime, Arul and his colleagues developed a technique to shift the wavelengths of midinfrared polaritons to the visible range, where detectors can better pick them up. Rather than molecules in a cavity, they coupled light to molecules atop a metallic grating. When irradiated with a visible laser beam, polaritons generated by the infrared radiation imprinted a signal on the scattered visible light, allowing polariton detection.
Using their technique, the team found a light–matter coupling strength at specific locations that was 350% higher than the average for the whole system. Researchers have been exploring the interaction of light with metallic gratings since the time of Lord Rayleigh, Arul says. “The system is still delivering surprises.”
–Katherine Wright
Katherine Wright is the Deputy Editor of Physics Magazine.
Transferring quantum information between widely separated locations is necessary to develop a global quantum network. This project is hindered by the high photon loss inherent to long-distance fiber-based transmission—the default for photonic qubits. To get around this problem, researchers have demonstrated the transmission of a quantum signal via satellite instead (see Viewpoint: Paving the Way for Satellite Quantum Communications). Now Sumit Goswami of the University of Calgary, Canada, and Sayandip Dhara of the University of Central Florida show how quantum information could be relayed over large distances by a network of such satellites [1].
The scheme involves a train of satellites in low-Earth orbit, each equipped with a pair of reflecting telescopes. A given satellite would receive a photonic qubit using one telescope and transmit the qubit onward using the other. The satellite train would effectively bend photons around Earth’s curvature while controlling photon loss due to beam divergence. The team likens the scheme to a set of lenses on an optical table.
Simulations of satellites 120 km apart with 60-cm-diameter telescopes showed that beam-divergence loss vanished. Over a distance of 20,000 km, total losses—primarily reflection loss but also alignment and focusing errors—could be reduced to orders of magnitude less than those of a few hundred kilometers of optical fiber. Ultrahigh-reflectivity telescope mirrors could further decrease this loss.
Goswami and Dhara examined two protocols based on such a setup. In one, two entangled photons are transmitted in opposite directions from a satellite-based source. In the other, qubits are transmitted unidirectionally, with source and detector both on the ground. Despite the effects of atmospheric turbulence, the latter performed well and has the advantage of keeping the necessary quantum hardware on Earth.
–Rachel Berkowitz
Rachel Berkowitz is a Corresponding Editor for Physics Magazine based in Vancouver, Canada.
Ultracold atoms trapped in an optical lattice—a periodic array of laser-produced trapping sites—could potentially be used to perform quantum computations and should be scalable, according to experts. But until now researchers had failed to accomplish a critical step: quantum mechanically entangling more than two atoms at a time. Now Jian-Wei Pan of the University of Science and Technology of China and his colleagues have entangled one-dimensional chains of ten atoms and two-dimensional groups of eight atoms with high reliability [1]. The team has also demonstrated control and imaging of the states of the atoms with single-atom resolution [1]. The results show that several of the required building blocks needed for optical-lattice-based quantum processors are now practical.
Previously, Pan and his colleagues have entangled pairs of atoms in a system containing over 2000 rubidium atoms [2]. In that experiment they used a two-dimensional lattice that had two trapping locations at each lattice site—a so-called superlattice. To entangle larger numbers of atoms, the researchers again used the superlattice. They also used additional technologies: a so-called quantum gas microscope and three digital micromirror devices, spatial light modulators with high spatial precision. These tools provided the team with the single-atom resolution needed to create and then verify the simultaneous entanglement of groups of up to ten atoms within the set of 100 or so atoms involved in these experiments. Single-atom resolution will ultimately be necessary for quantum computers, so that devices can manipulate individual qubits and read their values.
–David Ehrenstein
David Ehrenstein is a Senior Editor for Physics Magazine.