CERN News

Subscribe to CERN News feed
Updated: 13 min 16 sec ago

CERN surprises Swiss expo visitors

Tue, 11/21/2017 - 11:27

In the middle of the exhibition hall, visitors to the grand fair in Geneva could immerse themselves, in the underground of science, visiting the large LHC accelerator and one of its experiments. (Image: Julien Ordan, Maximilien Brice/CERN)

In between a fondue tasting and trying out a reclining armchair, visitors to Geneva’s big annual fair, the Automnales had the chance to immerse themselves for a few minutes in the world of fundamental science. CERN was the guest of honour at this unmissable regional event, held from 10 to 19 November. It was an excellent opportunity for us to go out and meet members of the public who might never have thought to visit a research laboratory otherwise. 

The CERN stand was designed to resemble a particle collision, at the centre of which visitors could take a virtual-reality tour. Around this workshops, film screenings, games and interactive screens were offered. CERN volunteers explained the research, its technologies and its applications to visitors. (Image: Maximilien Brice, Julien Ordan)

Some 172 volunteers from CERN presented science in an entertaining way to visitors at the Automnales. Some 145 000 people attended the fair, and most of them stopped at the CERN stand, which was in a prime location in the middle of the hall.

The stand covered 1000 square meters and was designed to resemble a particle collision. Guests young and old had the chance to take a virtual-reality tour of the Large Hadron Collider (LHC) and one of its detectors, to learn how to conduct physics experiments using household objects, to marvel at shows about electricity and the cold, to play proton football, to programme robots and to travel through a tunnel taking them back in time to the Big Bang. Most importantly, they met CERN’s enthusiastic researchers, engineers, technicians and administrative employees, who were all delighted to share their passion for research. In short, ten days of enriching discoveries, meetings and exchanges.

For more information and pictures of the event, read this article.

For ten days, CERN was guest of honour at the Automnales, the annual trade fair in Geneva. (Video: Jacques Fichet/CERN)

 

50 years since iconic 'A Model of Leptons' published

Mon, 11/20/2017 - 10:29

This event shows the real tracks produced in the 1200 litre Gargamelle bubble chamber that provided the first confirmation of a neutral current interaction. (Image: CERN)

Today, 50 years ago, Steven Weinberg published the iconic paper A Model of Leptons, which explains the profound link between mathematics and nature. This paper lies at the core of the Standard Model, our most complete theory of how particles interact in our universe.

Just two pages long, Weinberg’s elegant and simply written theory was revolutionary at the time, yet was virtually ignored for many years. But now, it is cited at least three times a week.

The paper uses the idea of symmetry – that everything in our universe has a corresponding mirror image – between particles called pions to build Weinberg’s theory of the fundamental forces.

From 1965 Weinberg had been building a mathematical structure and theorems based on this symmetry that explained why physicists had observed certain interactions between pions and nucleons and how pions behave when they are scattered from one another. This paved the way for a whole theory of hadronic physics at low energy.

“It’s what keeps you going as a theoretical physicist to hope that one of your squiggles will turn out to describe reality.”
Steven Weinberg, Nobel prize winner and author of A Model of Leptons

Physicists had been using the concept of symmetry since the 1930’s, but had not yet been able to unite the electromagnetic and weak forces. Uniting the two forces would bring physicists closer to a single theory describing how and why all the fundamental interactions in our universe occur. The mathmatics needed the particles carrying these two forces to be massless, but Weinberg and other physicists knew that if the particles really created these forces in nature, they had to be very heavy.

One day, as the 34-year-old Weinberg was driving his red Camero to work, he had a flash of insight – he had been looking for massless particles in the wrong place. He applied his theory to a rarely mentioned and often disregarded particle, the massive W boson, and paired it with a massless photon.  Theorists accounted for the mass of the W by introducing another unseen mechanism. This later became known as the Higgs mechanism, which calls for the existence of a Higgs boson.

Proving the validity of Weinberg’s theory inspired one of the biggest experimental science programmes ever seen and CERN has built major projects with these discoveries at their heart: the Gargamelle bubble chamber found the first evidence of the electroweak current in 1973; the Super Proton Synchrotron showed, in 1982, the first evidence of the W boson; and most recently the Large Hadron Collider, in 2012, confirmed the existence of the Higgs Boson.


Steven Weinberg visiting the ATLAS collaboration in 2009. (Image: Maximilien Brice/CERN)

Speaking to the CERN Courier Weinberg, now 84, describes what it’s like to see his work confirmed: “It’s what keeps you going as a theoretical physicist to hope that one of your squiggles will turn out to describe reality.” He received the Nobel Prize for this iconic, game-changing theory in 1979.

Half a century after this publication, it’s hard to find a theory that explains fundamental physics as clearly as Weinberg’s, which brought together all the different pieces of the puzzle and assembled them into one, very simple idea.

 

Read more about the original theory, and an interview with Steven Weinberg in this month’s CERN Courier

Record luminosity: well done LHC

Mon, 11/13/2017 - 14:56

View of the LHC tunnel. (Image : Maximilien Brice/CERN)

It’s the end of the road for the protons this year after a magnificent performance from the Large Hadron Collider (LHC). On Friday, the final beams of the 2017 proton run circulated in the LHC. The run ended, as it does every year, with a round up of the luminosity performance, the indicator by which the effectiveness of a collider is measured and on which the operators keep a constant eye.

The LHC has far exceeded its target for 2017. It has provided its two major experiments, ATLAS and CMS, with 50 inverse femtobarns of data, i.e. 5 billion million million collisions. The inverse femtobarn (fb-1) is the unit used to measure integrated luminosity, or the cumulative number of potential collisions over a given period.

This result is all the more remarkable because the machine experts had to overcome a serious setback. A vacuum problem in the beam pipe of a magnet cell limited the number of bunches that could circulate in the machine. Several teams were brought in to find a solution. Notably, the arrangement of the bunches in the beams was changed. After a few weeks, luminosity started to increase again.

At the same time, over the course of the year, the operators have optimised the operating parameters. Using a new system put in place this year, they have notably reduced the size of the beams when they meet at the centre of the experiments. The more squeezed the beams, the more collisions occur each time they meet. Last year, the operators managed to obtain 40 collisions at each bunch crossing, with each bunch containing 100 billion particles. In 2017, up to 60 collisions were produced at each crossing.

Thanks to these improvements, the instantaneous luminosity record was smashed, reaching 2.06 x 1034cm-2s-1, or twice the nominal value. Instantaneous luminosity corresponds to the potential number of collisions per second.

The LHC will continue to operate for another two weeks for two special runs including a week for operation studies. The first special run will consist of carrying out proton collisions at 5.02 TeV (as opposed to the usual 13 TeV), the same energy as that planned for next year’s lead-ion runs. This will enable physicists to collect data with protons, which they will then be able to compare with the lead-ion data.

The second special run, at very low luminosity, will provide data for the TOTEM and ATLAS/ALFA experiments. These two experiments use detectors located on either side of two large LHC detectors: CMS in the case of TOTEM and ATLAS in the case of ATLAS/ALFA. They study interactions called elastic scattering, where two protons merely change direction slightly when they interact, rather than colliding. For these studies, the LHC makes the beams as wide as possible. What’s more, the energy will be limited to 450 GeV, i.e. the energy at which beams are injected from the accelerator complex into the LHC.

Finally, the operators will carry out a “machine development” campaign. Over a week, they will perform operating tests to improve the accelerator’s performance still further (it can never be too good) and begin to prepare the High-Luminosity LHC, which will take over from the LHC after 2025.

When these tests are over, the operators will stop the machine for the year-end technical shutdown. 

Graphs showing the integrated luminosity of the LHC in 2017. The unit is the inverse femtobarn. The green squares represent the achieved luminosity, while the blue line shows the planned luminosity. (Image: CERN)

Combatting cancer in challenging environments

Thu, 11/09/2017 - 15:59

World map showing access to radiotherapy treatment centres and the shortfall of more than 5000 radiotherapy machines in low- to middle-income countries. (Image: IAEA, AGaRT)

If you live in a low- or middle-income country, your chances of surviving cancer are significantly lower than if you live in a wealthier economy, and that’s largely due to the availability of radiation therapy.

A group of international experts in the fields of accelerator design, medical physics and oncology recently met at CERN to try to solve the technical problem of designing a robust linear accelerator (linac) that can be used in more challenging environments. 

Between 2015 and 2035, the number of cancer diagnoses worldwide is expected to increase by 10 million, with around 65% of those cases in poorer economies. 

It’s estimated that 12 600 new radiotherapy treatment machines will be needed to treat those patients. 

“We need to develop a machine that provides state-of-the-art radiation therapy in situations where the power supply is unreliable, the climate is harsh or communications are poor,” explains Manjit Dosanjh, senior advisor for CERN medical applications. “We need to avoid a linac of sub-standard quality that would not only provide lower-quality treatment but would be a disincentive for the recruitment and retention of high-quality staff.”

Limiting factors to the development and implementation of radiotherapy in lower-resourced nations don’t just include the cost of equipment and infrastructure, but also a shortage of trained personnel to properly calibrate and maintain the equipment and to deliver high-quality treatment. The plan is to design a medical accelerator that is affordable, easy to operate and maintain, and robust enough to be used in areas where these operational challenges might occur.

“I grew up in Australia, where the distances to hospitals can be vast, the climate can be harsh and local access to medical experts can quite literally be the difference between life and death,” explains accelerator physicist Suzie Sheehy from the University of Oxford and the Science and Technology Facilities Council (STFC). “In this project, the challenges in different environments will be extremely varied, but it seems obvious to me that those of us on the cutting-edge of research in particle accelerators should rise to the challenge of re-designing systems to make them more available to those who need them. I see this as a challenge and an opportunity to take my research into spaces where it is most needed.”

Jointly organised by CERN, the International Cancer Expert Corps (ICEC) and STFC, the workshop at CERN from 26 to 27 October was funded through the UK’s Global Challenges Research Fund, enabling key participants from Botswana, Ghana, Jordan, Nigeria and Tanzania to share their grass-roots perspectives. Understanding the in-country challenges will improve the effectiveness of the technology under design. Zubi Zubizaretta of the International Atomic Energy Agency (IAEA) also presented the results of the 2017 IAEA Radiation Therapy survey.

This workshop followed on from the inaugural workshop in November 2016, and a future ICEC workshop will look at the education and training requirements for the estimated 130 000 local staff (oncologists, medical physicists and technicians) who will be needed to operate the treatment machines and deliver patient care.

This ambitious project aims to have facilities and staff available to treat patients in low- and middle-income countries within 10 years.

Marie Skłodowska-Curie: more alive today than ever!

Tue, 11/07/2017 - 08:14

Exactly 150 years ago, on 7 November 1867, Marie Skłodowska was born in Warsaw in Poland. A century and a half later, the name Marie Skłodowska-Curie is associated not only with this double-Nobel-prizewinning scientific luminary, but also with a whole community of European scientists: the Marie Skłodowska-Curie fellows.

Since the programme was introduced by the European Commission in 1990, the Marie Skłodowska-Curie fellowships have benefitted more than 100 000 scientists at all stages of their careers (from doctoral students to experienced researchers). Above all, the programme aims to promote international and interdisciplinary mobility and excellence in research across all fields.

Since 2004, the Marie Skłodowska-Curie programme has enabled more than 490 fellows to continue their studies at CERN, usually for a period of two years. At present, 134 participants in the programme are spread across various departments of the Laboratory.

The aim of the Marie Skłodowska-Curie programme fits perfectly with CERN’s training mission. Several hundred undergraduate, doctoral and post-doctoral students have already benefited from CERN’s exceptional scientific environment and the know-how of its researchers, and the Marie Skłodowska-Curie programme has played a key role in making this happen. No doubt Marie Skłodowska-Curie herself would be proud of this success.

 

______________________________________

The Marie Skłodowska-Curie programme through the eyes of its fellows

 

Alessandra Gnecchi has been a Marie Skłodowska-Curie fellow in CERN’s Theoretical Physics department since April 2017. She is currently working on black holes in supersymmetric theories.

Alessandra Gnecchi. (Image: Julien Ordan/CERN)

 

“Marie Curie was the first female role model of the modern scientific era – I have a particular attachment to her. I was a young girl in the 1990s and read the book "Madame Curie", which made me decide to become a scientist.

Today, the Marie Skłodowska-Curie Fellowship Programme allows scientists to study more than one research topic, which is very important. It demonstrates moreover that the recipient was able to write a challenging proposal. Because of these aspects, this fellowship could allow my career to become highly visible and productive, and it is up to me now to exploit this opportunity.”

 

 

 

Roberto Cardella. (Image: Julien Ordan/CERN)

Roberto Cardella has been a Marie Skłodowska-Curie fellow in CERN’s Experimental Physics department since September 2016. He is currently working on the upgrade for the inner tracker of the ATLAS experiment.

“My Marie Skłodowska-Curie Fellowship falls under an ITN (Innovation Training Network) called STREAM. In our consortium, there are currently 17 fellows, spread all over Europe, working on related topics. It is inspiring to work on an innovative topic and to collaborate with other students.

This programme is a great opportunity for my career. Being part of a training network has already allowed me to get in touch with many institutes all over Europe. I am learning a lot from my colleagues here at CERN and the periodic meetings with the other students and partners in STREAM allow me to broaden my view of this field.”

 

 

 

Anna Stakia. (Image: Sophia Bennett/CERN)

Anna Stakia has been a Marie Skłodowska-Curie fellow in CERN’s Experimental Physics department since May 2016. She is currently working on New Physics searches and Machine Learning.

“Marie Skłodowska-Curie is without a doubt one of the most eminent figures in physics, and in science in general. I feel honoured to be part of a programme that carries her name and I am personally incredibly inspired by it.

To me, the strongest aspect of the Marie Skłodowska-Curie Programme is that it offers a broad variety of sub-academic and training options through which a fellow can navigate. In this way, any strict academic barriers are overcome. At the same time, the mobility opportunities enhance the fruitful interaction of students not only with researchers and working environments in foreign countries, but also among themselves, thus creating a fertile field for collaboration, which expands their research horizons, accelerates their progress and boosts their career potential.”

 

______________________________________

Today, to mark 150 years since the birth of Marie Skłodowska-Curie, CERN, the University of Liverpool and the Ludwig Maximilian University of Munich are organising a series of events for the scientific community and the general public. For more information, visit the event website.

From 3 pm (CET), watch the presentations at the University of Liverpool and at CERN.

To go further, read the article published in July on Marie Curie's granddaughter's visit to CERN.

Combien pèse un kilo ?

Thu, 11/02/2017 - 15:49

La balance du watt NIST-4 du National Institute of Standards and Technology a mesuré la constante de Planck avec une précision de 13 parties par milliard, précision suffisante pour participer à la redéfinition du kilogramme. (Image : J. L. Lee/NIST)

Le Kilogramme ne pèse plus un kilogramme. Cette nouvelle déroutante a été annoncée lors d’un séminaire au CERN, jeudi 26 octobre, par Klaus von Klitzing, qui a reçu en 1985 le prix Nobel de physique pour la découverte de l’effet Hall quantique. « Nous allons être témoins d’un changement révolutionnaire dans la manière de définir le kilogramme », a-t-il indiqué. 

Le kilogramme, unité de masse, fait partie avec six autres unités (le mètre, la seconde, l’ampère, le kelvin, la mole et la candela) du Système international d’unités (SI), qui sert de référence pour exprimer par des nombres tout objet ou phénomène naturel mesurable. La définition actuelle du kilogramme se base sur un petit cylindre en alliage de platine et d’iridium, surnommé le « grand K » et pesant exactement un kilogramme. Ce cylindre a été fabriqué en 1889 et il est depuis gardé en sûreté, protégé par trois cloches de verre dans un coffre-fort de haute sécurité dans la banlieue de Paris. Mais il y a un problème : l’étalon du kilogramme perd du poids. Il s’agirait, d’après la dernière vérification, d’environ 50 microgrammes. Ces 50 microgrammes suffisent à ce que l’étalon ne soit plus, comme c’était initialement le cas, identique à ses copies, conservées dans des laboratoires du monde entier. 

Pour résoudre ce problème de poids, les scientifiques ont cherché une nouvelle manière de définir le kilogramme.

Lors de la Conférence générale quadriennale des poids et mesures de 2014, la communauté scientifique de la métrologie a décidé officiellement de redéfinir le kilogramme en fonction de la constante de Planck (h), quantité issue de la mécanique quantique qui met en relation l’énergie d’une particule avec sa fréquence, mais aussi, grâce à l’équation d’Einstein E = mc2, avec sa masse. La constante de Planck est l’un des nombres fondamentaux de notre Univers, une quantité dont la valeur reste universellement fixe dans la nature, comme la vitesse de la lumière ou la charge électrique d’un proton. 

Le K 20, réplique de l’étalon du kilogramme et prototype national, conservé par le gouvernement des États-Unis au NIST à Bethesda, Maryland. (Image: NIST)

Une valeur fixe exacte sera déterminée pour la constante de Planck, sur la base des meilleures mesures obtenues dans le monde entier. Le kilogramme sera ainsi redéfini grâce à la relation existant entre la constante de Planck et la masse.

« Il n’y a pas lieu de s’inquiéter, assure Klaus von Klitzing. Le kilogramme sera redéfini d’une manière qui ne changera (presque) rien dans notre quotidien. Sa valeur ne gagnera pas en précision ; elle deviendra néanmoins plus stable et plus universelle. »

Le processus de redéfinition n’est pourtant pas aussi simple qu’il n’y paraît. Le Bureau international des poids et mesures, organe responsable de la gestion des accords internationaux sur les mesures, a imposé des exigences strictes pour la procédure à suivre : trois expériences indépendantes mesurant la constante de Planck devront se mettre d’accord sur la valeur qui en sera dérivée pour le kilogramme, avec une marge d’incertitude inférieure à 50 parties par milliard (ppb), et à 20 ppb pour l’une d’entre elles au moins. 50 parties par milliard équivaut, dans le cas présent, à environ 50 microgrammes, soit à peu près le poids d’un cil.

Deux types d’expériences se sont révélées capables de lier la constante de Planck à la masse avec une précision aussi exceptionnelle. Une des méthodes, pratiquée par une équipe internationale connue sous le nom Avogadro Project, consiste à compter les atomes contenus dans une sphère de silicone 28 ayant exactement le même poids que l’étalon du kilogramme. La deuxième méthode requiert une balance d’un type particulier, appelée balance du watt. Elle fonctionne en équilibrant des forces électromagnétiques avec une masse, la masse utilisée pour ce test étant calibrée pour correspondre exactement à l’étalon du kilogramme.

C’est là qu’entre en jeu la découverte majeure faite par Klaus von Klitzing en 1980, qui lui a valu le prix Nobel de physique. Pour obtenir des mesures de très haute précision du courant et de la tension des forces électromagnétiques se trouvant dans la balance du watt, les scientifiques utilisent deux constantes universelles. La première est la constante de von Klitzing, qui est connue avec une extrême précision et qui peut à son tour être définie en fonction de la constante de Planck et de la charge de l’électron. La constante de von Klitzing décrit la manière dont la résistance est quantifiée dans le cadre d’un phénomène de mécanique quantique appelé « effet Hall quantique », observé lorsque des électrons sont confinés dans une couche de métal extrêmement fine soumise à des températures basses et à de forts champs magnétiques.

« Il s’agit vraiment d’une révolution majeure, conclut Klaus von Klitzing. En fait, on a même dit qu’il s’agissait de la plus grande révolution, dans la métrologie, depuis la Révolution française, quand le premier système international d’unités avait été introduit par l’Académie française des sciences. »

Le CERN participe à cette révolution : le Laboratoire a pris part à un projet de métrologie, initié par l’institut suisse de métrologie METAS, visant à construire une balance du watt qui jouera un rôle dans l’harmonisation de la nouvelle définition du kilogramme en fournissant des mesures extrêmement précises de la constante de Planck. Le CERN a fourni un élément essentiel de la balance du watt : le circuit magnétique nécessaire pour créer les forces électromagnétiques qui seront équilibrées par la masse de test. L’aimant doit être extrêmement stable pendant la mesure, et fournir un champ magnétique très homogène. 

From Earth to space: developing radiation-tolerant systems

Wed, 11/01/2017 - 10:09

CERN helps develop many new technologies that have applications both in particle physics experiments and in the aerospace industry. (Image: K. Anthony/CERN)

Aerospace engineering and particle physics might not at first seem like obvious partners. However, both fields have to deal with radiation and other extreme environments, posing stringent technological requirements that are often similar.

CERN operates testing facilities and develops qualification technologies for high-energy physics, which are also useful for the ground-testing and qualification of flight equipment. This opportunity is particularly attractive when it comes to testing miniaturised satellites called CubeSats, whose components are typically made using commercial off-the-shelf components, since using standard procedures to ensure radiance tolerance is expensive and time-consuming.

The CERN Latch-up Experiment Student Satellite (CELESTA) intends to develop a miniaturised and space-qualified version of RadMon, a radiation monitor developed at CERN, and to prove that CERN’s High-Energy Accelerator Mixed-Field Facility (CHARM) can be used to test whether products are suitable for low Earth orbit. CELESTA is being developed in collaboration with the University of Montpellier and this year was selected by ESA’s Fly Your Satellite! programme to be sent into orbit in 2018 or 2019.

Many other technologies and facilities link space and accelerator radiation. The TimePix detectors, which are USB-powered particle trackers, are already used by NASA aboard the International Space Station to monitor radiation doses accurately. Monte Carlo codes such as FLUKA and Geant4, which were developed and have been maintained by worldwide collaborations with strong support from CERN since their conception, have been used routinely to study the radiation environment of past, recent and future space missions.

Magnesium diboride (MgB2), the high-temperature superconductor that will be used for the innovative electrical-transmission lines of the High-Luminosity LHC, has also demonstrated its potential for future space missions. VESPER, the Very Energetic Electron Facility for Space Planetary Exploration Missions in Harsh Radiative Environments (part of the CERN Linear Electron Accelerator for Research (CLEAR) facility), is a high-energy electron beam line used for radiation testing and suitable for testing electronic components for operation in Jupiter’s environment.

These synergies were in the limelight during RADECS 2017, the latest annual conference on Radiation Effects on Components and Systems, held for the first time at CERN in October this year. The aim of the RADECS conference is to provide an annual European forum on the effects of radiation on electronic and photonic materials, devices, circuits, sensors and systems. This year’s theme was “From space to ground and below”, referring to the need for radiation-tolerant systems both in space, aeronautical and terrestrial applications and in underground particle physics experiments.

This text is based on an article that first appeared in the October 2017 issue of the CERN Courier.

Happy Dark Matter Day

Mon, 10/30/2017 - 19:35

A simulation of the large-scale structure of the Universe that shows density filaments in blue and places of galaxy formation in yellow. (Image: Zarija Lukic/Berkeley Lab)

This 31 October might be the spookiest Halloween yet. Scientists at CERN are celebrating the hidden sectors of our Universe as part of the international celebration of Dark Matter Day. Dark matter warps distant starlight and enables galaxies to rotate at unfathomable speeds, yet is completely invisible to traditional detectors. In fact, scientists only know that dark matter exists because of its massive gravitational pull on ordinary matter. To catch this stealthy stuff, scientists must become a little more creative with how they search for it.

Today, CERN scientists will be showcasing the tools and techniques they are using to uncover the true nature of dark matter. At 15.00 CET, researchers from the CAST experiment will explain, during a Facebook Live session, how the Sun could radiate these invisible particles. Later this evening, scientists will explore how and why they are looking for dark matter, during a public event in CERN’s Globe of Science and Innovation, webcast live from 20.00 CET.

Can’t make it to the CERN events? There are numerous organisations around the world hosting activities to celebrate Dark Matter Day and plenty of ways to engage with the hunt from home. Check out this TED-Ed animation, which explains how the LHC could create dark matter, or find out when the  Phantom of the Universe show is being screened at a planetarium near you.

As you celebrate and explore unexplainable mysteries during your Halloween celebrations, remember, don’t be afraid of the dark!

LHC reaches 2017 targets ahead of schedule

Mon, 10/30/2017 - 16:59

Trillions of protons race around the LHC’s 27km ring in opposite directions more than 11,000 times a second, travelling at 99.9999991 per cent the speed of light. (Image: Max Brice and Julien Ordan/CERN)

Today, CERN Control Centre operators announced good news, the Large Hadron Collider (LHC) has successfully met its production target for 2017, delivering more than 45 inverse femtobarns* to the experiments.

This achievement was all the more impressive as it was ahead of schedule. The LHC still has 19 more days of proton collisions, continuing to provide physics data to the experiments. Yet earlier this year it looked unlikely that this target would be achieved. An issue had developed with a small group of magnets known as 16L2 that was affecting machine performance. Then, early September, thanks to effective and creative collaboration between different teams around CERN, alternative ways to deal with the technical issue were developed that made the LHC and its injector chain reach top performances again. In addition, by the end of September, the 2017 production run was shortened by advancing special runs planned for 2018 to 2017, putting yet more pressure on the operators to deliver in a smaller timeframe. 

The LHC has outperformed its target for 2017, delivering more collisions than expected to LHC experiments.

None-the-less with the target met, as well as another recent milestone of reaching twice the design luminosity, the LHC has once again shown its excellence. That being said, physicists are already looking to upgrades tens of years in the future and the physics potential that they bring. Today at CERN, scientists are gathering to begin a three-day workshop to review, extend and further refine understanding of the physics potential of the High Luminosity LHC – the planned upgrade of the LHC – and even beyond.

In the more immediate future, once the main proton physics run end this year, the LHC will have 15 days of special runs plus machine development before its winter shutdown begins on 11 December. At that point, the “Year-end technical stop” (YETS) will be used to help consolidate and improve the machine, ahead of its restart in spring 2018.

 

* The inverse femtobarn (fb-1) is the unit of measurement for integrated luminosity, indicating the cumulative number of potential collisions. One inverse femtobarn corresponds to around 80 million million collisions.

Xenon in the SPS: First tests for a photon factory

Fri, 10/27/2017 - 17:20

The Super Proton Synchrotron (SPS), pictured during a recent technical stop. (Image: Max Brice/CERN)

Accelerator operators can perform amazing acrobatics with particle beams, most recently in the Super Proton Synchrotron (SPS), CERN’s second-largest accelerator. For the first time, they have successfully injected a beam of partially ionised xenon particles into the SPS and accelerated it. Before they were injected into the SPS, these atoms were stripped of 39 of their 54 electrons.

During the first test, which took place in September, the beam was injected into the SPS ring and circulated for about one second. Now, the beam has been accelerated for the first time, reaching an energy of 81.6 gigaelectronvolts (GeV) per nucleon.

What makes this performance so remarkable is that these beams of partially ionised xenon atoms are extremely fragile and have a very short lifespan. If an atom loses just one of its 15 electrons, it changes orbit and is lost. “The SPS vacuum is not quite as high as that of the LHC. The residual gas molecules present in the vacuum chamber disturb the beam, which explains why it is lost quite quickly,” says Reyes Alemany, who is responsible for the SPS tests. “But keeping the beam going for one cycle in the SPS is already a very promising result!”

So why are accelerator physicists experimenting with these atoms? It’s to test a novel idea: a high-intensity source of gamma rays (photons with energies in the megaelectronvolt (MeV) range). This gamma factory, as it is known, would generate photons of up to 400 MeV in energy and at intensities comparable to those of synchrotrons or X-ray free-electron lasers (XFELs). XFELs produce high-intensity beams of X-rays – that is, photons of an energy of less than about 100 kiloelectronvolts (keV).

“A source of that kind would pave the way for studies never done before in fundamental physics, in the fields of quantum electrodynamics or dark matter research,” explains Witold Krasny, a CNRS physicist and CERN associate who founded the project and leads the work group. “It also opens the door for industrial and medical applications.” It could even serve as a test bench for a future neutrino factory or muon collider.

The principle is to accelerate partially ionised atoms and then excite them using a laser. As they return to their stable state, the atoms release high-energy photons.

The team took advantage of the presence of xenon in the accelerator complex to carry out this first test without disrupting the other ongoing physics programmes. Next year, during the LHC heavy-ion run, the team will repeat the experiment using ionised lead atoms, which will be stripped of all but one or two electrons. Those beams will be much more stable; having fewer electrons means that the atoms are less at risk of losing them. In addition, their electrons are only found in the “K” shell, the closest to the nucleus, and therefore have a stronger link to the nucleus than in the xenon atoms. The heavy-ion beams could be accelerated first in the SPS and then in the LHC.

The gamma factory project is part of the Physics Beyond Colliders study, which was launched in 2016 with the goal of investigating all possible non-collider experiments, particularly those that could be done using CERN’s accelerator complex. Hundreds of scientists are expected to attend the annual Physics Beyond Colliders conference at CERN at the end of November.

Xenon in the SPS: First tests for a photon factory

Fri, 10/27/2017 - 17:20

The Super Proton Synchrotron (SPS), pictured during a recent technical stop. (Image: Max Brice/CERN)

Accelerator operators can perform amazing acrobatics with particle beams, most recently in the Super Proton Synchrotron (SPS), CERN’s second-largest accelerator. For the first time, they have successfully injected a beam of partially ionised xenon particles into the SPS and accelerated it. Before they were injected into the SPS, these atoms were stripped of 39 of their 54 electrons.

During the first test, which took place in September, the beam was injected into the SPS ring and circulated for about one second. Now, the beam has been accelerated for the first time, reaching an energy of 81.6 gigaelectronvolts (GeV) per nucleon.

What makes this performance so remarkable is that these beams of partially ionised xenon atoms are extremely fragile and have a very short lifespan. If an atom loses just one of its 15 electrons, it changes orbit and is lost. “The SPS vacuum is not quite as high as that of the LHC. The residual gas molecules present in the vacuum chamber disturb the beam, which explains why it is lost quite quickly,” says Reyes Alemany, who is responsible for the SPS tests. “But keeping the beam going for one cycle in the SPS is already a very promising result!”

So why are accelerator physicists experimenting with these atoms? It’s to test a novel idea: a high-intensity source of gamma rays (photons with energies in the megaelectronvolt (MeV) range). This gamma factory, as it is known, would generate photons of up to 400 MeV in energy and at intensities comparable to those of synchrotrons or X-ray free-electron lasers (XFELs). XFELs produce high-intensity beams of X-rays – that is, photons of an energy of less than about 100 kiloelectronvolts (keV).

“A source of that kind would pave the way for studies never done before in fundamental physics, in the fields of quantum electrodynamics or dark matter research,” explains Witold Krasny, a CNRS physicist and CERN associate who founded the project and leads the work group. “It also opens the door for industrial and medical applications.” It could even serve as a test bench for a future neutrino factory or muon collider.

The principle is to accelerate partially ionised atoms and then excite them using a laser. As they return to their stable state, the atoms release high-energy photons.

The team took advantage of the presence of xenon in the accelerator complex to carry out this first test without disrupting the other ongoing physics programmes. Next year, during the LHC heavy-ion run, the team will repeat the experiment using ionised lead atoms, which will be stripped of all but one or two electrons. Those beams will be much more stable; having fewer electrons means that the atoms are less at risk of losing them. In addition, their electrons are only found in the “K” shell, the closest to the nucleus, and therefore have a stronger link to the nucleus than in the xenon atoms. The heavy-ion beams could be accelerated first in the SPS and then in the LHC.

The gamma factory project is part of the Physics Beyond Colliders study, which was launched in 2016 with the goal of investigating all possible non-collider experiments, particularly those that could be done using CERN’s accelerator complex. Hundreds of scientists are expected to attend the annual Physics Beyond Colliders conference at CERN at the end of November.

From the web to a start-up near you

Thu, 10/26/2017 - 15:01

The Timepix3 chip is a multipurpose hybrid pixel detector developed within the Medipix3 collaboration, with applications in medical imaging, education, space dosimetry and materials analysis. (Image: Max Brice/CERN)

Twenty years ago, in 1997, CERN set up a reinforced policy and team to support its knowledge- and technology-transfer activities. As a publicly funded Laboratory, CERN has a remit to ensure that its technology and expertise deliver prompt and tangible benefits to society wherever possible. Through novel developments in the field of accelerator technologies and detectors, and more recently in computing and digital sciences, CERN technologies and know-how have contributed to applications in many fields, including the World Wide Web, which was invented at CERN by Tim Berners-Lee in 1989.

Today, these activities are still going strong, and CERN’s influence has broadened. Its knowledge-transfer activities have had an impact on a wide range of fields, from medical and biomedical technologies to aerospace applications, safety and “industry 4.0”.

CERN’s expertise builds broadly on three technical fields: accelerators, detectors and computing. Behind these three pillars of technology lie many threads of technology and human expertise that translate into positive impacts on society in many different fields. (Image: G. Dorne/CERN)

Early activities at CERN relating to medical applications date back to the 1970s. In the 1990s, the Crystal Clear and Medipix collaborations started to explore the feasibility of developing the technologies used in the LHC detectors for possible medical applications, such as PET and X-ray imaging. The Proton-Ion Medical Machine Study (PIMMS) was launched at CERN with the aim of producing a synchrotron design optimised for treating cancer patients using protons and carbon ions. The initial design evolved into the machine built for the Italian National Centre for Oncological Hadrontherapy (CNAO). Later on, MedAustron in Austria built its treatment centre based on the CNAO design. For the past 50 years, CERN has hosted the ISOLDE facility. Over 1200 radioisotopes from more than 70 chemical elements have been made available for fundamental and applied research, including in the medical field. Today, activities pertinent to medical applications are happening throughout CERN and, in June 2017, the CERN Council approved a document setting out the “Strategy and framework applicable to knowledge transfer by CERN for the benefit of medical applications”.

The first brazed RFQ module for medical applications. (Image: M. Brice/CERN)

CERN’s computing expertise is also finding applications in aerospace. To solve the challenge of sharing software and codes in big-data environments, researchers at CERN have developed a system called CernVM-FS (CERN Virtual Machine File System), which is currently used in high-energy physics experiments to distribute around 350 million files. The system is now also being used by Euclid, a European space mission that aims to study the nature of dark matter and dark energy, to deploy software in its nine science data centres.

“Industry 4.0” is a push towards increasing automation and efficiency in manufacturing processes with connected sensors and machines, autonomous robots and big-data technology. In the field of robotics, CERN has developed TIM (Train Inspection Monorail), a mini-vehicle that monitors the LHC tunnel autonomously while moving along tracks suspended from the ceiling, and can be programmed to perform real-time inspection missions. This innovation has already caught the eye of industry, where it could be used in particular for the autonomous monitoring of utilities infrastructure, such as underground water pipelines.

Since the early days of technology transfer, CERN has continued to build a general culture of entrepreneurship within the Organization through many different avenues. In order to assist entrepreneurs and small technology businesses in taking CERN technologies and expertise to the market, CERN has established a network of nine Business Incubation Centres (BICs) throughout its Member States. The BIC managers provide office space, expertise, business support, access to local and national networks and support in accessing funding. There are currently 18 start-ups and spin-offs using CERN technologies in their businesses, four of which joined BICs last year alone.

Many other interesting projects are also in the pipeline. CERN’s expertise in superconducting technologies can be used in MRI machines and gantries for hadrontherapy, while its skill at handling large amounts of data can benefit the health sector more widely. In other fields, detector technologies developed at CERN can be used in non-destructive testing techniques, while compact accelerators benefit the analysis of artwork. These are just some examples of the new projects we are working on, and more initiatives will be launched to meet the needs of industrial and research partners in CERN’s Member States and Associate Member States over the next 20 years and beyond.

This article is a condensed excerpt from a feature article published in the CERN Courier September 2017 issue, which you can read in full here. Find out more at kt.cern

From the web to a start-up near you

Thu, 10/26/2017 - 15:01

The Timepix3 chip is a multipurpose hybrid pixel detector developed within the Medipix3 collaboration, with applications in medical imaging, education, space dosimetry and materials analysis. (Image: Max Brice/CERN)

Twenty years ago, in 1997, CERN set up a reinforced policy and team to support its knowledge- and technology-transfer activities. As a publicly funded Laboratory, CERN has a remit to ensure that its technology and expertise deliver prompt and tangible benefits to society wherever possible. Through novel developments in the field of accelerator technologies and detectors, and more recently in computing and digital sciences, CERN technologies and know-how have contributed to applications in many fields, including the World Wide Web, which was invented at CERN by Tim Berners-Lee in 1989.

Today, these activities are still going strong, and CERN’s influence has broadened. Its knowledge-transfer activities have had an impact on a wide range of fields, from medical and biomedical technologies to aerospace applications, safety and “industry 4.0”.

CERN’s expertise builds broadly on three technical fields: accelerators, detectors and computing. Behind these three pillars of technology lie many threads of technology and human expertise that translate into positive impacts on society in many different fields. (Image: G. Dorne/CERN)

Early activities at CERN relating to medical applications date back to the 1970s. In the 1990s, the Crystal Clear and Medipix collaborations started to explore the feasibility of developing the technologies used in the LHC detectors for possible medical applications, such as PET and X-ray imaging. The Proton-Ion Medical Machine Study (PIMMS) was launched at CERN with the aim of producing a synchrotron design optimised for treating cancer patients using protons and carbon ions. The initial design evolved into the machine built for the Italian National Centre for Oncological Hadrontherapy (CNAO). Later on, MedAustron in Austria built its treatment centre based on the CNAO design. For the past 50 years, CERN has hosted the ISOLDE [LINK!] facility. Over 1200 radioisotopes from more than 70 chemical elements have been made available for fundamental and applied research, including in the medical field. Today, activities pertinent to medical applications are happening throughout CERN and, in June 2017, the CERN Council approved a document setting out the “Strategy and framework applicable to knowledge transfer by CERN for the benefit of medical applications”.

The first brazed RFQ module for medical applications. (Image: M. Brice/CERN)

CERN’s computing expertise is also finding applications in aerospace. To solve the challenge of sharing software and codes in big-data environments, researchers at CERN have developed a system called CernVM-FS (CERN Virtual Machine File System), which is currently used in high-energy physics experiments to distribute around 350 million files. The system is now also being used by Euclid, a European space mission that aims to study the nature of dark matter and dark energy, to deploy software in its nine science data centres.

“Industry 4.0” is a push towards increasing automation and efficiency in manufacturing processes with connected sensors and machines, autonomous robots and big-data technology. In the field of robotics, CERN has developed TIM (Train Inspection Monorail), a mini-vehicle that monitors the LHC tunnel autonomously while moving along tracks suspended from the ceiling, and can be programmed to perform real-time inspection missions. This innovation has already caught the eye of industry, where it could be used in particular for the autonomous monitoring of utilities infrastructure, such as underground water pipelines.

Since the early days of technology transfer, CERN has continued to build a general culture of entrepreneurship within the Organization through many different avenues. In order to assist entrepreneurs and small technology businesses in taking CERN technologies and expertise to the market, CERN has established a network of nine Business Incubation Centres (BICs) throughout its Member States. The BIC managers provide office space, expertise, business support, access to local and national networks and support in accessing funding. There are currently 18 start-ups and spin-offs using CERN technologies in their businesses, four of which joined BICs last year alone.

Many other interesting projects are also in the pipeline. CERN’s expertise in superconducting technologies can be used in MRI machines and gantries for hadrontherapy, while its skill at handling large amounts of data can benefit the health sector more widely. In other fields, detector technologies developed at CERN can be used in non-destructive testing techniques, while compact accelerators benefit the analysis of artwork. These are just some examples of the new projects we are working on, and more initiatives will be launched to meet the needs of industrial and research partners in CERN’s Member States and Associate Member States over the next 20 years and beyond.

This article is a condensed excerpt from a feature article published in the CERN Courier September 2017 issue, which you can read in full here. Find out more at kt.cern

Webcast: How does collaboration shape innovation?

Wed, 10/25/2017 - 11:14

IdeaSquare is an innovation hub at CERN (Image: Jean-Claude Gadmer/CERN)

Today,‪ IdeaSquare at CERN is hosting The Other Side of Innovation – a workshop to discover how collaboration across disciplines and rapid prototyping is shaping innovation.

Join us via webcast from 13:00 CEST.

Professor of technology and innovation at ETH Zürich, Stefano Brusoni, will give a talk on the divergent process of innovation, while Bruno Herbelin, deputy director of the Center for Neuroprosthetics at EPFL, will showcase how virtual reality can be used by researchers.

IdeaSquare is an innovation hub at CERN, which aims to bring together people from many fields, to generate new ideas and work on conceptual prototypes related to detector research in an open, collaborative environment.

It brings together CERN personnel, visiting students, and external project collaborators from the domains of research, technology development and education. It also contributes to CERN’s Knowledge Transfer Group, helping them to shape and innovate new product ideas into socially and globally relevant activities.

 

For more information, visit the event page.

Webcast: How does collaboration shape innovation?

Wed, 10/25/2017 - 11:14

IdeaSquare is an innovation hub at CERN (Image: Jean-Claude Gadmer/CERN)

Today,‪ IdeaSquare at CERN is hosting The Other Side of Innovation – a workshop to discover how collaboration across disciplines and rapid prototyping is shaping innovation.

Join us via webcast from 13:00 CEST.

Professor of technology and innovation at ETH Zürich, Stefano Brusoni, will give a talk on the divergent process of innovation, while Bruno Herbelin, deputy director of the Center for Neuroprosthetics at EPFL, will showcase how virtual reality can be used by researchers.

IdeaSquare is an innovation hub at CERN, which aims to bring together people from many fields, to generate new ideas and work on conceptual prototypes related to detector research in an open, collaborative environment.

It brings together CERN personnel, visiting students, and external project collaborators from the domains of research, technology development and education. It also contributes to CERN’s Knowledge Transfer Group, helping them to shape and innovate new product ideas into socially and globally relevant activities.

 

For more information, visit the event page.

Meet the DUNEs

Mon, 10/23/2017 - 14:31

Inside one of the protoDUNE detectors, currently under construction at CERN (Image: Max Brice/CERN)

A new duo is living in CERN’s test beam area. On the outside, they look like a pair of Rubik’s Cubes that rubbed a magic lamp and transformed into castle turrets. But on the inside, they’ve got the glamour of a disco ball.

These 12m x 12m x 12m boxes are two prototypes for the massive detectors of the Deep Underground Neutrino Experiment (DUNE). DUNE, an international experiment hosted by Fermilab in the United States, will live deep underground and trap neutrinos: tiny fundamental particles that rarely interact with matter.

“Learning more about neutrinos could help us better understand how the early Universe evolved and why the world is made of matter and not antimatter,” said Stefania Bordoni, a CERN researcher working on neutrino detector development.

These DUNE prototypes are testing two variations of a detection technique first developed by Nobel laureate Carlo Rubbia. Each cube is a chilled thermos that will hold approximately 800 tonnes of liquid argon. When a neutrino bumps into an atom of argon, it will release a flash of light and a cascade of electrons, which will glide through the electrically charged chamber to detectors lining the walls.

Inside their reinforced walls sits a liquid-tight metallic balloon, which can expand and contract to accommodate the changing volume of the argon as it cools from a gas to a liquid.

Even though theses cubes are huge, they are mere miniature models of the final detectors, which will be 20 times larger and together hold a total of 72 000 tonnes of liquid argon.

In the coming months, these prototypes will be cooled down so that their testing can begin using a dedicated beam line at CERN’s SPS accelerator complex.

Meet the DUNEs

Mon, 10/23/2017 - 14:31

Inside one of the protoDUNE detectors, currently under construction at CERN (Image: Max Brice/CERN)

A new duo is living in CERN’s test beam area. On the outside, they look like a pair of Rubik’s Cubes that rubbed a magic lamp and transformed into castle turrets. But on the inside, they’ve got the glamour of a disco ball.

These 12m x 12m x 12m boxes are two prototypes for the massive detectors of the Deep Underground Neutrino Experiment (DUNE). DUNE, an international experiment hosted by Fermilab in the United States, will live deep underground and trap neutrinos: tiny fundamental particles that rarely interact with matter.

“Learning more about neutrinos could help us better understand how the early Universe evolved and why the world is made of matter and not antimatter,” said Stefania Bordoni, a CERN researcher working on neutrino detector development.

These DUNE prototypes are testing two variations of a detection technique first developed by Nobel laureate Carlo Rubbia. Each cube is a chilled thermos that will hold approximately 800 of liquid argon. When a neutrino bumps into an atom of argon, it will release a flash of light and a cascade of electrons, which will glide through the electrically charged chamber to detectors lining the walls.

Inside their reinforced walls sits a liquid-tight metallic balloon, which can expand and contract to accommodate the changing volume of the argon as it cools from a gas to a liquid.

Even though theses cubes are huge, they are mere miniature models of the final detectors, which will be 20 times larger and hold a total of 72 000 tonnes of liquid argon.

In the coming months, these prototypes will be cooled down so that their testing can begin using a dedicated beam line at CERN’s SPS accelerator complex.

ATLAS and CMS join forces to tackle top-quark asymmetry

Fri, 10/20/2017 - 09:29

Event display of a tt̄ event candidate in the 2015 data (Image: ATLAS/CERN)

In their hunt for new particles and phenomena lurking in LHC collisions, the ATLAS and CMS experiments have joined forces to investigate the top quark. As the heaviest of all elementary particles, weighing almost as much as an atom of gold, the top quark is less well understood than its lighter siblings. With the promise of finding new physics hidden amongst the top quark’s antics, ATLAS and CMS have combined their top-quark data for the first time.

There were already hints that the top quark didn’t play by the rules in data collected at the Tevatron collider at Fermilab in the US (the same laboratory that discovered the particle in 1995). Around a decade ago, researchers found that, when produced in pairs from the Tevatron’s proton-antiproton collisions, top quarks tended to be emitted in the direction of the proton beam, while anti-tops aligned in the direction of the antiproton beam. A small forward-backward asymmetry is predicted by the Standard Model, but the data showed the measured asymmetry to be tantalisingly bigger than expected, potentially showing that new particles or forces are influencing top-quark pair production.

“As physicists, when we see something like this, we get excited,” says ATLAS researcher Frederic Deliot. If the asymmetry is much larger than predicted, it means “there could be lots of new physics to discover.”


All matter around us is made of elementary particles called quarks and leptons. Each group consists of six particles, which are related in pairs, or “generations” – the up quark and the down quark form the first, lightest and most stable generation, followed by the charm quark and strange quark, then the top quark and bottom (or beauty) quark, the heaviest and least stable generation. (Image: Daniel Dombinguez/CERN)

The forward-backward asymmetry measured at the Tevatron cannot be seen at the LHC because the LHC collides protons with protons, not antiprotons. But a related charge asymmetry, which causes top quarks to be produced preferentially in the centre of the LHC’s collisions, can be measured. The Standard Model predicts the effect to be small (around 1%) but, as with the forward-backward asymmetry, it could be made larger by new physics. The ATLAS and CMS experiments both measured the asymmetry by studying differences in the angular distributions of top quarks and antiquarks produced at the LHC at energies of 7 and 8 TeV.

Alas, individually and combined, their results show no deviation from the latest Standard Model calculations. These calculations have in fact recently been improved, and show that the predicted asymmetry is slightly higher than previously thought. This, along with improvements in data analysis, even brings the earlier Tevatron result into line with the Standard Model.

ATLAS and CMS will continue to subject the heavyweight top quark to tests at energies of 13 TeV to see if it deviates from its expected behaviour, including precision measurements of its mass and interactions with other Standard Model particles. But measuring the asymmetry will get even tougher, because the effect is predicted be half as big at a higher energy. “It’s going to be difficult,” says Deliot. “It will be possible to explore using the improved statistics at higher energy, but it is clear that the space for new physics has been severely restricted.”

The successful combination of the charge-asymmetry measurements was achieved within the LHC top-quark physics working group, where scientists from ATLAS and CMS and theory experts work together intensively towards improving the interplay between theory and the two experiments, explains CMS collaborator Thorsten Chwalek. "Although the combination of ATLAS and CMS charge asymmetry results didn't reveal any hints of new physics, the exercise of understanding all the correlations between the measurements was very important and paved the way for future ATLAS+CMS combinations in the top-quark sector.”

For more inforrmation:

Atlas webpage

CMS webpage

ATLAS physics briefing

 

 

 

ATLAS and CMS join forces to tackle top-quark asymmetry

Fri, 10/20/2017 - 09:29

Event display of a tt̄ event candidate in the 2015 data (Image: ATLAS/CERN)

In their hunt for new particles and phenomena lurking in LHC collisions, the ATLAS and CMS experiments have joined forces to investigate the top quark. As the heaviest of all elementary particles, weighing almost as much as an atom of gold, the top quark is less well understood than its lighter siblings. With the promise of finding new physics hidden amongst the top quark’s antics, ATLAS and CMS have combined their top-quark data for the first time.

There were already hints that the top quark didn’t play by the rules in data collected at the Tevatron collider at Fermilab in the US (the same laboratory that discovered the particle in 1995). Around a decade ago, researchers found that, when produced in pairs from the Tevatron’s proton-antiproton collisions, top quarks tended to be emitted in the direction of the proton beam, while anti-tops aligned in the direction of the antiproton beam. A small forward-backward asymmetry is predicted by the Standard Model, but the data showed the measured asymmetry to be tantalisingly bigger than expected, potentially showing that new particles or forces are influencing top-quark pair production.

“As physicists, when we see something like this, we get excited,” says ATLAS researcher Frederic Deliot. If the asymmetry is much larger than predicted, it means “there could be lots of new physics to discover.”


All matter around us is made of elementary particles called quarks and leptons. Each group consists of six particles, which are related in pairs, or “generations” – the up quark and the down quark form the first, lightest and most stable generation, followed by the charm quark and strange quark, then the top quark and bottom (or beauty) quark, the heaviest and least stable generation. (Image: Daniel Dombinguez/CERN)

The forward-backward asymmetry measured at the Tevatron cannot be seen at the LHC because the LHC collides protons with protons, not antiprotons. But a related charge asymmetry, which causes top quarks to be produced preferentially in the centre of the LHC’s collisions, can be measured. The Standard Model predicts the effect to be small (around 1%) but, as with the forward-backward asymmetry, it could be made larger by new physics. The ATLAS and CMS experiments both measured the asymmetry by studying differences in the angular distributions of top quarks and antiquarks produced at the LHC at energies of 7 and 8 TeV.

Alas, individually and combined, their results show no deviation from the latest Standard Model calculations. These calculations have in fact recently been improved, and show that the predicted asymmetry is slightly higher than previously thought. This, along with improvements in data analysis, even brings the earlier Tevatron result into line with the Standard Model.

ATLAS and CMS will continue to subject the heavyweight top quark to tests at energies of 13 TeV to see if it deviates from its expected behaviour, including precision measurements of its mass and interactions with other Standard Model particles. But measuring the asymmetry will get even tougher, because the effect is predicted be half as big at a higher energy. “It’s going to be difficult,” says Deliot. “It will be possible to explore using the improved statistics at higher energy, but it is clear that the space for new physics has been severely restricted.”

The successful combination of the charge-asymmetry measurements was achieved within the LHC top-quark physics working group, where scientists from ATLAS and CMS and theory experts work together intensively towards improving the interplay between theory and the two experiments, explains CMS collaborator Thorsten Chwalek. "Although the combination of ATLAS and CMS charge asymmetry results didn't reveal any hints of new physics, the exercise of understanding all the correlations between the measurements was very important and paved the way for future ATLAS+CMS combinations in the top-quark sector.”

For more inforrmation:

Atlas webpage

CMS webpage

ATLAS physics briefing

 

 

 

A more precise measurement for antimatter than for matter

Thu, 10/19/2017 - 08:30

Stefan Ulmer, spokesperson of the BASE collaboration, working on the experiment set-up. (Image: Maximilien Brice, Julien Ordan/CERN)

This week, the BASE collaboration published, in Nature, a new measurement of the magnetic moment of the antiproton, with a precision exceeding that of the proton. Thanks to a new method involving simultaneous measurements made on two separately-trapped antiprotons in two Penning traps, BASE succeeded in breaking its own record presented last January. This new result improves by a factor 350 the precision of the previous measurement and allows to compare matter and antimatter with an unprecedented accuracy.

“This result is the culmination of many years of continuous research and development, and the successful completion of one of the most difficult measurements ever performed in a Penning trap instrument,” said BASE spokesperson Stefan Ulmer.

The results are consistent with the magnetic moments of the proton and antiproton being equal, with the experimental uncertainty of the new antiproton measurement now significantly smaller than that for protons. The magnetic moment of the antiproton is found to be 2.792 847 344 1 (measured in unit of nuclear magneton), to be compared to the figure of 2.792 847 350 that the same collaboration of researchers found for the proton in 2014, at the BASE companion experiment at Mainz, in Germany.

“It is probably the first time that physicists get a more precise measurement for antimatter than for matter, which demonstrates the extraordinary progress accomplished at CERN’s Antiproton Decelerator, ” added first-author of the study Christian Smorra.

You can read the scientific paper here

Drone footage of CERN's BASE experiment (Video:Noemi Caraban/CERN)

Pages