Category Archives: Physics

CERN sets date for first attempt at 7 TeV collisions in the LHC

Geneva, 23 March 2010. With beams routinely circulating in the Large Hadron Collider at 3.5 TeV, the highest energy yet achieved in a particle accelerator, CERN has set the date for the start of the LHC research programme. The first attempt for collisions at 7 TeV (3.5 TeV per beam) is scheduled for 30 March.

0911182_43-A4-at-144-dpi “With two beams at 3.5 TeV, we’re on the verge of launching the LHC physics programme,” explained CERN’s Director for Accelerators and Technology, Steve Myers. “But we’ve still got a lot of work to do before collisions. Just lining the beams up is a challenge in itself: it’s a bit like firing needles across the Atlantic and getting them to collide half way.”

Between now and 30 March, the LHC team will be working with 3.5 TeV beams to commission the beam control systems and the systems that protect the particle detectors from stray particles. All these systems must be fully commissioned before collisions can begin.

“The LHC is not a turnkey machine,” said CERN Director General Rolf Heuer.“The machine is working well, but we’re still very much in a commissioning phase and we have to recognize that the first attempt to collide is precisely that. It may take hours or even days to get collisions.”

The last time CERN switched on a major new research machine, the Large Electron Positron collider, LEP, in 1989 it took three days from the first attempt to collide to the first recorded collisions.

The current LHC run began on 20 November 2009, with the first circulating beam at 0.45 TeV. Milestones were quick to follow, with twin circulating beams established by 23 November and a world record beam energy of 1.18 TeV being set on 30 November. By the time the LHC switched off for 2009 on 16 December, another record had been set with collisions recorded at 2.36 TeV and significant quantities of data recorded. Over the 2009 part of the run, each of the LHC’s four major experiments, ALICE, ATLAS, CMS and LHCb recorded over a million particle collisions, which were distributed smoothly for analysis around the world on the LHC computing grid. The first physics papers were soon to follow. After a short technical stop, beams were again circulating on 28 February 2010, and the first acceleration to 3.5 TeV was on 19 March.

Once 7 TeV collisions have been established, the plan is to run continuously for a period of 18-24 months, with a short technical stop at the end of 2010. This will bring enough data across all the potential discovery areas to firmly establish the LHC as the world’s foremost facility for high-energy particle physics.

A webcast will be available on the day of the first attempt to collide protons at 7TeV. More details will be available at:


CERN Press Office,
+41 22 767 34 32
+41 22 767 21 41

1.CERN, the European Organization for Nuclear Research, is the world’s leading laboratory for particle physics. It has its headquarters in Geneva. At present, its Member States are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland and the United Kingdom. India, Israel, Japan, the Russian Federation, the United States of America, Turkey, the European Commission and UNESCO have Observer status.

Leave a comment

Posted by on March 30, 2010 in LHC, Physics, Science News



Large Hadron Collider Triples Its Own Record – 3.5 TeV

CERN Operations Group leader Mike Lamont (foreground) and LHC engineer in charge Alick Macpherson in the CERN Control Centre early this morning.

In the early hours of this morning, the beam energy was ramped up to 3.5 TeV, a new world record and the highest energy for this year’s run. Now operators will prepare the machine to make high-energy collisions later this month.

At 5:23 this morning, Friday 19 March, the energy of both beams in the LHC was ramped up to 3.5 TeV, a new world record. During the night, operators had tested the performance of the whole machine with two so-called ‘dry runs’, that is, without beams. Given the good overall response, beams were injected at around 3:00 a.m. and stabilized soon after. The ramp started at around 4:10 and lasted about one hour.

In my message this week, I’d like to congratulate the LHC team on accelerating two beams to 3.5 TeV in the early hours of this morning. The timing could not have been better. Coming during a week of CERN Council meetings, it allowed us to show delegates the great progress we’re making.

The occasion also gave us the opportunity to set out again the prudent step-by-step approach that we’re taking to get the LHC up and running, and it was refreshing to hear one member of the Scientific Policy Committee declare on Monday that we should never forget that the LHC is not a turnkey machine.With the progress the LHC is making, that simple fact would be easy to overlook. The figures coming back from this first run are already quite remarkable. In Week 10, the LHC’s availability for the operators was over 65%: it usually takes a new accelerator years to reach that level. And over the last few weeks, operation of the LHC at 450 GeV has become routinely reproducible, which is again a feat that usually takes a new machine much longer to achieve.

Over the last couple of weeks, operation of the LHC at 450 GeV has become routinely reproducible. The operators were able to test and optimize the beam orbit, the beam collimation, the injection and extraction phases as well as the associated protection system. On 12 March, both beams were ramped up to 1.18 TeV. The overall response from the machine was very positive.

The first part of this week saw a technical stop, during which the magnet and magnet protection experts continued their campaign to commission the machine to 6 kAmps – the current needed to operate at 3.5 TeV per beam. Tests are still ongoing to fully understand the electrical behaviour of the dipole circuits with currents higher than 2 kAmps, which has an impact on the quench protection system (see box) and on the procedure for ramping the beam energy to 3.5 TeV (6kAmps).

While the experts are working to fully understand the circuit performance (for details, watch the embedded video interview with Andrzej Siemko, Group Leader of the LHC machine protection), the operators will continue ramping the beam energy and prepare for high-energy collisions later this month.

Leave a comment

Posted by on March 21, 2010 in LHC, Physics, Science News


Tags: ,

Compressed Sensing: Filling in the Blanks

I stumbled across this article today, Fill in the Blanks: Using Math to Turn Lo-Res Datasets Into Hi-Res Samples by By Jordan Ellenberg, Wired Magazine, February 22, 2010, and found it so fascinating – in particular because of my recent research into fractals – that I had to take a little tour around the internet to find a little more information on compressed sensing.

Wikipedia describes compressed sensing as: “a technique for acquiring and reconstructing a signal utilizing the prior knowledge that it is sparse or compressible. The field has existed for at least four decades, but recently the field has exploded, in part due to several important results by David Donoho, Emmanuel Candès, Justin Romberg and Terence Tao.”

“The ideas behind compressive sensing came together in 2004 when Emmanuel J. Candès, a  mathematician at Caltech, was working on a problem in magnetic resonance imaging. He discovered that a test image [a badly corrupted version of the image shown here] could be P = phantom('Modified Shepp-Logan',200);reconstructed exactly even with data deemed insufficient by the Nyquist-Shannon criterion.”

According to the story Fill in the Blanks: Using Math to Turn Lo-Res Datasets Into Hi-Res Samples by By Jordan Ellenberg, Wired Magazine, February 22, 2010, the mathematical technique called l1 minimization is now being looked at in a number of experimental applications, such as DARPA funded research into acquisition of enemy communication signals:

DARPA ~ …mathematics thrust area have successfully applied a methodology of discovery of physics-based structure within a sensing problem, from which it was often possible to determine and algorithmically exploit efficient low-dimensional representations of those problems even though they are originally posed in high-dimensional settings. Computational complexity and statistical performance of fielded algorithms within the DSP component of sensor systems have both been substantially improved through this approach. The aim of ISP is much more ambitious: to develop and amplify this concept across all components of an entire sensor system and then across networks of sensor systems.

data storage:

Wired ~ …the technique will help us in the future as we struggle with how to treat the vast amounts of information we have in storage. The world produces untold petabytes of data every day — data that we’d like to see packed away securely, efficiently, and retrievably. At present, most of our audiovisual info is stored in sophisticated compression formats. If, or when, the format becomes obsolete, you’ve got a painful conversion project on your hands.

and further into the future, perhaps CS will even live in our digital camera’s:

Wired ~ Candès believes, we’ll record just 20 percent of the pixels in certain images, like expensive-to-capture infrared shots of astronomical phenomena. Because we’re recording so much less data to begin with, there will be no need to compress. And instead of steadily improving compression algorithms, we’ll have steadily improving decompression algorithms that reconstruct the original image more and more faithfully from the stored data.

Today though, CS is already rewriting the way we capture medical information. A team at the University of Wisconsin, with participation from GE Healthcare, is combining CS with technologies called HYPR and VIPR to speed up certain kinds of magnetic resonance scans, in some cases by a factor of several thousand.  GE Healthcare is also experimenting with a novel protocol that promises to use CS to vastly improve observations of the metabolic dynamics of cancer patients. Meanwhile, the CS-enabled MRI machines at Packard can record images up to three times as quickly as conventional scanners.

Wired ~ In the early spring of 2009, a team of doctors at the Lucile Packard Children’s Hospital at Stanford University lifted a 2-year-old into an MRI scanner. The boy, whom I’ll call Bryce, looked tiny and forlorn inside the cavernous metal device. The stuffed monkey dangling from the entrance to the scanner did little to cheer up the scene. Bryce couldn’t see it, in any case; he was under general anesthesia, with a tube snaking from his throat to a ventilator beside the scanner. Ten months earlier, Bryce had received a portion of a donor’s liver to replace his own failing organ. For a while, he did well. But his latest lab tests were alarming. Something was going wrong — there was a chance that one or both of the liver’s bile ducts were blocked.

Shreyas Vasanawala, a pediatric radiologist at Packard, didn’t know for sure what was wrong, and hoped the MRI would reveal the answer. Vasanawala needed a phenomenally hi-res scan, but if he was going to get it, his young patient would have to remain perfectly still. If Bryce took a single breath, the image would be blurred. That meant deepening the anesthesia enough to stop respiration. It would take a full two minutes for a standard MRI to capture the image, but if the anesthesiologists shut down Bryce’s breathing for that long, his glitchy liver would be the least of his problems.

However, Vasanawala and one of his colleagues, an electrical engineer named Michael Lustig, were going to use a new and much faster scanning method. Their MRI machine used an experimental algorithm called compressed sensing — a technique that may be the hottest topic in applied math today. In the future, it could transform the way that we look for distant galaxies. For now, it means that Vasanawala and Lustig needed only 40 seconds to gather enough data to produce a crystal-clear image of Bryce’s liver.

Read the rest of this entry »


Tags: , , , , , , , , , ,

Outcome from Chamonix: Better in the long run

The Large Hadron Collider, the world’s most powerful particle accelerator, is about to enter its longest continuous operational period, in preparation for full-strength particle-smashing.

CernCMS_182_large On Wednesday, Steve Myers, the LHC’s director for accelerators and technology, blogged that CERN had decided last week to run the giant particle collider for 18 to 24 months at a collision energy of seven tera-electron-volts (TeV)–or 3.5 TeV per beam–with the powering-up phase starting later this month.

After that, the LHC will "go into a long shutdown in which we’ll do all the necessary work to allow us to reach the LHC’s design collision energy of 14 TeV for the next run," Myers wrote.

Last week, the Chamonix workshop once again proved its worth as a place where all the stakeholders in the LHC can come together, take difficult decisions and reach a consensus on important issues for the future of particle physics. The most important decision we reached last week is to run the LHC for 18 to 24 months at a collision energy of 7 TeV (3.5 TeV per beam). After that, we’ll go into a long shutdown in which we’ll do all the necessary work to allow us to reach the LHC’s design collision energy of 14 TeV for the next run. This means that when beams go back into the LHC later this month, we’ll be entering the longest phase of accelerator operation in CERN’s history, scheduled to take us into summer or autumn 2011.

What led us to this conclusion? Firstly, the LHC is unlike any previous CERN machine. Because it is a cryogenic facility, each run is accompanied by lengthy cool-down and warm-up phases. For that reason, CERN’s traditional ‘run through summer and shutdown for winter’ operational model had already been brought into question. Furthermore, we’ve known for some time that work is needed to prepare the LHC for running at energies significantly higher than the 7 TeV collision energy we’ve chosen for the first physics run. The latest data show that while we can run the LHC at 7 TeV without risk to the machine, running it at higher energy would require more work in the tunnel. These facts led us to a simple choice: run for a few months now and programme successive short shutdowns to step up in energy, or run for a long time now and schedule a single long shutdown before allowing 14 TeV (7 TeV per beam).

A long run now is the right decision for the LHC and for the experiments. It gives the machine people the time necessary to prepare carefully for the work that’s needed before allowing 14 TeV. And for the experiments, 18 to 24 months will bring enough data across all the potential discovery areas to firmly establish the LHC as the world’s foremost facility for high-energy particle physics.

I’d like to invite you all to the summary of the Chamonix workshop on Friday 5 February at 14:00 in the Main auditorium. See:

Steve Myers
Director for Accelerators and Technology

Read more: The Large Hadron Collider is about to enter its longest continuous operational period, in preparation for full-strength particle-smashing. ZDNetUK

Leave a comment

Posted by on February 5, 2010 in Physics, Science News


Tags: , , , ,

Scientists Produce Unprecedented 1 Megajoule Laser Shot, Step Towards Fusion Ignition

Lawrence Livermore National Laboratory

US scientists have produced a laser shot with an unprecedented energy level that could be a key step towards nuclear fusion, the US National Nuclear Security Administration said Wednesday, January 27, 2010.

The National Nuclear Security Administration announced that scientists at the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory (LLNL) [Lawrence Livermore National Laboratory is located in Livermore, California, about 40 miles east of San Francisco in southern Alameda County] have successfully delivered an historic level of laser energy — more than 1 megajoule — to a target in a few billionths of a second and demonstrated the target drive conditions required to achieve fusion ignition. A megajoule (MJ) is equal to one million joules, or approximately the kinetic energy of a one-ton vehicle moving at 160 km/h (100 mph). This is about 30 times greater than the energy ever delivered by any other group of lasers in the world.

The peak power of the laser light, which was delivered within a few billionths of a second, was about 500 times that used by the United States at any given time.

Laser Bay 1 “Breaking the megajoule barrier brings us one step closer to fusion ignition at the National Ignition Facility, and shows the universe of opportunities made possible by one of the largest scientific and engineering challenges of our time,” said NNSA Administrator Thomas D’Agostino. “NIF is a critical component in our stockpile stewardship program to maintain a safe, secure and effective nuclear deterrent without underground nuclear testing. This milestone is an example of how our nation’s investment in nuclear security is producing benefits in other areas, from advances in energy technology to a better understanding of the universe.”

In order to demonstrate fusion, the energy that powers the sun and the stars, NIF focuses the energy of 192 powerful laser beams into a pencil-eraser-sized cylinder containing a tiny spherical target filled with deuterium and tritium, two isotopes of hydrogen. Inside the cylinder, the laser energy is converted to X-rays, which compress the fuel until it reaches temperatures of more than 200 million degrees Fahrenheit and pressures billions of times greater than Earth’s atmospheric pressure. The rapid compression of the fuel capsule forces the hydrogen nuclei to fuse and release many times more energy than the laser energy that was required to initiate the reaction.

nif_hohlraum_big This experimental program to achieve fusion ignition is known as the National Ignition Campaign sponsored by NNSA and is a partnership among LLNL, Los Alamos National Laboratory, the Laboratory for Laser Energetics, General Atomics, Sandia National Laboratories, as well as numerous other national laboratories and universities.

Source: National Ignition Facility News Release

Leave a comment

Posted by on February 3, 2010 in Physics, Science News, Technology


Tags: , , , , , ,

LHC at CERN Explained via TED, Brian Cox

About LHC at CERN

“Rock-star physicist” Brian Cox talks about his work on the Large Hadron Collider at CERN. Discussing the biggest of big science in an engaging, accessible way, Cox brings us along on a tour of the massive project.

About Brian Cox

Physicist Brian Cox has two jobs: working with the Large Hadron Collider at CERN, and explaining big science to the general public. He’s a professor at the University of Manchester. Full bio and more links



Technorati Tags: ,,,
Leave a comment

Posted by on January 1, 2010 in Physics, Science


Tags: , , ,

LHC to restart in 2009


Geneva, 5 December 2008. CERN today confirmed that the Large Hadron Collider (LHC) will restart in 2009. This news forms part of an updated report, published today, on the status of the LHC following a malfunction on 19 September.

“The top priority for CERN today is to provide collision data for the experiments as soon as reasonably possible,” said CERN Director General Robert Aymar. “This will be in the summer of 2009.”

The initial malfunction was caused by a faulty electrical connection between two of the accelerator’s magnets. This resulted in mechanical damage and release of helium from the magnet cold mass into the tunnel. Proper safety procedures were in force, the safety systems performed as expected, and no one was put at risk.

Detailed studies of the malfunction have allowed the LHC’s engineers to identify means of preventing a similar incident from reoccurring in the future, and to design new protection systems for the machine. A total of 53 magnet units have to be removed from the tunnel for cleaning or repair, of these, 28 have already been brought to the surface and the first two replacement units have been installed in the tunnel. The current schedule foresees the final magnet being reinstalled by the end of March 2009, with the LHC being cold and ready for powering tests by the end of June 2009.

“We have a lot of work to do over the coming months,” said LHC project Leader Lyn Evans, “but we now have the roadmap, the time and the competence necessary to be ready for physics by summer. We are currently in a scheduled annual shutdown until May, so we’re hopeful that not too much time will be lost.”

Full details of the timetable to restart are available in the report published today.

Download the report [PDF format]   *Note* –  The report link is broken at this time!

Technorati Tags: ,

Posted by on January 11, 2009 in Physics, Science News


Tags: ,