Collider: The Search for the World's Smallest Particles - Paul Halpern (2009)

Introduction

The Machinery of Perfection

Does not so evident a brotherhood among the atoms point to a common parentage? . . . Does not the entireness of the complex hint at the perfection of the simple? . . . Is it not because originally, and therefore normally, they were One—that now, in all circumstances—at all points—in all directions—by all modes of approach—in all relations and through all conditions—they struggle back to this absolutely, this irrelatively, this unconditionally one?

—EDGAR ALLAN POE (EUREKA)

Deep in the human heart is an irrepressible longing for unity. Symmetry and completeness guide our sense of beauty and steer us toward people, places, and things that seem balanced and whole. Architects understand this quest when they make use of geometric principles to design aesthetically pleasing structures. Photographers tap into this yearning when they frame their images to highlight a scene’s harmonies. And lovers realize this desire when they seek the comfort of profound connection.

Where might we find perfection? Is it by digging deep into our past to an ancient time before symmetry was shattered? Or is it by digging deep underground, crafting mighty machinery, and shattering particles ourselves—hoping that in the rubble we might somehow find fossils of a paradise lost?

The opposite of beauty is the macabre. Unbalanced things, like broken art or atonal music, make us uneasy. Perhaps no writer better expressed this contrast than Edgar Allan Poe—a master at capturing the gorgeous and the gruesome—who spent much of his final years developing and promoting an attempt to understand the deep unity underlying creation. His prose poem Eureka suggests that the universe’s original oneness longs to reconstitute itself; like the unsteady House of Usher, it pines for its native soil.

Contemporary physics, a triumph of generations of attempts to map the properties of nature, contains satisfying islands of harmony. Yet it is a discipline with uncomfortable inequities and gaps. Completing the cartography of the cosmos has called for intrepid exploration by today’s ablest researchers.

In understanding the forces that steer the universe, and trying to unify them into a single explanation of everything, most of the greatest strides have been made in the past two centuries. In the mid-nineteenth century, the brilliant Scottish physicist James Clerk Maxwell demonstrated that electricity and magnetism are integrally connected, and that the relationship between them could be expressed through a set of four simple equations. Maxwell’s equations are so succinct they can literally fit on a T-shirt—as evidenced by a popular fashion choice at many physics conferences. These relationships offer a startling conclusion: all light in the world, from the brilliant yellow hues of sunflowers to the scarlet shades of sunsets, consists of electromagnetic waves—electricity and magnetism working in tandem.

By the early twentieth century, physicists had come to realize that these waves always travel in discrete packets of energy, called photons. These hop at the speed of light between electrically charged objects, causing either attraction or repulsion. Hence, all electromagnetic phenomena in the world, from the turning of a compass needle to the blazing of lightning in the sky, involve the exchange of photons between charged particles.

In addition to electromagnetism, the other known natural interactions include two forces that operate on the nuclear scale—called the weak and strong interactions—and the apple-dropping, planet-guiding force of gravitation. These four forces proscribe all of the ways material objects can attract, repel, or transform. Whenever motion changes—a sudden jolt, a subtle twist, a quiet start, or a screeching halt—it is due to one or more of the four natural interactions.

Each interaction occurs by means of its own exchange particle, or set of particles. An exchange particle works by drawing other particles together, pushing them apart, or changing their properties. It’s like a Frisbee game in which the players move closer to catch the Frisbee or step back when it hits them. The process of tossing something back and forth cements the players’ connection.

Given Maxwell’s fantastic success with marrying electricity and magnetism, many physicists have tried to play matchmaker with the other forces. Like party hosts trying to facilitate connections among their guests, researchers have looked for commonalities as a way of making introductions. Could all four interactions be linked through a mutual set of relationships?

The most successful unification so far has been the melding of electromagnetism and the weak force, performed by American physicists Steven Weinberg and Sheldon Glashow and Pakistani physicist Abdus Salam (each working independently). In unity, these would be known as the electroweak interaction. There were many tricky details that needed to be ironed out, however, before the match could be made.

One of the major issues had to do with a great disparity in the masses of the exchange particles corresponding to each force. While photons have zero mass, the carriers of the weak interaction are bulky—signified by the latter force’s much shorter range. The difference between electromagnetic and weak exchanges is a bit like tossing a foam ball across a field and then trying to do the same with a lead bowling ball. The latter would scarcely spend any time in the air before plunging to the ground. With such different volleys, how could the two forces play the same game?

Sometimes, however, inequities emerge from once-balanced situations. Symmetry, as collectors of ancient sculpture know, can be fragile. Perhaps the very early universe, instants after the fiery Big Bang that ushered in its birth, was in a fleeting state of harmony. All forces were perfectly balanced, until some transformation tilted the scales. Equity shattered, and some of the exchange particles became bulkier than others. Could today’s unequal forces constitute the culmination of a universal symmetry-breaking process?

In 1964, British physicist Peter Higgs proposed a promising mechanism for spontaneously breaking the universe’s initial symmetry. His mechanism requires a special entity, dubbed the Higgs field, that pervades all of space and sets its fundamental energy scale. (A field is a mathematical description of how properties of a force or particle change from point to point.) Within its internal structure is a marker called a phase angle that can point anywhere around a circle. At extremely high temperatures, such as the nascent moments of the universe, the direction of this marker is blurry, like a rapidly spinning roulette wheel. However, as temperatures lower, the roulette wheel freezes, and the marker points to a random direction. The Higgs field’s initial symmetry, with all angles being equal, has spontaneously broken to favor a single angle. Because the Higgs field sets the baseline for the vacuum (lowest energy) state of the universe, this transforms during the symmetry breaking from a situation called the true vacuum, in which the lowest energy is zero, to a false vacuum, in which it is nonzero. Following Albert Einstein’s famous dictum Emc2 (energy equals mass times the speed of light squared), the acquired energy becomes mass and is shared among many elementary particles, including the carriers of the weak interaction. In short, the halting of the Higgs field’s “roulette wheel” channels mass into the weak exchange (and other) particles and explains why they are bulky while the photons remain massless. With its phenomenal ability to bestow mass on other particles, the Higgs has acquired the nickname the “God particle.”

If the Higgs mechanism is correct, a remnant of the original field should exist as a fundamental particle. Because of its high mass—more than a hundred times higher than the protons that compose the cores of hydrogen atoms—it would be seen only during energetic particle events, such as high-energy collisions. Despite decades of searching, this key ingredient for electroweak unification has yet to be found. Hence, the elusive God particle has become the holy grail of contemporary physics.

Aside from the missing Higgs particle, electroweak unification has proven enormously successful. Its importance is such that it is known as the Standard Model. However, much to the physics community’s disappointment, attempts to unite the electroweak interactions with the other two forces have yet to bear fruit.

At least the electroweak and strong forces can be written in the same language—the lexicon of quantum mechanics. Developed in the 1920s, quantum mechanics is a powerful toolbox for describing the subatomic realm. Although it offers accurate odds for the outcome of physical events such as scattering (the bouncing of one particle off another) and decay, it frustratingly contains a built-in fuzziness. No matter how hard you try to nail down the precise course of events for natural occurrences on the subatomic scale, you are often left with the flip of a coin or the roll of a die. Einstein detested having to make wagers on what ought to be known with crystal clarity and spent his later years trying to develop an alternative. Nevertheless, like a brilliant but naughty young Mozart, quantum mechanics has offered enough stunning symphonies to hide its lack of decorum.

Physicists who relish exactness tend to turn toward Einstein’s own masterpiece, the general theory of relativity, which offers a precise way of describing gravity. Unlike theories of the other interactions, it is deterministic, not probabilistic, and treats space and time as participants rather than just background coordinates. Although there have been numerous attempts, there is no broadly accepted way of placing gravity on a quantum footing. It’s like trying to assemble a winning team for a spelling championship but finding that one of the four players, though an expert, speaks a completely different language.

Researchers are left with an odd puzzle. Of the four fundamental interactions, two, electromagnetism and the weak, appear to fit together perfectly. The strong interaction seems as if it ought to fit, but no one has quite been able to match it up. And gravity seems to belong to another set of pieces altogether. How then to reconstruct the original symmetry of the cosmos?

Other areas of asymmetry in contemporary physics include a huge discrepancy between the amount of matter and antimatter (like matter but oppositely charged) in the universe—the former is far more plentiful—and behavioral differences between the particles that build up matter, the fermions, and those that convey forces, the bosons. Like the Montagues and the Capulets, fermions and bosons belong to separate houses with distinct traditions. They like to group themselves in different ways, with fermions demanding more breathing room. Attempts to reconcile the two, in a grand cosmic union called supersymmetry, require that each member of one family have a counterpart in the other. These supersymmetric companions could also help explain a deep conundrum in astronomy: why galaxies appear to be steered by more mass than they visibly contain. Could supersymmetric companions constitute some or all of this dark matter? So far such invisible agents have yet to be found.

Such gaps and discrepancies rattle the human spirit. We like our science to tell a full story, not a shaggy-dog tale. If we can’t think of a solid ending, perhaps we haven’t imagined hard enough—not that theoretical physicists haven’t tried. For each scientific mystery a bevy of possible explanations have sprung up with varying degrees of plausibility.

Recent theoretical efforts to replace elementary particles with vibrating strands or membranes of energy—in what are called string theory and M-theory, respectively—have captured the imagination. These make use of supersymmetry or extra dimensions beyond space and time to explain in an elegant way some of the differences between gravity and the other interactions. An attractive mathematical feature of these theories is that while in prior approaches some calculations involving infinitesimal point particles yielded nonsensical results, using finite strands or membranes removes these problems. Given the difficulties with completing unification through extending the Standard Model, a number of prominent theorists have been won over to the mathematical elegance of these novel approaches. Steven Weinberg, for instance, once remarked, “Strings are the only game in town.”1

Detractors of string theory and M-theory, on the other hand, question their physical relevance because they contain undetermined values and require unseen dimensions. In the array of all configurations, the real world is just a subset of myriad possibilities. If a theory has enough free parameters, opponents point out, it could represent virtually any kind of particle or interaction. It’s like a writer who compares himself to Dickens churning out tens of thousands of pages of uneven prose and instructing an editor to piece together the most Dickensian passages. To echo Truman Capote’s famous remark, “that’s not writing; that’s typing,” detractors could well say about string theory, “that’s not physics; that’s model-making.”

Even the most stalwart supporters and fervent detractors would agree that the ultimate test of a theory lies in its experimental verification. So far, for string theory and M-theory, such evidence has been lacking. As noted theorist Bryce De Witt once told me, “With M-theory I feel dubious about graduate students [going into a field] where there is not one shred of experimental evidence supporting it.”2

From the 1930s until the mid-1990s, enormous strides were made in particle physics by means of high-energy experimentation with various types of accelerators. An accelerator is a device that uses electric and magnetic fields in tandem to steer particles (such as protons) along tracks or rings while propelling them to higher and higher energies. These particles are then allowed to collide, converting their energy into a multitude of collision products. Following Einstein’s equation, the higher the energy at the collision point, the greater the chance of massive particles being produced. While older accelerators used fixed targets, physicists came to realize that smashing particles together head-on produced even higher energies. Accelerators in which particles crash into one another are called colliders.

During those pivotal decades, by using various types of detectors to collect and analyze collision data, researchers were able to identify a zoo of elementary particles. The major theoretical breakthroughs of those times were motivated by a need to organize these particles into families and to understand their decays and interactions. Physicists discovered that all matter particles are either hadrons (subject to the strong force) or leptons (unaffected by the strong force). Protons are examples of hadrons and electrons are examples of leptons. Hadrons are composed of elementary components called quarks—either two or three per particle—bound together by gluons. Quarks come in six varieties called flavors: “down,” “up,” “strange,” “charm,” “bottom,” and “top.” There are also six types of antiquarks, which are similar to quarks but oppositely charged.

In that era of discovery, whenever novel theories were proposed, such as the quark model, researchers set out to test them through further experimentation. Testability lent certain theories a special relevance and clout—allowing them to raise their voices above others and demand to be heard. Upon verification, they could then say, “I told you so.”

For example, the top quark, predicted in the 1970s, was identified in 1995 through an analysis of the collision debris of what was then the mightiest accelerator in the world: the Tevatron, at Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois. Inaugurated in 1983, the Tevatron propels streams of protons and antiprotons (negatively charged, antimatter counterparts of protons) up to energies of about 1 TeV (Tera electron volts) each before bashing them together. One electron volt is the amount of energy involved in transporting a single electron or proton between the termini of a one-volt battery. Multiply that quantity by one trillion, and the result is 1 TeV—a colossal amount of energy for a minuscule elementary particle.

As it turned out, the discovery of the top quark represented the last major triumph of the Tevatron—and of experimental high-energy physics for some time. Finding the Higgs particle, identifying supersymmetric companions, and other important goals seemed to require more energy than even that mighty machine could muster. Without the opportunity for experimental confirmation the chorus of contending theories began to resemble a meaningless cacophony. Only by building more energetic colliders to test these competing ideas could the malaise of theoretical physics be remedied and significance restored to its voice.

The European Organization for Nuclear Research, better known by the acronym CERN (Conseil Européen pour la Recherche Nucléaire), took up the challenge. With the aim of finding the Higgs particle, discovering possible supersymmetric companions, discerning the nature of dark matter, exploring the possibility of hidden extra dimensions, understanding why there is an abundance of matter over antimatter in the universe, reproducing some of the conditions of the Big Bang, and resolving a host of other critical scientific issues, CERN would channel its resources into the construction of the largest and most powerful accelerator in the world, built at its headquarters near Geneva, Switzerland.

After more than fifteen years of planning and more than eight billion dollars in funding, the Large Hadron Collider (LHC), science’s groundbreaking effort to unlock the deepest secrets of particle physics, is finally complete. It is truly the grandest experiment of all time—the pinnacle of humanity’s quest for unification. Befitting the pursuit of cosmic grandeur and unity, it is set in a stunning location.

Query a world traveler about locales of striking beauty and harmony, and chances are Switzerland would be high up on the list. From its majestic mountains and crystalline lakes to its quaint cog railways and charming medieval towns, it is hard to imagine a better place to base a search for unification. Indeed the Swiss confederation, uniting inhabitants divided into four different official languages (French, German, Italian, and Romansch), several major religions (Protestant, Catholic, and other faiths), and twenty-six distinct cantons, physically isolated in many cases from one another, represents a model for bringing disparate forces together into a single system. Though in past centuries Switzerland experienced its share of turmoil, in more recent times it has become a haven for peace and neutrality.

As Europe’s political frontiers have receded, many scientific roadblocks have fallen as well. The LHC crosses the Swiss-French border with the ease of a diplomat. Its seventeen-mile-long circular underground tunnel, recycled from a retired accelerator called the Large Electron-Positron Collider (LEP), represents a triumph for international cooperation. Only by working in unison, it reminds us, might we discover the secrets of natural unity.

American researchers form a large contingent in the major LHC experiments. They are proud to contribute to such a pivotal venture. Although the United States is not a member of CERN, it donates ample funds toward LHC research. While celebrating Europe’s achievements, however, many American physicists still quietly mourn what could have taken place at home.

In 1993, the U.S. Congress voted to cut off funding for what would have been a far bigger, more powerful project, the Superconducting Super Collider (SSC). About fourteen miles of a planned fifty-four-mile tunnel in the region of Waxahachie, Texas, had already been excavated before the plug was pulled. Today that land sits fallow, except for the weed-strewn abandoned buildings on the site. Years of anticipation of novel discoveries were crushed in a single budgetary decision.

President Bill Clinton sent a letter to the House Appropriations Committee expressing his strong concerns: “Abandoning the SSC at this point would signal that the United States is compromising its position of leadership in basic science—a position unquestioned for generations.”3

Nevertheless, tight purse strings won out over sweeping visions. The cancellation of the SSC shattered the plans of those who had made multiyear commitments to the enterprise and discouraged young researchers from pursuing the field. It would prove a horrendous setback for American high-energy physics, shifting the momentum across the Atlantic.

By delivering a planned 20-TeV burst of energy with each collision, the particle-smasher in Texas would have been energetic enough to conduct a thorough search for the elusive God particle. Perhaps in its hatchery, supersymmetric companion particles would have been born, presenting themselves through their characteristic decay profiles. Dark matter could have made itself known in caverns deep beneath the Texas soil. The ramifications of string theory and other unification models could have been explored. Like the moon landings, these expeditions could have been launched from U.S. soil. With the LHC’s completion, the Tevatron will soon be obsolete and no more large American accelerators are planned. What went wrong?

The reason lies with long-term planning and commitment to science, an area where sadly the United States has in recent times often fallen short. Each European member of CERN pledges a certain amount every year, depending on its gross national product. Thus the designers of the LHC could count on designated funding over the many years required to get the enterprise up and running. Already, the upgrades of coming years are being programmed. Foresight and persistence are the keys to the LHC’s success.

Not that there haven’t been frustrating glitches and delays. Contemporary high-energy physics requires delicate instrumentation that must be aligned perfectly and maintained in extreme environmental conditions such as ultracold temperatures. Despite researchers’ best efforts, systems often fail. Originally supposed to go on line in 2005, the LHC wasn’t yet ready. Its opening was delayed again in 2007 because of accidental damage to some of its magnets.

On September 10, 2008, proton beams were circulated successfully for the first time around the LHC’s large ring. Project leader Lyn Evans and the international team of researchers working at the lab were elated. “It’s a fantastic moment,” said Evans. “We can now look forward to a new era of understanding about the origins and evolution of the universe.”4

Nine days later, however, that heady summer of hope screeched to a halt due to a devastating malfunction. Before particle collisions had even been attempted, a faulty electrical connection in the wiring between two magnets heated up, causing the supercooled helium surrounding them to vaporize. Liquid helium is a critical part of the LHC’s cooling system that keeps its superconducting magnets functioning properly. In gaseous form, the helium began to leak profusely into the vacuum layer that surrounds the system, thwarting attempts by emergency release valves to channel it off safely. Then came the knockout punch. The flood of helium slammed into the magnets, jostled them out of position, and destroyed more wiring and part of the beam pipe. Upon inspection, technicians realized that it would take many months to repair the damage, recheck the electrical and magnetic systems around the ring, and attempt operations once more. Currently, the LHC is scheduled to go on line in September 2009.

When it is up and running, the LHC will be a marvel to behold—albeit remotely, given that its action will take place well beneath the surface. Burrowed hundreds of feet beneath the earth but only ten feet in diameter, the LHC tunnel will serve as the racetrack for two opposing beams of particles. Steered by more than a thousand gigantic supercooled magnets—the coldest objects on Earth—these particles will race eleven thousand times per second around the loop, traveling up to 99.999999 percent of the speed of light. Reaching energies up to 7 TeV each, the beams will be forced to collide at one of four designated intersection points.

One of these collision sites houses the ATLAS (A Toroidal LHC ApparatuS), detector, a colossal instrument seven stories high (more than half the height of the Statue of Liberty) and spanning 150 feet (half the length of a football field) from end to end. Using sensitive tracking and calorimetry (energy-measuring) devices, it will monitor the debris of protons crashing together in its center, collecting an encyclopedia of data about the by-products of each collision. Halfway around the ring, another general- purpose detector called CMS (Compact Muon Solenoid) will employ alternative tracking and calorimetry systems to similarly collect reams of valuable collision data. At a third site, a specialty detector called LHCb (Large Hadron Collider beauty) will search for the decays of particles containing bottom quarks, with the hope of discovering the reason for the dearth of antimatter in the cosmos. Finally, at a fourth collision site, another specialized detector called ALICE (A Large Ion Collider Experiment) will be reserved for times of the year when lead ions are collided instead of protons. By smashing these together, researchers hope to re-create some of the conditions of the early universe. From each detector, based on careful assessment of the signals for possible new particles, the most promising information will be sent off for analysis via a global computing network called the Grid.

A vast group of researchers from numerous countries around the world will wait eagerly for the LHC results, hoping to find signs of the Higgs, supersymmetric companions, and other long-hoped-for particles. Discovery of any of these would spur a renaissance in physics and an enormous boost for the scientific enterprise—not to mention grounds for a Nobel Prize. The world would celebrate the achievements of those involved in this extraordinary undertaking, including the hardworking Evans and the thousands of workers contributing their vital efforts and ideas to the project.

If the Higgs is found, depending on exactly what mass it turns out to be, the Standard Model could be either confirmed or found in need of major revision. Some supersymmetric alternatives to the Standard Model predict multiple Higgs particles at various energies. Finding evidence of these would be a triumph for supersymmetric theories, especially if other supersymmetric particles are discovered along with them. At the energies of the LHC, most physicists expect to see some new particles. If all goes well, there should be enough for theorists to chew on for many years.

Anticipation is high for the LHC, but so is apprehension. More than any other scientific device in recent memory, there has been an undercurrent of fear that its operation somehow places the Earth or even the whole universe at risk. Disseminated largely through the Internet, these views have caught flame (and been flamed) in numerous blogs and user groups.

The principal culprits for the LHC’s supposed threats to our existence include voracious microscopic black holes, ferocious hypothetical particles called “strangelets,” magnetic monopoles, and other purported catalysts of doom. Apocalyptic fears are nothing new; many people choose to spend their time worrying about potential calamities such as the collision of asteroids or the evaporation of Earth by a nearby stellar explosion. What is novel about the LHC theories are worries about the world-gobbling powers of theoretical objects that have never been detected and could very well not even exist.

Of these LHC doomsday scenarios, perhaps the most widespread is the notion that the intensity of the collisions would forge a mini-black hole at the collision site that would then grow to Earth-swallowing dimensions by ingesting more and more material, like the gelatinous creature in the horror film The Blob. Indeed, there are some theoretical speculations about the creation of microscopic gravitationally intense objects. However, the idea that mini-black holes would act like blobs is an unfortunate misconception; any objects created would be far too minuscule to pose a threat.

Ordinary black holes are the products of the late-stage collapse of heavy stars, at least three times as massive as the Sun. They are called such because of their intense gravitational fields, which are so strong that within an invisible barrier called the event horizon not even light can escape. Typically, the way some black holes accrue matter is if they have companion stars; then they slowly absorb their unfortunate partner’s material and grow gradually over time.

Miniature black holes are a hypothetical concept based on the premise of concentrating a large amount of mass in a region the size of an elementary particle. Their event horizons would be so small that the minute bodies would have virtually no gravitational effect on the space a mere fraction of an inch away, let alone elsewhere in the Earth. Moreover, due to a process called Hawking radiation, they would almost immediately evaporate—decaying into other particles. Thus, mini-black holes would scarcely have the opportunity to exist, let alone to grow beyond subatomic proportions. In short, they’d have no chance of destroying even part of the LHC, let alone the Earth.

As Peter Higgs told the Independent, “The black hole business has become rather inflated. Even the theorists who are suggesting that mini-black holes are things that could be produced are not predicting black holes large enough to swallow up chunks of the universe. I think the publicity has rather got out of hand and some people have misunderstood it.”5

In the days before the aborted start-up in September 2008, apocalyptic worries dominated news stories about the collider. “Meet Evans the Atom, Who Will End the World on Wednesday,” proclaimed a headline to a piece in the British tabloid Daily Mail about the LHC’s project leader. The story begins by mentioning that as a child the Welsh-born Evans made explosives with his chemistry set that “blew the fuses of [his] whole house a few times.”6 Could the whole world, the piece considers, be the next thing for him to blow up?

One team of activists, led by former nuclear safety official Walter Wagner, has gone so far as to sue the LHC, pressing for a halt to its operations. In response to public concerns about the LHC’s purported dangers, researchers working on the project have issued detailed analyses of potential threats to the planet, demonstrating how none of these are worth fretting about. A 2003 report, “Study of Potentially Dangerous Events during Heavy-ion Collisions at the LHC,” found that “classical gravitational effects [of mini-black holes] are completely negligible for LHC energies and luminosities.”7 A follow-up study conducted in 2008 similarly indicated that mini-black holes would present no danger. Both reports pointed out that if such entities could be created, they would have been produced in energetic cosmic rays that continuously rain down upon the Earth. The mere fact that we are here means that anything forged at such energies wouldn’t be threatening.

Indeed, the French and Swiss residents living above the particle-smasher seem for the most part calm and happy. CERN prides itself on openness, publishing all of its decisions. It also takes great pains to respect the environment. The land above the collider is largely unspoiled and clean, flourishing with farms and vineyards. If the agency believed there was any reasonable chance that the LHC would imperil the Earth, the device would have been canceled.

Another aspect of the LHC that has caused some consternation as well as excitement is its ability to reproduce some of the conditions believed to have occurred less than one-trillionth of a second after the Big Bang. Does that mean it will actually create a new cosmic explosion and potentially destroy our own universe? Hardly. Rather, it is the energy per particle that will resemble conditions from the dawn of time. Forget about astronomical bursts; by human standards the actual energies produced are small—less than a billionth of a dietary calorie per collision! For a subatomic particle, nevertheless, that’s one hefty meal. By recording and studying events at such energies, scientists will be able to understand what happened during the actual creation of the cosmos—without risking engendering a new one themselves.

Unraveling the secrets of the origin and composition of the universe is hardly a new venture, although tools such as the LHC make this much easier. Philosophers and scientists have long wondered what happened during the earliest moments of time. What are the smallest things in the world and how do all of the pieces fit together? Could there be a theory of theories explaining all aspects of nature, from the tiniest particles to the cosmos itself? It is wondrous that such longstanding riddles may soon be answered.