Particle Manifestation of Quantum States Part 1

©Fernando Caracena 2014

This post argues that particles are emergent phenomena that result from the interaction of quanta and the matter matrix. Part 1 defines the problem, which is further discussed in part 2.

The Wave-Particle Duality

Introductory courses to quantum mechanics compress the problem of the wave-particle duality  into almost cartoon form, which contains the paradox but in a very simplified fashion. A good physics student goes along with the arguments, but later in becoming more familiar with observational results of quantum phenomena, may become somewhat skeptical of the whole business. This happened with David Bohm, a student of  J Robert Oppenheimer who wrote the introductory book on Quantum Mechanics that became the standard text at that time, and is now regarded as a classic. His book was based on the Bohr interpretation of Quantum Mechanics, called the Copenhagen interpretation. But later, he changed his mind, and tried to find an alternative theory to quantum theory, called the theory of hidden variables, which aimed at finding a way around the conceptual problems presented by quantum theory.

 

Computer Imaging and Analysis of Particle Interactions in the LHC

In practice, visualizing subatomic particles as small bits of matter that move from one point to another along a well defined, continuous trajectories is an analysis tool for detecting and classifying particles that result from very small scale interactions in high energy collisions; but, it is a tool that has no place in quantum theory, being more in the nature of the particles of Newtonian mechanics. Trajectory visulaization depended on the invention of chamber tools such as the Wilson Cloud Chamber, the Spark Chamber, and the Bubble Chamber.

Experimentalist of the recent past began to rely on using pictures of particle tacks from photographs of cloud chambers, spark chambers, and bubble chambers to select and analyse the events of interest. Analysis procedures involved converting several flat photographs of the same event from different angles, into digital data about three dimensional structures by having a human operator trace out paths using a mouse-like device. The human pace, which is limited by human reaction times on the order of 0.1 sec., was actually much slower than that. Software now accomplishes tasks that in the past involved a lot pattern recognition by human beings.

Imaging by Solid State Devices in the LHC

Automated, trajectory analyses by computers, at very fast speeds is an essential feature of modern particle accelerators, such as the LHC, which generate a huge flow of data that contain signals of illusive particles, such as the Higgs Boson. Today, solid state technology has brought the imaging of particle tracks to a high level of electronic sophistication that gives experimentalist computer access to mechanisms that give preliminary analysis of events and sorting out of events of interest from the fire-hose flow of data in real time. In the large hadron collider (LHC) at Geneva, Switzerland, events are "imaged" digitally and electronically by various solid state devices. The following quote from a Wikipedia article describing the ATLAS particle detector, which monitors the results of head on collisions of protons, gives the reader an idea of the complex technology involved:

"ATLAS is 46 metres long, 25 metres in diameter, and weighs about 7,000 tonnes; it contains some 3000 km of cable.[2] The experiment is a collaboration involving roughly 3,000 physicists from over 175 institutions in 38 countries...

The Pixel Detector,[18] the innermost part of the detector, contains three concentric layers and three disks on each end-cap, with a total of 1,744 modules, each measuring 2 centimetres by 6 centimetres. The detecting material is 250 µm thick silicon. Each module contains 16 readout chips and other electronic components. The smallest unit that can be read out is a pixel (50 by 400 micrometres); there are roughly 47,000 pixels per module. The minute pixel size is designed for extremely precise tracking very close to the interaction point. In total, the Pixel Detector has over 80 million readout channels, which is about 50% of the total readout channels of the whole experiment. "

Digital selection and analysis of Events on the LHC

The three-dimensional "images" generated by the ATLAS detector are extremely large digital data sets. The scientific analysis of interesting events requires much more time for teams of scientists using high speed computing than it takes to generate the data sets. Therefore, the task of saving the best data sets for later analysis becomes an enormous problem of real-time sorting in the face of the huge data stream. The above quoted article goes on to give us some idea of the data processing task faced by the experimentalists who want to extract useful information from the experimental results:

"The detector generates unmanageably large amounts of raw data: about 25 megabytes per event (raw; zero suppression reduces this to 1.6 MB), multiplied by 40 million beam crossings per second in the center of the detector. This produces a total of 1 petabyte of raw data per second.[22] The trigger system[23] uses simple information to identify, in real time, the most interesting events to retain for detailed analysis. There are three trigger levels. The first is based in electronics on the detector while the other two run primarily on a large computer cluster near the detector. The first-level trigger selects about 100,000 events per second. After the third-level trigger has been applied, a few hundred events remain to be stored for further analysis. This amount of data still requires over 100 megabytes of disk space per second – at least a petabyte each year.[24]

Offline event reconstruction is performed on all permanently stored events, turning the pattern of signals from the detector into physics objects, such as jets, photons, and leptons. Grid computing is being extensively used for event reconstruction, allowing the parallel use of university and laboratory computer networks throughout the world for the CPU-intensive task of reducing large quantities of raw data into a form suitable for physics analysis. The software for these tasks has been under development for many years, and will continue to be refined even now that the experiment is collecting data."

[The LHC and its vast collaboration openly shows the deep level of commitment to pure science that is alive and growing in Europe, as interest in everything of human value appears to be diminishing and disappearing from the United States. This signals that the U S is headed to becoming a vast agricultural wasteland, void of most intellectual pursuits. The implications of this deserve to be addressed in a separate blog entry.]

The ATLAS, " the largest detector ever built at a particle collider", images events as digital clusters of pixels in three dimensions. The various geometries of the different particle tracks identify their physical characteristics. For example, charged particle tracks are turned along the arc of a circle at right angles to an applied magnetic field, positively charged ones turn in on direction, and negatively charged ones, in the opposite direction. The component of motion in the direction of the magnetic field feels no force from it, so remains unaltered. A variety of energy and momentum characteristics are computed from the geometry of particle tracks together with calorimetry data. The pieces of the puzzle are further assembled using conservation principles.

ATLAS involves an incredible amount of fast, automated sifting through the huge data stream in real-time to narrow down the much fewer events of interest to be archived in permanent storage. Even then, the data featuring events of interest come as huge, digital data-sets. These must be reduced by many orders of magnitude in real-time by selecting the ones of interest through pattern recognition software, which narrows the data archive stream to 100 megabytes  per second (108 bytes per second) from an initial data stream of petabytes per second (1015 bytes per second).

The above quoted article succinctly summarizes the whole precess as follows:  "The different traces that particles leave in each layer of the detector allow for effective particle identification and accurate measurements of energy and momentum."

How are Pure Quantum States reconciled with Particle Tracks?

The short answer suggested in the post,  Localization of Quantum Events is two part:

a) although, quanta are not localizable in time and space;

nevertheless,

b) particles acquire position in time and space through their interaction with the matter matrix.

In connection with the wave-particle duality, this post discusses the two above quantum physics themes in in more detail. The blog entry, "Further Mathematical treatment of Quantum States", discusses the the operator  mathematics used in quantum theory, which corresponds to the mathematics of Fourier integrals. Basically, the results of such analyses can be reduced to some simple observations:

1) particle-like entities can be produced by a superposition of pure quantum states;

2) a finite number of pure components superpose into an infinitely repeating pattern.

3) an isolated particle-like entity requires the superposition of an infinity of pure quantum states.

Thus quantum theory reveals that we live in a holistic universe, where the totality of everything feeds back to make individuality possible. Not only time and space emerge from processes of the whole of creation, but individual objects emerge from the totality of processes of the universe, and even more, since although vast, the universe is not infinite.

Two Models of Particles

In working through high energy experiments involving all kinds of elementary particle interactions, physicists have avoided the philosophical complexity of trying to reconcile the idea of particles travelling along characteristic trajectories with waves propagating outward. There is some hand waving explanation in that when the wavelength of a wave train is small enough compared to the size of obstacles in its path, the waves move along paths described by geometrical optics, which are more like particle trajectories. In practice, physicists switch from the particle model in one part of analysing experimental data, to the wave model in defining and unscrambling the quantum processes involved in these interactions.

Pushing the Quantum Explanation too far into Many Worlds

If you push the idea of Quantum theory to its limits in trying to account for all reality, you wind up with a hydra theory of reality, a Medusa that turns us all into stone. Hugh Everett (1930–1982) found such a logically consistent interpretation that included everything, in what since has been called "the Many-worlds interpretation".  This interpretation has the huge disconnect from common sense in that it bifurcates the universe at every quantum detection point creating a myriad of universe clones that climb geometrically in number at every turn. Let me know when we reach infinity!

In a sense the Many-worlds interpretation is an ingenious explanation that unites the rigidity of the past with the indefiniteness of the future—but at what a cost! Nevertheless, Hugh Everett's idea may have provided us with a lead as to where to look for explanations: the void in our thinking, missed by modern theories, is some theory of reality itself. That is the burr in the saddle of modern thought, which is the focus of the second part of this series.

 

This entry was posted in History, philosophy, physics, waves and vibrations. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *