Here Today... Gone To Hell! | Message Board


Guns N Roses
of all the message boards on the internet, this is one...

Welcome, Guest. Please login or register.
April 18, 2024, 11:24:27 PM

Login with username, password and session length
Search:     Advanced search
1227783 Posts in 43248 Topics by 9264 Members
Latest Member: EllaGNR
* Home Help Calendar Go to HTGTH Login Register
+  Here Today... Gone To Hell!
|-+  Off Topic
| |-+  The Jungle
| | |-+  Some Scientific Quandaries
0 Members and 1 Guest are viewing this topic. « previous next »
Pages: [1] Go Down Print
Author Topic: Some Scientific Quandaries  (Read 5935 times)
MCT
Guest
« on: April 20, 2005, 10:24:22 AM »

http://www.newscientist.com/channel/space/mg18524911.600

13 things that do not make sense
19 March 2005
NewScientist.com news service
Michael Brooks

1 - The placebo effect:

DON'T try this at home. Several times a day, for several days, you induce pain in someone. You control the pain with morphine until the final day of the experiment, when you replace the morphine with saline solution. Guess what? The saline takes the pain away.

This is the placebo effect: somehow, sometimes, a whole lot of nothing can be very powerful. Except it's not quite nothing. When Fabrizio Benedetti of the University of Turin in Italy carried out the above experiment, he added a final twist by adding naloxone, a drug that blocks the effects of morphine, to the saline. The shocking result? The pain-relieving power of saline solution disappeared.

So what is going on? Doctors have known about the placebo effect for decades, and the naloxone result seems to show that the placebo effect is somehow biochemical. But apart from that, we simply don't know.

Benedetti has since shown that a saline placebo can also reduce tremors and muscle stiffness in people with Parkinson's disease (Nature Neuroscience, vol 7, p 587). He and his team measured the activity of neurons in the patients' brains as they administered the saline. They found that individual neurons in the subthalamic nucleus (a common target for surgical attempts to relieve Parkinson's symptoms) began to fire less often when the saline was given, and with fewer "bursts" of firing - another feature associated with Parkinson's. The neuron activity decreased at the same time as the symptoms improved: the saline was definitely doing something.

We have a lot to learn about what is happening here, Benedetti says, but one thing is clear: the mind can affect the body's biochemistry. "The relationship between expectation and therapeutic outcome is a wonderful model to understand mind-body interaction," he says. Researchers now need to identify when and where placebo works. There may be diseases in which it has no effect. There may be a common mechanism in different illnesses. As yet, we just don't know.

2 - The horizon problem:

OUR universe appears to be unfathomably uniform. Look across space from one edge of the visible universe to the other, and you'll see that the microwave background radiation filling the cosmos is at the same temperature everywhere. That may not seem surprising until you consider that the two edges are nearly 28 billion light years apart and our universe is only 14 billion years old.

Nothing can travel faster than the speed of light, so there is no way heat radiation could have travelled between the two horizons to even out the hot and cold spots created in the big bang and leave the thermal equilibrium we see now.

This "horizon problem" is a big headache for cosmologists, so big that they have come up with some pretty wild solutions. "Inflation", for example.

You can solve the horizon problem by having the universe expand ultra-fast for a time, just after the big bang, blowing up by a factor of 1050 in 10-33 seconds. But is that just wishful thinking? "Inflation would be an explanation if it occurred," says University of Cambridge astronomer Martin Rees. The trouble is that no one knows what could have made that happen.

So, in effect, inflation solves one mystery only to invoke another. A variation in the speed of light could also solve the horizon problem - but this too is impotent in the face of the question "why?" In scientific terms, the uniform temperature of the background radiation remains an anomaly.

3 - Ultra-energetic cosmic rays:

FOR more than a decade, physicists in Japan have been seeing cosmic rays that should not exist. Cosmic rays are particles - mostly protons but sometimes heavy atomic nuclei - that travel through the universe at close to the speed of light. Some cosmic rays detected on Earth are produced in violent events such as supernovae, but we still don't know the origins of the highest-energy particles, which are the most energetic particles ever seen in nature. But that's not the real mystery.

As cosmic-ray particles travel through space, they lose energy in collisions with the low-energy photons that pervade the universe, such as those of the cosmic microwave background radiation. Einstein's special theory of relativity dictates that any cosmic rays reaching Earth from a source outside our galaxy will have suffered so many energy-shedding collisions that their maximum possible energy is 5 ? 1019 electronvolts. This is known as the Greisen-Zatsepin-Kuzmin limit.

Over the past decade, however, the University of Tokyo's Akeno Giant Air Shower Array - 111 particle detectors spread out over 100 square kilometres - has detected several cosmic rays above the GZK limit. In theory, they can only have come from within our galaxy, avoiding an energy-sapping journey across the cosmos. However, astronomers can find no source for these cosmic rays in our galaxy. So what is going on?

One possibility is that there is something wrong with the Akeno results. Another is that Einstein was wrong. His special theory of relativity says that space is the same in all directions, but what if particles found it easier to move in certain directions? Then the cosmic rays could retain more of their energy, allowing them to beat the GZK limit.

Physicists at the Pierre Auger experiment in Mendoza, Argentina, are now working on this problem. Using 1600 detectors spread over 3000 square kilometres, Auger should be able to determine the energies of incoming cosmic rays and shed more light on the Akeno results.

Alan Watson, an astronomer at the University of Leeds, UK, and spokesman for the Pierre Auger project, is already convinced there is something worth following up here. "I have no doubts that events above 1020 electronvolts exist. There are sufficient examples to convince me," he says. The question now is, what are they? How many of these particles are coming in, and what direction are they coming from? Until we get that information, there's no telling how exotic the true explanation could be.

4 - Belfast homeopathy results:

MADELEINE Ennis, a pharmacologist at Queen's University, Belfast, was the scourge of homeopathy. She railed against its claims that a chemical remedy could be diluted to the point where a sample was unlikely to contain a single molecule of anything but water, and yet still have a healing effect. Until, that is, she set out to prove once and for all that homeopathy was bunkum.

In her most recent paper, Ennis describes how her team looked at the effects of ultra-dilute solutions of histamine on human white blood cells involved in inflammation. These "basophils" release histamine when the cells are under attack. Once released, the histamine stops them releasing any more. The study, replicated in four different labs, found that homeopathic solutions - so dilute that they probably didn't contain a single histamine molecule - worked just like histamine. Ennis might not be happy with the homeopaths' claims, but she admits that an effect cannot be ruled out.

So how could it happen? Homeopaths prepare their remedies by dissolving things like charcoal, deadly nightshade or spider venom in ethanol, and then diluting this "mother tincture" in water again and again. No matter what the level of dilution, homeopaths claim, the original remedy leaves some kind of imprint on the water molecules. Thus, however dilute the solution becomes, it is still imbued with the properties of the remedy.

You can understand why Ennis remains sceptical. And it remains true that no homeopathic remedy has ever been shown to work in a large randomised placebo-controlled clinical trial. But the Belfast study (Inflammation Research, vol 53, p 181) suggests that something is going on. "We are," Ennis says in her paper, "unable to explain our findings and are reporting them to encourage others to investigate this phenomenon." If the results turn out to be real, she says, the implications are profound: we may have to rewrite physics and chemistry.

5 - Dark matter:

TAKE our best understanding of gravity, apply it to the way galaxies spin, and you'll quickly see the problem: the galaxies should be falling apart. Galactic matter orbits around a central point because its mutual gravitational attraction creates centripetal forces. But there is not enough mass in the galaxies to produce the observed spin.

Vera Rubin, an astronomer working at the Carnegie Institution's department of terrestrial magnetism in Washington DC, spotted this anomaly in the late 1970s. The best response from physicists was to suggest there is more stuff out there than we can see. The trouble was, nobody could explain what this "dark matter" was.

And they still can't. Although researchers have made many suggestions about what kind of particles might make up dark matter, there is no consensus. It's an embarrassing hole in our understanding. Astronomical observations suggest that dark matter must make up about 90 per cent of the mass in the universe, yet we are astonishingly ignorant what that 90 per cent is.

Maybe we can't work out what dark matter is because it doesn't actually exist. That's certainly the way Rubin would like it to turn out. "If I could have my pick, I would like to learn that Newton's laws must be modified in order to correctly describe gravitational interactions at large distances," she says. "That's more appealing than a universe filled with a new kind of sub-nuclear particle."

Logged
MCT
Guest
« Reply #1 on: April 20, 2005, 10:26:07 AM »

6 - Viking's methane:

JULY 20, 1976. Gilbert Levin is on the edge of his seat. Millions of kilometres away on Mars, the Viking landers have scooped up some soil and mixed it with carbon-14-labelled nutrients. The mission's scientists have all agreed that if Levin's instruments on board the landers detect emissions of carbon-14-containing methane from the soil, then there must be life on Mars.

Viking reports a positive result. Something is ingesting the nutrients, metabolising them, and then belching out gas laced with carbon-14.

So why no party?

Because another instrument, designed to identify organic molecules considered essential signs of life, found nothing. Almost all the mission scientists erred on the side of caution and declared Viking's discovery a false positive. But was it?

The arguments continue to rage, but results from NASA's latest rovers show that the surface of Mars was almost certainly wet in the past and therefore hospitable to life. And there is plenty more evidence where that came from, Levin says. "Every mission to Mars has produced evidence supporting my conclusion. None has contradicted it."

Levin stands by his claim, and he is no longer alone. Joe Miller, a cell biologist at the University of Southern California in Los Angeles, has re-analysed the data and he thinks that the emissions show evidence of a circadian cycle. That is highly suggestive of life.

Levin is petitioning ESA and NASA to fly a modified version of his mission to look for "chiral" molecules. These come in left or right-handed versions: they are mirror images of each other. While biological processes tend to produce molecules that favour one chirality over the other, non-living processes create left and right-handed versions in equal numbers. If a future mission to Mars were to find that Martian "metabolism" also prefers one chiral form of a molecule to the other, that would be the best indication yet of life on Mars.

7 - Tetraneutrons:

FOUR years ago, a particle accelerator in France detected six particles that should not exist. They are called tetraneutrons: four neutrons that are bound together in a way that defies the laws of physics.

Francisco Miguel Marqu?s and colleagues at the Ganil accelerator in Caen are now gearing up to do it again. If they succeed, these clusters may oblige us to rethink the forces that hold atomic nuclei together.

The team fired beryllium nuclei at a small carbon target and analysed the debris that shot into surrounding particle detectors. They expected to see evidence for four separate neutrons hitting their detectors. Instead the Ganil team found just one flash of light in one detector. And the energy of this flash suggested that four neutrons were arriving together at the detector. Of course, their finding could have been an accident: four neutrons might just have arrived in the same place at the same time by coincidence. But that's ridiculously improbable.

Not as improbable as tetraneutrons, some might say, because in the standard model of particle physics tetraneutrons simply can't exist. According to the Pauli exclusion principle, not even two protons or neutrons in the same system can have identical quantum properties. In fact, the strong nuclear force that would hold them together is tuned in such a way that it can't even hold two lone neutrons together, let alone four. Marqu?s and his team were so bemused by their result that they buried the data in a research paper that was ostensibly about the possibility of finding tetraneutrons in the future (Physical Review C, vol 65, p 44006).

And there are still more compelling reasons to doubt the existence of tetraneutrons. If you tweak the laws of physics to allow four neutrons to bind together, all kinds of chaos ensues (Journal of Physics G, vol 29, L9). It would mean that the mix of elements formed after the big bang was inconsistent with what we now observe and, even worse, the elements formed would have quickly become far too heavy for the cosmos to cope. "Maybe the universe would have collapsed before it had any chance to expand," says Natalia Timofeyuk, a theorist at the University of Surrey in Guildford, UK.

There are, however, a couple of holes in this reasoning. Established theory does allow the tetraneutron to exist - though only as a ridiculously short-lived particle. "This could be a reason for four neutrons hitting the Ganil detectors simultaneously," Timofeyuk says. And there is other evidence that supports the idea of matter composed of multiple neutrons: neutron stars. These bodies, which contain an enormous number of bound neutrons, suggest that as yet unexplained forces come into play when neutrons gather en masse.

8 - The Pioneer anomaly:

THIS is a tale of two spacecraft. Pioneer 10 was launched in 1972; Pioneer 11 a year later. By now both craft should be drifting off into deep space with no one watching. However, their trajectories have proved far too fascinating to ignore.

That's because something has been pulling - or pushing - on them, causing them to speed up. The resulting acceleration is tiny, less than a nanometre per second per second. That's equivalent to just one ten-billionth of the gravity at Earth's surface, but it is enough to have shifted Pioneer 10 some 400,000 kilometres off track. NASA lost touch with Pioneer 11 in 1995, but up to that point it was experiencing exactly the same deviation as its sister probe. So what is causing it?

Nobody knows. Some possible explanations have already been ruled out, including software errors, the solar wind or a fuel leak. If the cause is some gravitational effect, it is not one we know anything about. In fact, physicists are so completely at a loss that some have resorted to linking this mystery with other inexplicable phenomena.

Bruce Bassett of the University of Portsmouth, UK, has suggested that the Pioneer conundrum might have something to do with variations in alpha, the fine structure constant (see "Not so constant constants", page 37). Others have talked about it as arising from dark matter - but since we don't know what dark matter is, that doesn't help much either. "This is all so maddeningly intriguing," says Michael Martin Nieto of the Los Alamos National Laboratory. "We only have proposals, none of which has been demonstrated."

Nieto has called for a new analysis of the early trajectory data from the craft, which he says might yield fresh clues. But to get to the bottom of the problem what scientists really need is a mission designed specifically to test unusual gravitational effects in the outer reaches of the solar system. Such a probe would cost between $300 million and $500 million and could piggyback on a future mission to the outer reaches of the solar system (www.arxiv.org/gr-qc/0411077).

"An explanation will be found eventually," Nieto says. "Of course I hope it is due to new physics - how stupendous that would be. But once a physicist starts working on the basis of hope he is heading for a fall." Disappointing as it may seem, Nieto thinks the explanation for the Pioneer anomaly will eventually be found in some mundane effect, such as an unnoticed source of heat on board the craft.

9 - Dark energy:

IT IS one of the most famous, and most embarrassing, problems in physics. In 1998, astronomers discovered that the universe is expanding at ever faster speeds. It's an effect still searching for a cause - until then, everyone thought the universe's expansion was slowing down after the big bang. "Theorists are still floundering around, looking for a sensible explanation," says cosmologist Katherine Freese of the University of Michigan, Ann Arbor. "We're all hoping that upcoming observations of supernovae, of clusters of galaxies and so on will give us more clues."

One suggestion is that some property of empty space is responsible - cosmologists call it dark energy. But all attempts to pin it down have fallen woefully short. It's also possible that Einstein's theory of general relativity may need to be tweaked when applied to the very largest scales of the universe. "The field is still wide open," Freese says.

10 - The Kuiper cliff:

IF YOU travel out to the far edge of the solar system, into the frigid wastes beyond Pluto, you'll see something strange. Suddenly, after passing through the Kuiper belt, a region of space teeming with icy rocks, there's nothing.

Astronomers call this boundary the Kuiper cliff, because the density of space rocks drops off so steeply. What caused it? The only answer seems to be a 10th planet. We're not talking about Quaoar or Sedna: this is a massive object, as big as Earth or Mars, that has swept the area clean of debris.

The evidence for the existence of "Planet X" is compelling, says Alan Stern, an astronomer at the Southwest Research Institute in Boulder, Colorado. But although calculations show that such a body could account for the Kuiper cliff (Icarus, vol 160, p 32), no one has ever seen this fabled 10th planet.

There's a good reason for that. The Kuiper belt is just too far away for us to get a decent view. We need to get out there and have a look before we can say anything about the region. And that won't be possible for another decade, at least. NASA's New Horizons probe, which will head out to Pluto and the Kuiper belt, is scheduled for launch in January 2006. It won't reach Pluto until 2015, so if you are looking for an explanation of the vast, empty gulf of the Kuiper cliff, watch this space.

Logged
MCT
Guest
« Reply #2 on: April 20, 2005, 10:27:34 AM »

11 - The Wow signal:

IT WAS 37 seconds long and came from outer space. On 15 August 1977 it caused astronomer Jerry Ehman, then of Ohio State University in Columbus, to scrawl "Wow!" on the printout from Big Ear, Ohio State's radio telescope in Delaware. And 28 years later no one knows what created the signal. "I am still waiting for a definitive explanation that makes sense," Ehman says.

Coming from the direction of Sagittarius, the pulse of radiation was confined to a narrow range of radio frequencies around 1420 megahertz. This frequency is in a part of the radio spectrum in which all transmissions are prohibited by international agreement. Natural sources of radiation, such as the thermal emissions from planets, usually cover a much broader sweep of frequencies. So what caused it?

The nearest star in that direction is 220 light years away. If that is where is came from, it would have had to be a pretty powerful astronomical event - or an advanced alien civilisation using an astonishingly large and powerful transmitter.

The fact that hundreds of sweeps over the same patch of sky have found nothing like the Wow signal doesn't mean it's not aliens. When you consider the fact that the Big Ear telescope covers only one-millionth of the sky at any time, and an alien transmitter would also likely beam out over the same fraction of sky, the chances of spotting the signal again are remote, to say the least.

Others think there must be a mundane explanation. Dan Wertheimer, chief scientist for the SETI@home project, says the Wow signal was almost certainly pollution: radio-frequency interference from Earth-based transmissions. "We've seen many signals like this, and these sorts of signals have always turned out to be interference," he says. The debate continues.

12 - Not-so-constant constants:

IN 1997 astronomer John Webb and his team at the University of New South Wales in Sydney analysed the light reaching Earth from distant quasars. On its 12-billion-year journey, the light had passed through interstellar clouds of metals such as iron, nickel and chromium, and the researchers found these atoms had absorbed some of the photons of quasar light - but not the ones they were expecting.

If the observations are correct, the only vaguely reasonable explanation is that a constant of physics called the fine structure constant, or alpha, had a different value at the time the light passed through the clouds.

But that's heresy. Alpha is an extremely important constant that determines how light interacts with matter - and it shouldn't be able to change. Its value depends on, among other things, the charge on the electron, the speed of light and Planck's constant. Could one of these really have changed?

No one in physics wanted to believe the measurements. Webb and his team have been trying for years to find an error in their results. But so far they have failed.

Webb's are not the only results that suggest something is missing from our understanding of alpha. A recent analysis of the only known natural nuclear reactor, which was active nearly 2 billion years ago at what is now Oklo in Gabon, also suggests something about light's interaction with matter has changed.

The ratio of certain radioactive isotopes produced within such a reactor depends on alpha, and so looking at the fission products left behind in the ground at Oklo provides a way to work out the value of the constant at the time of their formation. Using this method, Steve Lamoreaux and his colleagues at the Los Alamos National Laboratory in New Mexico suggest that alpha may have decreased by more than 4 per cent since Oklo started up (Physical Review D, vol 69, p 121701).

There are gainsayers who still dispute any change in alpha. Patrick Petitjean, an astronomer at the Institute of Astrophysics in Paris, led a team that analysed quasar light picked up by the Very Large Telescope (VLT) in Chile and found no evidence that alpha has changed. But Webb, who is now looking at the VLT measurements, says that they require a more complex analysis than Petitjean's team has carried out. Webb's group is working on that now, and may be in a position to declare the anomaly resolved - or not - later this year.

"It's difficult to say how long it's going to take," says team member Michael Murphy of the University of Cambridge. "The more we look at these new data, the more difficulties we see." But whatever the answer, the work will still be valuable. An analysis of the way light passes through distant molecular clouds will reveal more about how the elements were produced early in the universe's history.

13 - Cold fusion:

AFTER 16 years, it's back. In fact, cold fusion never really went away. Over a 10-year period from 1989, US navy labs ran more than 200 experiments to investigate whether nuclear reactions generating more energy than they consume - supposedly only possible inside stars - can occur at room temperature. Numerous researchers have since pronounced themselves believers.

With controllable cold fusion, many of the world's energy problems would melt away: no wonder the US Department of Energy is interested. In December, after a lengthy review of the evidence, it said it was open to receiving proposals for new cold fusion experiments.

That's quite a turnaround. The DoE's first report on the subject, published 15 years ago, concluded that the original cold fusion results, produced by Martin Fleischmann and Stanley Pons of the University of Utah and unveiled at a press conference in 1989, were impossible to reproduce, and thus probably false.

The basic claim of cold fusion is that dunking palladium electrodes into heavy water - in which oxygen is combined with the hydrogen isotope deuterium - can release a large amount of energy. Placing a voltage across the electrodes supposedly allows deuterium nuclei to move into palladium's molecular lattice, enabling them to overcome their natural repulsion and fuse together, releasing a blast of energy. The snag is that fusion at room temperature is deemed impossible by every accepted scientific theory.

That doesn't matter, according to David Nagel, an engineer at George Washington University in Washington DC. Superconductors took 40 years to explain, he points out, so there's no reason to dismiss cold fusion. "The experimental case is bulletproof," he says. "You can't make it go away."

(From issue 2491 of New Scientist magazine, 19 March 2005, page 30)
Logged
*Izzy*
*Title*
Legend
*****

Karma: 0
Offline Offline

Gender: Male
Posts: 1640


*Here Today*


« Reply #3 on: April 20, 2005, 11:37:37 AM »

Interesting, a couple I don't understand though

 smoking Izzy? smoking
Logged

Quote from: MCT
Quote from: D
how much difference is there in GMT to easter time?

Let me think here........is easter time anything like Christmas time?.........
Mr. Dick Purple
and the iconoclast in yellow
Legend
*****

Karma: 0
Offline Offline

Gender: Male
Posts: 4302


I have inside me blood of Kings


WWW
« Reply #4 on: April 20, 2005, 02:20:37 PM »

I felt Quite dizzy when I read all.  nervous
Logged

No man can be my equal
McGann
Haiku writing motherfucker
VIP
****

Karma: 0
Offline Offline

Gender: Male
Posts: 856


My hat is HEAVY-DUTY tinfoil


« Reply #5 on: April 20, 2005, 03:39:53 PM »

This may be the most interesting thread I've ever seen, anywhere.  Quantum mechanics and astrophysics are pet interests of mine.

Thanks for posting, MCT!!!

/Mike
Logged

"When fascism came to America, it was called 'Political Correctness' and waved a culturally relative flag."
-Mike McGann
D
Deliverance Banjo Player
Legend
*****

Karma: -5
Offline Offline

Gender: Male
Posts: 22289


I am Back!!!!!!


WWW
« Reply #6 on: April 20, 2005, 04:37:03 PM »

I've always believed in the placebo effect

They use to say that Asthma medicine was nothing but sugar water. "I heard that before the Stephen King movie "IT"

that u think u are getting a dose of medicine therefore u calm down and get ok.

I wonder how many psychotic medications are placebos?

they say matter cant be created or destroyed, so after we die, where does our energy go? it has to go somewhere, it just doesnt cease to exist.
Logged

Who Says You Can't Go Home to HTGTH?
Gunner80
ohh..My somber smile
Legend
*****

Karma: -1
Offline Offline

Gender: Male
Posts: 3518


A delivery boy from the past


« Reply #7 on: April 20, 2005, 08:56:47 PM »

I've always believed in the placebo effect

They use to say that Asthma medicine was nothing but sugar water. "I heard that before the Stephen King movie "IT"

that u think u are getting a dose of medicine therefore u calm down and get ok.

I wonder how many psychotic medications are placebos?

they say matter cant be created or destroyed, so after we die, where does our energy go? it has to go somewhere, it just doesnt cease to exist.
Our bodies create energy that keeps us alive and when we die it stops.
Logged

The Rolling Stones, greatest Rock N' Roll band ever, period!
Narcissa
Back In Black
Rocker
***

Karma: 0
Offline Offline

Posts: 269



« Reply #8 on: April 20, 2005, 09:08:23 PM »

after we die, where does our energy go?

the last of the body's energy is expired as heat, into the atmosphere, that's why dead bodies are cold.
Logged
MCT
Guest
« Reply #9 on: April 21, 2005, 06:28:46 PM »

Quantum mechanics and astrophysics are pet interests of mine.

Yeah, the quantum side of things is pretty interesting. That, coupled with your post, reminded me of a series of fairly recent articles regarding a newly emerging field of science - quantum astronomy. I'll throw 'em in just in case you haven't read them, and of course for the benefit of anyone who might actually want to; read them that is...

http://www.space.com/searchforlife/quantum_astronomy_041111.html

By Laurance R. Doyle
SETI Institute
posted: 11 November 2004

This is a series of four articles each with a separate explanation of different quantum phenomena. Each of the four articles is a piece of a mosaic and so every one is needed to understand the final explanation of the quantum astronomy experiment we propose, possibly using the Allen Array Telescope and the narrow-band radio-wave detectors being build by the SETI Institute and the University of California, Berkeley.

With the success of recent movies such as "What the &$@# Do We Know?" and the ongoing -- and continuously surprising -- revelations of the unexpected nature of underlying reality that have been unfolding in quantum physics for three-quarters of a century now, it may not be particularly surprising that the quantum nature of the universe may actually now be making in-roads into what has previously been considered classical observational astronomy. Quantum physics has been applied for decades to cosmology, and the strange "singularity" physics of black holes. It is also applicable to macroscopic effects such as Einstein-Bose condensates (extremely cold conglomerations of material that behave in non-classical ways) as well as neutron stars and even white dwarfs (which are kept from collapse, not by nuclear fusion explosions but by the Pauli Exclusion Principle ? a process whereby no two elementary particles can have the same quantum state and therefore, in a sense, not collapse into each other).

Well, congratulations if you have gotten through the first paragraph of this essay. I can't honestly tell you that things will get better, but I can say that to the intrepid reader things should get even more interesting. The famous quantum physicist Richard Feynmann once said essentially that anyone who thought he understood quantum physics did not understand it enough to understand that he did not actually understand it! In other words, no classical interpretation of quantum physics is the correct one. Parallel evolving universes (one being created every time a quantum-level choice is made), faster-than-light interconnectedness underlying everything, nothing existing until it is observed, these are a few of the interpretations of quantum reality that are consistent with the experiments and observations.

There are many ways we could go now in examining quantum results. If conscious observation is needed for the creation of an electron (this is one aspect of the Copenhagen Interpretation, the most popular version of quantum physics interpretations), then ideas about the origin of consciousness must be revised. If electrons in the brain create consciousness, but electrons require consciousness to exist, one is apparently caught in circular reasoning at best. But for this essay, we shall not discuss quantum biology. Another path we might go down would be the application of quantum physics to cosmology -- either the Inflationary origin of the universe, or the Hawking evaporation of black holes, as examples. But our essay is not about this vast field either. Today we will discuss the scaling of the simple double-slit laboratory experiment to cosmic distances, what can truly be called, "quantum astronomy."

The laboratory double-slit experiment contains a lot of the best aspects of the weirdness of quantum physics. It can involve various kinds of elementary particles, but for today's discussion we will be talking solely about light ? the particle nature of which is called the "photon." A light shining through a small hole or slit (like in a pinhole camera) creates a spot of light on the screen (or film, or detector). However, light shown through two slits that are close together creates not two spots on the screen, but rather a series of alternating bright and dark lines with the brightest line in the exact middle of this interference pattern. This shows that light is a wave since such a pattern results from the interference of the waves coming from slit one (which we shall call "A") with the waves coming from slit two (which we shall call "B"). When peaks of waves from light source A meet peaks from light source B, they add and the bright lines are produced. Not far to the left and right of this brightness peak, however, peaks from A meet troughs from B (because the crests of the light waves are no longer aligned) and a dark line is produced. This alternates on either side until the visibility of the lines fades out. This pattern is simply called an "interference pattern" and Thomas Young used this experiment to demonstrate the wave nature of light in the early 19th Century.

However, in the year 1900 physicist Max Planck showed that certain other effects in physics could only be explained by light being a particle. Many experiments followed to also show that light was indeed also a particle (a "photon") and Albert Einstein was awarded the Nobel Prize in physics in 1921 for his work showing that the particle nature of light could explain the "photoelectric effect." This was an experiment whereby low energy (red) light, when shining onto a photoelectric material, caused the material to emit low energy (slow moving) electrons, while high energy (blue) light caused the same material to emit high energy (fast moving) electrons. However, lots of red light only ever produced more low energy electrons, never any high-energy electrons. In other words, the energy could not be "saved up" but rather must be absorbed by the electrons in the photoelectric material individually. The conclusion was that light came in packets, little quantities, and behaved thus as a particle as well as a wave.

So light is both a particle and a wave. OK, kind of unexpected (like Jell-O) but perhaps not totally weird. But the double slit experiment had another trick up its sleeve. One could send one photon (or "quantum" of energy) through a single slit at a time, with a sufficiently long interval in between, and eventually a spot builds up that looks just like the one produced when a very intense (many photons) light was sent through the slit. But then a strange thing happened. When one sends a single photon at a time (waiting between each laser pulse, for example) toward the screen when both slits are open, rather than two spots eventually building up opposite the two slit openings, what eventually builds up is the interference pattern of alternating bright and dark lines! Hmm? how can this be, if only one photon was sent through the apparatus at a time?

The answer is that each individual photon must ? in order to have produced an interference pattern -- have gone through both slits! This, the simplest of quantum weirdness experiments, has been the basis of many of the unintuitive interpretations of quantum physics. We can see, perhaps, how physicists might conclude, for example, that a particle of light is not a particle until it is measured at the screen. It turns out that the particle of light is rather a wave before it is measured. But it is not a wave in the ocean-wave sense. It is not a wave of matter but rather, it turns out that it is apparently a wave of probability. That is, the elementary particles making up the trees, people, and planets -- what we see around us -- are apparently just distributions of likelihood until they are measured (that is, measured or observed). So much for the Victorian view of solid matter!

The shock of matter being largely empty space may have been extreme enough -- if an atom were the size of a huge cathedral, then the electrons would be dust particles floating around at all distances inside the building, while the nucleus, or center of the atom, would be smaller than a sugar cube. But with quantum physics, even this tenuous result would be superseded by the atom itself not really being anything that exists until it is measured. One might rightly ask, then, what does it mean to measure something? And this brings us to the Uncertainly Principle first discovered by Werner Heisenberg. Dr. Heisenberg wrote, "Some physicist would prefer to come back to the idea of an objective real world whose smallest parts exist objectively in the same sense as stones or trees exist independently of whether we observe them. This however is impossible."

Perhaps that is enough to think about for now. So in the next essay we will examine, in some detail, the uncertainty principle as it relates to what is called "the measurement problem" in quantum physics. We shall find that the uncertainty principle will be the key to performing the double-slit experiment over astronomical distances, and demonstrating that quantum effects are not just microscopic phenomena, but can be extended across the cosmos.
Logged
MCT
Guest
« Reply #10 on: April 21, 2005, 06:30:37 PM »

http://www.space.com/searchforlife/quantum_astronomy_041118.html

By Laurance R. Doyle
Astronomer, SETI Institute
posted: 18 November 2004

This is the second article in a series of four articles each with a separate explanation of different quantum phenomena. Each article is a piece of a mosaic, so every one is needed to understand the final explanation of the quantum astronomy experiment we propose, possibly using the Allen Array Telescope and the narrow-band radio-wave detectors being build by the SETI Institute and the University of California, Berkeley.

In the first article, we discussed the double-slit experiment and how a quantum particle of light (a photon) can be thought of as a wave of probability until it is actually detected. In this article we shall examine another feature of quantum physics that places fundamental constraints on what can actually be measured, a basic property first discovered by Werner Heisenberg, the simplest form known as the "Heisenberg Uncertainty Principle."

In scientific circles we are perhaps used to thinking of the word "principle" as "order", "certainty", or "a law of the universe". So the term "uncertainty principle" may strike us as something akin to the terms "jumbo shrimp" or "guest host" in the sense of juxtaposing opposites. However, the uncertainty principle is a fundamental property of quantum physics initially discovered through somewhat classical reasoning -- a classically based logic that is still used by many physics teachers to explain the uncertainty principle today. This classical approach is that if one looks at an elementary particle using light to see it, the very act of hitting the particle with light (even just one photon) should knock it out of the way so that one can no longer tell where the particle actually is located -- just that it is no longer where it was.

Smaller wavelength light (blue, for example, which is more energetic) imparts more energy to the particle than longer wavelength light (red, for example, which is less energetic). So using a smaller (more precise) "yardstick" of light to measure position means that one "messes up" the possible position of the particle more by "hitting" it with more energy. While his sponsor, Nehls Bohr (who successfully argued with Einstein on many of these matters), was on travel, Werner Heisenberg first published his Uncertainty Principle Paper using this more-or-less classical reasoning just given. (The deviation from classical notion was the idea of light comes in little packets or quantities, known as "quanta," as discussed in article one). However the uncertainty principle was to turn out to be much more fundamental than even Heisenberg imagined in his first paper.

Momentum is a fundamental concept in physics. It is classically defined as the mass of a particle multiplied by its velocity. We can picture a baseball thrown at us at 100 miles per hour having a similar effect as a bat being thrown at us at ten miles per hour; they would both have about the same momentum although they have quite different masses. The Heisenberg Uncertainty Principle basically stated that if one starts to know the change in the momentum of an elementary particle very well (that is usually, what the change in a particle's velocity is) then one begins to lose knowledge of the change in the position of the particle, that is, where the particle is actually located. Another way of stating this principle, using relativity in the formulation, turns out to be that one gets another version of the uncertainty principle. This relativistic version states that as one gets to know the energy of an elementary particle very well, one cannot at the same time know (i.e., measure) very accurately at what time it actually had that energy. So we have, in quantum physics, what are called "complimentary pairs." (If you'd really like to impress your friends, you can also call them "non-commuting observables.")

One can illustrate the basic results of the uncertainty principle with a not-quite-filled balloon. On one side we could write "delta-E" to represent our uncertainty in the value of the energy of a particle, and on the other side of the balloon write "delta-t" which would stands for our uncertainty in the time the particle had that energy. If we squeeze the delta-E side (constrain the energy so that it fits into our hand, for example) we can see that the delta-t side of the balloon would get larger. Similarly, if we decide to make the delta-t side fit within our hand, the delta-E side would get larger. But the total value of air in the balloon would not change; it would just shift. The total value of air in the balloon in our analogy is one quantity, or one "quanta," the smallest unit of energy possible in quantum physics. You can add more quanta-air to the balloon (making all the values larger, both in delta-E and delta-t) but you can never take more than one quanta-air out of the balloon in our analogy. Thus "quantum balloons" do not come in packets any smaller than one quanta, or photon. (It is interesting that the term "quantum leap" has come to mean a large, rather than the smallest possible, change in something, and the order of the dictionary definitions of "quantum leap" have now switched, with the popular usage first and the opposite, physics usage second. If you say to your boss, "We've made a quantum leap in progress today" this can still, however, be considered an honest statement of making absolutely no progress at all.)

When quantum physics was still young, Albert Einstein (and colleagues) would challenge Nehls Bohr (and colleagues) with many strange quantum puzzles. Some of these included effects that seemed to imply that elementary particles, through quantum effects, could communicate faster than light. Einstein was known to then imply that we really could not be understanding physics correctly for such effects to be allowed to take place for, among other things, such faster-than-light connectedness would deny the speed-of-light limit set by relativity. Einstein came up with several such self-evidently absurd thought experiments one could perform, the most famous being the EPR (Einstein, Podolski, Rosen) paradox, named after the three authors of this paper, which showed that faster-than-light communication would appear to be the result from certain quantum experiments and therefore argued that quantum physics was not complete?that some factors had to be, as yet, undiscovered. This led Nehls Bohr and his associates to formulate the "Copenhagen Interpretation" of quantum physics reality. This interpretation, (overly simplified in a nutshell), is that it makes no sense to talk about an elementary particle until it is observed because it really doesn't exist unless it is observed. In other words, elementary particles might be thought of not just as being made up of forces, but that some constituents of it that must be taken into account are the observer or measurer as well, and that the observer can never really be separated from the observation.

Using the wave equations formulated for quantum particles by Erwin Schr?dinger, Max Born was the first to make the suggestion that these elementary particle waves were not made up of anything but probabilities! So the constituents of everything we see are made up of what one might call "tendencies to exist" which are made into particles by adding the essential ingredient of "looking." Looking as an ingredient itself, it must be noted, took some getting used to! There were other possible interpretations we could follow, but it can be said that none of them was consistent with any sort of objective reality as Victorian physics had known it before. The wildest theories could fit the data equally well, but none of them allowed the particles making up the universe to consist of anything without either an underlying faster-than-light communication (theory of David Bohm), another parallel universe branching off ours every time there is a minute decision to be made (many worlds interpretation), or the "old" favorite, the observer creates the reality when he looks (the Copenhagen Interpretation).

Inspired by all these theories, a physicist at CERN in Switzerland named John Bell came up with an experiment that could perhaps test some of these theories and certainly test how far quantum physics was from classical physics. By now (1964) quantum physics was old enough to have distinguished itself from all previous physics to the point that physics before 1900 was dubbed "classical physics" and physics discovered after 1900 (mainly quantum physics) was dubbed "modern physics." So, in a sense, the history of science in broken up into the first 46 centuries (if one starts with Imhotep who built the first pyramid as the first historical scientist) and the last century, with quantum physics. So, we can see that we are quite young in the age of modern physics, this new fundamental view of science. It might even be fair to say that most people are not even aware, even after a century, of the great change that has been taking place in the fundamental basis of the scientific endeavor and interpretations of reality.

Logged
MCT
Guest
« Reply #11 on: April 21, 2005, 06:31:59 PM »

(cont'd)

John Bell proposed an experiment that could measure if a given elementary particle could "communicate" with another elementary particle farther away faster than any light could have traveled between them. In 1984 a team led by Alain Aspect in Paris did this experiment and indeed, this was undeniably the apparent result. The experiment had to do with polarized light. For illustrative purposes, let's say that you have a container of light, and the light is waving all over the place and -- if the container is coated with a reflective substance, except for the ends -- the light is bouncing off the walls. (One might picture a can of spaghetti with noodles at all orientations as the directions of random light waves.) At the ends we place polarizing filters. This means that only light with a given orientation (say like noodles that are oriented up-and-down) can get out, while back-and-forth light waves (noodles) cannot get out. If we rotate the polarizers at both ends by 90 degrees we would then let out back-and-forth light waves, but now not up-and-down light.

It turns out that if we were to rotate the ends so that they were at an angle of 30 degrees to each other, about half of the total light could get out of the container -- one-fourth from one side of the bottle and one-fourth through the other side. This is (close enough to) what John Bell proposed and Alain Aspect demonstrated. When the "bottle" was rotated at one end, making a 30-degree angle with the other side so that only half the light could escape, a surprising thing happened. Before any light could have had time to travel from the rotated side of the "bottle" (actually a long tube) to the other side, the light coming out of the opposite side from the one that was rotated changed to one-fourth instantaneously (or as close to instantaneous as anyone could measure). Somehow that side of the "bottle" had gotten the message that the other side had been rotated faster than the speed of light. Since then this experiment has been confirmed many times.

John Bell's formulation of the fundamental ideas in this experiment have been called "Bell's Theorem" and can be stated most succinctly in his own words; "Reality is non-local." In other words, not only do the elementary particles that make up the things we see around us not exist until they are observed (Copenhagen Interpretation), but they are not, at the most essential level, even identifiably separable from other such particles arbitrarily far away. John Muir, the 19th Century naturalist once said, "When we try to pick out anything by itself, we find it hitched to everything else in the universe." Well he might have been surprised how literally -- in physics as well as in ecology -- this turned out to be true.

In the next essay we will combine the uncertainty principle with the results of Bell's Theorem and increase the scale of the double slit experiment to cosmic proportions with what Einstein's colleague, John Wheeler, has called "The Participatory Universe." This will involve juggling what is knowable and what is unknowable in the universe at the same time.


Logged
MCT
Guest
« Reply #12 on: April 21, 2005, 06:33:25 PM »

http://www.space.com/searchforlife/quantum_astronomy_041216.html

By Laurance R. Doyle
SETI Institute
posted: 16 December 2004

This is the third article in a series of four articles each with a separate explanation of different quantum phenomena. Each article is a piece of a mosaic, so every one is needed to understand the final explanation of the quantum astronomy experiment we propose, possibly using the Allen Telescope Array and the narrow-band radio-wave detectors being build by the SETI Institute and the University of California, Berkeley.

In the previous two articles we discussed the basic double-slit experiment that demonstrates the dual nature of light -- wave and particle -- and then the Heisenberg Uncertainty Principle which demonstrates the complimentary (mutual exclusion) of what one can measure at the same time. In this article we shall discuss the more basic interpretation of quantum physics in terms of what one can even know or not know, and how this affects the results one is trying to measure.

John Bell formulated the uncertainty principle in terms of what one could know or not know in an experiment. Several Bell-type experiments have successfully shown that this would seem to be the simplest interpretation of the situation. Taking our double-slit example, if one puts a detector at one or the other of the two slits -- even one that does not destroy the photon, electron, or whatever particle as it goes through the slit -- then an interference pattern does not appear at the detector. This is because one has set up an experiment in which one can "know" which path the particle took (i.e., which slit the particle went through). As long as one can tell this, then the particle cannot go through both slits at once and one no longer gets an interference pattern.

Now, you might say, what if I decide not to look at the detector set up next to one or the other of the slits? Well, one still does not get an interference pattern because the potential exists for one to be able to tell which path the photon, for example, traveled. Even this potential (i.e., "knowability") is enough to stop the formation of an interference pattern. All ability to detect which path an elementary particle took (in this case a photon of light) must be removed to obtain interference. In other words, one must not even be able to tell -- even in principle -- which path the elementary particle took in order for it to "take" both paths and form an interference pattern.

This is the most fundamental concept in quantum physics -- knowable and unknowable. It is from this more fundamental concept of the uncertainty principle that we shall approach our quantum astronomy experiment. It has been experimentally verified that if one can know which path a photon traveled, then an interference pattern is not possible. But if one can become ignorant of which path the photon (or any elementary particle) took, then an interference pattern is assured. That is, if one is ignorant of which path the photon took, then an interference pattern is not just possible, it must occur.

This last point can produce some decidedly non-classical effects. One example is the phenomenon known as "quantum beats." Picture an atom (classically, for now) as consisting of a nucleus with electrons jumping all around it. Electrons do not move smoothly away from and toward their central nucleus; they take discrete steps (energy quanta, actually) to transition from a lower to a higher (farther from the nucleus) orbital level. They actually disappear from one level and reappear at another, but are never found in between the two. As an electron "jumps" from a higher-level step to a lower-level step, it emits a photon of light. Just the fact that one cannot, even in principle, tell which energy level jump the electron took, is enough to produce a special kind of interference fringes called "quantum beats."

Thus, while picturing probability distributions as classical waves (like water waves) may be helpful for beginning physics students, real quantum wave phenomena are decidedly non-classical and produce decidedly non-classical results. They are not waves made of anything but probabilities, or tendencies to exist. Yet they can interfere with each other in a wave-phenomena-type way before they are measured, and so "turn" from probability waves to measured particles. (This "collapse of the wave function" is also said to take place instantaneously, as we shall discuss more in article four.)

Einstein wrote several times "God does not play dice with the universe." Quantum physics, however, has reduced everything to probabilities mathematically, and such a formulation inherently implies dice rolling for all possibilities until a measurement is made. Richard Feynman pointed out that the mathematics really does mean all possibilities. Every elementary particle takes every path it possibly can -- a kind of infinite-slit experiment -- and then these infinite numbers of paths all cancel in the multi-dimensional mathematics -- called Hilbert space -- so that only one result is finally measured.

However, a colleague of Einstein's, Professor John Wheeler of Princeton University, has pointed out that one could take another interpretation, an interpretation he has dubbed, "The Participatory Universe." In this approach one can look at the universe as directly participating in each quantum effect in real time. In other words, the concept of a First Cause starting things off (winding up the clock of the universe, one might say) and then leaving the laws of physics to run things, may be what is incorrect in the basic approach of classical physics. Rather, in this participatory scenario, the Cause of the laws of physics remains an active Participant. (If one would like to also draw some religious points into such discussions I would just say that it is important to understand what is being said and what is not being said here?that is, to not oversimplify what went into Prof. Wheeler's introduction of this interesting proposed conceptualization for quantum reality.)

Professor Wheeler came up then with a Gedanken experiment (i.e., a thought experiment) that he called the "delayed choice" experiment. He proposed a huge scaling up (to cosmic proportions) of Young's double slit experiment that we've talked so much about. In this Gedanken experiment gravitational lenses, which can bend light from distance quasars or galaxies, are used as sort-of giant slits to create two paths for photons from a quasar or distance galaxy. General Relativity shows that masses in space can bend light.

The first support for the confirmation of Einstein's theory of relativity came with the measurement of the bending of light from stars by the Sun as they passed close behind it during a total solar eclipse. Light was indeed bent by the mass of the Sun (that is to say that space-time was curved near large masses). It turns out that large masses like galaxies that are rather close to being directly in between a distant quasar-galaxy and us will bend the light from the distant object toward us. One can think of light coming toward us more-or-less directly from a distant quasar-galaxy (let's call this path A) while light shining from this quasar also heads off into space at a slightly different angle.

This light, however, encounters a massive galaxy along the way so that the light rays that would normally have missed the Earth get bent towards us as well (we'll call this light path B). Thus it appears that we have two quasars with a massive galaxy in between. However, this is just one quasar whose light rays are coming more-or-less directly toward us along path A and whose second image appears on the other side of the massive galaxy image, these latter being the rays traveling along bent path B (i.e., bent toward us). Thus it appears that we have two quasars when we actually have two images of the same quasar.

John Wheeler realized that these two paths constituted a kind of double-slit experiment where the slits were the two gravitational lens images. The two paths of light from the quasar might be used then to interfere with each other. However, this could be done -- according to Bell's approach to the uncertainty principle -- only if one could not tell which path any particular photon had traveled. One way of being able to avoid knowing which path an individual photon took is to make the paths equal (within the uncertainty principle) so that one could not tell whether any photons arriving had traveled along path A or path B. Even if there were a flare in the quasar, the flare (peak in brightness) would arrive at the same time at Earth and so one could not use timing to tell which way it came. (One can see that if the paths are not equal, the light from the flare would arrive along path A before path B and so one could tell the path difference, which would negate the possibility of getting an interference pattern.)

Professor Wheeler "solved" this problem by adding an immensely long fiber optics cable to path A to make it as long as path B. (The fiber optics cable turned out to have to be over a light year long in this case, so it really was truly a Gedanken experiment without much hope of realization -- but we will propose a possible solution to this problem in the fourth and last essay.) The delayed-choice part of the experiment was, nevertheless, still very interesting. Given, then, that one achieves an interference pattern in this way, one should be able to put a detector at the intersection of light paths A and B and just re-do a cosmic-scale version of the Young's double-slit experiment (where light-photons from quasar images A and B crossing the universe are equivalent to light going through slits 1 and 2 in the laboratory).

Logged
MCT
Guest
« Reply #13 on: April 21, 2005, 06:35:25 PM »

(cont'd)

Now it should be noted that one of the founders of quantum physics, P.A.M. Dirac, noted that, at least in the Young's double-slit experiment, one could only get an interference pattern if each photon only ever interfered with itself -- that is, each single photon had to go through both slits and so only interfere with itself, not with any other photon. This certainly made sense in terms of the interference experiment being done with one photon at a time, and still producing interference. (There are also conservation of energy arguments -- two photons should not be expected to produce 4 times the energy when they meet sometimes -- making a bright line -- and no energy when they meet at other times -- making a dark line in the interference pattern.) Thus if one did the Wheeler delayed-choice experiment with single photons being detected at a time, one would still expect -- if one could not tell which path, A or B, the photons traveled along -- that an interference pattern would result where the two paths met. However, if one moved the detector to just detect photons along path A then these photons will have just traveled along path A (by classical reasoning). Or, similarly, if one moves the detector to intersect path B photons, then the photons will have traveled only path B.

The interesting part of Professor Wheeler's thought experiment is that the quasar emitting the photons is about one billion light years away?that is, the light from this quasar is supposed to have taken a billion years to travel to Earth. It seems perplexing that any given photon will have had to have traveled both paths when you put the detector at the intersection of both paths, but then one path or the other path when you decide to put the detector directly into one of these paths rather than at their intersection.

In other words, how can your decision as to where to put the detector affect the path of a given photon a billion years after it supposedly started along one of the paths toward Earth -- long before humans even existed on this planet (much less discovered quantum physics)? It would appear that what has "happened" in the distant past in this case may be determined by what is happening right now even though it is supposed to have "happened" over a billion years ago. The choice of which path, in other words, has somehow been "delayed." One might view this as the Universe playing more the part of an active participant in what is happening rather than just in what has happened in the past in this case. Hence the "Participatory Universe" conceptualization.

This interesting Gedanken experiment points out what may be the main difference between general relativity and quantum physics. In general relativity time is a definite dimension, part of the already unalterable space-time continuum. While in quantum physics, time is, at best, a variable, and is also quantized (i.e. there are particles of time). Thus far from being an absolute, time in quantum physics is a not a solid background upon which particles in space change. In quantum physics time is not yet really, in a sense, even there until the "time particles" are measured.

In our fourth and final assay we will talk about the possible realization of Professor Wheeler's Gedanken experiment, which may open up a whole new field of investigation -- a field which we will call "Quantum Astronomy."


Logged
MCT
Guest
« Reply #14 on: April 21, 2005, 06:36:26 PM »

http://www.space.com/searchforlife/quantum_astronomy_050113.html

By Laurance R. Doyle
SETI Institute
posted: 13 January 2005

This is the final article in a series of four articles each with a separate explanation of different quantum phenomena. Each article is a piece of a mosaic, so every one is needed to understand the final explanation of the quantum astronomy experiment we propose, possibly using the Allen Telescope Array and the narrow-band radio-wave detectors being build by the SETI Institute and the University of California, Berkeley.

In the preceding three essays we discussed Young's double-slit experiment where light was shown to behave as a wave. We also discussed the birth of quantum physics where light was also shown to behave like a particle. In the second article we discussed a basic limitation on measurement imposed by the Heisenberg Uncertainty Principle and how one may "trade" knowledge of one measurement for another. In article three we then discussed John Bell's concept of knowability and unknowability, and then John Wheeler's Gedanken (thought) experiment creating a cosmic-scale double-slit experiment requiring an immensely (billions of miles) long fiber optics cable. It is the application of John Bell's concept of knowability and unknowability that we shall now apply to the uncertainty principle in order to try to perform John Wheeler's cosmic double-slit experiment over cosmic distances that we shall discuss in this article.

In order to realize this experiment, however, one must come up with a substitute for this unbuildably long fiber optics cable, and this is where the SETI Institute's new Allen Telescope Array and its narrow-band radio wave detectors can play an important part. SETI radio projects use the fact that, as far as we know, no natural (i.e., non-technological) source of radio waves can make a very narrow-band radio channel. When you tune to a station on the radio, one turn and you are on another channel. If you tune to a radio galaxy, however, you can turn the dial many dozens of times and you will still be on the same channel, so to speak?you will hear the same sounds. In other words, as far as we know, only technology can make a narrow (1 Hertz wide) radio channel. Thus, looking for narrow-band signals in space should be a good way to look for evidence of any radio-technological civilizations around other stars. Fortunately for quantum astronomy, it also turns out that an extremely narrow-band radio channel can also be used to replace that unrealistically long fiber optics cable! But to explain just how this can be done we need to first look again at the uncertainty principle.

When a colleague, Dr. David P. Carico of San Francisco State University, and I began thinking about actually carrying out Professor Wheeler's delayed-choice experiment, we realized that the uncertainty principle needed to be satisfied if one is to obtain an interference pattern. That is, one needs to be ignorant of which path the light traveled?along path A (directly from the quasar) or along path B (the path most bent by the gravity of the intervening galaxy back toward Earth) so that it could "travel both paths" and so interfere with itself. (The terms "travel" and "path" as applied to a photon-wave, of course, do not have any real meaning in quantum physics if the particle-nature does not exist until it is measured. But for now we will use such terms, as it is difficult to speak of quantum effects without some reference to our classical notions of space and time.) The energy-time uncertainty principle, as we will recall, referred to the fact that knowing the energy of a given particle meant that one could not know precisely the time the particle had that energy. And, "complimentarily" (the term for this that Nehls Bohr used), if one knows the time to a high precision, one cannot then know what that energy was with greater accuracy than the basic quantum value. (This quantum value, as we will also recall, is called "Planck's constant", or one quantum of energy and is actually quite a small value, so we do not usually notice this uncertainty constraint in everyday activities.)

Now in thinking about how to do this experiment we thought that perhaps it might be possible to "trade" knowledge of energy for knowledge of time, but in this case the time would be the delay time between the two paths of the gravitational lens images, A and B. The uncertainty in energy then might be able to replace the hugely long fiber optics cable with, instead, a very narrow-band radio detector. It's OK. Read on. I can hopefully explain what I mean. We have seen that we can trade knowledge of energy for knowledge of time (remember the balloon image in a previous article with "delta-E" written on one end and "delta-t" written on the other.) We also remember that if we can tell which path each photon traveled, we will not get an interference pattern but rather just a picture of a quasar at A and another (image of it) at B. To understand this "trade" then, let's take just a bit closer look at what we mean by a narrow-band radio wave.

It is known, in the physics of electromagnetic waves, that longer waves have less energy than shorter waves. The blue light we see has more energy per photon than the red light we see. (This can be extended to lower energy infrared photons, and higher energy ultraviolet photons, or even to very low energy radio photons, and even very much higher energy x-ray photons.) In photography, using a filter on the camera lens can allow only blue light, or red light into the camera. Sunlight is usually a whole mixture of blues, greens, yellows, oranges, reds, and so on, and therefore also a mixture of photons of light of all kinds of energy, high and low. When one uses, say, a red filter, one is cutting out the higher energy blue photons from going into the camera, and so only detects the lower energy red light. The narrower the filter, the less range of energy is let into the camera.

Similarly for radio detectors, if one has a broadband detector, one is letting in radio waves of all sorts of energies all at once. However if one has a very narrow-band radio detector (such as are used in the search for extraterrestrial intelligent technology), one is highly constraining the range of energies being detected. Only the radio photons of a very narrow spread in energy are actually measured. Remembering the uncertainty principle for energy and time, we can recognize that narrow-band radio detectors thus represent a constraint on the value of the energy being measured. Now what about time, however? For that, let's look at the crossing of the radio waves (which is just long wavelength light) coming along paths A and/or B. We can only get an interference pattern if we cannot tell (or even potentially be able to tell) which path a radio photon took to reach our detector. But, if the difference in the travel time between paths A and B is long enough (this is called the "delay time" of the gravitational lens), then there is plenty of time to detect, for example, if a flare went off at the quasar so that image A brightened, followed by image B some time (the delay time) later. This is actually how the delay time between gravitational lens paths is measured. Now the next sentence is the most important. However, if we use a narrow enough radio bandpass, we can potentially constrain the energy to such a precise value, that the time uncertainty is so large that it exceeds the actual delay time of the gravitational lens. In other words, we can constrain the energy (by using narrow-band radio detectors) so much that we can exceeded the ability?even potentially?of measuring which path the photon travels because our uncertainty in the arrival time of the photon is now larger (because of the uncertainty principle) than the actual delay time or travel time difference between paths A and B. Thus we cannot tell along which path the photon traveled and so should get an interference pattern at the detectors. A very narrow-band (but real) radio detector then, can substitute for an unrealistically long fiber optics cable to get an interference pattern at the intersection of paths A and B.

Logged
MCT
Guest
« Reply #15 on: April 21, 2005, 06:37:31 PM »

(cont'd)

So, how does one proceed to do this? We can start observing the gravitational lens using a radio telescope with very narrow-band detectors. We set the detectors on the narrowest-band possible (let's say one-hundredth of a Hertz, which means we know the wavelength?and therefore energy?of the radio wave coming in to within one-hundredth of a wavelength per second). We focus the two images of the quasar across each other and (if the delay time is not too long?longer than 100 seconds in this case) we will obtain an interference pattern. This means we cannot know which path the radio photons "took." (We also assume no detectable rapid fluctuations from the quasar for simplicity, although there are ways of dealing with this effect, as well, using "choppers" in the path of the incoming light.) Now what happens if we increase the allowable energies being detected (i.e., increase the bandpass of the radio detectors)? At first we may still get an interference pattern. But if we continue to increase the bandpass, at some point the interference pattern will disappear, and we shall simply get a (radio) picture of a quasar at location A, and another of its image at location B. The interference pattern will have disappeared at exactly the point where we could begin to tell which path the photons took. In other words, by allowing ourselves to become more and more ignorant of the energy of the radio waves arriving, we simultaneously allowed an increased knowledge (according to the uncertainty principle) of the time interval. And when we decreased our knowledge of the energy to the point where our knowledge of the time interval could drop below the actual delay time between light paths of the gravitational lens, we could (at least in principle) tell which path each photon took. Thus, the uncertainty principle "kicks in" and says that one cannot know which path a photon took and still get a wave phenomenon (i.e., an interference pattern). One cannot have one's photon and wave it too.

Thus we may be able to use very narrow-band radio detectors to realize the delayed-choice (perhaps no longer just Gedanken) experiment proposed by Professor Wheeler. What is of interest in doing such an experiment? First, it may represent a possible way to directly measure delay times for gravitational lenses that don't vary much in brightness, and such delay times can be used to measure the expansion rate of the universe (this parameter is called the "Hubble constant") directly. But more intriguing, perhaps, is that it can possibly provide a measure of the minimum time it takes for a wave to "become" a particle. If the quasar is one billion light years away (that's about six billion trillion miles) and the interference pattern is being formed by a probability wave that is traveling along both paths A and B, then when one increases the bandpass (say, over one hour's time) to the point where the wave becomes a particle (photon) then one might be able to speak in terms of the wave "becoming" a particle at the minimum rate of a billion light years per hour. This rate is considered in most quantum physics formulations to be instantaneous, but one is reminded of Galileo and a colleague standing on opposite hillsides with lamps trying to measure the speed of light. When one opened the lampshade, as soon as the other saw it, they opened their lampshade, and so, back and forth. They decided that the speed of light was either instantaneous of very very fast. It turned out to be very very fast (186,300 miles per second)?far too fast to measure with shaded lamps on nearby hills. So perhaps quantum astronomy may someday allow such a measurement of the speed of the wave-to-particle transition, if it is not instantaneous. What we have outlined here is just one experiment in many possible experiments that could be performed in what may be one of the most interesting new fields of the 21st Century, quantum astronomy.

For more information:

Werner Heisenberg - http://www.aip.org/history/heisenberg/

Nehls Bohr - http://en.wikipedia.org/wiki/Niels_Bohr

Albert Einstein - http://scienceworld.wolfram.com/biography/Einstein.html

John Bell - http://physicsweb.org/articles/world/11/12/8

Erwin Schr?dinger - http://scienceworld.wolfram.com/biography/Schroedinger.html

Max Born - http://en.wikipedia.org/wiki/Max_Born

***
Enjoy!


Logged
Prometheus
VIP
****

Karma: 0
Offline Offline

Gender: Male
Posts: 1476


I've been working all week on one of them.....


« Reply #16 on: April 21, 2005, 10:18:34 PM »

post whore...................
Logged

........oh wait..... nooooooo...... How come there aren't any fake business seminars in Newfoundland?!?? Sad? ............
Pages: [1] Go Up Print 
« previous next »
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.9 | SMF © 2006-2009, Simple Machines LLC Valid XHTML 1.0! Valid CSS!
Page created in 0.18 seconds with 18 queries.