lunes, 21 de diciembre de 2009
Five laws of human nature
http://www.newscientist.com/article/dn18301-five-laws-of-human-nature.html
You're so predictable.
Offended? We're used to the idea that nature is governed by laws that spell out how things work. But the idea that human nature is governed by such laws raises hackles. Perhaps because of this, they have often been proposed with tongue in cheek – which makes it all the more disconcerting when they turn out to be backed up by evidence.
One such law is the Peter principle, which states that in any organisation "people reach the level of their own incompetence". As we report this week, physics-based simulations suggest that this is more than just a cynical snipe at our bosses' competence. And that means we might have to rethink our ideas about who to promote to what jobs.
So what other laws of human nature might we have to reluctantly accept? Here are five that may – or may not – govern our lives.
Parkinson's law
Why is there always so much work to do? Anyone searching for an explanation might find one in Parkinson's law. Civil servant, historian and theorist Cyril Northcote Parkinson suggested in a 1955 article that work expands to fill the time available for its completion – backed up with statistical evidence drawn from his historical research. More recent mathematical analyses have lent support to the idea.
Parkinson also came up with the "law of triviality", which states that the amount of time an organisation spends discussing an issue is inversely proportional to its importance. He argued that nobody dares to expound on important issues in case they're wrong – but everyone is happy to opine at length about the trivial.
This in turn may be a result of Sayre's law, which states that in any dispute, the intensity of feeling is inversely proportional to the value of the stakes at issue.
Parkinson also proposed a coefficient of inefficiency, which attempts to define the maximum size a committee can reach before it becomes unable to make decisions. His suggestion that it lay "somewhere 19.9 and 22.4" has stood the test of time: more recent research suggests that committees cannot include many more than 20 members before becoming utterly hapless.
Student syndrome
"If it weren't for the last minute, I wouldn't get anything done." So said an anonymous wit, and none but the most ferociously well-organised can disagree.
In fact, procrastination is a major problem for some people, especially those who are easily distracted or are uncertain of their ability to complete a task.
One of the most well-known examples of vigorous procrastination is student syndrome. As anyone who has ever been (or known) a student will know, it is standard practice to apply yourself to a task only at the last possible moment before the deadline.
Student syndrome is so common that some experts in project management recommend not assigning long periods of time to particular tasks, because the people who are supposed to do them will simply wait until just before the deadline to start work, and the project will overrun anyway (International Journal of Project Management, vol 18, p 173).
Some of the blame for student syndrome may be laid at the feet of the planning fallacy: the tendency for people to underestimate how long it will take to do something.
If you often get caught out by how long things take, we recommend considering Hofstadter's law, coined by the cognitive scientist Douglas Hofstadter: "It always takes longer than you expect, even when you take into account Hofstadter's law."
Pareto principle
The rich have a lot more money than you. That might sound like a statement of the obvious, but you may be surprised by just how much richer than you they are. In fact, in most countries 80 per cent of the wealth is owned by just 20 per cent of the population.
This was first spotted by the economist Vilfredo Pareto in the early 20th century, and it seems to be a universal rule in societies – although the precise nature of the distribution has been revised over the years.
But the Pareto principle is not just about money. For most systems, 80 per cent of events are triggered by just 20 per cent of the causes. For instance, 20 per cent of the users of a popular science website are responsible for 80 per cent of the page clicks.
Businesses often use the Pareto principle as a rule of thumb, for instance deciding to do the most important 20 per cent of a job in order to get 80 per cent of the reward.
Salem hypothesis
First proposed by Bruce Salem on the discussion site Usenet, the Salem hypothesis claims that "an education in the engineering disciplines forms a predisposition to [creationist] viewpoints". This was rephrased somewhat by P. Z. Myers as "creationists with advanced degrees are often engineers".
Is there any evidence to back this up, or is it just a gratuitous slander against engineers? A 1982 article in the Proceedings of the Iowa Academy of Science suggested that many leading creationists trained as engineers, notably Henry Morris, one of the authors of the key creationist book The Genesis Flood. But the article did not present any figures.
More recently, Diego Gambetta and Steffen Hertog have noted a preponderance of engineers among Islamic extremist groups. They suggested that engineers may be at greater risk of being recruited by such groups than other graduates.
Obviously creationism is not the same thing as violent activism, but Gambetta and Hertog's analysis may be useful nevertheless because they discuss the engineering mindset in some detail. They show, for instance, that engineers are more likely to be religious than other graduates (PDF).
None of this is anywhere near enough to prove the Salem hypothesis, but it does provide some intriguing circumstantial evidence.
Maes-Garreau law
Everyone loves predicting the future, and some make a career out of it. These futurists often present detailed, authoritative claims about what is going to happen, though their success rate isn't always exemplary.
A common theme in futurist predictions is that revolutionary technology of one sort or another is just around the corner, and that this technology will allow people to live forever. This can mean physical immortality or some more abstracted technique like downloading one's personality into a computer. The "singularity", which Ray Kurzweil says will arrive "by 2045 or thereabouts", is a prime example.
And thus we come to the Maes-Garreau law, which states that any such prediction about a favourable future technology will fall just within the expected lifespan of the person making it.
Pattie Maes, a researcher at the Massachusetts Institute of Technology, observed in the late 1980s that many of her male colleagues were interested in these ideas, and tabulated when they expected the miracle technology to arrive. Sure enough, she found that the dates they predicted for the singularity were always on or around their 70th birthdays.
She mentioned her findings in a talk, but did not write them up. Subsequently, the journalist Joel Garreau made similar observations in his book Radical Evolution, which looked at the implications of such "transhumanist" ideas.
The Maes-Garreau law was finally coined, and given its name, by Wired editor Kevin Kelly. Kelly informally repeated Maes's analysis, confirming her findings. He then defined the "Maes-Garreau point" as the latest possible date a prediction can come true and still remain in the lifetime of the person making it.
sábado, 17 de enero de 2009
Our world may be a giant hologram

Has GEO600's laser probed the fundamental fuzziness of space-time? (Image: Wolfgang Filser / Max Planck Society)
DRIVING through the countryside south of Hanover, it would be easy to miss theGEO600 experiment. From the outside, it doesn't look much: in the corner of a fieldstands an assortment of boxy temporary buildings, from which two long trenches emerge, at a right angle to each other, covered with corrugated iron. Underneath the metal sheets, however, lies a detector that stretches for 600 metres.
For the past seven years, this German set-up has been looking for gravitational waves - ripples in space-time thrown off by super-dense astronomical objects such as neutron stars and black holes. GEO600 has not detected any gravitational waves so far, but it might inadvertently have made the most important discovery in physics for half a century.
For many months, the GEO600 team-members had been scratching their heads over inexplicable noise that is plaguing their giant detector. Then, out of the blue, a researcher approached them with an explanation. In fact, he had even predicted the noise before he knew they were detecting it. According to Craig Hogan, a physicist at the Fermilab particle physics lab in Batavia, Illinois, GEO600 has stumbled upon the fundamental limit of space-time - the point where space-time stops behaving like the smooth continuum Einstein described and instead dissolves into "grains", just as a newspaper photograph dissolves into dots as you zoom in. "It looks like GEO600 is being buffeted by the microscopic quantum convulsions of space-time," says Hogan.
If this doesn't blow your socks off, then Hogan, who has just been appointed director of Fermilab's Center for Particle Astrophysics, has an even bigger shock in store: "If the GEO600 result is what I suspect it is, then we are all living in a giant cosmic hologram."
The idea that we live in a hologram probably sounds absurd, but it is a natural extension of our best understanding of black holes, and something with a pretty firm theoretical footing. It has also been surprisingly helpful for physicists wrestling with theories of how the universe works at its most fundamental level.
The holograms you find on credit cards and banknotes are etched on two-dimensional plastic films. When light bounces off them, it recreates the appearance of a 3D image. In the 1990s physicists Leonard Susskind and Nobel prizewinner Gerard 't Hooft suggested that the same principle might apply to the universe as a whole. Our everyday experience might itself be a holographic projection of physical processes that take place on a distant, 2D surface.
The "holographic principle" challenges our sensibilities. It seems hard to believe that you woke up, brushed your teeth and are reading this article because of something happening on the boundary of the universe. No one knows what it would mean for us if we really do live in a hologram, yet theorists have good reasons to believe that many aspects of the holographic principle are true.
Susskind and 't Hooft's remarkable idea was motivated by ground-breaking work on black holes by Jacob Bekenstein of the Hebrew University of Jerusalem in Israel and Stephen Hawking at the University of Cambridge. In the mid-1970s, Hawking showed that black holes are in fact not entirely "black" but instead slowly emit radiation, which causes them to evaporate and eventually disappear. This poses a puzzle, because Hawking radiation does not convey any information about the interior of a black hole. When the black hole has gone, all the information about the star that collapsed to form the black hole has vanished, which contradicts the widely affirmed principle that information cannot be destroyed. This is known as the black hole information paradox.
Bekenstein's work provided an important clue in resolving the paradox. He discovered that a black hole's entropy - which is synonymous with its information content - is proportional to the surface area of its event horizon. This is the theoretical surface that cloaks the black hole and marks the point of no return for infalling matter or light. Theorists have since shown that microscopic quantum ripples at the event horizon can encode the information inside the black hole, so there is no mysterious information loss as the black hole evaporates.
Crucially, this provides a deep physical insight: the 3D information about a precursor star can be completely encoded in the 2D horizon of the subsequent black hole - not unlike the 3D image of an object being encoded in a 2D hologram. Susskind and 't Hooft extended the insight to the universe as a whole on the basis that the cosmos has a horizon too - the boundary from beyond which light has not had time to reach us in the 13.7-billion-year lifespan of the universe. What's more, work by several string theorists, most notably Juan Maldacena at the Institute for Advanced Study in Princeton, has confirmed that the idea is on the right track. He showed that the physics inside a hypothetical universe with five dimensions and shaped like a Pringle is the same as the physics taking place on the four-dimensional boundary.
According to Hogan, the holographic principle radically changes our picture of space-time. Theoretical physicists have long believed that quantum effects will cause space-time to convulse wildly on the tiniest scales. At this magnification, the fabric of space-time becomes grainy and is ultimately made of tiny units rather like pixels, but a hundred billion billion times smaller than a proton. This distance is known as the Planck length, a mere 10-35 metres. The Planck length is far beyond the reach of any conceivable experiment, so nobody dared dream that the graininess of space-time might be discernable.
That is, not until Hogan realised that the holographic principle changes everything. If space-time is a grainy hologram, then you can think of the universe as a sphere whose outer surface is papered in Planck length-sized squares, each containing one bit of information. The holographic principle says that the amount of information papering the outside must match the number of bits contained inside the volume of the universe.
Since the volume of the spherical universe is much bigger than its outer surface, how could this be true? Hogan realised that in order to have the same number of bits inside the universe as on the boundary, the world inside must be made up of grains bigger than the Planck length. "Or, to put it another way, a holographic universe is blurry," says Hogan.
This is good news for anyone trying to probe the smallest unit of space-time. "Contrary to all expectations, it brings its microscopic quantum structure within reach of current experiments," says Hogan. So while the Planck length is too small for experiments to detect, the holographic "projection" of that graininess could be much, much larger, at around 10-16 metres. "If you lived inside a hologram, you could tell by measuring the blurring," he says.
When Hogan first realised this, he wondered if any experiment might be able to detect the holographic blurriness of space-time. That's where GEO600 comes in.
Gravitational wave detectors like GEO600 are essentially fantastically sensitive rulers. The idea is that if a gravitational wave passes through GEO600, it will alternately stretch space in one direction and squeeze it in another. To measure this, the GEO600 team fires a single laser through a half-silvered mirror called a beam splitter. This divides the light into two beams, which pass down the instrument's 600-metre perpendicular arms and bounce back again. The returning light beams merge together at the beam splitter and create an interference pattern of light and dark regions where the light waves either cancel out or reinforce each other. Any shift in the position of those regions tells you that the relative lengths of the arms has changed.
"The key thing is that such experiments are sensitive to changes in the length of the rulers that are far smaller than the diameter of a proton," says Hogan.
So would they be able to detect a holographic projection of grainy space-time? Of the five gravitational wave detectors around the world, Hogan realised that the Anglo-German GEO600 experiment ought to be the most sensitive to what he had in mind. He predicted that if the experiment's beam splitter is buffeted by the quantum convulsions of space-time, this will show up in its measurements (Physical Review D, vol 77, p 104031). "This random jitter would cause noise in the laser light signal," says Hogan.
In June he sent his prediction to the GEO600 team. "Incredibly, I discovered that the experiment was picking up unexpected noise," says Hogan. GEO600's principal investigator Karsten Danzmann of the Max Planck Institute for Gravitational Physics in Potsdam, Germany, and also the University of Hanover, admits that the excess noise, with frequencies of between 300 and 1500 hertz, had been bothering the team for a long time. He replied to Hogan and sent him a plot of the noise. "It looked exactly the same as my prediction," says Hogan. "It was as if the beam splitter had an extra sideways jitter."
No one - including Hogan - is yet claiming that GEO600 has found evidence that we live in a holographic universe. It is far too soon to say. "There could still be a mundane source of the noise," Hogan admits.
Gravitational-wave detectors are extremely sensitive, so those who operate them have to work harder than most to rule out noise. They have to take into account passing clouds, distant traffic, seismological rumbles and many, many other sources that could mask a real signal. "The daily business of improving the sensitivity of these experiments always throws up some excess noise," says Danzmann. "We work to identify its cause, get rid of it and tackle the next source of excess noise." At present there are no clear candidate sources for the noise GEO600 is experiencing. "In this respect I would consider the present situation unpleasant, but not really worrying."
For a while, the GEO600 team thought the noise Hogan was interested in was caused by fluctuations in temperature across the beam splitter. However, the team worked out that this could account for only one-third of the noise at most.
Danzmann says several planned upgrades should improve the sensitivity of GEO600 and eliminate some possible experimental sources of excess noise. "If the noise remains where it is now after these measures, then we have to think again," he says.
If GEO600 really has discovered holographic noise from quantum convulsions of space-time, then it presents a double-edged sword for gravitational wave researchers. One on hand, the noise will handicap their attempts to detect gravitational waves. On the other, it could represent an even more fundamental discovery.
Such a situation would not be unprecedented in physics. Giant detectors built to look for a hypothetical form of radioactivity in which protons decay never found such a thing. Instead, they discovered that neutrinos can change from one type into another - arguably more important because it could tell us how the universe came to be filled with matter and not antimatter (New Scientist, 12 April 2008, p 26).
It would be ironic if an instrument built to detect something as vast as astrophysical sources of gravitational waves inadvertently detected the minuscule graininess of space-time. "Speaking as a fundamental physicist, I see discovering holographic noise as far more interesting," says Hogan.
Small price to pay
Despite the fact that if Hogan is right, and holographic noise will spoil GEO600's ability to detect gravitational waves, Danzmann is upbeat. "Even if it limits GEO600's sensitivity in some frequency range, it would be a price we would be happy to pay in return for the first detection of the graininess of space-time." he says. "You bet we would be pleased. It would be one of the most remarkable discoveries in a long time."
However Danzmann is cautious about Hogan's proposal and believes more theoretical work needs to be done. "It's intriguing," he says. "But it's not really a theory yet, more just an idea." Like many others, Danzmann agrees it is too early to make any definitive claims. "Let's wait and see," he says. "We think it's at least a year too early to get excited."
The longer the puzzle remains, however, the stronger the motivation becomes to build a dedicated instrument to probe holographic noise. John Cramer of the University of Washington in Seattle agrees. It was a "lucky accident" that Hogan's predictions could be connected to the GEO600 experiment, he says. "It seems clear that much better experimental investigations could be mounted if they were focused specifically on the measurement and characterisation of holographic noise and related phenomena."
One possibility, according to Hogan, would be to use a device called an atom interferometer. These operate using the same principle as laser-based detectors but use beams made of ultracold atoms rather than laser light. Because atoms can behave as waves with a much smaller wavelength than light, atom interferometers are significantly smaller and therefore cheaper to build than their gravitational-wave-detector counterparts.
So what would it mean it if holographic noise has been found? Cramer likens it to the discovery of unexpected noise by an antenna at Bell Labs in New Jersey in 1964. That noise turned out to be the cosmic microwave background, the afterglow of the big bang fireball. "Not only did it earn Arno Penzias and Robert Wilson a Nobel prize, but it confirmed the big bang and opened up a whole field of cosmology," says Cramer.
Hogan is more specific. "Forget Quantum of Solace, we would have directly observed the quantum of time," says Hogan. "It's the smallest possible interval of time - the Planck length divided by the speed of light."
More importantly, confirming the holographic principle would be a big help to researchers trying to unite quantum mechanics and Einstein's theory of gravity. Today the most popular approach to quantum gravity is string theory, which researchers hope could describe happenings in the universe at the most fundamental level. But it is not the only show in town. "Holographic space-time is used in certain approaches to quantising gravity that have a strong connection to string theory," says Cramer. "Consequently, some quantum gravity theories might be falsified and others reinforced."
Hogan agrees that if the holographic principle is confirmed, it rules out all approaches to quantum gravity that do not incorporate the holographic principle. Conversely, it would be a boost for those that do - including some derived from string theory and something called matrix theory. "Ultimately, we may have our first indication of how space-time emerges out of quantum theory." As serendipitous discoveries go, it's hard to get more ground-breaking than that.
sábado, 10 de enero de 2009
Nueva teoría del Universo encaja dos de los mayores misterios
Los físicos han desarrollado una teoría que unifica dos de los misterios más ampliamente estudiados del Universo: ¿Por qué existe un desequilibrio entre la materia ordinaria y la antimateria (los científicos esperan ver iguales cantidades de ambas, pero observan menos antimateria), y la identidad de la “materia oscura” – las enigmáticas partículas que se piensa que cuentan para el tirón gravitatorio extra observado en las galaxias distantes.

“Proponemos que el algún punto del Universo primigenio, la materia oscura interactuó con la materia ordinaria de algún modo particular que llevó a desplazar el equilibrio entre la materia y la antimateria ligeramente hacia la materia, un proceso llamado bariogénesis”, dijo Jeff Jones, físico de la Universidad de California-Santa Cruz involucrado en el trabajo, a PhysOrg.com. “Hemos propuesto un nuevo mecanismo para la bariogénesis que enlaza estos dos misterios, que se suponía usualmente no relacionados entre sí”.
El prefijo “bario” en bariogénesis viene de “barión,” una clase de partículas compuestas de tres quarks. Los protones y neutrones son los ejemplos más típicos de bariones. Por extensión, la materia corriente – átomos, en otras palabras, que son principalmente protones y neutrones – están también esencialmente hechos de bariones. De igual forma, la antimateria es, en su mayor parte, antibariones.
El físico ruso Andrei Sakharov, padre de la bomba de hidrógeno rusa y partidario de la coexistencia pacífica entre los soviéticos y los sistemas occidentales, apuntó en los años 60 que para que la bariogénesis tuviese lugar tendría que haber una violación de la simetría CP. La simetría CP es un concepto que afirma que si las partículas comunes son reemplazadas por antipartículas en cualquier proceso físico, y la “lateralidad” es invertida simultáneamente (de forma similar a que yo soy diestro pero mi imagen en el espejo, mi “anti-yo” es zurdo), el resultado debería ser un proceso igualmente factible que tiene lugar a la misma razón que el primero. De las cuatro fuerzas fundamentales conocidas – nuclear fuerte y débil, electromagnética y gravedad – los científicos han visto sólo a la nuclear débil violar, en algunos experimentos, la simetría CP. Sin embargo, cuando la violación da como resultado la producción de bariones, siempre genera además antibariones. Por lo que no se produce desequilibrio.
Debe haber, entonces, un proceso que no conserve en número total de bariones involucrados y que viole la simetría CP. El Modelo Estándar de física de partículas – una teoría que describe la relación entre las fuerzas nuclear débil y fuerte y las fuerzas electromagnéticas, y todas las partículas que sienten estas fuerzas – predice que existe tal proceso. Conocido como “proceso sphaleron”, sólo tendría lugar a unas temperaturas demasiado altas como conseguirse en un laboratorio, pero podrían haber tenido lugar en el Universo primigenio. El proceso sphaleron permite la posibilidad de que la materia sea generada sin su correspondiente antimateria, pero esto no es aún una solución completa. Pero, dice Jones, “El Universo no tienen ninguna razón para preferir la materia sobre la antimateria, por lo que a largo plazo se esperaría que finalmente se compensaran. Misteriosamente, no ha sucedido, y aquí es donde aparece nuestro trabajo”.
Él y sus colegas demuestran en su artículo (que apareció en la edición de noviembre de Journal of High Energy Physics) que si las partículas de materia oscura tienen ciertas propiedades habrían interactuado con la materia ordinaria. Cuando añadimos esto al resto de las ecuaciones en el Modelo Estándar (la materia oscura actualmente no es parte de él), esta interacción podría haber causado una preferencia de la materia sobre la antimateria. Como en el proceso sphaleron, esta interacción sólo tendría lugar a las temperaturas muy altas que se dieron lugar en el Universo primigenio y que se “apagaron” cuando el Universo se enfrió. Esto no sólo explica cómo pudieron generarse más bariones que antibariones durante un momento, sino también por qué habría más bariones a largo plazo. Pero para este trabajo, el proceso sphaleron tendría que haber terminado antes de que ocurriese la interacción entre la materia ordinaria y la materia oscura. Si tiene lugar en el orden opuesto, el exceso de bariones del Universo primigenio serían aniquilados por lo nuevos antibariones producidos por el proceso sphaleron.
El grupo abordó estas ideas cuando estaban estudiando una extensión del Modelo Estándar conocido como el modelo Pentágono, el cuál involucra partículas teóricas apodadas pentaquarks, que intenta combinarlos juntos en grupos de cinco. Los científicos pueden verificar el Modelo Estándar sólo para partículas con energías alcanzables por los experimentos humanos, las cuales, relativamente, no son muy altas. A altas energías, muchos científicos sospechan que el Modelo Estándar colapsaría y entraría en juego una nueva física. El modelo Pentágono, por otra parte, abre el campo a nuevas fuerzas desconocidas y se mantiene a energías a las cuales el Modelo Estándar falla.
Los investigadores dicen que su teoría de la bariogénesis podría ser válida aún si el modelo Pentágono resulta ser incorrecto, siempre que sus suposiciones sobre la identidad y naturaleza de la materia oscura sea cierta.
“En última instancia, para saber si nuestra teoría es verdaderamente la explicación correcta, se necesitará una comprobación. Dado que involucra conexiones entre muchos procesos físicos distintos, habrá más oportunidades de probarla y esto podría hacerla en ciertos aspectos más fácil de comprobar”, dice Jones. “Sin embargo, hay una cierta ventaja sobre otras alternativas, en esta se consigue matar dos pájaros de un tiro”.
Cita: Tom Banks, Sean Echols y Jeff L. Jones, “Baryogenesis, dark matter and the pentagon.” J. High Energy Phys. JHEP11 (2006) 046
Autor: Laura Mgrdichian
Fecha Original: 2007-01-31
![]() |
“Proponemos que el algún punto del Universo primigenio, la materia oscura interactuó con la materia ordinaria de algún modo particular que llevó a desplazar el equilibrio entre la materia y la antimateria ligeramente hacia la materia, un proceso llamado bariogénesis”, dijo Jeff Jones, físico de la Universidad de California-Santa Cruz involucrado en el trabajo, a PhysOrg.com. “Hemos propuesto un nuevo mecanismo para la bariogénesis que enlaza estos dos misterios, que se suponía usualmente no relacionados entre sí”.
El prefijo “bario” en bariogénesis viene de “barión,” una clase de partículas compuestas de tres quarks. Los protones y neutrones son los ejemplos más típicos de bariones. Por extensión, la materia corriente – átomos, en otras palabras, que son principalmente protones y neutrones – están también esencialmente hechos de bariones. De igual forma, la antimateria es, en su mayor parte, antibariones.
El físico ruso Andrei Sakharov, padre de la bomba de hidrógeno rusa y partidario de la coexistencia pacífica entre los soviéticos y los sistemas occidentales, apuntó en los años 60 que para que la bariogénesis tuviese lugar tendría que haber una violación de la simetría CP. La simetría CP es un concepto que afirma que si las partículas comunes son reemplazadas por antipartículas en cualquier proceso físico, y la “lateralidad” es invertida simultáneamente (de forma similar a que yo soy diestro pero mi imagen en el espejo, mi “anti-yo” es zurdo), el resultado debería ser un proceso igualmente factible que tiene lugar a la misma razón que el primero. De las cuatro fuerzas fundamentales conocidas – nuclear fuerte y débil, electromagnética y gravedad – los científicos han visto sólo a la nuclear débil violar, en algunos experimentos, la simetría CP. Sin embargo, cuando la violación da como resultado la producción de bariones, siempre genera además antibariones. Por lo que no se produce desequilibrio.
Debe haber, entonces, un proceso que no conserve en número total de bariones involucrados y que viole la simetría CP. El Modelo Estándar de física de partículas – una teoría que describe la relación entre las fuerzas nuclear débil y fuerte y las fuerzas electromagnéticas, y todas las partículas que sienten estas fuerzas – predice que existe tal proceso. Conocido como “proceso sphaleron”, sólo tendría lugar a unas temperaturas demasiado altas como conseguirse en un laboratorio, pero podrían haber tenido lugar en el Universo primigenio. El proceso sphaleron permite la posibilidad de que la materia sea generada sin su correspondiente antimateria, pero esto no es aún una solución completa. Pero, dice Jones, “El Universo no tienen ninguna razón para preferir la materia sobre la antimateria, por lo que a largo plazo se esperaría que finalmente se compensaran. Misteriosamente, no ha sucedido, y aquí es donde aparece nuestro trabajo”.
Él y sus colegas demuestran en su artículo (que apareció en la edición de noviembre de Journal of High Energy Physics) que si las partículas de materia oscura tienen ciertas propiedades habrían interactuado con la materia ordinaria. Cuando añadimos esto al resto de las ecuaciones en el Modelo Estándar (la materia oscura actualmente no es parte de él), esta interacción podría haber causado una preferencia de la materia sobre la antimateria. Como en el proceso sphaleron, esta interacción sólo tendría lugar a las temperaturas muy altas que se dieron lugar en el Universo primigenio y que se “apagaron” cuando el Universo se enfrió. Esto no sólo explica cómo pudieron generarse más bariones que antibariones durante un momento, sino también por qué habría más bariones a largo plazo. Pero para este trabajo, el proceso sphaleron tendría que haber terminado antes de que ocurriese la interacción entre la materia ordinaria y la materia oscura. Si tiene lugar en el orden opuesto, el exceso de bariones del Universo primigenio serían aniquilados por lo nuevos antibariones producidos por el proceso sphaleron.
El grupo abordó estas ideas cuando estaban estudiando una extensión del Modelo Estándar conocido como el modelo Pentágono, el cuál involucra partículas teóricas apodadas pentaquarks, que intenta combinarlos juntos en grupos de cinco. Los científicos pueden verificar el Modelo Estándar sólo para partículas con energías alcanzables por los experimentos humanos, las cuales, relativamente, no son muy altas. A altas energías, muchos científicos sospechan que el Modelo Estándar colapsaría y entraría en juego una nueva física. El modelo Pentágono, por otra parte, abre el campo a nuevas fuerzas desconocidas y se mantiene a energías a las cuales el Modelo Estándar falla.
Los investigadores dicen que su teoría de la bariogénesis podría ser válida aún si el modelo Pentágono resulta ser incorrecto, siempre que sus suposiciones sobre la identidad y naturaleza de la materia oscura sea cierta.
“En última instancia, para saber si nuestra teoría es verdaderamente la explicación correcta, se necesitará una comprobación. Dado que involucra conexiones entre muchos procesos físicos distintos, habrá más oportunidades de probarla y esto podría hacerla en ciertos aspectos más fácil de comprobar”, dice Jones. “Sin embargo, hay una cierta ventaja sobre otras alternativas, en esta se consigue matar dos pájaros de un tiro”.
Cita: Tom Banks, Sean Echols y Jeff L. Jones, “Baryogenesis, dark matter and the pentagon.” J. High Energy Phys. JHEP11 (2006) 046
Autor: Laura Mgrdichian
Fecha Original: 2007-01-31