The MDL Times - Science and Tech. News on MDL

Discussion in 'Serious Discussion' started by kldpdas, Jun 30, 2011.

  1. half Man Half Biscuit

    half Man Half Biscuit MDL Addicted

    Jun 1, 2011
    684
    922
    30
    Lighting the universe
    Rethinking what the first stars were like

    [​IMG]


    The Big Bang wasn’t all it has been cracked up to be. Sure, it created the universe. But after the heat of the primordial fireball faded, the cosmos plunged into darkness. The universe was cold and black — a sea of hydrogen and helium atoms mixed with a mysterious dark form of matter making its presence known only by its gravity. No stars.

    It took a series of violent events — starting about 100 million years after the Big Bang—to end the cosmic Dark Ages. First, the evenly spread dark matter gathered into clumps, pulling in hydrogen gas that coalesced into clouds. Then pressure inside the clouds grew strong enough to fuse atoms, triggering nuclear reactions. The first stars created this way looked like roses with diaphanous petals, unfolding against a sea of darkness. The universe was finally in bloom.

    The first stars marked a milestone in the history of the universe, bringing light and warmth back to the cosmos. Later, those primeval stars met their end in spectacular explosions known as supernovas, which seeded the universe with its first dollops of oxygen, carbon and silicon. Those elements made it possible for a second generation of stars to form.

    The second-gen stars eventually burned through the opaque fog of hydrogen atoms and set the skies twinkling. These stars gathered into the first recognizable galaxies — dwarf galaxies of a few million stars. Dwarf galaxies merged, and after billions of years life emerged in one of the bigger galaxies, on a smallish backwater planet called Earth.

    On that much, astronomers agree. But new simulations that track the star-formation process further than ever before are casting doubt on earlier ideas about the properties of the first stars. They’ve been cast as loners and extremely massive, for instance. But now the massive-loner theory is in dispute. And that has profound consequences for nearly everything that happened next, because the mass of the first stars may have determined the size of the first galaxies and how quickly the second generation of stars could assemble to form them.

    “There is widespread confusion and disagreement,” says astronomer Jason Tumlinson of the Space Telescope Science Institute in Baltimore. “I can no longer say with any confidence what the first stars were like.” But, he adds, “that’s what makes the field so exciting.”
    access
    Enlargemagnify
    Recent simulations have shown that some of the first stars may have formed as twins or even triplets. Here, star embryos (crosses) form in a swirling cloud of hydrogen and helium gas.P. Clark, S. Glover, R. Smith, T. Greif, R. Klessen, V. Bromm

    New simulations, new ideas

    [​IMG]

    Retracing the steps of star formation is a tricky business. Less than a decade ago, computer simulations by Tom Abel of Stanford’s Kavli Institute for Particle Astrophysics and Cosmology and his colleagues indicated that the first stars were whoppers — between 30 and 300 times as heavy as the sun — and that each formed in solitary confinement within separate clouds of gas (SN: 6/8/02, p. 362). The gas showed no sign of fragmenting into several stars; instead, it appeared that the condensing object would keep growing to become one behemoth. And because massive stars die out in just a few million years, none of these first stars could still exist in the universe today.

    Although the researchers could follow the steps toward star formation during the first 100 million years or so of cosmic history, they could not track the additional 100,000 years it takes for an infant star to grow to its final size. The team had to stop because supercomputers couldn’t — and still can’t — precisely track the rapid changes in density a cloud core undergoes as it becomes a star.

    Using a mathematical trick, however, other teams have now gone slightly further, simulating about 1,000 years more of the star-formation process. Rather than attempting to track the rapid changes in the dense cloud core, these teams in effect ignore the core, treating it as a sink or black hole, with material falling onto the central region simply disappearing from sight.

    Adopting that approach, the researchers have found evidence that a disk of material that forms around each of the embryonic stars can fragment into several fledgling stars, much the way the disk of material around the infant sun broke into clumps that formed the planets (SN: 2/26/11, p. 18).

    The net result, as these astrophysicists now see it, is that stars could have been born in pairs or even threesomes. Since they coalesce from the same cloud, each partner would be lighter than if it had formed in solitary confinement.
    access
    Enlargemagnify
    SWISS CHEESEView larger image | Researchers hope to reconstruct the star-formation process during the first billion years of cosmic history by measuring the brightness of 21-centimeter radio waves emitted by hydrogen relative to the cosmic microwave background (chart, bottom). Holes should appear (top, left to right) where radiation emitted by stars ionized hydrogen atoms, stripping electrons. As more stars formed, the holes would grow and merge, leaving the universe completely ionized as it is today. By seeing how quickly ionization proceeded, scientists hope to learn whether the first stars formed singly or in multiples.Texas Advanced Computing Center; J. Pritchard & A. Loeb/Nature 2010

    “Whether at the end of this process one, two or a few massive stars will remain is currently unknown,” says Abel. Some studies even suggest that very small fragments, weighing no more than the mass of the sun, might form. Because low-mass stars take billions of years to burn out, some of the first stars could have survived to the present day, some researchers suggest.
    [​IMG]

    To find out what the first stars were like, researchers are now looking to the scars those stars left behind — the extent to which they broke apart nearby atoms of hydrogen gas.

    For instance, if most of the first stars were single and massive, they would have transformed the early universe into a giant hunk of Swiss cheese. That’s because big stars emit copious amounts of ultraviolet light, which ionizes surrounding gases — stripping electrons from the neutral hydrogen and helium atoms that veiled the cosmos during the Dark Ages. The birth of each individual star would create an ionized bubble, or hole, in the gases around it. Over time, the universe would be riddled with these holes. Once the holes grew large enough to overlap, the universe would be almost completely ionized — as evidence suggests it has been ever since the cosmos was a few hundred million years old.

    But if the very first stars were extremely massive, they could have prevented other stars from forming. The energy from their ultraviolet emissions would break molecules of hydrogen into atoms. Without hydrogen molecules, which provide a clump-promoting cooling effect, the dark matter at the heart of star formation would not have enough gravity to pull gas into a star.

    If the new simulations showing that primeval stars were born with partners are correct, the universe might never have gone through a Swiss cheese phase, Zoltán Haiman of Columbia University thinks. If the partnerships were close enough, one star would be more likely to collapse to become a black hole and draw matter from the other, emitting X-rays in the process. Far more penetrating than ultraviolet light, the X-rays would rapidly strip electrons from hydrogen and helium atoms throughout the cosmos, leaving a uniformly ionized universe instead of holes, Haiman suggested in the April 7 Nature.

    The stellar-partnership scenario could explain an enduring puzzle in the universe today, suggests a team led by I. Félix Mirabel of the French Atomic and Alternative Energies Commission in Gif-sur-Yvette, France and the Institute for Astronomy and Space Physics in Buenos Aires. The leading theory of dark matter predicts that the Milky Way should be surrounded by hundreds of dwarf galaxies, but observers have found only about 25. Mirabel’s team suggests in the April Astronomy & Astrophysics that the other dwarf galaxies exist but can’t be seen because they’re starless — shadowy leftovers from the early universe, when such galaxies were too small to either forge or hold onto the first stars.
    access
    Enlargemagnify
    AFTER THE BANGView larger image | The universe was a quiet place for millions of years after the Big Bang, plunged into darkness when electrons and protons cooled enough to combine into neutral hydrogen atoms. Today, scientists are reconstructing the series of events that led to the first stars, galaxies and ultimately the universe as seen today.From left: pederk/istockphoto; T. Dubé; Detlev van Ravenswaay/Photo Researchers, Inc.; Hubble Heritage Team/STSCI/AURA/NASA; John H. Wise; NASA, JPL-Caltech, T. Pyle/SSC; NASA, ESA, Hubble Heritage Team/STScI/AURA

    Researchers, however, don’t agree on how these X-ray–emitting partnerships would affect the universe. According to Haiman, the partners would emit so much more heat than a lone star that they would delay the formation of the first galaxies.
    [​IMG]

    The extra heat from the stellar partners could boost the temperature and pressure of surrounding gases and prevent any clump of matter weighing less than a billion suns from corralling the gas to make new stars. Waiting around until dark matter clumps were that heavy may have delayed the onset of galaxy formation by 100,000 years.

    But other astronomers disagree. Some theorists argue that rather than delaying the first galaxies, X-ray–emitting binaries would promote cooling that would hasten star formation. Tumlinson notes that through a chain of chemical reactions, X-rays would promote the formation of the HD molecule, in which one hydrogen atom is replaced by its heavier isotope, deuterium. That molecule might act as a new coolant.

    “People argue about this for hours at meetings and still there’s no consensus,” notes Tumlinson.

    Ground truth

    As the theorists continue to debate their models, observations to test their ideas are about to begin.

    New arrays of radio telescopes will look for imprints that the first stars left behind on the clouds of hydrogen atoms surrounding them. Radio astronomers can tune in to radio waves from hydrogen atoms that existed at different epochs of the Dark Ages — before, during and after the first stars formed — thanks to shifts in wavelength caused by the expansion of the universe.

    In particular, astronomers will look for radio emissions with wavelengths of 21 centimeters, which neutral hydrogen emits but ionized hydrogen cannot. If the Swiss cheese model is correct and the first stars were massive loners, observers should see the holes created when the stars broke apart the neutral hydrogen atoms.

    By using 21-centimeter radiation to pinpoint if and when holes formed and merged, low-frequency radio telescopes such as LOFAR, a set of radio dishes spread across the Netherlands and other parts of Europe, will map out the history of the first stars, says Avi Loeb of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Mass. Such maps should indicate whether the first stars were massive loners after all.

    Last year in Physical Review D, Loeb and his Harvard-Smithsonian colleague Jonathan Pritchard calculated that even a relatively inexpensive single radio dish that would record the intensity of the 21-centimeter radio emission averaged over the entire sky could indicate when the first stars were born and how quickly they ionized helium and hydrogen atoms by emitting ultraviolet light or X-rays.

    Other researchers are attempting to read a fossil record of the elements cast into space by the very first generation of stars. Theorist John Wise of Princeton University and his colleagues are trying to simulate the second generation of stars, dubbed Pop II, which are the first stars that got incorporated into galaxies. Because Pop II stars are small enough to be relatively long-lived, researchers can examine them to see what they inherited from their parents’ generation.

    “Astronomers are actually able to see Pop II stars in galaxies” and learn about their predecessors, says Wise. In addition to giant, 30-meter ground-based telescopes that astronomers are now planning to build, the James Webb Space Telescope, which researchers hope will launch late this decade, will closely examine Pop II stars from the first galaxies.

    But researchers aren’t just waiting for Webb to be launched. Astronomers using the European Southern Observatory’s Very Large Telescope in Chile are getting a head start by re-examining the surfaces of eight elderly Milky Way stars. The stars are at least 12 billion years old and are probably members of the Pop II generation, Cristina Chiappini of the Leibniz Institute for Astrophysics Potsdam in Germany and her colleagues report in the April 28 Nature.

    The team found high abundances of two rare, heavy elements — strontium and yttrium — relative to iron. To explain the composition of those second-generation stars, the researchers propose that the first stars were massive and rotated rapidly, spinning about 250 times faster than the sun. By mixing different layers of nuclear-burning gases, these whirling dervishes could trigger a chain of nuclear reactions that could have produced the high levels of strontium and yttrium.

    If the first stars were fast rotators, they would be more likely to end their lives as gamma-ray bursts, Tumlinson notes in a commentary accompanying the Nature article. Such bursts are the most powerful explosions in the universe and would serve as cosmic fireworks that would brilliantly signal the first stars’ demise.

    The bursts would be the ultimate messengers — death throes that traveled billions of light-years through space to reach Earth. For Loeb, recording those signals would be the thrill of a lifetime. “This is our roots, our origins,” he says. The bursts would put humans face to face “with our earliest ancestors, one star at a time.”
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  2. UVAIS

    UVAIS MDL Expert

    Mar 17, 2011
    1,333
    1,895
    60
    :eek::eek::worthy::worthy: Thanks HMHB
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  3. UVAIS

    UVAIS MDL Expert

    Mar 17, 2011
    1,333
    1,895
    60
    Chinese couple sells their three children to surf net!!!

    Chinese couple sells their three children to surf net​
    :eek:

    ...more
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  4. Opus

    Opus MDL Member

    Jul 28, 2009
    170
    30
    10
    A New Direction for Digital Compasses

    The advance could lead to motion sensors showing up in running shoes and tennis rackets.

    Cell phones and many other mobile devices now come packed with sensors capable of tracking them as they move. The digital compasses, gyroscopes, and accelerometers embedded in such devices have spawned a wide range of location-based services, as well as novel ways of controlling mobile gadgets—for instance, with a shake or a flick. Now a new way of making these sensors could make such technology smaller and cheaper.

    The advance could also result in motion sensors appearing in many more devices and objects, including running shoes or tennis rackets, says Nigel Drew of the Barcelona, Spain-based Baolab Microsystems, which developed the new technology.

    Baolab has made a new kind of digital compass using a simpler manufacturing method. The technology will appear in GPS devices next year, says Drew. The company has also made prototype accelerometers and gyroscopes, and plans to combine all three types of sensor on the same chip.

    Conventional digital compasses are made using what's called complementary metal-oxide-semiconductor manufacturing, the most common method for making microchips and electronic control circuitry. But such compasses include structures such as magnetic field concentrators that need to be added after the chip is made, which adds complexity and cost. "The fundamental difference is that [our compass is] made entirely within the standard CMOS," says Drew.

    This is possible because the compass exploits a phenomenon called the Lorentz force. Most commercial digital compasses rely on a different phenomenon, called the Hall Effect, which works by running a current through a conductor and measuring changes in voltage caused by the Earth's magnetic field.

    The Lorentz force, in contrast, happens when a magnetic field generates a force on a conducting material when a current is flowing through it. A device can determine the magnetic field by measuring the displacement of an object upon which this force is acting.

    In Baolab's chips, a nanoscale micro-electromechanical system (MEMS) is etched out of a conventional silicon chip. This nano-MEMS device consists of an aluminum mass suspended by springs. When a device drives a current through the mass, any magnetic fields present will exert a force on the mass and affect its resonance. A pair of metal plates that flank the mass will detect these changes. A device can they measure the magnetic field in one direction by noting minuscule changes in the capacitance of these plates. Using a set of three of these sensors, the device can determine direction of the Earth's magnetic field, and hence it's orientation.

    "This sort of MEMS-CMOS co-integration technology will improve the sensitivity and enable smaller, and therefore cheaper, sensor chips compared to the conventional ones," says Hiroshi Mizuta, a professor of nanoelectronics at Southampton University's NANO Group.

    Each of Baolab's nano-MEMS sensors is less than 90 microns long. Drew says it should be possible to integrate all three types of sensors into a single chip just three millimeters long.

     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  5. Opus

    Opus MDL Member

    Jul 28, 2009
    170
    30
    10
    #65 Opus, Aug 3, 2011
    Last edited by a moderator: Apr 20, 2017
    Euclideon claims to revolutionize gaming graphics tech

    : Small Australian developer Euclideon has released a video on YouTube that claims their new graphics processing technology will revolutionize how games are made forever.

    This isn’t the first we have heard of this technology or this company. In early 2010, Unlimited Detail’s CEO Bruce Dell released a video demonstrating their current progress with this new approach to computer graphics. After the video went viral, the company disappeared which led many to conclude that the video was either fake or the company dissolved.

    The team reemerged this week with a new video under a new company name. Unlimited Detail is now known as Euclideon and this video installment looks even more promising than the last, at least on the surface.

    Without delving into too much technical detail, the Unlimited Detail method works by eliminating traditional polygon limits and instead uses “unlimited detail point cloud data”. Point cloud data is comprised of tiny atoms, the same technology that is used in medicine and science. This method allows for much higher detail but uses a lot of processing power and as such, a full game level couldn’t be created using this method, until now.

    Euclideon claims their technology will render polygons useless by instead using these tiny atoms to create objects and worlds, just like how things are created in the real world. The theory and video are certainly impressive, but many still have their doubts about how it would actually work in a real game.

    The video demo is just that, a rendered video. There are no animations being shown and things like collision detection could be a challenge. Others with knowledge of the subject suggest this is just a rehash of voxel technology which is very memory-intensive.

    To their credit, Euclideon makes it clear that they are not graphic artists or game developers. Once their SDK is finished “in several months”, they hope to pass it along to true game developers.

     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  6. R29k

    R29k MDL GLaDOS

    Feb 13, 2011
    5,171
    4,811
    180
    Black holes and pulsars could reveal extra dimensions

    MAKING a black hole let go of anything is a tall order. But their grip may slowly weaken if the universe has extra dimensions, something that pulsars could help us to test.
    String theory, which attempts to unify all the known forces, calls for extra spatial dimensions beyond the three we experience. Testing the theory has proved difficult, however.
    Now John Simonetti of Virginia Tech in Blacksburg and colleagues say black holes orbited by neutron stars called pulsars could do just that - if cosmic surveys can locate such pairings. "The universe contains 'experimental' setups we cannot produce on Earth," he says.
    Black holes are predicted to fritter away their mass over time by emitting particles, a phenomenon called Hawking radiation. Without extra dimensions, this process is predicted to be agonisingly slow for run-of-the-mill black holes weighing a few times as much as the sun, making it impossible to measure.
    Extra dimensions would give the particles more ways to escape, speeding up the process. This rapid weight loss would loosen a black hole's gravitational grip on any orbiting objects, causing them to spiral outwards by a few metres per year, the team calculates (The Astrophysical Journal, DOI: 10.1088/2041-8205/737/2/l28).
    A pulsar orbiting a black hole could reveal this distancing. That's because the lighthouse-like pulses of radiation they emit would vary slightly depending on the size of the star's orbit.

    Source
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  7. R29k

    R29k MDL GLaDOS

    Feb 13, 2011
    5,171
    4,811
    180
    Blazars and Active Galaxies

    GLAST will study a wide variety of astronomical objects and phenomena, but according to GLAST Project Scientist Steve Ritz of NASA's Goddard Space Flight Center in Greenbelt, Md., "Active galactic nuclei will be GLAST's bread and butter. There are guaranteed results."

    Active galactic nuclei, or AGN for short, are galaxies with extraordinarily luminous cores powered by black holes containing millions or even billions of times more material than our Sun. As gas is trapped by a monster black hole's gravity, it settles into an accretion disk and starts to spiral down the Universe's ultimate drain. Before the gas crosses the black hole's outer boundary (the event horizon) — beyond which nothing can escape — the material generates a vast outpouring of electromagnetic radiation. In the most luminous AGN, the visible light exceeds the combined output of an entire galaxy's worth of stars, even though the light-emitting area is only about the size of our solar system.

    Even more amazing, radio, optical, and X-ray telescopes have resolved jets shooting away from galactic cores in opposite directions. The material in these jets can rip across space at more than 99% the speed of light, and some jets remain tightly collimated for hundreds of thousands of light-years. When a jet points almost directly toward Earth, the material can appear to be moving faster than the speed of light. This superluminal motion is an illusion caused by the geometry of a source moving at high speed that is nearly but not perfectly head-on.

    But despite the staggering scale and speed of these jets, astronomers haven't been able to answer the most basic questions about them, such as how matter is accelerated to within a whisker of the speed of light. "We don’t know what the jets are made of or how they are produced. It is one of the biggest unsolved mysteries of astrophysics. But jets are the link between the activity of the supermassive black hole and the AGN's surrounding environment in intergalactic space," says Peter Michelson of Stanford University in California, who is the Principal Investigator of GLAST's primary science instrument: the Large Area Telescope (LAT).

    The LAT will probably detect gamma rays from different types of AGN, such as radio galaxies, Seyfert galaxies, quasars, and blazars. But the biggest contribution may come from blazars, which are thought to be AGN whose black holes aim their jets almost directly at Earth. Whereas the Energetic Gamma-Ray Experiment Telescope (EGRET) on NASA's Compton Gamma-ray Observatory identified 66 blazars during the mission, GLAST should see thousands. By studying the energy spectra and variability of gamma rays and other wavelengths of light coming from blazars, the LAT instrument should be able to determine the composition of the jets, establishing whether they are dominated by electrons and positrons (the antimatter counterpart of electrons), or by protons.

    [​IMG]In this radio image, two jets shoot out of the center of active galaxy Cygnus A. GLAST may solve the mystery of how these jets are produced and what they are made of. Credit: NRAO

    + High resolution image
    "When GLAST detects a blazar, it is monitoring violent activity from a black hole taking place in the distant past," says GLAST Interdisciplinary Scientist Charles Dermer of the Naval Research Laboratory in Washington, D.C. "Understanding gamma rays from these sources is a form of black hole archeology that reveals the high-energy history of our Universe."

    The LAT may also detect AGN that do not produce jets, or whose jets are not aimed directly at Earth. EGRET saw hints of gamma rays from at least two radio galaxies. The High Energy Stereoscopic System (H.E.S.S.), an array of four telescopes currently operating in Namibia, has discovered that gamma rays are coming from the giant elliptical galaxy M87, whose jets do not point toward Earth. These gamma-ray photons may originate from a region of the accretion disk very near the central black hole. By observing these and other galaxies, the LAT should provide precious insights into the mechanism that powers AGN activity.

    Moreover, the LAT will investigate a curious discrepancy between EGRET and results from several ground-based observatories, including the Whipple Observatory in Arizona. EGRET detected low-energy gamma rays from blazars, whereas Whipple has discovered high-energy TeV-level gamma rays. "Ground- and space-based telescopes have detected blazars, but there is almost no overlap in the blazars they detect," notes GLAST Deputy Project Scientist Julie McEnery of NASA Goddard. "Clearly, each type of telescope is seeing a different type of object."

    With its ability to survey the entire sky every three hours, the LAT will undoubtedly catch many AGN giving off giant flares of energy, and this flaring is one of the most important tools for studying AGN. Blazars in particular are extremely variable at all wavelengths, changing both their total energy output and spectra on timescales ranging from less than an hour to many years. The relationship of variability at different wavelengths is a crucial test for models attempting to explain these outbursts and to identify the nature of the jet particles. Obtaining measurements across the spectrum is challenging, especially on short timescales, so GLAST team members will communicate with other astronomers, who can point various ground- and space-based telescopes at flaring blazars.

    by Robert Naeye

    Source
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  8. R29k

    R29k MDL GLaDOS

    Feb 13, 2011
    5,171
    4,811
    180
    The Pressure there is 8 tons per square inch :hmm:
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  9. R29k

    R29k MDL GLaDOS

    Feb 13, 2011
    5,171
    4,811
    180
    The limits of Gravity, Space and Time ...




    NASA's Cosmic Journeys: in Search of Gravity

    NASA is planning a series of space science missions that would take us to the limits of space and time. These missions, collectively known as Cosmic Journeys, seek to understand the nature of gravity - the force generating the fantastic outpouing of energy around a black hole and which may have been intertwined with the other three fundamental forces at the momemt of the Big Bang.We are embarking upon a cosmic journey. From the safety of our home planet Earth, scientists plan to explore the very limits of the known universe. Our travels will take us to where space and time cease to exist as we know them, and to where the secrets of the past and future lie captured in the starlight of the present across an expanse of billions of light-years. Cosmic Journeys, a new series of NASA space science missions, will take us to the limits of gravity, space and time.This virtual journey will use the power of resolution far greater than what current telescopes can muster to transport us to the rim of a black hole, to eagle-eye views of the galaxies and voids that pervade the Universe, and to the earliest moments of time, just fractions of a second after the Big Bang.The goal of our Cosmic Journeys is to solve the mystery of gravity, a force that is all around us but cannot be seen. If you have ever slipped on a wet floor or had your favourite scoop of ice cream tumble over your cone, then you have come face to face with gravity. This is the force that keeps us pinned to the Earth, no matter if we live in Norway or Australia. Indeed, the Space Shuttle requires huge rocket boosters just to escape the Earth's gravity. And even in orbit, the Space Shuttle still feels the Earth's pull.Gravity acts from here to the edge of the Universe, affecting all that is seen and much of what remains unseen. This force can never be completely removed from an environment, unlike sound or light waves. Even in the 'zero gravity' of space, there is still the force of gravity that cannot be avoided or screened - a force that always attracts, never repels.We clearly understand what gravity does, but we do not fundamentally know how it does it. Yet it is this force that holds the answers to the most basic questions of our humanity, such as what is the universe made of, how does it grow, and what is its fate?Gravity has puzzled the greatest minds of the past century. Albert Einstein described gravity in a revolutionary way in his Theory of General Relativity, which says that mass distorts space and time to produce the force of gravity. A black hole is an extreme example of mass warping space-time.Einstein also predicted that gravity propagates in waves, just like light. These would be ripples in the fabric of space that move at the speed of light. Gravity may be associated with a particle, called the graviton. If so, gravity may be similar to the other fundamental forces of nature.The difficulty is that gravity doesn't fit into what scientists call the Standard Model, which describes the behavior of light and subatomic particles.We do not have one model that can describe everything in the Universe. Instead, we have two theories: General Relativity and Quantum Physics. General Relativity accounts for gravity, the force that acts across large scales. Quantum Physics, part of the Standard Model, describes the behavior of the other three fundamental forces: electromagnetism, weak forces (seen in radioactive decay), and strong forces (holding subatomic particles together). These forces act over small scales.Einstein spent most of his life trying to make things simpler, to find laws of physics more general than known before and to unite gravity with electromagnetism. Today, we may be very close to merging these concepts of Quantum Physics and General Relativity into what scientists call the 'Theory of Everything', a unified theory that predicts the behavior of all matter and energy in all situations.Such a theory would be a windfall for science, likely leading to spectacular technological advancements that we cannot even begin to imagine. Gravity is the secret ingredient in this Grand Unified Theory. So we must move beyond the Standard Model to reach our goal.Moving beyond the Standard Model requires us to investigate the connection between General Relativity and Quantum Physics. Do these two theories meet in the earliest moments after the Big Bang, when the size of the newly formed, ultra-hot Universe was confined to quantum (very small) scales possibly described by quantum gravity? The answer may lie at 10-44 second after the Big Bang, when the Universe visible to us today was only 10-33 cm wide and when gravity -- confined within what physicists call the Planck scale - played a role equal to the other forces.Also, is General Relativity the ultimate theory of gravity in the Universe? Do black holes, predicted by General Relativity, truly exist? Or are these black holes that fill the Universe some different type of phenomenon?NASA's Cosmic Journeys seeks to answer these questions by using the Universe as a laboratory to probe the most extreme environments of gravity and temperature and the earliest moments of time. These environments exist for us to visit today in the vicinity of black holes, where gravity is king; in the early Universe, where space was hot and dense enough to perhaps unite gravity with the other three fundamental forces; and on a universal scale, where the gravity of dark matter shapes galaxies and clusters of galaxies into walls and voids. Thus, our Cosmic Journeys, in pursuit of gravity, will take us to these regions of the Universe.

    Journey to a Black Hole

    Nowhere is gravity greater than in the region around a black hole. These objects exert a gravitational force so great that not even a beam of light can escape their pull.There are two main types of black holes: the stellar black hole and the supermassive black hole. A stellar black hole is a massive star that ran out of fuel. Without fuel, the core of the star collapses and the outer shell explodes into space. We can often see this explosion as a beautiful supernova remnant. The collapsed core becomes the stellar black hole, an infinitely dense object.A supermassive black hole lies in the core of perhaps all galaxies, including our own. This type of black hole is up to a billion times more massive than the stellar variety, and we do not know how it forms. Cosmic Journeys missions concentrate mostly on the more powerful, supermassive variety.Although black holes emit no light, we can still see the action around them. Their intense gravitational fields pull in surrounding matter, perhaps from a nearby star or from interstellar gas floating freely. This transfer of gas spiraling toward the black hole, called accretion, is amazingly bright in many wavelengths of light. Once light crosses the boundary of a black hole, called the event horizon, it is lost forever. The light we see, therefore, has escaped that final plunge. Other particles are not so lucky.The Hubble Space Telescope and the Chandra X-ray Observatory are finding that black holes are everywhere - alone in empty space, in the hearts of normal galaxies, and in the chassis of powerful quasars. Both of these telescopes are producing superb images of a variety of objects and phenomena, each one showing gravity hard at work.Hubble and Chandra, which collect optical and X-ray light respectively, are like spaceships transporting us to the world of black holes. They have taken us to the ballpark; we have a taste of the excitement. Now we want to get front-row seats. That is, we want to get close enough to actually take a picture of the black hole itself, beyond the accretion disk. This is a central Cosmic Journeys goal.A direct image of gravity at its extreme will be of fundamental importance to Physics. Yet imaging a black hole requires a million times improvement over Chandra. That's a big step. Over the next 20 years, the Cosmic Journeys missions will take us closer and closer to a black hole though the power of resolution.Each successive mission will further us in our journey by 10 or 100-fold increases in resolution, step by step as we approach our goal of zooming in a million times closer. And each stop along the way will bring us new understandings of the nature of matter and energy.GLAST is a gamma-ray observatory mission that will observe jets of particles that shoot away in opposite regions from a supermassive black hole at near the speed of light.We do not fully understand how a black hole, which is known for pulling matter in, can generate high-speed jets that stretch out for billions of miles.Galaxies that harbor black holes with a jet aimed in our direction are called blazars, as opposed to quasars, which have their jets aimed in other directions.GLAST, up to 50 times more sensitive than previous gamma-ray observatories, will stare down the barrel of these jets to unlock the mechanism of how the enigmatic jets form.The Constellation-X mission will probe the inner disk of matter swirling into a black hole, using spectroscopy to journey 1,000 times closer to a black hole than any other mission before it. With such resolution, Constellation-X will be able to measure the mass and spin of black holes, two key properties. This X-ray mission will also map the distortions of space-time predicted by Einstein.Constellation-X draws its superior resolution by pooling the resources of four X-ray satellites orbiting in unison into one massive X-ray telescope.The ARISE mission will produce radio-wave images from the base of supermassive black hole jets with resolution 100,000 times sharper than Hubble. Such unprecedented resolution can reveal how black holes are fed and how jets are created. ARISE will attain this resolution through interferometry. This technique is used today with land-based radio telescopes.Smaller radio telescopes spread out on land - perhaps one mile apart - can work together to generate a single, huge radio telescope with the collecting power of a one-mile radio dish. ARISE will utilize one large radio telescope in space with many other radio telescopes on Earth, bringing what is now a land-based technology to new heights.Closer and closer we will travel through resolution. The MAXIM mission, a million times more powerful than Chandra, will capture a direct image of a black hole. MAXIM will be another interferometry mission, with many smaller components positioned in a deep Earth orbit to focus X-ray photons onto a detector.X-ray interferometry, an emerging technology, has the potential to resolve the event horizon of a supermassive black hole in the nucleus of a nearby galaxy and at the center of our galaxy. This is equivalent to resolving a feature the size of a dinner plate on the surface of the Sun. With MAXIM, we will be able to see light and matter plunging across the event horizon. We will also see up close how gravity distorts light and how time comes to a virtual standstill at the event horizon.

    Gravitational wave antennae, a new type of probe ...

    There is another window to the Universe, different from light waves, through which we can see the deepest, most dust-enshrouded sources of strong gravity.LISA is a mission that will probe the Universe through the detection of gravitational waves. These waves come from the violent motions of massive objects, such as black holes. Gravitational waves can pierce through regions of space that light cannot shine through, for matter does not absorb these waves. As such, LISA can detect black hole activity buried within the dust and gas that other types of telescopes cannot see.With gravitational waves unimpeded by even the foggiest patches of the Universe, LISA will detect far more binary black holes than any satellite that will come before it. These are supermassive massive black holes in colliding galaxies or massive stellar black holes orbiting each other. As the orbits slowly break down, the black holes move closer and closer to each other, creating larger and larger gravitational waves as they spiral together. Finally, the black holes coalesce in a tremendous outpouring of energy.Like a ship floating on the ocean, LISA will detect the subtle waves that "rock" its gravitational antennae -- moving them less than 100 times the width of an atom over a distance of five million kilometers. LISA comprises three satellites orbiting the Sun in the form of a triangle connected by laser beams. The beams will measure the change in distance between satellites caused by a gravitational wave.LISA will specifically detect low-frequency gravitational waves and will thus complement ground-based gravitational wave detectors now being built, which detect higher-frequency waves. The lower-frequency waves would be those waves produced by coalescing massive black holes, as opposed to merging neutron stars, white dwarfs and smaller black holes.

    To the stars and beyond...

    Where else can resolution take us other than a black hole? Imagine a trip through X-ray resolution to a binary star system called Capella, approximately 45 light-years from Earth. This is the six brightest star visible in the northern hemisphere, located in the constellation Auriga. Do you have your seat belt buckled? Then let's zoom in.The Chandra X-ray Observatory sees this star system with up to 500 milli-arcsecond resolution. That is fantastically sharp for an X-ray telescope, yet we plan to do even better. At first glance with X-ray glasses, it is not obvious we are seeing two stars in the Capella system - just a blending into what seems like one bright source.About ten times closer, with 10 milli-arcsecond resolution, we see two distinct sources. Moving in 100 times closer from here, to 100 micro-arcsecond resolution, we see the spherical features of the two stars. The MAXIM-Pathfinder will take us to this spot. Swooping in 10 times closer yet, to 10 micro-arcsecond resolution, we see one of the stars as if it were our sun, complete with solar flares.The journey ends 10 times closer than this, at 1 micro-arcsecond resolution, where we see detailed features on the surface of the star. The MAXIM mission, placing us a million times 'closer' than Chandra, will approach this micro-arcsecond resolution. This is sharp enough to image a black hole. Through the power of resolution, we will travel 45 light-years in only 20 years time.

    Journey Through Dark Matter

    Over 90 percent of the matter in the Universe is in a form we cannot see with any type of telescope. This so-called dark matter might be composed of exotic particles that do not readily interact with our detectors on Earth, perhaps invisible matter that is all around us everyday. We simply don't know. The nature of dark matter, in fact, is one of astronomy's greatest mysteries. A Nobel Prize is likely the award for the clever souls who can figure it out!If we can't see dark matter, you might ask, how do we know its there... and in such abundance? Basically, we can feel it. All matter exerts gravity; dark matter is no exception. In the same way that the Earth's gravity keeps us safely on the ground and the Sun's gravity controls the orbits of the planets, gravitational influence of ubiquitous dark matter is responsible for the very shape of the Universe.This is particularly evident over large scales. We know, for example, that there is not enough visible mass in clusters of galaxies to hold all the contents together. There must be the additional mass of abundant dark matter forming the glue. One of the great accomplishments of the last decade was creating models of the structure of the Universe with supercomputers.These models essentially place the Universe in a box, starting from the early Universe and expanding to the modern era. Dark matter is not evenly dispersed through the Universe. Instead, it forms a cosmic web, with galaxy clusters at the intersection of long chains of galaxies, all separated by voids of apparent empty space. The gravity of the dark matter is the force behind this structure.The Universe started as a dense, ultra-hot bundle of subatomic particles. Slight density fluctuations gave way to the large-scale structure we see today.As the Universe expanded - cooling and heating once again - dark matter collapsed under the force of gravity. Ordinary matter followed this dark matter.Denser regions of dark matter attracted greater amounts of ordinary matter. If dark matter is the web of structure, than ordinary matter - in the form of stars and galaxies - are the flies caught on the web.NASA's COBE mission searched for and found density fluctuations in the early Universe. These fluctuations were reflected as temperature differences in the cosmic microwave background. This low-energy radiation formed around 300,000 years after the Big Bang. Supercomputer models of the Universe build upon the COBE data, extrapolating through time to reveal the cosmic web of the modern era. The high and low-density regions we see with COBE are essentially the walls and voids we see today.Although we cannot see the dark matter that influences the structure of the Universe, we can trace it in several ways in hopes of understanding its nature. Two Cosmic Journeys missions, MAP and Planck, will probe the microwave background with even greater resolution than COBE. These missions will sharpen supercomputer models by placing greater constrains on density fluctuations, by determining the shape of the Universe, and by establishing the ratio of ordinary matter to dark matter.Another key component to understanding dark matter will be solving the mystery of the missing baryons. More mysterious matter? Yes, it's true. As stated before, over 90 percent of all matter is dark matter. Of the remaining 10 percent, most of this is missing! This type of matter is called baryonic, the ordinary stuff we see everyday composed of protons, neutrons and electrons. Hydrogen is an example of baryonic matter.Large amounts of baryonic matter formed in the Big Bang and are seen in the early, distant Universe in the spectra of light from quasars as it passes through clouds of hydrogen, known as the lyman alpha forest. This matter seems to have disappeared, however, from our local universe. Finding it will lead us to the location and distribution of dark matter.Constellation-X will search for the missing baryons trapped in the channels of dark matter that connect galaxy clusters, the largest known structures in the Universe. In the early Universe, we see much more hydrogen than we do today because the hydrogen was cold. Clouds of cold hydrogen absorb light that passes through it. We can 'see' the hydrogen clouds by virtue of the light that can't get through. The more hydrogen, the less light that passes through.As the universe grew older, more stars began to 'turn on', adding more heat to galaxy clusters and warming up the hydrogen. The gas also shock heated up as it collapsed with the force of gravity. Hot hydrogen is harder to see. Light passes through it without being absorbed as much as when it was colder.Constellation-X will instead look for absorption lines from oxygen and other elements heavier than hydrogen. These elements, which constitute perhaps only 1 percent of the missing baryons, tells us how much hydrogen is out there. These absorption lines are very faint, though. Hubble has seen traces of one type of oxygen isotope. Constellation-X will be sensitive enough to detect faint absorption lines from several isotopes of several different element, providing tighter constraints in calculating the total amount of missing hydrogen. Also, in regions where material is dense enough to glow in X rays, Constellation X will observe emission lines from different gases.So, by looking for absorption lines from elements heavier than hydrogen, Constellation-X will essentially "X-ray" the structure of the Universe. Also, through a search for X-ray emission lines, called an X-ray survey, we will have an unbiased way of finding dark matter potentials over a wide range of redshifts and masses.Albert Einstein's work also plays into the dark matter search. General Relativity predicts that matter (and the gravity it produces) distorts light. We have seen the way light bends as it passes by galaxy clusters and black holes, two sources of great gravitational force. The gravity of dark matter should also distort light, albeit more subtly.Work is underway to search for the effects of gravitational lensing, or the bending of light, produced by dark matter. This involves the careful analysis of light from very distant galaxies for evidence of distortion as the light passes though intervening regions of dark matter. To observers, the light from distant spherical objects is pulled by gravity into elliptical shapes, an effect known as cosmic shear. By analyzing the cosmic shear produced in thousands of galaxies, we can determine the distribution of dark matter over large regions of the sky - a powerful tool to test the foundations of cosmology.The GLAST mission takes a different approach. GLAST will search for dark matter by observing the gamma rays produced in the interactions of certain exotic particles - matter yet to be observed in nature but predicted by scientists. Some scientists believe that WIMPs, weakly interacting massive particles, are major contributors to dark matter. WIMPs may have formed in the early Universe and may now reside in dark matter halos that surround galaxies.GLAST will be sensitive enough to detect the gamma rays produced when two certain types of WIMPs collide. In this way, GLAST uses the universe as a laboratory to determine if these exotic particles truly exist in nature.The types of dark matter and their amounts are key factors in determining the structure of the Universe as well as its fate - whether it will collapse or expand forever. Scientists classify dark matter into 'cold' and 'hot'. An example of cold dark matter would be WIMPs and axions. An example of hot dark matter is the neutrino, a particle similar to an electron but with zero change and very little mass. Neutrinos have been detected, namely by the Super-Kamiokande neutrino detector in Japan.Hot dark matter moves quickly and is less gravitationally bound compared to the cold variety. An abundance of hot dark matter would lead to an evenly dispersed universe. And abundance of cold dark matter, in contrast, would produce a clumpy universe. Its stronger gravitational potential leads to matter congregating about it. The evidence available to us today points to a universe with more cold dark matter than hot dark matter. Thus, we see a clumpy universe, but one that distributes its clumpiness.The collapse of dark matter created this structure, with galaxies and hot gas trapped like flies in a spider's web. Evidence is mounting that the universe will expand forever. The biggest discovery of recent years is that the expansion rate of the Universe appears to be accelerating. The most distant galaxies are moving farther and farther apart at an ever-increasing speed. What is driving this acceleration? Not gravity.Gravity should act to slow the expansion rate. If the Universe is truly accelerating, then there might be an unknown form of energy that counters the work of gravity. This is called 'dark energy'.Solving the mystery of dark energy involves investigating the underlying conditions that point to the existence of dark energy. Namely, we must know the density of matter in the Universe and the rate of its expansion over time. The Cosmic Journeys missions will accomplish this by teaming up to take an inventory of the universe across many wavelengths.

    Journey to the Beginning of Time

    How was the Universe formed? The leading model is called the Big Bang Theory. This model says that all the matter and radiation that we see today originated at a finite time in the past - a singularity that looked like what we would expect to see in the center of a black hole! Physicists and astronomers want to journey back to as close to this time as physically possible.What happened in the first second after the Big Bang is as important as the billions of years that have followed. During this time, temperatures were so hot that matter and radiation as we see them today could not exist. What did exist, perhaps, were the many theorized particles that physicists hunt for today. Also, such high temperatures may have allowed gravity to merge with the other three forces.Traditionally, physicists have used giant, earthbound particle accelerators to reproduce the heat and environment of the early Universe. So far they have reproduced an environment similar to when the Universe was a ten-billionth of a second old, a period called the electro-weak era when electromagnetism and weak forces became distinguishable. By 'journeying' to this era, physicists showed that these two forces are two aspects of the same phenomenon. Our goal is to peer back even farther in time, when the Universe was far younger than even a trillionth of a trillionth of a second old.Journeying back, we hope to see the inflation era, the Grand Unified Theory (GUT) era, and the speculative superstring era - all occurring in the first fraction of a second after the Big Bang and all crucial to our understanding of physics beyond the Standard Model.During the inflation era, a 10-32 second after the Big Bang, the Universe grew trillions of trillions of times larger in a mere thousandth of a second. This theorized rapid expansion period explains the breadth of the Universe we see today. In the GUT era, when the Universe was only a 10-35 second old, the strong force was united with the electro-weak. GUT stands for grand unified theories, because three of the four fundamental forces were united and quantum gravity may have existed.In the superstring era at 10-44 second, all forces may have been indistinguishable. This is a period called Planck time, the earliest time on which physicists can speculate. We will likely never be able to reproduce environments associated with these eras in the early Universe on Earth with particle accelerators.We can, however, use the Universe as a laboratory and journey back in time. The MAP and Planck missions are our 'time machines'. These satellites will detect the cosmic microwave background, radiation produced from when the Universe was only 300,000 years old, before all other forms of light.And they will do so with much greater angular and spectral resolution than any mission that has come before, from the COBE satellite in 1990 to balloon-borne experiments in 2000.Slight temperature differences in this microwave radiation reflect density difference from when the Universe was less than a 10-25 second old. This places us at the end of the inflation era. MAP and Planck will, in fact, provide the first solid test of the inflation theory.You wouldn't think we could travel farther back in time than that, but we plan to. The CMBPOL mission will also observe the cosmic microwave background, only it will search for the polarization of the microwave radiation, not temperature. This mission depends on MAP and Planck's confirmation of inflation theory predictions: a flat Universe with primordial perturbations.Inflation would produce gravitational waves, which could be detected via the unique polarization pattern they inscribe upon the cosmic microwave background. With CMBPOL, we journey to the beginning of the inflation era, at 10-32 second, closer yet to the secrets held in the GUT era.A mission to follow after LISA will directly detect gravitational radiation from this period of inflation. The mission involves two independent gravitational wave antennae tuned to a wave period one second long, where the Universe is quiet in all other forms of gravitational radiation.These gravitational waves, relics from the Big Bang, fill the Universe now, only they are too subtle to detect with our current technology.

    Atomic fossils also tell tales ...

    The OWL mission may also probe the inflation era with its detection of rare, high-energy cosmic rays. These cosmic rays are subatomic particles moving so fast that they possess more energy than scientists thought was possible. The cosmic rays had to be produced in the local Universe, for any known particle farther than 150 million light-years would have lost energy on the long journey to Earth by colliding with cosmic microwave background radiation.Some scientists believe that the highest-energy cosmic rays come from the annihilation of topological defects formed during the inflation era. But they may also come from nearby supermassive black holes or even from neutron stars. We could tell which if we had a large sample of these cosmic rays. The highest-energy particles, however, are very rare - striking once per square kilometer per century. As such, no more than a few have been identified with ground-based detectors.OWL will employ a new type of technology to monitor huge regions of the Earth's atmosphere for cosmic-ray activity by looking down from space, not up. When the highest-energy particles enter the atmosphere, they produce a faint light that OWL will detect.OWL hopes to identify hundreds of these mysterious particles each year with the goal of identifying their origin. They could indeed point to a yet-undiscovered phenomenon.Cosmic rays are like fossils from the Universe, and NASA has several Cosmic Journeys missions that will collect them. The ACCESS mission will sit on the International Space Station to collect a full range of medium-energy cosmic rays, from hydrogen to bismuth. These cosmic rays are less energetic than the ones OWL will search for, and ACCESS will measure them directly in a box-shaped cosmic ray detector.ACCESS will help determine where these mysterious cosmic rays come from, what they are made of, and how they were accelerated to such high speeds. These cosmic rays likely originate in remnants of star explosions.

    Cosmic Connections

    NASA won't be alone in its Cosmic Journeys. The space agency will work hand-in-hand with the National Science Foundation (NSF) and the Department of Energy (DOE). The partnership now being forged, called 'Connections: From Quarks to the Cosmos', hopes to bring together physicists, astronomers and other professionals who have traditionally worked independently.The NSF and DOE conduct major research projects involving particle accelerators and underground particle detectors; ground-based observations of ultra-high energy cosmic rays, high-energy gamma rays, dark matter and dark energy; large-scale sky surveys in microwave, radio and optical wavelengths; spaced-based observations of cosmic rays and gamma rays; and theory and computer simulation work.Underground detectors search for neutrinos and relics of dark matter. The NSF- and DOE-supported Super-Kamiokande neutrino detector, for example, is a 50,000-ton tank of ultra-pure water buried nearly one kilometer underground in Japan. Neutrinos are elementary particles somewhat like electrons, only they have zero charge and hardly any mass. Many believe that neutrinos contribute to some of the dark matter mentioned earlier in this article.Other earth-based detectors aim to directly detect the dark matter that may be all around us. These include WIMP and axion detectors. Axions, theoretical exotic particles thought to contribute to dark matter, may burst into detectable microwaves when they encounter very strong magnetic fields.Particle accelerators are used to produce dark matter particles, discover new forces, and understand the basis of why we see more matter than antimatter in the Universe. Two DOE-supported accelerators are at the Fermi National Accelerator Laboratory outside of Chicago and the Stanford Linear Accelerator Center, operated by Stanford University. Particle accelerators have revealed the substructure of the nucleus of an atom.Accelerators work by colliding particles, such as lead or hydrogen protons, which generate high energies and simulate the conditions of the first moments after the Big Bang when the Universe was a hot soup of subatomic particles. The Connection among NASA, NSF and DOE will surely generate a total that is greater than the sum of its parts.

    The Cosmic Journey Missions

    Ours is a Cosmic Journey inspired by gravity and propelled by resolution. Each Cosmic Journeys mission will transport us closer to a black hole, closer to the invisible gases the percolate between stars and galaxies, closer to the very beginning of time. We will leave no star unturned.We will place the Universe under tight surveillance, examining both the massive and minute -- from merging galaxies ripping apart stars to atomic particles shooting through spectacular accelerators the size of the Milky Way. These phenomena, dictated by gravity, hold the secrets about the birth of the Universe, its fate, and all the swirling and glowing that goes on in between.We are on the verge of major breakthroughs based on connecting particle physics, gravity and cosmology. As with previous advances in fundamental physics, this new program may yield "Nobel Prize" discoveries... and perhaps even more dramatic Cosmic Journeys! Here are a few of the Cosmic Journeys missions either approved or under consideration by NASA.

    Approved Missions

    MAP, the Microwave Anisotropy Probe, will produce an accurate full-sky map of the cosmic microwave background with high sensitivity and angular resolution. By measuring temperature fluctuations in this microwave light that bathes the Universe, MAP will provide insight to the nature of gravity, dark matter, and the early growth and ultimate fate of the Universe.SWIFT is a mid-size satellite mission that will detect gamma-ray bursts and 'swiftly' (within a minute) point its UV/optical and X-ray telescopes at the bursts, while at the same time relay the information to other satellites and telescopes so that they too can observe the bursts. Swift needs to act quickly, because these bursts - the most powerful events known in the Universe other than the Big Bang - last only a few days before fading forever. Gamma-ray bursts occur randomly from all directions; their origin is not known. In between bursts, Swift will be busy studying supermassive black holes.GLAST, the Gamma-ray Large Area Space Telescope, will measure the most energetic form of light in the Universe, called gamma rays. One of GLAST's many targets will be black hole jets, particle accelerators in space far more powerful than anything we can build on Earth. GLAST will study the mechanism of these jets, as well as search for clues to the nature of dark matter.FIRST, the Far InfraRed and Submillimetre Telescope, is a cornerstone mission of the European Space Agency (ESA) that will probe a poorly studied region of the electromagnetic band. FIRST will provide insight to galaxy formation, the life cycle of energy and matter, and gravity at work in the early Universe. Planck, named after German scientist and Nobel Prize winner Max Planck, will probe the cosmic microwave background with greater accuracy than MAP. Planck is an ESA mission that will launch with FIRST.

    Missions Under Formulation

    ACCESS, the Advanced Cosmic-ray Composition Experiment for the Space Station, is a cosmic ray detector to be launched and attached to the International Space Station in 2006 to help us understand the origin, variety, distribution and life span of elementary particles in our galaxy.Constellation-X is a next-generation X-ray telescope mission that will investigate black holes, Einstein's Theory of General Relativity, galaxy formation, the evolution of the Universe on the largest scales, the recycling of matter and energy, and the nature of dark matter. The Constellation-X spectroscopy mission entails four moderate-sized telescopes orbiting and observing in unison, combining to yield the collecting power of one giant telescope.LISA, the Laser Interferometer Space Antenna, will observe gravitational waves from very massive black holes found in the centers of many galaxies. Gravitational waves are one of the fundamental building blocks of our theoretical picture of the universe, yet they have not been observed. The LISA mission will consist of three spacecraft forming an equilateral triangle with a distance of five million kilometers between any two spacecraft. Gravitational waves passing through the solar system will generate small changes in the distance between the spacecraft.

    Proposed Mission Concepts

    HSI, the High-resolution Spectroscopy Mission, will study the highest-energy X rays with unprecedented sensitivity, addressing fundamental questions on the origin of heavy elements and black holes. HSI will provide the closest look yet at distant quasars and nearby black holes and neutron stars.MAXIM Pathfinder, part of the Micro Arcsecond X-ray Imaging Mission program, will test visionary technology as well as carry out important scientific objectives. This mission, as the name implies, serves as a pathfinder towards the ultimate goal of imaging a black hole, which will be accomplished by the MAXIM mission itself. MAXIM Pathfinder will be about 10,000 times more sensitive than Chandra, and will bring us ever closer to the disks and jets associated with black holes.MAXIM, the Micro Arcsecond X-ray Imaging Mission, will image a black hole, a primary goal of NASA's Office of Space Science and Cosmic Journeys. MAXIM must be a million times more sensitive than Chandra to accomplish this. A direct image of gravity at its extreme will be of fundamental importance to Physics. MAXIM's unsurpassed resolution - equivalent to resolving a feature the size of a dinner plate on the surface of the Sun - will yield untold discoveries and tremendously improve our understanding of a multitude of cosmic sources.EXIST, the Energetic X-ray Imaging Survey Telescope, will collect the highest energy X-ray photons from sources such as neutron stars, galactic black holes, dust-enshrouded supermassive black holes and regions of nucleosynthesis. EXIST will complement HSI, a hard X-ray spectroscopy mission, and may sit on the International Space Station.ARISE, the Advanced Radio Interferometry between Space and Earth, comprises one (or possibly two) 25-meter radio telescopes in highly elliptical Earth orbit in conjunction with a large number of radio telescopes on the ground. Using the technique of interferometry, which pools together many smaller telescopes into one powerful telescope, ARSIE will have the resolution needed to zoom in on the base of a black hole jet to see how matter is fed into a black hole and how these jets of particles form.CMBPOL, the Cosmic Microwave Background Polarization Experiment, is a follow-up to the MAP and Planck missions, this time measuring the polarization of microwave radiation produced by the Big Bang, instead of temperature differences in that radiation. This mission tests the theory of inflation, which states that the early Universe grew trillions of times larger in only a thousandth of a second, perhaps at 10-35 second after the Big Bang. If the inflation theory is true, CMBPOL would be able to detect cosmological background gravitational waves produced by this era. CMBPOL can also discriminate between competing models for how the earliest galaxies and supermassive black holes formed.OWL, short for Orbiting Wide-angle Light-collectors, will detect the highest energy cosmic rays. The origin of these energetic particles is a mystery: The particles must have originated somewhere close (150 million light-years), for particles from distant sources would lose energy on the way to Earth. Yet nothing that we know of close to Earth could produce such energetic particles. OWL, comprising two satellites which observe the Earth's atmosphere from above, may also tell us about the conditions of the Universe when it was only trillionths of trillionths of a second old.
    Author: Christopher Wanjek

    Source
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  10. Petronium

    Petronium MDL Senior Member

    Oct 1, 2009
    359
    614
    10
    INTERNET

    10,000 volunteers needed to assess Europe's broadband

    Sep 27, 2011

    The Commission plans to check real ISP speeds and performance
    by Jennifer Baker



    The European Commission will put monitoring devices into 10,000 volunteers' homes to find out if ISPs are living up to their promises.

    The project, being run for the Commission by SamKnows will take place simultaneously in 30 European countries. The aim is to map the broadband performance in all European Union member states as well as Croatia, Iceland and Norway.

    Volunteers will have to plug a small device into their home Internet connections. When the broadband line is not in use, the device will run a series of automated tests over the volunteer's broadband connection, simulating common Internet applications and protocols. This will measure the speed and performance of the connection, improving transparency for consumers paying for broadband connectivity. It could also help ISPs and regulators plan for the future.

    State aid for broadband networks reached a record high in the E.U. in 2010. The European Commission approved the use of more than €1.8 billion ($US2.4 billion) of public funds for broadband development in 2010 -- four times that allowed in 2009. The aid was allowed in an effort to achieve ambitious digital agenda goals to ensure that all E.U. citizens have access to high-speed Internet access by 2020.

    It is estimated that currently only around 30 percent of E.U. broadband lines have speeds of at least 10M bps (bits per second) and only 5 percent of lines have average speeds at or above 30M bps.

    To volunteer, go to http://www.samknows.eu/ to register for the testing equipment. The device does not monitor the volunteer's activity on the Internet or record any personally identifiable information.

    Source
     
  11. kldpdas

    kldpdas MDL Member

    Oct 21, 2009
    212
    391
    10
    #71 kldpdas, Oct 5, 2011
    Last edited by a moderator: Apr 20, 2017
    (OP)
    Invisibility Cloak by Texas Researchers



    :ufo:
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  12. kldpdas

    kldpdas MDL Member

    Oct 21, 2009
    212
    391
    10
    #72 kldpdas, Oct 5, 2011
    Last edited by a moderator: Apr 20, 2017
    (OP)
    WISE Finds Fewer Asteroids Near Earth

     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  13. R29k

    R29k MDL GLaDOS

    Feb 13, 2011
    5,171
    4,811
    180
    Earthquakes Generate Big Heat In Super-small Areas

    [​IMG]
    Most earthquakes that are seen, heard, and felt around the world are caused by fast slip on faults. While the earthquake rupture itself can travel on a fault as fast as the speed of sound or better, the fault surfaces behind the rupture are sliding against each other at about a meter per second.
    But the mechanics that underlie fast slip during earthquakes have eluded scientists, because it’s difficult to replicate those conditions in the laboratory. “We still largely don’t understand what is going at earthquake slip speeds,” said David Goldsby, a geophysicist at Brown, “because it’s difficult to do experiments at these speeds.”
    Now, in experiments mimicking earthquake slip rates, Goldsby and Brown geophysicist Terry Tullis show that fault surfaces in earthquake zones come into contact only at microscopic points between scattered bumps, called asperities, on the fault. These tiny contacts support all the force across the fault. The experiments show that when two fault surfaces slide against other at fast slip rates, the asperities may reach temperatures in excess of 2,700 degrees Fahrenheit, lowering their friction, the scientists write in a paper published in Science. The localized, intense heating can occur even while the temperature of the rest of the fault remains largely unaffected, a phenomenon known as flash heating.
    “This study could explain a lot of the questions about the mechanics of the San Andreas Fault and other earthquakes,” said Tullis, professor emeritus of geological sciences, who has studied earthquakes for more than three decades.
    The experiments simulated earthquake speeds of close to half a meter per second. The rock surfaces touched only at the asperities, each with a surface area of less than 10 microns — a tiny fraction of the total surface area. When the surfaces move against each other at high slip rates, the experiments revealed, heat is generated so quickly at the contacts that temperatures can spike enough to melt most rock types associated with earthquakes. Yet the intense heat is confined to the contact flashpoints; the temperature of the surrounding rock remained largely unaffected by these microscopic hot spots, maintaining a “room temperature” of around 77 degrees Fahrenheit, the researchers write.
    “You’re dumping in heat extremely quickly into the contacts at high slip rates, and there’s simply no time for the heat to get away, which causes the dramatic spike in temperature and decrease in friction,” Goldsby said.
    “The friction stays low so long as the slip rate remains fast,” said Goldsby, associate professor of geological sciences (research). “As slip slows, the friction immediately increases. It doesn’t take a long time for the fault to restrengthen after you weaken it. The reason is the population of asperities is short-lived and continually being renewed, and therefore at any given slip rate, the asperities have a temperature and therefore friction appropriate for that slip rate. As the slip rate decreases, there is more time for heat to diffuse away from the asperities, and they therefore have lower temperature and higher friction.”
    Flash heating and other weakening processes that lead to low friction during earthquakes may explain the lack of significant measured heat flows along some active faults like the San Andreas Fault, which might be expected if friction was high on faults during earthquakes. Flash heating in particular may also explain how faults rupture as “slip pulses,” wrinkle-like zones of slip on faults, which would also decrease the amount of heat generated.
    If that is the case, then many earthquakes have been misunderstood as high-friction events. “It’s a new view with low dynamic friction. How can it be compatible with what we know?” asked Tullis, who chairs the National Earthquake Prediction Evaluation Council, an advisory body for the U.S. Geological Survey.
    “Flash heating may explain it,” Goldsby replied.
    The U.S. Geological Survey funded the research.

    Source
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  14. R29k

    R29k MDL GLaDOS

    Feb 13, 2011
    5,171
    4,811
    180
    Gravitational Waves Are The ‘Sounds Of The Universe’

    [​IMG]
    Einstein wrote about them, and we’re still looking for them — gravitational waves, which are small ripples in the fabric of space-time, that many consider to be the sounds of our universe.
    Just as sound complements vision in our daily life, gravitational waves will complement our view of the universe taken by standard telescopes.
    Albert Einstein predicted gravitational waves in 1918. Today, almost 100 years later, advanced gravitational wave detectors are being constructed in the US, Europe, Japan and Australia to search for them.
    While any motion produces gravitational waves, a signal loud enough to be detected requires the motion of huge masses at extreme velocities. The prime candidate sources are mergers of two neutron stars: two bodies, each with a mass comparable to the mass of our sun, spiraling around each other and merging at a velocity close to the speed of light.
    Such events are rare, and take place once per hundreds of thousands of years per galaxy. Hence, to detect a signal within our lifetime the detectors must be sensitive enough to detect signals out to distances of a billion light years away from Earth. This poses an immense technological challenge. At such distances, the gravitational waves signal would sound like a faint knock on our door when a TV set is turned on and a phone rings at the same time.
    Competing noise sources are numerous, ranging from seismic noise produce by tiny quakes or even a distant ocean wave. How can we know that we have detected a gravitational wave from space rather than a falling tree or a rambling truck?
    Therefore, astronomers have been looking for years for a potential electromagnetic light signal that would accompany or follow the gravitational waves. This signal would allow us to “look through the peephole” after hearing the faint knock on the door, and verify that indeed “someone” is there. In their new article just published in Nature, Prof. Tsvi Piran, Schwarzmann University Professor at the Hebrew University of Jerusalem, and Dr. Ehud Nakar from Tel Aviv University describe having found just that.
    They noticed that surrounding interstellar material would slow debris ejected at velocities close to the speed of light during the merger of two neutron stars. Heat generated during this process would be radiated away as radio waves. The resulting strong radio flare would last a few months and would be detectable with current radio telescopes from a billion light years away.
    Search after such a radio signal would certainly take place following a future detection, or even a tentative detection of gravitational waves. However, even before the advanced gravitational wave detectors become operational, as expected in 2015, radio astronomers are geared to looking for these unique flares.
    Nakar and Piran point out in their article that an unidentified radio transient observed in 1987 by Bower et al. has all the characteristics of such a radio flare and may in fact have been the first direct detection of a neutron star binary merger in this way.
    Dr. Nakar’s research was supported by an International Reintegration Grant from the European Union and a grant from the Israeli Science Foundation and an Alon Fellowship. Prof. Piran’s research was supported by an Advanced European Research Council grant and by the High Energy Astrophysics Center of the Israeli Science Foundation.

    Image Caption: This is a simulation of matter ejected from a star merger. Credit: Stephan Rosswog

    Source

     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  15. R29k

    R29k MDL GLaDOS

    Feb 13, 2011
    5,171
    4,811
    180
    Fundamental Constant May Depend on Where in the Universe You Are

    A fundamental physical constant akin to the charge of the electron or the speed of light may depend on where in the universe you are, a team of astronomers reports. If true, that observation would overturn scientists' basic assumption that the laws of physics are the same everywhere in the universe. Other researchers are skeptical, however.
    The constant in question is the so-called fine-structure constant. A number with a value of about 1/137, the constant dictates the strength of the electromagnetic force and, hence, determines the exact wavelengths of light an atom will absorb. The idea that the constant may have changed over the age of the universe isn't new. Astrophysicist John Webb of the University of New South Wales in Sydney, Australia, and his colleagues first rang that bell in 1998, using data from the 10-meter telescope at the W. M. Keck Observatory on Mauna Kea, Hawaii, which peers into the Northern Hemisphere.
    Back then, the team looked at the brightly shining centers of ancient galaxies known as quasars. Light from the quasars must pass through clouds of gas on its several-billion-year journey to Earth, and the atoms in the gas absorb light of specific wavelengths. So the spectrum of the light reaching Earth is missing these wavelengths and looks a bit like a bar code. The overall shift of the lines tells researchers how far away a gas cloud is and, hence, how long ago the light passed through it. The relative spacing of the lines lets them estimate the fine-structure constant at that time. Analyzing such data, Webb and colleagues argued that the fine-structure constant was about 1 part in 100,000 smaller 12 billion years ago than it is today. That was a radical proposition, as the laws of physics are supposed to be the same no matter where you are in the universe.
    The result was not universally accepted, however. In 2004, Patrick Petitjean, an astronomer at the Institute for Astrophysics in Paris, and colleagues used observations of 23 clouds from the Very Large Telescope (VLT) on Cerro Paranal in Chile, which peers into the southern sky, and found no discernible variation in the fine-structure constant.
    Case closed? Not quite. Now Webb and his colleagues have scoured the southern sky themselves using the VLT. Their 153 clouds suggested a difference of 1 part in 100,000 in the fine-structure constant 12 billion years ago. Except in the southern sky, the constant seems to be larger. Connecting the two extremes with a line, the team found that absorption patterns in the clouds along that line are consistent with the fine-structure constant changing slowly through space—smaller in the distant northern sky and larger on the southern side.
    "The result is thrilling," says atomic physicist Wim Ubachs of the Free University of Amsterdam, who wasn't involved in the work. "It might be an indicator that the universe is different from what we thought it to be." Ubachs says he's open to the idea that fundamental constants might actually change over time and position, as scientists don't have a decent explanation for why the fundamental constants have their particular values anyway. Still, the huge claim that a constant changes demands weighty evidence—which the new data are not, as even Webb's team agrees. They say the chances that random statistical fluctuations in the data could produce a fake signal as big are less than 1 in 15,000, the team reported online 31 October in Physical Review Letters. To qualify as hard evidence, those odds must drop to 1 in 2 million.
    Not surprisingly, Petitjean finds the suggestion that the fine-structure constant changes across space "very difficult to believe." He argues that taken by themselves, the Webb team's VLT data wouldn't be interesting. Webb admits that the chances that random fluctuations in the new VLT data could produce a fake trend are a fairly large 1 in 34. But he argues that the data are compelling because two independent telescopes, pointing in different directions, saw the fine-structure constant changing at the same rate and in the same direction. As for why Petitjean's group didn't see the increase in its own data from the VLT, Webb says Petitjean and colleagues were looking in the wrong direction. The 23 clouds Petitjean's team studied don't run along the line through the universe where the fine-structure constant appears to change, Webb says, so it's no surprise that they didn't see the same trend.
    Petitjean sees the agreement differently. He says that the results match because in light up to about 10 billion years old, his team and Webb's see the same thing: no change. Only Webb's group analyzed the older light, and that is the source of its trend. Until it is confirmed independently by others, he warns, "everybody should be careful about the result."
    If it stands up, Webb says, the claim might help answer a grand conceptual question: Why do the fundamental constants take on values that permit life to exist when tiny changes would make life impossible? If the fundamental constants vary over the potentially infinite extent of the universe, our place in the universe would naturally be where the constants are tuned just right to make our existence possible—a version of the so-called anthropic principle. In some circles, however, the anthropic principle raises eyebrows even higher than the idea of changing physical constants.

    Source
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  16. R29k

    R29k MDL GLaDOS

    Feb 13, 2011
    5,171
    4,811
    180
    How innovative is Apple's new voice assistant, Siri?

    IT LETS you check the weather or make an appointment simply by asking aloud, but is Siri, the "personal assistant" on Apple's newly released iPhone 4S, really such an advance?
    Yes, says Boris Katz, an artificial intelligence (AI) researcher at MIT. He says Apple has created a "very impressive piece of engineering" by combining established techniques from fields such as voice recognition and natural language processing.
    Phil Blunsom, who researches machine learning at the University of Oxford, stresses that Apple hasn't just put together existing techniques. But he has reservations: "The difficulty is that each one of these systems makes errors, and when they are fed into each other the errors multiply."
    Apple won't talk about Siri's underlying technology, though a patent application it filed earlier this year reveals that the software manages these errors by restricting queries to specific areas like dining or the weather. Apple calls such themes, for which Siri has access to databases of information, "active ontologies". For example, the dining ontology contains databases of restaurants, cuisines and dishes, along with information on the concept of a meal - that it involves one or more people gathering to eat.
    The active ontology idea is not new - Tom Gruber, one of the inventors of Siri, formally defined it in 1995. What is unusual about Siri is that, unlike earlier grand AI projects, it is "very specifically focused on helping in particular domains", says Philip Resnik, a computational linguist at the University of Maryland in College Park. "If you go out of those domains, all bets are off."
    Siri listens out for keywords such as "Mexican" or "taco" to identify the subject area. It also works out whether to prompt for more information - such as what time to book a table - or whether it has enough details to access a reservations website and make the booking. This final step is possible because most web services now offer application programming interfaces (APIs) that let apps feed information to them. "That's one of the reasons Siri is possible now when it wouldn't have been five or 10 years ago," says Resnik.
    The ability to make sense of requests phrased in ordinary language sets Siri apart from competitors such as Android's Voice Actions, which requires commands in a certain format - saying "navigate to" will elicit directions, but "how do I get to... ?" will not. It doesn't look as if Google is planning a Siri competitor yet. "I don't believe that your phone should be an assistant," said Andy Rubin, who heads Android development at Google, last week.
    Siri will only get better. All queries users put to it are processed by Apple's servers, giving the company a wealth of data it can use to improve the app. Katz suggests Apple could mine this data to discover commonly asked questions that Siri cannot yet handle. That's simple enough, but what about asking it to "book a meal for my family when we're all available"?
    "Siri 2 might involve taking advantage of the fact that many of the tasks you attempt to solve have a social aspect to them," says Resnik. So, for example, the Siris on your family members' iPhones could all work together to organise the meal.
    Blunsom says Apple must try to keep expectations realistic, otherwise people might dismiss Siri because it "can't answer esoteric questions, despite the fact that it can find you a good sushi restaurant nearby".

    Source
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  17. R29k

    R29k MDL GLaDOS

    Feb 13, 2011
    5,171
    4,811
    180
    It's got 16,000 eyes on you—the vision of a Cambrian-era predator

    [h=2]By John Timmer[/h]
    [​IMG]
    This Cambrian predator was about a meter long.
    Those of you who get a bit weirded out by spiders and other arthropods would probably have a coronary if anAnomalocaris were to swim in your direction. The animals were about a meter long, and shaped as a flattened oval, a bit like a modern flounder. That's about the only similarity with a fish, though. Instead of fins, theAnomalocarids propelled themselves through the water using a series of elongated paddle-like structures running down both edges of the body. In front, a pair of appendages could shovel prey into a circular mouth located on its underside.
    And then there were the large, bulging eyes, springing from each side of the animal's head. Until now, we could only guess at what the eyes looked like, but some spectacular, 515 million-year-old fossils from Australia have now shown that they had a huge number of small lenses, arranged much like those in modern insects and other arthropods. The finding suggests that the compound eyes evolved right at the origin of this branch of the evolutionary tree, long before the sorts of hard exoskeletons we now consider typical of arthropods.
    First, the fossils themselves, which are absolutely spectacular. We've discovered a number of differentAnomalocarid species in fossil deposits around the globe but, at best, these simply left behind an impression of the eyes. So, we knew they were roughly pear-shaped and where they appeared on the animal, but nothing about their internal structure. The eyes found in the new fossils clearly show details of the internal structure. They aren't actually attached to an Anomalocaris, but they match the impressions previously found with them, and we've not found anything else in these fossil beds big enough to support an eye of this size.
    It takes a microscope to see them, but individual lenses were preserved in each eye. For someone who has seen countless images of the compound eyes of Drosophila, they are startling in how modern they look. Based on their density, the authors estimate that each eye housed 16,000 individual lenses, the most that have ever been seen on any animal we know about. Based on the curve of the eye and what we know about modern compound eyes, they suggest that the animal had very good visual acuity.
    [​IMG]
    The fossilized remains of 515 million year old eyes.
    John Paterson.
    Those findings feed into a number of evolutionary arguments. For one, a number of researchers suggested thatAnomalocarids were the apex predators of the Cambrian seas, able to shove many of the smaller creatures into their circular jaws—in fact, a number of fossils of smaller creatures have been discovered that appear to bear bite marks from an Anomalocarid. The visual capabilities suggested by these fossils supports the idea that these animals were hunters, using their visual system to spot prey.
    The clear similarities between the visual system of the Anomalocarids and modern arthropods also strengthens previous indications of a close relationship between these groups. Previous phylogenetic trees placeAnomalocarids as branching off earlier than the origin of all modern arthropods, and near the base of an entire phylogenetic group that includes extinct species and modern arthropods. Confirming that these animals had modern-looking compound eyes pushes the origin of those eyes back to near the base of the group, and suggests the possibility that the evolution of this mode of vision may be one of their defining features.
    The final argument made by the authors is in support of an idea that's been around for decades: the proliferation of new phylogenetic groups in the Cambrian came about in part because the first large predators drove an evolutionary arms race that led to new forms of protection (shells and hard plates) and new modes of motion. The presence of a meter-long predator with excellent vision, in the authors opinion, certainly fits in nicely with the idea of an extremely competitive environment.
    Nature, 2011. DOI: 10.1038/nature10689 (About DOIs).
    Photograph by Katrina Kenny & University of Adelaide

    Link

     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  18. R29k

    R29k MDL GLaDOS

    Feb 13, 2011
    5,171
    4,811
    180
    Scientists shrink a Stirling heat engine to single microscopic particle

    By Kyle Niemeyer
    [​IMG]
    A much-larger-than-particle-size Stirling engine


    Just how small can you make an engine? Two researchers from the University of Stuttgart and the Max Planck Institute for Intelligent Systems, Valentin Blickle and Clemens Bechinger, successfully shrank the Stirling heat engine down to a single, microscopic particle. The engine is so small, in fact, that the random fluctuations in position due to Brownian motion cause variations in its work output. This microscopic Stirling engine is controlled using a pair of highly focused lasers.
    Stirling engines, named after the Scottish inventor who created them in 1816, offer the highest theoretical efficiency of any heat engine—the same as the Carnot efficiency. Due to pesky entropy and the second law of thermodynamics, you can’t get all the heat you put in back out as work. The efficiency of any heat engine, then, is just the ratio of output work to input heat. The Carnot efficiency, conceived by Nicolas Léonard Sadi Carnot (the father of thermodynamics), gives the maximum theoretical efficiency of the engine and depends only on the temperature range within which the engine operates.
    The typical, macroscopic Stirling cycle consists of four steps: an isothermal (constant temperature) compression, isochoric (constant volume) heat addition, isothermal expansion, and isochoric heat removal. The heat addition and removal processes operate through the engine walls, making this an external combustion engine as opposed to internal combustion engines like gasoline and diesel, where the heat exchange occurs in the working fluid. The steam engine is another example of an external combustion engine.
    In order to create a microscopic version of a Stirling engine, the researchers sandwiched a tiny, 2.94 μm melamine bead in a 4 μm water gap between glass slides. Heat was added and removed by near-instantaneously increasing and decreasing the surrounding water bath—it shot between room temperature (22°C) and 86°C in less than 10 milliseconds—using a laser.
    They used another infrared laser to create an optical trap for the bead. Increasing and decreasing the stiffness of the trap (done by varying the beam intensity) functioned as the isothermal compression and expansion steps. The expansion, for example, occurs when the energized particle (from the heat addition) relaxes and is able to move freely, due to the stiffness of the trap decreasing.
    Now, since we’re dealing with a single particle, rather than a continuum, the position—and extracted work—fluctuates randomly due to Brownian motion. In a macroscopic system, well described by the laws of thermodynamics, the large number of degrees of freedom negate the effect of microscopic fluctuations. Here, however, they are obvious in a plot of the extracted work.
    The team observed a small difference between the work extracted during expansion and the work spent during compression in each cycle, and measured the average production of work over time. They suggest this was caused by the greater variation in position at the high temperature condition, where the particle moves around more due to higher energy.
    Probably the most exciting result of the study is the high efficiency obtained: 14 percent thermal efficiency, which corresponds to 90 percent of the Carnot (and maximum possible) efficiency, 15.5 percent. This may seem low, but it depends only on the ratio of the high and low temperatures, 76°C and 22°C for this particular experiment.
    The authors also found that some work is unavoidably dissipated to the surrounding environment, especially when the engine operates at higher frequencies, outputting more work. Conversely, the efficiency increases with increasing cycle time, only reaching the 14 percent value at low frequencies. This means that the optimal power is a competition between dissipation and efficiency, something the researchers plan to study further.
    Blickle and Bechinger talk about future microscopic machines benefiting from this research, and it’s certainly an accomplishment to create a single-particle engine, but direct applications of this aren’t clear. One issue is that the engine is completely externally controlled by the two lasers, so a different approach would be needed for a self-contained device. However, the concept is fascinating, and provides new insights on thermodynamics at such small scales.
    Nature Physics, 2011. DOI: 10.1038/nphys2163 (About DOIs)
    Photograph by .scribe

    Source:eek:
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  19. R29k

    R29k MDL GLaDOS

    Feb 13, 2011
    5,171
    4,811
    180
    Pentagon-backed 'time cloak' makes event undetectable

    January 6, 2012


    PARIS — Pentagon-supported physicists say they had devised a "time cloak" that briefly makes an event undetectable.
    The laboratory device manipulates the flow of light in such a way that for the merest fraction of a second an event cannot be seen, according to a paper published in the science journal Nature.
    It adds to experimental work in creating next-generation camouflage - a so-called invisibility cloak in which specific colours cannot be perceived by the human eye.

    "Our results represent a significant step towards obtaining a complete spatio-temporal cloaking device," says the study, headed by Moti Fridman of Cornell University in New York.
    The breakthrough exploits the fact that frequencies of light move at fractionally different speeds.
    The so-called temporal cloak starts with a beam of green light that is passed down a fibre-optic cable.
    The beam goes through a two-way lens that splits it into two frequencies-- blueish light which travels relatively fast, and reddish light, which is slower.
    The tiny difference in speed is then accentuated by placing a transparent obstacle in front of the two beams.
    Eventually a time gap opens up between the red and blue beams as they travel through the optical fibre.
    The gap is tiny - just 50 picoseconds, or 50 millionths of a millionth of a second.
    But it is just long enough to squeeze in a pulse of laser at a different frequency from the light passing through the system.
    The red and blue light are then given the reverse treatment.
    They go through another obstacle, which this time speeds up the red and slows down the blue, and come to a reverse lens that reconstitutes them as a single green light.
    But the 40-picosecond burst of laser is not part of the flow of photons, and thus cannot be detected.
    In a commentary, optical engineers Robert Boyd and Zhimin Shi of New York's University of Rochester, likened the experiment to a level crossing on a busy road.
    When a train comes, the cars are stopped, and this causes a gap in the traffic.
    When the train has passed, the stopped cars speed up until they catch up with the traffic in front of them. To the observer, the flow seems quite normal, and there is no evidence that a train has crossed the intersection.
    After proving that the "cloak" is possible, the next step for the researchers is to expand the time gap by orders of magnitude, firstly to microseconds and then to milliseconds, said Boyd and Shi.
    The time cloak has a potential use in boosting security in fibre-optic communications because it breaks up optical signals, lets them travel at different speeds and then reassembles them, which makes data hard to intercept.
    Last year, scientists reported a step forward in so-called metamaterials which act as a cloaking of space, as opposed to time.
    Metamaterials are novel compounds whose surface that interacts with light at specific frequencies thanks to a tiny, nano-level structure. As a result, light flows around the object - rather like water that bends around a rock in a stream - as opposed to being absorbed by it.
    Fridman's work was part-supported by the Defence Advanced Research Project Agency, or DARPA, a Pentagon unit which develops futuristic technology that can have a military use. Its achievements include DARPANet, a predecessor of the internet.


    AFP

    Source
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...
  20. R29k

    R29k MDL GLaDOS

    Feb 13, 2011
    5,171
    4,811
    180
    LHC: Higgs boson 'may have been glimpsed'

    By Paul RinconScience editor, BBC News website, Geneva[​IMG]
    Two teams at the LHC have seen hints of what may well prove to be the Higgs


    The most coveted prize in particle physics - the Higgs boson - may have been glimpsed, say researchers reporting at the Large Hadron Collider (LHC) in Geneva.
    The particle is purported to be the means by which things in the Universe obtain their mass.
    Scientists say that two experiments at the LHC see hints of the Higgs at the same mass, fuelling huge excitement.
    But the LHC does not yet have enough data to claim a discovery.
    Finding the Higgs would be one of the biggest scientific advances of the last 60 years. It is crucial for allowing us to make sense of the Universe, but has never been observed by experiments.
    Continue reading the main storyThe Higgs boson
    • The Higgs is a sub-atomic particle that is predicted to exist, but has not yet been seen
    • It was proposed as a mechanism to explain mass by six physicists, including Peter Higgs, in 1964
    • It imparts mass to other fundamental particles via the associated Higgs field
    • It is the last missing member of the Standard Model, which explains how particles interact

    This basic building block of the Universe is a significant missing component of the Standard Model - the "instruction booklet" that describes how particles and forces interact.
    Two separate experiments at the LHC - Atlas and CMS - have been conducting independent searches for the Higgs. Because the Standard Model does not predict an exact mass for the Higgs, physicists have to use particle accelerators like the LHC to systematically look for it across a broad search area.
    At a seminar at Cern (the organisation that operates the LHC) on Tuesday, the heads of Atlas and CMS said they see "spikes" in their data at roughly the same mass: 124-125 gigaelectronvolts (GeV; this is about 130 times as heavy as the protons found in atomic nuclei).
    "The excess may be due to a fluctuation, but it could also be something more interesting. We cannot exclude anything at this stage," said Fabiola Gianotti, spokesperson for the Atlas experiment.
    [​IMG]


    Professor Rolf-Dieter Heuer, director-general of Cern: ''We have made extremely good progress''

    Guido Tonelli, spokesperson for the CMS experiment, said: "The excess is most compatible with a Standard Model Higgs in the vicinity of 124 GeV and below, but the statistical significance is not large enough to say anything conclusive.
    "As of today, what we see is consistent either with a background fluctuation or with the presence of the boson."
    'Exciting'Prof Rolf-Dieter Heuer, director-general of Cern, told BBC News: "Such signals can come and go… Although there is correspondence between the two experiments, we need more solid numbers."
    None of the spikes seen by the experiments is at much more than the "two sigma" level of certainty.
    Continue reading the main storyStatistics of a 'discovery'

    [​IMG]
    • Particle physics has an accepted definition for a "discovery": a five-sigma level of certainty
    • The number of standard deviations, or sigmas, is a measure of how unlikely it is that an experimental result is simply down to chance rather than a real effect
    • Similarly, tossing a coin and getting a number of heads in a row may just be chance, rather than a sign of a "loaded" coin
    • The "three sigma" level represents about the same likelihood of tossing more than eight heads in a row
    • Five sigma, on the other hand, would correspond to tossing more than 20 in a row
    • Unlikely results can occur if several experiments are being carried out at once - equivalent to several people flipping coins at the same time
    • With independent confirmation by other experiments, five-sigma findings become accepted discoveries

    A level of "five sigma" is required to claim a discovery, meaning there is less than a one in a million chance the data spike is down to a statistical fluke.
    Another complicating factor is that these tantalising hints consist only of a handful of events among the billions of particle collisions analysed at the LHC.
    Prof Heuer said: "We can be misled by small numbers, so we need more statistics," but added: "It is exciting."
    If it exists, the Higgs is very short-lived, quickly decaying - or transforming - into more stable particles. There are several different ways this can happen, which provides scientists with different routes to search for the boson.
    They looked at particular decay routes for the Higgs that produce only a handful of events, but have the advantage of having less background noise in the data. This background noise consists of random combinations of events, some of which can look like Higgs decays.
    Other decay modes produce more events - which are better for statistical certainty - but also more background noise. Prof Heuer said physicists were "squeezed" between these two options.
    Prof Stefan Soldner-Rembold, from the University of Manchester, called the quality of the LHC's results "exceptional", adding: "Within one year we will probably know whether the Higgs particle exists, but it is likely not going to be a Christmas present."
    The simple fact that both Atlas and CMS seem to be seeing a data spike at the same mass has been enough to cause enormous excitement in the particle physics community.

    Source
     
    Stop hovering to collapse... Click to collapse... Hover to expand... Click to expand...