In the first meaning, it refers to an inability to connect with others on an emotional level, as well as a means of coping with anxiety by avoiding certain situations that trigger it; it is often described as "emotional numbing" or dissociation.
In the second sense, it is a type of mental assertiveness that allows people to maintain their boundaries and psychic integrity when faced with the emotional demands of another person or group of persons..
Wednesday, 28 May 2008
Erikson's stages of psychosocial development
Erikson's stages of psychosocial development describes eight developmental stages through which a healthily developing human should pass from infancy to late adulthood
In each stage the person confronts, and hopefully masters, new challenges.
Each stage builds on the successful completion of earlier stages.
The challenges of stages not successfully completed may be expected to reappear as problems in the future. Erik Erikson developed the theory in the 1950's as an improvement on Freud's psychosexual stages.
Erikson accepted many of Freud's theories (including the id, ego, and superego, and Freud's infantile sexuality represented in psychosexual development), but rejected Freud's attempt to describe personality solely on the basis of sexuality.
In his most influential work, Childhood and Society (1950), he divided the human life cycle into eight psychosocial stages of development..
In each stage the person confronts, and hopefully masters, new challenges.
Each stage builds on the successful completion of earlier stages.
The challenges of stages not successfully completed may be expected to reappear as problems in the future. Erik Erikson developed the theory in the 1950's as an improvement on Freud's psychosexual stages.
Erikson accepted many of Freud's theories (including the id, ego, and superego, and Freud's infantile sexuality represented in psychosexual development), but rejected Freud's attempt to describe personality solely on the basis of sexuality.
In his most influential work, Childhood and Society (1950), he divided the human life cycle into eight psychosocial stages of development..
Empathy
Empathy is the recognition and understanding of the states of mind, beliefs, desires, and particularly, emotions of others.
It is often characterized as the ability to "put oneself into another's shoes", or experiencing the outlook or emotions of another being within oneself; a sort of emotional resonance..
It is often characterized as the ability to "put oneself into another's shoes", or experiencing the outlook or emotions of another being within oneself; a sort of emotional resonance..
Anger management
The term anger management commonly refers to a system of psychological therapeutic techniques and exercises by which one with excessive or uncontrollable anger can control or reduce the triggers, degrees, and effects of an angered emotional state.
Courses in anger management are sometimes mandated to violent criminals by a legal system..
For more information about the topic Anger management, read the full article at Wikipedia.org, or see the following related articles:
Anger — Anger is a term for the emotional aspect of aggression, as a basic aspect of the stress response in animals in which a perceived aggravating stimulus ... > read more
Anchoring bias in decision-making — Anchoring or focalism is a term used in psychology to describe the common human tendency to rely too heavily, or "anchor," on one trait or piece of ... > read more
Mental confusion — Severe confusion of a degree considered pathological usually refers to loss of orientation (ability to place oneself correctly in the world by time, ... > read more
Pyromania — Pyromania is an obsession with fire and starting fires, in an intentional fashion, usually on multiple occasions. It should be contrasted with other ... > read more
Courses in anger management are sometimes mandated to violent criminals by a legal system..
For more information about the topic Anger management, read the full article at Wikipedia.org, or see the following related articles:
Anger — Anger is a term for the emotional aspect of aggression, as a basic aspect of the stress response in animals in which a perceived aggravating stimulus ... > read more
Anchoring bias in decision-making — Anchoring or focalism is a term used in psychology to describe the common human tendency to rely too heavily, or "anchor," on one trait or piece of ... > read more
Mental confusion — Severe confusion of a degree considered pathological usually refers to loss of orientation (ability to place oneself correctly in the world by time, ... > read more
Pyromania — Pyromania is an obsession with fire and starting fires, in an intentional fashion, usually on multiple occasions. It should be contrasted with other ... > read more
Social cognition
Social cognition is the study of how people process social information, especially its encoding, storage, retrieval, and application to social situations.
There has been much recent interest in the links between social cognition and brain function, particularly as neuropsychological studies have shown that brain injury (particularly to the frontal lobes) can adversely affect social judgements and interaction.
People diagnosed with certain mental illnesses are also known to show differences in how they process social information.
There is now an expanding research field examining how such conditions may bias cognitive processes involved in social interaction, or conversely, how such biases may lead to the symptoms associated with the condition. It is also becoming clear that some aspects of psychological processes that promote social behaviour (such as face recognition) may be innate.
Studies have shown that newborn babies, younger than one hour old can selectively recognize and respond to faces, while people with some developmental disorders such as autism or Williams syndrome may show differences in social interaction and social communication when compared to their unaffected peers..
There has been much recent interest in the links between social cognition and brain function, particularly as neuropsychological studies have shown that brain injury (particularly to the frontal lobes) can adversely affect social judgements and interaction.
People diagnosed with certain mental illnesses are also known to show differences in how they process social information.
There is now an expanding research field examining how such conditions may bias cognitive processes involved in social interaction, or conversely, how such biases may lead to the symptoms associated with the condition. It is also becoming clear that some aspects of psychological processes that promote social behaviour (such as face recognition) may be innate.
Studies have shown that newborn babies, younger than one hour old can selectively recognize and respond to faces, while people with some developmental disorders such as autism or Williams syndrome may show differences in social interaction and social communication when compared to their unaffected peers..
Mental retardation
Mental retardation is a term for a pattern of persistently slow learning of basic motor and language skills ("milestones") during childhood, and a significantly below-normal global intellectual capacity as an adult..
For more information about the topic Mental retardation, read the full article at Wikipedia.org, or see the following related articles:
Learning disability — In the United States and Canada, the term learning disability is used to refer to psychological and neurological conditions that affect a person's ... > read more
Brain damage — Brain damage or brain injury is the destruction or degeneration of brain cells. Brain damage may occur due to a wide range of conditions, illnesses, ... > read more
Personality disorder — Personality disorders form a class of mental disorders that are characterized by long-lasting rigid patterns of thought and behaviour. Because of the ... > read more
For more information about the topic Mental retardation, read the full article at Wikipedia.org, or see the following related articles:
Learning disability — In the United States and Canada, the term learning disability is used to refer to psychological and neurological conditions that affect a person's ... > read more
Brain damage — Brain damage or brain injury is the destruction or degeneration of brain cells. Brain damage may occur due to a wide range of conditions, illnesses, ... > read more
Personality disorder — Personality disorders form a class of mental disorders that are characterized by long-lasting rigid patterns of thought and behaviour. Because of the ... > read more
Mental retardation
Mental retardation is a term for a pattern of persistently slow learning of basic motor and language skills ("milestones") during childhood, and a significantly below-normal global intellectual capacity as an adult..
For more information about the topic Mental retardation, read the full article at Wikipedia.org, or see the following related articles:
Learning disability — In the United States and Canada, the term learning disability is used to refer to psychological and neurological conditions that affect a person's ... > read more
Brain damage — Brain damage or brain injury is the destruction or degeneration of brain cells. Brain damage may occur due to a wide range of conditions, illnesses, ... > read more
Personality disorder — Personality disorders form a class of mental disorders that are characterized by long-lasting rigid patterns of thought and behaviour. Because of the ... > read more
For more information about the topic Mental retardation, read the full article at Wikipedia.org, or see the following related articles:
Learning disability — In the United States and Canada, the term learning disability is used to refer to psychological and neurological conditions that affect a person's ... > read more
Brain damage — Brain damage or brain injury is the destruction or degeneration of brain cells. Brain damage may occur due to a wide range of conditions, illnesses, ... > read more
Personality disorder — Personality disorders form a class of mental disorders that are characterized by long-lasting rigid patterns of thought and behaviour. Because of the ... > read more
Cognition
The term cognition is used in several loosely related ways to refer to a faculty for the human-like processing of information, applying knowledge and changing preferences.
Cognition or cognitive processes can be natural and artificial, conscious and not conscious; therefore, they are analyzed from different perspectives and in different contexts, in anesthesia, neurology, psychology, philosophy, systemics and computer science.
The concept of cognition is closely related to such abstract concepts as mind, reasoning, perception, intelligence, learning, and many others that describe numerous capabilities of human mind and expected properties of artificial or synthetic intelligence.
Cognition is an abstract property of advanced living organisms; therefore, it is studied as a direct property of a brain or of an abstract mind on subsymbolic and symbolic levels. In psychology and in artificial intelligence, it is used to refer to the mental functions, mental processes and states of intelligent entities (humans, human organizations, highly autonomous robots), with a particular focus toward the study of such mental processes as comprehension, inferencing, decision-making, planning and learning (see also cognitive science and cognitivism).
Recently, advanced cognitive researchers have been especially focused on the capacities of abstraction, generalization, concretization/specialization and meta-reasoning which descriptions involve such concepts as beliefs, knowledge, desires, preferences and intentions of intelligent individuals/objects/agents/systems. The term "cognition" is also used in a wider sense to mean the act of knowing or knowledge, and may be interpreted in a social or cultural sense to describe the emergent development of knowledge and concepts within a group that culminate in both thought and action..
Cognition or cognitive processes can be natural and artificial, conscious and not conscious; therefore, they are analyzed from different perspectives and in different contexts, in anesthesia, neurology, psychology, philosophy, systemics and computer science.
The concept of cognition is closely related to such abstract concepts as mind, reasoning, perception, intelligence, learning, and many others that describe numerous capabilities of human mind and expected properties of artificial or synthetic intelligence.
Cognition is an abstract property of advanced living organisms; therefore, it is studied as a direct property of a brain or of an abstract mind on subsymbolic and symbolic levels. In psychology and in artificial intelligence, it is used to refer to the mental functions, mental processes and states of intelligent entities (humans, human organizations, highly autonomous robots), with a particular focus toward the study of such mental processes as comprehension, inferencing, decision-making, planning and learning (see also cognitive science and cognitivism).
Recently, advanced cognitive researchers have been especially focused on the capacities of abstraction, generalization, concretization/specialization and meta-reasoning which descriptions involve such concepts as beliefs, knowledge, desires, preferences and intentions of intelligent individuals/objects/agents/systems. The term "cognition" is also used in a wider sense to mean the act of knowing or knowledge, and may be interpreted in a social or cultural sense to describe the emergent development of knowledge and concepts within a group that culminate in both thought and action..
Thursday, 22 May 2008
MRI/PET Scanner Combo Made For First Time
Two kinds of body imaging -- positron emission tomography (PET) and magnetic resonance imaging (MRI) -- have been combined for the first time in a single scanner.
MRI scans provide exquisite structural detail but little functional information, while PET scans -- which follow a radioactive tracer in the body -- can show body processes but not structures, said Simon Cherry, professor and chair of biomedical engineering at UC Davis. Cherry's lab built the scanner for studies with laboratory mice, for example in cancer research.
"We can correlate the structure of a tumor by MRI with the functional information from PET, and understand what's happening inside a tumor," Cherry said.
Combining the two types of scan in a single machine is difficult because the two systems interfere with each other. MRI scanners rely on very strong, very smooth magnetic fields that can easily be disturbed by metallic objects inside the scanner. At the same time, those magnetic fields can seriously affect the detectors and electronics needed for PET scanning. There is also a limited amount of space within the scanner in which to fit everything together, Cherry noted.
Scanners that combine computer-assisted tomography (CAT) and PET scans are already available, but CAT scans provide less structural detail than MRI scans, especially of soft tissue, Cherry said. They also give the patient a dose of radiation from X-rays.
The photomultiplier tubes used in conventional PET machines are very sensitive to magnetic fields. So the researchers used a new technology -- the silicon avalanche photodiode detector -- in their machine. They were able to show that the scanner could acquire accurate PET and MRI images at the same time from test objects and mice.
The other authors on the paper, published online March 3 by the journal Proceedings of the National Academy of Sciences, are UC Davis graduate student Ciprian Catana, now at the Martinos Center for Biomedical Imaging, Harvard University; postdoctoral researcher Yibao Wu and Jinyi Qi, assistant professor of biomedical engineering, both at UC Davis; Daniel Procissi, Caltech; Bernd Pichler, University of Tübingen, Germany; and Russell Jacobs from the Beckman Institute, California Institute of Technology. Another paper by Cherry, Jacobs and UC Davis Associate Professor Angelique Louie reviewing the opportunities and challenges for combined PET/MRI was published in the Feb. 2008 issue of the Proceedings of the IEEE. The work was supported by grants from the National Institutes of Health.
MRI scans provide exquisite structural detail but little functional information, while PET scans -- which follow a radioactive tracer in the body -- can show body processes but not structures, said Simon Cherry, professor and chair of biomedical engineering at UC Davis. Cherry's lab built the scanner for studies with laboratory mice, for example in cancer research.
"We can correlate the structure of a tumor by MRI with the functional information from PET, and understand what's happening inside a tumor," Cherry said.
Combining the two types of scan in a single machine is difficult because the two systems interfere with each other. MRI scanners rely on very strong, very smooth magnetic fields that can easily be disturbed by metallic objects inside the scanner. At the same time, those magnetic fields can seriously affect the detectors and electronics needed for PET scanning. There is also a limited amount of space within the scanner in which to fit everything together, Cherry noted.
Scanners that combine computer-assisted tomography (CAT) and PET scans are already available, but CAT scans provide less structural detail than MRI scans, especially of soft tissue, Cherry said. They also give the patient a dose of radiation from X-rays.
The photomultiplier tubes used in conventional PET machines are very sensitive to magnetic fields. So the researchers used a new technology -- the silicon avalanche photodiode detector -- in their machine. They were able to show that the scanner could acquire accurate PET and MRI images at the same time from test objects and mice.
The other authors on the paper, published online March 3 by the journal Proceedings of the National Academy of Sciences, are UC Davis graduate student Ciprian Catana, now at the Martinos Center for Biomedical Imaging, Harvard University; postdoctoral researcher Yibao Wu and Jinyi Qi, assistant professor of biomedical engineering, both at UC Davis; Daniel Procissi, Caltech; Bernd Pichler, University of Tübingen, Germany; and Russell Jacobs from the Beckman Institute, California Institute of Technology. Another paper by Cherry, Jacobs and UC Davis Associate Professor Angelique Louie reviewing the opportunities and challenges for combined PET/MRI was published in the Feb. 2008 issue of the Proceedings of the IEEE. The work was supported by grants from the National Institutes of Health.
Sniffing Out Uses For The Electronic Nose
Despite 25 years of research, development of an "electronic nose" even approaching the capabilities of the human sniffer remains a dream, chemists in Germany conclude in an overview on the topic.
In a new article, Udo Weimar and colleagues describe major advances that have produced olfactory sensors with a range of uses in detecting certain odors. Electronic noses excel, for instance, at picking up so-called "non-odorant volatiles"-- chemicals that mammalian noses cannot pick up like carbon monoxide.
Ideally, however, an electronic nose should mimic the discrimination of the mammalian olfactory system for smells -- reliably identifying odors like "fruity," "grassy" and "earthy" given off by certain chemicals. Until electronic noses become more selective, their roles probably will be limited to serving as valuable tools for tasks such as monitoring air quality and detecting explosives.
"The electronic nose has the potential to enter our daily life far away from well-equipped chemical laboratories and skilled specialists," the article states. "Keeping its limitations in mind and adapted for a special purpose, this will be the future for the electronic nose for as long as the ability to smell odors rather than detect volatiles is still far away over the rainbow."
In a new article, Udo Weimar and colleagues describe major advances that have produced olfactory sensors with a range of uses in detecting certain odors. Electronic noses excel, for instance, at picking up so-called "non-odorant volatiles"-- chemicals that mammalian noses cannot pick up like carbon monoxide.
Ideally, however, an electronic nose should mimic the discrimination of the mammalian olfactory system for smells -- reliably identifying odors like "fruity," "grassy" and "earthy" given off by certain chemicals. Until electronic noses become more selective, their roles probably will be limited to serving as valuable tools for tasks such as monitoring air quality and detecting explosives.
"The electronic nose has the potential to enter our daily life far away from well-equipped chemical laboratories and skilled specialists," the article states. "Keeping its limitations in mind and adapted for a special purpose, this will be the future for the electronic nose for as long as the ability to smell odors rather than detect volatiles is still far away over the rainbow."
New Detector Can 'See' Single Neutrons Over Broad Range
Researchers at the National Institute of Standards and Technology (NIST) and the University of Maryland have developed a new optical method that can detect individual neutrons and record them over a range of intensities at least a hundred times greater than existing detectors. The new detector, described at the March Meeting of the American Physical Society* by Charles Clark, a Fellow of the Joint Quantum Institute of NIST and the University of Maryland, promises to improve existing neutron measurements and enable tests of new phenomena beyond the Standard Model, the basic framework of particle physics.
The prototype laboratory device is based on a process first observed by the research team: the emission of light from hydrogen atoms produced when neutrons are absorbed by helium-3 atoms (3He). Lyman alpha light, discovered by Harvard physicist Theodore Lyman in 1906, results from the jump between the two lowest-energy states of the hydrogen atom. Although it is the brightest light emitted by the sun and is one of the most abundant forms of light in the universe, Lyman alpha is invisible to the eye because it lies in the far ultraviolet region of the optical spectrum. It is strongly absorbed by most substances and can travel through only about a millimeter of air.
Helium gas, however, does not absorb Lyman alpha light. When a neutron is absorbed by a helium-3 atom, one atom of hydrogen and one atom of tritium (a heavy form of hydrogen) are produced. These atoms fly apart at high speeds, can be excited by collisions with surrounding helium gas, and subsequently emit Lyman alpha light. This light is recorded by the new device, known as the Lyman alpha neutron detector (LAND).
Using an ultracold neutron beam at the NIST Center for Neutron Research, the research team has discovered that Lyman alpha light is generated with surprisingly high efficiency: about 40 photons are generated per neutron for helium gas at atmospheric pressure. According to Alan Thompson, neutron expert on the team, "This device thus has the potential to detect both single neutrons and large numbers of neutrons, which is very difficult to do with present neutron detectors based on electrical discharges."
The use of an optical means of detection, rather than an electronic one, also offers the prospect of at least a hundredfold improvement in neutron detectors' dynamic range (the spread in recordable neutron intensity from faint to bright). This stems from the fact that optical detectors respond more quickly than electronic detectors (which suffer from longer periods of inactivity known as "dead time.")
With further development, this new method can potentially lead to better measurements at existing neutron facilities (for example, neutron diffraction instruments at the NIST Center for Neutron Research) and enable new tests of physics beyond the Standard Model. Measurements at NIST of a property in neutrons known as the electric dipole moment and more precise measurements of the neutron lifetime are planned.
The prototype laboratory device is based on a process first observed by the research team: the emission of light from hydrogen atoms produced when neutrons are absorbed by helium-3 atoms (3He). Lyman alpha light, discovered by Harvard physicist Theodore Lyman in 1906, results from the jump between the two lowest-energy states of the hydrogen atom. Although it is the brightest light emitted by the sun and is one of the most abundant forms of light in the universe, Lyman alpha is invisible to the eye because it lies in the far ultraviolet region of the optical spectrum. It is strongly absorbed by most substances and can travel through only about a millimeter of air.
Helium gas, however, does not absorb Lyman alpha light. When a neutron is absorbed by a helium-3 atom, one atom of hydrogen and one atom of tritium (a heavy form of hydrogen) are produced. These atoms fly apart at high speeds, can be excited by collisions with surrounding helium gas, and subsequently emit Lyman alpha light. This light is recorded by the new device, known as the Lyman alpha neutron detector (LAND).
Using an ultracold neutron beam at the NIST Center for Neutron Research, the research team has discovered that Lyman alpha light is generated with surprisingly high efficiency: about 40 photons are generated per neutron for helium gas at atmospheric pressure. According to Alan Thompson, neutron expert on the team, "This device thus has the potential to detect both single neutrons and large numbers of neutrons, which is very difficult to do with present neutron detectors based on electrical discharges."
The use of an optical means of detection, rather than an electronic one, also offers the prospect of at least a hundredfold improvement in neutron detectors' dynamic range (the spread in recordable neutron intensity from faint to bright). This stems from the fact that optical detectors respond more quickly than electronic detectors (which suffer from longer periods of inactivity known as "dead time.")
With further development, this new method can potentially lead to better measurements at existing neutron facilities (for example, neutron diffraction instruments at the NIST Center for Neutron Research) and enable new tests of physics beyond the Standard Model. Measurements at NIST of a property in neutrons known as the electric dipole moment and more precise measurements of the neutron lifetime are planned.
Handheld DNA Detector
A researcher at the National University at San Diego has taken a mathematical approach to a biological problem - how to design a portable DNA detector. Writing in the International Journal of Nanotechnology, he describes a mathematical simulation to show how a new type of nanoscale transistor might be coupled to a DNA sensor system to produce a characteristic signal for specific DNA fragments in a sample.
Samuel Afuwape of the National University, in San Diego, California, explains that a portable DNA sequencer could make life easier for environmental scientists testing contaminated sites. Clinicians and medical researchers too could use it to diagnose genetic disorders and study problems in genetics. Such a sensor might also be used to spot the weapons of the bioterrorist or in criminal forensic investigations.
The earliest DNA biosensors used fluorescent labels to target DNA, but these were expensive and slow. The next generation used mediator molecules to speed up the process and labeled enzymes to make the sensors highly selective for their target molecules. None of these systems were portable, however, and the current research trend is towards systems that use no molecular labels and have avoid costly reagents.
Nevertheless, DNA biosensors are already becoming ubiquitous in many areas, but the instrumentation is usually limited to the laboratory setting. Afuwape says that a commercially viable, off-the-shelf handheld DNA biosensor that could be used in environmental, medical, forensics and other applications might be possible if researchers could unravel the basic molecular machinery operating at the interface between sample and detector.
Afuwape suggests that a new type of electronic device, the ion-selective field-effect transistor (ISFET), might be integrated into a DNA biosensor. Such a sensor would be coated with thousands of known DNA sequences that could match up - hybridize - with specific DNA fragments in a given medical or environmental sample.
The key to making the system work is that the ISFET can measure changes in conductivity. Constructing a sensor so that the process of DNA hybridization is coupled to a chemical reaction that generates electricity would produce discrete electronics signals. These signals would be picked up by the ISFET. The characteristic pattern of the signals would correspond to hybridization of a known DNA sequence on the sensor and so could reveal the presence of its counterpart DNA in the sample. Afuwape's mathematical work demonstrates that various known chemical reaction circuits involving DNA could be exploited in such a sensor.
"The ISFET is proving to be a powerful platform on which to design and develop selective, sensitive, and fast miniature DNA sensors," says Afuwape, "such portable DNA sensors will find broad application in medical, agriculture, environmental and bioweapons detection."
Samuel Afuwape of the National University, in San Diego, California, explains that a portable DNA sequencer could make life easier for environmental scientists testing contaminated sites. Clinicians and medical researchers too could use it to diagnose genetic disorders and study problems in genetics. Such a sensor might also be used to spot the weapons of the bioterrorist or in criminal forensic investigations.
The earliest DNA biosensors used fluorescent labels to target DNA, but these were expensive and slow. The next generation used mediator molecules to speed up the process and labeled enzymes to make the sensors highly selective for their target molecules. None of these systems were portable, however, and the current research trend is towards systems that use no molecular labels and have avoid costly reagents.
Nevertheless, DNA biosensors are already becoming ubiquitous in many areas, but the instrumentation is usually limited to the laboratory setting. Afuwape says that a commercially viable, off-the-shelf handheld DNA biosensor that could be used in environmental, medical, forensics and other applications might be possible if researchers could unravel the basic molecular machinery operating at the interface between sample and detector.
Afuwape suggests that a new type of electronic device, the ion-selective field-effect transistor (ISFET), might be integrated into a DNA biosensor. Such a sensor would be coated with thousands of known DNA sequences that could match up - hybridize - with specific DNA fragments in a given medical or environmental sample.
The key to making the system work is that the ISFET can measure changes in conductivity. Constructing a sensor so that the process of DNA hybridization is coupled to a chemical reaction that generates electricity would produce discrete electronics signals. These signals would be picked up by the ISFET. The characteristic pattern of the signals would correspond to hybridization of a known DNA sequence on the sensor and so could reveal the presence of its counterpart DNA in the sample. Afuwape's mathematical work demonstrates that various known chemical reaction circuits involving DNA could be exploited in such a sensor.
"The ISFET is proving to be a powerful platform on which to design and develop selective, sensitive, and fast miniature DNA sensors," says Afuwape, "such portable DNA sensors will find broad application in medical, agriculture, environmental and bioweapons detection."
New Tool To Monitor Nuclear Reactors Developed
International inspectors may have a new tool in the form of an antineutrino detector, that could help them peer inside a working nuclear reactor.
A Lawrence Livermore National Laboratory-Sandia National Laboratories' team recently demonstrated that the operational status and thermal power of reactors can be quickly and precisely monitored over hour-to month-time scales, using a cubic-meter-scale antineutrino detector.
Adam Bernstein, leader of the Advanced Detectors Group at LLNL, is the project's principle investigator. He works on the detector development project with colleagues Nathaniel Bowden, Steven Dazeley and Robert Svoboda at LLNL; David Reyna, Jim Lund and Lorraine Sadler from Sandia National Laboratories' California branch in Livermore, and Professor Todd Palmer and graduate student Alex Misner at Oregon State University.
Antineutrinos are elusive neutral particles produced in nuclear decay. They interact with other matter only through gravitational and weak forces, which makes them very difficult to detect. However, the number of antineutrinos emitted by nuclear reactors is so large that a cubic-meter scale detector suffices to record them by the hundreds or thousands per day. As the team has demonstrated, this new detector makes practical monitoring devices for nonproliferation applications possible.
The detector could be used to determine the operational amount of plutonium or uranium necessary to run the reactor and place a direct constraint on the amount of fissile material the reactor creates throughout its lifecycle.
It is a long-recognized and fundamental dilemma of the nuclear age that nuclear reactors and nuclear weapons use very similar fuels. The fuels are generically known as fissile material -- principally uranium and plutonium, either of which elements, in appropriate isotopic mixtures, can be used to build a nuclear device. Reactors consume uranium and produce plutonium, typically over periods of a year or so. Bombs consume either or both materials, in a few microseconds.
According to the National Academy of Engineering, nearly 2 million kilograms of highly-enriched uranium (90 percent or greater U-235) and plutonium have already been produced and exist in the world today -- some from military and some from civil production. While these fuels can be and are used to produce electric power with incredible efficiency, it takes less than 10 kilograms of plutonium, or a few tens of kilograms of highly-enriched uranium, to build a bomb. There lies the nonproliferation problem.
Because reactors consume uranium and are the source of all the world's plutonium, they are a critical nuclear fuel cycle element within the jurisdiction of the International Atomic Energy Agency's (IAEA) Safeguard Regime. The regime was put in place by international treaty (the Nuclear Nonproliferation Treaty) to detect the diversion of fissile materials from civil nuclear fuel cycle facilities into weapons programs. Part of the nonproliferation program involves comparing the actual operations of a reactor -- specifically its changing plutonium and uranium inventories -- with operator declarations of what the reactor is expected to produce during its normal operations.
That's where the new detector comes in. It provides a direct measurement of the operational status (on/off) of the reactor, measures the reactor thermal power and places a direct constraint on the fissile inventory of the reactor throughout its lifecycle. All three parameters are derived directly from the antineutrino rate, measured nonintrusively and continuously by the detector. The data can be acquired directly by the safeguards agency (for example, the IAEA) without any intervention or support from the reactor operator. The detector is located on the reactor site in an out-of-the-way location tens of meters from the core, and outside the containment dome.
In a 2006 article in the journal Nuclear Instruments and Methods, the team presented the first data from a prototype detector, SONGS1, deployed at the San Onofre Nuclear Generating Station in Southern California. Those results confirmed the successful detection of antineutrinos with this simple prototype, which was shown to be continuously and stably operable for yearlong periods with remote and automatic data collection and detector calibration.
"Our forthcoming Journal of Applied Physics paper provides a fuller analysis of the data," Bernstein said. "By comparing our antineutrino rate with the publicly available records of the reactor's thermal power and using a detailed simulation of the reactor dynamics over time, we've now been able to quantify the precision with which the detector can monitor the reactor power and operational status over hourly to monthly time scales using only the antineutrino signals.
"What's interesting is that the precision of our simple prototype is limited only by counting statistics -- there was no evidence for long-term drifts or other confounding detector problems that could have compromised the detector performance. This robust, predictable behavior bodes well for the deployability of these devices, which have previously and mistakenly been considered delicate apparati, squarely within the realm of fundamental physics.
"Through our work, and that of a growing community of researchers around the world, it is appropriate to speak of a new kind of applied physics -- applied antineutrino physics. Assuming this technology is broadly adopted as a Nonproliferation tool, it's not much of a stretch to imagine a small industry springing up in a few years, churning out antineutrino detectors for nuclear safeguards."
Antineutrino emission in nuclear reactors comes from the decay of neutron-rich fragments produced by heavy element fissions and is linked to the fissile isotope production and consumption processes. On average, a single fission is followed by the production of approximately six antineutrinos. However, above a certain energy threshold, the average number of antineutrinos produced per fission is significantly different for the two major fissile elements, uranium-235 and plutonium-239. This difference results in measurable changes in the antineutrino rate over the course of the reactor fuel cycle, as the ratios of these two elements change.
The new detector operates unattended for long periods without significant maintenance, is self-calibrating, and does not affect plant operations in any way. Data from the detector is acquired remotely in real time, Bernstein said.
As for the detector being tampered with: "There are a host of pretty much standard techniques, used by the IAEA and others to guard against tampering," Bernstein said. "Furthermore, the antineutrino signature seen by the detector is hard to mimic with surrogate neutron or gamma sources."
The detector can be used to monitor reactor activities at different time scales of interest for safeguards. The reactor can be monitored on an hourly basis to look for sudden outages or other short-term anomalies in reactor operations. Or it can verify long-term stable operation of the reactor by measuring the antineutrino rate over the course of weeks, months and even years.
"It's important to emphasize that the monitoring need not depend on operation declarations," Bernstein said. "It can be kept under the control of the safeguard agency, providing a completely independent measure of reactor status."
The research will appear in an upcoming issue of the Journal of Applied Physics. In addition, the National Academy of Engineering referenced the antineutrino detection work as one of 14 Grand Challenges for Engineering under the grand challenge "Prevent Nuclear Terror." The work already has garnered national and international attention as a possible new tool in IAEA's set of methods used to detect diversion of fissile materials.
A Lawrence Livermore National Laboratory-Sandia National Laboratories' team recently demonstrated that the operational status and thermal power of reactors can be quickly and precisely monitored over hour-to month-time scales, using a cubic-meter-scale antineutrino detector.
Adam Bernstein, leader of the Advanced Detectors Group at LLNL, is the project's principle investigator. He works on the detector development project with colleagues Nathaniel Bowden, Steven Dazeley and Robert Svoboda at LLNL; David Reyna, Jim Lund and Lorraine Sadler from Sandia National Laboratories' California branch in Livermore, and Professor Todd Palmer and graduate student Alex Misner at Oregon State University.
Antineutrinos are elusive neutral particles produced in nuclear decay. They interact with other matter only through gravitational and weak forces, which makes them very difficult to detect. However, the number of antineutrinos emitted by nuclear reactors is so large that a cubic-meter scale detector suffices to record them by the hundreds or thousands per day. As the team has demonstrated, this new detector makes practical monitoring devices for nonproliferation applications possible.
The detector could be used to determine the operational amount of plutonium or uranium necessary to run the reactor and place a direct constraint on the amount of fissile material the reactor creates throughout its lifecycle.
It is a long-recognized and fundamental dilemma of the nuclear age that nuclear reactors and nuclear weapons use very similar fuels. The fuels are generically known as fissile material -- principally uranium and plutonium, either of which elements, in appropriate isotopic mixtures, can be used to build a nuclear device. Reactors consume uranium and produce plutonium, typically over periods of a year or so. Bombs consume either or both materials, in a few microseconds.
According to the National Academy of Engineering, nearly 2 million kilograms of highly-enriched uranium (90 percent or greater U-235) and plutonium have already been produced and exist in the world today -- some from military and some from civil production. While these fuels can be and are used to produce electric power with incredible efficiency, it takes less than 10 kilograms of plutonium, or a few tens of kilograms of highly-enriched uranium, to build a bomb. There lies the nonproliferation problem.
Because reactors consume uranium and are the source of all the world's plutonium, they are a critical nuclear fuel cycle element within the jurisdiction of the International Atomic Energy Agency's (IAEA) Safeguard Regime. The regime was put in place by international treaty (the Nuclear Nonproliferation Treaty) to detect the diversion of fissile materials from civil nuclear fuel cycle facilities into weapons programs. Part of the nonproliferation program involves comparing the actual operations of a reactor -- specifically its changing plutonium and uranium inventories -- with operator declarations of what the reactor is expected to produce during its normal operations.
That's where the new detector comes in. It provides a direct measurement of the operational status (on/off) of the reactor, measures the reactor thermal power and places a direct constraint on the fissile inventory of the reactor throughout its lifecycle. All three parameters are derived directly from the antineutrino rate, measured nonintrusively and continuously by the detector. The data can be acquired directly by the safeguards agency (for example, the IAEA) without any intervention or support from the reactor operator. The detector is located on the reactor site in an out-of-the-way location tens of meters from the core, and outside the containment dome.
In a 2006 article in the journal Nuclear Instruments and Methods, the team presented the first data from a prototype detector, SONGS1, deployed at the San Onofre Nuclear Generating Station in Southern California. Those results confirmed the successful detection of antineutrinos with this simple prototype, which was shown to be continuously and stably operable for yearlong periods with remote and automatic data collection and detector calibration.
"Our forthcoming Journal of Applied Physics paper provides a fuller analysis of the data," Bernstein said. "By comparing our antineutrino rate with the publicly available records of the reactor's thermal power and using a detailed simulation of the reactor dynamics over time, we've now been able to quantify the precision with which the detector can monitor the reactor power and operational status over hourly to monthly time scales using only the antineutrino signals.
"What's interesting is that the precision of our simple prototype is limited only by counting statistics -- there was no evidence for long-term drifts or other confounding detector problems that could have compromised the detector performance. This robust, predictable behavior bodes well for the deployability of these devices, which have previously and mistakenly been considered delicate apparati, squarely within the realm of fundamental physics.
"Through our work, and that of a growing community of researchers around the world, it is appropriate to speak of a new kind of applied physics -- applied antineutrino physics. Assuming this technology is broadly adopted as a Nonproliferation tool, it's not much of a stretch to imagine a small industry springing up in a few years, churning out antineutrino detectors for nuclear safeguards."
Antineutrino emission in nuclear reactors comes from the decay of neutron-rich fragments produced by heavy element fissions and is linked to the fissile isotope production and consumption processes. On average, a single fission is followed by the production of approximately six antineutrinos. However, above a certain energy threshold, the average number of antineutrinos produced per fission is significantly different for the two major fissile elements, uranium-235 and plutonium-239. This difference results in measurable changes in the antineutrino rate over the course of the reactor fuel cycle, as the ratios of these two elements change.
The new detector operates unattended for long periods without significant maintenance, is self-calibrating, and does not affect plant operations in any way. Data from the detector is acquired remotely in real time, Bernstein said.
As for the detector being tampered with: "There are a host of pretty much standard techniques, used by the IAEA and others to guard against tampering," Bernstein said. "Furthermore, the antineutrino signature seen by the detector is hard to mimic with surrogate neutron or gamma sources."
The detector can be used to monitor reactor activities at different time scales of interest for safeguards. The reactor can be monitored on an hourly basis to look for sudden outages or other short-term anomalies in reactor operations. Or it can verify long-term stable operation of the reactor by measuring the antineutrino rate over the course of weeks, months and even years.
"It's important to emphasize that the monitoring need not depend on operation declarations," Bernstein said. "It can be kept under the control of the safeguard agency, providing a completely independent measure of reactor status."
The research will appear in an upcoming issue of the Journal of Applied Physics. In addition, the National Academy of Engineering referenced the antineutrino detection work as one of 14 Grand Challenges for Engineering under the grand challenge "Prevent Nuclear Terror." The work already has garnered national and international attention as a possible new tool in IAEA's set of methods used to detect diversion of fissile materials.
Tiny Sensor Developed To Detect Homemade Bombs
A team of chemists and physicists at the University of California, San Diego has developed a tiny, inexpensive sensor chip capable of detecting trace amounts of hydrogen peroxide, a chemical used in the most common form of homemade explosives.
The invention and operation of this penny-sized electronic sensor, capable of sniffing out hydrogen peroxide vapor in the parts-per-billion range from peroxide-based explosives, such as those used in the 2005 bombing of the London transit system, is detailed in a new article.*In addition to detecting explosives, UC San Diego scientists say the sensor could have widespread applications in improving the health of industrial workers by providing a new tool to inexpensively monitor the toxic hydrogen peroxide vapors from bleached pulp and other products to which factory workers are exposed.
“The detection capability of this tiny electronic sensor is comparable to current instruments, which are large, bulky and cost thousands of dollars each,” said William Trogler, a professor of chemistry and biochemistry at UCSD and one of its inventors. “If this device were mass produced, it’s not inconceivable that it could be made for less than a dollar.”
The device was invented by a team led by Trogler; Andrew Kummel, a professor of chemistry and biochemistry; and Ivan Schuller, a professor of physics. Much of the work was done by UCSD chemistry and physics graduate students Forest Bohrer, Corneliu Colesniuc and Jeongwon Park.
The sensor works by monitoring the variability of electrical conductivity through thin films of “metal phthalocyanines.” When exposed to most oxidizing agents, such as chlorine, these metal films show an increase in electrical current, while reducing agents have the opposite effect—a decrease of electrical current.
But when exposed to hydrogen peroxide, an oxidant, the metal phthalocyanine films behave differently depending on the type of metal used. Films made of cobalt phthalocyanine show decreases in current, while those made from copper or nickel show increases in current.
The UCSD team used this unusual trait to build their sensor. It is composed of thin films of both cobalt phthalocyanine and copper phthalocyanine to display a unique signature whenever tiny amounts of hydrogen peroxide are present.
Bombs constructed with hydrogen peroxide killed more than 50 people and injured 700 more on two London subway trains and a transit bus during rush hour on July 7, 2005. More than 1,500 pounds of a hydrogen peroxide-based mixture was discovered after an alleged bomb plot in Germany that resulted in the widely publicized arrest last September of three people.
Trogler said that because the team’s sensor is so little affected by water vapor, it can be used in industrial and other “real-life applications.” The university has applied for a patent on the invention, which has not yet been licensed.
The article Selective Detection of Vapor Phase Hydrogen Peroxide with Phthalocyanine Chemiresistors is published in the Journal of the American Chemical Society.
Funding for the research study was provided by the Air Force Office of Scientific Research.
The invention and operation of this penny-sized electronic sensor, capable of sniffing out hydrogen peroxide vapor in the parts-per-billion range from peroxide-based explosives, such as those used in the 2005 bombing of the London transit system, is detailed in a new article.*In addition to detecting explosives, UC San Diego scientists say the sensor could have widespread applications in improving the health of industrial workers by providing a new tool to inexpensively monitor the toxic hydrogen peroxide vapors from bleached pulp and other products to which factory workers are exposed.
“The detection capability of this tiny electronic sensor is comparable to current instruments, which are large, bulky and cost thousands of dollars each,” said William Trogler, a professor of chemistry and biochemistry at UCSD and one of its inventors. “If this device were mass produced, it’s not inconceivable that it could be made for less than a dollar.”
The device was invented by a team led by Trogler; Andrew Kummel, a professor of chemistry and biochemistry; and Ivan Schuller, a professor of physics. Much of the work was done by UCSD chemistry and physics graduate students Forest Bohrer, Corneliu Colesniuc and Jeongwon Park.
The sensor works by monitoring the variability of electrical conductivity through thin films of “metal phthalocyanines.” When exposed to most oxidizing agents, such as chlorine, these metal films show an increase in electrical current, while reducing agents have the opposite effect—a decrease of electrical current.
But when exposed to hydrogen peroxide, an oxidant, the metal phthalocyanine films behave differently depending on the type of metal used. Films made of cobalt phthalocyanine show decreases in current, while those made from copper or nickel show increases in current.
The UCSD team used this unusual trait to build their sensor. It is composed of thin films of both cobalt phthalocyanine and copper phthalocyanine to display a unique signature whenever tiny amounts of hydrogen peroxide are present.
Bombs constructed with hydrogen peroxide killed more than 50 people and injured 700 more on two London subway trains and a transit bus during rush hour on July 7, 2005. More than 1,500 pounds of a hydrogen peroxide-based mixture was discovered after an alleged bomb plot in Germany that resulted in the widely publicized arrest last September of three people.
Trogler said that because the team’s sensor is so little affected by water vapor, it can be used in industrial and other “real-life applications.” The university has applied for a patent on the invention, which has not yet been licensed.
The article Selective Detection of Vapor Phase Hydrogen Peroxide with Phthalocyanine Chemiresistors is published in the Journal of the American Chemical Society.
Funding for the research study was provided by the Air Force Office of Scientific Research.
Biosensing Nanodevice To Revolutionize Health Screenings
One day soon a biosensing nanodevice developed by Arizona State University researcher Wayne Frasch may eliminate long lines at airport security checkpoints and revolutionize health screenings for diseases like anthrax, cancer and antibiotic resistant Staphylococcus aureus (MRSA).
Even more incredible than the device itself, is that it is based on the world's tiniest rotary motor: a biological engine measured on the order of molecules.
Frasch works with the enzyme F1-adenosine triphosphatase, better known as F1- ATPase. This enzyme, only 10 to 12 nanometers in diameter, has an axle that spins and produces torque. This tiny wonder is part of a complex of proteins key to creating energy in all living things, including photosynthesis in plants. F1-ATPase breaks down adenosine triphosphate (ATP) to adenosine diphospahte (ADP), releasing energy. Previous studies of its structure and characteristics have been the source of two Nobel Prizes awarded in 1979 and 1997.
It was through his own detailed study of the rotational mechanism of the F1-ATPase, which operates like a three-cylinder Mazda rotary motor, that Frasch conceived of a way to take this tiny biological powerhouse and couple it with science applications outside of the human body.
An article authored by Frasch and his colleagues in the ASU School of Life Sciences details the technology that would allow this. Their publication "Single-molecule detection of DNA via sequence-specific links between F1-ATPase motors and gold nanorod sensors" was recently published in the journal Lab on a Chip, and featured in the online journal Chemical Biology.
What Frasch and his colleagues show is that the enzyme can be armed with an optical probe (gold nanorod) and manipulated to emit a signal when it detects a single molecule of target DNA. This is achieved by anchoring a quiescent F1-ATPase motor to a surface. A single strand of a reference biotinylated DNA molecule is then attached to its axle. The marker protein, biotin, on the DNA is known to bind specifically and tightly to the glycoprotein avidin, so an avidin-coated gold nanorod is then added. The avidin-nanorod attaches to the biotinylated DNA strand and forms a stable complex.
When a test solution containing a target piece of DNA is added, this DNA binds to the single complementary reference strand attached to the F1-ATPase. The DNA complex, suspended between the nanorod and the axle, forms a stiff bridge. Once ATP is added to the test solution, the F1-ATPase axle spins, and with it, the attached (now double-stranded) DNA and nanorod. The whirling nano-sized device emits a pulsing red signal that can then be detected with a microscope.
According to Frasch, the rotation discriminates fully assembled nanodevices from nonspecifically bound nanorods, resulting in a sensitivity limit of one zeptomole (600 molecules). Simply put, if it's not moving and flashing, it simply isn't relevant.
Moreover, Frasch says, "Studies with the F1-ATPase in my laboratory show that since it can detect single DNA molecules, it far exceeds the detection limits of conventional PCR [polymerase chain reaction] technology."
Such a detection instrument based on the F1-ATPase enzyme would also be "faster and more portable," he adds.
With support from Science Foundation Arizona (SFAz), Frasch will transfer his work from the bench to biotech, through establishment of a local company that utilizes the nano-sized F1-ATPase to produce a DNA detection instrument.
A prototype of the DNA detector is already in development. It is roughly the size of a small tissue box. Sampling would be as simple as taking a swab from an infected wound or a piece of baggage, dissolving it in a solution and placing a drop on a slide bearing reference F1-ATPases and their nanorods. Once in the instrument, red blinking signals emitted by rotating nanorods would let a computer know there's trouble, literally, in a flash.
SFAz funding has also enabled Frasch to extend the method to do protein detection at the single molecule level. This is novel because, unlike DNA, proteins can not be amplified artificially to improve the chances of detection.
"Rapid and sensitive biosensing of nucleic acids and proteins is vital for the identification of pathogenic agents of biomedical and bioterrorist importance," notes Frasch, who is also with the Center for Bioenergy and Photosynthesis in the College of Liberal Arts and Sciences. "It also provides a new avenue through which to analyze genotypes and forensic evidence."
Even more incredible than the device itself, is that it is based on the world's tiniest rotary motor: a biological engine measured on the order of molecules.
Frasch works with the enzyme F1-adenosine triphosphatase, better known as F1- ATPase. This enzyme, only 10 to 12 nanometers in diameter, has an axle that spins and produces torque. This tiny wonder is part of a complex of proteins key to creating energy in all living things, including photosynthesis in plants. F1-ATPase breaks down adenosine triphosphate (ATP) to adenosine diphospahte (ADP), releasing energy. Previous studies of its structure and characteristics have been the source of two Nobel Prizes awarded in 1979 and 1997.
It was through his own detailed study of the rotational mechanism of the F1-ATPase, which operates like a three-cylinder Mazda rotary motor, that Frasch conceived of a way to take this tiny biological powerhouse and couple it with science applications outside of the human body.
An article authored by Frasch and his colleagues in the ASU School of Life Sciences details the technology that would allow this. Their publication "Single-molecule detection of DNA via sequence-specific links between F1-ATPase motors and gold nanorod sensors" was recently published in the journal Lab on a Chip, and featured in the online journal Chemical Biology.
What Frasch and his colleagues show is that the enzyme can be armed with an optical probe (gold nanorod) and manipulated to emit a signal when it detects a single molecule of target DNA. This is achieved by anchoring a quiescent F1-ATPase motor to a surface. A single strand of a reference biotinylated DNA molecule is then attached to its axle. The marker protein, biotin, on the DNA is known to bind specifically and tightly to the glycoprotein avidin, so an avidin-coated gold nanorod is then added. The avidin-nanorod attaches to the biotinylated DNA strand and forms a stable complex.
When a test solution containing a target piece of DNA is added, this DNA binds to the single complementary reference strand attached to the F1-ATPase. The DNA complex, suspended between the nanorod and the axle, forms a stiff bridge. Once ATP is added to the test solution, the F1-ATPase axle spins, and with it, the attached (now double-stranded) DNA and nanorod. The whirling nano-sized device emits a pulsing red signal that can then be detected with a microscope.
According to Frasch, the rotation discriminates fully assembled nanodevices from nonspecifically bound nanorods, resulting in a sensitivity limit of one zeptomole (600 molecules). Simply put, if it's not moving and flashing, it simply isn't relevant.
Moreover, Frasch says, "Studies with the F1-ATPase in my laboratory show that since it can detect single DNA molecules, it far exceeds the detection limits of conventional PCR [polymerase chain reaction] technology."
Such a detection instrument based on the F1-ATPase enzyme would also be "faster and more portable," he adds.
With support from Science Foundation Arizona (SFAz), Frasch will transfer his work from the bench to biotech, through establishment of a local company that utilizes the nano-sized F1-ATPase to produce a DNA detection instrument.
A prototype of the DNA detector is already in development. It is roughly the size of a small tissue box. Sampling would be as simple as taking a swab from an infected wound or a piece of baggage, dissolving it in a solution and placing a drop on a slide bearing reference F1-ATPases and their nanorods. Once in the instrument, red blinking signals emitted by rotating nanorods would let a computer know there's trouble, literally, in a flash.
SFAz funding has also enabled Frasch to extend the method to do protein detection at the single molecule level. This is novel because, unlike DNA, proteins can not be amplified artificially to improve the chances of detection.
"Rapid and sensitive biosensing of nucleic acids and proteins is vital for the identification of pathogenic agents of biomedical and bioterrorist importance," notes Frasch, who is also with the Center for Bioenergy and Photosynthesis in the College of Liberal Arts and Sciences. "It also provides a new avenue through which to analyze genotypes and forensic evidence."
Safer, Easier System For Remote Explosive Detection
Detecting roadside bombs may become easier, thanks to chemical sensors being developed at the University of Michigan.
A team led by chemistry professor Theodore Goodson III has created materials that sniff out TNT and give off signals that can be detected remotely---from a moving Humvee, for example. Their work was recently described in the journal Nanotechnology and also is the subject of a presentation at the 235th national meeting of the American Chemical Society in New Orleans, April 9, 2008.
The materials under study are large macromolecules made up of smaller active parts (chromophores), put together in a branching pattern. When TNT vapor contacts the material, "the TNT gets caught in the branches, as if in a sieve," said Goodson, who has a joint appointment in the Department of Macromolecular Science and Engineering. Normally, these materials emit light (fluoresce) when their molecules are excited with pulses of infrared light. But even the slightest trace of TNT quenches that fluorescence.
Goodson envisions a system in which sensors---which can be made for about $10 each---are positioned along the roadside and in other important locations. Passing military vehicles would be equipped with lasers to shoot infrared light at the sensors to excite the fluorescence and a specially-designed light-collection system to detect the sensors' response. Any sensors that don't fluoresce would be tip-offs to possible locations of roadside bombs.
Goodson's remote detection scheme relies on highly sensitive, low-cost, battery-free, thin film sensors which require no electronic equipment or excitation source at the sites where they are installed. In contrast, conventional chemical TNT sensors for explosive detection have no remote capability and must be used in close proximity to the suspicious site, increasing the danger for military personnel. Using infrared light to excite the remote sensors minimizes light-scattering, allows for greater penetration through the atmosphere, and is safe to soldiers' eyes.
Goodson's research team also is working on laser-based methods for directly detecting TNT, with no sensors on site.
Goodson's collaborators on the research are graduate student Aditya Narayanan and research associate Oleg Varnavski of U-M; Oliver Mongin and Mireille Blanchard-Desce of Université de Rennes 1 in Rennes, France, and Jean-Pierre Majoral of Laboratoire de Chimie de Coordination in Toulouse, France.
A team led by chemistry professor Theodore Goodson III has created materials that sniff out TNT and give off signals that can be detected remotely---from a moving Humvee, for example. Their work was recently described in the journal Nanotechnology and also is the subject of a presentation at the 235th national meeting of the American Chemical Society in New Orleans, April 9, 2008.
The materials under study are large macromolecules made up of smaller active parts (chromophores), put together in a branching pattern. When TNT vapor contacts the material, "the TNT gets caught in the branches, as if in a sieve," said Goodson, who has a joint appointment in the Department of Macromolecular Science and Engineering. Normally, these materials emit light (fluoresce) when their molecules are excited with pulses of infrared light. But even the slightest trace of TNT quenches that fluorescence.
Goodson envisions a system in which sensors---which can be made for about $10 each---are positioned along the roadside and in other important locations. Passing military vehicles would be equipped with lasers to shoot infrared light at the sensors to excite the fluorescence and a specially-designed light-collection system to detect the sensors' response. Any sensors that don't fluoresce would be tip-offs to possible locations of roadside bombs.
Goodson's remote detection scheme relies on highly sensitive, low-cost, battery-free, thin film sensors which require no electronic equipment or excitation source at the sites where they are installed. In contrast, conventional chemical TNT sensors for explosive detection have no remote capability and must be used in close proximity to the suspicious site, increasing the danger for military personnel. Using infrared light to excite the remote sensors minimizes light-scattering, allows for greater penetration through the atmosphere, and is safe to soldiers' eyes.
Goodson's research team also is working on laser-based methods for directly detecting TNT, with no sensors on site.
Goodson's collaborators on the research are graduate student Aditya Narayanan and research associate Oleg Varnavski of U-M; Oliver Mongin and Mireille Blanchard-Desce of Université de Rennes 1 in Rennes, France, and Jean-Pierre Majoral of Laboratoire de Chimie de Coordination in Toulouse, France.
Traffic Woes? New Method Allows Traffic Optimization Over Large Geographic Areas
How can traffic be monitored and controlled more effectively? In the ORINOKO project, scientists have developed methods of determining the traffic situation across a wide area, and have refined processes that enable traffic to be optimally channeled.
Traffic jams on the way to work, to the shops or to a holiday destination – a common experience for most of us. Traffic management systems can provide help. Various concepts and measures are being tested, for example in the transport research project ORINOKO (Operative Regional Integrated and Optimized Corridor Control). The project received funding to the tune of almost three million euros from the German federal ministry of economics and technology BMWi over a period of about three years.
The Fraunhofer Institute for Transportation and Infrastructure Systems IVI in Dresden was among the project partners. The IVI team led by Ulf Jung and Georg Förster performed a variety of tasks. “One thing we did was set up a central database containing a digital map of the road network. A vast amount of relevant measurement data flows continuously into this database,” says Georg Förster. “We also provided software interfaces that enable dynamic data from a variety of sources, such as journey times, traffic volume or tailback lengths, to be used for control and information purposes within the scope of the traffic management system.”
The team is particularly proud at having established a sensor system based on video cameras, which was installed and tested on a trial basis at ten different sites in Nuremberg over the past few months. It can automatically determine certain traffic statistics such as the number of vehicles on the roads or the length of a tailback. These values are continuously fed into a central computer system where they are processed and used to control the traffic. For instance, traffic lights are switched to suit the situation observed by the cameras. “This combination of advanced computer technology and the image processing software developed by us delivers data of a similar quality to those of conventional induction loops, but is much cheaper and more flexible to use,” says IVI head of department Ulf Jung.
The video detector can determine the number of vehicles, their speed, the length of a tailback, and other factors. At present, it is able to analyze up to six traffic lanes simultaneously. The recorded images are processed and interpreted in real time on the spot by a small computer connected to the camera module, which then sends the traffic data and live images to a control center. The new system fills the gap between the established but expensive induction loops and the journey time measurements obtained using sensors in taxis. The video detectors are not only cost-efficient but also deliver a continuous stream of reliable data.
Traffic jams on the way to work, to the shops or to a holiday destination – a common experience for most of us. Traffic management systems can provide help. Various concepts and measures are being tested, for example in the transport research project ORINOKO (Operative Regional Integrated and Optimized Corridor Control). The project received funding to the tune of almost three million euros from the German federal ministry of economics and technology BMWi over a period of about three years.
The Fraunhofer Institute for Transportation and Infrastructure Systems IVI in Dresden was among the project partners. The IVI team led by Ulf Jung and Georg Förster performed a variety of tasks. “One thing we did was set up a central database containing a digital map of the road network. A vast amount of relevant measurement data flows continuously into this database,” says Georg Förster. “We also provided software interfaces that enable dynamic data from a variety of sources, such as journey times, traffic volume or tailback lengths, to be used for control and information purposes within the scope of the traffic management system.”
The team is particularly proud at having established a sensor system based on video cameras, which was installed and tested on a trial basis at ten different sites in Nuremberg over the past few months. It can automatically determine certain traffic statistics such as the number of vehicles on the roads or the length of a tailback. These values are continuously fed into a central computer system where they are processed and used to control the traffic. For instance, traffic lights are switched to suit the situation observed by the cameras. “This combination of advanced computer technology and the image processing software developed by us delivers data of a similar quality to those of conventional induction loops, but is much cheaper and more flexible to use,” says IVI head of department Ulf Jung.
The video detector can determine the number of vehicles, their speed, the length of a tailback, and other factors. At present, it is able to analyze up to six traffic lanes simultaneously. The recorded images are processed and interpreted in real time on the spot by a small computer connected to the camera module, which then sends the traffic data and live images to a control center. The new system fills the gap between the established but expensive induction loops and the journey time measurements obtained using sensors in taxis. The video detectors are not only cost-efficient but also deliver a continuous stream of reliable data.
New Molecules Could Change The Face Of Explosives Detection
Researchers at the University of Massachusetts Amherst have created complex molecules containing zinc for use in portable sensors that quickly and reliably detect the presence of plastic explosives, a pressing need for soldiers in Iraq and other hostile environments.
Sensors containing the zinc complexes are also the first devices that allow the user to identify which type of explosive is present, since each metal complex has a unique response to explosives and explosive mimics.
“This is a big improvement over existing sensors based on polymers, since the metal complexes can discriminate between closely related explosives compounds,” says Michael Knapp, a professor of chemistry. “This ability is a real advantage for airport security personnel and law enforcement officials, who need to quickly detect and identify what type of explosives they are dealing with.”
Results of the study by Knapp, doctoral candidate Meaghan Germain and undergraduate student Thomas Vargo were published April 23 in the Journal of the American Chemical Society.
Knapp and Germain currently hold a patent for the zinc complexes, and are working with the UMass Amherst Office of Commercial Ventures and Intellectual Property to bring this technology to market. The research was supported by start-up funds provided by the University of Massachusetts Amherst.
The zinc complexes are naturally fluorescent, but they lose this ability when exposed to chemicals contained in plastic explosives, a phenomenon called quenching. Since each of the complexes react by losing different amounts of their fluorescent ability, they can be used to create sensor arrays that produce a different visual display when exposed to different explosives.
During testing, the sensors also responded quickly, since the zinc complexes are very efficient at changing energy states, making them suitable for hostile environments. “Of all the molecules that fluoresce, these go from a high energy state to a low energy state like falling off a cliff,” says Knapp. “They don’t lose energy gradually like metal complexes made with copper.”
“Identifying and distinguishing related compounds by optical methods is an enormous challenge for chemical sensing,” says Knapp. “The differential quenching of the zinc complexes is what permits discrimination within the closely related nitroaromatic family used in explosives.”
Sensors containing the zinc complexes are also the first devices that allow the user to identify which type of explosive is present, since each metal complex has a unique response to explosives and explosive mimics.
“This is a big improvement over existing sensors based on polymers, since the metal complexes can discriminate between closely related explosives compounds,” says Michael Knapp, a professor of chemistry. “This ability is a real advantage for airport security personnel and law enforcement officials, who need to quickly detect and identify what type of explosives they are dealing with.”
Results of the study by Knapp, doctoral candidate Meaghan Germain and undergraduate student Thomas Vargo were published April 23 in the Journal of the American Chemical Society.
Knapp and Germain currently hold a patent for the zinc complexes, and are working with the UMass Amherst Office of Commercial Ventures and Intellectual Property to bring this technology to market. The research was supported by start-up funds provided by the University of Massachusetts Amherst.
The zinc complexes are naturally fluorescent, but they lose this ability when exposed to chemicals contained in plastic explosives, a phenomenon called quenching. Since each of the complexes react by losing different amounts of their fluorescent ability, they can be used to create sensor arrays that produce a different visual display when exposed to different explosives.
During testing, the sensors also responded quickly, since the zinc complexes are very efficient at changing energy states, making them suitable for hostile environments. “Of all the molecules that fluoresce, these go from a high energy state to a low energy state like falling off a cliff,” says Knapp. “They don’t lose energy gradually like metal complexes made with copper.”
“Identifying and distinguishing related compounds by optical methods is an enormous challenge for chemical sensing,” says Knapp. “The differential quenching of the zinc complexes is what permits discrimination within the closely related nitroaromatic family used in explosives.”
New Molecules Could Change The Face Of Explosives Detection
Researchers at the University of Massachusetts Amherst have created complex molecules containing zinc for use in portable sensors that quickly and reliably detect the presence of plastic explosives, a pressing need for soldiers in Iraq and other hostile environments.
Sensors containing the zinc complexes are also the first devices that allow the user to identify which type of explosive is present, since each metal complex has a unique response to explosives and explosive mimics.
“This is a big improvement over existing sensors based on polymers, since the metal complexes can discriminate between closely related explosives compounds,” says Michael Knapp, a professor of chemistry. “This ability is a real advantage for airport security personnel and law enforcement officials, who need to quickly detect and identify what type of explosives they are dealing with.”
Results of the study by Knapp, doctoral candidate Meaghan Germain and undergraduate student Thomas Vargo were published April 23 in the Journal of the American Chemical Society.
Knapp and Germain currently hold a patent for the zinc complexes, and are working with the UMass Amherst Office of Commercial Ventures and Intellectual Property to bring this technology to market. The research was supported by start-up funds provided by the University of Massachusetts Amherst.
The zinc complexes are naturally fluorescent, but they lose this ability when exposed to chemicals contained in plastic explosives, a phenomenon called quenching. Since each of the complexes react by losing different amounts of their fluorescent ability, they can be used to create sensor arrays that produce a different visual display when exposed to different explosives.
During testing, the sensors also responded quickly, since the zinc complexes are very efficient at changing energy states, making them suitable for hostile environments. “Of all the molecules that fluoresce, these go from a high energy state to a low energy state like falling off a cliff,” says Knapp. “They don’t lose energy gradually like metal complexes made with copper.”
“Identifying and distinguishing related compounds by optical methods is an enormous challenge for chemical sensing,” says Knapp. “The differential quenching of the zinc complexes is what permits discrimination within the closely related nitroaromatic family used in explosives.”
Sensors containing the zinc complexes are also the first devices that allow the user to identify which type of explosive is present, since each metal complex has a unique response to explosives and explosive mimics.
“This is a big improvement over existing sensors based on polymers, since the metal complexes can discriminate between closely related explosives compounds,” says Michael Knapp, a professor of chemistry. “This ability is a real advantage for airport security personnel and law enforcement officials, who need to quickly detect and identify what type of explosives they are dealing with.”
Results of the study by Knapp, doctoral candidate Meaghan Germain and undergraduate student Thomas Vargo were published April 23 in the Journal of the American Chemical Society.
Knapp and Germain currently hold a patent for the zinc complexes, and are working with the UMass Amherst Office of Commercial Ventures and Intellectual Property to bring this technology to market. The research was supported by start-up funds provided by the University of Massachusetts Amherst.
The zinc complexes are naturally fluorescent, but they lose this ability when exposed to chemicals contained in plastic explosives, a phenomenon called quenching. Since each of the complexes react by losing different amounts of their fluorescent ability, they can be used to create sensor arrays that produce a different visual display when exposed to different explosives.
During testing, the sensors also responded quickly, since the zinc complexes are very efficient at changing energy states, making them suitable for hostile environments. “Of all the molecules that fluoresce, these go from a high energy state to a low energy state like falling off a cliff,” says Knapp. “They don’t lose energy gradually like metal complexes made with copper.”
“Identifying and distinguishing related compounds by optical methods is an enormous challenge for chemical sensing,” says Knapp. “The differential quenching of the zinc complexes is what permits discrimination within the closely related nitroaromatic family used in explosives.”
Monday, 19 May 2008
New Technology Tests Maturity Of Stem Cells
Stem cells can differentiate into 220 different types of body cell. The development of these cells can now be systematically observed and investigated with the aid of two new machines that imitate the conditions in the human body with unprecedented accuracy.
Stem cells are extremely versatile: They can develop in 220 different ways, transforming themselves into a correspondingly diverse range of specialized body cells. Biologists and medical scientists plan to make use of this differentiation ability to selectively harvest cardiac, skin or nerve cells for the treatment of different diseases. However, the stem cell culture techniques practiced today are not very efficient.
What proportion of a mass of stem cells is transformed into which body cells? And in what conditions? “We need devices that keep doing the same thing and thus deliver statistically reliable data,” says Professor Günter Fuhr, director of the Fraunhofer Institute for Biomedical Engineering IBMT in St. Ingbert.
Two prototypes of laboratory devices for stem cell differentiation enable the complex careers of stem cells to be systematically examined for the first time ever. These devices are the result of the international project ‘CellPROM’ – ‘Cell Programming by Nanoscaled Devices’ – which was funded by the European Union to the tune of 16.7 million euros and coordinated by the IBMT. “The type of cell culture used until now is too far removed from the natural situation,” says CellPROM project coordinator Daniel Schmitt – for in the body, the stem cells come into contact with solute nutrients, messenger RNAs and a large number of different cells.
Millions of proteins rest in or on the cell membranes and excite the stem cells to transform themselves into specialized cells. “We want to provide the stem cells in the laboratory with a surface that is as similar as possible to the cell membranes,” explains Daniel Schmitt. “To this end, the consortium developed a variety of methods by which different biomolecules can be efficiently applied to cell-compatible surfaces.”
In the two machines – MagnaLab and NazcaLab – the stem cells are brought into contact with the signal factors in a pre-defined manner. In MagnaLab, several hundred cells grow on culture substrates that are coated with biomolecules. In NazcaLab, large numbers of individual cells, washed around by a nutrient solution, float along parallel channels where they encounter micro-particles that are charged with signal factors.
“We use a microscope and a camera to document in fast motion how individual cells divide and differentiate,” says Schmitt. The researchers demonstrated on about 20 different cell models that the multi-talents can be stimulated by surface signals to transform themselves into specialized cells.
Stem cells are extremely versatile: They can develop in 220 different ways, transforming themselves into a correspondingly diverse range of specialized body cells. Biologists and medical scientists plan to make use of this differentiation ability to selectively harvest cardiac, skin or nerve cells for the treatment of different diseases. However, the stem cell culture techniques practiced today are not very efficient.
What proportion of a mass of stem cells is transformed into which body cells? And in what conditions? “We need devices that keep doing the same thing and thus deliver statistically reliable data,” says Professor Günter Fuhr, director of the Fraunhofer Institute for Biomedical Engineering IBMT in St. Ingbert.
Two prototypes of laboratory devices for stem cell differentiation enable the complex careers of stem cells to be systematically examined for the first time ever. These devices are the result of the international project ‘CellPROM’ – ‘Cell Programming by Nanoscaled Devices’ – which was funded by the European Union to the tune of 16.7 million euros and coordinated by the IBMT. “The type of cell culture used until now is too far removed from the natural situation,” says CellPROM project coordinator Daniel Schmitt – for in the body, the stem cells come into contact with solute nutrients, messenger RNAs and a large number of different cells.
Millions of proteins rest in or on the cell membranes and excite the stem cells to transform themselves into specialized cells. “We want to provide the stem cells in the laboratory with a surface that is as similar as possible to the cell membranes,” explains Daniel Schmitt. “To this end, the consortium developed a variety of methods by which different biomolecules can be efficiently applied to cell-compatible surfaces.”
In the two machines – MagnaLab and NazcaLab – the stem cells are brought into contact with the signal factors in a pre-defined manner. In MagnaLab, several hundred cells grow on culture substrates that are coated with biomolecules. In NazcaLab, large numbers of individual cells, washed around by a nutrient solution, float along parallel channels where they encounter micro-particles that are charged with signal factors.
“We use a microscope and a camera to document in fast motion how individual cells divide and differentiate,” says Schmitt. The researchers demonstrated on about 20 different cell models that the multi-talents can be stimulated by surface signals to transform themselves into specialized cells.
Boosting 'Mussel' Power: New Technique For Making Key Marine Mussel Protein
Researchers in Korea report development of a way to double production of a sticky protein from marine mussels destined for use as an antibacterial coating to prevent life-threatening infections in medical implants. The coating, produced by genetically-engineered bacteria, could cut medical costs and improve implant safety, the researchers say.
Bacterial infection of medical implants, such as cardiac stents and dialysis tubing, threatens thousands of people each year and is a major medical challenge due to the emergence of antibiotic-resistant bacteria. Several research groups are working on long-lasting, germ-fighting coatings from mussel proteins, but production of these coatings is inefficient and expensive.
Hyung Joon Cha and colleagues previously developed a way to use genetically engineered E. coli bacteria to produce mussel adhesive proteins. Now they report adding a new gene for producing Vitreoscilla hemoglobin (VHb), a substance that boosts production of proteins under low-oxygen conditions. Adding the VHb gene to the engineered E. coli doubled the amount of mussel proteins produced, which could lead to more cost-effective coatings, the researchers say.
The article "Enhancement of Mussel Adhesive Protein Production in Escherichia coli by Co-expression of Bacterial Hemoglobin" is scheduled for the June 6 issue of ACS' Biotechnology Progress.
Bacterial infection of medical implants, such as cardiac stents and dialysis tubing, threatens thousands of people each year and is a major medical challenge due to the emergence of antibiotic-resistant bacteria. Several research groups are working on long-lasting, germ-fighting coatings from mussel proteins, but production of these coatings is inefficient and expensive.
Hyung Joon Cha and colleagues previously developed a way to use genetically engineered E. coli bacteria to produce mussel adhesive proteins. Now they report adding a new gene for producing Vitreoscilla hemoglobin (VHb), a substance that boosts production of proteins under low-oxygen conditions. Adding the VHb gene to the engineered E. coli doubled the amount of mussel proteins produced, which could lead to more cost-effective coatings, the researchers say.
The article "Enhancement of Mussel Adhesive Protein Production in Escherichia coli by Co-expression of Bacterial Hemoglobin" is scheduled for the June 6 issue of ACS' Biotechnology Progress.
Groundbreaking Methodology For Identify Cancerous Cells
Recognizing the distinction between healthy and cancerous cells has traditionally been up to the eye of highly-trained cytologists and pathologists. While the majority of the resulting diagnoses are accurate, new technology can enhance the accuracy and alleviate the physical strain on the human observer. Northeastern University professor Max Diem and his team have developed an automatic method based on vibrational microspectroscopy that identifies the presence of metastatic cancer cells without the need for staining, and without human input.
The innovative method aids classical cytology (where visually inspection is used to detect changes in the morphology of cells obtained from bodily fluids, exfoliation or thin needle biopsy) and classical pathology (where stained tissue sections are examined visually).
“The idea behind the methodology is to examine the chemical composition of cells, as opposed to relying solely on the morphology,” said Diem, Professor of Chemistry and Chemical Biology at Northeastern University. “Abnormalities in exfoliated cells, for instance in Pap smears, can be difficult to discern visually, however, by looking at the biochemical composition of the cell with the help of vibrational spectroscopy, we can detect specific cellular changes indicating cancer.”
Funded by the National Cancer Institute (NCI) of the National Institutes of Health (NIH), the novel method developed by Diem and his team uses a quantifiable and quantitative approach to measure cervical, urothelial or buccal exfoliated cells. As disease changes the chemical composition of the cell, the instrument is able to detect variations in cellular properties without the need to stain the slides and inspect them visually.
“The method is entirely machine-based and computer-interpreted, and thus, reduces the workload in diagnostic laboratories,” added Diem. “It allows us to increase the overall accuracy and decrease the time required to render medical diagnoses.”
Under another grant from NCI, the researchers are working on developing an operating room-based instrument that will produce a diagnosis of breast cancer cells in the axillary lymph nodes within 15 minutes after excision. The goal is to produce instrumentation and software that can analyze lymph node sections in the operating room, and provide the surgeon with an objective diagnosis of the spread of disease.
“We have identified three major milestones for this particular research,” said Diem. “We want to develop a rapid sample preparation methodology, refine the imaging instrumentation, and construct reliable databases and algorithms for the detection.”
Underscoring the university’s emphasis on interdisciplinary research, Diem’s laboratory also collaborates with the Center for Subsurface Sensing and Imaging Systems (CenSISS) at Northeastern University, making the professor one of the non-engineer members of the CenSISS group.
The innovative method aids classical cytology (where visually inspection is used to detect changes in the morphology of cells obtained from bodily fluids, exfoliation or thin needle biopsy) and classical pathology (where stained tissue sections are examined visually).
“The idea behind the methodology is to examine the chemical composition of cells, as opposed to relying solely on the morphology,” said Diem, Professor of Chemistry and Chemical Biology at Northeastern University. “Abnormalities in exfoliated cells, for instance in Pap smears, can be difficult to discern visually, however, by looking at the biochemical composition of the cell with the help of vibrational spectroscopy, we can detect specific cellular changes indicating cancer.”
Funded by the National Cancer Institute (NCI) of the National Institutes of Health (NIH), the novel method developed by Diem and his team uses a quantifiable and quantitative approach to measure cervical, urothelial or buccal exfoliated cells. As disease changes the chemical composition of the cell, the instrument is able to detect variations in cellular properties without the need to stain the slides and inspect them visually.
“The method is entirely machine-based and computer-interpreted, and thus, reduces the workload in diagnostic laboratories,” added Diem. “It allows us to increase the overall accuracy and decrease the time required to render medical diagnoses.”
Under another grant from NCI, the researchers are working on developing an operating room-based instrument that will produce a diagnosis of breast cancer cells in the axillary lymph nodes within 15 minutes after excision. The goal is to produce instrumentation and software that can analyze lymph node sections in the operating room, and provide the surgeon with an objective diagnosis of the spread of disease.
“We have identified three major milestones for this particular research,” said Diem. “We want to develop a rapid sample preparation methodology, refine the imaging instrumentation, and construct reliable databases and algorithms for the detection.”
Underscoring the university’s emphasis on interdisciplinary research, Diem’s laboratory also collaborates with the Center for Subsurface Sensing and Imaging Systems (CenSISS) at Northeastern University, making the professor one of the non-engineer members of the CenSISS group.
Magnet Lab Researchers Make Observing Cell Functions Easier
Now that the genome (DNA) of humans and many other organisms have been sequenced, biologists are turning their attention to discovering how the many thousands of structural and control genes -- the "worker bees" of living cells that can turn genes on and off -- function.
To do that, they need to develop new techniques and tools. Scientists in the Optical Microscopy group at the National High Magnetic Field Laboratory at Florida State University, working in collaboration with researchers from the University of Alberta in Canada and the University of California, San Diego, have done just that, and in the process have produced back-to-back articles in the journal Nature Methods.
In the first paper, magnet-lab biologists Michael Davidson and Kristen Hazelwood worked with researchers from the University of Alberta to create two new fluorescent-protein biosensors, molecular "beacons" that can tell if there is activity within a cell. The biosensors can be used simultaneously to monitor two separate dynamic functions in a single cell -- a key to understanding how different proteins and enzymes (the biomolecules that cause chemical reactions) work together to complete the daily chores that help cells grow and divide. Knowing how cells work together can help researchers learn a great deal more about tumors and developmental biology, among many other things.
The researchers improved a powerful technique used to monitor cellular dynamics called fluorescence resonance energy transfer, or FRET. The technique is used to examine a new class of biosensor molecules that tether two fluorescent proteins together through an intervening peptide (which is like a polymer). Several hundred of these new biosensors have been developed over the past few years and are being used by scientists around the world to study a variety of functions, including programmed cell death, carbohydrate metabolism, cell division, hormone stimulation, acidity changes -- just about any cellular process that can occur.
"In FRET, two molecules that are fluorescent act as 'molecular beacons' under the microscope, transferring energy between each other if they interact in the living cell," said Davidson, who directs the magnet lab's Optical Microscopy program. "With FRET, we can see that happen, but until now, we have only been able to monitor one biosensor at a time."
The new technique, called Dual FRET, is outlined in the paper "Fluorescent Protein FRET Pairs for Ratiometric Imaging of Dual Biosensors." http://www.nature.com/nmeth/journal/v5/n5/abs/nmeth.1207.html
Further expanding the capabilities of optical microscopy, Davidson and his team worked with collaborators from the University of California, San Diego to create a new screening method for fluorescent proteins that makes them more stable under the microscope. These proteins are sensitive to light, which can bleach them out after a certain period of time. By making the proteins more stable, microscopists can observe live cell dynamics for longer periods of time. The paper describing their work, "Improving the Photostability of Bright Monomeric Orange and Red Fluorescent Proteins," was published in the May 4 online edition of Nature Methods. http://www.nature.com/nmeth/journal/v4/n9/full/nmeth1083.html
Taken together, the new technique and tool are expected to speed up experiments and expand the utility of optical microscopy by allowing two dynamic processes inside a cell to be observed at once -- and for longer periods of time.
To do that, they need to develop new techniques and tools. Scientists in the Optical Microscopy group at the National High Magnetic Field Laboratory at Florida State University, working in collaboration with researchers from the University of Alberta in Canada and the University of California, San Diego, have done just that, and in the process have produced back-to-back articles in the journal Nature Methods.
In the first paper, magnet-lab biologists Michael Davidson and Kristen Hazelwood worked with researchers from the University of Alberta to create two new fluorescent-protein biosensors, molecular "beacons" that can tell if there is activity within a cell. The biosensors can be used simultaneously to monitor two separate dynamic functions in a single cell -- a key to understanding how different proteins and enzymes (the biomolecules that cause chemical reactions) work together to complete the daily chores that help cells grow and divide. Knowing how cells work together can help researchers learn a great deal more about tumors and developmental biology, among many other things.
The researchers improved a powerful technique used to monitor cellular dynamics called fluorescence resonance energy transfer, or FRET. The technique is used to examine a new class of biosensor molecules that tether two fluorescent proteins together through an intervening peptide (which is like a polymer). Several hundred of these new biosensors have been developed over the past few years and are being used by scientists around the world to study a variety of functions, including programmed cell death, carbohydrate metabolism, cell division, hormone stimulation, acidity changes -- just about any cellular process that can occur.
"In FRET, two molecules that are fluorescent act as 'molecular beacons' under the microscope, transferring energy between each other if they interact in the living cell," said Davidson, who directs the magnet lab's Optical Microscopy program. "With FRET, we can see that happen, but until now, we have only been able to monitor one biosensor at a time."
The new technique, called Dual FRET, is outlined in the paper "Fluorescent Protein FRET Pairs for Ratiometric Imaging of Dual Biosensors." http://www.nature.com/nmeth/journal/v5/n5/abs/nmeth.1207.html
Further expanding the capabilities of optical microscopy, Davidson and his team worked with collaborators from the University of California, San Diego to create a new screening method for fluorescent proteins that makes them more stable under the microscope. These proteins are sensitive to light, which can bleach them out after a certain period of time. By making the proteins more stable, microscopists can observe live cell dynamics for longer periods of time. The paper describing their work, "Improving the Photostability of Bright Monomeric Orange and Red Fluorescent Proteins," was published in the May 4 online edition of Nature Methods. http://www.nature.com/nmeth/journal/v4/n9/full/nmeth1083.html
Taken together, the new technique and tool are expected to speed up experiments and expand the utility of optical microscopy by allowing two dynamic processes inside a cell to be observed at once -- and for longer periods of time.
Warming Up For Magnetic Resonance Imaging
Standard magnetic resonance imaging, MRI, is a superb diagnostic tool but one that suffers from low sensitivity, requiring patients to remain motionless for long periods of time inside noisy, claustrophobic machines. A promising new MRI method, much faster, more selective — able to distinguish even among specific target molecules — and many thousands of times more sensitive, has now been developed in the laboratory by researchers at the Department of Energy's Lawrence Berkeley National Laboratory and the University of California at Berkeley
The key to the new technique is called "temperature-controlled molecular depolarization gates." It builds on a series of previous developments in MRI and the closely related field of nuclear magnetic resonance, NMR (which instead of an image yields a spectrum of molecular information), by members of the laboratories of Alexander Pines and David Wemmer at Berkeley Lab and UC Berkeley. Pines is the Glenn T. Seaborg Professor of Chemistry at the University of California at Berkeley and a senior scientist in Berkeley Lab's Materials Sciences Division. Wemmer is Professor of Chemistry at UC Berkeley and a member of Berkeley Lab's Physical Biosciences Division.
The technique was developed by a team of past and present Pines and Wemmer lab members headed by Leif Schröder of Berkeley Lab's Materials Sciences Division and including Lana Chavez, Tyler Meldrum, Monica Smith, and Thomas Lowery.
"The new method holds the promise of combining a set of proven NMR tools for the first time into a practical, supersensitive diagnostic system for imaging the distribution of specific molecules on such targets as tumors in human subjects," says lead author Schröder, "or even on individual cancer cells."
The key to the new technique is called "temperature-controlled molecular depolarization gates." It builds on a series of previous developments in MRI and the closely related field of nuclear magnetic resonance, NMR (which instead of an image yields a spectrum of molecular information), by members of the laboratories of Alexander Pines and David Wemmer at Berkeley Lab and UC Berkeley. Pines is the Glenn T. Seaborg Professor of Chemistry at the University of California at Berkeley and a senior scientist in Berkeley Lab's Materials Sciences Division. Wemmer is Professor of Chemistry at UC Berkeley and a member of Berkeley Lab's Physical Biosciences Division.
The technique was developed by a team of past and present Pines and Wemmer lab members headed by Leif Schröder of Berkeley Lab's Materials Sciences Division and including Lana Chavez, Tyler Meldrum, Monica Smith, and Thomas Lowery.
"The new method holds the promise of combining a set of proven NMR tools for the first time into a practical, supersensitive diagnostic system for imaging the distribution of specific molecules on such targets as tumors in human subjects," says lead author Schröder, "or even on individual cancer cells."
Molecule With 'Self-control' Synthesized
Plants have an ambivalent relationship with light. They need it to live, but too much light leads to the increased production of high-energy chemical intermediates that can injure or kill the plant.
The intermediates do this because the efficient conversion of sunlight into chemical energy cannot keep up with sunlight streaming into the plant.
"The intermediates don't have anywhere to go because the system is jammed up down the line," says ASU chemist Devens Gust. Plants employ a sophisticated process to defend against damage.
To better understand this process, Gust, along with fellow ASU researchers Thomas Moore and Ana Moore, both professors of chemistry and biochemistry, designed a molecule that mimics what happens in nature.
In nature, plants defend against this sunlight overload process using non-photochemical quenching (NPQ). This process drains off the excess light excitation energy as heat so that it cannot generate the destructive high-energy species.
The ASU-designed molecule works in a similar fashion in that it converts absorbed light to electrochemical energy but reduces the efficiency of the conversion as light intensity increases. The ASU-designed molecule has several components including two light gathering antennas -- a porphyrin electron donor, a fullerene acceptor and a control unit that reversibly photoisomerizes between a dihydroindolizine (DHI) and a betaine (BT).
When white light (sunlight) shines on a solution of the molecules, light absorbed by the porphyrin (or by the antennas) is converted to electrochemical potential energy. When the white light intensity is increased, the DHI on some molecules change to a different molecular structure, BT, that drains light excitation energy out of the porphyrin and converts it to heat, avoiding the generation of excess electrochemical potential. As the light becomes brighter, more molecules switch to the non-functional form, so that the conversion of light to chemical energy becomes less efficient. The molecule adapts to its environment, regulating its behavior in response to the light intensity.
"One hallmark of living cells is their ability to sense and respond to surrounding conditions," explains Thomas Moore. "In the case of metabolic control this process involves molecular-level recognition events that are translated into control of a chemical process."
"Functionally, this mimics one of the processes in photosynthesis that severely limits the energy conversion efficiency of higher plants," he added. "One way in which this work is important is that by understanding these events at the molecular level one can imagine redesigning photosynthesis to improve energy conversion efficiency and thereby come closer to meeting our energy needs."
The research is also important to one aspect of the exploding field of nanotechnology, that of regulation, Gust adds. Biological systems are known for their ability to engage in adaptive self-regulation. The nanoscale components respond to other nanoscale systems and to external stimuli in order to keep everything in balance and functioning properly. The ASU research shows how a bio-regulation system has been captured in a non-biological molecular scale analog process.
"Achieving such behavior in human-made devices is vital if we are to realize the promise of nanotechnology," adds Gust. "Although the mechanism of control used in the ASU molecule is different from that employed in NPQ, the overall effect is the same as occurs in the natural photosynthetic process."
Results were reported in the advanced online publication of Nature Nanotechnology (May 4, 2008).
In addition to Gust, Thomas Moore and Ana Moore, the ASU work was carried out by Stephen Straight, Gerdenis Kodis, Yuichi Terazono and Michael Hambourger.
The intermediates do this because the efficient conversion of sunlight into chemical energy cannot keep up with sunlight streaming into the plant.
"The intermediates don't have anywhere to go because the system is jammed up down the line," says ASU chemist Devens Gust. Plants employ a sophisticated process to defend against damage.
To better understand this process, Gust, along with fellow ASU researchers Thomas Moore and Ana Moore, both professors of chemistry and biochemistry, designed a molecule that mimics what happens in nature.
In nature, plants defend against this sunlight overload process using non-photochemical quenching (NPQ). This process drains off the excess light excitation energy as heat so that it cannot generate the destructive high-energy species.
The ASU-designed molecule works in a similar fashion in that it converts absorbed light to electrochemical energy but reduces the efficiency of the conversion as light intensity increases. The ASU-designed molecule has several components including two light gathering antennas -- a porphyrin electron donor, a fullerene acceptor and a control unit that reversibly photoisomerizes between a dihydroindolizine (DHI) and a betaine (BT).
When white light (sunlight) shines on a solution of the molecules, light absorbed by the porphyrin (or by the antennas) is converted to electrochemical potential energy. When the white light intensity is increased, the DHI on some molecules change to a different molecular structure, BT, that drains light excitation energy out of the porphyrin and converts it to heat, avoiding the generation of excess electrochemical potential. As the light becomes brighter, more molecules switch to the non-functional form, so that the conversion of light to chemical energy becomes less efficient. The molecule adapts to its environment, regulating its behavior in response to the light intensity.
"One hallmark of living cells is their ability to sense and respond to surrounding conditions," explains Thomas Moore. "In the case of metabolic control this process involves molecular-level recognition events that are translated into control of a chemical process."
"Functionally, this mimics one of the processes in photosynthesis that severely limits the energy conversion efficiency of higher plants," he added. "One way in which this work is important is that by understanding these events at the molecular level one can imagine redesigning photosynthesis to improve energy conversion efficiency and thereby come closer to meeting our energy needs."
The research is also important to one aspect of the exploding field of nanotechnology, that of regulation, Gust adds. Biological systems are known for their ability to engage in adaptive self-regulation. The nanoscale components respond to other nanoscale systems and to external stimuli in order to keep everything in balance and functioning properly. The ASU research shows how a bio-regulation system has been captured in a non-biological molecular scale analog process.
"Achieving such behavior in human-made devices is vital if we are to realize the promise of nanotechnology," adds Gust. "Although the mechanism of control used in the ASU molecule is different from that employed in NPQ, the overall effect is the same as occurs in the natural photosynthetic process."
Results were reported in the advanced online publication of Nature Nanotechnology (May 4, 2008).
In addition to Gust, Thomas Moore and Ana Moore, the ASU work was carried out by Stephen Straight, Gerdenis Kodis, Yuichi Terazono and Michael Hambourger.
Engineering Researchers Automate Analysis Of Protein Patterns
Carnegie Mellon University's Justin Y. Newberg and Robert F. Murphy have developed a software toolbox that is intended to help bioscience researchers characterize protein patterns in human tissues.
Newberg, a Ph.D. student in biomedical engineering, described the automated protein pattern recognition tool and its underlying methods as important for identifying biomarkers that could be useful for cancer diagnosis and therapy.
"Distribution of proteins in a cell or group of cells can be used to identify the state of surrounding tissue, whether it is healthy or diseased," said Newberg, the newsletter editor for Carnegie Mellon's Graduate Biomedical Engineering Society. "So, our tools can be used to develop novel approaches to screen tissue, which could have an immense benefit in such things as cancer diagnosis."
Newberg, a member of Murphy's research group, added that researchers are increasingly collecting large numbers of images due to the availability of automated microscopes. These images provide an excellent opportunity for improving the understanding of biological processes, but also create a need for automated bioimage analysis tools. Development of such tools has been a major focus of Carnegie Mellon's Center for Bioimage Informatics for many years.
Newberg said the Human Protein Atlas is an excellent example of a large scale dataset ripe for automated analysis. The atlas consists of more than 3,000 proteins imaged in 45 normal and 20 cancerous human tissues.
In a research article in the Journal of Proteome Research, Newberg and Murphy, the Ray and Stephanie Lane Professor of Computational Biology and a professor in the departments of Biological Sciences, Biomedical Engineering and Machine Learning at Carnegie Mellon, described how they applied their tools to analyze images of eight major subcellular location patterns with a high degree of accuracy. They pointed to their work as a strong indication that automated analysis of the whole atlas is feasible, and they plan to continue to study and characterize all of the proteins in the atlas.
"Knowing the exact location of thousands of proteins in human cells will enable a much better understanding of how these cells work and could ultimately advance the detection and diagnosis of serious diseases," Murphy said.
Newberg, a Ph.D. student in biomedical engineering, described the automated protein pattern recognition tool and its underlying methods as important for identifying biomarkers that could be useful for cancer diagnosis and therapy.
"Distribution of proteins in a cell or group of cells can be used to identify the state of surrounding tissue, whether it is healthy or diseased," said Newberg, the newsletter editor for Carnegie Mellon's Graduate Biomedical Engineering Society. "So, our tools can be used to develop novel approaches to screen tissue, which could have an immense benefit in such things as cancer diagnosis."
Newberg, a member of Murphy's research group, added that researchers are increasingly collecting large numbers of images due to the availability of automated microscopes. These images provide an excellent opportunity for improving the understanding of biological processes, but also create a need for automated bioimage analysis tools. Development of such tools has been a major focus of Carnegie Mellon's Center for Bioimage Informatics for many years.
Newberg said the Human Protein Atlas is an excellent example of a large scale dataset ripe for automated analysis. The atlas consists of more than 3,000 proteins imaged in 45 normal and 20 cancerous human tissues.
In a research article in the Journal of Proteome Research, Newberg and Murphy, the Ray and Stephanie Lane Professor of Computational Biology and a professor in the departments of Biological Sciences, Biomedical Engineering and Machine Learning at Carnegie Mellon, described how they applied their tools to analyze images of eight major subcellular location patterns with a high degree of accuracy. They pointed to their work as a strong indication that automated analysis of the whole atlas is feasible, and they plan to continue to study and characterize all of the proteins in the atlas.
"Knowing the exact location of thousands of proteins in human cells will enable a much better understanding of how these cells work and could ultimately advance the detection and diagnosis of serious diseases," Murphy said.
Lasers Used To Align Molecules: Technique Could Revolutionize Human Protein Imaging
Protein crystallographers have only scratched the surface of the human proteins important for drug interactions because of difficulties crystallizing the molecules for synchrotron x-ray diffraction.
Scientists at the U.S. Department of Energy's (DOE) Argonne National Laboratory have devised a way to eliminate the need for crystallization by using lasers to align large groups of molecules.
"Strong laser fields can be used to control the behavior of atoms and molecules," Argonne Distinguished Fellow Linda Young said. "Using x-rays, we can investigate their properties in a totally new way."
Crystallization allows scientists to create a periodic structure that will strongly diffract in specific directions when bombarded with x-rays. From the resulting diffraction pattern, a real-space image can be reconstructed.
However, without crystallization, when x-rays collide with multiple, randomly oriented molecules, they diffract in different directions, making it impossible to create a composite diffraction image, Argonne Physicist Robin Santra said.
Some molecules, such as many involved with drug interaction, cannot be crystallized and imaging would require numerous samples to bombard in order to get a full composite picture. Young's laser technique allows for millions of molecules suspended in a gaseous state to be aligned so that when bombarded with x-rays, they all diffract in the same way. The resulting images are at atomic level resolution and do not require crystallization.
"Understanding the structure of the approximately 1 million human proteins that cannot be crystallized is perhaps the most important challenge facing structural biology," Young said.
"A method for structure determination at atomic resolution without the need to crystallize would be revolutionary."
Young and her team have successfully aligned molecules using a laser, probed the aligned ensemble with x-rays and shown theoretically that the technique could be used for x-ray imaging (See E. R. Peterson et al., Applied Physics Letters 92, 094106 (2008)), but they require an proposed upgrade to the Advanced Photon Source facility located at Argonne before x-ray diffraction can be done experimentally.
Scientists at the U.S. Department of Energy's (DOE) Argonne National Laboratory have devised a way to eliminate the need for crystallization by using lasers to align large groups of molecules.
"Strong laser fields can be used to control the behavior of atoms and molecules," Argonne Distinguished Fellow Linda Young said. "Using x-rays, we can investigate their properties in a totally new way."
Crystallization allows scientists to create a periodic structure that will strongly diffract in specific directions when bombarded with x-rays. From the resulting diffraction pattern, a real-space image can be reconstructed.
However, without crystallization, when x-rays collide with multiple, randomly oriented molecules, they diffract in different directions, making it impossible to create a composite diffraction image, Argonne Physicist Robin Santra said.
Some molecules, such as many involved with drug interaction, cannot be crystallized and imaging would require numerous samples to bombard in order to get a full composite picture. Young's laser technique allows for millions of molecules suspended in a gaseous state to be aligned so that when bombarded with x-rays, they all diffract in the same way. The resulting images are at atomic level resolution and do not require crystallization.
"Understanding the structure of the approximately 1 million human proteins that cannot be crystallized is perhaps the most important challenge facing structural biology," Young said.
"A method for structure determination at atomic resolution without the need to crystallize would be revolutionary."
Young and her team have successfully aligned molecules using a laser, probed the aligned ensemble with x-rays and shown theoretically that the technique could be used for x-ray imaging (See E. R. Peterson et al., Applied Physics Letters 92, 094106 (2008)), but they require an proposed upgrade to the Advanced Photon Source facility located at Argonne before x-ray diffraction can be done experimentally.
Carbon-coated Nanomagnets Could Be A New Form Of Cancer Treatment
Carbon-coated nanomagnets may offer a new form of cancer treatment. Research presented at the 103rd Annual Scientific Meeting of the American Urological Association (AUA) suggests that nanoparticles consisting of metallic iron with a protective carbon coat could serve as a safe and effective hyperthermia agent.
Researchers from Germany have found that In animal models, using heat to selectively kill tumor cells is efficient. Using metallic iron in the nanoparticles (in lieu of iron oxide) would allow heating at greater temperatures; and coating the iron with carbon would prevent the iron from rusting, which can hinder the effectiveness of the therapy.
In order to ensure that the nanoparticles did not harm non-cancerous cells, researchers tested their compatibility with normal tissues. Human PC-3 prostate cells and a non-malignant fibroblast cell line were incubated with the carbon coated nanomagnets and, after the incubation period, the cells did not experience major cytotoxic (cell-destroying) effects. The cell cycle distribution and the apoptosis rate were not impaired by the presence of nanomagnets, reflecting the biocompatible character of these structures.
This breakthrough could provide an effective treatment option for many types of cancer, without the destruction of surrounding cells associated with chemotherapy or invasive surgery.
The fact that the carbon-coating prevented cell destruction during incubation proves that the nanoparticles could potentially serve as safe and effective hyperthermia agents, targeting and destroying cancerous cells. These findings underscore a need for more research regarding the use of nanoparticles as potential cancer treatments.
Researchers from Germany have found that In animal models, using heat to selectively kill tumor cells is efficient. Using metallic iron in the nanoparticles (in lieu of iron oxide) would allow heating at greater temperatures; and coating the iron with carbon would prevent the iron from rusting, which can hinder the effectiveness of the therapy.
In order to ensure that the nanoparticles did not harm non-cancerous cells, researchers tested their compatibility with normal tissues. Human PC-3 prostate cells and a non-malignant fibroblast cell line were incubated with the carbon coated nanomagnets and, after the incubation period, the cells did not experience major cytotoxic (cell-destroying) effects. The cell cycle distribution and the apoptosis rate were not impaired by the presence of nanomagnets, reflecting the biocompatible character of these structures.
This breakthrough could provide an effective treatment option for many types of cancer, without the destruction of surrounding cells associated with chemotherapy or invasive surgery.
The fact that the carbon-coating prevented cell destruction during incubation proves that the nanoparticles could potentially serve as safe and effective hyperthermia agents, targeting and destroying cancerous cells. These findings underscore a need for more research regarding the use of nanoparticles as potential cancer treatments.
Saturday, 10 May 2008
Vitamin D And Calcium Influence Cell Death In The Colon, Researchers Find
Researchers at Emory University are learning how vitamins and minerals in the diet can stimulate or prevent the appearance of colon cancer.
Emory investigators will present their findings on biological markers that could influence colon cancer risk in three abstracts at the American Association for Cancer Research meeting in San Diego.
In a clinical study of 92 patients, supplementing diet with calcium and vitamin D appeared to increase the levels of a protein called Bax that controls programmed cell death in the colon. More Bax might be pushing pre-cancerous cells into programmed cell death, says Emory researcher Veronika Fedirko, who will present her team's results.
Previous studies have shown that calcium and vitamin D tend to reduce colon cancer risk.
"We were pleased that the effects of calcium and vitamin D were visible enough in this small study to be significant and reportable," Fedirko says. "We will have to fully evaluate each marker's strength as we accumulate more data."
The studies of colorectal biopsy samples are part of a larger effort to identify a portfolio of measurements that together can gauge someone's risk of getting colon cancer, says Roberd Bostick, MD, MPH, professor of epidemiology at Emory's Rollins School of Public Health.
"We want to have the equivalent of measuring cholesterol or high blood pressure, but for colon cancer instead of heart disease," Bostick says. "These measurements will describe the climate of risk in the colon rather than spotting individual tumors or cells that may become tumors."
Bostick has plans for developing non-invasive blood or urine tests for colon cancer risk.
Bostick and his colleagues demonstrates in a 200-patient case-control study that high levels of calcium and vitamin D together are associated with increased levels of E-cadherin, which moderates colon cells' movement and proliferation.
A third abstract on the same case-control study to be presented at the same meeting, shows that high levels of iron in the diet are linked to low levels of APC, a protein whose absence in colon cancer cells leads to their runaway growth.
Bostick and his colleagues are participating in a ten-year multi-center study of the effects of increased vitamin D and calcium and biomarker-guided treatment of colon cancer recurrence. The study involves almost 2,500 people nationwide who have regular colonoscopies.
The Bostick team's research is funded by the National Cancer Institute and the Wilson and Anne Franklin Foundation.
Emory investigators will present their findings on biological markers that could influence colon cancer risk in three abstracts at the American Association for Cancer Research meeting in San Diego.
In a clinical study of 92 patients, supplementing diet with calcium and vitamin D appeared to increase the levels of a protein called Bax that controls programmed cell death in the colon. More Bax might be pushing pre-cancerous cells into programmed cell death, says Emory researcher Veronika Fedirko, who will present her team's results.
Previous studies have shown that calcium and vitamin D tend to reduce colon cancer risk.
"We were pleased that the effects of calcium and vitamin D were visible enough in this small study to be significant and reportable," Fedirko says. "We will have to fully evaluate each marker's strength as we accumulate more data."
The studies of colorectal biopsy samples are part of a larger effort to identify a portfolio of measurements that together can gauge someone's risk of getting colon cancer, says Roberd Bostick, MD, MPH, professor of epidemiology at Emory's Rollins School of Public Health.
"We want to have the equivalent of measuring cholesterol or high blood pressure, but for colon cancer instead of heart disease," Bostick says. "These measurements will describe the climate of risk in the colon rather than spotting individual tumors or cells that may become tumors."
Bostick has plans for developing non-invasive blood or urine tests for colon cancer risk.
Bostick and his colleagues demonstrates in a 200-patient case-control study that high levels of calcium and vitamin D together are associated with increased levels of E-cadherin, which moderates colon cells' movement and proliferation.
A third abstract on the same case-control study to be presented at the same meeting, shows that high levels of iron in the diet are linked to low levels of APC, a protein whose absence in colon cancer cells leads to their runaway growth.
Bostick and his colleagues are participating in a ten-year multi-center study of the effects of increased vitamin D and calcium and biomarker-guided treatment of colon cancer recurrence. The study involves almost 2,500 people nationwide who have regular colonoscopies.
The Bostick team's research is funded by the National Cancer Institute and the Wilson and Anne Franklin Foundation.
Twin Findings Raise Hopes Of Improved Anemia Treatments
A new understanding of how red blood cell production is controlled could lead to improvements in the treatment of the blood disorder anaemia, according to West Australian medical researchers.
The findings are reported in two papers published in Blood, the journal of the American Society of Hematology, by a group of Australian scientists, led by Western Australian Institute for Medical Research (WAIMR) Director Peter Klinken and his Laboratory for Cancer Medicine.
One of the papers shows how the gene Hls5, which was discovered by Professor Klinken's team, affects red cell production.
"We have established that Hls5 impedes the maturation of immature red blood cells which has provided us with a much better understanding of what Hls5 does and how it is linked with the development of leukaemias and cancers," he said.
"Another arm of our research has revealed that thyroid hormone, which it was already established affected metabolism, also contributes to red blood cell formation -- which was previously unknown."
Professor Klinken said both findings opened the door to exploring new ways of treating a range of anaemias.
"Anaemias develop where a person's blood is low in red blood cells so the two discoveries we have made may provide an insight into how to turn these conditions around," he said.
"Our findings indicate that minor changes in Hls5 levels can have a big impact and so the possibility of modulating this gene to generate new treatments is significant.
"As a number of patients don't respond to erythropoietin (EPO) -- the current form of hormone therapy for anaemias -- this new knowledge will hopefully lead to alternative treatments."
The research being conducted by Professor Klinken and his team is funded by the National Health and Medical Research Council and ASX-listed Perth-based biotechnology company BioPharmica.
Anaemia occurs when the amount of haemoglobin (which is found in red blood cells) drops below normal. Haemoglobin is necessary for the transportation of oxygen throughout the body.
It can be caused by iron or vitamin deficiency, blood loss, a chronic illness, a genetic or acquired defect or disease or through the use of some medications.
The findings are reported in two papers published in Blood, the journal of the American Society of Hematology, by a group of Australian scientists, led by Western Australian Institute for Medical Research (WAIMR) Director Peter Klinken and his Laboratory for Cancer Medicine.
One of the papers shows how the gene Hls5, which was discovered by Professor Klinken's team, affects red cell production.
"We have established that Hls5 impedes the maturation of immature red blood cells which has provided us with a much better understanding of what Hls5 does and how it is linked with the development of leukaemias and cancers," he said.
"Another arm of our research has revealed that thyroid hormone, which it was already established affected metabolism, also contributes to red blood cell formation -- which was previously unknown."
Professor Klinken said both findings opened the door to exploring new ways of treating a range of anaemias.
"Anaemias develop where a person's blood is low in red blood cells so the two discoveries we have made may provide an insight into how to turn these conditions around," he said.
"Our findings indicate that minor changes in Hls5 levels can have a big impact and so the possibility of modulating this gene to generate new treatments is significant.
"As a number of patients don't respond to erythropoietin (EPO) -- the current form of hormone therapy for anaemias -- this new knowledge will hopefully lead to alternative treatments."
The research being conducted by Professor Klinken and his team is funded by the National Health and Medical Research Council and ASX-listed Perth-based biotechnology company BioPharmica.
Anaemia occurs when the amount of haemoglobin (which is found in red blood cells) drops below normal. Haemoglobin is necessary for the transportation of oxygen throughout the body.
It can be caused by iron or vitamin deficiency, blood loss, a chronic illness, a genetic or acquired defect or disease or through the use of some medications.
Synthetic Vitamin D Helps Prevent Some Breast Cancers, Animal Study Suggests
Researchers at Rutgers University have found that, in animal studies, a synthetic form of active vitamin D has a substantive preventive effect on the development of both estrogen receptor (ER)-positive and ER-negative breast cancers. Unlike many of the other synthetic vitamin D agents that have been tested in humans, this compound, known as Gemini 0097, shows no toxicity, they report.
The research team found that daily injections of Gemini 0097 cut growth of ER-positive cancer by 60 percent in rat studies, and reduced ER-negative breast cancer by half in mice.
"These are very promising findings, especially because no toxicity is observed," said researcher Hong Jin Lee, a graduate student at Rutgers. Lee works in the laboratory of lead investigator Nanjoo Suh, Ph.D., an assistant professor at the Susan Lehman Cullman Laboratory for Cancer Research at Rutgers, the State University of New Jersey. Suh said that Gemini 0097 likely did not cause the most common vitamin D toxicity, an overload of calcium in blood known as hypercalcemia, because the compound has an extra side chain of chemicals.
"It is quite different from the natural shape of active vitamin D," she said. "Because the binding affinity of Gemini 0097 with vitamin D receptor is low that may contribute to the lower toxicity, but the efficacy stays the same or even better."
Epidemiologic studies have shown that use of vitamin D is beneficial in preventing colon cancer, but studies in prostate and breast cancer have yielded mixed conclusions, Suh says.
Vitamin D is a pro-hormone that is produced in the skin after exposure to sunlight. Vitamin D dietary supplements are converted into an active, useful form by metabolism in the liver and kidneys. Although the active form of vitamin D has been tested as a cancer treatment, the higher doses needed for prevention or treatment have typically produced intolerable side effects in clinical trials, Suh says.
In this study, the researchers tested 60 novel Gemini vitamin compounds, with Gemini 0097 performing the best, Lee says.
In one set of studies, the researchers exposed rats to a mammary carcinogen, then injected groups of 15 animals with different doses of Gemini 0097. They found that the lowest dose had little effect but higher doses slowed the growth of resultant ER-positive tumors by 60 percent, compared with a group of control rats. Some treated rats developed small mammary tumors and some developed none at all, says Lee. "The data are very convincing," he said.
In a second, similar experiment in a mouse model of ER-negative breast cancer, mice treated with Gemini vitamin D had 50 percent fewer tumors than did control mice.
The researchers analyzed tumor samples from both the rats and the mice and discovered that Gemini 0097 prevents tumorigenesis by increasing expression of the p21 protein, which arrests the cell cycle, and by inducing insulin-like growth factor binding protein--3 (IGFBP-3), which slows down cell proliferation.
"These data are from animal studies, and we need more data before these compounds can be tested in humans," said Suh. "Still, we are hopeful that we have found a way of providing vitamin D without toxicity that has a significant effect on cancer prevention."
This research was presented at the 2008 Annual Meeting of the American Association for Cancer Research, April 12-16, 2008.
The research team found that daily injections of Gemini 0097 cut growth of ER-positive cancer by 60 percent in rat studies, and reduced ER-negative breast cancer by half in mice.
"These are very promising findings, especially because no toxicity is observed," said researcher Hong Jin Lee, a graduate student at Rutgers. Lee works in the laboratory of lead investigator Nanjoo Suh, Ph.D., an assistant professor at the Susan Lehman Cullman Laboratory for Cancer Research at Rutgers, the State University of New Jersey. Suh said that Gemini 0097 likely did not cause the most common vitamin D toxicity, an overload of calcium in blood known as hypercalcemia, because the compound has an extra side chain of chemicals.
"It is quite different from the natural shape of active vitamin D," she said. "Because the binding affinity of Gemini 0097 with vitamin D receptor is low that may contribute to the lower toxicity, but the efficacy stays the same or even better."
Epidemiologic studies have shown that use of vitamin D is beneficial in preventing colon cancer, but studies in prostate and breast cancer have yielded mixed conclusions, Suh says.
Vitamin D is a pro-hormone that is produced in the skin after exposure to sunlight. Vitamin D dietary supplements are converted into an active, useful form by metabolism in the liver and kidneys. Although the active form of vitamin D has been tested as a cancer treatment, the higher doses needed for prevention or treatment have typically produced intolerable side effects in clinical trials, Suh says.
In this study, the researchers tested 60 novel Gemini vitamin compounds, with Gemini 0097 performing the best, Lee says.
In one set of studies, the researchers exposed rats to a mammary carcinogen, then injected groups of 15 animals with different doses of Gemini 0097. They found that the lowest dose had little effect but higher doses slowed the growth of resultant ER-positive tumors by 60 percent, compared with a group of control rats. Some treated rats developed small mammary tumors and some developed none at all, says Lee. "The data are very convincing," he said.
In a second, similar experiment in a mouse model of ER-negative breast cancer, mice treated with Gemini vitamin D had 50 percent fewer tumors than did control mice.
The researchers analyzed tumor samples from both the rats and the mice and discovered that Gemini 0097 prevents tumorigenesis by increasing expression of the p21 protein, which arrests the cell cycle, and by inducing insulin-like growth factor binding protein--3 (IGFBP-3), which slows down cell proliferation.
"These data are from animal studies, and we need more data before these compounds can be tested in humans," said Suh. "Still, we are hopeful that we have found a way of providing vitamin D without toxicity that has a significant effect on cancer prevention."
This research was presented at the 2008 Annual Meeting of the American Association for Cancer Research, April 12-16, 2008.
Vitamin E May Help Alzheimer's Patients Live Longer, Study Suggests
People with Alzheimer's disease who take vitamin E appear to live longer than those who don't take vitamin E, according to new research.
For the study, researchers followed 847 people with Alzheimer's disease for an average of five years. About two-thirds of the group took 1,000 international units of vitamin E twice a day along with an Alzheimer's drug (a cholinesterase inhibitor). Less than 10 percent of the group took vitamin E alone and approximately 15 percent did not take vitamin E.
The study found people who took vitamin E, with or without a cholinesterase inhibitor, were 26 percent less likely to die than people who didn't take vitamin E.
"Vitamin E has previously been shown to delay the progression of moderately severe Alzheimer's disease. Now, we've been able to show that vitamin E appears to increase the survival time of Alzheimer's patients as well," said study author Valory Pavlik, PhD, with Baylor College of Medicine's Alzheimer's Disease and Memory Disorders Center in Houston, TX, and member of the American Academy of Neurology. "This is particularly important because recent studies in heart disease patients have questioned whether vitamin E is beneficial for survival."
In addition, the study found vitamin E plus a cholinesterase inhibitor may be more beneficial than taking either agent alone. "Our findings show that people who took a cholinesterase inhibitor without vitamin E did not have a survival benefit," said Pavlik. "More research needs to be done to determine why this may be the case."
In addition to vitamin E supplements, some vegetables oils, nuts, and green leafy vegetables are main food sources of vitamin E. Some fortified cereals in the United States also contain vitamin E. "The daily amount of vitamin E taken by patients in this study was much higher than what is currently recommended for the general population," said Pavlik.
This research was presented at the American Academy of Neurology 60th Anniversary Annual Meeting in Chicago, April 12--19, 2008.
For the study, researchers followed 847 people with Alzheimer's disease for an average of five years. About two-thirds of the group took 1,000 international units of vitamin E twice a day along with an Alzheimer's drug (a cholinesterase inhibitor). Less than 10 percent of the group took vitamin E alone and approximately 15 percent did not take vitamin E.
The study found people who took vitamin E, with or without a cholinesterase inhibitor, were 26 percent less likely to die than people who didn't take vitamin E.
"Vitamin E has previously been shown to delay the progression of moderately severe Alzheimer's disease. Now, we've been able to show that vitamin E appears to increase the survival time of Alzheimer's patients as well," said study author Valory Pavlik, PhD, with Baylor College of Medicine's Alzheimer's Disease and Memory Disorders Center in Houston, TX, and member of the American Academy of Neurology. "This is particularly important because recent studies in heart disease patients have questioned whether vitamin E is beneficial for survival."
In addition, the study found vitamin E plus a cholinesterase inhibitor may be more beneficial than taking either agent alone. "Our findings show that people who took a cholinesterase inhibitor without vitamin E did not have a survival benefit," said Pavlik. "More research needs to be done to determine why this may be the case."
In addition to vitamin E supplements, some vegetables oils, nuts, and green leafy vegetables are main food sources of vitamin E. Some fortified cereals in the United States also contain vitamin E. "The daily amount of vitamin E taken by patients in this study was much higher than what is currently recommended for the general population," said Pavlik.
This research was presented at the American Academy of Neurology 60th Anniversary Annual Meeting in Chicago, April 12--19, 2008.
Variants Of Vitamin D Receptor Linked To Increased Risk Of Breast Cancer
Genetic variations in the body's receptor for vitamin D could increase the risk of breast cancer in postmenopausal women, according to a new study.
Jenny Chang-Claude of the Division of Cancer Epidemiology, at the German Cancer Research Center, in Heidelberg, and colleagues there and at the Institute for Medical Biometrics and Epidemiology, University Clinic Hamburg-Eppendorf, Germany, undertook a population-based case-control study involving 1,408 patients and 2,612 control individuals.
The researchers explain that vitamin D intake and serum concentrations of its metabolites have been associated with a decreased risk of developing breast cancer. The vitamin plays a known role in controlling calcium levels and influences the differentiation of cells and so could play a part in preventing the runaway proliferation of cells characteristic with cancer.
Previous studies regarding the association between vitamin D and breast cancer have been inconsistent in their conclusions.
Chang-Claude and her colleagues have investigated variations in the gene encoding the vitamin D receptor protein. They found that there were no differences in the biomarker for vitamin D, 25-hydroxyvitamin D, between women with two well-known genetic variations, the polymorphisms FokI and TaqI, and two functional putative variants, VDR-5132 and Cdx2, in the gene for the receptor. Moreover, they found no relationship between the presence of these polymorphisms and overall risk of postmenopausal breast cancer.
However, they found a significant increase in the risk of estrogen receptor (ER) positive tumours among women with the TaqI genetic variant. This suggests the involvement of estrogen metabolism in the anticancer activity of vitamin D.
"Further studies focusing on the influence of genetic variations on vitamin D receptor functionality, activity and concentration are now needed" says Chang-Claude.
Journal reference: Vitamin D receptor gene polymorphisms and haplotypes and postmenopausal breast cancer risk. Sascha Abbas, Alexandra Nieters, Jakob Linseisen, Tracy Slanger, Silke Kropp, Elke J Mutschelknauss, Dieter Flesch-Janys and Jenny Chang-Claude. Breast Cancer Research (in press)
Jenny Chang-Claude of the Division of Cancer Epidemiology, at the German Cancer Research Center, in Heidelberg, and colleagues there and at the Institute for Medical Biometrics and Epidemiology, University Clinic Hamburg-Eppendorf, Germany, undertook a population-based case-control study involving 1,408 patients and 2,612 control individuals.
The researchers explain that vitamin D intake and serum concentrations of its metabolites have been associated with a decreased risk of developing breast cancer. The vitamin plays a known role in controlling calcium levels and influences the differentiation of cells and so could play a part in preventing the runaway proliferation of cells characteristic with cancer.
Previous studies regarding the association between vitamin D and breast cancer have been inconsistent in their conclusions.
Chang-Claude and her colleagues have investigated variations in the gene encoding the vitamin D receptor protein. They found that there were no differences in the biomarker for vitamin D, 25-hydroxyvitamin D, between women with two well-known genetic variations, the polymorphisms FokI and TaqI, and two functional putative variants, VDR-5132 and Cdx2, in the gene for the receptor. Moreover, they found no relationship between the presence of these polymorphisms and overall risk of postmenopausal breast cancer.
However, they found a significant increase in the risk of estrogen receptor (ER) positive tumours among women with the TaqI genetic variant. This suggests the involvement of estrogen metabolism in the anticancer activity of vitamin D.
"Further studies focusing on the influence of genetic variations on vitamin D receptor functionality, activity and concentration are now needed" says Chang-Claude.
Journal reference: Vitamin D receptor gene polymorphisms and haplotypes and postmenopausal breast cancer risk. Sascha Abbas, Alexandra Nieters, Jakob Linseisen, Tracy Slanger, Silke Kropp, Elke J Mutschelknauss, Dieter Flesch-Janys and Jenny Chang-Claude. Breast Cancer Research (in press)
High Blood Levels Of Vitamin D Protect Women From Breast Cancer, Study Suggests
A connection between vitamin D level and the risk of developing breast cancer has been implicated for a long time, but its clinical relevance had not yet been proven. Sascha Abbas and colleagues from the working group headed by Dr. Jenny Chang-Claude at the German Cancer Research Center (Deutsches Krebsforschungszentrum, DKFZ), collaborating with researchers of the University Hospitals in Hamburg-Eppendorf, have now obtained clear results:
While previous studies had concentrated chiefly on nutritional vitamin D, the researchers have now investigated the complete vitamin D status. To this end, they studied 25-hydroxyvitamin D (25(OH)D) as a marker for both endogenous vitamin D and vitamin D from food intake.
The result of the study involving 1,394 breast cancer patients and an equal number of healthy women after menopause was surprisingly clear: Women with a very low blood level of 25(OH)D have a considerably increased breast cancer risk. The effect was found to be strongest in women who were not taking hormones for relief of menopausal symptoms. However, the authors note that, in this retrospective study, diagnosis-related factors such as chemotherapy or lack of sunlight after prolonged hospital stays might have contributed to low vitamin levels of breast cancer patients.
In addition, the investigators focused on the vitamin D receptor. The gene of this receptor is found in several variants known as polymorphisms. The research team of the DKFZ and Eppendorf Hospitals investigated the effect of four of these polymorphisms on the risk of developing breast cancer. They found out that carriers of the Taql polymorphism have a slightly increased risk of breast tumors that carry receptors for the female sex hormone estrogen on their surface. No effects on the overall breast cancer risk were found. A possible explanation offered by the authors is that vitamin D can exert its cancer-preventing effect by counteracting the growth-promoting effect of estrogens.
Besides its cancer-preventing influence with effects on cell growth, cell differentiation and programmed cell death (apoptosis), vitamin D regulates, above all, the calcium metabolism in our body. Foods that are particularly rich in vitamin D include seafish (cod liver oil),
While previous studies had concentrated chiefly on nutritional vitamin D, the researchers have now investigated the complete vitamin D status. To this end, they studied 25-hydroxyvitamin D (25(OH)D) as a marker for both endogenous vitamin D and vitamin D from food intake.
The result of the study involving 1,394 breast cancer patients and an equal number of healthy women after menopause was surprisingly clear: Women with a very low blood level of 25(OH)D have a considerably increased breast cancer risk. The effect was found to be strongest in women who were not taking hormones for relief of menopausal symptoms. However, the authors note that, in this retrospective study, diagnosis-related factors such as chemotherapy or lack of sunlight after prolonged hospital stays might have contributed to low vitamin levels of breast cancer patients.
In addition, the investigators focused on the vitamin D receptor. The gene of this receptor is found in several variants known as polymorphisms. The research team of the DKFZ and Eppendorf Hospitals investigated the effect of four of these polymorphisms on the risk of developing breast cancer. They found out that carriers of the Taql polymorphism have a slightly increased risk of breast tumors that carry receptors for the female sex hormone estrogen on their surface. No effects on the overall breast cancer risk were found. A possible explanation offered by the authors is that vitamin D can exert its cancer-preventing effect by counteracting the growth-promoting effect of estrogens.
Besides its cancer-preventing influence with effects on cell growth, cell differentiation and programmed cell death (apoptosis), vitamin D regulates, above all, the calcium metabolism in our body. Foods that are particularly rich in vitamin D include seafish (cod liver oil),
Vitamin D Important In Brain Development And Function
In a definitive critical review, scientists at Children's Hospital & Research Center Oakland ask whether there is convincing biological or behavioral evidence linking vitamin D deficiency to brain dysfunction. Joyce C. McCann, Ph.D., assistant staff scientist and Bruce N. Ames, Ph.D., senior scientist at Children's Hospital Oakland Research Institute (CHORI) conclude that there is ample biological evidence to suggest an important role for vitamin D in brain development and function, and that supplementation for groups chronically low in vitamin D is warranted. Their conclusions will be published on April 22, 2008 in the Federation of American Societies for Experimental Biology (FASEB) Journal.
"This critical analysis of vitamin D function and the brain is a model of careful thinking about nutrition and behavior", says Gerald Weissmann, MD, Editor-in-Chief of the FASEB Journal "One wishes that all studies of nutritional supplements or requirements were this thoughtful. Drs. McCann and Ames deftly show that while vitamin D has an important role in the development and function of the brain, its exact effects on behavior remain unclear. Pointing to the need for further study, the authors argue for vitamin D supplementation in groups at risk."
Vitamin D has long been known to promote healthy bones by regulating calcium levels in the body. Lack of sufficient vitamin D in very young children results in rickets, which can be easily prevented by vitamin D supplements. Only recently the scientific community has become aware of a much broader role for vitamin D. For example, we now know that, in addition to its role in maintaining bone health, vitamin D is involved in differentiation of tissues during development and in proper functioning of the immune system.
In fact, over 900 different genes are now known to be able to bind the vitamin D receptor, through which vitamin D mediates its effects. In addition to protecting against rickets, evidence now strongly indicates that a plentiful supply of vitamin D helps to protect against bone fractures in the elderly. Evidence also continues to accumulate suggesting a beneficial role for vitamin D in protecting against autoimmune diseases, including multiple sclerosis and type I diabetes, as well as some forms of cancer, particularly colorectal and breast.
Vitamin D is present in only a few foods (e.g., fatty fish), and is also added to fortified milk, but our supply typically comes mostly from exposure to ultraviolet rays (UV) in sunlight. UV from the sun converts a biochemical in the skin to vitamin D, which is then metabolized to calcitriol, its active form and an important hormone. Formation of vitamin D by UV can be 6 times more efficient in light skin than dark skin, which is an important cause of the known widespread vitamin D deficiency among African Americans living in northern latitudes. Dark skin has been selected during evolution because it protects against the burning UV rays of the sun in the tropics.
White skin has been selected for allowing as much UV exposure to make sufficient vitamin D in Northern (high) latitudes. Thus, fair-skinned northerners are at risk in Australia or Arizona for sunburns and UV-induced cancer, while dark-skinned people in the Northern U.S. or European latitudes with little exposure to the sun are at risk for rickets, bone fractures and possibly other diseases including several types of cancer due to a lack of vitamin D. Fortunately sun-screens and vitamin D supplements are inexpensive.
McCann & Ames point out that evidence for vitamin D's involvement in brain function includes the wide distribution of vitamin D receptors throughout the brain. They also discuss vitamin D's ability to affect proteins in the brain known to be directly involved in learning and memory, motor control, and possibly even maternal and social behavior. The review also discusses studies in both humans and animals that present suggestive though not definitive evidence of cognitive or behavioral consequences of vitamin D inadequacy. The authors discuss possible reasons for the apparent discrepancy between the biological and behavioral evidence, and suggest new, possibly clarifying avenues of research.
Many vitamin D experts advise that the currently recommended level of vitamin D intake is much too low and should be raised to protect against bone fractures and possibly cancer in addition to rickets (2). Indeed, even using present guidelines, too many Americans have low vitamin D blood levels. McCann & Ames propose that, despite uncertainty regarding all of the deleterious effects of vitamin D inadequacy, the evidence overall indicates that supplementation, which is both inexpensive and prudent, is warranted for groups whose vitamin D status is exceptionally low, particularly nursing infants, the elderly, and African Americans (e.g., see (3)).
This review is the fourth in a series by McCann & Ames that critically evaluate scientific evidence linking deficiencies in micronutrients (the approximately 40 vitamins, minerals, amino acids, and fatty acids required for the body to function) to brain function. Other reviews in the series discuss the long-chain polyunsaturated fatty acid docosahexaenoic acid (DHA) (4, 5), choline (6), and iron (7).
"This critical analysis of vitamin D function and the brain is a model of careful thinking about nutrition and behavior", says Gerald Weissmann, MD, Editor-in-Chief of the FASEB Journal "One wishes that all studies of nutritional supplements or requirements were this thoughtful. Drs. McCann and Ames deftly show that while vitamin D has an important role in the development and function of the brain, its exact effects on behavior remain unclear. Pointing to the need for further study, the authors argue for vitamin D supplementation in groups at risk."
Vitamin D has long been known to promote healthy bones by regulating calcium levels in the body. Lack of sufficient vitamin D in very young children results in rickets, which can be easily prevented by vitamin D supplements. Only recently the scientific community has become aware of a much broader role for vitamin D. For example, we now know that, in addition to its role in maintaining bone health, vitamin D is involved in differentiation of tissues during development and in proper functioning of the immune system.
In fact, over 900 different genes are now known to be able to bind the vitamin D receptor, through which vitamin D mediates its effects. In addition to protecting against rickets, evidence now strongly indicates that a plentiful supply of vitamin D helps to protect against bone fractures in the elderly. Evidence also continues to accumulate suggesting a beneficial role for vitamin D in protecting against autoimmune diseases, including multiple sclerosis and type I diabetes, as well as some forms of cancer, particularly colorectal and breast.
Vitamin D is present in only a few foods (e.g., fatty fish), and is also added to fortified milk, but our supply typically comes mostly from exposure to ultraviolet rays (UV) in sunlight. UV from the sun converts a biochemical in the skin to vitamin D, which is then metabolized to calcitriol, its active form and an important hormone. Formation of vitamin D by UV can be 6 times more efficient in light skin than dark skin, which is an important cause of the known widespread vitamin D deficiency among African Americans living in northern latitudes. Dark skin has been selected during evolution because it protects against the burning UV rays of the sun in the tropics.
White skin has been selected for allowing as much UV exposure to make sufficient vitamin D in Northern (high) latitudes. Thus, fair-skinned northerners are at risk in Australia or Arizona for sunburns and UV-induced cancer, while dark-skinned people in the Northern U.S. or European latitudes with little exposure to the sun are at risk for rickets, bone fractures and possibly other diseases including several types of cancer due to a lack of vitamin D. Fortunately sun-screens and vitamin D supplements are inexpensive.
McCann & Ames point out that evidence for vitamin D's involvement in brain function includes the wide distribution of vitamin D receptors throughout the brain. They also discuss vitamin D's ability to affect proteins in the brain known to be directly involved in learning and memory, motor control, and possibly even maternal and social behavior. The review also discusses studies in both humans and animals that present suggestive though not definitive evidence of cognitive or behavioral consequences of vitamin D inadequacy. The authors discuss possible reasons for the apparent discrepancy between the biological and behavioral evidence, and suggest new, possibly clarifying avenues of research.
Many vitamin D experts advise that the currently recommended level of vitamin D intake is much too low and should be raised to protect against bone fractures and possibly cancer in addition to rickets (2). Indeed, even using present guidelines, too many Americans have low vitamin D blood levels. McCann & Ames propose that, despite uncertainty regarding all of the deleterious effects of vitamin D inadequacy, the evidence overall indicates that supplementation, which is both inexpensive and prudent, is warranted for groups whose vitamin D status is exceptionally low, particularly nursing infants, the elderly, and African Americans (e.g., see (3)).
This review is the fourth in a series by McCann & Ames that critically evaluate scientific evidence linking deficiencies in micronutrients (the approximately 40 vitamins, minerals, amino acids, and fatty acids required for the body to function) to brain function. Other reviews in the series discuss the long-chain polyunsaturated fatty acid docosahexaenoic acid (DHA) (4, 5), choline (6), and iron (7).
Low Blood Levels Of Vitamin D May Be Associated With Depression In Older Adults
Older adults with low blood levels of vitamin D and high blood levels of a hormone secreted by the parathyroid glands may have a higher risk of depression, according to a new report .
About 13 percent of older individuals have symptoms of depression, and other researchers have speculated that vitamin D may be linked to depression and other psychiatric illnesses, according to background information in the article. "Underlying causes of vitamin D deficiency such as less sun exposure as a result of decreased outdoor activity, different housing or clothing habits and decreased vitamin intake may be secondary to depression, but depression may also be the consequence of poor vitamin D status," the authors write. "Moreover, poor vitamin D status causes an increase in serum parathyroid hormone levels." Overactive parathyroid glands are frequently accompanied by symptoms of depression that disappear after treatment of the condition.
Witte J. G. Hoogendijk, M.D., Ph.D., and colleagues at VU University Medical Center, Vrije Universiteit Amsterdam, the Netherlands, measured blood levels of vitamin D and parathyroid hormone and assessed symptoms of depression among 1,282 community residents age 65 to 95. Of those individuals, 26 had a diagnosis of major depressive disorder, 169 had minor depression and 1,087 were not depressed. The average blood vitamin D level was 21 nanograms per milliliter and the average parathyroid hormone level was 3.6 picograms per milliliter.
Blood vitamin D levels were 14 percent lower in individuals with major and minor depression (average, 19 nanograms per milliliter) compared with non-depressed participants (average, 22 nanograms per milliliter). In addition, parathyroid hormone thyroid levels were an average of 5 percent higher in those with minor depression (average, 3.72 picograms per milliliter) and 33 percent higher in those with major depressive disorder (average, 4.69 picograms per milliliter) than in those who were not depressed (average, 3.53 picograms per milliliter).
The findings may be important to patients because both low blood vitamin D levels and high parathyroid hormone levels can be treated with higher dietary intake of vitamin D or calcium and increased sunlight exposure. "Moreover, the clinical relevance of the present study is underscored by our finding that 38.8 percent of men and 56.9 percent of women in our community-based cohort had an insufficient vitamin D status," they conclude. Additional studies are needed to determine whether changes in levels of vitamin D and parathyroid hormone precede depression or follow it.
Journal reference: Arch Gen Psychiatry. 2008;65[5]:508-512.
This study was supported by a clinical fellow grant from the Netherlands Organisation for Scientific Research. .
About 13 percent of older individuals have symptoms of depression, and other researchers have speculated that vitamin D may be linked to depression and other psychiatric illnesses, according to background information in the article. "Underlying causes of vitamin D deficiency such as less sun exposure as a result of decreased outdoor activity, different housing or clothing habits and decreased vitamin intake may be secondary to depression, but depression may also be the consequence of poor vitamin D status," the authors write. "Moreover, poor vitamin D status causes an increase in serum parathyroid hormone levels." Overactive parathyroid glands are frequently accompanied by symptoms of depression that disappear after treatment of the condition.
Witte J. G. Hoogendijk, M.D., Ph.D., and colleagues at VU University Medical Center, Vrije Universiteit Amsterdam, the Netherlands, measured blood levels of vitamin D and parathyroid hormone and assessed symptoms of depression among 1,282 community residents age 65 to 95. Of those individuals, 26 had a diagnosis of major depressive disorder, 169 had minor depression and 1,087 were not depressed. The average blood vitamin D level was 21 nanograms per milliliter and the average parathyroid hormone level was 3.6 picograms per milliliter.
Blood vitamin D levels were 14 percent lower in individuals with major and minor depression (average, 19 nanograms per milliliter) compared with non-depressed participants (average, 22 nanograms per milliliter). In addition, parathyroid hormone thyroid levels were an average of 5 percent higher in those with minor depression (average, 3.72 picograms per milliliter) and 33 percent higher in those with major depressive disorder (average, 4.69 picograms per milliliter) than in those who were not depressed (average, 3.53 picograms per milliliter).
The findings may be important to patients because both low blood vitamin D levels and high parathyroid hormone levels can be treated with higher dietary intake of vitamin D or calcium and increased sunlight exposure. "Moreover, the clinical relevance of the present study is underscored by our finding that 38.8 percent of men and 56.9 percent of women in our community-based cohort had an insufficient vitamin D status," they conclude. Additional studies are needed to determine whether changes in levels of vitamin D and parathyroid hormone precede depression or follow it.
Journal reference: Arch Gen Psychiatry. 2008;65[5]:508-512.
This study was supported by a clinical fellow grant from the Netherlands Organisation for Scientific Research. .
Folic Acid, B Vitamins Not Linked To Reduced Risk Of Cardiovascular Events In High-risk Women
Women at high-risk of cardiovascular disease who took a daily supplement of folic acid and vitamin B6 and B12 for seven years did not have an overall reduced rate of cardiovascular events, despite a significant lowering of homocysteine levels, according to a new study.
"Homocysteine [an amino acid produced by the body] levels have been directly associated with cardiovascular risk in observational studies; and daily supplementation with folic acid, vitamin B6, vitamin B12, or a combination have been shown to reduce homocysteine levels to varying degrees in intervention studies," the authors write. Observational data suggest cardiovascular benefits from B-vitamin supplementation may be greater among women, yet women have been underrepresented in published randomized trials. "Given the paucity of data on women and the known influences of estrogen on homocysteine levels, adequately powered randomized trials of homocysteine lowering in women are still needed."
Christine M. Albert, M.D., M.P.H., of Brigham and Women's Hospital and Harvard Medical School, Boston, and colleagues tested whether a combination of folic acid, vitamin B6 and vitamin B12 would reduce total cardiovascular events among women at high risk for the development of cardiovascular disease (CVD) over 7 years of follow-up. Within an ongoing randomized trial of antioxidant vitamins, 5,442 women who were U.S. health professionals age 42 years or older, with either a history of CVD or three or more coronary risk factors, were enrolled in a randomized trial to receive a combination pill containing folic acid (2.5 mg), vitamin B6 (50 mg), and vitamin B12 (1 mg) or a matching placebo.
During the 7.3 years of follow-up, 796 participants (14.6 percent) experienced a confirmed CVD event included in the primary end point (heart attack, stroke, coronary revascularization, or CVD death), with some individuals experiencing more than one event. There was no difference in the cumulative incidence of the primary combined end point in the active vs. placebo treatment groups at any time during study follow-up. A total of 406 women (14.9 percent) in the active treatment group and 390 (14.3 percent) in the placebo group experienced at least one cardiovascular event included in the primary end point.
When analyzed separately, there were no significant differences for each of the components of the primary outcome including heart attack, stroke, and CVD death, between the active treatment and placebo groups. Also, the risk of death from any cause was similar between the active and placebo treatment groups.
The researchers also found that the average plasma homocysteine level was 18.5 percent lower in the active group than that observed in the placebo group.
"Our results are consistent with prior randomized trials performed primarily among men with established vascular disease and do not support the use of folic acid and B vitamin supplements as preventive interventions for CVD in these high-risk fortified populations," the authors write.
Journal reference: JAMA. 2008;299[17]:2027-2036.
Editorial: Homocysteine-Lowering B Vitamin Therapy in Cardiovascular Prevention--Wrong Again?
In an accompanying editorial, Eva Lonn, M.D., M.Sc., F.R.C.P.C., of McMaster University, Hamilton, Ontario, Canada, comments on the findings of Albert and colleagues.
"... B vitamin supplements cannot currently be recommended for the prevention of CVD events (with the exception of rare genetic disorders) and there is no role for routine screening for elevated homocysteine levels. However, ongoing clinical research should provide further evidence on whether there may be any role for homocysteine-lowering B vitamin supplements in CVD prevention and for the overall importance of homocysteine as a CV risk factor."
"Homocysteine [an amino acid produced by the body] levels have been directly associated with cardiovascular risk in observational studies; and daily supplementation with folic acid, vitamin B6, vitamin B12, or a combination have been shown to reduce homocysteine levels to varying degrees in intervention studies," the authors write. Observational data suggest cardiovascular benefits from B-vitamin supplementation may be greater among women, yet women have been underrepresented in published randomized trials. "Given the paucity of data on women and the known influences of estrogen on homocysteine levels, adequately powered randomized trials of homocysteine lowering in women are still needed."
Christine M. Albert, M.D., M.P.H., of Brigham and Women's Hospital and Harvard Medical School, Boston, and colleagues tested whether a combination of folic acid, vitamin B6 and vitamin B12 would reduce total cardiovascular events among women at high risk for the development of cardiovascular disease (CVD) over 7 years of follow-up. Within an ongoing randomized trial of antioxidant vitamins, 5,442 women who were U.S. health professionals age 42 years or older, with either a history of CVD or three or more coronary risk factors, were enrolled in a randomized trial to receive a combination pill containing folic acid (2.5 mg), vitamin B6 (50 mg), and vitamin B12 (1 mg) or a matching placebo.
During the 7.3 years of follow-up, 796 participants (14.6 percent) experienced a confirmed CVD event included in the primary end point (heart attack, stroke, coronary revascularization, or CVD death), with some individuals experiencing more than one event. There was no difference in the cumulative incidence of the primary combined end point in the active vs. placebo treatment groups at any time during study follow-up. A total of 406 women (14.9 percent) in the active treatment group and 390 (14.3 percent) in the placebo group experienced at least one cardiovascular event included in the primary end point.
When analyzed separately, there were no significant differences for each of the components of the primary outcome including heart attack, stroke, and CVD death, between the active treatment and placebo groups. Also, the risk of death from any cause was similar between the active and placebo treatment groups.
The researchers also found that the average plasma homocysteine level was 18.5 percent lower in the active group than that observed in the placebo group.
"Our results are consistent with prior randomized trials performed primarily among men with established vascular disease and do not support the use of folic acid and B vitamin supplements as preventive interventions for CVD in these high-risk fortified populations," the authors write.
Journal reference: JAMA. 2008;299[17]:2027-2036.
Editorial: Homocysteine-Lowering B Vitamin Therapy in Cardiovascular Prevention--Wrong Again?
In an accompanying editorial, Eva Lonn, M.D., M.Sc., F.R.C.P.C., of McMaster University, Hamilton, Ontario, Canada, comments on the findings of Albert and colleagues.
"... B vitamin supplements cannot currently be recommended for the prevention of CVD events (with the exception of rare genetic disorders) and there is no role for routine screening for elevated homocysteine levels. However, ongoing clinical research should provide further evidence on whether there may be any role for homocysteine-lowering B vitamin supplements in CVD prevention and for the overall importance of homocysteine as a CV risk factor."
Vitamin D Linked To Reduced Mortality Rate In Chronic Kidney Disease
For patients with moderate to severe chronic kidney disease (CKD), treatment with activated vitamin D may reduce the risk of death by approximately one-fourth, suggests a study in the August Journal of the American Society of Nephrology.
Many patients with advanced CKD take the drug calcitriol, an oral form of activated vitamin D, to treat elevated levels of parathyroid hormone. "Although activated vitamin D is known to influence many biological processes, previous clinical knowledge is limited to its effect on parathyroid hormone levels," explains Dr. Bryan Kestenbaum of the University of Washington in Seattle, one of the study authors.
The study included 1,418 patients who had stage 3 to 4 CKD, which means moderately to severely reduced kidney function. All patients also had high parathyroid hormone levels (hyperparathyroidism), which can contribute to weakening of the bones in CKD. The researchers identified one group of patients who were being treated with calcitriol to lower their parathyroid hormone levels and another group who were not receiving calcitriol.
During a two-year follow-up period, mortality rates were compared for patients who were and were not taking calcitriol. "We then adjusted for differences in age, kidney function, parathyroid hormone levels, other illnesses, and other medications," says Dr. Kestenbaum.
In the adjusted analysis, the overall risk of death was about 26 percent lower for patients taking calcitriol. Patients on calcitriol were also less likely to develop end-stage renal disease, requiring dialysis to replace lost kidney function.
Overall, treatment with calcitriol was associated with a 20 percent reduction in the risk of either death or dialysis. The reduction in mortality with calcitriol was unrelated to its effect on parathyroid hormone levels.
"Recently, there has been an increased focus on the effects of vitamin D beyond those on bone health," Dr. Kestenbaum comments. "Vitamin D deficiency has been associated with risk factors for cardiovascular disease, such as high blood pressure, diabetes, and inflammation." Previous studies have suggested that treatment with intravenous vitamin D can improve survival in patients on hemodialysis.
The new results suggest that treatment with oral activated vitamin D may also improve survival in patients with CKD who do not yet require dialysis. "Randomized clinical trials are needed to test the hypothesis that vitamin D therapy can improve cardiovascular health and survival in CKD," Dr. Kestenbaum adds. "Future studies should also examine the role of non-activated vitamin D, which is less expensive and less toxic."
The study has some important limitations, including a lack of data on other factors that may have affected survival in patients taking calcitriol. Also, since the study included mainly older, white men, the results may not apply to younger, more ethnically diverse populations with CKD.
Many patients with advanced CKD take the drug calcitriol, an oral form of activated vitamin D, to treat elevated levels of parathyroid hormone. "Although activated vitamin D is known to influence many biological processes, previous clinical knowledge is limited to its effect on parathyroid hormone levels," explains Dr. Bryan Kestenbaum of the University of Washington in Seattle, one of the study authors.
The study included 1,418 patients who had stage 3 to 4 CKD, which means moderately to severely reduced kidney function. All patients also had high parathyroid hormone levels (hyperparathyroidism), which can contribute to weakening of the bones in CKD. The researchers identified one group of patients who were being treated with calcitriol to lower their parathyroid hormone levels and another group who were not receiving calcitriol.
During a two-year follow-up period, mortality rates were compared for patients who were and were not taking calcitriol. "We then adjusted for differences in age, kidney function, parathyroid hormone levels, other illnesses, and other medications," says Dr. Kestenbaum.
In the adjusted analysis, the overall risk of death was about 26 percent lower for patients taking calcitriol. Patients on calcitriol were also less likely to develop end-stage renal disease, requiring dialysis to replace lost kidney function.
Overall, treatment with calcitriol was associated with a 20 percent reduction in the risk of either death or dialysis. The reduction in mortality with calcitriol was unrelated to its effect on parathyroid hormone levels.
"Recently, there has been an increased focus on the effects of vitamin D beyond those on bone health," Dr. Kestenbaum comments. "Vitamin D deficiency has been associated with risk factors for cardiovascular disease, such as high blood pressure, diabetes, and inflammation." Previous studies have suggested that treatment with intravenous vitamin D can improve survival in patients on hemodialysis.
The new results suggest that treatment with oral activated vitamin D may also improve survival in patients with CKD who do not yet require dialysis. "Randomized clinical trials are needed to test the hypothesis that vitamin D therapy can improve cardiovascular health and survival in CKD," Dr. Kestenbaum adds. "Future studies should also examine the role of non-activated vitamin D, which is less expensive and less toxic."
The study has some important limitations, including a lack of data on other factors that may have affected survival in patients taking calcitriol. Also, since the study included mainly older, white men, the results may not apply to younger, more ethnically diverse populations with CKD.
Tuesday, 6 May 2008
Allergic-like Reactions Can Occur In Premedicated Patients
Allergic-like reactions can occur in patients (both children and adults) when given gadolinium containing contrast agents, even if they have been pre-medicated with corticosteroids and antihistamines, according to a recent study conducted by researchers at the University of Michigan Health Systems in Ann Arbor.
"We pre-medicate patients at our institution who have a history of prior allergic-like reaction to gadolinium-containing contrast agents", said Jonathan R. Dillman, MD, lead author of the study.
"Pre-medication is sometimes also considered in patients who have a history of prior severe allergic-like reaction to another substance (including iodinated contrast material)," said Dr. Dillman. "While we know from previous studies that allergic-like reactions may occur following pre-medication in the setting of repeat iodinated contrast material injections (the so-called 'breakthrough reaction'), we were uncertain if this phenomenon also occurred in the setting of repeat gadolinium-containing contrast material administration," he said.
The researchers reviewed contrast material reaction forms from the institution's department of radiology over a five-year period. According to the study, eight patients experienced nine allergic-like reactions (one patient experienced two breakthrough reactions) after being administered a gadolinium-containing contrast agent despite being pre-medicated.
Of these reactions, six were mild and three were moderate. There were no severe or fatal breakthrough reactions. All patients who experienced breakthrough reactions had a history of allergic-like reactions to either gadolinium or iodine containing contrast media.
"While we believe that pre-medication likely decreases an individual's risk of allergic-like reaction to gadolinium-containing contrast material, our study concludes that 'breakthrough reactions' do occur. Radiologists, therefore, must be available to treat an allergic-like reaction following gadolinium-containing contrast material administration, even when a patient has been pre-medicated with corticosteroids and antihistamines," said Dr. Dillman.
"We pre-medicate patients at our institution who have a history of prior allergic-like reaction to gadolinium-containing contrast agents", said Jonathan R. Dillman, MD, lead author of the study.
"Pre-medication is sometimes also considered in patients who have a history of prior severe allergic-like reaction to another substance (including iodinated contrast material)," said Dr. Dillman. "While we know from previous studies that allergic-like reactions may occur following pre-medication in the setting of repeat iodinated contrast material injections (the so-called 'breakthrough reaction'), we were uncertain if this phenomenon also occurred in the setting of repeat gadolinium-containing contrast material administration," he said.
The researchers reviewed contrast material reaction forms from the institution's department of radiology over a five-year period. According to the study, eight patients experienced nine allergic-like reactions (one patient experienced two breakthrough reactions) after being administered a gadolinium-containing contrast agent despite being pre-medicated.
Of these reactions, six were mild and three were moderate. There were no severe or fatal breakthrough reactions. All patients who experienced breakthrough reactions had a history of allergic-like reactions to either gadolinium or iodine containing contrast media.
"While we believe that pre-medication likely decreases an individual's risk of allergic-like reaction to gadolinium-containing contrast material, our study concludes that 'breakthrough reactions' do occur. Radiologists, therefore, must be available to treat an allergic-like reaction following gadolinium-containing contrast material administration, even when a patient has been pre-medicated with corticosteroids and antihistamines," said Dr. Dillman.
Subscribe to:
Posts (Atom)