Tuesday, April 20, 2010
Tiny, Slippery Symmetries: In Search of the Bizarro Electron
My friends the particle physicists look for clues to the secrets of the universe: they want to know how things are put together, or, to put it better—how things are coming apart, given that the universe is expanding. They’re tracing the clues back to the beginning, when all things were one.
That sounds so cosmic. Better than the Bible.
So I thought that when Robbie Pattie, a research fellow with the NCSU physics department, showed me his work space, I’d be seeing something…well, cool. I thought he’d lead me over to some eyepiece, and I’d look in, and there would be the Big Bang repeating, or something.
Nope. He showed me what looked like a 1-inch USB port in a block of wood and a cold keg of helium. (See above. Photo courtesy of Dave Baker.)
Okay—it was REALLY cold. The helium. It was 4 Kelvin, which works out to -270 Celsius or -452 Farenheit. So if you want to argue semantics, Robbie did show me something “cool.”
The really cool stuff is too small to see, though. Down in that thing that looks like a USB port is a battery with each pole hooked to a tiny bronze plate (think middle school shop class electronics). Hovering (actually, doing little mini-orbits, little loops) between the plates is a neutron from that 4 Kelvin helium (there’s none of that in shop class). Robbie’s colleague Chris is trying to measure the ways that the electric charge affects the neutron’s motion.
If the neutron's loop stays the same, we aren’t too excited. We know that the neutron has magnetic moment because it contains an electron and a proton—collectively, its charge is neutral, but it has poles. But if the neutron wavers in its little orbit, that means the electric field generated by the battery is affecting it; it has electric dipole moment. There’s another particle in there we’ve never observed before, pulling it towards the field. And that is extremely—cosmically—exciting.
Most particle physicists think that there’s a dark half of the universe. A Bizaro half made of matter that we can’t see, even though it pulls us outwards constantly. They call it dark matter, and we know it's there because the universe is expanding, and we can see its background radiation.
Electric dipole moment—if Chris’s neutron has it—would be evidence of the Bizaro electron. Proof of the other side. If Chris can find it, physicists can adjust their picture of how the universe works.
Did you get that? ADJUST THEIR PICTURE OF HOW THE UNIVERSE WORKS. That’s what physicists do.
The NCSU group's work also adds to a body of scholarship that looks for materials that work best in these kinds of experiments. They're called Ultra Cold Neutrons (remember that 4 Kelvin Helium?) If you want to see the cool stuff Robbie builds in a lab out in Los Alamos (think shop class, again), check out this power point of a paper he contributed to. It was presented at an Ultra Cold Neutron conference at the end of last year in Santa Fe, New Mexico.
Monday, April 19, 2010
Breaking Down Pancreatitis
The Pancreas is that underappreciated organ beneath your stomach which nobody pays attention to until it swells to the size of a foot-long sub and jams into your nearby internal organs. This swelling, commonly known as acute Pancreatitis, has many causes including the introduction of scorpion venom into an animal’s body. Researchers Keith Weninger of North Carolina State’s Department of Physics and Paul Fletcher of East Carolina University’s Department of Microbiology and Immunology have recently completed a study of the effects of scorpion venom on protein and enzyme production in the pancreas of guinea pigs. According to the researchers, “Clinical studies report that scorpion venoms induce significant pathology, including acute pancreatitis in humans following envenomation.” Approximately 80,000 people are affected by acute pancreatitis every year in the U.S, a number which could be significantly reduced using data collected by Weninger and his team.
http://www.latoxan.com/VENOM/SCORPION/IMG/Tityus-serrulatus.jpg
It is important to note that while scorpion stings are widespread in the U.S., stings do occur worldwide especially in Africa and India. Additionally, according to the researchers, “Secretagogues of non-scorpion venom origin used by others can also produce similar effects but require excessive levels of administration in vivo in order to achieve those results”.
In their experiments, swelling of the pancreas was caused by a toxin-induced failure of the normal vesicular traffic, which is the basis for intra-cellular transport of proteins. Toxin molecules attacked modified proteins referred to as vesicle-associated membrane proteins (VAMPs), rendering them unable to transport other proteins throughout the pancreas. Basically, pancreatitis was onset by disabling the pancreatic cells ability to release or absorb components.
This transport process is called “vesicle fusion” and it works like so. The contents of one cell may be mixed or injected into another cell or VAMP that can be transferred to another location. Weninger and Fletcher report that “understanding of these functions is fundamental to extending knowledge of transport in normal and diseased cells.” Additionally, “Simultaneous cleavage of multiple SNAREs, such as VAMP2, VAMP3, and VAMP8, would presumably have major physiological consequences”.
A cutting of protein molecules, known as proteolysis, occurred between the soluble N-ethylmaleimide-sensitive factor attachment protein receptor (SNARE) motif and the transmembrane anchor. This cleavage was reportedly performed by an enzyme called antarease, a newly discovered “Metalloprotease”, which was not previously present in amino acid sequencing databases.
Before their experiments, no scorpion toxins had been associated with intracellular targets. Therefore, “a definitive function in pathogenesis for the metalloprotease activity of scorpion venom remains to be determined beyond a theoretical role”.
According to Dr. Weninger, “results from the experiments have important implications for potential effects on secretory discharge as well as vesicular transport mechanisms in the exocrine process.” Understanding the effects of VAMP cleavage by a metalloprotease will lead to a better understanding of the mechanisms responsible for pancreatitis and potential treatments or cures. Vesicle fusion has been explored in recent years as a method of cellular-level drug introduction.
Results of the experiment were published in the March edition of the Journal of Biological Chemistry.
Fletcher Jr., Paul L. et al "Vesicle-associated Membrane Protein (VAMP) Cleavage By a New Metalloprotease from the Brazilian Scorpion Tityus Serrulatus*." Journal of Biological Chemistry 285.10 (2010): 7405-7416. jbc.org. Web. 06 Apr. 2010.
Revamped Design Increases Computer Efficiency
It's hard to imagine that single processors in computers were once so fast they melted the very chips they were engineered for. Now they are designed to split up the workload to counter this incredible heat. However, this has presented new challenges for today's engineers on both a hardware and software level as far as maintaining and increasing a computer's computational speed is concerned. Luckily, researcher's at North Carolina State University seemed to have just raised bar in this matter by devising a way to increase the efficiency of modern processors by up to twenty percent.
To understand what is going on under the hood of these machines, a quick overview is in order:
Computers have a so-called “brain” known as the Central Processing Unit (CPU) or core. This is where all of the computations take place when executing an application/program such as your everyday web-browser.
The calculations necessary to run a program are split up into separate tasks called “threads”, a process known as parallelization, which can be computed simultaneously on multiple cores, making for a very fast means of computation.
Unfortunately, some programs are difficult to split up into threads because of their sequential nature. They are dependent on the outcomes of other threads/programs in order to continue computation, limiting their usage of multi-core systems – slowing execution time.
Process execution traditionally takes place with a calculation step followed by a memory-management function to free-up memory and prepare data for storage. For difficult-to-parallelize programs, this two-step task can only be completed on a single core, slowing things down significantly.
The basis of what these researchers did to achieve this feat of a twenty percent increase in efficiency is by treating the Memory-Management step as a separate Thread (MMT), allowing for simultaneous execution of both steps. This ensures utilization of multiple cores, increasing processing speed. Programs frequently request for memory to be used or freed up for various reasons. These requests are now passed through a small layer of code between the user and the operating system that determines how these requests should be satisfied.
Their methods involve taking these memory-management requests and lumping them together. Then they predict how these requests should be handled to satisfy the needs of the program. Dr. James Tuck, assistant professor of computer and electrical engineering at NC State, says: “As it turns out, programs are very predictable.” Predictable in a sense that it is possible to calculate the next memory-management request before it has happened. So, when predicting requests in bulk, time is saved since the work has already been done. These requests are then completed on a separate core than that of the calculation thread via the MMT. “It's like having two cooks in a kitchen...” Dr. Tuck explains. One cook does the mixing while the other prepares the ingredients. The mixing, or computation, is the important part of satisfying program requests and the preparation of the ingredients is what the MMT takes care of.
This exploitation of parallelization resulted in some interesting findings, with the most significant being that this MMT approach is independent of existing applications. In Layman's terms, MMT allows for a boost of speed without having to alter pre-existing code, and is effectively transparent on a user level. This is good news considering that a large overhaul of complicated programs such as common web-browsers and word processors is not necessary, saving lots of time and money for a possible twenty percent increase in speed.
The scientific article MMT: Exploiting Fine-Grained Parallelism in Dynamic Memory Management can be found at http://www.ece.ncsu.edu/arpers/Papers/MMT_IPDPS10.pdf
A visual of an Intel Core 2 Duo processor.
Sunday, April 18, 2010
Polypropylene -- The Fiber that Could
Tissue scaffolds, gym socks, and potable water…items you would never think could go together are being linked through a material you use every day. The material is one of the most industrially important polymers in the world, polypropylene (PP). The applications using natural and man-made PP fibers that link these three unrelated topics are only the beginning of what can be accomplished through the changing of the surface chemistry of the original fibers.
Research at North Carolina State University (NCSU) in the department of Chemical Engineering is making headway into changing the chemistry of the fiber surfaces through a process known as “Atomic Layer Deposition”, or ALD. ALD is the process of depositing a uniform layer of a certain chemical across the entire fiber surface, or really across any surface you want it deposited on.
Chris Hanson, an undergraduate researcher in the lab of Gregory Parsons at NCSU, has been using ALD to modify the surface chemistry of the common polymer, PP, in an effort to produce a product much more valuable than the original plastic (which is commonly used for making various consumer disposable bottles).
“Nonwoven polypropylene (which is again used for consumer bottles) is incredibly cheap to produce, but the original inert surface renders the fibers not useful for specialty applications such as advanced bio-filtration. Our goal is to show that modifying the surface chemistry of PP by adding tiny layers of aluminum oxide can alter how water adheres to the surface”, said Chris. Illustrating the change in how water adheres to the fiber surfaces shows that a change is indeed occurring on the fibers.
The research was done on small squares of PP placed inside a homemade reactor (depicted in the image). In order to obtain multiple aluminum oxide layers on the same surface, a layer of aluminum oxide must be deposited followed by a chemical to make the surface suitable to accept another aluminum oxide layer. The researchers did this cyclical procedure and obtained PP samples with aluminum oxide layers ranging from zero to 100 layers, in increments of ten (where the thickest coating of 100 layers corresponds to a thickness of ~10 nanometers). A drop of water was then placed on the layers and the angle between the water and sample surface measured to determine how well the sample repelled or attracted the water.
In this study, the researchers were able to show that depositing layers of aluminum oxide on the surface of the originally “water-repelling” polymer does change the polymer to have water-attracting properties.
So why does it matter that a surface can be changed from having water-repelling to water-attracting properties?
“Once the surface chemistry of the inert, water-repelling PP has been made to attract water, it can undergo chemical reactions much more easily and can then be used to make advanced bio-filters, clothes that destroy bacteria, water filters, etc,” said Chris. When looking at the specific application to anti-microbial coatings and what is already on the market, the new feature of this technology is that it can be used to deposit a more robust set of microbe killers than what has been used before, opening up many new opportunities.
The future of using ALD to change the surface chemistry of inert materials is promising not only to researchers, but to every person in the world who will use a product born from this technology.
Steven Burgess
Peer Reviewed Literature:
Hyde, G. K. et al "Atomic Layer Deposition and Abrupt Wetting Transitions on Nonwoven Polypropylene and Woven Cotton Fabrics." Langmuir 26.4 (2010): 2550-2558.
http://pubs.acs.org/doi/abs/10.1021/la902830d
New Scale Helps Diagnose OCD
You begin organizing the cereal boxes by height. A strange calm washes over you.
You’ve just experienced five minutes in the life of someone with obsessive-compulsive disorder.
Nearly everyone experiences some level of obsessive-compulsive behavior, or OCD, which is characterized by intrusive thoughts, or obsessions, and repetitive actions, or compulsions. In its most severe form, OCD can severely inhibit a person’s ability to function normally. Time that could be spent with friends is instead consumed by the need to respond to the obsessions.
Before a psychiatrist or psychologist begins therapy on a person with OCD symptoms, he or she must determine whether or not the patient indeed has OCD or a different anxiety disorder. To make a definite diagnosis, doctors use one of several diagnostic tests. However, according to Dr. Jonathan Abramowitz, each one has flaws.
So he and several of his colleagues from across the country developed their own measure for OCD, called the Dimensional Obsessive-Compulsive Scale (DOCS). In a 2010 peer-reviewed article in Psychological Assessment, Abramowitz introduces the DOCS and explains how it diagnoses symptom severity more accurately than the Obsessive-Compulsive Inventory-Revised (OCI-R), and distinguishes OCD symptoms more clearly than the Yale-Brown Obsessive-Compulsive Scale (Y-BOCS) the two most widely-used OCD measures.
A research psychologist from the University of North Carolina at Chapel Hill, Abramowitz argues that obsessions and compulsions work as pairs. He groups these obsession-compulsion pairs into four areas, or dimensions: Contamination, Responsibility for Harm, Unacceptable Thoughts, and Symmetry. The OCI-R and Y-BOCS also include Hoarding.
Unlike some other OCD diagnostic tests, the DOCS doesn’t separate obsessions from compulsions. In fact, it doesn’t ask questions about specific behaviors at all. The scale asks five questions for each OCD dimension: time spent on the behavior, avoidance of situations, distress over unwelcome thoughts, disruption of life, and difficulty in ignoring OCD thoughts. The DOCS introduces two new components to OCD diagnostics: adding the concept of avoidance, and removing Hoarding from the scale.
Abramowitz tested the DOCS on three different groups: people diagnosed with OCD, people diagnosed with other anxiety disorders, and college students from Tennessee, Florida, and Arkansas. Members from each group completed the DOCS and one or more of six other tests, including two tests for OCD, three scales for anxiety disorders and one scale for depression.
When compared with results from the other scales, the DOCS results were more similar to results from the other two OCD scales than they were to the anxiety or depression scales. In addition to being a useful clinical tool in diagnosing OCD, the DOCS also shows promise in measuring treatment outcomes.
Source: Abramowitz, J., et. al. (2010). Assessment of Obsessive-Compulsive Symptom Dimensions: Development and Evaluation of the Dimensional Obsessive-Compulsive Scale. Psychological Assessment, 22(1), 180-198.
Cleaning Mickey Mouse
It is everywhere, the “Mickey Mouse” of molecules, H2O or water. It is essential for life, animal, plant, and even bacteria. Bacteria can live in our water pipes, which can pose a risk to human health. In the article (2009) “Effect of free chlorine application on microbial quality of drinking water in chloraminated distribution systems” researchers from Chile, Massachusetts, and North Carolina developed a new method of measuring bacteria amounts in water by flow cytometry.
Bacteria eat the ammonia produced by decaying materials and release nitrate and nitrite into the water in a process called nitrification. When pregnant women drink nitrate/nitrite rich water the nitrate competes with the oxygen in the blood of the fetus, causing the child to be born with blue tinted skin. Blue baby syndrome which results from low oxygen levels in the bloodstream.
The Distribution Systems (DS) or water plants, in the research used Chloramines as their primary method of disinfection due to its ability to kill bacteria without producing high levels of harmful by-products.
The trade-off is higher nitrification levels in chloramine treated water. In an effort to reduce nitrification levels in tap water North Carolina law requires water treatment facilities that use chloraminated disinfection as its primary disinfectant to flush the system with free chlorine of hypochlorous acid one month a year in order to reduce nitrification levels.
While the free chlorine reduces nitrification levels it presents other health problem for people. The free chlorine reacts with the water to make harmful by-products like chloroform a known carcinogen, thereby preventing its widespread use.
In areas where water is in constant use like urban and suburban centers, the public doesn’t face as much risk as people in dead end sites. A “dead end site” is essentially where water sits or pools for long periods of time, like a school during summer (1).
The researchers looked at a closed community center and found that before the introduction of free chlorine into the DS the water showed high levels of nitrification. In order to move the free chlorinated water into the community center, hydrant flushing or opening up a water source in order to get water moving was used. Once the free chlorinated water reached the community center nitrification went down, but the levels of chloroform levels increased. A second hydrant flushing was needed to remove the chloroform rich water, however it was not done.
According to Dr. Detlef Knappe a researcher in the article, the flow cytometry method of calculating bacteria count will hopefully help engineers and scientists to strike a balance between bacterial and negative chemical effects.
By engineering water systems that circulate water to eliminate dead end sites or creating a balance between chemical disinfectant and bacterial amounts flow cytometry will help provide clean and safe water to the public.
For more information on the concepts in this blog please check out:
(1)Rosenfeldt EJ, Baeza C, Knappe DR. 2009. Effect of free chlorine application on microbial quality of drinking water in chloraminated distribution systems. American Water Works Association 101(10):60-70.
Growing Technology that Grows
Runoff from construction sites has long been a major factor in polluting our rivers and lakes. The EPA has even issued a new rule requiring all construction sites over 10 acres to reduce runoff to about 100-times lower than is typical of their discharge.
Luckily, research out of NC State University shows how a remarkable polymer can be used to reduce runoff from road construction up to 98% from current methods.
Professor R.A. McLaughlin, professor S.E. King, and professor G.D. Jennings are taking advantage of a remarkable polymer called polyacrylamide (PAM). PAM is a water soluble, synthetic polymer that expands when it comes into contact with water.
So, when it rains, PAM dissolves into sediments from runoff, causing tiny sediments to literally expand and settle. Not only does this stop the sediments from polluting a river or lake, but it also creates a physical barrier that slows runoff.
When used in conjunction with fiber check dams (FCDs) consisting of straw wattles and coir logs, construction companies might actually have a chance of meeting the DOT’s requirements.
Construction companies’ current method of using rock check dams are highly ineffective and even more expensive than McLaughlin et al.’s method of using the fiber check dams with PAM.
This research has major implications in a nation where 45% of rivers and streams in the
When water has high levels of particles due to sedimentation, it is called turbid water, and high levels of turbidity in drinking water can protect disease-causing bacteria from ultraviolet sterilization, which when drank can cause nausea, vomiting, and dizziness. And, long-term drinking of turbid water can lead to gastrointestinal diseases or death.
McLaughlin et al. tested three different systems for controlling erosion at two roadway projects undergoing construction in the
The first experimental section of the first roadway was treated using the standard technique used by construction companies:
“[Rock check dams] are narrow sediment traps in the ditch along with rock check dams,” McLaughlin said.
The team treated the second experimental section of the first roadway with fiber check dams (FCDs) consisting of straw wattles and coir logs.
And the team treated the third experimental section of the first roadway using fiber check dams with granulated PAM added to each. The PAM is granulated rather than liquid so that when it rains, it dissolves into the soil and thickens the barrier of the dam, as well as weighs down the sediments.
The team compared the three systems by measuring the turbidity, or the amount of sediments, in the water.
Turbidity is measured in nephelometric turbidity units (NTU). The EPA requires turbidity for construction companies be less than 280 NTU. At site one, the average turbidity values for the storm-water runoff was 3,913 nephelometric turbidity units (NTU) for rock check dams. However, the Fiber check dams with PAM’s average turbidity values were only 34 NTU.
"Because we did this research before the rule was issued, we have good confidence that we can train the industry to attain the turbidity goal following key elements we have determined are necessary based on our research," McLaughlin said.
McLaughlin et al. published their research, titled “Improving construction site runoff quality with fiber check dams and polyacrylamide,” in the Journal of Soil and Water Conservation 64(2):1444-154.
http://www.jswconline.org.www.lib.ncsu.edu:2048/content/64/2/144.full.pdf+html
Monday, April 5, 2010
Justice is blind--but can it be deaf?
If I were reading this text aloud to you, your brain could use the sound of my voice—the way I combined vowels and consonants, the subtle shifts in inflection, the vocabulary I used, the order of those words, and maybe even the frequencies I produced—to determine my ethnicity. And your brain would very likely get it right.
Can you tell “what” I am from the slant of this typeface? I didn’t think so.
The field of sociolinguistics studies how language is socially constructed—that is, how who we are determines how we say what we say. And any good sociolinguist will tell you that, as listeners, we place almost as much value on the “how” as we do on the “what”: we judge people according to how they sound. Speech is as important as skin color to our assessments of—and potential discrimination against—people who are different from us. Sociolinguists call such discrimination linguistic profiling.
(Linguistic profiling can be used for more than just discrimination. Voices can be analyzed like fingerprints and used to ID individuals; such analysis can solve crime or open doors with voice recognition technology.)
But sociolinguists have a hard job. It’s nearly impossible to sort out all of the variables that contribute to our judgments about who someone is. Erik Thomas, Jeff Reaser, and Walt Wolfram at NCSU have spent parts of their careers trying to narrow it down. In a forthcoming chapter in Linguistic Profiling and Linguistic Human Rights, Thomas reviews several decades worth of scholarship, and the big conclusion is…a question mark. We know that people can make good guesses about ethnicity from phonological features—sounds alone, no syntax or vocabulary clues, but we don’t know which phonological features are the key.
Thomas writes that studies—most of which use recorded samples of African American (or “black”) speech as well as other control samples to survey listeners—have determined that there are prototypical features of African American speech. Sociolinguists call speech that contains such features “marked” speech. Studies also show that non-prototypical black speech (i.e. the speech of black speakers from isolated southern towns which shows more marked features typical of southern, not black, speech) is less recognizable to white listeners. But more prototypical black speech, even without marked words and syntax—is recognizable.
The key, says Thomas, is in the way that African American speakers use vowels and consonants and in the rhythm and intonation—or prosody—of their speech. But what’s the recipe? What features, when absent, erase the speaker’s race?
In 2004, Thomas and his colleague Jeff Reaser collected samples from African American speakers in Hyde County, North Carolina—a region where Black speech contains fewer prototypical features. They then used a computer program that allowed them to modulate vowel sounds to tweak, or monotonize, prototypical /o/ sounds. When they compared listener monotonized /o/s to the unchanged samples with “marked” /o/s, they found that the vowel helped listeners identify African American speakers. (A control group from the piedmont, where Black speech has many more prototypical features--sounds, words, and syntax--was much easier for listeners to identify.)
A follow-up study using different analytical methods and extending the listener sample to include West Virginia University students as well as NCSU students is forthcoming. Its results imply that listeners use different cues to determine the ethnicity of males and females.
Sunday, April 4, 2010
Slow-growing Minds May Imply Adult Schizophrenia
1,000 participants from New Zealand were used to establish a testing group for this research. They were all born between 1972 and 1973 and then tested periodically throughout their growing years for cognitive impairments characteristic of schizophrenic patients. It was shown that the percentage of patients considered schizophrenic in their adult years had related developmental difficulties that was seen as early as age 7. These difficulties consist of a child's visual learning, working memory, processing speed, attention span, verbal reasoning and the solving of visual-spatial problems.
To facilitate this research, three fundamental yet unanswered questions within the scope of study are acknowledged: What is the developmental course of schizophrenia prior to its onset? Do different cognitive characteristics follow similar or different developmental paths? And are there developmental difficulties specific to schizophrenia?
Testing for these disabilities began at age 3 and were continued until age 13 in 2 year intervals; a testing methodology apart of the Dunedin Multidisciplinary Health and Development Study. Furthermore, it was noticed that between the ages of 7 and 13, a loss in .17 and .26 of mental age was apparent in children who would later become diagnosed with schizophrenia.
By the age of 32, it was shown that 2.5 percent of the participants monitored met the diagnostic criteria of schizophrenia and 1 percent met the formal criteria. Only the participants who met the formal criteria were hospitalized and put on antipsychotic medication. Their conclusions consisted of two findings evident between childhood and early adolescence of the children who grew up to develop schizophrenia: Upon entering primary school, they struggled with verbal reasoning and as they aged, lagged behind their peers in working memory, attention, and processing.
Now what does all this mean? Schizophrenia is not an all-of-the-sudden occurrence, but related to a child's development. Children who developed schizophrenia lagged behind in school compared to their peers and continued to do so. Initially their verbal skills are poor and as they age, continue to develop other hindrances in learning. Their minds did grow, but their abilities to make sense of the world forced them into social isolation/delusion. This study offers a unique view not so much into why schizophrenia occurs but what traits are similar in those diagnosed. Future research will focus on the causes of this disease based on these stages of human development.
Friday, April 2, 2010
Feeling good: the perception of what is “pleasant” to the touch
It is important for us to understand how the touch of various materials influences our emotional responses, as the sense of touch is one of the major ways we, as humans, gather information about the world. Now, scientists have revealed that males and females have different perceptions of what is pleasant to the touch.
Previous investigations about pleasant touch suggested soft and smooth materials as pleasant; those that were stiff, rough, or coarse as unpleasant. However, these investigations did not have any numbers to back them up, nor did they consider the possibility of there being differences between males and females.
A research team led by Greg Essick of UNC Chapel Hill in North Carolina decided to fill in these gaps in the knowledge of tactile stimulation, or what they call “pleasantness-to-touch.” Their study, published online in Neuroscience & Biobehavioral Reviews on February 21, 2010, sought to make a quantitative assessment of people’s perceptions of pleasant touch. In other words, by using controlled experiments and gathering concrete data, the scientists can identify what and where females feel pleasantness differently than males.
In the experiment, a rotary tactile stimulator (pictured below) was used to control how the materials brush across the skin while a participant entered in their “pleasantness rating” on a scale of 100% unpleasant to 100% pleasant. “Pleasantness” of contact was noted as affective touch, or touch that evoked a positive emotional response to tactile stimulation.
There were 21 male and 22 female participants. After repeating the process several times, 16 stimulus trials per participant, the researchers obtained ratings of pleasantness of different textured materials stroked across the skin of multiple body sites at controlled velocities and forces of application.
Their data supports previous results that smooth stimuli are pleasant, and that they continue to be pleasant even with increased force. Conversely, rough stimuli start out neutral and become more unpleasant as the force is increased.
The most unexpected findings were that males found stimulation of the forehead, particularly for terry toweling and denim, unpleasant, whereas female participants found terry toweling and denim to be unpleasant on the hand and thigh. For materials that were more pleasant, gendered-responses were similar when tested on the hand, forearm, and thigh. Interestingly, male participants found stimulation of the calf more pleasant than female participants.
This was the first study to conclude, based on data, that there are differences between male and female responses to unpleasant materials. They believe this phenomenon most likely due to sex-dependent mechanical responses of the skin.
We all know that males and females have areas that are more sensitive than others, but who would have thought there are some pretty mundane differences in addition to the sexual ones that usually come to mind.
It is only a first step, but as we continue to learn about, and assign values to the different emotional responses to touch, we can begin to understand the psychology of why people are attracted by some things and repulsed by others.
Health Effects of Gillnets on Sea Turtles along the Cape Fear River
In an effort to decrease mortality rates, North Carolina law requires fishermen to supervise their gillnets during the summer months. Since gillnets are in the water for an average of twelve hours (soaked), many fishermen don’t supervise their nets for the full time in an effort to cut costs (1).
Many sea turtles escape the gillnets only to die later on from boating accidents, predators, or their injuries. In an effort to decrease sea turtle mortality, researchers from the University of North Carolina-Wilmington and Grice Laboratory in Charleston, South Carolina analyzed sea turtle blood biochemistry to determine the likelihood of survival after entanglement and release.
The researchers captured eighteen sea turtles using a mesh gillnet soaked for a maximum of six hours from May to October. To ensure the safety of the animals, the sea turtles were immediately brought to the surface if they were in danger of drowning.
Once the sea turtles were detangled/ released the researchers drew blood immediately after the turtle was brought on board. They then tested the turtles reflexes, checked for injuries, and monitored behavior before the turtle was released about ten meters from the capture site.
The scientists rated the turtles from A to D, with A meaning perfect condition and D being low activity, severe injuries, and missing or delayed reflexes.
Only three of the eighteen sea turtles rated as an A
Only three of the eighteen sea turtles rated as an A. Out of those who did not, their blood work showed elevated levels of lactate, LDH (lactate dehydrogenase), and other chemicals which cause an increase in metabolism. Due to the increased metabolism, the sea turtles have to spend longer periods of time on the surface in order to recover. These longer surface times greatly increase the risk of death for the turtles.
The researchers hope their study would determine a maximum unattended soak time for fishermen that would also minimize the unintentional capture of sea turtles. Due to other factors that lead to sea turtle deaths, the scientists recommended the current restriction remain. However, the scientists also recommend that captured sea turtles be brought onboard to assess their physical well-being.
They outline that fishermen can do a gentle touch to the tail or eyelid to assess reflexes and a simple visual inspection to check for injuries. They also suggest that fishermen take the turtles to a rehabilitation facility at the first sign of injury or distress.
1. Snoddy, J. E. (2009). Blood Biochemistry of Sea Turtles Captured in Gillnets in the Lower Cape Fear River North Carolina, USA. The Journal of Wildlife Management, 8(73), 1394-1402.
Thursday, April 1, 2010
Pesticides, Urine and Vacuum Dust: What’s the Connection?
According to Starr, almost no one can avoid the presence of pesticides. The neatest, most compulsive housekeeper who eats organic food and shuns furry companions may provide even the slightest exposure to pesticides in the air. Most times, the exposure goes unnoticed. Although most people won’t ingest even a fraction of the pesticides needed to make them sick, the Centers for Disease Control and Prevention periodically assesses people’s exposure to pyrethroid pesticides by examining their urine. Pyrethroids are a type of pesticide commonly used to control insects on farms, in homes and on pets.
In the urine, Starr says, the metabolized parent pesticide and the degradates look identical when they are analyzed. Starr wanted to know if urinary analyses were accurately measuring the amount of the original pyrethroids in a person’s body or if some of the person’s exposure came from the less toxic degradates. If people were inhaling both the parent pesticide and the degradates, results from urinary analyses would be unnecessarily alarming. So he decided to test dust samples vacuumed from people’s homes.
To purify the dust samples, Starr sent them to a lab in Maryland to be zapped with gamma radiation. Dust is as nasty as it seems; it’s a collection of skin cells, bacteria, hair, dirt, and any other particle flying around in the air. Starr says he only works with dust inside a fume hood.
“If you can smell the contents of the bag when you vacuum, you are redistributing dust throughout your home,” he says.
Thanks to Starr, we can all breathe easier.
Reference: Starr, J., Graham, S., Stout, D., Andrews, K. and Nishioka, M. Pyrethroid pesticides and their metabolites in vacuum cleaner dust collected from homes and day-care centers. Environmental Research, 108 (2008) 271-279.
Soverign Wealth Funds: Risk and Reward
Jim Allen
So what’s the big deal about Norway? “It’s a hereditary monarchy and parliamentary democracy, cold, and has universal health care, but what’s risky about that”, is what one might pronounce. The Norwegian Sovereign Wealth Fund (SWF) performed better than its private sector counterparts with a higher associated risk, according to a recent study published by North Carolina State researchers in The World Economy. Funds like the one in Norway are large and growing, and are just recently coming into public light.
Sovereign Wealth Funds are basically large government investment funds which function like mutual funds in terms of risk and return on investment. While not all SWFs are the same, they each have a similar goal: “to achieve maximum income subject to a level of risk and certain portfolio constraints specified by its Ministry of Finance.” In Norway, the Fund allows government spending to be smoothed out relative to the volatile pattern of the nation’s oil revenue. SWFs are essentially savings accounts for governments; citizens have banks, and now government entities do also. Many nations which have SWFs are also oil producing nations; SWFs provide a stabilizing effect for nations which have erratic economies related to fluctuations in oil prices. These nations include the United Arab Emirates, Saudi Arabia, Norway, and even the U.S.
According to the research, “Total assets held by SWFs have been estimated to be around US$3 trillion (Jen, 2007)” (See the figure below). This value is greater than the total assets of hedge funds (US$2 trillion) but less than total official monetary reserves of central banks (US$6 trillion).” By 2013, the holdings of SWFs are estimated to exceed US$6 trillion. These are just approximations because the literature on SWFs has been limited and descriptive at best, up till now. The large amount of money held Worldwide by SWFs was quite alarming to the researchers at NC State when they looked at the relative return ratios and how much risk the SWFs were taking to achieve these returns.
Researchers found that the Norway Fund had average monthly returns of 0.36%, while the Social Choice Fund (generally accepted as a balanced and fiscally responsible fund) had a return of 0.16%. To explain this, the research also included data on return on investment versus risk. Not surprisingly, the increased returns also carried increased risk. More specifically, the Norway return to risk ratio was 0.14 while the Social Choice Fund had a much lower ratio at 0.07. Critics also point out mistakes made by SWFs in recent years, and in particular investments in American corporations such as Citigroup, Morgan Stanley, and Merrill Lynch.
This research effort with the Norwegian SWF Fund (referred to as “The Fund”) is one of the first of its kind since the Norwegian government has allowed a high level of transparency unrivaled by other Funds. Researchers stated that “The Fund demonstrated transparency by providing detailed and reliable information about its activities in a timely manner.” Because of this, researchers were able to analyze important data about how The Fund affects global markets and what kinds of returns its investors receive.
Caner, Mehmet, and Thomas Grennes "Sovereign Wealth Funds: The Norwegian Experience." World Economy 33.4 (2010): 597-614. Wiley Interscience. Web. 1 Mar. 2010.
Nightmare on Hog Street
I’m talking, of course, about the foul, foul, foul odors emitted by hog and chicken farms. Driving two hours from
Working with Dr. James Kastner at the
Catalytic oxidation uses specially-designed catalysts and ozone to break down the odor-causing compounds.
Catalysts are things that are added to a process to start or change the rate of the process’ chemical reaction, and the catalyst is not consumed by the reaction itself, meaning a catalyst can be used many times.
Kolar and Kastner developed the catalysts by “coating structures made of activated carbon with a nanoscale film made of cobalt or nickel oxide.”
Activated Carbon’s porous structure gives it a very large surface area in which to expose the odorous agents.
“The cobalt and nickel oxide nanofilms make excellent catalysts, Kolar explains, ‘because they increase the rate of the chemical reaction between the odor-causing compounds and the ozone, making the process more efficient. They are also metals that are both readily available and relatively inexpensive.’”
Another advantage of catalytic oxidation is that it takes place at room temperature, meaning there are no energy costs, and the only two byproducts created are carbon dioxide and water.
The current system, which uses chemical “scrubbers” to remove the odor-causing agents, has many disadvantages.
Most obviously, it is ineffective. All the empirical evidence you need is your nose.
Although Kolar has only targeted industrial poultry farms, he is using his research to target hog farms next. "’This technology could be applied to swine operations to address odors and ammonia emissions,’ Kolar says. ‘My next step is to try to pursue this research on a large scale."