Tuesday, April 20, 2010

Tiny, Slippery Symmetries: In Search of the Bizarro Electron


My friends the particle physicists look for clues to the secrets of the universe: they want to know how things are put together, or, to put it better—how things are coming apart, given that the universe is expanding. They’re tracing the clues back to the beginning, when all things were one.

That sounds so cosmic. Better than the Bible.

So I thought that when Robbie Pattie, a research fellow with the NCSU physics department, showed me his work space, I’d be seeing something…well, cool. I thought he’d lead me over to some eyepiece, and I’d look in, and there would be the Big Bang repeating, or something.
Nope. He showed me what looked like a 1-inch USB port in a block of wood and a cold keg of helium. (See above. Photo courtesy of Dave Baker.)

Okay—it was REALLY cold. The helium. It was 4 Kelvin, which works out to -270 Celsius or -452 Farenheit. So if you want to argue semantics, Robbie did show me something “cool.”

The really cool stuff is too small to see, though. Down in that thing that looks like a USB port is a battery with each pole hooked to a tiny bronze plate (think middle school shop class electronics). Hovering (actually, doing little mini-orbits, little loops) between the plates is a neutron from that 4 Kelvin helium (there’s none of that in shop class). Robbie’s colleague Chris is trying to measure the ways that the electric charge affects the neutron’s motion.

If the neutron's loop stays the same, we aren’t too excited. We know that the neutron has magnetic moment because it contains an electron and a proton—collectively, its charge is neutral, but it has poles. But if the neutron wavers in its little orbit, that means the electric field generated by the battery is affecting it; it has electric dipole moment. There’s another particle in there we’ve never observed before, pulling it towards the field. And that is extremely—cosmically—exciting.

Most particle physicists think that there’s a dark half of the universe. A Bizaro half made of matter that we can’t see, even though it pulls us outwards constantly. They call it dark matter, and we know it's there because the universe is expanding, and we can see its background radiation.

Electric dipole moment—if Chris’s neutron has it—would be evidence of the Bizaro electron. Proof of the other side. If Chris can find it, physicists can adjust their picture of how the universe works.

Did you get that? ADJUST THEIR PICTURE OF HOW THE UNIVERSE WORKS. That’s what physicists do.

The NCSU group's work also adds to a body of scholarship that looks for materials that work best in these kinds of experiments. They're called Ultra Cold Neutrons (remember that 4 Kelvin Helium?) If you want to see the cool stuff Robbie builds in a lab out in Los Alamos (think shop class, again), check out this power point of a paper he contributed to. It was presented at an Ultra Cold Neutron conference at the end of last year in Santa Fe, New Mexico.

Monday, April 19, 2010

Breaking Down Pancreatitis

Jim Allen

The Pancreas is that underappreciated organ beneath your stomach which nobody pays attention to until it swells to the size of a foot-long sub and jams into your nearby internal organs. This swelling, commonly known as acute Pancreatitis, has many causes including the introduction of scorpion venom into an animal’s body. Researchers Keith Weninger of North Carolina State’s Department of Physics and Paul Fletcher of East Carolina University’s Department of Microbiology and Immunology have recently completed a study of the effects of scorpion venom on protein and enzyme production in the pancreas of guinea pigs. According to the researchers, “Clinical studies report that scorpion venoms induce significant pathology, including acute pancreatitis in humans following envenomation.” Approximately 80,000 people are affected by acute pancreatitis every year in the U.S, a number which could be significantly reduced using data collected by Weninger and his team.

http://www.latoxan.com/VENOM/SCORPION/IMG/Tityus-serrulatus.jpg

It is important to note that while scorpion stings are widespread in the U.S., stings do occur worldwide especially in Africa and India. Additionally, according to the researchers, “Secretagogues of non-scorpion venom origin used by others can also produce similar effects but require excessive levels of administration in vivo in order to achieve those results”.

In their experiments, swelling of the pancreas was caused by a toxin-induced failure of the normal vesicular traffic, which is the basis for intra-cellular transport of proteins. Toxin molecules attacked modified proteins referred to as vesicle-associated membrane proteins (VAMPs), rendering them unable to transport other proteins throughout the pancreas. Basically, pancreatitis was onset by disabling the pancreatic cells ability to release or absorb components.

This transport process is called “vesicle fusion” and it works like so. The contents of one cell may be mixed or injected into another cell or VAMP that can be transferred to another location. Weninger and Fletcher report that “understanding of these functions is fundamental to extending knowledge of transport in normal and diseased cells.” Additionally, “Simultaneous cleavage of multiple SNAREs, such as VAMP2, VAMP3, and VAMP8, would presumably have major physiological consequences”.

A cutting of protein molecules, known as proteolysis, occurred between the soluble N-ethylmaleimide-sensitive factor attachment protein receptor (SNARE) motif and the transmembrane anchor. This cleavage was reportedly performed by an enzyme called antarease, a newly discovered “Metalloprotease”, which was not previously present in amino acid sequencing databases.

Before their experiments, no scorpion toxins had been associated with intracellular targets. Therefore, “a definitive function in pathogenesis for the metalloprotease activity of scorpion venom remains to be determined beyond a theoretical role”.

According to Dr. Weninger, “results from the experiments have important implications for potential effects on secretory discharge as well as vesicular transport mechanisms in the exocrine process.” Understanding the effects of VAMP cleavage by a metalloprotease will lead to a better understanding of the mechanisms responsible for pancreatitis and potential treatments or cures. Vesicle fusion has been explored in recent years as a method of cellular-level drug introduction.


Results of the experiment were published in the March edition of the Journal of Biological Chemistry.


Fletcher Jr., Paul L. et al "Vesicle-associated Membrane Protein (VAMP) Cleavage By a New Metalloprotease from the Brazilian Scorpion Tityus Serrulatus*." Journal of Biological Chemistry 285.10 (2010): 7405-7416. jbc.org. Web. 06 Apr. 2010. .

Revamped Design Increases Computer Efficiency

It's hard to imagine that single processors in computers were once so fast they melted the very chips they were engineered for. Now they are designed to split up the workload to counter this incredible heat. However, this has presented new challenges for today's engineers on both a hardware and software level as far as maintaining and increasing a computer's computational speed is concerned. Luckily, researcher's at North Carolina State University seemed to have just raised bar in this matter by devising a way to increase the efficiency of modern processors by up to twenty percent.

To understand what is going on under the hood of these machines, a quick overview is in order:

Computers have a so-called “brain” known as the Central Processing Unit (CPU) or core. This is where all of the computations take place when executing an application/program such as your everyday web-browser.

The calculations necessary to run a program are split up into separate tasks called “threads”, a process known as parallelization, which can be computed simultaneously on multiple cores, making for a very fast means of computation.

Unfortunately, some programs are difficult to split up into threads because of their sequential nature. They are dependent on the outcomes of other threads/programs in order to continue computation, limiting their usage of multi-core systems – slowing execution time.

Process execution traditionally takes place with a calculation step followed by a memory-management function to free-up memory and prepare data for storage. For difficult-to-parallelize programs, this two-step task can only be completed on a single core, slowing things down significantly.

The basis of what these researchers did to achieve this feat of a twenty percent increase in efficiency is by treating the Memory-Management step as a separate Thread (MMT), allowing for simultaneous execution of both steps. This ensures utilization of multiple cores, increasing processing speed. Programs frequently request for memory to be used or freed up for various reasons. These requests are now passed through a small layer of code between the user and the operating system that determines how these requests should be satisfied.

Their methods involve taking these memory-management requests and lumping them together. Then they predict how these requests should be handled to satisfy the needs of the program. Dr. James Tuck, assistant professor of computer and electrical engineering at NC State, says: “As it turns out, programs are very predictable.” Predictable in a sense that it is possible to calculate the next memory-management request before it has happened. So, when predicting requests in bulk, time is saved since the work has already been done. These requests are then completed on a separate core than that of the calculation thread via the MMT. “It's like having two cooks in a kitchen...” Dr. Tuck explains. One cook does the mixing while the other prepares the ingredients. The mixing, or computation, is the important part of satisfying program requests and the preparation of the ingredients is what the MMT takes care of.

This exploitation of parallelization resulted in some interesting findings, with the most significant being that this MMT approach is independent of existing applications. In Layman's terms, MMT allows for a boost of speed without having to alter pre-existing code, and is effectively transparent on a user level. This is good news considering that a large overhaul of complicated programs such as common web-browsers and word processors is not necessary, saving lots of time and money for a possible twenty percent increase in speed.

The scientific article MMT: Exploiting Fine-Grained Parallelism in Dynamic Memory Management can be found at http://www.ece.ncsu.edu/arpers/Papers/MMT_IPDPS10.pdf

A visual of an Intel Core 2 Duo processor.

Sunday, April 18, 2010

Polypropylene -- The Fiber that Could


Tissue scaffolds, gym socks, and potable water…items you would never think could go together are being linked through a material you use every day. The material is one of the most industrially important polymers in the world, polypropylene (PP). The applications using natural and man-made PP fibers that link these three unrelated topics are only the beginning of what can be accomplished through the changing of the surface chemistry of the original fibers.

Research at North Carolina State University (NCSU) in the department of Chemical Engineering is making headway into changing the chemistry of the fiber surfaces through a process known as “Atomic Layer Deposition”, or ALD. ALD is the process of depositing a uniform layer of a certain chemical across the entire fiber surface, or really across any surface you want it deposited on.

Chris Hanson, an undergraduate researcher in the lab of Gregory Parsons at NCSU, has been using ALD to modify the surface chemistry of the common polymer, PP, in an effort to produce a product much more valuable than the original plastic (which is commonly used for making various consumer disposable bottles).

“Nonwoven polypropylene (which is again used for consumer bottles) is incredibly cheap to produce, but the original inert surface renders the fibers not useful for specialty applications such as advanced bio-filtration. Our goal is to show that modifying the surface chemistry of PP by adding tiny layers of aluminum oxide can alter how water adheres to the surface”, said Chris. Illustrating the change in how water adheres to the fiber surfaces shows that a change is indeed occurring on the fibers.

The research was done on small squares of PP placed inside a homemade reactor (depicted in the image). In order to obtain multiple aluminum oxide layers on the same surface, a layer of aluminum oxide must be deposited followed by a chemical to make the surface suitable to accept another aluminum oxide layer. The researchers did this cyclical procedure and obtained PP samples with aluminum oxide layers ranging from zero to 100 layers, in increments of ten (where the thickest coating of 100 layers corresponds to a thickness of ~10 nanometers). A drop of water was then placed on the layers and the angle between the water and sample surface measured to determine how well the sample repelled or attracted the water.

In this study, the researchers were able to show that depositing layers of aluminum oxide on the surface of the originally “water-repelling” polymer does change the polymer to have water-attracting properties.

So why does it matter that a surface can be changed from having water-repelling to water-attracting properties?

“Once the surface chemistry of the inert, water-repelling PP has been made to attract water, it can undergo chemical reactions much more easily and can then be used to make advanced bio-filters, clothes that destroy bacteria, water filters, etc,” said Chris. When looking at the specific application to anti-microbial coatings and what is already on the market, the new feature of this technology is that it can be used to deposit a more robust set of microbe killers than what has been used before, opening up many new opportunities.

The future of using ALD to change the surface chemistry of inert materials is promising not only to researchers, but to every person in the world who will use a product born from this technology.

Steven Burgess



Peer Reviewed Literature:

Hyde, G. K. et al "Atomic Layer Deposition and Abrupt Wetting Transitions on Nonwoven Polypropylene and Woven Cotton Fabrics." Langmuir 26.4 (2010): 2550-2558.

http://pubs.acs.org/doi/abs/10.1021/la902830d

New Scale Helps Diagnose OCD

Imagine being in the cereal aisle of the grocery store when suddenly you see something that doesn’t look right: boxes of various heights, randomly mixed together. You break out into a cold sweat; your heart thumps hard. You try to resist the urge to reorganize the boxes, but you can’t.

You begin organizing the cereal boxes by height. A strange calm washes over you.

You’ve just experienced five minutes in the life of someone with obsessive-compulsive disorder.


Nearly everyone experiences some level of obsessive-compulsive behavior, or OCD, which is characterized by intrusive thoughts, or obsessions, and repetitive actions, or compulsions. In its most severe form, OCD can severely inhibit a person’s ability to function normally. Time that could be spent with friends is instead consumed by the need to respond to the obsessions.

Before a psychiatrist or psychologist begins therapy on a person with OCD symptoms, he or she must determine whether or not the patient indeed has OCD or a different anxiety disorder. To make a definite diagnosis, doctors use one of several diagnostic tests. However, according to Dr. Jonathan Abramowitz, each one has flaws.

So he and several of his colleagues from across the country developed their own measure for OCD, called the Dimensional Obsessive-Compulsive Scale (DOCS). In a 2010 peer-reviewed article in Psychological Assessment, Abramowitz introduces the DOCS and explains how it diagnoses symptom severity more accurately than the Obsessive-Compulsive Inventory-Revised (OCI-R), and distinguishes OCD symptoms more clearly than the Yale-Brown Obsessive-Compulsive Scale (Y-BOCS) the two most widely-used OCD measures.

A research psychologist from the University of North Carolina at Chapel Hill, Abramowitz argues that obsessions and compulsions work as pairs. He groups these obsession-compulsion pairs into four areas, or dimensions: Contamination, Responsibility for Harm, Unacceptable Thoughts, and Symmetry. The OCI-R and Y-BOCS also include Hoarding.

Unlike some other OCD diagnostic tests, the DOCS doesn’t separate obsessions from compulsions. In fact, it doesn’t ask questions about specific behaviors at all. The scale asks five questions for each OCD dimension: time spent on the behavior, avoidance of situations, distress over unwelcome thoughts, disruption of life, and difficulty in ignoring OCD thoughts. The DOCS introduces two new components to OCD diagnostics: adding the concept of avoidance, and removing Hoarding from the scale.

Abramowitz tested the DOCS on three different groups: people diagnosed with OCD, people diagnosed with other anxiety disorders, and college students from Tennessee, Florida, and Arkansas. Members from each group completed the DOCS and one or more of six other tests, including two tests for OCD, three scales for anxiety disorders and one scale for depression.

When compared with results from the other scales, the DOCS results were more similar to results from the other two OCD scales than they were to the anxiety or depression scales. In addition to being a useful clinical tool in diagnosing OCD, the DOCS also shows promise in measuring treatment outcomes.

Source: Abramowitz, J., et. al. (2010). Assessment of Obsessive-Compulsive Symptom Dimensions: Development and Evaluation of the Dimensional Obsessive-Compulsive Scale. Psychological Assessment, 22(1), 180-198.

Cleaning Mickey Mouse


It is everywhere, the “Mickey Mouse” of molecules, H2O or water. It is essential for life, animal, plant, and even bacteria. Bacteria can live in our water pipes, which can pose a risk to human health. In the article (2009) “Effect of free chlorine application on microbial quality of drinking water in chloraminated distribution systems” researchers from Chile, Massachusetts, and North Carolina developed a new method of measuring bacteria amounts in water by flow cytometry.

Water Crisis Worsens In Southern England
The old method, R2A ager, did not show scientists an accurate representation of the bacteria amounts in water. However, flow cytometry uses a dye that causes organic material (bacteria) to glow, making it easier for scientists to count the number of bacteria present in a water sample.

Bacteria eat the ammonia produced by decaying materials and release nitrate and nitrite into the water in a process called nitrification. When pregnant women drink nitrate/nitrite rich water the nitrate competes with the oxygen in the blood of the fetus, causing the child to be born with blue tinted skin. Blue baby syndrome which results from low oxygen levels in the bloodstream.

The Distribution Systems (DS) or water plants, in the research used Chloramines as their primary method of disinfection due to its ability to kill bacteria without producing high levels of harmful by-products.

The trade-off is higher nitrification levels in chloramine treated water. In an effort to reduce nitrification levels in tap water North Carolina law requires water treatment facilities that use chloraminated disinfection as its primary disinfectant to flush the system with free chlorine of hypochlorous acid one month a year in order to reduce nitrification levels.

While the free chlorine reduces nitrification levels it presents other health problem for people. The free chlorine reacts with the water to make harmful by-products like chloroform a known carcinogen, thereby preventing its widespread use.

In areas where water is in constant use like urban and suburban centers, the public doesn’t face as much risk as people in dead end sites. A “dead end site” is essentially where water sits or pools for long periods of time, like a school during summer (1).

The researchers looked at a closed community center and found that before the introduction of free chlorine into the DS the water showed high levels of nitrification. In order to move the free chlorinated water into the community center, hydrant flushing or opening up a water source in order to get water moving was used. Once the free chlorinated water reached the community center nitrification went down, but the levels of chloroform levels increased. A second hydrant flushing was needed to remove the chloroform rich water, however it was not done.

According to Dr. Detlef Knappe a researcher in the article, the flow cytometry method of calculating bacteria count will hopefully help engineers and scientists to strike a balance between bacterial and negative chemical effects.

By engineering water systems that circulate water to eliminate dead end sites or creating a balance between chemical disinfectant and bacterial amounts flow cytometry will help provide clean and safe water to the public.

For more information on the concepts in this blog please check out:

(1)Rosenfeldt EJ, Baeza C, Knappe DR. 2009. Effect of free chlorine application on microbial quality of drinking water in chloraminated distribution systems. American Water Works Association 101(10):60-70.

Growing Technology that Grows

Runoff from construction sites has long been a major factor in polluting our rivers and lakes. The EPA has even issued a new rule requiring all construction sites over 10 acres to reduce runoff to about 100-times lower than is typical of their discharge.

Luckily, research out of NC State University shows how a remarkable polymer can be used to reduce runoff from road construction up to 98% from current methods.

Professor R.A. McLaughlin, professor S.E. King, and professor G.D. Jennings are taking advantage of a remarkable polymer called polyacrylamide (PAM). PAM is a water soluble, synthetic polymer that expands when it comes into contact with water.

So, when it rains, PAM dissolves into sediments from runoff, causing tiny sediments to literally expand and settle. Not only does this stop the sediments from polluting a river or lake, but it also creates a physical barrier that slows runoff.

When used in conjunction with fiber check dams (FCDs) consisting of straw wattles and coir logs, construction companies might actually have a chance of meeting the DOT’s requirements.

Construction companies’ current method of using rock check dams are highly ineffective and even more expensive than McLaughlin et al.’s method of using the fiber check dams with PAM.

This research has major implications in a nation where 45% of rivers and streams in the United States are “impaired for their intended use, with sediment and siltation as the leading cause,” according to a study the EPA did in 2002.

When water has high levels of particles due to sedimentation, it is called turbid water, and high levels of turbidity in drinking water can protect disease-causing bacteria from ultraviolet sterilization, which when drank can cause nausea, vomiting, and dizziness. And, long-term drinking of turbid water can lead to gastrointestinal diseases or death.

McLaughlin et al. tested three different systems for controlling erosion at two roadway projects undergoing construction in the North Carolina mountains from June 2006 to March 2007. The team marked off three experimental sections next to each other at the first roadway project, and two experimental sections at the second roadway project.

The first experimental section of the first roadway was treated using the standard technique used by construction companies:

“[Rock check dams] are narrow sediment traps in the ditch along with rock check dams,” McLaughlin said.

The team treated the second experimental section of the first roadway with fiber check dams (FCDs) consisting of straw wattles and coir logs.

And the team treated the third experimental section of the first roadway using fiber check dams with granulated PAM added to each. The PAM is granulated rather than liquid so that when it rains, it dissolves into the soil and thickens the barrier of the dam, as well as weighs down the sediments.

The team compared the three systems by measuring the turbidity, or the amount of sediments, in the water.

Turbidity is measured in nephelometric turbidity units (NTU). The EPA requires turbidity for construction companies be less than 280 NTU. At site one, the average turbidity values for the storm-water runoff was 3,913 nephelometric turbidity units (NTU) for rock check dams. However, the Fiber check dams with PAM’s average turbidity values were only 34 NTU.

"Because we did this research before the rule was issued, we have good confidence that we can train the industry to attain the turbidity goal following key elements we have determined are necessary based on our research," McLaughlin said.

McLaughlin et al. published their research, titled “Improving construction site runoff quality with fiber check dams and polyacrylamide,” in the Journal of Soil and Water Conservation 64(2):1444-154.

http://www.swcs.org

http://www.jswconline.org.www.lib.ncsu.edu:2048/content/64/2/144.full.pdf+html

Monday, April 5, 2010

Justice is blind--but can it be deaf?

People accept that it’s wrong to judge based on skin color, but most people think that there’s a right (read: white, middle class, “standard”) way to speak.

If I were reading this text aloud to you, your brain could use the sound of my voice—the way I combined vowels and consonants, the subtle shifts in inflection, the vocabulary I used, the order of those words, and maybe even the frequencies I produced—to determine my ethnicity. And your brain would very likely get it right.

Can you tell “what” I am from the slant of this typeface? I didn’t think so.

The field of sociolinguistics studies how language is socially constructed—that is, how who we are determines how we say what we say. And any good sociolinguist will tell you that, as listeners, we place almost as much value on the “how” as we do on the “what”: we judge people according to how they sound. Speech is as important as skin color to our assessments of—and potential discrimination against—people who are different from us. Sociolinguists call such discrimination linguistic profiling.



(Linguistic profiling can be used for more than just discrimination. Voices can be analyzed like fingerprints and used to ID individuals; such analysis can solve crime or open doors with voice recognition technology.)

But sociolinguists have a hard job. It’s nearly impossible to sort out all of the variables that contribute to our judgments about who someone is. Erik Thomas, Jeff Reaser, and Walt Wolfram at NCSU have spent parts of their careers trying to narrow it down. In a forthcoming chapter in Linguistic Profiling and Linguistic Human Rights, Thomas reviews several decades worth of scholarship, and the big conclusion is…a question mark. We know that people can make good guesses about ethnicity from phonological features—sounds alone, no syntax or vocabulary clues, but we don’t know which phonological features are the key.

Thomas writes that studies—most of which use recorded samples of African American (or “black”) speech as well as other control samples to survey listeners—have determined that there are prototypical features of African American speech. Sociolinguists call speech that contains such features “marked” speech. Studies also show that non-prototypical black speech (i.e. the speech of black speakers from isolated southern towns which shows more marked features typical of southern, not black, speech) is less recognizable to white listeners. But more prototypical black speech, even without marked words and syntax—is recognizable.

The key, says Thomas, is in the way that African American speakers use vowels and consonants and in the rhythm and intonation—or prosody—of their speech. But what’s the recipe? What features, when absent, erase the speaker’s race?

In 2004, Thomas and his colleague Jeff Reaser collected samples from African American speakers in Hyde County, North Carolina—a region where Black speech contains fewer prototypical features. They then used a computer program that allowed them to modulate vowel sounds to tweak, or monotonize, prototypical /o/ sounds. When they compared listener monotonized /o/s to the unchanged samples with “marked” /o/s, they found that the vowel helped listeners identify African American speakers. (A control group from the piedmont, where Black speech has many more prototypical features--sounds, words, and syntax--was much easier for listeners to identify.)

A follow-up study using different analytical methods and extending the listener sample to include West Virginia University students as well as NCSU students is forthcoming. Its results imply that listeners use different cues to determine the ethnicity of males and females.

Sunday, April 4, 2010

Slow-growing Minds May Imply Adult Schizophrenia

Have you ever had trouble paying attention as a child? Maybe difficulty processing new ideas or information? Based off the title and these two open-ended questions, you're probably thinking: “YES! I did! Does this mean I'm destined to develop schizophrenia?” Try to dislodge your heart from your throat and then take a deep breath. It's very unlikely. However, a recent study shows that these characteristics in children may provide clues toward the early detection of adulthood schizophrenia. Researchers from Duke University published an article in American Journal of Psychiatry involving a long term study of the growing minds of children with the objective to find a correlation between a child's cognitive shortcomings and their likely-hood of developing schizophrenia in adulthood.

1,000 participants from New Zealand were used to establish a testing group for this research. They were all born between 1972 and 1973 and then tested periodically throughout their growing years for cognitive impairments characteristic of schizophrenic patients. It was shown that the percentage of patients considered schizophrenic in their adult years had related developmental difficulties that was seen as early as age 7. These difficulties consist of a child's visual learning, working memory, processing speed, attention span, verbal reasoning and the solving of visual-spatial problems.

To facilitate this research, three fundamental yet unanswered questions within the scope of study are acknowledged: What is the developmental course of schizophrenia prior to its onset? Do different cognitive characteristics follow similar or different developmental paths? And are there developmental difficulties specific to schizophrenia?

Testing for these disabilities began at age 3 and were continued until age 13 in 2 year intervals; a testing methodology apart of the Dunedin Multidisciplinary Health and Development Study. Furthermore, it was noticed that between the ages of 7 and 13, a loss in .17 and .26 of mental age was apparent in children who would later become diagnosed with schizophrenia.

By the age of 32, it was shown that 2.5 percent of the participants monitored met the diagnostic criteria of schizophrenia and 1 percent met the formal criteria. Only the participants who met the formal criteria were hospitalized and put on antipsychotic medication. Their conclusions consisted of two findings evident between childhood and early adolescence of the children who grew up to develop schizophrenia: Upon entering primary school, they struggled with verbal reasoning and as they aged, lagged behind their peers in working memory, attention, and processing.

Now what does all this mean? Schizophrenia is not an all-of-the-sudden occurrence, but related to a child's development. Children who developed schizophrenia lagged behind in school compared to their peers and continued to do so. Initially their verbal skills are poor and as they age, continue to develop other hindrances in learning. Their minds did grow, but their abilities to make sense of the world forced them into social isolation/delusion. This study offers a unique view not so much into why schizophrenia occurs but what traits are similar in those diagnosed. Future research will focus on the causes of this disease based on these stages of human development.

Friday, April 2, 2010

Feeling good: the perception of what is “pleasant” to the touch

Something strokes your skin and you shiver with pleasure. Or is it displeasure? Whether it is pleasant or unpleasant, touch forms a cornerstone of social behavior in humans. Before we even opened our eyes as babies, we tried to make sense of our surroundings through our skins. We touch to learn, to show affection and feel pleasure.

It is important for us to understand how the touch of various materials influences our emotional responses, as the sense of touch is one of the major ways we, as humans, gather information about the world. Now, scientists have revealed that males and females have different perceptions of what is pleasant to the touch.

Previous investigations about pleasant touch suggested soft and smooth materials as pleasant; those that were stiff, rough, or coarse as unpleasant. However, these investigations did not have any numbers to back them up, nor did they consider the possibility of there being differences between males and females.

A research team led by Greg Essick of UNC Chapel Hill in North Carolina decided to fill in these gaps in the knowledge of tactile stimulation, or what they call “pleasantness-to-touch.” Their study, published online in Neuroscience & Biobehavioral Reviews on February 21, 2010, sought to make a quantitative assessment of people’s perceptions of pleasant touch. In other words, by using controlled experiments and gathering concrete data, the scientists can identify what and where females feel pleasantness differently than males.

In the experiment, a rotary tactile stimulator (pictured below) was used to control how the materials brush across the skin while a participant entered in their “pleasantness rating” on a scale of 100% unpleasant to 100% pleasant. “Pleasantness” of contact was noted as affective touch, or touch that evoked a positive emotional response to tactile stimulation.




There were 21 male and 22 female participants. After repeating the process several times, 16 stimulus trials per participant, the researchers obtained ratings of pleasantness of different textured materials stroked across the skin of multiple body sites at controlled velocities and forces of application.

Their data supports previous results that smooth stimuli are pleasant, and that they continue to be pleasant even with increased force. Conversely, rough stimuli start out neutral and become more unpleasant as the force is increased.

The most unexpected findings were that males found stimulation of the forehead, particularly for terry toweling and denim, unpleasant, whereas female participants found terry toweling and denim to be unpleasant on the hand and thigh. For materials that were more pleasant, gendered-responses were similar when tested on the hand, forearm, and thigh. Interestingly, male participants found stimulation of the calf more pleasant than female participants.

This was the first study to conclude, based on data, that there are differences between male and female responses to unpleasant materials. They believe this phenomenon most likely due to sex-dependent mechanical responses of the skin.

We all know that males and females have areas that are more sensitive than others, but who would have thought there are some pretty mundane differences in addition to the sexual ones that usually come to mind.

It is only a first step, but as we continue to learn about, and assign values to the different emotional responses to touch, we can begin to understand the psychology of why people are attracted by some things and repulsed by others.

Health Effects of Gillnets on Sea Turtles along the Cape Fear River


The lower portion of the Cape Fear River in North Carolina is home to fisheries and sea turtles which is a costly combination for both. In this area, fishermen use gillnets, or large mesh nets used to capture fish, that are hazardous to both sea turtles and the fishing industry. Sea turtles can get caught in the nets, ruin them, and inflict injury or death upon themselves.

In an effort to decrease mortality rates, North Carolina law requires fishermen to supervise their gillnets during the summer months. Since gillnets are in the water for an average of twelve hours (soaked), many fishermen don’t supervise their nets for the full time in an effort to cut costs (1).

Many sea turtles escape the gillnets only to die later on from boating accidents, predators, or their injuries. In an effort to decrease sea turtle mortality, researchers from the University of North Carolina-Wilmington and Grice Laboratory in Charleston, South Carolina analyzed sea turtle blood biochemistry to determine the likelihood of survival after entanglement and release.

The researchers captured eighteen sea turtles using a mesh gillnet soaked for a maximum of six hours from May to October. To ensure the safety of the animals, the sea turtles were immediately brought to the surface if they were in danger of drowning.

Once the sea turtles were detangled/ released the researchers drew blood immediately after the turtle was brought on board. They then tested the turtles reflexes, checked for injuries, and monitored behavior before the turtle was released about ten meters from the capture site.

The scientists rated the turtles from A to D, with A meaning perfect condition and D being low activity, severe injuries, and missing or delayed reflexes.

Only three of the eighteen sea turtles rated as an A

Only three of the eighteen sea turtles rated as an A. Out of those who did not, their blood work showed elevated levels of lactate, LDH (lactate dehydrogenase), and other chemicals which cause an increase in metabolism. Due to the increased metabolism, the sea turtles have to spend longer periods of time on the surface in order to recover. These longer surface times greatly increase the risk of death for the turtles.

The researchers hope their study would determine a maximum unattended soak time for fishermen that would also minimize the unintentional capture of sea turtles. Due to other factors that lead to sea turtle deaths, the scientists recommended the current restriction remain. However, the scientists also recommend that captured sea turtles be brought onboard to assess their physical well-being.

They outline that fishermen can do a gentle touch to the tail or eyelid to assess reflexes and a simple visual inspection to check for injuries. They also suggest that fishermen take the turtles to a rehabilitation facility at the first sign of injury or distress.

1. Snoddy, J. E. (2009). Blood Biochemistry of Sea Turtles Captured in Gillnets in the Lower Cape Fear River North Carolina, USA. The Journal of Wildlife Management, 8(73), 1394-1402.

Thursday, April 1, 2010

Pesticides, Urine and Vacuum Dust: What’s the Connection?

Dust is nearly as ubiquitous as the wind, floating and flying into every crevice of our lives. It begins as a speck of dirt or a couple of hairs, and as it settles over every surface, it clings together as a fuzzy blanket or a furry tumbleweed. That blanket—and every particle that binds it together—holds evidence of what you breathe each day.

That evidence inspired Dr. James Starr to conduct a study on what was in the dust in people’s homes. A physical research scientist with the U.S. Environmental Protection Agency, Starr was particularly interested in a specific ingredient that he suspected he would find in the dust—pesticides.

According to Starr, almost no one can avoid the presence of pesticides. The neatest, most compulsive housekeeper who eats organic food and shuns furry companions may provide even the slightest exposure to pesticides in the air. Most times, the exposure goes unnoticed. Although most people won’t ingest even a fraction of the pesticides needed to make them sick, the Centers for Disease Control and Prevention periodically assesses people’s exposure to pyrethroid pesticides by examining their urine. Pyrethroids are a type of pesticide commonly used to control insects on farms, in homes and on pets.

When someone uses a pesticidewhether it’s a shot of ant or roach spray in a corner or a flea product on a dog—that pesticide remains where it is applied. Once it enters the environment, the original parent pesticide, and the altered compounds that comprise it—called degradates—enter the body with a breath or a bite of food, may be further metabolized, and may exit the body through the urine.

In the urine, Starr says, the metabolized parent pesticide and the degradates look identical when they are analyzed. Starr wanted to know if urinary analyses were accurately measuring the amount of the original pyrethroids in a person’s body or if some of the person’s exposure came from the less toxic degradates. If people were inhaling both the parent pesticide and the degradates, results from urinary analyses would be unnecessarily alarming. So he decided to test dust samples vacuumed from people’s homes.

“We’re interested in dose, the follow-up to exposure,” Starr says. “That’s where the chemicals have the ability to have an effect.” The greater the toxicity of a substance and the more that’s ingested, the more likely the substance is to have an effect.

To purify the dust samples, Starr sent them to a lab in Maryland to be zapped with gamma radiation. Dust is as nasty as it seems; it’s a collection of skin cells, bacteria, hair, dirt, and any other particle flying around in the air. Starr says he only works with dust inside a fume hood.

“If you can smell the contents of the bag when you vacuum, you are redistributing dust throughout your home,” he says.

The pyrethroid products in the dust samples—in doses too miniscule to likely be harmful—included the degradates.

“This shows that exposure can be to the degradates in addition to the parent compound,” says Starr. “If you don’t know which, you may overestimate exposure to the more harmful parent.”

Thanks to Starr, we can all breathe easier.

Reference: Starr, J., Graham, S., Stout, D., Andrews, K. and Nishioka, M. Pyrethroid pesticides and their metabolites in vacuum cleaner dust collected from homes and day-care centers. Environmental Research, 108 (2008) 271-279.

Soverign Wealth Funds: Risk and Reward

Norwegian Risks Pay Off

Jim Allen

So what’s the big deal about Norway? “It’s a hereditary monarchy and parliamentary democracy, cold, and has universal health care, but what’s risky about that”, is what one might pronounce. The Norwegian Sovereign Wealth Fund (SWF) performed better than its private sector counterparts with a higher associated risk, according to a recent study published by North Carolina State researchers in The World Economy. Funds like the one in Norway are large and growing, and are just recently coming into public light.

Sovereign Wealth Funds are basically large government investment funds which function like mutual funds in terms of risk and return on investment. While not all SWFs are the same, they each have a similar goal: “to achieve maximum income subject to a level of risk and certain portfolio constraints specified by its Ministry of Finance.” In Norway, the Fund allows government spending to be smoothed out relative to the volatile pattern of the nation’s oil revenue. SWFs are essentially savings accounts for governments; citizens have banks, and now government entities do also. Many nations which have SWFs are also oil producing nations; SWFs provide a stabilizing effect for nations which have erratic economies related to fluctuations in oil prices. These nations include the United Arab Emirates, Saudi Arabia, Norway, and even the U.S.

According to the research, “Total assets held by SWFs have been estimated to be around US$3 trillion (Jen, 2007)” (See the figure below). This value is greater than the total assets of hedge funds (US$2 trillion) but less than total official monetary reserves of central banks (US$6 trillion).” By 2013, the holdings of SWFs are estimated to exceed US$6 trillion. These are just approximations because the literature on SWFs has been limited and descriptive at best, up till now. The large amount of money held Worldwide by SWFs was quite alarming to the researchers at NC State when they looked at the relative return ratios and how much risk the SWFs were taking to achieve these returns.



Researchers found that the Norway Fund had average monthly returns of 0.36%, while the Social Choice Fund (generally accepted as a balanced and fiscally responsible fund) had a return of 0.16%. To explain this, the research also included data on return on investment versus risk. Not surprisingly, the increased returns also carried increased risk. More specifically, the Norway return to risk ratio was 0.14 while the Social Choice Fund had a much lower ratio at 0.07. Critics also point out mistakes made by SWFs in recent years, and in particular investments in American corporations such as Citigroup, Morgan Stanley, and Merrill Lynch.

This research effort with the Norwegian SWF Fund (referred to as “The Fund”) is one of the first of its kind since the Norwegian government has allowed a high level of transparency unrivaled by other Funds. Researchers stated that “The Fund demonstrated transparency by providing detailed and reliable information about its activities in a timely manner.” Because of this, researchers were able to analyze important data about how The Fund affects global markets and what kinds of returns its investors receive.




Caner, Mehmet, and Thomas Grennes "Sovereign Wealth Funds: The Norwegian Experience." World Economy 33.4 (2010): 597-614. Wiley Interscience. Web. 1 Mar. 2010. .

Nightmare on Hog Street


The nightmare is always the same: I’m a kid again, riding in the passenger seat of my mom’s 1997 Chrysler Town and Country, playing Pokémon happily with the wind blowing in my face. All of the sudden, I feel it creep up my spine – a dark matter with properties beyond comparison.

I’m talking, of course, about the foul, foul, foul odors emitted by hog and chicken farms. Driving two hours from Raleigh in any direction is putting your nose at serious risk and heading toward the beach is a death wish.

In fact, North Carolina produced 781,000,000 chickens and produced 2.96 billion eggs in 2007, compromising 38.8% of North Carolina’s total farm income. However, the process of taking animal byproducts like skin, bones, and feathers and processing them into useful products produces extremely foul odors.

There may finally be a solution!

Dr. Praveen Kolar, assistant professor of biological and agricultural engineering at NC State, has devised a new inexpensive treatment process that significantly eliminates foul odors and air pollutants emitted by Concentrated Animal Feeding Operations (CAFOs).

Working with Dr. James Kastner at the University of Georgia, Dr. Kolar “has designed an effective filtration system that takes advantage of catalytic oxidation to remove these odor-causing pollutants.”

Catalytic oxidation uses specially-designed catalysts and ozone to break down the odor-causing compounds.

Catalysts are things that are added to a process to start or change the rate of the process’ chemical reaction, and the catalyst is not consumed by the reaction itself, meaning a catalyst can be used many times.

Kolar and Kastner developed the catalysts by “coating structures made of activated carbon with a nanoscale film made of cobalt or nickel oxide.”

Activated Carbon’s porous structure gives it a very large surface area in which to expose the odorous agents.

“The cobalt and nickel oxide nanofilms make excellent catalysts, Kolar explains, ‘because they increase the rate of the chemical reaction between the odor-causing compounds and the ozone, making the process more efficient. They are also metals that are both readily available and relatively inexpensive.’”

Another advantage of catalytic oxidation is that it takes place at room temperature, meaning there are no energy costs, and the only two byproducts created are carbon dioxide and water.

The current system, which uses chemical “scrubbers” to remove the odor-causing agents, has many disadvantages.

Most obviously, it is ineffective. All the empirical evidence you need is your nose.

Also, some of the odor-causing animal compounds are aldehydes, which are used in making plasticizers and detergents. These aldehydes can “combine with other atmospheric compounds to form ozone – triggering asthma attacks and causing other adverse respiratory health effects.”

Although Kolar has only targeted industrial poultry farms, he is using his research to target hog farms next. "’This technology could be applied to swine operations to address odors and ammonia emissions,’ Kolar says. ‘My next step is to try to pursue this research on a large scale."

Wednesday, March 31, 2010

OCD: Relieving the Obsession




How many times do you have to wash your hands until they are really clean? One time a day? Five times a day? Would you be surprised if some people wash their hands over 100 times day? This “germ phobia”, while disrupting to daily life, is sometimes the least of worries for people suffering with obsessive compulsive disorder (OCD).

Research at the University of North Carolina at Chapel Hill (UNC-CH) in the department of psychology is offering new insight into OCD and ways to treat/alleviate it. With over 3.3 million Adults in America suffering from OCD each year and the high probability of countless more undiagnosed cases, this research is close to offering some answers.

Jonathan S. Abramowitz and co-workers in the Department of Psychology at UNC-CH have shown there to be an unmistakable connection between the concepts of experiential avoidance with the symptoms of OCD. Experiential avoidance, commonly described as a person going to irrational measures to avoid experiencing such unpleasant triggers as disturbing thoughts, emotions, etc., has been previously thought to relate to OCD, but this is the first study proving that it indeed does.

“There are competing theories for explaining obsessive-compulsive (OC) symptoms and this study was designed to try to figure out which theory is stronger. Knowing this can help with better understanding and treating people with OCD,” said Abramowitz.

The study was conducted using 353 volunteer students from UNC-CH enrolled in an introductory psychology course. The students, under confidentiality, were administered three questionnaires via internet at their convenience measuring: 1) tendency for experiential avoidance; 2) tendency towards depression; and 3) tendency towards specific OCD symptoms, such as hand-washing, hoarding, checking, etc. After scoring the questionnaires based on certain criteria published in literature, the researchers divided the scores into two groups, one with a higher degree of obsessive compulsive (OC) symptoms, and another with a much lower degree. The analysis at the core of the journal article was created from the data specific to the group with a higher OC tendency.

Upon examining the results from a statistical analysis of the higher OC group, the researchers discovered that “As expected, the High-OC group evidenced greater levels of…experiential avoidance relative to the Low-OC group.”

But how is this new and what does this mean for the public afflicted with OCD?

From the results of this research, “…people with OCD probably have specific dysfunctional beliefs that underlie their OCD symptoms moreso than having a general tendency to avoid unpleasant experiences. Also, that treatment might be focused on modifying obsessive beliefs which underlie the problem, rather than helping patient develop more psychological flexibility in general,” said Abramowitz.

When looking at what is to come next, “Future research should examine this same question using actual treatment-seeking patients with OCD…[rather than using a]… non-clinical sample of people scoring highly on a measure of OCD symptoms,” says Abramowitz.

It can sometimes be difficult to treat such an illness as OCD, but what remains certain is that the illness can never be treated or cured if you are focusing on the wrong aspect of the illness. Thanks to Abramowitz and co-workers at UNC-CH, even more successful treatments for this condition are that much closer to becoming a reality.

Steven Burgess

Peer-reviewed literature:
Abramowitz, Jonathan S., Gerald R. Lackey, and Michael G. Wheaton "Obsessive-compulsive Symptoms: The Contribution of Obsessional Beliefs and Experiential Avoidance." Journal of Anxiety Disorders 23 (2009): 160-166.

Picture from:
http://bit.ly/9r0nSJ

Tuesday, March 23, 2010

Science Journalism: Magazine vs. Online Series

The 2009 AAAS Science Journalism Award Recipients in the magazine and online categories approach the concept “explaining science to the public” in divergent ways.

Gary Wolf’s magazine article, “A Simple Plan to ID Every Creature on Earth” manages to keep a nice balance between the science and the overall story. He explains the science involved using specific details and easy-to-understand explanations. The cold hard science about placing a barcode on species by analyzing the CO1 Mitochondrial gene is tempered by descriptions that inspire the different senses. This results in a story that is both engaging and highly informational.

Contrarily, the online series by Lisa Friedman concerning the displacement of people in Bangladesh due to climate change and global warming, begs the question, “where is the science?” It takes a very humanistic approach to a topic that has inundated the news of late and it presents the consequences of human actions. To that effect, it is a well-written human interest story. Because it does not explain anything scientifically, though, it really shouldn‘t have won an award for scientific journalism. Instead it is a nice story of how humans are suffering because of a scientific phenomenon.

The two winners both use pictures to enhance the pieces. The online series takes it a step further with a short video clip. This video clip tugs on the heartstrings but it does not explain any science and does not add much to the overall. Both stories are well-written, had clarity, and flowed from beginning to end. However, where the former is scientifically dense, the later is watered down and contains almost no scientific basis or fact.

Tuesday, March 9, 2010

Large & Small (But Really Long) Newspaper Articles

Carl Zimmer’s three long newspaper articles are on very different topics: “Now: The Rest of the Genome”, “10 Genes, Furiously Evolving”, and “Blink Twice If You Like Me”.

Zimmer’s “Now: The Rest of the Genome” is about Encode (Encyclopedia of DNA Elements) and what they are doing. It is full of scientific language referring to terminology that I have heard many times but never fully comprehended. Altough Zimmer does explain it, the It is full of very long sentences and paragraphs compared to the small newspaper article “’One mother loses five of her kids in ‘worst-case scenario’, which is full of short, informative sentences and paragraphs.

It’s main focus is providing a reformative definition for ‘gene’:

‘”These new concepts are moving the gene away from a physical snippet of DNA and back to a more abstract definition. “It’s almost a recapture of what the term was originally meant to convey,’ Dr. Gingeras said.”

This is all I really retained from this article. I got bored many times throughout and can’t imagine someone who is not interested in science finding this interesting or sitting down to read the article in its entirety.

“10 Genes, Furiously Evolving” is about the evolutionary biology of viruses. Virologists study the complexity and mystery surrounding the evolution of viruses. Viruses are described in scientific language, filled with quotes from doctors, etc.

Zimmer includes interesting trivia such as "Virologists have estimated that there are a million trillion viruses in the world's oceans." Further, Zimmer explains that birds are constantly mixing up the constellation of viruses.

This is certainly a relevant article considering the recent swine flu outbreak that has caused everyone to panic. It does a good job explaining how swine flu has evolved and it flows well. Its length was not as daunting as “Now: The Rest of the Genome.”

“Blink Twice if You Like Me” is the most interesting article Zimmer wrote. Although the other two are arguably more important and impactful than cannibal butterflies, they didn’t interest me nearly as much.

Zimmer again used very long paragraphs with very long sentences. However, multimedia such as audio recordings, photos, and videos on the left side of the page add some color and life and detract from the massive wall of text going down the page.

The article is interesting because everyone loves fireflies. Hating fireflies is like hating rainbows. Who hates rainbows? The science was interesting enough and Zimmer explained the observable traits and practices of firefly mating.

However, I found the end of the article the most fascinating: when the author talked about Photuris predators, their deceitful flashes, and how evolution and natural selection came into play. It seemed to be disconnected from the rest of the article, and I’d like to read an expanded article on this topic.

“Lethal Legacy” by Amie Thompson may be from a small newspaper, but it is a long, long, long, long, long article. It’s so long that it needs a few headings: “Merciless Killer”, “Road to Montana”, etc. as well as includes pictures throughout. Long narrative of history of family. There is not much science. It is a sad story and seems to have TONS of information that is just thrown at the paper.

This may be harsh, but I got bored listening to the stories of more and more people getting sick and dying. Compared to ‘One mother loses five of her kids in ‘worst-case scenario’, it bores me and seems to just tell a story of everyone dying. It was all over the place and I didn’t like it one bit. The pictures on the right were the same pictures every new page (and there were six long pages).

“One mother loses five of her kids in 'worst-case scenario'” is on the contrary a short article with short paragraphs that does the job. It doesn’t ad tons of detail or take a long time explaining the science, but gets straight to the point and briefly portrays the tragic events.

This article does not have too much science in it, but it is also far shorter. I think this article would be the article most likely to be completely read by the average reader (including me) because it is not intimidating, unlike the others.

In my opinion, “Disease leads neurologist on research odyssey” is the best article about Pallido-Ponto-Nigral-Degenration (PPND). It is medium in length and I feel its readability and flow are the best. It focused on Wszolek’s research and path instead of jumping around to a bunch of different people and stories like “Lethal Legacy” seemed to do. It seems to be a step up from the shortest article, and step down from the longest article, but still explains all needed information.

Zimmer’s Large Newspaper Articles

Carl Zimmer’s three long newspaper articles are on very different topics: “Now: The Rest of the Genome”, “10 Genes, Furiously Evolving”, and “Blink Twice If You Like Me”.

Zimmer’s “Now: The Rest of the Genome” is about Encode (Encyclopedia of DNA Elements) and what they are doing. It is full of scientific language referring to terminology that I have heard many times but never fully comprehended. Altough Zimmer does explain it, the It is full of very long sentences and paragraphs compared to the small newspaper article “’One mother loses five of her kids in ‘worst-case scenario’, which is full of short, informative sentences and paragraphs.

It’s main focus is providing a reformative definition for ‘gene’:

‘”These new concepts are moving the gene away from a physical snippet of DNA and back to a more abstract definition. “It’s almost a recapture of what the term was originally meant to convey,’ Dr. Gingeras said.”

This is all I really retained from this article. I got bored many times throughout and can’t imagine someone who is not interested in science finding this interesting or sitting down to read the article in its entirety.

“10 Genes, Furiously Evolving” is about the evolutionary biology of viruses. Virologists study the complexity and mystery surrounding the evolution of viruses. Viruses are described in scientific language, filled with quotes from doctors, etc.

Zimmer includes interesting trivia such as "Virologists have estimated that there are a million trillion viruses in the world's oceans." Further, Zimmer explains that birds are constantly mixing up the constellation of viruses.

This is certainly a relevant article considering the recent swine flu outbreak that has caused everyone to panic. It does a good job explaining how swine flu has evolved and it flows well. Its length was not as daunting as “Now: The Rest of the Genome.”

“Blink Twice if You Like Me” is the most interesting article Zimmer wrote. Although the other two are arguably more important and impactful than cannibal butterflies, they didn’t interest me nearly as much.

Zimmer again used very long paragraphs with very long sentences. However, multimedia such as audio recordings, photos, and videos on the left side of the page add some color and life and detract from the massive wall of text going down the page.

The article is interesting because everyone loves fireflies. Hating fireflies is like hating rainbows. Who hates rainbows? The science was interesting enough and Zimmer explained the observable traits and practices of firefly mating.

However, I found the end of the article the most fascinating: when the author talked about Photuris predators, their deceitful flashes, and how evolution and natural selection came into play. It seemed to be disconnected from the rest of the article, and I’d like to read an expanded article on this topic.

“Lethal Legacy” by Amie Thompson may be from a small newspaper, but it is a long, long, long, long, long article. It’s so long that it needs a few headings: “Merciless Killer”, “Road to Montana”, etc. as well as includes pictures throughout. Long narrative of history of family. There is not much science. It is a sad story and seems to have TONS of information that is just thrown at the paper.

This may be harsh, but I got bored listening to the stories of more and more people getting sick and dying. Compared to ‘One mother loses five of her kids in ‘worst-case scenario’, it bores me and seems to just tell a story of everyone dying. It was all over the place and I didn’t like it one bit. The pictures on the right were the same pictures every new page (and there were six long pages).

“One mother loses five of her kids in 'worst-case scenario'” is on the contrary a short article with short paragraphs that does the job. It doesn’t ad tons of detail or take a long time explaining the science, but gets straight to the point and briefly portrays the tragic events.

This article does not have too much science in it, but it is also far shorter. I think this article would be the article most likely to be completely read by the average reader (including me) because it is not intimidating, unlike the others.

In my opinion, “Disease leads neurologist on research odyssey” is the best article about Pallido-Ponto-Nigral-Degenration (PPND). It is medium in length and I feel its readability and flow are the best. It focused on Wszolek’s research and path instead of jumping around to a bunch of different people and stories like “Lethal Legacy” seemed to do. It seems to be a step up from the shortest article, and step down from the longest articles, and

Megabeast and Barcoding Life: A Comparison of Award-Winning Science Media

I am sure we are all glad that sabertooth tigers, wooly mammoths, and other giant beast of the ice age are no longer around, but we still wonder why they are no longer around. Of course like any other insect and animal of today “megabeast” of the past would still be classified into different species. An “automatic animal identifying machine” would be useful for that task.

Animals of the ice age are never coming back, but we may know why they're extinct and we do in fact have an “automatic animal identifying machine.”

Gary Wolf is a human, who wrote the 2009 AAAS (American Association for the Advancement of Science) award winning article A Simple Plan to ID Every Creature on Earth in Wired Magazine; if you want to make sure you can use Paul Herbert’s “automatic animal identifying machine.”

Wolf’s article tells the story of Herbert quest to rid the biology world of “operational taxonomic units” an imprecise method of classifying organism until identification as a species. Herbert uses CO1 mitochondria DNA a unique “barcode” to separate specimens into species.

While pictures are few, Wolf uses anecdote and simple language to keep the reader engaged. Herbert’s machine is not widely accepted by the scientific community and Wolf shows that with quotes from the opposition.

In comparison you have Doug Hamilton’s AAAS awarding winning documentary The Last Extinction of the controversial new theory on the extinction of the “megabeast” of the Ice Age.

Hamilton goes where the scientists go showing the scientist triumphs or failures making viewers feel like they are there when their theory is proven right or wrong. Like Wolf, uses opposing opinions to keep viewers engaged while also informing them of alternate theories.

It be could that I found DNA more interesting than extinct ice age animals, since I thought that Wolf did a better job explaining and entertaining readers than Hamilton. To decide for yourselves read/ watch the originals at:

http://www.wired.com/science/discoveries/magazine/16-10/ff_barcode?currentPage=1

http://www.pbs.org/wgbh/nova/clovis/

Genes and Climate Change: Recognizing Great Science Writing

Every year the American Association for the Advancement of Science gives out awards to journalists to acknoledge great achievement in the area of science reporting to the general public. What makes these writers so good? What do they have in common? Why is it important to recognize science wrting to a genral audience? These are all questions one may ask of the AAAS, and questions which deserve to be answered.

The first three awards for 2009, for "Large Newspaper" articles which received over 100,000 copies, had to do with genetics and biology. What made these articles particulaly spectacular was their ability to describe immensly detailed and profound research and findings in ways that any adult could understand. This was achieved very sublty using a combination of images, metaphors, and analogies. By relating a complex theory or idea with a simple to understand idea or concept, the reader can gain a fuller understanding and appreciation of the topic. This method of writing was prevelant in all of the newspaper articles. Additionally, the articles made a point to answer the "so what" question that all readers have.

This "so what" question was especially important in the online publications. Online content allows for the author to include links to other content, show pictures and other multimedia, and by doing so draw readers into the article. In the online publication titled "Bangladesh: Where the climate exodus begins", the following image is used to draw out the pathos of the reader.

http://www.eenews.net/special_reports/bangladesh/part_one


Similar techniques are used in other posts in order to give relevance to the subject matter which will make the general public want to keep reading. The use of such techniques of writing, media presentation, as well as the overall presentation is what made these publications very compelling to the general public while still conveying science in a meaningful and accurate way and thus makes them deserving of praise and recognition by the scientific and journalism communities.

Comparing Lisa Friedman's online science journalism to Julia Cort's educational science television

Comparing a fun video about scientists who grow diamonds, complete with Indiana Jones references and goofy cartoons, to hard journalism about the effects of climate change on Bangladesh makes for some pretty stark contrasts. One is compelling because of its topic's Mr. Wizardesque appeal. The other demands attention with gravity rather than levity...

Consider both in terms of Cornelia Dean's measurements of "newsworthiness."

Extent: Climate Change, yes. Growing Diamonds, no.
Intensity: Neither, really. I'm used to people in Bangladesh doing worse than I am, and I'm not buying a diamond--grown or otherwise--anytime soon.
Consequence: Climate Change, yes. Growing Diamonds, no.
Celebrity: Climate Change, no. Growing Diamonds, yes--Neil DeGrasse Tyson is, to quote a colleague, "a scientist rock-star."
Proximity: nope and nope
Timeliness: Climate Change, ALWAYS. Diamonds? Timeless, maybe, but not particularly timely.
Novelty: Climate Change, good lord no. But Friedman's idea of going in-depth to observe direct and measurable effects of climate change is quite novel. Her approach is much of the story's success, I think.
Diamonds, obviously extremely novel. This is where Cort capitalizes.
Human interest: Friedman focuses here, and lets her subjects speak in their own voices. Photos add credence to the narrative. We see economic forces at the micro level.
Diamonds--A trip to New York's Diamond District puts human interest where there might've been none. We get to watch a veteran diamond appraiser react to a "grown" diamond. His disdainful joking about the idea of "grown" diamonds devaluing natural ones make us wonder if there is anxiety underneath. And his perspective begs the question: what do we value?
Currency: both are a yes

Interesting. These pieces of science communication are ostensibly opposites (with one making use of novelty and the other on consequence), except for the fact that both capitalize on human interest.