Tuesday, April 20, 2010

Tiny, Slippery Symmetries: In Search of the Bizarro Electron


My friends the particle physicists look for clues to the secrets of the universe: they want to know how things are put together, or, to put it better—how things are coming apart, given that the universe is expanding. They’re tracing the clues back to the beginning, when all things were one.

That sounds so cosmic. Better than the Bible.

So I thought that when Robbie Pattie, a research fellow with the NCSU physics department, showed me his work space, I’d be seeing something…well, cool. I thought he’d lead me over to some eyepiece, and I’d look in, and there would be the Big Bang repeating, or something.
Nope. He showed me what looked like a 1-inch USB port in a block of wood and a cold keg of helium. (See above. Photo courtesy of Dave Baker.)

Okay—it was REALLY cold. The helium. It was 4 Kelvin, which works out to -270 Celsius or -452 Farenheit. So if you want to argue semantics, Robbie did show me something “cool.”

The really cool stuff is too small to see, though. Down in that thing that looks like a USB port is a battery with each pole hooked to a tiny bronze plate (think middle school shop class electronics). Hovering (actually, doing little mini-orbits, little loops) between the plates is a neutron from that 4 Kelvin helium (there’s none of that in shop class). Robbie’s colleague Chris is trying to measure the ways that the electric charge affects the neutron’s motion.

If the neutron's loop stays the same, we aren’t too excited. We know that the neutron has magnetic moment because it contains an electron and a proton—collectively, its charge is neutral, but it has poles. But if the neutron wavers in its little orbit, that means the electric field generated by the battery is affecting it; it has electric dipole moment. There’s another particle in there we’ve never observed before, pulling it towards the field. And that is extremely—cosmically—exciting.

Most particle physicists think that there’s a dark half of the universe. A Bizaro half made of matter that we can’t see, even though it pulls us outwards constantly. They call it dark matter, and we know it's there because the universe is expanding, and we can see its background radiation.

Electric dipole moment—if Chris’s neutron has it—would be evidence of the Bizaro electron. Proof of the other side. If Chris can find it, physicists can adjust their picture of how the universe works.

Did you get that? ADJUST THEIR PICTURE OF HOW THE UNIVERSE WORKS. That’s what physicists do.

The NCSU group's work also adds to a body of scholarship that looks for materials that work best in these kinds of experiments. They're called Ultra Cold Neutrons (remember that 4 Kelvin Helium?) If you want to see the cool stuff Robbie builds in a lab out in Los Alamos (think shop class, again), check out this power point of a paper he contributed to. It was presented at an Ultra Cold Neutron conference at the end of last year in Santa Fe, New Mexico.

Monday, April 19, 2010

Breaking Down Pancreatitis

Jim Allen

The Pancreas is that underappreciated organ beneath your stomach which nobody pays attention to until it swells to the size of a foot-long sub and jams into your nearby internal organs. This swelling, commonly known as acute Pancreatitis, has many causes including the introduction of scorpion venom into an animal’s body. Researchers Keith Weninger of North Carolina State’s Department of Physics and Paul Fletcher of East Carolina University’s Department of Microbiology and Immunology have recently completed a study of the effects of scorpion venom on protein and enzyme production in the pancreas of guinea pigs. According to the researchers, “Clinical studies report that scorpion venoms induce significant pathology, including acute pancreatitis in humans following envenomation.” Approximately 80,000 people are affected by acute pancreatitis every year in the U.S, a number which could be significantly reduced using data collected by Weninger and his team.

http://www.latoxan.com/VENOM/SCORPION/IMG/Tityus-serrulatus.jpg

It is important to note that while scorpion stings are widespread in the U.S., stings do occur worldwide especially in Africa and India. Additionally, according to the researchers, “Secretagogues of non-scorpion venom origin used by others can also produce similar effects but require excessive levels of administration in vivo in order to achieve those results”.

In their experiments, swelling of the pancreas was caused by a toxin-induced failure of the normal vesicular traffic, which is the basis for intra-cellular transport of proteins. Toxin molecules attacked modified proteins referred to as vesicle-associated membrane proteins (VAMPs), rendering them unable to transport other proteins throughout the pancreas. Basically, pancreatitis was onset by disabling the pancreatic cells ability to release or absorb components.

This transport process is called “vesicle fusion” and it works like so. The contents of one cell may be mixed or injected into another cell or VAMP that can be transferred to another location. Weninger and Fletcher report that “understanding of these functions is fundamental to extending knowledge of transport in normal and diseased cells.” Additionally, “Simultaneous cleavage of multiple SNAREs, such as VAMP2, VAMP3, and VAMP8, would presumably have major physiological consequences”.

A cutting of protein molecules, known as proteolysis, occurred between the soluble N-ethylmaleimide-sensitive factor attachment protein receptor (SNARE) motif and the transmembrane anchor. This cleavage was reportedly performed by an enzyme called antarease, a newly discovered “Metalloprotease”, which was not previously present in amino acid sequencing databases.

Before their experiments, no scorpion toxins had been associated with intracellular targets. Therefore, “a definitive function in pathogenesis for the metalloprotease activity of scorpion venom remains to be determined beyond a theoretical role”.

According to Dr. Weninger, “results from the experiments have important implications for potential effects on secretory discharge as well as vesicular transport mechanisms in the exocrine process.” Understanding the effects of VAMP cleavage by a metalloprotease will lead to a better understanding of the mechanisms responsible for pancreatitis and potential treatments or cures. Vesicle fusion has been explored in recent years as a method of cellular-level drug introduction.


Results of the experiment were published in the March edition of the Journal of Biological Chemistry.


Fletcher Jr., Paul L. et al "Vesicle-associated Membrane Protein (VAMP) Cleavage By a New Metalloprotease from the Brazilian Scorpion Tityus Serrulatus*." Journal of Biological Chemistry 285.10 (2010): 7405-7416. jbc.org. Web. 06 Apr. 2010. .

Revamped Design Increases Computer Efficiency

It's hard to imagine that single processors in computers were once so fast they melted the very chips they were engineered for. Now they are designed to split up the workload to counter this incredible heat. However, this has presented new challenges for today's engineers on both a hardware and software level as far as maintaining and increasing a computer's computational speed is concerned. Luckily, researcher's at North Carolina State University seemed to have just raised bar in this matter by devising a way to increase the efficiency of modern processors by up to twenty percent.

To understand what is going on under the hood of these machines, a quick overview is in order:

Computers have a so-called “brain” known as the Central Processing Unit (CPU) or core. This is where all of the computations take place when executing an application/program such as your everyday web-browser.

The calculations necessary to run a program are split up into separate tasks called “threads”, a process known as parallelization, which can be computed simultaneously on multiple cores, making for a very fast means of computation.

Unfortunately, some programs are difficult to split up into threads because of their sequential nature. They are dependent on the outcomes of other threads/programs in order to continue computation, limiting their usage of multi-core systems – slowing execution time.

Process execution traditionally takes place with a calculation step followed by a memory-management function to free-up memory and prepare data for storage. For difficult-to-parallelize programs, this two-step task can only be completed on a single core, slowing things down significantly.

The basis of what these researchers did to achieve this feat of a twenty percent increase in efficiency is by treating the Memory-Management step as a separate Thread (MMT), allowing for simultaneous execution of both steps. This ensures utilization of multiple cores, increasing processing speed. Programs frequently request for memory to be used or freed up for various reasons. These requests are now passed through a small layer of code between the user and the operating system that determines how these requests should be satisfied.

Their methods involve taking these memory-management requests and lumping them together. Then they predict how these requests should be handled to satisfy the needs of the program. Dr. James Tuck, assistant professor of computer and electrical engineering at NC State, says: “As it turns out, programs are very predictable.” Predictable in a sense that it is possible to calculate the next memory-management request before it has happened. So, when predicting requests in bulk, time is saved since the work has already been done. These requests are then completed on a separate core than that of the calculation thread via the MMT. “It's like having two cooks in a kitchen...” Dr. Tuck explains. One cook does the mixing while the other prepares the ingredients. The mixing, or computation, is the important part of satisfying program requests and the preparation of the ingredients is what the MMT takes care of.

This exploitation of parallelization resulted in some interesting findings, with the most significant being that this MMT approach is independent of existing applications. In Layman's terms, MMT allows for a boost of speed without having to alter pre-existing code, and is effectively transparent on a user level. This is good news considering that a large overhaul of complicated programs such as common web-browsers and word processors is not necessary, saving lots of time and money for a possible twenty percent increase in speed.

The scientific article MMT: Exploiting Fine-Grained Parallelism in Dynamic Memory Management can be found at http://www.ece.ncsu.edu/arpers/Papers/MMT_IPDPS10.pdf

A visual of an Intel Core 2 Duo processor.

Sunday, April 18, 2010

Polypropylene -- The Fiber that Could


Tissue scaffolds, gym socks, and potable water…items you would never think could go together are being linked through a material you use every day. The material is one of the most industrially important polymers in the world, polypropylene (PP). The applications using natural and man-made PP fibers that link these three unrelated topics are only the beginning of what can be accomplished through the changing of the surface chemistry of the original fibers.

Research at North Carolina State University (NCSU) in the department of Chemical Engineering is making headway into changing the chemistry of the fiber surfaces through a process known as “Atomic Layer Deposition”, or ALD. ALD is the process of depositing a uniform layer of a certain chemical across the entire fiber surface, or really across any surface you want it deposited on.

Chris Hanson, an undergraduate researcher in the lab of Gregory Parsons at NCSU, has been using ALD to modify the surface chemistry of the common polymer, PP, in an effort to produce a product much more valuable than the original plastic (which is commonly used for making various consumer disposable bottles).

“Nonwoven polypropylene (which is again used for consumer bottles) is incredibly cheap to produce, but the original inert surface renders the fibers not useful for specialty applications such as advanced bio-filtration. Our goal is to show that modifying the surface chemistry of PP by adding tiny layers of aluminum oxide can alter how water adheres to the surface”, said Chris. Illustrating the change in how water adheres to the fiber surfaces shows that a change is indeed occurring on the fibers.

The research was done on small squares of PP placed inside a homemade reactor (depicted in the image). In order to obtain multiple aluminum oxide layers on the same surface, a layer of aluminum oxide must be deposited followed by a chemical to make the surface suitable to accept another aluminum oxide layer. The researchers did this cyclical procedure and obtained PP samples with aluminum oxide layers ranging from zero to 100 layers, in increments of ten (where the thickest coating of 100 layers corresponds to a thickness of ~10 nanometers). A drop of water was then placed on the layers and the angle between the water and sample surface measured to determine how well the sample repelled or attracted the water.

In this study, the researchers were able to show that depositing layers of aluminum oxide on the surface of the originally “water-repelling” polymer does change the polymer to have water-attracting properties.

So why does it matter that a surface can be changed from having water-repelling to water-attracting properties?

“Once the surface chemistry of the inert, water-repelling PP has been made to attract water, it can undergo chemical reactions much more easily and can then be used to make advanced bio-filters, clothes that destroy bacteria, water filters, etc,” said Chris. When looking at the specific application to anti-microbial coatings and what is already on the market, the new feature of this technology is that it can be used to deposit a more robust set of microbe killers than what has been used before, opening up many new opportunities.

The future of using ALD to change the surface chemistry of inert materials is promising not only to researchers, but to every person in the world who will use a product born from this technology.

Steven Burgess



Peer Reviewed Literature:

Hyde, G. K. et al "Atomic Layer Deposition and Abrupt Wetting Transitions on Nonwoven Polypropylene and Woven Cotton Fabrics." Langmuir 26.4 (2010): 2550-2558.

http://pubs.acs.org/doi/abs/10.1021/la902830d

New Scale Helps Diagnose OCD

Imagine being in the cereal aisle of the grocery store when suddenly you see something that doesn’t look right: boxes of various heights, randomly mixed together. You break out into a cold sweat; your heart thumps hard. You try to resist the urge to reorganize the boxes, but you can’t.

You begin organizing the cereal boxes by height. A strange calm washes over you.

You’ve just experienced five minutes in the life of someone with obsessive-compulsive disorder.


Nearly everyone experiences some level of obsessive-compulsive behavior, or OCD, which is characterized by intrusive thoughts, or obsessions, and repetitive actions, or compulsions. In its most severe form, OCD can severely inhibit a person’s ability to function normally. Time that could be spent with friends is instead consumed by the need to respond to the obsessions.

Before a psychiatrist or psychologist begins therapy on a person with OCD symptoms, he or she must determine whether or not the patient indeed has OCD or a different anxiety disorder. To make a definite diagnosis, doctors use one of several diagnostic tests. However, according to Dr. Jonathan Abramowitz, each one has flaws.

So he and several of his colleagues from across the country developed their own measure for OCD, called the Dimensional Obsessive-Compulsive Scale (DOCS). In a 2010 peer-reviewed article in Psychological Assessment, Abramowitz introduces the DOCS and explains how it diagnoses symptom severity more accurately than the Obsessive-Compulsive Inventory-Revised (OCI-R), and distinguishes OCD symptoms more clearly than the Yale-Brown Obsessive-Compulsive Scale (Y-BOCS) the two most widely-used OCD measures.

A research psychologist from the University of North Carolina at Chapel Hill, Abramowitz argues that obsessions and compulsions work as pairs. He groups these obsession-compulsion pairs into four areas, or dimensions: Contamination, Responsibility for Harm, Unacceptable Thoughts, and Symmetry. The OCI-R and Y-BOCS also include Hoarding.

Unlike some other OCD diagnostic tests, the DOCS doesn’t separate obsessions from compulsions. In fact, it doesn’t ask questions about specific behaviors at all. The scale asks five questions for each OCD dimension: time spent on the behavior, avoidance of situations, distress over unwelcome thoughts, disruption of life, and difficulty in ignoring OCD thoughts. The DOCS introduces two new components to OCD diagnostics: adding the concept of avoidance, and removing Hoarding from the scale.

Abramowitz tested the DOCS on three different groups: people diagnosed with OCD, people diagnosed with other anxiety disorders, and college students from Tennessee, Florida, and Arkansas. Members from each group completed the DOCS and one or more of six other tests, including two tests for OCD, three scales for anxiety disorders and one scale for depression.

When compared with results from the other scales, the DOCS results were more similar to results from the other two OCD scales than they were to the anxiety or depression scales. In addition to being a useful clinical tool in diagnosing OCD, the DOCS also shows promise in measuring treatment outcomes.

Source: Abramowitz, J., et. al. (2010). Assessment of Obsessive-Compulsive Symptom Dimensions: Development and Evaluation of the Dimensional Obsessive-Compulsive Scale. Psychological Assessment, 22(1), 180-198.

Cleaning Mickey Mouse


It is everywhere, the “Mickey Mouse” of molecules, H2O or water. It is essential for life, animal, plant, and even bacteria. Bacteria can live in our water pipes, which can pose a risk to human health. In the article (2009) “Effect of free chlorine application on microbial quality of drinking water in chloraminated distribution systems” researchers from Chile, Massachusetts, and North Carolina developed a new method of measuring bacteria amounts in water by flow cytometry.

Water Crisis Worsens In Southern England
The old method, R2A ager, did not show scientists an accurate representation of the bacteria amounts in water. However, flow cytometry uses a dye that causes organic material (bacteria) to glow, making it easier for scientists to count the number of bacteria present in a water sample.

Bacteria eat the ammonia produced by decaying materials and release nitrate and nitrite into the water in a process called nitrification. When pregnant women drink nitrate/nitrite rich water the nitrate competes with the oxygen in the blood of the fetus, causing the child to be born with blue tinted skin. Blue baby syndrome which results from low oxygen levels in the bloodstream.

The Distribution Systems (DS) or water plants, in the research used Chloramines as their primary method of disinfection due to its ability to kill bacteria without producing high levels of harmful by-products.

The trade-off is higher nitrification levels in chloramine treated water. In an effort to reduce nitrification levels in tap water North Carolina law requires water treatment facilities that use chloraminated disinfection as its primary disinfectant to flush the system with free chlorine of hypochlorous acid one month a year in order to reduce nitrification levels.

While the free chlorine reduces nitrification levels it presents other health problem for people. The free chlorine reacts with the water to make harmful by-products like chloroform a known carcinogen, thereby preventing its widespread use.

In areas where water is in constant use like urban and suburban centers, the public doesn’t face as much risk as people in dead end sites. A “dead end site” is essentially where water sits or pools for long periods of time, like a school during summer (1).

The researchers looked at a closed community center and found that before the introduction of free chlorine into the DS the water showed high levels of nitrification. In order to move the free chlorinated water into the community center, hydrant flushing or opening up a water source in order to get water moving was used. Once the free chlorinated water reached the community center nitrification went down, but the levels of chloroform levels increased. A second hydrant flushing was needed to remove the chloroform rich water, however it was not done.

According to Dr. Detlef Knappe a researcher in the article, the flow cytometry method of calculating bacteria count will hopefully help engineers and scientists to strike a balance between bacterial and negative chemical effects.

By engineering water systems that circulate water to eliminate dead end sites or creating a balance between chemical disinfectant and bacterial amounts flow cytometry will help provide clean and safe water to the public.

For more information on the concepts in this blog please check out:

(1)Rosenfeldt EJ, Baeza C, Knappe DR. 2009. Effect of free chlorine application on microbial quality of drinking water in chloraminated distribution systems. American Water Works Association 101(10):60-70.

Growing Technology that Grows

Runoff from construction sites has long been a major factor in polluting our rivers and lakes. The EPA has even issued a new rule requiring all construction sites over 10 acres to reduce runoff to about 100-times lower than is typical of their discharge.

Luckily, research out of NC State University shows how a remarkable polymer can be used to reduce runoff from road construction up to 98% from current methods.

Professor R.A. McLaughlin, professor S.E. King, and professor G.D. Jennings are taking advantage of a remarkable polymer called polyacrylamide (PAM). PAM is a water soluble, synthetic polymer that expands when it comes into contact with water.

So, when it rains, PAM dissolves into sediments from runoff, causing tiny sediments to literally expand and settle. Not only does this stop the sediments from polluting a river or lake, but it also creates a physical barrier that slows runoff.

When used in conjunction with fiber check dams (FCDs) consisting of straw wattles and coir logs, construction companies might actually have a chance of meeting the DOT’s requirements.

Construction companies’ current method of using rock check dams are highly ineffective and even more expensive than McLaughlin et al.’s method of using the fiber check dams with PAM.

This research has major implications in a nation where 45% of rivers and streams in the United States are “impaired for their intended use, with sediment and siltation as the leading cause,” according to a study the EPA did in 2002.

When water has high levels of particles due to sedimentation, it is called turbid water, and high levels of turbidity in drinking water can protect disease-causing bacteria from ultraviolet sterilization, which when drank can cause nausea, vomiting, and dizziness. And, long-term drinking of turbid water can lead to gastrointestinal diseases or death.

McLaughlin et al. tested three different systems for controlling erosion at two roadway projects undergoing construction in the North Carolina mountains from June 2006 to March 2007. The team marked off three experimental sections next to each other at the first roadway project, and two experimental sections at the second roadway project.

The first experimental section of the first roadway was treated using the standard technique used by construction companies:

“[Rock check dams] are narrow sediment traps in the ditch along with rock check dams,” McLaughlin said.

The team treated the second experimental section of the first roadway with fiber check dams (FCDs) consisting of straw wattles and coir logs.

And the team treated the third experimental section of the first roadway using fiber check dams with granulated PAM added to each. The PAM is granulated rather than liquid so that when it rains, it dissolves into the soil and thickens the barrier of the dam, as well as weighs down the sediments.

The team compared the three systems by measuring the turbidity, or the amount of sediments, in the water.

Turbidity is measured in nephelometric turbidity units (NTU). The EPA requires turbidity for construction companies be less than 280 NTU. At site one, the average turbidity values for the storm-water runoff was 3,913 nephelometric turbidity units (NTU) for rock check dams. However, the Fiber check dams with PAM’s average turbidity values were only 34 NTU.

"Because we did this research before the rule was issued, we have good confidence that we can train the industry to attain the turbidity goal following key elements we have determined are necessary based on our research," McLaughlin said.

McLaughlin et al. published their research, titled “Improving construction site runoff quality with fiber check dams and polyacrylamide,” in the Journal of Soil and Water Conservation 64(2):1444-154.

http://www.swcs.org

http://www.jswconline.org.www.lib.ncsu.edu:2048/content/64/2/144.full.pdf+html

Monday, April 5, 2010

Justice is blind--but can it be deaf?

People accept that it’s wrong to judge based on skin color, but most people think that there’s a right (read: white, middle class, “standard”) way to speak.

If I were reading this text aloud to you, your brain could use the sound of my voice—the way I combined vowels and consonants, the subtle shifts in inflection, the vocabulary I used, the order of those words, and maybe even the frequencies I produced—to determine my ethnicity. And your brain would very likely get it right.

Can you tell “what” I am from the slant of this typeface? I didn’t think so.

The field of sociolinguistics studies how language is socially constructed—that is, how who we are determines how we say what we say. And any good sociolinguist will tell you that, as listeners, we place almost as much value on the “how” as we do on the “what”: we judge people according to how they sound. Speech is as important as skin color to our assessments of—and potential discrimination against—people who are different from us. Sociolinguists call such discrimination linguistic profiling.



(Linguistic profiling can be used for more than just discrimination. Voices can be analyzed like fingerprints and used to ID individuals; such analysis can solve crime or open doors with voice recognition technology.)

But sociolinguists have a hard job. It’s nearly impossible to sort out all of the variables that contribute to our judgments about who someone is. Erik Thomas, Jeff Reaser, and Walt Wolfram at NCSU have spent parts of their careers trying to narrow it down. In a forthcoming chapter in Linguistic Profiling and Linguistic Human Rights, Thomas reviews several decades worth of scholarship, and the big conclusion is…a question mark. We know that people can make good guesses about ethnicity from phonological features—sounds alone, no syntax or vocabulary clues, but we don’t know which phonological features are the key.

Thomas writes that studies—most of which use recorded samples of African American (or “black”) speech as well as other control samples to survey listeners—have determined that there are prototypical features of African American speech. Sociolinguists call speech that contains such features “marked” speech. Studies also show that non-prototypical black speech (i.e. the speech of black speakers from isolated southern towns which shows more marked features typical of southern, not black, speech) is less recognizable to white listeners. But more prototypical black speech, even without marked words and syntax—is recognizable.

The key, says Thomas, is in the way that African American speakers use vowels and consonants and in the rhythm and intonation—or prosody—of their speech. But what’s the recipe? What features, when absent, erase the speaker’s race?

In 2004, Thomas and his colleague Jeff Reaser collected samples from African American speakers in Hyde County, North Carolina—a region where Black speech contains fewer prototypical features. They then used a computer program that allowed them to modulate vowel sounds to tweak, or monotonize, prototypical /o/ sounds. When they compared listener monotonized /o/s to the unchanged samples with “marked” /o/s, they found that the vowel helped listeners identify African American speakers. (A control group from the piedmont, where Black speech has many more prototypical features--sounds, words, and syntax--was much easier for listeners to identify.)

A follow-up study using different analytical methods and extending the listener sample to include West Virginia University students as well as NCSU students is forthcoming. Its results imply that listeners use different cues to determine the ethnicity of males and females.

Sunday, April 4, 2010

Slow-growing Minds May Imply Adult Schizophrenia

Have you ever had trouble paying attention as a child? Maybe difficulty processing new ideas or information? Based off the title and these two open-ended questions, you're probably thinking: “YES! I did! Does this mean I'm destined to develop schizophrenia?” Try to dislodge your heart from your throat and then take a deep breath. It's very unlikely. However, a recent study shows that these characteristics in children may provide clues toward the early detection of adulthood schizophrenia. Researchers from Duke University published an article in American Journal of Psychiatry involving a long term study of the growing minds of children with the objective to find a correlation between a child's cognitive shortcomings and their likely-hood of developing schizophrenia in adulthood.

1,000 participants from New Zealand were used to establish a testing group for this research. They were all born between 1972 and 1973 and then tested periodically throughout their growing years for cognitive impairments characteristic of schizophrenic patients. It was shown that the percentage of patients considered schizophrenic in their adult years had related developmental difficulties that was seen as early as age 7. These difficulties consist of a child's visual learning, working memory, processing speed, attention span, verbal reasoning and the solving of visual-spatial problems.

To facilitate this research, three fundamental yet unanswered questions within the scope of study are acknowledged: What is the developmental course of schizophrenia prior to its onset? Do different cognitive characteristics follow similar or different developmental paths? And are there developmental difficulties specific to schizophrenia?

Testing for these disabilities began at age 3 and were continued until age 13 in 2 year intervals; a testing methodology apart of the Dunedin Multidisciplinary Health and Development Study. Furthermore, it was noticed that between the ages of 7 and 13, a loss in .17 and .26 of mental age was apparent in children who would later become diagnosed with schizophrenia.

By the age of 32, it was shown that 2.5 percent of the participants monitored met the diagnostic criteria of schizophrenia and 1 percent met the formal criteria. Only the participants who met the formal criteria were hospitalized and put on antipsychotic medication. Their conclusions consisted of two findings evident between childhood and early adolescence of the children who grew up to develop schizophrenia: Upon entering primary school, they struggled with verbal reasoning and as they aged, lagged behind their peers in working memory, attention, and processing.

Now what does all this mean? Schizophrenia is not an all-of-the-sudden occurrence, but related to a child's development. Children who developed schizophrenia lagged behind in school compared to their peers and continued to do so. Initially their verbal skills are poor and as they age, continue to develop other hindrances in learning. Their minds did grow, but their abilities to make sense of the world forced them into social isolation/delusion. This study offers a unique view not so much into why schizophrenia occurs but what traits are similar in those diagnosed. Future research will focus on the causes of this disease based on these stages of human development.

Friday, April 2, 2010

Feeling good: the perception of what is “pleasant” to the touch

Something strokes your skin and you shiver with pleasure. Or is it displeasure? Whether it is pleasant or unpleasant, touch forms a cornerstone of social behavior in humans. Before we even opened our eyes as babies, we tried to make sense of our surroundings through our skins. We touch to learn, to show affection and feel pleasure.

It is important for us to understand how the touch of various materials influences our emotional responses, as the sense of touch is one of the major ways we, as humans, gather information about the world. Now, scientists have revealed that males and females have different perceptions of what is pleasant to the touch.

Previous investigations about pleasant touch suggested soft and smooth materials as pleasant; those that were stiff, rough, or coarse as unpleasant. However, these investigations did not have any numbers to back them up, nor did they consider the possibility of there being differences between males and females.

A research team led by Greg Essick of UNC Chapel Hill in North Carolina decided to fill in these gaps in the knowledge of tactile stimulation, or what they call “pleasantness-to-touch.” Their study, published online in Neuroscience & Biobehavioral Reviews on February 21, 2010, sought to make a quantitative assessment of people’s perceptions of pleasant touch. In other words, by using controlled experiments and gathering concrete data, the scientists can identify what and where females feel pleasantness differently than males.

In the experiment, a rotary tactile stimulator (pictured below) was used to control how the materials brush across the skin while a participant entered in their “pleasantness rating” on a scale of 100% unpleasant to 100% pleasant. “Pleasantness” of contact was noted as affective touch, or touch that evoked a positive emotional response to tactile stimulation.




There were 21 male and 22 female participants. After repeating the process several times, 16 stimulus trials per participant, the researchers obtained ratings of pleasantness of different textured materials stroked across the skin of multiple body sites at controlled velocities and forces of application.

Their data supports previous results that smooth stimuli are pleasant, and that they continue to be pleasant even with increased force. Conversely, rough stimuli start out neutral and become more unpleasant as the force is increased.

The most unexpected findings were that males found stimulation of the forehead, particularly for terry toweling and denim, unpleasant, whereas female participants found terry toweling and denim to be unpleasant on the hand and thigh. For materials that were more pleasant, gendered-responses were similar when tested on the hand, forearm, and thigh. Interestingly, male participants found stimulation of the calf more pleasant than female participants.

This was the first study to conclude, based on data, that there are differences between male and female responses to unpleasant materials. They believe this phenomenon most likely due to sex-dependent mechanical responses of the skin.

We all know that males and females have areas that are more sensitive than others, but who would have thought there are some pretty mundane differences in addition to the sexual ones that usually come to mind.

It is only a first step, but as we continue to learn about, and assign values to the different emotional responses to touch, we can begin to understand the psychology of why people are attracted by some things and repulsed by others.