Posts tagged with "Science Daily"

Shocker: The racial observation, ‘You all look alike to me,’ is hard-wired into our brains at birth

July 10, 2019

It’s a phenomenon that occurs when a member of one race looks at an individual of another color. You often hear it framed as a joke—“You all look alike to me”–although it’s really nothing to laugh at, since it’s a form of stereotyping, and even prejudice.

Indeed, scientists at the University of California-Riverside recently discovered that we are hard-wired to process—or not to process—facial differences, based on race. And that tendency occurs in the earliest filters of our thought processes.

The one stipulation: Those of the majority race experience the most pronounced difficulties individuating those of other races. The scientists hypothesize that minorities are exposed more frequently to the features of the majority race—and, therefore, may be able to individuate more easily.

The research, published this week in the journal, Proceedings of the National Academy of Sciences of the United States (PNAS), was led by UC-Riverside psychologist Brent Hughes. The overriding question posited in the paper: When we observe members of another racial group, are their actual physical features indistinct?

The study participants were 17 white people studying white and black faces on a monitor while lying inside a functional MRI scanner, which identifies changes in brain activity, according to a report by Science Daily. Some experiments were also conducted outside of the MRI.

Hughes and his team looked at the white participants’ high-level visual cortex to see whether it was more tuned in to differences in white faces than black ones. The visual cortex is the first stop for processing impulses from the eyes; the high-level visual cortex specializing in processing faces.

Their findings affirmed previous studies, determining that participants showed a greater tendency to individuate— recognize differences in—the faces of individuals of their own race, and were less able to recognize the differences in persons of other colors. But Hughes’ study went further, demonstrating how deep this tendency runs: as far as our earliest sensory processes.

“Our results suggest that biases for other-race faces emerge at some of the earliest stages of sensory perception,” Hughes wrote in the paper, entitled “Neural Adaptation to faces reveals racial outgroup homogeneity effects in early perception.”

Hughes wrote that the fallout from noticing the differences in members of one own’s race but not others is profound. These early perceptions can cascade, affecting downstream beliefs and behaviors. The implications can range from embarrassing to life-changing: Think of when the wrong suspect in a crime is selected from a lineup.

“We are much more likely to generalize negative experiences if we see individuals as similar or interchangeable parts of a broad social group,” Hughes said.

“Members of minority groups wind up being exposed to more members of majority groups than majority members get exposed to minority members,” he said. “It could be that exposure to individuals of different groups may help the visual system develop expertise that reduces this effect.”

The study shouldn’t be interpreted as a pass for “you all look the same to me,” Hughes said.

“These effects are not uncontrollable,” he said. “These race biases in perception are malleable and subject to individual motivations and goals. In this sense, attitudes, motives and goals can be shaping visual perceptual processes.”

Research contact: brent.hughes@ucr.edu

Full house, fuller life: The more children a woman has, the more slowly she ages

February 22, 2019

Just as the U.S. birthrate has hit an all-time low—at 12.2 newborns per 1,000 women aged 14-44 during 2016, according to the Centers for Disease Control—a study has found that, the more children a woman has, the more youthful she remains.

It’s counterintuitive, considering all the mental and physical stress mothers experience. However, a study conducted in 2016 by researchers at Canada’s Simon Fraser University indicates that the higher the number of children a woman gives birth to, the more slowly her body will age, People magazine reported on February 20.

Specifically, the study led by health sciences professor Pablo Nepomnaschy and postdoctoral researcher Cindy Barha found that women who give birth to more surviving children exhibited longer telomeres. Telomeres are the protective tips found at the end of each DNA strand and are indicative of cellular aging. Longer telomeres are integral to cell replication and are associated with longevity.

Some common physical examples of this can be seen in skin and hair, which are most affected by the shortening because they reproduce the most often, the study has found.

The researchers actually did not study a cohort of U.S. women. Instead, they evaluated a pool of 75 indigenous Guatemalan women over the course of 13 years—finding that the women who gave birth to more children had longer telomeres.

Hormones may play a role in the anti-aging results, Nepomnaschy recently told Science Daily, noting, “The slower pace of telomere shortening found in the study participants who have more children … may be attributed to the dramatic increase in estrogen, a hormone produced during pregnancy.”

This makes sense, he said, because,“Estrogen functions as a potent antioxidant that protects cells against telomere shortening.”

Another plausible explanation for why having a higher number of children slows down the shortening of telomeres is the social environment it creates. Nepomnaschy explained that women with more kids tend to receive more support from outside sources, like relatives and friends, which can help increase the amount of metabolic energy that such mothers have. This, in turn, contributes to tissue upkeep and slows down aging.

Research contact: @SFU_FHS

Eye contact: As plain as the nose on your face

February 5, 2019

For years, we’ve been told that eye contact is essential to establishing relationships—and that failing to meet a hiring manager’s eyes during a job interview foils any chance of employment.

But just recently, researchers at Edith Cowan University in Perth, Australia, have found that simply staring somewhere between the forehead and the chin of a conversational partner will suffice, based on a February 5 report by Science Daily.

Yes, eye contact might be all in our heads.

Lead author Dr, Shane Rogers a lecturer in the School of Arts and Humanities, believes that, for those of us who experience social anxiety when making eye contact—or when being engaged eye-to-eye—this finding will be welcome news.

“Maintaining strong eye contact is widely accepted to be an important communication skill in western cultures,” he notes. “People [have been led to] believe if you aren’t willing to engage in soul-to-soul mutual eye contact then you are at best lacking in confidence; at worst, untrustworthy.

“However, the reverence devoted to eye contact is not supported by scientific evidence,” he asserts.

During the course of the study, a researcher engaged in four-minute conversations with 46 participants, during which both parties wore Tobii eye-tracking glasses.

“For approximately half the conversations the researcher looked at the eyes most of the time; for the other half, [they] gazed predominantly at the mouth,” Dr. Rogers explains..

After the conversations, the participants rated how much they enjoyed the conversations.

“The mouth group perceived the same amount of eye contact and enjoyed the conversations just as much as the eye group,” Dr. Rogers comments.

According to Rogers, the results suggest that—when specifically focused on trying to determine the direction of a partner’s glance—people demonstrate a limited capacity to identify it accurately.

“People are not very sensitive to the specific gaze focus of their partner to their face; instead they perceive direct gaze towards their face as eye contact,” Dr Rogers says.

“So don’t get hung up on seeking out the eyes of your audience, just look generally at their face, and let the eye contact illusion experienced by your partner do the work for you,” he recommends.

Research contact: shane.rogers@ecu.edu.au

Schadenfreude: Feeling good about someone else’s bad luck

October 26, 2018

Embarrassing as it is to admit, few among us have not experienced schadenfreudethe German word for the sense of pleasure that people derive from the misfortune of others. This common, yet poorly understood, emotion may provide a valuable window into the darker side of humanity, according to psychologists at Emory University in Atlanta.

In a study to be published in New Ideas in Psychology in January, and covered on October 23 in Science Daily, the Emory researchers propose that schadenfreude comprises three separable but interrelated sensibilities—aggression, rivalry, and justice—and that people who experience schadenfreude feel a sense of detachment from the subject of their glee.

Indeed, says Shengshen Wang, a Ph.D. candidate in Psychology at Emory and the lead author of the paper, “Dehumanization appears to be at the core of schadenfreude. The scenarios that elicit schadenfreude, such as intergroup conflicts, tend to also promote dehumanization.”

Dehumanization can range from subtle forms, such as assuming that someone from another ethnic group does not feel the full range of emotions as one’s in-group members do; all the way to blatant forms; such as equating sex offenders to animals.

“Our literature review strongly suggests that the propensity to experience schadenfreude isn’t entirely unique, but that it overlaps substantially with several other ‘dark’ personality traits, such as sadism, narcissism, and psychopathy,” comments co-author Philippe Rochat, who studies infant and child development, . “Moreover, different subforms of schadenfreude may relate somewhat differently to these often malevolent traits.”

“Schadenfreude is an uncanny emotion that is difficult to assimilate,” Rochat says. “It’s kind of a warm-cold experience that is associated with a sense of guilt. It can make you feel odd to experience pleasure when hearing about bad things happening to someone else.”

Psychologists view schadenfreude through the lens of three theories:

  • Envy theory focuses on a concern for self-evaluation, and a lessening of painful feelings when someone perceived as enviable gets knocked down a peg.
  • Deservingness theory links schadenfreude to a concern for social justice and the feeling that someone dealt a misfortune received what was coming to them.
  • Intergroup-conflict theory concerns social identity and the schadenfreude experienced after the defeat of members of a rival group, such as during sporting or political competitions.

The Emory research study found that infants as young as eight months demonstrate a sophisticated sense of social justice. In experiments, they showed a preference for puppets who assisted a helpful puppet, and who punished puppets that had exhibited antisocial behavior. Research on infants also points to the early roots of intergroup aggression, showing that, by nine months, infants preferred puppets who punish others who are unlike themselves.

“When you think of normal child development, you think of children becoming good-natured and sociable,” Rochat says. “But there’s a dark side to becoming socialized. You create friends and other in-groups to the exclusion of others.”

Spiteful rivalry appears by at least age five or six, when research has shown that children will sometimes opt to maximize their gain over another child, even if they have to sacrifice a resource to do so.

By the time they reach adulthood, many people have learned to hide any tendencies for making a sacrifice just for spite, but they may be more open about making sacrifices that are considered pro-social.

The review article posits a unifying, motivational theory: Concerns of self-evaluation, social identity and justice are the three motivators that drive people toward schadenfreude. What pulls people away from schadenfreude is the ability to feel empathy for others and to perceive them as fully human and to show empathy for them.

Ordinary people may temporarily lose empathy for others. But those with certain personality disorders and associated traits—such as psychopathy, narcissism, or sadism—either are less able or less motivated to put themselves in the shoes of others.

“By broadening the perspective of schadenfreude, and connecting all of the related phenomena underlying it, we hope we’ve provided a framework to gain deeper insights into this complex, multi-faceted emotion,” Wang says.

Research contact: shensheng.wang@emory.edu

Patients care what their doctors wear

June 26, 2018

A survey of 4,062 patients at ten major medical centers nationwide by Michigan Medicine at the University of Michigan  has found that doctors should consider their white coats as much more than a fashion statement. In fact, one-third of respondents to the study, published on June 6 in Science Daily, said that what a doctor wears influences their satisfaction with their care.

Based on the findings, the researched have called for more hospitals, health systems, and practice groups to look at their dress standards for physicians, or create them, if they don’t already have one.

“Professional dress on Wall Street, law and nearly every other industry is relatively clear—and it typically mirrors what applicants would wear to their job interview,” says Christopher Petrilli, M.D., lead author of the study and an assistant professor of hospital medicine at the University of Michigan Medical School, who worked in the finance industry before entering medicine. “In medicine, the dress code is quite heterogeneous, but, as physicians, we should make sure that our attire reflects a certain level of professionalism that is also mindful of patients’ preferences.”

The study asked patients to look at pictures of male and female physicians in seven different forms of attire, and to think of them in both inpatient and outpatient clinical settings. For each photo, they rated the providers on how knowledgeable, trustworthy, caring, and approachable the physician appeared, and how comfortable the attire made the patient feel.

The options were:

  • Casual: Short-sleeved collared shirt and jeans with tennis shoes, with or without white coat;
  • Scrubs: Blue short-sleeved scrub top and pants, with or without white coat;
  • Formal: Light blue long-sleeved dress shirt and navy blue suit pants, with or without white coat; with black leather shoes with one-inch heels for women, and with black leather shoes for men, and a dark blue tie for men; or
  • Business suit: Navy blue jacket and pants with the same dress shirt, tie and shoes as in the “formal” option, no white coat.

Formal attire with a white coat got the highest score on the composite of five measures, and was especially popular with people over age 65. It was followed by scrubs with a white coat, and formal attire without a white coat. Indeed, when asked directly what they thought their own doctors should wear, 44% of patients said they preferred the formal attire with white coat, and 26% said scrubs with a white coat. When asked what they would prefer surgeons and emergency physicians wear, scrubs alone got 34%  of the vote; followed by scrubs with a white coat with 23%.

The results were largely the same for physicians of either gender except for male surgeons. Patients tended to prefer that they go with formal wear, without a white coat.

The setting of care mattered, too. Sixty-two percent agreed or strongly agreed that when seeing patients in the hospital, doctors should wear a white coat, and 55% said the same for doctors seeing patients in an office setting. The percentage preferring a white coat fell to 44% for emergency physicians.

Though the surveys were conducted during business hours on weekdays, the researchers asked patients what they thought doctors should wear when seeing patients on weekends. In this case, 44% said the short-sleeved outfit with jeans was appropriate, although 56% were neutral or disapproved of such a look even on weekends.

Interestingly, patients in the Northeast (38%) and Midwest (40%) were less insistent on white coats and formal attire;, compared with those in the West (50%) and in the South (51%). Northeasterners were more than twice as likely as southerners to prefer scrubs alone for surgeons.

“This is by far the largest study to date in this area. We used the expertise gained from our previous systematic review along with a panel of psychometricians, research scientists, choice architects, survey experts, and bioethicists to develop our study instrument. Given the size, methodological rigor and representativeness of these data, local, nuanced policies addressing physician attire should be considered to improve the patient experience,” says Petrilli, who treats patients in the hospitals of Michigan Medicine, U-M’s academic medical center, and holds a position at the VA Ann Arbor Healthcare System. He is a member of the U-M Institute for Healthcare Policy and Innovation.

The researchers note that while studies have shown that while physicians’ white coats, neckties and sleeves have been shown to harbor infectious organisms—leading some countries to require physicians’ arms to be “bare below the elbow”—no studies have shown actual transmission of infection to patients through contact with physician attire.

However, other research has suggested that physicians may be more attentive to tasks when wearing their white coats, perhaps increasing patient safety.

“Patients appear to care about attire and may expect to see their doctor in certain ways. Which may explain why even white lab coats received a high rating for ‘approachability’ — patients may see a white coat similar to a physician’s ‘uniform’ and may similarly also expect formal attire in most settings,” notes Petrilli. “Patients don’t always have the opportunity to choose their doctor. In this era of appropriately increased focus on patient centeredness and satisfaction, physician attire may be an important, easily modifiable component of the patient care experience.”

Research contact: MichMedmedia@med.umich.edu