Category Archives: Science

20th Century sea level rise underestimated?

Sea level change resulting from Greenland ice melt, derived from NASA GRACE measurements. Black circles show locations of the best historical water level records, which underestimate global average sea level rise due to Greenland melt by about 25 percent. Credits: University of Hawaii/NASA-JPL/Caltech

Sea level change resulting from Greenland ice melt, derived from NASA GRACE measurements. Black circles show locations of the best historical water level records, which underestimate global average sea level rise due to Greenland melt by about 25 percent. Credits: University of Hawaii/NASA-JPL/Caltech

The world’s coastal regions have been submerging even faster than we thought, according to a new study which finds that the measuring devices used to calculate the rise may be given readings lower than the real rate of sinking.

From NASA:

A new NASA and university study using NASA satellite data finds that tide gauges — the longest and highest-quality records of historical ocean water levels — may have underestimated the amount of global average sea level rise that occurred during the 20th century.

A research team led by Philip Thompson, associate director of the University of Hawaii Sea Level Center in the School of Ocean and Earth Science and Technology, Manoa, evaluated how various processes that cause sea level to change differently in different places may have affected past measurements. The team also included scientists from NASA’s Jet Propulsion Laboratory, Pasadena, California, and Old Dominion University, Norfolk, Virginia.

“It’s not that there’s something wrong with the instruments or the data,” said Thompson, “but for a variety of reasons, sea level does not change at the same pace everywhere at the same time. As it turns out, our best historical sea level records tend to be located where 20th century sea level rise was most likely less than the true global average.”

One of the key processes the researchers looked at is the effect of “ice melt fingerprints,” which are global patterns of sea level change caused by deviations in Earth’s rotation and local gravity that occur when a large ice mass melts. To determine the unique melt fingerprint for glaciers, ice caps and ice sheets, the team used data from NASA’s Gravity Recovery and Climate Experiment (GRACE) satellites on Earth’s changing gravitational field, and a novel modeling tool (developed by study co-author Surendra Adhikari and the JPL team) that simulates how ocean mass is redistributed due to ice melting.

Continue reading

The upper class pays less attention to others

Class status really does color our perceptions of others.

It even colors whether or not we even perceive others, according to some fascinating new research.

The findings offer a clue to the impacts of the sharply widening class divisions in a world where neoliberalism has become the dominant political paradigm, a model backed by the wealthy in which the needs of others are simply ignored in order to justify the concentration of wealth.

From the Association for Psychological Science:

The degree to which other people divert your attention may depend on your social class, according to new findings published in Psychological Science [$35 to access], a journal of the Association for Psychological Science.

The research shows that people who categorize themselves as being in a relatively high social class spend less time looking at passersby compared with those who aren’t as well off, a difference that seems to stem from spontaneous processes related to perception and attention.

“Across field, lab, and online studies, our research documents that other humans are more likely to capture the attention of lower-class individuals than the attention of higher-class individuals,” says psychological scientist Pia Dietze of New York University. “Like other cultural groups, social class affects information processing in a pervasive and spontaneous manner.”

Previous studies have shown a variety of behavioral differences among people of various social classes — including levels of compassion, interpersonal engagement, charity, ethicality, and empathy toward others. Dietze and co-author Eric Knowles wondered whether these discrepancies might stem, at least in part, from deep, culturally ingrained differences in the way people process information.

The researchers hypothesized that our social class affects how relevant others are to us in terms of our own goals and motivations. Compared with people who come from less-advantaged circumstances, people from relatively privileged backgrounds are likely to be less dependent on others socially; as such, they are less likely to view other people as potentially rewarding, threatening, or otherwise worth paying attention. Importantly, Dietze and Knowles posited that this difference in what they call “motivational relevance” is so fundamental that it manifests in basic cognitive processes — like visual attention — that operate quickly and involuntarily.

Continue reading

Many cloud hosting services plagued by malware

This map shows locations where the impacts of bad repositories (Bars) occur. (Credit: Xiaojing Liao, Georgia Tech)

This map shows locations where the impacts of bad cloud hosting repositories [Bars] occur. [From Xiaojing Liao, Georgia Tech]

Cloud hosting services, those providers who host your data in third party data centers so you can save space on your hard rives, like to boast that they’re more secure that your own computer.

But that may not be the case, according to new research from researchers at three major U.S. universities.

Fortunately for U.S. users, American servers seem to have the fewest infections, but that’s not the case in many other countries.

From the Georgia Institute of Technology:

A study of 20 major cloud hosting services has found that as many as 10 percent of the repositories hosted by them had been compromised – with several hundred of the “buckets” actively providing malware. Such bad content could be challenging to find, however, because it can be rapidly assembled from stored components that individually may not appear to be malicious.

To identify the bad content, researchers created a scanning tool that looks for features unique to the bad repositories, known as “Bars.” The features included certain types of redirection schemes and “gatekeeper” elements designed to protect the malware from scanners. Researchers from the Georgia Institute of Technology, Indiana University Bloomington and the University of California Santa Barbara conducted the study.

Believed to be the first systematic study of cloud-based malicious activity, the research will be presented October 24 at the ACM Conference on Computer and Communications Security in Vienna, Austria. The work was supported in part by the National Science Foundation.

“Bad actors have migrated to the cloud along with everybody else,” said Raheem Beyah, a professor in Georgia Tech’s School of Electrical and Computer Engineering. “The bad guys are using the cloud to deliver malware and other nefarious things while remaining undetected. The resources they use are compromised in a variety of ways, from traditional exploits to simply taking advantage of poor configurations.”

Beyah and graduate student Xiaojing Liao found that the bad actors could hide their activities by keeping components of their malware in separate repositories that by themselves didn’t trigger traditional scanners. Only when they were needed to launch an attack were the different parts of this malware assembled.

“Some exploits appear to be benign until they are assembled in a certain way,” explained Beyah, who is the Motorola Foundation Professor and associate chair for strategic initiatives and innovation in the School of Electrical and Computer Engineering. “When you scan the components in a piecemeal kind of way, you only see part of the malware, and the part you see may not be malicious.”

In the cloud, malicious actors take advantage of how difficult it can be to scan so much storage. Operators of cloud hosting services may not have the resources to do the deep scans that may be necessary to find the Bars – and their monitoring of repositories may be limited by service-level agreements.

While splitting the malicious software up helped hide it, the strategy also created a technique for finding the “bad buckets” hosting it, Beyah said. Many of the bad actors had redundant repositories connected by specific kinds of redirection schemes that allowed attacks to continue if one bucket were lost. The bad buckets also usually had “gatekeepers” designed to keep scanners out of the repositories, and where webpages were served, they had simple structures that were easy to propagate.

Continue reading

Billions in health costs from plastic bottles, cans


For several years we’ve been posting about the grave health dangers posed by the flood of chemicals we’ve poured into our world, particularly compounds capable of mimicking natural chemicals critical to our welfare and manufactured by the body’s endocrine system.

These s-called endocrine disruptors have been linked to a whole host of afflictions, ranging from cancer and obesity to ADHD and fetal abnormalities and social isolation.

And now, for the first time, comes a report detailed the huge financial costs these chemicals are imposing on our already staggered healthcare system.

From New York Universit’s Langone Medical Center:

Annual healthcare costs and lost earnings in the United States from low-level but daily exposure to hazardous chemicals commonly found in plastic bottles, metal food cans, detergents, flame retardants, toys, cosmetics, and pesticides, exceeds $340 billion, according to a detailed economic analysis by researchers at NYU Langone Medical Center.

The investigators who performed the calculations say the massive toll from everyday contact with endocrine-disrupting chemicals amounts to more than 2.3 percent of the country’s gross domestic product.

Included in the team’s analysis, described online [$31.50 to read] October 17 in The Lancet Diabetes & Endocrinology, are estimated costs from more than 15 medical conditions linked by previous research to toxic levels of these chemicals. Scientists say chemical exposure occurs through gradual ingestion and buildup of these toxins as consumer products are used and break down.

According to researchers, endocrine-disrupting chemicals have for decades been known to pose a danger to human health because the compounds can interfere with natural hormone function. Such chemicals include bisphenol A (BPA), commonly used to line tin food cans; phthalates, used in the manufacture of plastic food containers and many cosmetics; polychlorinated biphenyl (PCB)-like polybrominated diphenyl ethers, or PBDEs, found in flame retardants in furniture and packaging; and pesticides, such as chlorpyrifos and organophosphates.

However, the researchers say their new analysis, which took three years to complete, is the first U.S. assessment of the costs associated with routine endocrine-disrupting chemical exposure and resulting increases not only in rates of neurological and behavioral disorders, but also in rates of male infertility, birth defects, endometriosis, obesity, diabetes, and some cancers, as well as diminished IQ scores.

More, including a video report, after the jump. . . Continue reading

The universe abruptly grows a lot more populated

That’s because astronomers had been underestimating the number of galazies in the known universe by a factor of ten.

And that means a radical increase of the number of solar systems with planets capable to supporting life.

From the Space Telescope Science Institute in Baltimore:

The universe suddenly looks a lot more crowded, thanks to a deep-sky census assembled from surveys taken by NASA’s Hubble Space Telescope and other observatories.

Astronomers came to the surprising conclusion that there are at least 10 times more galaxies in the observable universe than previously thought.

The results have clear implications for galaxy formation, and also helps shed light on an ancient astronomical paradox — why is the sky dark at night?

In analyzing the data, a team led by Christopher Conselice of the University of Nottingham, U.K., found that 10 times as many galaxies were packed into a given volume of space in the early universe than found today. Most of these galaxies were relatively small and faint, with masses similar to those of the satellite galaxies surrounding the Milky Way. As they merged to form larger galaxies the population density of galaxies in space dwindled. This means that galaxies are not evenly distributed throughout the universe’s history, the research team reports in a paper to be published in The Astrophysical Journal.

“These results are powerful evidence that a significant galaxy evolution has taken place throughout the universe’s history, which dramatically reduced the number of galaxies through mergers between them — thus reducing their total number. This gives us a verification of the so-called top-down formation of structure in the universe,” explained Conselice.

One of the most fundamental questions in astronomy is that of just how many galaxies the universe contains. The landmark Hubble Deep Field, taken in the mid-1990s, gave the first real insight into the universe’s galaxy population. Subsequent sensitive observations such as Hubble’s Ultra Deep Field revealed a myriad of faint galaxies. This led to an estimate that the observable universe contained about 200 billion galaxies. The new research shows that this estimate is at least 10 times too low.

Conselice and his team reached this conclusion using deep-space images from Hubble and the already published data from other teams. They painstakingly converted the images into 3-D, in order to make accurate measurements of the number of galaxies at different epochs in the universe’s history. In addition, they used new mathematical models, which allowed them to infer the existence of galaxies that the current generation of telescopes cannot observe. This led to the surprising conclusion that in order for the numbers of galaxies we now see and their masses to add up, there must be a further 90 percent of galaxies in the observable universe that are too faint and too far away to be seen with present-day telescopes. These myriad small faint galaxies from the early universe merged over time into the larger galaxies we can now observe.

Continue reading

Map of the day: Spread of a deer-killing disease

A lethal plague is spreading the deer population of North America, a disease threatening to annihilate a critical species in already endangered ecosystem.

Dubbed Chronic Wasting Disease [CWD], the ailment is similar to the human affliction Creutzfeldt-Jakob disease [mad cow disease], and like the human ailment, it attacks the brain and nervous system, resulting in erratic behavior and a wasting away of bodily tissues.

Both diseases appear to be caused by prions, particles smaller than viruses.

From the U.S. Geological Survey:

Chronic wasting disease may have long-term negative effects on white-tailed deer, a highly visible and economically valuable keystone species, according to a new study from the USGS and published in Ecology [$38 to read].

CWD is an always-fatal, neurological disease of the deer family, scientifically referred to as cervids that include white-tailed deer, mule deer, elk and moose. The disease is an internationally-significant wildlife management issue for free-ranging and captive white-tailed deer. Originally described in captive mule deer about 35 years ago in Colorado, CWD has now been discovered in 24 states, two Canadian provinces, the Republic of Korea and Norway.

“Despite the health threat of CWD to deer populations, little is known about the rates of infection and mortality caused by this disease,” said Dr. Michael D. Samuel, USGS wildlife biologist and lead author on the report.

Researchers used mathematical models to estimate infection and mortality for white-tailed deer in Wisconsin and Illinois outbreaks. They found that adult male deer have three times the risk of CWD infection than female deer and males also have higher disease mortality than females.

“Additional research is needed to more fully understand how CWD is transmitted to healthy deer and the potential long-term impact of the disease on North American deer populations,” said Samuel. USGS scientists found that CWD-associated deaths can cause substantial reductions in deer populations in areas where CWD is not addressed.

Scientific understanding of the ecology and transmission of CWD in free-ranging wildlife is limited, but this information is critical for making management decisions and helping to better understand the ecology of CWD in free-ranging populations.

The paper, “Chronic wasting disease in white-tailed deer: infection, mortality, and implications for heterogeneous transmission,” was published in Ecology and authored by Michael D. Samuel, USGS, Wisconsin Cooperative Wildlife Research Unit, University of Wisconsin; Daniel Storm, Department of Forest and Wildlife Ecology, University of Wisconsin, and currently with the Wisconsin Department of Natural Resources.

Southwest U.S. megadroughts loom as earth warns

Maps of megadrought risk for the American Southwest under different levels of warming, and the required increase in precipitation to compensate for that warming. From the study [see below].

Maps of megadrought risk for the American Southwest under different levels of warming, and the required increase in precipitation to compensate for that warming. From the study [see below].

It’s the nightmare everyone should fear, and it’s almost inevitable.

From Cornell University:

As a consequence of a warming Earth, the risk of a megadrought – one that lasts more than 35 years – in the American Southwest likely will rise from a low chance over the past thousand years to a 20 to 50 percent chance in this century. However, by slashing greenhouse gas emissions, these risks are nearly cut in half, according to a Cornell-led study in Science Advances, Oct. 5.

“Megadroughts are rare events, occurring only once or twice each millennium. In earlier work, we showed that climate change boosts the chances of a megadrought, but in this paper we investigated how cutting fossil fuel emissions reduces this risk,” said lead author Toby Ault, Cornell professor of earth and atmospheric science.

If climate change goes unabated – and causes more than a 2 degree Celsius increase in atmospheric temperature – megadroughts will become very probable, Ault said.

“The increase in risk is not due to any particular change in the dynamic circulation of the atmosphere,” Ault said. “It’s because the projected increase in atmospheric demand for moisture from the land surface will shift the soil moisture balance. If this happens, megadroughts will be far more likely for next millennium.”

Ault explained a natural “tug-of-war” governing the surface moisture balance between the precipitation supply (rain) and evaporation (transpiration). But he cautions that increases in average regional temperatures could be so dramatic – more than 4 degrees Celsius (7.2 degrees Fahrenheit) – that evaporation wins out. This, in turn, dries out the land surface and makes megadroughts 70 to 99 percent likely.

“We found that megadrought risk depends strongly on temperature, which is somewhat good news,” Ault said. “This means that an aggressive strategy for cutting greenhouse gas emissions could keep regional temperature changes from going beyond about 2 degrees Celsius (3.6 degrees Fahrenheit).”

This lower average warming figure cuts the megadrought risk almost in half, he said.

These tug-of-war scenarios could very well play out in the American Southwest, according to tree ring and geologic records. During sequences of exceptionally dry years, those rings tend to be relatively narrower than in wet years, he said.

“Tree rings from the American Southwest provides evidence of megadroughts, as there are multiple decades when growth is suppressed by dry conditions,” Ault said, pointing to several megadroughts that occurred in North America between 1300 and 1100 B.C.

“We also know they have occurred in other parts of the world, and they have been linked to the demise of several pre-industrial civilizations,” he said.

The tug-of-war between moisture supply and demand might play out differently in other parts of the world, Ault said.

“Nonetheless, even in the Southwest we found examples of plausible 21st-century climates where precipitation increases, but megadroughts still become more likely,” said Ault, who noted the normally verdant Northeast is in the middle of drought. “This should serve as a cautionary note for areas like the Northeast expecting to see a more-average moisture supply.

“Megadrought risks are still likely to be higher in the future than they were in the past,” he said. “Hence, efficient use of water resources in the drought-stricken American Southwest are likely to help that region thrive during a changing climate.”

On the paper, “Relative Impacts of Mitigation, Temperature, and Precipitation on 21st-Century Megadrought Risk in the American Southwest [open access],” Ault is joined by Justin S. Mankin and Benjamin Cook, both of the NASA Goddard Institute for Space Studies, and Jason E. Smerdon of Columbia University. The National Science Foundation supported this research.