Category Archives: Science

Chart of the day: Heat of the oceans soars


The heat content in the upper 2,000 meters of the ocean from 1958 through 2020. The graph shows the departure from a baseline (the average temperature between 1981-2010), with red bars showing more heat than the baseline and blue bars showing less.

More from the National Center for Atmospheric Research [emphasis added]:

The temperatures in the upper 2,000 meters of the ocean hit a record high in 2020, according to a new analysis by a research team that included scientists from the National Center for Atmospheric Research (NCAR). The five hottest years for the upper ocean on record have all occurred since 2015.

The results of the new analysis, published in the journal Advances in Atmospheric Science, further illustrate how Earth is warming — just over 90 percent of the additional heat due to human-caused climate change is absorbed by the ocean. Ocean heat is a valuable indicator of climate change because it does not fluctuate as much as temperatures at the Earth’s surface, which can vary in response to weather and natural climate variations such as El Niño. Thus it more clearly reveals the gradual accumulation of heat due to human activities.

The increase in ocean temperatures can cause a number of societal impacts. Warmer ocean surface waters can, for example, supercharge hurricanes and other storms that travel over the sea. Warmer water also expands to take up more room, driving sea level rise and causing coastal flooding.

The uneven vertical heating of the ocean — the surface warms more quickly than the deeper ocean layers — also causes the ocean to become more stratified. This stratification inhibits ocean mixing and the distribution of dissolved oxygen and nutrients, which impacts marine ecosystems and fisheries. 

“The warming of the ocean has real consequences,” said NCAR scientist Kevin Trenberth, a co-author of the study. “Ocean heat has exacerbated many of the most significant climate-related events in recent history, and contributed to the record number of billion-dollar disasters in the United States in 2020.”

The new research was led by Lijing Cheng, of the Chinese Academy of Sciences. The NCAR contributors are Trenberth and John Fasullo.

The research was supported by the National Science Foundation, which is NCAR’s sponsor, as well as by the National Key R&D Program of China, the Chinese Academy of Sciences, NASA, and the U.S. Department of Energy. 

Ocean heat is challenging to analyze because direct measurements of the ocean’s temperature and other attributes can be few and far between. Scientists depend on models to help fill in the gaps between these measurements. However, a network of ocean floats deployed in the last couple of decades that move throughout the top 2,000 meters of the open ocean has provided researchers with valuable data for better understanding past measurements and calibrating their models.

For this analysis, the scientists used two different ocean heat datasets, one from the Institute of Atmospheric Physics, which is part of the Chinese Academy of Sciences, and one from the National Centers for Environmental Information, which is part of the U.S. National Oceanic and Atmospheric Administration.

While the two datasets yielded slightly different values for the globally integrated ocean heat in 2020, they both found that 2020 was the warmest year on record. The results also agree with results from independent research teams using slightly different methods and from independent data, such as satellite measurements of global sea level.

“Despite the challenges of measuring the entire ocean’s temperature, we can now say definitively that the ocean is warming and has been for decades,” Fasullo said. “In fact, each decade back to the 1970s has been discernibly warmer than the decade before, revealing an accumulation of heat that can only be explained by human activities.”

Linguists: How Trump speeches fueled insurrection


“Sticks and stones may break my bones but words will never hurt me.”Children’s saying

“Sticks and stones may break my bones, but words will always hurt me. Bones mend and become actually stronger in the very place they were broken and where they have knitted up; mental wounds can grind and ooze for decades and be re-opened by the quietest whisper.” — Stephen Fry

No President has ever deployed violent language against his own people in the way Donald Trump has.

Samira Sarano, Kone Foundation Senior Researcher at the Migration Institute of Finland, examined Trump’s rhetoric in the 2016 election in The Meta-violence of Trumpism, research published in the European Journal of American Studies, an open-access academic journal.

Here’s a telling passage [emphasis added]:

Rather than denouncing violence, Trump frequently praised the “passion” and “energy” of his supporters, and he even promised to pay the legal fees of supporters caught in violent altercations. At a March 4, 2016 rally, he commented on a protestor’s removal: “Try not to hurt him. If you do, I’ll defend you in court. Don’t worry about it.” At times, Trump explicitly condoned the use of violence against protestors. On February 1, 2016, he stated: “If you see someone getting ready to throw tomatoes knock the crap out of them, would you? Seriously. OK. Just knock the hell… I promise you, I will pay for the legal fees.” Though Trump himself wished he could “punch [a protestor] in the face,” he recognized that such tactics were unpopular: “Part of the problem and part of the reason it takes so long [to remove protestors] is that nobody wants to hurt each other anymore.” Trump praised violent action against protestors: “I love the old days, you know? You know what I hate? There’s a guy totally disruptive, throwing punches. We’re not allowed to punch back anymore. I love the old days. You know what they used to do to guys like that when they were in a place like this? They’d be carried out on a stretcher, folks.”

Donald Trump: Aggressive Rhetoric and Political Violence, a more recent study, published in October in the journal Perspectives on Terrorism, was authored by two Columbia University scholars, political scientist and journalist Brigitte L. Nacosis and Wallace S. Sayre Professor of Government and International and Public Affairs Robert Y. Shapiro, and Yaeli Bloch-Elkonis Senior Lecturer/Assistant Professor of Communications and Political Science at Bar Ilan University.

Here’s one key excerpt [emphasis added]:

Examining whether correlations existed between counties that were venues of Donald Trump’s 275 campaign rallies in 2016 and subsequent hate crimes, three political scientists found that “counties that had hosted a 2016 Trump campaign rally saw a 226 percent increase in reported hate crimes over comparable counties that did not host such a rally.” While cautioning that this “analysis cannot be certain it was Trump’s campaign rally rhetoric that caused people to commit more crime in the host county,” the researchers also found it “hard to discount a ‘Trump effect’ since data of the Anti-Defamation League showed “a considerable number of these reported hate crimes referenced Trump.” Moreover, investigative reporting identified 41 cases of domestic terrorism/hate crimes or threats thereof, in which the perpetrators invoked Trump favorably in manifestos, social media posts, police interrogations, or court documents. Almost all of this violence was committed by White males against minorities or politicians singled out frequently by Trump for rhetorical attacks, and journalists. The U.S. Press Freedom Tracker recorded a total of 202 attacks on U.S. journalists from 2017, Trump’s first year in office, through mid-2020.

Trumpspeak and the assault on the Capitol

And now another study parses Trump’s speeches in the lead-up to and in the aftermath of 6 January insurrection at the nation’s Capitol.

Two scholars from the University of Memphis, Roger J. Kreuz, Associate Dean and Professor of Psychology, University of Memphis, and Leah Cathryn Windsor, Research Assistant Professor, parse presidential speech in a report for The Conversation, the open access, plain language academic journal:

How Trump’s language shifted in the weeks leading up to the Capitol riot – 2 linguists explain

On Jan. 6, the world witnessed how language can incite violence.

One after another, a series of speakers at the “Save America” rally at the Ellipse in Washington redoubled the messages of anger and outrage.

This rhetoric culminated with a directive by the president to go to the Capitol building to embolden Republicans in Congress to overturn the results of the 2020 election.

“Fight like hell,” President Donald Trump implored his supporters. “And if you don’t fight like hell, you’re not going to have a country anymore.”

Shortly thereafter, some of Trump’s supporters breached the Capitol.

Throughout his presidency, Trump’s unorthodox use of language has fascinated linguists and social scientists. But it wasn’t just his words that day that led to the violence.

Starting with a speech he made on Dec. 2 – in which he made his case for election fraud – we analyzed six public addresses Trump made before and after the riot at the Capitol building. The others were the campaign rally ahead of the runoff elections in Georgia, the speech he made at the “Save America” rally on Jan. 6, the videotaped message that aired later that same day, his denouncement of the violence on Jan. 7 and his speech en route to Texas on Jan. 12.

Together, they reveal how the president’s language escalated in intensity in the weeks and days leading up to the riots.

Finding patterns in language

Textual analysis – converting words into numbers that can be analyzed as data – can identify patterns in the types of words people use, including their syntax, semantics and vocabulary choice. Linguistic analysis can reveal latent trends in the speaker’s psychological, emotional and physical states beneath the surface of what’s being heard or read.

This sort of analysis has led to a number of discoveries.

For example, researchers have used it to identify the authors of The Federalist Papers, the Unabomber manifesto and a novel written by J.K. Rowling under a pseudonym.

Textual analysis continues to offer fresh political insights, such as its use to advance the theory that social media posts attributed to QAnon are actually written by two different people.

The ‘official’ sounding Trump

Contrary to popular thinking, Trump does not universally use inflammatory rhetoric. While he is well known for his unique speaking style and his once-frequent social media posts, in official settings his language has been quite similar to that of other presidents.

Researchers have noted how people routinely alter their speaking and writing depending on whether a setting is formal or informal. In formal venues, like the State of the Union speeches, textual analysis has found Trump to use language in ways that echo his predecessors.

In addition, a recent study analyzed 10,000 words from Trump’s and President-elect Joe Biden’s campaign speeches. It concluded – perhaps surprisingly – that Trump and Biden’s language was similar.

Both men used ample emotional language – the kind that aims to persuade people to vote – at roughly the same rates. They also used comparable rates of positive language, as well as language related to trust, anticipation and surprise. One possible reason for this could be the audience, and the persuasive and evocative nature of campaign speeches themselves, rather than individual differences between speakers.

The road to incitement

Of course, Trump has, at times, used overtly dire and violent language.

After studying Trump’s speeches before the storming of the Capitol building, we found some underlying patterns. If it seemed there was a growing sense of momentum and action in his speeches, it’s because there was.

More, including graphics, after the jump. . .

Continue reading

UN Climate report: We’re falling short


The United Nations Environment Program took a lot st how well the world’s nations are planning and acting to meet the ongoing and worsening global crisis.

Their verdict: We’re doing much too little.

One interesting feature of their report is an analysis of programs that work with nature, illustrated in this graphic from the document, linking climate crises and nature-based mitigations:

Here’s a briefing on the report from the United Nations Environment Program:

Step up climate change adaptation or face serious human and economic damage – UN report

As temperatures rise and climate change impacts intensify, nations must urgently step up action to adapt to the new climate reality or face serious costs, damages and losses, a new UN Environment Programme (UNEP) report finds.

Adaptation – reducing countries’ and communities’ vulnerability to climate change by increasing their ability to absorb impacts – is a key pillar of the Paris Agreement on Climate Change. The agreement requires its signatories to implement adaptation measures through national plans, climate information systems, early warning, protective measures and investments in a green future.

The UNEP Adaptation Gap Report 2020 finds that while nations have advanced in planning, huge gaps remain in finance for developing countries and bringing adaptation projects to the stage where they bring real protection against climate impacts such as droughts, floods and sea-level rise.

Public and private finance for adaptation must be stepped up urgently, along with faster implementation. Nature-based solutions – locally appropriate actions that address societal challenges, such as climate change, and provide human well-being and biodiversity benefits by protecting, sustainably managing and restoring natural or modified ecosystems – must also become a priority.

“The hard truth is that climate change is upon us,” said Inger Andersen, Executive Director of UNEP. “Its impacts will intensify and hit vulnerable countries and communities the hardest – even if we meet the Paris Agreement goals of holding global warming this century to well below 2°C and pursuing 1.5°C.”

“As the UN Secretary-General has said, we need a global commitment to put half of all global climate finance towards adaptation in the next year,” she added. “This will allow a huge step up in adaptation – in everything from early warning systems to resilient water resources to nature-based solutions.”

Adaptation planning is growing, but funding and follow-up lagging

The most encouraging finding of the report is that 72 per cent of countries have adopted at least one national-level adaptation planning instrument. Most developing countries are preparing National Adaptation Plans. However, the finance needed to implement these plans is not growing fast enough.

The pace of adaptation financing is indeed rising, but it continues to be outpaced by rapidly increasing adaptation costs. Annual adaptation costs in developing countries are estimated at USD 70 billion. This figure is expected to reach USD 140-300 billion in 2030 and USD 280-500 billion in 2050.

There are some encouraging developments. The Green Climate Fund (GCF) has allocated 40 per cent of its total portfolio to adaptation and is increasingly crowding-in private sector investment. Another important development is increasing momentum to ensure a sustainable financial system. However, increased public and private adaptation finance is needed. New tools such as sustainability investment criteria, climate-related disclosure principles and mainstreaming of climate risks into investment decisions can stimulate investments in climate resilience.

Implementation of adaptation actions is also growing. Since 2006, close to 400 adaptation projects financed by multilateral funds serving the Paris Agreement have taken place in developing countries. While earlier projects rarely exceeded USD 10 million, 21 new projects since 2017 reached a value of over USD 25 million. However, of over 1,700 adaptation initiatives surveyed, only 3 per cent had already reported real reductions to climate risks posed to the communities where the projects were being implemented.

Nature-based solutions for adaptation can make a huge contribution

The report places a special focus on nature-based solutions as low-cost options that reduce climate risks, restore and protect biodiversity and bring benefits for communities and economies.

An analysis of four major climate and development funds – the Global Environment Facility, the Green Climate Fund, the Adaptation Fund and the International Climate Initiative – suggested that support for green initiatives with some element of nature-based solutions has risen over the last two decades. Cumulative investment for climate change mitigation and adaptation projects under the four funds stood at USD 94 billion. However, only USD 12 billion was spent on nature-based solutions – a tiny fraction of total adaptation and conservation finance.

Stepping up action

According to the report, cutting greenhouse gas emissions will reduce the impacts and costs associated with climate change. Achieving the 2°C target of the Paris Agreement could limit losses in annual growth to up to 1.6 per cent, compared to 2.2 per cent for the 3°C trajectory.

All nations must pursue the efforts outlined in UNEP’s Emissions Gap Report 2020, which called for a green pandemic recovery and updated Nationally Determined Contributions that include new net-zero commitments. However, the world must also plan for, finance and implement climate change adaptation to support those nations least responsible for climate change but most at risk.

While the COVID-19 pandemic is expected to hit the ability of countries to adapt to climate change, investing in adaptation is a sound economic decision.

Most of world’s land to lose water as climate heats


Water is the most precious thing on earth, the life-sustaining fluid that most of us take for granted.

But as the earth warms and climates shifts, modst of planet will be suffering frrom reduced water supplies, one of the most worrying impacts of climate change.

Yadu Pokhrel, Associate Professor of Civil and Environmental Engineering at Michigan State University and Farshid Felfelani, an MSU Postdoctoral Research Associate, examine the data and its implications for a report in The Conversation, an academic journal written in conversational English:

Two-thirds of Earth’s land is on pace to lose water as the climate warms – that’s a problem for people, crops and forests

The world watched with a sense of dread in 2018 as Cape Town, South Africa, counted down the days until the city would run out of water. The region’s surface reservoirs were going dry amid its worst drought on record, and the public countdown was a plea for help.

By drastically cutting their water use, Cape Town residents and farmers were able to push back “Day Zero” until the rain came, but the close call showed just how precarious water security can be. California also faced severe water restrictions during its recent multiyear drought. And Mexico City is now facing water restrictions after a year with little rain.

There are growing concerns that many regions of the world will face water crises like these in the coming decades as rising temperatures exacerbate drought conditions.

Understanding the risks ahead requires looking at the entire landscape of terrestrial water storage – not just the rivers, but also the water stored in soils, groundwater, snowpack, forest canopies, wetlands, lakes and reservoirs.

We study changes in the terrestrial water cycle as engineers and hydrologists. In a new study published Jan. 11, we and a team of colleagues from universities and institutes around the world showed for the first time how climate change will likely affect water availability on land from all water storage sources over the course of this century.

We found that the sum of this terrestrial water storage is on pace to decline across two-thirds of the land on the planet. The worst impacts will be in areas of the Southern Hemisphere where water scarcity is already threatening food security and leading to human migration and conflict. Globally, one in 12 people could face extreme drought related to water storage every year by the end of this century, compared to an average of about one in 33 at the end of the 20th century.

These findings have implications for water availability, not only for human needs, but also for trees, plants and the sustainability of agriculture.

Where the risks are highest

The water that keeps land healthy, crops growing and human needs met comes from a variety of sources. Mountain snow and rainfall feed streams that affect community water supplies. Soil water content directly affects plant growth. Groundwater resources are crucial for both drinking water supplies and crop productivity in irrigated regions.

While studies often focus just on river flow as an indicator of water availability and drought, our study instead provides a holistic picture of the changes in total water available on land. That allows us to capture nuances, such as the ability of forests to draw water from deep groundwater sources during years when the upper soil levels are drier.

The declines we found in land water storage are especially alarming in the Amazon River basin, Australia, southern Africa, the Mediterranean region and parts of the United States. In these regions, precipitation is expected to decline sharply with climate change, and rising temperatures will increase evaporation. At the same time, some other regions will become wetter, a process already seen today.

The map shows the projected change in terrestrial water storage by the end of the 21st century, compared to the 1975-2005 average, under a mid-range scenario for global warming. A continuum of yellow to orange to dark red reflects increasing severity of loss of stored water; teal to blue to dark blue reflects increasing gains in stored water. Yadu Pokhrel, et al, Nature Climate Change, 2021, CC BY-ND

Our findings for the Amazon basin add to the longstanding debate over the fate of the rainforest in a warmer world. Many studies using climate model projections have warned of widespread forest die-off in the future as less rainfall and warmer temperatures lead to higher heat and moisture stress combined with forest fires.

In an earlier study, we found that the deep-rooted rainforests may be more resilient to short-term drought than they appear because they can tap water stored in soils deeper in the ground that aren’t considered in typical climate model projections. However, our new findings, using multiple models, indicate that the declines in total water storage, including deep groundwater stores, may lead to more water shortages during dry seasons when trees need stored water the most and exacerbate future droughts. All weaken the resilience of the rainforests.

A new way of looking at drought

Our study also provides a new perspective on future droughts.

There are different kinds of droughts. Meteorological droughts are caused by lack of precipitation. Agricultural droughts are caused by lack of water in soils. Hydrological droughts involve lack of water in rivers and groundwater. We provided a new perspective on droughts by looking at the total water storage.

We found that moderate to severe droughts involving water storage would increase until the middle of the 21st century and then remain stable under future scenarios in which countries cut their emissions, but extreme to exceptional water storage droughts could continue to increase until the end of the century.

That would further threaten water availability in regions where water storage is projected to decline.

Changes driven by global warming

These declines in water storage and increases in future droughts are primarily driven by climate change, not land-water management activities such as irrigation and groundwater pumping. This became clear when we examined simulations of what the future would look like if climate conditions were unchanged from preindustrial times. Without the increase in greenhouse gas emissions, terrestrial water storage would remain generally stable in most regions.

If future increases in groundwater use for irrigation and other needs are also considered, the projected reduction in water storage and increase in drought could be even more severe.

UFO rider sneaks into gov’t spending bill


No, not an alien a la X Files, but a rider in the sense of an add-on to legislation created for other purposes.

And there’s more UFO news, too.

From CNN:

When President Donald Trump signed the $2.3 trillion coronavirus relief and government funding bill into law in December, so began the 180-day countdown for US intelligence agencies to tell Congress what they know about UFOs.

No, really. The director of National Intelligence and the secretary of defense have a little less than six months now to provide the congressional intelligence and armed services committees with an unclassified report about “unidentified aerial phenomena.

“It’s a stipulation that was tucked into the “committee comment” section of the Intelligence Authorization Act for Fiscal Year 2021, which was contained in the massive spending bill.

The legislation mandates that the report contain an unclassified analysis of all the UFO information collected by the Naval Intelligence, the FBI, and the Unidentified Aerial Phenomena Task Force, though a classified annex may also be submitted.

The legislation orders creation of an inter-agency review process to ensure timely reporting of information, headed by a designated official.

Congress also ordered a report on any the national security threats posed by the phenomena.

CIA UFO reports also go online

In a second development today, a website offering all manner of declassified secrets has made all thus-far released UFO reports from the Central Intelligence Agency

All of the CIA’s publicly available documents on unidentified flying objects is now available to be downloaded.

The website The Black Vault, ran by  John Greenewald Jr., has published a downloadable archive of every instance of Unidentified Aerial Phenomena (UAP), the government classification for UFOs. All the files are available online at The Black Vault’s website.

Greenewald scanned thousands of pages by hand following approximately 10,000 Freedom Of Information Acts (FOIA) levied at multiple agencies, including the CIA, which have resulted in the 2.2 million pages uploaded to The Black Vault.

The documents had been released before, but in a very difficult format for the average reader. Black Vault offers the records in accessible PDF versions for easy downloading and perusal.

Study: Unhealthy wildfire smoke soars in the West


And its precisely the sort that does the most damage to your health.

From the Associated Press:

Wildfire smoke accounted for up to half of all health-damaging small particle air pollution in the western U.S. in recent years as warming temperatures fueled more destructive blazes, according to a study released Monday.

Even as pollution emissions declined from other sources including vehicle exhaust and power plants, the amount from fires increased sharply, said researchers at Stanford University and the University of California, San Diego.

The findings underscore the growing public health threat posed by climate change as it contributes to catastrophic wildfires such as those that charred huge areas of California and the Pacific Northwest in 2020. Nationwide, wildfires were the source of up to 25% of small particle pollution in some years, the researchers said.

“From a climate perspective, wildfires should be the first things on our minds for many of us in the U.S.,” said Marshall Burke, an associate professor of earth system science at Stanford and lead author of the study.

The AP notes that particulates form the smoke have been linked to breathing problems and other ailments, as well as an increased rate of premature deaths, according to health experts.

More than a million tons of particulate pollution were recorded five of the last ten years.

From the report, published in the Proceedings of the National Academy of Science:

Over the past four decades, burned area from wildfires has roughly quadrupled in the United States. This rapid growth has been driven by a number of factors, including the accumulation of fuels due to a legacy of fire suppression over the last century and a more recent increase in fuel aridity, shown for the western United States), a trend which is expected to continue as the climate warms. These increases have happened parallel to a substantial rise in the number of houses in the wildland–urban interface (WUI). Using data on the universe of home locations across the United States and updated national land cover maps, we update earlier studies and estimate that there are now ∼49 million residential homes in the WUI, a number that has been increasing by roughly 350,000 houses per year over the last two decades. As firefighting effort focuses substantially on the protection of private homes, these factors have contributed to a steady rise in spending on wildfire suppression by the US government, which in recent years has totaled ∼$3 billion/y in federal expenditure. Total prescribed burn acreage has increased in the southeastern United States but has remained largely flat elsewhere, suggesting to many that there is underinvestment in this risk-mitigation strategy, given the massive overall growth in wildfire risk.

A graphic from the report sums up the rapid spread of wildfire pollution:

The quantity, source, and incidence of wildfire smoke. (A and B) Average predicted micrograms per cubic meter of PM2.5 attributable to wildfire smoke in 2006 to 2008 and 2016 to 2018, as calculated from a statistical model fitting satellite-derived smoke plume data. (C) Share of smoke originating outside the United States, June to September 2007 to 2014, with a substantial amount of smoke in the Northeast and Midwest originating from Canadian fires and about 60% of smoke in the Northeast originating outside the country; nationally, ∼11% of smoke is estimated to originate outside the country. (D) The share of smoke originating in the western United States, June to September 2007 to 2014. Smoke originating in the western United States accounts for 54% of the smoke experienced in the rest of the United States. (E and F) Racial exposure gradients are opposite for particulate matter from smoke compared to total particulate matter: Across the coterminous United States, counties with a higher population proportion of non-Hispanic whites have lower average particulate matter exposure but higher average ambient exposure to particulate matter from smoke (P <0.01 for both relationships).

Scientists worry as U.S. rivers change color


Pollution, in the form of soil runoff and farming fertilizers, is changing the colors of American rivers, and the implications may be profound.

From Yale Environment360:

One in three large American rivers has changed color over the last 36 years, shifting from shades of blue to green and yellow, raising concerns about the health of U.S. waterways, according to an analysis of nearly 235,000 satellite images published in the journal Geophysical Research Letters.

The research examined satellite images covering 67,000 miles of large rivers (measuring more than 197-feet-wide) taken from 1984 to 2018 by NASA and the U.S. Geological Survey. It found that 56 percent of rivers appeared yellow instead of blue, and 38 percent appeared green. In one-third of the rivers examined, the color shift from blue to green or yellow was a long-term change, not tied to seasonal variation. Just 8 percent of the satellite images showed rivers as blue.

“Most of the rivers are changing gradually and not noticeable to the human eye,” lead author John Gardner, a postdoctoral researcher in the global hydrology lab at University of North Carolina, told Live Science. “But areas that are the fastest changing are more likely to be man-made.”

A yellow hue is likely due to a higher sediment load in the water, which can be caused by human activity, such as dredging or construction, or natural causes, such as heavy rainfall. Rivers appear green when there are large amounts of algae, often the result of fertilizer runoff from farms.

More from the Associated Press:

“If things are becoming more green, that’s a problem,” said study lead author John Gardner, a University of Pittsburgh geology and environmental sciences professor. Although some green tint to rivers can be normal, Gardener said, it often means large algae blooms that cause oxygen loss and can produce toxins.

The chief causes of color changes are farm fertilizer run-off, dams, efforts to fight soil erosion and man-made climate change, which increases water temperature and rain-related run-off, the study authors said.

“We change our rivers a lot. A lot of that has to do with human activity,” said study co-author Tamlin Pavelsky, a professor of global hydrology at the University of North Carolina.

Excess fertilizer runoff has been the cause of major outbreaks of toxic algae creating dead zones in the Gulf of Mexico, fed by runoff from U.S. farmlands feeding into the Mississippi River.

Map of the day: 2020 billion-dollar climate disasters


From the National Atmospheric and Oceanic Administration, a map of the billion-dollar plus climate disasters to strike the U.S. in 2020:

More from the report:

There were 22 separate billion-dollar weather and climate disasters across the United States, shattering the previous annual record of 16 events, which occurred in 2017 and 2011. The billion-dollar events of 2020 included a record 7 disasters linked to tropical cyclones, 13 to severe storms, 1 to drought, and 1 to wildfires. The 22 events cost the nation a combined $95 billion in damages. 

Adding the 2020 events to the record that began in 1980, the U.S. has sustained 285 weather and climate disasters where the overall damage costs reached or exceeded $1 billion. (All cost estimates are adjusted based on the Consumer Price Index as of December 2020). The cumulative cost for these 285 events exceeds $1.875 trillion.

More generally, the U.S. experienced a record-breaking number of named tropical cyclones (30), eclipsing the record of 28 set in 2005, the year of Hurricane Katrina. Of these 30 storms, a record 12 made landfall in the United States. And 7 of the 12 became billion-dollar disasters—also a new record.

Not to be left out, many central states were impacted by a historically powerful derecho on August 10, which caused impacts comparable to an inland hurricane. 2020 also brought a record-breaking U.S. wildfire season, which burned more than 10.2 million acres. California more than doubled its previous annual record for area burned (last set in 2018) with over 4.1 million acres. In total, it is clear that 2020 (red line below) stands head and shoulders above all other years in regard to the number of billion-dollar disasters.

In broader context, the total cost of U.S. billion-dollar disasters over the last 5 years (2016-2020) exceeds $600 billion, with a 5-year annual cost average of $121.3 billion, both of which are new records. The U.S. billion-dollar disaster damage costs over the last 10-years (2011-2020) were also historically large:  at least $890 billion from 135 separate billion-dollar events. Moreover, the losses over the most recent 15 years (2006-2020) are $1.036 trillion in damages from 173 separate billion-dollar disaster events.

Among the other items in their report, is this dramatic graph, revealing the ever-growing inflation-adjusted costs incurred from climate disasters over recent years, compared to historic averages:

Month-by-month accumulation of billion-dollar disasters for each year on record. The colored lines represent the top 5 years for most billion-dollar disasters prior to 2020. All other years are colored black. Before the end of August, 2020 (red line) had broken the previous annual record for billion-dollar disasters—16—set in 2011 (royal blue) and tied in 2017 (purple).  NOAA image by NCEI.

Europe hit hottest-ever year, planet record tied


Copernicus Climate Change Service, an intergovernmental service in the European union keep close track on leading indicators of the earth’s changing climate, and this year report offers a stark warning about the the years ahead.

The planet marked a near record overall, with Europe its hottest year ever.

From the Copernicus Climate Change Service:

Air temperature at a height of two metres for 2020, shown relative to its 1981–2010 average. Source: ERA5. Credit: Copernicus Climate Change Service/ECMWF

The Copernicus Climate Change Service (C3S) today reveals that globally 2020 was tied with the previous warmest year 2016, making it the sixth in a series of exceptionally warm years starting in 2015, and 2011-2020 the warmest decade recorded. Meanwhile, Europe saw its warmest year on record, 0.4°C warmer than 2019 which was previously the warmest year. Together with the Copernicus Atmosphere Monitoring Service (CAMS), C3S also reports that CO2 concentrations in the atmosphere have continued to rise at a rate of approximately 2.3 ppm/year in 2020 reaching a maximum of 413 ppm during May 2020. Both C3S and CAMS are implemented by the European Centre for Medium-Range Weather Forecasts on behalf of the European Commission with funding by the European Union.

Decadal averages of global air temperature at a height of two metres estimated change since the pre-industrial period according to different datasets: ERA5 (ECMWF Copernicus Climate Change Service, C3S); GISTEMPv4 (NASA); HadCRUT5 (Met Office Hadley Centre); NOAAGlobalTempv5 (NOAA), JRA-55 (JMA); and Berkeley Earth. Credit: Copernicus Climate Change Service/ECMWF

C3S’s dataset for surface air temperatures shows that:

  • Globally, 2020 was on a par with the 2016 record
  • 2020 was 0.6°C warmer than the standard 1981-2010 reference period and around 1.25°C above the 1850-1900 pre-industrial period
  • This makes the last six years the warmest six on record
  • Europe saw its warmest year on record at 1.6°C above the 1981-2010 reference period, and 0.4°C above 2019, the previous warmest year
  • The largest annual temperature deviation from the 1981-2010 average was concentrated over the Arctic and northern Siberia, reaching to over 6°C above average

Furthermore, satellite measurements of global atmospheric CO2 concentrations show that:

  • CO2 global column-averaged maximum reached 413 ppm
  • CO2 continued to rise in 2020, increasing by 2.3 ± 0.4 ppm,slightly less than the growth rate of the previous year

Parts of the Arctic and northern Siberia saw some of the largest annual temperature deviations from average in 2020, with a large region seeing deviations of as much as 3°C and in some locations even over 6°C for the year as a whole. On a monthly basis, the largest positive temperature anomalies for the region repeatedly reached more than 8°C. Western Siberia experienced an exceptionally warm winter and spring, a pattern also seen over summer and autumn in the Siberian Arctic and over much of the Arctic Ocean.

Furthermore, the wildfire season was unusually active in this region, with fires first detected in May, continuing throughout summer and well into autumn. As a result, poleward of the Arctic Circle, fires released a record amount of 244 megatonnes of carbon dioxide in 2020, over a third more than the 2019 record. During the second half of the year, Arctic sea ice was significantly lower than average for the time of the year with July and October seeing the lowest sea ice extent on record for the respective month.

In general, the Northern Hemisphere experienced above average temperatures for the year, apart from a region over the central North Atlantic. In contrast, parts of the Southern Hemisphere saw below average temperatures, most notably over the eastern equatorial Pacific, associated with the cooler La Niña conditions developing during the second half of the year. It is notable that 2020 matches the 2016 record despite a cooling La Niña, whereas 2016 was a record year that began with a strong warming El Niño event.

Annual averages of global air temperature at a height of two metres estimated change since the pre-industrial period (left-hand axis) and relative to 1981-2010 (right-hand axis) according to different datasets: Red bars: ERA5 (ECMWF Copernicus Climate Change Service, C3S); Dots: GISTEMPv4 (NASA); HadCRUT5 (Met Office Hadley Centre); NOAAGlobalTempv5 (NOAA), JRA-55 (JMA); and Berkeley Earth. Credit: Copernicus Climate Change Service/ECMWF.

Europe 2020: warmest year on record

2020 was Europe’s warmest year recorded, and seasonally winter 2019/20 and autumn 2020 were also the warmest recorded. Winter 2020, meaning December 2019 to February 2020, exceeded the previous warmest of 2016 by almost 1.4°C, while autumn (September to November 2020) passed the old record set in 2006 by 0.4°C. In addition, western Europe experienced a significant heatwave in late July and early August. The next four warmest years for Europe also happened during the last decade.

A full and detailed analysis of Europe’s climate will be released in April when Copernicus presents its annual European State of the Climate 2020.

Carlo Buontempo, Director of the Copernicus Climate Change Service (C3S), comments: “2020 stands out for its exceptional warmth in the Arctic and a record number of tropical storms in the North Atlantic. It is no surprise that the last decade was the warmest on record, and is yet another reminder of the urgency of ambitious emissions reductions to prevent adverse climate impacts in the future.”

CO2 concentrations continue to rise in 2020

Monthly global CO2 concentrations from satellites (top panel) and derived annual mean growth rates (bottom panel) for 2003-2020. Top: column-averaged CO2 (XCO2) based on the C3S/Obs4MIPs (v4.2) consolidated (2003-2019) and CAMS preliminary near-real time data (2020) records. The listed numerical values in red indicate annual XCO2 averages. Bottom: Annual mean XCO2 growth rates derived from data shown in the top panel. The listed numerical values correspond to the growth rate in ppm/year including an uncertainty estimate in brackets. Source: University of Bremen for Copernicus Climate Change Service and Copernicus Atmosphere Monitoring Service/ECMWF

Analysis of satellite data reveals that carbon dioxide concentrations have continued to rise in 2020 reaching an unprecedented global column-averaged maximum of approximately 413.1 ppm. The estimated annual mean XCO2 growth rate for 2020 was 2.3 ± 0.4 ppm/year. This is less than the growth rate in 2019, which was 2.5 ± 0.2 ppm/year and also less than the 2.9 ppm/year increase in 2015 and 2016. However, 2015 and 2016 experienced a strong El Niño climate event, which resulted in a larger atmospheric growth rate due to a weaker than normal uptake of atmospheric CO2 by land vegetation and large CO2 wildfire emissions, particularly in Indonesia in those years. The wildfires in the Arctic and Australia in 2020, although of unprecedented magnitude in their regions, represent only a small fraction of global fire emissions.

Vincent-Henri Peuch, Director of the Copernicus Atmosphere Monitoring Service (CAMS), comments: “While carbon dioxide concentrations have risen slightly less in 2020 than in 2019, this is no cause for complacency. Until the net global emissions reduce to zero, CO2 will continue to accumulate in the atmosphere and drive further climate change.”

In the context of the COVID-19 pandemic, it has been estimated by the Global Carbon Project that there was a reduction of around 7% of fossil CO2 emissions.

“To what extent this was a factor in the lower total increase is debatable though, as the variations in global growth rate are dominated by natural processes. We must continue efforts to decrease CO2 net emissions to reduce the risk of climate-related change”, Vincent-Henri Peuch adds.

“The extraordinary climate events of 2020 and the data from the Copernicus Climate Change Service show us that we have no time to lose. We must come together as a global community, to ensure a just transition to a net zero future. It will be difficult, but the cost of inaction is too great, which is why the commitments made under our European Green Deal are so very necessary”, highlights Matthias Petschke, Director for Space, European Commission’s Directorate-General for Defence industry and Space.

Biodiversity collapses in the Eastern Mediterranean


For anyone doubting the reality of global warming, consider this: The populations of native lifeforms of the once Temperate Eastern Mediterranean Sea have collapsed, replaced by species previously native only to tropical waters.

The implications are profound.

From the University of Vienna:

Native biodiversity collapse in the Eastern Mediterranean

An international team led by Paolo G. Albano from the Department of Palaeontology at the University of Vienna quantified a dramatic biodiversity collapse of up to 95 per cent of native species in the Eastern Mediterranean. The study is published in the Proceedings of the Royal Society B: Biological Sciences.

The coastline of Israel is one of the warmest areas in the Mediterranean Sea. Here, most marine species have been at the limits of their tolerance to high temperatures for a long time – and now they are already beyond those limits. Global warming has led to an increase in sea temperatures beyond those temperatures that Mediterranean species can sustain. Consequently, many of them are going locally extinct.

Paolo Albano’s team quantified this local extinction for marine molluscs, an invertebrate group encompassing snails, clams and mussels. They thoroughly surveyed the Israeli coastline and reconstructed the historical species diversity using the accumulations of empty shells on the sea bottom.

Biodiversity loss in the last few decades

The shallow habitats at scuba diving depths are affected most. Here, the researchers were not able to find living individuals of up to 95 per cent of the species whose shells were found in the sediments. The study suggests that most of this loss has occurred recently, presumably in just the last few decades.

Additionally, most of the species still found alive cannot grow enough to reproduce, “a clear sign that the biodiversity collapse will further continue,” says Albano. In contrast, the tropical species that enter from the Suez Canal thrive. The warm waters in the Eastern Mediterranean are very suitable habitats for them. Indeed, they occur in large populations and their individuals are fully fit to reproduce.

“For anyone accustomed to snorkel or dive in the Mediterranean,” explains the researcher, “the underwater scenario in Israel is unrecognisable: The most common species are missing, while in contrast tropical species are everywhere”.

The future perspectives for the Mediterranean are not good. The sea will continue to warm even if we would stop carbon dioxide emissions today. This is due to the inertia of the system, the long braking distance, so to speak. 

It is thus likely that the biodiversity collapse will continue to spread. It may already be occurring in other eastern Mediterranean areas not surveyed yet, and it will expand to the West and intensify. Only intertidal organisms, which are to some extent pre-adapted to temperature extremes, and habitats in deeper water, where the temperature is markedly lower, will continue to persist – at least for some time.

“But the future is dim unless we immediately act to reduce our carbon emissions and to protect marine habitats from other pressures which contribute to biodiversity loss,” says Paolo Albano, “The changes that already occurred in the warmest areas of the Mediterranean may not be reversible, but we would be able to save large parts of the rest of the basin.”

Methodologically, the study was also interesting due to its interdisciplinary character: “These results came from the cooperation of scientists with very different backgrounds,” says Martin Zuschin, Head of the Department of Palaeontology and co-author of the study – “In particular, the cooperation between ecologists and palaeontologists is providing unique new views on how humankind is impacting biodiversity”.

Trump’s anti-environmental rampage continues


Like an angry teenager smashing up his room in a fit of pique at his parents, Donald Trump is doing everything he can in his [hopefully] final days to trash the environment.

Here’s the latest.

A major blow to medical science and clean air

From the Associated Press:

The Environmental Protection Agency released one of its last major rollbacks under the Trump administration on Tuesday, limiting what evidence it will consider about risks of pollutants in a way that opponents say could cripple future public health regulation.

EPA Administrator Andrew Wheeler said the new rule, which restricts what findings from public health studies the agency can consider in crafting health protections, was made in the name of transparency about government decision-making. “We’re going to take all this information and shine light on it,” Wheeler said Tuesday, in unveiling the terms of the new rule in a virtual appearance hosted by a conservative think tank.

“I don’t think we get enough credit as an administration about wanting to open up … to sunlight and scrutiny,” Wheeler said of the Trump administration, which has already rolled back dozens of public health and environmental protections.

Opponents say the latest rule would threaten patient confidentiality and privacy of individuals in public health studies, and call the requirement an overall ruse to handicap future regulation.

The kind of research findings that appear targeted in the new rule “present the most direct and persuasive evidence of pollution’s adverse health effects,” said Richard Revesz, an expert in pollution law at the New York University School of Law.

The stench becomes pervasive when you look at the origin’s of Trump’s policy change.

From the New York Times:

Nearly a quarter century ago, a team of tobacco industry consultants outlined a plan to create “explicit procedural hurdles” for the Environmental Protection Agency to clear before it could use science to address the health impacts of smoking.

President Trump’s E.P.A. has now embedded parts of that strategy into federal environmental policy. On Tuesday Andrew Wheeler, the administrator of the E.P.A., formally released a new regulation that favors certain kinds of scientific research over others in the drafting of public health rules.

<snip>

“Right now we’re in the grips of a serious public health crisis due to a deadly respiratory virus, and there’s evidence showing that air pollution exposure increases the risk of worse outcomes,” said Dr. Mary Rice, a pulmonary and critical care physician who is chairwoman of the environmental health policy committee at the American Thoracic Society.

“We would want E.P.A. going forward to make decisions about air quality using all available evidence, not just putting arbitrary limits on what it will consider,” she said.

It’s not a good sign when your science rules come from a powerful industry that spent millions on election and propaganda designed to conceal the fact that they’d probably killed as many people as Hitler.

But it gets even worse, the Times notes:

The E.P.A. says the regulation only deals with future rules. Public health experts, however, warned that studies that have been used for decades to show, for example, that lead in paint dust is tied to behavioral disorders in children might be inadmissible when existing regulations come up for renewal.

Most significantly, they warned, a groundbreaking 1993 Harvard University project that definitively linked polluted air to premature deaths, currently the foundation of the nation’s air-quality laws, could become inadmissible as the agency considers whether to strengthen protections. In that study, scientists signed confidentiality agreements to track the private medical and occupational histories of more than 22,000 people in six cities. Its findings have long been attacked by the fossil fuel industry and some Republican lawmakers.

Corporate bird slaughter legalized

From The Hill:

The Fish and Wildlife Service (FWS) has finalized a rule rolling back protections for migratory birds, according to a document that will be published in the Federal Register this week. 

The new rule changes the implementation of the 1918 Migratory Bird Treaty Act (MBTA) so that companies are no longer penalized for accidentally or incidentally harming or killing these birds. 

The MBTA has protected more than 1,000 different species of birds for more than 100 years by punishing companies whose projects cause them harm. 

The Trump administration has argued, however, that companies should only be punished for intentionally killing the animals, though it has admitted that relaxing these rules may cause companies not to carry out best practices that limit incidental bird deaths. 

By way of analogy, the the case of murder of a human being, the law differentiates between intentional homicide and negligent homicide, the latter being a death cause of by negligent actions not intended to lead to the death at issue.

But both are crimes; only the punishment differs.

And a fresh assault on Alaska’s wilderness

Finally, there’s a new front in his assault on the Alaskan National Wildlife Refuge, via the Guardian:

On Monday, the Trump administration also dramatically expanded the area where the government can lease public land for oil drilling to the west of ANWR.

The plan would allow drilling in 82% of the National Petroleum Reserve-Alaska, an area bigger than the state of West Virginia, according to environmental groups, though the Biden administration could reverse that decision more easily than it could hold off drilling in ANWR.

<snip>

Native groups in Alaska have fought ANWR drilling proposals with lawsuits. For the Gwich’in, indigenous Alaskans who have migrated alongside the caribou and relied upon them as a food source, the fight is personal. They formed the Gwich’in Steering Committee in 1988 to oppose drilling in the coastal plain, which they call the Sacred Place Where Life Begins.

“We come from some of the strongest people that ever walked this earth. They survived some of the coldest, harshest winters so that we can be here,” Bernadette Demientieff, executive director of the committee, said during an AM radio segment last week. “I feel like this is my responsibility as a Gwich’in, to protect the caribou.”

Polar bear advocates say the habitat is also critical to a population in dire straits from development and rising temperatures that are melting sea ice. The Arctic is heating at a much faster pace than the rest of the world. Polar bear numbers in Alaska and western Canada declined 40% from 2001 to 2010, said Steven Amstrup, chief scientist for Polar Bears International.

Well sign off with a quote:

“I want the cleanest water on Earth, I want the cleanest air on Earth and that’s what we’re doing — and I’m an environmentalist.” — Donald J. Trump

Vibrations key to brain communications?


“I’m pickin’ up good vibrations,” sang the Beach Boys, and they may have been more right than they knew.

Except for the fact that those vibrations are happening their own heads, and in different parts of the brain at the same time, and new research reveals that those vibrations may be the way different brain regions communicate with each other.

By our lights this is one of the most fascinating discoveries we’re read about in a long time, and the implications may be profound.

From the University of Helsinki:

A new means of neur­onal communication dis­covered in the hu­man brain

In a new study published in Nature Communications [open access], research groups of Professor J. Matias Palva and Research Director Satu Palva at the Neuroscience Centre of the University of Helsinki and Aalto University, in collaboration with the University of Glasgow and the University of Genoa, have identified a novel coupling mechanism linking neuronal networks by using human intracerebral recordings.

Neuronal oscillations are an essential part of the functioning of the human brain. They regulate the communication between neural networks and the processing of information carried out by the brain by pacing neuronal groups and synchronising brain regions.  

High-frequency oscillations with frequencies over 100 Hertz are known to indicate the activity of small neuronal populations. However, up to now, they have been considered to be exclusively a local phenomenon.

The findings of the European research project demonstrate that also high-frequency oscillations over 100 Hertz synchronize across several brain regions. This important finding reveals that strictly-timed communication between brain regions can be achieved by high-frequency oscillations.

The researchers observed that high-frequency oscillations were synchronised between neuronal groups with a similar architecture of brain structures across subjects, but occurring in individual frequency bands. Carrying out a visual task resulted in the synchronisation of high-frequency oscillations in the specific brain regions responsible for the task execution.

These observations suggest that high-frequency oscillations convey within the brain ‘information packages’ from one small neuronal group to another.

The discovery of high-frequency oscillations synchronised between brain regions is the first evidence of the transmission and reception of such information packages in a context broader than individual locations in the brain. The finding also helps to understand how the healthy brain processes information and how this processing is altered in brain diseases.

We’ll conclude with the appropriate song:

Trump drops another environmental bombshell


They’re called gillnets, defined by the National Oceanic and Atmospheric Administration’s Fisheries department as “a wall of netting that hangs in the water column, typically made of monofilament or multifilament nylon.”

A gillnet in action, via NOAA Fisheries.

I first became aware of gillnets and their destructive impacts on threatened and endanger sea mammals back in the 1970s, when my paper, the late, great Santa Monica Evening Outlook teamed up with the Sacramento Bee to report extensively on corruption in the California fishing industry and the state’s lax enforcement policies.

Conservationists warned us that gillnet fishermen were catching dolphnis, sea turtles and even small whales, which died, trapped under water and unable to reach the surface to breathe.

That’s why we were alarmed at one of Donald Trump’s latest moves, as reported by the Associated Press:

President Donald Trump vetoed a bill Friday that would have gradually ended the use of large-mesh drift gillnets deployed exclusively in federal waters off the coast of California, saying such legislation would increase reliance on imported seafood and worsen a multibillion-dollar seafood trade deficit.

Trump also said in his veto message to the Senate that the legislation sponsored by Sens. Dianne Feinstein, D-Calif., and Shelley Moore Capito, R-W.Va., “will not achieve its purported conservation benefits.”

Feinstein issued a statement late Friday saying Trump’s veto “has ensured that more whales, dolphins, sea turtles and other marine species will be needlessly killed, even as we have a proven alternative available.”

Trump vetoed the fishing bill as the Republican-controlled Senate followed the Democratic-led House and voted to overturn his earlier veto of the annual defense policy bill, enacting it into law despite Trump’s objections.

The fishing bill’s sponsors said large-mesh drift gillnets, which measure between 1 mile (1.6 kilometers) and 1.5 miles (2.4 kilometers) long and can extend 200 feet (60.9 meters) below the surface of the ocean, are left in the waters overnight to catch swordfish and thresher sharks. But they said at least 60 other marine species — including whales, dolphins and sea lions — can also become entangled in the nets, where they are injured or die.

Trumps wilful ignorance belied by his own government

Had Trump wanted to learn about gillnets, he could have simply gone to his own government’s NOAA Fisheries gillnet page and learned this:

Depending on the gillnet mesh size, animals can become entangled around their necks, mouths, and flippers. Entanglement can prevent proper feeding, constrict growth, or cause infection after many months. Marine mammals entangled in set gillnets can drown while those entangled in drift gillnets can drag gear for miles as they migrate and forage, leading to extreme fatigue. Species most commonly caught in gillnets include:

Large whales: Humpback whales, Fin whales, Right whales

Harbor porpoise

Dolphins: Bottlenose dolphins, Common dolphin, Right whale dolphins,

Steller sea lions

The report also cites the danger to sea turtles:

Gillnetting has been a major source of mortality for all sea turtle species.

Turtles encountering a gillnet can quickly become entangled around their head or flippers as they try to escape. Entangled turtles will drown if held under the water but have a higher chance of survival if they can reach the surface to breathe. The nylon can tighten around the turtle’s soft body parts and cause deep cuts potentially leading to infections, limited movement, or complete loss of the limb. Limited use of appendages can impair a turtle’s natural feeding, breathing, and swimming behavior.

And as the Wildlife Conservation Network notes:

Gillnets do an unbelievable amount of damage. Used to capture large amounts of fish, they kill not only targeted species, but any creature that swims into them. Critically endangered hammerhead sharks are particularly vulnerable to being caught as their unique T-shaped heads become easily entangled. Imagine if on land a single hunting trap was set in a forest and caught not just one animal, but hundreds—every rabbit, deer, squirrel, and song bird across acres. Similarly, gillnets empty the seas. Fishers use them because the payout seems big in the short-term, but in the long-term, they eviscerate fish stocks, devastate fishing livelihoods and marine ecosystems, and reduce populations of threatened marine wildlife like hammerheads.

And it’s not just mammals and amphibians at risk, as the American Bird Conservancy reported in 2016:

A new study published in the journal Biological Conservation provides the first global review of seabird mortality associated with the gillnet fishing industry and finds that, at a minimum, 400,000 seabirds are killed accidentally in gillnets each year, with numerous species suffering potentially significant impacts.

The study documents 81 species that have been caught and killed in gillnets and another 67 species that are potential victims. The list of susceptible species includes five Critically Endangered, 14 Endangered, 29 Vulnerable, and 15 Near Threatened species, as classified by the International Union for Conservation of Nature. The highest bycatch has been reported in the Pacific Northwest (about 194,000 individuals), Iceland (around 100,000), and the Baltic Sea (around 76,000). However, the report said that it is almost certain that that the actual number of birds killed in gillnets worldwide is much higher because of numerous data gaps.

According to the report, “…gillnets have been the cause of some of the highest recorded mortalities of seabirds worldwide. The status of seabird populations is deteriorating faster compared to other bird groups, and bycatch in fisheries is identified as one of the principle causes of declines.

Gillnets, in other words, are like using nukes to hunt deer.

An activist group fights back

The Sea Shepherd Conservation Society has been battling gillnets for decades.

Founded in 1977, the organization describes itself thusly:

Sea Shepherd is an international, non-profit marine conservation organization that engages in direct action campaigns to defend wildlife, and conserve and protect the world’s oceans from illegal exploitation and environmental destruction.

Sea Shepherd has been working hard to save the vaquita, a beautiful little dolphin brought to brink of destruction by gillnets in the Gulf of Mexico, By all estimates, few than 20 remain, possible even less than 10.

The nets are deployed to catch another critically endangered creature, a fish called the totoaba, valued in Asia as a cure for erectile dysfunction. The dead vaquitas trapped in their nets are merely what the American military likes to call collateral damage, like the hundreds of thousands on innocent Iraqis killed in the endless post-9/11 war in the Mideast.

In response to international pressure from Sea Shepherd and other activist organizations, Mexico banned gillnets in the vaquita habitat in the northern waters of the Gulf of Mexico on 29 June 2017, but that hasn’t stop the gillnetters.

As Sea Shepherd reported on 3 November:

This week marks the completion of a collaborative effort aimed at removing abandoned fishing gear from the habitat of the critically endangered vaquita.

The program, which is funded by the Government of Mexico, has been operating since 2016. Every year, a group of small-scale fishers from the community of San Felipe in the Upper Gulf of California undertakes seasonal ghost net removal operations in the Vaquita Refuge. Sea Shepherd provides support with the recovery of the nets located by the fishers, ensuring that they never find their way back into the marine ecosystem.

“Ghost nets” refer to abandoned fishing gear, nets that have been discarded or lost at sea. These inactive nets pose a deadly threat to all marine wildlife and can continue to kill marine animals indefinitely for as long as they remain in the ocean. Whales, turtles, dolphins, and vaquitas are all vulnerable to entanglement in these ​deserted nets.

A group of 35 local fishers, working from 17 small boats, systematically search the Vaquita Refuge in a grid pattern to locate discarded fishing gear. Following GPS coordinates, the boats drag modified hooks under the water to detect submerged nets. As the vessels move over the nets, the hooks become entangled in the fishing gear. Once a net is detected, the fishers mark the area, and Sea Shepherd’s Sharpie moves in to retrieve the gear.

This season, the operation successfully retrieved 20 nets from the Vaquita Refuge between Sept. 12 and Oct. 31, 2020.

“There are many more nets in the water than vaquitas,” said Andrea Bonilla, Sea Shepherd’s Ghost Net Project Coordinator. 

Here’s a video report filed after the campaign’s conclusion:

Tensions Escalate in Vaquita Refuge

Program notes:

Tensions are escalating in the habitat of the world’s most endangered marine mammal as Sea Shepherd fights to save the vaquita from imminent extinction.

There are less than 20 vaquitas left alive. Entanglement in illegal gillnets is the primary threat to the survival of this critically endangered animal.

The work is hazardous, as this shooting incident recorded last * February reveals:

Shots fired at Sea Shepherd inside Vaquita Refuge

Program notes:

On February 8th, 2020, a group of four fishing skiffs chased and opened fire on Sea Shepherd vessel the M/V Sharpie.

Four years earlier, in February 2016, Sea Shepherd found a surprise, a massive Humpback whale trapped in a gillnet inside vaquita waters, proof of the danger the nets pose to large marine mammals:

Sea Shepherd Crew Save Humpback Whale Entangled in Illegal Gillnet

Program notes:

Sea Shepherd crew rescued a whale entangled in an illegal totoaba gillnet in the Gulf of California. Sea Shepherd currently has two vessels in Mexico’s Gulf of California on OPERATION MILAGRO. Our goal is to save the vaquita porpoises, the most endangered marine mammal. The vaquita are caught as a result of fishing the totoaba, a fish poached for its swim bladder. Both the vaquita and the totoaba are endangered species and protected by law. Both species live only in the Gulf of California.

Another video from further afield comes from Italy, where a whole pod of sperm wahales was entanled in a drifting net.

From the Italian Mediterranean cetacean advocacy group Oceanomare Delphis:

Sperm Whale Rescue

Program notes:

On 9 Aug 2004 a herd of female and young sperm whales, entangled in an illegal drifting for swordfish, is rescued by divers of the Italian Coast Guard. After two days of work, all whales are released alive. Video by the Coast Guard of Naples. Editing by Oceanomare Delphis Onlus.

To conclude, we turn to one of our previous vaquita posts, this one from 21 June 2016.

Marine biologist Barbara Taylor of the Southwest Fisheries Science Center in La Jolla is passionate about saving the world’s endangered cetaceans, and her focus in recent years has been on the Vaquita, a recently discovered porpoise in the Sea of Cortez now facing imminent extinction.

Taylor’s passion for saving the rare mammal extends beyond the laboratory and field research to the other passion of her life, art [she has her own gallery where you can purchase her graphics and jewellery featuring the Vaquita]. Here’s one example:

And here’s a video featuring Barbara Taylor from University of California Television:

Net Loss: New Abundance Estimate Reveals That Mexico’s Vaquita Faces Imminent Extinction

Program notes:

Barbara Taylor of the National Marine Fisheries Service Southwest Fisheries Science Center, who participated in the last effort to save the recently extinct Chinese River Dolphin, or Baiji, gives a detailed chronicle of her involvement in documenting the decline of earth’s most endangered marine mammal, the Vaquita, found only in the Sea of Cortez, Mexico. Their primary threat is death in gillnets, which until very recently supplied shrimp to the U.S. market. The catastrophic 80% decline since 2011 results from illegal sales of endangered totoaba swim bladders to China. Recorded on 06/13/2016.

Tweet of the day: Happy 18th to a world hero


From here Tweetstream, with a hefty dash of humor:

London’s Sunday Times has fielded a compelling interview with the Swedish activist today,

Some excerpts:

She started thinking seriously about climate change after a lesson in which a teacher showed a documentary about the island of plastic floating in the Pacific Ocean. Thunberg started to cry. Others in the class were distressed too but they moved on when the school bell rang. Thunberg could not. It has been pointed out that people with autism are overrepresented within the climate movement and I’m interested to know why she thinks this is. “Humans are social animals. We copy each other’s behaviour, so if no one else is acting as though there’s no crisis then it can’t be that bad. But we who have autism, for instance, we don’t follow social codes, we don’t copy each other’s behaviour, we have our own behaviour,” she says. “It’s like the tale of The Emperor’s New Clothes; the child who doesn’t care about his reputation or becoming unpopular or being ridiculed is the only one who dares to question this lie that everyone else just silently accepts.”

It is a different folk tale that springs to my mind as I talk to her; the Dutch boy with his finger in the dyke. She is not at all emotional when she discusses the environment; she reads, speaks to scientists regularly and is motivated by cold, hard facts. Fame was just a consequence of her conviction and is not something she enjoys. She gets stopped in the street everywhere she goes except at home in Sweden. It is a cultural phenomenon called Jantelagen, or Jante’s law, she has said: a term used by Scandinavians to describe their cultural inclination towards disapproval of individual achievement. “I know that people see me, I can see in their eyes that they recognise me, and sometimes they point, but they don’t stop and talk,” she says. “It’s nice because I’m being left alone, but it gets very socially awkward because I know they know and it becomes like a game they all pretend.”

She copes with it by spending most of her time at home with her family. Her younger sister, Beata, was diagnosed with ADHD, and the family is a tight-knit unit. Over the years there has been a lot of speculation about the influence her parents have over her profile and her campaigning, but it is very clear when you talk to her that Thunberg thinks for herself. Does it make her feel lonely? She shakes her head. “Of course it is hard to find someone who understands what my life is like, but that doesn’t mean I’m lonely because I have so many people supporting me,” she says. One of them is Malala Yousafzai, the Nobel prize-winning Pakistani girl who was shot in the head by the Taliban and became a global champion of education for girls. They met when they were filming a series for the BBC and have stayed close. Yousafzai, 23, has advised her to “take care of yourself, to remember that you are probably in it for the long run, so you shouldn’t take on too much”, Thunberg says.

<snip>

She is decidedly laid-back about other people’s choices too. I ask what she makes of celebrities who talk about the environment while flying around the world. “I don’t care,” she says. “I’m not telling anyone else what to do, but there is a risk when you are vocal about these things and don’t practise as you preach, then you will become criticised for that and what you are saying won’t be taken seriously.” Nor does she agree that having children is bad for the planet. The whole issue is a distraction, she says, and one that scares people away. “I don’t think it’s selfish to have children. It is not the people who are the problem, it is our behaviour.”

Her own choices demonstrate what she believes is the right way to live. She stopped flying years ago — she famously sailed to America to speak at the 2019 UN climate summit, a voyage that took 15 days (footage shows her ashen-faced, disappearing out of shot with a bucket). She is a vegan and has stopped “consuming things”. What does that mean, I ask. Clothes? She nods. What if she needs something? “The worst-case scenario I guess I’ll buy second-hand, but I don’t need new clothes. I know people who have clothes, so I would ask them if I could borrow them or if they have something they don’t need any more,” she says. “I don’t need to fly to Thailand to be happy. I don’t need to buy clothes I don’t need, so I don’t see it as a sacrifice.”

Greenland: Scenes of an icy climatic breakup


From NASA’s Earth Observatory, a dramatic image captures the dramatic breakup of Greenland’s ice cap as the two-mile-thick frozen mantle melts and ice begins flowing into the sea. The added red regions note areas of movement, with the intensity of the reds reflecting the relative pace of glacial movement:

More from NASA:

A recent study of Greenland’s ice sheet found that glaciers are retreating in nearly every sector of the island, while also undergoing other physical changes. Some of those changes are causing the rerouting of freshwater rivers beneath the ice.

In a study led by Twila Moon of the National Snow and Ice Data Center, researchers took a detailed look at physical changes to 225 of Greenland’s ocean-terminating glaciers—narrow fingers of ice that flow from the ice sheet interior to the ocean. They found that none of those glaciers has substantially advanced since the year 2000, and 200 of them have retreated.

The map at the top of this page shows measurements of ice velocity across Greenland as measured by satellites. The data were compiled through the Inter-mission Time Series of Land Ice Velocity and Elevation project (ITS_LIVE), which brings together observations of glaciers collected by multiple Landsat satellites between 1985 and 2015 into a single dataset open to scientists and the public.

About 80 percent of Greenland is blanketed by an ice sheet, also known as a continental glacier, that reaches a thickness of up to 3 kilometers (2 miles). As glaciers flow toward the sea, they are usually replenished by new snowfall on the interior of the ice sheet that gets compacted into ice. Multiple studies have shown that the balance between glacier melting and replenishment is changing, as is the rate of iceberg calving. Due to rising air and ocean temperatures, the ice sheet is losing mass at an accelerating rate and additional meltwater is flowing into the sea.

“The coastal environment in Greenland is undergoing a major transformation,” said Alex Gardner, a snow and ice scientist at NASA’s Jet Propulsion Laboratory and co-author of the study. “We are already seeing new sections of the ocean and fjords opening up as the ice sheet retreats, and now we have evidence of changes to these freshwater flows. So losing ice is not just about changing sea level, it’s also about reshaping Greenland’s coastline and altering the coastal ecology.”

Though the findings by Moon, Gardner, and colleagues are in line with other Greenland observations, the new survey captures a trend that has not been apparent in previous work. As individual glaciers retreat, they are also changing in ways that are likely rerouting freshwater flows under the ice. For example, glaciers change in thickness not only as warmer air melts ice off of their surfaces, but also as their flow speed changes. Both scenarios can lead to changes in the distribution of pressure beneath the ice. This, in turn, can change the path of subglacial rivers, since water will always take the path of least resistance (lowest pressure).

Citing previous studies on the ecology of Greenland, the authors note that freshwater rivers under the ice sheet deliver nutrients to bays, deltas, and fjords around Greenland. In addition, the under-ice rivers enter the ocean where the ice and bedrock meet, which is often well below the ocean’s surface. The relatively buoyant freshwater rises, carrying nutrient-rich deep ocean water to the surface, where the nutrients can be consumed by phytoplankton. Research has shown that glacial meltwater rivers directly affect the productivity of phytoplankton, which serve as a foundation of the marine food chain. Combined with the opening of new fjords and sections of ocean as glaciers and ice shelves retreat, these changes amount to a transformation of the local environment.

“The speed of ice loss in Greenland is stunning,” said Moon. “As the ice sheet edge responds to rapid ice loss, the character and behavior of the system as a whole are changing, with the potential to influence ecosystems and people who depend on them.”

Europe drives push for free science journal access


Most of the world’s innovative science originates in public universities and non-profits, but to see the results of that research, readers have to either buy expensive subscriptions to scientific journals or pay hefty single user, limited-time access charges ranging for a few dollars to as high as $70, judging from our own experience.

The charges have steadily risen as science journals undergo the corporate consolidation process that has plagued print and electronic journalistic.

Here’s the last price survey from the Library Journal reflecting typical prices for one-year academic journal subscriptions:

What makes the problem worse for college and university libraries is that the major publishers have pushed for so-called Big Deal contracts forcing institutions to buy all their publications as a bundle. While the cost for individual subscriptions is lower in a bundle, libraries wind up with dozens of publications of little or no interest to students or4 faculty.

Butt there’s a revolution at work, as the University of Virginia reported 20 December:

In the 1990s, large publishers began marketing bundles of online journals to libraries at a discount. However, since the year 2000, the cost of journals has outpaced both inflation and library budgets, with publishers justifying increases by adding titles that libraries and faculty often do not want. As a result, a growing percentage of collections expenditures have been going toward keeping a shrinking percentage of desired titles. In spring 2019 UVA University Librarian John Unsworth joined six other Deans and Directors of research libraries at Virginia public doctoral institutions in signing an open letter supporting a decision by the University of California library system not to renew its $11 million-a-year scholarly journal subscription with academic publishing behemoth Elsevier. Since then, more institutions have ended or downsized their financial commitments to big publishers.

The publishers’ refusal to remedy an unsustainable purchasing model that locks research behind a paywall puts them at odds with scholars, who strongly prefer the impact of having their work available for free to anyone online. Even the federal government has signaled its interest in ensuring immediate public access to all taxpayer-funded research.

The seven Virginia institutions agree with their peers in the UC system and elsewhere — they can no longer invest in a broken model, paying faculty to produce scholarship which they then must purchase back from publishers at exorbitant rates. When the letter was written in 2019, several large journal packages consumed about 40 percent of the seven libraries’ collections budgets, affecting their ability to build collections most useful to scholarly communities. By 2025, if nothing changes, Elsevier alone is expected to take up 22.7 percent of UVA’s collections budget.

A fascinating 27 June 2017 report in the Guardian describes the way the journal publishing game works at Elsevier, the biggest player of all:

The core of Elsevier’s operation is in scientific journals, the weekly or monthly publications in which scientists share their results. Despite the narrow audience, scientific publishing is a remarkably big business. With total global revenues of more than £19bn, it weighs in somewhere between the recording and the film industries in size, but it is far more profitable. In 2010, Elsevier’s scientific publishing arm reported profits of £724m on just over £2bn in revenue. It was a 36% margin – higher than Apple, Google, or Amazon posted that year.

But Elsevier’s business model seemed a truly puzzling thing. In order to make money, a traditional publisher – say, a magazine – first has to cover a multitude of costs: it pays writers for the articles; it employs editors to commission, shape and check the articles; and it pays to distribute the finished product to subscribers and retailers. All of this is expensive, and successful magazines typically make profits of around 12-15%.

The way to make money from a scientific article looks very similar, except that scientific publishers manage to duck most of the actual costs. Scientists create work under their own direction – funded largely by governments – and give it to publishers for free; the publisher pays scientific editors who judge whether the work is worth publishing and check its grammar, but the bulk of the editorial burden – checking the scientific validity and evaluating the experiments, a process known as peer review – is done by working scientists on a volunteer basis. The publishers then sell the product back to government-funded institutional and university libraries, to be read by scientists – who, in a collective sense, created the product in the first place.

It is as if the New Yorker or the Economist demanded that journalists write and edit each other’s work for free, and asked the government to foot the bill. Outside observers tend to fall into a sort of stunned disbelief when describing this setup. A 2004 parliamentary science and technology committee report on the industry drily observed that “in a traditional market suppliers are paid for the goods they provide”. A 2005 Deutsche Bank report referred to it as a “bizarre” “triple-pay” system, in which “the state funds most research, pays the salaries of most of those checking the quality of research, and then buys most of the published product”.

Academic publishers’ unique advantage: A captive stable of authors

Writing in the open source journal of the Royal Society, Britain’s [and the world’s] oldest national scientific academy, three noted scholars describe the conditions that basically force academics to feed the beast:

In academia, the phrase ‘publish or perish’ is more than a pithy witticism—it reflects the reality that researchers are under immense pressure to continuously produce outputs, with career advancement dependent upon them]. Academic publications are deemed a proxy for scientific productivity and ability, and with an increasing number of scientists competing for funding, the previous decades have seen an explosion in the rate of scientific publishing. Yet while output has increased dramatically, increasing publication volume does not imply that the average trustworthiness of publications has improved.

<snip>

Despite their vital importance in conveying accurate science, top-tier journals possess a limited number of publication slots and are thus overwhelmingly weighted towards publishing only novel or significant results. Despite the fact that null results and replications are important scientific contributions,the reality is that journals do not much care for these findings. Researchers are not rewarded for submitting these findings nor for correcting the scientific record, as high-profile examples attest. This pressure to produce positive results may function as a perverse incentive. Edwards & Roy argue that such incentives encourage a cascade of questionable findings and false positives. Heightened pressure on academics has created an environment where ‘Work must be rushed out to minimize the danger of being scooped’. The range of questionable behaviour itself is wide. Classic‘fraud’ (falsification, fabrication and plagiarism (FFP) [20]) may be far less important than more subtle questionable research practices, which might include selective reporting of (dependent) variables, failure to disclose experimental conditions and unreported data exclusions. So how common are such practices? A study of National Institute of Health (NIH)-funded early and mid-career scientists (n=3247) found that within the previous 3 years, 0.3% admitted to falsification of data, 6% to a failure to present conflicting evidence and a worrying 15.5% to changing of study design, methodology or results in response to funder pressure. An overview by Fanelli has shown that questionable research practices are as common as 75%, while fraud per se occurs only in 1–3% of scientists

We would also argue that the pressure to publish is the primary reason reason schools are relying on graduate students to teach classes once taught by the researchers themselves, depriving undergraduate students of the chance to learn from mentors who have offloaded their teaching responsibilities onto the backs of overworked and underpaid graduate assistants and non-tenured faculty.

And they, too, need their publications if they’re to have chance at landing a tenured position.

An academic stranglehold

Last 18 February the McGill University Tribune in Canada noted the stranglehold the top publishers hold on academic publishing:

Elsevier dominates the industry. A 2015 report from Vincent Larivière of the Université de Montréal (UdeM) showed that Elsevier controls roughly a quarter of the scientific journal market, while competitors Springer and Wiley-Blackwell own nearly another quarter between them. The stranglehold that these companies have on the industry has allowed them to charge astronomically high subscription fees to universities, which had to field a 215 per cent increase in such fees between 1986 and 2003. These fees have come to claim an ever larger portion of university library budgets; in the 2018-2019 school year, McGill paid nearly $1.9 million to Elsevier alone for a subscription to ScienceDirect.

Elsevier has been growing the way most businesses grow these days, by swallowing the competition. Here’s a 15 January 2015 report from Science on their biggest play:

The London-based publisher of Nature and Scientific American, Macmillan Science and Education, announced today that it will merge with Berlin-based Springer Science+Business Media, one of the world’s largest science, technology, and medicine publishers. Together, the duo will generate an estimated $1.75 billion in annual sales and employ some 13,000 people. 

The German Holtzbrinck Publishing Group, which owns Macmillan Science and Education, will own 53% of the new company. BC Partners, a private equity firm that owns Springer+Business Media, will hold the rest. In 2013, BC Partners bought Springer in a deal worth approximately $3.8 billion.

The move is “aimed at securing the long-term growth of both businesses,” BC Partners said in a statement. Eventually, the firm aims to sell the new publishing giant, perhaps by transforming it into a publicly held company, managing partner Ewald Walgenbach told reporters. “The most likely exit will be an IPO [initial public offering],” he told Reuters. “However, that is still at least 2-3 years away.”

Elsevier and the other journal giants are militant in pursuing their control over the vastly profitable business, as the MIT Libraries explained in their examination of the Elsevier’s actions during one critical year:

In 2011, Elsevier supported the Research Works Act (RWA), a bill that would have made illegal the NIH Public Access Policy, along with any other similar government effort to make taxpayer-funded research openly accessible to the public. Following public outcry, including a boycottElsevier withdrew its support, just hours before the bill’s sponsors declared it dead. In their statement, Elsevier indicated they would still “continue to oppose government mandates in this area.”

Elsevier and its senior executives made 31 contributions to members of the House in 2011, of which 12 went to Representative Maloney (NY) one of the sponsors of RWA.

The MIT Press was the first to disavow the Association of American Publishers’ support of RWA. Nature and Science and several university presses followed MIT Press’ lead with disavowals of their own.

Also in 2011, Elsevier supported the Stop Online Piracy Act (SOPA), which threatened free speech and innovation, in part by enabling law enforcement to block access to entire internet domains for infringing material posted on a single web page. In comparison, competitors Springer, Wiley, and Taylor & Francis did not make public statements in support.

In other words, a classic example of what economists call rent-seeking.

Another graphic, this one from a research report published in the open source journal PLOS One in 2015 offers convincing proof that academic publishing is an oligopoly:

Percentage of papers published by the five major publishers, by discipline in the Natural and Medical Sciences, 1973–2013.

But the major publishers want to control even more of the knowledge flowing from taxpayer-funded institutions, as Bloomberg reported 30 June 2020:

In an article published in Science in May, Aspesi and MIT Press Director Amy Brand warned that Elsevier and other big publishers are positioning themselves to play ever bigger roles in measuring researchers’ productivity and universities’ quality, and possibly even to act as one-stop portals for the global exchange of information within scientific disciplines. “The dominance of a limited number of social networks, shopping services, and search engines shows us how internet platforms based on data and analytics can tend toward monopoly,” they wrote. Such concentration isn’t inevitable in scientific communication, they concluded, but preventing it will require “the academic community to act in coordination.”

Changes under way in Europe

But changes are in the works, according to a report published yesterday in Science, the Journal of the American Association for the Advancement of Science:

In 2018, a group of mostly European funders sent shock waves through the world of scientific publishing by proposing an unprecedented rule: The scientists they funded would be required to make journal articles developed with their support immediately free to read when published.

The new requirement, which takes effect starting this month, seeks to upend decades of tradition in scientific publishing, whereby scientists publish their research in journals for free and publishers make money by charging universities and other institutions for subscriptions. Advocates of the new scheme, called Plan S (the “S” stands for the intended “shock” to the status quo), hope to destroy subscription paywalls and speed scientific progress by allowing findings to be shared more freely. It’s part of a larger shift in scientific communication that began more than 20 years ago and has recently picked up steam.

Scientists have several ways to comply with Plan S, including by paying publishers a fee to make an article freely available on a journal website, or depositing the article in a free public repository where anyone can download it. The mandate is the first by an international coalition of funders, which now includes 17 agencies and six foundations, including the Wellcome Trust and Howard Hughes Medical Institute, two of the world’s largest funders of biomedical research.

The group, which calls itself Coalition S, has fallen short of its initial aspiration to catalyze a truly international movement, however. Officials in three top producers of scientific papers—China, India, and the United States—have expressed general support for open access, but have not signed on to Plan S. Its mandate for immediate open access will apply to authors who produced only about 6% of the world’s papers in 2017, according to an estimate by the Clarivate analytics firm, publisher of the Web of Science database.

Still, there’s reason to think Coalition S will make an outsize impact, says Johan Rooryck, Coalition S’s executive director and a linguist at Leiden University. In 2017, 35% of papers published in Nature and 31% of those in Science cited at least one coalition member as a funding source. “The people who get [Coalition S] funding are very prominent scientists who put out very visible papers,” Rooryck says. “We punch above our weight.” In a dramatic sign of that influence, the Nature and Cell Press families of journals—stables of high-profile publications—announced in recent weeks that they would allow authors to publish papers outside their paywall, for hefty fees.

However, as Jefferson Pooley of the London School of Economics writes, there are problems with the new model:

The deals offer, in Roger Schonfeld’s phrase, “to crown the existing major publishers as the OA [open access] Royalty.” Any open future, the reasoning goes, will be underwritten by the library expenditures already in the system. By hoovering those up—by sheltering their windfall profits—the big five are, at the same time, starving would-be competitors. Elsevier and the others are, as Richard Poynder has observed, “embedding themselves and their high prices into the new OA world, while elbowing aside OA publishers like Hindawi and PLOS.” There’s the related problem that the deals’ terms, in most cases, aren’t made public. So the pricing transparency that was supposed to discipline the APC—by introducing price-dampening competition for authors—is, in practice, obscured.

More fundamentally, the move to fold in author fees is an implicit endorsement of the deeply flawed APC regime—one that lowers barriers to readers only to raise them for authors. For scholars in the Global South—and in the humanities and social sciences everywhere—the APC option is laughably beyond reach. Yes, some publishers offer fee waivers, but the system is limited, shoddy, and patronizing—a charity band-aid on a broken system. Since author fees are stitched into read-and-publish deals, the approach serves to ratify—and secure in place—a scholarly publishing system underwritten by the APC. The deals also prop up, at least temporarily, the “hybrid” journal—the thousands of titles that publish tolled- and open-access articles side-by-side. Authors covered by the deals have every incentive to publish in hybrids: These are, very often, the established journals, marinated in prestige, and—unsurprisingly—the outlets that register the largest OA advantage.

More from the Association of College & Research Libraries:

Libraries and the faculty and institutions they serve are participants in the unusual business model that funds traditional scholarly publishing. Faculty produce and edit, typically without any direct financial advantage, the content that publishers then evaluate, assemble, publish and distribute. The colleges and universities that employ these faculty authors/editors then purchase, through their libraries, that packaged content back at exorbitant prices for use by those same faculty and their students. This unusual business model where the “necessary inputs” are provided free of cost to publishers who then in return sell that “input” back to the institutions that pay the salaries of the persons producing it has given rise to an unsustainable system begging for transformation. 

The subscription prices charged to institutions has far outpaced the budgets of the institutions’ libraries who are responsible for paying those bills. Years of stagnant university funding and the economic downturn rendered many library budgets flat, while journal pricing continued to rise. This problem became known as the “serials crisis.” Another element of the serials crisis that has been subject to discussion and debate has been the “big deal,” which is when large commercial publishers sell their complete list of titles to libraries at less than what a la carte pricing for titles would be individually. Some postulate that the big deal has helped negate the effects of the serials crisis while others argue that it actually hurts more than it helps.

But there’s a problem with the Big Deal

A problem that could better be described as the Big Screw.

University of California, Davis Librarian and Vice Provost MacKenzie Smith explained in an article for the open access journal The Conversation:

Under the new business model of licensing access to journals online rather than distributing them in print, for-profit publishers often lock libraries into bundled subscriptions that wrap the majority of a publisher’s portfolio of journals – almost 3,000 in Elsevier’s case – into a single, multimillion dollar package. Rather than storing back issues on shelves, libraries can lose permanent access to journals when a contract expires. And members of the public can no longer read the library’s copy of a journal because the licenses are limited to members of the university. Now the public must buy online copies of academic articles for an average of US$35 to $40 a pop.

The shift to digital has been good for researchers in many ways. It is far more convenient to search for articles online, and easier to access and download a copy – provided you work for an institution with a paid subscription. Modern software makes organizing and annotating them simpler, too. With all of these benefits, no one would advocate for going back to the old days of print journals.

Online access to journals did not improve the picture overall. Despite digital copies of articles costing nothing to duplicate and the cost of producing an article online being lower than in the past, the cost to libraries of licensing access to them has continued to experience hyperinflation. No library can afford to license all the journals its faculty and students want access to, and many researchers around the world are shut out completely. Compounding the problem, consolidation in the scholarly publishing market has reduced competition significantly, causing even more price inflexibility.

The Times of London on 12 March 2020 reported on the costs of academic journals to colleges and universities in the United Kingdom:

UK negotiators have vowed to strike “cost-effective and sustainable” deals with big publishers, as figures reveal that subscriptions to academic journals and other publishing charges are likely to have cost UK universities more than £1 billion over the past decade.

Data obtained using Freedom of Information requests show that UK universities paid some £950.6 million to the world’s 10 biggest publishing houses between 2010 and 2019. For the sector as a whole, however, the overall bill is likely to have topped £1 billion as one in five universities, including several Russell Group institutions, failed to provide cost information.

More than 90 per cent of this outlay was spent with five companies: Elsevier, Wiley, Springer Nature, Taylor & Francis and Sage, with Elsevier claiming £394 million over the 10-year period, roughly 41 per cent of monies received by big publishers.

Overall, the main publishers collected some £109.5 million in 2018-19 – up 44 per cent from 2010, when the bill was £76.1 million. In recent years, however, publishing costs have risen less sharply, climbing by 15 per cent since 2014-15.

And the Big Deal also leads to Big Profits and soaring stock prices

Here’s how share prices of RELX, Elsevier’s parent company, have performed compared to the FTSE [Financial Times Stock Exchange 100 Index of the top 100 most-capitalized firms on the Londson Stock Exchange], via Financial Times:

University of California stuns academic publishers

But what if libraries stood up to the publishing giants?

Here’s the 28 February 2019 announcement from the University of California’s Office of the President that sent shockwaves through the academic publishing world:

As a leader in the global movement toward open access to publicly funded research, the University of California is taking a firm stand by deciding not to renew its subscriptions with Elsevier. Despite months of contract negotiations, Elsevier was unwilling to meet UC’s key goal: securing universal open access to UC research while containing the rapidly escalating costs associated with for-profit journals.

In negotiating with Elsevier, UC aimed to accelerate the pace of scientific discovery by ensuring that research produced by UC’s 10 campuses — which accounts for nearly 10 percent of all U.S. publishing output — would be immediately available to the world, without cost to the reader. Under Elsevier’s proposed terms, the publisher would have charged UC authors large publishing fees on top of the university’s multi-million dollar subscription, resulting in much greater cost to the university and much higher profits for Elsevier.

“Knowledge should not be accessible only to those who can pay,” said Robert May, chair of UC’s faculty Academic Senate. “The quest for full open access is essential if we are to truly uphold the mission of this university.” The Academic Senate issued a statement today endorsing UC’s position.

Open access publishing, which makes research freely available to anyone, anywhere in the world, fulfills UC’s mission by transmitting knowledge more broadly and facilitating new discoveries that build on the university’s research and scholarly work. This follows UC’s faculty-driven principles on scholarly communication.

“I fully support our faculty, staff and students in breaking down paywalls that hinder the sharing of groundbreaking research,” said UC President Janet Napolitano. “This issue does not just impact UC, but also countless scholars, researchers and scientists across the globe — and we stand with them in their push for full, unfettered access.”

Elsevier is the largest scholarly publisher in the world, disseminating about 18 percent of journal articles produced by UC faculty. The transformative model that UC faculty and libraries are championing would make it easier and more affordable for UC authors to publish in an open access environment.

“Make no mistake: The prices of scientific journals now are so high that not a single university in the U.S. — not the University of California, not Harvard, no institution — can afford to subscribe to them all,” said Jeffrey MacKie-Mason, university librarian and economics professor at UC Berkeley, and co-chair of UC’s negotiation team. “Publishing our scholarship behind a paywall deprives people of the access to and benefits of publicly funded research. That is terrible for society.”

Elsevier was unwilling to meet UC’s reasonable contract terms, which would integrate subscription charges and open access publishing fees, making open access the default for any article by a UC scholar and stabilizing journal costs for the university.

“The university’s, and the world’s, move toward open access has been a long time in the making. Many institutions and countries agree that the current system is both financially unsustainable and ill-suited to the needs of today’s global research enterprise,” said Ivy Anderson, associate executive director of UC’s California Digital Library and co-chair of UC’s negotiation team. “Open access will spur faster and better research — and greater global equity of access to new knowledge.”

The University of California followed up its Elsevier decision with an agreement a year late with the open access publisher PLOS, reported by the UCLA Daily Bruin 9 March 2020:

The University of California made a two-year open-access agreement Feb. 19 with the Public Library of Science, which researchers say is part of an upending of the traditional academic publishing model.

Under the deal, the UC Library will cover the first $1,000 of the article-processing charges required for researchers to publish in PLOS journals, which typically range from $1,500 to $3,000. Researchers without sufficient funds can petition the library to cover the remainder of the cost.

Academic research has traditionally been closed access, meaning universities have to pay publishers subscription costs to give researchers access to publications in academic journals.

Conversely, articles in open-access journals are publicly available at no cost to readers. Instead of subscription charges, open-access journals levy an article-processing charge to researchers once their article passes peer review.

The UC’s deal would benefit researchers with low grant funding, such as early-career researchers and researchers in the humanities and social sciences, by allowing them to submit their work to PLOS for publication more easily.

Still more universities jump on the bandwagon

West Virginia University also jumped on the unbundling bandwagon, with immediate results, as WVU Dean of Libraries Karen Diaz reported on 3 December 2018:

For two years now, West Virginia University Libraries has been working toward bringing our materials spending in line with the new budget realities that we have faced since 2016. One of the biggest challenges in our reduction in funds is managing “bundled” journals subscriptions that historically provided us with more journal title subscriptions at less cost. Unfortunately, over time the inflationary costs of these bundle subscriptions have outpaced the size of our budget.

In 2016, when we were first presented with the need to reduce our spending, bundled journal packages accounted for 30 percent of our materials budget but only provided 6.2 percent of our titles. We recognized at the time that we would have to address this significant portion of our budget to achieve the necessary savings. We did so immediately by unbundling our Wiley subscription package which provided us with about $400,000 in savings at that time. Now we are moving to unbundle the remaining packages.

Three years Later, Florida State University did the same, with a very pleasing result, as Ars Technica reported last February:

When Florida State University cancelled its “big deal” contract for all Elsevier’s 2,500 journals last March to save money, the publisher warned it would backfire and cost the library $1 million extra in pay-per-view fees.

But even to the surprise of Gale Etschmaier, dean of FSU’s library, the charges after eight months were actually less than $20,000. “Elsevier has not come back to us about ‘the big deal’,” she said, noting it had made up a quarter of her content budget before the terms were changed.

Two months later, another university followed suit as Inside Higher Ed reported 13 April 2020:

The State University of New York Libraries Consortium announced on April 7 that it will not renew its bundled journal subscription deal with publisher Elsevier.

“While both parties negotiated in earnest and tried to come to acceptable terms for SUNY to maintain access to the full ScienceDirect package, in the end there was considerable disagreement around the value proposition of the ‘big deal,’” said the SUNY Libraries Consortium in a statement.

By subscribing to a core list of 248 journals, the SUNY libraries anticipate saving around $5 to $7 million per year. They currently spend around $10 million annually.

The University of North Carolina at Chapel Hill also announced last week that it is canceling its big deal with Elsevier for budgetary reasons.

Among other libraries debundling are those of the University of North Carolina, the SUNY [State University of New York] Libraries Consortium, and Iowa State University.

The sordid roots of the academic publishing oligopoly

Writing for Med Page Today, British cardiologist Rohin Francis offers an interesting glimpse at the curious roots of the corporate academic publishing rachet [and let’s face it, who else but organized crime racks up 40 percent annual profits?]:

People outside Britain might not have heard of Robert Maxwell, but you’ve certainly heard of his daughter, the widow of convicted sex offender, and dubiously-suicided Jeffrey Epstein. Ghislaine Maxwell is the daughter of Britain’s most notorious media tycoon, Robert Maxwell, fraudster, alleged spy, and one of the inspirations for Logan Roy’s character in “Succession.”

If you go back a few decades, the idea of making money out of scientific work was absurd. Of course, businessmen used scientific ideas throughout the Industrial Revolution and people could patent use of their ideas, but knowledge itself was shared freely particularly among scientists until Maxwell realized he could turn science into profit and created Pergamon Press. Stephen Buranyi colorfully illustrates Maxwell’s rise to power and influence in his excellent article that I’m linking below.

But essentially Maxwell wowed scientists with flash hotels, glamorous parties, and cold hard cash, then signed them up to exclusive deals with his journals. We would get dinner and fine wine, and at the end, he would present us a check, a few thousand pounds for the society. It was more money than us poor scientists had ever seen.

I didn’t realize until researching this video how enormous his influence on modern science has been. The whole system has been shaped by his model: the paid subscriptions, the journalistic way controversy and novelty are prioritized, the dominance of a handful of journals, and the way scientists dream of being published in those high-profile periodicals.

Fore more on Maxwell, see this profile in the Guardian.

Clearly, academic journals have become what lawyers life to call the fruit of a poisonous tree.

Quote of the day: Climate and COVID are linked


From Ilana Cohen, associate managing editor of the Harvard Political Review, writing at Inside Climate News:

In many ways, the United States’ struggle to control Covid-19 has painted a picture, part hopeful and part harrowing, of how the climate crisis might play out in the decades to come. 

Many climate activists and progressives hoped—at least at initially—that the death and illness associated with a worldwide pandemic would make it easier for people to take distant climate threats more seriously.

It didn’t take all that much imagination. The parallels were everywhere.

As Bullard noted, the same communities were being disproportionately affected in each crisis. 

And the same fine particle air pollution, known as PM 2.5, caused primarily by burning fossil fuels, was shown in an early Harvard study to be linked to higher Covid-19 deaths rates among people living in polluted areas. 

Climate change is also responsible for the proliferation of zoonotic diseases, like Covid-19, as drought, flooding and extreme weather force food production to encroach on habitats populated by bats, monkeys and other virus-carrying wild animals.

But while Covid-19 has raised some people’s consciousness about the urgent need to act on climate change, it has had the opposite effect on others. At least in the United States, the president and much of his base have embraced the same science denialism that has for years greeted climate change, even as deaths from the coronavirus soared.

Whether or not the Covid-19 pandemic ultimately bolsters or hampers the prospects for U.S. and global climate action, the two crises remain inextricably linked. At least for the foreseeable future, any effort to meaningfully address the root causes of one will involve confronting the other. 

Read the rest. . .

Scientists stymied Trump climate report rewrite


The Washington Post today has an excellent report on how U.S. government scientists thwarted the President’s efforts to rewrite a critical report on climate change.

Here’s the intro:

The National Climate Assessment, America’s premier contribution to climate knowledge, stands out for many reasons: Hundreds of scientists across the federal government and academia join forces to compile the best insights available on climate change. The results, released just twice a decade or so, shape years of government decisions.

Now, as the clock runs down on President Trump’s time in office, the climate assessment has gained a new distinction: It is one of the few major U.S. climate initiatives that his administration tried, yet largely failed, to undermine.

<snip>

In November, the administration removed the person responsible for the next edition of the report and replaced him with someone who has downplayed climate science, though at this point it seems to be too little, too late. But the efforts started back in 2018, when officials pushed out a top official and leaned on scientists to soften their conclusions — the scientists refused — and then later tried to bury the report, which didn’t work either.

“Thank God they didn’t know how to run a government,” said Thomas Armstrong, who during the Obama administration led the U.S. Global Change Research Program, which produces the assessment. “It could have been a lot worse.”

Donald Trump has consistently denied climate change, as Inside Climate News noted a year ago:

In almost every agency overseeing energy, the environment and health, Trump selected top officials who dispute the mainstream consensus on the urgency of climate action. People with little scientific background, or strong ties to industries they would be regulating, were appointed to scientific leadership positions. One of the administration’s first actions was to order scientists and other employees at EPA and other agencies to halt public communications. Several federal scientists working on climate change have said they were silenced, sidelined or demoted. At least three—a senior employee at the Department of Interior, one at the Centers for Disease Control and Prevention and another at the National Park Service—invoked whistleblower protections. Independent science advisors, such as members of the EPA’s Board of Scientific Counselors, have also been sidelined. Scientific content on government websites has been altered and the public’s access to data reduced. Climate data from the government’s open portal website was removed. So was the EPA’s climate change website. The words “climate change” have been purged from government reports, and other reports have been buried, including by officials at the Department of Agriculture. The administration even edited a major Defense Department report to downplay its climate findings. Through speeches and tweets, the president has repeatedly spread misinformation to the public through his climate denial and denigration of renewable energy.

EPA, meanwhile, is working to finalize its proposal to suppress the types of scientific evidence the agency can use in writing its rules. This includes prohibiting the use of well-established, long-term scientific studies underpinning the nation’s air pollution rules, a change the fossil fuel industry had sought for years. Known as the “secret science” rule, it has been lambasted by scientists and health experts worldwide. Related, the White House issued a memo offering new ways for fossil fuel and other industries to challenge science-based policies.

Trump’s focus, instead, was on corporate profits, and his ceaseless gutting of environmental rules came with a promise that his efforts would launch an economic book.

That didn’t go so well, as the New York Times reported 3 December:

Economists see little evidence that Mr. Trump’s rollback of climate change rules bolstered the economy. Jobs in the auto sector have been declining since the beginning of 2019, and the trend continued despite the rollback of rules aimed at vehicle pollution from greenhouse gases. Domestic coal production last year dropped to its lowest level since 1978. In September, the French government actually blocked a $7 billion contract to purchase American natural gas, arguing that gas produced without controls on methane leaks was too harmful to the climate.

Meantime, in May, carbon dioxide levels reached 417 parts per million, the highest level recorded in human history.

“Because global emissions in 2020 are so much higher than they were 10 or 20 or 30 years ago, that means that a year wasted in the Trump administration on not acting on climate has much bigger consequences than a year wasted in Ronald Reagan or George W. Bush or Bill Clinton’s administration,” said Michael Wara, a climate and energy expert at Stanford University.

Analysts say that the past four years represented a closing window in which the world’s largest polluting economies, working together, could have charted a path toward slowing the rate of planet-warming emissions. To do that, a scientific report in 2018 found that the world’s economies would need to reduce emissions 45 percent from 2010 levels by 2030 — and the policies to do so should be implemented rapidly.

Instead, in the largest economy in the world, they began to fray.

We leave the last word, or rather burn to out favorite Swede [and note the face of the guard, too]:

Can psychedelic drugs ease the pain of prejudice?


There are strong indications that they might be able to do just that, according to a fascinating new study from Ohio State University.

We’re noted extensively the groundbreaking new research showing that drugs such as psilocybin, LSD, and others have shown strong promise for treating a wide range of afflictions, ranging from spousal abuse, migraines, depression, and social isolation to nicotine addiction and alcoholism.

The latest finding concern the impacts of the drug on the daily stressed imposed by raciial bigotry.

From Ohio State University:

One psychedelic experience may lessen trauma of racial injustice

A single positive experience on a psychedelic drug may help reduce stress, depression and anxiety symptoms in Black, Indigenous and people of color whose encounters with racism have had lasting harm, a new study suggests.

The participants in the retrospective study reported that their trauma-related symptoms linked to racist acts were lowered in the 30 days after an experience with either psilocybin (Magic Mushrooms), LSD or MDMA (Ecstasy).

“Their experience with psychedelic drugs was so powerful that they could recall and report on changes in symptoms from racial trauma that they had experienced in their lives, and they remembered it having a significant reduction in their mental health problems afterward,” said Alan Davis, co-lead author of the study and an assistant professor of social work at The Ohio State University.

Overall, the study also showed that the more intensely spiritual and insightful the psychedelic experience was, the more significant the recalled decreases in trauma-related symptoms were.

A growing body of research has suggested psychedelics have a place in therapy, especially when administered in a controlled setting. What previous mental health research has generally lacked, Davis noted, is a focus on people of color and on treatment that could specifically address the trauma of chronic exposure to racism.

Davis partnered with co-lead author Monnica Williams, Canada Research Chair in Mental Health Disparities at the University of Ottawa, to conduct the research.

“Currently, there are no empirically supported treatments specifically for racial trauma. This study shows that psychedelics can be an important avenue for healing,” Williams said.

The study is published online in the journal Drugs: Education, Prevention and Policy.

The researchers recruited participants in the United States and Canada using Qualtrics survey research panels, assembling a sample of 313 people who reported they had taken a dose of a psychedelic drug in the past that they believed contributed to “relief from the challenging effects of racial discrimination.” The sample comprised adults who identified as Black, Asian, Hispanic, Native American/Indigenous Canadian, Native Hawaiian and Pacific Islander.

Details, after the jump. . .

Continue reading

Heavens above! Where giants are born


A stunning image from the Herschel Observatory, via NASA [click on the image to enlarge]:

The Little Fox and the Giant Stars

From NASA:

New stars are the lifeblood of our galaxy, and there is enough material revealed by this Herschel infrared image to build stars for millions of years to come.

Situated 8,000 light-years away in the constellation Vulpecula — Latin for “little fox” — the region in the image is known as Vulpecula OB1. It is a “stellar association” in which a batch of truly giant “OB” stars is being born. O and B stars are the largest stars that can form.

The giant stars at the heart of Vulpecula OB1 are some of the biggest in the galaxy. Containing dozens of times the mass of the sun, they have short lives, astronomically speaking, because they burn their fuel so quickly. At an estimated age of 2 million years, they are already well through their lifespans. When their fuel runs out, they will collapse and explode as supernovas. The shock this will send through the surrounding cloud will trigger the birth of even more stars, and the cycle will begin again.

O stars are at least 16 times more massive than the sun, and could be well over 100 times as massive. They are anywhere from 30,000 to 1 million times brighter than the sun, but they only live up to a few million years before exploding. B-stars are between two and 16 times as massive as the sun. They can range from 25 to 30,000 times brighter than the sun.

OB associations are regions with collections of O and B stars. Since OB stars have such short lives, finding them in large numbers indicates the region must be a strong site of ongoing star formation, which will include many more smaller stars that will survive far longer.

The vast quantities of ultraviolet light and other radiation emitted by these stars is compressing the surrounding cloud, causing nearby regions of dust and gas to begin the collapse into more new stars. In time, this process will “eat” its way through the cloud, transforming some of the raw material into shining new stars.

The image was obtained as part of Herschel’s Hi-GAL key-project. This used the infrared space observatory’s instruments to image the entire galactic plane in five different infrared wavelengths.

These wavelengths reveal cold material, most of it between -220º C and -260º C. None of it can be seen in ordinary optical wavelengths, but this infrared view shows astronomers a surprising amount of structure in the cloud’s interior.

The surprise is that the Hi-GAL survey has revealed a spider’s web of filaments that stretches across the star-forming regions of our galaxy. Part of this vast network can be seen in this image as a filigree of red and orange threads.

In visual wavelengths, the OB association is linked to a star cluster catalogued as NGC 6823. It was discovered by William Herschel in 1785 and contains 50 to 100 stars. A nebula emitting visible light, catalogued as NGC 6820, is also part of this multi-faceted star-forming region.

Herschel is a European Space Agency mission, with science instruments provided by consortia of European institutes and with important participation by NASA. While the observatory stopped making science observations in April 2013, after running out of liquid coolant as expected, scientists continue to analyze its data. NASA’s Herschel Project Office is based at NASA’s Jet Propulsion Laboratory, Pasadena, California. JPL contributed mission-enabling technology for two of Herschel’s three science instruments. The NASA Herschel Science Center, part of the Infrared Processing and Analysis Center at the California Institute of Technology in Pasadena, supports the U.S. astronomical community. Caltech manages JPL for NASA.