OREGON STATE UNIVERSITY

scientific research and advances

New study shows three abrupt pulse of CO2 during last deglaciation

CORVALLIS, Ore. – A new study shows that the rise of atmospheric carbon dioxide that contributed to the end of the last ice age more than 10,000 years ago did not occur gradually, but was characterized by three “pulses” in which C02 rose abruptly.

Scientists are not sure what caused these abrupt increases, during which C02 levels rose about 10-15 parts per million – or about 5 percent per episode – over a period of 1-2 centuries. It likely was a combination of factors, they say, including ocean circulation, changing wind patterns, and terrestrial processes.

The finding is important, however, because it casts new light on the mechanisms that take the Earth in and out of ice age regimes. Results of the study, which was funded by the National Science Foundation, appear this week in the journal Nature.

“We used to think that naturally occurring changes in carbon dioxide took place relatively slowly over the 10,000 years it took to move out of the last ice age,” said Shaun Marcott, lead author on the article who conducted his study as a post-doctoral researcher at Oregon State University. “This abrupt, centennial-scale variability of CO2 appears to be a fundamental part of the global carbon cycle.”

Some previous research has hinted at the possibility that spikes in atmospheric carbon dioxide may have accelerated the last deglaciation, but that hypothesis had not been resolved, the researchers say. The key to the new finding is the analysis of an ice core from the West Antarctic that provided the scientists with an unprecedented glimpse into the past.

Scientists studying past climate have been hampered by the limitations of previous ice cores. Cores from Greenland, for example, provide unique records of rapid climate events going back 120,000 years – but high concentrations of impurities don’t allow researchers to accurately determine atmospheric carbon dioxide records. Antarctic ice cores have fewer impurities, but generally have had lower “temporal resolution,” providing less detailed information about atmospheric CO2.

However, a new core from West Antarctica, drilled to a depth of 3,405 meters in 2011 and spanning the last 68,000 years, has “extraordinary detail,” said Oregon State paleoclimatologist Edward Brook, a co-author on the Nature study and an internationally recognized ice core expert. Because the area where the core was taken gets high annual snowfall, he said, the new ice core provides one of the most detailed records of atmospheric CO2.

“It is a remarkable ice core and it clearly shows distinct pulses of carbon dioxide increase that can be very reliably dated,” Brook said. “These are some of the fastest natural changes in CO2 we have observed, and were probably big enough on their own to impact the Earth’s climate.

“The abrupt events did not end the ice age by themselves,” Brook added. “That might be jumping the gun a bit. But it is fair to say that the natural carbon cycle can change a lot faster than was previously thought – and we don’t know all of the mechanisms that caused that rapid change.”

The researchers say that the increase in atmospheric CO2 from the peak of the last ice age to complete deglaciation was about 80 parts per million, taking place over 10,000 years. Thus, the finding that 30-45 ppm of the increase happened in just a few centuries was significant.

The overall rise of atmospheric carbon dioxide during the last deglaciation was thought to have been triggered by the release of CO2 from the deep ocean – especially the Southern Ocean. However, the researchers say that no obvious ocean mechanism is known that would trigger rises of 10-15 ppm over a time span as short as one to two centuries.

“The oceans are simply not thought to respond that fast,” Brook said. “Either the cause of these pulses is at least part terrestrial, or there is some mechanism in the ocean system we don’t yet know about.”

One reason the researchers are reluctant to pin the end of the last ice age solely on CO2 increases is that other processes were taking place, according to Marcott, who recently joined the faculty of the University of Wisconsin-Madison.

“At the same time CO2 was increasing, the rate of methane in the atmosphere was also increasing at the same or a slightly higher rate,” Marcott said. “We also know that during at least two of these pulses, the Atlantic Meridional Overturning Circulation changed as well. Changes in the ocean circulation would have affected CO2 – and indirectly methane, by impacting global rainfall patterns.”

“The Earth is a big coupled system,” he added, “and there are many pieces to the puzzle. The discovery of these strong, rapid pulses of CO2 is an important piece.”

Media Contact: 
Source: 

Shaun Marcott, smarcott@wisc.edu;

Ed Brook, 541-737-8197, brooke@geo.oregonstate.edu

Multimedia Downloads
Multimedia: 

 

 

 

 

(Feature photo at left) - Donald Voigt from Penn State looks at an ice core in January 2012 during the WAIS Divide project. Photo courtesy of Gifford Wong, Dartmouth

 

 

 

 

 

 

halfhalf

OSU scientists have examined air bubbles trapped in a new ice core that are providing them with some of the clearest indications of atmospheric conditions during the last ice age.

Oral contraception may become renewed option for HIV-positive women

CORVALLIS, Ore. – Contrary to guidelines issued by the World Health Organization, new research has found that HIV-positive women receiving one of the most common forms of drug therapy should be able to use at least some forms of oral contraceptives for birth control.

The findings, just published in the journal Contraception, may lead to new options of birth control for women with HIV. Further research should be done to confirm that clinical outcomes are consistent with conclusions that have been based on pharmacokinetic analysis, scientists said.

Worldwide, the leading cause of death among women ages 18-45 is HIV/AIDS, and prevention of mother-to-child transmission of HIV by reducing unintended pregnancy is a United Nations millennium goal for 2010-15. This research, and broad access to oral contraception, could help reach that goal.

The research was done by scientists from the Oregon State University/Oregon Health & Science University College of Pharmacy, the Albert Einstein College of Medicine and the University of Southern California.

Although millions of women around the world routinely use oral contraception, it has been largely avoided by those with HIV infections because some of the drugs commonly used to control HIV are believed to reduce the effectiveness of birth control pills.

Because of that, both the World Health Organization and Centers for Disease Control have suggested that oral contraceptives should not be used by HIV-positive women if other methods of birth control are available – such as barrier devices, IUDs or other approaches.

The new study, however, raises doubts about such a broad guideline against oral contraception. It found that while some types of oral contraceptives may be subject to this concern, others that are highly efficacious should be even more effective when particular HIV drugs are used.

“Oral contraception is used by millions of women and is among the most popular forms of birth control,” said Ganesh Cherala, an OSU assistant professor of pharmacy practice, a corresponding author on the study. Cherala is an expert in pharmacokinetics, or how drugs behave, interact and are transformed in the body.

“It’s important for women to have access to – and the ability to choose from – as wide a range of birth control options as possible,” Cherala said. “We believe this research shows the WHO guidelines are too generic and unnecessarily cautious. There clearly appear to be oral contraceptives that should be safe and effective in women being treated with HIV medications.”

In general, there are two types of oral contraceptives: “combination” drugs that include both estrogen and progestin, and drugs that are based solely on progestin for their efficacy. The concerns raised about reduced efficacy were based primarily on studies of the combination oral contraceptives, and may be valid for that group of drugs, Cherala said.

However, the new study found that progestin-only contraceptives based on at least one progestin, norethindrone, should actually have a slightly higher level of birth control efficacy, not a reduced one, when a women is taking one of the primary therapies for HIV, called a ritonavir-boosted atazanavir antiretroviral therapy.

Other progestin-only birth control drugs may also have the same properties as the ones based on norethindrone, but that has not yet been conclusively demonstrated, Cherala said. 

The research was done with both treatment and control groups of women who were HIV-positive, ages 18-44, with no other recent use of hormonal contraception.

“These findings are interesting and exciting,” Cherala said. “They should ultimately give women more options to consider for birth control.”

Historically, some of the progestin-only oral contraceptives had unwanted side effects more than the combination contraceptives, Cherala said. However, those differences are now very small as improved forms of progestin-only contraceptives have come to market.

This research was supported by the National Institutes of Health.

Media Contact: 
Source: 

Ganesh Cherala, 503-418-0447

Study: Could sleeper sharks be preying on protected Steller sea lions?

NEWPORT, Ore. – Pacific sleeper sharks, a large, slow-moving species thought of as primarily a scavenger or predator of fish, may be preying on something a bit larger – protected Steller sea lions in the Gulf of Alaska.

A new study found the first indirect evidence that this cold-blooded shark that can grow to a length of more than 20 feet – longer than a great white shark – may be an opportunistic predator of juvenile Steller sea lions.

Results of the study have just been published in the journal Fishery Bulletin. The findings are important, scientists say, because of management implications for the protected Steller sea lions.

For the past decade, Markus Horning of the Marine Mammal Institute at Oregon State University has led a project in collaboration with Jo-Ann Mellish of the Alaska SeaLife Center to deploy specially designed “life history transmitters” into the abdomens of juvenile Steller sea lions. These buoyant archival tags record data on temperature, light and other properties during the sea lions’ lives and after the animals die the tags float to the surface or fall out ashore and transmit data to researchers via satellite.

From 2005-11, Horning and his colleagues implanted tags into 36 juvenile Steller sea lions and over a period of several years, 17 of the sea lions died. Fifteen transmitters sent data indicating the sea lions had been killed by predation.

“The tags sense light and air to which they are suddenly exposed, and record rapid temperature change,” said Horning, who is in OSU’s Department of Fisheries and Wildlife. “That is an indication that the tag has been ripped out of the body, though we don’t know what the predator is that did this.

“At least three of the deaths were different,” he added. “They recorded abrupt temperature drops, but the tags were still dark and still surrounded by tissue. We surmise that the sea lions were consumed by a cold-blooded predator because the recorded temperatures aligned with the deep waters of the Gulf of Alaska and not the surface waters.

“We know the predator was not a killer whale, for example, because the temperatures would be much higher since they are warm-blooded animals.” Data collected from the transmitters recorded temperatures of 5-8 degrees Celsius.

That leaves a few other suspects, Horning said. However, two known predators of sea lions – great white sharks and salmon sharks – have counter-current heat exchanges in their bodies that make them partially warm-blooded and the tags would have reflected higher temperatures.

By process of elimination, Horning suspects sleeper sharks.

The Oregon State pinniped specialist acknowledges that the evidence for sleeper sharks is indirect and not definitive, thus he is planning to study them more closely beginning in 2015. The number of sleeper sharks killed in Alaska as bycatch ranges from 3,000 to 15,000 annually, indicating there are large numbers of the shark out there. The sleeper sharks caught up in the nets are usually comparatively small; larger sharks are big enough to tear the fishing gear and are rarely landed.

“If sleeper sharks are involved in predation, it creates something of a dilemma,” said Horning, who works out of OSU’s Hatfield Marine Science Center in Newport, Ore. “In recent years, groundfish harvests in the Gulf of Alaska have been limited in some regions to reduce the potential competition for fish that would be preferred food for Steller sea lions.

“By limiting fishing, however, you may be reducing the bycatch that helps keep a possible limit on a potential predator of the sea lions,” he added. “The implication could be profound, and the net effect of such management actions could be the opposite of what was intended.”

Other studies have found remains of Steller sea lions and other marine mammals in the stomachs of sleeper sharks, but those could have been the result of scavenging instead of predation, Horning pointed out.

The western distinct population of Steller sea lions has declined to about 20 percent of the levels they were at prior to 1975.

Media Contact: 
Source: 

Markus Horning, 541-867-0270, markus.horning@oregonstate.edu

Scientists discover carbonate rocks are unrecognized methane sink

CORVALLIS, Ore. – Since the first undersea methane seep was discovered 30 years ago, scientists have meticulously analyzed and measured how microbes in the seafloor sediments consume the greenhouse gas methane as part of understanding how the Earth works.

The sediment-based microbes form an important methane “sink,” preventing much of the chemical from reaching the atmosphere and contributing to greenhouse gas accumulation. As a byproduct of this process, the microbes create a type of rock known as authigenic carbonate, which while interesting to scientists was not thought to be involved in the processing of methane.

That is no longer the case. A team of scientists has discovered that these authigenic carbonate rocks also contain vast amounts of active microbes that take up methane. The results of their study, which was funded by the National Science Foundation, were reported today in the journal Nature Communications.

“No one had really examined these rocks as living habitats before,” noted Andrew Thurber, an Oregon State University marine ecologist and co-author on the paper. “It was just assumed that they were inactive. In previous studies, we had seen remnants of microbes in the rocks – DNA and lipids – but we thought they were relics of past activity. We didn’t know they were active.

“This goes to show how the global methane process is still rather poorly understood,” Thurber added.

Lead author Jeffrey Marlow of the California Institute of Technology and his colleagues studied samples from authigenic compounds off the coasts of the Pacific Northwest (Hydrate Ridge), northern California (Eel River Basin) and central America (the Costa Rica margin). The rocks range in size and distribution from small pebbles to carbonate “pavement” stretching dozens of square miles.

“Methane-derived carbonates represent a large volume within many seep systems and finding active methane-consuming archaea and bacteria in the interior of these carbonate rocks extends the known habitat for methane-consuming microorganisms beyond the relatively thin layer of sediment that may overlay a carbonate mound,” said Marlow, a geobiology graduate student in the lab of Victoria Orphan of Caltech.

These assemblages are also found in the Gulf of Mexico as well as off Chile, New Zealand, Africa, Europe – “and pretty much every ocean basin in the world,” noted Thurber, an assistant professor (senior research) in Oregon State’s College of Earth, Ocean, and Atmospheric Sciences.

The study is important, scientists say, because the rock-based microbes potentially may consume a huge amount of methane. The microbes were less active than those found in the sediment, but were more abundant – and the areas they inhabit are extensive, making their importance potential enormous. Studies have found that approximately 3-6 percent of the methane in the atmosphere is from marine sources – and this number is so low due to microbes in the ocean sediments consuming some 60-90 percent of the methane that would otherwise escape.

Now those ratios will have to be re-examined to determine how much of the methane sink can be attributed to microbes in rocks versus those in sediments. The distinction is important, the researchers say, because it is an unrecognized sink for a potentially very important greenhouse gas.

“We found that these carbonate rocks located in areas of active methane seeps are themselves more active,” Thurber said. “Rocks located in comparatively inactive regions had little microbial activity. However, they can quickly activate when methane becomes available.

“In some ways, these rocks are like armies waiting in the wings to be called upon when needed to absorb methane.”

The ocean contains vast amounts of methane, which has long been a concern to scientists. Marine reservoirs of methane are estimated to total more than 455 gigatons and may be as much as 10,000 gigatons carbon in methane. A gigaton is approximate 1.1 billion tons.

By contrast, all of the planet’s gas and oil deposits are thought to total about 200-300 gigatons of carbon.

Media Contact: 
Source: 

Andrew Thurber, 541-737-4500, athurber@coas.oregonstate.edu

Educated community no protection against a poor diet for children

 

CORVALLIS, Ore. – A study of elementary school children in a highly educated community in the Pacific Northwest found that about three fourths of the students had vitamin D levels that were either insufficient or deficient, and they also lacked an adequate intake of other important nutrients.

The findings, reported recently in the Journal of Extension by scientists from the Linus Pauling Institute at Oregon State University, make it clear that nutritional deficiencies can be profound even in communities with a very knowledgeable population and easy access to high quality, affordable food.

Many other studies have found similar concerns in areas with low socioeconomic status, poor food availability, and lower levels of education. This research identified significant nutritional problems in Corvallis, Ore., a university town with many grocery stores, a free bus transit system and some of the highest educational levels in the nation. In Corvallis, 26 percent of residents have completed a graduate degree, which is more than double the national average.

As students grew from younger children into adolescents, the problems only got worse, the research showed. The trend continued toward a diet focused on simple carbohydrates, and low both in fiber and important micronutrients.

“It’s becoming increasingly clear that health and dietary concerns cut across all populations, including comparatively well-educated or affluent groups,” said Gerd Bobe, an assistant professor with the OSU College of Agricultural Sciences and member of the Healthy Youth Program at the Linus Pauling Institute.

“This research also showed nutritional status and dietary choices are getting worse as students become teenagers,” Bobe said. “The foundation for lifelong health is laid in childhood, and puberty is a critical time for growth, brain and bone development. It’s a really bad time to have a bad diet. Since Corvallis is a very educated community with many health-conscious individuals, this is an illustration of just how widespread these problems are.”

The study focused some of its attention on vitamin D, taking blood samples from 71 students at four public elementary schools in Corvallis. They found about 8 percent of students were outright deficient in this vitamin, and about 69 percent were “insufficient,” defined as a level that’s less than ideal for optimal health.

Vitamin D is important for immune function; brain, muscle and bone development and health; and prevention of chronic diseases, including diabetes and cancer. It is often found to be deficient in many temperate zones where people don’t get adequate sun exposure.

The study examined children in two age groups, 5-8 years old and 9-11 years. The older children had even lower vitamin D levels that the younger group, correlated to a lower consumption of dairy products.

Based on their findings, the researchers suggested that educational or outreach programs to improve nutrition understanding are needed in a broad cross section of society, not just low-income or underserved groups.

“Studies show that children and adults both learn best when all their senses are involved, through something like cooking classes combined with nutrition education,” said Simone Frei, manager of the Healthy Youth Program at the Linus Pauling Institute.

“In our classes, youth and families learn about eating plenty of fresh fruits and vegetables, whole grain foods and reducing processed foods,” Frei said. “This is even more effective when combined with growing and harvesting fresh vegetables from the garden.”

This program, among other initiatives, operates a “Chefs in the Garden” summer camp, and data show that 65 percent of participating children increased their vegetable consumption after the camp experience.

Among the other findings of the study:

  • Most of the children reported a diet insufficient in fiber and essential fatty acids;
  • Nearly all children consumed less potassium and more sodium than recommended, a health habit that ultimately can be associated with higher levels of chronic diseases, including heart disease and cancer;
  • Only a single child in the entire study group reported a diet that would provide adequate intake of vitamin E, which is important for neurological development, cognition and anemia prevention.

As a result of the study, educational programs were developed by the Healthy Youth Program at OSU, some of which are available online at http://bit.ly/1ukBNnT

Media Contact: 
Source: 

Gerd Bobe, 541-737-1898

Multimedia Downloads
Multimedia: 

Learning to garden
Learning to garden


A healthy feast
A healthy feast

Study finds air temperature models poor at predicting stream temps

CORVALLIS, Ore. – Stream temperatures are expected to rise in the future as a result of climate change, but a new study has found that the correlation between air temperature and stream temperature is surprisingly tenuous.

The findings cast doubt on many statistical models using air temperatures to predict future stream temperatures.

Lead author Ivan Arismendi, a stream ecologist at Oregon State University, examined historic stream temperature data over a period of one to four decades from 25 sites in the western United States to see if increases in air temperature during this period could have predicted – through the use of statistical models – the observed stream temperatures.

He discovered that many streams were cooler than the models predicted, while others were warmer. The difference in temperature between the models and actual measurements, however, was staggering – as much as 12 degrees Celsius different in some rivers.

Results of the study have recently been published in the journal Environmental Research Letters. The study involved scientists from Oregon State, the U.S. Forest Service and the U.S. Geological Survey, and was supported by all three organizations, as well as by the National Science Foundation.

“These air-stream temperature models originated as a tool for looking at short-term relationships,” said Arismendi, a researcher in the OSU Department of Fisheries and Wildlife. “The problem is that people are starting to use them for long-term extrapolation. It is unreliable to apply uniform temperature impacts on a regional scale because there are so many micro-climate factors influencing streams on a local basis.”

Sherri Johnson, a U.S. Forest Service research ecologist and co-author on the study, said the findings are important because decisions based on these models may not be accurate. Some states, for example, have projected a major loss of suitable habitat for trout and other species because the models suggest increases in stream temperature commensurate with projected increases in air temperature.

“It just isn’t that simple,” Arismendi said. “Stream temperatures are influenced by riparian shading and in-stream habitat, like side channels. Dams can have an enormous influence, as can groundwater. It is a messy, complex challenge to project stream temperatures into the future.”

What made this study work, the authors say, was evaluating more than two dozen sites that had historic stream temperature data, which can be hard to find. The development about a dozen years ago of data loggers that can be deployed in streams is contributing enormous amounts of new data, but accurate historic records of stream temperatures are sparse.

Researchers at USGS and at sites like the H.J. Andrews Experimental Forest in Oregon, part of the National Science Foundation’s Long-Term Ecological Research program, have compiled stream data for up to 44 years, giving Arismendi and his colleagues enough historical data to conduct the comparative study.

In many of the 25 sites examined in the study, the researchers found that the difference between model-projected stream temperatures and actual stream temperatures was as great as the actual amount of warming projected – 3.0 degrees Celsius, or 5.5 degrees Fahrenheit. And in some cases, the projections were even farther off target.

“The models predictions were poor in summer and winter, and when there are extreme situations,” Arismendi noted. “They were developed to look at Midwest streams and don’t account for the complexity of western streams that are influenced by topography, extensive riparian areas and other factors.”

Increases in air temperatures in the future are still likely to influence stream temperatures, but climate sensitivity of streams “is more complex than what is being realized by using air temperature-based models,” said Mohammad Safeeq, an Oregon State University researcher and co-author on the study.

“The good news is that some of the draconian projections of future stream temperatures may be overstated,” noted Safeeq, who is in OSU’s College of Earth, Ocean, and Atmospheric Sciences. “On the other hand, some may actually be warmer than what air temperature-based models project.”

Not all streams will be affected equally, Johnson said.

“The one constant is that a healthy watershed will be more resilient to climate change than one that isn’t healthy – and that should continue to be the focus of restoration and management efforts,” she noted.

Jason Dunham, an aquatic ecologist with the USGS and co-author on the study, said the study highlights the value of long-term stream temperature records in the Northwest and globally.

“Without a long-term commitment to collecting this kind of data, we won’t have the ability to evaluate existing models as we did in this work,” Dunham said. “Long-term datasets provide vital material for developing better methods for quantifying the effects of climate on our water resources.”

Media Contact: 
Source: 

Ivan Arismendi, 541-750-7443;

Sherri Johnson, 541-758-7771

Task force outlines major initiatives to prepare for Pacific Northwest earthquake, tsunami

CORVALLIS, Ore. – A task force that studied implementation of the Oregon Resilience Plan today submitted to the Oregon legislature an ambitious program to save lives, mitigate damage and prepare for a massive subduction zone earthquake and tsunami looming in the future of the Pacific Northwest.

The recommendations of the Governor’s Task Force on Resilience Plan Implementation, if enacted, would result in spending more than $200 million every biennium in a long-term initiative.

The program would touch everyone from energy providers and utility companies to their customers, parents and school children, businesses, builders, land use regulators, transportation planners and fire responders. It would become one of the most aggressive efforts in the nation to prepare for a costly, life-threatening disaster that’s seen as both catastrophic and inevitable.

“We have a clear plan for what needs to be done, and now is the time to take our first significant steps forward,” said Scott Ashford, dean of the College of Engineering at Oregon State University, chair of the Governor’s Task Force, and an expert on liquefaction and earthquake engineering who has studied disasters all over the world, similar to those that Oregon will face.

“The scope of the disaster that the Pacific Northwest faces is daunting,” Ashford said. “And we won’t be able to accomplish everything we need to do in one or two years, but hopefully we won’t have to. What’s important is to get started, and the time for that is now.”

The task force making these recommendations included members of the Oregon legislature; advisers to Gov. Kitzhaber; private companies; the Oregon Office of Emergency Management; Oregon Department of Transportation; the Oregon Health Authority; city, county  and business leaders; the Red Cross and others.

The Oregon Resilience Plan, which was completed in early 2013, outlines more than 140 recommendations to reduce risk and improve recovery from a massive earthquake and tsunami that’s anticipated on the Cascadia Subduction Zone, similar to the one that hit Fukushima, Japan, in 2011.

The newest analysis identified specific steps that are recommended for the 2015-17 biennium. They address not only earthquake damage, but also the special risks facing coastal residents from what is expected to be a major tsunami.

One of the largest single steps would be biennial funding of $200 million or more for the OBDD/IFA Seismic Rehabilitation Grant Program, with similar or higher levels of funding in the future. Funds could be used to rehabilitate existing public structures such as schools to improve their seismic safety; demolish unsafe structures; or replace facilities that must be moved out of a tsunami inundation zone.

It was recommended that additional revenue be identified to complete work within a decade on the most critical roads and bridges that form “backbone” transportation routes; that the state Department of Geology and Mineral Industries receive $20 million to update inventory and evaluate critical facilities; and that $5 million be made available through existing programs for tsunami resilience planning by coastal communities.

Utility companies regulated by the Oregon Public Utility Commission would also be required to conduct seismic assessments of their facilities, and be allowed through rate increases to recover their costs if they make prudent investments to mitigate vulnerabilities.

When I studied areas that had been hard-hit by earthquakes in Chile, New Zealand and Japan, it became apparent that money spent to prepare for and minimize damage from the earthquake was hugely cost-effective,” Ashford said.

“One utility company in New Zealand said they saved about $10 for every $1 they had spent in retrofitting and rebuilding their infrastructure,” he said. “There’s a lot we can do right now that will make a difference and save money in the long run.”

Other key recommendations included:

  • Establish a resilience policy adviser to the governor;
  • Use the most recent tsunami hazard maps to redefine the inundation zone for construction;
  • Provide $1 million annually for scientific research by Oregon universities, to provide matching funds for earthquake research supported by the state, federal government or private industry;
  • Provide $500,000 to the Office of Emergency Management for educational programs and training aimed at managers, agencies, businesses and the general public;
  • Provide $500,000 to the Department of Education to lead a K-12 educational program;
  • Require water providers and wastewater agencies to complete a seismic risk assessment and mitigation plan, as part of periodic updates to master plans;
  • Require firefighting agencies, water providers and emergency management officials to create joint standards to use in a firefighting response to a large seismic event.

“Our next steps will include a lot of discussion, with the legislature, with business and community leaders, with the general public all over the state,” Ashford said. “The challenges we face are enormous but I really believe Oregonians are ready to take an important step toward resilience. This is our chance.”

Media Contact: 
Source: 

Scott Ashford, 541-737-5232

Multimedia Downloads
Multimedia: 

Sinking structures

Japan liquefaction

YouTube video of damage done in the Japanese earthquake is available online: http://bit.ly/ZYH35d

Anglers, beachcombers asked to watch for transponders from Japan

CORVALLIS, Ore. – Northwest anglers venturing out into the Pacific Ocean in pursuit of salmon and other fish this fall may scoop up something unusual into their nets – instruments released from Japan called “transponders.”

These floating instruments are about the size of a 2-liter soda bottle and were set in the ocean from different ports off Japan in 2011-12 after the massive Tohoku earthquake and tsunami. Researchers from Tattori University for Environmental Studies in Japan have been collaborating with Oregon State University, Oregon Sea Grant, and the NOAA Marine Debris Program on the project.

The researchers’ goal is to track the movement of debris via ocean currents and help determine the path and timing of the debris from the 2011 disaster. An estimated 1.5 million tons of debris was washed out to sea and it is expected to continue drifting ashore along the West Coast of the United States for several years, according to Sam Chan, a watershed health specialist with Oregon State University Extension and Oregon Sea Grant.

“These transponders only have a battery life of about 30 months and then they no longer communicate their location,” Chan said. “So the only way to find out where they end up is to physically find them and report their location. That’s why we need the help of fishermen, beachcombers and other coastal visitors.

“These bottles contain transmitters and they are not a hazardous device,” Chan added. “If you find something that looks like an orange soda bottle with a short antenna, we’d certainly like your help in turning it in.”

Persons who find a transponder are asked to photograph it if possible, and report the location of their find to Chan at Samuel.Chan@oregonstate.edu; or to the NOAA Marine Debris Program regional coordinator in their area at http://marinedebris.noaa.gov/contact-us. They will provide shipping instructions to persons who find the transponders so that the instruments can be returned to the research team.

One of the first transponders discovered in the Northwest washed ashore near Arch Cape, Oregon, in March 2013, about 19 months after it was set adrift. The persons who found it reported it to Chan, who began collaborating with researchers in Japan.

Another transponder was found near the Haida Heritage Site, formerly the Queen Charlotte Islands – the same location where a Harley-Davidson motorcycle floated up on a beach in a shipping container long after being swept out to sea in Japan by the tsunami.

“These transponders have recorded a lot of important data that will help us better understand the movement of tsunami and marine debris throughout the Pacific Ocean,” Chan said. “Everyone’s help in recovering these instruments is greatly appreciated.”

Media Contact: 
Source: 

Sam Chan, 541-737-4828; samuel.chan@oregonstate.edu

Mechanized human hands: System designed to improve hand function lost to nerve damage

CORVALLIS, Ore. – Engineers at Oregon State University have developed and successfully demonstrated the value of a simple pulley mechanism to improve hand function after surgery.

The device, tested in cadaver hands, is one of the first instruments ever created that could improve the transmission of mechanical forces and movement while implanted inside the body.

After continued research, technology such as this may offer new options to people who have lost the use of their hands due to nerve trauma, and ultimately be expanded to improve function of a wide range of damaged joints in the human body.

The findings were just reported in Hand, a professional journal, by researchers from OSU and the School of Medicine at the University of Washington. The research was supported by OSU.

“This technology is definitely going to work, and it will merge artificial mechanisms with biological hand function,” said Ravi Balasubramanian, an expert in robotics, biomechanics and human control systems, and assistant professor in the OSU College of Engineering.

“We’ll still need a few years to develop biocompatible materials, coatings to prevent fibrosis, make other needed advances and then test the systems in animals and humans,” Balasubramanian said. “But working at first with hands – and then later with other damaged joints such as knees or ankles – we will help people recover the function they’ve lost due to illness or injury.”

Initially, the OSU research will offer a significant improvement on surgery now used to help restore the gripping capability of hands following nerve damage. That procedure, called tendon-transfer surgery for high median-ulnar palsy, essentially reattaches finger tendons to a muscle that still works. But the hand function remains significantly impaired, requiring a large amount of force, the stretching of tendons, and fingers that all move at the same time, instead of separately as is often needed to grasp an object.

The new mechanism developed at OSU is not really robotic since it has no sensory, electronic or motor capabilities, Balasubramanian said. Rather, it’s a passive technology using a basic pulley that will be implanted within a person’s hand to allow more natural grasping function with less use of muscle energy.

“Many people have lost the functional use of their hands due to nerve damage, sometimes from traumatic injury and at other times from stroke, paralysis or other disorders,” Balasubramanian said. “The impact can be devastating, since grasping is a fundamental aspect of our daily life. The surgery we’re focusing on, for instance, is commonly performed in the military on people who have been injured in combat.”

The new research showed, in cadavers, how the mechanism developed for this problem can produce more natural and adaptive flexion of the fingers in grasping. The needed force to close all four fingers around an object was reduced by 45 percent, and the grasp improvement on an object reduced slippage by 52 percent.

Such progress can be an important step to improve function beyond the existing surgical procedure, by providing an alternative to the suture which has been the previous mainstay. The hand, experts say, is amazingly complex, with 35-38 muscles and 22 joints all working together, innervated by three nerves between the elbow and fingertip.

The long-term potential of such mechanized assistance is profound. In some cases, Balasubramanian said, it may indeed be possible to create joints or limbs that mechanically function as well or better than they did originally.

“There’s a lot we may be able to do,” he said. “Thousands of people now have knee replacements, for instance, but the knee is weaker after surgery. With mechanical assistance we may be able to strengthen and improve that joint.”

This work is part of a rapidly expanding robotics research and education program at OSU, in fields ranging from robotic underwater vehicles to prosthetic limbs, search and rescue missions and advanced manufacturing. New graduate degrees in robotics were just recently added at the university, one of the few institutions in the nation to have such graduate programs.

Media Contact: 
Multimedia Downloads
Multimedia: 

Mechanical hand

Pulley mechanism

Compound from hops aids cognitive function in young animals

CORVALLIS, Ore. – Xanthohumol, a type of flavonoid found in hops and beer, has been shown in a new study to improve cognitive function in young mice, but not in older animals.

The research was just published in Behavioral Brain Research by scientists from the Linus Pauling Institute and College of Veterinary Medicine at Oregon State University. It’s another step toward understanding, and ultimately reducing the degradation of memory that happens with age in many mammalian species, including humans.

Flavonoids are compounds found in plants that often give them their color. The study of them – whether in blueberries, dark chocolate or red wine - has increased in recent years due to their apparent nutritional benefits, on issues ranging from cancer to inflammation or cardiovascular disease. Several have also been shown to be important in cognition.

Xanthohumol has been of particular interest because of possible value in treating metabolic syndrome, a condition associated with obesity, high blood pressure and other concerns, including age-related deficits in memory. The compound has been used successfully to lower body weight and blood sugar in a rat model of obesity.

The new research studied use of xanthohumol in high dosages, far beyond what could be obtained just by diet. At least in young animals, it appeared to enhance their ability to adapt to changes in the environment. This cognitive flexibility was tested with a special type of maze designed for that purpose.

“Our goal was to determine whether xanthohumol could affect a process we call palmitoylation, which is a normal biological process but in older animals may become harmful,” said Daniel Zamzow, a former OSU doctoral student and now a lecturer at the University of Wisconsin/Rock County.

“Xanthohumol can speed the metabolism, reduce fatty acids in the liver and, at least with young mice, appeared to improve their cognitive flexibility, or higher level thinking,” Zamzow said. “Unfortunately it did not reduce palmitoylation in older mice, or improve their learning or cognitive performance, at least in the amounts of the compound we gave them.”

Kathy Magnusson, a professor in the OSU Department of Biomedical Sciences, principal investigator with the Linus Pauling Institute and corresponding author on this study, said that xanthohumol continues to be of significant interest for its biological properties, as are many other flavonoids.

“This flavonoid and others may have a function in the optimal ability to form memories,” Magnusson said. “Part of what this study seems to be suggesting is that it’s important to begin early in life to gain the full benefits of healthy nutrition.”

It’s also important to note, Magnusson said, that the levels of xanthohumol used in this study were only possible with supplements. As a fairly rare micronutrient, the only normal dietary source of it would be through the hops used in making beer, and “a human would have to drink 2000 liters of beer a day to reach the xanthohumol levels we used in this research.”

In this and other research, Magnusson’s research has primarily focused on two subunits of the NMDA receptor, called GluN1 and GluN2B. Their decline with age appears to be related to the decreased ability to form and quickly recall memories.

In humans, many adults start to experience deficits in memory around the age of 50, and some aspects of cognition begin to decline around age 40, the researchers noted in their report.

This research was supported by the National Institutes of Health.

Media Contact: 
Source: 

Kathy Magnusson, 541-737-6923

Multimedia Downloads
Multimedia: 

Hops

Hops