OREGON STATE UNIVERSITY

scientific research and advances

Nanofiber sutures promote production of infection-thwarting peptide

CORVALLIS, Ore. – Loading nanofiber sutures with vitamin D induces the production of an infection-fighting peptide, new research shows.

The discovery could represent an important advance in the prevention of surgical site infections, a multibillion-dollar challenge each year in the United States alone.

A collaboration that included Adrian Gombart of the Linus Pauling Institute at Oregon State University used coaxial electrospinning deposition and rolling to fabricate sutures that contained 25-hydroxyvitamin D3 and the pam3CSK4 peptide.

A peptide is a compound consisting of two or more amino acids linked in a chain; pam3CSK4’s function is to activate a cell’s toll-like receptor, which in turn triggers immune responses, in which vitamin D plays a key role.

The research showed the sutures released 25D3 – the same form of the vitamin that’s measured in the blood when a patient’s vitamin D levels are tested – on a sustained basis over four weeks. The sutures released pam3CSK4 via an initial burst followed by a four-week prolonged release.

“When the toll-like receptor is activated, you induce a particular enzyme to convert 25D3 to its bioactive form, known as 1,25-dihydroxy vitamin D3, that activates the vitamin D receptor,” Gombart said. “When activity increases, that increases expression of vitamin D receptor target genes, one of which produces the LL-37 peptide, which kills microbes by disrupting their membranes.

“The idea is, if you were to have an infection, the sutures would activate the toll-like receptors and start increasing production of 1,25D3 from the 25D3 that’s being released from sutures – so you get both local induction and an increase in the production of the antimicrobial peptide.”

The study’s corresponding author, Jingwei Xie of the University of Nebraska Medical Center, notes that the anti-infective sutures currently in use contain triclosan, an antibacterial and antifungal agent also found in a variety of consumer products.

“However, the frequent use has resulted in bacterial resistance,” Xie said. “Triclosan also shows a wide range of health risks including endocrine disruption, impaired muscle function, liver damage and the development of cancerous tumors. Compared to the currently available products and treatment options, the anti-infective sutures we develop could circumvent the selection for multidrug resistance and other health-associated shortcomings. The new sutures are also highly configurable and can deliver a variety of bioactive compounds to minimize infection risk, optimize healing and minimize scarring. None of the currently available sutures has this level of function.”

Gombart adds that the vitamin D delivered by the sutures could also affect additional genes involved in the immune response as well as LL-37.

“So a compound like vitamin D not only targets bacteria via the antimicrobial peptide, but other immune responses can also be modulated to help combat infection,” he said. “Targeting on multiple fronts helps minimize the chance of resistance.”

The University of Nebraska Medical Center, the National Institutes of Health, and the Otis Glebe Medical Research Foundation supported this research.

Also involved in the collaboration were researchers from the Joan C. Edwards School of Medicine at Marshall University in Huntington, West Virginia, and the Chongqing Academy of Animal Sciences & Key Laboratory of Pig Industry Sciences in Chongqing, China.

Findings were recently published in Nanomedicine.

Media Contact: 

Steve Lundeberg, 541-737-4039

Multimedia Downloads
Multimedia: 

nanofiber sutures

How the sutures work

Assessment shows metagenomics software has much room for improvement

CORVALLIS, Ore. – A recent critical assessment of software tools represents a key step toward taming the “Wild West” nature of the burgeoning field of metagenomics, said an Oregon State University mathematical biologist who took part in the research.

Metagenomics refers to the science of genetically studying whole communities of microorganisms, as opposed to sequencing single species grown in culture.

“Microbes are ridiculously important to life,” said David Koslicki, assistant professor of mathematics in the OSU College of Science. “They not only can cause terrible things to happen, like blight and disease, but in general, overwhelmingly, microbes are our friends. Without them doing their jobs, crops couldn’t grow as well, it would be hard to digest our food, we might not get sleepy at appropriate times. Microbes are so fundamental to life, to health, we really need to know as much as we can about them.”

Koslicki, a leader in a university-wide research and education program known as OMBI – the OSU Microbiome Initiative – described the findings, published recently in Nature Methods, as “sobering." 

“There are not a lot of well-established, well-characterized computational techniques and tools that biologists can use,” he said. “And the assessment showed that a lot of the tools being used do not do nearly as well as had been initially thought, so there’s definitely room for improvement there.

“That said, depending on the situation that a biologist is interested in, there are definitely different tools that have proven to be the best so far.”

Metagenomics is a relatively new field that developed quickly once next-generation sequencing grew inexpensive enough that looking at entire microbial communities became economically feasible, said Koslicki.

“The typical view of biology is a wet lab and everything like that, but a whole other facet has to do with these high-throughput ways of accessing genetic material,” he said. “You end up with a ton of data, and when you end up with a ton of data, you introduce new problem: How do I get the important information out of it? You have to come up with an algorithm that allows biologists to answer the questions they find important: What critters are there, how many are there, what are they doing, are there any viruses? We need to answer those questions and not just answer them quickly but also have some sort of idea how accurate the answer is.”

The dizzying array of tools biologists are using to try to answer those questions is “kind of like the Wild West,” Koslicki said. “If you want to learn what bacteria are in a sample, there are no less than three or four dozen different tools people have come up with, and in a rather disjointed manner. You have teams of statisticians, mathematicians, biologists, microbiologists, engineers all looking at this from their own perspectives and coming up with their own tools. Then the end-user biologist comes along and is faced with 40 different tools, and how do they know how good they are at answering the questions they need answered?”

Koslicki’s research, known as the CAMI challenge – critical assessment of metagenome interpretation –was aimed at ranking those tools to provide a road map for biologists.

“The challenge engaged the global developer community to benchmark their programs on highly complex and realistic data sets, generated from roughly 700 newly sequenced microorganisms and about 600 novel viruses and plasmids and representing common experimental setups,” he said. “This was an independent initiative. Typically when tools are compared, it’s attached to the publication of a new method that’s compared to other tools that do worse, so the new method looks good. There hasn’t been a lot of independent research into which tools actually work, how well they work, what kind of data do they well on, etc.”

The UK Engineering and Physical Sciences Research Council, the U.S. Department of Energy, the Cluster of Excellence on Plant Sciences, the Australian Research Council, the European Research Council, the Agency for Science, Technology and Research Singapore, the Lundbeck Foundation, and the National Science Foundation supported this research.

Media Contact: 

Steve Lundeberg, 541-737-4039

Gamma-ray burst detection just what OSU researchers exclusively predicted

CORVALLIS, Ore. – More than a month before a game-changing detection of a short gamma-ray burst – a finding announced today – scientists at Oregon State University predicted such a discovery would occur.

Scientists from U.S. and European collaborations converged on the National Press Club in Washington, D.C., today to say they’ve detected an X-ray/gamma-ray flash that coincided with a burst of gravitational waves, followed by visible light from a new cosmic explosion called a kilonova.

Gravitational waves were first detected in September 2015, and that too was a red-letter event in physics and astronomy; it confirmed one of the main predictions of Albert Einstein’s 1915 general theory of relativity and earned a Nobel prize for the scientists who discovered them.

“A simultaneous detection of gamma rays and gravitational waves from the same place in the sky is a major milestone in our understanding of the universe,” said Davide Lazzati, a theoretical astrophysicist in the OSU College of Science. “The gamma rays allow for a precise localization of where the gravitational waves are coming from, and the combined information from gravitational and electromagnetic radiation allows scientists to probe the binary neutron star system that’s responsible in unprecedented ways. We can tell things like which galaxy the waves come from, if there are other stars nearby, and whether or not the gravitational waves are followed by visible radiation after a few hours or days.”

Collaborators from the Laser Interferometer Gravitational-Wave Observatory, known as LIGO, and the European Gravitational Observatory’s Virgo team on Aug. 17, 2017, detected gravitational waves – ripples in the fabric of space-time – produced by the coalescence of two neutron stars.

Roughly two seconds later, NASA’s Fermi Gamma-ray Space Telescope detected a short flash of X- and gamma rays from the same location in the sky.

“The Fermi transient is more than 1,000 times weaker than a ‘normal’ short gamma-ray burst and has the characteristics that we predicted,” Lazzati said. “No other prediction of such flashes had been made. Just by pen and paper almost, we could say hey, we might see the bursts, even if they’re not in a configuration that makes them obvious.”

On July 6, Lazzati’s team of theorists had published a paper predicting that, contrary to earlier estimates by the astrophysics community, short gamma-ray bursts associated with the gravitational emission of binary neutron star coalescence could be detected – whether or not the gamma-ray burst was pointing at Earth.

The paper appeared in the journal Monthly Notices of the Royal Astronomical Society.

“X- and gamma rays are collimated, like the light of a lighthouse, and can be easily detected only if the beam points toward Earth,” Lazzati said. “Gravitational waves, on the other hand, are almost isotropic and can always be detected. We argued that the interaction of the short gamma-ray burst jet with its surroundings creates a secondary source of emission called the cocoon. The cocoon is much weaker than the main beam and is undetectable if the main beam points toward our instruments. However, it could be detected for nearby bursts whose beam points away from us.”

Since the first gravitational wave discovery, there have been three more confirmed detections, including the one from August that was jointly seen by scientists from the LIGO and Virgo groups.

“All observations until the last one were from the coalescence of binary black hole systems,” Lazzati said. “While these systems are interesting, they are dark in any other form of radiation and relatively little can be understood from them compared to binary neutron star systems.

“It’s a really lucky set of circumstances for a theorist, where you have a working theory to use to make predictions and new instruments such as LIGO and Virgo coming online to test them,” Lazzati said. “Scientists don’t make predictions because we want to be right – we make predictions because we want to test them. Even if we’re wrong, we’re still learning something – but it’s much more exciting to be right.”

The term neutron star refers to the gravitationally collapsed core of a large star; neutron stars are the smallest, densest stars known. According to NASA, neutron stars’ matter is packed so tightly that a sugar-cube-sized amount of it weighs in excess of a billion tons.

Media Contact: 

Steve Lundeberg, 541-737-4039

Multimedia Downloads
Multimedia: 

radiation jet copy

GRB computer simulation

New book ‘Singlewide’ explores the role of the American trailer park as affordable housing

CORVALLIS, Ore. – America’s rural trailer parks offer the promise of the American home ownership dream, but often fail to deliver on that dream as residents get caught in the trap of rising cost of home rental space and depreciating home values, a new book on rural trailer park life has concluded.

“Singlewide: Chasing the American Dream in a Rural Trailer Park,” by Oregon State University’s Katherine MacTavish and Sonya Salamon of the University of Illinois at Champaign-Urbana, explores the trailer park’s role as affordable rural housing and a path to home ownership.

“All of the people we interviewed saw their purchase of a mobile home as progress toward the American dream, but that just doesn’t happen,” said MacTavish, an associate professor of human development and family science in OSU’s College of Public Health and Human Sciences. “Owning your own home but having it parked on someone else’s land doesn’t net the same benefits for families as conventional home ownership would.”

MacTavish, whose academic research focuses on how the places where children grow up shape their life experiences and their futures, spent more than 18 months immersing herself in the culture of rural trailer park communities in Illinois and New Mexico. Another researcher conducted similar fieldwork at a trailer park in North Carolina.

“We spent a lot of time in people’s homes, talking with them, sharing a family meal, hanging out with the kids at the church youth group and visiting children in their classrooms,” MacTavish said. “We also interviewed a range of stakeholders, including park managers, school leaders and elected officials.”

The book is the culmination of that work, offering an in-depth assessment of the role of trailer parks in meeting affordable rural housing needs and helping people move up the economic ladder.

The researchers found that mobile homes depreciate rapidly, like vehicles do, making it hard for families to build equity; sales of homes located in parks can be hampered by landlord rules; and lot rent or lease costs continue to rise, making it difficult for families to save money or to move out of parks.

They also found that an interrelated system involving the manufacture, sale and financing of mobile homes, along with investor ownership of land-lease parks - a system they termed the “mobile home industrial complex” – undergirds the struggles for the rural homeowner of modest means.

 “This system, in which a number of players earn substantial profits, leaves families struggling to gain the benefits they anticipated from buying a home,” MacTavish said.

The researchers interviewed people in more than 240 trailer park households and followed 39 of those families closely. They found just a handful who were financially able to move from the trailer park to a conventional home or a mobile home on land they owned – the American dream of many in the parks.

But they also found little truth to the concept of “trailer trash” - a moniker applied almost exclusively to white families living in trailer parks but not to African American or Hispanic families living there. Within the parks, the researchers found parents working hard to move their children out of poverty and attain higher social class.

“These were not neighborhoods riddled with crime, noise and disarray,” MacTavish said. “They were mainly people who worked full-time for not great wages and not great benefits, just trying to get by and improve the lives of their children.”

Overall, MacTavish and Salamon found that trailer park families see themselves as doing the best they could for their families, despite the financial and social pitfalls they may face.

“They are just trying to give their kids a chance at a more stable and secure life than they had, and they are optimistic that they can manage that,” MacTavish said. 

“Singlewide,” from Cornell University Press, publishes Oct. 15 and is available for purchase online at www.cornellpress.cornell.edu/ and from a wide range of booksellers.

Story By: 
Source: 

Kate MacTavish, 541-737-9130, kate.mactavish@oregonstate.edu

Multimedia Downloads
Multimedia: 

Kate MacTavish

Kate MacTavish

Cover of "Singlewide"

Singlewide

Oregon State to host grid energy storage symposium

CORVALLIS, Ore. – Leaders in energy storage technology will converge on the Oregon State University campus Nov. 5-6 for a symposium to discuss opportunities and challenges for next-generation, large-scale grid energy storage systems in the Pacific Northwest and nationwide.

The meeting, expected to draw 100 to 150 participants, is intended to serve as a forum for industry representatives, utility companies, academic and government researchers, and policymakers to discuss energy storage and potential major applications in the region.

 “This meeting brings together the thought leaders who are driving the implementation of novel energy storage systems for the grid, wave power, and other sustainable energy technologies,” said conference chair Zhenxing Feng, assistant professor of chemical engineering in OSU’s College of Engineering. “These are the enabling technologies that can make the dream of 100 percent renewable energy into a reality.”

The symposium is being organized by Oregon State with assistance from the Joint Center for Energy Storage Research, a public/private partnership established by the U.S. Department of Energy in 2012. Topics for discussion include the status of current battery technology, challenges and opportunities in the emerging sectors of transportation and the energy grid, energy resilience in the electrical grid, special needs in Oregon, and commercialization and manufacturing opportunities throughout the region.

Invited presenters include researchers from Argonne National Laboratory, Pacific Northwest National Laboratory and Idaho National Laboratory, as well as representatives from industry, such as Lebanon, Oregon-based Entek International LLC.

The agenda includes keynote speakers, panel discussions, breakout sessions and a poster session networking event. Also planned are tours to a local utility company and Oregon State’s state-of-the-art facility for energy storage and materials characterization research.

More information and registration are available online at cbee.oregonstate.edu/energy-storage-symposium. 

Media Contact: 

Keith Hautala, 541-737-1478

Multimedia Downloads
Multimedia: 

Ocean Sentinel

Testing wave energy

‘Transformative’ research unrealistic to predict, scientists tell granting agencies

CORVALLIS, Ore. – Research-funding agencies that require scientists to declare at the proposal stage how their projects will be “transformative” may actually be hindering discovery, according to a study by Oregon State University ecologists.

The requirement can result in decreased funding for the “incremental” research that often paves the way for paradigm-shifting breakthroughs, the OSU scientists assert.

Their findings, as well as their recommendation for how to best foster transformative research, were published recently in Trends in Ecology and Evolution.

Sarah Gravem, postdoctoral scholar in integrative biology in Oregon State’s College of Science, was the lead author on the paper, titled “Transformative Research is Not Easily Predicted.”

Gravem, integrative biology professor Bruce Menge and the other collaborators note that the National Science Foundation, which funds roughly one-quarter of the federally supported research at U.S. colleges and universities, “has made the pursuit of transformative research a top priority by asking for a transformative research statement in every major research proposal solicited.”

The NSF defines transformative research as being “driven by ideas that have the potential to radically change our understanding of an important existing scientific or engineering concept or leading to the creation of a new paradigm … . Such research is also characterized by its challenge to current understanding or its pathway to new frontiers.”

Gravem says asking scientists to attempt to create new paradigms or fields in every proposal is unrealistic and potentially harmful.

The OSU scientists argue that a better approach, and one that was suggested more than a decade ago by the board that oversees the National Science Foundation, would be to create a funding subset: a separate NSF-wide program to solicit and support transformational research proposals.

“The board had been concerned that the U.S. was lagging behind other countries in scientific advances, concerned that creative and risky research was not getting funding,” Menge said. “It concluded that what the NSF should do is set aside some funds for risky research proposals, those defined by reviewers as they may or may not work, the chances are sort of slim, but they could turn out to be pretty cool.”

What the NSF did instead, Menge said, was require all proposals to show how the research being proposed would be transformative.

“Instructions to reviewers include the expectation that the reviewer will comment on how transformative the proposed research is,” Menge added.

The problem, the Oregon State collaborators say, is that it’s rarely possible to know at the proposal stage whether a project will turn out to be transformative; their assertion follows interviews and surveys of 78 highly cited ecologists who began with incremental goals and only later realized the transformative potential of their work.

“To start out with that transformative question is a backward way of thinking,” Gravem said. “Surely you have to think big to come up with big answers, and everyone is striving for that, but truly transformative research is an unobtainable standard to place on people at the proposal stage. Trying to make every project paradigm shifting can mean ignoring the incremental and basic science that eventually goes into shifting paradigms. It’s a detriment to ignore the building blocks in favor of the building.”

Gravem said the necessity of incremental research was also explained recently on Freakonomics Radio.

“Economist Ed Glaeser noted that Nobel Prizes are not typically given for single transformative research papers but are often given for a body of incremental research,” she said. “If transformations arise from incremental research, then the transformative criterion is redundant with the solicitation of incremental research. This is reflected by mixed evidence that soliciting transformative research led to increases in transformative outcomes compared with the typical model.”

Expanding fields of knowledge, adding to bodies of evidence, and comparing two fields that haven’t been compared before are the types of gains researchers can reasonably predict, Gravem added. Being asked to forecast how a project will turn out to be transformative puts “researchers in an awkward position that nobody likes.”

“We’re being forced to hype our work at the beginning of a proposal, which doesn’t do anything to help science or to help build trust in science,” Gravem said. “And it turns the funding process into an essay competition that favors people who take more liberty in predicting what their research might show.”

Menge notes that NSF’s plan all along was to reassess the transformative research statement requirement at some point, “and now is the time.”

“Research funding is effectively decreasing, but the demand for funding is increasing, so they look for ways to prune the field of who gets funded – I recognize that as a problem,” he said. “But making artificial hurdles is just wrong. Funding agencies should concentrate on the goals of the research rather than the unknowable outcome.” 

Media Contact: 
Source: 

Bruce Menge, mengeb@oregonstate.edu; Sarah Gravem, sgravem@gmail.com

New blue pigment discovered at Oregon State earns EPA approval

CORVALLIS, Ore. – The vibrant YInMn blue pigment discovered at Oregon State University has been approved for commercial sale by the Environmental Protection Agency.

The Shepherd Color Co., which licensed the pigment from OSU, announced that the EPA has granted the company a “low volume exemption” that paves the way for the pigment, commercially known as Blue 10G513, to be used in industrial coatings and plastics.

YInMn refers to the elements yttrium, indium and manganese, which along with oxygen comprise the pigment. It features a unique chemical structure that allows the manganese ions to absorb red and green wavelengths of light while only reflecting blue.

The pigment, created in OSU’s College of Science, has sparked worldwide interest, including from crayon maker Crayola, which used the color as the inspiration for its new Bluetiful crayon.

The pigment is so durable, and its compounds are so stable – even in oil and water – that the color does not fade. Those characteristics make the pigment versatile for a variety of commercial products; used in paints, for example, they can help keep buildings cool by reflecting the infrared part of sunlight.

The EPA approval announced this week does not include making the pigment available for artists’ color materials, but Shepherd is in the process of seeking approval for its use in all applications and is confident that will happen, company spokesman Mark Ryan said.

YInMn blue was discovered by accident in 2009 when OSU chemist Mas Subramanian and his team were experimenting with new materials that could be used in electronics applications.

The researchers mixed manganese oxide – which is black in color – with other chemicals and heated them in a furnace to nearly 2,000 degrees Fahrenheit. One of their samples turned out to be a vivid blue. Oregon State graduate student Andrew Smith initially made these samples to study their electrical properties.

“This was a serendipitous discovery, a happy accident,” said Subramanian, the Milton Harris Chair of Materials Science at OSU. “But in fact, many breakthrough discoveries in science happen when one is not looking for it. As Louis Pasteur famously said, ‘In the fields of observation, chance favors only the prepared mind.’

“Most pigments are discovered by chance,” Subramanian added. “The reason is because the origin of the color of a material depends not only on the chemical composition, but also on the intricate arrangement of atoms in the crystal structure. So someone has to make the material first, then study its crystal structure thoroughly to explain the color.”  

Subramanian notes that blue is associated with open spaces, freedom, intuition, imagination, expansiveness, inspiration and sensitivity.

“Blue also represents meanings of depth, trust, loyalty, sincerity, wisdom, confidence, stability, faith, heaven and intelligence,” he said. “Through much of human history, civilizations around the world have sought inorganic compounds that could be used to paint things blue but often had limited success. Most had environmental and/or durability issues. The YInMn blue pigment is very stable and durable. There is no change in the color when exposed to high temperatures, water, and mildly acidic and alkali conditions.”

Media Contact: 

Steve Lundeberg, 541-737-4039

Multimedia Downloads
Multimedia: 

Blue pigment

YInMn blue

Climate change, population growth may lead to open ocean aquaculture

CORVALLIS, Ore. – A new analysis suggests that open-ocean aquaculture for three species of finfish is a viable option for industry expansion under most climate change scenarios – an option that may provide a new source of protein for the world’s growing population.

This modeling study found that the warming of near-shore surface waters would shift the range of many species toward the higher latitudes – where they would have better growth rates – but even in areas that will be significantly warmer, open-ocean aquaculture could survive because of adaptation techniques including selective breeding.

Results of the study are being published this week in the Proceedings of the Royal Society B.

“Open-ocean aquaculture is still a young and mostly unregulated industry that isn’t necessarily environmentally sound, but aquaculture also is the fastest growing food sector globally,” said James Watson, an Oregon State University environmental scientist and co-author on the study. “One important step before developing such an industry is to assess whether such operations will succeed under warming conditions.

“In general, all three species we assessed – which represent species in different thermal regions globally – would respond favorably to climate change.”

Aquaculture provides a primary protein source for approximately one billion people worldwide and is projected to become even more important in the future, the authors say. However, land-based operations, as well as those in bays and estuaries, have limited expansion potential because of the lack of available of water or space.

Open-ocean aquaculture operations, despite the name, are usually located within several miles of land – near enough to market to reduce costs, but far enough out to have clean water and less competition for space. However, aquaculture managers have less control over currents, water temperature, and waves.

To assess the possible range for aquaculture, the researchers looked at three species of fish – Atlantic salmon (Salmo salar), which grows fastest in sub-polar and temperate waters; gilthead seabream (Sparus aurata), found in temperate and sub-tropical waters; and cobia (Rachycentron canadum), which is in sub-tropical and tropical waters.

“We found that all three species would shift farther away from the tropics, which most models say will heat more than other regions,” said Dane Klinger, a former postdoctoral researcher at Princeton University and lead author of the study. “Production of Atlantic salmon, for example, could expand well into the higher latitudes, and though the trailing edge of their range may face difficulties, adaptation techniques can offset those difficulties.

“Further, in most areas where these species are currently farmed, growth rates are likely to increase as temperatures rise.”

Open-ocean aquaculture is not without risk, the researchers acknowledge. The recent escape of farmed Atlantic salmon in Washington’s Puget Sound alarmed fisheries managers, who worry that the species may breed with wild Chinook or coho salmon that are found in the Pacific Northwest. Introduced species and populations also have the potential to introduce disease to native species. “A key unresolved question is how large the industry and individual farms can become before they begin to negatively impact surrounding ecosystems,” Klinger said.

The authors say their modeling study was designed to assess the potential growth rates and potential range for the three fish species, based on climate warming scenarios of 2-5 degrees Celsius (or 3.6 to 9 degrees Fahrenheit).

The study also found:

  • Seabream will have the greatest potential for open-ocean farming in terms of area, but the fish will grow at a slower rate than with salmon or cobia;
  • Cobia has the second largest potential area for growth, just ahead of salmon;
  • For all species, depth of water is the greatest constraint to development, followed by suitable currents;
  • Other factors dictating success include environment, economics (feed, fuel and labor), regulations and politics, ecology (disease, predators, and harmful algal blooms), and social norms.

“Offshore aquaculture will continue to be a small segment of the industry in the near-term, but there is only so much you can do on land and there are not enough wild fish to feed the world’s population,” Watson said. “Assessing the potential is the first step toward reducing some of the uncertainties for the future.”

Watson, who is on the faculty of OSU’s College of Earth, Ocean, and Atmospheric Sciences, did his research while at Princeton University.

Story By: 
Source: 

James Watson, 541-737-2519, jrwatson@coas.oregonstate.edu;

Dane Klinger, dhklinger@stanford.edu

Multimedia Downloads
Multimedia: 









aquaculture

Study: Sunlight and the right microbes convert Arctic carbon into carbon dioxide

CORVALLIS, Ore. – Nearly half of the organic carbon stored in soil around the world is contained in Arctic permafrost, which has experienced rapid melting, and that organic material could be converted to greenhouse gases that would exacerbate global warming.

When permafrost thaws, microbial consumption of those carbon reserves produces carbon dioxide – much of which eventually winds up in the atmosphere, but scientists have been unsure of just how the system works.

A new study published this week in Nature Communications outlines the mechanisms and points to the importance of both sunlight and the right microbial community as keys to converting permafrost carbon to CO2. The research was supported by the U.S. National Science Foundation and the Department of Energy.

“We’ve long known that microbes convert the carbon into CO2, but previous attempts to replicate the Arctic system in laboratory settings have failed,” noted Byron Crump, an Oregon State University biogeochemist and co-author on the study. “As it turns out, that is because the laboratory experiments did not include a very important element – sunlight.

“When the permafrost melts and stored carbon is released into streams and lakes in the Arctic, it gets exposed to sunlight, which enhances decay by some microbial communities, and destroys the activity for other communities. Different microbes react differently, but there are hundreds, even thousands of different microbes out there and it turns out that the microbes in soils are well-equipped to eat sunlight-exposed permafrost carbon.”

The research team from Oregon State and the University of Michigan was able to identify compounds that the microbes prefer using high-resolution chemistry and genetic approaches. They found that sunlight makes permafrost soils tastier for microbes because it converts it to the same kinds of carbon they already like to eat – the carbon they are adapted to metabolize.

“The carbon we’re talking about moves from the soil into rivers and lakes, where it is completely exposed to sunlight,” Crump said. “There are no trees and no shade, and in the summer, there are 24 hours a day of sunlight. That makes sunlight potentially more important in converting carbon into CO2 in the Arctic than in a tropical forest, for example.”

As the climate continues to warm, there are interesting ramifications for the Arctic, said Crump, who is a faculty member in OSU’s College of Earth, Ocean, and Atmospheric Sciences.

“The long-term forecast for the Arctic tundra ecosystem is for the warming to lead to shrubs and bigger plants replacing the tundra, which will provide shade from the sunlight,” Crump said. “That is considered a negative feedback. But there also is a positive feedback, in that seasons are projected to expand. Spring will arrive earlier, and fall will be later, and more water and carbon will enter lakes and streams with more rapid degradation of carbon.

“Which feedback will be stronger? No one can say for sure.”

The stakes are high, Crump said. There is more carbon stored in the frozen permafrost than in the atmosphere. It has accumulated over millions of years by plants growing and dying, with a very slow decaying process because of the freezing weather.

“Some of the organic matter is less tasty to microbes than others,” Crump said, “but bacterial communities are diverse, so there will be something out there that wants that energy and will use it.”

Story By: 
Source: 

Byron Crump, 541-737-4369, bcrump@coas.oregonstate.edu

Multimedia Downloads
Multimedia: 






IMG_0755

 

 

LAT003

When Arctic permafrost melts, it seeps into streams and lakes where it is exposed to sunlight, starting the process of converting it to carbon dioxide.

 

Study shows high cost of truckers not having enough places to park and rest

CORVALLIS, Ore. – A pilot study by Oregon State University illustrates the high economic cost of having too few safe places for commercial truck drivers to park and rest.

Over a seven-year period on one 290-mile stretch of highway alone, at-fault truck crashes resulted in approximately $75 million of “crash harm,” research conducted by the OSU College of Engineering for the Oregon Department of Transportation shows.

“Current crash data collection forms don’t have an explicit section for truck-parking-related crashes, but we can operate under the assumption that specific types of at-fault truck crashes, such as those due to fatigue, may be the result of inadequate parking,” said the study’s lead author, Salvador Hernandez, a transportation safety and logistics researcher at Oregon State.

Hernandez and graduate research assistant Jason Anderson analyzed Oregon’s portion of U.S. Highway 97, which runs the entire north-south distance of the state along the eastern slope of the Cascade Range.

Highway 97 was chosen, Hernandez said, because the idea for the study originated from ODOT’s office in Bend, which is near the highway’s Oregon midpoint. An impetus for the research was the 2012 passage of “Jason’s Law,” which prioritized federal funding to address a national shortage of truck parking.

Jason’s Law is named for truck driver Jason Rivenburg, who was robbed and fatally shot in South Carolina in 2009 after pulling off to rest at an abandoned gas station.

For “property-carrying drivers,” as opposed to bus operators, federal rules require drivers to get off the road after 11 hours and to park and rest for at least 10 hours before driving again.

“Around the country, commercial drivers are often unable to find safe and adequate parking to meet hours-of-service regulations,” Hernandez said. “This holds true in Oregon, where rest areas and truck stops in high-use corridors have a demand for truck parking that exceeds capacity. That means an inherent safety concern for all highway users, primarily due to trucks parking in undesignated areas or drivers exceeding the rules to find a place to park.”

Researchers looked at what other states were doing in response to the parking issue, surveyed more than 200 truck drivers, assessed current and future parking demand on Highway 97, and used historical crash data to identify trends and hot spots and to estimate crash harm.

“Crash trends in terms of time of day, day of the week, and month of the year follow the time periods drivers stated having trouble finding places to park,” Hernandez said. “In Oregon, if we do nothing to address the problem and freight-related traffic continues to grow, we’ll face greater truck parking shortages. A possible solution is finding ways to promote public-private partnerships, the state working together with industry.”

A solution is not, Hernandez said, simply waiting for the day autonomous vehicles take over the hauling of freight as some predict.

“There are many issues yet to be worked out with autonomous commercial motor vehicles,” he said, “and even if autonomous commercial motor vehicles become commonplace, we’re still going to need truck drivers in some capacity. For now and in the foreseeable future, we need truck drivers and safe and adequate places for the drivers to park and rest.” 

Media Contact: 

Steve Lundeberg, 541-737-4039