OREGON STATE UNIVERSITY

scientific research and advances

OSU ember research: Smaller branches pack the fastest, biggest fire-spreading punch

CORVALLIS, Ore. – As the West tallies the damages from the 2017 wildfire season, researchers at Oregon State University are trying to learn more about how embers form and about the blaze-starting potential they carry.

Preliminary findings indicate the diameter of the branches that are burning is the biggest single factor behind which ones will form embers the most quickly and how much energy they’ll pack.

“Increased population in the wildland-urban interface means increased risk to life and property from wildland fires,” said Tyler Hudson, a graduate student in the College of Engineering. “Spot fires started by embers lofted ahead of the main fire front are difficult to predict and can jump defensible space around structures.”

Research shows smaller-diameter branches are better at producing embers, also known as firebrands.

“Embers are wildfires’ most challenging mode of causing spread,” said David Blunck, assistant professor of mechanical engineering. “By understanding how embers form and travel through the air, scientists can more accurately predict how fire will spread. We have a multiscale approach that involves burning samples in a laboratory setting, larger burns – burning 10-foot-tall trees – and then working with the U.S. Forest Service to participate in prescribed burns.”

In his lab, Blunck’s research group controls multiple parameters which can influence generation rates: fire intensity, crosswind velocity, species of tree, diameter of the sample, fuel condition (natural vs. processed), and moisture content of the fuel.

“Fire intensity had little effect on the time needed for ember generation,” Hudson said. “And natural samples and dowels with similar diameters can have quite different ember generation times.”

Using samples of Douglas fir, western juniper, ponderosa pine and white oak with diameters of 2 and 6 millimeters, the researchers determined that 2-millimeter samples generated embers roughly five times as fast as 6-millimeter samples.

This trend can be explained by the observation that the bending stress is proportional to 1 divided by the cube of the diameter – thus, the larger the diameter, the smaller amount of bending stress and a lesser likelihood of breakage, and ember creation. Moreover, smaller diameters have less fuel that needs to be burned.

In the field, researchers can track embers’ energy “from the time they leave the tree until they get to their destination,” Hudson said, using techniques ranging from infrared videography to measuring scorch marks on squares of fire-resistant fabric placed on the ground at varying distances from the fire. 

Blunck, Hudson and fellow mechanical engineering graduate student Mick Carter presented their preliminary findings in April at the 10th edition of the biennial U.S. National Combustion Meeting in College Park, Maryland.

In August, Blunck was among a group of collaborators receiving a $500,000 grant from the National Institute of Standards and Technology “for the development of a computer model that will define patterns for firebrand distribution during wildland-urban interface fires and their likelihood of igniting nearby structures.”

This past fire season in Oregon, roughly 2,000 fires combined to burn more than a half-million acres – that’s about 1,000 square miles, an area the size of Rhode Island.

One of the most devastating of those blazes was the Eagle Creek fire in the Columbia River Gorge, which scorched nearly 50,000 acres and threatened the historic Multnomah Falls Lodge – and provided a terrifying illustration of what embers can do.

“The fire jumped the river and started burning in Washington because of embers,” Blunck said. “We estimate that the fire jumped 2 miles across the river.”

Media Contact: 

Steve Lundeberg, 541-737-4039

Multimedia Downloads
Multimedia: 

Controlled burn at Post

Ember research

NASA looks for citizen scientists to collect snowpack depth measurements

CORVALLIS, Ore. – The National Aeronautics and Space Administration is looking for snowshoers, backcountry skiers and snow-machine users in the Pacific Northwest to gather data to use in computer modeling for snow-water equivalent, or SWE.

SWE refers to how much water a particular amount of snow contains, information that’s important to scientists, engineers, and land and watershed managers.

NASA is funding a four-year project that involves an Oregon State University civil engineering professor, David Hill, and Ph.D. student, Ryan Crumley, as well as researchers at the University of Washington and the Alaska Division of Geological & Geophysical Surveys.

The project is called Community Snow Observations and is part of NASA’s Citizen Science for Earth Systems program.

“Our initial model runs show that the citizen science measurements are doing an amazing job of improving our simulations,” said David Hill of the OSU College of Engineering. “NASA has an unbelievable number of satellite assets in the sky producing incredible information about what’s going on in the earth’s systems, and they’re leveraging information and expertise from the public to make their product even better.”

Getting involved in Community Snow Observations is easy. A smartphone, the free Mountain Hub application, and an avalanche probe with graduated markings in centimeters are the only tools a recreationist needs.

As citizen scientists make their way through the mountains, they use their avalanche probes to take snow depth readings that they then upload into Mountain Hub, a fully featured app for outdoor users.

That’s all there is to it.

“Traditionally, the types of models we run have relied on ‘point’ measurements, such as snow telemetry stations,” Hill said. “Citizen scientists who are traveling in backcountry snow environments can provide us with much more data than those stations provide.”

Community Snow Observations kicked off in February 2017. Led by Hill, Gabe Wolken of the University of Alaska Fairbanks and Anthony Arendt of the University of Washington, the project has so far focused primarily on Alaskan snowpacks. Researchers are now looking to recruit citizen scientists in the Pacific Northwest as well, and if possible in the Rocky Mountain region also.

Alaska Fairbanks has spearheaded the public involvement aspect of the project, while the UW’s chief role is managing the data. Hill and Crumley are responsible for the modeling.

Which particular geographic areas get modeled “is kind of up to the public,” Hill said, adding that the more the data are spread out over time and space, the better.

“The models take into account the temporal densification of the snowpack and the spatial variability in snow-water equivalent and how snow properties are always changing, even in a given location,” he said. “If we get a whole bunch of measurements on one day in one spot, that has value, but the more we can get things stretched out, the more coverage we get, the better modeling products we can produce.”

Media Contact: 

Steve Lundeberg, 541-737-4039

Source: 
Multimedia Downloads
Multimedia: 

snowpack

Snowpack research

With ‘material robotics,’ intelligent products won’t even look like robots

CORVALLIS, Ore. – Robots as inconspicuous as they are ubiquitous represent the vision of researchers in the new and burgeoning field of material robotics.

In an invited perspective paper published today in Science Robotics, Oregon State University researcher Yiğit Mengüç and three co-authors argue against looking at robotics as a “dichotomy of brain versus body.”

Mengüç and collaborators from the University of Colorado, Yale University and École Polytechnique Fédérale de Lausanne take a view that seeks to dissolve the basic assumption that robots are either “machines that run bits of code” or “software ‘bots’ interacting with the world through a physical instrument.”

“We take a third path: one that imbues intelligence into the very matter of a robot,” said Mengüç, assistant professor of mechanical engineering in OSU’s College of Engineering and part of the college’s Collaborative Robotics and Intelligent Systems Institute. “The future we’re dreaming of is one of material-enabled robotics, something akin to robots themselves being integrated into day-to-day objects.”

Such as footwear, for example.

“Shoes that are able to intelligently support your gait, change stiffness as you’re running or walking, or based upon the surface you’re on or the biomechanics of your foot,” Mengüç said. “That’s one potential product. Examples of that kind of material intelligence abound in nature, where complex functionality results from systems of simple materials.

“The point here with something like a self-adjusting shoe is it no longer resembles a robot – that’s kind of the direction of ubiquity we’re imagining.”

Mengüç notes that as technology becomes more capable it tends to follow a pattern of disappearing into the background of everyday life.

“Take smartphones,” he said. “Autocorrect, a very small and impoverished version of artificial intelligence, is ubiquitous.

“In the future, your smartphone may be made from stretchable, foldable material so there’s no danger of it shattering. Or it might have some actuation, where it changes shape in your hand to help with the display, or it can be able to communicate something about what you’re observing on the screen. What I would see as success for material robotics is where the technology we make is not static anymore – all these bits and pieces of technology that we take for granted in life will be living, physically responsive things, moving, changing shape in response to our needs, not just flat, static screens.”

At present, the authors note, two distinct approaches remain for creating composite materials that match the complexity of functional biological tissue: new materials synthesis and system-level integration of material components. 

Materials scientists are developing new bulk materials with the inherent multifunctionality required for robotic applications, while roboticists are working on new material systems with tightly integrated components.

“The convergence of these approaches will ultimately yield the next generation of material-enabled robots,” Mengüç said. “It’s a natural partnership that will lead to robots with brains in their bodies – inexpensive and ever-present robots integrated into the real world.”

Joining Mengüç in authoring the paper were Nikolaus Correll of the University of Colorado, Rebecca Kramer of Yale, and Jamie Paik of École Polytechnique Fédérale de Lausanne in Switzerland.

They were invited to contribute their thoughts on the state and direction of material robotics after organizing a workshop on the subject at the “Robotics: Science and Systems” conference held in July at the Massachusetts Institute of Technology. 

Media Contact: 

Steve Lundeberg, 541-737-4039

Floods are necessary for maintaining healthy river ecosystems

CORVALLIS, Ore. – Flooding rivers can wreak havoc on homes and roads but are necessary for healthy ecosystems, research at Oregon State University suggests.

The study shows that alterations to rivers’ natural flow patterns – because of dams, diversions and changes in precipitation – cause damage to riparian plant communities and river ecosystems in general.

Even minor shifts in temporal flow patterns harm networks of competing vegetation, said the study’s corresponding author, Jonathan Tonkin of the OSU College of Science.

The most severe effects, he said, occur when cyclical flooding is removed from the equation.

“We think of floods as being these damaging forces because of what they can do to human infrastructure,” said Tonkin, a postdoctoral scholar in integrative biology. “But flooding has benefits across the board, for both organisms and habitats in and around rivers.”

Findings were published today in Nature Ecology and Evolution.

Researchers used models to explore how a variety of possible flow scenarios could affect the diversity and integrity of riparian forests along major rivers and looked at five tree and shrub guilds common to rivers worldwide. The guilds were groupings of species with similar responses to water availability and river flow disturbance.

The scientists used detailed species biology and 83 years of flow data from Colorado’s Yampa River, an undammed, 250-mile waterway, to build a computer model to predict future flows and to quantify the effects of flow changes on riparian plant communities.

Results showed that even modest alterations in the historical patterns of flood and drought can have negative effects on ecological networks, in this case, competing plant guilds, and that network “connectance” decreased as flow regime alteration increased; connectance is a measure of just how linked species in a network are to one another.

Study results also indicate that river flow homogenization, a result of damming, may be just as detrimental as drought to riparian communities.

“Connectance plays a fundamental role in maintaining biodiversity,” Tonkin said. “Evidence suggests that highly connected communities are better able to deal with species losses in food webs and are more resistant to invasion by non-native species. The simplification of these networks, including because of drought conditions that are predicted to increase widely over the next century, may predispose networks to collapse.”

Thus, preserving or restoring key components of natural flow regimes, which enhance connectance, should be a priority for river managers, he said.

“River-dependent communities have evolved over millennia and have been tailored by natural selection to the volume and seasonal variability of the flows,” he said. “Maintenance of flooding is fundamentally important for ecosystem health. Flooding is a vital driver of the ecology of rivers.”

One of the effects of reduced flooding is a change regarding which riparian guild plays the keystone role; keystone refers to having the single largest effect on the ecological network in terms of influences on other species.

“Removing floods, in particular, led to a loss of keystone status of hydroriparian pioneer trees, which are species like cottonwoods, alders, and river red gum,” Tonkin said. “Loss of keystone guilds leads to changes in fundamentally important ecosystem services.”

Those include habitat provision for wildlife, flood mitigation and bank stability, microclimatic regulation, and nutrient cycling.

“Because different guilds have different soil requirements and ecological roles, it is important to predict which ones will function in a keystone role under future flow regime scenarios,” he said.

Supporting this study were the U.S. Department of Defense, the U.S. Forest Service and Dinosaur National Monument.

Collaborators included David Lytle of the OSU College of Science as well as researchers from the University of Washington and the Forest Service.

Media Contact: 

Steve Lundeberg, 541-737-4039

Source: 
Multimedia Downloads
Multimedia: 

Bill Williams River in Arizona.

Bill Williams River

Another danger sign for coral reefs: Substitute symbiont falls short

CORVALLIS, Ore. – For reef-building corals, not just any symbiotic algae will do, new research shows.

The findings are important because they amount to another danger sign for the world’s coral reefs, which rely on a partnership with the millions of phototrophic algae they host to obtain food.

Global climate change is threatening the reefs in part because the symbionts, dinoflagellates of the genus Symbiodinium, can be stressed by warming oceans to the point of dysbiosis – a collapse of the host-symbiont partnership, which results in a phenomenon known as coral bleaching.

Earlier studies had suggested the more heat-tolerant Symbiodinium trenchii might be able to take the place of other, more sensitive species of Symbiodinium.

But an international research group that included Virginia Weis, Eli Meyer and Camerron Crowder of Oregon State University found that likely won’t be the case.

“Our research suggests that while S. trenchii might be able to establish a population in a host, it’s not correct to say it will be a beneficial partnership for the coral,” said Weis, professor of integrative biology in OSU’s College of Science.

Findings were published today in the Proceedings of the National Academy of Sciences.

Weis and collaborators at Victoria University of Wellington in New Zealand, the University of Melbourne and Stanford University worked with the sea anemone Exaiptasia pallida, commonly called Aiptasia, an established model for studying the type of symbiosis upon which coral reefs rely.

“Corals are really hard to grow in a lab,” Weis said. “They’re very fussy, they’re slow growing, and many of them are endangered. But this anemone grows very fast and is easy to manipulate.”

Aiptasia anemones were colonized separately with S. trenchii and their native symbionts, S. minutum.

S. trenchii has been observed to invade corals after bleaching – when the corals become stressed and lose their algae.

“When we challenged Aiptasia with the regular symbiont, it went as expected,” Weis said. “There was no immune system response, and there was productivity – we could see signs of the host getting sugars and nutrients from its symbiont.”

But with the introduction of S. trenchii, it was a much different story.

“We got a completely different set of signals,” she said. “The hosts’ immune system went on alert, mounting a response to try to eject this invader, and we saw signs of catabolism – instead of growing and putting carbon away for a rainy day, the host was having to break down its own tissues because it wasn’t getting enough food. So it was quite a dramatically different set of responses.”

Understanding as much as possible about the symbiosis corals require, and the biology that underlies it, is a key to the “grave and existential threat” they face from climate change, Weis said.

“We’re at the point now where coral reefs as we know them will in fact largely disappear, and what we’re hoping is to get carbon emissions more under control and bring global temperatures back down so we can manage their reappearance as a dominant ecosystem,” she said. “One approach to mitigate the problem would be to shift the host to a symbiont population that can develop corals that are more robust to climate change. One of the hopes had centered on S. trenchii, but what studies show in the model system is that it’s unlikely that combination would result in an ecologically healthy partnership that could last. It’s a cautionary tale to those who think we can willy-nilly make symbiont switches and have healthy corals be the result.”

The Royal Society of New Zealand supported this research. The photo at the top of this page was provided by globalcoralbleaching.org.

Media Contact: 

Steve Lundeberg, 541-737-4039

Multimedia Downloads
Multimedia: 

Exaiptasia pallida

Aiptasia

OSU researcher part of DARPA grant for autonomous drone swarms

CORVALLIS, Ore. – An Oregon State University computer science professor is part of a team that will receive up to $7.1 million to develop a drone swarm infrastructure to help the U.S. military in urban combat.

The contract is part of the Defense Advanced Research Project Agency’s OFFSET program, short for Offensive Swarm-Enabled Tactics. The program’s goal, according to DARPA’s website, is “to empower … troops with technology to control scores of unmanned air and ground vehicles at a time.”

Julie A. Adams of OSU’s College of Engineering is on one of two teams of “swarm systems integrators” whose job is to develop the system infrastructure and integrate the work of the “sprint” teams that will focus on swarm tactics, swarm autonomy, human-swarm teaming, physical experimentation and virtual environments.

Raytheon BBN, a key research and development arm of the Raytheon Company, a major defense contractor, leads Adams’ team. The team also includes Smart Information Flow Technologies, a research and development firm. Northrop Grumman, an aerospace and defense technology company, heads the other team of integrators.

Adams, the associate director for deployed systems and policy at the college’s Collaborative Robotics and Intelligent Systems Institute, is the only university-based principal investigator on either team of integrators.

Researchers envision swarms of more than 250 autonomous vehicles – multi-rotor aerial drones, and ground rovers – to gather information and assist troops in “concrete canyon” surroundings where line-of-sight, satellite-based communication is impaired by buildings.

The information the swarms collect can help keep U.S. troops more safe, and civilians in the battle areas more safe as well.

“I specifically will work on swarm interaction grammar – how we take things like flanking or establishing a perimeter and create a system of translations that will allow someone to use those tactics,” Adams said. “We want to be able to identify algorithms to go with the tactics and tie those things together, and also identify how operators interact with the use of a particular tactic.

“Our focus is on the individuals who will be deployed with the swarms, and our intent is to develop enhanced interactive capabilities: speech, gestures, a head tilt, tactile interaction. If a person is receiving information from a swarm, he might have a belt that vibrates. We want to make the interaction immersive and more understandable for humans and enable them to interact with the swarm.”

Adams noted that China last summer launched a record swarm of 119 fixed-wing unmanned aerial vehicles.

“Right now we don’t have the infrastructure available for testing the capabilities of large swarms,” Adams said. “Advances have been made with indoor systems, including accurately tracking individual swarm members and by using simulations. Those are good first steps but they don’t match what will happen in the real world. Those approaches allow for testing and validation of some system aspects but they don’t allow for full system validation.”

The integrators’ objective is for operators to interact with the swarm as a whole, or subgroups of the swarm, and not individual agents – like a football coach orchestrating his entire offense as it runs a play.

“What the agents do individually is simple; what they do as a whole is much more interesting,” said Adams, likening a drone swarm to a school of fish acting in concert in response to a predator. “We’ve got these ‘primitives’” – basic actions a swarm can execute – “and we’ll map these primitives to algorithms for the individual agents in the swarm, and determine how humans can interact with the swarm based on all of these things. We want to advance and accelerate enabling swarm technologies that focus on swarm autonomy and how humans can interact and team with the swarm.” 

Media Contact: 

Steve Lundeberg, 541-737-4039

Traffic signal countdown timers lead to improved driver responses

CORVALLIS, Ore. – Countdown timers that let motorists know when a traffic light will go from green to yellow lead to safer responses from drivers, research at Oregon State University suggests.

The findings are important because of mistakes made in what traffic engineers call the “dilemma zone” – the area in which a driver isn’t sure whether to stop or keep going when the light turns yellow.

A traffic signal countdown timer, or TSCT, is a clock that digitally displays the time remaining for the current stoplight indication – i.e., red, yellow or green. 

Widely adopted by roughly two dozen countries around the world, traffic signal countdown timers are not used in the U.S. Crosswalk timers for pedestrians are allowed, but TSCTs are prohibited by the Department of Transportation.

“When you introduce inconsistencies – sometimes you give drivers certain information, sometimes you don’t – that has the potential to cause confusion,” said David Hurwitz, transportation engineering researcher in OSU’s College of Engineering and corresponding author on the study.

There were more than 37,000 traffic fatalities in the United States in 2016. Around 20 percent of those occurred at intersections, he said.

It’s not known exactly how many U.S. intersections are signalized because no agency does a comprehensive count, but the National Transportation Operations Coalition estimates the number to be greater than 300,000.

A significant percentage of those feature fixed-time signals, which are recommended in areas with low vehicle speed and heavy pedestrian traffic.

Traffic signal countdown timers work well at fixed-time signals, Hurwitz said, but they may not be practical for actuated signals; at those intersections, he said, a light typically changes only one to four seconds after the decision to change it is made – not enough time for a countdown timer to be of value.

In this study, which used a green signal countdown timer, or GSCT, in Oregon State’s driving simulator, the clock counted down the final 10 seconds of a green indication.

A subject pool of 55 drivers ranging in age from 19 to 73 produced a data set of 1,100 intersection interactions, half of which involved a GSCT. The presence of the countdown timer increased the probability that a driver in the dilemma zone would stop by an average of just over 13 percent and decreased deceleration rates by an average of 1.50 feet per second.

“These results suggest that the information provided to drivers by GSCTs may contribute to improved intersection safety in the U.S.,” Hurwitz said. “When looking at driver response, deceleration rates were more gentle when presented with the countdown timers, and we did not find that drivers accelerated to try to beat the light – those are positives for safety. Drivers were significantly more likely to slow down and stop when caught in the dilemma zone. The results in the lab were really consistent and statistically convincing.”

The findings, published recently in Transportation Research Part F: Traffic Psychology and Behaviour, build on a 2016 paper in Transportation Research Part C: Emerging Technologies.

The earlier results, which arose from a related research project, showed drivers were more ready to go when the light turned green at intersections with a red signal countdown timer, which indicates how much time remains until the light goes from red to green. The first vehicle in line got moving an average of 0.82 seconds more quickly in the presence of a timer, suggesting an intersection efficiency improvement thanks to reduction in time lost to startups.

The papers comprised dissertation work by then Ph.D. student Mohammad Islam, who now works for a Beaverton, Oregon-based company, Traffic Technology Services. Amy Wyman, an OSU Honors College undergraduate who completed her degree in 2017, collaborated on the publication.

TTS, whose chief executive officer, Thomas Bauer, is also an OSU College of Engineering alumnus, has developed a cloud-computer-connected countdown timer for the automotive industry.

Several cars in the German luxury carmaker Audi’s 2017 lineup already feature the timer, which can be viewed both on the instrument panel and via a heads-up display. The system is currently operational in several U.S. cities including Portland.

Unlike the traffic-signal-mounted timers, the onboard clocks are allowed in the U.S. 

Media Contact: 

Steve Lundeberg, 541-737-4039

Multimedia Downloads
Multimedia: 

Yellow lights

The "dilemma zone"

Nanofiber sutures promote production of infection-thwarting peptide

CORVALLIS, Ore. – Loading nanofiber sutures with vitamin D induces the production of an infection-fighting peptide, new research shows.

The discovery could represent an important advance in the prevention of surgical site infections, a multibillion-dollar challenge each year in the United States alone.

A collaboration that included Adrian Gombart of the Linus Pauling Institute at Oregon State University used coaxial electrospinning deposition and rolling to fabricate sutures that contained 25-hydroxyvitamin D3 and the pam3CSK4 peptide.

A peptide is a compound consisting of two or more amino acids linked in a chain; pam3CSK4’s function is to activate a cell’s toll-like receptor, which in turn triggers immune responses, in which vitamin D plays a key role.

The research showed the sutures released 25D3 – the same form of the vitamin that’s measured in the blood when a patient’s vitamin D levels are tested – on a sustained basis over four weeks. The sutures released pam3CSK4 via an initial burst followed by a four-week prolonged release.

“When the toll-like receptor is activated, you induce a particular enzyme to convert 25D3 to its bioactive form, known as 1,25-dihydroxy vitamin D3, that activates the vitamin D receptor,” Gombart said. “When activity increases, that increases expression of vitamin D receptor target genes, one of which produces the LL-37 peptide, which kills microbes by disrupting their membranes.

“The idea is, if you were to have an infection, the sutures would activate the toll-like receptors and start increasing production of 1,25D3 from the 25D3 that’s being released from sutures – so you get both local induction and an increase in the production of the antimicrobial peptide.”

The study’s corresponding author, Jingwei Xie of the University of Nebraska Medical Center, notes that the anti-infective sutures currently in use contain triclosan, an antibacterial and antifungal agent also found in a variety of consumer products.

“However, the frequent use has resulted in bacterial resistance,” Xie said. “Triclosan also shows a wide range of health risks including endocrine disruption, impaired muscle function, liver damage and the development of cancerous tumors. Compared to the currently available products and treatment options, the anti-infective sutures we develop could circumvent the selection for multidrug resistance and other health-associated shortcomings. The new sutures are also highly configurable and can deliver a variety of bioactive compounds to minimize infection risk, optimize healing and minimize scarring. None of the currently available sutures has this level of function.”

Gombart adds that the vitamin D delivered by the sutures could also affect additional genes involved in the immune response as well as LL-37.

“So a compound like vitamin D not only targets bacteria via the antimicrobial peptide, but other immune responses can also be modulated to help combat infection,” he said. “Targeting on multiple fronts helps minimize the chance of resistance.”

The University of Nebraska Medical Center, the National Institutes of Health, and the Otis Glebe Medical Research Foundation supported this research.

Also involved in the collaboration were researchers from the Joan C. Edwards School of Medicine at Marshall University in Huntington, West Virginia, and the Chongqing Academy of Animal Sciences & Key Laboratory of Pig Industry Sciences in Chongqing, China.

Findings were recently published in Nanomedicine.

Media Contact: 

Steve Lundeberg, 541-737-4039

Multimedia Downloads
Multimedia: 

nanofiber sutures

How the sutures work

Assessment shows metagenomics software has much room for improvement

CORVALLIS, Ore. – A recent critical assessment of software tools represents a key step toward taming the “Wild West” nature of the burgeoning field of metagenomics, said an Oregon State University mathematical biologist who took part in the research.

Metagenomics refers to the science of genetically studying whole communities of microorganisms, as opposed to sequencing single species grown in culture.

“Microbes are ridiculously important to life,” said David Koslicki, assistant professor of mathematics in the OSU College of Science. “They not only can cause terrible things to happen, like blight and disease, but in general, overwhelmingly, microbes are our friends. Without them doing their jobs, crops couldn’t grow as well, it would be hard to digest our food, we might not get sleepy at appropriate times. Microbes are so fundamental to life, to health, we really need to know as much as we can about them.”

Koslicki, a leader in a university-wide research and education program known as OMBI – the OSU Microbiome Initiative – described the findings, published recently in Nature Methods, as “sobering." 

“There are not a lot of well-established, well-characterized computational techniques and tools that biologists can use,” he said. “And the assessment showed that a lot of the tools being used do not do nearly as well as had been initially thought, so there’s definitely room for improvement there.

“That said, depending on the situation that a biologist is interested in, there are definitely different tools that have proven to be the best so far.”

Metagenomics is a relatively new field that developed quickly once next-generation sequencing grew inexpensive enough that looking at entire microbial communities became economically feasible, said Koslicki.

“The typical view of biology is a wet lab and everything like that, but a whole other facet has to do with these high-throughput ways of accessing genetic material,” he said. “You end up with a ton of data, and when you end up with a ton of data, you introduce new problem: How do I get the important information out of it? You have to come up with an algorithm that allows biologists to answer the questions they find important: What critters are there, how many are there, what are they doing, are there any viruses? We need to answer those questions and not just answer them quickly but also have some sort of idea how accurate the answer is.”

The dizzying array of tools biologists are using to try to answer those questions is “kind of like the Wild West,” Koslicki said. “If you want to learn what bacteria are in a sample, there are no less than three or four dozen different tools people have come up with, and in a rather disjointed manner. You have teams of statisticians, mathematicians, biologists, microbiologists, engineers all looking at this from their own perspectives and coming up with their own tools. Then the end-user biologist comes along and is faced with 40 different tools, and how do they know how good they are at answering the questions they need answered?”

Koslicki’s research, known as the CAMI challenge – critical assessment of metagenome interpretation –was aimed at ranking those tools to provide a road map for biologists.

“The challenge engaged the global developer community to benchmark their programs on highly complex and realistic data sets, generated from roughly 700 newly sequenced microorganisms and about 600 novel viruses and plasmids and representing common experimental setups,” he said. “This was an independent initiative. Typically when tools are compared, it’s attached to the publication of a new method that’s compared to other tools that do worse, so the new method looks good. There hasn’t been a lot of independent research into which tools actually work, how well they work, what kind of data do they well on, etc.”

The UK Engineering and Physical Sciences Research Council, the U.S. Department of Energy, the Cluster of Excellence on Plant Sciences, the Australian Research Council, the European Research Council, the Agency for Science, Technology and Research Singapore, the Lundbeck Foundation, and the National Science Foundation supported this research.

Media Contact: 

Steve Lundeberg, 541-737-4039

Gamma-ray burst detection just what OSU researchers exclusively predicted

CORVALLIS, Ore. – More than a month before a game-changing detection of a short gamma-ray burst – a finding announced today – scientists at Oregon State University predicted such a discovery would occur.

Scientists from U.S. and European collaborations converged on the National Press Club in Washington, D.C., today to say they’ve detected an X-ray/gamma-ray flash that coincided with a burst of gravitational waves, followed by visible light from a new cosmic explosion called a kilonova.

Gravitational waves were first detected in September 2015, and that too was a red-letter event in physics and astronomy; it confirmed one of the main predictions of Albert Einstein’s 1915 general theory of relativity and earned a Nobel prize for the scientists who discovered them.

“A simultaneous detection of gamma rays and gravitational waves from the same place in the sky is a major milestone in our understanding of the universe,” said Davide Lazzati, a theoretical astrophysicist in the OSU College of Science. “The gamma rays allow for a precise localization of where the gravitational waves are coming from, and the combined information from gravitational and electromagnetic radiation allows scientists to probe the binary neutron star system that’s responsible in unprecedented ways. We can tell things like which galaxy the waves come from, if there are other stars nearby, and whether or not the gravitational waves are followed by visible radiation after a few hours or days.”

Collaborators from the Laser Interferometer Gravitational-Wave Observatory, known as LIGO, and the European Gravitational Observatory’s Virgo team on Aug. 17, 2017, detected gravitational waves – ripples in the fabric of space-time – produced by the coalescence of two neutron stars.

Roughly two seconds later, NASA’s Fermi Gamma-ray Space Telescope detected a short flash of X- and gamma rays from the same location in the sky.

“The Fermi transient is more than 1,000 times weaker than a ‘normal’ short gamma-ray burst and has the characteristics that we predicted,” Lazzati said. “No other prediction of such flashes had been made. Just by pen and paper almost, we could say hey, we might see the bursts, even if they’re not in a configuration that makes them obvious.”

On July 6, Lazzati’s team of theorists had published a paper predicting that, contrary to earlier estimates by the astrophysics community, short gamma-ray bursts associated with the gravitational emission of binary neutron star coalescence could be detected – whether or not the gamma-ray burst was pointing at Earth.

The paper appeared in the journal Monthly Notices of the Royal Astronomical Society.

“X- and gamma rays are collimated, like the light of a lighthouse, and can be easily detected only if the beam points toward Earth,” Lazzati said. “Gravitational waves, on the other hand, are almost isotropic and can always be detected. We argued that the interaction of the short gamma-ray burst jet with its surroundings creates a secondary source of emission called the cocoon. The cocoon is much weaker than the main beam and is undetectable if the main beam points toward our instruments. However, it could be detected for nearby bursts whose beam points away from us.”

Since the first gravitational wave discovery, there have been three more confirmed detections, including the one from August that was jointly seen by scientists from the LIGO and Virgo groups.

“All observations until the last one were from the coalescence of binary black hole systems,” Lazzati said. “While these systems are interesting, they are dark in any other form of radiation and relatively little can be understood from them compared to binary neutron star systems.

“It’s a really lucky set of circumstances for a theorist, where you have a working theory to use to make predictions and new instruments such as LIGO and Virgo coming online to test them,” Lazzati said. “Scientists don’t make predictions because we want to be right – we make predictions because we want to test them. Even if we’re wrong, we’re still learning something – but it’s much more exciting to be right.”

The term neutron star refers to the gravitationally collapsed core of a large star; neutron stars are the smallest, densest stars known. According to NASA, neutron stars’ matter is packed so tightly that a sugar-cube-sized amount of it weighs in excess of a billion tons.

Media Contact: 

Steve Lundeberg, 541-737-4039

Multimedia Downloads
Multimedia: 

radiation jet copy

GRB computer simulation