OREGON STATE UNIVERSITY

college of engineering

Nuclear “forensics” program will aid national security efforts

CORVALLIS, Ore.  – Oregon State University is helping to bolster U.S. anti-terrorism and nuclear security efforts through a new graduate student training initiative in nuclear forensics.

A new option in an existing degree program will train the next generation of nuclear forensics professionals, giving them the technical expertise needed to identify pre- or post-detonation nuclear and radiological materials, and determine how and where they were created.

Training in this field, university officials said, will create experts with the skills to provide proof of those responsible for any attack or potential attack. Funding for the graduate student emphasis, which is one of the first of its kind in the nation, will be provided by the Department of Homeland Security through the Nuclear Forensics Education Award Program.

"The use of nuclear materials in several capacities is being pursued, and the reality of the world is that not everyone doing so has honorable intentions," said Brittany Robertson, the first student pursuing the emphasis. "I believe in being proactive, so that we don’t have to be reactive. A nuclear tragedy anywhere, whether intentional or accidental, has the potential to affect everywhere."

The nuclear forensics emphasis is led by Camille Palmer, research professor and instructor at OSU’s Department of Nuclear Engineering and Radiation Health Physics. It draws on faculty expertise in nuclear engineering, radiation health physics, radiation detection and radiochemistry, and utilizes state-of-the-art laboratory and spectroscopy facilities in OSU’s Radiation Center.

New courses are being created in nuclear materials science, nuclear forensics analysis, and detection of special nuclear material, which will build on existing courses in radiophysics, radiochemistry, and applied radiation safety.

“Oregon State is one of a handful of universities in the world positioned to make a significant impact in nuclear forensics education and research,” Palmer said. “Our human capital, facilities, and proximity to national laboratories make us a natural fit for a forensics program; and our goal is to continue to strengthen research collaborations to ensure that we are consistently relevant and productive in this field.”

Media Contact: 

Jens Odegaard, 541-737-2644

Source: 

Camille Palmer, 541-737-7059

Multimedia Downloads
Multimedia: 

Nuclear forensics
Brittany Robertson

Map outlines western Oregon landslide risks from a subduction zone earthquake

CORVALLIS, Ore. – New landslide maps have been developed that will help the Oregon Department of Transportation determine which coastal roads and bridges in Oregon are most likely to be usable following a major subduction zone earthquake that is expected in the future of the Pacific Northwest.

The maps were created by Oregon State University and the Oregon Department of Geology and Mineral Industries, or DOGAMI, as part of a research project for ODOT. They outline the landslide risks following a large earthquake on the Cascadia Subduction Zone.

The mapping is part of ongoing ODOT efforts to preserve the critical transportation routes that will facilitate response and recovery.

“Landslides are a natural part of both the Oregon Coast Range and Cascade Range, but it’s expected there will be a significant number of them that are seismically induced from a major earthquake,” said Michael Olsen, an assistant professor in the OSU School of Civil and Construction Engineering. “A massive earthquake can put extraordinary additional strain on unstable slopes that already are prone to landslides.”

Landslides are already a serious geologic hazard for western Oregon. But during an earthquake, lateral ground forces can be as high as half the force of gravity.

The Coast Range is of special concern, officials say, because it will be the closest part of the state to the actual subduction zone earthquake, and will experience the greatest shaking and ground movement. The research identified some of the most vulnerable landslide areas in Oregon as parts of the Coast Range between Tillamook and Astoria, and from Cape Blanco south to the California border – in each case, from the coast to about 30 miles inland.

“Major landslides have been identified by DOGAMI throughout western Oregon using high-resolution lidar mapping,” Olsen said. “Some experts believe that a number of these landslides date back to the last subduction zone earthquake in Oregon, in 1700. Coast Range slopes that are filled with weak layers of sedimentary rock are particularly vulnerable, and many areas are already on the verge of failure.”

According to the new map, the highway corridors to the coast that will face comparatively less risk from landslides will be Oregon Highway 36 from near Eugene to Florence; Oregon Highway 38 from near Cottage Grove to Reedsport; Oregon Highway 18 from Salem to Lincoln City; and large portions of U.S. Highway 30 from Portland to Astoria. However, landslides or other damages could occur on any road to the coast or in the Cascade Range due to the anticipated high levels of ground shaking.

The new research, along with other considerations, will help ODOT and other officials determine which areas merit the most investment in coming years as part of long-term planning for the expected earthquake. Given the high potential for damage and minimal resources available for mitigation, experts may choose to focus their efforts on highway corridors that are expected to receive less damage from the earthquake, Olsen said.

The research reflected in the new map considered such factors as slope, direction of ground movement, soil type, vegetation, distance to rivers, roads and fault locations, peak ground acceleration, peak ground velocity, annual precipitation averages, and other factors.

ODOT, Oregon State and DOGAMI have been state leaders in research on risks posed by the Cascadia Subduction Zone, earthquake and tsunami impacts, and initiatives to help the state prepare for a future disaster that scientists say is a certainty.

Officials said it’s important to consider not just the damage to structures that can occur as a result of an earthquake, but also landslide and transportation issues.

“ODOT recognizes the potential not only for casualties due to landslides during and after an earthquake, but also for the likelihood of isolating whole segments of the state’s population,” one ODOT official said. “Thousands of people in the coastal communities would be stranded and cut off from rescue, relief and recovery that would arrive by surface transport.”

ODOT recently completed a seismic vulnerability assessment and selected lifeline corridor routes to prioritize following an earthquake.  ODOT also maintains an unstable slopes program, evaluating the frequency of rockfalls and landslides affecting highway corridors.

DOGAMI recently released another open file report as part of the Oregon Resilience Plan, which evaluated multiple potential hazards resulting from a Cascadia subduction zone earthquake, including landslides, liquefaction, and tsunamis.

Some recent efforts at OSU have also focused on understanding the different concerns raised by a subduction zone earthquake compared to the type of strike-slip faults more common in California, on which many seismic plans are based. Subduction earthquakes tend to be larger, affect a wider area and last longer.

Following are publications that are available:

DOGAMI Open-File Report O-15-01, Landslide Susceptibility Analysis of Lifeline Routes in the Oregon Coast Range, by Rubini Mahalingam; Michael J. Olsen; Mahyar Sharifi-Mood; and Daniel T. Gillins, Oregon State University School of Civil and Construction Engineering.  The report can be purchased on DVD for $30 each from the Nature of the Northwest Information Center (NNW), 800 N.E. Oregon St., Suite 965, Portland, Ore., 97232. You may also call NNW at (971) 673-2331 or order online at www.NatureNW.org. There is a $4.95 shipping and handling charge for all mailed items.

ODOT Research Report SPR-740, Impacts of Potential Seismic Landslides on Lifeline Corridors, by Michael J. Olsen; Scott A. Ashford; Rubini Mahalingam; Mahyar Sharifi-Mood; Matt O’Banion and Daniel T. Gillins, Oregon State University School of Civil and Construction Engineering.  Download the report:  http://1.usa.gov/18352DF

 

Media Contact: 
Source: 

Michael Olsen, 541-737-9327

Multimedia Downloads
Multimedia: 

Coast Range landslide
Landslide


Landslide map
New maps

OSU to outfit undersea gliders to “think like a fish”

CORVALLIS, Ore. – Oregon State University researchers have received a $1 million grant from the W.M. Keck Foundation that will allow them to outfit a pair of undersea gliders with acoustical sensors to identify biological “hot spots” in the coastal ocean.

They also hope to develop an onboard computing system that will program the gliders to perform different functions depending on what they encounter.

In other words, the scientists say, they want to outfit a robotic undersea glider to “think like a fish.”

“We spend all of this time on ships, deploying instrumentation that basically is designed to see how ocean biology aggregates around physical features – like hake at the edge of the continental shelf or salmon at upwelling fronts,” said Jack Barth, a professor in OSU’s College of Earth, Ocean, and Atmospheric Sciences and a principal investigator on the project. “But that just gives us a two-week window into a particular area.

“We already have a basic understanding of the ecosystem,” Barth added. “Now we want to get a better handle of what kind of marine animals are out there, how many there are, where they are distributed, and how they respond to phytoplankton blooms, schools of baitfish or oceanic features. It will benefit a variety of stakeholders, from the fishing industry and resource managers to the scientific community.”

Barth is a physical oceanographer who knows the physical processes of the coastal ocean. He’ll work with Kelly Benoit-Bird, a marine ecologist, who specializes in the relationships among marine organisms from tiny plankton to large whales. Her work utilizes acoustics to identify and track animals below the ocean surface – and it is these sensors that will open up a new world of research aboard the gliders.

“Our first goals are to understand the dynamics of the Pacific Northwest upwelling system, find the biological hotspots, and then see how long they last,” Benoit-Bird said. “Then we’d like to learn what we can about the distribution of prey and predators – and the relationship of both to oceanic conditions.”

Using robot-mounted acoustic sensors, the OSU researchers will be able to identify different kinds of marine animals using their unique acoustical signatures. Diving seabirds, for example, leave a trail of bubbles through the water like the contrail left by a jet. Zooplankton show up as a diffuse cloud. Schooling fish create a glowing, amoeba-shaped image.

“We’ve done this kind of work from ships, but you’re more or less anchored in one spot, which is limiting,” Benoit-Bird said. “By putting sensors on gliders, we hope to follow fish, or circle around a plankton bloom, or see how seabirds dive. We want to learn more about what is going on out there.”

Programming a glider to spend weeks out in the ocean and then “think” when it encounters certain cues, is a challenge that falls upon the third member of the research team, Geoff Hollinger, from OSU’s robotics program in the College of Engineering. Undersea gliders operated by Oregon State already can be programmed to patrol offshore for weeks at a time, following a transect, moving up and down in the water column, and even rising to the surface to beam data back to onshore labs via satellite.

But the instruments aboard the gliders that measure temperature, salinity and dissolved oxygen are comparatively simple and require limited power. Using sophisticated bioacoustics sensors that record huge amounts of data, and then programming the gliders to respond to environmental cues, is a significant technological advance.

“All of the technology is there,” Hollinger said, “but combining it into a package to perform on a glider is a huge robotics and systems engineering challenge. You need lots of computing power, longer battery life, and advanced control algorithms.”

Making a glider “think,” or respond to environmental cues, is all about predictive algorithms, he said.

“It is a little like looking at economic indicators in the stock market,” Hollinger pointed out. “Just one indicator is unlikely to tell you how a stock will perform. We need to develop an algorithm that essentially turns the glider into an autonomous vehicle that can run on autopilot.”

The three-year research project should benefit fisheries management, protection of endangered species, analyzing the impacts of new ocean uses such as wave energy, and documenting impacts of climate change, the researchers say.

Oregon State has become a national leader in the use of undersea gliders in research to study the coastal ocean and now owns and operates more than 20 of the instruments through three separate research initiatives. Barth said the vision is to establish a center for underwater vehicles and acoustics research – which would be a key component of its recently announced Marine Studies Initiative.

The university also has a growing program in robotics, of which Hollinger is a key faculty member. This collaborative project funded by Keck exemplifies the collaborative nature of research at Oregon State, the researchers say, where ecologists, oceanographers and roboticists work together.

“This project and the innovative technology could revolutionize how marine scientists study the world’s oceans,” Barth said.

Media Contact: 
Source: 

Jack Barth, 541-737-1607, barth@coas.oregonstate.edu;

Kelly Benoit-Bird, 541-737-2063, kbenoit@coas.oregonstate.edu;

Geoff Hollinger, 541-737-5906, Geoff.hollinger@oregonstate.edu

Multimedia Downloads
Multimedia: 

acoustic_image_benoit-bird smart_glider_OSU glider

Study outlines impact of tsunami on the Columbia River

CORVALLIS, Ore. – Engineers at Oregon State University have completed one of the most precise evaluations yet done about the impact of a major tsunami event on the Columbia River, what forces are most important in controlling water flow and what areas might be inundated.

They found, in general, that tidal stages are far more important than river flow in determining the impact of a tsunami; that it would have its greatest effect at the highest tides of the year; and that a tsunami would be largely dissipated within about 50 miles of the river’s mouth, near Longview, Wash.

Any water level increases caused by a tsunami would be so slight as to be almost immeasurable around the Portland metropolitan area or Bonneville Dam, the study showed. But water could rise as much as 13 feet just inside the mouth of the Columbia River, and almost 7 feet within a few miles of Astoria.

“There have been previous models of Columbia River run-up as a result of a tsunami, but they had less resolution than this work,” said David Hill, an associate professor of civil engineering in the OSU College of Engineering. “We carefully considered the complex hydrodynamics, subsidence of grounds that a tsunami might cause, and the impacts during different scenarios.”

The impact of tsunamis on rivers is difficult to predict, researchers say, because many variables are involved that can either dampen or magnify their effect. Such factors can include the width and shape of river mouths, bays, river flow, tidal effects, and other forces.

But the major tsunami in Japan in 2011, which was caused by geologic forces similar to those facing the Pacific Northwest, also included significant inland reach and damage on local rivers. As a result, researchers are paying increased attention to the risks facing residents along such rivers.

The OSU research has been published in the Journal of Waterway, Port, Coastal and Ocean Engineering, by Hill and OSU graduate student Kirk Kalmbacher. It’s based on a major earthquake on the Cascadia Subduction Zone and a resulting tsunami, with simulations done at different rivers flows; and high, low, flood and ebb tides.

Of some interest is that the lowest elevation of a tsunami wave generally occurs at a high tide, but its overall flooding impact is the greatest because the tide levels are already so high. Because of complex hydrodynamic interactions, the study also found that only on a flood tide would water actually wash up and over the southern spit of the Columbia River mouth, with some local flooding based on that.

Tides, overall, had much more impact on the reach of a tsunami than did the amount of water flowing in the river.

“We were a little surprised that the river’s water flow didn’t really matter that much,” Hill said. “The maximum reach of a tsunami on the Columbia will be based on the tidal level at the time, and of course the magnitude of the earthquake causing the event.”

Based on a maximum 9.0 magnitude earthquake and associated tsunami, at the highest tide of the year, the research concluded:

  • Just offshore, the tsunami would raise water levels about 11.5 to 13 feet.
  • Just inside the mouth of the Columbia River, the water would rise about 13 feet.
  • At river mile 6, approaching Hammond, Ore., the river would rise about 10 feet.
  • At river mile 25, near Welch Island, the river would rise about 1.6 feet.
  • At river mile 50, near Longview, Wash., there would be no measurable rise in the river.

Maps have been developed as a result of this research that make more precise estimates of the areas which might face tsunami-induced flooding. They should aid land owners and land use planners, Hill said, in making improved preparations for an event that researchers now say is inevitable in the region’s future. Experts believe this region faces subduction zone earthquakes every 300-600 years, and the last one occurred in January, 1700.

There are some noted differences in the projections on these newer maps and older ones, Hill said.

Media Contact: 
Source: 

David Hill, 541-737-4939

Multimedia Downloads
Multimedia: 

Tsunami impact Tsunami impact

Tsunami impact maps

OSU professor elected to the National Academy of Engineering

CORVALLIS, Ore. – Gabor Temes, a professor of electrical and computer engineering at Oregon State University, has been elected to the National Academy of Engineering, the highest professional distinction for engineers in both industry and academia.

Temes, who has been at OSU’s School of Electrical Engineering and Computer Science since 1990, was honored for his “contributions to analog signal processing and engineering education.”

Members are selected for significant contributions to engineering research, practice, or education, and for the "pioneering of new and developing fields of technology.” Temes is the second OSU faculty member to receive the rare honor; the first was professor emeritus Octave Levenspiel, who was elected in 2000.

Temes’ career has spanned work in industry and academia. He served as distinguished professor and department chair at UCLA and as professor and department head at OSU. His research in the area of analog integrated circuits – the interface between the “real” analog world and digital signal processors – has improved the quality of sound and data communications.

He holds 14 patents and has more than 500 publications, including several books. His long career has earned him many accolades including the IEEE Kirchhoff Award, a prestigious distinction recognizing outstanding career achievements.

“We are extremely fortunate to have Gabor Temes at Oregon State,” said Bella Bose, professor and interim school head in the School of Electrical Engineering and Computer Science. “In addition to being an outstanding researcher, he is an excellent mentor and many of his graduate students have gone on to become leaders in industry and academia.”

This year, 67 new members were elected to the academy, bringing the total U.S. membership to 2,263. The induction ceremony will be held on Oct. 4 during the National Academy of Engineering’s annual meeting in Washington, D.C.

Media Contact: 

Rachel Robertson, 541-737-7098

Source: 

Gabor Temes, 541-737-2979

Multimedia Downloads
Multimedia: 

Gabor Temes
Gabor Temes

Wave energy integration costs should compare favorably to other energy sources

 

CORVALLIS, Ore. – A new analysis suggests that large-scale wave energy systems developed in the Pacific Northwest should be comparatively steady, dependable and able to be integrated into the overall energy grid at lower costs than some other forms of alternative energy, including wind power.

The findings, published in the journal Renewable Energy, confirm what scientists have expected – that wave energy will have fewer problems with variability than some energy sources and that by balancing wave energy production over a larger geographic area, the variability can be even further reduced.

The variability of alternative energy sources is one factor that holds back their wider use – if wind or solar energy decreases and varies widely, then some other energy production has to back it up, and that adds to the overall cost of energy supply.

“Whenever any new form of energy is added, a challenge is to integrate it into the system along with the other sources,” said Ted Brekken, an associate professor and renewable energy expert in the College of Engineering at Oregon State University.

“By producing wave energy from a range of different sites, possibly with different types of technology, and taking advantage of the comparative consistency of the wave resource itself, it appears that wave energy integration should be easier than that of wind energy,” he said. “The reserve, or backup generation, necessary for wave energy integration should be minimal.”

This estimate of the cost of integrating wind energy indicated that it would be 10 percent or less than the actual charges being made for the integration of wind energy. Energy integration, however, is just one component of the overall cost of the power generated. Wave energy, still in the infancy of its development, is not yet cost competitive on an overall basis.

Wave energy is not now being commercially produced in the Pacific Northwest, but experts say its future potential is significant, and costs should come down as technologies improve and more systems are developed. This study examined the hypothetical addition of 500 megawatts of generating capacity in this region by 2025, which would be comparable to approximately five large wind farms.

Another strength of wave energy, the study suggested, is that its short-term generation capacity can be predicted with a high degree of accuracy over a time scale ranging from minutes to hours, and with some accuracy even seasonally or annually.

The Pacific Northwest has some of the nation’s best wave energy resources, and as a result is home to the Northwest National Marine Renewable Energy Center, supported by the U.S. Department of Energy.

Wave energy in the region is expected to spur economic growth, help diversify the energy portfolio, reduce greenhouse gas emissions and reduce transmission losses, the study noted.

This study was a collaboration of researchers at OSU, the University of Victoria, and private industry.

Media Contact: 
Source: 

Ted Brekken, 541-737-2995

Multimedia Downloads
Multimedia: 

Ocean Sentinel

Ocean Sentinel

New type of semiconductor could change face of consumer electronics

CORVALLIS, Ore. – Materials first developed at Oregon State University more than a decade ago with an eye toward making “transparent” transistors may be about to shake up the field of consumer electronics – and the first uses are not even based on the transparent capability of the materials.

Transparent transistors were invented by OSU researchers in 2002. In continued work and in collaboration with private industry, certain transparent transistor materials – amorphous oxide semiconductors – are now gaining some of their first commercial applications. Licensing of the compounds is under way to a range of companies.

One of the first and most important of the semiconductors is based on the compound indium gallium zinc oxide, or IGZO. It’s now being used to produce flat-panel displays for computer monitors with extraordinary resolution and clarity, and in ultrathin HDTVs. IGZO will also soon find its way into tablets and cell phone displays.

But that may be just the beginning, experts say.

“Amorphous oxide semiconductors appear well-positioned to significantly impact a $100 billion industry,” said John Wager, holder of the Michael and Judith Gaulke Chair in the OSU School of Electrical Engineering and Computer Science.

“Because of their increased electron mobility, compounds like IGZO can provide brighter displays with higher resolution,” Wager said.

Transistors made using IGZO consume much less standby power - cell phones might be created that only need charging once or twice a week instead of once a day.

The primary competition for amorphous oxide semiconductors is low-temperature polysilicon, Wager said. But this technology is more complex and expensive.

“Amorphous oxide semiconductors benefit from the fact that they can be implemented by retrofitting an existing fabrication facility,” Wager said. “This would save billions of dollars, rather than having to build a new plant, as required for low-temperature polysilicon.

“Amorphous oxide semiconductor implementation appears on the verge of exploding,” he said. “If the current trend continues, in the next five years most people will likely own some device with these materials in them. This is a breathtaking pace.”

The commercialization of amorphous oxide semiconductors also bodes well for the future of transparent electronics.

Conceptually, electronics could be incorporated into any glass surface. A bathroom mirror could display your schedule for that day in an updatable and interactive way. A window could function as a computer display in conjunction with touchscreen control. Driving directions could appear on the windshield of your automobile. Or you could replace your drapes with a bedroom window that would automatically or manually darken to block out light.

Media Contact: 
Source: 

John Wager, 541-737-2994

Atmospheric carbon dioxide used for energy storage products

CORVALLIS, Ore. – Chemists and engineers at Oregon State University have discovered a fascinating new way to take some of the atmospheric carbon dioxide that’s causing the greenhouse effect and use it to make an advanced, high-value material for use in energy storage products.

This innovation in nanotechnology won’t soak up enough carbon to solve global warming, researchers say. However, it will provide an environmentally friendly, low-cost way to make nanoporous graphene for use in “supercapacitors” – devices that can store energy and release it rapidly.

Such devices are used in everything from heavy industry to consumer electronics.

The findings were just published in Nano Energy by scientists from the OSU College of Science, OSU College of Engineering, Argonne National Laboratory, the University of South Florida and the National Energy Technology Laboratory in Albany, Ore. The work was supported by OSU.

In the chemical reaction that was developed, the end result is nanoporous graphene, a form of carbon that’s ordered in its atomic and crystalline structure. It has an enormous specific surface area of about 1,900 square meters per gram of material. Because of that, it has an electrical conductivity at least 10 times higher than the activated carbon now used to make commercial supercapacitors.

“There are other ways to fabricate nanoporous graphene, but this approach is faster, has little environmental impact and costs less,” said Xiulei (David) Ji, an OSU assistant professor of chemistry in the OSU College of Science and lead author on the study. “The product exhibits high surface area, great conductivity and, most importantly, it has a fairly high density that is comparable to the commercial activated carbons.

“And the carbon source is carbon dioxide, which is a sustainable resource, to say the least,” Ji said. “This methodology uses abundant carbon dioxide while making energy storage products of significant value.”

Because the materials involved are inexpensive and the fabrication is simple, this approach has the potential to be scaled up for production at commercial levels, Ji said.

The chemical reaction outlined in this study involved a mixture of magnesium and zinc metals, a combination discovered for the first time. These are heated to a high temperature in the presence of a flow of carbon dioxide to produce a controlled “metallothermic” reaction. The reaction converted the elements into their metal oxides and nanoporous graphene, a pure form of carbon that’s remarkably strong and can efficiently conduct heat and electricity. The metal oxides could later be recycled back into their metallic forms to make an industrial process more efficient.

By comparison, other methods to make nanoporous graphene often use corrosive and toxic chemicals, in systems that would be challenging to use at large commercial levels.

“Most commercial carbon supercapacitors now use activated carbon as electrodes, but their electrical conductivity is very low,” Ji said. “We want fast energy storage and release that will deliver more power, and for that purpose the more conductive nanoporous graphene will work much better. This solves a major problem in creating more powerful supercapacitors.”

A supercapacitor is a type of energy storage device, but it can be recharged much faster than a battery and has a great deal more power. They are mostly used in any type of device where rapid power storage and short, but powerful energy release is needed.

They are being used in consumer electronics, and have applications in heavy industry, with the ability to power anything from a crane to a forklift. A supercapacitor can capture energy that might otherwise be wasted, such as in braking operations. And their energy storage abilities may help “smooth out” the power flow from alternative energy systems, such as wind energy.

They can power a defibrillator, open the emergency slides on an aircraft and greatly improve the efficiency of hybrid electric automobiles. Nanoporous carbon materials can also adsorb gas pollutants, work as environmental filters, or be used in water treatment. The uses are expanding constantly and have been constrained mostly by their cost.

Media Contact: 
Source: 

Xiulei (David) Ji, 541-737-6798

Multimedia Downloads
Multimedia: 

Nanoporous graphene
Nanoporous graphene

Matched “hybrid” systems may hold key to wider use of renewable energy

CORVALLIS, Ore. – The use of renewable energy in the United States could take a significant leap forward with improved storage technologies or more efforts to “match” different forms of alternative energy systems that provide an overall more steady flow of electricity, researchers say in a new report.

Historically, a major drawback to the use and cost-effectiveness of alternative energy systems has been that they are too variable – if the wind doesn’t blow or the sun doesn’t shine, a completely different energy system has to be available to pick up the slack. This lack of dependability is costly and inefficient.

But in an analysis just published in The Electricity Journal, scientists say that much of this problem could be addressed with enhanced energy storage technology or by developing “hybrid” systems in which, on a broader geographic scale, one form of renewable energy is ramping up even while the other is declining.

Wind energy is already pretty cost-competitive and solar energy is quickly getting there,” said Anna Kelly, a graduate student in the School of Public Policy at Oregon State University, and an energy policy analyst. “The key to greater use of these and other technologies is to match them in smart-grid, connected systems.

“This is already being done successfully in a number of countries and the approach could be expanded.”

For instance, the wind often blows more strongly at night in some regions, Kelly said, and solar technology can only produce energy during the day. By making more sophisticated use of that basic concept in a connected grid, and pairing it with more advanced forms of energy storage, the door could be opened for a much wider use of renewable energy systems, scientists say.

“This is more than just an idea, it’s a working reality in energy facilities around the world, in places like Spain, Morocco and China, as well as the U.S.,” Kelly said. “Geothermal is being paired with solar; wind and solar with lithium-ion batteries; and wind and biodiesel with batteries. By helping to address the price issue, renewable energy is being produced in hybrid systems by real, private companies that are making real money.”

Advanced energy storage could be another huge key to making renewable energy more functional, and one example is just being developed in several cooperating states in the West. Electricity is being produced by efficient wind farms in Wyoming; transmitted to Utah where it’s being stored via compressed air in certain rock formations; and ultimately used to help power Los Angeles.

This $8 billion system could be an indicator of things to come, since compressed air can rapidly respond to energy needs and be readily scaled up to be cost-competitive at a significant commercial level.

“There are still a number of obstacles to overcome,” said Joshua Merritt, a co-author on the report and also a graduate student in mechanical engineering and public policy at OSU. “Our transmission grids need major improvements so we can more easily produce energy and then send it to where it’s needed. There are some regulatory hurdles to overcome. And the public has to more readily accept energy systems like wind, wave or solar in practice, not just in theory.”

The “not in my back yard” opposition to renewable energy systems is still a reality, the researchers said, and there are still some environmental concerns about virtually any form of energy, whether it’s birds killed by wind turbine rotors, fish losses in hydroelectric dams or chemical contaminants from use of solar energy.

The near future may offer more options, the researchers said. Advanced battery storage technologies are becoming more feasible. Wave or tidal energy may become a real contributor, and some of those forces are more predictable and stable by definition. And the birth of small, modular nuclear reactors – which can be built at lower cost and produce no greenhouse gas emissions – could play a significant role in helping to balance energy outflows from renewable sources.

The long-term goal, the report concluded, is to identify technologies that can work in a hybrid system that offers consistency, dependability and doesn’t rely on fossil fuels. With careful matching of systems, improved transmission abilities and some new technological advances, that goal may be closer than realized, they said.

“With development, the cost of these hybrid systems will decrease and become increasingly competitive, hopefully playing a larger role in power generation in the future,” the researchers wrote in their conclusion.

Media Contact: 
Source: 
Multimedia Downloads
Multimedia: 

Wind farm
Wind farm

New technology may speed up, build awareness of landslide risks

CORVALLIS, Ore. – Engineers have created a new way to use lidar technology to identify and classify landslides on a landscape scale, which may revolutionize the understanding of landslides in the U.S. and reveal them to be far more common and hazardous than often understood.

The new, non-subjective technology, created by researchers at Oregon State University and George Mason University, can analyze and classify the landslide risk in an area of 50 or more square miles in about 30 minutes - a task that previously might have taken an expert several weeks to months. It can also identify risks common to a broad area rather than just an individual site.

And with such speed and precision, it reveals that some landslide-prone areas of the Pacific Northwest are literally covered by landslides from one time or another in history. The system is based on new ways to use light detecting and ranging, or lidar technology, that can seemingly strip away vegetation and other obstructions to show land features in their bare form.

“With lidar we can see areas that are 50-80 percent covered by landslide deposits,” said Michael Olsen, an expert in geomatics and the Eric HI and Janice Hoffman Faculty Scholar in the OSU College of Engineering. “It may turn out that there are 10-100 times more landslides in some places than we knew of before.

“We’ve always known landslides were a problem in the Pacific Northwest,” Olsen said. “Many people are just now beginning to realize how big the problem is.”

An outline of the new technology was recently published in Computers and Geosciences, a professional journal.

Oregon and Washington, especially in the Coast Range and Cascade Range, are already areas commonly known to have landslides, and as a result Oregon’s Department of Geology and Mineral Industries has become a national leader in mapping of them, Olsen said. But previous approaches are slow, and the new technology, called a Contour Connection Method, could radically speed up widespread mapping, and build both professional and public awareness of the issue.

Despite the prevalence and frequency of landslides, they are not generally covered by most homeowner insurance policies; coverage can be purchased separately, but most people don’t. And with increasing population growth, more and more people are moving into more remote locations, or building in scenic areas near the hills around cities where landslide risk might be high.

“A lot of people don’t think in geologic terms, so if they see a hill that’s been there for a long time, they assume there’s no risk,” said Ben Leshchinsky, a geotechnical engineer in the OSU College of Forestry. “And many times they don’t want to pay extra to have an expert assess landslide risks or do something that might interfere with their land development plans.”

Lidar is already a powerful tool, but the new system developed at OSU offers an automated way to improve the use of it, and could usher in a new era of landslide awareness, experts say. Information could be more routinely factored into road, bridge, land use, zoning, building and other decisions.

With this technology, a computer automatically looks for land features, such as suddenly steeper areas of soil, that might be evidence of a past landslide. It then searches the terrain for other features, such as a “toe” of soils at the base of the landslide. And in minutes it can make unbiased, science-based classifications of past landslides that consistently use the same criteria.

The technology was applied to the region surrounding the landslide of March, 2014, that killed 43 people near the small town of Oso, Washington. In about nine minutes it was able to analyze more than 2,200 acres and many prehistoric landslide features that are readily apparent in lidar images, in this region known for slope instability.

Eventually, adaptations of the technology might even allow for real-time monitoring of soil movement, the researchers said.

Media Contact: 
Source: 

Michael Olsen, 541-737-9327

Multimedia Downloads
Multimedia: 

Landslide detection

Landslide detection




Landslide map of region
Mapped landslide inventory




Oso landslide
Oso, Washington landslide