Category: Energy

Offshore wind farms provide feeding opportunities for seals – and more

New research has revealed offshore wind turbines may act as artificial reefs and seals have been deliberately seeking out the structures whilst hunting for prey.

Dr Deborah Russell carried out research with her team from the University of St Andrews where they gathered data from GPS devices attached to seals in the North Sea. The findings were published in the journal Current Biology.

The movements of Harbour and Grey seals were tracked and the researchers found a proportion of the seals continued to return to offshore wind structures. This suggested seals forage around wind farms and underwater pipelines along British and Dutch coasts.

Russell said, “I was shocked when I first saw the stunning grid pattern of a seal track around Sheringham Shoal – an offshore wind farm in Norfolk.

“You could see that the seal appeared to travel in straight lines between turbines, as if he was checking them out for potential prey and then stopping to forage at certain ones.”

She added, “The behaviour observed could have implications for both offshore wind farm developments and the decommissioning of oil and gas infrastructure.”

A study published in the journal of Applied Ecology in May suggested that renewable energy projects could help certain marine species settle in new areas and thrive.

The first thing I learned about offshore structures when I started work down along the Gulf of Mexico was that the best fishing spots in the Gulf were underneath well producing platforms. Between structure and shade which offered temperature gradients, you always had better luck catching your limit next to an oil platform.

The shade was nice, too.

Thanks, Mike

About these ads

First Buckyballs from boron


Wang Lab/Brown University

Just in time for the World Cup final, researchers have succeeded in building the first ‘buckyballs’ made entirely from boron atoms. Unlike true, carbon-based buckyballs, the boron molecules are not shaped exactly like footballs. But this novel form of boron might lead to new nanomaterials and could find uses in hydrogen storage.

Robert Curl, Harold Kroto and Richard Smalley found the first buckyball — or buckminsterfullerene — in 1985. The hollow cage, made of 60 carbon atoms arranged in pentagons and hexagons like a football, got its name from the US architect and engineer Richard Buckminster Fuller, who used the same shapes in designing his domes. The discovery opened the flood gates for creating more carbon structures with impressive qualities, such as carbon nanotubes and the single-atom-thick graphene. Since then, material scientists have also searched for buckyball-like structures made of other elements.

In 2007, Boris Yakobson, a material scientist at Rice University in Houston, Texas, theorized that a cage made of 80 boron atoms should be stable. Another study published just last week predicts a stable structure with 36 boron atoms.

Publishing…in Nature Chemistry, a team led by Lai-Sheng Wang, a chemist at Brown University in Providence, Rhode Island, has become the first to see such a beast — although its structure is slightly different from that predicted. The researchers call their 40-atom molecule borospherene. It is arranged in hexagons, heptagons and triangles…

In addition to having a less elegant shape, the borosphene balls form a different type of internal bond from their carbon counterparts. This makes them difficult to use as isolated building blocks as they have a tendency to interact with each other, but this reactivity may make boron buckyballs good for connecting in chains. It also makes the balls capable of bonding with hydrogen, which the team says could make them useful in hydrogen storage.

Which should have GM and Toyota knocking at the hallowed doors of Brown University. Both are dedicated to producing alternative-fuel vehicles using hydrogen.

Thanks, Mike

Net-zero energy test home ends first year with leftover energy

Braving a harsh winter with snow-covered solar panels, a net-zero energy home in Washington DC has come up trumps in a year-long study of its energy harvesting capabilities. Located on campus at the National Institute of Standards and Technology researchers used computer simulation to replicate the energy consumption of a family of four. At the end of its first 12 months, there was a large enough surplus to power an electric car for 1,440 miles.

The 2,700 ft sq two-story construction was developed to look like a regular home, but function as a laboratory for clean energy research. Much like the Honda Smart Home, NIST’s effort combines stable ground temperatures with geothermal systems to minimize heating and cooling loads throughout the building. Another factor in overall energy efficiency is a doubling of insulation levels, sealed by special sheeting that reportedly heals itself when pierced.

“The most important difference between this home and a Maryland code-compliant home is the improvement in the thermal envelope – the insulation and air barrier,” says NIST mechanical engineer Mark Davis…

The energy surplus and the home’s claim to net-zero living was compounded by a stretch of severe weather. For 38 days through winter, the 32 photovoltaic panels were largely covered in snow and ice, hampering their ability to harvest energy from the sun. But over the 12 month period, the home generated 13,577 kWh of energy. This surpassed the virtual family’s energy usage by 491 kWh, an excess that could in theory be directed toward an electric vehicle or back into the grid…

Despite boasting the aesthetics of a typical suburban house, adoption of the technologies used will largely come down to cost. NIST estimates that fitting out a similar-sized house with all the bells and whistles of its test home would cost around US$162,700. On the upside, it puts savings in electricity costs at $4,373 for the year.

Further research will center on how the measurements of the home can improve its energy efficiency and addressing the difference in up-front costs and long term savings. NIST is hopeful its findings will lead to improved energy efficiency standards as a resource for builders, regulators and home buyers.

A couple of comments.

First, the design is two or three times the size of sensible requirements. Make your decisions based on need instead of cultural McMansions and a family of four could be quite comfortable in 1200 square feet instead of 2700. My wife and I and a dog live in 1400 sq.ft. and use about 900 sq.ft. including a study/home office. We have a spare bedroom we refloored a couple years ago and haven’t yet gotten round to moving anything into that room!

Second, custom home building adds a premium of as much as 30% to cost. Building comparable homes as part of a subdivision, growing economic advantages of scale will reduce the cost of building homes like this. Working this out from scratch probably increased cost from projected by 10-15% just on change orders. :)

Wastewater from Big Oil triggers surge in small earthquakes

Massive injections of wastewater from the oil and gas industry are likely to have triggered a sharp rise in earthquakes in the state of Oklahoma.

Researchers say there has been a forty-fold increase in the rate of quakes in the US state between 2008-13.

The scientists found that the disposal of water in four high-volume wells could be responsible for a swarm of tremors up to 35km away.

Their research has been published in the journal, Science…

There has been increasing evidence of links between the process of oil and gas extraction and earthquakes in states like Arkansas, Texas, Ohio and Oklahoma in recent years…

The US Geological Survey (USGS) has also reported on the question of seismicity induced by wastewater disposal.

This new research goes further, linking a large swarm of Oklahoma tremors with a number of specific water wells, distantly located.

More than 2,500 earthquakes greater than magnitude 3.0 have occurred around the small town of Jones since 2008. This represents about 20% of the total in the central and western US in this period.

Researchers have now linked this increase to a near doubling in the volumes of wastewater disposed of in the central Oklahoma region between 2004 and 2008.

Water is never far away in the energy extraction process. It is used not just for hydraulic fracturing, but also to squeeze more oil out of conventional wells…

According to Dr Bill Ellsworth from the USGS, the high price of oil has driven this water-based approach. But the law says that drinking water has to be protected from the salty flow.

“As part of the business model, you have to be able to dispose of these very large volumes of saline water. You can’t treat it; you can’t put it into the rivers. So, you have to inject it underground…”

Most studies like this always characterize quakes around 3.0 magnitude as small. There is the possibility that all these little quakes are what folks are forced to put up with – until the Big One.

Thanks, Mike

An algae that switches its internal quantum computer on and off

Scientists have shown that certain algae which use quantum effects to optimize photosynthesis are also capable of switching it off. It’s a discovery that could lead to highly efficient organic solar cells and quantum-based electronics.

Like quantum computers, some organisms are capable of scanning all possible options in order to choose the most efficient path or solution. For plants and some photosynthetic algae, this means the ability to make the most of the energy they receive and then deliver that energy from leaves with near perfect efficiency. This effect, called quantum decoherence, is what allows some algae to survive in very low levels of light.

Recently, scientists from the UNSW School of Physics studied one of these algae, a tiny single-celled organism called cryptophytes. They typically live at the bottom of pools of water, or under thick ice, where light is scarce. The researchers found that there’s a class of cryptophytes in which quantum decoherence is switched off, and it’s on account of a single genetic mutation that alters the shape of a light-harvesting protein.

In quantum mechanics, a system is coherent when all quantum waves are in step with each other. When it’s coherent, it can exist in many different states simultaneously, an effect known as superposition.

The researchers used x-ray crystallography to determine the crystal structure of the light-harvesting complexes from three different species. Two cryptophyte species had a mutation that led to the insertion of an extra amino acid that changes the structure of the protein complex, which disrupts decoherence.

The next step for the scientists will be to determine whether the switching effect is assisting the algae’s survival. What’s more, further understanding of this phenomenon could eventually lead to technological advances, such as better organic solar cells and quantum-based electronic devices.

I’m beginning to worry that quantum mechanics is starting to rub off on my little gray cells. This is making sense to me – and it only took five or six decades.

Thanks, Mike

LEDs to revolutionize hi-tech greenhouse farming

It won’t come as a surprise to discover that consumers all over the developed world are increasingly demanding seasonal vegetables all year round, even when the local climate simply doesn’t allow that kind of growth. Particularly sought-after are tomatoes, cucumbers, and leaf vegetables. Which is why greenhouse farming has become a major factor in the food supply of the developed world.

Consequently, the number of commercial greenhouses and the area they occupy is rocketing. In the Netherlands, for example, greenhouses occupy around 0.25 percent of the land area of the entire country. And the Netherlands isn’t even the largest producer of greenhouse vegetables in Europe. That position is held by Spain. And the largest producer of greenhouse vegetables in the world is now China…

So an important question is how to minimize the energy it takes to grow these crops. One obvious answer is to convert greenhouses from the traditional incandescent lighting, usually high pressure sodium lamps, to more energy-efficient LEDs.

That might seem like an economic no-brainer but the industry has been slow to make this change because of the high initial cost of LEDs. The question that farmers have pondered over is whether they will ever recoup the upfront cost of a brand-new system of lighting.

Today, they get an answer thanks to the work of Devesh Singh and pals at the Hannover Centre for Optical Technologies at the University of Hannover in Germany. These guys have compared the life-cycle costs of traditional high pressure sodium lamps against those of LEDs for greenhouse lighting.

And they say the advantages are clear. They calculate that the cumulative cost of high pressure sodium lamps surpasses that of LEDs after just seven years and that after 16 years the cumulative cost of high pressure sodium lamps is more than double the equivalent cost of LEDs.

It’s not hard to see where this saving comes from. Although high pressure sodium lamps are individually cheaper than LEDs, they have to be changed every year compared to every 19 years for LEDs. And, of course, LEDs use considerably less electricity, wasting little as heat.

But the most interesting part of Singh and co’s analysis is in the potential of LEDs to change the way that vegetables are grown. High pressure sodium lamps emit light across the entire visible part of the spectrum and well into the infrared where much energy is lost as heat. By contrast, LEDs can be adjusted to emit light in very specific parts of the spectrum.

Plant physiologists have long known that chlorophyll absorbs mainly in the blue, green and red parts of the spectrum but absorbs a little in the orange and yellow. So it makes sense to produce light only in these parts of the spectrum. That’s easy with LEDs, of course, but impossible with sodium lamps.

The strategy seems clear: convert to LED lighting as quickly as possible. Conversion will pay for itself in a few years and the advantages influencing yields and the quality of output start with flipping a light switch – and taking the resulting veggies to market.

RTFA for a few more ways in which crop quality is improved – while saving on the cost of production.

What would a functioning warp-drive ship look like?


Image by Mark Rademaker

Artist Mark Rademaker has unveiled a set of concept images imagining what a spaceship capable of traveling to other stars in a matter of months would really look like. Although it may look like something from the next science fiction epic and is unlikely to lift off anytime soon, his IXS Enterprise design is actually based on some hard science…

The idea comes from the work published by Miguel Alcubierre in 1994. His version of a warp drive is based on the observation that, though light can only travel at a maximum speed of 186,000 miles per second spacetime itself has a theoretically unlimited speed. Indeed, many physicists believe that during the first seconds of the Big Bang, the universe expanded at some 30 billion times the speed of light.

The Alcubierre warp drive works by recreating this ancient expansion in the form of a localized bubble around a spaceship. Alcubierre reasoned that if he could form a torus of negative energy density around a spacecraft and push it in the right direction, this would compress space in front of it and expand space behind it. As a result, the ship could travel at many times the speed of light while the ship itself sits in zero gravity, meaning the crew don’t end up as a grease stain on the aft bulkhead from the acceleration.

Unfortunately, the original maths indicated that a torus the size of Jupiter would be needed, and you’d have to turn Jupiter itself into pure energy to power it. Worse, negative energy density violates a lot of physical limits itself and to create it requires forms of matter so exotic that their existence is largely hypothetical.

In recent years, Dr Harold “Sonny” White of NASA’s Johnson Space Center has given the interstellar minded some cause for optimism by showing that even if the warp drive may not be possible, it may be much less impossible than previously thought. White looked at the equations and discovered that making the torus thicker, while reducing the space available for the ship, allowed the size of the torus to be greatly decreased, down to a width of 10 meters for a ship traveling ten times the speed of light.

According to White, with such a setup, a ship could reach Alpha Centauri in a little over five months, and oscillating the bubble around the craft reduces the stiffness of spacetime, making it easier to distort. This would reduce the amount of energy required by several orders of magnitude, making it possible to design a craft that, rather than being the size of Jupiter, is smaller than the Voyager 1 probe.

RTFA for another set of reasons why we ain’t seeing this anytime soon. Or maybe even later.

But, as David Szondy says in his article, “It doesn’t hurt to dream”.

Northeastern states cut emissions and enjoyed above-average growth

Some critics of the Environmental Protection Agency’s new requirements for power plants argue that forcing emissions reduction will curtail economic growth. But the recent experience of states that already cap carbon emissions reveals that emissions and economic growth are no longer tightly tied together.

One of the ways that states will be able to meet the new E.P.A. standards is by joining a Northeastern cap-and-trade program known as the Regional Greenhouse Gas Initiative, which first put in a carbon cap in 2009. In a cap-and-trade system, the government places a ceiling on total carbon emissions and issues permits for those emissions, which companies can buy and sell from one another.

The nine states already in the program — Connecticut, Delaware, Maine, Maryland, Massachusetts, New Hampshire, New York, Rhode Island and Vermont — have substantially reduced their carbon emissions in recent years. At the same time, those states have had stronger economic growth than the rest of the country.

These nine states had large emissions drops even before the program began in 2009, in part because the recession and warmer winters lowered the demand for power. The states also began switching to natural gas power, retiring coal units, and adding wind and solar energy generation. As the economy recovered, participation in the program spurred the states to find ways to meet the increasing demand for power without driving up emissions.

Since 2009, the nine states have cut their emissions by 18 percent, while their economies grew by 9.2 percent. By comparison, emissions in the other 41 states fell by 4 percent, while their economies grew by 8.8 percent…

Joining the Northeast cap-and-trade program, or another similar program in California that began in 2013, is one of many ways states can reach the new goals the E.P.A. has set. Other options include taking a series of individual steps — such as upgrading older power plants and expanding nuclear, wind and solar power generation — without a statewide cap. The states themselves will decide exactly how they will meet the goals set for them.

Of course, though economists uniformly fart in the general direction of Reaganomics, though honest historians catalogue the revisionist lies of Republicans who claim every financial success – the ideologues of climate denial are so tightly beholden to the fossil fuel economy they will simply hire more Madison Avenue flacks and continue their stroll down memory lane. Hoping for a burning Bush or some other miracle.

CO2 hits 400 parts per million, highest monthly average in human history

The World Meteorological Association announced Monday that the average monthly concentration of carbon dioxide (CO2) in the northern hemisphere exceeded 400 parts per million in April, the highest monthly average on record.

A press release from the United Nations’ weather agency says crossing the “threshold is of symbolic and scientific significance and reinforces evidence that the burning of fossil fuels and other human activities are responsible for the continuing increase in heat-trapping greenhouse gases warming our planet.”

According to the WMO’s release, CO2 levels have risen more than 40 percent, up from pre-industrial levels of around 278 ppm, since human’s began burning fossil fuels. CO2 can remain in the atmosphere for hundreds of years, longer still in the oceans. Because plants absorb more carbon dioxide during summer months when their foliage is more dense and plentiful, CO2 levels fluctuate from season to season and tend to peak in the spring. The northern hemisphere, due to higher levels of human industrial activity than the southern hemisphere, tends to have a more pronounced seasonal cycle.

But even with April’s reading representing a seasonal peak, Earth’s atmosphere hasn’t seen levels as high as 400 ppm for millions of years.

“This should serve as yet another wakeup call about the constantly rising levels of greenhouse gases which are driving climate change. If we are to preserve our planet for future generations, we need urgent action to curb new emissions of these heat trapping gases,” WMO Secretary-General Michel Jarraud said…”Time is running out.”

Scientists are essentially conservative in the methodology they choose to analyze the real world; but, they are relentless. While skeptics prance about and prate this week’s talking points from the Koch Bros and the other fossil fuel barons – real science moves steadily onward. There may be dialectical moments of sharp new understanding. In general, though, the core characteristic is step-by-step until a qualitative level is raised.

And then the process starts anew.

Reserve junk science politics for philistines who put profits before people’s needs.