Tuesday 17 March 2009

Trilliant Energy Vallet - iPhone Based SmartGrid Integration

Original Article - Trilliant Inc.

Helping consumers and the environment
Electric car charged? Check. Schedule hot tub at home? Check. Helping the environment? Check. All this and more is made possible by Trilliant's Energy Valet, a widget for use on Apple's iPhone® and iPod touch® that will give consumers unprecedented control over energy consumption and savings. Sign up for Trilliant's Preview to get more information as it becomes available.

Benefits include:

* Instant status of utility bill costs, savings, and carbon footprint reduction.
* Instant Time of Use Rate schedule.
* Long term-near term remote scheduling of household devices.
* 1Touch charging of electric vehicle, and remote control of household.

SerViewCom
Trilliant's SerViewCom® data acquisition and data processing platform provides a streamlined and cost-effective 24/7 Advanced Metering Infrastructure (AMI) system. Compared to other meter data acquisition systems, SerViewCom is very flexible in terms of operating system platform, data base selection, data acquisition approaches, look and feel, reporting, and alarms. Also Trilliant offers solutions for MV-90 customers enhancing the value of an overall AMI system.

SerViewCom offers various levels of configuration that allow you to refine the applications so they meet your corporate needs:

* Choose your own database: Oracle, Sybase, Postgress, MySQL, and MS SQL
* Change look and feel: SerViewCom embeds transparently into your current corporate web design
* Run on your preferred OS: Windows 2000, Windows XP and 2003
* Create reports: several formats available including PDF, HTML, and CSV

San Diego Utility to Roll Out Smart Meters Mid-March

Original Article: earth2tech.com

The recession could be standing in the way of some utilities’ smart grid plans, but San Diego Gas & Electric, which services 3.4 million residents across 4,100 square miles, is trying to keep on track. Stephanie Donovan, spokesperson for the utility, told us recently that starting in mid-March San Diego Gas & Electric (SDG&E) will start rolling out 2.3 million electric and gas meters at its customers homes.

That means the utility’s project is just slightly behind schedule, as SDG&E had been planning to start a broad smart meter deployment in February. Itron, which will be providing the meters for the rollout, said last month that some of its utility customers would be changing up deployment schedules, with some moving forward more quickly than expected and others deploying somewhat later than their initial schedule.


SDG&E has chosen to use Itron’s OpenWay smart meters, which use an open-standards approach to provide a two-way communication between the utility and the resident’s home. The OpenWay meters also have a ZigBee chip embedded, enabling the resident to create a home area network that can help cut energy consumption.

For SDG&E, coming close to the deployment deadline is still pretty good — these smart meter projects are a substantial effort for utilities and require years of planning, regulatory approval and hundreds of millions of dollars. SDG&E’s project was approved by the California Public Utilities Commission back in 2007, and the utility estimates expenditures for the project on the order of $572 million (including at least $500 million in capital investments).

That kind of investment is pretty standard for a utility’s smart meter project. Ed Legge, an analyst with the Edison Electric Institute, said an average utility will spend at least $500 million on a large scale smart meter rollout. And it would take at least $50 billion for all of the investor-owned utilities (which make up 70 percent of the U.S. utilities) to roll out smart grid networks.

This week, utilities and smart grid firms will all be waiting for a vote on the stimulus package, which would allocate $4.5 billion for a smarter power grid, with a goal of 40 million smart meters installed. While those funds are just a drop in the bucket for the overall industry, companies like smart grid software maker eMeter are already saying that they’re seeing a stimulus-induced pickup in the market.

AT&T Taps Into Smart Grid With SmartSynch

Original Article: earth2tech.com

Smart grid analyst Jesse Berst wasn’t kidding when he said companies are in a frenzy repositioning themselves to grab a piece of the smart grid market. The latest is the largest cell phone company in the U.S.: AT&T. This morning, AT&T says it is working with smart meter technology maker SmartSynch to provide its wireless network for residential installations using SmartSynch’s smart meter technology.

SmartSynch and AT&T already have a partnership whereby AT&T’s wireless network is used to connect smart meters at commercial and industrial locations to around 100 different utilities’ back offices. This morning’s announcement is an expansion of that business into the residential market.

SmartSynch is a decade-old company based in Jackson, Miss., that makes a smart meter system that uses Internet protocol (IP) networks, like cellular or Wi-Fi, to connect smart meters at buildings to utilities. The benefit of SmartSynch’s system for utilities is that they don’t have to build their own network to run smart meters, they can use existing networks, which means a much lower cost to deploy the system.

While AT&T has been using its network for smart meter deployments for awhile, this new extension is a bet on the growth of the smart meter residential market. U.S. utilities are just starting to do trials of smart meters in homes in select regions, but with the stimulus package injecting billions into the smart meter and smart grid markets, utilities and companies are getting ready to take advantage of dramatic growth this year. President Barack Obama has called for the installation of 40 million smart meters and 3,000 miles of transmission lines.

A buildout of the smart grid could also be one of the largest creators of wealth in the decade. As Berst said recently, the smart grid will “spawn new Googles and Microsofts,” and is “akin to the transcontinental railroad, the phone system, the interstate highway system and the Internet.” Of course AT&T wants a piece of that.

Friday 13 March 2009

Original Article - MIT Technology Review


Outside vibe: Citysense is a downloadable application for the iPhone and BlackBerry. It provides a heat map of GPS activity in a major city. Here, San Francisco is shown with red patches that indicate higher activity. The application has also identified the user’s location (a solid yellow dot) and suggests popular destinations (yellow circles).
Credit: Sense Networks
Over the course of any day, people congregate around different parts of a city. In the morning hours, workers commute downtown, while at lunchtime and in the evening, people disperse to eateries and bars.

While this sort of behavior is common knowledge, it hasn't been visible to the average person. Sense Networks, a startup based in New York, is now trying to bring this side of a city to life. Using cell-phone and taxi GPS data, the startup's software produces a heat map that shows activity at hot spots across a city. Currently, the service, called Citysense, only works in San Francisco, but it will launch in New York in the next few months.

On Wednesday, at the O'Reilly Emerging Technologies conference in San Jose, CA, Tony Jebara, chief scientist for Sense Networks and a professor at Columbia University, detailed plans of a forthcoming update to Citysense that shows not only where people are gathering in real time, but where people with similar behavioral patterns--students, tourists, or businesspeople, for instance--are congregating. A user downloads Citysense to her phone to view the map and can choose whether or not to allow the application to track her own location.

The idea, says Jebara, is that a person could travel to a new city, launch Citysense on her phone, and instantly get a feel for which neighborhoods she might want to spend the evening visiting. This information could also help her filter restaurant or bar suggestions from online recommendation services like Yelp. Equally important, from the company's business perspective, advertisers would have a better idea of where and when to advertise to certain groups of people.

Citysense, which has access to four million GPS sensors, currently offers simple statistics about a city, says Jebara. It shows, for instance, whether the overall activity in the city is above or below normal (Sense Networks' GPS data indicates that activity in San Francisco is down 34 percent since October) or whether a particular part of town has more or less activity than usual. But the next version of the software, due out in a couple of months, will help users dig more deeply into this data. It will reveal the movement of people with certain behavior patterns.

"It's like Facebook, but without the self-reporting," Jebara says, meaning that a user doesn't need to actively update her profile. "We want an honest social network where you're connected to someone because you colocate."

In other words, if you live in San Francisco and go to Starbucks at 4 P.M. a couple of times a week, you probably have some similarities with someone in New York who also visits Starbucks at around the same time. Knowing where a person in New York goes to dinner on a Friday night could help a visitor to the city make a better restaurant choice, Jebara says.

As smart phones with GPS sensors become more popular, companies and researchers have clamored to make sense of all the data that this can reveal. Sense Networks is a part of a research trend known as reality mining, pioneered by Alex Pentland of MIT, who is a cofounder of Sense Networks. Another example of reality mining is a research project at Intel that uses cell phones to determine whether a person is the hub of a social network or at the periphery, based on her tone of voice and the amount of time she talks.



Jebara is aware that the idea of tracking people's movements makes some people uncomfortable, but he insists that the data used is stripped of all identifying information. In addition, anyone who uses Citysense must first agree to let the system log her position. A user can also, at any time, delete her data from the Sense Networks database, Jebara says.

Part of Sense Networks' business plan involves providing GPS data about city activity to advertisers, Jebara says. But again, this does not mean revealing an individual's whereabouts--just where certain types of people congregate and when. For instance, Sense Networks' data-analysis algorithms may show that a particular demographic heads to bars downtown between 6 and 9 P.M. on weekdays. Advertisers could then tailor ads on a billboard screen to that specific crowd.

So far, Jebara says, Sense Networks has categorized 20 types, or "tribes," of people in cities, including "young and edgy," "business traveler," "weekend mole," and "homebody." These tribes are determined using three types of data: a person's "flow," or movements around a city; publicly available data concerning the company addresses in a city; and demographic data collected by the U.S. Census Bureau. If a person spends the evening in a certain neighborhood, it's more likely that she lives in that neighborhood and shares some of its demographic traits.

By analyzing these types of data, engineers at Sense Networks can determine the probability that a user will visit a certain type of location, like a coffee shop, at any time. Within a couple of weeks, says Jebara, the matrix provides a reliable probability of the type of place--not the exact place or location--that a person will be at any given hour in a week. The probability is constantly updated, but in general, says Jebara, most people's behavior does not vary dramatically from day to day.

Sense Networks is exploring what GPS data can reveal about behavior, says Eric Paulos, a professor of computer science at Carnegie Mellon. "It's interesting to see things like this, [something] that was just research a few years ago, coming to the market," he adds. Paulos says it will be important to make sure that people are aware of what data is being used and how, but he predicts that more and more companies are going to find ways to make use of the digital bread crumbs we leave behind. "It's going to happen," he says.

Video of SenseCity in action

Thursday 19 February 2009

Bus drivers' skills kept sharp

Original Article - Arizona Star


oe Riggs, a school bus driver for the Marana district, could afford to laugh it off when he crashed into an SUV — it wasn't real.
School bus driver Joe Riggs burst into laughter when he hit an SUV.
Riggs, 64, wasn't at the wheel of his 40-foot-long bus when he hit the other vehicle. He was sharpening his skills in a white trailer equipped with a bus-driving simulation system. Seventy-five Marana Unified School District bus drivers, including Riggs, received one hour of training last week in the driving simulator.

"It's a good training system," Riggs said. "Every driver should have access to a simulator at one time or another." Training concentrated on safe and proper turning techniques and the appropriate use of mirrors.

"There's a lot of blind spots because of the size of the bus, and if drivers are aware of how to use those mirrors, it greatly helps them," said Dean Humphrey, a senior loss-control consultant for the Arizona School Risk Retention Trust Inc., which provided the simulator.

Also known as the Trust, the group provides insurance to Arizona public schools, including Marana, and to community colleges. Marana paid nothing for the one-week use of the mobile simulator.

Humphrey's goal is to train between 2,600 and 2,800 bus drivers all across the state this year. He said about 7,000 school bus drivers are working in Arizona.
Amphitheater, Flowing Wells, Sahuarita and Sunnyside school district bus drivers also will receive training sometime this year, Humphrey said. And he's trying to schedule a training session with the Tucson Unified School District.
"We want to provide a different delivery system for training that's cost-effective as well as effective," Humphrey said.


Dean Humphrey brought the simulator for the drivers to use. "There's a lot of blind spots because of the size of the bus, and if drivers are aware of how to use those mirrors, it greatly helps them," he said.
When Riggs hit the simulated sport utility vehicle, he was practicing his defensive-driving skills on a course in which vehicles dart into the path of the bus.
Drivers sit in front of four screens in a driver's seat that's equipped with all the functions of an actual school bus.

They operate the driving simulator from their point of view and can travel through city, freeway or rural courses. Defensive-driving techniques also can be practiced on skills courses.

"It is great," Riggs said about the driving simulator. "It doesn't teach you, of course, how to drive, but it does make you rely on and learn to use your mirrors more often than you might. And to drive more defensively."
Driving a school bus is the retired stockbroker's second career, but Riggs does have some experience operating large vehicles. Riggs and his wife sold their belongings in Homer, Alaska, four years ago and bought a motor home.
"This gave me a good way to get back to motor home driving, which I dearly loved," Riggs said.

Marana bus drivers viewed the training as an opportunity to hone their skills. They know how important their job is, and additional training will help them be better drivers.

"I think it's great. It's different. I was skeptical," school bus driver Toni Keepers said. "It lets us see what we're doing. It's a different look."
She has driven a school bus for nearly 12 years. She said there's more to the job than driving.

Keepers, 52, doesn't leave the bus yard until she checks the fluid levels on her bus, thoroughly checks the exterior and interior of her bus, and fuels up.
"I take better care of my bus than I do my own car," she said.

Wednesday 18 February 2009

Sun-powered device converts CO2 into fuel

Original Article - New Scientist

Powered only by natural sunlight, an array of nanotubes is able to convert a mixture of carbon dioxide and water vapour into natural gas at unprecedented rates.

Such devices offer a new way to take carbon dioxide from the atmosphere and convert it into fuel or other chemicals to cut the effect of fossil fuel emissions on global climate, says Craig Grimes, from Pennsylvania State University, whose team came up with the device.

Although other research groups have developed methods for converting carbon dioxide into organic compounds like methane, often using titanium-dioxide nanoparticles as catalysts, they have needed ultraviolet light to power the reactions.

The researchers' breakthrough has been to develop a method that works with the wider range of visible frequencies within sunlight.

Enhanced activity

The team found it could enhance the catalytic abilities of titanium dioxide by forming it into nanotubes each around 135 nanometres wide and 40 microns long to increase surface area. Coating the nanotubes with catalytic copper and platinum particles also boosted their activity.

The researchers housed a 2-centimetre-square section of material bristling with the tubes inside a metal chamber with a quartz window. They then pumped in a mixture of carbon dioxide and water vapour and placed it in sunlight for three hours.

The energy provided by the sunlight transformed the carbon dioxide and water vapour into methane and related organic compounds, such as ethane and propane, at rates as high as 160 microlitres an hour per gram of nanotubes. This is 20 times higher than published results achieved using any previous method, but still too low to be immediately practical.

If the reaction is halted early the device produces a mixture of carbon monoxide and hydrogen known as syngas, which can be converted into diesel.
Copper boost

"If you tried to build a commercial system using what we have accomplished to date, you'd go broke," admits Grimes. But he is confident that commercially viable results are possible.

"We are now working on uniformly sensitising the entire nanotube array surface with copper nanoparticles, which should dramatically increase conversion rates," says Grimes, by at least two orders of magnitude for a given area of tubes.

This work suggests a "potentially very exciting" application for titanium-dioxide nanotubes, says Milo Shaffer, a nanotube researcher at Imperial College, London. "The high surface area, small critical dimensions, and open structure [of these nanotubes] apparently provide a relatively high activity," he says.

Abstract in Nano Letters

Wednesday 4 February 2009

Managing Energy with Swarm Logic

Original Article - MIT Technology Review

Self-organizing equipment could cut energy bills.
By Tyler Hamilton

Smart switch: The controller shown here could improve the energy efficiency of building appliances. The devices communicate wirelessly and use swarming algorithms to collaboratively decide how to manage power usage.
Credit: REGEN ENERGY
Air-conditioning units and heating systems are examples of power-hungry equipment that regularly switches on and off in commercial buildings. When these devices are all switched on at once, power consumption spikes, and a building's owners are left with hefty peak-demand charges on their electricity bills.

A startup based in Toronto says that it has come up with a way to reduce energy use by mimicking the self-organizing behavior of bees. REGEN Energy has developed a wireless controller that connects to the control box on a piece of building equipment and functions as a smart power switch. Once several controllers have been activated, they detect each other using a networking standard called ZigBee and begin negotiating the best times to turn equipment on and off. The devices learn the power cycles of each appliance and reconfigure them to maximize collective efficiency.

The goal is to avoid everything coming on at the same time without sacrificing individual performance. The devices work through this problem using a "swarm algorithm" that coordinates activity without any single device issuing orders.

"Every node thinks for itself," says Mark Kerbel, cofounder and chief executive officer of REGEN Energy, which invented the proprietary algorithm embedded in each device. Before making a decision, he explains, a node will consider the circumstances of other nodes in its network. For example, if a refrigerator needs to cycle on to maintain a minimum temperature, a node connected to a fan or pump will stay off for an extra 15 minutes to keep power use below a certain threshold. "The devices must satisfy the local restraint but simultaneously satisfy the system objective," says Kerbel, adding that a typical building might have between 10 and 40 controllers working together in a single "hive." The devices are simple and quick to install and, because there's no human intervention, require no special training to use.

It's a dramatic departure from the top-down command model associated with current building-automation systems. Some researchers say that the decentralized approach to energy management offers a cheaper, more effective way to manage supply and demand in a delicately balanced electricity system. Indeed, some believe that it could be an early prescription for an emerging smart grid.

"You're seeing a lot more interest in this on a modest scale," says David Chassin, a scientist at Pacific Northwest National Laboratory's energy-technology group, which is heading up the GridWise smart-grid initiative.

The benefits could extend beyond electricity savings for building owners. Today's electricity system is designed for peak consumption, which means that power plants are built to satisfy those few minutes of each day when power demand surges well above daily averages. By reducing peak demand on a large scale, utilities can maximize the operation of existing power plants while reducing the need to build new plants for occasional use. Another potential benefit is reduced carbon emissions, since power plants that supply peak electricity tend to be less efficient and fueled by coal and natural gas.



George Pappas, a professor of electrical and systems engineering at the University of Pennsylvania and an expert in distributed control systems, says that swarm logic is a natural fit for energy applications. "REGEN is ahead of the curve on this," says Pappas.

Operation within a building is one thing, but less certain is whether swarm logic can be trusted to manage the grid itself. Chassin says that the engineering community is understandably wary of decentralized or "emergent" control systems for the grid because, while they work remarkably well in certain applications, the approach is not well tested.

Kerbel first came up with the idea of using a swarm algorithm to manage power consumption in 2005. "We were politely told that this style of control just isn't ready and requires far more academic research," he says. "It's difficult to think outside the command-and-control box and allow this leap of faith--that is, relinquishing decision-making capabilities to individual nodes of the collective."

It's a bias that Herb Sinnock, manager of the Centennial Energy Institute, in Toronto, admits to having. He says that engineers typically want constant feedback so that they can measure system operation and make refinements. REGEN's technology dispenses with all that, but he notes that its application will allow for some mistakes. "It's not like they're positioning control rods in a nuclear reactor core. We're talking about affecting the temperature in a room by half a degree, so there's room for error," says Sinnock.

Sinnock's institute has been working with REGEN to evaluate the performance of its devices in the field. Tests have so far demonstrated that building owners--of hospitals, hotels, shopping malls, factories, and other large facilities--could save as much as 30 percent on their peak-demand charges. Those savings, REGEN claims, more than cover the cost of renting the devices, which is an option for major electricity consumers reluctant to buy the technology up front. If the devices are purchased, the payback is less than three years, says Kerbel.

The simplicity of the installation is what impresses Sinnock most. "In a few hours, they can have the devices installed and figuring out their environment and surroundings," he says. Pappas, meanwhile, says that he expects there will be much more interest in this type of application over the coming years, pointing to a U.S. economic stimulus package that calls for more investment in energy efficiency and smart-grid technologies. "A lot of the big impact and low-hanging fruit is going to come from using this approach," he says.

Sunday 1 February 2009

SENSEable City - MIT at Davos

The SENSORy bike wheel (giving a mesh network for data aquisition, monitoring ... why, it's endless)


Disruptive Innovation, Applied to Health Care

Original Article - New York Times


THE health care system in America is on life support. It costs too much and saps economic vitality, achieves far too little return on investment and isn’t distributed equitably. As the Obama administration tries to diagnose and treat what ails the system, however, reformers shouldn’t be worried only about how to pay for it.
Dr. Yan Chow, a pediatrician with Kaiser, demonstrates a videoconferencing system that would allow doctors to speak with patients in their homes.

A laser keyboard could be used in spaces too small for a conventional one and might help prevent the spread of infection among hospital workers.
Instead, the country needs to innovate its way toward a new health care business model — one that reduces costs yet improves both quality and accessibility.

Two main causes of the system’s ills are century-old business models, for the general hospital and the physician’s practice, both of which are based on treating illness, not promoting wellness. Hospitals and doctors are paid by insurers and the government for the health care equivalent of piecework: hospitals profit from full beds and doctors profit from repeat visits. There is no financial incentive to keep patients healthy.

“The business models were all created decades ago, and acute disease drove those costs at the time,” says Steve Wunker, a senior partner at the consulting firm Innosight. “Most businesses in this industry are looking at their business model as entirely immutable. They’re looking for innovative offerings that fit this frozen model.”

Advances in technology and medical research are making it possible to envision an entirely new health care system that provides more individualized care without necessarily increasing costs, some health care experts say.

For instance, genetic breakthroughs have helped reveal time and again that what we thought was one disease — Type 2 diabetes, for instance — actually represents a score or more of distinct illnesses, each of which responds best to a different type of therapy, according to medical professionals.

As researchers develop ways to define diagnoses more precisely, more effective treatments can be prescribed, says Matthew Holt, founder of the Health Care Blog and co-founder of the biannual conference Health 2.0. Ultimately, those therapies can be administered by nurse practitioners or others trained to handle routine ailments. The expensive “intuitive medicine” practiced by doctors trained to wade through a thicket of mysterious symptoms in search of an accurate diagnosis can then focus on those cases that truly require their services.

Using innovation management models previously applied to other industries, Clayton M. Christensen, a Harvard Business School professor, argues in “The Innovator’s Prescription” that the concepts behind “disruptive innovation” can reinvent health care. The term “disruptive innovation,” which he introduced in 2003, refers to an unexpected new offering that through price or quality improvements turns a market on its head.

Disruptive innovators in health care aim to shape a new system that provides a continuum of care focused on each individual patient’s needs, instead of focusing on crises. Mr. Christensen and his co-authors argue that by putting the financial interests of hospitals and doctors at the center, the current system gives routine illnesses with proven therapies the same intensive and costly specialized care that more complicated cases require.

“Health care hasn’t become affordable,” he said in an interview, “because it hasn’t yet gone through disruptive decentralization.”

It’s coming, though. Some health care suppliers have set up fixed-fee integrated systems, and accept monthly payments from members in exchange for a promise of cradle-to-grave health care. Each usually also charges a small co-payment for treatment. Routine cases are handled through lower-cost facilities, leaving more complicated cases to higher-cost hospitals and specialists. Such systems include Kaiser Permanente, Intermountain Healthcare in Utah, the Mayo Clinic, the Geisinger Health System in Pennsylvania and the Veterans Health Administration.

By creating a continuum of care that follows patients wherever they go within an integrated system, says the Princeton University economist Uwe Reinhardt, care providers can stay on top of what preventive measures and therapies are most effective. Tests aren’t needlessly duplicated, competing medications aren’t prescribed by different doctors, and everyone knows what therapies a patient has received. As a result, integrated systems like Kaiser’s provide 22 percent greater cost efficiency than competing systems, according to a 2007 study by Hewitt Associates.

Kaiser’s system, in particular, has proved the benefits of an integrated system, Mr. Reinhardt says. “It is much cheaper than pay-for-service systems, because they have absolutely no incentive to overtreat you, but they have every incentive to keep you healthy,” he says. “Kaiser still makes mistakes — any large system does — but their facilities always come out ahead in every service quality survey I’ve reviewed.”

At Kaiser, experimentation with new technologies and business models occurs at the Sidney R. Garfield Health Care Innovation Center in San Leandro, Calif. Kaiser opened the facility in 2006 to test such new technologies as a videoconferencing system linking health care professionals to patients in their homes. Another is a laser-projected keyboard to prevent the spread of germs via computer equipment.

The Stanford economist Alain C. Enthoven, who has been studying the nation’s health care system for more than 30 years, said integrated systems “are the disruptive innovation we need to turn loose on the rest of America.” In a recent report for the Committee for Economic Development, Mr. Enthoven advocates letting consumers choose between traditional fee-for-service plans and less expensive integrated systems, then letting consumers pocket the difference in premiums. “Medicine is a complicated team sport,” he notes. “It takes an integrated system to keep the patient at the center of it.”

DR. JOHN H. COCHRAN, who as executive director of the Permanente Foundation is the highest-ranking physician among Kaiser’s 14,000-plus doctors, says information technology will play a crucial role in revolutionizing the country’s health care system.

“There’s a mythology that I.T. decreases the personal relationship between the physician and the patient,” he said. “In point of fact, it enhances it.”

Bringing business school concepts to bear on health care simply makes sense, Dr. Cochran says.

“We have a financial, macroeconomic, multinational crisis right now that can be paralytic or catalytic,” he said. “Let’s make sure we’re a catalyst.”

Thursday 29 January 2009

Clean Energy Innovations - An Overview

Original Article - MIT Technology Review

As the energy debate rages in political and scientific circles, investment in clean energy technologies continues to rise. Clean Energy Trends 2008, published by research firm Clean Edge, estimates that fuel cells, solar PV, wind energy, and biofuels — a combined $77.3 billion market in 2007 — will increase to $254.5 billion (or 229 percent) within a decade. Even ocean energy research, which once ebbed due to its costly nature, has enjoyed a resurgence, attracting $250 million in global capital expenditure since 2004, according to ABS Energy Research. Yet, it remains to be seen which clean energy technologies have the brightest commercial future.
Is ocean energy the next wave?

A recent report completed by Greentech Media and the Prometheus Institute for Sustainable Development estimates that the annual market size for ocean energy in the next six years will total $500 million, with installed capacity growing from less than 10MW today to about 1GW during the same period. However, ocean energy remains largely uncharted compared to the progress made on other renewable energy fronts.

The Florida Atlantic University Center for Ocean Energy Technology (COET) is working to change that by focusing on ocean energy permitting and policy development, education and workforce development, public outreach, standards development and economic analysis.

[More ...]

Cheap, super-efficient LED lights on the horizon

Original Article - New Scientist

Incandescent tungsten-filament light bulbs face a global switch-off as governments push for energy efficient fluorescent lamps to become the standard. But the light could soon go out on those lamps too, now that UK materials scientists have discovered a cheaper way to produce LED bulbs, which are three times as efficient as fluorescent lamps.

Although the ultimate dominance of LED lights has long been predicted, the expense of the super-efficient technology has made the timescale uncertain. The researchers now say LED bulbs based on their new process could be commercially available within five years.

Gallium nitride (GaN) LEDs have many advantages over compact fluorescent lamps (CFLs) and incandescent bulbs. They switch on instantly, with no gradual warm-up, and can burn for an average of 100,000 hours before they need replacing - 10 times as long as fluorescent lamps and some 130 times as long as an incandescent bulb. CFLs also contain small levels of mercury, which makes environmentally-friendly disposal of spent bulbs difficult.

Cracking up

The cost of production has kept the LEDs far from homes and offices, however. Gallium nitride cannot be grown on silicon like other solid-state electronic components because it shrinks at twice the rate of silicon as it cools. Crystals of GaN must be grown at 1000°C, so by the time a new LED made on silicon has cooled, it has already cracked, rendering the devices unusable.

One solution is to grow the LEDs on sapphire, which shrinks and cools at much the same rate as GaN. But the expense is too great to be commercially competitive.

Now Colin Humphreys's team at the University of Cambridge has discovered a simple solution to the shrinkage problem.

They included layers of aluminium gallium nitride in their LED design. These layers shrink at a much slower rate during cooling and help to counteract the fast-shrinkage of pure gallium nitride. These LEDs can be grown on silicon as so many other electronics components are. "They still work well as LEDs even with those extra layers inside," says Humphreys.

Early switch-over

A 15-centimetre silicon wafer costs just $15 and can accommodate 150,000 LEDs making the cost per unit tiny. That levels the playing field with CFLs, which many people only ever saw as a stopgap solution to the lighting problem.

Humphreys reckons that the UK government encouraged consumers to drop tungsten bulbs too soon. "We should have stayed with tungsten for another five years and then switched to LEDs," he says.

Humphreys's team was funded by the UK Engineering and Physical Sciences Research Council. The UK government's Technology Strategy Board will now provide the funding to turn the new technology into a commercial process.

Monday 19 January 2009

Balloon power isn't just a load of hot air

Original Article - New Scientist

For those who dislike the sight of wind turbines on the horizon, would a spectacular hot-air balloon farm be more acceptable?

Ian Edmonds, an environmental consultant with Solartran in Brisbane, Australia, has designed a giant engine with a balloon as its "piston". A greenhouse traps solar energy, providing hot air to fill the balloon. As the balloon rises, it pulls a tether, which turns a generator on the ground. Once the balloon has reached 3 kilometres, air is released through its vent and it loses buoyancy. This means less energy is needed to pull the balloon back down again, resulting in a net power gain (Renewable Energy, DOI: 10.1016/j.renene.2008.06.022). "It is like a huge two-stroke engine, with a capacity of 45 million litres, a stroke of 3 kilometres, and a frequency of one revolution per hour," says Edmonds.

For roughly the same cost as wind power, Edmonds has calculated that a large 44-metre-diameter recreational balloon could generate 50 kilowatts, enough to supply energy to about 10 homes. Doubling the diameter of the balloon would increase power production tenfold, substantially reducing costs, he says.

Using air heated by the sun to generate power has been attempted before: solar towers use the rising air to turn turbines. But a prototype solar tower in Manzanares in Spain proved too short even at 200 metres, limiting the amount of energy that could be captured from the rising air. Building towers of 500 metres or more has so far proved too expensive.

Science Direct

Wednesday 14 January 2009

Cut power costs with DC power

Original Article - InfoWorld

(Note: The AC.vs.DC discussion below, while centered around dense data center applications, has relevance to the New City Design discussion as we attempt to rationalize the integration of micro generator capability, DC motors, devices which currently rely on low amperage DC power bricks and the AC transmission grid. Where is the most appropriate AC/DC boundary and how to we make it flexible and capable of evolution as our social structures change and as public spaces and private spaces arrange themselves?)


Our recent article "10 power-saving myths debunked" generated a lot of interest and controversy. One topic that sparked plenty of discussion was the use of DC power in the datacenter. Because all computers use DC power internally, the basic concept is to limit the number of energy-wasting AC-to-DC conversions between the utility pad and the servers and to make those conversions as efficient as possible.

In a typical datacenter environment, power conversions abound along the path from the outside utility pad to the servers. With each conversion, some power is lost. The power starts at the utility pad at 16,000 VAC (volts alternating current), then converted to 440 VAC, to 220 VAC, then to 110 VAC before it reaches the UPSes feeding each server rack. Each UPS converts the incoming AC power to DC power, then back to AC. The UPSes then distribute that AC power to their respective servers -- where it's converted back to DC. As much as 50 to 70 percent of the electricity that comes into the datacenter is wasted throughout this long and winding conversion process.

There's a more efficient approach, one promoted by Validus DC Systems: taking the utility-supplied 13,000 VAC and converting it directly to 575 VDC (volts direct current) using an outdoor-rated conversion unit, then running power into the datacenter over 1.5-inch cabling. Each rack in the datacenter then has a 575-to-48-VDC converter that is 95 percent efficient. The direct DC approach can save users 50 percent or more between cooling savings and elimination of conversion losses, according to Ron Croce, COO of Validus.

It might be tempting to place an AC-DC conversion unit outside the datacenter so that heat dissipation occurs outdoors and to run 48 VDC into the datacenter. However, long runs at 48 VDC suffer from voltage drop, which means that a good deal of power is lost before it gets to the servers -- about 20 percent for every 100 feet of cabling.

One of the common arguments against using DC power in the datacenter is that machines don't support it: Most servers run on 110 VAC, which is then converted internally into 5 VDC and 12 VDC. However, with the use of DC power gaining some traction in datacenters, a number of server vendors, including HP, IBM, and Sun, are making DC power supplies available on some or all of their server lines, such that the machines can run on 48 VDC. HP's next generation of server chassis will be the same for all AC- and DC-powered systems, with modular power supplies.

Moreover, some large systems, such as the IBM P Series, are already designed to use 575 VDC. Although there is no current standard for high-voltage DC power in datacenters, Panduit and other companies are working on a standardized 400-VDC connector and cabling solution. General Electric is currently working on listing 600-VDC circuit breakers with the Underwriters Laboratories. These breakers already function at 600 VDC but were not previously rated because there was no demand.

In addition to providing cost savings through higher efficiency, DC systems may also provide an opportunity to expand datacenter capacity: Many existing datacenters are using only part of their available square footage because they can't get more power or cooling capacity. Some telecommunications centers are finding that newer rack systems require 100 watts per square foot rather than the old standard of 40 watts per square foot. Due to lack of space, their buildings can't support the 4-foot-thick bundles of cabling necessary for that much 48-VDC power. Moving to high-voltage DC could get around these limitations, because the required cabling would be just 1.5 inches thick.

In many areas, including New York, the San Francisco Bay Area, and Los Angeles, companies are unable to get additional power from the local utilities. Increasing the efficiency of existing systems could also allow companies to continue to use existing buildings and still expand datacenter capacity.

A high-voltage power system like that from Validus requires substantial installation and investment, including running large diameter cabling from the utility pad outside into the datacenter, installing the 575-to-48-VDC converters for each rack, and converting servers to 48 VDC. However, saving 50 percent or more on power over many years represents a big return. For companies that are unable to increase datacenter capacity by buying more power capacity, turning to DC may be the only solution.

Monday 5 January 2009

Report: Toyota developing solar powered green car

By YURI KAGEYAMA, AP Business Writer

TOKYO – Toyota Motor Corp. is secretly developing a vehicle that will be powered solely by solar energy in an effort to turn around its struggling business with a futuristic ecological car, a top business daily reported Thursday.

The Nikkei newspaper, however, said it will be years before the planned vehicle will be available on the market. Toyota's offices were closed Thursday and officials were not immediately available for comment.

According to The Nikkei, Toyota is working on an electric vehicle that will get some of its power from solar cells equipped on the vehicle, and that can be recharged with electricity generated from solar panels on the roofs of homes. The automaker later hopes to develop a model totally powered by solar cells on the vehicle, the newspaper said without citing sources.

The solar car is part of efforts by Japan's top automaker to grow during hard times, The Nikkei said.

In December, Toyota stunned the nation by announcing it will slip into its first operating loss in 70 years, as it gets battered by a global slump, especially in the key U.S. market. The surging yen has also hurt the earnings of Japanese automakers.

Still, Toyota is a leader in green technology and executives have stressed they won't cut back on environmental research despite its troubles.

Toyota, the manufacturer of the Lexus luxury car and Camry sedan, has already begun using solar panels at its Tsutsumi plant in central Japan to produce some of its own electricity.

The solar panels on the roofs add up in size to the equivalent of 60 tennis courts and produce enough electricity to power 500 homes, according to Toyota. That reduces 740 tons a year of carbon dioxide emissions and is equal to using 1,500 barrels of crude oil.

Toyota is also likely to indirectly gain expertise in solar energy when its partner in developing and producing hybrid batteries, Panasonic Corp., takes over Japanese rival Sanyo Electric Co., a leader in solar energy, early next year.