Thursday, January 13. 2011
Via MIT Technology Review
-----
Big computing providers are developing energy-saving strategies for new server farms.
By Cindy Waxer
When it came time for Hewlett-Packard to decide on a location for its new data center, the company could have considered variables like network connectivity, local talent, or proximity to corporate headquarters. Instead, a 100-year weather report convinced HP to build its new 360,000-square-foot facility in breezy Billingham, England.
|
Server farm: Yahoo’s data center in Lockport, New York, was inspired by a chicken coop and lets air naturally vent through the top.
Credit: Yahoo |
"You get a lot of cool and moist winds coming over the northeast coast of Britain," says Ian Brooks, HP's European head of sustainable computing. By harnessing these winds with massive fans, Brooks says, HP has created a system that uses 40 percent less energy than conventional methods of keeping data centers cool.
HP isn't the only company taking its cues from nature when it comes to the design and construction of data centers, clusters of server computers that run Internet services and store and crunch data. These facilities have been the smokestacks of the digital era because they use so much electricity: not only does it take a lot of power to run the machines themselves, but data centers are heavily air conditioned because servers generate a lot of heat and don't run well in environments much warmer than 25 ºC. As demand for online services skyrockets, the EPA predicts, U.S. data centers could nearly double their 2006 levels of energy consumption by 2011, reaching 100 billion kilowatt-hours per year—enough to power 10 million homes. By 2020, data centers will account for 18 percent of the world's carbon emissions, according to the Smart 2020 report released by the Climate Group, a nonprofit organization.
To reduce the environmental—and financial—burdens, more and more companies are trying innovative designs for data centers. For instance, at the HP center in Britain, known as Wynyard, fans more than two meters in diameter pull the North Sea winds into a mixing chamber, where they cool the warm air given off by the center's servers. That air is funneled into a large cavity beneath the servers, directed through vents in the floor, and then circulated throughout a series of aisles to chill the computers. The resulting warm exhaust is extracted, mixed with the incoming fresh air, and recirculated.
By eliminating the need for energy-intensive cooling equipment, the Wynyard facility cuts 12,500 metric tons of carbon dioxide from the total generated by the industry-standard data center. That is the equivalent of taking nearly 3,000 midsize vehicles off the road.
Another innovative data center is one that Yahoo opened in September 2010 in Lockport, New York. In this case, the inspiration came from chicken coops rather than coastal winds. "Chickens throw off a fair bit of heat; servers throw off a fair bit of heat," says Christina Page, Yahoo's director of climate and energy strategy. "So we built a long, tall, narrow building with a coop along the top to vent the air."
|
Drawn in: At Hewlett-Packard’s data center in Billingham, England, large fans pull in fresh air.
Credit: HP |
The 155,000-square-foot facility mimics the narrow design of a chicken coop and features louvers along the sides of the building so that prevailing winds can flow freely throughout the halls. On particularly hot days, the center can activate an evaporative cooling system, which uses less energy than traditional chillers. That means the facility uses at least 95 percent less water than a conventional data center, and 40 percent less energy—enough to power more than 9,000 households annually. What's more, with its preconstructed metal components, the chicken-coop structure can be assembled in less than six months.
"There's a good case to be made for the return on investment on a lot of green practices," says Page. "This data center was cheaper and faster to build, in addition to being more efficient on the operating-expenditure side."
The information-management company Iron Mountain, meanwhile, is taking advantage of natural geothermal conditions to slash energy consumption by locating a data center in a former limestone mine, 22 stories below ground. Iron Mountain's storage facility in Butler County, Pennsylvania, houses Room 48, whose racks of servers rely on the natural cooling properties of the limestone walls to remain at 13 ºC. Iron Mountain also developed a high-static air pressure differential cooling system that relies on high-velocity ducts, located in the cold aisles separating rows of servers, and linear return ducts in its hot aisles. The system creates winds that naturally cause cold air to sink and hot air to rise and exit the room through perforated ceiling tiles. The absence of air conditioners not only freed up about 30 percent more space in Room 48 but cut energy consumption for cooling by 10 to 15 percent compared with traditional data centers.
These are the kinds of unheralded changes that can really make a difference, says Mark Lafferty, director of strategic solutions at technology services provider CDW. "The really basic, non-glamorous, non-sexy stuff companies do can have a dramatic effect on the amount of resource consumption in a data center," he says.
Copyright Technology Review 2011.
Tuesday, January 11. 2011
Via Rhizome
-----
by Ceci Moss
Friday, January 07. 2011
Via ArchDaily
-----
by Alison Furuto
Moderated by Joseph Grima (Domus), all are invited to the free Critical Futures event starting at 6:30pm on January 13th, which will focus on a debate on the future of architecture criticism followed by complimentary drinks and further discussion after the talk. Participants include Charles Holland (author, Fantastic Journal), Peter Kelly (Blueprint), Kieran Long (architecture critic, Evening Standard), Geoff Manaugh (author, BLDGBLOG), and Beatrice Galilee (writer, curator, DomusWeb, The Gopher Hole). The event is located at The Gopher Hole, 350-354 Old Street, London, EC1V 9NQ. More event description after the break.
Over the past decade, epochal transformations have profoundly reshaped the context within which architecture is conceived and debated. The Internet has made images and information free and instantly ubiquitous; magazines, once the undisputed platforms for the criticism of architecture and design, have been challenged to redefine their purpose and economic model in the light of dwindling readerships; blogs have given a global audience, potentially of millions, to anyone with an Internet connection. In all of this, architecture criticism in the traditional sense appears to have all but vanished – not only from the Internet but from magazines themselves. As Peter Kelly, editor of Blueprint, wrote in a recent editorial, “As traditional publishing media and institutions become less influential, one wonders where architects can go to find informed, intelligent criticism of their work”.
Does, as author of BLDGBLOG Geoff Manaugh proposes, the designer of the videogame Grand Theft Auto have more influence as an architect than David Chipperfield? Is criticism in the traditional sense still relevant or useful? If the role of the print publication in contemporary production irreversibly declines, what is its future role? What forces will shape architectural production in a post-critical environment? Is, as Kelly writes, a more realistic and rigorous approach to architectural criticism online urgently needed?
As the first in a three-part series of debates on the future of architecture criticism organized by Domus in London, Milan and New York to celebrate the launch of its new website, this discussion will bring together writers, editors, bloggers and theorists active in the field today to address these and other questions.
The event will be hosted by The Gopher Hole, an exhibition and events space in Shoreditch, London.
Wednesday, January 05. 2011
Via MIT Technology Review
-----
Software-designed microbes could make biofuels and drugs.
By Katherine Bourzac
|
Bio coder: Christopher Voigt, an assistant professor at the University of California, San Francisco, is developing software to speed up designing microbes that produce biofuels and other useful chemicals.
Credit: Technology Review
|
Genetically modified microbes can biofuels, drugs, and other products efficiently and doing arduous work such as cleaning up toxic waste. But designing the complex biochemical pathways that modified microbes need to perform such tasks is like building a Rube Goldberg machine. Getting the genetic designs right is a time-consuming process of trial and error.
Christopher Voigt, an associate professor at the University of California, San Francisco, hopes to change that with software that automates the creation of "genetic circuits" in microbes. These circuits are the pathways of genes, proteins, and other biomolecules that the cells use to perform a particular task, such as breaking down sugar and turning it into fuel. Voigt and colleagues have so far made basic circuit components in E. coli. They are working with the large California biotechnology company Life Technologies to develop software that would let bioengineers design complete genetic circuits more easily.
Designing a microbe for a particular task would then be much like writing a new computer program, says Voigt. Just as programmers do not have to think about how electrons move through the gates in an integrated circuit, he says, biological engineers may eventually be able to design circuits for genes, proteins, and other biomolecules at a level of abstraction. "If we apply computational processes to things that bacteria can already do, we can get complete control over making spider silk, or drugs, or other chemicals," he says.
Certain types of circuits could, for instance, help regulate the activity of bacteria that produce biofuels. Instead of outside controls, internal circuits could maintain the chemical levels and other conditions needed to keep bacteria producing at high yields. "We're trying to make the cell understand where it is and what it should be doing based on its understanding of the world," says Voigt. Trying to design such a control circuit without the help of a computer would take a lot of trial and error.
Voigt has now made a type of circuit component called a NOR gate in E. coli bacteria. NOR gates can be combined to perform any logical operation. In work described in the journal Nature, Voigt's group also showed they could improve the quality of the output of bacterial circuits by having them work collectively, forming a circuit of NOR gates, one in each cell. Voigt has designed bacterial circuits to hook into natural bacterial communication systems called quorum sensing, so that the cells can "vote" on an output. This increases the quality of the computation peformed.
"This breakthrough work in synthetic biology expands our capacity to construct functional, programmable bacteria," says James Collins, professor of biomedical engineering at Boston University who is not affiliated with Voigt's team. Collins observes that the California researchers have learned to combine simple circuits in individual cells to make a more complex circuit at the population level. "This represents an important step towards harnessing the power of synthetic ecosystems for biotech applications," he says.
The University of California researchers are now entering the second year of a research agreement with Life Technologies to develop software to automate the biological design process. "The vision is to take these software modules and develop them so that the process of biological parts selection and circuit design is far more automated and simplified than it is today," says Todd Peterson, vice president of synthetic biology research and development at the company. The company hopes to incorporate most of the software modules being designed by Voigt's group into its Vector NTI software by the end of spring 2012.
Copyright Technology Review 2011.
Personal comment:
That move was expected and is now coming (soon to a software near you?): when code meets biology that meets design.
Monday, January 03. 2011
Old post but one of the most requested SPACEINVADING projects on archinect.com
Via bldgblog
---
Remnants of the Biosphere by Biospheric Design
Location: Arizona
Image Credits: Noah Sheldon "The structure was billed as the first large habitat for humans that would live and breathe on its own, as cut off from the earth as a spaceship," the New York Times wrote back in 1992, but the project was a near-instant failure.
Photographer Noah Sheldon got in touch the other week with a beautiful series of photos documenting the decrepit state of Biosphere 2, a semi-derelict bio-architectural experiment in the Arizona desert.
The largest sealed environment ever created, constructed at a cost of $200 million, and now falling somewhere between David Gissen's idea of subnature—wherein the slow power of vegetative life is unleashed "as a transgressive animated force against buildings"—and a bioclimatically inspired Dubai, Biosphere 2 even included its own one million-gallon artificial sea.
"The structure was billed as the first large habitat for humans that would live and breathe on its own, as cut off from the earth as a spaceship," the New York Times wrote back in 1992, but the project was a near-instant failure.
Scientists ridiculed it. Members of the support team resigned, charging publicly that the enterprise was awash in deception. And even some crew members living under the glass domes, gaunt after considerable loss of weight, tempers flaring, this winter threatened to mutiny if management did not repair a growing blot on the project's reputation.
The entire site was sold to private developers in 2007, leaving the buildings still functional andopen for toursbut falling apart.
Sheldon was originally inspired to visit and photograph the site after reading in the New York Timesthat "suburban sprawl" had come to surround the once-remote research site.
Indeed, we read, real estate development has "conquered vast swaths of the Sonoran Desert. The Biosphere, miles from nowhere when it was built in the 1980s, is now within the reach of a building boom streaking north from Tucson and south from Phoenix (and which some demographers say will eventually join the two cities, once 100 miles apart)." Traffic jams are not infrequent where there were once country roads, and new suburbs have sprung up within just a few miles of the research site.
Now, like something straight out of J.G. Ballard, the property might someday be home to a development called Biosphere Estates.
Sheldon's images, reproduced here with his permission, show the facility advancing into old age. A vast biological folly in the shadow of desert over-development, the project of Biosphere 2 seems particularly poignant in this unkempt state.
The fertile promise of the microcosm has been abandoned.
In this context, Biosphere 2 could perhaps be considered one of architect Francois Roche's "buildings that die," a term Roche used in a recent interview with Jeffrey Inaba. Indeed, in its current state Biosphere 2 is easily one of the ultimate candidates for Roche's idea of "corrupted biotopes"; the site's ongoing transformation into suburbia only makes this corruption more explicit.
Watching something originally built precisely as a simulation of the Earth —the in "Biosphere 2" is meant to differentiate this place from the Earth itself, i.e. Biosphere— slowly taken over by the very forces it was meant to model is philosophically extraordinary: the model taken over by the thing it represents. A replicant in its dying throes.
|