Tuesday, August 02. 2016
By fabric | ch
As we lack a decent search engine on this blog and as we don't use a "tag cloud" either... but because August is certainly one of the best period of the year to spend time reading and digging into past content and topics:
HERE ARE ALL THE CURRENT UPDATED CATEGORIES TO NAVIGATE ON | RBLG BLOG:
(to be seen below if you're navigating on the blog's page or here for rss readers)
Posted by Patrick Keller in fabric | ch, Architecture, Art, Culture & society, Design, Interaction design, Science & technology, Sustainability, Territory at 17:01
Defined tags for this entry: architecture, art, culture & society, design, fabric | ch, interaction design, science & technology, sustainability, territory, toorop
Monday, December 21. 2015
Note: for all friends, artists and architects that will still be around in Lausanne next Tuesday (22.12), let's all meet at Circuit gallery for Philippe Rahm's talk about his first novel published last Spring, Météorologie des sentiments. The book is a pilgrimage through many past memories, in a non linear way and under the combined angles of feelings and meteorology (as the title of the book states it...)
The novel is indeed closely related to Philippe's practice and teaching as an architect, with which we share many interests!
Météorologie des sentiments
Monday, November 23. 2015
Note: this article was published a while ago and was rebloged here and there already. I kept it in my pile of "interesting articles to read later when I'll have time" for a long time as well therefore. But it make sense to post it in conjunction with the previous one about Norman Foster and by extension with the otehr one concerning the Chicago Biennial.
It is also sometimes interesting to read posts with delay, when the hype and buzzwords are gone. Written in the aftermath of the Tesla annoncement about its home battery (Powerwall), the article was all about energy revolution. But since then, what? We're definitely looking forward...
Nobody wants to say it outright, but the Apple Watch sucks. So do most smartwatches. Every time I use my beautiful Moto 360, its lack of functionality makes me despair. But the problem isn’t our gadgets. It’s that the future of consumer tech isn’t going to come from information devices. It’s going to come from infrastructure.
That’s why Elon Musk’s announcements of the new Tesla battery line last night were more revolutionary than Apple Watch and more exciting than Microsoft’s admittedly nifty HoloLens. Information tech isn’t dead — it has just matured to the point where all we’ll get are better iterations of the same thing. Better cameras and apps for our phones. VR that actually works. But these are not revolutionary gadgets. They are just realizations of dreams that began in the 1980s, when the information revolution transformed the consumer electronics market.
But now we’re entering the age of infrastructure gadgets. Thanks to devices like Tesla’s household battery, Powerwall, electrical grid technology that was once hidden behind massive barbed wire fences, owned by municipalities and counties, is now seeping slowly into our homes. And this isn’t just about alternative energy like solar. It’s about how we conceive of what technology is. It’s about what kinds of gadgets we’ll be buying for ourselves in 20 years.
It’s about how the kids of tomorrow won’t freak out over terabytes of storage. They’ll freak out over kilowatt-hours.
Beyond transforming our relationship to energy, though, the infrastructure age is about where we expect computers to live. The so-called internet of things is a big part of this. Our computers aren’t living in isolated boxes on our desktops, and they aren’t going to be inside our phones either. The apps in your phone won’t always suck you into virtual worlds, where you can escape to build treehouses and tunnels in Minecraft. Instead, they will control your home, your transit, and even your body.
Once you accept that the thing our ancestors called the information superhighway will actually be controlling cars on real-life highways, you start to appreciate the sea change we’re witnessing. The internet isn’t that thing in there, inside your little glowing box. It’s in your washing machine, kitchen appliances, pet feeder, your internal organs, your car, your streets, the very walls of your house. You use your wearable to interface with the world out there.
It makes perfect sense to me that a company like Tesla could be at the heart of the new infrastructure age. Musk’s focus has always been relentlessly about remolding the physical world, changing the way we power our transit — and, with SpaceX, where future generations might live beyond Earth. The opposite of cyberspace is, well, physical space. And that’s where Tesla is taking us.
But in the infrastructure age, physical space has been irrevocably transformed by cyberspace. Now we use computers to experience the world in ways we never could before computer networks and data analysis, using distributed sensor devices over fault lines to give people early warnings about earthquakes that are rippling beneath the ground — and using satellites like NASA’s SMAP to predict droughts years before they happen.
Of course, there are the inevitable dangers that come with infusing physical space with all the vulnerabilities of cyberspace. People will hack your house; they’ll inject malicious code into delivery drones; stealing your phone might become the same thing as stealing your car. We’ll still be mining unsustainably to support our glorious batteries and photovoltaics and smart dance clubs.
But we will also benefit enormously from personalizing the energy grid, creating a battery-powered hearth for every home. Plus the infrastructure age leads directly into outer space, to tackle big problems of human survival, and diverts our impoverished attention spans from gazing neurotically at the social scene unfolding in tiny glowing rectangles on our wrists.
The information age brought us together, for better or worse. It allowed us to understand our environment and our bodies in ways we never could before. But the infrastructure age is what will prevent us from killing ourselves as we grow up into a truly global civilization. That is far more important, and exciting, than any gold watch could ever be.
Note: Meanwhile, on the "big architects" end of the spectrum... Where I enjoyed to read the sentence " Foster is delighted that Britain now has an infrastructure commission, chaired by Andrew Adonis, which he says gives the opportunity to plan in 30-year cycles and remove the politics from infrastructure."
Via The Guardian
By Rowan Moore
Norman Foster’s Millau viaduct in France, which has ‘cut out five-hour traffic jams’. Photograph: Michael Reinhard/Corbis
“Do you believe in infrastructure?” asks Norman Foster, with challenge in his voice. He does. Infrastructure, he says, is about “investing not to solve the problems of today but to anticipate the issues of future generations”. He cites his hero, Joseph Bazalgette, who, in solving Victorian London’s sewage problems, “thought holistically to integrate drains with below-ground public transportation and above-ground civic virtue”.
Foster is delighted that Britain now has an infrastructure commission, chaired by Andrew Adonis, which he says gives the opportunity to plan in 30-year cycles and remove the politics from infrastructure. He will expound these views this week at the Urban Age 10th anniversary Global Debates, Urban Age being the LSE’s Deutsche Bank-sponsored series of conferences in which high-powered and highly powerful people travel the world exchanging views on city building.
Statistics spin out of him about sustainability. “If you take the carbon footprint of London, that’s one seventh of that of Atlanta, so there’s a relationship between density and emissions. The whole climate change issue, which many would argue is about the survival of the species, comes down to urbanism.
Foster’s proposed design for the Thames Hub airport. Photograph: dbox/Foster & Partners
“When I was in Harvard recently, I said that each of us in this room, the energy that we consume in one year would equal the energy consumed by two Japanese, 13 Chinese, 31 Indians and 370 Ethiopians. So you start to take the relationship between energy consumed by a society and infant mortality, life expectancy, sexual freedom, academic freedom, freedom from violence. So those societies that consume more energy have more of those desirable qualities, so all those issues are inseparable from the nature of the infrastructure.” The connections between these points are not always clear, but the argument seems to be that better use of energy through better infrastructure will enable more people to live better.
Of his own work, Foster says that many of the most important projects are not what are normally considered buildings, but things such as the Millennium Bridge, the pedestrianisation of Trafalgar Square in London, the Millau viaduct in southern France and the remaking of the Marseille waterfront. More statistics: “Millau cut out five-hour traffic jams, which meant that the saving in CO2 from the 10% of traffic that is heavy good vehicles had an effect equivalent to a forest of 40,000 trees.”
He has campaigned vigorously for the Thames Hub, a new airport in the Thames estuary with an associated network of huge ambition: an orbital railway around London, a flood barrier, tidal energy generation. He is profoundly disappointed that his plan is likely to be rejected in favour of an expanded Heathrow: “The reality of a hub airport is that you can never ever do that at Heathrow. If you do that at Heathrow now you can absolutely guarantee that we will still be pedalling furiously to stand still. You can never accommodate long-term needs there.”
Norman Foster: ‘The whole climate change issue comes down to urbanism.’ Photograph: Manolo Yllera
But given what he just said about sustainability, should we be expanding airports at all? “Do you eat meat?” he asks scathingly. “You’re probably going to have your hamburger in spite of the fact that you’re going to make a much greater impact than any travel.” Air travel, he says, “compares well statistically with the amount of methane produced by cows and the amount of energy and water needed to produce a hamburger”.
“The reality is that all society is embedded in mobility. You’re going to take that flight. You’d be better to take the flight out of an airport that is driven by tidal power and which uses natural light, and which anticipates the day when air travel will be more sustainable.” He talks of solar-powered flight and planes made of lightweight composite materials.
It could also be asked what is the role of the architect in what is generally the province of engineers, planners and politicians. Around us is evidence of his practice’s apparent potency – towers in China and India, a model of the giant circle, one mile in circumference, which will be Apple’s new headquarters, images of a concept for habitats on Mars – but Foster says: “I have no power as an architect, none whatsoever. I can’t even go on to a building site and tell people what to do.” Advocacy, he says, is the only power an architect ever has.
To write about Foster presents a particular challenge to an architecture critic. The scale of his achievement is immense and he has created many outstanding buildings. A wise man recently pointed out that if Foster had only built his 20 or 30 best works, critical admiration would be virtually unqualified. It is largely because his practice has designed many more projects than this that he sometimes gets a bad press. But would it really have been better if he had confined himself to a boutique practice in order to preserve his architectural purity?
It can seem peevish and petty to question his work, but it is not beyond criticism. In particular, it can become weaker the more it makes contact with realities outside itself. If you look upwards in the Great Court he designed in the British Museum, you will see an impressive structure of steel and glass, but at your own level it becomes bland and sometimes clumsy. The Gherkin is a memorable presence on the London skyline, but awkward at pavement level. The Millennium Bridge, even with the modifications necessary to stop it wobbling, is confident and elegant except at its landing, where the overhang of its cantilever creates spaces that are plain nasty.
In the context of infrastructure, the question is also whether it adapts to the political, social and physical conditions that surround it. In answer to Foster’s question, yes, I do believe in infrastructure. Or, rather, I’d compare it to water: essential to existence, life-enhancing and sometimes beautiful, but with the power to damage and destroy if misused.
Design for the proposed drone-port project in Rwanda. Photograph: Foster & Partners
All this makes a new drone-port project in Rwanda one of Foster and Partners’ most intriguing. Conceived with Jonathan Ledgard, the director of Afrotech, who describes himself as a thinker on the future of Africa, it is a plan to create a network of cargo drones that can bring medical supplies and blood, plus spare parts, electronics and e-commerce, to hard-to-access parts of Africa. The drones have ports – shelters where they can safely land and unload, but which also serve as “a health clinic, a digital fabrication shop, a post and courier room, and an e-commerce trading hub, allowing it to become part of local community life”. Because of their inaccessible locations, they have to be built using materials close to hand, so techniques have been developed for efficiently making local earth into bricks and stones into foundations.
It is impossible at this point and at this distance to know if the drone-port project will achieve what it hopes, but its ambition to adapt to local conditions seems absolutely to the point. The interesting question is then how to bring the same thinking to infrastructure in a developed country, such as Britain. What is the right infrastructure for the society and culture of this country, at this point? Has it changed since Foster’s Victorian heroes, such as Bazalgette, did their work? Can we import the large-scale thinking of modern China and, if so, with what modification? These are good questions for an architect to address.
Urban Age Global Debates run until 3 December; lsecities.net/ua
Wednesday, October 21. 2015
Note: suddenly speaking about web design, wouldn' it be the time to start again doing some interaction design on the web? Aren't we in need of some "net art" approach, some weirder propositions than the too slick "responsive design" of a previsible "user-centered" or even "experience" design dogma? These kind of complex web/interaction experiences almost all vanished (remember Jodi?) To the point that there is now a vast experimental void for designers to tap again into!
Well, after the site that can only be browsed by one person at a time (with a poor visual design indeed), here comes the one that self destruct itself. Could be a start... Btw, thinking about files, sites or contents, etc. that would self destruct themsleves would probably help save lots of energy in data storage, hard drives and datacenters of all sorts, where these data sits like zombies.
By Isis Madrid
Former head of product at Flickr and Bitly, Matt Rothenberg recently caused an internet hubbub with his Unindexed project. The communal website continuously searched for itself on Google for 22 days, at which point, upon finding itself, spontaneously combusted.
In addition to chasing its own tail on Google, Unindexed provided a platform for visitors to leave comments and encourage one another to spread the word about the website. According to Rothenberg, knowledge of the website was primarily passed on in the physical world via word of mouth.
“Part of the goal with the project was to create a sense of unease with the participants—if they liked it, they could and should share it with others, so that the conversation on the site could grow,” Rothenberg told Motherboard. “But by doing so they were potentially contributing to its demise via indexing, as the more the URL was out there, the faster Google would find it.”
When the website finally found itself on Google, the platform disappeared and this message replaced it:
If you are interested in creating a similar self-destructing site, feel free to start with Rothenberg’s open source code.
Friday, October 02. 2015
The design research Inhabiting and Interfacing the Cloud(s) will be presented during the peer reviewed Renewable Futures Conference next week in Riga (Estonia), which will be the first edition of a serie that promiss to scout for radical approaches.
Christophe Guignard will introduce the participants to the stakes and the progresses of our ongoing experimental work. There will be profiled and inspiring speakers such as Lev Manovitch, John Thackara, Andreas Brockmann, etc.
Christophe Guignard will make a short “follow up” about the conference on this blog once he’ll be back from Riga.
Friday, August 14. 2015
Note: While being interested in the idea of the commune for some time now --I've been digging into old stories, like the ones of the well named Haight-Ashbury's Diggers, or the Droppers, in connection to system theory, cybernetics and information theory and then of course, to THE Personal Computer as "small scale technology" , so as to "the biggest commune of all: the internet" (F. Turner)--.
The idealistic social flatness of the communes, anarchic yet with inevitable emerging order, its "counter" approach to western social organization but also the fact that in the end, the 60ies initiatives seemed to have "failed" for different reasons, interests me for further works. These "diggings" are also somehow connected to a ongoing project and tool we recently published online, a "data commune": Datadroppers (even so it is just a shared tool).
Following this interest, I came accross this latest online publication by uncube (Issue #34) about the Commune Revisited, which both have an historic approach to old experiments (like the one of Drop City), and to more recent ones, up to the "gated community" ... The idea of the editors being to investigate the diversity of the concepts. It brings an interesting contemporary twist and understanding to the general idea... In a time when we are totally fed up with neo liberalism.
"One year after our Urban Commons issue, we're returning to the idea of the communal, this time investigating just how diversly the concept of "commune" can be interpreted - and not always with entirely benevolent intentions or successful results.
Wether trying to escape a broken economy or an oppressive system via new forms of existence or looking to break the system itself via anarchic methodologies, forming a commune traditionnaly involves segregation or stepping "outside" society.
But no matter how off-grid and back-to-nature the contemporary communities that we investigate here are, it turns out they are far more connected than we think.
Turn on, tune out, drop in.
Sunday, February 01. 2015
By fabric | ch
Along different projects we are undertaking at fabric | ch, we continue to work on self initiated researches and experiments (slowly, way too slowly... Time is of course missing). Deterritorialized House is one of them, introduced below.
Some of these experimental works concern the mutating "home" program (considered as "inhabited housing"), that is obviously an historical one for architecture but that is also rapidly changing "(...) under pressure of multiple forces --financial, environmental, technological, geopolitical. What we used to call home may not even exist anymore, having transmuted into a financial commodity measured in sqm (square meters)", following Joseph Grima's statement in sqm. the quantified home, "Home is the answer, but what is the question?"
In a different line of works, we are looking to build physical materializations in the form of small pavilions for projects like i.e. Satellite Daylight, 46°28'N, while other researches are about functions: based on live data feeds, how would you inhabit a transformed --almost geo-engineered atmospheric/environmental condition? Like the one of Deterritorialized Living (night doesn't exist in this fictional climate that consists of only one day, no years, no months, no seasons), the physiological environment of I-Weather, or the one of Perpetual Tropical Sunshine, etc.?
We are therefore very interested to explore further into the ways you would inhabit such singular and "creolized" environments composed of combined dimensions, like some of the ones we've designed for installations. Yet considering these environments as proto-architecture (architectured/mediated atmospheres) and as conditions to inhabit, looking for their own logic.
We are looking forward to publish the results of these different projects along the year. Some as early sketches, some as results, or both. I publish below early sketches of such an experiment, Deterritorialized House, linked to the "home/house" line of research. It is about symbiotically inhabiting the data center... Would you like it or not, we surely de-facto inhabit it, as it is a globally spread program and infrastructure that surrounds us, but we are thinking here in physically inhabiting it, possibly making it a "home", sharing it with the machines...
What is happening when you combine a fully deterritorialized program (super or hyper-modern, "non lieu", ...) with the one of the home? What might it say or comment about contemporary living? Could the symbiotic relation take advantage of the heat the machine are generating --directly connected to the amount of processing power used--, the quality of the air, the fact that the center must be up and running, possibly lit 24/7, etc.
As we'll run a workshop next week in the context of another research project (Inhabiting and Interfacing the Cloud(s), an academic program between ECAL, HEAD, EPFL-ECAL Lab and EPFL in this case) linked to this idea of questioning the data center --its paradoxically centralized program, its location, its size, its functionalism, etc.--, it might be useful to publish these drawings, even so in their early phase (theys are dating back from early 2014, the project went back and forth from this point and we are still working on it.)
1) The data center level (level -1 or level +1) serves as a speculative territory and environment to inhabit (each circle in this drawing is a fresh air pipe sourrounded by a certain number of computers cabinets --between 3 and 9).
A potential and idealistic new "infinite monument" (global)? It still needs to be decided if it should be underground, cut from natural lighting or if it should be fragmented into many pieces and located in altitude (--likely, according to our other scenarios that are looking for decentralization and collaboration), etc. Both?
Fresh air is coming from the outside through the pipes surrounded by the servers and their cabinets (the incoming air could be an underground cooled one, or the one that can be found in altitude, in the Swiss Alps --triggering scenarios like cities in the moutains? moutain data farming? Likely too, as we are looking to bring data centers back into small or big urban environments). The computing and data storage units are organized like a "landscape", trying to trigger different atmospheric qualities (some areas are hotter than others with the amount of hot air coming out of the data servers' cabinets, some areas are charged in positive ions, air connectivity is obviously everywhere, etc.)
Artificial lighting follows a similar organization as the servers' cabinets need to be well lit. Therefore a light pattern emerges as well in the data center level. Running 24/7, with the need to be always lit, the data center uses a very specific programmed lighting system: Deterritorialized Daylight linked to global online data flows.
2) Linked to the special atmospheric conditions found in this "geo-data engineered atmosphere" (the one of the data center itself, level -1 or 1), freely organized functions can be located according to their best matching location. There are no thick walls as the "cabinets islands" acts as semi-open partitions.
A program starts to appear that combines the needs of a data center and the one of a small housing program which is immersed into this "climate" (dense connectivity, always artificially lit, 24°C permanent heat). "Houses" start to appear as "plugs" into a larger data center.
3) A detailed view (data center, level -1 or +1) on the "housing plug" that combine programs. At this level, the combination between an office-administration unit for a small size data center start to emerge, combined with a kind of "small office - home office" that is immersed into this perpetually lit data space. This specific small housing space (a studio, or a "small office - home office") becomes a "deterritorialized" room within a larger housing program that we'll find on the upper level(s), likely ground floor or level +2 of the overall compound.
4) Using the patterns emerging from different spatial components (heat, light, air quality --dried, charged in positive ions--, wifi connectivity), a map is traced and "moirés" patterns of spatial configurations ("moirés spaces") start to happen. These define spatial qualities. Functions are "structurelessly" placed accordingly, on a "best matching location" basis (needs in heat, humidity, light, connectivity which connect this approach to the one of Philippe Rahm, initiated in a former research project, Form & Function Follow Climate (2006). Or also i.e. the one of Walter Henn, Burolandschaft (1963), if not the one of Junya Ishigami's Kanagawa Institute).
Note also that this is a line of work that we are following in another experimental project at fabric | ch, about which we also hope to publish along the year, Algorithmic Atomized Functioning --a glimpse of which can be seen in Desierto Issue #3, 28° Celsius.
5) On ground level or on level +2, the rest of the larger house program and few parts of the data center that emerges. There are no other heating or artificial lighting devices besides the ones provided by the data center program itself. The energy spent by the data center must serve and somehow be spared by the house. Fresh and hot zones, artificial light and connectivity, etc. are provided by the data center emergences in the house, so has from the opened "small office - home office" that is located one floor below. Again, a map is traced based and moirés patterns of specific locations and spatial configurations emerge. Functions are also placed accordingly (hot, cold, lit, connected zones).
Starts or tries to appear a "creolized" housing object, somewhere in between a symbiotic fragmented data center and a house, possibly sustaining or triggering new inhabiting patterns...
Project (ongoing): fabric | ch
Team: Patrick Keller, Christophe Guignard, Christian Babski, Sinan Mansuroglu
Tuesday, December 23. 2014
A Columbia scientist and his startup think they have a plan to save the world. Now they have to convince the rest of us.
By Eli Kintish
CTO and co-founder Peter Eisenberger in front of Global Thermostat’s air-capturing machine.
Physicist Peter Eisenberger had expected colleagues to react to his idea with skepticism. He was claiming, after all, to have invented a machine that could clean the atmosphere of its excess carbon dioxide, making the gas into fuel or storing it underground. And the Columbia University scientist was aware that naming his two-year-old startup Global Thermostat hadn’t exactly been an exercise in humility.
But the reception in the spring of 2009 had been even more dismissive than he had expected. First, he spoke to a special committee convened by the American Physical Society to review possible ways of reducing carbon dioxide in the atmosphere through so-called air capture, which means, essentially, scrubbing it from the sky. They listened politely to his presentation but barely asked any questions. A few weeks later he spoke at the U.S. Department of Energy’s National Energy Technology Laboratory in West Virginia to a similarly skeptical audience. Eisenberger explained that his lab’s research involves chemicals called amines that are already used to capture concentrated carbon dioxide emitted from fossil-fuel power plants. This same amine-based technology, he said, also showed potential for the far more difficult and ambitious task of capturing the gas from the open air, where carbon dioxide is found at concentrations of 400 parts per million. That’s up to 300 times more diffuse than in power plant smokestacks. But Eisenberger argued that he had a simple design for achieving the feat in a cost-effective way, in part because of the way he would recycle the amines. “That didn’t even register,” he recalls. “I felt a lot of people were pissing on me.”
The next day, however, a manager from the lab called him excitedly. The DOE scientists had realized that amine samples sitting around the lab had been bonding with carbon dioxide at room temperature—a fact they hadn’t much appreciated until then. It meant that Eisenberger’s approach to air capture was at least “feasible,” says one of the DOE lab’s chemists, Mac Gray.
Five years later, Eisenberger’s company has raised $24 million in investments, built a working demonstration plant, and struck deals to supply at least one customer with carbon dioxide harvested from the sky. But the next challenge is proving that the technology could have a transformative impact on the world, befitting his company’s name.
The need for a carbon-sucking machine is easy to see. Most technologies for mitigating carbon dioxide work only where the gas is emitted in large concentrations, as in power plants. But air-capture machines, installed anywhere on earth, could deal with the 52 percent of carbon-dioxide emissions that are caused by distributed, smaller sources like cars, farms, and homes. Secondly, air capture, if it ever becomes practical, could gradually reduce the concentration of carbon dioxide in the atmosphere. As emissions have accelerated—they’re now rising at 2 percent per year, twice as rapidly as they did in the last three decades of the 20th century—scientists have begun to recognize the urgency of achieving so-called “negative emissions.”
The obvious need for the technology has enticed several other efforts to come up with various approaches that might be practical. For example, Climate Engineering, based in Calgary, captures carbon using a liquid solution of sodium hydroxide, a well-established industrial technique. A firm cofounded by an early pioneer of the idea, Eisenberg’s Columbia colleague Klaus Lackner, worked on the problem for several years before giving up in 2012.
A report released in April by the Intergovernmental Panel on Climate Change says that avoiding the internationally agreed upon goal of 2 °C of global warming will likely require the global deployment of “carbon dioxide removal” strategies like air capture. (See “The Cost of Limiting Climate Change Could Double without Carbon Capture Technology.”) “Negative emissions are definitely needed to restore the atmosphere given that we’re going to far exceed any safe limit for CO2, if there is one,” says Daniel Schrag, director of the Harvard University Center for the Environment. “The question in my mind is, can it be done in an economical way?”
Most experts are skeptical. (See “What Carbon Capture Can’t Do.”) A 2011 report by the American Physical Society identified key physical and economic challenges. The fact that carbon dioxide will bind with amines, forming a molecule called a carbamate, is well known chemistry. But carbon dioxide still represents only one in 2,500 molecules in the air. That means an effective air-capture machine would need to push vast amounts of air past amines to get enough carbon dioxide to stick to them and then regenerate the amines to capture more. That would require a lot of energy and thus be very expensive, the 2011 report said. That’s why it concluded that air capture “is not currently an economically viable approach to mitigating climate change.”
The people at Global Thermostat understand these daunting economics but remain defiantly optimistic. The way to make air capture profitable, says Global Thermostat cofounder Graciela Chichilnisky, a Columbia University economist and mathematician, is to take advantage of the demand for the gas by various industries. There already exists a well-established, billion-dollar market for carbon dioxide, which is used to rejuvenate oil wells, make carbonated beverages, and stimulate plant growth in commercial greenhouses. Historically, the gas sells for around $100 per ton. But Eisenberger says his company’s prototype machine could extract a concentrated ton of the gas for far less than that. The idea is to first sell carbon dioxide to niche markets, such as oil-well recovery, to eventually create bigger ones, like using catalysts to make fuels in processes that are driven by solar energy. “Once capturing carbon from the air is profitable, people acting in their own self-interest will make it happen,” says Chichilnisky.
Eisenberger and Chichilnisky were colleagues at Columbia in 2008 when they realized that they had complementary interests: his in energy, and hers in environmental economics, including work to help shape the 1991 Kyoto Protocol, the first global treaty on cutting emissions. Nations had pledged big cuts, says Chichilnisky, but economic and political realities had provided “no way to implement it.” The pair decided to create a business to tackle the carbon challenge.
They focused on air capture, which was first developed by Nazi scientists who used liquid sorbents to remove accumulations of CO2 in submarines. In the winter of 2008 Eisenberger sequestered himself in a quiet house with big glass windows overlooking the ocean in Mendocino County, California. There he studied existing literature on capturing carbon and made a key decision. Scientists developing techniques to capture CO2 have thus far sought to work at high concentrations of the gas. But Eisenberger and Chichilnisky focused on another term in those equations: temperature.
Engineers have previously deployed amines to scrub CO2 from flue gases, whose temperatures are around 70 °C when they exit power plants. Subsequently removing the CO2 from the amines—“regenerating” the amines—generally requires reactions at 120 °C. By contrast, Eisenberger calculated that his system would operate at roughly 85 °C, requiring less total energy. It would use relatively cheap steam for two purposes. The steam would heat the surface, driving the CO2 off the amines to be collected, while also blowing CO2 away from the surface.
The upshot? With less heat-management infrastructure than what is required with amines in the smokestacks of power plants, the design of a scrubber could be simpler and therefore cheaper. Using data from their prototype, Eisenberger’s team figures the approach could cost between $15 and $50 per ton of carbon dioxide captured from air, depending on how long the amine surfaces last.
If Global Thermostat can achieve anywhere near the prices it’s touting, a number of niche markets beckon. The startup has partnered with a Carson City, Nevada-based company called Algae Systems to make biofuels using carbon dioxide and algae. Meanwhile the demand is rising for carbon dioxide to inject into depleted oil wells, a technique known as enhanced oil recovery. One study estimates that the application could require as much as 3 billion tons of carbon dioxide annually by 2021, a nearly tenfold increase over the 2011 market.
That still represents a drop in the bucket in terms of the amounts needed to reduce or even stabilize the concentration of CO2 in the atmosphere. But Eisenberger says there are really no alternatives to air capture. Simply capturing carbon emissions from coal-fired power plants, he says, only extends society’s dependence on carbon-intensive coal.
Suck it up
It’s a warm December afternoon in Silicon Valley as Eisenberger and I make our way across SRI International’s concrete research center. It’s in these low-slung buildings where engineers first developed ARPAnet, Apple’s Siri software, and countless other technological advances. About a quarter mile from the entrance, a 40-foot-high tower of fans, steel, and silver tubes comes into view. This is the Global Thermostat demonstration plant. It’s imposing and clean. Eisenberger gazes at the quiet scene around the tower, which includes a tall tree. “It’s doing exactly what the tree is doing,” says Eisenberger. But then he corrects himself. “Well, actually, it’s doing it a lot better.”
After Eisenberger earned a PhD physics in 1967 at Harvard, stints at Bell Labs, Princeton, and Stanford followed. At Exxon in the 1980s he led work on solar energy, then served as director of Lamont-Doherty, the geosciences lab at Columbia. There he has taught a long-standing seminar called “The Earth/Human system.” It was in that seminar, in 2007, with Lackner as a guest lecturer, that Eisenberger first heard about air capture. After a year or so of preparation, he and Chichilnisky reached out to billionaire Edgar Bronfman Jr. “Sometimes when you hear something that must be too good to be true, it’s because it is,” was Bronfman’s reaction, according to his son, who was present at the meeting. But the scion implored his father: “If they’re right, this is one of the biggest opportunities out there.” The family invested $18 million.
That largesse has allowed the company to build its demonstration despite basically no federal support for air capture research. (Global Thermostat chose SRI as its site due to the facility’s prior experience with carbon-capture technology.) The rectangular tower uses fans to draw air in over alternating 10-foot-wide surfaces known as contactors. Each is comprised of 640 ceramic cubes embedded with the amine sorbent. The tower raises one contactor as another is lowered. That allows the cubes of one to collect CO2 from ambient air while the other is stripped of the gas by the application of the steam, at 85 °C. For now that gas is simply vented, but depending on the customer it could be injected into the ground, shipped by pipe, or transferred to a chemical plant for industrial use.
A key challenge facing the company is the ruggedness of the amine sorbent surfaces. They tend to decay rapidly when oxidized, and frequently replacing the sorbents could make the process much less cost-effective than Eisenberger projects.
None of the world’s thousands of coal plants have been outfitted for full-scale capture of their carbon pollution. And if it isn’t economical for use in power plants, with their concentrated source of carbon dioxide, the prospects of capturing it out of the air seem dim to many experts. “There’s really little chance that you could capture CO2 from ambient air more cheaply than from a coal plant, where the flue gas is 300 times more concentrated,” says Robert Socolow, director of the Princeton Environment Institute and co-director of the university’s carbon mitigation initiative.
Adding to the skepticism over the feasibility of air capture is that there are other, cheaper ways to create the so-called negative emissions. A more practical way to do it, Schrag says, would involve deriving fuels from biomass—which removes CO2 from the atmosphere as it grows. As that feedstock is fermented in a reactor to create ethanol, it produces a stream of pure carbon dioxide that can be captured and stored underground. It’s a proven technique and has been tested at a handful of sites worldwide.
Even if air capture were to someday prove profitable, whether it should be scaled up is another question. Say a solar power plant is built outside an existing coal plant. Should the energy the new solar plant produces be used to suck carbon out of the atmosphere, or to allow the coal plant to be shut down by replacing its energy output? The latter makes much more sense, says Socolow. He and others have another concern about air capture: that claims about its feasibility could breed complacency. “I don’t want us to give people the false hope that air capture can solve the carbon emissions problem without a strong focus on [reducing the use of] fossil fuels,” he says.
Eisenberger and Chichilnisky are adamant about the importance of sucking CO2 out of the atmosphere rather than focusing entirely on capturing it from coal plants. In 2010, the pair developed a version of their technology that mixes air with flue gas from a coal or gas-fired power plant. That approach provides a source of steam while capturing both atmospheric carbon and new emissions. It also could lower costs by providing a higher concentration of CO2 for the machine to capture. “It’s a very impressive system, a triumph,” says Socolow, who thinks scientific advances made in air capture will eventually be used primarily on coal and gas power plants.
Such an application could play a critical role in cleaning up greenhouse gas emissions. But Eisenberger has revealed even loftier goals. A patent granted to him and Chichilnisky in 2008 described air capture technology as, among other things, “a global thermostat for controlling average temperature of a planet’s atmosphere.”
Eli Kintisch is a correspondent for Science magazine.
Friday, April 25. 2014
This article by Carlo Ratti originally appeared in The European titled “The Sense-able City“. Ratti outlines the driving forces behind the Smart Cities movement and explain why we may be best off focusing on retrofitting existing cities with new technologies rather than building new ones.
What was empty space just a few years ago is now becoming New Songdo in Korea, Masdar in the United Arab Emirates or PlanIT in Portugal — new “smart cities”, built from scratch, are sprouting across the planet and traditional actors like governments, urban planners and real estate developers, are, for the first time, working alongside large IT firms — the likes of IBM, Cisco, and Microsoft.
The resulting cities are based on the idea of becoming “living labs” for new technologies at the urban scale, blurring the boundary between bits and atoms, habitation and telemetry. If 20th century French architect Le Corbusier advanced the concept of the house as a “machine for living in”, these cities could be imagined as inhabitable microchips, or “computers in open air”.
Read on for more about the rise of Smart Cities
Wearable Computers and Smart Trash
The very idea of a smart city runs parallel to “ambient intelligence” — the dissemination of ubiquitous electronic systems in our living environments, allowing them to sense and respond to people. That fluid sensing and actuation is the logical conclusion of the liberation of computing: from mainframe solidity to desktop fixity, from laptop mobility to handheld ubiquity, to a final ephemerality as computing disappears into the environment and into humans themselves with development of wearable computers.
It is impossible to forget the striking side-by-side images of the past two Papal Inaugurations: the first, for Benedict XVI in 2005, shows the raised hands of a cheering crowd, while the second, for Francesco I in 2013, a glimmering constellation of smartphone screens held aloft to take pictures. Smart cities are enabled by the atomization of technology, ushering an age when the physical world is indistinguishable from its digital overlay.
The key mechanism behind ambient intelligence, then, is “sensing” — the ability to measure what happens around us and to respond dynamically. New means of sensing are suffusing every aspect of urban space, revealing its visible and invisible dimensions: we are learning more about our cities so that they can learn about us. As people talk, text, and browse, data collected from telecommunication networks is capturing urban flows in real time and crystallizing them as Google’s traffic congestion maps.
Like a tracer running through the veins of the city, networks of air quality sensors attached to bikes can help measure an individual’s exposure to pollution and draw a dynamic map of the urban air on a human scale, as in the case of the Copenhagen Wheel developed by new startup Superpedestrian. Even trash could become smarter: the deployment of geolocating tags attached to ordinary garbage could paint a surprising picture of the waste management system, as trash is shipped throughout the country in a maze-like disposal process — as we saw in Seattle with our own Trash Track project.
Afraid of Our Own Bed
Today, people themselves (equipped with smartphones, naturally) can be instruments of sensing. Over the past few years, a new universe of urban apps has appeared — allowing people to broadcast their location, information and needs — and facilitating new interactions with the city. Hail a taxi (“Uber”), book a table for dinner (“OpenTable”), or have physical encounters based on proximity and profiles (“Grindr” and “Blendr”): real-time information is sent out from our pockets, into the city, and right back to our fingertips.
In some cases, the very process of sensing becomes a deliberate civic action: citizens themselves are taking an increasingly active role in participatory data sharing. Users of Waze automatically upload detailed road and traffic information so that their community can benefit from it. 311-type apps allow people to report non-emergencies in their immediate neighborhood, from potholes to fallen tree branches, and subsequently organize a fix. Open Street Map does the same, enabling citizens to collaboratively draw maps of places that have never been systematically charted before — especially in developing countries not yet graced by a visit from Google.
These examples show the positive implications of ambient urban intelligence but the data that emerges from fine-grained sensing is inherently neutral. It is a tool that can be used in many different applications, and to widely varying ends. As artist-turned-XeroxPARC-pioneer Rich Gold once asked in an incisive (and humorous) essay: “How smart does your bed have to be, before you are afraid to go to sleep at night?” What might make our nights sleepless, in this case, is the sheer amount of data being generated by sensing. According to a famous quantification by Google’s Eric Schmidt, every 48 hours we produce as much data as all of humanity until 2003 (an estimation that is already three years old). Who has access to this data? How do we avoid the dystopian ending of Italo Calvino’s 1960s short story “The Memory of the World,” where humanity’s act of infinite recording unravels as intrigue, drama, and murder?
And finally, does this new pervasive data dimension require an entirely new city? Probably not. Of course, ambient intelligence might have architectural ramifications, like responsive building facades or occupant-targeted climates. But in each of the city-sensing examples above, technology does not necessarily call for new urban space — many IT-infused “smart city initiatives” feel less like a necessity and more like a justification of real estate operations on a massive scale – with a net result of bland spatial products.
Forget About Flying Cars
Ambient intelligence can indeed pervade new cities, but perhaps most importantly, it can also animate the rich, chaotic erstwhile urban spaces — like a new operating system for existing hardware. This was already noted by Bill Mitchell at the beginning of our digital era: “The gorgeous old city of Venice […] can integrate modern telecommunications infrastructure far more gracefully than it could ever have adapted to the demands of the industrial revolution.” Could ambient intelligence bring new life to the winding streets of Italian hill towns, the sweeping vistas of Santorini, or the empty husks of Detroit?
We might need to forget about the flying cars that zip through standard future cities discourse. Urban form has shown an impressive persistence over millennia — most elements of the modern city were already present in Greek and Roman times. Humans have always needed, and will continue to need, the same physical structures for their daily lives: horizontal planes and vertical walls (no offense, Frank O. Gehry). But the very lives that unfold inside those walls is now the subject of one of the most striking transformations in human history. Ambient intelligence and sensing networks will not change the container but the contained; not smart cities but smart citizens.
This article by Carlo Ratti originally appeared in The European Magazine
Posted by Patrick Keller in Architecture, Culture & society, Interaction design, Science & technology, Sustainability at 07:51
Defined tags for this entry: architecture, citizen, culture & society, data, interaction design, monitoring, participative, science & technology, sustainability, urbanism, users
(Page 1 of 26, totaling 259 entries) » next page
fabric | rblg
This blog is the survey website of fabric | ch - studio for architecture, interaction and research.
We curate and reblog articles, researches, writings, exhibitions and projects that we notice and find interesting during our everyday practice and readings.
Most articles concern the intertwined fields of architecture, territory, art, interaction design, thinking and science. From time to time, we also publish documentation about our own work and research, immersed among these related resources and inspirations.
This website is used by fabric | ch as archive, references and resources. It is shared with all those interested in the same topics as we are, in the hope that they will also find valuable references and content in it.
| rblg on Twitter