Tuesday, December 23. 2014Can Sucking CO2 Out of the Atmosphere Really Work? | #atmosphere
----- A Columbia scientist and his startup think they have a plan to save the world. Now they have to convince the rest of us.
By Eli Kintish
CTO and co-founder Peter Eisenberger in front of Global Thermostat’s air-capturing machine.
Physicist Peter Eisenberger had expected colleagues to react to his idea with skepticism. He was claiming, after all, to have invented a machine that could clean the atmosphere of its excess carbon dioxide, making the gas into fuel or storing it underground. And the Columbia University scientist was aware that naming his two-year-old startup Global Thermostat hadn’t exactly been an exercise in humility. But the reception in the spring of 2009 had been even more dismissive than he had expected. First, he spoke to a special committee convened by the American Physical Society to review possible ways of reducing carbon dioxide in the atmosphere through so-called air capture, which means, essentially, scrubbing it from the sky. They listened politely to his presentation but barely asked any questions. A few weeks later he spoke at the U.S. Department of Energy’s National Energy Technology Laboratory in West Virginia to a similarly skeptical audience. Eisenberger explained that his lab’s research involves chemicals called amines that are already used to capture concentrated carbon dioxide emitted from fossil-fuel power plants. This same amine-based technology, he said, also showed potential for the far more difficult and ambitious task of capturing the gas from the open air, where carbon dioxide is found at concentrations of 400 parts per million. That’s up to 300 times more diffuse than in power plant smokestacks. But Eisenberger argued that he had a simple design for achieving the feat in a cost-effective way, in part because of the way he would recycle the amines. “That didn’t even register,” he recalls. “I felt a lot of people were pissing on me.” The next day, however, a manager from the lab called him excitedly. The DOE scientists had realized that amine samples sitting around the lab had been bonding with carbon dioxide at room temperature—a fact they hadn’t much appreciated until then. It meant that Eisenberger’s approach to air capture was at least “feasible,” says one of the DOE lab’s chemists, Mac Gray. Five years later, Eisenberger’s company has raised $24 million in investments, built a working demonstration plant, and struck deals to supply at least one customer with carbon dioxide harvested from the sky. But the next challenge is proving that the technology could have a transformative impact on the world, befitting his company’s name. The need for a carbon-sucking machine is easy to see. Most technologies for mitigating carbon dioxide work only where the gas is emitted in large concentrations, as in power plants. But air-capture machines, installed anywhere on earth, could deal with the 52 percent of carbon-dioxide emissions that are caused by distributed, smaller sources like cars, farms, and homes. Secondly, air capture, if it ever becomes practical, could gradually reduce the concentration of carbon dioxide in the atmosphere. As emissions have accelerated—they’re now rising at 2 percent per year, twice as rapidly as they did in the last three decades of the 20th century—scientists have begun to recognize the urgency of achieving so-called “negative emissions.” The obvious need for the technology has enticed several other efforts to come up with various approaches that might be practical. For example, Climate Engineering, based in Calgary, captures carbon using a liquid solution of sodium hydroxide, a well-established industrial technique. A firm cofounded by an early pioneer of the idea, Eisenberg’s Columbia colleague Klaus Lackner, worked on the problem for several years before giving up in 2012.
A report released in April by the Intergovernmental Panel on Climate Change says that avoiding the internationally agreed upon goal of 2 °C of global warming will likely require the global deployment of “carbon dioxide removal” strategies like air capture. (See “The Cost of Limiting Climate Change Could Double without Carbon Capture Technology.”) “Negative emissions are definitely needed to restore the atmosphere given that we’re going to far exceed any safe limit for CO2, if there is one,” says Daniel Schrag, director of the Harvard University Center for the Environment. “The question in my mind is, can it be done in an economical way?” Most experts are skeptical. (See “What Carbon Capture Can’t Do.”) A 2011 report by the American Physical Society identified key physical and economic challenges. The fact that carbon dioxide will bind with amines, forming a molecule called a carbamate, is well known chemistry. But carbon dioxide still represents only one in 2,500 molecules in the air. That means an effective air-capture machine would need to push vast amounts of air past amines to get enough carbon dioxide to stick to them and then regenerate the amines to capture more. That would require a lot of energy and thus be very expensive, the 2011 report said. That’s why it concluded that air capture “is not currently an economically viable approach to mitigating climate change.” The people at Global Thermostat understand these daunting economics but remain defiantly optimistic. The way to make air capture profitable, says Global Thermostat cofounder Graciela Chichilnisky, a Columbia University economist and mathematician, is to take advantage of the demand for the gas by various industries. There already exists a well-established, billion-dollar market for carbon dioxide, which is used to rejuvenate oil wells, make carbonated beverages, and stimulate plant growth in commercial greenhouses. Historically, the gas sells for around $100 per ton. But Eisenberger says his company’s prototype machine could extract a concentrated ton of the gas for far less than that. The idea is to first sell carbon dioxide to niche markets, such as oil-well recovery, to eventually create bigger ones, like using catalysts to make fuels in processes that are driven by solar energy. “Once capturing carbon from the air is profitable, people acting in their own self-interest will make it happen,” says Chichilnisky.
Warming up Eisenberger and Chichilnisky were colleagues at Columbia in 2008 when they realized that they had complementary interests: his in energy, and hers in environmental economics, including work to help shape the 1991 Kyoto Protocol, the first global treaty on cutting emissions. Nations had pledged big cuts, says Chichilnisky, but economic and political realities had provided “no way to implement it.” The pair decided to create a business to tackle the carbon challenge. They focused on air capture, which was first developed by Nazi scientists who used liquid sorbents to remove accumulations of CO2 in submarines. In the winter of 2008 Eisenberger sequestered himself in a quiet house with big glass windows overlooking the ocean in Mendocino County, California. There he studied existing literature on capturing carbon and made a key decision. Scientists developing techniques to capture CO2 have thus far sought to work at high concentrations of the gas. But Eisenberger and Chichilnisky focused on another term in those equations: temperature. Engineers have previously deployed amines to scrub CO2 from flue gases, whose temperatures are around 70 °C when they exit power plants. Subsequently removing the CO2 from the amines—“regenerating” the amines—generally requires reactions at 120 °C. By contrast, Eisenberger calculated that his system would operate at roughly 85 °C, requiring less total energy. It would use relatively cheap steam for two purposes. The steam would heat the surface, driving the CO2 off the amines to be collected, while also blowing CO2 away from the surface.
The upshot? With less heat-management infrastructure than what is required with amines in the smokestacks of power plants, the design of a scrubber could be simpler and therefore cheaper. Using data from their prototype, Eisenberger’s team figures the approach could cost between $15 and $50 per ton of carbon dioxide captured from air, depending on how long the amine surfaces last. If Global Thermostat can achieve anywhere near the prices it’s touting, a number of niche markets beckon. The startup has partnered with a Carson City, Nevada-based company called Algae Systems to make biofuels using carbon dioxide and algae. Meanwhile the demand is rising for carbon dioxide to inject into depleted oil wells, a technique known as enhanced oil recovery. One study estimates that the application could require as much as 3 billion tons of carbon dioxide annually by 2021, a nearly tenfold increase over the 2011 market. That still represents a drop in the bucket in terms of the amounts needed to reduce or even stabilize the concentration of CO2 in the atmosphere. But Eisenberger says there are really no alternatives to air capture. Simply capturing carbon emissions from coal-fired power plants, he says, only extends society’s dependence on carbon-intensive coal.
Suck it up It’s a warm December afternoon in Silicon Valley as Eisenberger and I make our way across SRI International’s concrete research center. It’s in these low-slung buildings where engineers first developed ARPAnet, Apple’s Siri software, and countless other technological advances. About a quarter mile from the entrance, a 40-foot-high tower of fans, steel, and silver tubes comes into view. This is the Global Thermostat demonstration plant. It’s imposing and clean. Eisenberger gazes at the quiet scene around the tower, which includes a tall tree. “It’s doing exactly what the tree is doing,” says Eisenberger. But then he corrects himself. “Well, actually, it’s doing it a lot better.” After Eisenberger earned a PhD physics in 1967 at Harvard, stints at Bell Labs, Princeton, and Stanford followed. At Exxon in the 1980s he led work on solar energy, then served as director of Lamont-Doherty, the geosciences lab at Columbia. There he has taught a long-standing seminar called “The Earth/Human system.” It was in that seminar, in 2007, with Lackner as a guest lecturer, that Eisenberger first heard about air capture. After a year or so of preparation, he and Chichilnisky reached out to billionaire Edgar Bronfman Jr. “Sometimes when you hear something that must be too good to be true, it’s because it is,” was Bronfman’s reaction, according to his son, who was present at the meeting. But the scion implored his father: “If they’re right, this is one of the biggest opportunities out there.” The family invested $18 million. That largesse has allowed the company to build its demonstration despite basically no federal support for air capture research. (Global Thermostat chose SRI as its site due to the facility’s prior experience with carbon-capture technology.) The rectangular tower uses fans to draw air in over alternating 10-foot-wide surfaces known as contactors. Each is comprised of 640 ceramic cubes embedded with the amine sorbent. The tower raises one contactor as another is lowered. That allows the cubes of one to collect CO2 from ambient air while the other is stripped of the gas by the application of the steam, at 85 °C. For now that gas is simply vented, but depending on the customer it could be injected into the ground, shipped by pipe, or transferred to a chemical plant for industrial use. A key challenge facing the company is the ruggedness of the amine sorbent surfaces. They tend to decay rapidly when oxidized, and frequently replacing the sorbents could make the process much less cost-effective than Eisenberger projects.
False hope None of the world’s thousands of coal plants have been outfitted for full-scale capture of their carbon pollution. And if it isn’t economical for use in power plants, with their concentrated source of carbon dioxide, the prospects of capturing it out of the air seem dim to many experts. “There’s really little chance that you could capture CO2 from ambient air more cheaply than from a coal plant, where the flue gas is 300 times more concentrated,” says Robert Socolow, director of the Princeton Environment Institute and co-director of the university’s carbon mitigation initiative. Adding to the skepticism over the feasibility of air capture is that there are other, cheaper ways to create the so-called negative emissions. A more practical way to do it, Schrag says, would involve deriving fuels from biomass—which removes CO2 from the atmosphere as it grows. As that feedstock is fermented in a reactor to create ethanol, it produces a stream of pure carbon dioxide that can be captured and stored underground. It’s a proven technique and has been tested at a handful of sites worldwide. Even if air capture were to someday prove profitable, whether it should be scaled up is another question. Say a solar power plant is built outside an existing coal plant. Should the energy the new solar plant produces be used to suck carbon out of the atmosphere, or to allow the coal plant to be shut down by replacing its energy output? The latter makes much more sense, says Socolow. He and others have another concern about air capture: that claims about its feasibility could breed complacency. “I don’t want us to give people the false hope that air capture can solve the carbon emissions problem without a strong focus on [reducing the use of] fossil fuels,” he says. Eisenberger and Chichilnisky are adamant about the importance of sucking CO2 out of the atmosphere rather than focusing entirely on capturing it from coal plants. In 2010, the pair developed a version of their technology that mixes air with flue gas from a coal or gas-fired power plant. That approach provides a source of steam while capturing both atmospheric carbon and new emissions. It also could lower costs by providing a higher concentration of CO2 for the machine to capture. “It’s a very impressive system, a triumph,” says Socolow, who thinks scientific advances made in air capture will eventually be used primarily on coal and gas power plants. Such an application could play a critical role in cleaning up greenhouse gas emissions. But Eisenberger has revealed even loftier goals. A patent granted to him and Chichilnisky in 2008 described air capture technology as, among other things, “a global thermostat for controlling average temperature of a planet’s atmosphere.”
Eli Kintisch is a correspondent for Science magazine.
Related Links:
Posted by Patrick Keller
in Science & technology, Sustainability
at
13:18
Defined tags for this entry: air, atmosphere, engineering, geography, pollution, science & technology, sustainability
Imposing Security | #code
Note: while I'm rather against too much security (therefore, not "Imposing security") and probably reticent to the fact that we, as human beings, are "delegating" too far our daily routines and actions to algorithms (which we wrote), this article stresses the importance of code in our everyday life as well as the fact that it goes down to the language which is used to code a program. Interesting to know that some coding languages are more likely to produce mistakes and errors.
-----
Computer programmers won’t stop making dangerous errors on their own. It’s time they adopted an idea that makes the physical world safer.
Three computer bugs this year exposed passwords, e-mails, financial data, and other kinds of sensitive information connected to potentially billions of people. The flaws cropped up in different places—the software running on Web servers, iPhones, the Windows operating system—but they all had the same root cause: careless mistakes by programmers. Each of these bugs—the “Heartbleed” bug in a program called OpenSSL, the “goto fail” bug in Apple’s operating systems, and a so-called “zero-day exploit” discovered in Microsoft’s Internet Explorer—was created years ago by programmers writing in C, a language known for its power, its expressiveness, and the ease with which it leads programmers to make all manner of errors. Using C to write critical Internet software is like using a spring-loaded razor to open boxes—it’s really cool until you slice your fingers. Alas, as dangerous as it is, we won’t eliminate C anytime soon—programs written in C and the related language C++ make up a large portion of the software that powers the Internet. New projects are being started in these languages all the time by programmers who think they need C’s speed and think they’re good enough to avoid C’s traps and pitfalls. But even if we can’t get rid of that language, we can force those who use it to do a better job. We would borrow a concept used every day in the physical world. Obvious in retrospect Of the three flaws, Heartbleed was by far the most significant. It is a bug in a program that implements a protocol called Secure Sockets Layer/Transport Layer Security (SSL/TLS), which is the fundamental encryption method used to protect the vast majority of the financial, medical, and personal information sent over the Internet. The original SSL protocol made Internet commerce possible back in the 1990s. OpenSSL is an open-source implementation of SSL/TLS that’s been around nearly as long. The program has steadily grown and been extended over the years. Today’s cryptographic protocols are thought to be so strong that there is, in practice, no way to break them. But Heartbleed made SSL’s encryption irrelevant. Using Heartbleed, an attacker anywhere on the Internet could reach into the heart of a Web server’s memory and rip out a little piece of private data. The name doesn’t come from this metaphor but from the fact that Heartbleed is a flaw in the “heartbeat” protocol Web browsers can use to tell Web servers that they are still connected. Essentially, the attacker could ping Web servers in a way that not only confirmed the connection but also got them to spill some of their contents. It’s like being able to check into a hotel that occasionally forgets to empty its rooms’ trash cans between guests. Sometimes these contain highly valuable information. Heartbleed resulted from a combination of factors, including a mistake made by a volunteer working on the OpenSSL program when he implemented the heartbeat protocol. Although any of the mistakes could have happened if OpenSSL had been written in a modern programming language like Java or C#, they were more likely to happen because OpenSSL was written in C. Many developers design their own reliability tests and then run the tests themselves. Even in large companies, code that seems to work properly is frequently not tested for lurking flaws. Apple’s flaw came about because some programmer inadvertently duplicated a line of code that, appropriately, read “goto fail.” The result was that under some conditions, iPhones and Macs would silently ignore errors that might occur when trying to ascertain the legitimacy of a website. With knowledge of this bug, an attacker could set up a wireless access point that might intercept Internet communications between iPhone users and their banks, silently steal usernames and passwords, and then reëncrypt the communications and send them on their merry way. This is called a “man-in-the-middle” attack, and it’s the very sort of thing that SSL/TLS was designed to prevent. Remarkably, “goto fail” happened because of a feature in the C programming language that was known to be problematic before C was even invented! The “goto” statement makes a computer program jump from one place to another. Although such statements are common inside the computer’s machine code, computer scientists have tried for more than 40 years to avoid using “goto” statements in programs that they write in so-called “high-level language.” Java (designed in the early 1990s) doesn’t have a “goto” statement, but C (designed in the early 1970s) does. Although the Apple programmer responsible for the “goto fail” problem could have made a similar mistake without using the “goto” statement, it would have been much less probable. We know less about the third bug because the underlying source code, part of Microsoft’s Internet Explorer, hasn’t been released. What we do know is that it was a “use after free” error: the program tells the operating system that it is finished using a piece of memory, and then it goes ahead and uses that memory again. According to the security firm FireEye, which tracked down the bug after hackers started using it against high-value targets, the flaw had been in Internet Explorer since August 2001 and affected more than half of those who got on the Web through traditional PCs. The bug was so significant that the Department of Homeland Security took the unusual step of telling people to temporarily stop running Internet Explorer. (Microsoft released a patch for the bug on May 1.) Automated inspectors There will always be problems in anything designed or built by humans, of course. That’s why we have policies in the physical world to minimize the chance for errors to occur and procedures designed to catch the mistakes that slip through. Home builders must follow building codes, which regulate which construction materials can be used and govern certain aspects of the building’s layout—for example, hallways must reach a minimum width, and fire exits are required. Building inspectors visit the site throughout construction to review the work and make sure that it meets the codes. Inspectors will make contractors open up walls if they’ve installed them before getting the work inside inspected. The world of software development is completely different. It’s common for developers to choose the language they write in and the tools they use. Many developers design their own reliability tests and then run the tests themselves! Big companies can afford separate quality–assurance teams, but many small firms go without. Even in large companies, code that seems to work properly is frequently not tested for lurking security flaws, because manual testing by other humans is incredibly expensive—sometimes more expensive than writing the original software, given that testing can reveal problems the developers then have to fix. Such flaws are sometimes called “technical debt,” since they are engineering costs borrowed against the future in the interest of shipping code now. The solution is to establish software building codes and enforce those codes with an army of unpaid inspectors. Crucially, those unpaid inspectors should not be people, or at least not only people. Some advocates of open-source software subscribe to the “many eyes” theory of software development: that if a piece of code is looked at by enough people, the security vulnerabilities will be found. Unfortunately, Heartbleed shows the fallacy in this argument: though OpenSSL is one of the most widely used open-source security programs, it took paid security engineers at Google and the Finnish IT security firm Codenomicon to find the bug—and they didn’t find it until two years after many eyes on the Internet first got access to the code. Instead, this army of software building inspectors should be software development tools—the programs that developers use to create programs. These tools can needle, prod, and cajole programmers to do the right thing. This has happened before. For example, back in 1988 the primary infection vector for the world’s first Internet worm was another program written in C. It used a function called “gets()” that was common at the time but is inherently insecure. After the worm was unleashed, the engineers who maintained the core libraries of the Unix operating system (which is now used by Linux and Mac OS) modified the gets() function to make it print the message “Warning: this program uses gets(), which is unsafe.” Soon afterward, developers everywhere removed gets() from their programs. The same sort of approach can be used to prevent future bugs. Today many software development tools can analyze programs and warn of stylistic sloppiness (such as the use of a “goto” statement), memory bugs (such as the “use after free” flaw), or code that doesn’t follow established good-programming standards. Often, though, such warnings are disabled by default because many of them can be merely annoying: they require that code be rewritten and cleaned up with no corresponding improvement in security. Other bug–finding tools aren’t even included in standard development tool sets but must instead be separately downloaded, installed, and run. As a result, many developers don’t even know about them, let alone use them. To make the Internet safer, the most stringent checking will need to be enabled by default. This will cause programmers to write better code from the beginning. And because program analysis tools work better with modern languages like C# and Java and less well with programs written in C, programmers should avoid starting new projects in C or C++—just as it is unwise to start construction projects using old-fashioned building materials and techniques. Programmers are only human, and everybody makes mistakes. Software companies need to accept this fact and make bugs easier to prevent. Simson L. Garfinkel is a contributing editor to MIT Technology Review and a professor of computer science at the Naval Postgraduate School.
Posted by Patrick Keller
in Culture & society, Science & technology
at
13:13
Defined tags for this entry: code, computing, culture & society, hack, language, science & technology, software
(Page 1 of 1, totaling 2 entries)
|
fabric | rblgThis blog is the survey website of fabric | ch - studio for architecture, interaction and research. We curate and reblog articles, researches, writings, exhibitions and projects that we notice and find interesting during our everyday practice and readings. Most articles concern the intertwined fields of architecture, territory, art, interaction design, thinking and science. From time to time, we also publish documentation about our own work and research, immersed among these related resources and inspirations. This website is used by fabric | ch as archive, references and resources. It is shared with all those interested in the same topics as we are, in the hope that they will also find valuable references and content in it.
QuicksearchCategoriesCalendar
Syndicate This BlogArchivesBlog Administration |