In the firts part of the XIXth century, we saw our landscape gradually populated by water towers. They came all together with the advance of railways and steam trains as well as with the delivery of water under pressure to households, offices and factories, etc. They took their part in the implementation of the first industrial revolution.
Will we see now our landscape progressively transformed by some new types of energy constructions (i.e. above, a "duct turbine" designed to increase wind velocity, that we could also call a "wind tower") and become the new landmarks of our (still to come) sustainable society. Could we combine this type of structure with some other program? With living or with data centers, with other new and "iconic structures" of our still early century? Should these types of structure also inhabit hurricanes and their usual paths and collect huge amount of energy?
What two crustaceans and a timepiece have to do with the future of medical electronics.
By Nidhi Subbaraman
In Evgeny Katz’s vision of the future, medical implants will use the human body as a battery. They’ll just run on the same juice that powers us human beings. His lab at Clarkson University has been building a biofuel cell—an energy harvester--that has successfully drawn electrical energy from glucose coursing the blood streams of snails, clams, and now, lobsters.
Human medical implants powered by what we eat are a long way away, but in a new paper, Katz and his team demonstrate how their technology is maturing towards such a reality. That’s where the lobsters come in. Researchers from Clarkson University and the University of Vermont College of Medicine explain how they’ve powered a watch using glucose from two lobsters, connected as batteries would be, in series. They also show that it’s possible to keep a pacemaker ticking with glucose levels usually seen in the human body.
The key to this setup is an enzyme stationed at implanted electrodes made of carbon nanotubes. Together, the two efficiently convert chemical energy from glucose in an animal’s circulatory system to electricity.
In the past, these energy-harvesting biofuel cells have been tested in the ear of rabbits, in the abdomen of insects, in the body cavity of snails and clams. But the lobsters are different. It’s the first time living organisms have powered up a piece of electronics.
With electrodes in their abdomen, the two lobsters powered the watch for an hour, until the lobsters’ glucose levels near the electrode dropped. (They don’t feel any pain, a member of the team has explained, because they don’t have nerve endings where the electrodes were implanted.) The voltage picked up though, and the crustaceans powered the watch for as long as they remained alive in the lab.
People with pacemakers are ideal bio-battery candidates. As an early test of the idea, the team hooked up a pacemaker to an artificial setup resembling the human circulatory system. It contained serum spiked with glucose at different levels--to represent glucose levels in the blood immediately after you hit the gym, or while sitting at your desk at work, or if you’re diabetic. (Serum is blood with the proteins and cells filtered out.)
With its battery removed, the pacemaker became the first of its kind to run solely on glucose derived from body fluid for five hours. It won’t be the last though, Katz and co. have a list of other medical devices waiting their turn.
Carbon cell: The all-carbon solar cell consists of a photoactive layer between two electrodes.
Using a grab bag of novel nanomaterials, researchers at Stanford University have built the first all-carbon solar cells. Their carbon photovoltaics don’t produce much electricity, but as the technology is perfected, all-carbon cells could be inexpensive, printable, flexible, and tough enough to withstand extreme environments and weather.
The goal is not to replace solar cells made from silicon and other inorganic materials, says Zhenan Bao, professor of chemical engineering at Stanford University, who led the work. Rather, it is to fill new niches. “Carbon is one of the most abundant elements on earth, and it is versatile,” Bao says.
Carbon is remarkably tough—atom-thick graphene and long, thin carbon nanotubes are two of the strongest materials ever tested. So carbon photovoltaics might be sprayed on the sides of buildings, or rolled up and taken into the desert. Various forms of carbon can be printed to make thin, flexible, transparent, and even stretchable electronics.
Thanks to its versatility, carbon in one form or another was used to make each solar-cell component. The three main parts—a nanotube cathode and a graphene anode sandwiching an active layer made of nanotubes and buckyballs—were all made by printing or evaporating from inks.
Making the cathode work was the trickiest part, says Bao—researchers have had a hard time making carbon nanomaterials that collect electrons. The Stanford researchers solved the problem by picking the right flavor of nanotubes and giving them a chemical treatment. This work is described in the journal ACS Nano.
The all-carbon photovoltaics convert less than 1 percent of the energy in light into electricity (by comparison, a silicon solar cell converts around 20 percent of light into electricity). However, Bao says that her group worked mostly with off-the-shelf materials, with just a bit of tuning. She attributes part of the problem to the roughness of the carbon films, which trips up traveling charges, and says it should be possible to smooth them out by working on the processing methods.
Carbon nanomaterials “are still relatively new materials,” says Bao. “There’s a lot of research on how to control their properties and how to use them.”
IBM Yorktown researcher and 2011 MIT Technology Review young innovator Fengnian Xia, who is not involved in the work, agrees, saying that the solar cells need better-quality starting materials and processes. “The idea is great, and this is a good first demonstration, but it’s not ready for realistic applications,” he says.
Other groups are focused on making better carbon materials for the active layers of photovoltaics. According to theoretical calculations by Jeffrey Grossman at MIT, carbon solar cells should be able to reach 13 percent conversion efficiency.
For carbon solar cells to be commercially viable, says Shenqiang Ren, assistant professor of chemistry at the University of Kansas, their efficiency must cross 10 percent. Ren’s lab set the conversion-efficiency record for carbon solar cells (equipped with conventional metal electrodes) at 1.3 percent this September, in work that appeared in ACS Nano. That’s about how well the first polymer solar cells performed, he notes.
Ren is working with computational materials scientists, including Grossman, to design better carbon photovoltaics by picking the right kinds of carbon nanomaterials. With this guidance, Ren says, his lab has already made carbon solar cells that convert 5 percent of light energy into electricity, and he expects to go higher still.
Besides the debate about the location of this solar farm (apparently in what should be a natural reserve), impressive pictures of the construction of the plant, in particular the one above. Really impressive structure, yet highly symetrical and centralized. A sort of reverse power panopticon.
David Keith spoke at MIT Technology Review’s EmTech conference this week.
Geoengineering—using technology to purposefully change the climate—is the only option for reducing the risk of climate change from greenhouse-gas emissions in the next few decades, says David Keith, a professor of public policy and applied physics at Harvard University. And he says that if it’s done in moderation, it could be much safer than some experts have argued. In fact, says Keith, effective methods of geoengineering are so cheap and easy that just about any country could do it—for better or worse.
Keith, speaking this week at MIT Technology Review’s annual EmTech conference, says it is already too late to avoid climate changes by reducing carbon emissions alone. The carbon dioxide that’s been released into the atmosphere by burning fossil fuels is already likely to cause significant harm, such as raising temperatures enough to hurt crop yields in many places. “If you want to, say, really stop the loss of Arctic sea ice or stop heat-stress crop losses over the next few decades, geoengineering is pretty much the only thing you can do,” he says (see “Why Climate Scientists Support Geoengineering Research”).
Keith’s preferred method of geoengineering is to shade the earth by injecting sulfate particles into the upper atmosphere, imitating a similar process that happens with large volcanic eruptions, which are known to temporarily cool the planet. The technique could be effective even if far less sulfate were injected than is currently emitted by fossil-fuel power plants. A million tons per year injected into the stratosphere would be enough—whereas 50 million tons are injected into the lower part of the atmosphere by coal plants, he says. (In the lower atmosphere, the sulfates are less effective at cooling because they stay airborne for shorter periods.)
One of the main objections to geoengineering is that the measures that might be taken to cool the planet won’t exactly offset the effects of carbon dioxide, so they could actually make things much worse—for example, by altering patterns of precipitation. Keith says recent climate models suggest that injecting sulfate particles into the upper reaches of the atmosphere might not affect precipitation nearly as much as others have warned.
“I propose that you start in about 2020, and you start very, very gradually increasing your amount of sulfate engineering so that you cut about in half the rate of warming,” he says. “Not eliminate it, but cut it about in half. Cutting it in half is a big benefit.”
One of the benefits could be increased crop production. Though some critics have worried that geoengineering would alter monsoon patterns that are key to agriculture in India, Keith says moderate geoengineering could actually boost crop productivity there by 20 percent, in part by reducing temperatures.
Keith and some of his colleagues recently hired engineers to estimate how much one approach to sulfate injection might work, and how much it might cost. It could be done at first with existing airplanes—certain business jets can fly high enough to inject the particles into the upper atmosphere. Eventually we would need new planes that can fly higher. All in all, once the procedure is scaled up it would cost about a billion dollars a year and require about 100 aircraft. That’s cheap enough for most countries to pull off on their own.
The fact that it’s easy isn’t necessarily a good thing, Keith says. There’s the potential that if one country does it, another might blame that country—rightly or wrongly—for ensuing bad weather (see “The Geoengineering Gambit”).
And there are also real concerns about the impact sulfates might have on the atmosphere (see Geoengineering May Be Necessary, Despite Its Perils). It’s known that sulfates can be involved in reactions that deplete the ozone layer. As the earth warms, water vapor levels are increasing, which could exacerbate the problem. Keith is proposing a test to discover quantitatively just what the effect of the injections could be. He would introduce small clouds of sulfate and water vapor into the stratosphere using balloons, and then carefully measure the reactions that take place.
And Keith acknowledges a concern many have had about geoengineering: that using it to offset problems from climate change will reduce the incentive to tackle the greenhouse-gas emissions at the root of the problem. Even if geoengineering is employed, reducing emissions will still be important. Sulfate injection does nothing to address the ocean acidification associated with increased levels of carbon dioxide in the atmosphere. And if emissions continue to grow, ever-increasing amounts of sulfate will be needed.
But Keith thinks the potential benefits might be worth the dangers. “We don’t know enough yet to start,” he says. “But the current balance of evidence is that doing this really would reduce risks. And for that reason, we’ve got to take it seriously. It really would be reckless not to look at something that could reduce risk like this could.”
Indeed, largely because of their gargantuan energy requirements and high-tech secrets, Data Centers have been shrouded in mystery since their beginnings. This is particularly true in Google’s case. When Andrew Blum, author of Tubes: A Journey to the Center of the Internet, visited Google’s Data Center in The Dalles, Oregon, he said it was like “ a prison,” and couldn’t even get past the cafeteria. Nary a peek has been seen of a Google Data Center.
Until now, that is. Google just launched a new website,Where the Internet Lives, which features never-before-seen images of eight of Google’s 9 data centers, the places the “physical internet” calls home.
Google didn’t share with us why they’re choosing to go transparent now (we’re guessing Facebook’s decision to openly tout their Data Centers’ design & energy-efficiency might have something to do with it), but they did alert us to some of their centers’ more ecologically note-worthy features.
According to Google, their facilities, which must process 3 billion search queries a day and 72 hours of YouTube videos every minute, are “among the most energy efficient in the world,” “using half the energy of a typical data center.”
Google’s Data Center in Hamina, Finland, for example, with an Alvar Aalto-designed machine hall, uses a cutting-edge cooling system, utilizing sea water from the Bay of Finland.
Scroll on for images of Hamina and other Google Data Centers – and see them all at Google’s new site: Where the Internet Lives.
Of course this news has been rebloged a lot recently, I know it and I will just add to the pile of reblogs, a few days later now that the news has spread (and start already to be forgotten). But it is nonetheless interesting for us to keep it as a resource here on | rblg, so as Facebook's "open compute" data center architecture, as we plan to start soon a new set of projects and architectural experiments around the idea of "inhabiting the cloud". A good lecture about this subject is also the recent publication by CLOG about Data Space.
-
See also this video from Google (and access a Datacenter through Streetview).
Northrup Grumman is having no trouble adapting its speeders to the cold.
By Christopher Mims
Because moose aren't the only thing in Canada's north.
As it becomes increasingly clear that climate change and the race for new sources of oil and gas are going to turn Earth's poles into hotbeds of military contention, Northrup Grumman is responding by offering Canada a drone that can fly under even the harshest of conditions.
Dubbed the Polar Hawk, the aircraft is a modified version of the basic Block 30 airframe. […] To meet Canada's specific requirements, the aircraft's satellite communications system has been modified to cope with the spotty coverage found in the arctic. The aircraft would also have wing deicing and engine anti-icing capability
The Polar Hawk can survey 40,000 square miles of territory a day, which means it would take only three of them to monitor all of Canada's northern reaches. Which is good, because one Hawk plus all its support infrastructure is $215 million.
Arctic is getting permanent monitoring... or rather let's say surveillance in this case. It reminds me of the project we exhibited in 2010 on the Frioul archipelago, Arctic Opening (and that is published in Bracket issue # 2 [goes soft]), where we tried to pinpoint the changes that would occur in the Arctic.
With every passing project I feel like my basement is being converted from a living only area, to a work and project area. Computers being built, gadgets being taken apart, Lego projects all around. I’m not complaining by any means, but I do feel as my basement becomes populated with more and more tech based projects that the environment is missing something organic, something natural to balance things out.
… at some point I started wanting to use the heat from a computer as a way to warm the soil and help with germination/growth. I’m about as far from a botanist as it comes, I did some reading online and became pretty interested in the effects of soil temperature on germination/growth. I read different studies and papers from various universities. It was not too long into that process that I became hooked on the idea of using computer heat as a way to control the soil temperature of some sort of living plant life.
Alexander Trevi (Pruned) presented a couple of month ago the idea of "gardens as crypto computers". What about, in addition, burrying full datacenters (therefore "clouds" infrastructure) underground, use their generated heat to grow entire fields of crops or else in the country side (or in the city side)? Even so I've no idea how far ground heat as a positive or negative effect on crops...
BrightFarms CEO, Paul Lightfoot is obsessed with efficiency. Spending most of his career improving market supply chains he has now turned his attention to the market supply chains of America’s produce. BrightFarms is an innovative and straight forward program whose goal is to eliminate the wasted energy expended on travel times between the farm and the shelf, to provide more nutritious and safer produce that is grown for the table and not for the endurance of days and weeks of transport, and to create a local market where consumers know their farmers and where the food is coming from and who is responsible for growing it. Littlefoot describes the blatant problems with the food industry today – efficiently factory farming and preserving produce that moves from one and end of the country to the other and inefficiently providing nutritious and tasty produce.
The challenge is to create a model that ensures quality while keeping costs down and BrightFarms appears to have found a strategy that works: hydroponic rooftop gardening near supermarket distribution centers or local markets. The newly renam520/500 ed Federal Plaza #2, soon to be known as Liberty View Industrial Plaza to be developed by Salmar Properties, in Brooklyn, NY is set to be the world’s largest rooftop garden which will reportedly grow “1 million pounds of local produce per year, including tomatoes, lettuces and herbs”. Find out how it works after the break!
BrightFarms business model seems simple – and too good to be true. The company is essentially a middle man – connecting experienced and reliable local farmers with credited grocery stores – that finances, develops and builds the BrightFarm operation. BrightFarms ensures that both parties enter into individual agreements with the program. The grocery stores are obligated to purchase the output of the farms for a 10-year period, while farmers must guarantee the volume and quality of output. And of course the key ingredient to making this operation distinct from the trends of the country is the proximity of the farms, farmers and grocery stores. Community is essential.
Aside from providing goods that are fresher and more nutritious, BrightFarms hydroponic system also reduces carbon output drammatically. Hydroponic farming delivers nutrients to plants directly through the water without soil. These systems can be trays or columns made of PVC that expose the roots to the nutrient and mineral filled water. No soil means no land use and no heavy, gas-guzzling equipment. The water in the system can be reused, There is greater control of the nutrients which means reduced waste and the water stays in the system and can be reused which greatly reduces the agricultural runoff. It also consolidates space, which makes maintenance and harvesting much easier.
The system is perfect for urban rooftop applications, which is why Liberty View Industrial Plaza is set to be the model for urban agriculture covering the rooftop of an 8-story 1.1 million-square-foot warehouse building along Brooklyn’s Industrial Waterfront in Sunset Park. The project will provide innumerable benefits for the city. It will provide enough produce to feed 5,000 New Yorkers, will create an anticipated 1,300 permanent industrial jobs and 400 construction Jobs, and will relieve the over-burdened sewer system of 1.8 million gallons of storm water from entering the waterways. It is also a plan that is part of Mayor Bloomberg’s initiative to revitalize Brooklyn’s waterfront – which is already underway at the Brooklyn Navy Yard aka Navy Hill.
Everyone is optimistic that the project will not only bring fresh and healthy food and revitalized attitude toward local farming, but will also push the long-dormant industrial buildings into a new territory of sustainable development for cities. Follow this link to see other projects by BrightFarms.
While I also definitely think that producing food closer to the place where it will be eaten is a necessary thing (but guess what? this was still the way we were producing and eating food, at least in my neighboorhood, when I was a very young kid --i.e. my grandfather was selling the excedent products of his garden to the local shop, which means: we need local shops again, as well as a different economic and consumption model--. We didn't need either to take a car to buy a few tomatoes so to say), I also question this whole idea of urban farming: how much energy does it really need to grow products? At least on rooftop and exposed platforms seems a good direction, on the contrary, to build skyscapers that need artificial lighting and air conditioning to produce food not really.
But it looks like that there will be a "competition" about the use of rooftops in the close future: will we use them to produce clean electricity (solar or by other means), locally, should we use white rooftops to reduce the global albedo index of our cities (the albedo index of soil and green plants is not good) and therefore artifically replace the solar reflexion of disappearing glaciers and ice cap, should we collect waterfrom the roofs instead, or should we use them to grow plants (and eventually capture CO2 locally too, and particles of pollution as well that we'll then eat...)?
I have the feeling that we need a more general view (systems theory?) that help take more parameters into consideration.
Maybe the solution will look like this: to bioengineer new white algae plants that we can eat, that need few enery and water to grow under a minimal amount of natural light and that help produce biocarburant... (but they will still eject C02 when you'll use that carburant... damn...)
A l’heure actuelle la demande en énergie croît plus vite que l’offre. Selon l’Agence internationale de l’énergie, à l’horizon 2030 les besoins de la planète seront difficiles à satisfaire, tous types d’énergies confondus. Il faudra beaucoup de créativité pour satisfaire la demande.
Vincent Schachter, directeur de la recherche et du développement pour les énergies nouvelles à Total commence son exposé sur la biologie de synthèse. “C’est important de préciser dans quel cadre nous travaillons”. Ses chercheurs redessinent le vivant. Ils s’échinent à mettre au point des organismes microscopiques, des bactéries, capables de produire de l’énergie.
En combinant ingénierie, chimie, informatique et biologie moléculaire, les scientifiques recréent la vie.
Ambition démiurgique
Aucune avancée scientifique n’a incarné tant de promesses : détourner des bactéries en usines biologiques capables de produire des thérapeutiques contre le cancer, des biocarburants ou des molécules capables de dégrader des substances toxiques.
Dans la salle Lamartine de l’Assemblée nationale ce 15 février, le parterre de spécialistes invités par l’Office parlementaire d’évaluation des choix scientifique et techniques (OPECST) est silencieux. L’audition publique intitulée Les enjeux de la biologie de synthèse s’attaque à cette discipline jeune, enjeu déjà stratégique. Geneviève Fioraso, députée de l’Isère, qui l’a organisée, confesse : “J’ai des collègues parlementaires à l’Office qui sont biologistes. Ils me disent qu’ils sont parfois dépassés par ce qui est présenté. Ce sont des questions très complexes d’un point de vue scientifique”.
L’Office, dont la mission est “d’informer le Parlement des conséquences des choix de caractère scientifique et technologique afin, notamment, d’éclairer ses décisions” est composé de parlementaires, députés et sénateurs. Dix-huit élus de chaque assemblée qui représentent proportionnellement l’équilibre politique du Parlement. Assistés d’un conseil scientifique ad hoc ils sont saisis des sujets scientifiques contemporains : la sûreté nucléaire en France, les effets sur la santé des perturbateurs endocriniens, les leçons à tirer de l’éruption du volcan Eyjafjöll…
Marc Delcourt, le PDG de la start-up Global Bioenergies, basée à Evry, prend la parole :
La biologie de synthèse, c’est créer des objets biologiques. Nous nous attachons à transformer le métabolisme de bactéries pour leur faire produire à partir de sucres une molécule jusqu’à maintenant uniquement issue du pétrole, et dont les applications industrielles sont énormes.
Rencontré quelques jours plus tard, Philippe Marlière, le cofondateur de l’entreprise, “s’excuse”. Il donne, lui, une définition “assez philosophique” de la biologie de synthèse : ” Pour moi c’est la discipline qui vise à faire des espèces biologiques, ou tout objet biologique, que la nature n’aurait pas pu faire. Ce n’est pas ‘qu’elle n’a pas fait’, c’est ‘qu’elle n’aurait pas pu faire. Il faut que ce soit notre gamberge qui change ce qui se passe dans le vivant”.
Ce bio-chimiste, formé à l’École Normale Supérieure, assume sans fard une ambition de démiurge, il s’agit de créer la vie de manière synthétique pour supplanter la nature. Il ajoute :
Je ne suis pas naturaliste, je ne fais pas partie des gens qui pensent que la nature est harmonieuse et bonne. Au contraire, la biologie de synthèse pose la nature comme imparfaite et propose de l’améliorer .
Aussi provoquant que cela puisse paraître c’est l’objectif affiché et en partie atteint par la centaine de chercheurs qui s’adonne à la discipline depuis 10 ans en France. Il reprend : “Aussi vaste que soit la diversité des gènes à la surface de la terre, les industriels se sont déjà persuadés que la biodiversité naturelle ne suffira pas à procurer l’ensemble des procédés dont ils auront besoin pour produire de manière plus efficace des médicaments ou des biocarburants. Il va falloir que nous nous retroussions les manches et que nous nous occupions de créer de la bio-diversité radicalement nouvelle, nous-mêmes.”
Biologiste-ingénieur
L’évolution sur terre depuis 3 milliard et demi d’années telle que décrite par Darwin est strictement contingente. La sélection naturelle, écrit le prix Nobel de médecine François Jacob dans Le jeu des possibles“opère à la manière d’un bricoleur qui ne sait pas encore ce qu’il va produire, mais récupère tout ce qui lui tombe sous la main, les objets les plus hétéroclites, bouts de ficelle, morceaux de bois, vieux cartons pouvant éventuellement lui fournir des matériaux […] D’une vieille roue de voiture il fait un ventilateur ; d’une table cassée un parasol. Ce genre d’opération ne diffère guère de ce qu’accomplit l’évolution quand elle produit une aile à partir d’une patte, ou un morceau d’oreille avec un fragment de mâchoire”.
Le hasard de l’évolution naturelle, combiné avec la nécessité de l’adaptation a sculpté un monde “qui n’est qu’un parmi de nombreux possibles. Sa structure actuelle résulte de l’histoire de la terre. Il aurait très bien pu être différent. Il aurait même pu ne pas exister du tout”. Philippe Marlière ajoute, laconique : “A posteriori on a toujours l’impression que les choses n’auraient pas pu être autrement, mais c’est faux, le monde aurait très bien pu exister sans Beethoven”.
Comprendre que l’évolution n’a ni but, ni projet. Et la science est sur le point de pouvoir mettre un terme au bricolage inopérant de l’évolution. Le biologiste, ici, est aussi ingénieur. A partir d’un cahier des charges il définit la structure d’un organisme pour lui faire produire la molécule dont il a besoin. Si la biologie de synthèse en est à ses balbutiements, elle est aussi une révolution culturelle.
Il s’agit désormais de créer de nouvelles espèces dont l’existence même est tournée vers les besoins de l’humanité. “La limite à ne pas toucher pour moi c’est la nature humaine. Je suis un opposant acharné au transhumanisme“, met tout de suite en garde le généticien.
A, T, G, C
Depuis que Francis Crick, James Watson et Rosalind Franklin ont identifié l’existence de l’ADN, l’acide désoxyribonucléique, en 1953, une succession de découvertes ont permis de modifier cet l’alphabet du vivant.
On sait désormais lire, répliquer, mais surtout créer un génome et ses gènes, soit en remplaçant certaines de ses parties, soit en le synthétisant entièrement d’après un modèle informatique. Les gènes, quatre bases azotées, A, T, G et C qui se succèdent le long de chacun des deux brins d’ADN pour former la fameuse double hélice, illustre représentation du vivant. Quatre molécules chimiques qui codent la vie : A, pour adénine, T pour thymine, G pour guanine, et C pour cytosine. Leur agencement détermine l’activité du gène, la ou les protéines pour lesquelles il code, qu’il crée. Les protéines, ensuite, déterminent l’action des cellules au sein des organismes vivants : produire des cheveux blonds, des globules blancs, ou des bio-carburants.
On peut à l’heure actuelle, en quelques clics, acheter sur Internet une base azotée pour 30 cents. Un gène de taille moyenne, chez la bactérie, coûte entre 300 et 500 €, il est livré aux laboratoires dans de petits tubes en plastique translucide. Là il est intégré à un génome qui va générer de nouvelles protéines, en adéquation avec les besoins de l’industrie et de l’environnement.
L’être humain est devenu ingénieur du vivant, il peut transformer de simples êtres unicellulaires, levures ou bactéries en de petites usines qu’il contrôle. C’est le bio-entrepreneur américain Craig Venter qui sort la discipline des laboratoires en annonçant en juin 2010 avoir crée Mycoplasma mycoides, une bactérie totalement artificielle “fabriquée à partir de quatre bouteilles de produits chimiques dans un synthétiseur chimique, d’après des informations stockées dans un ordinateur”.
Si la création a été saluée par ses pairs et les médias, certains s’attachent toutefois à souligner que sa Mycoplasma mycoides n’a pas été crée ex nihilo, puisque le génome modifié a été inséré dans l’enveloppe d’une bactérie naturelle. Mais la manipulation est une grande première.
Tour de Babel génétique
Philippe Marlière a posé devant lui un petit cahier, format A5, où après avoir laissé dériver son regard il prend quelques notes. “Il y a longtemps qu’on essaye de changer le vivant en profondeur. Moi c’est l’aspect chimique du truc qui m’intéresse : où faut-il aller piocher dans la table de Mandeleiev pour faire des organismes vivants ? Jusqu’où sont-ils déformables ? Jusqu’à quel point peut-on les lancer dans des mondes parallèles sur terre ?”. Il jette un coup d’œil à son Schweppes :
Prenez l’exemple de l’eau lourde. C’est une molécule d’eau qui se comporte pratiquement comme de l’eau, et on peut forcer des organismes vivants à y vivre et évoluer. Or il n’y a d’eau lourde nulle part dans l’univers, il n’y a que les humains qui savent la concentrer. On peut créer un microcosme complètement artificiel et être sûr que l’évolution qui a lieu là-dedans n’a pas eu lieu dans l’univers. C’est l’évolution dans des conditions qui n’auraient pas pu se dérouler sur terre, c’est intéressant. La biologie de synthèse est une forme radicale d’alter-mondialisme, elle consiste à dire que d’autres vies sont vraiment possibles, en les changeant de fond en comble.
Ce n’est pas une provocation feinte, ce n’est même pas une provocation. L’homme a à cœur d’être bien compris. Il s’agit de venir à bout de l’évolution darwinienne, pathétiquement coincée à un stade qui n’assure plus les besoins en énergie des 10 milliards d’humains à venir. Il faut pour ça réécrire la vie, son code. Innover dans l’alphabet de quatre lettres, A, C, G et T. Créer une nouvelle biodiversité. Condition sine qua non : ces mondes, le nôtre, le naturel, et le nouveau, l’artificiel, devraient cohabiter sans pouvoir jamais échanger d’informations. Il appelle ça la tour de Babel génétique, où les croisements entre espèces seraient impossibles.
“Les écologistes exagèrent souvent, mais ils mettent en garde contre les risques de dissémination génétique et ils ont raison. Les croisements entre espèces vont très loin. J’ai lu récemment que le chat et le serval sont inter-féconds”. Il estime de la main la hauteur du serval, un félin tacheté, proche du guépard, qui vit en Afrique. Un mètre de haut environ.
Par ailleurs il fallait être superstitieux pour imaginer que le pollen des OGM n’allait pas se disséminer. Le pollen sert à la dissémination génétique ! D’où notre projet, il s’agit de faire apparaître des lignées vivantes pour lesquelles la probabilité de transmettre de l’information génétique est nulle.
Le concept tient en une phrase :
“The farther, the safer : plus la vie artificielle est éloignée de celle que nous connaissons, plus les risques d’échanges génétiques entre espèces diminuent. C’est là qu’il y a le plus de brevets et d’hégémonie technologique à prendre.”
Il s’agit de modifier notre alphabet de 4 lettres, A, C, G et T, pour créer un nouvel ADN, le XNA, clé de la “xénobiologie”:
X pour Xeno, étranger, et biologie. Le sens de cet alphabet ne serait pas lisible par les organismes vivants, c’est ça le monde qu’on veut faire. C’est comme lancer un Spoutnik, c’est difficile. Mais comme disait Kennedy, ‘On ne va pas sur la lune parce que c’est facile, on y va parce que c’est difficile.’
This blog is the survey website of fabric | ch - studio for architecture, interaction and research.
We curate and reblog articles, researches, writings, exhibitions and projects that we notice and find interesting during our everyday practice and readings.
Most articles concern the intertwined fields of architecture, territory, art, interaction design, thinking and science. From time to time, we also publish documentation about our own work and research, immersed among these related resources and inspirations.
This website is used by fabric | ch as archive, references and resources. It is shared with all those interested in the same topics as we are, in the hope that they will also find valuable references and content in it.