Thursday, January 27. 2011
Via Change Observer via Dan Hill
-----
By John Thackara
Gram junkies are those fanatical hikers and climbers who fret about every gram of weight that might be carried — in everything from titanium cook pans to toothbrush covers. Excess weight is not just an objective performance issue for these guys; they take it personally. In the matter of mobility and modern transportation, we all need to become gram junkies. We need to obsess not about speed, or about exotic power sources, but about the weight of every step taken, every vehicle used, every infrastructure investment contemplated.
Why? Because modern mobility not only damages the biosphere, our only home, but also global systems of air. Rail and road travel are also greedy in their use of space, matter, energy, biodiversity and land. Designers around the world are busily developing a dazzling array of solutions to deal with these complex challenges. The website Newmobility.org, for example, has identified 177 different projects and approaches to sustainable mobility. These include bus rapid transit (BRT), car free days, demand-responsive transport (DRT), hitch-hiking, pedestrianization, smart parking strategies and van-pooling.
The trouble is that every solution that assumes our current or increased levels of transport intensity, once whole system costs are calculated, turns out not to be viable. Many transport strategies help solve one or two problems — but exacerbate others. The best-known example is the way that the expansion of highways reduces congestion for a time, but tends to increase total vehicle traffic. Another rebound effect: increased vehicle fuel efficiency conserves energy; but, because it reduces vehicle operating costs, it tends to increase total vehicle travel. The growth of the US Interstate Highway System changed fundamental relationships between time, cost and space. These, in turn, enabled forms of economic development that have proved devastating to the biosphere.
Todd Litman, who runs the Victoria Transport Policy Institute in Canada, points out that depreciation, insurance, registration and residential parking are not directly affected by how much a vehicle is driven. Motorists maximize their vehicle travel to get their money’s worth from such expenditures and receive no incentives to drive less. Litman describes these market distortions as "economic traps" in which competition for resources creates conflicts between individual interests and the common good.
The effects of these economic traps are "cumulative and synergistic" in Litman's words: total impact is greater than the sum of individual effects and these distortions skew countless travel decisions and contribute to a long-term cycle of automobile dependency.
Modern mobility kills people too — but without fuss. The total number of people killed on 9/11 was 2,819. And yes, that was a tragedy. But consider this: An average of 3,242 people die worldwide on the roads each day of the year, year after year. Children are especially vulnerable; traffic accident deaths account for 41 percent of all child deaths by injury.
Even if it doesn't kill you outright, modern mobility contributes to poor health. The highest rates of obesity correlate 1:1 with the proportion of journeys children take by car — and the costs of obesity are heading for 10 percent of US GDP. Increased auto dependency and air pollution also contribute to respiratory illnesses, cardiovascular disease and hospital admissions.
We squander time on mobility. We spend the same amount of time traveling today as we did 50 years ago — but we use that time to travel longer distances. The average German citizen today drives 15,000 km (9,320 miles) a year; in 1950, she covered just 2,000 km (1,242 miles). A lot of our travel time is commuting time and work-related travel that we believe we cannot avoid. We also spend a lot of time traveling in order to shop and to take the kids to distant schools.
Even if modern mobility were not a climate change or social problem, the fact that global mobility depends on a finite energy source — oil — means it is fundamentally not resilient. Whether oil and gas are at a peak, or on a plateau, increasing consumption means that the 9 million gallons of gasoline people currently use in the U.S. each day simply will not be availabl in the future. Ninety-five percent of all transportation depends on oil — and with that, food systems in all developed countries are vulnerable to any disruption in the prevailing logistical system.
To a car company, replacing the chrome wing mirror on an SUV with one made of carbon fiber is a step toward sustainable transportation. To a radical ecologist, all motorized movement is unsustainable. So when is transportation sustainable, and when is it not?
Motorized horse cart featured on the French website Traits en Savoie.
Meterus Horse Power, in France, is modernizing animal traction with an approach that combines technology, ecology, profitability and horse wellness. From Equishop website (Switzerland).
Chris Bradshaw, a transport economist, emphasizes that “light” transport systems are not, per se, sustainable — only less unsustainable than commuting by car. “Light rail supports far-flung suburbs; street cars support, well, street-car suburbs,” Bradshaw says. “A smaller, more efficient, or alternative-fuel vehicle is only less unsustainable than another private vehicle. It will still take up space on the road and in parking lots, it will still threaten the life and limb of others, it will still create noise, and it still will require lots of energy and resources to manufacture, transport to a dealer and dispose of when its life ends.”
Bradshaw wants planners and designers to respect what he calls “the scalar hierarchy.” This is when trips taken most frequently are short enough to be made by walking (even if pulling a small cart), while the next more frequent trips require a bike or street car and so on. “If one adheres to this, then there are so few trips to be made by car, that owning one is foolish.”
Investments in high-speed trains such as they are, is another non-solution. Europeans believe that high-speed trains are environmentally far friendlier than aircraft — but they're not. When researchers at Martin Luther University studied the construction, use and disposal of the high-speed rail infrastructure in Germany, they found that 48 kilograms (about a hundred pounds) of solid primary resources is needed for one passenger to travel 100 kilometers.
China's proposed "straddling" bus would use existing infrastructure to greatly increase capacity along arterial routes.
Is one answer to go by banana boat? Not really. The world's merchant fleet contributes nearly 4.5 percent of all global emissions — a huge amount, up there with cars, housing, agriculture and industry. (Like aviation, shipping emissions are omitted from European targets for cutting global warming.)
Electric cars are the biggest distraction of all. The assumption in European and U.S. policy is that renewable energy-powered smart grids will power millions of electric or hybrid vehicles. Unfortunately, these technology-driven solutions are not viable once the economics of electrical grid modernization are factored in. The German branch of the World Wildlife Foundation (WWF) published a study in May 2009 (conducted with IZES, a German institute for future energy systems). Electric cars only reduce greenhouse gases marginally, they found. The manufacturing processes of both the hybrid and the fully electric car require more energy than those of any conventional petrol-powered car. A worst-case (and frankly most likely) scenario is that most electric cars will be run on electricity from coal instead of from renewable sources.
The least talked about obstacle to electric transportation concerns the raw materials needed to manufacture the vehicles. Rare earth metals are key to global efforts to switch to cleaner energy and therefore transportation. But mining and processing the metals causes immense environmental damage. China’s rare earth industry each year produces more than five times the amount of waste gas, including deadly fluorine and sulfur dioxide, than the total flared annually by all miners and oil refiners in the U.S. Alongside that 13 billion cubic meters of gas is 25 million tons of wastewater laced with cancer-causing heavy metals such as cadmium. And, just as we already have a problem with peak oil, a shortfall looks likely in the world’s capacity to produce lithium. One rare metals expert, William Tahil, claims the production of hybrid and electric cars will soon tax the world’s production of lithium carbonate.
Think More, Move Less
Politicians tend to dissemble or lie, or both, whenever the subject of transportation strategy crops up. Despite proof that transportation damages the biosphere, costs a fortune and kills people, the policy establishment is in thrall to the belief that transportation-enabled economic growth takes priority over all else. This false belief is based on inadequate ecological accounting, and the power of the myriad industries involved. Every aspect of the aviation industry, for example — airplane manufacturers, airlines, airports — is subsidized by direct grants and tax breaks. Remove these hidden subsidies and also charge aviation the true costs of its environmental impact and the whole enterprise becomes un-economic even on its own terms.
Politicians are also scared that no voter will tolerate a curtailment of air travel. The better way to put it is that no rich voter will do so. Only 5 percent of the world’s population has ever flown. Aviation is overwhelmingly an activity of the rich, and strong measures to combat the environmental impact of aviation would not adversely effect poor people.
We once hoped that the internet would replace trips to the mall; that air travel would give way to teleconferencing; and that digital transmission would replace the physical delivery of books and videos. In the event, technology has indeed enabled some of these new kinds of mobility — but in addition to, not as replacements for, the old kinds. The internet has increased transport intensity in the economy as a whole. Rhetoric of a “weightless” economy, the “death of distance” and the “displacement of “matter by mind” sound ridiculous in retrospect.
Rather than tinker with symptoms — such as inventing hydrogen-powered vehicles, or turning gas stations into battery stations — the more interesting and pertinent design task is to re-think the way we use time and space and to reduce the movement of matter — whether goods or people — by changing the word "faster" to "closer."
Our transportation challenge can be compared to distributed computing. The speed-obsessed computer world, in which network designers rail against delays measured in milliseconds, is years ahead of the rest of us in rethinking space-time issues. It can teach us how to rethink relationships between place and time in the real world, too. Embedded on microchips, computer operations entail careful accounting for the speed of light. The problem geeks struggle constantly with is called latency — the delay caused by the time it takes for a remote request to be serviced, or for a message to travel between two processing nodes. Another key word, attenuation, describes the loss of transmitted signal strength as a result of interference — a weakening of the signal as it travels farther from its source — much as the taste of strawberries grown in Spain weakens as they are trucked to faraway places. The brick walls of latency and attenuation prompt computer designers to talk of a “light-speed crisis” in microprocessor design. The clever design solution to the light-speed crisis is to move processors closer to the data — in ecological terms, to re-localize the economy.
Network designers are good localizers. Striving to reduce geodesic distance, they have developed the so-called store-width paradigm or “cache and carry.” They focus on copying, replicating and storing web pages as close as possible to their final destination, at content access points. Thus, if you go online to retrieve a large software update from an online file library, you are often given a choice of countries from which to download it. This technique is called “load balancing” — even though the loads in question, packets of information, don’t actually weigh anything in real-world terms. Cache-and-carry companies maintain tens of thousand of such caches around the world.
By monitoring demand for each item downloaded and making more copies available in its caches when demand rises and fewer when demand falls, operators help to smooth out huge fluctuations in traffic. Other companies combine the cache-and-carry approach with smart file sharing, or "portable shared memory parallel programming." Users’ own computers, anywhere on the internet, are used as shared memory systems so that recently accessed content can be delivered quickly when needed to other users nearby on the network.
The Law of Locality
My favorite example of decentralized production concerns drinks. The weight of beer and other beverages, especially mineral water, trucked from one rich nation to another is a large component of the freight flood that threatens to overwhelm us. But first Coca-Cola and now a boom in microbreweries demonstrate a radically lighter approach: Export the recipe and sometimes the production equipment, but source raw material and distribute locally.
People and information want to be closer. When planning where to put capacity, network designers are guided by the law of locality; this law states that network traffic is at least 80 percent local, 95 percent continental and only 5 percent intercontinental. Communication network designers use another rule that we can learn from in the analogue world: “The less the space, the more the room.” In silicon, the trade-off between speed and heat generated improves dramatically as size diminishes: Small transistors run faster, cooler and cheaper. Hence the development of the so-called processor-in-memory (PIM) — an integrated circuit that contains both memory and logic on the same chip.
So, too, in the analogue world: radically decentralized architectures of production and distribution can radically reduce the material costs of production. We need to build systems that take advantage of the power of networks — but that do so in ways that optimize local-ness.
This design principle — “the less the space, the more the room” — is nowhere better demonstrated than in the human brain. The brain, in Edward O. Wilson's words, is “like one hundred billion squids linked together... an intricately-wired system of nerve cells, each a few millionths of a meter wide, that are connected to other nerve cells by hundreds of thousands of endings. Information transfer in brains is improved when neuron circuits, filling specialized functions, are placed together in clusters.”
Neurobiologists have discovered an extraordinary array of such functions: sensory relay stations, integrative centers, memory modules, emotional control centers, among others. The ideal brain case is spherical or close to it, Wilson observes, because a sphere has the smallest surface relative to volume of any geometric form. A sphere also allows more circuits to be placed close together; the average length of circuits can thus be minimized, which raises the speed of transmission while lowering the energy cost for their construction and maintenance.
Like gram junkies, the mobility dilemma is not as hard to solve as it looks once one replaces the word “speed” with the word “weight.” By changing the success measurement from faster to closer, it becomes possible to borrow from other domains, such as microprocessor design, network topography and the geodesy of the human brain. The biosphere itself is the result of 3.8 billion years of iterative, trial-and-error design — so we can safely assume it’s an optimized solution. As Janine Benyus explains in her wonderful book Biomimicry: Innovation Inspired by Nature, biological communities, by and large, are localized or relatively closely connected in time and space. Their energy flux is low, distances covered are proximate. With the exception of a few high-flying species, in other words, “nature does not commute to work.”
Personal comment:
A documented essay from John Thackara about the different forms of mobility and their footprints. With some consideration on the "immobile mobility" or rather "mediated mobility" and an interesting proposal about the "law of locality" taking its inspiration in network computing.
Wednesday, January 26. 2011
Via TreeHugger
-----
Image via TED
The rise of lending libraries, swapping sites, and product as a service systems over the last 5 years or so has been impressive. We've seen an upswing in everything from clothing swap parties to local rental communities, to big services like Zipcar for getting around without having to own a car and even AirBnB for renting spare bedrooms from locals rather than hotel rooms. Rachel Botsman is the co-author of the book What's Mine Is Yours: The Rise of Collaborative Consumption. She studies how we're switching to a culture of sharing, and how that will transform business, consumerism, and the meaning and impact of social networking in our lives. She took the time to answer a few questions from us about what's behind collaborative consumerism, and what we can expect over the next few years.
Photo by Jamiesrabbits via Flickr Creative Commons
Lending libraries, rental sites for stuff, and even car sharing is getting more popular these days. But what area of consumables have you seen the most growth in for sharing or swapping among community members?
Swapping sites for goods with limited value or that fulfill a temporary need such a baby goods, books and DVDs are growing at a staggering rate; Peer-to-peer space rental sites (homes, gardens, parking spaces, storage etc.) such as AirBnb, Landshare and Parkatmyhouse are exploding in mainstream popularity; Bike sharing is the fastest growing form of transportation in the world; Co-working spaces are popping up in the world's major cities; I think 2011 is the year that we start to see skill or 'favor' share communities such as TaskRabbit, Skillshare and Hey Neighbor start to take off.
As collaborative consumerism becomes more practical and popular, how do you think it will shift our economy as a whole?
Big picture (and I am talking in 10-20 years time), I think we will see the way we measure 'wealth', 'growth' and 'happiness' being completely redefined. We are already seeing countries such as the UK, Canada and France looking at reinventing measures beyond GDP that give a picture of the holistic well-being of a nation. As Sarkozy commented, "So many things that are important to individuals are not included in GDP."
The way assets and income are taxed is going to be an interesting area as more people become "micro-entrepreneurs" earning money renting out their assets or bartering their skills. Peer-to-peer marketplaces essentially cut out a lot of middlemen but in the process create a whole array of cottage industries. Just think of Etsy. It's going to be interesting to see whether big brands and global businesses retain their appeal or whether small really is the next big thing.
Photo by Orin Zebest via Flickr Creative Commons
Some of the big environmental benefits we can see with a culture of sharing goods is reduced production of stuff, and definitely less waste. What are some of the lesser seen eco-benefits we might see?
In short, a) better utilization of assets b) products designed for longevity not obsolescence and c) mindset and behavior change.
All around us, we are surrounded by stuff that has what I call 'idling capacity', the untapped value of unused or underused assets. There are different kinds of idling capacity. Products that are underutilized (e.g. the average car that sits parked for 23 hours a day); products that fulfill a temporary need (e.g. baby goods and clothes): or those that diminish in appeal and value after usage (e.g. a movie or a book). At the heart of Collaborative Consumption is how we can use the latest technologies to redistribute 'idling capacity' and maximize usage.
I could not think of a more exciting time to be a designer. Longevity does not just mean designing with durable materials but making goods with modularity that can be seamlessly updated, as well as easily broken down for future reuse, resale or repair. It will mean designing products that can be easily shared, customized and personalized by different users. If a designer had a blank sheet of paper and was designing a car for shared usage versus individual ownership how would it differ? How can we use RFID tags to embed stories, images, and videos into shared goods so they become smarter and more interesting than individually owned products? There are endless sustainable design opportunities...
When people start using different examples of Collaborative Consumption they frequently describe a 'mindset change.' There are examples like car sharing where users think twice about whether they need to drive and thereby reduce their miles travelled by an approximated 45%. And there are examples like peer-to-peer rental, where people are using platforms such as Neighborgoods or Snapgoods. 'Owners' are realizing they can make money from renting out their assets peer-to-peer and 'renters' are experiencing the benefits of not needing to own. Finally, you have examples like 'swap trading' where people suddenly realize they are surrounded by assets they can swap to get what they want versus buying new stuff. The behaviour becomes addictive.
How far do you think we are from having collaborative consumerism be a mainstream way of using goods, and what are some of the steps we still need to take to get there?
We are just in the nascent stages of Collaborative Consumption. We have already seen examples like Netflix, eBay and Zipcar become household names but that has taken a decade - technology and consumer values were playing catch-up. But I think the current massive cultural and technological shift is accelerating the next wave of Collaborative Consumption at an astonishing rate.
I think it's critical for more big brands to enter the space. BMW, Daimler and Peugeot have all recently launched car sharing models. Amazon just announced its 'Buy Back' scheme of second-hand unwanted books. I would love to see a big bank enter the social lending space; for a retail giant like Target to launch an innovative rental model; for a brand like Zappos to create a shoe swapping and repair platform....
Big brands can reach scale faster, they prove there are real business models behind Collaborative Consumption (and there are), but they also create the social proof, the cultural cache for this new cultural and economy to become mainstream.
More on Collaborative Consumerism
Meet Rachel Botsman and Roo Rogers Authors of What's Mine Is Yours
TED Talk: Systems of Sharing About to Revolutionize Consumerism
Monday, January 17. 2011
Via Next Nature
-----
Remember the beginning of 2010, when the Eyjafjallajokull volcano erupted, throwing huge amounts of ash into the air, thousands of flights have been canceled across Europe due to fears that ash could turn into molten glass within a hot jet engine, crippling the aircraft. This picture was taken during and after the flight ban at the same location.
Personal comment:
Are planes in fact among the first unspoken geo-engineering devices? Cloud generators?
Thursday, January 13. 2011
Via MIT Technology Review
-----
Big computing providers are developing energy-saving strategies for new server farms.
By Cindy Waxer
When it came time for Hewlett-Packard to decide on a location for its new data center, the company could have considered variables like network connectivity, local talent, or proximity to corporate headquarters. Instead, a 100-year weather report convinced HP to build its new 360,000-square-foot facility in breezy Billingham, England.
|
Server farm: Yahoo’s data center in Lockport, New York, was inspired by a chicken coop and lets air naturally vent through the top.
Credit: Yahoo |
"You get a lot of cool and moist winds coming over the northeast coast of Britain," says Ian Brooks, HP's European head of sustainable computing. By harnessing these winds with massive fans, Brooks says, HP has created a system that uses 40 percent less energy than conventional methods of keeping data centers cool.
HP isn't the only company taking its cues from nature when it comes to the design and construction of data centers, clusters of server computers that run Internet services and store and crunch data. These facilities have been the smokestacks of the digital era because they use so much electricity: not only does it take a lot of power to run the machines themselves, but data centers are heavily air conditioned because servers generate a lot of heat and don't run well in environments much warmer than 25 ºC. As demand for online services skyrockets, the EPA predicts, U.S. data centers could nearly double their 2006 levels of energy consumption by 2011, reaching 100 billion kilowatt-hours per year—enough to power 10 million homes. By 2020, data centers will account for 18 percent of the world's carbon emissions, according to the Smart 2020 report released by the Climate Group, a nonprofit organization.
To reduce the environmental—and financial—burdens, more and more companies are trying innovative designs for data centers. For instance, at the HP center in Britain, known as Wynyard, fans more than two meters in diameter pull the North Sea winds into a mixing chamber, where they cool the warm air given off by the center's servers. That air is funneled into a large cavity beneath the servers, directed through vents in the floor, and then circulated throughout a series of aisles to chill the computers. The resulting warm exhaust is extracted, mixed with the incoming fresh air, and recirculated.
By eliminating the need for energy-intensive cooling equipment, the Wynyard facility cuts 12,500 metric tons of carbon dioxide from the total generated by the industry-standard data center. That is the equivalent of taking nearly 3,000 midsize vehicles off the road.
Another innovative data center is one that Yahoo opened in September 2010 in Lockport, New York. In this case, the inspiration came from chicken coops rather than coastal winds. "Chickens throw off a fair bit of heat; servers throw off a fair bit of heat," says Christina Page, Yahoo's director of climate and energy strategy. "So we built a long, tall, narrow building with a coop along the top to vent the air."
|
Drawn in: At Hewlett-Packard’s data center in Billingham, England, large fans pull in fresh air.
Credit: HP |
The 155,000-square-foot facility mimics the narrow design of a chicken coop and features louvers along the sides of the building so that prevailing winds can flow freely throughout the halls. On particularly hot days, the center can activate an evaporative cooling system, which uses less energy than traditional chillers. That means the facility uses at least 95 percent less water than a conventional data center, and 40 percent less energy—enough to power more than 9,000 households annually. What's more, with its preconstructed metal components, the chicken-coop structure can be assembled in less than six months.
"There's a good case to be made for the return on investment on a lot of green practices," says Page. "This data center was cheaper and faster to build, in addition to being more efficient on the operating-expenditure side."
The information-management company Iron Mountain, meanwhile, is taking advantage of natural geothermal conditions to slash energy consumption by locating a data center in a former limestone mine, 22 stories below ground. Iron Mountain's storage facility in Butler County, Pennsylvania, houses Room 48, whose racks of servers rely on the natural cooling properties of the limestone walls to remain at 13 ºC. Iron Mountain also developed a high-static air pressure differential cooling system that relies on high-velocity ducts, located in the cold aisles separating rows of servers, and linear return ducts in its hot aisles. The system creates winds that naturally cause cold air to sink and hot air to rise and exit the room through perforated ceiling tiles. The absence of air conditioners not only freed up about 30 percent more space in Room 48 but cut energy consumption for cooling by 10 to 15 percent compared with traditional data centers.
These are the kinds of unheralded changes that can really make a difference, says Mark Lafferty, director of strategic solutions at technology services provider CDW. "The really basic, non-glamorous, non-sexy stuff companies do can have a dramatic effect on the amount of resource consumption in a data center," he says.
Copyright Technology Review 2011.
Monday, January 03. 2011
Old post but one of the most requested SPACEINVADING projects on archinect.com
Via bldgblog
---
Remnants of the Biosphere by Biospheric Design
Location: Arizona
Image Credits: Noah Sheldon "The structure was billed as the first large habitat for humans that would live and breathe on its own, as cut off from the earth as a spaceship," the New York Times wrote back in 1992, but the project was a near-instant failure.
Photographer Noah Sheldon got in touch the other week with a beautiful series of photos documenting the decrepit state of Biosphere 2, a semi-derelict bio-architectural experiment in the Arizona desert.
The largest sealed environment ever created, constructed at a cost of $200 million, and now falling somewhere between David Gissen's idea of subnature—wherein the slow power of vegetative life is unleashed "as a transgressive animated force against buildings"—and a bioclimatically inspired Dubai, Biosphere 2 even included its own one million-gallon artificial sea.
"The structure was billed as the first large habitat for humans that would live and breathe on its own, as cut off from the earth as a spaceship," the New York Times wrote back in 1992, but the project was a near-instant failure.
Scientists ridiculed it. Members of the support team resigned, charging publicly that the enterprise was awash in deception. And even some crew members living under the glass domes, gaunt after considerable loss of weight, tempers flaring, this winter threatened to mutiny if management did not repair a growing blot on the project's reputation.
The entire site was sold to private developers in 2007, leaving the buildings still functional andopen for toursbut falling apart.
Sheldon was originally inspired to visit and photograph the site after reading in the New York Timesthat "suburban sprawl" had come to surround the once-remote research site.
Indeed, we read, real estate development has "conquered vast swaths of the Sonoran Desert. The Biosphere, miles from nowhere when it was built in the 1980s, is now within the reach of a building boom streaking north from Tucson and south from Phoenix (and which some demographers say will eventually join the two cities, once 100 miles apart)." Traffic jams are not infrequent where there were once country roads, and new suburbs have sprung up within just a few miles of the research site.
Now, like something straight out of J.G. Ballard, the property might someday be home to a development called Biosphere Estates.
Sheldon's images, reproduced here with his permission, show the facility advancing into old age. A vast biological folly in the shadow of desert over-development, the project of Biosphere 2 seems particularly poignant in this unkempt state.
The fertile promise of the microcosm has been abandoned.
In this context, Biosphere 2 could perhaps be considered one of architect Francois Roche's "buildings that die," a term Roche used in a recent interview with Jeffrey Inaba. Indeed, in its current state Biosphere 2 is easily one of the ultimate candidates for Roche's idea of "corrupted biotopes"; the site's ongoing transformation into suburbia only makes this corruption more explicit.
Watching something originally built precisely as a simulation of the Earth —the in "Biosphere 2" is meant to differentiate this place from the Earth itself, i.e. Biosphere— slowly taken over by the very forces it was meant to model is philosophically extraordinary: the model taken over by the thing it represents. A replicant in its dying throes.
|