A Utah-based startup company called Chamtech Operations is claiming that its Spray On Antenna Kit can turn any surface into a high-powered antenna.
As explained by Anthony Sutera in the video below of his presentation at Google's Solve for X event, “Our material uses thousands of nano-capacitors that we can spray paint on in the right pattern. All of these little capacitors charge and discharge extremely quickly in real time and they don't create any heat. When we hook up our material to a radio, the signal hops from capacitor to capacitor very quickly, finds its happy spot, and launches into space.”
Imagine painting wireless antennas on walls. Instead of planting them as fake trees or simply uncamouflaged, phone companies will blend their unsightly cell towers into the landscape. They could even commission artists like Haas & Hahn to turn the city as though fully draped with Joseph's Amazing Technicolor Dreamcoat.
Alternatively, they could enlist children from local elementary schools to reconfigure the city's electromagnetic infrascape in exchange for very plump donations. Is that mural on that once seamy but now chromatically resplendent underpass depicting the neighborhood's tempestuous history and encouraging future, their parents will ask. Why yes, they'll answer back, and you can also bounce phone calls off of it.
Elsewhere, graffiti artists will independently fill in the dead zones. Or maybe not to patch up the network, but rather as part of a collaboration with sound artists to create a pop-up pirate radio station to broadcast an improvisational audio documentary of the local area. During times of protest, they could be used to burst through whatever electromagnetic kettling the security forces might be using to prevent the crowds from organizing and reaching critical mass. If there's no graffiti antenna nearby, just whip out a Spray On Antenna Kit to reclaim the public spectrum.
In any case, I'd like those “nut jobs” at DEMILIT given some of the stuff and see how they might use it in one of their sonic tours of military landscapes. Will they spray it on crumbling bunkers and derelict silos, thus repurposing them into antennas to transmit in real-time the acoustic ecology of war?
What might avant-gardeners, who presumably already know how to harness energy from trees, do with the stuff if they find out that you can also apply them to trees and turn them into arboreal arrays?
Trees broadcasting themselves singing. An entire national park airing an epic botanical opera. The Amazon sending messages to exoplanets.
[The Placer County Courthouse, in Auburn, California -- imagine it swarmed by a glitch jam.]
NPR reported this morning(ed. note: last morning due to the repost) on a traffic jam in California caused by an algorithmic glitch “accidentally summon[ing] 1,200 people to jury duty on the same morning”. An excellent reminder of the tendency of algorithmic dysfunction to manifest as physical dysfunction, and (at a relatively small scale) of the potentially disproportionate impact of glitches when they are translated from dataspace into an infrastructural system. The glitch may be as simple as having accidentally swapped the 0 indicating “do not come in” for the 1 indicating “come in”, but the resulting jam is rendered in aluminum autobodies and on asphalt corridors where it is much more difficult to clear than it was to create.
No other means of storing energy may be able to reach the scale required to run Germany on solar and wind power.
By Kevin Bullis
Kevin Casper - Wind Turbines (Public Domain Pictures)
-
If Germany is to meet its ambitious goals of getting a third of its electricity from renewable energy by 2020 and 80 percent by 2050, it must find a way to store huge quantities of electricity in order to make up for the intermittency of renewable energy.
Siemens says it has just the technology: electrolyzer plants, each the size of a large warehouse, that split water to make hydrogen gas. The hydrogen could be used when the wind isn't blowing to generate electricity in gas-fired power plants, or it could be used to fuel cars.
Producing hydrogen is an inefficient way to store energy—about two-thirds of the power is lost in the processes of making the hydrogen and using the hydrogen to generate electricity. But Siemens says it's the only storage option that can achieve the scale that's going to be needed in Germany.
Unlike conventional industrial electrolyzers, which need a fairly steady supply of power to efficiently split water, Siemens's new design is flexible enough to run on intermittent power from wind turbines. It's based on proton-exchange membrane technology similar to that used in fuel cells for cars, which can operate at widely different power levels. The electrolyzers can also temporarily operate at two to three times their rated power levels, which could be useful for accommodating surges in power on windy days.
Germany, which has led the world in installing solar capacity, isn't just concerned about climate change. Its leaders think that in the long term, renewable energy will be cheaper than fossil fuels, so it could give the country an economic advantage, says Miranda Schreurs, director of the Environmental Policy Research Center at the Freie Universität Berlin. Germany will serve as a test case to show whether industrialized countries can compete while relying on renewables.
Another reason Germany is turning to renewable energy is to meet its goal of reducing emissions of greenhouse gases by 40 percent by 2020, relative to 1990 levels, and by 80 percent by 2050. Some other countries have similarly ambitious carbon dioxide reduction goals, but Germany stands out because it's a large economy that depends on cheap electricity to make manufactured goods. It has decided to not to use nuclear power as a source of steady, carbon-free electricity. And it can't rely heavily on natural gas, which emits about half as much carbon dioxide as coal. Natural gas is more expensive in Europe than in the United States, and it comes from countries such as Russia that aren't always reliable suppliers.
Keeping electricity costs low while transitioning toward renewable power will be difficult. Solar power is far more expensive than fossil-fuel power, especially in Germany, where skies are often cloudy. And although wind power is already nearly as cheap as fossil-fuel power—which is why Germany is starting to shift its policies to favor wind—like solar, it is intermittent: even some of the best-situated wind turbines generate electricity only a third of the time.
Ensuring reliable power supplies will therefore require installing high-voltage power lines to get renewable energy from places that happen to be sunny or windy to the places energy is needed. Germany is already struggling with limits to its ability to transmit its existing renewable energy supply, which accounts for about 20 percent of its electricity: according to Siemens, Germany throws away 20 percent of the power its wind turbines produce because it doesn't have enough transmission capacity.
Renewable energy will require very large-scale energy storage. The most affordable way to store electricity is to use it to pump water up a hill, and then let it flow down again to spin a turbine and generator when electricity is needed. But this only works in places where there are hills and dams, and most of Germany is flat.
The total amount of pumped-water storage in Germany now is about 40 gigawatt-hours—no more than renewable sources could generate in an hour on a sunny and windy day, says Michael Weinhold, Siemens Energy's chief technology officer. "They were not made for buffering hours or days, or even weeks, of volatility."
Right now, batteries are far too expensive—and not nearly enough are being made to accommodate the scale required. It would take the battery capacity of millions of electric vehicles to equal the existing pumped-water storage capacity.
Germany does, however, have the potential to store a vast amount of hydrogen, because it's possible to mix small amounts of hydrogen into existing natural gas pipelines and storage containers. These offer enough capacity to store about two weeks of current renewable energy production in Germany. Salt caverns, some of which are now used to store Germany's strategic oil reserve, could provide far more storage.
Siemens estimates that generating 85 percent of Germany's electricity using renewables will require 30,000 gigawatt-hours of storage. The hydrogen needed to supply that much electricity could be stored in a quarter of the space available in underground caverns. The hydrogen could be distributed initially through existing natural gas pipelines, and eventually through dedicated pipelines.
Siemens says its electrolyzers are about 60 percent efficient; 40 percent of the energy generated by a wind turbine would be lost making hydrogen gas. Then at least 40 percent of the energy in the hydrogen would be lost in generating electricity in gas-fired power plants or fuel cells. So only about a third of the original energy would be retained. But Weinhold says the system would make hydrogen from electricity that couldn't otherwise be used on the grid and therefore would be wasted without such a storage system.
In addition to being inefficient, the system could be expensive. The high cost of fuel cells is a key reason they haven't been used widely in cars. But Weinhold says Siemens is working to bring down costs. Siemens is conducting pilot demonstrations of the technology this year, and it plans to sell two-megawatt systems by 2015 and to build systems as large as 250 megawatts by 2018. The largest plants could harness the power produced by about 100 wind turbines.
Copyright Technology Review 2012.
Personal comment:
A pretty detailed article by Kevin Bullis about what it means to set up a sustainable energy society, in term of infrastructures --in this case Germany--. And about what it takes to store large amount of renewable energy for a relative long period of time.
We can wonder why they don't think more about a distributed system (a one similar to what Jeremy Rifkin envisions in his recent book, The Third Industrial Revolution), this still looks like an old pattern of big top-down corporations or influencial lobbies that will have their grasp on energy. Why not "store" all this excess of energy in "the millions of electric cars"? or in a distributed network of houses/buildings that would also produce their own energy in parallel?
“Secret Servers”, an article by James Bridle originally published in issue 099 of Icon magazine, looks at the relationship between architecture and the physical infrastructure of the internet. I found Bridle’s last few paragraphs particularly provocative:
“What is at stake is the way in which architects help to define and shape the image of the network to the general public. Datacenters are the outward embodiment of a huge range of public and private services, from banking to electronic voting, government bureaucracy to social networks. As such, they stand as a new form of civic architecture, at odds with their historical desire for anonymity.
Facebook’s largest facility is its new datacenter in Prineville, Oregon, tapping into the same cheap electricity which powers Google’s project in The Dalles. The social network of more than 600 million users is instantiated as a 307,000 square foot site currently employing over 1,000 construction workers—which will dwindle to just 35 jobs when operational. But in addition to the $110,000 a year Facebook has promised to local civic funds, and a franchise fee for power sold by the city, comes a new definition for datacenters and their workers, articulated by site manager Ken Patchett: “We’re the blue collar guys of the tech industry, and we’re really proud of that. This is a factory. It’s just a different kind of factory then you might be used to. It’s not a sawmill or a plywood mill, but it’s a factory nonetheless.”
This sentiment is echoed in McDonald’s description of “a new age industrial architecture”, of cities re-industrialised rather than trying to become “cultural cities”, a modern Milan emphasising the value of engineering and the craft and “making” inherent in information technology and digital real estate.
The role of the architect in the new digital real estate is to work at different levels, in Macdonald’s words “from planning and building design right down to cultural integration with other activities.” The cloud, the network, the “new heavy industry”, is reshaping the physical landscape, from the reconfiguration of Lower Manhattan to provide low-latency access to the New York Stock Exchange, to the tangles of transatlantic fiber cables coming ashore at Widemouth Bay, an old smuggler’s haunt on the Cornish coast. A formerly stealth sector is coming out into the open, revealing a tension between historical discretion and corporate projection, and bringing with it the opportunity to define a new architectural vocabulary for the digitised world.”
Though Bridle does not make this link explicit in the article, the idea of a potential “new architectural vocabulary” is clearly related to the “New Aesthetic” that Bridle began talking about this past May. (I’ve always liked Matt Berg’s description of it as a “sensor vernacular”, and Robin Sloan’s “digital backwash aesthetic”. I’m not sure either of those capture exactly what Bridle’s been talking about — more like pieces of it — but they all dance around the same set of things, or at least similar sets.) Here’s Bridle’s original description, pinched together:
For so long we’ve stared up at space in wonder, but with cheap satellite imagery and cameras on kites and RC helicopters, we’re looking at the ground with new eyes, to see structures and infrastructures.
The map fragments, visible at different resolutions, accepting of differing hierarchies of objects.
Views of the landscape are superimposed on one another. Time itself dilates.
Representations of people and of technology begin to break down, to come apart not at the seams, but at the pixels.
The rough, pixelated, low-resolution edges of the screen are becoming in the world.
And when that — a new aesthetic vocabulary — gets linked to a “re-industrialization”, pulling together aesthetics, culture, economics, and politics, you’ve got a pretty significant project. I’d like to talk about this at more length later, but for now I will just quote from Dan Hill’s fantastic 14 Cities project. (Independent of the concerns in this post, the whole project is worth a read.) This is the fourth of the fourteen fictional future cities Hill describes, “Re-industrial City”:
“The advances in various light manufacturing technologies throughout the early part of the 21st century — rapid prototyping, 3D printing and various local clean energy sources — enabled a return of industry to the city. Noise, pollution and other externalities were so low as to be insignificant, and allied to the nascent interest in digitally-enabled craft at the turn of the century, by the early 2020s suburbs had become light industrial zones once again.
Waterloo, Alexandria and the Inner West of Sydney through to Pyrmont once again became a thriving manufacturing centre, albeit on a domestic scale, as people were able to ‘micro-manufacture’ products from their backyard, or send designs to mass-manufacture hubs supported by logistics networks of electric delivery vans and trains. Melbourne had led the way through its nurturing of production in the creative industries and its existing built fabric.
In an ironic twist, former warehouses and factories are being partially converted from apartments back into warehouses and factories. Yet the domestic scale of the technologies means they can coexist with living spaces, actually suggesting a return to the craftsman’s studio model of the Middle Ages. The ‘faber’ movement — faber, to make — spread through most Australian cities, with the ‘re-industrial city’ as the result, a genuinely mixed-use productive place — with an identity.”
["Bundled, Buried, and Behind Closed Doors", a documentary short by Ben Mendelsohn and Alex Cholas-Wood, looks at one of our favorite things -- the physical infrastructure of the internet -- and, in particular, the telco hotel at 60 Hudson Street. It's particularly fascinating to see how 60 Hudson Street exhibits the "tendency of communications infrastructure to retrofit pre-existing networks to suit the needs of new technologies": the building became a modern internet hub primarily because it was already a hub in earlier communications networks, permeated by pneumatic tubes, telegraph cables, and telephone lines, and thus easily suited to the running of fiber-optic cables. (This is important because it demonstrates the relative fixity of infrastructural geographies -- like the pattern of the cities they are embedded in, the positions of infrastructures tend to endure even as the infrastructures themselves decay and are replaced.)]
During the 1930′s, the United Kingdom wanted to reinforce its defense system against the growing antagonism that would end up in the Second World War. That is how they invented the RADAR (RAdio Detection And Ranging) but before such achievements, they experimented an architectural system on the Kent coast that would allow an early warning of potential enemy planes and bring an idea of their directions. Those monumental sound mirrors were reflecting sound into a microphone that was able to determine the direction of the enemy aircraft by determining which area has reflected sound the most.
In order to know more about them -which were eventually gave up before the beginning of the Blitz- you can read the article Listening for the Enemy written by Solveig Grothe for German magazine Der Spiegle.
Thank you Carla.
Personal comment:
Built "architectures" to amplify and (in a way) deterritorialize sounds.
A 30-day Kickstarter campaign to raise funds for the continued development of + Pool is underway. From the creative minds at Family and PlayLab, + Pool is a collaboration to design a floating riverwater pool for everyone in the rivers of New York City. Beginning the next phase of the project, material testing and design, the online fundraising campaign hopefully will raise the initial $25,000 needed to begin physically testing the filtration membranes providing results to determine the best filtration membranes and methods to provide clean and safe riverwater for the public to swim in. A preliminary engineering feasibility report was initially conducted by Arup New York, which assessed the water quality, filtration, structural, mechanical and energy systems of + Pool.
Family and PlayLab launched a Kickstarter online fundraising campaign this month with the ultimate goal of generating enough support to prototype the filtration system by building a full-scale working mockup of the one section of + Pool. Research, design, testing and development will continue through the year in conjunction with permitting, approvals and building partnerships with community, municipal, commercial and environmental organizations.
Donation levels for the Kickstarter campaign range from $1 to $10,000 with the hope that everyone interested in cleaner public waterways can get involved. Donors can choose from a variety of incentives and gear up for a day at the pool. For more information about the project and the campaign or to donate click here. Or write to info@pluspool.org.
Follow the break for more details about this project and the history of floating pools in New York City, which date back to the early 19th century.
+ Pool is the collaborative initiative of design studios Family and PlayLab to build a floating pool for everyone in the rivers of New York City. The project was launched with the ambition to improve the use of the city’s natural resources by providing a clean and safe way for the public to swim in New York’s waters.
As both a public amenity and an ecological prototype, + Pool is a small but exciting precedent for environmental urbanism in the 21st Century.
NYC + POOL
+ Pool is for you, for your friend, for your mom, for your dad, for your girlfriend, for your kids, for your boss, for your bartender, for your tamale guy, for your other girlfriend, for New York City, for everyone.
An offshore reflection of the city intersection, + Pool both exemplifies the dense, busy character of New York City and offers an island retreat from it.
Floating pools have paralleled the development of New York City dating back to the early 19th Century. When the city’s elite used lower Manhattan as a resort in the 1800s floating spas were located just off the Battery. After the Civil War the huge influx of immigrants required bathhouses in the Hudson and East Rivers as many were without proper bathing facilities in their homes. In the early 1900s improved plumbing infrastructure and increasing water quality concerns closed the last of the river-borne pools, relocating aquatic leisure activities to more sanitized and inland sites.
In 1972, the Clean Water Act set forth the goal of making every body of water in the country safe for recreation, and in 2007 the Floating Pool Lady – a reclaimed barge now located in the Bronx – brought back the first semblance of New York’s floating pool culture in almost a Century.
Today, as the appreciation for our city’s natural resources becomes increasingly crucial, a permanent floating pool in the river will help restore the water culture so integral to New York City.
+ Pool should be enjoyed by everyone, at all times, which is why it’s designed as four pools in one: Children’s Pool, Sports Pool, Lap Pool and Lounge Pool. Each pool can be used independently to cater to all types of swimmers, combined to form an Olympic-length lap pool, or opened completely into a 9,000 square foot pool for play.
WATER + POOL
The most important aspect of + Pool’s design is that it filters river water through the pool’s walls – like a giant strainer dropped into the river.
The concentric layers of filtration materials that make up the sides of the pool are designed to remove bacteria, contaminants and odors, leaving only safe and swimmable water that meets city, state and federal standards of quality.
PARK + POOL
Its universally recognizable shape and unusual offshore siting immediately position + Pool as a iconic piece of public infrastructure.
Whether as a compliment to a thriving park or catalyst for a growing one, the pool can serve as a destination for weekend visitors, an island haven for busy locals, and a symbol for the surrounding neighborhood.
After the launch of + Pool in the summer of 2010, Family and PlayLab began meeting with waterfront organizations, engineers, urban planners, environmental experts, public and private developers and community organizations to build a team to push the project forward. Likeminded institutions like The Metropolitan Waterfront Association, NYC Swim and the Department of Parks and Recreation have all been integral in shaping both the design and process of the pool itself.
The + Pool team has been working with renowned engineering firm Arup New York to study the filtration, structural, mechanical and energy systems of the pool as well as the water quality conditions and regulations necessary for the project. The team recently completed a preliminary engineering feasibility report in preparation for the material and methods testing phase.
Following the completion of the preliminary engineering report done in collaboration with Arup, the + Pool team is now moving into the phase of material testing to assess and determine the best filtration membranes and methods to provide clean and safe riverwater for the public to swim in.
Family and PlayLab launched a Kickstarter online fundrasing campaign in June of 2011 with the ultimate goal of generating enough support to prototype the filtration system by building a full-scale working mockup of the one section of + Pool.
Research, design, testing and development will continue through the year in conjunction with permitting, approvals and building partnerships with community, municipal, commercial and environmental organizations.
As I've said before, concentrated solar power (the "other" kind of solar—not photovoltaic panels) is a core component of any carbon-free energy future. This week, the United States is one giant step closer to plugging in the world's largest concentrated solar power plant—Brightsource's Ivanpah plant—which will pump out a massive 392 Megawatts of clean, solar energy in the Mojave Desert as soon as 2013. A rendition of the enormous project is above.
The project took a big leap from business plan to reality this week with two serious funding deals. First, Google announced its largest energy investment ever—placing a $168 million bet on Brightsource. The same day, the Department of Energy finalized a whopping $1.6 billion in loan guarantees for the project. "Today's announcement is creating over 1,000 jobs in California," said Energy Secretary Steven Chu, "while laying the foundation for thousands more clean energy jobs across the country in the future."
Not familiar with this core clean energy solution? Here's how I described it a couple years ago:
Whereas photovoltaic panels directly convert sunlight into an electric current, concentrated solar uses the sun's heat energy itself to generate power. [...] The intense heat boils the water, which creates steam. The steam spins a turbine, and-voila!-electricity is generated. Under optimum conditions, the plant can churn out 20 megawatts of juice, enough to power 10,000 homes.
Here's an illustration from our graphics team from a few years back:
So just how big is the potential for concentrated solar? A recent study found that 1,000 square miles of the Mojave Desert devoted to CSP could produce enough energy to power the entire country. On a grander scale, less than one percent of the world's deserts could power the whole world, if transmission lines could accommodate the electricity. In other words, plants like Ivanpah can absolutely replace coal-burning power plants in sunny areas.
Here's Brightsource's test site in Israel, where it's proving the technology.
I found it interesting that the Department of Energy and Google made these funding announcements on the same day. Such public-private cooperation seems to represent the best path forward for clean energy absent any federal legislation or mandate.
The social network breaks an unwritten rule by giving away plans to its new data center—an action it hopes will make the Web more efficient.
By Tom Simonite
The new data center, in Prineville, Oregon, covers 147,000 square feet and is one of the most energy-efficient computing warehouses ever built.
Credit: Jason Madera
Just weeks before switching on a massive, super-efficient data center in rural Oregon, Facebook is giving away the designs and specifications to the whole thing online. In doing so, the company is breaking a long-established unwritten rule for Web companies: don't share the secrets of your server-stuffed data warehouses.
Ironically, most of those secret servers rely heavily on open source or free software, for example the Linux operating system and the Apache webserver. Facebook's move—dubbed the Open Compute Project—aims to kick-start a similar trend with hardware.
"Mark [Zuckerberg] was able to start Facebook in his dorm room because PHP and Apache and other free and open-source software existed," says David Recordon, who helps coordinate Facebook's use of, and contribution to, open-source software. "We wanted to encourage that for hardware, and release enough information about our data center and servers that someone else could go and actually build them."
The attitude of other large technology firms couldn't be more different, says Ricardo Bianchini, who researches energy-efficient computing infrastructure at Rutgers University. "Typically, companies like Google or Microsoft won't tell you anything about their designs," he says. A more open approach could help the Web as a whole become more efficient, he says. "Opening up the building like this will help researchers a lot, and also other industry players," he says. "It's opening up new opportunities to share and collaborate."
The open hardware designs are for a new data center in Prineville, Oregon, that will be switched on later this month. The 147,000-square-foot building will increase Facebook's overall computing capacity by around half; the social network already processes some 100 million new photos every day, and its user base of over 500 million is growing fast.
The material being made available - on a new website - includes detailed specifications of the building's electrical and cooling systems, as well as the custom designs of the servers inside. Facebook is dubbing the approach "open" rather than open-source because its designs won't be subject to a true open-source legal license, which requires anyone modifying them to share any changes they make.
The plans reveal the fruits of Facebook's efforts to create one of the most energy-efficient data centers ever built. Unlike almost every other data center, Facebook's new building doesn't use chillers to cool the air flowing past the servers. Instead, air from the outside flows over foam pads moistened by water sprays to cool by evaporation. The building is carefully oriented so that prevailing winds direct outside air into the building in both winter and summer.
Facebook's engineers also created a novel electrical design that cuts the number of times that the electricity from the grid is run through a transformer to reduce its voltage en route to the servers inside. Most data centers use transformers to reduce the 480 volts from the nearest substation down to 208 volts, but Facebook's design skips that step. "We run 480 volts right up to the server," says Jay Park, Facebook's director of data-center engineering. "That eliminates the need for a transformer that wastes energy."
To make this possible, Park and colleagues created a new type of server power supply that takes 277 volts and which can be split off from the 408-volt supply without the need for a transformer. The 408 volts is delivered using a method known as "three phase power": three wires carry three alternating currents with carefully different timings. Splitting off one of those wires extracts a 277-volt supply.
Park and colleagues also came up with a new design for the backup batteries that keep servers running during power outages before backup generators kick in—a period of about 90 seconds. Instead of building one huge battery store in a dedicated room, many cabinet-sized battery packs are spread among the servers. This is more efficient because the batteries share electrical connections with the computers around them, eliminating the dedicated connections and transformers needed for one large store. Park calculates that his new electrical design wastes about 7 percent of the power fed into it, compared to around 23 percent for a more conventional design.
According to the standard measure of data-center efficiency—the power usage efficiency (PUE) score—Facebook's tweaks have created one of the most efficient data centers ever. A PUE is calculated by dividing a building's total power use by the energy used by its computers - a perfect data center would score 1. "Our tests show that Prineville has a PUE of 1.07," says Park. Google, which invests heavily in data-center efficiency, reported an average PUE of 1.13 across all its locations for the last quarter of 2010 (when winter temperatures make data centers most efficient), with the most efficient scoring 1.1.
Google and others will now be able to cherry pick elements from Facebook's designs, but that poses no threat to Facebook's real business, says Frank Frankovsky, the company's director of hardware design. "Facebook is successful because of the great social product, not [because] we can build low-cost infrastructure," he says. "There's no reason we shouldn't help others out with this."
Copyright Technology Review 2011.
Personal comment:
Will efficient and sustainable ways to organize architectural climate as well as to use energy become a by product of data centers? Might be.
A recent paper published in the Physical Review has some astonishing suggestions for the geographic future of financial markets. Its authors, Alexander Wissner-Grossl and Cameron Freer, discuss the spatial implications of speed-of-light trading. Trades now occur so rapidly, they explain, and in such fantastic quantity, that the speed of light itself presents limits to the efficiency of global computerized trading networks.
These limits are described as "light propagation delays."
[Image: Global map of "optimal intermediate locations between trading centers," based on the earth's geometry and the speed of light, by Alexander Wissner-Grossl and Cameron Freer].
It is thus in traders' direct financial interest, they suggest, to install themselves at specific points on the Earth's surface—a kind of light-speed financial acupuncture—to take advantage both of the planet's geometry and of the networks along which trades are ordered and filled. They conclude that "the construction of relativistic statistical arbitrage trading nodes across the Earth’s surface" is thus economically justified, if not required.
Amazingly, their analysis—seen in the map, above—suggests that many of these financially strategic points are actually out in the middle of nowhere: hundreds of miles offshore in the Indian Ocean, for instance, on the shores of Antarctica, and scattered throughout the South Pacific (though, of course, most of Europe, Japan, and the U.S. Bos-Wash corridor also make the cut).
These nodes exist in what the authors refer to as "the past light cones" of distant trading centers—thus the paper's multiple references to relativity. Astonishingly, this thus seems to elide financial trading networks with the laws of physics, implying the eventual emergence of what we might call quantum financial products. Quantum derivatives! (This also seems to push us ever closer to the artificially intelligent financial instruments described in Charles Stross's novel Accelerando). Erwin Schrödinger meets the Dow.
It's financial science fiction: when the dollar value of a given product depends on its position in a planet's light-cone.
[Image: Diagrammatic explanation of a "light cone," courtesy of Wikipedia].
These points scattered along the earth's surface are described as "optimal intermediate locations between trading centers," each site "maximiz[ing] profit potential in a locally auditable manner."
Wissner-Grossl and Freer then suggest that trading centers themselves could be moved to these nodal points: "we show that if such intermediate coordination nodes are themselves promoted to trading centers that can utilize local information, a novel econophysical effect arises wherein the propagation of security pricing information through a chain of such nodes is effectively slowed or stopped." An econophysical effect.
In the end, then, they more or less explicitly argue for the economic viability of building artificial islands and inhabitable seasteads—i.e. the "construction of relativistic statistical arbitrage trading nodes"—out in the middle of the ocean somewhere as a way to profit from speed-of-light trades. Imagine, for a moment, the New York Stock Exchange moving out into the mid-Atlantic, somewhere near the Azores, onto a series of New Babylon-like platforms, run not by human traders but by Watson-esque artificially intelligent supercomputers housed in waterproof tombs, all calculating money at the speed of light.
"In summary," the authors write, "we have demonstrated that light propagation delays present new opportunities for statistical arbitrage at the planetary scale, and have calculated a representative map of locations from which to coordinate such relativistic statistical arbitrage among the world’s major securities exchanges. We furthermore have shown that for chains of trading centers along geodesics, the propagation of tradable information is effectively slowed or stopped by such arbitrage."
Historically, technologies for transportation and communication have resulted in the consolidation of financial markets. For example, in the nineteenth century, more than 200 stock exchanges were formed in the United States, but most were eliminated as the telegraph spread. The growth of electronic markets has led to further consolidation in recent years. Although there are advantages to centralization for many types of transactions, we have described a type of arbitrage that is just beginning to become relevant, and for which the trend is, surprisingly, in the direction of decentralization. In fact, our calculations suggest that this type of arbitrage may already be technologically feasible for the most distant pairs of exchanges, and may soon be feasible at the fastest relevant time scales for closer pairs.
Our results are both scientifically relevant because they identify an econophysical mechanism by which the propagation of tradable information can be slowed or stopped, and technologically significant, because they motivate the construction of relativistic statistical arbitrage trading nodes across the Earth’s surface.
This blog is the survey website of fabric | ch - studio for architecture, interaction and research.
We curate and reblog articles, researches, writings, exhibitions and projects that we notice and find interesting during our everyday practice and readings.
Most articles concern the intertwined fields of architecture, territory, art, interaction design, thinking and science. From time to time, we also publish documentation about our own work and research, immersed among these related resources and inspirations.
This website is used by fabric | ch as archive, references and resources. It is shared with all those interested in the same topics as we are, in the hope that they will also find valuable references and content in it.