Tuesday, March 18. 2014Airships and atmosphere | #airship #satellite
-----
Airships can patrol the upper atmosphere, monitoring the ground or peering at the stars for a fraction of a cost of satellites, according to a new report. All that’s needed is a prize to kick-start innovation.
The Naval Air Engineering Station in Lakehurst New Jersey must be one of the most famous airfields in the world. If you’ve ever watched the extraordinary footage of the German passenger airship Hindenburg catching fire as it attempted to moor, you’ll have seen Lakehurst. That’s where the disaster took place. Despite its notorious past, Lakehurst is still a center of airship engineering and technology. In 2012, it was home to the Long Endurance Multi-Intelligence Vehicle, an airship designed and built for the U.S. military to use for surveillance purposes over Afghanistan. The vehicle is colossal—91 meters long, 34 meters wide, and 26 meters high, about the size of a 30 story office block lying on its side. And it is designed to fly uncrewed at about 10 kilometers for up to three weeks at a time. (Last year, the program was canceled and the airship sold back to the British contractor that built it, which now intends to fly it commercially.) This ambitious program and a few others like it mostly funded by the U.S. military, have attracted some jealous glances from scientists. The ability to fly at 20 kilometers or more for extended periods of time could be hugely useful. Fitted with cameras that scan the ground, sensors that monitor the atmosphere or telescopes that point to the stars, these observatories could revolutionize the kind of data researchers are able to gather about the universe. Today, Sarah Miller and few pals have prepared a report for the Keck Institute of Space Studies in Pasadena suggesting that scientists have unnecessarily ignored the advantages of airships and that the time is right for a new era of science based on this capability. The problem, of course, is that airships capable of these missions have not yet been built. Most of the well-funded development has come from the military for long duration surveillance missions. But with the end of the wars in Iraq and Afghanistan and the downsizing of the U.S. military machine, this funding has dried up. But Miller and co have a suggestion. They say that innovation in this area could be stimulated by setting up a prize for the development of a next-generation airship, just as the X-Prize stimulated interest in reusable rocket flights. The goal, they say, should be to build a maneuverable, stationed-keeping airship that can stay aloft at an altitude of more than 20 km from least 20 hours while carrying a science payload of a least 20 kg. That’s a significant challenge. One problem will be carrying or generating the power required to propel the airship. This increases with the cube of its airspeed and so will be the biggest drain on the vehicle’s resources. Another challenge is to handle the thermal loads at this altitude, where temperatures can vary by as much as 50 °C and where there is little air to carry heat away. But none of these problems look like showstoppers. Given the right kind of incentives, it should be possible to put one of these things in the air in the very near future, perhaps based on the technology developed for vehicles like the Long Endurance Multi-Intelligence Vehicle. All that’s needed is a sponsor willing to cough up a few million dollars for a prize. Anybody with a few bucks to spare?
Ref: arxiv.org/abs/1402.6706 : AIRSHIPS : A New Horizon for Science.
Related Links:Personal comment:
In regard of the now necessary needs to monitor our man transformed atmosphere... (and not only to have universal Internet access provided by private companies), a fully artificial need, the creation of such blimp-drones would be interesting. Yet I totally disagree with the fact that this should be a private initiative. It is capital to my understanding that it remains public, in the hands of the public and driven by public technology (including the monitored data). Monday, March 17. 2014Air public | #atmosphere #health #public
Via Le Monde, via Philippe Rahm Architectes ----- Par Philippe Rahm
A quelques semaines des élections municipales, il n'a jamais fait aussi beau à Paris. Le soleil brille, il fait chaud et pourtant on nous déconseille de sortir dehors à cause de la pollution de l'air qui atteint des sommets. Mauvaise nouvelle pourdéjeuner en terrasse. C'est assez paradoxal, ce beau temps qui ne l'est en réalité pas. Cela ne va pas de soi et il nous faudra réviser à l'avenir nos critère du beau et du laid, ne plus se fier au perceptible, au soleil, à la température et au ciel bleu, mais plutôt à l'invisible et se dire le matin qu'il fait beau seulement quand le bulletin météo annoncera pour la journée un taux bas de particules fines dans l'air.
Le nuage de pollution à Paris, jeudi 13 mars. | AP/Christophe Ena
Mais si le bulletin météo classique nous informait de l'état du ciel selon des forces naturelles qui nous dépassaient et contre lequel on ne pouvait choisir que de prendre ou pas son parapluie, le problème de la pollution des villes est une conséquence des activités humaines. Et parce qu'il nous concerne tous, parce qu'il définit la réalité chimique de nos rues et de nos places, parce qu'il menace notre santé, il est éminemment politique. J'affirmerai même qu'il est la raison d'être fondamentale du politique: celle de nous assurer à tous une bonne santé. Le politique est né de la gestion sanitaire de la ville et de la définition de ses valeurs publics que l'on retrouve inscrit aujourd'hui dans les règlements et les plans d'urbanisme: avoir de la lumière naturelle dans toutes les chambres, boire de l'eau potable, évacuer et traiter les déchets et les excréments. En-dessous de son interprétation culturelle, l'Histoire de l'urbanisme et du politique est finalement celle d'une conquête physiologique, pour les villes, pour les hommes, du bien-être, du confort, de la bonne santé. Et respirer un air sain en ville ? Ne pourrait-on pas penser que c'est finalement cela que l'on demande aujourd'hui au politique ? La demande n'est pas neuve. Au début du XIXe siècle, Rambuteau, préfet de Paris, avait tracé la rue du même nom au coeur du Marais pour faire circuler l'air pour éviter le confinement des germes. Dans sa suite, le préfet Haussmann traçait les boulevards dans un même soucis d'hygiène, y plantait des arbres pour les tempérer, créaient des parcs (les Buttes-Chaumont, le bois de Boulogne, etc.) comme Olmsted avec Central Park à New-York, conçues à la manière de poumons verts pour rafraîchir la ville en été, absorber les poussières et la pollution, améliorer la qualité de l'air, parce qu'à l'époque, on mourrait réellement de tuberculoses et des autres maladies bactériennes dans les villes. Mais toutes ces mesures sanitaires ont perdu leur légitimité avec la découverte de la pénicilline et la diffusion des antibiotique à partir les années 1950. À quoi cela servait-il encore de raser les petites rues sans air et obscures du Moyen-Âge, de déplacer les habitations dans de vastes parcs de verdure si l'on pouvait chasser la maladie simplement avec un antibiotique à avaler deux fois par jour durant une semaine. Etait-ce vraiment raisonnable d'élargir les petites fenêtres des vieilles maisons en pierre, d'enlever les toits en pentes pour en faire des toits terrasses, si en réalité, on pouvait éviter la maladie avec un peu de pénicilline ? Si l'on a arrêté de démolir les vieux quartiers des villes européennes à partir des années 1970, si on a commencé à trouver du charme aux ruelles tortueuses et aux vieilles maisons étroites du Moyen-Âge, aux intérieurs sombres et humides des centres villes, si les prix des arrondissements historiques que tout le monde désertait jusqu'aux années 1970 ont commencé à grimper, si des mesures de protections du patrimoine ont été votées, si ces vielles pierres sont devenues des témoins de notre civilisation et un atout touristique et économique, si l'on est revenu habiter les vieux centres historiques, on le doit peut-être autant aux théories post-modernes de Bernard Huet, l'architecte des la place Stalingrad et des Champs-Elysées dans les années 1980, qu'à la découverte médicale des antibiotiques. Mais les antibiotiques ne peuvent rien contre la pollution aux particules fines d'aujourd'hui. Cela veut-il dire que nous allons assister au même phénomène que durant la première partie du XXe siècle, celle d'une désertion des centre-villes, d'une perte de valeur immobilière des quartiers centraux de Paris, au profit des banlieues et des campagnes où l'air n'est pas polluée ? La ville que l'on a réappris à aimer et à habiter à la fin du XXe siècle va t-elle retombée dans la désolation ? On peut tenter de croire, dans un monde globalisé, que la mission de la politique locale est aujourd'hui de réduire le chômage ou de diminuer les impôts. Mais plus profondément, le politique se doit aujourd'hui de reprendre en main sa mission fondamentale, celle d'assurer la qualité de nos biens publics, celle de nous offrir en ville, après l'eau et la lumière, un air de qualité, seule garantie pour la prospérité sociale et économique future.
Philippe Rahm construit en ce moment un parc de 70 hectares pour la ville de Taichung à Taiwan, livré en décembre 2015 qui propose d'atténuer la chaleur, l'humidité et la pollution de l'air par l'emploi du végétal et de technologies vertes.
Philippe Rahm (Architecte et enseignant aux Universités de Princeton et Harvard (Etats-Unis))
Related Links:
Posted by Patrick Keller
in Architecture, Culture & society, Territory
at
08:48
Defined tags for this entry: architecture, atmosphere, culture & society, health, politics, public, territory, thinking, weather
Friday, March 07. 2014Cosmic Web | #topology
Via VC Blog ----- By Manuel Lima
In the end of many of my talks, after going through a variety of compelling examples of network visualization, I wrap up with a bit of a quandary, asking the audience if there’s such a thing as a universal structure. This teaser usually comprises a side-by-side comparison between a mouse’s neuronal network and a simulation of the growth of cosmic structure and the formation of galaxies and quasars.
A common juxtaposition, shown during many of my lectures, between a neuronal network (left) and the vast cosmic structure (right).
As it turns out, this inquiry might not be as far-fetched as we might think. A few days ago, National Geographic posted an intriguing article titled Astronomers Get First Glimpse of Cosmic Web, where they report how scientists have for the first time captured a peek of the “vast, web-like network of diffuse gas that links all of the galaxies in the cosmos.” As stated in the article: Leading cosmological theories suggest that galaxies are cocooned within gigantic, wispy filaments of gas. This “cosmic web” of gas-filled nebulas stretches between large, spacious voids that are tens of millions of light years wide. Like spiders, galaxies mostly appear to lie within the intersections of the long-sought webs.
From the original image caption in the article: Computer simulations suggest that matter in the universe is distributed in a “cosmic web” of filaments, as seen in the image above from a large-scale dark-matter simulation. The inset is a zoomed-in, high-resolution image of a smaller part of the cosmic web, 10 million light-years across, from a simulation that includes gas as well as dark matter. The intense radiation from a quasar can, like a flashlight, illuminate part of the surrounding cosmic web (highlighted in the image) and make a filament of gas glow, as was observed in the case of quasar UM287. Credit: Anatoly Klypin and Joel Primack, S. Cantalupo
This find is not just impressive and thought-provoking, but it could also become a major focus of the emerging fields of complex systems and network science.
Wednesday, March 05. 2014The Mission to Decentralize the Internet | #network #decentralizationFollowing my previous reblogs about The Real Privacy Problem & Snowdens's Leaks.
Via The New Yorker ----- Posted by Joshua Kopstein
In the nineteen-seventies, the Internet was a small, decentralized collective of computers. The personal-computer revolution that followed built upon that foundation, stoking optimism encapsulated by John Perry Barlow’s 1996 manifesto “A Declaration of the Independence of Cyberspace.” Barlow described a chaotic digital utopia, where “netizens” self-govern and the institutions of old hold no sway. “On behalf of the future, I ask you of the past to leave us alone,” he writes. “You are not welcome among us. You have no sovereignty where we gather.”
This is not the Internet we know today. Nearly two decades later, a staggering percentage of communications flow through a small set of corporations—and thus, under the profound influence of those companies and other institutions. Google, for instance, now comprises twenty-five per cent of all North American Internet traffic; an outage last August caused worldwide traffic to plummet by around forty per cent. Engineers anticipated this convergence. As early as 1967, one of the key architects of the system for exchanging small packets of data that gave birth to the Internet, Paul Baran, predicted the rise of a centralized “computer utility” that would offer computing much the same way that power companies provide electricity. Today, that model is largely embodied by the information empires of Amazon, Google, and other cloud-computing companies. Like Baran anticipated, they offer us convenience at the expense of privacy. Internet users now regularly submit to terms-of-service agreements that give companies license to share their personal data with other institutions, from advertisers to governments. In the U.S., the Electronic Communications Privacy Act, a law that predates the Web, allows law enforcement to obtain without a warrant private data that citizens entrust to third parties—including location data passively gathered from cell phones and the contents of e-mails that have either been opened or left unattended for a hundred and eighty days. As Edward Snowden’s leaks have shown, these vast troves of information allow intelligence agencies to focus on just a few key targets in order to monitor large portions of the world’s population. One of those leaks, reported by the Washington Post in late October (2013), revealed that the National Security Agency secretly wiretapped the connections between data centers owned by Google and Yahoo, allowing the agency to collect users’ data as it flowed across the companies’ networks. Google engineers bristled at the news, and responded by encrypting those connections to prevent future intrusions; Yahoo has said it plans to do so by next year. More recently, Microsoft announced it would do the same, as well as open “transparency centers” that will allow some of its software’s source code to be inspected for hidden back doors. (However, that privilege appears to only extend to “government customers.”) On Monday, eight major tech firms, many of them competitors, united to demand an overhaul of government transparency and surveillance laws. Still, an air of distrust surrounds the U.S. cloud industry. The N.S.A. collects data through formal arrangements with tech companies; ingests Web traffic as it enters and leaves the U.S.; and deliberately weakens cryptographic standards. A recently revealed document detailing the agency’s strategy specifically notes its mission to “influence the global commercial encryption market through commercial relationships” with companies developing and deploying security products. One solution, espoused by some programmers, is to make the Internet more like it used to be—less centralized and more distributed. Jacob Cook, a twenty-three-year-old student, is the brains behind ArkOS, a lightweight version of the free Linux operating system. It runs on the credit-card-sized Raspberry Pi, a thirty-five dollar microcomputer adored by teachers and tinkerers. It’s designed so that average users can create personal clouds to store data that they can access anywhere, without relying on a distant data center owned by Dropbox or Amazon. It’s sort of like buying and maintaining your own car to get around, rather than relying on privately owned taxis. Cook’s mission is to “make hosting a server as easy as using a desktop P.C. or a smartphone,” he said. Like other privacy advocates, Cook’s goal isn’t to end surveillance, but to make it harder to do en masse. “When you couple a secure, self-hosted platform with properly implemented cryptography, you can make N.S.A.-style spying and network intrusion extremely difficult and expensive,” he told me in an e-mail. Persuading consumers to ditch the convenience of the cloud has never been an easy sell, however. In 2010, a team of young programmers announced Diaspora, a privacy-centric social network, to challenge Facebook’s centralized dominance. A year later, Eben Moglen, a law professor and champion of the Free Software movement, proposed a similar solution called the Freedom Box. The device he envisioned was to be a small computer that plugs into your home network, hosting files, enabling secure communication, and connecting to other boxes when needed. It was considered a call to arms—you alone would control your data. But, while both projects met their fund-raising goals and drummed up a good deal of hype, neither came to fruition. Diaspora’s team fell into disarray after a disappointing beta launch, personal drama, and the appearance of new competitors such as Google+; apart from some privacy software released last year, Moglen’s Freedom Box has yet to materialize at all. “There is a bigger problem with why so many of these efforts have failed” to achieve mass adoption, said Brennan Novak, a user-interface designer who works on privacy tools. The challenge, Novak said, is to make decentralized alternatives that are as secure, convenient, and seductive as a Google account. “It’s a tricky thing to pin down,” he told me in an encrypted online chat. “But I believe the problem exists somewhere between the barrier to entry (user-interface design, technical difficulty to set up, and over-all user experience) versus the perceived value of the tool, as seen by Joe Public and Joe Amateur Techie.” One of Novak’s projects, Mailpile, is a crowd-funded e-mail application with built-in security tools that are normally too onerous for average people to set up and use—namely, Phil Zimmermann’s revolutionary but never widely adopted Pretty Good Privacy. “It’s a hard thing to explain…. A lot of peoples’ eyes glaze over,” he said. Instead, Mailpile is being designed in a way that gives users a sense of their level of privacy, without knowing about encryption keys or other complicated technology. Just as important, the app will allow users to self-host their e-mail accounts on a machine they control, so it can run on platforms like ArkOS. “There already exist deep and geeky communities in cryptology or self-hosting or free software, but the message is rarely aimed at non-technical people,” said Irina Bolychevsky, an organizer for Redecentralize.org, an advocacy group that provides support for projects that aim to make the Web less centralized. Several of those projects have been inspired by Bitcoin, the math-based e-money created by the mysterious Satoshi Nakamoto. While the peer-to-peer technology that Bitcoin employs isn’t novel, many engineers consider its implementation an enormous technical achievement. The network’s “nodes”—users running the Bitcoin software on their computers—collectively check the integrity of other nodes to ensure that no one spends the same coins twice. All transactions are published on a shared public ledger, called the “block chain,” and verified by “miners,” users whose powerful computers solve difficult math problems in exchange for freshly minted bitcoins. The system’s elegance has led some to wonder: if money can be decentralized and, to some extent, anonymized, can’t the same model be applied to other things, like e-mail? Bitmessage is an e-mail replacement proposed last year that has been called the “the Bitcoin of online communication.” Instead of talking to a central mail server, Bitmessage distributes messages across a network of peers running the Bitmessage software. Unlike both Bitcoin and e-mail, Bitmessage “addresses” are cryptographically derived sequences that help encrypt a message’s contents automatically. That means that many parties help store and deliver the message, but only the intended recipient can read it. Another option obscures the sender’s identity; an alternate address sends the message on her behalf, similar to the anonymous “re-mailers” that arose from the cypherpunk movement of the nineteen-nineties. Another ambitious project, Namecoin, is a P2P system almost identical to Bitcoin. But instead of currency, it functions as a decentralized replacement for the Internet’s Domain Name System. The D.N.S. is the essential “phone book” that translates a Web site’s typed address (www.newyorker.com) to the corresponding computer’s numerical I.P. address (192.168.1.1). The directory is decentralized by design, but it still has central points of authority: domain registrars, which buy and lease Web addresses to site owners, and the U.S.-based Internet Corporation for Assigned Names and Numbers, or I.C.A.N.N., which controls the distribution of domains. The infrastructure does allow for large-scale takedowns, like in 2010, when the Department of Justice tried to seize ten domains it believed to be hosting child pornography, but accidentally took down eighty-four thousand innocent Web sites in the process. Instead of centralized registrars, Namecoin uses cryptographic tokens similar to bitcoins to authenticate ownership of “.bit” domains. In theory, these domain names can’t be hijacked by criminals or blocked by governments; no one except the owner can surrender them. Solutions like these follow a path different from Mailpile and ArkOS. Their peer-to-peer architecture holds the potential for greatly improved privacy and security on the Internet. But existing apart from commonly used protocols and standards can also preclude any possibility of widespread adoption. Still, Novak said, the transition to an Internet that relies more extensively on decentralized, P2P technology is “an absolutely essential development,” since it would make many attacks by malicious actors—criminals and intelligence agencies alike—impractical. Though Snowden has raised the profile of privacy technology, it will be up to engineers and their allies to make that technology viable for the masses. “Decentralization must become a viable alternative,” said Cook, the ArkOS developer, “not just to give options to users that can self-host, but also to put pressure on the political and corporate institutions.” “Discussions about innovation, resilience, open protocols, data ownership and the numerous surrounding issues,” said Redecentralize’s Bolychevsky, “need to become mainstream if we want the Internet to stay free, democratic, and engaging.” Illustration by Maximilian Bode.
Posted by Patrick Keller
in Culture & society, Science & technology, Territory
at
08:46
Defined tags for this entry: culture & society, data, hardware, internet, opensource, operating system, science & technology, software, surveillance, territory, topology
(Page 1 of 1, totaling 4 entries)
|
fabric | rblgThis blog is the survey website of fabric | ch - studio for architecture, interaction and research. We curate and reblog articles, researches, writings, exhibitions and projects that we notice and find interesting during our everyday practice and readings. Most articles concern the intertwined fields of architecture, territory, art, interaction design, thinking and science. From time to time, we also publish documentation about our own work and research, immersed among these related resources and inspirations. This website is used by fabric | ch as archive, references and resources. It is shared with all those interested in the same topics as we are, in the hope that they will also find valuable references and content in it.
QuicksearchCategoriesCalendarSyndicate This BlogArchivesBlog Administration |