LED traffic lights are a great way for cities around the world to save on energy costs, but from what we’re hearing although they use 90 percent less energy, they’re not as efficient compared to traditional incandescent light bulbs when heavy snow strikes the way it did in many places around Europe and the US in the last couple of days.
Of course global warming is probably the decade’s most debated topic and LED traffic lights are just a small part in the big puzzle, but the hazardous downside is that they aforementioned LEDs don’t burn hot enough to melt any snow that covers them. And word is that the whole thing led to numerous traffic jams, a few accidents and at least one death.
Already a problem in many places that have switched to LED traffic lights (indeed you see them much better during the day), authorities are already working to find a fix. That includes installing weather shields, coating the lights with water-repellent substances or heating elements like those used on airport runways.
Funny how the biggest advantage of the LED technology is its biggest drawback when used as traffic lights. Let’s hope for a fix soon!
The best news I have heard this week was that the specs for 3D Blu-ray were finalized and that the PS3 would support 3D. That means with a simple firmware update PS3 owners will be able to enjoy 3D movies assuming they have a compatible TV. Sounds like it’s time for me to upgrade!
Sony also announced this week that it has teamed up with RealD to bring 3D technology to the home. You might recognize the RealD name as the company who is behind the 3D technology for many of the high-end 3D films in theaters like Avatar and others. Odds are if you watched a 3D movie in the theater lately, RealD was the tech behind the cool movie.
Sony will be licensing RealD tech for use in consumer products in 2010, including the stereoscopic 3D tech behind RealD in theaters. The tech will go into 3D Bravia LCD TVs and 3D eyewear needed to view the content.
la 3D pour les films pousse, mais cela n'enlève rien au coté gadget de la chose. L'énergie investie dans cette technologie est également biaisée/boostée par le fait que c'est également un moyen de lutte contre les copies illégales de film... propose-t-on de la 3D parce que cela apporte qqchose, ou bien propose-t-on la 3D pour éviter ou rendre bien plus difficile la copie du contenu...
Le fait que tout cela soit toujours "restitué" par l'intermédiaire d'un écran (plat) me fait penser que nous sommes encore bien loin d'une véritable restitution volumétrique de l'enregistrement d'une scène passée.
FEATURE, FIBER — BY Blaine Brownell on December 11, 2009 AT 9:00 AM
Mycobond is a mycological bio-composite that can be used in a wide variety of applications. Instead of conventional manufacturing processes, Mycobond uses mycelium—which is essentially the root system of a mushroom—to transform loose aggregates into strong composites. This process can be varied by using different species of fungus and mixtures of aggregates in order to make a composite with an optimal density, strength, appearance, and performance for the specific application.
Additionally, Mycobond represents a low-embodied-energy manufacturing process as the material self assembles at room temperature and pressure in the dark. Furthermore, Mycobond upcycles resources like rice hulls, cotton burrs, and buckwheat hulls that are otherwise thrown away, transforming them into valuable products, including rigid board insulation and protective packaging buffers.
Today, at the International Climate Change Conference (COP15) in Copenhagen, we demonstrated a new technology prototype that enables online, global-scale observation and measurement of changes in the earth's forests. We hope this technology will help stop the destruction of the world's rapidly-disappearing forests. Emissions from tropical deforestation are comparable to the emissions of all of the European Union, and are greater than those of all cars, trucks, planes, ships and trains worldwide. According to the Stern Review, protecting the world's standing forests is a highly cost-effective way to cut carbon emissions and mitigate climate change. The United Nations has proposed a framework known as REDD (Reducing Emissions from Deforestation and Forest Degradation in Developing Countries) that would provide financial incentives to rainforest nations to protect their forests, in an effort to make forests worth "more alive than dead." Implementing a global REDD system will require that each nation have the ability to accurately monitor and report the state of their forests over time, in a manner that is independently verifiable. However, many of these tropical nations of the world lack the technological resources to do this, so we're working with scientists, governments and non-profits to change this. Here's what we've done with this prototype to help nations monitor their forests:
Start with satellite imagery
Satellite imagery data can provide the foundation for measurement and monitoring of the world's forests. For example, in Google Earth today, you can fly to Rondonia, Brazil and easily observe the advancement of deforestation over time, from 1975 to 2001:
(Landsat images courtesy USGS)
This type of imagery data — past, present and future — is available all over the globe. Even so, while today you can view deforestation in Google Earth, until now there hasn't been a way to measure it.
Then add science
With this technology, it's now possible for scientists to analyze raw satellite imagery data and extract meaningful information about the world's forests, such as locations and measurements of deforestation or even regeneration of a forest. In developing this prototype, we've collaborated with Greg Asner of Carnegie Institution for Science, and Carlos Souza of Imazon. Greg and Carlos are both at the cutting edge of forest science and have developed software that creates forest cover and deforestation maps from satellite imagery. Organizations across Latin America use Greg's program, Carnegie Landsat Analysis System (CLASlite), and Carlos' program, Sistema de Alerta de Deforestation (SAD), to analyze forest cover change. However, widespread use of this analysis has been hampered by lack of access to satellite imagery data and computational resources for processing.
Handle computation in the cloud
What if we could offer scientists and tropical nations access to a high-performance satellite imagery-processing engine running online, in the “Google cloud”? And what if we could gather together all of the earth’s raw satellite imagery data — petabytes of historical, present and future data — and make it easily available on this platform? We decided to find out, by working with Greg and Carlos to re-implement their software online, on top of a prototype platform we've built that gives them easy access to terabytes of satellite imagery and thousands of computers in our data centers.
Here are the results of running CLASlite on the satellite imagery sequence shown above:
CLASlite online: This shows deforestation and degradation in Rondonia, Brazil
from 1986-2008, with the red indicating recent activity
Here's the result of running SAD in a region of recent deforestation pressure in Mato Grosso, Brazil:
SAD online: The red "hotspots" indicate deforestation
that has happened within the last 30 days
Combining science with massive data and technology resources in this way offers the following advantages:
Unprecedented speed: On a top-of-the-line desktop computer, it can take days or weeks to analyze deforestation over the Amazon. Using our cloud-based computing power, we can reduce that time to seconds. Being able to detect illegal logging activities faster can help support local law enforcement and prevent further deforestation from happening.
Ease of use and lower costs: An online platform that offers easy access to data, scientific algorithms and computation horsepower from any web browser can dramatically lower the cost and complexity for tropical nations to monitor their forests.
Security, privacy and transparency: Governments and researchers don't want to share sensitive data and results before they are ready. Our cloud-based platform allows users to control access to their data and results. At the same time, because the data, analysis and results reside online, they can also be easily shared, made available for collaboration, presented to the public and independently verified — when appropriate.
Climate change impact: We think that a suitably scaled-up and enhanced version of this platform could be a promising as a tool for forest monitoring, reporting and verification (MRV) in support of efforts such as REDD.
As a Google.org product, this technology will be provided to the world as a not-for-profit service. This technology prototype is currently available to a small set of partners for testing purposes — it's not yet available to the general public but we expect to make it more broadly available over the next year. We are grateful to a host of individuals and organizations (find full list here) who have advised us on developing this technology. In particular, we would like to thank the Gordon and Betty Moore Foundation for their close partnership since the initial inception of this project. We're also working with the Group on Earth Observations (GEO), a consortium of national government bodies, inter-governmental organizations, space agencies and research institutions through GEO's Forest Carbon Tracking (FCT) task force. Last month together we launched the GEO FCT portal and are now exploring how we can also together bring the power of this new technology to tropical nations.
We're excited to be able to share this early prototype and look forward to seeing what's possible.
Posted by Rebecca Moore, Engineering Manager, Google.org and Dr. Amy Luers, Environment Manager, Google.org
Google’s breaking news left and right today. Of course the big news is that real-time search is live, but the release of Google Goggles can’t be overlooked. The brand-new addition to Google Labs is an experimental application for Android devices that supports visual search.
How does it work? Just open the app, snap a photo and voilà: Google will process the image and return search results. The photo search functionality eliminates the need to type or say anything on your mobile device, and it adds context to your real-world surroundings.
While the technology is pretty remarkable, Google admits that it is still in its infancy. So while some image searches work brilliantly — think photos of books, business cards, artwork, places, logos and landmarks — don’t be too disappointed if your image searches for food, animals, plants and cars are less than stellar.
Still, the application should prove useful, and we hope to see versions of it made available for other smartphone users as well. For now, though, iPhone users can turn to a number of different augmented reality applications for camera-enabled search functionality.
Watch the video below for a demonstration of Google Goggles.
More about Google today: yet another example of Google starting to mix real and virtual worlds and mediatization to space a bit further. What they call Google Goggles.
This blog is the survey website of fabric | ch - studio for architecture, interaction and research.
We curate and reblog articles, researches, writings, exhibitions and projects that we notice and find interesting during our everyday practice and readings.
Most articles concern the intertwined fields of architecture, territory, art, interaction design, thinking and science. From time to time, we also publish documentation about our own work and research, immersed among these related resources and inspirations.
This website is used by fabric | ch as archive, references and resources. It is shared with all those interested in the same topics as we are, in the hope that they will also find valuable references and content in it.