Tuesday, August 02. 2016
By fabric | ch
As we continue to lack a decent search engine on this blog and as we don't use a "tag cloud" ... This post could help navigate through the updated content on | rblg (as of 07.2016), via all its tags!
HERE ARE ALL THE CURRENT TAGS TO NAVIGATE ON | RBLG BLOG:
(to be seen just below if you're navigating on the blog's page or here for rss readers)
Posted by Patrick Keller in fabric | ch at 16:58
Defined tags for this entry: 3d, activism, advertising, agriculture, air, animation, applications, archeology, architects, architecture, art, art direction, artificial reality, artists, atmosphere, automation, behaviour, bioinspired, biotech, blog, body, books, brand, character, citizen, city, climate, clips, code, cognition, collaboration, commodification, communication, community, computing, conditioning, conferences, consumption, content, control, craft, culture & society, curators, customization, data, density, design, design (environments), design (fashion), design (graphic), design (interactions), design (motion), design (products), designers, development, devices, digital, digital fabrication, digital life, digital marketing, dimensions, direct, display, documentary, earth, ecal, ecology, economy, electronics, energy, engineering, environment, equipment, event, exhibitions, experience, experimentation, fabric | ch, farming, fashion, fiction, films, food, form, franchised, friends, function, future, gadgets, games, garden, generative, geography, globalization, goods, hack, hardware, harvesting, health, history, housing, hybrid, identification, illustration, images, information, infrastructure, installations, interaction design, interface, interferences, kinetic, knowledge, landscape, language, law, life, lighting, localization, localized, magazines, make, mapping, marketing, mashup, materials, media, mediated, mind, mining, mobile, mobility, molecules, monitoring, monography, movie, museum, music, nanotech, narrative, nature, networks, neurosciences, opensource, operating system, participative, particles, people, perception, photography, physics, physiological, politics, pollution, presence, print, privacy, product, profiling, projects, psychological, public, publishing, reactive, real time, recycling, research, resources, responsive, ressources, robotics, santé, scenography, schools, science & technology, scientists, screen, search, security, semantic, services, sharing, shopping, signage, smart, social, society, software, solar, sound, space, speculation, statement, surveillance, sustainability, tactile, tagging, tangible, targeted, teaching, technology, tele-, telecom, territory, text, textile, theory, thinkers, thinking, time, tools, topology, tourism, toys, transmission, trend, typography, ubiquitous, urbanism, users, variable, vernacular, video, viral, vision, visualization, voice, vr, war, weather, web, wireless, writing
Monday, December 09. 2013
Researchers at the MIT Media Lab and the Max Planck Institutes have created a foldable, cuttable multi-touch sensor that works no matter how you cut it, allowing multi-touch input on nearly any surface.
In traditional sensors the connectors are laid out in a grid and when one part of the grid is damaged you lose sensitivity in a wide swathe of other sensors. This system lays the sensors out like a star which means that cut parts of the sensor only effect other parts down the line. For example, you cut the corners off of a square and still get the sensor to work or even cut all the way down to the main, central connector array and, as long as there are still sensors on the surface, it will pick up input.
The team that created it, Simon Olberding, Nan-Wei Gong, John Tiab, Joseph A. Paradiso, and Jürgen Steimle, write:
This very direct manipulation allows the end-user to easily make real-world objects and surfaces touch interactive,
You can read the research paper here but this looks to be very useful in the DIY hacker space as well as for flexible, wearable projects that require some sort of multi-touch input. While I can’t imagine we need shirts made of this stuff, I could see a sleeve with lots of inputs or, say, a watch with a multi-touch band.
Don’t expect this to hit the next iWatch any time soon – it’s still very much in prototype stages but definitely looks quite cool.
Tuesday, May 03. 2011
By Kate Greene
Our lives are awash with ambient electromagnetic radiation, from the fields generated by power lines to the signals used to send data between Wi-Fi transmitters. Researchers at Microsoft and the University of Washington have found a way to harness this radiation for a computer interface that turns any wall in a building into a touch-sensitive surface.
The technology could allow light switches, thermostats, stereos, televisions, and security systems to be controlled from anywhere in the house, and could lead to new interfaces for games.
"There's all this electromagnetic radiation in the air," says Desney Tan, senior researcher at Microsoft (and a TR35 honoree in 2007). Radio antennas pick up some of the signals, Tan explains, but people can do this too. "It turns out that the body is a relatively good antenna," he says.
The ambient electromagnetic radiation emitted by home appliances, mobile phones, computers, and the electrical wiring within walls is usually considered noise. But the researchers chose to put it at the core of their new interface.
When a person touches a wall with electrical wiring behind it, she becomes an antenna that tunes the background radiation, producing a distinct electrical signal, depending on her body position and proximity to and location on the wall. This unique electrical signal can be collected and interpreted by a device in contact with or close to her body. When a person touches a spot on the wall behind her couch, the gesture can be recognized, and it could be used, for example, to turn down the volume on the stereo.
So far, the researchers have demonstrated only that a body can turn electromagnetic noise into a usable signal for a gesture-based interface. A paper outlining this will be presented next week at the CHI Conference on Human Factors in Computing Systems in Vancouver, BC.
In an experiment, test subjects wore a grounding strap on their wrist—a bracelet that is normally used to prevent the buildup of static electricity in the body. A wire from the strap was connected to an analog-to-digital converter, which fed data from the strap to a laptop worn in a backpack. Machine-learning algorithms then processed the data to identify characteristic changes in the electrical signals corresponding to a person's proximity to a wall, the position of her hand on the wall, and her location within the house.
"Now we can turn any arbitrary wall surface into a touch-input surface," says Shwetak Patel, professor of computer science and engineering and electrical engineering at the University of Washington (and a TR35 honoree in 2009), who was involved with the work. The next step, he says, is to make the data analysis real-time and to make the system even smaller—with a phone or a watch instead of a laptop collecting and analyzing data.
"With Nintendo Wii and Microsoft's Kinect, people are starting to realize that these gesture interfaces can be quite compelling and useful," says Thad Starner, professor in Georgia Tech's College of Computing. "This is the sort of paper that says here is a new direction, an interesting idea; now can we refine it and make it better over time."
Refining the system to make it more user-friendly will be important, says Pattie Maes, a professor in MIT's Media Lab who specializes in computer interfaces. "Many interfaces require some visual, tangible, or auditory feedback so the user knows where to touch." While the researchers suggest using stickers or other marks to denote wall-based controls, this approach might not appeal to everyone. "I think it is intriguing," says Maes, "but may only have limited-use cases."
Joe Paradiso, another professor in MIT's Media Lab, says, "The idea is wild and different enough to attract attention," but he notes that the signal produced could vary depending on the way a person wears the device that collects the signal.
Patel has previously used a building's electrical, water, and ventilation systems to locate people indoors. Tan has worked with sensors that use human brain power for computing and muscle activity to control electronics wirelessly. The two researchers share an interest in pulling useful information out of noisy signals. With the recent joint project, Tan says, the researchers are "taking junk and making sense of it."
Friday, April 01. 2011
bitforms gallery is pleased to announce an exhibition that explores the sense of touch as a metaphor of bodily presence and an extension across boundary. Actualizing our understandings of public/private and inside/outside, touch is a gesture system. It manipulates physical, social, psychological and electronic domains, aiding in their transformation.
Touched: A Space of Relations
Monday, February 21. 2011
While the bedside table and lamp has been a staple of our lives since there was such a thing as a bed and a table, there is now a new invention being floated around by French firm Quarks that may make it obsolete.
With Quarks’ new paint, the entire wall acts as an on and off switch meaning that you no longer need to put a lamp close enough for you to reach over and turn off. The On/Off Paint will record a single touch and carry an electric current that will turn a device on or off. The best part of the paint is that it does not take a great deal of wiring in order to make it work, simply determine the area you want to paint and as long as it is around a plug you’ll be able to touch any part of that area.
While this isn’t currently available in the United States, if the company knows what’s good for it, the On/Off Switch Paint will be here soon.
Friday, November 05. 2010
Via Art Press
I just read these following lines from David Hockney (73 this year, and for which technologies always had an influence in his paintings --photography, photocopy, fax--), in Art Press n° 372:
"(...). You can understand what's happening today much better if you see history the way I do. For 500 years the Church controlled society by reigning over images. With the advent of the mass media, it gradually lost its power, and media magnates circulated whatever images they want, and Hollywood has extended its empire around the world. Now images are undergoing another upheaval, marking the decline of newspaper and television. People have moved onto this [gesturing at his iPad]. The monopoly on the distribution of images has been shattered. Now I can circulate whatever I want for free. This new era also signals the changing nature of photography."
"In my book Secret Knowledge, I demonstrated that photography existed long before chemical development on paper. Art historians may dispute it but I don't mind: this is the result of observation by a painter, an insider's view. Why did Caravaggio invent Hollywood lighting? Because he used a whole system of lenses, concave mirror and a camera obscura to project faces and real objects onto the canvas. Vermeer, Van Eyck, Canaletto, Chardin and Ingres used these methods. The invention of the camera in 1837 was just a way to fix the projection on paper. That lasted 160 years and now it's over. Now it's the era of digital photography, manipulating images, and the come back of manual dexterity. The idea that photography is the most striking illustration of reality is outdated. In this country, possessing or looking at certain kind of photos will land you in jail. I ask you: how can anyone tell whether or not these photos have anything to do with reality? Has this point been discussed? No. Has it been talked about in the art world? No. People think the world is like photography, but the camera gives us an optical projection of the world. It sees the world in a spectacular way, whereas we perceive it psychologically.
No matter how good a photo may be, it doesn't haunt you the way a painting can. A good painting embraces ambiguities that can never be untangled; that's why it's so fascinating. (...) Painting will always be superior to photography in one respect: time, that juxtaposition of moments which is what makes a great painting so deep, rich and ambiguous. It has been said that the surface is all that matters. But that is to cancel what can be called the magic of art. The magic is the indeterminate part. I think the only way to renew art is to go back to nature. Nature is unlimited; it's foolish to say that we've seen it all."
Can we still speak about "nature" (I mean, a place where there's no wifi, no telecom networks and no communication --probably in the deep oceans, in foreign and/or inaccessible landscapes or high in the moutains --but even the Himalayas got wifi recently, Swiss mountains are all covered by Swisscom networks...--; do we have to speak about "manual dexterity" or "sotware dexterity"?
Thursday, January 07. 2010
We’ve seen virtual keyboards before, but the technology has never quite taken off. But this is quite exceptional. UK-based Light Blue Optics (LBO) has unveiled a revolutionary compact projector that lets users interact with images projected onto a flat surface, essentially turning it into a touch screen. Remarkably, it has built-in Wi-Fi and Bluetooth connectivity, which allows users to access and connect with internet-based applications and engage in social networking and multimedia sharing via the projector itself, which runs Adobe Flash Lite 3.1. The Light Touch projector transforms the usually static image created by a projector into interactive multimedia content using Light Blue Optics’ proprietary holographic laser projection technology. The device can project a virtual 10-inch interactive display onto any surface and the integrated infra-red touch sensing system enables users to interact with the virtual image in the same way they would use a handheld touchscreen device. The device is set to be showcased in an invite-only event on Thursday, January 7th at CES. LBO also announced a list of global companies working with the Company to bring its unique projection technology to market; Adobe, CSR, Foxconn, Interbrand, Microsoft, Micron, Nichia, Photop, Opnext and Toshiba. It comes equipped with 2GB of onboard Flash memory and has a Micro SD card slot for up to 32GB of storage, running wall-powered or battery-operated, with a run-time of two hours before recharging is required. The Light Touch certainly looks very interesting, but there are no details on pricing or availability yet.
We’ve seen virtual keyboards before, but the technology has never quite taken off. But this is quite exceptional. UK-based Light Blue Optics (LBO) has unveiled a revolutionary compact projector that lets users interact with images projected onto a flat surface, essentially turning it into a touch screen. Remarkably, it has built-in Wi-Fi and Bluetooth connectivity, which allows users to access and connect with internet-based applications and engage in social networking and multimedia sharing via the projector itself, which runs Adobe Flash Lite 3.1.
The Light Touch projector transforms the usually static image created by a projector into interactive multimedia content using Light Blue Optics’ proprietary holographic laser projection technology. The device can project a virtual 10-inch interactive display onto any surface and the integrated infra-red touch sensing system enables users to interact with the virtual image in the same way they would use a handheld touchscreen device.
The device is set to be showcased in an invite-only event on Thursday, January 7th at CES. LBO also announced a list of global companies working with the Company to bring its unique projection technology to market; Adobe, CSR, Foxconn, Interbrand, Microsoft, Micron, Nichia, Photop, Opnext and Toshiba. It comes equipped with 2GB of onboard Flash memory and has a Micro SD card slot for up to 32GB of storage, running wall-powered or battery-operated, with a run-time of two hours before recharging is required. The Light Touch certainly looks very interesting, but there are no details on pricing or availability yet.
Friday, October 02. 2009
Most sci-fi fans remember the movie Minority Report, and the scene where Tom Cruise manipulates data on a large, vertical, virtual screen using only his hands and fingers. Although I prefer to do my computer work sitting down, it was the stuff geek dreams are made of, and now Apple has applied for a patent that sounds eerily similar.
AppleInsider has the details, but the gist of it is this: Apple’s multitouch functions currently implemented on the iPhone and on Macbook trackpads are nice enough, but aren’t good enough for a larger screen (a larger screen on a mythical Apple tablet device, that is).
Therefore, Apple’s new patent describes a far more sophisticated multitouch input method, allowing for use of all ten fingers, complex movements and, yes, proper typing. Built-in sensors should be able to distinguish various hand configurations, detect when the user wants the cursor to move, and enable various types of input (typing, manipulating 2D objects, and handwriting). We can’t say all of this is definite proof that an Apple tablet is coming, but it’s painfully easy to imagine all of this applied on such a device.
Minority Report syndrom again... This time possibly coming to a table(t) near you.
Thursday, April 09. 2009
A Microsoft project lets a touch screen control other hardware.
By Kate Greene
Morris and his colleagues have developed software for touch-screen surfaces that allows physical controls to be added to them. In addition, the software lets people define the functions that each knob, button, and slider on a controller will perform.
The researchers' system, called Ensemble, was presented on Monday at the Computer-Human Interaction (CHI 2009) Conference in Boston. It consists of a custom-made touch table that is two meters long and one meter wide, and several portable sound-editing controllers that connect to the computer that controls the surface. The table is similar to Microsoft's Surface, but larger. As with Surface, cameras underneath the tabletop are used to sense when a user touches the surface or when an object is placed on top of it.
The idea of incorporating traditional input devices like mouses or keyboards with a touch display is not new, but the Microsoft researchers show with Ensemble that it's possible to make hardware do more than a single specified task.
Cameras within the Ensemble table detect a special tag on the bottom of each audio control box to recognize each box and determine its position on the surface. The software then produces an "aura" around each device, including touch-surface controls like "play," "pause," and "stop," and virtual sliders that correspond to physical knobs on the box.
A person can then edit a music track, for example, using both the physical device and the touch-surface controls. The virtual sliders can be used to zoom in on the audio waveform of a track, or to go to a different location on the waveform by panning. The physical knobs on the box perform the same function but offer much finer control. The system also allows a person to change the function of the knobs to, say, control the volume of a trumpet track instead.
"It's a software mechanism for telling the hardware what to do," says Morris. He explains that once a person has mapped different functions onto the controller, she's able to save it for later or pass it along to someone else who has a similar role in the editing process.
The paper, presented at CHI 2009 by Rebecca Fiebrink, a graduate student at Princeton University, also describes a study examining how people used the interface. Most of the study participants used the physical controls, favoring the accuracy and responsiveness that they offer. However, these participants also made extensive use of surface controls, choosing them mainly for tasks in which a single touch produced a discrete result, such as playing or stopping a track.
Robert Jacob, a professor of electrical engineering at Tufts University, in Medford, MA, says that the researchers "did a nice job of investigating what users actually did when given both [physical controllers and a touch screen] and the opportunity to switch between them."
Jacob, who chaired the session in which the paper was presented, acknowledges that bridging the gap between physical and digital objects can be challenging. "It's a difficult problem with no general solutions, but rather individual interesting designs," he says. "Ideally, you want the benefits of the digital without giving up those of the physical."
While Ensemble was designed for sound editing, its underlying technology could find other applications in graphics, gaming, and visual design, says Morris. "It could be used in scenarios where you want people to collaborate on a surface as a group," he says, but where the resolution of touch surface limits the precision of the virtual controls.
Copyright Technology Review 2009.
Pas encore très concluant (voir la vidéo: interaction très lente... et je demande à voir une fois une véritable collaboration "de travail" dans un tel contexte), mais il y a de l'idée dans cette sorte de réalité augmentée partagée à plusieurs et projetée autour d'outils de travail ou d'édition.
(Page 1 of 1, totaling 9 entries)
fabric | rblg
This blog is the survey website of fabric | ch - studio for architecture, interaction and research.
We curate and reblog articles, researches, writings, exhibitions and projects that we notice and find interesting during our everyday practice and readings.
Most articles concern the intertwined fields of architecture, territory, art, interaction design, thinking and science. From time to time, we also publish documentation about our own work and research, immersed among these related resources and inspirations.
This website is used by fabric | ch as archive, references and resources. It is shared with all those interested in the same topics as we are, in the hope that they will also find valuable references and content in it.
| rblg on Twitter