Minding my own business,2011, Robert Overweg, Grand theft auto 4
Whale car, 2011, Robert Overweg, Grand theft auto 4
-----
" Robert Overweg is a photographer in virtual world environments. These environments appear in first and third person shooter games but Overweg see’s them operating as a direct extension of his physical world, revealing the new public space of contemporary society.
Overweg proceeds to the outskirts of the virtual world which he dissects through his photography. In doing so, he draws our attention to environments that are often overlooked and yet ironically appear eerily familiar."
Seen everywhere online these days and now on | rblg too... Yet another "trojan horse" by Google to turn you into a mobile and indoor sensor for their own sake (data collection, if I said so). And soon will we be able to visit your flat or the ones of your friends through Google Maps/Earth, or through a constellation of other applications. After clicking at the door, of course.
But also, as it is often the case with such devices, an interesting tool as well... On top of which disruptive apps will be built that will further mix material and immaterial experiences and that will further locate parts of your "home" into "clouds".
As it consists in an open call for ideas, before they'll give away 200 dev. kits, don't hesitate to send them a line if you have an unpredictable one (this promiss to be very competing...)!
“Our current prototype is a 5” phone containing customized hardware and software designed to track the full 3D motion of the device, while simultaneously creating a map of the environment. These sensors allow the phone to make over a quarter million 3D measurements every second, updating its position and orientation in real-time, combining that data into a single 3D model of the space around you.
“It runs Android and includes development APIs to provide position, orientation, and depth data to standard Android applications written in Java, C/C++, as well as the Unity Game Engine. These early prototypes, algorithms, and APIs are still in active development. So, these experimental devices are intended only for the adventurous and are not a final shipping product….”
Shapeoko was the little milling machine that could. It surpassed its Kickstarter goal and went into production with the goal of supplying CNC mill fans with an easy-to-use and inexpensive ($300) CNC machine.
Two years after the Kickstarter campaign concluded, creator Edward Ford has joined forces with Inventables to build the Shapeoko 2, which goes on pre-sale today. The second version features a completely redesigned Z-axis, dual Y-axis steppers, as well as Inventables’ MakerSlide linear bearing system.
If you’ll be in Chicago on today (note: last monday), Inventables will be holding a Shapeoko 2 launch event where you’ll get the opportunity to see the machine in action. You can also pre-order the kit. The price is $300 for just the mechanics — just add electronics — or you can get a full kit for $650.
There is a great, undiscovered potential in virtual reality development. Sure, you can create lifelike virtual worlds, but you can also make players sick. Oculus VR founder Palmer Luckey and VP of product Nate Mitchell hosted a panel at GDC Europe last week, instructing developers on how to avoid the VR development pitfalls that make players uncomfortable. It was a lovely service for VR developers, but we saw a much greater opportunity. Inadvertently, the panel explained how to make players as queasy and uncomfortable as possible.
And so, we now present the VR developer's guide to manipulating your players right down to the vestibular level. Just follow these tips and your players will be tossing their cookies in minutes.
Note: If you'd rather not make your players horribly ill and angry, just do the opposite of everything below.
Include lots of small, tight spaces
In virtual reality, small and closed-off areas truly feel small, said Luckey. "Small corridors are really claustrophobic. It's actually one of the worst things you can do for most people in VR, is to put them in a really small corridor with the walls and the ceiling closing in on them, and then tell them to move rapidly through it."
Meanwhile, open spaces are a "relief," he said, so you'll want to avoid those.
Possible applications: Air duct exploration game.
Create a user interface that neglects depth and head-tracking
Virtual reality is all about depth and immersion, said Mitchell. So, if you want to break that immersion, your ideal user interface should be as traditional and flat as possible.
For example, put targeting reticles on a 2D plane in the center of a player's field of view. Maybe set it up so the reticle floats a couple of feet away from the player's face. "That is pretty uncomfortable for most players and they'll just try to grapple with what do they converge on: That near-field reticle or that distant mech that they're trying to shoot at?" To sample this effect yourself, said Mitchell, you can hold your thumb in front of your eyes. When you focus on a distant object, your thumb will appear to split in two. Now just imagine that happening to something as vital a targeting reticle!
You might think that setting the reticle closer to the player will make things even worse, and you're right. "The sense of personal space can make people actually feel uncomfortable, like there's this TV floating righting in front of their face that they try to bat out of the way." Mitchell said a dynamic reticle that paints itself onto in-game surfaces feels much more natural, so don't do that.
You can use similar techniques to create an intrusive, annoying heads-up display. Place a traditional HUD directly in front of the player's face. Again, they'll have to deal with double vision as their eyes struggle to focus on different elements of the game. Another option, since VR has a much wider field of view than monitors, is to put your HUD elements in the far corners of the display, effectively putting it into a player's peripheral vision. "Suddenly it's too far for the player to glance at, and they actually can't see pretty effectively." What's more, when players try to turn their head to look at it, the HUD will turn with them. Your players will spin around wildly as they desperately try to look at their ammo counter.
Possible applications: Any menu or user interface from Windows 3.1.
Disable head-tracking or take control away from the player
"Simulator sickness," when players become sick in a VR game, is actually the inverse of motion sickness, said Mitchell. Motion sickness is caused by feeling motion without being able to see it ? Mitchell cited riding on a boat rocking in the ocean as an example. "There's all this motion, but visually you don't perceive that the floor, ceiling and walls are moving. And that's what that sensory disconnect ? mainly in your vestibular senses ? is what creates that conflict that makes you dizzy." Simulator sickness he said, is the opposite. "You're in an environment where you perceive there to be motion, visually, but there is no motion. You're just sitting in a chair."
If you disable head-tracking in part of your game, it artificially creates just that sort of sensory disconnect. Furthermore, if you move the camera without player input, say to display a cut-scene, it can be very disorienting. When you turn your head in VR, you expect the world to turn with you. When it doesn't, you can have an uncomfortable reaction.
Possible applications:Frequent, Unskippable Cutscenes: The Game.
Feature plenty of backwards and lateral movement
Forward movement in a VR game tends not to cause problems, but many users have trouble dealing with backwards movement, said Mitchell. "You can imagine sometimes if you sit on a train and you perceive no motion, and the train starts moving backwards very quickly, or you see another car pulling off, all of those different sensations are very similar to that discomfort that comes from moving backwards in space." Lateral movement ? i.e. sideways movement ? has a similar effect, Mitchell said. "Being able to sort of strafe on a dime doesn't always cause the most comfortable experience."
Possible applications: Backwards roller coaster simulator.
Quick changes in altitude
"Quick changes in altitude do seem to cause disorientation," said Mitchell. Exactly why that happens isn't really understood, but it seems to hold true among VR developers. This means that implementing stairs or ramps into your games can throw players for a loop ? which, remember, is exactly what we're after.Don't use closed elevators, as these prevent users from perceiving the change in altitude, and is generally much more comfortable.
Possible applications: A VR version of the last level from Ghostbusters on NES. Also: Backwards roller coaster simulator.
Don't include visual points of reference
When players look down in VR, they expect to see their character's body. Likewise, in a space combat or mech game, they expect to see the insides of the cockpit when they look around. "Having a visual identity is really crucial to VR. People don't want to look down and be a disembodied head." For the purposes of this guide, that makes a disembodied head the ideal avatar for aggravating your players.
Possible applications:Disembodied Heads ... in ... Spaaaaaace. Also: Disembodied head in a backwards roller coaster.
Shift the horizon line
Okay, this is probably one of the most devious ways to manipulate your players. Mitchell imagines a simulation of sitting on a beach, watching the sunset. "If you subtly tilt the horizon line very, very minimally, a couple degrees, the player will start to become dizzy and disoriented and won't know why."
Possible applications:Drunk at the Beach.
Shoot for a low frame rate, disable V-sync
"With VR, having the world tear non-stop is miserable." Enough said. Furthermore, a low frame rate can be disorienting as well. When players move their heads and the world doesn't move at the same rate of speed, its jarring to their natural senses.
Possible applications: Limitless.
In Closing
Virtual reality is still a fledgling technology and, as Luckey and Mitchell explained, there's still a long way to go before both players and developers fully understand it.There are very few points of reference, and there is no widely established design language that developers can draw from.
What Luckey and Mitchell have detailed - and what we've decided to ignore - is a basic set of guidelines on maintaining player comfort in the VR space. Fair warning though, if you really want to design a game that makes players sick, the developers of AaaaaAAaaaAAAaaAAAAaAAAAA!!! already beat you to it.
A 3D printer approved by NASA will be flown to the International Space Station next year so astronauts can print components, tools and equipment on-demand in space.
Note: I'm joining here two posts that hit the blogs recently. The FilaBot 3d printer that print from garbage and the sort of narcissic-souvenir 3d photo booth from Omote 3d. Will it become possible to 3d print snapshots of ouselves, our houses, even our food with our own garbage (including therefore food garbage...)? Which would be a decent way to recycle trash (best way actually might be distant heating).
Of the many fictionalized, futuristic innovations shown in the Back to the Future movies, one of the most beneficial belonged to the DeLorean at the center of it all, and I don’t mean the ability to time travel. Rather, if even regular engines could run on garbage, we’d solve the issues of fuel availability and waste disposal in one fell swoop. That’s why it’s nice to see that this concept has come into existence right at the upswing of the 3D printing phenomenon.
FilaBot is a desktop device that breaks down various types of plastics and processes them into filament that you can use for your home 3D printer. That includes your botched 3D printed experiments, so you won’t be wasting filament when you’re testing out a design.
Their Kickstarter campaign, which closed in early 2012, clocked three times its goal, and should prove to be a great accompanying device for home 3D printers like MakerBot. Founder Tyler McNaney plans to create a whole range of products that offer this functionality, some with great potential for customization.
FilaBot is a welcome arrival to a burgeoning world of creativity that threatens to create an immense amount of waste, something that we’re already pretty good at rapidly creating in large volumes. Now, instead of adding to the garbage pile, we can process some of our existing waste into something useful… well, depending what you’ll be designing and fabricating.
Un photomaton d’un nouveau genre vient de voir le jour à Tokyo: il crée une figurine à l'image du modèle. Complètement mégalo mais idéal pour les amateurs de petits soldats de plomb.
Omote 3D propose aux Tokyoïtes de leur tirer le portrait et de réaliser leur figurine en 3D. - D.R.
A priori ce n’est qu’un gadget de plus pour consommateurs en mal d’ego trip. Mais les figurines créées par Omote 3D, photomaton installé pour quelques semaines à Omotesando, coeur de la consommation de luxe tokyoïte, prouvent que l’impression 3D est en passe de devenir un produit grand public.
Le Pop up studio ouvrira dans une galerie de Tokyo le 24 novembre prochain. On pourra s'y faire tirer le portrait, à la façon d’un photomaton - mais avec l'aide d'un photographe professionnel. Le portrait sera ensuite scanné et à partir des données enregistrées, et une figurine à l'image du client verra le jour. La machine, appelée Omote 3D, propose donc de transformer le chaland en petit soldat de plomb – mais en plastique et sans fusil. Pour 200 euros (la figurine de dix centimètres), le laboratoire Party, Rhizomatiks et Engine Film livrent l’objet, qu'il s’agit ensuite de colorer soi-même.
Les prix sont encore assez élevés mais la technique en est à ses prémices : de 21 000 yens (200 euros) pour une figurine de 10 cm à 42 000 yens (400 euros) pour la plus grande version de 20 centimètres.
An interesting twist with 3d printing: to use it as a way to recycle our old PET bottles or plastic trash (and by extension any trash, including organic waste to 3d print food?). And a way to potentially produce strange self consumption portrait.
Here’s a mind-blowing view of the Earth that you’ve probably never seen — or even thought of — before. Dubbed “Portrait of Global Aerosols” by NASA, this is the kind of imagery that climate scientists use to analyze the Earth’s atmosphere, the weather, and trends such as global climate change.
Now, first things first: The Earth doesn’t actually look like this from space (alas). Rather, this is an image output by the Goddard Earth Observation System Model, version 5 (GEOS-5). GEOS-5 is an almighty piece of software that runs on a supercomputer at NASA’s Center for Climate Simulation in Maryland.
In the case of this image, GEOS-5 is modeling the presence of aerosols (solid or liquid particles suspended in gas) across the Earth’s atmosphere. Each of the colors represents a different aerosol: Red is dust (swept up from deserts, like the Sahara); Blue is sea salt, swirling inside cyclones; Green is smoke from forest fires; and white is sulfates, which bubble forth from volcanoes — and from burning fossil fuels. The full-size version of the image is particularly mesmerizing, with beautiful swirls of Saharan sand in the Atlantic, and perhaps the tail end of the Gulf Stream circling around Iceland.
It’s hard to be certain, but it seems like the US east coast, central Europe, and east Asia are burning a lot of fossil fuels. Japan, of course, sits on the edge of the Pacific Ring of Fire, so the sulfates there could be from volcanoes. The smoke in Australia is probably from forest fires — but the large volume of smoke from the Amazon rain forest and sub-Saharan Africa is curious. Are these forest fires, or the large-scale burning of wood for heat and power?
As you can imagine, the amount of raw data required to produce such imagery is immense. Weather modeling is still one of the primary uses of supercomputers. To create the Portrait of Global Aerosols, GEOS-5 will have aggregated the measurements from hundreds of weather stations across Earth, along with data from the four NASA/NOAA GOES weather satellites. So you have some idea of the complexity of the GEOS-5 model, the resolution of this image is 10 kilometers (6 miles) — meaning the Earth has been split into regions (“pixels”) of 10km2, and then the atmospheric conditions are simulated for each region. The surface area of the Earth is 510,072,000km2, which means the total number of regions is around 5 million.
Each of these 5 million pixels might have megabytes or gigabytes of weather data associated with it — and of course, in any given area, the weather in each pixel interacts with those around it. This gives you some idea of how much data needs to be processed and moved around — and it only becomes exponentially more complex as sensors improve (producing more data) and as you increase the depth of your analysis. In the case of climate change, for example, scientists are modeling decades or even centuries of data to try and divine some kind of pattern — a task that taxes even the most powerful supercomputers. If you’ve ever wondered why we keep building faster and faster supercomputers, now you know why.
SCI-Arc Masters of Architecture graduates Liz and Kyle von Hasseln have been awarded the inaugural Gehry Prize for developing an interruptible 3D printing method, dubbed Phantom Geometry, that allows designers to make alterations to the design while it is being printed. The Phantom Geometry method is a convenient alternative to the conventional, static 3D printing systems available today. The system’s main components includes a UV light projector, a special photo-sensitive resin, and controlled robotic arms from SCI-Arc’s Robot House.
See also the project ProtoHouse by Softkill Design in the area of digital fabrication (obviously a technology that is actually in the "peak of expectations" phase of the hype cycle for emerging technologies graph).
Let's assume for a minute that 3D printing becomes as good as its proponents say it will, and soon. We're talking high strength plastics, high resolution models, all at prices that the average consumer can afford.
It seems obvious that at the point where all these trend lines meet, there's a powerful incentive for tinkerers and teenagers to start downloading plans from the internet and simply making their own sets.
In this scenario, if physical objects made from single materials follow the same trajectory as other media that were physical until they became just bits, there will at first be resistance from toymakers, in the form of lawsuits. Collectors will be sued as a deterrent to other rogues, and websites for sharing designs shut down.
Meanwhile, an underground of Makers will continue to experiment. Amateurs will collaborate to create LEGO sets and other toys that no cadre of designers in Denmark could match. Some will go pro. Gradually, the industry will adapt.
I realize that some proponents of 3D printing envision this process eating pretty much all the manufacturing on the planet. There are good reasons that won't happen. But for certain industries that are uniquely susceptible to being disrupted by better versions of today's 3D printing technology, who knows? Perhaps the YouTube of the future deals in atoms, not bits.
This blog is the survey website of fabric | ch - studio for architecture, interaction and research.
We curate and reblog articles, researches, writings, exhibitions and projects that we notice and find interesting during our everyday practice and readings.
Most articles concern the intertwined fields of architecture, territory, art, interaction design, thinking and science. From time to time, we also publish documentation about our own work and research, immersed among these related resources and inspirations.
This website is used by fabric | ch as archive, references and resources. It is shared with all those interested in the same topics as we are, in the hope that they will also find valuable references and content in it.