Tuesday, August 02. 2016
By fabric | ch
As we continue to lack a decent search engine on this blog and as we don't use a "tag cloud" ... This post could help navigate through the updated content on | rblg (as of 07.2016), via all its tags!
HERE ARE ALL THE CURRENT TAGS TO NAVIGATE ON | RBLG BLOG:
(to be seen just below if you're navigating on the blog's page or here for rss readers)
Posted by Patrick Keller in fabric | ch at 16:58
Defined tags for this entry: 3d, activism, advertising, agriculture, air, animation, applications, archeology, architects, architecture, art, art direction, artificial reality, artists, atmosphere, automation, behaviour, bioinspired, biotech, blog, body, books, brand, character, citizen, city, climate, clips, code, cognition, collaboration, commodification, communication, community, computing, conditioning, conferences, consumption, content, control, craft, culture & society, curators, customization, data, density, design, design (environments), design (fashion), design (graphic), design (interactions), design (motion), design (products), designers, development, devices, digital, digital fabrication, digital life, digital marketing, dimensions, direct, display, documentary, earth, ecal, ecology, economy, electronics, energy, engineering, environment, equipment, event, exhibitions, experience, experimentation, fabric | ch, farming, fashion, fiction, films, food, form, franchised, friends, function, future, gadgets, games, garden, generative, geography, globalization, goods, hack, hardware, harvesting, health, history, housing, hybrid, identification, illustration, images, information, infrastructure, installations, interaction design, interface, interferences, kinetic, knowledge, landscape, language, law, life, lighting, localization, localized, magazines, make, mapping, marketing, mashup, materials, media, mediated, mind, mining, mobile, mobility, molecules, monitoring, monography, movie, museum, music, nanotech, narrative, nature, networks, neurosciences, opensource, operating system, participative, particles, people, perception, photography, physics, physiological, politics, pollution, presence, print, privacy, product, profiling, projects, psychological, public, publishing, reactive, real time, recycling, research, resources, responsive, ressources, robotics, santé, scenography, schools, science & technology, scientists, screen, search, security, semantic, services, sharing, shopping, signage, smart, social, society, software, solar, sound, space, speculation, statement, surveillance, sustainability, tactile, tagging, tangible, targeted, teaching, technology, tele-, telecom, territory, text, textile, theory, thinkers, thinking, time, tools, topology, tourism, toys, transmission, trend, typography, ubiquitous, urbanism, users, variable, vernacular, video, viral, vision, visualization, voice, vr, war, weather, web, wireless, writing
Wednesday, March 05. 2014
Via The New Yorker
Posted by Joshua Kopstein
In the nineteen-seventies, the Internet was a small, decentralized collective of computers. The personal-computer revolution that followed built upon that foundation, stoking optimism encapsulated by John Perry Barlow’s 1996 manifesto “A Declaration of the Independence of Cyberspace.” Barlow described a chaotic digital utopia, where “netizens” self-govern and the institutions of old hold no sway. “On behalf of the future, I ask you of the past to leave us alone,” he writes. “You are not welcome among us. You have no sovereignty where we gather.”
This is not the Internet we know today. Nearly two decades later, a staggering percentage of communications flow through a small set of corporations—and thus, under the profound influence of those companies and other institutions. Google, for instance, now comprises twenty-five per cent of all North American Internet traffic; an outage last August caused worldwide traffic to plummet by around forty per cent.
Engineers anticipated this convergence. As early as 1967, one of the key architects of the system for exchanging small packets of data that gave birth to the Internet, Paul Baran, predicted the rise of a centralized “computer utility” that would offer computing much the same way that power companies provide electricity. Today, that model is largely embodied by the information empires of Amazon, Google, and other cloud-computing companies. Like Baran anticipated, they offer us convenience at the expense of privacy.
Internet users now regularly submit to terms-of-service agreements that give companies license to share their personal data with other institutions, from advertisers to governments. In the U.S., the Electronic Communications Privacy Act, a law that predates the Web, allows law enforcement to obtain without a warrant private data that citizens entrust to third parties—including location data passively gathered from cell phones and the contents of e-mails that have either been opened or left unattended for a hundred and eighty days. As Edward Snowden’s leaks have shown, these vast troves of information allow intelligence agencies to focus on just a few key targets in order to monitor large portions of the world’s population.
One of those leaks, reported by the Washington Post in late October (2013), revealed that the National Security Agency secretly wiretapped the connections between data centers owned by Google and Yahoo, allowing the agency to collect users’ data as it flowed across the companies’ networks. Google engineers bristled at the news, and responded by encrypting those connections to prevent future intrusions; Yahoo has said it plans to do so by next year. More recently, Microsoft announced it would do the same, as well as open “transparency centers” that will allow some of its software’s source code to be inspected for hidden back doors. (However, that privilege appears to only extend to “government customers.”) On Monday, eight major tech firms, many of them competitors, united to demand an overhaul of government transparency and surveillance laws.
Still, an air of distrust surrounds the U.S. cloud industry. The N.S.A. collects data through formal arrangements with tech companies; ingests Web traffic as it enters and leaves the U.S.; and deliberately weakens cryptographic standards. A recently revealed document detailing the agency’s strategy specifically notes its mission to “influence the global commercial encryption market through commercial relationships” with companies developing and deploying security products.
One solution, espoused by some programmers, is to make the Internet more like it used to be—less centralized and more distributed. Jacob Cook, a twenty-three-year-old student, is the brains behind ArkOS, a lightweight version of the free Linux operating system. It runs on the credit-card-sized Raspberry Pi, a thirty-five dollar microcomputer adored by teachers and tinkerers. It’s designed so that average users can create personal clouds to store data that they can access anywhere, without relying on a distant data center owned by Dropbox or Amazon. It’s sort of like buying and maintaining your own car to get around, rather than relying on privately owned taxis. Cook’s mission is to “make hosting a server as easy as using a desktop P.C. or a smartphone,” he said.
Like other privacy advocates, Cook’s goal isn’t to end surveillance, but to make it harder to do en masse. “When you couple a secure, self-hosted platform with properly implemented cryptography, you can make N.S.A.-style spying and network intrusion extremely difficult and expensive,” he told me in an e-mail.
Persuading consumers to ditch the convenience of the cloud has never been an easy sell, however. In 2010, a team of young programmers announced Diaspora, a privacy-centric social network, to challenge Facebook’s centralized dominance. A year later, Eben Moglen, a law professor and champion of the Free Software movement, proposed a similar solution called the Freedom Box. The device he envisioned was to be a small computer that plugs into your home network, hosting files, enabling secure communication, and connecting to other boxes when needed. It was considered a call to arms—you alone would control your data.
But, while both projects met their fund-raising goals and drummed up a good deal of hype, neither came to fruition. Diaspora’s team fell into disarray after a disappointing beta launch, personal drama, and the appearance of new competitors such as Google+; apart from some privacy software released last year, Moglen’s Freedom Box has yet to materialize at all.
“There is a bigger problem with why so many of these efforts have failed” to achieve mass adoption, said Brennan Novak, a user-interface designer who works on privacy tools. The challenge, Novak said, is to make decentralized alternatives that are as secure, convenient, and seductive as a Google account. “It’s a tricky thing to pin down,” he told me in an encrypted online chat. “But I believe the problem exists somewhere between the barrier to entry (user-interface design, technical difficulty to set up, and over-all user experience) versus the perceived value of the tool, as seen by Joe Public and Joe Amateur Techie.”
One of Novak’s projects, Mailpile, is a crowd-funded e-mail application with built-in security tools that are normally too onerous for average people to set up and use—namely, Phil Zimmermann’s revolutionary but never widely adopted Pretty Good Privacy. “It’s a hard thing to explain…. A lot of peoples’ eyes glaze over,” he said. Instead, Mailpile is being designed in a way that gives users a sense of their level of privacy, without knowing about encryption keys or other complicated technology. Just as important, the app will allow users to self-host their e-mail accounts on a machine they control, so it can run on platforms like ArkOS.
“There already exist deep and geeky communities in cryptology or self-hosting or free software, but the message is rarely aimed at non-technical people,” said Irina Bolychevsky, an organizer for Redecentralize.org, an advocacy group that provides support for projects that aim to make the Web less centralized.
Several of those projects have been inspired by Bitcoin, the math-based e-money created by the mysterious Satoshi Nakamoto. While the peer-to-peer technology that Bitcoin employs isn’t novel, many engineers consider its implementation an enormous technical achievement. The network’s “nodes”—users running the Bitcoin software on their computers—collectively check the integrity of other nodes to ensure that no one spends the same coins twice. All transactions are published on a shared public ledger, called the “block chain,” and verified by “miners,” users whose powerful computers solve difficult math problems in exchange for freshly minted bitcoins. The system’s elegance has led some to wonder: if money can be decentralized and, to some extent, anonymized, can’t the same model be applied to other things, like e-mail?
Bitmessage is an e-mail replacement proposed last year that has been called the “the Bitcoin of online communication.” Instead of talking to a central mail server, Bitmessage distributes messages across a network of peers running the Bitmessage software. Unlike both Bitcoin and e-mail, Bitmessage “addresses” are cryptographically derived sequences that help encrypt a message’s contents automatically. That means that many parties help store and deliver the message, but only the intended recipient can read it. Another option obscures the sender’s identity; an alternate address sends the message on her behalf, similar to the anonymous “re-mailers” that arose from the cypherpunk movement of the nineteen-nineties.
Another ambitious project, Namecoin, is a P2P system almost identical to Bitcoin. But instead of currency, it functions as a decentralized replacement for the Internet’s Domain Name System. The D.N.S. is the essential “phone book” that translates a Web site’s typed address (www.newyorker.com) to the corresponding computer’s numerical I.P. address (192.168.1.1). The directory is decentralized by design, but it still has central points of authority: domain registrars, which buy and lease Web addresses to site owners, and the U.S.-based Internet Corporation for Assigned Names and Numbers, or I.C.A.N.N., which controls the distribution of domains.
The infrastructure does allow for large-scale takedowns, like in 2010, when the Department of Justice tried to seize ten domains it believed to be hosting child pornography, but accidentally took down eighty-four thousand innocent Web sites in the process. Instead of centralized registrars, Namecoin uses cryptographic tokens similar to bitcoins to authenticate ownership of “.bit” domains. In theory, these domain names can’t be hijacked by criminals or blocked by governments; no one except the owner can surrender them.
Solutions like these follow a path different from Mailpile and ArkOS. Their peer-to-peer architecture holds the potential for greatly improved privacy and security on the Internet. But existing apart from commonly used protocols and standards can also preclude any possibility of widespread adoption. Still, Novak said, the transition to an Internet that relies more extensively on decentralized, P2P technology is “an absolutely essential development,” since it would make many attacks by malicious actors—criminals and intelligence agencies alike—impractical.
Though Snowden has raised the profile of privacy technology, it will be up to engineers and their allies to make that technology viable for the masses. “Decentralization must become a viable alternative,” said Cook, the ArkOS developer, “not just to give options to users that can self-host, but also to put pressure on the political and corporate institutions.”
“Discussions about innovation, resilience, open protocols, data ownership and the numerous surrounding issues,” said Redecentralize’s Bolychevsky, “need to become mainstream if we want the Internet to stay free, democratic, and engaging.”
Illustration by Maximilian Bode.
Tuesday, July 13. 2010
by Nick Jones
I’ll be in South Africa at the end of August at the Gartner Symposium in Cape Town, which means I am just finishing some updates to the presentations I’ll be delivering. If any of my South African colleagues are reading this, there’s no need to remind me that my presentations are late; I’m painfully aware of that fact. My only excuse is that I worked for many years as a software developer, and everyone knows that software people never deliver on time. However, I digress.
I just updated a slide on future smartphone market share which makes depressing reading for Symbian fans. The rate at which Symbian is losing share is accelerating. Our new forecasts will be published at the end of July, but I doubt anyone will be surprised. That’s not to say that Symbian won’t remain the dominant platform for a few years more, but it does mean that the competition – especially Android – is catching up very fast.
Market share is an existential threat to Symbian, it imperils the very existence of the platform. And the main reason Symbian is losing share is the user experience which isn’t competitive with Apple or Android. Based on the early previews I’ve seen Symbian 3 looks to have polished a few of the rough edges, but doesn’t fix the problem. So if the weak UI is threatening Symbian’s very survival the Foundation ought to be seriously worried, right? Wrong. I just looked on the Foundation web site and blogs at the roadmap and features for future releases. What I see is too much effort on stuff that really doesn’t matter. For example: Audio policy packages for Symbian, WIFi direct, support for an “open cloud manifesto”, an accredited Symbian developer program for China, better multitasking, multiple personalised home screens, HDMI connection to external TVs, better web runtime support, better internal architecture and so on.
Forget elegant architecture, forget better multitasking, forget Chinese developers, forget release schedules that don’t deliver S4 devices with a new user experience until 2011. None of these matter. People will never use the features if they don’t buy the phone. The situation is now serious enough that any developer who isn’t working on something directly related to a new UI is wasting their time. The S4 UI is a “bet the platform” project. For any organisation to be in a situation where its survival depends on one project is very dangerous, especially when their track record in the area isn’t outstanding. I think the Foundation needs a contingency plan in case the planned S4 interface isn’t radical enough or good enough. Maybe redirect some developers and start a couple of skunkworks projects to create new competing UIs for S4, or perhaps announce a competition with a $1M prize for a new Symbian UI to encourage some radical ideas.
I think the Symbian foundation is just re-arranging the deck chairs on the Titanic and ignoring the Android iceberg ahead.
Monday, March 01. 2010
Worldwide mobile phone sales to end users totalled 1.211 billion units in 2009, a 0.9 per cent decline from 2008, according to Gartner, Inc. In the fourth quarter of 2009, the market registered a single-digit growth as mobile phone sales to end users surpassed 340 million units, an 8.3 per cent increase from the fourth quarter of 2008.
"The mobile devices market finished on a very positive note, driven by growth in smartphones and low-end devices," said Carolina Milanesi, research director at Gartner. ”Smartphone sales to end users continued their strong growth in the fourth quarter of 2009, totalling 53.8 million units, up 41.1 per cent from the same period in 2008. In 2009, smartphone sales reached 172.4 million units, a 23.8 per cent increase from 2008. In 2009, smartphone-focused vendors like Apple and Research In Motion (RIM) successfully captured market share from other larger device producers, controlling 14.4 and 19.9 per cent of the worldwide smartphone market, respectively.”
Throughout 2009, intense price competition put pressure on average selling prices (ASPs). The major handset producers had to respond more aggressively in markets such as China and India to compete with white-box producers, while in mature markets they competed hard with each other for market share. Gartner expects the better economic environment and the changing mix of sales to stabilise ASPs in 2010.
Three of the top five mobile phone vendors experienced a decline in sales in 2009 (see Table 1). The top five vendors continued to lose market share to Apple and other vendors, with their combined share dropping from 79.7 in 2008 to 75.3 per cent in 2009.
Note* This table includes iDEN shipments, but excludes ODM to OEM shipments.
In 2009, Nokia's annual mobile phone sales to end users reached 441 million units, a 2.2 per cent drop in market share from 2008. Although Nokia outperformed industry expectations in sales and revenue in the fourth quarter of 2009, its declining smartphone ASP showed that it continues to face challenges from other smartphone vendors. "Nokia will face a tough first half of 2010 as improvement to Symbian and new products based on the Meego platform will not reach the market well before the second half of 2010," said Ms Milanesi. "Its very strong mid-tier portfolio will help it hold market share, but its ongoing weakness at the high end of the portfolio will hurt its share of market value."
Samsung was the clear winner among the top five with market share growing by 3.2 percentage points from 2008. This achievement came as a result of improved channel relationships with distributors to extend its reach and better address the needs of individual markets as well as a rich mid-tier portfolio. For 2010, the company is putting a focus on Bada, its new operating system (OS) that aims at adding the value of an ecosystem to its successful hardware lineup.
Motorola sold slightly more than half of its 2008 sales and exhibited the sharpest drop in market share, accounting for 4.8 per cent market share in 2009. "Its refocus away from the low-end market limited the volume opportunity, but should help it drive margins going forward. Motorola's hardest barrier is to grow brand awareness outside the North American market, where it benefits from a long-lasting relationship with key communications service providers (CSPs).
In the smartphone OS market, Symbian continued its lead, but its share dropped 5.4 percentage points in 2009 (see Table 2). Competitive pressure from its competitors, such as RIM and Apple, and the continued weakness of Nokia's high-end device sales have negatively impacted Symbian's share.
At Mobile World Congress 2010, Symbian Foundation announced its first release since Symbian became fully open source. Symbian^3 should be made available by the end of the first quarter of 2010 and may reach the first devices by the third quarter of 2010, while Symbian^4 should be released by the end of 2010.
"Symbian had become uncompetitive in recent years, but its market share, particularly on Nokia devices, is still strong. If Symbian can use this momentum, it could return to positive growth," said Roberta Cozza, principal research analyst at Gartner.
Source: Gartner (February 2010)
The two best performers in 2009 were Android and Apple. Android increased its market share by 3.5 percentage points in 2009, while Apple's share grew by 6.2 percentage points from 2008, which helped it move to the No. 3 position and displace Microsoft Windows Mobile.
“Android's success experienced in the fourth quarter of 2009 should continue into 2010 as more manufacturers launch Android products, but some CSPs and manufacturers have expressed growing concern about Google's intentions in the mobile market,” Ms Cozza said. “If such concerns cause manufacturers to change their product strategies or CSPs to change which devices they stock, this might hinder Android's growth in 2010.”
"Looking back at the announcements during Mobile World Congress 2010, we can expect 2010 to retain a strong focus around operating systems, services and applications while hardware takes a back seat," said Ms Milanesi. "Sales will return to low-double-digit growth, but competition will continue to put a strain on vendors' margins."
Thursday, February 04. 2010
The smartphone market is very hot right now with smartphones selling very well and many different companies competing in the market. The open source Android OS is doing very well in the market against the proprietary iPhone OS and Windows Mobile. The most widely used smartphone OS in the world is Symbian and the Symbian Foundation announced today that its open source migration is complete.
The Symbian OS has been developed for more than ten years and has shipped on more than 330 million dives. The entire source code for the OS is now open source and available to anyone who wants to download it at no charge.
The code can now be used and modified by anyone for any purpose from mobile phones to other types of gear. The move was made to put Symbian in a position for growth and faster time to market. I wonder if we will see the Symbian OS start to pop up on consumer electronic devices like tablets like Android is doing. The use of the software is governed by the Eclipse Public License and other open source licenses.
Tuesday, September 22. 2009
Google’s Chrome OS is now available to download. The search giant first announced its plans to release a desktop OS back in July, and this is the first “early development” build available to the general public.
With it you get the GNOME 2.24 desktop environment, Google’s own Chrome 4.0.206 web browser and the OpenOffice.org 3.0 office suite if you really need to get some work done. Since you’ll probably be more excited about “living in the browser” like Google expects us to all eventually be doing, you’ll be pleased to hear there’s an Adobe Flash Player 10 plugin too, so streaming video will work.
Recent interviews with the Google exec team suggest that Chrome OS may end up moving closer to Android at some point in the future, but for now it’s a standalone platform that they expect to see hitting production netbooks sometime in the second half of 2010. You can find the 476MB Google Chrome OS download here.
(Page 1 of 1, totaling 6 entries)
fabric | rblg
This blog is the survey website of fabric | ch - studio for architecture, interaction and research.
We curate and reblog articles, researches, writings, exhibitions and projects that we notice and find interesting during our everyday practice and readings.
Most articles concern the intertwined fields of architecture, territory, art, interaction design, thinking and science. From time to time, we also publish documentation about our own work and research, immersed among these related resources and inspirations.
This website is used by fabric | ch as archive, references and resources. It is shared with all those interested in the same topics as we are, in the hope that they will also find valuable references and content in it.
| rblg on Twitter