>>> context weblog
sampling new cultural context
| home | site map | about context | donate | lang >>> español - català |
friday :: february 24, 2006
   
 
protect net neutrality!

"The Internet’s open, neutral architecture has proven to be an enormous engine for market innovation, economic growth, social discourse, and the free flow of ideas. The remarkable success of the Internet can be traced to a few simple network principles – end-to-end design, layered architecture, and open standards -- which together give consumers choice and control over their online activities. This “neutral” network has supported an explosion of innovation at the edges of the network (...) Because the network is neutral, the creators of new Internet content and services need not seek permission from carriers or pay special fees to be seen online. As a result, we have seen an array of unpredictable new offerings – from Voice-over-IP to wireless home networks to blogging – that might never have evolved had central control of the network been required by design. Allowing broadband carriers to control what people see and do online would fundamentally undermine the principles that have made the Internet such a success. For the foreseeable future most Americans will face little choice among broadband carriers. Enshrining a rule that permits carriers to discriminate in favor of certain kinds or sources of services would place those carriers in control of online activity. Allowing broadband carriers to reserve huge amounts of bandwidth for their own services will not give consumers the broadband Internet our country and economy need. Promoting an open and accessible Internet is critical for consumers. It is also critical to our nation’s competitiveness

(...) Some believe that the Internet was born and flourished out of a fortuitous accident, a random interaction of market forces and technology. But that simply is not the case. The advent of the Internet took tremendous vision and initiative, by numerous network engineers, and software developers, and hardware vendors, and entrepreneurs. That advent also included visionary U.S. policymakers who recognized that the government largely needed to get out of the way, and allow the free market to work its genius in this new interactive, online environment. At the same time, as I will explain below, that policy judgment rested on an existing regulatory framework that allowed open and nondiscriminatory access to the Internet. I was fortunate to be involved in the earliest days of the “network of networks.” From that experience, I can attest to how the actual design of the Internet – the way its digital hardware and software protocols, including the TCP/IP suite, were put together -- led to its remarkable economic and social success.

First, the layered nature of the Internet describes the “what,” or its overall structural architecture. The use of layering means that functional tasks are divided up and assigned to different software-based protocol layers. For example, the “physical” layers of the network govern how electrical signals are carried over a physical medium, such as copper wire or radio waves. The “transport” layers help route the user’s data packets to their correct destinations, while the application layers control how those packets are used by a consumer's email program, web browser, or other computer application. This simple and flexible system creates a network of modular “building blocks,” where applications or protocols at higher layers can be developed or modified with no impact on lower layers, while lower layers can adopt new transmission and switching technologies without requiring changes to upper layers. Reliance on a layered system greatly facilitates the unimpeded delivery of packets from one point to another. Second, the end-to-end design principle describes the “where,” or the place for network functions to reside in the layered protocol stack. With the Internet, decisions were made to allow the control and intelligence functions to reside largely with users at the “edges” of the network, rather than in the core of the network itself. For example, it is the user’s choice what security to use for his or her communications, what VOIP system to use in assembling digital bits into voice communications, or what web browser to adopt. This is precisely the opposite of the traditional telephony and cable networks, where control over permitted applications is handled in the core (in headends and central offices), away from the users at the edge. As a result, the power and functionality of the Internet is left in the hands of the end users. Third, the design of the Internet Protocol, or the “how,” allows for the separation of the networks from the services that ride on top of them. IP was designed to be an open standard, so that anyone could use it to create new applications and new networks (by nature, IP is completely indifferent to both the underlying physical networks, and to the countless applications and devices using those networks). As it turns out, IP quickly became the ubiquitous bearer protocol at the center of the Internet. Thus, using IP, individuals are free to create new and innovative applications that they know will work on the network in predictable ways. Finally, from these different yet related design components, one can see the overarching rationale -- the “why” -- that no central gatekeeper should exert control over the Internet. This governing principle allows for vibrant user activity and creativity to occur at the network edges. In such an environment, entrepreneurs need not worry about getting permission for their inventions will reach the end users. In essence, the Internet has become a platform for innovation. One could think of it like the electric grid, where the ready availability of an open, standardized, and stable source of electricity allows anyone to build and use a myriad of different electric devices. This is a direct contrast to closed networks like the cable video system, where network owners control what the consumer can see and do. In addition to this architectural design, the Internet has thrived because of an underlying regulatory framework that supported openness. Wisely, government has largely avoided regulating the Internet directly.

(...) In the zone of governmental noninterference surrounding the Internet, one crucial exception had been the nondiscrimination requirements for the so-called last mile. Developed by the FCC over a decade before the commercial advent of the Internet, these “Computer Inquiry” safeguards required that the underlying providers of last-mile network facilities – the incumbent local telephone companies – allow end users to choose any ISP, and utilize any device, they desired. In turn, ISPs were allowed to purchase retail telecommunications services from the local carriers on nondiscriminatory rates, terms, and conditions. The end result was, paradoxically, a regulatory safeguard applied to last-mile facilities that allowed the Internet itself to remain open and “unregulated” as originally designed. Indeed, it is hard to imagine the innovation and creativity of the commercial Internet in the 1990s ever occurring without those minimal but necessary safeguards already in place. By removing any possibility of ILEC barriers to entry, the FCC paved the way for an explosion in what some have called “innovation without permission.” A generation of innovators -- like Tim Berners-Lee with the World Wide Web, Yair Goldfinger with Instant Messaging, David Filo and Jerry Yang with Yahoo!, Jeff Bezos with Amazon, and Larry Page and Sergey Brin with Google – were able to offer new applications and services to the world, without needing permission from network operators or paying exorbitant carrier rents to ensure that their services were seen online. And we all have benefited enormously from their inventions." >from *The Testimony of Mr. Vinton Cerf*, Vice President and Chief Internet Evangelist, Google. Senate Committee on Commerce, Science, and Transportation. Full Committee Hearing on Net Neutrality. February 7, 2006

related context
>
Full Committee Hearing on Net Neutrality. Senate Committee on Commerce, Science, and Transportation. February 7, 2006
> network neutrality.
> universities diffused internet technology in mid -1990s. 'students brought it to the public.' february 22, 2006
> rfc3271: the internet is for everyone. may 6, 2002

imago
>
net noninterference path

sonic flow
>
end to end design [stream]
end to end design [download]

| permaLink

 
friday :: february 17, 2006
   
 
feral robotic: bridging robotics with pervasive location based public authoring

Successful field trial of a public authoring feral robot. The robot combines air quality and carbon dioxide sensor and a GPS unit with WiFi communications to upload its geo-referenced readings to the Urban Tapestries platform. Over the next 2 months they will be developing a web interface to allow for more sophisticated visualisation and interrogation of robot sensor readings, and combining them with local knowledge uploaded to Urban Tapestries by people in the area.

Social Tapestries is a research programme exploring the potential benefits and costs of local knowledge mapping and sharing, what they have termed the public authoring of social knowledge. Over the next few years Proboscis will be developing a series of experimental uses of public authoring to demonstrate the social and cultural benefits of local knowledge sharing enabled by mobile and network technologies. These playful and challenging experiments build upon the Urban Tapestries framework and software platform developed by Proboscis and its partners.

The aim is to bridge the experimental robotics field with that of pervasive location based public authoring. The collaboration is intended to investigate the possibilities for artists and engineers to develop compelling new forms of social and cultural intervention that can be adopted and adapted by ordinary people, using the tools and materials available to them.

Proboscis and Natalie Jeremijenko are collaborating to jointly explore intersections between their current work, specifically Proboscis' Urban Tapestries platform and Natalie Jeremijenko's Feral Robots. Natalie Jeremijenko's experimental robotics projects reconfigure low cost robots that are sold as consumer toys into vehicles of social and cultural activism. Her workshops have explored the possibilities of robotics breaking out of the academic lab and using the economies of scale of consumer manufacturers to put sophisticated equipment into the hands of the general public. Her project has developed a series of kits which adapt the toy robots into powerful sensing devices for locating and identifying chemical pollution and radiation.

Combining these two modes of social and cultural exploration will form the basis of the Visiting Fellowship: aiming to leverage the practical, hands-on approach of 'hobbyist' robotics with the ability to annotate specific geo-locations. Our aim is to design and create practical applications of such 'creative misuse' of commercially available technologies for social and cultural public benefit. >from *Robotic Feral Public Authoring site*.

related context
>
interactive city. 'contemporary city is weighted down. we dream of something more. not some something planned and canned, like another confectionary spectacle. something that can respond to our dreams. something that will transform with us, not just perform change on us, like an operation.' isea, august, 2006
> pigeonblog. alternative participatory way to environmental air pollution data gathering. it equips urban homing pigeons with gps enabled electronic air pollution sensing devices, capable of sending location based air pollution data as well as images to an online mapping/blogging environment in real time. august, 2006
> biology inspires perceptive machines. february 9, 2006
> nsf releases 'sensors for environmental observatories' report. 'in place sensors are producing a revolution in our understanding of the environment... better-informed public policies that address the interactions between human society and the natural environment will be the result.' january 17, 2006
> gridswarms. may 20, 2005
> datacities: sensity. may 13, 2005
> plan: pervasive and locative arts network. january 28, 2005
> greenbots project. 'artbots are small robots able to develop some type of creative activity...' december 17, 2004
> the sensor revolution. march 2, 2004

imago
>
escaped from domestication

sonic flow
>
feral robot [stream]
feral robot [download]

| permaLink

 
friday :: february 10, 2006
   
 
two super heavy elements discovery ratified

For the first time a Swiss research group has participated in the discovery of new chemical elements. The elements have the numbers 113 and 115 and were discovered by a combination of physical and chemical techniques in the Russian nuclear research centre (JNIR) in Dubna. With its radiochemical expertise the Paul Scherrer Institute (PSI) was central to the experiment’s success.

Chemistry is currently pushing the boundaries of the scientific unknown. Until 1940 Uranium was the heaviest known element. This naturally occurring metal has the periodic number 92, as it has 92 positively charged protons in its nucleus. Since then over twenty elements with higher atomic numbers have been discovered.

The birth of element 115 [ununpentium]: heavy elements decay by emitting charged helium atoms – alpha particles. Such decay chains were used by American, Russian and Swiss scientists to physically prove the existence of elements 115 and its decay product after emission of the first alpha particle – element 113 [ununtrium]. In order to synthesize the atoms of element 115 a rotating target disc of americium was bombarded with a calcium beam. In a fusion reaction between target and beam particles element 115 was born. However their formation was not sufficient to prove the element’s existence as its atoms only lived a mere tenth of a second and were difficult to detect. The radiochemical experiment proved much more successful as it yielded five times as many atoms.

Radiochemical proof: As expected, the element 115 decayed by emitting an alpha particle to become element 113 and then in further emissions of four alpha particles, to dubnium, element 105. It was here that the elegant experimental approach from the PSI came into play. Behind the rotating americium disc (target) a copper plate was placed which collected all element 115 atoms emitted from the target. The copper plate was chemically processed by means of liquid chromatography techniques and 15 atoms of dubnium (which have a half-life of 32 hours) were observed. The decay pattern of these atoms supported the evidence of the physics experiment. Thus the discovery of element 115 and its progeny, element 113, was proven. All elements below atomic number 113 are already known. >from *Two super heavy elements discovered*. Element identification in Russia, due to Swiss expertise in radiochemistry. January 31, 2006

related context
>
nucleosynthesis. the process of creating new atomic nuclei from preexisting nucleons.
> about superheavy elements. elements with an atomic number greater than 112, produced artificially in cyclotron experiments.
> livermore scientists team with russia to discover elements 113 and 115. february 2, 2004
> nuclear physicists confirm element 110 discovery. 'confirmation that element 110 can be made in collisions between lead and nickel nuclei is noteworthy given the recent scandal over element 118.' july 4, 2003
> lawrence berkeley lab concludes that evidence of element 118 was a fabrication. 'finding superheavy element 118 would have been a giant step in the quest for the conjectured island of nuclear stability.' september, 2002.
> results of element 118 experiment retracted. 'science is self-correcting. if you get the facts wrong, your experiment is not reproducible.' july 27, 2001

imago
>
r-process nucleosynthesis

sonic flow
>
super heavy elements 113 and 115 [stream]
super heavy elements 113 and 115 [download]

| permaLink

 
friday :: february 3, 2006
   
 
rain gardens: urban low-tech against pollution

Properly designed 'rain gardens' can effectively trap and retain up to 99 percent of common pollutants in urban storm runoff, potentially improving water quality and promoting the conversion of some pollutants into less harmful compounds, according to new research. The affordable, easy-to-design gardens could help solve one of the nation’s most pressing pollution problems, according to the study’s authors, Michael Dietz and John Clausen of the University of Connecticut.

More than half of the rainwater that falls on a typical city block, one with 75 percent or more impervious cover — such as roads or parking lots — will leave as runoff, according to the Environmental Protection Agency. This runoff includes metals, oils, fertilizers and other particulate matter, the Connecticut researchers note. Easy-to-construct rain gardens — shallow depressions in the earth landscaped with hardy shrubs and plants such as chokeberry or winterberry surrounded by bark mulch — offer a simple remedy to this problem, they say.

The gardens are designed to replicate the natural water cycle that existed before roads and other impervious surfaces were constructed, Dietz and Clausen say. As the water collects and soaks into the rain garden, it infiltrates into the ground rather than draining directly into sewers or waterways. The gardens work well year-round, they say.

In their two-year study of roof-water runoff, the researchers found that rain gardens significantly reduced concentrations of nitrates, ammonias, phosphorous and other pollutants reaching storm drains. In addition, design tweaks that allowed polluted rainwater to pool at the bottom of the gardens permitted bacteria in the soil to convert harmful nitrates into nitrogen gas, preventing them from entering the groundwater.

Dietz and Clausen hope their results will encourage developers and homeowners to create these low-tech rain water collectors. “Rain gardens are pleasing to look at, while they are performing an important function,” Dietz says. >from *Rain gardens soak up urban storm water pollution*. January 26, 2006

related context
>
research/demonstration rain garden.
> the green into our urban open spaces. december 10, 2004
> parccentralpark. 'an emerging urban kitchen.' may-september, 2004
> second skin: emergent architecture. december 16, 2002

imago
>
visual pollutant treatment

sonic flow
>
rain garden [stream]
rain garden [download]

| permaLink

 





> context weblog archive
december 2006
november 2006
october 2006
september 2006
august 2006
july 2006
june 2006
may 2006
april 2006
march 2006
february 2006
january 2006
december 2005
november 2005
october 2005
september 2005
august 2005
july 2005
june 2005
may 2005
april 2005
march 2005
february 2005
january 2005
december 2004
november 2004
october 2004
september 2004
august 2004
july 2004
june 2004
may 2004
april 2004
march 2004
february 2004
january 2004
december 2003
november 2003
october 2003
june 2003
may 2003
april 2003
march 2003
february 2003
january 2003
december 2002
november 2002
october 2002
july 2002
june 2002
may 2002
april 2002
march 2002
february 2002
january 2002
countdown 2002
december 2001
november 2001
october 2001
september 2001
august 2001

more news in
> sitemap

Google


context archives all www
   "active, informed citizen participation is the key to shaping the network society. a new 'public sphere' is required." seattle statement
| home | site map | about context | donate | lang >>> español - català |
03 http://straddle3.net/context/03/en/2006_02.html