> context weblog november 2002
sampling new cultural context
| home | site map | about context | lang >>> español - català |
 
 
thursday :: november 28, 2002
 
international children's digital library
:: to serve and explore with young readers

Led by the University of Maryland and the Internet Archive, a partnership of government, non-profit, industry and academic organizations launched the world's largest international digital library for children.

The project is funded by the National Science Foundation with additional support from other partners as part of a long-term research project to develop new technology to serve young readers. The *International Children's Digital Library* (ICDL - >see http://www.icdlbooks.org/) will provide children ages 3 to 13 years with an unparalleled opportunity to experience different cultures through literature. The new digital library will begin with 200 books in 15 languages representing 27 cultures, with plans to grow over five years to 10,000 books representing 100 different cultures.

While the ICDL's collection is intended to provide access to the best children's books worldwide, a primary long-term benefit of the project may be in discovering how children can best interact with digital books. "Engaging stories help children grow intellectually and emotionally, gain an understanding of who they are and learn about others and the world around them, all while having a great deal of fun," said Allison Druin, project leader of the ICDL at the University of Maryland. "We believe that the International Children's Digital Library can provide an important new avenue for children to experience new books and explore other cultures."

In compiling the 10,000-book collection, the ICDL will also work to understand data acquisition and rights management in the creation of a large-scale digital library. The ICDL will collaborate with industry partners on technologies needed to incorporate copyrighted books while respecting intellectual property laws and rights.From *NSF-Supported International Children's Digital Library to Launch November 20*, november 18, 2002.

related context
>
Internet Archive
> project gutenberg: net public domain library awarded. october 31, 2002

> kids and books
wednesday :: november 27, 2002
 
 
> experimental tele-immersion
tele-immersion demonstration
:: milestone of grid computing

Drawing on a bank of cameras that constantly scans participants and their surroundings, tele-immersion allows participants in different states to feel as if they’re chatting in the same room. But gathering such comprehensive, real-time measurements of a person and his environment takes a toll: Tele-immersion generates huge amounts of data, requiring massive computing power and bandwidth.

While the tele-immersion system gathered and displayed information in side- by-side booths at the Baltimore Convention Center, actual data processing occured some 250 miles away at the Pittsburgh Supercomputing Center. The boost in computing power achieved with the move to the Pittsburgh Supercomputing Center will permit at least one significant advance in tele- immersion’s capabilities: For the first time, the system will be able to image an entire room in real time. Previously, limited processing power restricted the gathering of images to a small area where participants were seated, while the background was static, not unlike a television anchor seated before an unchanging image of a city skyline.

"The reassigning of tele-immersion data processing to a faraway supercomputing center is a milestone for grid computing, which uses remote machines to process data,” Kostas Daniilidis said. "If connections are fast enough – as with Internet2 – the network itself becomes a giant computer, linking processors scattered over many hundreds of miles. This tele- immersion experiment shows definitively that a network computer configured this way can handle extremely data-intensive operations much more quickly than if processing were occurring within the confines of a single room."

All this computing is for a good cause. Daniilidis and his colleagues say tele-immersion may well revolutionize the way people communicate, allowing people on opposite ends of the country or world to feel temporarily as if they're in each other’s presence. Key to tele-immersion’s realistic feel are a hemispherical bank of digital cameras that capture participants from a variety of angles and tracking gear worn on their heads. Combined with polarized glasses much like those worn at 3D movies, the setup creates subtly different images in each eye, much as our eyes do in daily life.

The tele-immersion collaboration involving Penn, UNC and the Pittsburgh Supercomputing Center is funded by the National Science Foundation. >from *Tele-Immersion System Is First ‘Network Computer,’ with Input, Processing and Output in Different Locations*, november 18, 2002.

related context
>
About Internet2
> first transatlantic touch: virtual reality touch. november 4, 2002

 
 
 
tuesday :: november 26, 2002
 
computer model of e-coli
:: predict evolution of bacteria

Bioengineers used their computer model of E-coli (patent pending) to accurately predict how the bacteria would evolve under specific conditions. The results may have applications for designing tailor-made biological materials for commercial uses or for predicting the evolution of drug- resistant bacteria.

“This is totally revolutionary —that you can actually predict the outcome of such a complicated and intricate process as adaptive evolution,” says Bernhard Palsson, UCSD Bioengineering Professor and study co-author with by UCSD Bioengineering Professor Jeff Hasty. The study also serves as an example of the power of systems biology, a hot emerging field dedicated to employing mathematics and computer simulation to understand how genes and proteins work together to control the function of cells.

Palsson first created a computer model of E-coli in 2000, and since then has shown that the model accurately mimics the behavior of the bacteria 80% of the time. Over the past two decades, Palsson has been working at the enormous challenge of creating computer models of these biological functions. He employs a technique he calls constraints-based modeling — basically describing what a cell does not do in order to define what it can do through a process of elimination. To date, Palsson has created in silico models of metabolism for E-coli, the red blood cell, H. influenzae, H. pylori and yeast.

Researchers are beginning to build and test synthetic versions of control modules within the circuitry of cells. For example, Hasty has developed a model of a positive feedback loop, in which a gene produces a protein which in turn causes that gene to become more active. He says as scientists begin to synthesize these simple network modules in the context of mathematical models, it will set the stage for the controlling cellular function, which could have important applications in nanotechnology and gene and cell therapy. Hasty's long-term goal is to build synthetic genetic networks which could be inserted into a patient's cells to tightly regulate the expression of a desired protein, or even to cause an undesirable cell to self-destruct. >from *UCSD Bioengineers Use Computer Model to Predict Evolution of Bacteria*, november 20, 2002.

related context
>
Worldwide E.coli alliance. september 4, 2002

> e-coli: bits of life; life in bits?
monday :: november 25, 2002
 
 
> solar cell for automotion: service station
full spectrum solar cell
:: unexpected discovery

A single system of alloys incorporating indium, gallium, and nitrogen can convert virtually the full spectrum of sunlight-from the near infrared to the far ultraviolet-to electrical current.

What began as a basic research question points to a practical application of great potential value. For if solar cells can be made with this alloy, they promise to be rugged, relatively inexpensive-and the most efficient ever created. Wladek Walukiewicz and his colleagues (from Lawrence Berkeley National Laboratory - Berkeley Lab -, a group at Cornell University headed by William Schaff and a group at Japan's Ritsumeikan University headed by Yasushi Nanishi) were studying not how semiconductors absorb light to create electrical power, but how they use electricity to emit light.

"If it works, the cost should be on the same order of magnitude as traffic lights," Walukiewicz says. "Maybe less." Solar cells so efficient and so relatively cheap could revolutionize the use of solar power not just in space but also on Earth. From *An unexpected discovery could yield a full spectrum solar cell*, november 18, 2002.

related context
>
energy for greenhouse planet: towards a global energy system. november 13, 2002
> plastic spintronics: from silicon to plastic based computers. october 1, 2002
> nanoparticles used in solar energy conversion. august 9, 2002

 
 
 
thursday :: november 21, 2002
 
world's fastest supercomputers
:: 2002 top500 list released

In what has become a much-anticipated event in the world of high performance computing, the 20th edition of the "TOP500" list of the world's fastest supercomputers was released.

The Earth Simulator supercomputer installed earlier this year at the Earth Simulator Center in Yokohama, Japan, is with its Linpack benchmark performance of 35.86 Tflop/s (trillions of calculations per second) retains the number one position (last year, The IBM ASCI White system at Lawrence Livermore National Laboratory was the number one supercomputer in the world, with a performance of 7.2 teraflop/s). The #2 and #3 positions are held by two new, identical ASCI Q systems at Los Alamos National Laboratory (7.73 Tflop/s each). These systems are built by Hewlett-Packard and based on the AlphaServer SC computer system.

For the first time ever, 2 PC-based clusters were able to gain a top 10 spot. At position 5 is a cluster at Lawrence Livermore National Laboratory built by Linux NetworX and Quadrics. At position 8 is a cluster at the Forecast Systems Laboratory at NOAA built by HPTi with a Myrinet interconnect.

In terms of total performance of all the installed systems, the latest TOP500 edition shows IBM as still the clear leader with 31.8 percent, ahead of HP with 22.1 percent and NEC with 14.6 percent. A total of 55 Intel- based and eight AMD-based PC clusters are now present in the TOP500. The number of clusters in the TOP500 grew again to a total of 93 systems. Fourteen of these clusters are labeled as 'Self-Made' as they are designed and assembled by the end users themselves.

Seven of the TOP10 systems, 46% of all 500 systems, and 51% of the total performance are installed in the United States. Also 91% of all 500 systems are produced in the United States. >from *TOP500 List Of World's Fastest Supercomputers Released* , november 15, 2002.

related context
>
Earth Simulator press release. april 18, 2002
> linux for world's biggest computer. january 31, 2002
> world's fastest supercomputers: latest edition of top500 list released. november 22, 2001
> asci white supercomputer. june 29, 2000

> the vacuum packed computer concept! by Allen,
underemployed inventor and industrial designer
wednesday :: november 20, 2002
 
 
> clean out electrical communication
systems to circumvent Internet censorship
:: possible weaknesses

A wide variety of systems, including programs by the names of *Triangle Boy*, *Peek-A-Booty*, *Six/Four*, and *CGIProxy*, have been proposed for circumventing Internet censorship in countries such as China and Saudi Arabia, with no clear winner emerging as the single best anti-censorship solution.

One reason is that there hasn't been much discussion about how well these systems would hold up in response to various types of attacks that could be mounted by the censors. The worst thing that could happen would be for an anti-censorship system to be widely deployed, with volunteers all over the world running software to assist in the effort and people in China and other censored countries using the software every day to beat censorship, when suddenly the censors find a flaw that can undermine and block the whole system. If the censors discover a technique to detect circumvention traffic, then not only can the system be blocked and rendered obsolete, but if the traffic can be traced back to individual users in the censored countries, the penalties imposed on them could be severe.

Plus, if the traffic is detectable, the censors can also trace it to the sites outside their country which are helping defeat Internet censorship, and add those sites to a permanent blacklist. Even if those sites later upgrade to a more secure, undetectable version of the software, they will still be blacklisted, and it may be prohibitive for them to move to a new location to get around the blacklist.

So, it is a high priority to think of possible attacks against a system before the system is deployed. This page is a collection of common attacks, weaknesses, and fallacies that must be avoided. >from *List of possible weaknesses in systems to circumvent Internet censorship by Bennett Haselton*

related context
>
response to “List of Possible Weaknesses in Systems to Circumvent Internet Censorship" by Paul Baranowski. november 11, 2002
> the free network project: freedom of communication. november 5, 2002
> the hacktivismo declaration: assertions of liberty in support of an uncensored internet. july 18, 2002
> CodeCon 2002: p2p and cripto programming. february 21, 2002

 
 
 
tuesday :: november 19, 2002
 
transfer data at world record pace
:: developing grid concept

Physicists recently set a world record for high-speed disk-to-disk transfer of research data. The rates achieved were equivalent to transferring all the data from a full-length DVD movie from one part of the world to another in less than 60 seconds, or a full compact disk in less than eight seconds. Within three hours, physicists successfully moved one terabyte of research data--equivalent to roughly 1,500 CDs from TRIUMF, the particle physics lab in Vancouver, to CERN, the international particle physics lab in Geneva.

"We are now coming to grips with the practical problems of establishing a world-wide computational grid to process the unprecedented amounts of data that will be generated by the ATLAS experiment when it begins to operate in 2007," said James Pinfold, who speaks for ATLAS Canada, the Canadian contingent of an international collaboration of 2,000 physicists. The ATLAS experiment will employ the world's highest-energy collider (the LHC) to explore the fundamental nature of matter and the basic forces which shape our universe. "The only way to deal with such unprecedented amounts of data is to utilize a new class of computing infrastructure--the GRID. The GRID is the World Wide Web of the 21st century," said Pinfold. The GRID will eventually link together computers, supercomputers and storage centers across the globe to create a world computer that makes it possible for the increasingly large and international modern scientific collaborations to share resources on a mammoth scale. This allows globally distributed groups to work together in ways that were previously impossible. The GRID "world computer" must have lightning fast and reliable communication of information between its nodes. >from *University of Alberta physicist helps transfer data at world record pace*, November 13, 2002

related context
>
indiana virtual machine room: tera-scale supercomputer grid. june 13, 2002
> mmg: massively-multiplayer games platform. may 10, 2002
> science grid deployement: emerging model of computing. april 3, 2002
> end of lep accelerator at cern: large electron positron collider. december, 2000

> chronos, our father time
monday :: november 18, 2002
 
 
> para cristal te quiero,
espejo nunca

mirror matter in the solar system?
:: dark matter candidate

Mirror matter is an entirely new form of matter predicted to exist if mirror symmetry is a fundamental symmetry of nature. Mirror matter has the right broad properties to explain the inferred dark matter of the Universe and might also be responsible for a variety of other puzzles in particle physics, astrophysics, meteoritics and planetary science.

The mysterious blue 'ponds' and the dearth of small craters found on the asteroid Eros by the NEAR-Shoemaker spacecraft can be explained by the mirror matter space-body hypothesis, according to Robert Foot and Saibal Mitra.

Mirror matter is a candidate for the mysterious dark matter inferred to exist in galaxies. Some amount of mirror matter should exist in our solar system, perhaps not enough to form a large body like a planet, but enough to form small asteroid-sized space-bodies. Indeed the total mass of asteroids in the asteroid belt is estimated to be only about 0.05% of the mass of the Earth. A similar number, or even greater number of mirror bodies, perhaps orbiting in a different plane or even spherically distributed like the Oort cloud is a fascinating possibility.

An even more remarkable possibility is that the comets themselves are the mirror matter space-bodies! Observations by spacecraft have found that the cores of comets appear to be composed of extremely dark material. Could this dark material be the dark matter? >from *Mirror matter in the solar system: New evidence for mirror matter from Eros by Robert Foot and Saibal Mitra* , November 4, 2002 and *Mirror matter footprint on Eros?*

related context
>
the seven wonders of the mirror world
> center for cosmological physics: probing phenomena beyond standard model. september 13, 2001
> first "map" of dark matter. march 7, 2000

 
 
 
thursday :: november 14, 2002
 
the advent of genomic medicine
:: to take center stage in clinical medicine

Humans have known for millennia that heredity affects health. However, Mendel's seminal contribution to the elucidation of the mechanisms by which heredity affects phenotype occurred less than 150 years ago, and Garrod began applying this knowledge to human health only at the start of the past century. For most of the 20th century, many medical practitioners viewed genetics as an esoteric academic specialty; that view is now dangerously outdated.

Except for monozygotic twins, each person's genome is unique. All physicians will soon need to understand the concept of genetic variability, its interactions with the environment, and its implications for patient care. With the sequencing of the human genome only months from its finish, the practice of medicine has now entered an era in which the individual patient's genome will help determine the optimal approach to care, whether it is preventive, diagnostic, or therapeutic. Genomics, which has quickly emerged as the central basic science of biomedical research, is poised to take center stage in clinical medicine as well. >from "Genomic Medicine. A Primer" by Alan E. Guttmacher, M.D., and Francis S. Collins, M.D., Ph.D. The New England Journal of Medicine. Volume 347:1512-1520 November 7, 2002 Number 19

related context
>
nhgri launches genome.gov. june 24, 2002
> ethical dilemmas of human genome. may 23, 2002
> gene(sis): contemporary art explores human genomics. april 5, 2002
> working draft of the human genome. june 26th, 2000

> medicine, what medicine?
wednesday :: november 13, 2002
 
 
> new refineries ?
energy for greenhouse planet
:: towards a global energy system

In an effort to stabilize climate and slow down global warming, a team of international researchers have evaluated a series of new primary energy sources that either do not emit or limit the amount of carbon dioxide released to the atmosphere.

Possible candidates for primary energy sources include terrestrial, solar and wind energy; and solar-power satellites; biomass; nuclear fission; nuclear fusion; fission fusion hybrids and fossil fuels from which carbon has been removed.

“Reducing carbon dioxide emissions will require major change in how we produce, distribute, store and use our energy,” said Caldeira, one author of the paper titled “Advanced Technology Paths to Global Climate Stability: Energy for a Greenhouse Planet.” Martin I. Hoffert was lead author of the report.

"What our research clearly shows is that scientific innovation can only reverse this trend if we adopt an aggressive, global strategy for developing alternative fuel sources that can produce up to three times the amount of power we use today,” Hoffert said. “Currently , these technologies simply don’t exist – either operationally or as pilot projects.”

Non-primary power technologies that could contribute to climate stability and slowing down global warming include conservation, efficiency improvements, hydrogen production, storage and transport, superconducting global electric grids and geoengineering.

The report concludes: “Combating global warming by radical restructuring of the global energy system could be the technology challenge of the century.” >from *International Researchers Propose Advanced Energy Technologies To Help Quell Global Warming* <> october 31, 2002

related context
>
emerging link between freak weather, climate change. august 30, 2002
> earth 'will expire by 2050': living planet report. july 12, 2002
> global warming: u.s. climate action report 2002. june 12, 2002

 
 
 
tuesday :: november 12, 2002
 
microchip
:: environmental impact

Microchips may be small, but their impact on our world has been huge. And this impact goes beyond the obvious effects of e-mail, cell phones and electronic organizers: A new study shows that the 'environmental weight' of microchips far exceeds their small size. Scientists have estimated that producing a single two-gram chip - the tiny wafer used for memory in personal computers - requires at least 3.7 pounds of fossil fuel and chemical inputs.

"The public needs to be aware that the technology is not free; the environmental footprint of the device is much more substantial than its small physical size would suggest," says Eric Williams, Ph.D., of United Nations University in Tokyo, Japan. Williams is the lead author of the paper and director of a project investigating the environmental implications of the Information Technology revolution.

The results have crucial implications for the debate on dematerialization - the concept that technological progress should lead to radical reductions in the amount of materials and energy required to produce goods. The microchip is often seen as the prime example of dematerialization because of its high value and small size, but the new findings suggest this might not be the case. >from *The three-and-a-half pound microchip: Environmental implications of the IT revolution* november 5, 2002

related context
>
e-waste: cyber-age nightmare. march 5, 2002

> 1.7 kg weight
monday :: november 11, 2002
 
 
> the topkapi scroll:
geometry and ornament in islamic architecture
genes, neurons, internet
:: organizing principles of networks

How do 30,000 genes in our DNA work together to form a large part of who we are? How do one hundred billion neurons operate in our brain? The huge number of factors involved makes such complex networks hard to crack. Now, a study uncovers a strategy for finding the organizing principles of virtually any network – from neural networks to ecological food webs or the Internet. A team headed by Dr. Uri Alon of the Weizmann Institute has found several such organizational patterns – which they call 'network motifs' – underlying genetic, neural, technological, and food networks.

Alon surmised that patterns serving an important function in nature might recur more often than in randomized networks. This in mind, he devised an algorithm that enabled him to analyze the plentiful scientific findings examining key networks in some well-researched organisms. Alon noticed that some patterns in the networks were inexplicably more repetitive than they would be in randomized networks. This handful of patterns was singled out as a potential bundle of network motifs.

Surprisingly, the team found two identical motifs in genetic and neural systems. 'Apparently both information-processing systems employ similar strategies,' says Alon. 'The motifs shared by neural and genetic networks may serve to filter noise or allow for complex activation of neurons or genes.' Exposing the 'wiring' of such networks can thus help scientists classify systems generically (just as lions and mice both belong to the same 'class,' neural and genetic systems could be classified in the same generic category if they have many motifs in common). >from *Genes, Neurons, and The Internet Found to Have Organizing Principles – Some Identical*

related context
>
think networks: the new science of networks. june 6, 2002

 
 
 
thursday :: november 7, 2002
 
rosetta project
:: linux of linguistics

Fifty to ninety percent of the world's languages are predicted to disappear in the next century, many with little or no significant documentation. As part of the effort to secure this critical legacy of linguistic diversity, The Long Now Foundation is creating a broad online survey and near permanent physical archive of 1,000 of the approximately 7,000 languages on the planet.

The project are creating this broad language archive through an open contribution, peer review model similar to the strategy that created the original Oxford English Dictionary. Their goal is an open source "Linux of Linguistics"- an effort of collaborative online scholarship drawing on the expertise and contributions of thousands of academic specialists and native speakers around the world.

In the end, they hope the process of creating a new global Rosetta, as well as the imaginative power of having a 1,000 language archive on a single, aesthetically suggestive object, will help draw attention to the tragedy of language extinction as well as speed the work to preserve what we have left of this critical manifestation of the human intellect. >from *Rosetta Project site*

related context
>
the worldwide lexicon: link dictionaries and semantic networks, june 28, 2002

> visual language disks
wednesday :: november 6, 2002
 
 
> flow patterns
flow
:: the design challenge of pervasive computing

What happens to society when there are hundreds of microchips for every man, woman and child on the planet – most of them (the chips) talking to each other? What are the implications of a world filled with sensors and actuators? What does 'the world as spread sheet' look like? What will it mean it to be 'always on' in a real-time economy? Some of the world’s most insightful designers, thinkers and entrepreneurs will address these questions at Doors of Perception 7 in Amsterdam 14, 15, 16 November 2002. The theme of the celebrated international gathering is Flow: the design challenge of pervasive computing.

Doors of Perception (Doors) is an international conference which helps set the design agenda for information and communication technologies. Doors brings together innovators, entrepreneurs, educators, and designers, who need to imagine alternative futures - sustainable ones - and take design steps to realize them. Our products are a better understanding of the design process; scenarios for services that meet emerging needs in new ways; and new connections and capabilities among innovative people and organisations. Seven conferences have been organised since 1993. >from *Doors of Perception 7*, october 28, 2002

related context
>
UbiComp 2002, fourth international conference on ubiquitous computing. october 1, 2002
> international conference on pervasive computing. august 26-28, 2002
> (re)distributions :: a culture of ubiquity. july 15, 2002

 
 
 
tuesday :: november 5, 2002
 
the free network project
:: freedom of communication

After over a year of development, several rewrites of variousn parts of the code, and dramatic enhancements to every aspect of operation, the First new stable release of Freenet in over 14 months is finally ready.

Freenet is free software designed to ensure true freedom of communication over the Internet. It allows anybody to publish and read information with complete anonymity. Nobody controls Freenet, not even its creators, meaning that the system is not vulnerable to manipulation or shutdown.

Freenet is also efficient in how it deals with information, adaptively replicating content in response to demand. We have and continue to pioneer innovative new ideas such as the application of emergent behavior to computer communication, and public-key cryptography to creating secure namespaces.

The system provides a flexible and powerful infrastructure capable of supporting a wide range of applications, including: uncensorable dissemination of controversial information - Freenet protects freedom of speech by enabling anonymous and uncensorable publication of material ranging from grassroots alternative journalism to banned exposes like Peter (Spycatcher) Wright's and David Shayler's revelations about MI5 -, efficient distribution of high-bandwidth content - Freenet's adaptive caching and mirroring is being used to distribute Debian Linux software updates and to combat the Slashdot effect -, universal personal publishing - Freenet enables anyone to have a website, without space restrictions or compulsory advertising, even if you don't own a computer -. >from *Freenet 0.5 released*, october 28, 2002

related context
>
freenet testing phase. february 18, 2000

> building a network
monday :: november 4, 2002
 
 
> virtual touch?
first transatlantic touch
:: virtual reality touch

In a milestone that conjures up the refrain to a Paul McCartney song, researchers at MIT and University College London have linked 'hands across the water' in the first transatlantic touch, literally 'feeling' each other's manipulations of a small box on a computer screen. The MIT researchers supplied the haptics expertise for the work; the UCL team covered software development and network issues.

Potential applications abound. "In addition to sound and vision, virtual reality programs could include touch as well," said Mandayam A. Srinivasan, director of MIT's Touch Lab and leader of the MIT team.

The demonstration of long-distance touch involves a computer and a small robotic arm that takes the place of a mouse. A user can manipulate the arm by clasping its end, which resembles a thick stylus. The overall system creates the sensation of touch by exerting a precisely controlled force on the user's fingers. On the computer screen, each user sees a three-dimensional room. Within that room are a black box and two tiny square pointers that show the users where they are in the room. They then use the robotic arms to collaboratively lift the box. That's where the touch comes in. As a user at MIT moves the arm—and therefore the pointer—to touch the box, he can 'feel' the box, which has the texture of hard rubber. The user in London does the same thing. Together they attempt to pick up the box—one applying force from the left, the other from the right—and hold it as long as possible. All the while, each user can feel the other's manipulations of the box. >from *MIT and London team report first transatlantic touch*, october 28, 2002

related context
>
reach out and touch molecules (about a haptic device). july 16, 2002
> researchers discover molecule that detects touch. october 26, 2000
> mit touch lab research explores how the hand works. march 18, 1999

> context weblog archive

write your mail and will send you the updates

:: subscribe

november 02
october 02
july 02
june 02
may 02
april 02
march 02
february 02
january 02
cuntdown 02
december 01
november 01
october 01
september 01
august 01

more news 00-01
>>> archive

send your comments to
> context@straddle3.net

 

Google


context archives all www
      "active, informed citizen participation is the key to shaping the network society. a new "public sphere" is required." seattle statement
| home | site map | about context | lang >>> español - català |
 
02
http://straddle3.net/context/02/blog_0211.en.html