Introduction-en old

Tracing new topologieson Internet « The map is open and connectable in all of its dimensions; it is attachable, reversible, susceptible to constant modification. It can be torn, reversed, adapted to any kind of mounting, reworked by an individual, group, or social formation.( ...)The map has to do with performance, whereas the tracing always involves an alleged "competence".»

Gilles Deleuze and Felix Guattari, Rhizome (Thousand Plateaux). The context of information on the Internet The mechanisms of web searching Access to contents and resources on the Internet, such as web pages, images, ... usually require the use of a search tool. The commercial and centralized search engines such as Google, Yahoo … compute and rank user requests on the basis of information taken from the Web graph structure and from estimated query-document similarities. Web search queries are unstructured and often ambiguous, the results returned to the user are not targeted enough and often far from taking into account the user's profile in its search experience, its tastes and its interests. Also, when more targeted searches are proposed - the expectations of the user and search history are not taken into consideration in the process. User Position New digital technologies have transformed members of an audience consuming contents that were produced by others, into users who interact, create and share contents with their peers. This is a fundamental element of a social and cultural transformation. Among them are two kinds of architectures that have contributed to value and strengthen this culture of distributive practices and knowledge sharing:  Web 2.0 applications social and collaborative platforms such as the Wikis (Wikipedia, Wiktionnary, Wikimediation, Culture Free, ...), the blogs, or the social platforms of contents sharing (YouTube, Flickr, Twitter, …) have provided users with better opportunities and tools to interact with their peers by sharing. They have radically modified the landscape of Internet.  Peer to Peer Networks [P2P] are different from centralized networks (client - server architecture) by the modularity of their architectures, they deliver services for which the peers share a portion of their own resources that directly provide to other participants without intermediary network hosts or servers.   Captive knowledge vs distributive knowledge Search history files, popularity rankings, data mining, non indexed pages,.. provide a better understanding of the nature of information in Internet. These data are part of a collective memory of the Web and are elements of a historiography. The web trading companies are using their patented software, algorithms and database to privatize and enclose the data, which become the object of economic and political greed - data filtering, targeted advertising, commercial profiling, public monitoring on Internet, political censorship… Due to aggressive methods based on competition and overbid, the trading companies hide the data they artificially own to create a situation of monopoly, in the aim of monitoring the users and controlling the network to value their position. For the user, the data he creates and sends on the Internet remain almost totally fragmented and useless - either they do not belong to him or simply he does not have any access to them.

We propose to relay information stored in search history files, bookmarks, web queries,.. to create a network of peers linked to each other and sharing their data together, clustered in communities of shared interests, participating freely in the construction of a collective and distributed knowledge. We believe that these communities will open new topologies, unknown to the main search engine data bases of the proprietary and centralized networks, in which results will be more complex, more diversified and more dynamic. The system The system is a peer to peer distributed and stuctured netwwork based on two network layers, sharing the different levels of information. This kind of architecture is already developped on main networked projects at EPFL. The users will need to install a «Firefox» plug-in (firefox add-on) whose task will be to send data in the network.   The interest-proximity layer is composed by the user members of the network. They are the regular peers, they are unpredictable and unreliable because they can leave and join the network at any time.   The indexing-layer is composed by the backbone peers, the reliable and trustful machines, could be institutional servers, machines that are connected all the time. They are responsible of the global level of information; the user's profile, the visited sites, the search history and links  and  the security of the network.  How it works One must be member of the network (become a peer) to use its request system. The purpose is to share their informations and links with others, in a common shared interest system. They «vote» for websites; it means they send data, URL, in the network. They can search for information like in any information search system. Requests are filtred and sorted by popularity and co-occurance frequencies (see co-occurance- recommander system in FAQ). Our main purpose is to provide peers with interesting feed-back to their requests. When peers make their researches and activate hyperlinks in the network, the returned result will be sorted and filtered and new topologies based on common interest will be displayed.

As member peers can have multiple and diversified interests, networks of websites then created frame as much as dynamic and diversified web topologies that a common list of a search engine based on graph requests and keywords will never be abble to show. The users profiles are created for each peer. It is computed with the data sent by the users; their search history file, and their web queries … .This profile will provide backbone peers with useful informations on the peer's interests. Privacy policy and anonymization The peers profile are private and these data are stored on their computer.Only the information relative to the login process are stored in a part of the system, different for each peer. When peers want to join the network, they create a login through a standard process of validation by email (one email=one human) and a form with a captcha script (serie of letters and numbers that has to be humanly reproductible). A network-encrypted key which change periodically, protects these data in order to avoid their appropriation by someone outside the network. Information - «votes» Once peers activate an hyperlink during they research process, it is stored in their browser's history log file. That information will be sent and filtered in the system. This process is called the vote process. Only this information is interested and will be sorted out and not the persons from whom it comes neither their IP address. This information will be fragmented and encrypted so that peers privacy is guaranteed. None is able to find out or reproduce the physical identity of the peer who send the data neither the IP address from where the data is sent. Visualization of information The visualization of information is a meta language that contains the information as well as a discourse on this information; its representativeness (what is it represented, how and why). It is a visual experience and it transforms in a display the question of time and granularity of the given information. It provides the user a possibility to interact with the displayed information, as well as a new kind of understanding (visually and sensitively); Information are displayed in a 3 or 4D environment when they often are beyond these limits and then invisible to the human eye. The visual display of the networks and their topologies are part of this exploration. Since the nineties, net artists have experimented the possibilities provided by computer simulation programs and graphical software. They have brought in these fields their thoughts about the aesthetics, the sensitivity, and the unpredictability of data. In the network, the peer will be able to navigate in a visual interface, made of 3 or 4D dynamic environment and mappings that will be updated periodically - a visual representation of the reticular and dynamic structures of the network.