Hall of fame
In english the name is written Peer-to-peer
A peer-to-peer (or P2P) computer network is a network that relies primarily on the computing power and bandwidth of the participants in the network rather than concentrating it in a relatively low number of servers. P2P networks are typically used for connecting nodes via largely ad hoc connections. Such networks are useful for many purposes. Sharing content files (see file sharing) containing audio, video, data or anything in digital format is very common, and realtime data, such as telephony traffic, is also passed using P2P technology.
A pure peer-to-peer network does not have the notion of clients or servers, but only equal peer nodes that simultaneously function as both "clients" and "servers" to the other nodes on the network. This model of network arrangement differs from the client-server model where communication is usually to and from a central server. A typical example for a non peer-to-peer file transfer is an FTP server where the client and server programs are quite distinct, and the clients initiate the download/uploads and the servers react to and satisfy these requests.
The earliest peer-to-peer network in widespread use was the Usenet news server system, in which peers communicated with one another in order to propagate Usenet news articles over the entire Usenet network. Particularly in the earlier days of Usenet, UUCP was used to extend even beyond the Internet. However, the news server system also acted in a client-server form when individual users accessed a local news server in order to read and post articles.
Some networks and channels, such as Napster, OpenNAP, or IRC @find, use a client-server structure for some tasks (e.g., searching) and a peer-to-peer structure for others. Networks such as Gnutella or Freenet use a peer-to-peer structure for all purposes, and are sometimes referred to as true peer-to-peer networks, although Gnutella is greatly facilitated by directory servers that inform peers of the network addresses of other peers.
Peer-to-peer architecture embodies one of the key technical concepts of the internet, described in the first internet Request for Comments, "RFC 1, Host Software" dated 7 April 1969. More recently, the concept has achieved recognition in the general public in the context of the absence of central indexing servers in architectures used for exchanging multimedia files.
The concept of peer to peer is increasingly evolving to an expanded usage as the relational dynamic active in distributed networks, i.e. not just computer to computer, but human to human. Yochai Benkler has developed the notion of commons-based peer production to denote collaborative projects such as free software. Associated with peer production are the concept of peer governance (referring to the manner in which peer production projects are managed) and peer property (referring to the new type of licenses which recognize individual authorship but not exclusive property rights, such as the GNU General Public License and the Creative Commons License).
Operation of peer-to-peer networks
One possible classification of peer-to-peer networks is according to their degree of centralisation:
Peers act as equals, merging the roles of clients and server
There is no central server managing the network
There is no central router
Has a central server that keeps information on peers and responds to requests for that information.
Peers are responsible for hosting available resources (as the central server does not have them), for letting the central server know what resources they want to share, and for making its shareable resources available to peers that request it.
Route terminals are used addresses, which are referenced by a set of indices to obtain an absolute address.
Some examples of "pure" peer-to-peer application layer networks designed for file sharing are Gnutella and Freenet.
Advantages of peer-to-peer networks
An important goal in peer-to-peer networks is that all clients provide resources, including bandwidth, storage space, and computing power. Thus, as nodes arrive and demand on the system increases, the total capacity of the system also increases. This is not true of a client-server architecture with a fixed set of servers, in which adding more clients could mean slower data transfer for all users.
The distributed nature of peer-to-peer networks also increases robustness in case of failures by replicating data over multiple peers, and -- in pure P2P systems -- by enabling peers to find the data without relying on a centralized index server. In the latter case, there is no single point of failure in the system.
When the term peer-to-peer was used to describe the Napster network, it implied that the peer protocol was important, but, in reality, the great achievement of Napster was the empowerment of the peers (i.e., the fringes of the network) in association with a central index, which made it fast and efficient to locate available content. The peer protocol was just a common way to achieve this.
Unstructured and structured P2P networks
The P2P overlay network consists of all the participating peers as network nodes. There are links between any two nodes that know each other: i.e. if a participating peer knows the location of another peer in the P2P network, then there is a directed edge from the former node to the latter in the overlay network. Based on how the nodes in the overlay network are linked to each other, we can classify the P2P networks as unstructured or structured.
An unstructured P2P network is formed when the overlay links are established arbitrarily. Such networks can be easily constructed as a new peer that wants to join the network can copy existing links of another node and then form its own links over time. In an unstructured P2P network, if a peer wants to find a desired piece of data in the network, the query has to be flooded through the network in order to find as many peers as possible that share the data. The main disadvantage with such networks is that the queries may not always be resolved. A popular content is likely to be available at several peers and any peer searching for it is likely to find the same, but, if a peer is looking for a rare or not-so-popular data shared by only a few other peers, then it is highly unlikely that search be successful. Since there is no correlation between a peer and the content managed by it, there is no guarantee that flooding will find a peer that has the desired data. Flooding also causes a high amount of signalling traffic in the network and hence such networks typically have a very poor search efficiency. Most of the popular P2P networks such as Napster, Gnutella and KaZaA are unstructured.
Structured P2P networks overcome the limitations of unstructured networks by maintaining a Distributed Hash Table (DHT) and by allowing each peer to be responsible for a specific part of the content in the network. These networks use hash functions and assign values to every content and every peer in the network and then follow a global protocol in determining which peer is responsible for which content. This way, whenever a peer wants to search for some data, it uses the global protocol to determine the peer(s) responsible for the data and then directs the search towards the responsible peer(s). Some well known structured P2P networks are Chord, Pastry, Tapestry, CAN, and Tulip.
Under US law "the Betamax decision" (Sony Corp. of America v. Universal City Studios, Inc.), case holds that copying "technologies" are not inherently illegal, if substantial non-infringing use can be made of them. This decision, predating the widespread use of the Internet applies to most data networks, including peer-to-peer networks, since distribution of correctly licensed files can be performed. These non-infringing uses include sending open source software, public domain files and out of copyright works. Other jurisdictions tend to view the situation in somewhat similar ways.
In practice, many, often most, of the files shared on peer-to-peer networks are copies of copyrighted popular music and movies. Sharing of these copies among strangers is illegal in most jurisdictions. This has led many observers, including most media companies and some peer-to-peer critics, to conclude that the networks themselves pose grave threats to the established distribution model. The research that attempts to measure actual monetary loss has been somewhat equivocal. Whilst on paper the existence of these networks results in large losses, the actual income does not seem to have changed much since these networks started up. Whether the threat is real or not, both the RIAA and the MPAA now spend large amounts of money attempting to lobby lawmakers for the creation of new laws, and some copyright owners pay companies to help legally challenge users engaging in illegal sharing of their material.
In spite of the Betamax decision, peer-to-peer networks themselves have been targeted by the representatives of those artists and organizations who license their creative works, including industry trade organizations such as the RIAA and MPAA as a potential threat. The Napster service was shut down by an RIAA lawsuit.
In A&M Records v. Napster, 239 F.3d 1004 (9th Cir. 2001), the court found that Napster was both vicariously and contributorily liable for the copyright infringement its users were engaged in. Vicarious liability in these types of cases extends to a provider who financially benefits from the infringement committed by its users while having the ability to police that infringement but has failed to do so. The court found ample evidence that Napster’s future revenue is directly dependent upon “increases in userbase.” It also found that Napster could have done more than it claimed with regards to restricting users from sharing copyrighted material. Later on in the opinion, the famous peer-to-peer provider was also found contributorily liable. It knew of the infringing use that Napster could and did have, in a medium where the software provided essential access. The RIAA could have sued individual users at the time for violating federal law, but thought it more prudent to shut down the means by which those users shared music.
Napster's use of a central server distinguishes it, on its facts, from the next generation peer-to-peer technology, in which the communication of files is truly "peer to peer." Therefore, additional litigation was needed to determine the legality of their uses.
In MGM v. Grokster, the U.S. Supreme Court reversed a decision of the Ninth Circuit Court of Appeals which had granted a summary judgment of dismissal, and held that were factual issues concerning whether the defendant p2p software providers had, or had not, encouraged their users to infringe copyrights. If they had done so, they could be held liable for secondary copyright infringement.
A little over a year later, the RIAA initiated the first major post-Grokster case, Arista v. Limewire, in Manhattan federal court. Lime Wire has counterclaimed in that suit, charging the major record companies with antitrust violations and other misconduct."Lime Wire Sues RIAA for Antitrust Violations"
Shortly thereafter, the lower court judge in Grokster found one of the defendants, Streamcast, the maker of Morpheus, to be liable under the standards enunciated by the Supreme Court. "Streamcast Held Liable for Copyright Infringement in MGM v. Grokster, Round 2" ("Recording Industry vs. The People")
As actions to defend copyright infringement by media companies expand, the networks have quickly adapted and constantly become both technologically and legally more difficult to dismantle. This has caused the users that are actually breaking the law to become targets, because whilst the underlying technology may be legal, the abuse of it by individuals redistributing content in a copyright infringing way is clearly not.
Anonymous peer-to-peer networks allow for distribution of material - legal or not - with little or no legal accountability across a wide variety of jurisdictions. Many profess that this will lead to greater or easier trading of illegal material and even (as some suggest) facilitate terrorism, and call for its regulation on those grounds. Others counter that the potential for illegal uses should not prevent the technology from being used for legal purposes, that the presumption of innocence must apply, and that non peer-to-peer technologies like e-mail, which also possess anonymizing services, have similar capabilities.
In the European Union (EU), the 2001 EU Copyright directive, which implemented the 1996 WIPO treaty ("World Intellectual Property Organization Copyright Treaty"), prohibits peer-to-peer, claiming it is a violation of the directive. However, not all European member states have implemented the directive in national legislation. Notably, on December 22, 2005, after discussing the EU directive, the French parliament passed two amendments legalizing the exchange of copies on the internet for private use. In a later proceeding, the French government withdrew the article in question and made illegal any p2p client obviously aimed at sharing copyrighted material. The term "obviously" was not defined. The project of law (called DADVSI) has still to be discussed by the French senate and, if the decision differs too much from the Parliament's, it will be debated on second lecture back at the Parliament (Assemblée Nationale).
Interestingly, Canada stands out by authorizing, at least until the projected copyright reform, downloads on peer-to-peer networks under the "private copying" exception.
Attacks on peer-to-peer networks
Many peer-to-peer networks are under constant attack by people with a variety of motives.
poisoning attacks (e.g. providing files whose contents are different from the description)
polluting attacks (e.g. inserting "bad" chunks/packets into an otherwise valid file on the network)
defection attacks (users or software that make use of the network without contributing resources to it)
insertion of viruses to carried data (e.g. downloaded or carried files may be infected with viruses or other malware)
malware in the peer-to-peer network software itself (e.g. distributed software may contain spyware)
denial of service attacks (attacks that may make the network run very slowly or break completely)
filtering (network operators may attempt to prevent peer-to-peer network data from being carried)
identity attacks (e.g. tracking down the users of the network and harassing or legally attacking them)
spamming (e.g. sending unsolicited information across the network- not necessarily as a denial of service attack)
Most attacks can be defeated or controlled by careful design of the peer-to-peer network and through the use of encryption. P2P network defense is in fact closely related to the "Byzantine Generals Problem". However, almost any network will fail when the majority of the peers are trying to damage it, and many protocols may be rendered impotent by far fewer numbers.
From Wikipedia, the free encyclopedia.
File last modified on 2016-5-11
Contributor : devassal thibault
See also this article on Wikipedia : Peer-to-peer
All text is available under the terms of the GNU Free Documentation License.
You may find another article in the encyclopedia by consulting this list.
[Chess forum] [Rating lists] [Countries] [Chess openings] [Legal informations] [Contact]
[Social network] [Hot news] [Discussions] [Seo forums] [Meet people] [Directory]
Social network : create your photo albums, discuss with your friends...
Hot news & buzz : discover the latest news and buzz on the internet...
Discussions : questions and answers, forums on almost everything...
Seo forums : search engines optimisation forums, web directory...
Play the strongest international correspondence chess players !