Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Networking The Internet Software IT

Completing BitTorrent Decentralization 236

Njaal writes "With BitTorrent going trackerless, searching for and distributing .torrent files is a natural next step. The Socialized.Net (TSN) is a pure P2P search infrastructure which facilitates P2P searching and distribution of .torrent files. It comes complete with an Azureus (and Firefox) search plugin. TSN is written in Python and is made available under the GPL. Note that this is part of my PhD thesis, and is as such meant as a technology demonstrator."
This discussion has been archived. No new comments can be posted.

Completing BitTorrent Decentralization

Comments Filter:
  • by LiquidCoooled ( 634315 ) on Saturday May 21, 2005 @01:19PM (#12599599) Homepage Journal
    Note that this is part of my PhD thesis, and is as such ment as a technology demonstrator

    really means:

    Pleassseeeeeeeeeeeeeeeeeeeeee don't sue my ass.
    • by Anonymous Coward
      I thought it ment, er, meant he didn't use a spell checker.
    • by Anonymous Coward on Saturday May 21, 2005 @01:59PM (#12599830)
      It is worth noting that every P2P software distributor sued by the RIAA has used built-in searching. Built-in searching is really the big thing that separates the internet from what people commonly call peer-to-peer networks (even though the internet is itself a P2P network).

      With the conventional internet, you were stuck using a centralized search engine which is easy to censor. To censor a network with built-in searching, you have to censor the whole network.
  • Though the author is just a student, this is positive. I also applaud his consideration for Firefox first. What will it take for him to consider "the other" browser? Next, let me look for a torrent of the other newly released movie. I guess slashdotters know it.
  • by Anonymous Coward
    "The file isn't a valid Azureus plugin."

    • right... but read the same page where you clicked.

      in bold:
      Sorry about that, unzip the file into your azureus plugin directory, the wizard does not work on this file.

      so, what I did, in Linux was:
      # cd ~/.Azureus/plugins
      # cp /[path to saved]/TorrentSearch.tar.gz .
      # tar xzvf TorrentSearch.tar.gz
      and then restarted Azureus. It is now in the "plugins" menu.

  • by Sv-Manowar ( 772313 ) on Saturday May 21, 2005 @01:29PM (#12599667) Homepage Journal

    If this technology works as advertised (and obviously that has yet to be seen) it will only really work by the kind of mass adoption created by inclusion in the standard bittorrent clients. This is how the Azureus distributed database has worked out so well, because of the existing userbase being rolled over seamlessly to its inclusion by default.

    If Azureus or other clients decided to include functionality like this, it would effectively leave programs like eXeem dead in the water and provide BitTorrent users with a closed 'single-stop' solution for finding and downloading files.

  • 400%Growth in nodes (Score:2, Interesting)

    by jzono1 ( 772920 )
    400%Growth in nodes known, went from 4 to 18, wonder how many there is i n a hour :) Too bad one has to reboot az/ff to use the plugins tho
    • As for Azureus, I don't see the problem with restarting. It may be a bit slow at first, but it quickly picks up.

      As for Fireofx... Thats where the Session Saver extension [mozdev.org] comes in handy! My new favorite extension.

      • Ahh, watch out for the session saver - there needs to be a way to disable it when needed. I had an issue with a particular web page that crashed FF on OS X, and the damn session saver would open directly to the crasher! I finally had to dig the extension out of the firefox settings in order to get my browser back.

        • I had the same problem. I didn't have time to dive into it, tho, and it was a fresh FF install, so I just wiped the FF config dir and started again.

          SessionSaver has to save it's last 'state' somewhere. Anyone know?

          SB
        • Ahh, watch out for the session saver - there needs to be a way to disable it when needed. I had an issue with a particular web page that crashed FF on OS X, and the damn session saver would open directly to the crasher!

          Couldn't you just disconnect the network/modem, and possibly delete the cache? Then it'd fail to load and you could open a local page, then plug back in and continue.

  • by tepples ( 727027 ) <tepples.gmail@com> on Saturday May 21, 2005 @01:31PM (#12599685) Homepage Journal

    Now that BT has decentralized tracker and decentralized search, it appears that the only remaining advantages over ed2k (e.g. eMule) are the tit-for-tat algorithm and smaller complete block size before one can begin uploading (256 KB for BT vs. 9500 KB for ed2k).

  • by Stalyn ( 662 ) on Saturday May 21, 2005 @01:39PM (#12599725) Homepage Journal
    one which helps me download pr0n faster.

  • by iammaxus ( 683241 ) on Saturday May 21, 2005 @01:42PM (#12599734)
    Trackerless torrents and search technologies like this seems to be changing BitTorrent into a conventional p2p system. Can anyone explain the diffrence? Is it just a regular p2p system with a highly efficient segmented downloading system?
    • The difference in this case being all that trackless stuff for BT is optional. You can still run a tracker, and provide the .torrent file on your Web server.
    • Meta data search? (Score:2, Interesting)

      by bobbuck ( 675253 )
      Do any of these P2P systems allow a better description of the shared resource than the filename? It would be great if there was a description file or database for the shared resources. That way you could search for certain filetypes, versions, sources, licenses, etc. and be able to get a real description of the file before you download. If P2P grows beyond mp3's this will quickly become a nessecity.

      P2P could even replace things like classified ads or directories. Share a picture of your car with tags se

      • Well I haven't tried out this program but the papers on his website describe the searching being done on XML metadata which can include file, format as well as things like director, or genre.
    • by Daedalon ( 848458 ) on Saturday May 21, 2005 @04:36PM (#12600729)

      There are quite a lot of differences in the three major P2P technologies. Here I try to cover the most important of each:

      ed2k (eMule [emule-project.net])

      • + Easy linking. Links can be shared anywhere: in web pages, IRC, email. The single 100-200 character link contains everything that is needed to download the file.
      • + Supports usage with and without a server (in eMule, ed2k server and serverless Kademlia)
      • - If you run a server, you can't make it private
      • - If you run a server, you cannot control what is shared there
      • - Inefficient, seems to waste bandwidth

      Direct Connect (DC++ [sourceforge.net], Reverse Connect [sourceforge.net])

      • + You can run servers (hubs) private
      • + You can see what everyone is sharing in your hub
      • + Using eMule-like links has recently become available, though clicking a link doesn't add the file in your queue, it only allows you to search for it
      • + Efficient, you can download directly from someone very fast, even through intranet
      • - No serverless mode
      • - You don't have total control on what is shared in your server
      • - Only in Reverse Connect you can download from multiple sources simultaneously

      BitTorrent (Azureus [sourceforge.net], BitComet [bitcomet.com])

      • + The most efficient p2p yet
      • + Server (tracker) admin can have total control of what is shared choosing a directory where he uploads allowed torrents
      • + A single .torrent file can contain instructions on how to download multiple files
      • - No serverless mode
      • - No searching
      • - To share download instructions for a file(set), you have to be able to transfer a .torrent file, a plaintext link isn't enough

      This has been the situation for a while. In ed2k nothing big has changed for a year. DC++ (incl. Reverse Connect) is evolving, but magnet (TTH) linking has been the only major change in years. When DC++ gets its support for ADC [sourceforge.net] complete, the evolution of Direct Connect is predicted to get a major boost.

      What trackerless BitTorrent [bittorrent.com] does is to make every client a small tracker. So it doesn't just enable searching and serverless usage, it also makes sharing illegal files easier (more than it does for legal). Previously, to share content, you had to find a tracker that allows posting .torrents. To share copyrighted content, you also had to find a tracker that didn't care about legal aspects. So sharing legal and illegal content is now equally easy, while it previously was (at least in theory) a little bit easier to share legal content.

      Overall, the changes of trackerless BitTorrent would still make it the best available p2p techonology. For certain special cases, Direct Connect could be better, and both DC and ed2k support easier linking than BT, but even that can change in the future: BT could implement a meta-p2p engine, so that you could share plaintext links that make your client download the right .torrent file and add it to your queue. This would make BT superior to eMule in every aspect.

      • Emule:
        - Slooooow at single downloads, need a long queue, lots of incomplete files wasting disk space
        Direct Connect:
        - If the last source on your server goes missing, you often have to jump around servers to find another source.
        - Haven't tried RC, but slow clients can block fast downloads (e.g. kick out 2k/s modem user, get 200k/s Uni user)
        BT:
        - Nearly impossible to find rare files!!!

        And I guess it can be in its place to compare with 3rd gen networks too (Freenet, Ants, I2P etc):

        + Anonymous
        + Serverless
        + Does
      • re: ed2k:

        # - If you run a server, you can't make it private
        # - If you run a server, you cannot control what is shared there

        Both of these statements are incorrect. eMule supports secure user identification based on a public key system. All the server has to do is reject login requests from clients not on a whitelist. Similarly, when a client issues a search request the server is free to do whatever it wants both with the request and the results. If you only want people sharing known good files via your
      • "BT could implement a meta-p2p engine, so that you could share plaintext links that make your client download the right .torrent file and add it to your queue. This would make BT superior to eMule in every aspect." The latest Azureus already has that (magnet links). For example, try magnet:?xt=urn:btih:GCT5DYD6RADW6TY2ICW54UZDXB6OPC XD
  • Azureus install (Score:2, Interesting)

    by Kahless2k ( 799262 )
    Anyone else having trouble installing the azureus install?

    Kahless2k
  • Ok bad pun i know ..
    this gives us a redundancy Admins can only dream of in other areas.
    The fact that you can have your files spread over a massive number of computers spread across the world is the way of future file distribution. The load changes from a constant one on your server to a one off (well perhaps one day) of uploading it , then as soon as you know it the file propigates itself across the p2p network allowing for speeds unatainable in the classic server-client model which is still prevelant .
    The
  • Ever wondered why is bittorrent faster than other P2P networks like eDonkey or overnet? This is because there is no built-in decentralized search engine. Users have to download one of the files that are available to them, and consequently more people download the same file at a certain time. The result is that you get the files faster.
    • by dizzydogg ( 127440 ) on Saturday May 21, 2005 @03:52PM (#12600494)
      Bittorent is designed for a massive swarm of people all downloading the same file at the same time. It won't die under these circumstances, it will thrive. The more people you have downloading the file, the more people are sharing the file. The reason bittorent is faster is because it forces you to share with others, and doesnt allow you to get away with not sharing the file like many p2p programs, where many people with "slow" connections or a cap on their monthly bandwidth turn off their uploads. Thats why so many people download off one guy with other p2p programs, because the file is never shared by so many people who dowload it, the few people who are sharing their copy's queue is swamped.

      It's all because of bittorents tit for tat system, where if the seeders are swamped, you'll usualy get your upload speed returned to you from the other peers you are downloading with. If you upload at 5k/s, you download at 5k/s,but if you can do 30k/s you usualy get 30k/s. You swap the pieces you have for pieces your missing with the other downloaders. Your client remembers the people who traded with it succesfully and tries to make further trades with these people since your client can confirm that they are uploading, and thus you will get something in return. Meanwhile the seeders are feeding the rarest pieces to the people it sees as the ones who upload the most to others, and they swap with others and so on, until everyone has a complete copy.
      • If you upload at 5k/s, you download at 5k/s,but if you can do 30k/s you usualy get 30k/s.

        My lousy DSL cant sustain much more than 8kb/s upstream before all my other net-apps die. There is no upstream left for sending requests.

        So I have Azureus capped at 5kb/s upstream to make the net usuable. Guess what? At occation I still get download speeds which maxes my downstream at 80kb/s. And that's when there are still other peers in the swarm.

        You may be correct on the rest, but at this point you are wron

        • Re:Not so (Score:3, Informative)

          by petermgreen ( 876956 )
          seeds/finished downloaders don't/can't engage in tit for tat and so will give you as much as they can.

          so it seems likely that in your case you were simply getting data from a seed/finished downloader when your rate spiked like that.
      • With my highly throttled upstream bandwidth, my bittorrent download rate peaks when I cap the upload to 1-4k/s. Uploading more causes my download rate to drop significantly. I can hit my max upload, or max download, but I can't hit both at the same time.
      • If you upload at 5k/s, you download at 5k/s,but if you can do 30k/s you usualy get 30k/s.

        I don't know what the exact relationship between upload and download speeds is in BT, but it's not 1:1 as you indicate here. I regularly have torrents running at 80k/s - 150k/s *sustained* with my upload rate capped at 10k/s or 15k/s and have even seen it get as high as 400k/s (and stay there until the torrent was finished). Indeed, even with my upload capped at 3k/s I've seen d/l speeds of 50k/s.

    • As I understand it this, like the Azureas(sp) DHT technology, is simply a secondary protocol one can use on top of BT. Thus even if you are correct and decentralized searching imposes a large performance hit you can still go ahead and use the original BT without any of this additional crap.

      Moreover, I expect the speed difference is really caused by two factors. First simple algorithmic efficency, BT seems better than the earlier generation of P2P clients (people learned something. Secondly, previous dec
    • No, the reason bittorrent is faster is that it stops leeching, but without requiring you to wait in line for ages to get files.
  • This is very interesting (though I would really like to see more info on how it works I only saw one paper). However while perfect for a PhD demo it seems in the long run it would be better to build a fully distributed system from the groun up. I seem to remember freenet doing something similar but I don't think they ever implemented searching.

    In particular by building both searching and trackerlessness into such a system from the groun up one could benefit from a clean elegant metaphor (both searching a
  • by logicnazi ( 169418 ) <gerdesNO@SPAMinvariant.org> on Saturday May 21, 2005 @03:12PM (#12600271) Homepage
    Well perhaps not quite. However, this is where web technology is headed.

    While one benefit of P2P is psuedo-anonymous file hosting. That is if I wish to spread some information I need not set up a webserver and be easily traceable (ideally once everything goes trackerless). Another one is the fact that the consumers of information can provide the bandwidth for the resources they consume.

    The benefits for open sourceesque projects cannot be underestimated. Running community sites like wikipedia is very difficult as they need to pay for lots of bandwidth and server space. A well designed P2P system would turn every user of a resource into a partial server. This means it is no more expensive to provide information a million people want than to provide information 10 people want.

    Of course some issues such as file ownership permissions need to be dealt with. However, this is exactly the sort of technology that is needed to realize the great leveling capacity of the internet and turn non-profit groups and individuals into just as important media distribution entities as major corporations.

    I fully expect this to change the world.
    • One immediate snag with something dedicated would be controlling those files on people's computers. As in, what's to stop someone from deleting something important, or tampering with it in some way?

      While there are certain to be backups (and likely a good chunk of people serving the same information, to reduce that likelyhood of error), I wonder if it could be introduced like a SETI@Home type of thing -- people serve intentionally or as a screensaver/when the computer isn't being used.

      Probably would ha

      • Well for such a system to work you would need to automatically replicate files in some manner. Otherwise files might disappear if computers go down . If such a system became truly popular then one might well integrate it into the webbrowser or a similar program and that would essentially keep it running all the time.

        The real problems are not replication but allocation of resources. What prevents someone from flooding all the disk space with junk? Presumably you would need some sort of credit system whe
  • Efforts to turn a great distributed download acceleration technology into a shady decentralized p2p search and file sharing system like Kazaa are bearing fruit.
    • Efforts to turn a great distributed download acceleration technology into a shady decentralized p2p search and file sharing system like Kazaa are bearing fruit.


      Or, depending on how you look at it, propaganda designed to portray modern social tools as "shady" are becoming more popular among those who don't understand the issues.
  • A Different Solution (Score:3, Interesting)

    by 26199 ( 577806 ) on Saturday May 21, 2005 @03:59PM (#12600539) Homepage

    Port advertising instead of service advertising.

    I haven't come across this idea elsewhere, so, please let me know if you actually do it ;)... I would if I had a server handy, it's an easy project.

    One centralised server can be used as a central tracker for P2P, or anything else, with no legal implications. The idea is simple. Your server doesn't advertise services, it advertises open ports.

    Let's say my awesome new p2p program uses port 23145. On starting up, it sends a packet to central server saying "my port 23145 is open". When someone else asks the server for someone with port 23145 open, there's a chance they get my IP address in return. When I have enough connections, I send a packet asking that I be delisted.

    Obviously there need to be controls against spoofing, etc, but the application is so simple that these are pretty easy to do.

    Because the central server stores nothing more than IP/port pairs (plus timing and security stuff), there is complete deniability. You have no way to tell which program people are running, either on the server or the client. And you never see any application data whatsoever. It's just as useful for legitimate apps as for legally difficult stuff.

    Problem solved. Any program can find other instances of the same program without nasty legal questions being raised. Admittedly they'll have to check the identity of the other program on connection, but they should be doing that anyway...

  • ANyone have a torrent of the installer? It's dogslow/slashdoted already.
  • Re: (Score:2, Interesting)

    Comment removed based on user account deletion
  • I've been thinking about this tracker-less idea, and it scares me.

    When bittorrent first came out, the AA's didn't know what to attack because as far as Bram Cohen was concerned, he was in the free and clear. Bittorrent did not handle any searching, was not the central host for the clients, and didn't really do anything but make a decentralized File transfer client. So the way that the AA's handled bittorrent was to take out the trackers that were doing the illegal swapping, and this is fine, because basici
  • I've been thinking about this tracker-less idea, and it scares me.

    When bittorrent first came out, the AA's didn't know what to attack because as far as Bram Cohen was concerned, he was in the free and clear. Bittorrent did not handle any searching, was not the central tracker for the clients, and didn't really do anything but make a decentralized File transfer client to spread bandwidth costs and increase speed. So the way that the AA's handled bittorrent was to take out the trackers that were doing the il
  • Making bittorrent trackerless is a horrible idea, that will only serve to reduce the usefulness of bittorrent below that of Kazaa/Gnutella.

    The biggest thing Bittorrent has going for it, is that a central authority that you trust, is listing legit files, with descriptions, etc. With Gnutella/Kazaa, you don't have any assurance, hence the problem with fake files.

    Bittorrent makes this problem worse (if it goes decentralized) because it downloads chunks completely randomly, so you can't even preview an 9GB

"Money is the root of all money." -- the moving finger

Working...