Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming Media IT Technology

Political and Technical Implications of GitTorrent 208

lkcl writes "The GitTorrent Protocol (GTP) is a protocol for collaborative git repository distribution across the Internet. Git promises to be a distributed software management tool, where a repository can be distributed. Yet, the mechanisms used to date to actually 'distribute,' such as ssh, are very much still centralized. GitTorrent makes Git truly distributed. The initial plans are for reducing mirror loading, however the full plans include totally distributed development: no central mirrors whatsoever. PGP signing (an existing feature of git) and other web-of-trust-based mechanisms will take over from protocols on ports (e.g. ssh) as the access control 'clearing house.' The implications of a truly distributed revision control system are truly staggering: unrestricted software freedom. The playing field is leveled in so many ways, as 'The Web Site' no longer becomes the central choke-point of control. Coming just in time for that all-encompassing Free Software revolution hinted at by The Rebellion Against Vista, this article will explain more fully some of the implications that make this quiet and technically brilliant project, GitTorrent, so important to Software Freedom, from both technical and political perspectives."
This discussion has been archived. No new comments can be posted.

Political and Technical Implications of GitTorrent

Comments Filter:
  • by nategoose ( 1004564 ) on Thursday December 04, 2008 @02:06PM (#25991695)
    Reread the summary in Davros's voice, in creasing the volume and excitement as you get closer to the end. Come on -- it'll be fun.
  • by Anonymous Coward on Thursday December 04, 2008 @02:07PM (#25991721)

    The hyperbole makes you look like a frothing idiot.

  • by ooglek ( 98453 ) <beckman@@@angryox...com> on Thursday December 04, 2008 @02:10PM (#25991753) Homepage Journal
    This is cool, your code can be free. But unfortunately you're still stuck with hosting the documentation on a central website of some sort. I'm hopeful someone will whip up a standard for hosting the documentation website. IE PHP + SQlite + GitTorrent docRoot == Distributed website. Now several websites could support any GitTorrent-hosted documentation. Go to any GitTorrentDoc-enabled website, type in the .torrent of the repository, and blam -- the server pulls it down (or has it already cached) and you can page through the fully-dynamic docRoot. Could even contain Trac or something, so all the bug tracking is also in the GitTorrent repository.
    • by ooglek ( 98453 ) <beckman@@@angryox...com> on Thursday December 04, 2008 @02:14PM (#25991833) Homepage Journal
      Hmm. Except that the problem of SQlite being updated by two or more people at the same time would create problems. Unless BugIDs were md5 hashes, an insert would likely cause problems. And even md5 hashes have collisions, though pretty unlikely even if you have 100,000 bugs.
    • by lkcl ( 517947 ) <lkcl@lkcl.net> on Thursday December 04, 2008 @02:32PM (#25992135) Homepage

      This is cool, your code can be free. But unfortunately you're still stuck with hosting the documentation on a central website of some sort.

      no - you're not :) read the article [advogato.org]: it mentions that static content such as that generated by ikiwiki could perfectly well be generated by a locally-checked-out (gittorrent-distributed) copy of the documentation

      extend that concept a little further (one step at a time!) and you have, as you rightly mention:

      a standard for hosting the documentation website. IE PHP + SQlite + GitTorrent docRoot == Distributed website.

      yes! although, to be much better, technically, you'd have a distributed SQL server - a peer-to-peer SQL server. there's a project that IngreSQL are keeping an eye on, called "d", that might show some promise, here.

      Could even contain Trac or something, so all the bug tracking is also in the GitTorrent repository.

      yes!

      _now_ you're getting it :)

    • Re: (Score:3, Insightful)

      What is it that prevents you from putting the documentation into git as well? Does git somehow refuse to store plain English text?

      • Re: (Score:3, Funny)

        by Anonymous Coward

        Yes, thats why there are no comments in the linux kernel

        <ducks>

      • by vrmlguy ( 120854 )

        If I'm understanding things correctly (and there's no guarantee that I am) the "problem" is that a lot of documentation is generated from source code (structured comments in C and Java, pod files in Perl, doc-strings in Python, etc). For example, dOxygen generates the HTML files that you browse, so a signed version of those HTML files isn't available anywhere. The solution is to require anyone wanting authenticated documentation to install the tools needed to generate it from the authenticated source.

    • by PouletFou ( 1221320 ) on Thursday December 04, 2008 @02:45PM (#25992349)
      From TFA : The possibilities that GitTorrent opens up are just mind-blowing. Here are a few: * Imagine that an entire project - its web site, documentation, wiki, bug-tracker, source code and binaries are all managed and stored in a peer-to-peer distributed git repository. o To view the web site, you either go to the main site, http://web-site.org/ [web-site.org] or, if you are offline or want faster access, you go to the locally checked out copy.
    • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Thursday December 04, 2008 @03:44PM (#25993245) Journal

      Using sqlite would probably not work very well.

      For issue tracking, a better example would be ditz [rubyforge.org], which stores issues as plain text. YAML, actually, but close enough. Thus, rather than thinking about this whole separate layer of SQL transactions, you deal with changes to the bug tracker with the same tools you use for managing the code.

      For instance, rather than Trac's retarded behavior of refusing to let you modify an issue when someone else already has (and refusing to let you see their changes without opening a new tab), you'd let Git try to merge them, and fix it manually if necessary.

      PHP would not be a good idea, either, unless it was very well secured -- you'd probably want static files for your wiki, or a safer templating language (Markdown, etc). In fact, no need to make it a wiki -- again, just keep it flat, and use git as the mechanism for distributing changes.

  • by Chris Mattern ( 191822 ) on Thursday December 04, 2008 @02:10PM (#25991765)

    ...there's too many gits on the internet *now*...

  • Why? (Score:3, Interesting)

    by Rix ( 54095 ) on Thursday December 04, 2008 @02:12PM (#25991799)

    The primary purpose of peer to peer systems are to either avoid censorship or provide lots of cheap/free bandwidth.

    Neither of these really apply to source code management. Hosting is easily sponsored and the files aren't very big anyway. Few projects will face censorship anywhere other than the most regressive regimes (ie, China or the US).

    • Think of it as abstracting away servers, sort of like the "cloud computing" concept but from a different angle. At minimum, it gives you automatic load balancing between mirrors.

      I'm not sure if this particular implementation is the greatest thing since sliced bread, but there are still a ton of areas where just adding distribution + pgp signatures will make the world a better place.

      • Re: (Score:3, Informative)

        I don't see the need, though. Git is small and lightweight. Large-ish projects just work off of Github, which is fast enough. If the central repository goes down, you have other means (mailing list, etc) for getting back in touch -- granted, GitTorrent would do that for you, but it seems a premature optimization when a central repository works most of the time.

        • You're used to permanent online Internet access.

          in cases where internet access is prohibitively expensive or even impossible, it makes perfect sense to have everything in easily-syncable git repositories.

          once you have the documentation, the wiki, the code and the bugtracker in repositories, you could even sync those repositories up with the rest of the world through the exchange of floppy disks, CDs or USB memory sticks.

          so the article is about "thinking ahead".

    • Re:Why? (Score:5, Insightful)

      by lkcl ( 517947 ) <lkcl@lkcl.net> on Thursday December 04, 2008 @02:39PM (#25992239) Homepage

      The primary purpose of peer to peer systems are to either avoid censorship or provide lots of cheap/free bandwidth.

      the primary purposes _now_ are to avoid censorship and to provide lots of cheap/free bandwidth.

      the last major upgrade of debian REDLINED the world's internet backbone infrastructure for a WEEK.

      with the total linux usage only being - what... 1% of the world's desktop systems, and debian being a small fraction of that, the debian mirror system are ALREADY creaking under the load.

      Neither of these really apply to source code management.

      why not?

      Hosting is easily sponsored and the files aren't very big anyway. Few projects will face censorship anywhere other than the most regressive regimes (ie, China or the US).

      i don't _want_ "sponsorship". i don't _want_ my pet project hosted by a large corporation. i want it completely independent.

      i want my web site content hosted and automatically mirrored across the world, along with its bugs database and its wiki all linked together.

      i want people in the emerging markets and the third world to be able to have exactly the same kind of luxury that we do - and they DO NOT have "continuous access to the web site or access to the lovely sponsored hosting".

      think much bigger and you will start to see why this is so damn important.

      • [Citation Needed] (Score:2, Insightful)

        by Rix ( 54095 )

        Proof or it didn't happen.

        Why don't you want your pet project hosted by a large corporation? You really just sound like you're whining about nothing.

        I'm pretty sure neither Google Code nor Sourceforge discriminate against the third world.

        • I'm pretty sure neither Google Code nor Sourceforge discriminate against the third world.

          But it could easily be the other way around. You could be a third world developer and have a great piece of software to share, say KQuicken for Linux, but you can't reach sourceforge because of your government's firewall.
      • Re: (Score:3, Funny)

        by BlowChunx ( 168122 )

        simple answer is to have debian update more often... ;-)

      • the last major upgrade of debian REDLINED the world's internet backbone infrastructure for a WEEK.

        I very much doubt it was the source which caused that issue, or that a distributed git repository would help. I'm guessing it was downloading DVD images, or individual packages -- assuming that this actually happened; I only have your word for it at the moment.

        And the reason it doesn't apply to source code management is, at least technologically, git is damned good -- fast and small enough that it really does not cost very much in the way of hosting. Typically, when I send an update to Github, it's on the o

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      "the files aren't very big anyway."

      Speak for yourself. Ever work on a game or film project?

      • I don't see why you'd keep a film project in a SCM, and game art assets can be kept separate from the code anyway. SCMs won't track them very well.

        • by grumbel ( 592662 )

          I don't see why you'd keep a film project in a SCM

          For the same reason you keep source in SCM.

          SCMs won't track them very well.

          Centralized SCM do quite fine, Git on the other side does rather horrible since it forces you to checkout the complete history of the project, not so much a problem with text files, but a huge issue with binary blobs, which don't diff well and thus don't compress. A 'git clone' can easily get 10 times as large as a 'svn checkout' for a project with lots of binary stuff.

  • A website and bandwidth has never been a chokepoint, sourceforge and google code has for years provided bandwidth.

    This is a problem in search of a solution.

    • by GCsoftware ( 68281 ) on Thursday December 04, 2008 @02:19PM (#25991901) Homepage

      This is a problem in search of a solution.

      I believe you meant "solution in search of a problem."

    • across most of europe, america and asia, internet access is near-unlimited.

      have you considered the implications of receiving linux on a CD, and being cut off from the rest of the internet?

      how would a group of 100 developers, or 1000 developers, or 10,000 developers - all of them "used to" the current levels of internet access and speed, cope in a situation where the access to the internet was restricted to intermittent 56k dialup?

      • In such a situation, I still don't see how GitTorrent helps. I can still use Git over a LAN, manually, and that intermittent 56k is still plenty fast for source-level changes to be distributed via Git.

    • Re: (Score:2, Interesting)

      by sakonofie ( 979872 )
      From http://code.google.com/p/gittorrent/ [google.com]:

      It might currently come across as a solution looking for a problem - and as one smart-ass with admin rights to the Google Code project reminds you on the source tab, "more alpha than the greek letter". The initial motivation was performance of downloads and in particular reducing load on kernel.org.

      Not convinced this is a good idea yet? Oh don't worry it goes on:

      That's one reason d'etre, but to those who argue that is insufficient justification for its existence, that Git is already fast enough - it is a first step towards applying decentralizing Peer to Peer concepts to Git.

      BTW, an excellent way to convince someone a project really doesn't have a "reason d'etre" is insisting it has multiple "reason d'etre"s.

      If you decentralize the download layer, it's just another small step before you decentralize the push rights and tie it to a web of trust such as PGP, and then you don't actually need discrete mirror sites. Every mirror can track the git repositories the owners want it to carry, and those authorized to sign updates can make signed updates to push the cloud forward.

      You had me at performance and distributing bandwidth costs, and probably should have stopped there. Changing ownership of a project from those who control "The Web Site" to those "authorized to sign updates" doesn't do much for me.
      And srly, "central choke-point of co

  • It amuses me (Score:5, Interesting)

    by Reality Master 201 ( 578873 ) on Thursday December 04, 2008 @02:16PM (#25991853) Journal

    The hyperventilation notwithstanding, what amuses me most is the fact that the project is currently hosted at Google Code.

    Try meditation or something.

    • by lkcl ( 517947 )

      ohmmmmmmmmmmmm

      "i am at onnnne with the universe. i am greeeen!"

      the project was found by accident: the author of the article and the project's authors are not related, in any way.

      think of google code as a bootstrap mechanism: you have to get from here to there _somehow_, and if it wasn't for the old, you'd never get a leg-up into the new....

      • think of google code as a bootstrap mechanism: you have to get from here to there _somehow_, and if it wasn't for the old, you'd never get a leg-up into the new....

        True enough, but you'd think that you'd start with one of the "old" things which was at least managed with the same SCM your project is for. That is, why wouldn't you use Github for that?

        • by lkcl ( 517947 )

          i think that google might be a bit peeved if the people who were on that particular GSoC-sponsored project decided that they wanted to use github.org, not code.google.com.... :)

          notwithstanding: even github would be sidelined by gittorrent - or would have to adapt gittorrent...

  • you don't need the hype. linking it to the downfall of vista makes us laugh at you

    just describe what it does, dryly, concisely, technically. if it is worthy of the hype, we will supply the hype for you

    but when you supply the hype, we are inclined to believe there's not much really going on with your project. which might not be true. so change your tone, for your own sake

  • by Anonymous Coward on Thursday December 04, 2008 @02:26PM (#25992029)

    Coming just in time for that all-encompassing Free Software revolution hinted at by The Rebellion Against Vista

    Can you also point me to where the rainbow-powered unicorn factories are? I imagine they probably exist in the world you seem to live in, you insufferable twit.

  • But a central repository doesn't disappear when seeders disappear, and it is more easily controlled to protect commits. The magic of git is that I can easily have a private branch, and then easily merge it. But is this really a good idea?

    • by lkcl ( 517947 )

      there's nothing to stop an EXISTING site from being the one that publishes their "central" repository via gittorrent. in fact, that's the whole point - initially - of gittorrent: to take the load off the "central" repositories, currently utilising http mirroring.

      but thank you - i will make mention of that, explicitly, in the article.

  • The nice part about a repository hosted on a well-known site is (relative) confidence in the security of the code. If a repo is fully distributed, what's to protect against someone at a node adding malicious code? And, if something malicious is discovered in software you downloaded, how do you track it back to the source node?

    Curious,

    • by Sloppy ( 14984 )

      The nice part about a repository hosted on a well-known site is (relative) confidence in the security of the code. If a repo is fully distributed, what's to protect against someone at a node adding malicious code? And, if something malicious is discovered in software you downloaded, how do you track it back to the source node?

      Wasn't that answered in the summary?

      PGP signing (an existing feature of git) and other web-of-trust-based mechanisms will take over from protocols on ports (e.g. ssh) as the access c

    • Re: (Score:3, Informative)

      by TheRaven64 ( 641858 )
      As mentioned in the summary, PGP. Each branch will be signed with a PGP key, so if you trust the person who owns the key then you trust the code. If someone tampers with it, then they won't be able to sign it. You can still grab their branch, but only if you trust them.
    • by lkcl ( 517947 )

      Git GPG signing, and KeyNote [ietf.org].

      http://www1.cs.columbia.edu/~angelos/keynote.html [columbia.edu]

  • Rebellion you say? (Score:3, Insightful)

    by Jamie's Nightmare ( 1410247 ) on Thursday December 04, 2008 @02:34PM (#25992157)

    I would rather see a rebellion on Slashdot against articles that announced FOSS news as if it was predicting the second coming of Christ.

    This story is in no way related the the Microsoft's (perceived) loss in market share, not to mention the fact that those who are dropping windows are moving to Apple, not Linux. But hey, gotta go for every low blow you can get while the news is still fresh, right?

  • by eddy ( 18759 ) on Thursday December 04, 2008 @02:35PM (#25992175) Homepage Journal

    BitTorrent Trademark Guidelines: [bittorrent.com] "Misleading or Confusing People. If you are using any of our trademarks in a way that will cause people to get the wrong idea about BitTorrent's involvement in something, you should stop! If you have some reason why you think your proposed use isn't misleading or confusing, let's talk."

    • Re: (Score:3, Informative)

      They've already renamed it MirrorSync [google.com] and redesigned the entire concept to fit better into the way git repositories already talk to each other.
  • Comment removed based on user account deletion
    • by lkcl ( 517947 )

      A distributed repository has no political implications that mirroring in general don't have already have.

      China. Dubai. internet access is monitored and censored. In Dubai, if a mime-encoded download *happens* to have the letters "sEx" in it, it gets shut off.

      • by richlv ( 778496 )

        darn. they can't read about sussex, england.
        and they can't register at hostels/hotels/conferences where sex has to be specified (this actually happened at my workplace where some admin had set overzealous filters on his own).

      • Dang, man! Are you replying to every single comment? That's quite the astroturfing campaign you are running....

        • by lkcl ( 517947 )

          ha ha :)

          no, not every single one - just the ones that get the wrong end of the stick in some subtle way that could misdirect readers.

  • This is an iffy idea for data that actually matters. The "torrent" type systems sort of work because they're willing to accept very poor data integrity in exchange for free music and video. Even that's going downhill, as more content shows up with logos, ads, and other various dreck tacked on.

    When it doesn't work, or something gets lost, who do you blame?

    Security is supposed to be through "signing". Who's signing what? Does everybody sign their own check-in, do servers sign collections of files, or w

    • read the article [advogato.org]: in it, you will see links to the fact that Git already has GPG signing on tags.

      also, you will see references to KeyNote [columbia.edu], aka RFC 2704 [ietf.org]. for convenience, i'm cut/pasting the top bit, here:

      "Trust management, introduced in the PolicyMaker system [BFL96], is a unified approach to specifying and interpreting security policies, credentials, and relationships; it allows direct authorization of security-critical actions. A trust-management system provides standard, general-purpose mechanisms for s

    • by autocracy ( 192714 ) *
      You're supposed to have a connection to the "web of trust" system. The system isn't meant to work based on the idea of, "Oh, there are a bunch of keys that have signed each other. Must be fine."
  • I don't know how this would work with software, as Bit Torrent files seem to have a half-life of sorts, so that older files might disappear. What ensures that the entire list of files expected is actually available, and how do you browse "the repository" for a project?
    • by lkcl ( 517947 )

      Persistence happens by mistake when people forget to clean out their gittorrent-backed git repositories.

      the nice thing about using gittorrent is that you would end up with copies of the bits of source code and the binaries that YOU were interested in - and, consequently, so would anyone else.

      so, if you were a maintainer of a project, you would be interested in hosting a "central" repository, just like is done now, keeping all the revisions of the software, but it would *happen* to also be _distributed_...

      in

  • This sounds like a nice way to take the load off of the central servers. I don't think it will replace them or make them unnecessary.

    From a technical standpoint, with Git, there's nothing about the central server that is unique. Instead, it's a social convention. Everyone knows where to get the code. Linus discusses this here. http://lwn.net/Articles/246381/ [lwn.net]

    Perhaps, my imagination is failing; but, I don't think this will change. Most people want to go to a well known trusted place to at least get

  • This is a very legitimate torrent use that will frustrate the RIAA in its attempts to stamp out torrents.

    • To a reasonable and informed person, this might constitute a legitimate non-infringing use of P2P.

      To the IP Crusaders, this another step on the slippery slope!

      "Now this evil infringing peer-to-peer technology is being used to host UNSTOPPABLE repositories of COPYRIGHT-INFRINGEMENT and COPYRIGHT-PROTECTION-CIRCUMVENTION software! The P2P terrorists have gone from Torrent as a WMD to Torrent as a WMD factory!"

      I hope /code doesn't get all "all-caps-filter" on me; I'm trying to simulate hysterics.

  • Dead project (Score:5, Informative)

    by nniillss ( 577580 ) on Thursday December 04, 2008 @03:33PM (#25993079)
    Status, according to the project site, http://code.google.com/p/gittorrent/ [google.com]: Currently no-one is actively developing either this developed version or Jonas' C++ implementation.

    The last project entries/downloads are from February 2008. Why such a hype over a dead/dormant project?

  • Vista rebellion? (Score:2, Insightful)

    by Vamman ( 1156411 )
    You mean someone else supports cleartype fonts now?

    I'm not a Microsoft fan but this shit about a vista rebellion has nothing to do with bringing two technologies together (that also have their warts).

    I'm petty sure the frustrated Vista users won't be benefiting from peer-peer distributed source code anytime soon.
  • Git is basically just a generic distributed versioning-filesystem layer, right? Source control is its current killer app, but it's got no particular hooks to make it dependent on that domain.

    So if we combined Git + Bittorrent... does that give us a generalised peer-to-peer distributed filesystem?

    If so, that's a whole lot more interesting than just a way to share source code fast. Imagine a true peer-to-peer Web built on something like this.

    Imagine, for instance, posting blog or wiki posts as little paragrap

  • by hachete ( 473378 ) on Thursday December 04, 2008 @06:05PM (#25995119) Homepage Journal

    Given that a fair proportion of most of the firms I've worked for do not know how to use SCMS, a lot of the SCMS I've maintained contain rather large binary snapshots. Also, distributed firms. So this might be a useful tool if I could get people to use it. Which is unlikely.

    but the politics? In this case, leave it out. Just a distraction.

  • by syousef ( 465911 ) on Thursday December 04, 2008 @08:16PM (#25996885) Journal

    I always check my GIT repository into SVN for safe keeping ;-)

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...