Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Mozilla The Internet Software

Why Mozilla Is Committed To Using Gecko 632

Ars Technica has published an article about Mozilla's commitment to use the Gecko rendering engine instead of using Webkit, which was adopted by Apple and Google for use in the Safari and Chrome browsers. I have been using Chrome on my work PC and find many of its features compelling, and wonder how soon we will see its best innovations in Firefox. Why is Gecko worth keeping if it is outdated and bloated?

This discussion has been archived. No new comments can be posted.

Why Mozilla Is Committed To Using Gecko

Comments Filter:
  • by suck_burners_rice ( 1258684 ) on Tuesday September 09, 2008 @08:00PM (#24939887)
    Because it has a cooler name than the boring sounding WebKit. Besides, it'll save you 15% on car insurance.
  • lite (Score:5, Insightful)

    by TheSHAD0W ( 258774 ) on Tuesday September 09, 2008 @08:02PM (#24939921) Homepage

    Why is Gecko worth keeping if it is outdated and bloated?

    Because it's bloated as a single app, but less bloated then opening up a new process (or more than one!) for every single web page loaded. Until every computer in use has multi-gigabyte memory, including handheld devices, there will be a need for something lighter than webkit

    • Re:lite (Score:4, Informative)

      by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Tuesday September 09, 2008 @08:08PM (#24939997) Homepage Journal

      You're confusing Firefox-the-browser with Gecko-the-renderer. There's no reason Firefox couldn't have one process per tab, and most Webkit/KHTML implementations currently use one process per browser window (like Firefox).

      In short, pick something else to distinguish them. You're way off this time around.

    • Re:lite (Score:5, Informative)

      by Anonymous Coward on Tuesday September 09, 2008 @08:09PM (#24940019)

      Webkit doesn't specify that you have to use a separate process for each page. That's a Google Chrome feature.

    • by Estanislao Martínez ( 203477 ) on Tuesday September 09, 2008 @08:19PM (#24940185) Homepage

      Because it's bloated as a single app, but less bloated then opening up a new process (or more than one!) for every single web page loaded. Until every computer in use has multi-gigabyte memory, including handheld devices, there will be a need for something lighter than webkit

      First of all, WebKit itself doesn't impose the multi-process model that Google's Chrome uses. For example, Safari uses WebKit, and it runs as a single process.

      With that cleared up, now, here's the more important flawed assumption in your post: that having the broswer use n processes to display n pages will require n times as much memory as doing it all with n threads in one process. That's far from true, because such a browser can be architected so that the processes use shared memory for all shared resources and state.

      The multi-process architecture will carry additional memory overhead, but done correctly, it will scale up much better than linearly. The real costs are the costs of process creation and switching in the OS, plus the costs of the inter-process communication method. Using shared memory for the latter is cheap, but it can potentially make one process bring down the others, defeating the purpose of isolating each page into a process; it's a balancing act, and the memory overhead really depends on what tradeoffs one picks here.

      • Amendment (Score:5, Insightful)

        by Estanislao Martínez ( 203477 ) on Tuesday September 09, 2008 @08:26PM (#24940283) Homepage

        The multi-process architecture will carry additional memory overhead, but done correctly, it will scale up much better than linearly. The real costs are the costs of process creation and switching in the OS, plus the costs of the inter-process communication method. Using shared memory for the latter is cheap, but it can potentially make one process bring down the others, defeating the purpose of isolating each page into a process; it's a balancing act, and the memory overhead really depends on what tradeoffs one picks here.

        Actually, I take that back. The only real overhead is the OS overhead for separate processes.

        The architectural choice of what memory contents should be shared between processes and which should be private aren't specific to the multi-process architecture. The same choices and tradeoffs exist in a multi-threaded application; you can choose between having each thread have its own copy of some piece of memory (uses more memory, but isolates each thread from the others), or have all the threads share it (uses less memory, but access must be synchronized, and any bugs involving that shared memory may make one thread bring others down).

  • Heterogeny (Score:5, Insightful)

    by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Tuesday September 09, 2008 @08:02PM (#24939929) Homepage Journal

    Variety is the spice of life. If every browser used the same engine, there'd be no competitive spirit to improve it. Besides, when was a monoculture ever a good thing?

    I've been using Konqueror for my primary browser for several years now, but still respect the Mozilla group and wish them the best of luck. As long as everyone follows the standards (which the Open Source browser folks have excelled at), the more the merrier!

    • Re: (Score:3, Insightful)

      by Enderandrew ( 866215 )

      Gecko just had a major two-year-plus makeover, and it still isn't as good as Webkit. One could argue that Webkit of two years ago stacks up reasonably well with Gecko of today.

      Mozilla spent so much time on rendering engine refactoring, and they want to focus on stabilizing 3, and then moving to Firefox 4.

      Moving to a new rendering engine might seem daunting. I don't see Mozilla approaching the project themselves.

      There is a new QT branch of Firefox, but even that isn't a proper QT branch. It uses QT widget

    • Re: (Score:3, Interesting)

      by localman ( 111171 )

      As a has-been web developer and regular web user, I'm going to suggest that the advantages of having many browsers (or more specifically, rendering engines) is largely overstated.

      I don't see that browsers have made any wondrous leaps of progress due to competition. In fact it seems that competition has stymied progress at times, as browsers had to attempt supporting incompatible features that grew out of attempts to one-up the competition. Companies that develop websites have to waste a lot of resources o

      • Re:Heterogeny (Score:5, Insightful)

        by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Tuesday September 09, 2008 @09:11PM (#24940791) Homepage Journal

        As a current web developer, I develop with KHTML. When I like it, I verify that it looks the same under Gecko (and it always does). If it's a major change, I'll check it under MSIE and screw around with the CSS until IE manages to display it without barfing. I don't bother testing with Opera anymore because I've never once seen it fail on a valid page that renders under KHTML - it's just kind of assumed that it will work.

        So with all the HTML engines out there, you only have to test two camps: MSIE and everything else. Adding another standards-compliant engine wouldn't increase my workload one iota.

      • Re: (Score:3, Informative)

        Flash is an example of something that seemingly progressed well, perhaps faster than browsers, while having essentially no competition.

        It's not quite what it seems to be...

        You make some interesting points, and your point about Flash isn't entirely invalid, but Flash is also a great example of the sheer lunacy of pinning any kind of "web standard" on a single closed implementation from a single vendor.

        Let's start with the basics: Flash is slow. Version 10 might finally get hardware acceleration -- OpenGL has been available for how long, again?

        Firefox already uses Cairo, a cross-platform vector graphics engine, which already has some acceler

  • by d34thm0nk3y ( 653414 ) on Tuesday September 09, 2008 @08:02PM (#24939931)
    Holy begging the question Batman!

    Yes, I did check Wikipedia to make sure a million angry slashdotters weren't going to kill me for its usage.
  • by creature124 ( 1148937 ) on Tuesday September 09, 2008 @08:03PM (#24939935)
    This article ignores the real question: Why change? I personally see nothing 'outdated' or 'bloated' about Gecko, and there is no point in changing if Webkit provides no real advantage.
    • by bigstrat2003 ( 1058574 ) * on Tuesday September 09, 2008 @08:07PM (#24939983)
      Did you RTFA, or just TFS? Because I RTFA'd, and the article specifically says that there's no reason for Firefox to switch engines. TFS is full of it, basically, so I could understand if you got the wrong idea from that.
    • by PunkOfLinux ( 870955 ) <mewshi@mewshi.com> on Tuesday September 09, 2008 @08:09PM (#24940027) Homepage

      I think it's more that WebKit is the new buzzword in browser dev. Plus, Apple uses it, so it's *obviously* the holy grail. I think Gecko is fine; if it's the bloat, maybe the competition from WebKit will whip it into shape.

  • Because... (Score:5, Informative)

    by not already in use ( 972294 ) on Tuesday September 09, 2008 @08:04PM (#24939951)

    It's required for the XUL based interface?

  • Woah... (Score:5, Insightful)

    by JustinOpinion ( 1246824 ) on Tuesday September 09, 2008 @08:05PM (#24939953)

    Why is Gecko worth keeping if it is outdated and bloated?

    You've begged the question, there. The fact is that Gecko isn't outdated and bloated. Mozilla has kept the code up-to-date. They've improved rendering and javascript performance remarkably in recent Firefox releases.

    Personally, I'd rather see alternatives being independently developed and improved; all the while competing with each other for mindshare and technical superiority. The alternative, of relying on a single rendering engine for all browsers, is a bad idea. History has taught us it will lead to stagnation and quirky (rather than standards-compliant) rendering.

    • Re:Woah... (Score:5, Funny)

      by Anonymous Coward on Tuesday September 09, 2008 @08:14PM (#24940103)

      You've begged the question, there

      GOD DAMNIT! No, Begging the question is a logi... wait, you used it RIGHT?

      *reads it again*

      Okay... WHO ARE YOU AND WHAT HAVE YOU DONE WITH SLASHDOT!?

    • Re:Woah... (Score:5, Interesting)

      by QuantumG ( 50515 ) * <qg@biodome.org> on Tuesday September 09, 2008 @08:15PM (#24940119) Homepage Journal

      Ya know what I'd like to see? Standards revision. It's great to tote out "standards compliance" as the holy grail, but the problem is that there are plenty of things that the standard just does not define.. and those things get discovered by web developers who work around the issues and it never gets back to the standards drafters. For example, how do you prefetch images? For a long time there was no standard way. Now there's the link tag but it's optional.. yeah, that's right, the standard says that a browser can optionally implement the tag.. what kind of standard is that anyway? So no-one used it. Instead, they use the img tag and set the width and height of the image to 0.. unfortunately, the standard never said "if the width of the image is zero, thou shalt not render anything." Yeah, yeah, I know, should be implied, by some browsers render a white pixel and figure that's good enough.. the fact that this isn't good enough should be fed back to the standard and made explicit.

      Thankfully the interest in Acid tests has taken on this role. Unfortunately even a lot of stuff that is in the acid test never makes it back to the standard, so browser developers have to reverse engineer the Acid test!

      • Please do not take this negatively:

        Ya know what I'd like to see? Standards revision.

        And yet, they do revise them by working on and ratifying a new version.

        It's great to tote out "standards compliance" as the holy grail, but the problem is that there are plenty of things that the standard just does not define.. and those things get discovered by web developers who work around the issues and it never gets back to the standards drafters.

        That sounds nice, but you're advocating a moving target. Standards or recommendations would never be finished.

        Now there's the link tag but it's optional.. yeah, that's right, the standard says that a browser can optionally implement the tag.. what kind of standard is that anyway? So no-one used it. Instead, they use the img tag and set the width and height of the image to 0.. unfortunately, the standard never said "if the width of the image is zero, thou shalt not render anything."

        Just because *you* want it, doesn't mean others do.

        Unfortunately even a lot of stuff that is in the acid test never makes it back to the standard, so browser developers have to reverse engineer the Acid test!

        I'm guessing you're a web developer. Therefore, you or your company have a demonstrated interest in the recommendations, which means you can sign up and be a member of the committees and advocate your changes and proposals for the next version

      • Re:Woah... (Score:4, Insightful)

        by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Tuesday September 09, 2008 @10:29PM (#24941597) Journal

        Now there's the link tag but it's optional.. yeah, that's right, the standard says that a browser can optionally implement the tag.. what kind of standard is that anyway?

        One that makes sense?

        Out of curiosity -- when was the last time you used lynx? Or links2? Or w3m? Browsers don't even have to implement images at all.

        Seems to me, about the best they can do is define what the behavior should be when implemented. So, I'd suggest just using the link tag -- it's not like your page will break if it's not implemented, it'll just be slightly slower.

        And I've found that prefetching images isn't useful, most of the time. Let the browser cache do its job.

        Thankfully the interest in Acid tests has taken on this role. Unfortunately even a lot of stuff that is in the acid test never makes it back to the standard, so browser developers have to reverse engineer the Acid test!

        Aren't the Acid tests documented?

        Sure, it's easier if you only have to read the standards, but if you're just looking for a higher Acid test score, that seems like the obvious solution.

  • by daceaser ( 239083 ) on Tuesday September 09, 2008 @08:06PM (#24939969) Homepage

    The whole of the Mozilla code tree is tied into a framework called XPCOM. It is a Cross-Platform reimplementation of Microsoft's COM. The XPCOM influence is extremely pervasive throughout the whole of the Mozilla/Firefox/Thunderbid/Sunbird/Gecko code trees.

    WebKit would not fit in very well with the existing ecosystem because it does not tie into the XPCOM framework which is used to tie all of the Mozilla group's projects together. A lot of the potential performance benefits of moving to WebKit would be lost because of all the bridging between WebKit and XPCOM that would be required.

  • It's NIH (Score:3, Insightful)

    by coolgeek ( 140561 ) on Tuesday September 09, 2008 @08:10PM (#24940039) Homepage

    The Mozilla crew are still pissed at David Hyatt for choosing Konqueror over Gecko as "the best open source rendering engine available" when he defected from Mozilla to Apple.

    That's why they will never consider WebKit. Too much pride.

  • by ya really ( 1257084 ) on Tuesday September 09, 2008 @08:12PM (#24940063)

    He's going from talking about rendering engine (webkit/gecko) to talking about how great the features are in Chrome (not the rendering engine, the browser). Then back to rendering engine (gecko). What exactly is your topic?

    Just a hunch, but the writer doesnt sound intelligent enough to know the features of a rendering engine.

  • by melted ( 227442 ) on Tuesday September 09, 2008 @08:12PM (#24940065) Homepage

    Recently, I'm seeing some indirect evidence of memory corruption in FF. After a while it fails to download images or connect to the network, for example. You restart the process and it all works like buttah again. Heck, Internet Explorer is more stable than this.

    I guess fixing hard to repro bugs is far less glorious a job than bolting on a new JS interpreter (even though the old one was OK to begin with) or tweaking the UI.

  • by fuzzyfuzzyfungus ( 1223518 ) on Tuesday September 09, 2008 @08:14PM (#24940095) Journal
    While it is certainly true that the mozilla codebase has a rather sordid past, its trajectory has been extremely encouraging(particularly given that it essentially includes its own cross platform widget set, used by mozilla apps and a few others). Javascript performance is competitive with the best, memory performance has steadily improved, and rendering support is quite credible.

    I can understand why a third party, starting a project from scratch, might be disinclined to use Gecko; but Gecko seems to be very much on the worthwhile side of the "improve vs. scrap" question.
  • RTFA (more closely) (Score:4, Informative)

    by nadamsieee ( 708934 ) on Tuesday September 09, 2008 @08:58PM (#24940659)

    From a technical perspective, Gecko is now very solid and no longer lags behind WebKit. A testament to the rate at which Gecko has been improving is its newfound viability in the mobile space, where it was practically considered a nonstarter not too long ago. Mozilla clearly has the resources, developer expertise, and community support to take Gecko anywhere that WebKit can go.

  • Mozilla IS Gecko (Score:4, Insightful)

    by mysidia ( 191772 ) on Tuesday September 09, 2008 @09:16PM (#24940865)

    Gecko is what they developed.

    This is like having an article on Redhat's commitment to the Linux kernel.

    As if they could just arbitrary change their flagship product to use the BSD kernel instead.

    Or like discussing Microsoft's commitment to the Windows platform.

    Just because unix/Linux-based kernels and software are becoming more popular in some circles does not mean that it is conceivable for M$ to drop the Windows kernel in favor of a *IX one.

    If Gecko in Mozilla dies it will be because they have developed a better Gecko, or because Mozilla as a whole has died.

  • Security? (Score:3, Informative)

    by Shadow-isoHunt ( 1014539 ) on Tuesday September 09, 2008 @10:11PM (#24941421) Homepage
    Why is WebKit worth switching to when Chrome had five vulnerabilities in two days?

    2008-09-05: http://milw0rm.com/exploits/6367 [milw0rm.com]
    2008-09-05: http://milw0rm.com/exploits/6386 [milw0rm.com]
    2008-09-05: http://milw0rm.com/exploits/6372 [milw0rm.com]
    2008-09-04: http://milw0rm.com/exploits/6365 [milw0rm.com]
    2008-09-03: http://milw0rm.com/exploits/6355 [milw0rm.com]
    2008-09-03: http://milw0rm.com/exploits/6353 [milw0rm.com]

    WebKit isn't touching my machine, thank you very much. Might throw Bunny(the fuzzer) at the codebase, though.
  • Wait wait... (Score:5, Insightful)

    by xouumalperxe ( 815707 ) on Wednesday September 10, 2008 @04:45AM (#24943985)

    Chrome is all new and bright and shiny, Firefox has some (plenty?) memory leaks, and all of a sudden we go from comparing browsers to making sweeping statements over their respective rendering engines? Why?

    How is a rendering engine that scores 85% on ACID3 "outdated"? Why should Mozilla drop a codebase that is quite successful in the marketplace, and that they know intimately and have full control over in favour of one they don't know all that well and is controlled by Apple, just because it's (arguably) king of the hill right now?

    Frankly, the summary is a troll -- and the article feels like little more than a jab at free clicks.

1 + 1 = 3, for large values of 1.

Working...