Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Google Businesses Security The Internet

Google to be Our Web-Based Anti-Virus Protector ? 171

cyberianpan writes "For some time now, searches have displayed 'this site may harm your computer' when Google has tagged a site as containing malware. Now the search engine giant is is further publicizing the level of infection in a paper titled: The Ghost In The Browser. For good reason, too: the company found that nearly 1 in ten sites (or about 450,000) are loaded with malicious software. Google is now promising to identify all web pages on the internet that could be malicious - with its powerful crawling abilities & data centers, the company is in an excellent position to do this. 'As well as characterizing the scale of the problem on the net, the Google study analyzed the main methods by which criminals inject malicious code on to innocent web pages. It found that the code was often contained in those parts of the website not designed or controlled by the website owner, such as banner adverts and widgets. Widgets are small programs that may, for example, display a calendar on a webpage or a web traffic counter. These are often downloaded form third party sites. The rise of web 2.0 and user-generated content gave criminals other channels, or vectors, of attack, it found.'"
This discussion has been archived. No new comments can be posted.

Google to be Our Web-Based Anti-Virus Protector ?

Comments Filter:
  • 1 in 10? (Score:3, Funny)

    by Xoltri ( 1052470 ) on Friday May 11, 2007 @01:23PM (#19086275)
    When I was living at home my sister must've found every last one of them. She was terrible for breaking the computer.
    • Re:1 in 10? (Score:5, Funny)

      by hal2814 ( 725639 ) on Friday May 11, 2007 @01:34PM (#19086549)
      Well most downloaded malware comes through online games and porn. Which one did your sister have a hankering for?
      • Why can't they be one in the same?
    • Re: (Score:3, Funny)

      by Kurrurrin ( 790594 )
      I'm trying to figure out how the first post can be tagged as redundant. It doesn't work, unless one is taking into account the entire history of posting on /. And if that is the case, then everyone should just start off with (Score:-1, Redundant) to save mods the trouble.
      • by Kijori ( 897770 )
        I suppose it could be if it repeated something universally known (or something in the summary).
      • Re: (Score:2, Funny)

        by Shinmizu ( 725298 )
        Statistically, it's probably a safe bet to automatically tag the first post as such:

        1) In Soviet Russia, first post, for one, welcomes our new Cowboy Neal overlords that can run linux on beowulf goatse clusters of this article was submitted three years ago, you stupid editors.
        2)?????
        3) Profit
  • Since most of this malware attacks windows machines, isn't google helping microsoft more than it's helping linux or apple?
    • by Aldur42 ( 1042038 ) on Friday May 11, 2007 @01:30PM (#19086439)
      Maybe, but any reduction in the number of infected PCs is win for the entire net.
    • Does it matter? (Score:5, Insightful)

      by Radon360 ( 951529 ) on Friday May 11, 2007 @01:43PM (#19086729)

      I would hope that Google is looking at it more from the perspective of what is generally good for the betterment of the entire internet. Who cares if it directly benefits users of Microsoft product users more than Linux/OSX users? Bottom line, it is potentially one less infection, and one less pwned computer in a bot network. Less infections means less machines that are probing ports on random addresses, or used in brute force attacks, such as DoS attempts.

      Don't get too tied up in the means, but rather what the potential end results, good or bad, might be.

      • by pegr ( 46683 )
        Do we really want to make it easier to identify malware sites so evil-doers will have a ready-made list of sites to entrap the unsuspecting? At least going through Google, you get a "head's up" first. With a direct link, you don't even get that...
      • Perhaps the defining characteristic of organization by cooperative lifeforms is the recognition of self versus non-self from the multi-cellular level on up to the tribal level. Here is a small step towards a network that can recognize what is healthy and what is hostile to it.

        Up until now this has mainly been done in a supervised method where some central authority made a finding. Now this is becoming automated to recognize intruders without human intervention. And it's happening in a collective way in wh
    • by LurkerXXX ( 667952 ) on Friday May 11, 2007 @01:45PM (#19086789)
      Do Linux or Apple users not mind when a bot-net army takes down a website they are trying to access, or clogs the pipes?

      Do Linux or Apple users not mind all the spam to their inbox from hijacked machines?

      Do Linux or Apple users not have to worry about some family member being taken in by a phishing scheme, hosted on a hijacked machine?

      Do Linux or Apple users not mind tons of hijacked machines probing any SSH or other ports you might have open, looking for vulnerabilities or doing dictionary password attacks?

      Less hijacked machines on the internet helps us all. Be you a Windows, Linux, Apple, BSD, or other user. Not caring about hijacked windows boxes because you are leet enough to use Linux is stupid.
      • Re: (Score:2, Insightful)

        by Synchis ( 191050 )
        On that same note, just because there is currently not much malware on Linux or Mac, doesn't mean it will always be that way.

        I'm fairly indifferent to which platform I use as long as it functions well. I'm also not the norm, but am privy to using many a malware free Windows Machine.

        The more Linux distros are out there, the larger the market share, the more malware will target it. If you think you will always have a highhorse to sit on just because you run Linux or Mac, then I'll be there when you fall and b
        • by dbcad7 ( 771464 )
          So your Switzerland when it comes to operating systems...

          I'm fairly indifferent to which platform I use as long as it functions well.

          If you think you will always have a highhorse to sit on just because you run Linux or Mac, then I'll be there when you fall and bust your ass on the first widespread linux or mac malware invasion to point and laugh at you.

          It obviously bothers you a lot that Mac and Linux machines don't have the same experience as Windows users. Too bad my machine will be all screwed up fr

    • by dave562 ( 969951 )
      There's no sense in making the user suffer or declaring them an enemy combatant.
    • by __aawdrj2992 ( 996973 ) on Friday May 11, 2007 @02:06PM (#19087193) Journal

      Since most of this malware attacks windows machines, isn't google helping microsoft more than it's helping linux or apple?

      Since morality is defined by the desire to limit human suffering, protecting innocent people who don't know better from malware is always going to be for a greater good. People shouldn't have to get their OS reloaded every few months.

      Not running your choice of OS doesn't make them bad, and is a startling simplistic world view. There's no "helping Microsoft" here; they are trying to protect all Internet users. Since those people are using Google search, it's really more like trying to serve their customers better. Since all their customers are Internet users; so ask yourself: what is concern #1 amongst Internet users?

      • Re: (Score:3, Interesting)

        by a.d.trick ( 894813 )

        Since morality is defined by the desire to limit human suffering

        Really? I won't say that human suffering is good or anything, but I think that's a pretty short-sighted definition. I mean, if I just killed everyone there would be no more suffering.

    • Re: (Score:2, Informative)

      It is in everyone's interest to both secure Windows and stop malware in general, because an infected box can be used for things other than gathering info on the owner, which then affects people who have nothing to do with Windows.

      For instance, botnets generally are made up of windows PCs, but are used to DDoS attack Unix webservers for ransom or political gain. They can also be used to attack network nodes such as vulnerable Cisco routers or corporate firewalls, it's a generic proxy model of attack which ca
    • Most of their customer base is probably using MS Windows machines, too -- probably over 90% (eg, [url:http://www.w3schools.com/browsers/browsers_os .asp]). Why shouldn't they help their customer base?
    • by Heembo ( 916647 )
      As/if the popularity of Mac's increase, so will their susceptibility to malware. This has nothing to do with poor engineering on MS's part, it's just the popularity of MS that makes winbloz, ie (and even FF) such a target. Casual surfing does have the capability to wack a mac http://it.slashdot.org/article.pl?sid=07/04/21/033 6255 [slashdot.org] .
    • Ever heard the phrase "Cut off your nose to spite your face?"
  • by cyberianpan ( 975767 ) on Friday May 11, 2007 @01:26PM (#19086331)
    This is potentially a very useful service but not all URLs we visit are from Google searches, some we still type in others as links from pages. However could we soon expect a Firefox add in that will filter all http requests through Google ? So then our new overlords will indeed know everything about our web-habits ?
  • be blocked?

    It found that the code was often contained in those parts of the website not designed or controlled by the website owner, such as banner adverts and widgets.
    Wouldn't it be far better to have safer browsers than to shut out (as many people or their organizations will do) 10% of the web?
    • Re: (Score:1, Funny)

      by Anonymous Coward

      Wouldn't it be far better to have safer browsers than to shut out (as many people or their organizations will do) 10% of the web?
      No. Because that will impact Google's ability to monetize their intellectual property through certification / exception schemes.
      • | Wouldn't it be far better to have safer browsers than to shut out (as many people or their organizations will do) 10% of the web? No. Because that will impact Google's ability to monetize their intellectual property through certification / exception schemes.
        Do you mean something like SORBS?
    • by zCyl ( 14362 )

      Wouldn't it be far better to have safer browsers than to shut out (as many people or their organizations will do) 10% of the web?

      Websites from people or organizations accidently distributing viruses are probably not the most insightful or useful websites anyway.
    • Re: (Score:3, Insightful)

      by Radon360 ( 951529 )

      The answer to your first question is most likely yes.

      What it would do, hopefully, is force companies in the business of serving up ads for pages to clean up their act, or find themselves going out of business. When word gets out that XYZ web ad agency's ads led Google to flag ABC company's web page as having malware, those looking to whore search rank positions will drop them like a bad habit.

    • Re: (Score:3, Interesting)

      by arivanov ( 12034 )
      They would.

      And the only thing a person who wants to distribute malware neeeds to do is some minimal robots.txt manipulation. The pages with the "bait" content can still be "crawlable" by google while the malware may sit in areas which have been made non-crawlable.

      Yet another stupid idea. Almost as stupid as the .bank domain. Or windows asking you to reboot just because the program you run was called "install" or had an MSI extension.
      • Hello?!? McFly?!? I know this is /. but the least you could do is read the summary!

        It found that the code was often contained in those parts of the website not designed or controlled by the website owner, such as banner adverts and widgets... These are often downloaded form third party sites.

        The robots.txt file on the website's server has no effect on third-party content hosted on a completely different server.

        And for the record, I think it's a brilliant idea. If an advertising agency serves up spyware it'll trash the rankings of the sites hosting its own ads, and pretty soon it'll have such a bad reputation among the entire web that nobody will use it. Thus it will force these advertising muppets to clean up th

      • And the only thing a person who wants to distribute malware neeeds to do is some minimal robots.txt manipulation. The pages with the "bait" content can still be "crawlable" by google while the malware may sit in areas which have been made non-crawlable.

        Seems like the solution to that is obvious -- don't obey robots.txt for the purposes of the malware scan.

        I'm not sure that robots.txt is legally binding anyway, except perhaps where it relates to an implicit permission to cache content (and even there I don't
        • Seems like the solution to that is obvious -- don't obey robots.txt for the purposes of the malware scan.

          Google already does that. It won't index content that's blocked, but it will still crawl it -- just in case. The rationale given is that when google was first starting out, web sites like the California DMV (Department of Motor Vehicles) and web sites like the New York Times, would just block all bots by default. And Google felt it couldn't afford to ignore such mainstream web sites, especially since
    • by mblase ( 200735 )
      Wouldn't it be far better to have safer browsers than to shut out (as many people or their organizations will do) 10% of the web?

      Yes, but there's nothing Google can do about that.

      Google does not yet make a web browser that can out-marketshare Internet Explorer.

      They do, however, have a search engine that significantly out-marketshares MSN Search.
    • by suv4x4 ( 956391 )
      Wouldn't it be far better to have safer browsers than to shut out (as many people or their organizations will do) 10% of the web?

      I don't know. Wouldn't it be best if we had both?

      It's optional whether you'll use Google's warning system, I know in a quite a lot of use cases people would rather filter 10%, hell, 20% or 30% of the web, if the remaining sites are guaranteed to be safe.
  • Pros and Cons (Score:5, Interesting)

    by PixieDust ( 971386 ) on Friday May 11, 2007 @01:26PM (#19086347)
    I can see a lot of Pros and Cons to this. While certainly it's good that such a major player is taking an active and aggressive stance on this, I thinkk it's also going to cause a lot of people to have a false sense of security. And while this only affects users who search for pages (and that is a LOT of traffic), it's still going to bring the question to some users "Google tells me if a site is dangerous, what do I need malware protection for?"

    I surf almost exclusively in Windows, using IE (IE6 + XP Pro on Desktop, IE7 + Vista on laptop) with no protection, and I've not had an issue with malware in years. But most people's browsing habits aren't quite like mine.

    One other effect I can see this having, is let's say www.bigcompanyhere.com gets tagged as being potentially harmful. Now Google has done them a favor by alerting them to a security problem, which they can then address, and are likely to do so much quicker to try and minimize damage to their image.

    I'm fairly interested to see how this plays out.

    • Re: (Score:3, Interesting)

      by Radon360 ( 951529 )

      One other effect I can see this having, is let's say www.bigcompanyhere.com gets tagged as being potentially harmful. Now Google has done them a favor by alerting them to a security problem, which they can then address, and are likely to do so much quicker to try and minimize damage to their image.

      The next question would be, what are Google's plans/procedure for getting a site recrawled after a problem is corrected? I could see a company being be upset about not having a quick and effective way of getting this flag cleared after fixing the problem. Or, for that matter, a less scrupulous site operator removing the malware, getting cleared, then reintroducing it, and the repeat the cycle on the next crawl when it gets flagged again.

      While I think Google would like to just say that such a warning

    • Re: (Score:3, Insightful)

      "One other effect I can see this having, is let's say www.bigcompanyhere.com gets tagged as being potentially harmful. Now Google has done them a favor by alerting them to a security problem, which they can then address, and are likely to do so much quicker to try and minimize damage to their image."

      A favor? Google has likely killed their company, or at least it's online portion. Remember the big debate about how certain companies weren't being seen on the front page of google searches a while ago? Remember
      • Personally, I kind of like the side-effects and I don't really see the problem with this.

        It means that the security of the site that I am using is positively correlated with its place in the rankings.

        If a site is poorly designed and capable of being exploited with malware, it probably does deserve to be kicked into the 'get your s#!t together' pool down with the people who pay SEO 'professionals.'

        The risk of such things happening will cause sites to care a lot more about security.

        As for the 'low low price'
      • Re:Pros and Cons (Score:4, Insightful)

        by fuzz6y ( 240555 ) on Friday May 11, 2007 @03:22PM (#19088463)

        . . . even if they fix the minor problem that google flagged for them?

        minor problem my foot. Your notion that bigcompanyhere.com is entitled to grandma's money even if they're peddling spyware is ridiculous. Google gave grandma exactly what she wanted: a place to buy a widget without getting 0wn3d. The fact that they did no favors for bigcompanyhere.com is of no concern to her. Or me.

        I wouldn't be surprised if they (google) began offering "consulting" fees to remove the malware that google flagged from the companies site quickly

        I would be very surprised indeed. They don't offer consulting fees to get you back on the gravy train after you got penaltyboxed for purveying spam links

        Their job should not be to tell people where to search but rather to let them go where they want to go.

        Spyware central isn't where I want to go, even if they sell the cheapest RAM by four cents. Google, of course, is working for their shareholders and get paid by their advertisers, but they have a vested interest in keeping the searchers happy so the advertisers will keep paying them. The people whose sites are included in the results don't have some God given right to be on the first page so they can make money. Nevertheless, google has always tried to walk the tightrope between being overrun by crappy keyword farms and kicking out legitimate sites.

    • One other effect I can see this having, is let's say www.bigcompanyhere.com gets tagged as being potentially harmful. Now Google has done them a favor by alerting them to a security problem, which they can then address, and are likely to do so much quicker to try and minimize damage to their image.
      Address? Surely they'd just insist that the malware was a customer service, and sue Google for defamation?
      • Hahaha! You should totally get modded up "Funny". The sad part is, it's probably true. Anyone remember abetterinternet and apropos?
  • Already being done (Score:5, Informative)

    by zappepcs ( 820751 ) on Friday May 11, 2007 @01:27PM (#19086361) Journal
    McAfee SiteAdvisor already does this for Google search results pages. This is nothing new. Its a FF extension and works well, though lately it has pointed out that proxy servers are trying to steal my identity when I try to use them.
  • by truthsearch ( 249536 ) on Friday May 11, 2007 @01:28PM (#19086381) Homepage Journal
    Instead of just flagging sites for users, they should first add the detailed information to the Google Webmaster Tools. If it's third party software that's the problem inform the webmasters (at least those who use Google's tools) so they can take it down. Granted, it's their own fault for using third party software without enough investigation, but let them fix the problem before they're flagged for end users.
    • Re: (Score:3, Insightful)

      by Miseph ( 979059 )
      Um, no. A website can get hits 24 hours a day, 7 days a week, and while some websites have webmasters able to give that much coverage, most do not. What about all of the users who could potentially become infected in the time between when Google spots the malware and the webmaster can fix the problem? How long would Google give them to fix it before just putting up a notice anyway? The point is to control the propagation of malware, not give webmasters a chance to stop sucking at life before warning end use
      • Um, yes. Not every webmaster is incompetent. Having malware through a generally respectable ad agency, for example, may be no fault of the webmaster. Why would it hurt to wait one week to put the feature on the front-end of Google, and informing webmaster through their tool first? One week wouldn't make any significant difference when the new version of this feature doesn't even exist today.
        • by Miseph ( 979059 )
          Well, unless you're one of the users whose machine got infected with malware during the week. Then you'd be a little bit pissed.

          I agree that not all webmasters are incompetent, but I don't see why that means a tool like this should assume the opposite. A competent webmaster would probably fix the problem (even if it means temporarily removing the widget) as soon as they were notified in any case, not to mention that they'd be less likely to put a sketchy 3rd party component anyway.
  • Huh (Score:5, Funny)

    by Realistic_Dragon ( 655151 ) on Friday May 11, 2007 @01:28PM (#19086401) Homepage
    I browse the internet on my Linux box, running OS X with MacOnLinux. On OS X I run VMWare player hosting FreeBSD, where I have all the options turned to OFF. That runs Firefox, which connects to a web-2.0 version of Lynx. I use this to connect to another site which manually lets me enter netcat commands and read the result.

    My only complaint is that the pirates at Macrodobe STILL won't support my platform of choice! When will there be a flash player for people like me!
  • by WrongSizeGlass ( 838941 ) on Friday May 11, 2007 @01:30PM (#19086447)
    Of course Google can protect us against everything and everyone (except the IRS, acne and that kid on the bike in Better Off Dead). They can do anything they say they can do ... and even stuff that they haven't thought of yet.

    Google is good, Google is great, and Google can do no wrong. Where on Earth did I ever get that pearl of wisdom? I read it on the internets, of course ... on some site that rhymes with froogle.
    • by hal2814 ( 725639 )
      "except the IRS, acne and that kid on the bike in Better Off Dead"

      Google did take care of that kid on the bike for me. I don't know how they did it, but all I had to do was give Google $2 and they made him go away somehow.
    • I read it on the internets, of course ... on some site that rhymes with froogle.
      I wonder how the Froogles.com [zdnet.com] guy is feeling, now that Google calls that service Google Product Search [google.com].
  • right.. (Score:5, Funny)

    by mastershake_phd ( 1050150 ) on Friday May 11, 2007 @01:31PM (#19086471) Homepage
    It found that the code was often contained in those parts of the website not designed or controlled by the website owner, such as banner adverts and widgets.
     
    So google is going to protect us from webpages that use less than reputable advertising and widget services. Hmm, maybe google should go into the advertising and widget service, oh wait...
  • by Bearhouse ( 1034238 ) on Friday May 11, 2007 @01:32PM (#19086483)
    Some people don't like, or cannot use, Firefox or Opera, plus sensible add-ons such as anti-phising plug-ins, noscript...

    For example, one of my (very big) corp. customers is still running IE 7...

    When I challenged the support guys about this, they said 'that's OK, we detect & block most things at the firewall'...

    *sigh*

    When I pointed out that:
    1. That's bullshit.
    2. Lots of their managers travelled, and surfed the net via unsecure methods like hotels using proxy servers, public wifi, they said 'that's OK, they can only access the intranet and internal mail via VPN'.

    *double sigh*

    So now I advise people not to click on URLs directly, or type them in, but go via Google. It's better than nothing...
    • You do realise that a properly configured setup of IE7 can be more secure than Firefox right?

      I don't see the issuse the type of people who get malware/spyware/virus's will get them reguardless of browser, sure a good browser will help stop of it but your forgetting how stupid some people can be. The company sounds like they have a good approach, the VPN probably blocks all but a few ports and hopefully some sort of firewall stops the other attacks sure it doesnt help that externally infected machine but it
  • From the article,

    The user is presented with links that promise access to 'interesting' pages with explicit pornographic content, copyrighted software or media.
    In other words, the people who have their computers hacked are those looking for trouble in the first place (although I have to admit that I don't consider porn trouble but I bet most of these problematic sites are serving copyrighted material anyways.) I guess you get what you pay for!
  • Just display something different, that is hide malware) when googlebot comes on your website.
  • end-users, man (Score:4, Insightful)

    by Skadet ( 528657 ) on Friday May 11, 2007 @01:33PM (#19086521) Homepage

    It found that the code was often contained in those parts of the website not designed or controlled by the website owner, such as banner adverts and widgets.
    These days, almost nothing is designed by the website owner. Unless you're coding your own html/php/asp/pearl/ruby/python or at very least peruse the source code of the widgets you download to make sure there's nothing bad in there, you're just another end-user. And so this is not unexpected. End-users are the ones that "CL1CK TH3 PURPL3 M0NK3Y F0R ELEVENTY M1LL10N DOLLERZZZZ!!!" and install all sorts of crazy stuff on their machines. (Rabbit trail: one of my clients many years ago actually ASKED me to install the infamous purple monkey for him because he liked the text-to-speech). Whether it's on the desktop or on the web, people who will install anything without even a hint of research will continue to spread computer-borne diseases. It's one of the reasons I hate MySpace. What 13-year-old girl isn't going to think sparkly, smiling unicorns aren't cute? Of COURSE they're going to spread them around, even though they're attached to a malicious website.
  • 450,000? (Score:5, Informative)

    by rueger ( 210566 ) on Friday May 11, 2007 @01:34PM (#19086539) Homepage
    Sigh, are basic editorial skills too much to ask here? (I know, it's a rhetorical question).

    TFA does not say that "the company found that nearly 1 in ten sites (or about 450,000) are loaded with malicious software." This implies that there are a total of less than a half million sites that pose a risk.

    It said that of the 4.5 million pages examined, "about 450,000 were capable of launching so-called "drive-by downloads"..."

    It also notes that "A further 700,000 pages were thought to contain code that could compromise a user's computer, the team report."

    The problem is probably quite a bit larger than presented in the summary, even if one ignores the confusion between "sites" and "pages".
  • by Bearhouse ( 1034238 ) on Friday May 11, 2007 @01:38PM (#19086633)
    "Our Web-Based Anti-Virus.."

    Is this not based more at phising scams, trojans and other exploits, rather than just virii?

    What's the main source of virus infections? Anybody got some research?

    I'm guesing it's swapping infected files, not visiting pr0n sites...

    • It depends on what you call a virus. Most spyware has viral qualities, usually with the exception that it doesn't use the host to propagate itself. Those are usually delivered through the web via the standard Punch-the-Monkey-type flashlets. Real virii are much worse, and I use the propagation property to decide what's 'real'. Propagation consumes resources on your PC and becomes a risk to anybody directly connected to your network. Spyware usually just, well, spys on you and reports back to a central serve
  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Friday May 11, 2007 @01:39PM (#19086645)
    Comment removed based on user account deletion
    • Google has farmed this process out to a third party, stopbadware.org, thereby insuring that an understaffed company is forced to deal with tons of irate web users trying desperately to get their site traffic restored before their business goes belly up.

      Not a good idea.

  • by Orinthe ( 680210 ) on Friday May 11, 2007 @01:44PM (#19086745) Homepage
    It should be noted that the 10% of the web number is somewhat misleading--some comments seem to think it implies that 1 in every 10 pages one visits are likely to contain malware, or the like. Chances are, most of these pages are not worth visiting. This isn't in in every ten pages on yahoo.com or cnn.com, it's probably more like 8 in 10 pages on freekiddiepornplz.com and piratewarezserialzhackz.tv.
  • by PlayItBogart ( 1099739 ) on Friday May 11, 2007 @01:45PM (#19086785)
    Is that anything like Ghost in the Shell?
  • by Animats ( 122034 ) on Friday May 11, 2007 @01:55PM (#19086989) Homepage

    Here's the actual paper. [usenix.org] It's a Usenix paper.

    What they're doing is straightforward, and it's much like what many virus scanners do. First, they look at web pages to see if there's anything suspicious that requires further analysis. If there is, they load the page into Internet Explorer (of course) in a virtual machine, and see if it changes its environment. The better virus scanners have been doing something like that for a few years now, running possible viruses in some kind of sandbox. Although they usually don't go all the way and run Internet Explorer in a virtual machine. (Are you allowed to do that under Microsoft's current EULA for IE 7?)

    The main problem with Google's approach here is that it's after the fact. They won't notice a bad page until the next time they crawl it. Bad pages come and go so fast today that they'll always be behind. As the paper says, "Since many of the malicious URLs are too short-lived to provide statistically meaningful data, we analyzed only the URLs whose presence on the Internet lasted longer than one week."

    If Google implements this, the main effect will be to push attackers into changing site names for attack sites even faster.

    It's all so backward. What we need is to run most of Internet Explorer in a tightly sandboxed environment on the user's machine, so that when you close the window, any browser damage goes away. That would actually work.

    • It's all so backward. What we need is to run most of Internet Explorer in a tightly sandboxed environment on the user's machine, so that when you close the window, any browser damage goes away. That would actually work.

      Or, just not run Internet Explorer, which as far as I can tell, is the most effective solution overall.

    • What we need is to run most of Internet Explorer in a tightly sandboxed environment on the user's machine, so that when you close the window, any browser damage goes away.

      What we need is for Internet Explorer to actually implement a real sandbox, and make all the attack vectors that involve ActiveX go away.
    • by mcrbids ( 148650 )
      What we need is to run most of Internet Explorer in a tightly sandboxed environment on the user's machine, so that when you close the window, any browser damage goes away. That would actually work.

      Sort of. Your conclusion that this would work is a result of looking at security only from a limited context. While this does limit the damage of a single type of attack (virus meddling with O/S files) it doesn't do anything at all to defend against the many other forms of online attacks.

      To wit:

      What about phishing
    • What's to stop Google from using the Google Toolbar to do basic scanning of incoming web pages? If anything looks suspicious in the initial scan, they can push the URL to "Googlenet" to have the URL fully analyzed.

      As much as I hate giving so much power to a single company... a Google web antivirus system is actually a pretty good idea.
    • Like using Parallels on a Mac?
  • I once wrote a document called Ghost in the Shell [google.com] which dealt with crypto/stego. I wonder if I can sue Google for stealing the concept name in order to pay back the anime producer who will sue me after they get wind of it..
  • Easy to defeat? (Score:5, Interesting)

    by 140Mandak262Jamuna ( 970587 ) on Friday May 11, 2007 @01:59PM (#19087079) Journal
    The malicious websites just have to skip the malicious code when the user agent string is google crawler. Are they going to change the user agent string? Will it be considered pretexting (the euphemism for impersonating)?
  • It's very nice from Google or any other company to do so. But I think the solution is to teach people to surf smarter! I.e When they think they want to download a movie, there's no way to download .exe file! it's just plain stupidity. People need to read the messages they pop before they click yes on every message like : By Clicking yes 1Click-weather-adware-traybar will be installed.
    One day people will learn to surf smarter, meanwhile, we will help them becoming smarter.
  • by mblase ( 200735 ) on Friday May 11, 2007 @02:24PM (#19087483)
    the Google study analyzed the main methods by which criminals inject malicious code on to innocent web pages. It found that the code was often contained in those parts of the website not designed or controlled by the website owner, such as banner adverts and widgets

    I am shocked, SHOCKED, to discover that a company that makes money selling ads on other websites would want to highlight malware-spouting ads by other companies.

    Yes, I agree that identifying these ads is a Good Thing. No, I don't think publicly-traded Google's intentions are entirely noble.
  • by madsheep ( 984404 ) on Friday May 11, 2007 @02:26PM (#19087521) Homepage
    Regardless of whether not not this provides a "false sense of security" it is a good idea. It would certainly be better than nothing. It won't really provide a false sense of security anymore than a phishing tool bar, antivirus software, or e-mail filtering. Right now people search for stuff on Google and click the link. There is no false sense of security. People are already assuming the websites are safe. If Google steps in and says "hey, this site isn't safe", then at least people have advance notice and choice.

    I see references to common things like widgets, but I don't see that as the most commonly attacked/exploited part of websites. Sure it's a real issue and is common (yes AdSense was hit with this kind of attack), but I hope they look for a lot more. One of the most common these days are the surprise addition to website sources of iframes with widths of 0. Or new and sudden references to .js files or new obfuscated JavaScript. If they look for all of this and possibly analyze/process it, they can go a long way to stop this type of malware. This feature if implemented correctly is a win for everyone on the Internet... well except the bad guys. :)
  • robots.txt (Score:2, Insightful)

    by _bug_ ( 112702 )
    What about malicious sites (fake login pages) that disallow indexing/crawling via meta tags or robots.txt. If Google still searches/indexes that page then they break the rules for crawlers/bots and how does that reflect on them?

    Also, what about content that's delivered on pages that require you to login first (poral, message boards, etc..). These are areas a crawler is not going to get to and completely miss.

    Going back to the fake login pages bit, unless Google can index every site every day these fake logi
    • by Shados ( 741919 )
      Well, I'd think the point is to only check the pages that are actually displayed in google. If there's a robot.txt blocking a page, Google won't display that -exact- page, and it thus won't even be in the links I might end up clicking directly from google.

      The loss is that you could go to a safe link, then be redirected or whatever to an unsafe one, so its indeed not perfect, but...
  • So, are they going to point out all of the scam/spam/malware pages in their "Sponsored Links"? Hell, even searching for "Google Earth" turns up five pages purporting to be the download location, pages which no doubt either make their money from ads, or encase the download in absolute spyware hell.
  • So this helps redress the balance.

    What a great idea.
  • Google will take care of us. Not to worry. They don't do evil...as long as you watch their ads. Just don't ask questions or break their NDA's. Then you're fucked.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...