[WEB SECURITY] Concerning anti-malware system in search engines

MustLive mustlive at websecurity.com.ua
Thu Sep 8 16:55:17 EDT 2011


Hello participants of Mailing List.

As I see, some people found interesting my previous article concerning
search engines built-in antiviruses - Bypassing of behavioral analysis or
malware strikes back
(http://lists.webappsec.org/pipermail/websecurity_lists.webappsec.org/2011-August/008007.html).
And I'll answer everyone who wrote me concerning that post. It was the first
post in the series. And this will be the second one ;-).

I have an idea concerning anti-malware systems in search engines about
improving them. Which I've decided to publicly release to allow any search
engine to use it for improving their system. At time of May, when I devised
this idea, there were three search engines with built-in antiviruses which I
knew: Google, Yahoo and Yandex. And at the end of August web mail Mail.ru
added protection against malicious links with using of WOT's service into
their system (so this my idea partly can be applied to them also, but
Mail.ru should better add antivirus to their search engine and use the idea
for their search engine). So this idea can be interesting for these
companies (and any other who will be implementing built-in antiviruses in
their systems). That company who first implement this idea in their system
will receive competitive advantage.

As I told earlier, in May I made a speech at conference UISG and ISACA Kiev
Chapter about systems of revealing infected web sites. For those who
interested, the speech (on Russian) can be found at my site
(http://websecurity.com.ua/uploads/articles/speech-2011.swf). And one
interesting aspect of this topic, on which I draw attention during my
speech, was the methods of influencing on web sites owners to remove malware
from their sites (and as additional effect - forcing them to attend more at
security of their sites). The most effective one - which can be used for
informing and for influencing on sites owners - is writing in search engines
(such as Google, who first became doing it) that some sites are infected.

I told about it briefly, not spending many time on such topic as informing
(because I told about a lot of aspects in my speech), but it was one of
important aspects in work of fighting with malware in Internet. And after
few days after conference I've devised this idea about improving the way
how search engines inform their users about infected sites.

Currently search engines inform about infected status only during that time,
when they think the specific site is infected. And remove this status just
after they decided that the site is clear. To improve this situation I
recommend to use two statuses instead of one - "infected" and "was
infected". The second new status should be used for some time (like one
month) after the site was found to be clear and it shouldn't forbid users to
click on link and visit the site - only inform about "the past" of the site.

How it can improve security of users of search engine. The same sites often
can be infected many times during a year - I found it many times during my
researches. Plus some systems can mistakenly decided that the site is clear.
So such new status gives additional protection for the users. For example,
Google's antivirus system, which I often use for my researches, has such
problems, as not seeing viruses on earlier infected sites (which exist on
these sites) or saying that site is clear (at first pages from this site in
serp and in Safe Browsing), but at that saying that other pages are infected
(there are sometimes disparity between Safe Browsing and Google's serp).
Such cases I see many times for last two years. And adding of the second
status can improve this situation (of course companies should also improve
their antiviruses).

Besides, last week I wrote the article "Effective using of cloacking against
web antiviruses". I'll not be writing you in details about it, because I
wrote about things which are well-know for a long time and which I took into
account when in 2008 was developing my Web VDS (and I heard that some AV
vendors also not making such mistakes, as Google did). In article I wrote
about recent case, when I found Google using cloacking (UA-spoofing), as I
supposed it was done for the purpose of decloacking malware, but Google did
it very ineffectively (without taking into account well-know aspects), so it
couldn't help against advanced malware. So all AV vendors should take it
into account.

Best wishes & regards,
MustLive
http://soundcloud.com/mustlive 






More information about the websecurity mailing list