Relevant Change Detection: Framework for the Precise Extraction of Modified and Novel Web-based Content as a Filtering Technique for Analysis Engines


Kevin Borgolte, Christopher Kruegel, Giovanni Vigna


Proceedings of the 23rd World Wide Web Conference (WWW), April 2014 Developers' Track


Tracking the evolution of websites has become fundamental to the understanding of today’s Internet. The automatic reasoning of how and why websites change has become essential to developers and businesses alike, in particular because the manual reasoning has become impractical due to the sheer number of modifications that websites undergo during their operational lifetime, including but not limited to rotating advertisements, personalized content, insertion of new content, or removal of old content.

Prior work in the area of change detection, such as XyDiff, X-Diff or AT&T’s internet difference engine, focused mainly on “diffing” XML-encoded literary documents or XML-encoded databases. Only some previous work investigated the differences that must be taken into account to accurately extract the difference between HTML documents for which the markup language does not necessarily describe the content but is used to describe how the content is displayed instead. Additionally, prior work identifies all changes to a website, even those that might not be relevant to the overall analysis goal, in turn, they unnecessarily burden the analysis engine with additional workload.

In this paper, we introduce a novel analysis framework, the Delta framework, that works by (i) extracting the modifications between two versions of the same website using a fuzzy tree difference algorithm, and (ii) using a machine-learning algorithm to derive a model of relevant website changes that can be used to cluster similar modifications to reduce the overall workload imposed on the analysis engine. Based on this model for example, the tracked content changes can be used to identify ongoing or even inactive web-based malware campaigns, or to automatically learn semantic translations of sentences or paragraphs by analyzing websites that are available in multiple languages.

In prior work, we showed the effectiveness of the Delta framework by applying it to the detection and automatic identification of web-based malware campaigns on a data set of over 26 million pairs of websites that were crawled over a time span of four months. During this time, the system based on our framework successfully identified previously unknown web-based malware campaigns, such as a targeted campaign infecting installations of the Discuz!X Internet forum software.

  title     = {{Relevant Change Detection: Framework for the Precise Extraction of Modified and Novel Web-based Content as a Filtering Technique for Analysis Engines}},
  author    = {Borgolte, Kevin and Kruegel, Christopher and Vigna, Giovanni},
  booktitle = {Proceedings of the 23rd World Wide Web Conference (WWW)},
  series    = {WWW Companion},
  acmid     = {2578039},
  date      = {2014-04},
  doi       = {10.1145/2567948.2578039},
  edition   = {23},
  editor    = {Broder, Andrei Z. and Shim, Kyuseok and Suel, Torsten},
  isbn      = {978-1-4503-2745-9},
  location  = {Seoul, Republic of Korea},
  numpages  = {4},
  pages     = {595--598},
  publisher = {International World Wide Web Conference Committee (IW3C2)},
  url       = {}