Web sites are public representations of corporations, businesses and governmental bodies. As such they require proper maintenance to ensure that accurate and updated information is presented adequately. Dynamically created Web sites sometimes are not an option: they add a computational overhead on the server and make the automatic indexing of pages difficult. Static Web sites may grow to an extent that manually performing maintenance operations becomes unfeasible: automatic or semi-automatic means ought to be put in place. In this paper we explain one such approach, using software agents to look after large data-intensive Web sites. Our maintenance operations are performed by a team of autonomous agents that communicate with each other as well as with Web masters or other human agents. Our approach can be naturally incorporated into existing Web sites and its use can be gradually extended to encompass larger portions of the site. Because our agents are independent, their individual malfunctioning should not stop the maintenance effort as a whole.
|Number of pages||16|
|Publication status||Published - 2004|