Abstract. In spite of the wide use of the Internet, it is difficult to develop desirable web documents evaluation that reflects users’ needs. Many automatic ranking systems have used this citation system to measure the relative importance of consumer products or documents. However, the automatic citation analysis has a limitation in that it does not truly reflect the importance of the varying viewpoints of human evaluation. Therefore, human evaluations of web documents are very helpful in finding relevant information in a specific domain. Currently, human evaluation is done by a single expert or general users without considering the degree of domain knowledge of evaluators. In this paper, we suggest that a dynamic group of experts for a certain web document be automatically created among users to evaluate domain specific web documents. The experts have dynamic authority weights depending on their performance of the ranking evaluation. In addition, we develop an evaluation effectivene...