Master's Thesis from the year 2013 in the subject Computer Science - Software grade: A course: Master of technology language: English abstract: World Wide Web is a source of enormous information and has massive influence on our lives. A large number of websites nowadays are designed without sufficient resources and professional skills. Therefore evaluating quality of a website is a very important issue further for success of any web; quality is one of the most important attribute. Various guidelines tools and methodologies have been described by many authors to maintain the quality of a websites but their implementation is not much clear. Web metrics are used to measure various attributes of websites quantitatively and can be used to evaluate the quality of a website. So it is important to assess the website to enhance the quality of websites and web development process In our research we calculated twenty web page metrics using an automated tool WEB METRICS CALCULATOR developed in ASP.NET language. We collected data from websites of various categories from pixel awards of year 2010 2011 and 2012 to categorize the websites into good or bad. We have used logistic regression and 10 machine learning techniques (Bayes net Naïve bayes Multilayer perceptron Adaboost Decision table Nnge Part Bf-tree J-48 and Random forest). Out of all these techniques results shows that area under ROC curve is greatest for Random forest within range .842-.891 for all year data set so performance of Random forest model is greater as compared to all other models.
Piracy-free
Assured Quality
Secure Transactions
Delivery Options
Please enter pincode to check delivery time.
*COD & Shipping Charges may apply on certain items.