Link analysis home page
Part I: Theory
1. Introduction
2. Crawlers and search engines
3. Theoretical perspectives
4. Sampling & correlations
Part II: Web structure
5. Link structures in the web graph
6. Content structure of the web
Part III: Academic links
7. Universities – link types
8. Universities - link models
9. Universities - international
10. Departments and disciplines
11. Journals and articles
Part IV: Applications
12. Site design & search engines
13. Health check for universities
14. Personal home pages
15. Academic network analysis
16. Business web sites
Part V: Tools and techniques
17. Search engines & Archive
18. Personal crawlers
19. Data cleansing
20. Cybermetrics database
21. Embedded link analysis
22. Social network analysis
23. Network visualisation
24. Academic web indicators
Part VI: Summary
25. Summary & future directions
26. Glossary
Online Appendix
Ethical issues for crawlers

Reviews of this book

- follow-up book (2009):
Introduction to Webometrics

 

Part V: Tools and Techniques

20. Online University Link Databases

Useful links

Instructions

The recommended general approach for analyzing the files is as follows.

  1. Create a new folder on the local computer with an appropriate name, such as NZ_2004. Copy the domain names file into this folder.
  2. Download the appropriate link structure zip file.
  3. Create a subfolder of your new folder and unzip the link structure files into this subfolder.
  4. Download and unzip the latest SocSciBot Tools program from the same page as the database.
  5. Print the SocSciBot Tools user manual that is together with the tools program and use it to select the actions needed with the tools program.

Note that some of the analyses require a lot of hard disk space to store the information created and may take a long time to complete: up to one hour.