Link analysis home page
Part I: Theory
2. Crawlers and search engines
3. Theoretical perspectives
4. Sampling & correlations
Part II: Web structure
5. Link structures in the web graph
6. Content structure of the web
Part III: Academic links
7. Universities – link types
8. Universities - link models
9. Universities - international
10. Departments and disciplines
11. Journals and articles
Part IV: Applications
12. Site design & search engines
13. Health check for universities
14. Personal home pages
15. Academic network analysis
16. Business web sites
Part V: Tools and techniques
17. Search engines & Archive
18. Personal crawlers
19. Data cleansing
20. Cybermetrics database
21. Embedded link analysis
22. Social network analysis
23. Network visualisation
24. Academic web indicators
Part VI: Summary
25. Summary & future directions
Ethical issues for crawlers
Reviews of this book
- 2009 follow-up book:
Introduction to Webometrics
Part V: Tools and Techniques
20. Online University Link Databases
The recommended general approach for analyzing the files is as follows.
- Create a new folder on the local computer with an appropriate name, such
as NZ_2004. Copy the domain names file into this folder.
- Download the appropriate link structure zip file.
- Create a subfolder of your new folder and unzip the link structure files
into this subfolder.
- Download and unzip the latest SocSciBot Tools program from the same page
as the database.
- Print the SocSciBot Tools user manual that is together with the tools
program and use it to select the actions needed with the tools program.
Note that some of the analyses require a lot of hard disk space to store the
information created and may take a long time to complete: up to one hour.