I recently revisited my project (read 6 Degrees of separation Part 1) and discovered Gephi, a network analysis software designed for large graphs. After spending a few hours, I created these two visualizations (click on each to go to the SVG):
Wikipedia can be represented as a graph, where each page is connected to other pages, forming a complex network. By recursively scraping Wikipedia, starting from the "Wikipedia" page, I obtained an enormous graph with 1.3 million nodes and 10.4 million edges. Here is the visualization I managed to generate:
In the middle of the graph, you may notice a white square. My working theory is that the optimization algorithm, responsible for arranging the nodes to improve graph readability, is unable to handle those specific nodes and edges, resulting in their immobility.