Preprint
Article

This version is not peer-reviewed.

New Methodology for Calculating Webometrics University Ranking: From Google Scholar to OpenAlex

Submitted:

05 January 2026

Posted:

07 January 2026

You are already at the latest version

Abstract
The Webometrics University Ranking website ceased to function in 2025 due to an inability to obtain citation data from Google Scholar. Since then, Webometrics University Ranking data has been published on the Figshare server, but the values of the three individual indicators have not been ranked. From July 2025 onwards, the Openness indicator values for citations have been calculated using OpenAlex via the ROR identifier. Data on the ranking of all three indicators will be provided twice a year in the form of an Excel file on a paid basis. Examples of universities included in the TOP-2,000 of the January 2025 and July 2025 Webometrics Ranking, which had missing legitimate Institutional Google Scholar Citation profiles (IGSCPs), demonstrate a sharp increase in their rankings when switching to the new methodology for calculating Openness indicator values. Experiments on the webometric ranking of universities with missing IGSCPs included in the TOP-2,000, when restored, showed identical average changes in world rankings, both in experiments using the old methodology and when transitioning from the old methodology to the new one together with the strong rank correlation of two corresponding layers in the Top 1,000 Webometrics University Ranking in time led to the conclusion that the transition to the new methodology will have virtually no effect on university rankings. This conclusion allows researchers and university managers to conduct comparative analyses of university positioning and benchmarking exercises starting in 2016, when IGSCPs were introduced into the Webometrics University Ranking calculation. The importance of creating and maintaining Institutional Google Scholar Citation profiles, despite changes in the Webometrics University Ranking calculation methodology, has been demonstrated.
Keywords: 
;  ;  ;  ;  

Introduction

Since 2004, the Spanish Cybermetrics Laboratory has been calculating the Webometric Rankings of the World’s Universities that have autonomous domains (URL addresses) (www.webometric.info). This global university ranking is on par with the THE-QS University Rankings, as well as with the Shanghai and Taiwanese University Rankings. The following indices are measured based on the responses received from four high capacity search engines (Google, Yahoo, Live Search, and Exalead): SIZE (the total number of pages obtained from the abovementioned engines for each university domain), VISIBILITY (the total number of unique external citations obtained with the help of the last three engines), RICH FILES (the number of pdf, ps, doc, and ppt file formats obtained using the first engine), SCHOLAR (the number of academic documents and their citation obtained with the help of the Google Scholar search engine); and subsequently, the integral webometric index, according to which the world’s universities are ranked, was calculated using special mathematical procedures, including logarithmic normalization and weighing, which made it possible to build an integral indicator on an additive basis (Aguillo et al., 2006).
The launch of this ranking took a lot of attention from university management around the globe, since, unlike all other rankings, it made it possible to rank almost all universities in the world. Therefore, there naturally appeared a great interest towards analyzing this ranking in comparison with other rankings in scientific discourse, as well.
Since the launch of the Webometrics University Ranking (2004), its methodology has been constantly changing, but in July 2016, the most significant change happened (Moskovkin, Yawei, Sadovski, 2019). If originally the third indicator (Openness) concerned the number of PDF files affiliated with the University site found through the Google Scholar search engine, from the second half of 2016, this indicator shifted towards the citations found through the same search engine, with the Openness indicator weighing only 10%.
Naturally, most of the world universities were not ready for this change. They should have had their scientists create personal Google Scholar Citation profiles (GSCPs) in advance, and those profiles tied to the university domain would have automatically made-up institutional Google Scholar Citation profiles (IGSCPs). That very year also saw the introduction of the bibliometric Excellence indicator, weighing as much as 30%, to be calculated on the basis of the Institute Top-10% most cited Scopus papers profiles.
A certain number of the best personal GSPCPs were used to calculate the total number of citations for each university. The ranked number of these total citations was called Transparent Ranking, which has been published on the Webometrics University Ranking website since July 2016. The positions (ranks) of universities in this ranking were determined by their Openness indicator values.
Regarding the comparison of Transparent Ranking (July 2016) with other rankings, A. Basu et al. (2019) wrote:
“Rank correlation shows that correlation is low with THE and QS. This may have been expected as they are partially survey-based and partially based on bibliometric indicators. The correlation with Scimago and the Leiden Ranking, which are based only on bibliometric indicators is better. In fact, correlation is best with the Leiden ranking (~ 0.7) which also has a size-independent variant.”
However, it should be noted that Google Scholar launched a new service called “Google Scholar Citations” on 16 November 2011 (Connor, 2011), i.e. five years before its integration into the Webometrics Ranking.
Three years after the launch of this service, J.L.Ortega and I.F. Aguillo (2014) wrote the following about its comparison with Microsoft Academic Search:
“However, their technical limitations such as duplicated profiles, spurious citations, and possible manipulations, make advisable that the use of these citation indexes in research evaluation will be jointly other citation databases that permit to detect biases or gaps that could under value the analysis of an institution, discipline, or author. As many of these limitations are probably due to the novelty of these platforms, it can be expected that future developments would improve these services and they became stronger competitors of the actual subscription-based citation databases.”
When using Google Scholar Citation profiles to rank institutions, as N. Ul Sabah et al. (2023) wrote, the following methodological problems arise:
“Firstly, Redundant profiles, which increased their citation and rank to produce false results. Secondly, the exclusion of theses and dissertation documents to retrieve the actual publications to count for citations. Thirdly, the elimination of falsely owned articles from scholars’ profiles.”
These findings are confirmed by other studies (Ortega, Aguillo, 2014; Mingers, O’Hanley, & Okunola, 2017). To address these issues, N. Ul Sabah et al. (2023) developed a specialised computer programme, which they tested using data from 120 UK-REF (United Kingdom’s Research Excellence Framework) institutes and Google Scholar.
In July 2016, out of about 22, 000 universities of the world ranked with Webometrics, only 4,120 universities had IGSCPs. The further growth in the number of universities with such profiles had the following pattern: in January 2017 – 8, 634 (out of 26, 000 ranked universities), in July 2017 – 9,491, in January 2018 – 9,593, in July 2018 – 10,778 (Moskovkin, Yawei, Sadovski, 2019).
As noted by A. Basu et al. (2019) in the July 2016 edition, the Webometrics University Rankings contains 4,132 Institutions with an aggregate of 174,915,125 citations that makes it one of the largest ranking exercises in the world.
The illegitimacy of the IGSCP in the Webometrics Ranking (January 2018) of Boris Grinchenko Kyiv University was pointed out by O. Buinytska, B. Hrytseliak and V. Smirnova (2018). They noted that the list of Personal GSCPs of this university included profiles of departments.
A comprehensive quantitative and qualitative analysis of the legitimacy of IGSCPs has been conducted for the Asian University Webometrics Ranking by V. M. Moskovkin and O. V. Serkina (2024).
I have not encountered other works devoted to a large-scale analysis of the legitimacy of IGSCPs, which is largely related to errors in the functioning of Google Scholar, as well as errors made by the creators of these profiles. In the early years of this search engine's operation, its errors were tracked on an annual basis by P. Jasco (2005, 2012).
The most comprehensive list of such errors immediately after the launch of the Google Scholar Citation service in calculating the Openness indicator in the Webometrics University Ranking was identified by A. Martin-Martin et al. (2016). They noted that errors that can compromise the metric portrait of an author offered by Google Scholar can be grouped into two main sections. First, the errors Google Scholar sometimes makes when it indexes a document or when it assigns citations to it. Second, the specific errors that are sometimes made during the creation of a Google Scholar Citations profile. Their list is given below:
a)
Incorrect identification of the title of the document;
b)
Ghost authors;
c)
Book reviews indexed as books;
d)
Incorrect attribution of documents to authors;
e)
Failing to merge all versions of a same document into one record;
f)
Grouping different editions of the same book in a single record;
g)
Improper attribution of citations to a document;
h)
Duplicate citations;
i)
Missing citations;
j)
Duplicate profile;
k)
Variety of document types (including non-academic documents);
l)
Inclusion of missattributed documents in the profile;
m)
Deliberate manipulation of documents and citations in Google Scholar;
n)
Duplicate documents in profiles;
o)
Incorrectly merged documents;
p)
Unclean document titles;
q)
Missing or uncommon areas of interest.
Thus, I have shown that after the introduction of the Openness indicator in the calculation of the Webometrics University Ranking, university management and researchers have been faced with the problem of the correct creation of personal GSCPs, which still receives very little attention. However, at the beginning of 2025, another, but more radical, change in the Webometrics University Ranking calculation methodology took place. IGSCPs were abandoned in the calculation of the Openness indicator.
Let us briefly consider the reasons for this abandonment and the new methodology for calculating this ranking.

New Methodology for Calculating the Webometrics University Ranking and Its Resource Support

In 2025, the Webometrics Ranking website ceased to function. As Isidro Aguillo, the head of the Webometrics project, informed me: “Unfortunately, Google Scholar no longer allows data extraction,” which makes it impossible to calculate the Openness indicator based on Institutional Google Scholar Citation profiles (IGSCPs).
The January 2025 Webometrics University Ranking, calculated using the old methodology, are published on the Figshare server (Aguillo, 2025 a).
It ranks about 32,000 HEIs from over 200 countries. Data on the ranks of the Visibility, Excellence and Openness indicators are not provided. The first indicates the ROR identifier for universities. The Research Organisation Registry (ROR) is a global, community-led registry of open, persistent identifiers for research organisations. ROR makes it easy for anyone or any system to disambiguate institution names and connect research organisations to researchers and their outputs.
The July 2025 Webometrics University Ranking, calculated using a new methodology, has been published on the Figshare server (Aguillo, 2025 b).
In this methodology, the values of the Visibility and Excellence indicators are calculated as before, while the values of the Openness indicator are calculated using the number of citations for the period 2020-2024 of institutions with the same weight (10%). These calculations are performed using OpenAlex via the ROR identifier. Approximately the first 8,000 universities have an ROR identifier of the form https://ror.org/03vek6s52 for Harvard University.
After the 8,000th place in the ranking, we see universities with missing ROR identifiers, and the closer to the end of the ranking, the more such universities there are. For universities with missing ROR identifiers, Total Citation calculations are not performed.
University managers and researchers can check the total number of citations over the last five years using the ROR identifier. For example, Leiden University's ROR identifier is https://ror.org/027bh9e22 528. To view its publication activity and citation rate, the following query should be made: https://api.openalex.org/institutions/ror:https://ror.org/027bh9e22
As Isidro Aguillo informed me, “a full Excel file with a breakdown of indicators will be available at a price of 200 euros.”

Proof of Comparability Between the Old and New Webometrics University Ranking Calculation Methodologies

In July 2025, all leading universities with missing legitimate IGSCPs sharply improved their positions in the ranking. There were 40 such universities in the TOP-2,000 Webometrics University Ranking in January 2025 (Table 1).
In January 2025, their World Ranks did not change significantly compared to July 2024, as the methodology for calculating them remained unchanged, but in July 2025, these World Ranks have changed dramatically due to a change in methodology.
In January 2025, the number of citations Top 310 authors (excluding the Top 20 outliers) according to Google Scholar for the universities under consideration were not taken into account, as there were no IGSCPs, and in July 2025, their absence was compensated for by total citations according to OpenAlex.
The average change in university rankings for the 40 universities under consideration (Table 1, WR January 2025/ WR July 2025) is 2.187. This means that the universities with missing legitimate IGSCPs improved their positions in the Webometrics University Ranking by an average of 2.187 times when the calculation of total citations was switched from Google Scholar to Open Alex.
Let us now examine how the positions of universities have improved following the restoration of their illegitimate IGSCPs under the old calculation methodology. In January 2024, there were 14 universities with illegitimate IGSCPs in the TOP-2,000 Webometrics University Ranking, which restored their profiles in July 2024 (Table 2).
From this table, we can calculate the average change in university rankings, which is 2.134. Consequently, the universities with missing legitimate IGSCPs improved their positions in the Webometrics University Ranking by an average of 2.134 times when their illegitimate IGSCPs were restored after a year.
Descriptive statistics for both calculations are shown in Table 3.
As we can see, similar average values for changes in world rankings and confidence intervals were obtained in both cases. This suggests that the transition to the new calculation methodology will have little effect on university rankings.
When creating personal GSCPs, researchers try to include their best publications from the Scopus and WoS databases, as well as all their Open Access publications with a DOI that are included in DOAJ. Virtually all of these publications are indexed by OpenAlex. The work of D. Chavarro, J.P. Alperin and J. Willinsky (2025) supports this. They made the following conclusions:
“To summarize, items with a Crossref DOI have a 96% chance of appearing in OpenAlex. Journals included in the Directory of Open Access Journals (DOAJ) without a DataCite DOI (and without a Crossref DOI) have an 81% chance of inclusion in Open Alex. Journals in Scopus, while lacking a Crossref DOI and absent from DOAJ also have an 81% chance of appearing in OpenAlex.”
At the same time, the rejection of IGSCPs cuts off citations of old publications from closed sources and grey literature. Nevertheless, I believe that the correlation coefficient between the total citation values calculated by IGSCPs and OpenAlex will be close to unity. The same can be assumed for Spearman's rank correlation coefficient between World Ranks calculated using the two methodologies, which would confirm the conclusion regarding the mutual compatibility of these methodologies.
Indeed, let us take the TOP-1,000 Webometrics University Ranking in January 2025 and perform such a correlation with the corresponding World Ranks in July 2025 for three layers (Table 4).
In these calculations, we will remove the statistical outlier for the University of Michigan (94; 6) for the top layer слoя (Top-100), where the first number is the rank of this university in January 2025, and the second is the rank of this university in July 2025. For the middle layer, we removed three statistical outliers: University of Birmingham (485; 116); Leiden University (528; 116); Peking University Health Science Center (Beijing Medical University) (535; 1,077), and for the lower layer, we removed two: Charité Universitätsmedizin Berlin (905; 233); Swinburne University of Technology (963; 433).
As a result, we have made the following conclusions:
Global Hierarchical Stability: Despite the radical shift in data sources, the global ranking hierarchy remains exceptionally robust. The overall Spearman correlation coefficient (ρ=0.9843) for the combined sample (N=294) confirms that the transition to OpenAlex does not disrupt the fundamental structure of Webometrics University Ranking. The macro-level distance between top-layer and lower-layer institutions remains consistent.
The “Fading Correlation” Phenomenon: A distinct "fading correlation" effect is observed as we move down the ranking layers. While the Top-100 display near-perfect identity (ρ=0.9910), the correlation drops significantly in the bottom layer (ρ=0.3369). This volatility in the 901–1000 range is attributed to a higher density of data points and the increased sensitivity of the OpenAlex indicator to previously unindexed or incorrectly mapped institutional citations. The dense lower layers of the Webometrics University Ranking, where universities have similar Openness scores, lead to high turbulence and poor predictability of changes in positions. This situation is universal in nature and would also be observed when studying the temporal stability of any ranking. It has been shown that this effect has been confirmed in other data sets in the works of Selten et al. (2020) and Moskovkin et al. (2022).
Correction of Institutional Invisibility: The low correlation in the lower tiers is interpreted not as a methodological flaw, but as a corrective feature. The transition to OpenAlex via ROR identifiers effectively "unmasks" universities that were previously penalized or invisible due to the lack of legitimate Institutional Google Scholar Citation Profiles (IGSCP).
Methodological Validity: The high global correlation provides empirical justification for the adoption of OpenAlex as a reliable and transparent alternative to Google Scholar. This transition aligns Webometrics with the broader movement toward Open Science and verifiable bibliometric data.
In the Top-100 Webometrics University Ranking, the correlation coefficients with adjacent rankings in time are extremely high, indicating remarkable stability of leading universities. Institutions in this layer tend to retain their positions over time, which is consistent with the idea that once an institution enters the global elite, it can remain there indefinitely due to accumulated prestige, resources, and visibility. In ideological terms, this situation is known as the Matthew effect (Merton, 1968). More rigorously, this effect was substantiated by D. J. de S. Price (1976), who named it the Cumulative Advantage Distribution.
The advantage of switching to citation counting using OpenAlex is related to the synchronisation of the five-year time interval when calculating top cited papers (Excellence indicator) with the calculation of total citations (Openness indicator).

Comparison of Google Scholar and OpenAlex Services and Reasons for Switching to OpenAlex When Calculating Webometrics University Ranking

As we noted earlier, referring to Isidro Aguillo, the direct reason for the transition to the new Webometrics University Ranking calculation methodology was related to Google Scholar's refusal to provide citation data to Cybermetrics Laboratory. Let us try to understand the preconditions and deeper reasons for this, which were unknown at the time of writing this manuscript.
Let us begin with a comparison between the two services, according to L. Delgado-Quirós and J.L. Ortega (2024):
“Google Scholar: One of the most important academic search engines due to its estimated size (389 million) and age (2004), it obtains data directly from the web. Special agreements with publishers allow it to extract citations and content information from paywalled journals. Its search interface can also access books (Google Books) and patents (Google Patents);
OpenAlex: This is the newest service (2022). It was created by OurResearch, a nonprofit organization that recovered the defunct Microsoft Academic Graph (203 million) to implement a new open product. The core of OpenAlex was then set up by Microsoft Academic Graph, with the addition of data from other open sources such as Crossref, PubMed, and ORCID. OpenAlex now indexes 240 million publications.”
The size of the Google Scholar database, containing 389 million records, was obtained by M. Gusenbauer (2019) as of January 2018 using the Query Hit Count (QHC) methodology. To date, this is the largest bibliographic database, but large-scale searches and bibliometric analysis require an API.
Indeed, it is well known that APIs play a key role in automating data retrieval in large-scale bibliometric analysis. A. Imran and M. Q. Pasta (2024) note that OpenAlex and Semantic Scholar offer the most user-friendly experiences with minimal technical requirements, while the Google Scholar API is the most cumbersome. At the same time, E. Orduna-Malea et al. (2015) noted that there is no API for Google Scholar, and that it only displays the first 1000 results, prevents us from performing large-scale empirical studies of these issues.
The same was later written by A. Martín-Martín et al. (2021), noting that Google Scholar has no data exporting capabilities in its web interface and no API, as well as E.M. Garvon et al.(2025), who noted that the absence of an API and the lack of a master list of journals in the Google Scholar web interface hinder its use in large-scale analyses.
In this regard, as noted by L. Delgado-Quirós and J.L. Ortega (2024), special agreements are necessary for such large-scale analyses.
The reasons for switching from calculating the Openness indicator based on Google Scholar to calculating it based on OpenAlex can be understood from why this new service was created.
In May 2021, it was announced by the Microsoft Academic Blog (Next Steps…, 2021) that the Microsoft Academic website, application programming interfaces (API), and snapshots would retire on December 31, 2021. With the announcement of the retirement of Microsoft Academic Graph (MAG), the non-profit organization OurResearch announced that they would provide a similar resource under the name OpenAlex, with virtually all publications indexed by MAG being transferred to OpenAlex with their bibliographic data preserved (Scheidsteger, Haunschild, 2023).
So, we see that the commercial company Microsoft has discontinued support for its open access service MAG, so we can assume that in our case, the commercial company Google has done the same. In this regard, Cybermetrics Laboratory was forced to use the OpenAlex service to calculate the Webometrics University Rankings.
It should be noted that Google Scholar and OpenAlex are quite comparable in terms of bibliographic data coverage (Gusenbauer, 2019; Delgado-Quirós, & Ortega, 2024; Walters, 2025), and integration between Google Scholar Metrics and OpenAlex constitutes a viable approach for bibliometric studies, as it offers greater transparency, reproducibility, and analytical scope (Garvon et al., 2025).
Comparing the bibliographic search capabilities of fifteen discovery/access mechanisms, W.H. Walters (2025) wrote:
“GS and OpenAlex therefore have the potential to serve as first-choice discovery mechanisms as long as their interfaces and subject-search capabilities meet the user’s requirements.”
The transition to OpenAlex significantly changes the “rules of the game” interms of manipulation:
Reduced risk of “black SEO optimization” of profiles: In Google Scholar, anyone could create a fake institutional profile. In OpenAlex, profiles are linked via ROR ID (Research Organisation Registry), which is moderated by the scientific community. This makes it difficult to create “ghost universities.”;
Transparency as an antidote: Unlike Google Scholar's closed algorithms, OpenAlex data is transparent. Any researcher can verify the calculations, which in itself is a deterrent to manipulation.
In addition to Webometrics University Ranking, Leiden Ranking (CWTS Leiden Ranking Open Edition (2025)) also switched to OpenAlex in 2025. The transition to OpenAlex in calculating global university rankings indicates a shift towards Open Science, as OpenAlex data is completely open under a CC0 licence, unlike Scopus, WoS and Google Scholar.

The Importance of Establishing and Maintaining Institutional and Personal GSCPs

Transition to a new methodology for calculating Webometrics University Ranking, however, this does not mean that university managers should ignore the creation of legitimate IGSCPs.
Despite the fact that Webometrics University Ranking will switch to OpenAlex for calculating the Openness indicator from July 2025, and therefore IGSCPs will no longer directly influence this specific indicator in Webometrics, it is still extremely important for universities to create and maintain these profiles, as well as encourage their staff and students to create and maintain individual GSC profiles for several reasons:
1. Visibility and reach: Google Scholar remains one of the most popular and accessible search engines for scientific literature worldwide. Many researchers, students, and even ordinary users start their search there. A well-maintained GSC profile for a university or individual researcher significantly increases their visibility and chances of being found, and improving the visibility of publications increases the likelihood of them being cited.
2. Popularity and habit: For many researchers, GSC is the de facto tool for tracking their own citations and searching for literature. It would be unwise to ignore such a large audience and user habit.
3. Alternative assessment: Although Webometrics is changing its methodology, GSC provides its own citation metric, which differs from OpenAlex (due to its broader coverage of sources) but indirectly influences it. This gives the university another perspective for assessing its scientific influence.
4. Attracting talent and partnerships: Researchers, students, and partners often use GSC to assess the scientific activity of the university and its staff. Maintaining up-to-date profiles helps demonstrate the strength of the university's scientific community.
5. Internal monitoring and self-assessment: Universities can use GSC data for internal monitoring of the publication activity and citation rates of their faculties and researchers, supplementing information from other databases.
6. Reputation management: An active and well-structured GSC profile helps the university manage its online reputation and present its scientific activities in the most favourable light.
Thus, the change in the Webometrics University Ranking methodology does not negate the importance of Google Scholar Citation profiles. They remain a valuable tool for literary search, promoting, monitoring and evaluating scientific activity, although their role in the Webometrics ranking has changed. For example, L. Zhang and M. Kumaran (2023) investigated STEM librarians' presence on academic profile websites (APWs) at American and Canadian research universities and found that Google Scholar Citations was the most used APW, followed by ResearchGate, ORCID, and Academia.edu.
Despite the fact that the webometrics.info website has ceased to function, the significance of the Webometrics Ranking has not declined, as evidenced by the high number of views and downloads on the Figshare server. According to data from 13 August 2025, the number of views and downloads for the January 2025 ranking was 126,091 and 34,429, respectively, and for the July ranking, it was 20,862 and 5,461.

Conclusions

Google Scholar's refusal to provide Cybermetrics Laboratory with data for calculating the Webometrics University Ranking led to the need to find a replacement for calculating the Openness indicator. OpenAlex proved to be a good replacement, considering that the world's leading Leiden Ranking also switched to OpenAlex in 2025.
Difficulties in maintaining the informetrics.info website led to its closure and to the transition of publishing this ranking twice a year on the Figshare server with an additional fee for Excel files containing complete data on three individual indicators. I hope that this website will be restored in the future.
Experiments on the webometric ranking of universities with missing IGSCPs included in the TOP - 2,000, when restored, showed identical average changes in world rankings, both in experiments using the old methodology and when transitioning from the old methodology to the new one. This circumstance, as well as a comparison of the content of the Google Scholar and OpenAlex databases together with the strong correlation of two adjacent TOP-1,000 Webometrics University Ranking in time led to the conclusion that the transition to the new methodology will have virtually no effect on university rankings.
In this regard, this ranking can be used by researchers and university managers for comparative analysis of university positioning and benchmarking exercises starting in 2016, when IGSCPs were introduced into the Webometrics University Ranking calculation.
The necessity of continuing to support Institutional and Personal GSCPs is justified, as they will, among other things, have a positive impact on the growth of total citations in OpenAlex calculations.
It has been shown that the advantage of switching to citation counting using OpenAlex is related to the synchronisation of the five-year time interval when calculating the top cited papers (Excellence indicator) with the calculation of total citations (Openness indicator).
In conclusion, it should be noted that during its twenty-year evolution, Webometrics University Ranking has become a full-fledged scientometric ranking, similar to other prestigious rankings, whose only drawback is that, after the introduction of the Excellence indicator, it ceased to correspond to its webometric concept, since the closed statistical data on top cited papers for this indicator are taken from the Scimago laboratory, rather than from the Internet.

References

  1. Aguillo (2025 a). Ranking Web of Universities (webometrics.info). January 2025 edition. [CrossRef]
  2. Aguillo (2025 b). Ranking Web of Universities (webometrics.info). July 2025 edition. [CrossRef]
  3. Aguillo, I. F.; Granadino, B.; Ortega, J. L.; Prieto, J. A. F. Scientific research activity and communication measured with cybermetrics indicators. Journal of the American Society for information science and technology 2006, 57(10), 1296–1302. Available online: https://isidroaguillo.webometrics.info/sites/default/files/publicaciones/Aguillo2006Scientific_research_activity_and_communication_measured_with_cybermetric_indicators..pdf. [CrossRef]
  4. Basu, A.; Malhotra, D.; Seth, T.; Kumar Muhuri, P. Global Distribution of Google Scholar Citations: A Size-independent Institution-based Analysis. Journal of Scientometric Research 2019, 8(2), 72–78. [Google Scholar] [CrossRef]
  5. Buinytska, O.; Hrytseliak, B.; Smirnova, V. Rating as assessment tool of quality and competitiveness of university. Open educational e-environment of modern University 2018, 4, 16–32. Available online: https://elibrary.kubg.edu.ua/id/eprint/24061/. [CrossRef]
  6. Chavarro, D.; Alperin, J. P.; Willinsky, J. On the Open Road to Universal Indexing: OpenAlex and Open Journal Systems. Quantitative Science Studies 2025, 1–28. [Google Scholar] [CrossRef]
  7. Connor, J. Google Scholar citations open to all. 2011. [Google Scholar]
  8. Available online: http://googlescholar.blogspot.com/2011/11/google-scholar-citations-open-to-all.html.
  9. CWTS Leiden Ranking Open Edition. 2025. Available online: https://open.leidenranking.com/.
  10. Delgado-Quirós, L.; Ortega, J. L. Completeness degree of publication metadata in eight free-access scholarly databases. Quantitative Science Studies 2024, 5(1), 31–49. [Google Scholar] [CrossRef]
  11. Gavron, E. M.; Pinto, A. L.; Canto, F. L. D.; Talau, M. A tool for bibliometric analysis of journals indexed in Google Scholar Metrics and OpenAlex. Transinformação 2025, 37, e2515501. [Google Scholar] [CrossRef]
  12. Gusenbauer, M. Google Scholar to overshadow them all? Comparing the sizes of 12 academic search engines and bibliographic databases. Scientometrics 118 2019, 177–214. [Google Scholar] [CrossRef]
  13. Imran, A.; Pasta, M.Q. A Qualitative Study of Bibliographic Data Sources and Retrieval Mechanisms for Computational Research. 2024 26th International Multi-Topic Conference (INMIC), Karachi, Pakistan, 2024; pp. 1–6. [Google Scholar] [CrossRef]
  14. Jacso, P. As we may search—comparison of major features of the Web of Science, Scopus, and Google Scholar citation-based and citation-enhanced databases. Current science 2005, 89(9), 1537–1547. [Google Scholar]
  15. Available online: https://www.jstor.org/stable/24110924.
  16. Jasko, P. Google Scholar Author Citation Tracker: is it too little, too late? Online Information Review 2012, 36(1), 126–141. [Google Scholar] [CrossRef]
  17. Martín-Martín, A.; Orduna-Malea, E.; Ayllón, J. M.; Delgado López-Cózar, E. The counting house: measuring those who count. Presence of Bibliometrics, Scientometrics, Informetrics, Webometrics and Altmetrics in the Google Scholar Citations, ResearcherID, ResearchGate, Mendeley & Twitter. EC3 Working Papers, 21. 19th of January 2016. 60 pages, 12 tables, 35 figures. 2016. Available online: https://arxiv.org/ftp/arxiv/papers/1602/1602.02412.pdf.
  18. Martín-Martín, A.; Thelwall, M.; Orduna-Malea, E.; Delgado López-Cózar, E. Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations. Scientometrics 2021, 126(1), 871–906. [Google Scholar] [CrossRef] [PubMed]
  19. Merton, R.K. The Matthew effect in science. Science 1968, 159(3810), 56–63. [Google Scholar] [CrossRef] [PubMed]
  20. . [CrossRef]
  21. Mingers, J.; O’Hanley, J. R; Okunola, M. Using Google Scholar institutional level data to evaluate the quality of university research. Scientometrics 2017, 113(3), 1627–1643. [Google Scholar] [CrossRef]
  22. Moskovkin, V.; Yawei, L.; Sadovski, M. Identification of leading Russian universities without profiles in Google Scholar citation. Alma Mater in Russian. 2019, 1, 10–15. [Google Scholar] [CrossRef]
  23. Available online: https://www.researchgate.net/publication/331150996_Moskovkin_VM_Identifikacia_vedusih_rossijskih_universitetov_s_otsutstvuusimi_v_Google_Scholar_Citation_profilami_VM_Moskovkin_Lu_Avej_MV_Sadovski_Alma_mater_-_2019_-_No1-_S_10-15_Identification_of_lea#fullTextFileContent.
  24. Moskovkin, V. M.; Serkina, O. V. Quantitative Frequency Analysis of Google Scholar Citation and Top 10% Most Cited Scopus Papers Profiles for Asian University Webometrics Ranking. Informology 2024, 3(2), 11–32. Available online: https://www.informology.org/2024/v3n2/a36.pdf.
  25. Moskovkin, V. M.; Zhang, H.; Sadovski, M. V.; Serkina, O. V. Comprehensive quantitative analysis of TOP-100s of ARWU, QS and THE World University Rankings for 2014–2018. Education for Information 2022, 38(2), 133–169. [Google Scholar] [CrossRef]
  26. Next Steps for Microsoft Academic – Expanding into New Horizons. Microsoft Academic Blog. Published May 4, 2021/Updated June 24, 2021. Available online: https://www.microsoft.com/en-us/research/articles/microsoft-academic-to-expand-horizons-with-community-driven-approach/.
  27. Orduña-Malea, E.; Ayllón, J. M.; Martín-Martín, A.; Delgado López-Cózar, E. Methods for estimating the size of Google Scholar. Scientometrics 2015, 104(3), 931–949. [Google Scholar] [CrossRef]
  28. Ortega, J.L.; Aguillo, I.F. Microsoft academic search and Google scholar citations: Comparative analysis of author profiles. Journal of the Assotiation for Information Science and Technology 2014, 65(6), 1149–1156. [Google Scholar] [CrossRef]
  29. Price, *!!! REPLACE !!!*; de Solla, D.J. A general theory of bibliometric and other cumulative. 1976. [Google Scholar]
  30. advantage processes. Journal of the American Society for Information Science,.
  31. 27(5), 292–306. [CrossRef]
  32. Scheidsteger, T.; Haunschild, R. Which of the metadata with relevance for bibliometrics are the same and which are different when switching from Microsoft Academic Graph to OpenAlex? Profesional de la información 2023, 32(2), e320209. [Google Scholar] [CrossRef]
  33. Selten, F.; Neylon, C.; Huang, C.-K.; Growth, P. A longitudinal analysis of university rankings. Quantitative Science Studies 2020, 1(3), 1109–1135. [Google Scholar] [CrossRef]
  34. Ul Sabah, N.; Khan, M.M.; Talib, R.; Anwar, M.; Malik, M.S.A.; et al. Google Scholar University Ranking Algorithm to Evaluate the Quality of Institutional Research. Computers, Materials & Continua 2023, 75(3), 4955–4972. [Google Scholar] [CrossRef]
  35. Walters, W. H. Comparing conventional and alternative mechanisms of discovering and accessing the scientific literature. Proceedings of the National Academy of Sciences 2025, 122(27), e2503051122. [Google Scholar] [CrossRef]
  36. Zhang, L.; Kumaran, M. STEM Librarians’ Presence on Academic Profile Websites. Science & Technology Libraries 2023, 42(2), 247–263. [Google Scholar] [CrossRef]
Table 1. World Ranks (WR) of universities with missing legitimate IGSCPs included in the TOP-2,000 Webometrics University Ranking in January 2025, which were improves in July 2025 with the transition to a new methodology for calculating the Openness indicator.
Table 1. World Ranks (WR) of universities with missing legitimate IGSCPs included in the TOP-2,000 Webometrics University Ranking in January 2025, which were improves in July 2025 with the transition to a new methodology for calculating the Openness indicator.
University WR Jan 2025 WR July 2025 WRJan./WRJuly
Johns Hopkins University, School of Medicine (USA) 267 45 5.93
University of Birmingham (UK) 485 116 4.18
Leiden University (Netherlands) 528 146 3.62
University of Minnesota Twin Cities (USA) 724 264 2.74
Hong Kong University of Science & Tech. (Hong Kong) 812 280 2.9
Weill Medical College, Cornell University (USA) 1148 555 2.07
Universidade Nova de Lisboa (Portugal) 881 372 2.37
University of Hawaii at Manoa (USA) 1,099 492 2.23
Universite du Quebec (Canada) 1,036 601 1.72
Swinburne University of Technology (Australia) 963 433 2.17
Nanjing Agricultural University (China) 1,189 519 2.29
Technische Universitat Wien (Austria) 1,201 516 2.33
Universidad de Navarra (Spain) 1,254 585 2.14
Syracuse University (USA) 1,278 637 2.01
J.W. Goethe Universität Frankfurt am Main (Germany) 1,340 633 2.12
University of Florida Health (USA) 1,251 648 1.93
Universidade Federal do Bahia (Brazil) 1,427 697 2.05
Victoria University of Wellington (Australia) 1,264 613 2.06
Royal Holloway University of London (UK) 1,462 783 1.87
University of Maine (USA) 1,332 707 1.88
Universidad de Valladolid (Spain) 1,339 665 2.01
Université de Lyon (France) 1,481 931 1.59
Universidade Federal do Ceara (Brazil) 1,460 754 1.94
Universidad de Antioquia (Colombia) 1,656 870 1.9
Chiba University (Japan) 1,601 790 2.03
Medizinische Hochschule Hannover (Germany) 1,665 795 2.09
North West University (South Africa) 1,721 892 1.93
Pontificia Universidade Católica do Rio de Janeiro (Brazil) 1,618 918 1.76
National University of Sciences & Technology (Pakistan) 1,658 884 1.88
Illinois State University (USA) 1,941 1,213 1.52
Universidad de Costa Rica (Costa Rica) 1,869 1,091 1.71
Universitas Padjadjaran Bandung (Indonesia) 1,832 956 1.92
Maynooth University (Ireland) 1,893 1,039 1.77
University of Bucharest (Romania) 1,792 940 1.91
Universitat Oberta de Catalunya (Spain) 1,504 839 1.79
Universidad de Leon (Spain) 1,948 1,104 1.77
Universidad Politécnica de Valencia (Spain) 787 324 2.43
Sciences Po Institut d'Études Politiques de Paris (France) 1,920 1374 1.4
Universidad Europea de Madrid (Spain) 1,908 1,148 1.66
University of Regina (Canada) 1,501 810 1.85
Table 2. World Ranks (WR) of universities with missing legitimate IGSCPs included in the TOP-2,000 Webometrics University Ranking in January 2024, which restored their profiles in July 2024.
Table 2. World Ranks (WR) of universities with missing legitimate IGSCPs included in the TOP-2,000 Webometrics University Ranking in January 2024, which restored their profiles in July 2024.
University WR Jan 2024 WR July 2024 WRJan./WRJuly
Universidad de Sevilla (Spain) 733 285 2.57
Universitat Pompeu Fabra (Spain) 760 266 2.86
University of Warsaw (Poland) 763 353 2.16
Universidade Federal do Rio Grande do Sul (Brazil) 846 340 2.49
Universidade de Coimbra (Portugal) 871 315 2.77
Universidad Nacional de Colombia (Colombia) 1,155 577 2
Universidade Federal do Sao Paulo (Brazil) 1,205 637 1.89
Universidade de Estadual de Maringa (Brazil) 1,600 917 1.75
Tabriat Modares University (Iran) 1,641 783 2.1
Tabriz University of Medical Sciences (Iran) 1,662 878 1.89
Universidad de Jaen (Chile) 1,664 784 2.12
Mashhad University of Medical Sciences (Iran) 1,845 892 2.07
Universidade Federal do ABC UFABC (Brazil) 1,877 1,178 1.59
Asia University Taiwan (Taiwan) 1,882 1,160 1.62
Table 3. Descriptive statistics for changes in university rankings calculated in Table 2 and Table 3.
Table 3. Descriptive statistics for changes in university rankings calculated in Table 2 and Table 3.
Statistic Value (Table 2) Value (Table 3)
n 40 14
Mean 2.187 2.134
SD 0.797 0.402
95% CI [1.932; 2.442] [1.902; 2.366]
99% CI [1.845; 2.528] [1.811; 2.458]
Note: CI - confidence interval.
Table 4. Pearson and Spearman Correlation Analysis: January 2025 (Google Scholar) vs. July 2025 (OpenAlex).
Table 4. Pearson and Spearman Correlation Analysis: January 2025 (Google Scholar) vs. July 2025 (OpenAlex).
Sampling Layer (January ranks) N Pearson r p-value Spearman ρ p-value
Top-100 (1–93, 95–100) 99 0.991 < 0.001 0.991 < 0.001
Mid layer (451–550, excl. 485, 528, 535) 97 0.7013 < 0.001 0.7299 < 0.001
Bottom layer (901–1000, excl. 905, 963) 98 0.3423 < 0.001 0.3369 < 0.001
TOTAL (Global Sample) 294 0.9831 < 0.001 0.9843 < 0.001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated