General information on ranking

Name of the ranking India Rankings
Geographical scope India
Name of person in charge of ranking Dr. Anil Kumar Nassa
E-mail of person in charge of ranking akn.edu@gmail.com
Website of the ranking https://www.nirfindia.org/
First year of publication 2016
Most recent year of publication 2019
Date of last update 2023-05-04
Ranking organization India Ranking Society
Website of the methodology https://www.nirfindia.org/Documents
Methodology

Methology adopted by India Rankings is as follows:

1. Source of Data: In the absence of a reliable and comprehensive database that could supply all relevant data required for computing the scores for ranking, registered institutions were invited to submit the required data through an Online Data Capturing System (DCS). Publications, and citations and HCP pertaining to research output of applicant institutions were retrieved from Scopus (Elsevier Science) and Web of Science (Clarivate Analytics). Data on patent published and granted was taken from Derwent Innovation. Moreover, number of papers that appeared in the top 25 percentile of cited papers in the world for a given discipline was taken as sub-parameter for evaluating research performance of institutions.

2. Data Collection and Data Capturing: Data Capturing System (DCS), Feedback System and the Perception Capturing System were developed for online capturing of data from applicant institutions, feedback from public and institutional perception (from Peers and Employers). As mentioned earlier, the data on publications, citations and highly cited papers were retrieved directly from third-party sources. A brief description on data collection and data capturing is given below.

2.1. Online Data Capturing System: Data capturing system sought the detailed data in a format that facilitated computing the ranking metrics for each parameter as well as for checking consistency of data. Detailed notes were provided to explain every data element to help institutions to comprehend each data element and provide correct data. Attempts were made to keep the data entry to a minimum. Data of the previous two years in respect of the faculty, was pre-populated in the DCS, with provision for changes with suitable remarks/reasons for the changes.

2.2. Publications, Citations and Highly Cited Papers (HCP): Web of Science (WoS) and Scopus: Two sets of citation databases were used as sources for retrieving data on the number of publications, citations and highly cited papers for the registered institutions. These citation databases comprise of: i) Science Citation Index Expanded (SCI-Expanded), Social Sciences Citation Index (SSCI), Arts & Humanities Citation Index (A&HCI), Conference Proceedings Citation Index - Science (CPCI-S), Conference Proceedings Citation Index - Social Sciences & Humanities (CPCI-SSH), Book Citation Index– Science (BKCI-S), Book Citation Index– Social Sciences & Humanities (BKCI-SSH), Emerging Sources Citation Index (ESCI) and Current Chemical Reactions (CCR-EXPANDED) hosted on the Web of Science platform; and ii) Scopus. These sources of publications and citations cover all disciplines comprehensively.

These databases were searched to determine the quantitative productivity of all 5805 applicant institutions that registered themselves for ranking. The search included number of research articles published and citations received by them in a span of three calendar years, i.e. 2016, 2017 and 2018 for 2020 ranking. A common time window was used to obtain this data covering a short span of two weeks in the month of February 2020 to ensure fairness.

2.3. Search Strategy for Retrieving Research Publications, Citations and Highly Cited Papers from WoS and Scopus: All permutations, combinations and changes in the names of institutions were used while searching for articles published by faculty and researchers in the databases mentioned above. Since searches were conducted using names of institutions, articles that did not have institutional affiliations of their faculty and researchers were not retrieved.

2.4. Restricting Retrieval of Articles to a Given Discipline: Searches for publications and citations were done in the two databases mentioned above for applicant institutions without any subject-wise and discipline-wise restrictions for the Overall ranking of institutions. However, subject/discipline-specific searches were made for all other discipline-wise rankings in the interest of uniformity and fairness. Care was taken to design the restriction so as to get the widest possible coverage of sub-disciplines within each broad discipline.

2.5. Online Perception Capturing System: An online platform was developed to capture the perception inputs from peers and employers. A large number of peers (subject experts) were invited to submit their perception feedback on applicant institutions in a prescribed format.

3. Online Feedback System: Stakeholders (that included public or other individuals or entities having an interest in one or more institutions) were invited to give their feedback through “Online Feedback System” within ten days on the data submitted by the institutions, through a public advertisement in the newspapers and other media. The comments / feedback so received were auto-transmitted through an email without disclosing the identity of the stakeholder to the concerned institution(s) for taking necessary action at their end.

4. Data Verification

 4.1. Scouting for Outliers: Committees of Domain Experts: Issues and pit-falls in the process of data collection, verification, authentication and interpretation were addressed by the Implementation Core Committee (ICC) set-up by the MHRD to oversee the implementation of ranking work. This Committee also reviewed the parameters and formulae that were finally used for ranking in various disciplines. Besides, committees consisting of academic experts examined the data submitted by institutions under each of the five broad generic groups of parameters, for every category / discipline. These Committees examined the data on various parameters minutely and identified outliers and anomalies for further scrutiny. Institutions whose data seemed exaggerated or had anomalies were contacted telephonically and via e-mail to confirm or correct the data. Where it was felt necessary, they were asked to support their data with documentary evidence. Several e-mails are sent and telephonic calls are made to various institutes for verification of data on different parameters and sub-parameters.

4.2. Communication with Nodal Officers: Each institution was asked to nominate one of their senior functionaries as a nodal officer for dealing with NIRF matters. These nodal officers were contacted to clear doubts or to attend to the feedback and anomalies pointed out by the expert committees. Nodal officers were also called in person (where necessary) to interact with members of the committee and verify their data. For increased transparency, an advisory was sent to each institution to upload this data on their own website for dissemination to the public. For all the top-ranked institutions, the latest version of the corrected data based on further inputs from the institutions was made visible on the NIRF portal. While significant efforts were made to authenticate the data, the final responsibility for the accuracy of the submitted data lies with the concerned institutions

4.3.Verification of Data on Publications, Citations and Highly Cited Papers: The data on publications, citations and highly cited papers are shared with each applicant institution. Applicants were informed that the data was captured between a specific period for all institutions.

Additional information

  • Internet users access to ranking: open access
  • Language of publication: English
  • Website of the ranking organization: https://www.nbaind.org/
  • Types of the ranking organization: independent public organization