Academic Analytics Defined
Academic Analytics (AA) is a vendor that provides various consulting services and a dashboard of verified scholarly data for doctoral-granting research universities. For selected metrics, the data provides a source of external benchmarking and comparison to other institutions. Where appropriate, Academic Analytics helps academic leaders identify their strengths and areas for improvement.
The AA database includes information on over 270,000 faculty members associated with more than 9,000 Ph.D. programs and 10,000 departments at more than 385 universities in the United States and abroad. These data are structured so that they can be used to enable comparisons at the institutional, discipline, and faculty levels. The data include the primary areas of scholarly research accomplishment:
- Publication of scholarly work as books and journal articles
- Citations to published journal articles
- Research funding by thirteen federal agencies and two non-federal agencies
- Honorific awards bestowed upon faculty members
The overarching use of Academic Analytics is to provide data to help TTU improve research programs.
Overview of Academic Analytics Data
A significant value of AA is that the data reported are validated by a third party, i.e. the data are not self-reported. All data are consistently reported relative to the same set of conditions. However, the data are not equally strong amongst all programs. The funding data captures the principal investigator well but is inconsistent for co-co-principal investigators.
AA reports data in rolling, 4-5 year cycles, with the exception of scholarly awards and honors. The following data are in our current system, with the associated periods:
Data Type | Time Period |
---|---|
Faculty | Academic Year 2022 - 2023 |
Journal Articles | 2019 - 2022 |
Honorific Awards | No Limit |
Books | 2013 - 2022 |
Chapters | 2019 - 2022 |
Citations | 2018 - 2022 |
Conference Proceedings | 2019 - 2022 |
Grants | 2018 - 2022 |
In terms of research grants, the 2022 comparative database includes competitively reviewed assistance grant data from thirteen (13) federal agencies and two (2) non-federal agencies matched to the principal investigator at the lead institution (not necessarily Co-PI’s).
Organizations
- American Cancer Society (ACS)
- American Heart Association (AHA)
- Department of Agriculture (USDA)
- Department of Commerce (DOC)
- Department of Defense (DOD)
- Department of Education (ED)
- Department of Energy (DOE)
- Department of Health and Human Services (HHS)
- Department of Transportation (DOT)
- Environmental Protection Agency (EPA)
- National Aeronautics and Space Administration (NASA)
- National Endowment for the Arts (NEA)
- National Endowment for the Humanities (NEH)
- National Science Foundation (NSF)
- National Institutes of Health (NIH)
Note that honorific awards in the 2022 comparative database include honorific awards from more than 500 governing societies.
Academic Analytics compares faculty on these metrics by standardizing scores using the population mean and standard deviation. In this calculation method, z-scores are calculated by subtracting the mean score of the metric for the entire comparison population from the unit’s score and then dividing this difference by the standard deviation of the comparison population. The standard deviation is simply a measure of the overall variation of the scores in the discipline, and this division is a way to standardize scores on the same scale. These z-scores summarizing each metric facilitate comparisons between unlike data types (e.g., publications versus honorific awards, or chemistry versus history). A unit’s z-score for a given metric represents the scholarly activity of this program relative to the average unit in the same discipline.
For example: If the average number of journal publications per faculty member in discipline “A” is 6.0, and the standard deviation for this discipline is 2.0, then a program with 8.0 publications per faculty has a z-score of 1.0; i.e., [(8.0 – 6.0)/2.0] = 1.0, which implies that the faculty member is within 68% of all faculty reported, or one standard deviation from the mean. Note a few observations:
- Because each discipline has a unique variance for that subpopulation, the y-axis values (z-scores) differ for each group. Before determining the magnitude of a particular data point, one must carefully review the increments on the y-axis.
- Discipline classifications vary among institutions, so a particular discipline at TTU may, in fact, need to be a collection of disciplines to draw an accurate comparison.
Strategic, intentional use of Academic Analytics, as well as other data tools, is critical for the success of the resource. Academic Analytics is a complementary tool that will be used to inform academic leadership regarding unit performance and inform decisions on strategic investment.
Recommended uses for Academic Analytics:
Opportunities for Research Improvement
One of the most valuable applications of this comparative data is to identify the grant productivity in aspirant peers based on discipline. Academic Analytics allows you to view the granting agencies and activity levels of peers that the respective TTU academic leaders select. The comparative data can be used to:
- Identify granting agencies not currently solicited by the TTU department
- Review aspirant peer awards
- Identify research partner universities
AA has the following reporting mechanisms that can be used to identify research opportunities:
- Evaluation of broad field and program-specific strengths and weaknesses
- Evaluation of broad field and program-specific ranking for awards, grants, articles, and citations
- Identification of departmental peers grants market share
- Identification of possible subject experts
AA has created Custom Reports found in the Analysis on Demand module for:
- Deans and Associate Deans, who also have access to Department Chair reports Includes a calendar year broad overview of college research productivity compared to a custom peer group
- Department Chairs Includes calendar year departmental research productivity compared to a custom peer group
AAU has created a custom data release in the Benchmarking module (AAD2022_Custom_222_4.17.24_1.0) to directly compare TTU and AAU institutions.
- Complementary tool to other sources of data
Academic Analytics data and reports are only one source of information. Other institutional data sources should be consulted, such as:
- The College Metrics departmental-level dashboard
- The TTU Data Warehouse
- A variety of Cognos reports
Academic leaders will have additional sources of data to augment institutional resources. They will use a variety of quantitative and qualitative data when making critical decisions, but AA will not be used as the sole source of data for tenure and/or promotion decisions at any level in the process.
Access to the AA data
There are three levels of access for AA data:
- Level One: Executive leadership (president-level, provost-level)
- Level Two: Deans
- Level Three: Department chair/area coordinator
As the AA database has personally identifiable scholarly information, access to this data is a sensitive issue and will be restricted. Annually, in September, the provost will decide the degree of access that is available to each of the aforementioned levels (Level 1, Level 2, and Level 3).
Each level will have the following permissions:
Level | Academic Leadership Level | Permission Description |
---|---|---|
1 | Executive and Administrative Leadership (President, Provost, IE Leaders) | Unrestricted |
2 | Deans | All data in respective college |
3 | Chairs | All data in respective department |
Note that faculty with dual appointments will be visible to the deans and chairs associated with their primary appointments.
In addition, we have deployed internal access-granting controls to add further integrity to the process. Permissions to grant access are managed by the Office of Research and Innovation . As a result, we have severely limited the number of individuals outside deans and chairs with access to the data.
Data Management Procedures
TTU must carefully prepare the data (names and taxonomies) sent to Academic Analytics and actively monitor the accuracy of the data included. All deans will verify the faculty to be included in their college.
- Validating faculty data provided to AA
The following process occurs when Institutional Research receives the annual request from Academic Analytics to update the faculty list.
- The Office of Institutional Research generates a list of all tenured/tenure-track faculty employed in the Fall semester.
- The list of faculty for each PhD Program and academic department is provided to the respective deans for validation, which consists of verifying tenure faculty rank and associated taxonomies.
- The validated data are submitted to AA.
- AA reviews the submitted data and returns two lists – one of people who AA thinks should not have been included and one of people who were included but that AA thinks should not have been.
- The office of Institutional Research reviews the list of questioned people and, in consultation with the academic departments and deans, submits the final faculty list to AA.
- Correcting Data
One key campus concern is correcting information. AA data are not intended to be used in annual performance reviews or as part of the tenure and promotion process, and any faculty member has the right to obtain a copy of the relevant AA data for the purpose of reviewing and correcting AA data, within the constraints of the AA database. Data corrections, when discovered, will be submitted to AA via the Office of Research and Innovation.
In addition to carefully submitting and monitoring data, academic leaders should note the deficiencies in the Academic Analytics data and work collaboratively with the Office of the Vice President for Research to petition the vendor to improve the scope of data. Known deficiencies include:
- No performance (juried and non-juried) data in the performing arts.
- No visual arts and exhibitions (juried and non-juried) data.
- No inclusion of undergraduate-only programs at doctoral-granting universities.
- No inclusion of any program for which there is not a doctoral-granting program in the US.
- Incomplete data on co-principle investigators for credit in grant awards.
- Limited number of granting agencies included (noted above).
- No inclusion of private or state funding resources.
- Limited coverage of the population of journals in some fields is spotty or narrow.
- Programs may consider some journals of questionable quality and/or predatory.
- Restricting comparative data to institutions whose faculty have similar teaching/research obligations requires one’s separate investigation.
In addition, aspects of the TTU environment may pose data challenges for various departments:
- Department reorganizations and renaming may cause a break in relevant data during that period.
- TTU's inclusion of faculty taxonomy affiliation may differ from that of comparison peers. As noted above, the deans will validate the list of faculty to include, and AA will identify additional faculty members who may be included (administrators, adjuncts with rank, etc.). Because this is a subjective process, TTU's decisions for inclusion may vary from peer decisions.
- If a faculty member’s rank changes during the period, careful review is needed to ensure that the historical data on the faculty member transfers to the new rank.
- Faculty may be associated with zero, one, or more than one program in addition to their home department. Program association is not captured in the HR database and is dependent on verification at the college-level review. Not all programs are included in the AA database. AA data are organized in increasing granularity by broad field, department, and program.
Office of Research & Innovation
-
Address
Texas Tech University, 2500 Broadway, Box 41075 Lubbock, TX 79409 -
Phone
806.742.3905 -
Email
vpr.communications@ttu.edu