Introduction and Related Works

Several organisations are increasingly adopting analytics to enhance financial and operational efficiency (Van Barneveld et al. 2012). In the higher education sector, there is a growing expectation that institutions need to offer high-quality education and services within a changing and uncertain future. At the same time institutions of higher education struggle to address the needs of the diverse student body, and the different expectations and demands of stakeholders (Campbell and Oblinger 2007; Hazelkorn 2007; Oblinger 2012a, b). Regardless of challenges institutions face, they are expected to rely on evidence to demonstrate the efficiency and effectiveness of processes and operations. As institutions struggle to adjust to the speed of technological advancement and the resulting wake of changes required in thinking and practice, the use of evidence-based decision-making has become more critical to achieving productive outcomes with strategic value. Indeed, evidence-based decision-making has been purported as advantageous in shaping an agile and learning organisation capable of adjusting to the demands of rapid change (Bakshi 2012; Niemi and Gitin 2012). However, the ability of institutions to effectively respond to a demanding and complex array of changes will depend on their capacity to use data to analyse their practices and processes involved in running their operational and strategic imperatives (Kaisler et al. 2013).

While the use of business intelligence is now considered essential to the expansion and survival of commercial organisations, it is less utilised within the higher education sector (Sclater et al. 2016). However, as educational institutions continue to digitise their operations, the use of business intelligence tools and models that can facilitate real-time analytics to support evidence-based decision-making (Hilbert 2013) can no longer be ignored. In the future, the higher education sector is likely to embrace business intelligence widely. Undeniably, we are already privy to terms such as ‘academic analytics’ and ‘learning analytics’ that have been formed from analytics models associated with measurements of performance (Charlton et al. 2013; Mahroeian et al. 2017; Dawson et al. 2010; Siemens 2013; Tulasi 2013; West 2012).

The use of data in the higher education sector is not a new phenomenon. Institutions of higher education have been generating data for reporting purposes since 1990 (see Fig. 1). They have been generating reports on student enrolment, institutional revenue acquisition, spending trends and graduation rates (Daniel 2019). As institutions move from paper-based to digital space, the vast array of digital data being generated provides new and exciting opportunities to gain a more vibrant and meaningful picture of institutional performance and associated action patterns (OECD 2013). Daniel (2015) suggested that institutions can use this digital data and analytics to examine their competitive advantage and better respond to challenges facing them. With the advancement of new technologies, however, the emphasis has shifted to the collection of data on students’ academic activities such as library usage, interactivity and engagement online and most recently, the use of analytics to track students’ learning trajectories. According to Norris et al. (2008), the term analytics refers to the processes of using various forms of data to assess, measure, improve, and compare the performance of individuals, programmes and departments. Analytics is used for learner modelling optimisation and personalised learning. Since several higher education institutions are continuously generating vast amounts of data about students, their learning environments, learning process, and behaviours, such data can be analysed to understand educational, political, and managerial outcomes (Jones et al. 2020).

Fig. 1
figure 1

The progression of data, analytics and big data in higher education (Daniel 2019)

Further, the growth in digitised networks has seen a rapid shift in the volume and types of data now available within institutions. In particular, Moussavi et al. (2020) pointed out that educators can use analytics for improving teaching and learning methods. Analytics are being used to better understand how students learn (Vytasek et al. 2020). Siemens and Long (2011) noted that institutions of higher education are required to engage with analytics to meet the demands of a changing educational world, and OECD (2013) report highlighted the potential of analytics within the higher education sector to better support decision-making about educational outcomes and performance. In recent times, analytics is considered essential to understand student data through predictive models to assess students at-risk and hence, provide the needed intervention (Adams Becker et al. 2017; Sclater et al. 2016; U.S. Department of Education, National Center for Education Statistics 2011). Predictive models are an extension of automated computer-based early alert systems employed in many learnings, and they enable human experts to build, customise, and target educational programmes for at-risk learners (Brooks and Greer 2014).

Siemens and Long (2011) argue that analytics offers the only way to utilise the sizeable didactic array of data that is being produced as a result of digitisation. In this regard, higher education institutions must embrace analytics if they wish to meet the growing productivity demands that are shaping the future of higher education institutions (Wagner and Ice 2012; Hrabowski et al. 2011a, b; Picciano 2012). Goldstein et al. (2005) describe five stages by which institutions are likely to engage with analytics:

  • Stage one: Extraction and reporting of transaction-level data

  • Stage two: Analysis and monitoring of operational performance

  • Stage three: What-if decision support (for instance, scenario building)

  • Stage four: Predictive modelling and simulation

  • Stage five: Automatic triggers of business processes (such as alerts)

Recent work reveals that higher education institutions are exploring different ways of using analytics to garner useful insights from data to guide planning, interventions, and make better decisions (Avella et al. 2016). Institutions such as Purdue University and the University of Maryland have extensively used analytics to improve the quality of student learning. They have employed early warning alert systems that track students’ progress, and identify students who might be at risk of failing or dropping out of their programmes (Arnold and Pistilli 2012; Fritz 2011; Rajesh 2013). Work carried out in Purdue suggested that the utilisation of learning analytics has helped identify student help-seeking behaviour. The research showed that students who engaged with the Course Signal System at Purdue sought more help and resources compared to other students (Arnold and Pistilli 2012; Li and Wong 2020). Arizona State University (ASU) employed predictive analytics to improve student retention, and the insight gained helped the University to increase its freshmen retention rate by eight per cent. There was a ten per cent increase in the graduation rate (Phillips 2013). Further Greer et al. (2016a) reported the use of Ribbon Tool to identify degree completion rates. Ribbon Tool provided academics with interactive visualisation features that showed students’ progress and the navigation patterns they might have undertaken within their degree programmes over time to either successful completion or attrition. The Ribbon Tool was also indispensable for effective decision-making in curriculum mapping and change.

Within the Australasian context, the University of Sydney designed and developed the Student Relationship Engagement System (SRES) to help academics personalise student engagement within large classes. The SRES gives academics access to valuable and insightful data, and the system enables them to create a personalised learning environment for students with targeted feedback and support. Academics can access analytics associated with attendance, grades, and how students engage with live feedback. This practical application of learning analytics led to increasing student engagement, improvement in retention rates, and enhancement in students’ learning outcomes. The SRES was piloted at the University of Melbourne and the University of New South Wales (Liu et al. 2017).

Research suggests that the Open University of Australia (OUA) has developed a Personalised Adaptive Study Success (PASS) system to identify students experiencing disengagement (Atif et al. 2013).

Although there is a growing engagement with analytics within the higher education sector, the full benefits of analytics are not clearly understood. In some aspects, the deployment of analytics might not be supported by senior administration, and leveraging the full potentials of analytics requires addressing technical and policy issues (Baker and Inventado 2014; Jones 2012). For some institutions, the collection and analysis of data is already part of their administrative functions. However, extracting valuable insights from analytics requires an understanding of sophisticated modelling approaches, and specialised Data Science knowledge, which is either expensive to acquire or unavailable within the higher education sector. While analytics has the potential to enhance strategic decision-making, there is a substantial cost in terms of deploying the necessary infrastructure to achieve the required results; as such, many institutions have opted not to invest in analytics initiatives due to cost heavily.

The current article reports on research, which explored how decision-makers in the New Zealand higher education sector view, value, and use analytics in supporting decision-making. The study also looked at the extent to which higher education institutions were engaging with analytics to support their operational and strategic decisions.

Methods

This research employed a survey design involving administering an online questionnaire (see Appendix 1) predominantly to senior leaders (N = 82) in seven out of the eight research-intensive public universities in New Zealand. The ultimate goal of the research was to develop a broader understanding of the role of analytics in respect to improving decision-making in higher education. We explore respondents’ views about analytics, and the value analytics can add to the quality of operational and strategic decision-making in higher education. Respondents to the questionnaire included Vice-Chancellors, Deputy Vice-Chancellors, Pro-Vice-Chancellors, Administrators, Head of Departments, Directors, Managers and those who have some role associated with decision-making (n = 82; see Table 1). The sampling technique used to recruit the respondents was non-probability sampling (convenience sampling), as the population (senior executives of research-intensive public universities in New Zealand) was not well defined. Since this was an observational study and non-randomised, population estimation was undesirable (Etikan et al. 2016).

Table 1 Respondents’ demographics (n = 82)

The online questionnaire used in the present research was adapted from Goldstein and Katz (2005). The items in the questionnaire were aimed at broadly capturing participants’ conception of analytics and seeking a general understanding of the current engagement with analytics and the outcome accrued from its use in supporting decision-making in higher education. The items were measured on a 5-point Likert scale (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, and 5 = strongly agree). Before the deployment of the questionnaire, we tested it for reliability. The results revealed overall high scores of Cronbach reliability values on two dimensions and relatively low scores on two others: data accessibility consisting of 5 items (α = 0.78); data infrastructure development consisting of 4 items (α = 0.82); data governance with 2 items (α = 0.65), and future performance with 2 items (α = 0.60) (see Table 4). The instrument also included some open-ended questions to provide more contextual information about responses to the closed-ended questions.

Analysis and Results

We used Statistical Package for the Social Sciences (SPSS) version 23 to analyse responses to the closed-ended questions. Then, we used descriptive statistics, such as percentages and frequencies, to summarise the results. An Exploratory Factor Analysis (EFA) was used to examine the structure of the questions to understand respondents’ views on the value of using analytics in higher education and factors that might contribute to its success. We also used a thematic analysis approach to analyse responses to the open-ended statements. Both the numbers of respondents and proportions (n, %) were reported in the results.

The Exploratory Factor Analysis

To identify factors likely to influence respondents’ views on the value of analytics in enhancing decision-making within institutions, an Exploratory Factor Analysis with the Varimax rotation method was used to determine the construct validity of the 15 items. We chose the Varimax rotation because this method maximises the sum of the variances of the squared loadings and associates each variable to at least one factor. The Varimax method computes and groups all the coefficients to either large or near zero and minimises intermediate values. We obtained a Kaiser–Meyer–Olkin (KMO) value of 0.81, which suggested that the sample was adequate to proceed with exploratory factor analysis. In addition, Bartlett’s test was significance (p < .000). The assumption of independent sampling, normality, a linear relationship showed between pairs of variables, and variables correlated at a moderate level was met. The results are presented in Table 2. A table with a rotated component Matrix is also presented in Table 3. Thirteen of the fifteen items in the questionnaire loaded above the accepted level (0.3 thresholds), which was then mapped to the items, describing the four distinct factors.

Table 2 Rotated Component Matrixa

After rotation, we extracted four key factors. The first factor accounted for 39.39% of the variance (see Table 3), and described respondents’ views on the overall institutional awareness in the value and use of analytics to support students’ learning. This factor represents the role of analytics in identifying students who were at risk of failing their programme. It described the importance of the availability of training programmes for those interested in the use of analysis and tools for collecting data.

The second factor explained 10.55% of the variance, with six items loading into it. The items loaded into this factor are associated with the availability of data infrastructure policy and the development of analytics. The third factor explained 9.09% of the variance. This factor described the status of data governance and the investment of analytics within institutions. Two items loaded into the fourth factor and accounted for 7.79% of the variance. The fourth factor described the role of analytics in supporting strategic decision-making relating to future institutional performance. The four distinct factors extracted are presented in Table 3, each with an Eigenvalue > 1.

Table 3 Total variance explained

The State of Institutional Engagement With Analytics

Respondents described analytics as an emergent phenomenon and a developing field of inquiry with several potential benefits. The deployment of analytics in higher education was seen as key to facilitating a better understanding of the performance of institutions in various functional areas. They stated that analytics could play a significant role in understanding three functional areas: understanding and enhancing students’ services; supporting institutional research administration and promoting institutional efficiency. Although respondents have pointed out the potential benefits of deploying analytics in the higher education sector, there was a notable variation in how analytics is used within institutions in New Zealand. As one respondent noted: “I suspect the use of analytics for decision-making may be quite a variable across the organisation. Moreover, analytics is more likely to inform such decision-making, rather than being, the significant determinant.“

Drawing on Goldstein et al. (2005) framework which describes the various stages of analytics adoption and utilisation within the higher education sector, in terms of engagement with analytics, our findings suggest that the majority of institutions in New Zealand were in the stages of data extraction, analysis and reporting (see Fig. 2.). Moreover, more than half of the respondents (50, 70%) reported using analytics in their institutions for monitoring of operational performance, followed by the use of analytics for the reporting of transactional-level data’ (33, 46%). The distribution of the various stages in the use of analytics institutions is shown in Table 5. Respondents also pointed out differences in analytics capabilities across institutions. For example, both University A and the University B (see Table 5) reported the highest proportion (70%) of the current engagement with analytics in the area of student services and support. More specifically, analytics are used to understand ‘student enrolment management’, ‘student progress, and ‘finance and budgeting’. As one respondent stated…. “our work in the analytics area has been outstanding recently, and there are obvious access points via the data warehouse that now inform all schools and non-academic departments.“

Fig. 2
figure 2

Primary stages of the use of analytics

To gain further insights into the current engagement with analytics across institutions, a chi-square test of independence was conducted to explore variations in the use of analytics across different areas. Results, as shown in Table 5, suggest that contrary to expectations, this study did not find a statistically significant association between the type of institution and areas for the use of analytics, χ2 (105) = 47.651, p = 1.00. The association was small (Cohen 1988), Cramer’s V = 0.11.

Tables 4 and 5 shows over half of New Zealand universities are less engaged with analytics, except for University A and University B, where analytics was mainly used at the two stages of data extraction and analysis based on Goldstein et al. (2005) framework. Overall, the respondents reported that institutions are in the early stages of adopting analytics as mechanisms to influence decision-making within New Zealand higher education institutions.

Table 4 Descriptive statistics – Four factors of analytics engagement
Table 5 Primary stages of the use of analytics(n = 82)

Use of Analytics Within Institutions By Area

Respondents were asked to rate their responses to several statements on a 4 levels Likert scale (from strongly agree to strongly disagree) on the current use of analytics within their institutions. A closer examination of the data suggests that the New Zealand higher education sector used analytics in different ways and for various purposes. For instance, over half (50, 65%) of the institutions used analytics to monitor operational performance (56, 61%), of institution use analytics to model the potential impact of strategic decisions on future performance in areas such as budgets, governance, research outputs, teaching and learning excellence. Besides, institutions used analytics to identify students who were at risk of failing their course or programme (42, 55%) (Table 6).

Table 6 Engagement of analytics within higher education institutions (n = 82)

Others (32, 43%) reported analytics helped them to identify potential students for admission. Further, respondents (40, 53%) indicated that their institutions utilise analytics to forecast future demand for courses, while over one quarter (31, 41%) deploy analytics to automatically alert appropriate faculty or staff when a particular performance indicator falls outside of the desired range.

What Are the Potential Benefits of Using Analytics in Higher Education?

We asked respondents an open-ended question: “what are the potential benefits of utilising analytics in the institutions of higher education?“ The analysis of the date identified several key themes together with the frequency of their occurrence in the data (Table 7). Among the three re-occurring key themes were improving administrative services, efficiency and effectiveness in resource utilisation, and improving student services.

Table 7 Themes on the potential benefits of using analytics in higher education

Improving Administrative Services

Respondents indicated that analytics could help higher education institutions to streamline processes; provide better ways to reduce time dealing with manual processes; help institutions to develop key performance indicators, and monitor performance and track accountability measures for various stakeholders.

Analytics has always been used to monitor performance, compare competitors’ progress and assist with strategic planning in areas such as highlighted above–sophisticated analytics can be useful to understand massive data sets and give either more detailed performance information or a clearer ‘big picture’ view of performance[participant 8].

We use analytics to inform a wide range of management decisions and processes and progress with strategic priorities [participant 11].

Analytics can show what is going on, rather than just using guesswork and have real-time data [participant 12].

Efficiency and Effectiveness in Resource Utilisation

Higher education institutions are likely to increase the use of analytics in the future with the critical goal of improving overall performance through efficient ways of allocating resources and seeking strategies that enhance productivity and increase return on investment. For instance, one participant noted:

To me, resources’ use in an institution can be well optimised by using specific sort of analytics, improvement of faculty and research performance by extracting the right data and getting insightful (‘meaningful’) data [participant 14].

Similarly, another participant indicated….“I support the use of analytics to improve effectiveness and excellence, not to promote cost reduction (which almost always involves job losses) [participant12]”.

These results suggest that analytics, especially predictive analytics, has the capability of presenting current institutional strengths, challenges, and opportunities as well as future threats. As one respondent stated: Analytics provides a measure to calculate/estimate or forecast what is currently happening and what could happen in the future [participant 1].

Improving Student Services and Support

Analytics can play a pivotal role in students’ academic development and can be used to understand individualised student profiles to provide targeted learning support. The use of learning analytics in this way is more likely to help better understand students’ performance and to identify students at risk early. Such knowledge can be used to increase the retention rate of students, and it is a powerful way of tracking learning trajectories.

Analytics can improve learning and teaching by understanding behavioural responses to teaching methods. I can help finances by looking at patterns and identifying areas to economise [participant 4].

We are very ignorant of what drives students and the different sorts of student we have, and how to help them succeed [participant 5].

Better tracking of student performance over time will allow the identifying both students who excel and those who struggle. It could be targeted/nurtured for further education (post-grad), as well as early identification and intervention for students who may be struggling before they fail their course participant [participant 7].

Other respondents reported that analytics have the potential benefits of providing institutions with better strategies to improve student engagement and retention. As one participant puts it: “I see the potential power of analytics in improving student retention, engagement and learning [participant8]. There was a strong sentiment against the use of analytics for profiling students or staff: I would prefer the use of analytics to improve student learning, rather than monitor students or staff [participant2].

How Will Analytics Help Institutions in the Future?

We asked respondents about the role of analytics in helping the higher education sector to achieve its goals in the future. Respondents shared the view that using analytics in the future can significantly transform the way institutions operate. They indicated analytics could provide institutions with alternative means of evidence to help decision-makers make better decisions about business intelligence, strategic planning, student engagement and dealing with challenges facing the higher education sector. We identified several themes and the frequency of occurrence (Table 8).

Table 8 Themes on the role of analytics in higher education in New Zealand

An Alternative Source of Evidence for Enhanced Decision-making

Respondents indicated that analytics has the potential to transform how decisions are made which can lead to institutional success since several higher education institutions collect large amounts of data and associated analytics that can be harnessed to provide alternative evidence for effective decision-making. As one participant noted….“ one cannot make decisions without understanding the immediate environment. Moreover, that overall success in higher education will depend on how we use analytics to make useful and practical decisions.

In the near future, almost all success will be highly dependent on analytic use for evidence-based decision-making [participant 1].

Having analytics provides an opportunity to make more informed business decisions [participant 6].

Business Intelligence

Business intelligence refers to technology-mediated data-informed processes of identifying and presenting actionable evidence to decision-makers to optimise productivity outcomes. According to Negash and Gary (2008), “BI systems combine data gathering, data storage, and knowledge management with analytical tools to present complex internal and competitive information to planners and decision-makers, p. 178”. Applying business intelligence in higher education can help institutions to assess current and future business performances. BI promotes transference and accountability, and it utilises various performance measures, reporting, dashboards and scorecards to present and visualise outcomes.

It enables institutions to maximise their resources, plan business intelligence and analytics play a vital role. However, access has become relatively easy, and it is hard for people to make sense as so many tools offer so many different views [participant 1].

Analytics will form an essential source of data, but the most impactful metrics will be around business intelligence; who is enrolling, who is leaving, who is succeeding primarily factors affecting financial bottom line [participant 3].

Being able to spot well-performing departments and poor performing departments would enable corrective action to be taken [participant 4].

The University is partway through a business intelligence implementation. The data is being provided more widely than before and will undoubtedly aid data transparency [participant 7].

Strategic Planning

Analytics can significantly contribute to the quality of strategic planning development, implementation and evaluation. As one respondent noted: “analytics will be critical to ongoing strategic planning and institutional improvement. If you cannot measure progress, you will not know if you are improving “[participant 1].

Addressing Challenges Institutions Face

Analytics can enable institutions to assess the current state of affairs and effectively respond to challenges facing the sector: challenges of declining student enrolment; a shrinking budget; increasing demand from various stakeholders for accountability; and using analytics to optimise meagre resources to improve performance output.

Given the range of challenges facing HEI, the more accurate data that can be used, the better we know our current situation, and where we can go from there [participant 1].

We have constrained resources and operate in a fiercely competitive environment. We need to allocate resources optimally to be able to compete [participant 2].

Human-in-the Loop Analytics

Findings suggest that the use of analytics contributes to making effective decision-making. However, respondents indicated that the use of analytics is a means to an end. They stressed that institutions need to take into account the context and the impact of the use of analytics on people, and some noted the role of leadership and support for effective utilisation analytics.

I believe good leadership and passion for education and research are more important than analytics. It will not ‘rely’ on it, but it will play a part [participant 1].

Still need to ensure that humans are considered the critical elements in the activity (students, the staff of all types) [participant 2].

I think institutional success relies far more on the quality of its leaders―useful analytics used poorly will serve no purpose. Conversely, poor analytics used well might produce some good results [participant 3].

Student Recruitment and Success

Analytics provides a holistic view of student enrolment patterns, and it monitors their success and predicts future growth. It will be instrumental in student recruitment, admission and retention, as some participants noted:

Enrolments fell at our institution this year, but through analytics, we can see how to boost numbers by enhancing entrance scholarships. Also, to improve retention, we are using analytics to monitor student progress and to intervene when progress is unsatisfactory [participant 6].

We should, as an institution, be able to use the data to support targeted student learning support, target resources where required and identify performance issues across faculty and administrative processes which should enable improved performance in these functions over time [participant 5].

I can see recruitment, admission and selection all being heavily influenced by analytics—and in the future, curriculum and pedagogy design [participant 3].

Financial Decision-making

Universities in New Zealand acquire some funding from the government. The government requires universities to report on outcomes they have achieved and demonstrate sustained success in learning outcomes, graduation rates, and other graduate attributes.

Analytics offers more useful metrics for tracking and measuring spending and graduate success [participant 9].

Increasingly restricted funding, government directives on acceptable outcomes and government interference on graduate preferences [participant 1].

Financial provision will depend on metrics around employability and not just outcomes. Staffing will depend on metrics around sensible FTE/papers taught, not FTES [participant 2].

Challenges in Accessing Information for Effective Decision-making

Effective decision-making requires access to information and knowledge and skills needed to make a better interpretation of data. When asked about the common challenges decision-makers face in accessing information in their institutions (40%, 34) reported that decision-makers in their institutions have easy access to information. However, (33%, 28) remain neutral; (28%, 24) said that decision-makers in their institution lack access to useful information. Further analysis of responses to open-ended question revealed many challenges (see Table 9). These are summarised as follows:

Table 9 Themes on challenges in accessing information for effective decision-making

Access and Interpretation

Respondents raised concerns about a lack of access to data and useful interpretation of analytics. They said that available data could not be accessible because it is widely dispersed across several databases: As one participant said: “information required is widely dispersed and not easily interpreted. Much of the easily accessible information is not accessed or well utilised”. Another participant said: “much of the information is lost in layers and silos [participant 2].“Others expressed frustration that most of the information in their institutions is still paper-based and therefore, it is difficult to access or generate reports: “the data is hard to attain as not all processes can be streamlined or monitored with online tools or computer tools currently, as it is still very paper-based systems.“ [Participant 2]. While some decision-makers might have access to reports they needed, it was inconsistent, and others might not know where they can locate vital information.

I think the decision-makers are given access to some of the information they need to make crucial decisions. Still, I believe it would be inconsistent and not always clearly explained or interpreted [participant 5].

I think decision-makers have access to data if they know what to ask for; access, in my experience, is not easy [participant 8].

Others reported difficulties around developing key performance indications to measure progress and set targets.

We struggle to pull together simple metrics around research performance, revenue and expenditure from across the institution. I would imagine that other departments face similar problems [participant1].

Challenges in Meeting Different Information Needs

There are several decision-makers within the higher education sector; each will require a different form of information; because of this, meeting diverse information within a particular role becomes challenging. Even though the same data might be needed to support an individual in a different role, they are meant to serve different purposes.

We have some decision-makers who require high-level information and others who need more detail. The detailed requests can be challenging to meet [participant 1].

There are many decision-makers (e.g. teachers, managers, HoDs, etc.), and I am not sure all of them have easy access to useful data [participant 4].

Decision-makers are positioned at different levels, and their data needs change. At the macro level, senior managers (VC and DVC - even faculty/division heads) - their data needs are more around reporting/measuring activity and outcomes - they have this data. Fine-grained data though that gives meaningful insight into learning behaviour/engagement and outcomes is limited, although emerging and nascent [participant 2].

Some respondents also mentioned the lack of coordination and data governance framework in their institutions: “the evolved nature of the university leads to lack of coordination or information sharing across different parts of the university” [participant1]. Further, respondents identified a lack of adequate technical infrastructure to facilitate access to data better to support effective decision-making. While some institutions use legacy systems that were not built to accommodate new forms of metrics and analytics; others lack the awareness of the transformative nature of analytics.

Several in house bespoke services and systems have not been built with the opportunity of analytics in mind [participant 5].

I do not think our institution has discovered the power of business analytics to use them to their best advantage [participant 3].

Currently changing system so has been hard to get the usual data - will be better in the long run [participant 4].

Although there were institutions that have invested in technical systems to support the harvesting and use of analytics, respondents said that analytics they were provided with was not adequately used.

I believe that to some certain extent, my University having easy access to make significant decisions but not sure whether they are well-utilising analytics tools and implementation for its extraction or not [participant 6].

Respondents said that inability to understand data and to gain the support of leadership is critical to the success of analytics. One respondent commented: “the influence of analytics does not reach the top levels of leadership [participant 2]. Those who raised issues of understanding data said that institutions need to find better ways to use and interpret data.

In all my dealings within the leadership, data sorting has been quite tricky. They are always looking for more convenient methods of sorting data to inform decision-making [participant 1].

I work with the people extracting the data. There is a lot there, but a lot yet to be made digestible. Some data is readily available, but is it the data that should be driving decisions, or does it lead to an obtuse, target-based decision making process?[participant 2].

In addition to some challenges raised regarding access to useful information to support effective decision-making, respondents identified several concerns relating to the adoption of analytics in higher education (see Table 10).

Table 10 Themes about concerns in using analytics

Privacy and informed user consent were identified as a bottleneck to working with analytics, but ignoring them might lead to the misuse and abuse of data. Also, respondents raised concerns about the ethical issues and possible misuse of data, as well as security challenges and, the protection of students’ data. Others mentioned issues of data ownership, a lack of data sharing agreement between departments and units, and a lack of data governance models.

The increasing use of data and the automation of processes were seen as dehumanising, and the over-reliance on analytics and data can lead to the marginalisation of individual students and departments. Further, some respondents see the adoption of analytics in higher education as the neoliberal agenda of compliance and control. For example, one participant said: “Analytics triggers managerialism in higher education” [participant 3]. Also, there was fear that data can be manipulated to wrongly justify individual decisions as one respondent stated: “massaging data to achieve an end” [participant 4].

Lack of training and limited data literacy were identified as issues. Others stated that many people in leadership roles do not value the potential benefits of analytics.

Development of strategic capability―making sure that institutional leaders have basic data literacy and understanding what is necessary to make their institutions data-informed [participant 1].

Implementation capability―making sure that institutions lined up all the necessary capabilities that the implementation of analytics may happen (technologies, relevant questions, professional development, easy access to analytics results/tools etc.)[Participant 2].

Summary and Future Research

The use of analytics and data-driven decision-making in the context of higher education is key to institutional productivity (Daniel 2015; Lawson et al. 2016). More specifically, the deployment of predictive analytics within higher education institutions enable decision-makers to exercise better decisions and enhance their probabilities of success. However, the need for data-driven decision-making in the higher education sector is far beyond the collection of data to fulfil operational requirements. It requires a much broader understanding of how to work with analytics to harvest, process, analyse and distil useful insights from data to make better decisions on critical issues (Daniel 2019).

Greer et al. (2016a) noted that academics tend to overemphasise the need to use evidence-informed-decision making; however, when it comes to making the necessary changes to the teaching and learning environments based on evidence, many are hesitant to embrace change (Greer et al. (2016b). The authors suggested that the use of analytics for informed decision-making in higher education needs to scale to all levels of the institution, for instance to the departmental and course levels to facilitate ownership of research and defines the context for student success.

Others noted that the use of data-driven decision-making helps institutions to effectively deal with the growing diversity in the student body, declining admission and graduation rates, and dropout rates (Prinsloo 2019; Vytasek et al. 2020). The use of analytics help educators to develop more personalised, adaptive, and scalable and responses to a vast array of students’ educational backgrounds and skills (Moreno-Marcos et al. 2020).

This study explored the extent to which higher education institutions in New Zealand value and utilise analytics to enhance the quality of decision-making. The study aimed to provide an overview of the value, use, and the potential benefits of using analytics in higher education in New Zealand. We explored the use of analytics to address key challenges higher education institutions face, and the broader implications of using analytics for the quality of decision-making in the higher education sector. The key findings from this work revealed that higher education institutions in New Zealand are in the early stages of embracing analytics. Respondents reported using analytics for monitoring operational performance within institutions. They said they use analytics to produce reports from transactional-level data. Analytics were predominantly used in the areas of student services area (e.g. enrolment management, student progress).

There was a strong indication in the data that the deployment of analytics in higher education can enhance the quality of strategic and operational decisions. Data showed that despite the prevalence of adequate data infrastructure, especially in areas such as student information systems, human resource systems, admissions and financial systems, many institutions have not yet fully leveraged the potential benefits of analytics. However, those institutions that tend to report they have fully embraced the power of analytics have started to see improvement in the quality of their decisions, especially in meeting strategic goals. Participants said that the institutional use of analytics could play a significant role in improving student learning outcomes, student enrolment, grant management, and finance.

Further, analytics can foster a better understanding of areas such as student progress, student engagement, student enrolment, student retention issues, the quality of research and teaching, the optimisation of limited resources in institutions’ infrastructure, and many other issues associated with higher education institutions. The value of analytics within the higher education sector reported in this research is congruent with what is reported in the literature (El Alfy et al. 2019; Herodotou et al. 2019). However, the current study has identified several challenges and concerns that can limit the broad adoption and use of analytics in higher education within the New Zealand context. We identified the challenges of access and the ability to better interpret results, a lack of technical infrastructure to support the full-scale deployment of analytics, and a lack of data governance models making access and the sharing of data difficult.

Decision-makers require various forms of information, and therefore, coordinating a diverse information need was another challenge. There was also an issue about the difficulty of gaining leadership support in the deployment of analytics at a broader scale. In addition to the challenges identified, there were concerns and fear about the broader adoption of analytics and its impact on people, data security, data ownership, privacy, and ethics. Overall, limited engagement with analytics in New Zealand’s higher education relates to the limited use of analytics in higher education. More specifically, Gašević et al. (2019) reported a lack of institutional examples demonstrating the systemic adoption of learning analytics in improving student learning. The lack of engagement with analytics in the higher education sector in New Zealand mirrors the outcome of recent research. The study reported that a few higher education institutions in Europe have learning analytics strategies but limited implementation. Also, the research revealed that teaching and support staff predominately use the outcome of learning analytics and managers engage with learning analytics to influence institutional and teaching decisions (Tsai et al. 2020).

Although the present study has provided a general overview of research on analytics and its potentials in the New Zealand higher education sector, the study is descriptive rather than analytical. Findings only provide baseline data for discussion. The study used a small sample size and self-reported data, which were not triangulated with interviews or institutional policies. Participation in the study was voluntary, so there is a limited generalisation of the findings. Although the use of the factor analysis technique helped to identify critical factors, the data used in the study did not wholly meet all the conditions for employing the EFA techniques. For instance, EFA typically requires variables measured in higher-order scales (interval or ratio), and the study used Likert Scale. Also, as indicated earlier, the study had a smaller sample size, although the sample was above the minimum threshold (Winter et al. 2009). In general, the use of EFA is limiting in that the technique makes the assumptions that only components with strong correlations are to be included in a study, and thereby, variables that have low correlation confidence, although these might be important.

As the field of analytics continues to develop within the higher education sector, institutions are encouraged to explore analytics and develop systematic strategies to deploy systems and policies (notably, data governance, privacy, and tools) when implementing analytics initiatives. As an emergent phenomenon in higher education, future work is required to develop a consolidated framework for the deployment and measurement of the impact of a large-scale deployment of analytics on teaching, learning and research. To advance and better guide decision-making processes within the sector, future research is needed to undertake a comprehensive analysis at how analytics is implemented and its impact on organisational performance, student learning outcomes, and the quality of learning and teaching. Further research is needed to examine analytics policy implementation and the role of predictive modelling, statistical forecasting, complex data querying, optimisation, and casual analysis to support effective data-driven decision-making in higher education.

A Festschrift in Honour of Jim Greer

We dedicate this article to honour Professor Jim Greer’s foundational work in Artificial Intelligence in Education (AIED) and Learning Analytics. Jim was a leading scholar in Artificial Intelligence in Education (AIED); a pioneer in research on user/learner modelling and a strong advocate for the use of analytics in informing teaching and learning. He was a Senior Learning Analytics Strategist at the University of Saskatchewan, Canada. In later years, Jim’s research on learning analytics focused on leveraging the power of institutional students’ data to provide effective diagnostics approaches and tools for understanding learning and teaching problems and to provide practical strategies to policy-makers, faculty and students to mitigate these problems.

Ben first met with Jim and the Advanced Research in Intelligent Educational Systems (ARIES) team in a conference in Japan in 1998, where he expressed interest in studying Artificial Intelligence in Education (AIED). In 2000, Ben joined the Advanced Research in Intelligent Educational Systems (ARIES) Laboratory at the Department of Computer Science at the University of Saskatchewan, Canada as a PhD student. By then, Jim was the co-director of the ARIES lab. Upon the completion of his PhD, Ben took up a position as a Health Research and Innovation Analyst, at one of the largest healthcare providers in Saskatchewan but kept his Artificial Intelligence in Education (AIED) research active. He returned to academia almost a decade later, where he continued to be inspired by Jim’s work in data-driven decision-making and research in advanced learning technologies. The article presented in this special issue is in honour of Jim’s contribution and inspiration in the use of analytics to transform learning and teaching. For the last eight years, Ben has continued to research learning analytics, Big Data and Data Science in education and supervising graduate students in these areas. Hamid was his Doctoral student studying the value of Big Data and analytics to improve the quality of decision-making in higher education.