Big Data and Changes in Audit Technology - Contemplating A Research Agenda
Big Data and Changes in Audit Technology - Contemplating A Research Agenda
Big Data and Changes in Audit Technology - Contemplating A Research Agenda
To cite this article: George Salijeni, Anna Samsonova-Taddei & Stuart Turley (2019) Big Data
and changes in audit technology: contemplating a research agenda, Accounting and Business
Research, 49:1, 95-119, DOI: 10.1080/00014788.2018.1459458
a
Alliance Manchester Business School, University of Manchester, Manchester, UK; bFaculty of Business
and Society, University of South Wales, Newport, UK
This study explores the most recent episode in the evolution of audit technology, namely the
incorporation of Big Data and Data Analytics (BDA) into audit firm approaches. Drawing
on 22 interviews with individuals with significant experience in developing, implementing
or assessing the impact of BDA in auditing, together with publicly available documents on
BDA published within the audit field, the paper provides a holistic overview of BDA-
related changes in audit practice. In particular, the paper focuses on three key aspects,
namely the impact of BDA on the nature of the relationship between auditors and their
clients; the consequences of the technology for the conduct of audit engagements and the
common challenges associated with embedding BDA in the audit context. The study’s
empirical findings are then used to establish an agenda of areas suitable for further research
on the topic. The study is one of the first empirical accounts providing a perspective on the
rise of BDA in auditing.
Keywords: auditing; methodology; data analytics; Big Data
1. Introduction
The extant auditing literature contains considerable evidence of how audit firms have responded
to public questioning of the quality of audit work and the social relevance of the audit function
with methodological developments designed to restore confidence in the effectiveness of the
audit process (Carpenter and Dirsmith 1993, Power 2003, Curtis and Turley 2007, Robson
et al. 2007). It has been argued, for example, that the introduction of statistical sampling, the
audit risk model and Business Risk Auditing sought to represent auditing as a scientific
process to promote a picture of the seemingly objective, almost ‘factual’ nature of evidence gath-
ering. The use of confidence levels and the determination of materiality underpinning statistical
sampling and the audit risk model provided the metrics through which professional judgements
were easily converted into ‘facts’ (see Carpenter and Dirsmith 1993 for statistical sampling;
MacLullich 2003 for Audit Risk Model). This paper focuses on the most recent episode in the
evolution of audit technology, namely the incorporation of Big Data1 and Data Analytics
(BDA) into audit firm approaches. While the rise of BDA is clearly not the only example of meth-
odological experimentations in auditing over the years, it is a particularly intriguing case as it
brings out concerns as to whether Big Data-driven audit environments placing significant techni-
cal demands on auditing and increasing its remoteness may, in fact, be contributing to margina-
lisation of the audit function itself (Alles 2015). One possibility, therefore, is that BDA may
eventually be seen as a disruptive audit innovation. This study provides the first empirical
account of the developments, methodological challenges and possible consequences of the adop-
tion of BDA in audit methodologies with an ultimate aim to generate a meaningful agenda for
further research on the topic.
Big Data relates to the nature of data (Gartner 2012, Arnaboldi et al. 2017) whereas Data Ana-
lytics refers to the collection of tools developed to make sense of Big Data. We refer to ‘BDA’ to
capture the relationship between the two terms as well as suggesting that the audit technologies
being investigated in this study relate to those interrogating Big Data. The potential impact of
BDA as a major development in the delivery of audits has been recognised in the professional
auditing community internationally (IAASB 2016). The predominant view of the significance
of Big Data for auditing is reflected in the following comment from one professional body:
The quantity of data produced by and available to companies, the replacement of paper trails with IT
records, cloud storage, integrated reporting and growing stakeholder expectations for immediate
information – any one of these alone would affect the auditing process, but Big Data is bringing
them all, and more, at the same time. (ACCA 2015)
As a consequence, large international audit firms, particularly the Big Four, have all made state-
ments about the development of BDA tools to navigate and make sense of the vast data stores now
available within and outside client organisations (Accountancy Futures Academy 2013, EY 2014,
IAASB 2014b, KPMG 2014b) and invested significant amounts in projects designed to advance
the technology (Brown-Liburd et al. 2015, Cao et al. 2015). Williams (2013, p. 556) argued, in
this regard, that a technology such as BDA could ‘come to play a greater role in the regulation of
[auditors] themselves, imposing a greater visibility and transparency on their activities’. Such
statements by auditors and audit commentators point to the developments around BDA in audit-
ing as a major area of innovation in auditing worthy of investigation and provide a context for
this paper. To date, audit scholars have devoted most attention to contemplating the potential
impact of BDA on auditing conceptually (Brown-Liburd et al. 2015, Cao et al. 2015), with
little being known about how developments affect the actual practice of auditing in the setting
of an audit firm.
In contrast to much of the initial literature on this subject, which offers mainly normative accounts
of the nascent developments in BDA and auditing, and lack empirically drawn insights, this
exploratory study makes a contribution to the debate by providing empirical insights into such
developments from the perspective of the auditors themselves. In particular, drawing on 22 inter-
views with individuals who both are highly experienced in auditing and have had significant roles
in developing, implementing or assessing the impact of BDA in auditing, together with publicly
available documents on BDA published within the audit field, the paper provides a holistic over-
view of BDA-related changes in audit practice, focusing in particular on three key aspects. First,
we look for evidence of the extent to which the development and use of Big Data is reconfiguring
the relationship between auditors and their clients, leading to changed approaches to the storage,
analysis and utilisation of financial and other information. Second, we investigate how investment
in BDA within audit firms is changing audit methodology, that is, the manner in which auditors
Accounting and Business Research 97
are conducting various steps of the audit process, from planning to evidence collection, analysis
and reporting. Third, we provide an overview of the common practical challenges of embedding
BDA in audit work. We then draw on the study’s findings to establish an agenda of areas suitable
for further research concerning the actual impact of data analytics on daily audit practice from
technical, economic and professional perspectives. In this regard, this study can be distinguished
from prior accounts acknowledging the need for more research into the influence of BDA on audit
practice (see, for example, Appelbaum et al. 2017) as we draw on the findings from fieldwork to
offer an empirically grounded discussion of issues that warrant further scholarly investigation.
The paper is structured as follows. The next section sets out the background context and exist-
ing literature on BDA, in general and in relation to auditing, and places BDA in the context of
broader developments in audit methodology over time. This is followed by a section describing
the approach employed to collect evidence on this emerging topic before the main analysis section
presents an evaluation of the status of BDA developments relative to auditing. The final discus-
sion and conclusions section summarises findings and contemplates an agenda for future research
on the evolving field of financial statement auditing and BDA.
1990s, the world’s largest audit firms made significant efforts to influence public perceptions of
the Business Risk Audit model as a major innovation in audit methodology with ‘a potential to
enhance audit effectiveness … [and] the best way in which an auditor will be able to recognise
management fraud and business failure risks’ while, at the same time, dismissing concerns that
the technology could be used to ‘redefine auditing as consulting and to facilitate identification
of opportunities for providing value added services to clients, with the intention of improving
the status and profitability of the auditor’ (pp. 439–440). The rise of BDA in auditing should,
therefore, be understood in the longer term historical context of the above developments in
audit technology and auditors’ attempts to promote technological solutions for the problem of
maintaining (or restoring) the legitimacy of the audit function.
In the audit context, BDA has been commonly defined as:
the science and art of discovering and analyzing patterns, identifying anomalies, and extracting other
useful information in data underlying or related to the subject matter of an audit through analysis,
modeling, and visualization for the purpose of planning or performing the audit. (AICPA 2014, p. 5)
Comprehensive discussions focused around BDA and auditing first started in North America,
with audit commentators and regulators making statements to acknowledge that the audit environ-
ment is becoming a Big Data environment. For instance, the American Chartered Institute of
Public Accountants in 2012, while not specifically mentioning Big Data, launched a number of
initiatives designed to highlight the importance of data in the provision of audit and assurance
services, including the establishment of the Assurance Service Executive Committee Technol-
ogies Task Force to develop guidance and foster conceptual debate on the role of Big Data in
audits (Zhang et al. 2012).
While it is often acknowledged that the rise of BDA in auditing is still in its early stages, the use of
analytics in the audit process is not necessarily a recent phenomenon. Audit firms have been using
computer-based analytical tools since as early as the 1960s when the firms first developed Com-
puter Assisted Audit Techniques to interrogate data in ways that could assist procedures such as
audit sampling (Cushing and Loebbecke 1986). Some years later, in the early 1980s, computed
vendors started to offer off-the-shelf analytical tools such as Audit Command Language (ACL)
and Interactive Data Extraction and Analysis (IDEA) that later became used by the majority of
audit firms as part of their analytical suites. The two latter technologies may be seen as the pro-
totypes of what is now regarded as BDA in terms of their ability to tackle large data sets. What
also paved the way to the current BDA architecture were changes in the manner in which audit
clients manage information through technology. Some prominent examples include the platforms
collectively classified as Enterprise Resource Planning (ERP) which enable companies to store
and process information flows between different business units in an organisation (O’Leary
and Markus 2006). It is the proliferation of technological processes and mechanisms, including
those mentioned above, aimed at capturing and processing vast amounts of information, that
has given rise to the acknowledgement within accountancy firms that contemporary business
environments are increasingly characterised by the digital phenomenon called Big Data, and
that this has relevance for the audit, too.
All the major Big Four audit firms have invested significant amounts of money in recent years
to either acquire or develop BDA tools. For example, KPMG’s 2014 Transparency Report stated
that the firm, in collaboration with technological companies (such as the McLaren Technology
Group, better known for Formula One racing), had established a $100 million (£74.8million)
investment fund aimed at developing data analytical capabilities that could significantly transform
audits and create value for clients. Similarly, EY has committed US$400 million to develop audit
innovation, including new audit support tools such as BDA (Ernst and Young Transparency
Accounting and Business Research 99
Report 2014). Also, PwC announced they have developed Halo, i.e. an in-house analytical tool to
replace ACL and IDEA and to become ‘a next generation software application that analyses and
assures data using a suite of algorithms’ (PwC 2014b). Other, particular smaller audit firms often
make use of off-the-shelf analytics software such as Spotlight, Lavastorm and Alteryx (ICAEW
2016) to leverage their BDA capabilities.
Responding to the above developments in audit practice and methodology, recent audit litera-
ture has offered some initial assessment of the potential relevance of BDA to auditing, particularly
in areas of risk assessment and performance of substantive and analytical procedures (Alles 2015,
Cao et al. 2015, Krahel and Titera 2015, Wang and Cuthbertson 2015, Yoon et al. 2015, Alles and
Gray 2016, Appelbaum et al. 2017). Cao et al. (2015), for example, suggest that BDA provides
opportunities for auditors to navigate messy data at a faster rate and to create patterns and trends
which can provide a more microscopic visualisation of risks associated with the audited entity,
assisting in determining the level of materiality. Yoon et al. (2015) evaluate Big Data as a compli-
ment for more traditional audit evidence, judged against conventional criteria of sufficiency,
reliability, relevance and cost benefit. Further, these papers also note that a significant aspect of
BDA in auditing is the challenge that large amounts of information can create for the auditor’s
professional judgement. Brown-Liburd et al. (2015), for instance, argue that, while BDA offer
auditors a plethora of ways of analysing data, issues such as the ambiguity of information, infor-
mation overload and inability to navigate data to identify relevant information could undermine
the benefits of utilising BDA. Krahel and Titera (2015) suggest that such shortcomings could be
addressed through the incorporation of BDA in formal auditing standards to eliminate uncertainties
in some areas of judgement. In this regard, auditing regulators at national and international level
have also acknowledged the potential effects of Big Data on auditing. Appelbaum et al. (2017)
identify areas of auditing most affected by the rise of BDA and provide an overview of the
content of relevant audit regulations and research to formulate a range of topics that require attention
from the audit firms, standard setters and academics. While highlighting the pressing need for more
empirical work into the developments with BDA and auditing, the paper does not explore such
developments through a form of empirical engagement directly with audit firms themselves but
draws its conclusions from an assessment of publicly available information. In 2013, IAASB estab-
lished a taskforce to investigate the effects of BDA on auditing, but, to date, no action has been
taken to signal the need to revise the auditing standards beyond the publication of discussion
papers seeking the views of relevant stakeholders (IAASB 2014a, 2016).
The existing audit literature, therefore, provides predominantly technique-oriented, normative
accounts of auditors’ engagement with BDA, making inferences as to the possible as opposed to
the actual impact of such developments on contemporary audit practice. In contrast, this paper
offers instead an empirically informed analysis of the developments with regard to the growing
utilisation of BDA in audit work drawn from the personal reflections of auditors themselves
and other documentary evidence on the relevant topics. The resulting empirical findings are
then used as a basis for contemplating a stimulating agenda for future research into the rise of
BDA in auditing.
3. Research methodology
In this study, we have adopted a qualitative approach to explore the issues pertaining to the rise of
BDA in auditing and the effects of those on the practice of auditing. The paper relies on evidence
collected through semi-structured interviews as well as a wide range of documentary sources.
Potential interviewees for the study were first approached during the Accountancy Europe
(formerly known as Fédération des Experts Comptables Européens (FEE)) audit conference in
June 2015 in Brussels themed Long Term Vision and Short Term Challenges, where two
100 G. Salijeni et al.
leading ERPs developers (Oracle and SAP) and two of the Big Four audit firms (KPMG and PwC)
made presentations on Big Data implications on financial reporting and relevance of BDA for
audits of financial statements, respectively. Attendance at this event provided an opportunity to
discuss informally with auditors and stakeholders issues relating to BDA. Some of the discussions
resulted in formal interviews to further explore the issues addressed in the informal meeting. A
‘snowballing’ approach was followed whereby, drawing on these initial interviews, we developed
a network of contacts of individuals with knowledge of and experience in BDA tools in the
context of an audit. As a result, between July 2015 and January 2017, we conducted 22 semi-
structured interviews with 20 individuals (see Table 1), including auditors from the Big Four
and mid-tier audit firms mainly in the UK but also in EU countries such as Belgium and Italy
as well as members of the audit regulatory bodies in the UK and Belgium. All interviewees
have had a particular active involvement with BDA in the audit field, and together they represent
three types of interaction with BDA: (i) responsibilities with respect to the development of BDA
methods within audit firms, (ii) implementation of BDA on audit engagements and (iii) evaluation
of the impact of BDA for standard setting and other regulatory perspectives. Many have spent
lengthy careers as practicing auditors and hold senior positions within their firms’ hierarchies.
Furthermore, several interviewees from the Big Four and mid-tier audit firms also hold positions
as members of the IAASB Data Analytics Working Group (DAWG), which indicates their high-
level role with regard to issues around audit methodology and BDA in their respective firms. Four
participants from the audit firms were interviewed twice and three representatives of audit regu-
lators were interviewed as a group.
Interviews ranged in length between 30 minutes and 2 hours and were tape-recorded. During
the interviews, participants were asked questions following an agenda designed to cover a range
of topics (for an overview, see Appendix), including their role and how it is related to BDA; the
key developments with regard to BDA within their respective organisation and in the audit field in
general; what aspects of BDA they consider most useful/relevant to them personally, their organ-
isations and their clients; the current state of development with regard to BDA and any related
challenges; and the impact of BDA on various aspects of the audit process and its organisation.
Participants were encouraged to discuss points beyond the structure of the interview questions but
which could be considered as relevant to BDA in the audit of financial statements.
In addition, one of the authors attended five one-to-one practical sessions where the use of
BDA tools was demonstrated in a practical setting, with a possibility for follow-up questions
and discussion. One session was conducted on the premises of a mid-tier audit firm and the
remaining sessions were in the offices of the Big Four firms. The sessions were between 20
and 45 minutes long and focused on the practical application of BDA tools in areas such as assess-
ment of internal controls, analysis of journal entries, risk assessment, compliance with particular
auditing standards (e.g. ISA 240) and visualisation, among others. Due to concerns over compe-
tition, the author was only allowed to voice record the sessions. However, notes were taken and
also sketches were drawn of the tools whenever possible during and immediately after the
sessions.
We also supplemented the evidence from interviews with analysis of a wide range of docu-
mentary sources, including materials from websites of audit firms, audit regulatory agencies
and professional accountancy bodies; and the audit firms’ transparency reports and presentations
relating to the topic of BDA in auditing (see Table 2).
We adopted an interpretive approach to analysing the empirical evidence collected. Consider-
ing our primary objective is to provide a holistic view of the rise of BDA in auditing that could
enable a meaningful research agenda, we sought to capture in our analysis the complexity of the
rise of BDA not only as an instance of technological change in the delivery of an audit but also a
social phenomenon that has potential consequences for the relevance of auditing as a professional
practice. In particular, with regard to the analysis of the interviews with audit practitioners, we
sought to understand how they are relating to BDA technology and the manner in which their
engagement with these tools has an impact on their approaches to audit work and their under-
standing of their role as an auditor. In addition, interviews with members of the regulatory com-
munity provided further context on the way in which BDA developments in auditing are being
considered for their potential impact on issues such as the notion of audit quality, audit profession-
alism and auditor ethics; but regulatory matters are not the focus of this paper explicitly. We uti-
lised Nvivo, a qualitative data analysis tool, to identify recurring ‘expressions’ and ‘narratives’
(O’Dwyer et al. 2011, Humphrey et al. 2017) that were representative of the most commonly
expressed views as to the implications of BDA for auditing. While analysing the interviews,
however, we were aware of the potentially problematic nature of the audit environment, especially
at a time when a practice change is actively advocated. In this regard, Alvesson and Karreman
(2011), for example, observe that the pattern language through which research participants
express themselves is not only a manifest representation of the reality they are in but also a
latent attempt to reconstruct (desired) reality. Therefore, findings from the interview analysis
were repeatedly corroborated by reference to the documentary evidence collected in order to dis-
tinguish between auditors’ latent attempts to promote BDA and manifest representations of actual
developments relating to BDA in auditing. Our analytical approach led to the identification of
three broad analytical themes that will be subject of discussion in the following section of the
paper, namely: (i) the effect of BDA on the nature of relationship between auditors and their
clients; (ii) the influence of the technology on the execution of the various steps of the audit
process and (iii) the common challenges of embedding BDA in the conduct of the audit.
organizational goals, and how reporting elements are processed and assembled’ (Warren et al.
2015, p. 397). Our interview analysis suggests the significance of these issues as clients’
growing reliance on BDA tools yield opportunities for auditors to process and make sense of
the large data sets relating to the clients’ business systems:
We start using functions inherent in financial systems. A good example is SAP [‘Systems, Appli-
cations, Products’ – authors] control framework built which uses functionalities like authorizing
invoices. When I first started auditing we had to look at the invoices and make sure someone has
signed them. You don’t do that anymore. A much better way is to pull a file from the financial
system and see how many invoices were not authorized because it is electronic. There are different
ways around it, and that is set to develop. [D2]
Furthermore, the clients’ growing interaction with Big Data tools, according to some intervie-
wees, leads to the changing expectations as to the kind of audit (and audit evidence) that is appro-
priate for their business. The following interview excerpt illustrates this point:
If you think of a traditional audit, your data sources are ERP and Financial Systems. However, now
clients, stakeholders and others also want to hear about unstructured data, they want you to talk about
emails, external data and so on. [D1]
Another particular feature of BDA pointed out in the interviews by auditors is that it improves
their ability to communicate the outcomes of the audit in a way which their clients perceive to
be more understandable and adding value. By providing an abundant supply of what Earley
(2015, p. 3) referred to as ‘visually appealing analytics to impress clients’, BDA is used by audi-
tors to visualise the thought process behind auditor judgement – such as with regard to the identi-
fication of key areas of potential risk, hence contributing to higher perceived quality of audit.
Specifically, some interviewees pointed out that Big Data supplied by the client and processed
within an audit firm environment can facilitate an informed dialogue and comparison between
an auditor and the client around issues of risk assessment. Likewise, according to the audit
firms’ reports, analysis of large data sets can help ‘highlight behavioral outliers and … direct
audit activity toward areas of greatest risk’ (Deloitte 2014, p. 6). Interactive visualisation sessions
between an auditor and the client are now common features of audit engagements where often
highly complex analytical representations are used to represent information in a way which high-
lights particular issues that are of concern to the client:
I get access to all those transactions that are the mechanics of how the business works; but for risk
assessment, I can summarise all that information up for the client into a single graph, a picture set
of analyses. In a sense, [BDA] is taking the transactional information and generating high level infor-
mation for the auditor to supplement the previous approach to business risk assessment; and the hoped
outcome of this is that you have more confidence in your risk assessment. [P1]
We have had some feedback from one of our biggest clients when we first introduced the major piece
of analytics into their audit, this probably goes back to four to five years now since we first started.
They have huge shared service centres and the guy who runs the centres commented at the end, ‘now
we feel like we know what has really been audited this year’. [P5]
Many of the auditors interviewed also referred to the role that BDA can play in helping resolve
disagreements between clients and their auditors, specifically on areas involving a significant
amount of judgement:
If you go to that client [in this case a gas company] and say well, we have checked the last twenty
years of data and we have predicted the fine pattern for the next year, or the gas price from all our
104 G. Salijeni et al.
calculations. There is no way you are going to make revenue X. It is going to be way less, and we think
you should impair. That’s what it will do. It will give weight of evidence so those professional judge-
ments conversations are based on facts. [P2]
More recently, there is an emerging trend where the significant investment in BDA undertaken by
some large audit firms generates interest among their clients seeking to develop systems that use
analytics as part of decision-making. For example, auditors’ use of analytics in Big Data environ-
ments is having an effect on changing the practices and processes clients use to evaluate aspects of
employee performance and planning, as illustrated by the following comment from an interview
in one audit firm which has developed a financial stress index to measure staff efficiency:
We are now actually finding our clients are asking to use our extraction tools to get their own data into
other parts of their business, particularly internal audit. So, that is really interesting and we as external
auditors are often better able to give our clients the data they need but, once we have got the technol-
ogy to do it, we are perfectly happy to allow the company themselves to re-use that technology. What
we do not do is give them our analytics because that would be a breach of independence. [P1]
The above interview excerpt shows the efforts of the audit firms to drive their clients towards a
greater appreciation of the usefulness of Big Data and BDA to their business. Among other things,
some audit firms administer surveys to highlight the areas where clients would require data ana-
lytics. In one such survey, one of the Big Four audit firms, called for Big Data to become a board
room issue and for the need for business to become ‘analytics driven organizations to create
value’ (EY 2015). Such, mainly self-sponsored, surveys serve as a marketing tool to draw existing
clients into the Big Data space and to attract new clients. Importantly, they may also serve to show
how the major professional firms seek to become leaders rather than followers in the ‘Big Data
curve’.
The above discussion reveals how BDA is drawn upon by the firms as a basis for broadening the
range of services offered to clients, particularly those more advisory in nature, subject to the limits
that are imposed by regulation on the joint provision of audit and non-audit services. The manner
in which BDA findings are used by firms is, therefore, relevant to the long-standing debate on
whether joint provision can affect the auditor’s independence, notwithstanding the increased
regulatory constraint of recent years (Earley 2015). This excerpt from a public report by one of
the Big Four firms is particularly revealing in this regard:
We see audit as an opportunity, not an obligation, we go broader and deeper beyond statutory require-
ments, to realize the value of the data that’s available. We help organizations take stock of their finan-
cial statement, learn about their performance, understand where they could be doing better, and
prepare for what the future may bring. (Data and Analytics –Unlocking the Value of Audit, KPMG
2014b, p. 22)
The illuminating excerpt above highlights a number of issues. The emphasis on helping
organisations to take stock and learn suggests a portrayal of the role of the auditor as someone
there to assist the client rather than to act as a watchdog over their financial reporting. From
this perspective, the BDA-driven audit process effectively becomes a learning process for auditors
as it provides ways to map out the future of the client’s business and hence test potential areas
for promoting other services. Another excerpt from the same document reinforces this
observation:
[w]e can more easily identify trends and anomalies for further investigation. It allows us to give organ-
izations greater insights into their past performance. And this in turn enables them to take stock of
their processes and activities and adjust them to improve performance. (KPMG 2014b, p. 4)
Accounting and Business Research 105
In addition, the idea that audit is not an ‘obligation’ but an ‘opportunity’ points to a possible
reconceptualisation of the purpose of auditing in the era of data analytics as a function supporting
the interest of the client (in unlocking an organisation’s potential) rather than as a more conven-
tional verification exercise. The critical issue here is the way in which the output from BDA, and
the potential insights it offers, can affect the ‘mindset’ of the auditor, possibly sub-consciously,
towards a supportive or consultancy view of the audit itself. The following interview excerpt
exemplifies this kind of concern:
A big threat with BDA is around independence or perceived independence. We have a technology
where once a week we can pull results out for the client, we can produce dashboards for them
which show them stuff. […] The value-added piece of that data gives rise to some real issues as to
where do we draw the line and how can we stop the client integrating what we are doing as part of
their control process – because they want that kind of insight to come through from us. So, there
are real issues around independence here. [D1]
They [clients] can see this is really beneficial for them and we’ve got clients who ask us, ‘Can we buy
your software?’, and the answer is, ‘No’, because this will have an impact on our audit work and inde-
pendence. [P7]
It also has to be noted, however, that, whist many audit clients are eager to involve their auditors
in exploring the opportunities presented by BDA, there are also those that are less keen on
opening up their ERP systems to BDA, pointing to concerns over the motives behind the use
of BDA as well as issues relating to data security (Ramlakun 2015). These client concerns
may, in turn, affect auditors’ ability to access the data that are relevant to an audit:
[Some] clients are really sensitive to the idea that their information is used in bench marking because
they don’t want to give away any competitive advantage. [P5]
No matter how easy you make it when you use analytics, the client has to provide you with data. And
therefore, if a client doesn’t use analytics, perhaps had used analytics with little benefit in the past, or
simply is not particularly open to the idea then it can be difficult to use analytics in an audit because
the client is not keen to provide the data. [D2]
By combining the leading edge predictive analytics capability of MAT with our own data and analytics
expertise, we are set to transform our approach to audit to deliver greater quality, value and actionable
insights. (KPMG 2014a, p. 12)
We’re excited about data-enabled auditing, and also about new ways of reporting that reflect society’s
changing expectations of performance and value. (PwC 2014a, p. 7)
The above bold statements inevitably invite questions about the particular areas where BDA is
seen as having an especially significant impact on audit conduct. Analysis of our interviews
point to two such key areas, namely (i) organisation of delivery of the audit, and particularly
106 G. Salijeni et al.
possibilities for further outsourcing audit-related tasks and processes, and (ii) the execution of
various steps of the audit process.
With regard to the first point, developments in BDA seem to endorse existing accounts of the
data-driven technological innovations in audit as generating greater opportunities for audit out-
sourcing and making remote auditing a reality (Teeter et al. 2010). Here, the interviewees from
audit firms stated that, as a result of BDA tools being used to collect relevant information from
their clients’ ERPs irrespective of their location, it is increasingly the case that their audit
teams do not need to be physically present at the client premises. Furthermore, this increased
data availability and mobility has meant that the auditing procedures that are regarded as repeti-
tive (such as re-computations) are now assigned to and performed at centralised shared services
centres. Most large firms now have such centres in countries where the level of IT expertise is high
while labour cost is low:
We are going to have more auditors in India, in the Philippines etc. etc. Because that is where our
clients have shared services. It means we [audit firms] are bringing control and delivery of the
audit closer to central partner team. [D2]
Some interviewees argued that the outsourcing of mundane audit tasks can free up much needed
time for auditors to concentrate on complex issues and concerns that require professional judge-
ment. It is easy to see, however, how these claims may face sceptical reactions from some who
point to auditors’ emphasis on minimisation of the cost of audit to maintain competitiveness.
As regards the execution of auditing steps, our empirical analysis points to a number of technical
areas where BDA may be useful to auditors, including those relating to audit planning and risk
assessment, evidence gathering, performance of substantive and analytical procedures, among
others. These potential impacts stem from a range of unique opportunities presented by BDA,
for example, an enhanced ability to navigate messy data at a faster rate, to identify material excep-
tions based on the analysis of the whole population, or to fully verify transactions and, if necess-
ary, each individual calculation in areas where significant judgement is involved. While most,
especially large, audit firms have already automated their audit process, from client acceptance
to record management, BDA tools are adding another dimension by introducing data analytics
procedures to the existing automation trail, while at the same time making an audit process trace-
able through digital footprint. The following interview quotations demonstrate how data analytics
tools are utilised in areas of risk assessment:
There is essentially a digitalised map that’s telling us for any account balance what are the sources of
the numbers that are coming in to that account balance, and therefore, helping us really identify where
are the risks within that account balance. [P6]
We have updated our methodology which places much emphasis on how data analytics can be
used to support risk assessment with the view to the auditor being able to come up with the classi-
fication that means: you’ve got to do a lot of work, a little bit of work or pretty much no work.
And so, we have been building content [and] issue actual guidance to auditors and saying, ‘If you
do this then follow this decision tree, use analytics here, use analytics there, use other procedures
there’. [P1]
The above comments suggest that, through BDA, the firms appear to be pursuing a progressively
more structured approach where the execution of audit tasks is automated and auditors are encour-
aged to draw on predesigned templates. This is somewhat different from, for example, Business
Risk Audit which was motivated in part by a belief that auditors needed to move away from
substantive approach involving detailed testing towards a more analytical approach where pro-
fessional judgement is given prominence (Curtis and Turley 2007).
Accounting and Business Research 107
Further, one implication of BDA for the delivery of audits commonly discussed in the inter-
views concerns audit evidence gathering, both in terms of the nature of evidence collected and its
volume. Indeed, Big Data can, at least in principle, refer to a variety of unstructured information
relating to company performance that can be used in conjunction with traditional audit evidence
to support judgements and decision-making. BDA tools may also lead to wider audit scope by
making possible full population testing and more detailed analysis of accounting populations.
Auditors appear to be making efforts to link BDA to both operational scope and depth, i.e. the
two elements that, according to Power (1997, p. 18), constitute sufficient audit evidence.
As regards the operational scope of the audit and the volume of transactions to be tested, audi-
tors noted that BDA is providing a wider coverage of transactions and even an ability to look at
entire populations of journals as opposed to a few items, hence potentially addressing some of the
limitations of techniques such as statistical sampling in providing adequate coverage of the
client’s business:
Analytics can move you from haphazard sampling […] to statistical sampling [ … ] the amount of
audit effort is unchanged. I think we would all agree [that it] is more statistically valid in terms of
the outcomes you would achieve at the end when the data volumes are large.[…]. It is absolutely
100% transactions as opposed to some cursory examination by eye which just is not practical
beyond a few thousand. So, that is quite dramatically different. [P1]
[BDA] is allowing the auditor to slice and dice the segments of the journals they care about in order to
underpin a particular financial balance they are trying to audit. And on the back of that, and that’s the
analytical input they are now doing, [they are able] to say, ‘What do I know about the business, what
the management said and what I think?’, ‘What might be going in the industry around this topic?’, and
collectively bring all that together and conclude on their risk assessment. So, this is how you can do
more with the journals. [P5]
Furthermore, with respect to operational depth which refers to the amount of tests that can be per-
formed on the transactions (Power 1997, p. 18), auditors pointed out that the prevailing approach
has been to apply BDA tools that largely reflect the existing traditional audit procedures that were
previously administered manually – such as re-computations, re-calculations or reconciliations –
rather than using such tools to develop qualitatively different procedures:
Another area is re-computation, there are areas within financial statements that are systemically gen-
erated by the client and, as a consequence, we can systemically test them. So, a classic example is
depreciation interest calculations and certain aspects of provisioning. Historically, the auditors
would do either a test to say ‘does the maths seem to be working’ or do what is known as rational-
ization. [P2]
While these practices do not produce analytical procedures that can be unequivocally attributed to
the rise of BDA, it is plausible that they serve as ‘legacy practices’ (see Power 2003) to soften
possible resistance to a more comprehensive rollout of BDA tools in the future and to ensure
smooth transition. As is evident from the recent overview by the Financial Reporting Council
(FRC) of the use of analytics in auditing (2017), journal testing is the most common area
where data analytics tools are used. This can be attributed to the regulatory requirement contained
in ISA 240 that auditors demonstrate greater rigour in evaluation of fraud risk factors. From this
perspective, BDA is seen here as a means to meet the regulatory imperative in areas of public
concern. The interview excerpt below further illustrates this point:
Audit needs to satisfy the FRC needs, so we communicate with them about what we have done and
why to give comfort that we have covered for big risks. Journal testing has been called out as poor
across all the Big Four, it has been called out as too formulae and that not enough thought has
108 G. Salijeni et al.
gone into what are the highest risk journals. We use analytics now to document the process, and rather
than picking a random sample size, the tool lets you filter down to focus on small number of high risk
journals. So the sample is chosen based on risk, which is a much better way of auditing. So data ana-
lytics is helping to make sure we don’t get any deficiencies around journals. [D2]
While the interview evidence cited above points to a growing reliance on data analytics tools in
the audit process, it is less clear whether these developments can or should be taken as a sure sign
that auditors are now operating in ‘Big Data’ environments. This scepticism is shared by the audi-
tors themselves, with some of our respondents voicing an opinion that the amount of transactions
that auditors analyse remains small, and so they effectively leave unexplored much of information
that is labelled as Big Data. They argued, for example, that the use of social platforms such as
Facebook and Twitter has, at least at present, little place in the conduct of the audit, which
appears to contradict prior suggestion by some commentators that the resources available on
such social platforms could transform the way audits are conducted (see, for example, Cao
et al. 2015). The following comments by an audit partner and a director from Big Four firms
capture this sentiment:
I don’t believe [in Big Data]. For me there is this notion of Big Data but we are nowhere near it in our
organization and our business. Big Data is people like Facebook, Google – that is not us. We are not
those people. [P1]
If you think an audit as material misstatements and looking at assertions over financial statements,
clearly social media is not directly impacting any of those. [P4]
Likewise, our empirical analysis indicates that, despite some innovations such as incorporating
google maps facility into procedures to verify assertions relating to travel expenses or utilising
large social data stores to establish quantifiable indicators of highly subjective categories such
as national culture which has been adopted in one of the Big Four firms, it is still premature to
make claims about BDA having a transformational effect on the nature of audit evidence in a con-
ventional audit engagement.
Our interviewees point out that audit firms’ inability to roll out BDA to wider elements of the
audit process has much to do with a lack of required competence and expertise among auditors.
The firms’ senior representatives interviewed commented that their staff are more often than not
ill-equipped to deal intelligently with large amounts of information, and commonly face chal-
lenges from information overload and the difficulties of distinguishing relevant information
(Earley 2015, Krahel and Titera 2015). The large majority of auditors, including those from
large firms, do not possess computer programming skills that could instruct BDA to perform
various audit tests. As a consequence of such knowledge shortages, the promotion and operatio-
nalisation of BDA tools in auditing is often facilitated, and sometimes controlled, by specialists
such as data scientists who have little to do with the conduct of an audit, with auditors playing the
role of consumers rather than developers of their firms’ BDA capabilities:
We need people who are a lot more literate in using data. So, if you’ve got pilot data that you want to
use, you will have to identify and think about it in terms data fields, data groups, data bases, how they
can be appended, how they can be cleared, exactly which data fields you need to put into whatever
form you are using, and how that all can be manipulated. A lot of those principles come with reason-
ably advanced excel usage and also knowledge of tools such as IDEA, SQL, and others. [D1]
Our interview evidence suggests, however, that the auditors are acutely aware of the need to retain
control over the application of data analytics and effectively to be able to operate almost as data
specialists themselves. Recognising this point, major firms have undertaken significant
Accounting and Business Research 109
investment in the establishment of the so-called architecture for analytics, at both local and global
network-levels, running their own training programmes and establishing external collaborations
to develop highly skilled in-house analysts. KPMG, for example, in co-operation with Imperial
College London, has established a £20 m Centre for Advanced Business Analytics focused ‘on
developing fresh approaches to working with large, ambiguous and complex data sets and trans-
lating it into potential solutions by developing new tools, methods and techniques to harness the
power of big-data.3
The principle or objective if you like of this platform is to deliver analytics in a cost-effective way into
all of our audits without consideration of the ability of the auditee to pay. So, all our clients from top to
bottom have access to quality analytics irrespective of whether they are big or little clients. [P1]
In one interview, a case was cited where the presence of false positives in one particular engage-
ment created tensions between the engagement partner and a data analytics partner, where the
engagement partner was concerned about the strain on the audit budget because more work
was expected to be done on the outliers which could later prove to be false positives. As the
Data Assurance Partner recollects,
I made a couple of audit partners very grumpy as they had to do additional procedures to just confirm
that the things I had found [through the use of BDA] wouldn’t adjust their audit opinion. The good
news was that it didn’t adjust their audit opinion. [P7]
Other interviewees suggested that an auditor’s over- (or under-) reliance on BDA outputs may be
a function of their level of confidence and experience. In particular, where auditors are not com-
fortable with the audit procedures performed by BDA, they may opt for repeating those pro-
cedures using traditional methods:
The reality is we could not get them [audit team members] to stop doing the manual things that they
have been trained to do, so they ended up doing all the manual things even though we did all the data
analytics, and then they didn’t have any time to do the exciting stuff. [P7]
You do often find an auditor will use analytics but will be asked by the senior person to also do the
traditional audit techniques, which will lead to double the work. [D2]
The question [auditors have] is ‘What is it actually going to bring to my audit other than cost me a lot
of money?’. So, it’s been a battle to say, ‘Yes, you did do analytics before but if you run these tests
over payroll you get a lot more comfort and the quality of your audit will improve’. [M3]
Another set of concerns relates to the evident tensions between auditors and data scientists as
regarding their role for embedding BDA into daily audit processes and routines. Historically,
110 G. Salijeni et al.
auditors have relied upon experts from other fields as part of the process of arriving at professional
judgements (see Power 1997, O’Dwyer et al. 2011). While the same can be said about data ana-
lysts, the level of influence these specialists have in some audit firms seems to go well beyond
what would otherwise be expected of an expert external to audit. Many auditors interviewed
for this study point to data analysts being involved in the execution of key audit steps, such as
audit planning through design of computer scripts to extract data for initial risk assessment, or
performing tests such as re-computation or re-calculation to produce a report about the clients’
systems for use by auditors. Here, auditors rely on data analysts to provide them with an under-
standing of the client’s business. Further, in some firms interviewed, the data analysts, through
their lead partner, sign off any reports generated through BDA before they can be used by the
audit teams. This suggests that the construction of audit evidence is co-produced between auditors
and data analysts, thereby reconfiguring how sufficient audit evidence is derived:
We are already seeing that audit teams include more data analytics people; thus, people can understand
the database structures and data field. We have got different levels, we have got foundational people,
they are the architects, they understand how to take client data and turn it into useable formats. [M3]
There is also evidence that, in some instances, data analytics teams are responsible for selecting
BDA tools to be used by auditors, which creates tensions with audit team members who maintain
that the selection of audit toolkits should be left to auditors themselves:
I think, historically, [data scientists] have been seen very much as a support service, sort of ‘where
I don’t want to do journals testing, you do that, send my report back and I will review the exceptions’.
I think now they can no longer be seen as a support service. They are an integral part of the actual audit
team. […] So, the data group sit with the audit team, they are part of the audit team, they work very
closely with them. [M3]
Fundamentally, we need to put analytics into the hands of the auditors as opposed to the specialists
because the specialists are an expensive resource and they are finite resource, and the model could
only deliver it to the biggest, most prestigious clients. [P5]
being in their ‘infancy’. The prevalent view is that, at least at present, there is no need to contem-
plate changes in the current auditing standards as the existing versions provide sufficient scope for
auditors to experiment in relation to BDA:
The current audit model is not broken, and has served stakeholders well in the past. It has however
been constrained by technological limitations that no longer exist. Additionally, auditors find chal-
lenges in fitting the audit evidence derived from data analytics into the current audit approach required
by the auditing standards. (Minutes of a meeting of the IAASB’s DAWG 2015)
So, we are not seeing data analytics being used as much as we were expecting it to be. I suppose, in a
way, we have been seeing some data analytical tools being used for a long time, particularly around
the testing of journals, and so we see in the audit files that this is pretty much what audit teams are
using those tools for. [R4]
The question is as regulators we need to look at how auditors have used those analytics. You said in
the audit report that’s what you have done. We will then look at the audit file and one of the things we
do in our inspections is consider how accurate these descriptions in the audit reports are. Are they
really truly reflecting what it is that the audit team has done? So, yes firms are starting to put it
[claims about the use of BDA] into their audit reports more but the thing we are concerned about
is how fair is that representation? [R2]
Technical analytical lens Impact of BDA on audit procedures and on the achievement of audit quality
Does BDA lead to truly transformational, innovative analytical procedures, or simply to increased coverage
and enhanced processing of information subject to audit by means of conventional techniques?
Is BDA associated with more structured, routinized, and technique-oriented audit approaches? If so, what is
the potential impact of the rise of BDA on the exercise of professional judgement by auditors?
What impact does the rise of BDA have on auditors’ actual and perceived ability to achieve audit quality? Is
it possible to demonstrate that BDA is producing better audits?
What should be the appropriate way for the audit regulators and standard setters to accommodate the current
developments with BDA into auditing standards and regulations?
Economic analytical lens Impact of BDA on the economics of auditing and the position of audit within multi-
disciplinary audit firms
Examples of research questions
What role do the audit firms’ strategies for business expansion and commercial motivations play in the
promotion of BDA?
What is the actual impact of BDA on the shifting boundaries between audit and non-audit services?
Is audit function alone capable of generating resources sufficient for the development and management of
BDA tools? Consequently, does the development and deployment of BDA tools reflect purely an audit
mindset?
What are the consequences of BDA for potential marginalization of auditing within audit firms’ service
portfolios?
To what extent is the rise of BDA contributing to a perception of audit as a supplier of knowledge spillovers
for other service lines with in the audit firms?
What are the effects of the claimed benefits of BDA (such as improved diagnostic insight and full population
testing) on the scope for litigation against auditors?
Do auditors possess the expertise and knowledge required for BDA environments?
Is the rise of BDA in auditing potentially altering the nature of skillsets required of a modern auditor (towards
more technical competencies), and how is that reflected in professional training and qualification?
Does audit practice in the era of BDA provide a suitable environment for the development of auditors as
professionals?
What are the consequences of BDA for the image of audit work as intellectual inference-based practice?
Does the rise of BDA have potentially significant deskilling and de-professionalizing effects on auditing?
Can BDA ultimately be judged as a disruptive innovation (the prospect of an ‘auditor-less audit)’?
Accounting and Business Research 113
in a diagnostic way to alert auditors to potential areas of problem in the financial statements. Here,
researchers could draw relevant insights from, for example, Power (1997) who provides evidence
of how domains previously seen as peripheral and problematic for an audit were made auditable;
and also Williams (2013) who demonstrates how tools such as BDA could be used to construct
objects as risky and requiring regulatory attention. Both these studies attend to the social construc-
tivist nature of audit spaces as well as audit evidence as co-produced through interaction between
auditors and technology.
There is also a need for empirically grounded work to explore whether BDA is indeed transfor-
mational, as is often presented by the audit firms themselves, in the execution of routine audit work
and analytical procedures. More specifically, are the benefits of reliance on BDA mainly associated
with the power of processing and resulting opportunities to increase the coverage of data sets subject
to investigation or with the use of new, innovative analytical procedures in the course of an audit?
Here, there is a question of both motivation and actuality – that is the reason behind auditors’
attempts to engage with the Big Data environments may well be motivated by rationales of improv-
ing the standard of audit but, given prior research findings about the difficulties of implementation
of methodological developments (see, for instance, Curtis and Turley 2007), questions should still
be raised about the impact of such efforts on actual audit practice.
Further, in understanding the technical aspects of the application of BDA in the audit context,
there is also a need to account for the manner in which the growing significance of data processing
technologies may be shaping the very nature of what auditors do (Williams 2013). A more tech-
nologically enhanced BDA audit process, and a more technologically aware auditor, may indeed
be presented as necessary for auditors to be able to operate successfully in the modern business
environment. However, we call for research that can penetrate through this technocratic image
and question underlying assumptions, such as those about the role of judgement and structure
in audit work (Power 2003). It has been observed that technologies that are designed to offer gui-
dance to auditors in making professional judgements can often produce an opposite effect.
Dowling et al. (2008), for example, find that auditors who had adopted a very prescriptive decision
support system identified fewer business risks than those who used a less prescriptive approach.
They suggest, in particular, that a ‘hard’ structured approach encoded in the new technology
could limit the cognitive ability of auditors to generate many cues for assessing audit risks associ-
ated with the business. For years, a mystery surrounding what auditors do has enabled them to
encourage a perception of a judgement-intensive practice that warrants higher fees. On the one
hand, there is a possibility that the push for BDA is in conflict with that ambition, potentially
leading to predominantly routinised and increasingly technique-based approaches to audit which
minimise the need for professional judgement. On the other hand, however, one can also make a
counter argument that the additional insights generated through the use of BDA tools need to be
made sense of and interpreted by auditors, and these sense-making endeavours ultimately require
a degree of professional judgment. As we have demonstrated earlier, decisions around false posi-
tives, or the nature and veracity of data brought to an auditor’s attention through the use of BDA
are examples of areas that bring to the fore the issues round rationalisation and inference.
As the practice potentially spreads and both the rhetoric and actuality of BDA become more
prevalent, there is also a need to examine the extent to which these developments are accommo-
dated in existing generalised regulatory standards or challenge such standards. The relationship
between innovation in firm methodology and the development of generalised standards for audit-
ing is complex (see Curtis et al. 2016 on the embedding of BRA in auditing standards), reflecting
the fact that standards need to provide some latitude for interpretation and innovation and also to
be seen to be relevant to up-to-date developments in practice, but that standard setters do not want
simply to follow an agenda set by practice firms (Bamber and McMeeking 2016). The potential
for distance between formal standards and day-to-day practice in respect of BDA in auditing is
114 G. Salijeni et al.
relevant not only for standards setters, as included in the IAASB 2015–2016 work plan (IAASB
2014a), but also from a research perspective. Therefore, there is an important role for in-depth
empirical research that could produce understandings to inform and provide a direction for audit-
ing standard-setting agendas.
From an economic viewpoint, our findings demonstrate that the employment of BDA can be
seen as relevant to the ‘business of auditing’, potentially affecting audit costs and the efficiency of
audit work. Here, there is a need for a more critical assessment of whether the firms’ represen-
tation of BDA as the next big thing in the development of audit practice is driven, at least to a
certain degree, by their strategies for business expansion and the competitive rationales where
BDA may be used as a tool for further differentiation and segregation between the large and
smaller audit firms, for example, because of the necessary investment in technology and expertise.
The latter could struggle to achieve the level of investment in training and development necessary
for the application of complex BDA tools. Economic motivations behind the rise of BDA should
constitute an important research agenda that could shed light on the extent to which the pen-
etration of BDA into auditing is not only a consequence of the audit clients’ changing business
realities but also a product of the firms’ pursuit of their own strategic agendas.
Moreover, the quest for further business opportunities to be exploited in an era of BDA is
likely to be affecting the fluidity of the boundaries between audit and consultancy (Jeppessen
1998, Robson et al. 2007). Prior audit literature provides ample evidence of audit firms’ efforts
to expand their services into new, often consultancy-oriented activity domains (Shafer and
Gendron 2005, Andon et al. 2015) by pursuing an image of auditors as ‘versatile experts’
(Guo 2016, p. 100) who can withstand the competitive nature of the audit field while also uphold-
ing the normative values of the profession. Similarly, given the scale of investment required in the
process of developing and maintaining BDA algorithms, software and tools, it is unlikely that
auditing alone can generate sufficient resources to support the necessary investment and the
auditor will, as a result, end up using analytical tools which have been created to serve the
needs of the firms’ other service lines. The BDA tools themselves may, therefore, not reflect
purely an audit mindset. The danger here is that of a more client service mindset regarding the
underlying purpose and justification of the audit. This is something that goes to heart of what
might be called auditors’ ‘self-worth’, that is how they perceive the value of the activity they
undertake and its connection to something valued by management and the board. Several impor-
tant questions should be investigated here. What are the consequences of BDA for the standing of
auditing in a multi-disciplinary multi-service audit firm context? What is the potential contri-
bution of BDA to the marginalisation of auditing within the major firms’ service portfolios?
Can the rise of BDA result in shifts in the perceived role of the audit service as primarily a supplier
of knowledge spill-overs and business opportunities for the firms’ other service lines?
Lastly, the obvious and highly significant consequence of BDA research should consider is for
the notions of accounting professionalism. Specifically, audit scholars should contemplate the
manner in which auditors’ efforts to be seen as possessing the expertise and knowledge required
for BDA environments are potentially altering the nature of skillsets and competencies attributed
to a modern-day auditor and, more fundamentally, even his/her professional identity. The issue
about how exactly encounters with BDA may be changing the notion of auditor expertise, knowl-
edge and identity is an important one. If the skills required of modern auditors become skewed
towards more technical, ‘number-crunching’ types of competencies, this would have implications
for the manner in which practice provides a suitable environment for the development of auditors
as professionals (Turley et al. 2016). The use of the term ‘data scientist’ is becoming more
common place in auditing, which suggests auditors’ preference to imagine themselves as sophis-
ticated experts, at least as far as data processing and analysis is concerned. However, it is not clear
that mastering more quantitatively oriented skillsets leads to better auditors and better auditor
Accounting and Business Research 115
judgements. Power (1995) encourages us not to assume auditor’s expertise as given but as (co-)
produced. Again, a number of possible significant research questions can be suggested. What is
the influence of the spread of BDA on the knowledge base auditors draw on? How do encounters
with large data sets impact auditors’ ability (and perceived need) to develop more inference-type
skills that are commonly attributed to professionals as against other occupational groups? What
are the consequences of BDA for auditors’ self-image as well any changes in personal and pro-
fessional attributes that make up their identity at work? And ultimately, can the rise of BDA
potentially generate significant deskilling effects on accounting labour?
These latter concerns about the potential ‘de-professionalising’ effect of highly technically
oriented work environments are not unfounded and have already been voiced in the professiona-
lisation literature. Abbott (1988), for example, emphasised how reliance on mathematical tech-
niques potentially dilutes professional systems of knowledge and diminishes the image that
professionals create for themselves, because it is primarily on the basis of their expert knowledge
that professions build their jurisdictional authority, legitimacy and public image. In the audit
context, Omoteso et al. (2010) state that automated audit technologies have the capacity to
change the organisational structure of audit firms as some tasks previously performed by auditors
are taken over by the technology, implying a deskilling effect of such changes on audit pro-
fessionals (see Dowling et al. 2008 for a similar view). Power (1997) raised similar concerns
when arguing that if the idea behind the rise of statistical audit sampling was a project to construct
and codify auditors’ knowledge base then the outcomes were somewhat mixed. It is relevant to
raise these points with respect of the current situation with the rise of BDA in auditing. Indeed, is
it possible that history will judge BDA as a destructive innovation, at least in terms of the standing
of auditing and auditors as a professional group? With driverless cars being seen only recently as a
product of fictitious mind, will we ever see an ‘auditorless’ audit?
In conclusion, taking these possibilities together it can be argued that, just as in case of many
other technological developments in auditing over the past several decades (Humphrey and
Moizer 1990, Power 1992, Carpenter and Dirsmith 1993, Curtis and Turley 2007, Robson
et al. 2007), auditors’ recourse to technical development should be assessed not only from the
point of view of the technical improvements to practice but also in terms of their more fundamen-
tal relevance to understanding the significance and role of auditing in the governance of business.
There is an underlying question whether, as has been the case with other methodological devel-
opments previously, BDA developments reflect auditors’ search for a technical solution to what
essentially are existential problems and dilemmas facing the audit profession that, among other
things, have to do with issues such as the societal relevance of the audit function, the continuing
expectations gap and auditors’ ability as knowledge experts to meet the complexity inherent in
contemporary financial reporting practice. Certainly, there is a danger of disappointment if
BDA developments are seen as the golden ticket which is going to solve audit problems and
lead to a future in which audit services are universally valued. Equally, however, these develop-
ments may provide real tools to reconfigure, refocus and potentially reposition (within the firms’
business models) contemporary audit practice and, because of such potentially significant trans-
formative effects, should be subject to much higher level of scholarly attention and debate.
Notes
1. Gartner (2012) defines Big Data as ‘high-volume, high-velocity and high variety information assets that
demand cost effective, innovative forms of information processing for enhanced insight and decision
making’. Most scholarly, technical and professional publications embrace these three key elements of
volume, variety and velocity (e.g. Chen et al., 2012, Accountancy Futures Academy, 2013). Volume
recognises the magnitude of data, variety refers to how data sets may contain both structured and
unstructured data, and velocity reflects how data capture and processing technologies are increasing
116 G. Salijeni et al.
the speed of analysis. Some have sought to extend the definition to include other attributes such as varia-
bility (in the pattern of data capture), veracity (indicating the risks associated with some forms of data)
and the value of information (see discussion in Gandomi and Haider, 2015).
2. See https://home.kpmg.com/au/en/home/insights/2015/02/audit-data-analytics-unlocking-value-of-audit.
html (accessed 30 October 2017).
3. See http://www.imperial.ac.uk/business-school/research/kpmg-centre-for-advanced-business-analytics/
(accessed 21 September 2015).
4. Specifically, the authors mention ISA 315 Identifying and assessing the risks of material misstatement
through understanding the entity and its environment, ISA 240 The auditor’s responsibility relating to
fraud in an audit of financial statements and ISA 540 Auditing accounting estimates, including fair
value accounting estimates, and related disclosures.
References
Abbott, 1988. The System of Professions. The Essay on the Division of Expert Labour. Chicago: University
of Chicago Press.
ACCA, 2015. Big Data Audit Dynamite. London: Association of Chartered Certified Accountants.
Accountancy Futures Academy, 2013. Big Data: Its Power and Perils. London: Association of Chartered
Certified Accountants.
AICPA, 2014. Reimagining Auditing in a Wired World. NY New York. Available from: http://www.aicpa.
org/InterestAreas/FRC/AssuranceAdvisoryServices/DownloadableDocuments/Whitepaper_Blue_Sky_
Scenario-Pinkbook.pdf [Accessed 24 January 2017].
Alles, G.M., 2015. Drivers of the use and facilitators and obstacles of the evolution of Big Data by the audit
profession. Accounting Horizons, 29 (2), 439–449.
Alles, G.M. and Gray, G.L., 2016. Incorporating Big Data in audits: identifying inhibitors and a
research agenda to address those inhibitors. Journal of Accounting Information Systems, 22, 44–59.
Alvesson, M. and Karreman, D., 2011. Qualitative Research and Theory Development: Mystery as Method.
London: Sage Publications.
Andon, P., Free, C., and O’Dwyer, B., 2015. Annexing new audit spaces: challenges and adaptations.
Accounting, Auditing and Accountability Journal, 28 (8), 1400–1430.
Appelbaum, D., Kogan, A., and Vasarhelyi, M.A., 2017. Big Data and analytics in the modern audit engage-
ment: research needs. Auditing: A Journal of Practice and Theory, 36 (4), 11–27.
Arnaboldi, M., Busco, C., and Cuganesan, S., 2017. Accounting, accountability, social media and big data:
revolution or hype? Accounting, Auditing and Accountability Journal, 30 (4), 762–776.
Bamber, M. and McMeeking, K., 2016. An examination of international accounting standard-setting due
process and the implications for legitimacy. British Accounting Review, 48 (1), 59–73.
Barnes, P.A., 2004. The auditor’s going concern decision and Types I and II errors: The Coase Theorem,
transaction costs, bargaining power and attempts to mislead. Journal of Accounting and Public
Policy, 23 (6), 415–440.
Bhimani, A. and Willcocks, L., 2014. Digitisation, ‘Big Data’ and the transformation of accounting infor-
mation. Accounting and Business Research, 44 (4), 469–490.
Brown-Liburd, H., Issa, H. and Lombardi, D., 2015. Behavioral implication of Big Data’s impact on audit
judgment and decision making and future research directions. Accounting Horizons, 29 (2), 451–468.
Cao, M., Chychyla, R. and Stewart, T., 2015. Big data analytics in financial statement audits. Accounting
Horizons, 29 (2), 423–429.
Carpenter, B. and Dirsmith, M., 1993. Sampling and the abstraction of knowledge in the auditing profession:
an extended institutional theory perspective. Accounting, Organizations and Society, 18 (1), 41–63.
Chen, H., Chiang, R.H.L. and Storey, V.C., 2012. Business intelligence and analytics: from Big Data to big
impact. MIS Quarterly, 36 (4), 1165–1188.
Curtis, E. and Turley, S., 2007. The Business Risk Audit: a longitudinal case study of an audit engagement.
Accounting, Organizations and Society, 32 (4–5), 439–461.
Curtis, E., Humphrey, C. and Turley, S.W. 2016. Standards of innovation in auditing. Auditing: A Journal of
Practice & Theory, 35 (3), 75–98.
Cushing, B.E. and Loebbecke, J.K., 1986. Comparison of Audit Methodologies of Large Accounting Firms.
Sarasota, FL: American Accounting Association.
Deloitte, 2014. Audit Transparency Report for the Year Ended 31 May 2014. London: Deloitte LLP.
DeFond, M. and Zhang, J., 2014. A review of archival audit research. Journal of Accounting and Economics,
58 (2–3), 275–326.
Accounting and Business Research 117
Dowling, C., Leech, S.A. and Moroney, R., 2008. Audit support system design and the declarative knowl-
edge of long-term users. Journal of Emerging Technologies in Accounting, 5 (1), 99–108.
Earley, C.E., 2015. Data analytics in auditing: opportunities and challenges. Business Horizons, 58 (5), 493–
500
Elliott, R.K. and Jacobson, P.D., 1987. Audit technology: a heritage and a promise. Journal of Accountancy,
163, 198–217.
EY, 2014. Transparency Report 2014. London: Ernst and Young Global Limited.
EY, 2015. Becoming an Analytics Driven Organization to Create Value. London: Ernst and Young LLP with
Nimbus Ninety. Available from: http://www.ey.com/UK/en/Services/Specialty-Services/Big-data—
Becoming-an-analytics-driven-organisation-to-create-value [Accessed 15 December 2016].
Fischer, M.J., 1996. “Real-izing” the benefits of new technologies as a source of audit evidence: an interpre-
tive field study. Accounting, Organizations and Society, 21, 219–242.
FRC, 2006. Discussion Paper Promoting Audit Quality. London: Financial Reporting Council.
FRC, 2017. Audit Quality Thematic Review: The Use of Data Analytics in the Audit of Financial Statements.
London: Financial Reporting Council Limited.
Gandomi, A. and Haider, M., 2015. Beyond the hype: Big Data concepts, methods, and analytics.
International Journal of Information Management, 35 (2), 137–144.
Gartner, 2012. IT Glossary – Big Data. Available from: http://www.gartner.com/it-glossary/big-data/
[Accessed 10 May 2015].
Guo, K.H., 2016. The institutionalization of commercialism in the accounting profession: an identity-exper-
imentation perspective. AUDITING: A Journal of Practice & Theory, 35 (3), 99–117.
Hansen, J.V. and Messier Jr. W.F., 1986. A knowledge-based expert system for auditing advanced computer
systems. European Journal of Operational Research, 26, 371–379.
Higson, A.W., 2003. Corporate Financial Reporting: Theory and Practice. London: Sage Publication.
Holm, C. and Zaman, M., 2012. Regulating audit quality: restoring trust and legitimacy. Accounting Forum,
36 (1), 51–61.
Humphrey, C. and Moizer, P., 1990. From techniques to ideologies: An alternative perspective on the audit
function. Critical Perspectives on Accounting, 1 (3), 217–238.
Humphrey, C., O’Dwyer, B., and Unerman, J., 2017, Re-theorizing the configuration of organizational fields:
the IIRC and the pursuit of ‘enlightened’ corporate reporting. Accounting and Business Research, 47 (1),
30–63.
IAASB, 2014a. Work Plan for 2015–2016: Enhancing Audit Quality and Preparing for the Future. Available
from: https://www.ifac.org/system/files/publications/files/IAASB-Work-Plan-2015-2016.pdf [Accessed
24 January 2017].
IAASB, 2014b. A Framework for Audit Quality: Key Elements that Create an Environment for Audit
Quality. Available from: https://www.ifac.org/publications-resources/framework-audit-quality-key-
elements-create-environment-audit-quality [Accessed 4 December 2014].
IAASB, 2016. Exploring the Growing Use of Technology in the Audit, with a Focus on Data Analytics. New
York: International Auditing and Assurance Standards Board.
ICAEW, 2016. Data Analytics for External Auditors. London: Institute of Chartered Accountants England
and Wales.
Jeppessen, K.K., 1998. Reinventing auditing, redefining consulting and independence. European
Accounting Review, 7 (3), 517–539.
Khalifa, R., Sharma, N., Humphrey, C., and Robson, K., 2007. Discourse and audit change: transformations
in methodology in the professional audit field. Accounting, Auditing and Accountability Journal, 20 (6),
825–854.
KPMG, 2014a. UK Annual Report (Including the Transparency Report), KPMG LLP, United Kingdom.
KPMG, 2014b. Data & Analytics: Unlocking the Value of the Data. Available from: https://www.kpmg.com/
Global/en/services/Audit/Documents/unlocking-the-value-of-audit.pdf [Accessed 15 September 2015].
Krahel, P.J. and Titera, W.R., 2015. Consequences of Big Data and formalization on accounting and auditing
standards. Accounting Horizons, 29 (2), 409–422.
MacLullich, K.K., 2003. The emperor’s ‘new’ clothes? New audit regimes: insights from Foucault’s tech-
nologies of the self. Critical Perspectives on Accounting, 14 (8), 791–811.
Matthews, D., 2006. A History of Auditing: The Changing Process in Britain from the Nineteenth Century to
the Present day. Oxford: Routledge.
Newton, J.D. and Ashton, R.H., 1989. The association between audit technology and audit delay. Auditing: A
Journal of Practice and Theory, 8 (2 supplement), 22–37.
118 G. Salijeni et al.
O’Dwyer, B., Owen, D. and Unerman, J., 2011. Seeking legitimacy for new assurance forms: the case of
assurance on sustainability reporting. Accounting, Organizations and Society, 36 (1), 31–52.
O’Leary, D.E. and Markus, M.L., 2006. Microsoft’s management reporting: SAP, data warehousing, and
reporting tools. Journal of Emerging Technologies in Accounting, 3, 129–141.
Omoteso, K., Patel, A. and Scott, P., 2010. Information and communications technology and auditing:
current implications and future directions. International Journal of Auditing, 14 (2), 147–162.
Power, M., 1992. From common sense to expertise: the pre-history of auditing sampling. Accounting,
Organisation and Society, 17 (1), 37–62.
Power, M., 1995. Auditing, expertise and the sociology of technique. Critical Perspectives in Accounting, 6,
317–339.
Power, M., 1997. The Audit Society: Rituals of Verification. New York, NY: Oxford University Press.
Power, M., 2003. Auditing and the production of legitimacy. Accounting, Organizations and Society, 28 (4),
379–394.
PwC, 2014a. Building Trust Through Assurance: Transparency Report. PricewaterhouseCoopers LLP.
Available from: http://www.pwc.co.uk/transparencyreport/assets/pdf/transparency-report-fy14.pdf
[Accessed 27 May 2015].
PwC, 2014b. Data Analytics Delivering Intelligence in the Moment. London: PricewaterhouseCoopers LLP.
Ramlakun, R., 2015. How Big Data and Analytics Are Transforming the Audit. London: Ernst and Young
Global Limited, Vol. 9, 9–12.
Robson, K., Humphrey, C., Khalifa, R., and Jones, J., 2007. Transforming audit technologies: Business
Risk Audit methodologies and the audit field. Accounting, Organizations and Society, 32 (4–5),
409–438.
Shafer, W.E. and Gendron, Y., 2005. Analysis of a failed jurisdictional claim: the rhetoric and politics sur-
rounding the AICPA global credential project. Accounting, Auditing & Accountability Journal, 18 (4),
453–491.
Sikka, P., 2009. Financial crisis and the silence of the auditors. Accounting, Organizations and Society, 34
(6–7), 868–873.
Sully, J.M., 1974. Statistical sampling in auditing. Journal of the Royal Statistical Society, 23 (1), 71–80.
Teeter, R., Alles, M., and Vasarhelyi, M.A., 2010. Remote audit: a research framework. Journal of Emerging
Technologies in Accounting, 7 (1), 73–88.
Turley, S. and Cooper, M., 1991. Auditing in the UK. Hertfordshire: Institute of Chartered Accountants in
England and Wales, Prentice Hall.
Turley, W., Humphrey, C., Samsonova-Taddei, A., Siddiqui, J., Woods, M., Basioudis, I., and Richard, C.,
2016. Skills, Competencies and the Sustainability of the Modern Audit. Edinburgh: Institute of Chartered
Accountants of Scotland.
Vasarhelyi, M.A., Kogan, A., and Tuttle, B., 2015. Big Data in accounting: an overview. Accounting
Horizons, 29 (2), 381–396.
Wang, T. and Cuthbertson, R., 2015. Eight issues on audit data analytics we would like researched. Journal
of Information Systems, 29 (1), 155–162.
Warren Jr, J.D., Moffitt, K.C., and Byrnes, P., 2015. How Big Data will change accounting. Accounting
Horizons, 29 (2), 397–407.
Williams, J.W., 2013. Regulatory technologies, risky subjects, and financial boundaries: governing ‘fraud’ in
the financial markets. Accounting, Organizations and Society, 38 (6–7), 544–558.
Yoon, K., Hoogduin, L., and Zhang, L., 2015. Big Data as complementary audit evidence. Accounting
Horizons, 29 (2), 431–438.
Zhang, L., Pawlicki, A.R., McQuilken, D., and Titera, W.R., 2012. The AICPA assurance services executive
assurance technology task force: the audit data standards (ADS) initiative. Journal of Information
Systems, 26 (1), 199–205.
Appendix
Interview Guide
1. What is your role and how is it related to Big Data Analytics (BDA)?
2. What have been the key developments with regards to BDA in your organisation in the past few
years?
Accounting and Business Research 119
3. Can you think of factors that have influenced the use of BDA in your organisation and in the audit
field, in general?
4. Can you describe the aspects of BDA that you consider most valuable for you personally, the organ-
isation you work for and your clients?
5. In your opinion, which actors have been most committed to and active in promoting BDA in audit-
ing? Can you think of particular examples to illustrate this commitment?
6. How would you describe the current state of development of audit technologies associated with
BDA?
7. In your view, what parallels, if any, can be drawn between developments relating to BDA and other
changes in audit technology previously, such as Statistical Sampling or Business Risk Audit?
8. What is your view on the possible impact of BDA on the delivery of an audit, i.e. audit process and
its quality?
9. Can you think of any recent occasion(s) where either you or your colleagues engaged in the pro-
motion of BDA? What is your assessment of any resulting outcomes? What are the most significant
obstacles to the development of BDA in the audit context?
10. What are the perceptions of the audit clients regarding the use of BDA by audit firms?
11. How do you encourage your clients to consider BDA? What particular approaches do you use? How
do you rationalise the value of BDA to clients?
12. How do clients respond to such approaches?
13. To what extent is BDA used in the provision of other assurance services?
14. What are the perceptions of the regulators and standard setters regarding the use of BDA in
auditing?
15. What is your assessment of the future outlook of BDA technology in the audit context?