Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Measuring customer perceived online service quality : Scale development and managerial implications

2004, International Journal of Operations & Production Management

The purpose of this paper is to set forth a reliable and valid means of measuring online service quality based on a broad conceptual framework which integrates theory and conceptualization in customer service quality, information systems quality, and product portfolio management, into online service quality. An ethnographic content analysis of 848 customer reviews of online banking services was employed to identify salient online service quality dimensions. The most frequently cited online service quality attributes, along with literature review and personal interview results, were utilized to develop the survey questionnaire. Subsequent to the pre-test, a Web-based survey was undertaken to verify and test the online service quality model. A confirmatory factor analysis produced six key online service quality dimensions: reliability, responsiveness, competence, ease of use, security, and product portfolio. This paper includes a discussion of the managerial and theoretical implications of this online service quality model.

The Emerald Research Register for this journal is available at www.emeraldinsight.com/researchregister The current issue and full text archive of this journal is available at www.emeraldinsight.com/0144-3577.htm Measuring customer perceived online service quality Scale development and managerial implications Online service quality 1149 Zhilin Yang Department of Marketing, City University of Hong Kong, Kowloon, Hong Kong Minjoon Jun Department of Management, College of Business Administration and Economics, New Mexico State University, Las Cruces, USA, and Robin T. Peterson Department of Marketing, College of Business Administration and Economics, New Mexico State University, Las Cruces, USA Keywords Worldwide web, Customer service quality Abstract The purpose of this paper is to set forth a reliable and valid means of measuring online service quality based on a broad conceptual framework which integrates theory and conceptualization in customer service quality, information systems quality, and product portfolio management, into online service quality. An ethnographic content analysis of 848 customer reviews of online banking services was employed to identify salient online service quality dimensions. The most frequently cited online service quality attributes, along with literature review and personal interview results, were utilized to develop the survey questionnaire. Subsequent to the pre-test, a Web-based survey was undertaken to verify and test the online service quality model. A confirmatory factor analysis produced six key online service quality dimensions: reliability, responsiveness, competence, ease of use, security, and product portfolio. This paper includes a discussion of the managerial and theoretical implications of this online service quality model. Introduction Electronic commerce (e-commerce) has witnessed extensive growth. Dozens of Internet-only companies have surfaced in many industries and numerous conventionally-operated companies have adopted the Internet. Accordingly, competition among online companies has become rigorous. Most online companies publish price information and feature price in their advertising campaigns. Therefore, customers can become informed of the optimal prices for sought products/services. To offset this price-transparency disadvantage, competitors have utilized three primary strategies: (1) geographic differentiation; (2) service quality differences; and (3) modest levels of switching costs (Chen and Hitt, 2000). The authors thank the anonymous reviewers for their helpful comments. The first author gratefully acknowledges a research grant from the City University of Hong Kong (DAG Project No. 7100267). International Journal of Operations & Production Management Vol. 24 No. 11, 2004 pp. 1149-1174 q Emerald Group Publishing Limited 0144-3577 DOI 10.1108/01443570410563278 IJOPM 24,11 1150 Online growth has reduced the role of physical geography for many consumers. This geographical irrelevance can also shrink some implicit switching costs, such as those for convenience and time utility. In short, the importance of service quality differentiation, in attracting and retaining customers, has advanced. However, e-commerce service quality has been evaluated as inferior by numerous customers (Rubino, 2000). Since the Internet is a relatively new transactional channel, online companies may not clearly understand what specific services are desired. Additionally, many customers have not yet formed clear expectations for online retailers (Zeithaml et al., 2001). The importance of service quality and the challenges facing Internet-based services necessitate insights on the part of managers about what attributes customers use in their evaluation of online service quality. However, a rigorous measurement instrument of online service quality has not been available (Cox and Dale, 2001; O’Neill et al., 2001; Yoo and Donthu, 2001). In order to improve that condition, this study intends to . identify the more salient online service quality dimensions; . confirm the identified major service quality dimensions; and . determine the relative importance of each identified service quality dimension in producing overall service quality. The authors employed a two-stage approach in developing a reliable and valid measurement of online service quality. After establishing a broad conceptual framework which integrates theory and related concepts in the customer service quality, information systems quality, and product portfolio management into online service quality, the authors applied an ethnographic content analysis to 848 customer reviews of online banking services to identify online service quality dimensions. A survey questionnaire was generated, based on these identified salient attributes, and results from the literature review and personal interviews. Following this, a Web-based questionnaire survey effort produced data from 235 online customers. Then, a confirmatory factor analysis was used to outline six key online service quality dimensions: reliability, competence, responsiveness, ease of use, security, and product portfolio. Conceptual framework Two areas of literature were selected and reviewed for this study. One was the traditional and online service quality literature and the other was the information systems and Web site design literature. Based on the literature review, the authors identified the following three broad conceptual categories related to online service quality: (1) customer perceived service quality; (2) information systems quality; and (3) product portfolio. The major literature findings are discussed under these three categories. Customer perceived service quality Customer perceived service quality can be defined as a global judgment or attitude relating to the superiority of a service relative to competing offerings (Parasuraman et al., 1988). Over the past three decades, numerous researchers have sought to uncover the global services attributes that contribute most significantly to relevant quality assessments (Sasser et al., 1978; Gronroos, 1983; Parasuraman et al., 1985; Pitt et al., 1999). Among them, the Parasuraman et al. (1985) work has been regarded as most prominent, which revealed ten dimensions: (1) tangibles; (2) reliability; (3) responsiveness; (4) communication; (5) credibility; (6) security; (7) competence; (8) courtesy; (9) understanding the customer; and (10) access. These (1) (2) (3) (4) (5) ten dimensions were further purified and distilled to five: tangibles:; reliability; responsibility; assurance; and empathy (Parasuraman et al., 1988). In turn, these five attributes constitute the base of a global measurement devise for service quality, namely, SERVQUAL. SERVQUAL has been applied by various researchers to numerous service industries as a means of gauging service quality. The primary value of SERVQUAL lies in its powerful benchmarking, diagnostic, and prescriptive tools (Kettinger and Lee, 1997). However, it has also been subjected to critical conceptual and empirical assessments (for a comprehensive review, see Cronin and Taylor, 1994; Dabholkar et al., 1996). One major concern raised with this instrument is that service quality dimensions tend to be context-bounded and service-type-dependent (Paulin and Perrien, 1996). For instance, two new dimensions unique to the traditional retailing environment, such as “willingness and ability to serve” and “physical and psychological access”, were subsequently identified by Hedvall and Paltschik (1989). It is apparent that SERVQUAL may not be sufficient for measuring service quality across industries and situations, not to mention online service quality. The instrument does not consider unique facets of online service quality, since the five dimensions primarily address customer-to-employee, but not customer-to-Web-site interactions. Accordingly, some researchers have attempted to identify key attributes that best fit the online business environment. Zeithaml et al. (2001) uncovered 11 dimensions of Online service quality 1151 IJOPM 24,11 1152 online service quality in a series of focus group interviews. These were access, ease of navigation, efficiency, flexibility, reliability, personalization, security, responsiveness, assurance/trust, site aesthetics, and price knowledge. Cox and Dale (2001) propose that traditional service quality dimensions, such as competence, courtesy, cleanliness, comfort, and friendliness, are not relevant in the context of online retailing, whereas other factors, such as accessibility, communication, credibility, and appearance, are critical to the success of online businesses. Based on 54 student evaluations of three UK-based Internet bookshops, Barnes and Vidgen (2001) have extended the SERVQUAL scale and established a WebQual Index with 24 measurement items. The index addresses the following seven customer service quality aspects: reliability, competence, responsiveness, access, credibility, communication, and understanding the individual. Similarly, Madu and Madu (2002) have proposed the following 15 dimensions of online service quality based on their literature review: (1) performance; (2) features; (3) structure; (4) aesthetics; (5) reliability; (6) storage capacity; (7) serviceability; (8) security and system integrity; (9) trust; (10) responsiveness; (11) product differentiation and customization; (12) Web store policies; (13) reputation; (14) assurance; and (15) empathy. Wolfinbarger and Gilly (2003), through focus group interviews, a content analysis, and an online survey, have uncovered four factors of online retailing experience: (1) Web site design; (2) reliability; (3) security; and (4) customer service (this factor is primarily related to the customer-to-employee interactions). Further, Zeithaml et al. (2002) have discovered the following seven service quality dimensions: (1) efficiency; (2) reliability; (3) (4) (5) (6) (7) fulfillment; privacy; responsiveness; compensation; and contact. The first four dimensions concern core online service and the remaining three are related to service recovery. Recently, Santos (2003) identified, through focus group interviews, two categories of online service quality dimensions that influence customer retention: incubative and active groups. The dimensions in the active group are primarily associated with online customer service quality. They are reliability, efficiency, support, communication, security, and incentive. Further, other studies have attempted to identify key dimensions of service quality in the context of narrowly defined online businesses, such as online banks, portal services, and travel agencies (Kaynama and Black, 2000; Jun and Cai, 2001; Van Riel et al., 2001). Joseph et al. (1999) have uncovered six underlying dimensions of online banking service quality: (1) convenience/accuracy; (2) feedback/complaint management; (3) efficiency; (4) queue management; (5) accessibility; and (6) customization. Similarly, Van Riel et al. (2001) have derived three key portal service quality attributes – core service, supporting service, and user interface. Information systems quality The Internet is an innovative form of information technology, yet most commercial Web sites function as well-defined information systems. Information system quality can be divided into system and information quality. System quality refers to software development caliber, while information quality embraces accuracy, timeliness, currency and reliability of information (DeLone and McLean, 1992). Online companies employ a complicated database interface, serving as an expert system. From this perspective, online consumers are the end-users of the computer programs and networked system. The term “end user” refers to one who “interacts directly with the application software to enter information or prepare output reports” (Doll and Torkzadeh, 1988, p. 260). The principal goal of information systems service is to enable customers to function independently and to conduct numerous transactions on their own. In addition, as end users, consumers often seek desired product and service information through Web sites. Doll and Torkzadeh (1988) have purified 13 items proposed by Baroudi and Orlikowski (1988) to a 12 items scale that gauges five quality dimensions influencing end-user satisfaction with information systems: (1) content; (2) accuracy; Online service quality 1153 IJOPM 24,11 1154 (3) format; (4) ease of use; and (5) timeliness. Other studies have confirmed the reliability and validity of this scale (Doll et al., 1994; Hendrickson and Collins, 1996). Later, several inquiries identified Web site attributes that are critical to business success. D’Angelo and Little (1998) argue that factors such as navigational and visual characteristics, and practical considerations, such as images, background, color, sound, video, media, and content, are critical features of a Web site. Lohse and Spiller (1998) have noted that characteristics such as a feedback section and product lists are crucial in generating sales. Liu and Arnett (2000) propose four factors: system use, system design quality, information quality, and playfulness, as major ingredients for success. Yoo and Donthu (2001) have developed a measurement instrument for an Internet shopping site condition, SITEQUAL, which includes four dimensions: (1) ease of use; (2) aesthetic design; (3) processing speed; and (4) security. In the same vein, Cox and Dale (2001) have discovered and statistically validated four quality factors of a Web site. These are: (1) ease of use (the design of the Web site); (2) customer confidence (how the Web site generates customer trust); (3) online resources (capability of the Web site to offer products/services); and (4) relationship services (how the Web site bonds with the customer and inspires loyalty). In addition, Zeithaml et al. (2002) have uncovered several quality dimensions related to online systems – ease of navigation, flexibility, efficiency, site aesthetics, and security. Recently, Gefen et al. (2003) have empirically found that two technological aspects of the Web site interface, namely perceived ease of use and perceived usefulness significantly affect customer repurchase intentions. Voss (2003) has proposed three key quality factors relating to customer-centered service in a virtual environment – trust, information and status, and configuration and customization. Of these, two dimensions, information and status, and configuration and customization, are associated with the capability of Web sites. Santos (2003) has uncovered five dimensions of online systems quality – such as ease of use, appearance, linkage, structure and layout, and content, and labeled them as incubative dimensions. In the context of online bank Web sites, Waite and Harrison (2002) have found seven key factors that affect consumer satisfaction: (1) transaction technicalities; (2) decision making convenience; (3) interactive interrogation; (4) specialty information; (5) search efficiency; (6) physical back-up; and (7) technology thrill. Product portfolio Online customers are more inclined to patronize firms which offer a substantial variety of services. The primary reason for this choice is that it is more likely that their diverse needs can be fulfilled. This is especially the case for desired services which are not widely distributed or unavailable at physical outlets (Barcia, 2000). Thus, a key to gaining customer satisfaction and loyalty is to provide a mix of offerings preferred by target customers. Cho and Park (2001) have identified “variety of products” as one of the seven key dimensions that influence Internet shopper satisfaction. Page and Lepkowska-White (2002) have pointed out that a suitable selection of products/services is one of the important ingredients for developing consumer value in online companies. Another rationale for customer use of the Internet is convenience. When possible, many customers prefer to complete their transactions at one site. For instance, numerous online banking customers wish to pay their bills electronically and automatically, view and print their monthly bank statements, and purchase stocks, insurance, and other financial offerings. For this reason, companies with wide product lines may be able to attract large number of customers to their sites. Also, introducing new forms of products/ services to the marketplace appeals to customers whose needs are unfulfilled by existing offerings. Therefore, a key to gaining customer satisfaction is to provide a wide range of products/services and diverse features in the format required by customers. Research questions The online service quality attributes of the three categories set forth above were determined within a narrowly defined domain and in an independent manner. A systematical and extensive study is needed to uncover the underlying key dimensions of service quality in the context of online services. Therefore, the primary research questions include: . What is high quality online service? . What are the key dimensions of online service quality? . How can online service quality be conceptualized and measured in a parsimonious and valid way? Of course, not all service quality attributes have the same impact on consumer perceptions of online services. Some attributes may not be perceived as enhancing overall service quality. The key, therefore, is to uncover, among various potentially predictive service quality attributes, particular dimensions that are most crucial in enhancing the perceived level of service quality and to assess the degree to which they are associated. In this manner, management can come to identify what service areas deserve concentration, while avoiding investing resources in providing service quality attributes that may be of minor concern to consumers (Oliva et al., 1992). Thus, the secondary research questions include: Online service quality 1155 IJOPM 24,11 . . 1156 What are the most influential online service quality dimensions in achieving a high level of overall service quality as perceived by online customers? What actions can be taken to deliver high quality online service? The authors have adopted a two-stage approach to develop valid online service quality dimensions. In phase one, a content analysis was employed to explore possible dimensions of online service quality. Based on these findings, a literature review, and a series of personal interviews, the authors have developed a preliminary model of online service quality. In phase two, a survey questionnaire was generated to assess and refine the model. Phase one: an exploratory study The aim of this stage was to identify key dimensions and their respective service features through a content analysis of consumer reviews of their online service experiences. Content analysis of critical incidents has been shown to be effective in exploring customers’ perceptions of service quality with suppliers. The fact that customers contribute time and effort for voicing their Internet purchasing experiences suggests that the attributes are salient in the post-use evaluation process (Cadotte and Turgeon, 1988). Although the consumer comments, i.e. complaints and compliments, are not likely to completely reflect their total experiences with suppliers, they do highlight those service quality dimensions and detailed attributes of greatest concern. Sample The authors employed four steps to collect qualified customer reviews or anecdotes. The first step was to choose a sampled Industry. Online banking was selected as a sample industry, because it is very service-intensive; its services involve complicated processes; it is an emerging and fast growing service sector; and customers are very sensitive to banking service quality. The next step was to find appropriate Web sites that provide customers with a location to cite their evaluations of suppliers. By using multiple search engines (i.e. Google, Yahoo, altavista, MSN search, LookSmart, and Hotbot), the authors intensively reviewed the most prominent online consumer review Web sites. Nine Web sites were found to be relevant for this study. They are: (1) consumerreview.com; (2) deja.com; (3) consumerama.org; (4) epinions.com; (5) complaints.com; (6) consumeraffairs.com; (7) computingreview.com; (8) ratingwonders.com; and (9) gomez.com. The third step was to select qualified customer evaluation sites. Three selection criteria were established to permit collection of the most representative samples: (1) customers should be allowed to rate and review online companies based on their own online service experience; (2) customers should not be financially motivated to express their opinions favoring the reviewed companies (e.g. some consumer review Web sites award money to a consumer if his/her review leads a reader to make a purchase from the evaluated online company); and (3) customers should be encouraged to post both dissatisfied and satisfied reviews. Two sites, ratingwonders.com and gomez.com, both leading online consumer review sites, fully met the requirements. The last step was to choose sample banks – 20 of the most influential Internet banks were selected for study. They are: (1) First Internet Bank of Indiana; (2) CompuBank; (3) USABancShares.com; (4) NetBank; (5) CitiBank; (6) Security First Network Bank; (7) Wells Fargo; (8) WingspanBank.com; (9) BankDirect; (10) Bank of America; (11) E*Trade Bank; (12) Fleet; (13) American Express; (14) everbank.com; (15) American Bank Online; (16) Bank One; (17) Washington Mutual; (18) First Union; (19) USAccess Bank; and (20) Chase Manhattan Bank. While some banks are Internet-only companies, most are hybrid banks. These banks market both banking products and non-banking offerings such as stock trading, and insurance. Data collection The authors accessed two online review Web sites, Gomez and Ratingwonders, from 1-16 November 2000, to secure a sufficient volume of anecdotes. After deleting disqualified reviews, e.g. spamming messages, duplications, and other messages Online service quality 1157 IJOPM 24,11 1158 irrelevant to online banking services, a total of 848 useful consumer anecdotes were selected. Coding process All anecdotes were numbered, formatted, and imported to Ethnograph 5.0, a leading software package designed for coding qualitative data (Wazienski, 2000). The authors then classified each of the anecdotes into two categories: satisfied attributes (positive performance) and dissatisfied attributes (negative performance). The leading author, along with one research assistant, developed an initial 68 coding words based on the first 100 messages. These initial 68 coding words make up the primary themes or facets of the overall quality of online services. The two researchers then further independently coded the remaining anecdotes. Subsequent discussion identified and resolved all disagreements. Results The content analysis identified a total of 17 dimensions of online service quality and assorted these into three groups: (1) Customer service quality constituting ten dimensions: . responsiveness; . reliability; . competence; . access; . personalization; . courtesy; . continuous improvement; . communication; . convenience; and . control. (2) Online system quality consisting of six dimensions, namely: . ease of use; . accuracy; . security; . content; . timeliness; and . aesthetics. (3) One dimension of product portfolio, referring to product or service variety and diverse features (see Appendix 1). In terms of frequencies of mentions, the most often-cited quality attributes are responsiveness, reliability, and competence in the customer service quality category, ease of use, accuracy, and security in the information systems quality category, and product feature and product variety in the product portfolio category. These quality attributes were considered important in customer perception of service quality. Phase two: a confirmatory study Survey instrument development Due to the constraints of a real-life evaluation in the current study, the service quality dimensions had to be simplified and adjusted for the survey. Thus, not all of the dimensions are included in the survey questionnaire. Based on the most frequent citations and theoretical considerations, the authors selected the following six dimensions: reliability, responsiveness, competence, ease of use, security, and product portfolio. For instance, the “accuracy” dimension was emerged into the “reliability” dimension. Scale items for assessing these dimensions were incorporated into a survey instrument. Pre-test A pre-test of the questionnaire was conducted to assess the content validity of the measurement scales. Content validity can be evaluated by a group of judges, sometimes experts, who read or look at a measuring technique and decide whether in their opinion it measures what its name suggests (Judd et al., 1991, p. 54). After the review by five academics and four local professionals, who specialize in service marketing and e-commerce, some items were reworded, added or deleted based on their feedback. Next, the questionnaire was forwarded through e-mail attachment to 50 online customers selected from two news groups: online financial investment and e-commerce. The e-mail effort outlined the purpose of the study and requested the participants to answer, review and critique the attached questionnaire. A total of 14 respondents replied with useful suggestions. Based on their feedback, the questionnaire was further revised and finalized. Appendix 2 illustrates all the scale items used in the survey questionnaire. Measures The final questionnaire consisted of three sets of measures: (1) perceptions of overall online service quality and individual quality dimensions; (2) general information including demographic variables; and (3) computer and Internet usage information. The respondents were requested to indicate the extent to which they agreed or disagreed by checking the appropriate response to each questionnaire item. Except for overall service quality and satisfaction which used seven-point Likert scales, all items employed five-point Likert scales anchored by 1 ¼ strongly disagree and 5 ¼ strongly agree with 3 ¼ neutral: neither agree nor disagree as the midpoint. Data collection Online customers tend to employ the Internet for conducting commercial transactions, checking updated information about their accounts, tracking the current status of their purchase orders, or simply obtaining other necessary information. Since this study intended to identify key online service quality dimensions covering all stages of the product/service purchasing cycle, from information search to service recovery, the authors collected necessary data from the customers who had conducted commercial transactions online. Further, to enhance the generalization of the results, an attempt Online service quality 1159 IJOPM 24,11 1160 was made to gather data from a variety of online customers. The sampling frame consisted of online customers with personal e-mail addresses provided by an online e-mail address broker. A solicitation letter was conveyed by e-mail to 4,000 subjects randomly selected from the e-mailing list. The e-mail message described the research purpose and invited each receiver to participate in the survey. Sample members who were willing to participate clicked through the URL address provided in the invitation e-mail. A total of 1,101 e-mails were returned as undeliverable. Thus, the actual undeliverable rate is 27.5 per cent (1,101 of 4,000), which is similar to Sheehan and Hoy’s (2000) experience (26 per cent). The responses from 257 participants were forwarded to the leading author via e-mail. Of these, 22 were eliminated because they were incomplete or duplicate (The ISP address of each respondent has been checked) responses. Thus, the final sample was 235 and the effective response rate was 8.1 per cent (235 of 2,899). Unfortunately, because some potential respondents sent complaining e-mails via a third party to the e-mail postmaster of the authors’ affiliation, there was no way to locate and delete their e-mail addresses. Since any further e-mailing without deleting those e-mail addresses in question might involve potential legal issues and the number of collected useable responses was sufficient for further data analysis, follow-up e-mails were not sent. No comparison was made between early and late responses for checking non-response bias, since approximately 90 per cent of the responses were gathered within five days after the initial e-mail. Profile of respondents Table I sets forth the demographic variables, and both computer and Internet usage profiles of the 235 respondents. Appropriately 80.8 per cent of the respondents were male; 76.9 per cent were between the ages of 25 and 54; 68.0 per cent had earned a bachelor’s degree or higher; and 40.1 per cent earned an annual household income of US $ 70,000 or above. The characteristics of these respondents were similar to Internet user profiles gathered in other studies (e.g. Kehoe et al., 1999; Sheehan and Hoy, 2000). While 74.0 per cent of the respondents were living in the USA, the remainder 36.0 per cent resided in 17 other countries. As to the computer and Internet usage profile, 90.2 per cent of the sample had been using personal computers for more than five years; 94.1 per cent reported that they logged onto the Internet at least once a day on average; and 64.6 per cent spent more than five hours per week on browsing Web sites. Confirming online service quality dimensions The structural equation modeling (SEM) approach was used to assess if the hypothesized six-factor online service quality model fit the data set. Subsequently it used an interactive procedure to generate the final measurement model. The development of the final measurement model follows the respecification guidelines suggested by Anderson and Gerbing (1988). The practical respecification process followed two steps. First, the study considered an item removable if it demonstrated one of the following characteristics: . loaded on the wrong factor or crossloaded; or . exhibited large standardized residuals (Anderson and Gerbing, 1988; Bollen, 1989). Classification Pcta Gender Male Female 80.8 19.2 Education High school/Trade/Technical school Some college College graduate Graduate school 8.9 23.1 35.9 32.1 Age 16-24 25-34 35-44 45-54 55 or over 7.3 20.5 26.1 30.3 15.8 Annual household income Under $10,000 $10,000-29,999 $30,000-49,999 $50,000-69,999 $70,000-99,999 $100,000 or over 4.8 15.7 20.9 21.4 16.2 23.9 Living country/region USA Others 74.0 36.0 How long have you been using personal computers? 1-5 years 6-10 years 11 years or over 9.8 22.6 67.7 How long have you been using the Internet as one of your purchasing channels? Less than six months 0.5-1 year 1-2 years 3-5 years More than five years 3.4 13.3 32.2 38.6 12.4 On average, how often do you use the Internet? 1-5 times a week 1-4 times a day 5-8 times a day Nine times a day 5.9 40.9 21.3 31.9 On average, how many hours per week do you browse Web sites? Less than one hour 1-5 hours 6-10 hours 11-20 hours 21-40 hours Over 40 hours 5.1 30.3 26.5 20.9 10.7 6.4 Note: a The percentage is referred to the valid percentage Online service quality 1161 Table I. Profile of respondents IJOPM 24,11 1162 Then, if the questionable item was considered to be represented by another indicator, it was removed from the analysis. After one item was removed, the CFA was run again. The same removing procedure continued until all items were considered necessary, either theoretically or empirically. As a result of this procedure, six dimensions with their associated 20 scale items, reduced from the original 31, were derived. The six dimensions generated include: (1) reliability; (2) responsiveness; (3) competence; (4) ease of use; (5) security; and (6) product portfolio. As shown in Table II, the reliability of each factor was estimated by computing its Cronbach’s Alpha, which was 0.86, 0.76, 0.83, 0.80, 0.75, and 0.83, respectively. These scale items had adequate reliability and were deemed appropriate for further analysis. The t-values of all indicator loadings well exceeded the critical value (2.78) at the 0.01 significance level, suggesting that each indicator was relevant and acceptable. Thus, no further model respecification was necessary. The results of the confirmatory factor analysis (CFA) for online service quality dimensions in Table II show that the chi-square was statistically significant (x2 ¼ 158:07; d:f: ¼ 126, p , 0:03). Nevertheless, the ratio of the chi-square statistic relative to degree of freedom is 1.26, which was less than the suggested cut-off point of two. The values of Goodness-of-Fit Index (GFI), Non-Normed Fit Index (NNFI), Competitive Fit Index (CFI), and Root Mean Square Residual (RMSR) were 0.90, 0.99, 0.99, and 0.05, respectively. The convergent validity of the measurement model was examined by calculating the composite reliability and average variance extracted (AVE) (Fornell and Larcker, 1981). All the reliabilities were greater than the recommended 0.7 (Nunnally and Bernstein, 1994). The AVE represents the amount of variance captured by the construct measures relative to measurement error and the correlations among the latent variables. The AVE of each measure in this study extracted more than or equal to 50 per cent of variance, the cut-off value (Bagozzi and Yi, 1988). The discriminant validity of the measures was examined in two ways. First, the AVE was compared with the square of the parameter estimate among the latent variable (Fojt, 1995). This revealed that the correlation among indicators of each construct was greater than that of between a construct and any other construct. Second, the discriminant validity of each construct was evidenced by each indicator loading higher on the construct of interest than on any other variable (Chen et al., 1998). Table III lists the means, standard deviation of each construct, and correlations among the constructs. Finally, criterion-related validity analysis was undertaken to ascertain whether the online service quality measure behaves as was expected in relation to other constructs including both customer perceived overall service quality and overall satisfaction. Overall satisfaction was measured by three items with a Cronbach’s Alpha of 0.86, and overall service quality by two items with a Cronbach’s Alpha of 0.92 (see Appendix 2). Constructs, sources and scale items Reliability (a ¼ 0:86; AVE ¼ 0:67; CR ¼ 0:91) 1. The company performs the service correctly the first time 2. My online transactions are always accurate 3. The company keeps my records accurately Responsiveness (a ¼ 0:76; AVE ¼ 0:52; CR ¼ 0:82) 1. I receive prompt responses to my requests by e-mail or other means 2. The company quickly resolves problems I encounter 3. The company employees give me prompt service Competence (a ¼ 0:83; AVE ¼ 0:62; CR ¼ 0:88) 1. The company employees have the knowledge to answer my questions 2. The company employees properly handle any problems that arise 3. The company employees comply with my requests Ease of use (a ¼ 0:80; AVE ¼ 0:57; CR ¼ 0:90) 1. Using the company’s Web site requires a lot of effort (R) 2. The organization and structure of online content is easy to follow 3. It is easy for me to complete a transaction through the company’s Web site Product portfolio (a ¼ 0:75; AVE ¼ 0:51; CR ¼ 0:77) 1. All my service needs are included in the menu options 2. The company provides wide ranges of product packages 3. The company provides services with the features I want 4. The company provides most of the service functions that I need Security (a ¼ 0:83; AVE ¼ 0:57; CR ¼ 0:86) 1. The company will not misuse my personal information 2. I feel safe in my online transactions 3. I felt secure in providing sensitive information (e.g. credit card number) for online transactions 4. I felt the risk associated with online transactions is low Mean SD 4.33 4.33 4.31 0.80 0.92 0.82 Loading t-value 0.86 0.93 0.83 Online service quality 17.40 18.81 15.53 1163 3.58 3.64 3.68 0.98 0.95 0.89 0.63 0.82 0.88 10.26 12.80 11.81 3.47 0.90 0.85 15.80 3.70 3.71 0.93 0.82 0.90 0.78 17.34 14.57 3.52 1.03 0.71 11.81 3.75 0.95 0.95 16.63 4.06 0.86 0.92 15.47 3.43 3.76 3.76 3.82 1.05 0.79 0.83 0.90 0.65 0.51 0.62 0.88 10.80 8.16 9.93 16.61 3.46 4.05 0.92 0.74 0.61 0.88 9.93 16.62 3.83 3.86 0.93 0.90 0.89 0.71 17.04 12.34 Model fit indices x2 ¼ 158:07 (P ¼ 0:03), d:f: ¼ 126, x2 =d:f: ¼ 1:26 RMSR ¼ 0:05, GFI ¼ 0:90, CFI ¼ 0:99, NFI ¼ 0:96, NNFI ¼ 0:99 Note: CR = composite reliability; AVE = average variance extracted A regression analysis examined the associations of the six dimensions of perceived online service quality with overall service quality and satisfaction. Pursuant to the initial regression run, the outliers were detected by examining the standardized residual. Three outliers were found and eliminated. In turn, scale items were summed to form measures of the corresponding variables. Missing values were handled by choosing the option “exclude cases pairwise”, which means that only cases with complete data for the pair of constructs being correlated were used to compute the correlation coefficient on which the regression analysis is based. This procedure produced 218 effective samples. Table II. CFA results of measures IJOPM 24,11 1164 Table III. Correlation matrix 1. 2. 3. 4. 5. 6. 7. 8. Reliability Responsiveness Competence Ease of use Product portfolio Security Overall service qualityb Overall satisfactionb Mean SD 1 2 3 4 5 6 7 8 4.33 3.63 3.62 3.78 3.69 3.80 5.58 5.65 0.75 0.78 0.76 0.80 0.68 0.71 1.20 1.06 1 0.52a 0.44a 0.52a 0.43a 0.56a 0.61a 0.59a 1 0.70a 0.42a 0.46a 0.46a 0.66a 0.64a 1 0.49a 0.53a 0.49a 0.63a 0.61a 1 0.54a 0.49a 0.58a 0.59a 1 0.54a 0.56a 0.61a 1 0.54a 0.57a 1 0.91a 1 Notes: aCorrelation is significant at the 0.01 level (two-tailed); bA seven-point Likert scale was used The sum of all scale items within a particular factor was used to represent that factor. The enter method was used for the linear regressions. When the regression analyses were repeated with 70 per cent, 80 per cent, and 90 per cent randomly selected cases from the sample, the parameter estimates were stable. This finding, along with the factor loadings of the explanatory variables, suggested that multicollinearity would not be a concern. Table IV outlines the results of the regression analyses. The adjusted coefficients of determination (R 2) were 0.61 for both equations (p , 0:001). Therefore, the regression equations produced a satisfactory level of goodness of fit in predicting the variance of online perceived overall service quality and overall satisfaction in relation to respective service quality dimensions. The analysis revealed that all dimensions except security have a statistically significant effect on the assessment of overall service quality. The insignificant effect of “security” may be explained by the fact that customers typically have difficulty in directly evaluating a Web site’s security/privacy (Wolfinbarger and Gilly, 2003). Instead, they tend to use other clues, such as customer testimonials. Another reason may be that the surveyed customers feel comfortable with the security of online transactions. Based on the conceptual consideration, the “security” dimension was retained in the final measure (Anderson and Gerbing, 1988). In sum, the analysis supported the convergent and discriminant validity of the measure. The CFA results demonstrated that the six-factor model was appropriate and possessed adequate reliability and criterion-related validity. Independent variables Table IV. Regression analysis results between e-service quality dimensions and overall service quality and customer satisfaction Constant 1. Reliability 2. Responsiveness 3. Competence 4. Ease of use 5. Product portfolio 6. Security F-value p Adjust R 2 Overall service quality Standardized coefficients t-value p-value 0.22 0.27 0.16 0.20 0.12 0.04 59.50 0.00 0.61 2 1.79 3.97 4.33 5.52 3.64 2.25 0.64 0.08 0.00 0.00 0.01 0.00 0.03 0.53 Overall satisfaction Standardized coefficients t-value p-value 0.17 0.27 0.10 0.18 0.22 0.08 60.13 0.00 0.61 2 0.22 3.04 4.40 1.59 3.36 3.98 1.43 0.08 0.00 0.00 0.11 0.00 0.00 0.15 Discussion Customer perceived online service quality is one of the crucial determinants of the success of online businesses. Accordingly, considerable research has been conducted on the construct of online service quality, yet much of the literature is conceptual in nature or based on a few case studies. Moreover, even the limited survey-based empirical literature examines the construct within narrowly defined online businesses (e.g. online banks or portal services) or online business processes, (e.g. Web site design or online exchange processes), and fails to systematically investigate this important concept in a broad sense. In order to fill this research gap, this study empirically examined the construct of online service quality in the context of business-to-consumer e-commerce and from the perspective of integrated online service transformation processes, which consist of three key elements: customer service, “front store”, and product portfolio. The authors first identified key dimensions of customers’ perceived online service quality through a content analysis of critical incidents and then purified these into six dimensions by subjecting the data collected through Web-based surveys to CFA. The results of the validation procedure indicate that this proposed six-factor online service quality scale has appropriate reliability and validity in every aspect and has only 20 scale items. The six factors identified were: (1) reliability; (2) responsiveness; (3) competence; (4) ease of use; (5) security; and (6) product portfolio. The “reliability” factor comprised four items related to accurate online transactions, accurate records, correct performance, and fulfillment of promises. “Responsiveness” referred to prompt response to customer requests, the speed in resolving customer problems, and prompt services. “Competence” was related to employee ability to answer customer questions, their ability to resolve problems that arise, and compliance with customer requests. “Ease of use” referred to moderate effort required to navigate a Web site, well-organized/structured and easy-to-follow catalogs, and ease of completing an online transaction. “Security” encompassed low risk associated with online transactions, safeguarding personal information, and safety in completing online transactions. Finally, the “product portfolio” factor covered online service functions, useful free services, a wide range of product and service packages, and diverse features. Unique dimensions While all of the derived dimensions contain many traditional service quality aspects, they do have some unique characteristics related to the e-commerce setting. As such, it would be interesting to compare the traditional service quality dimensions identified by Parasuraman et al. (1988) with those of this study. Among the Parasuraman et al. (1988) five dimensions, four of them, reliability, assurance, responsiveness, and empathy, were also considered important by online customers. On the other hand, the Online service quality 1165 IJOPM 24,11 1166 traditional dimension of “tangibles” turned out to be inapplicable to the e-commerce setting. This dimension may be linked to Web site design characteristics, such as aesthetics, structure of content or store layout, menu naming, and arrangement of hyperlinks, most of which were incorporated into one of the study’s three “ease of use” unique dimensions. Two other dimensions uncovered by this study were security and product portfolio. The detailed discussion of these three dimensions is as follows. Ease of use. Many studies focused on the “ease of use” dimension in the information system area. In the context of Web-based markets, the “easy to navigate” feature is essential to attract both experienced and new online customers. As Rice (1997) has pointed out, for Internet-based shopping to achieve mass-market penetration, it must be made substantially easier than it is at present for consumers to navigate and locate information or content. Customers grant priority to needed on-screen information concerning products/services. Since the Web site functions as an information system, the organization and structure of online catalogues should be easy to follow and navigate. The sequencing, placement and naming of hyperlinks and navigational menus should be based on customer intuition. A well-designed navigational structure can facilitate consumers’ perceptions of online control and enjoyment. Moreover, a good Web site should always clarify the meaning of interactive messages in order to facilitate the “flow” (comments from a respondent). Most importantly, the contents of the Web site should be concise and easy to understand. All terms and conditions concerned with products/services should have these attributes. Adequate explanations, which are often missing in online banking and online stock trading services, should be provided. The simplicity and smoothness of the whole transaction process is also of critical importance to ensure customer satisfaction on the Internet. Consumers will often feel frustrated and even elect to terminate the transaction when they encounter misbehaving and superfluous Java applets and scripts on the site. Graphics and advertising can significantly slow download speed. Thus, a balance must be made between Web page multimedia richness and download speed. Security. Many customers are concerned with the risk associated with online transactions and privacy of sensitive personal information. Security is closely linked with the trustfulness of online companies. The perceived lack of security on public networks is definitely a stumbling block (Balfour et al., 1998). Personal information such as credit card numbers transmitted to vendors from consumers can be coded and decoded using encryption algorithms. Additionally, many consumers desire to retain some level of privacy or anonymity. A Web server, however, can track the identity of the user’s computer through “cookies”, a text file placed on a user’s hard drive. Most online customers are concerned about Web sites that do not provide clear and prominent statements about privacy and security matters. These disadvantages of e-commerce require companies to be very responsible for both customer transaction activities and personal information. Some respondents in the present survey provided useful suggestions. For instance, online companies can furnish visible evidence of services independent security certification. They should provide for documentation or passwords sent to prospective clients at the start of the service. Product portfolio. This dimension refers to the range and depth of products/services, and with free service offerings. Many customers seek products/services unavailable in their local outlets. Limited selection of products/services or outdated information is most likely to prevent numerous customers from purchasing online. In a survey of 220 consumers from Austin, Texas, Jarvenpaa and Todd (1997) found that the main impediments to consumer acceptance of online shopping were difficult-to-find specialty items, and limitation of the offerings of individual sites. Finally, a Web site can benefit if it provides adequate service functions in the menu options. Some value-added free services by linking to useful informational Web sites, are also desirable. Optimizing service quality levels The correlations among the six dimensions set forth earlier are high (see Table II). Thus, it is impossible to improve individual critical service quality dimensions without maintaining the quality level of all six attributes at least within the relevant zone of tolerance. Practically, however, it is difficult to offer all service quality attributes at a superior level. For example, one respondent commented how the security measures affected ease of use: It is complicated to get logged in. Each time I log in, I have to type not only my username and password, but also each time I’m asked four different digits from a 20-digit key word. I understand that is for safety reasons, but it is not very user friendly. Thus, the task of an online company is to optimize service quality by balancing the level of each primary service quality dimension. Coordination across organizational partners and departments is essential in designing Web sites and service processes. Perceived overall service quality The regression analysis results portrayed in Table IV indicate that responsiveness, reliability, product portfolio, and ease of use are considered important for both overall service quality and satisfaction. Responsive is the foremost critical factor in determining satisfaction. The second most important determinant of overall service quality is reliability and of satisfaction is ease of use. Online customers considered reliability to be the foremost factor in achieving high levels of service quality; this is consistent with the findings of other traditional service quality studies (Parasuraman et al., 1988; Bitner, 1990). Online consumers also regarded ease of use as a significant factor influencing overall service quality assessment. In contrast to the prevailing viewpoint, security turned out to be insignificant in determining overall service quality perceptions of online customers. A large number of customers are becoming accustomed to online transactions. Many of the respondents were not overly concerned with privacy and security, just as one commented: “I experienced no problems with privacy and would not hesitate to do business with them again.” Theoretical and practical implications Theoretically, this study extends measurement scales of traditional service quality to online service quality. Parasuraman et al. (1988) have developed SERVQUAL to assess service quality in traditional markets. As the online market has emerged, both researchers and practitioners have called for a set of reliable and valid service quality gauges in the setting of e-commerce. The online service quality measure developed in this study is designed to provide an effective tool to measure Internet-based service quality. Online service quality 1167 IJOPM 24,11 1168 Online companies can use the quality measurement tool developed in this study to detect service quality weaknesses and strengths. Based on their quality assessment and business strategies, online companies can allocate corporate resources to the important service quality attributes uncovered by this study. Particularly, it should be noted that improvements in the level of responsiveness, reliability, and ease of use constitutes a necessity for broadening a loyal customer base, since these factors have strong associations with overall service quality. Limitations and future research directions There are several limitations to the current study. First, the sample is US-focused with 74 per cent of the respondents residing in the USA. The participants in this study may possess attributes and behaviors that differ from those in other parts of the world. In addition, the sample is skewed to a particular gender with 80 per cent of the respondents being males, which may not exactly reflect the current composite of online customers. Next, as mentioned earlier in the data collection section, it was impossible to send follow-up surveys and thus no attempt was made to ascertain the existence of non-response bias by comparing responses from the first-wave surveys with those of a second wave. Future research could make several extensions of the current study. First, to verify the dimensions developed in this study and to enhance the generalizability of the research findings, future inquiries could employ more diversified samples across genders, various forms of online businesses, and diverse international customer environments. Second, the measurement instrument constructed in this study can be used to further investigate how customer perceived online service quality influence customer satisfaction and in turn purchasing behaviors such as customer repurchase intentions and loyalty. Similarly, the antecedents of customer perceived online service quality may also be examined using the measure. For example, product characteristics, such as value and brand, and consumer-specific characteristics, such as time orientation, time pressure, and technology readiness, may significantly affect customer perceptions on each of the online service quality dimensions derived in this study. Identifying these important antecedents is an essential element for better online service quality management. Next, the current research focuses on service quality dimensions perceived by customers who have conducted online transactions. However, a large portion of individuals primarily utilize the Internet as information sources and have not conducted commercial transactions. These customers may have unique perceptions of service quality. For instance, compared to customers with online transaction experience, who may feel comfortable with online security, purely online information searchers may have a serious concern with the security of online transactions. Thus, further research can develop a more generalized service quality scale by incorporating the perceptions from both groups. Finally, as the e-commerce field becomes increasingly mature, customers will shape clear expectations for online service quality attributes. More and more industry-wide service standards will be set forth and be accepted. Thus, future studies may utilize the expectation-disconfirmation paradigm to measure service quality and customer satisfaction. References Anderson, J.C. and Gerbing, D.W. (1988), “Structural equation modeling in practice: a review and recommended two-step approach”, Psychological Bulletin, Vol. 103 No. 3, pp. 411-23. Online service quality Bagozzi, R.P. and Yi, Y. (1988), “On the evaluation of structural equation models”, Journal of the Academy of Marketing Science, Vol. 16 No. 1, pp. 74-94. Balfour, A., Farquhar, B. and Langmann, G. (1998), “The consumer needs in global electronic commerce”, Electronic Markets, Vol. 8 No. 2, pp. 9-12. Barcia, S.M. (2000), “Internet pharmacies: all hype with no help”, Health Management Technology, Vol. 21 No. 4, pp. 24-5. Barnes, S.J. and Vidgen, R. (2001), “An evaluation of cyber-bookshops: the WebQual method”, International Journal of Electronic Commerce, Vol. 6 No. 1, pp. 11-30. Baroudi, J.J. and Orlikowski, W.J. (1988), “A short-form measure of user information satisfaction: a psychometric evaluation and notes on use”, Journal of Management Information System, Vol. 4 No. 4, pp. 44-59. Bitner, M.J. (1990), “Evaluating service encounter: the effects of physical surroundings and employee responses”, Journal of Marketing, Vol. 54 No. 2, pp. 69-82. Bollen, K.A. (1989), Structural Equations with Latent Variables, Wiley, New York, NY. Cadotte, E.R. and Turgeon, N. (1988), “Dissatisfiers and satisfiers: suggestions for consumer complaints and compliments”, Journal of Consumer Satisfaction, Dissatisfaction and Complaining Behavior, Vol. 1, pp. 74-9. Chen, H., Houston, A.L., Sewell, R.R. and Schatz, B.R. (1998), “Internet browsing and searching: user evaluations of category map and concept space techniques”, Journal of the American Society for Information Science, Vol. 49 No. 7, pp. 582-603. Chen, P.Y. and Hitt, L.M. (2000), “Switching cost and brand loyalty in electronic markets: evidence from on-line retail brokers”, in Proceedings of 21st Annual International Conference on Information Systems, Brisbane, Australia. Cho, N. and Park, S. (2001), “Development of electronic commerce user-consumer satisfaction index ecusi) for Internet shopping”, Industrial Management & Data Systems, Vol. 101 No. 8, pp. 400-5. Cox, J. and Dale, B.G. (2001), “Service quality and e-commerce: an exploratory analysis”, Managing Service Quality, Vol. 11 No. 2, pp. 121-31. Cronin, J.J. and Taylor, S.A. (1994), “SERVPERF versus SERVQUAL: reconciling performance-based and perceptions-minus-expectations measurement of service quality”, Journal of Marketing, Vol. 58 No. 1, pp. 125-31. Dabholkar, P.A., Thorpe, D.I. and Rentz, J.O. (1996), “A measure of service quality for retailing stores: scale development and validation”, Journal of the Academy of Marketing Science, Vol. 24 No. 1, pp. 3-16. D’Angelo, J. and Little, S. (1998), “Successful Web pages: what are they and do they exist?”, Information Technology and Libraries, Vol. 17 No. 2, pp. 71-81. DeLone, W.H. and McLean, E.R. (1992), “Information system success: the quest for the dependent variable”, Information Systems Research, Vol. 3 No. 1, pp. 60-95. Doll, W.J. and Torkzadeh, G. (1988), “The measurement of end-user computing satisfaction”, MIS Quarterly, Vol. 12 No. 2, pp. 259-74. Doll, W.J., Xia, W. and Torkzadeh, G. (1994), “A confirmatory factor analysis of the end-user computing satisfaction instrument”, MIS Quarterly, Vol. 18 No. 4, pp. 43-461. 1169 IJOPM 24,11 1170 Fojt, M. (1995), “Barclays invests in technology to boost customer service and market share”, The Journal of Services Marketing, Vol. 9 No. 3, pp. 66-7. Fornell, C. and Larcker, D. (1981), “Evaluating structural equation models with unobservable variables and measurement error”, Journal of Marketing Research, Vol. 18 No. 1, pp. 39-50. Gefen, D., Karahanna, E. and Straub, D.W. (2003), “Trust and TAM in online shopping: an integrated model”, MIS Quarterly, Vol. 27 No. 1, pp. 51-90. Gronroos, C. (1983), Strategic Management and Marketing in the Service Sector, Marketing Science Institution, Cambridge, MA. Hedvall, M.B. and Paltschik, M. (1989), “An investigation in and the generation of service quality concepts”, in Avlonitis, G.J. et al. (Eds), Marketing Thought and Practices in the 1990s, European Marketing Academy, Athens, pp. 473-83. Hendrickson, A.R. and Collins, M.R. (1996), “An assessment of structure and causation of its usage”, The Database for Advances in Information Systems, Vol. 27 No. 2, pp. 61-7. Jarvenpaa, S.L. and Todd, P.A. (1997), “Is there a future for retailing on the internet?”, in Peterson, R.A. (Ed.), Electronic Marketing and the Consumer, Sage, Thousand Oaks, CA, pp. 139-54. Joseph, M., McClure, C. and Joseph, B. (1999), “Service quality in the banking sector: the impact of technology in service delivery”, International Journal of Bank Marketing, Vol. 17 No. 4, pp. 182-91. Judd, C.M., Smith, E.R. and Kidder, L.H. (1991), Research Methods in Social Relations, Harcourt Brace Jovanovich College Publishers, Fort Worth, TX. Jun, M. and Cai, S. (2001), “The key determinants of internet banking service quality: a content analysis”, The International Journal of Bank Marketing, Vol. 19 No. 7, pp. 276-91. Kaynama, S.A. and Black, C.I. (2000), “A proposal to assess the service quality of online travel agencies: an exploratory study”, Journal of Professional Services Marketing, Vol. 21 No. 1, pp. 63-88. Kehoe, C., Pitkow, J., Sutton, K., Aggarwal, G. and Rogers, J.D. (1999), Results of GVU’s Tenth World Wide Web User Survey, Graphics Visualization and Usability Center, College of Computing, Georgia Institute of Technology, Atlanta, GA. Kettinger, W.K. and Lee, C.C. (1997), “Pragmatic perspectives on the measurement of information systems service quality”, MIS Quarterly, Vol. 21 No. 2, pp. 223-40. Liu, C. and Arnett, K.P. (2000), “Exploring the factors associated with Web site success in the context of electronic commerce”, Information & Management, Vol. 38 No. 1, pp. 23-34. Lohse, G.L. and Spiller, P. (1998), “Electronic shopping”, Communications of the ACM, Vol. 41 No. 7, pp. 81-7. Madu, C.N. and Madu, A.A. (2002), “Dimensions of e-quality”, International Journal of Quality & Reliability Management, Vol. 19 No. 3, pp. 246-58. Nunnally, J.C. and Bernstein, I.H. (1994), Psychometric Theory, McGraw-Hall, New York, NY. Oliva, T.A., Oliver, R.L. and MacMillan, I.C. (1992), “A catastrophe model for developing service satisfaction strategies”, Journal of Marketing, Vol. 56 No. 3, pp. 83-95. O’Neill, M., Wright, C. and Fitz, F. (2001), “Quality evaluation in on-line service environments: an application of the importance performance measurement technique”, Managing Service Quality, Vol. 11 No. 6, pp. 402-17. Page, C. and Lepkowska-White, E. (2002), “Web equity: a framework for building consumer value in online companies”, Journal of Consumer Marketing, Vol. 19 No. 3, pp. 231-48. Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1985), “A conceptual model of service quality and its implications for future research”, Journal of Marketing, Vol. 49 No. 4, pp. 41-50. Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1988), “SERVQUAL: a multiple-item scale for measuring consumer perceptions of service quality”, Journal of Retailing, Vol. 64 No. 1, pp. 12-40. Paulin, M. and Perrien, J. (1996), “Measurement of service quality: the effect of contextuality”, in Kunst, P. and Lemmink, J. (Eds), Managing Service Quality, Vol. III, Chapman, London, pp. 257-73. Pitt, L., Berthon, P. and Watson, R. (1999), “Cyberservice: taming service marketing problems with the World Wide Web”, Business Horizons, Vol. 42 No. 1, pp. 11-18. Rice, M. (1997), “What makes users revisit a Web site”, Marketing News, Vol. 31 No. 6, pp. 12-13. Rubino, G. (2000), “Getting and keeping online customers: if you build it, will they come?”, Bank Marketing, Vol. 32 No. 3, pp. 36-40. Santos, J. (2003), “E-service quality: a model of virtual service quality dimensions”, Managing Service Quality, Vol. 13 No. 3, pp. 233-46. Sasser, W.E. Jr, Olsen, R.P. and Wyckoff, D.D. (1978), Management of Service Operations: Text and Cases, Allyn & Bacon, Boston, MA. Sheehan, K.B. and Hoy, M.G. (2000), “Dimensions of privacy concerns among online consumers”, Journal of Public Policy and Marketing, Vol. 19 No. 1, pp. 62-73. Van Riel, A.C.R., Liljander, V. and Jurriens, P. (2001), “Exploring consumer evaluations of e-services: a portal site”, International Journal of Service Industry Management, Vol. 12 No. 4, pp. 359-77. Voss, C.A. (2003), “Rethinking paradigms of service – service in a virtual environment”, International Journal of Operations & Production Management, Vol. 23 No. 1, pp. 88-104. Waite, K. and Harrison, T. (2002), “Consumer expectations of online information provided by bank Web sites”, Journal of Financial Services Marketing, Vol. 6 No. 4, pp. 309-22. Wazienski, R.J. (2000), “The ethnograph”, Social Science Computer Review, Vol. 18 No. 3, pp. 351-6. Wolfinbarger, M.F. and Gilly, M.C. (2003), “eTailQ: dimensionalizing, measuring and predicting etail quality”, Journal of Retailing, Vol. 79 No. 3, pp. 183-98. Yoo, B. and Donthu, N. (2001), “Developing a scale to measure the perceived quality of Internet shopping sites (SITEQUAL)”, Quarterly Journal of Electronic Commerce, Vol. 2 No. 1, pp. 31-47. Zeithaml, V.A., Parasuraman, A. and Malhotra, A. (2001), “A conceptual framework for understanding e-service quality: implications for future research and managerial practice”, MSI Working Paper Series, Report No. 00-115, Cambridge, MA. Zeithaml, V.A., Parasuraman, A. and Malhotra, A. (2002), “Service quality delivery through Web sites: a critical review of extant knowledge”, Journal of the Academy of Marketing Science, Vol. 30 No. 4, pp. 362-75. Further reading Zeithaml, V.A. (2000), “Service quality, profitability, and the economic worth of customers: what we know and what we need to learn”, Journal of the Academy of Marketing Science, Vol. 28 No. 1, pp. 67-85. Zhang, P. and von Dran, G. (2001), “Expectations and rankings of Web site quality features: results of two studies”, On User Perceptions Proceedings of the 34th Hawaii International Conference on System Sciences, Hawaii, USA. Online service quality 1171 IJOPM 24,11 Appendix 1. Service quality dimensions of online banking and their frequencies by satisfiers and dissatisfiers No Dimension 1172 A Product portfolio Product features Product variety/range Sub-total B 1 Customer service quality Responsiveness Prompt service (Acct. open, customer request, etc.) Timely response from rep Quickly solve problems Sub-total 2 3 4 5 6 7 Table AI. Satisfied No. Pct. 3.1 2.6 0.5 3.1 67.7 1,178 74.1 1,507 72.6 23 34 31 88 4.7 7.0 6.4 18.1 131 87 72 290 8.2 5.5 4.5 18.2 154 121 103 378 7.4 5.8 5.0 18.2 3 4 0 0.6 0.8 0.0 141 54 29 8.9 3.4 1.8 144 58 29 6.9 2.8 1.4 0 7 0.0 1.4 14 238 0.9 15.0 14 245 0.7 11.8 Competence Reps. knowledge to answer questions Ability to solve problems Sub-total 33 5 38 6.8 1.0 7.8 103 68 171 6.5 4.3 10.8 136 73 209 6.6 3.5 10.1 Access E-mail access Representative access via phone ATM access Phone access Account access when abroad Sub-total 14 25 6 7 4 56 2.9 5.1 1.2 1.4 0.8 11.5 59 40 8 7 7 121 3.7 2.5 0.5 0.4 0.4 7.6 73 65 14 14 11 177 3.5 3.1 0.7 0.7 0.5 8.5 5 17 7 29 1.0 3.5 1.4 6.0 82 27 5 114 5.2 1.7 0.3 7.2 87 44 12 143 4.2 2.1 0.6 6.9 43 0 43 8.8 0.0 8.8 49 1 50 3.1 0.1 3.1 92 1 93 4.4 0.0 4.5 9 10 0 19 1.9 2.1 0.0 3.9 42 13 9 64 2.6 0.8 0.6 4.0 Personalization Assurance and care Individual attention Top management involvement Sub-total Courtesy Address complaints friendly Consistently courteous Sub-total Continuous improvement Continuous improvement on customer service Continuous improvement on online systems Continuous improvement on product offerings Sub-total 329 3.5 2.3 1.2 3.5 47 42 5 47 3.0 2.6 0.3 3.0 Total No. Pct. 64 53 11 64 Reliability Correct service (corresponding, and other unspecified issues) Keep service promise Keep promotion promise Accurate records (i.e. billing amount, mailing address) Sub-total 17 11 6 17 Dissatisfied No. Pct. 51 2.5 23 1.1 9 0.4 83 4.0 (continued) No Dimension 8 9 10 C 1 2 3 4 Satisfied No. Pct. Dissatisfied No. Pct. Total No. Pct. Communication Informing customer of important information Availability of status of transactions Payee information Clear answer Sub-total 7 6 0 0 13 1.4 1.2 0.0 0.0 2.7 31 13 5 1 50 1.9 0.8 0.3 0.1 3.1 38 19 5 1 63 1.8 0.9 0.2 0.0 3.0 Convenience Save time When I want 24/7 customer service Where I want Avoid service personnel Sub-total 5 14 12 3 2 36 1.0 2.9 2.5 0.6 0.4 7.4 21 1 2 0 0 24 1.3 0.1 0.1 0.0 0.0 1.5 26 15 14 3 2 60 1.3 0.7 0.7 0.1 0.1 2.9 0 0 0 0 0.0 0.0 0.0 0.0 26 21 9 56 1.6 1.3 0.6 3.5 26 21 9 56 1.3 1.0 0.4 2.7 140 28.8 365 23.0 505 24.3 39 23 8 0 5 14 5 6 100 8.0 4.7 1.6 0.0 1.0 2.9 1.0 1.2 20.6 62 55 29 34 24 11 12 6 233 3.9 3.5 1.8 2.1 1.5 0.7 0.8 0.4 14.7 101 78 37 34 29 25 17 12 333 4.9 3.8 1.8 1.6 1.4 1.2 0.8 0.6 16.0 6 6 1 13 1.2 1.2 0.2 2.7 62 13 3 78 3.9 0.8 0.2 4.9 68 19 4 91 3.3 0.9 0.2 4.4 6 2 8 1.2 0.4 1.6 11 11 22 0.7 0.7 1.4 17 13 30 0.8 0.6 1.4 Control Process control Mistake prevention Account lock-up Sub-total Online information systems quality Ease of use Functions that customers need User friendly Response speed Outdated technology Easy log-in Compatibility (e.g., Quicken, Microsoft money) Accessibility of Web site (i.e. shut down) Effective navigation Sub-total Accuracy Accurate online transactions Errors in interface Errors in contents Sub-total Security/privacy Information transaction safety Privacy Sub-total Others (contents, timeliness, aesthetics) Information on products and service Up-to-date information Attractive of the Web site Sub-total Total 10 2.1 15 0.9 25 1.2 7 1.4 16 1.0 23 1.1 2 0.4 1 0.1 3 0.1 19 3.9 32 2.0 51 2.5 486 100.0 1,590 100.0 2,076 100.0 Online service quality 1173 Table AI. IJOPM 24,11 1174 Appendix 2. Measurement instrument: perceived service quality dimensions Reliability 1. 2. 3. 4. The company performs the service correctly the first time When the company promises to do something by a certain time, it does so The company keeps my records accurately My online transactions are always accuratea Responsiveness 1. 2. 3. 4. 5. 6. Employees give me prompt service I receive prompt responses to my requests by e-mail or other means The company quickly resolves problems I encounter I can rapidly retrieve the information I requesta The company informs me of important information promptlya The company provides me real-time informationa Competence 1. 2. 3. Employees properly handle any problems that arise Employees have the knowledge to answer my questions Employees comply with my requests Ease of use 1. 2. 3. 4. 5. The organization and structure of online content was easy to follow It is easy for me to complete a transaction through my bank’s Web site Using the bank’s Web site requires a lot of effort I can easily log on to my accounta I didn’t encounter online jam in searching for informationa Product portfolio 1. 2. 3. 6. 9. The company provides wide ranges of service packages The company provides services with the features I want The company provided me many useful free services (e.g. message board)a The company provides most of the service functions that I need All my service needs are included in the menu options Security 1. 2. 3. 4. The company will not misuse my personal information I feel safe in my online transactions I felt secure in providing sensitive information (e.g. credit card number) for online transactions I felt the risk associated with online transactions is low Overall service quality (Cronbach’s Alpha ¼ 0:92) 1. Overall, the service quality of my online company is excellent 2. Overall, my online company comes up to my expectations of what makes a good online supplier Overall satisfaction (Cronbach’s Alpha ¼ 0:86) 45. Overall, I am very satisfied with the company 46. Overall, I am very satisfied with Internet-based transactions 47. Overall, I am very satisfied with the products/services offered by the company Table AII. Note: aItems were deleted from later analyses