Monitoring building operation and maintenance contracts

22 downloads 1898 Views 128KB Size Report
for monitoring contracts through performance review meeting and O&M audit was measured ... Keywords Buildings, Service operations, Maintenance, Contracts, ...
The current issue and full text archive of this journal is available at www.emeraldinsight.com/0263-2772.htm

F 25,5/6

Monitoring building operation and maintenance contracts Joseph H.K. Lai and Francis W.H. Yik

238

Department of Building Services Engineering, The Hong Kong Polytechnic University, Kowloon, Hong Kong SAR, China

Received November 2006 Accepted January 2007

Abstract Purpose – The purpose of the paper is to study the use of management tools and their costs for monitoring building operation and maintenance (O&M) service contracts. Design/methodology/approach – The management tools usable for monitoring building O&M contracts were reviewed, with their characteristics highlighted and compared. A series of face-to-face interviews with practitioners looking after building O&M contracts was conducted to collect empirical information, followed by data analysis and discussion of the results. Findings – The paper finds that using balanced scorecard or benchmarking to monitor building O&M contracts was unpopular. The use of customer satisfaction survey was rather common. The cost for monitoring contracts through performance review meeting and O&M audit was measured. It tended to reduce in relative amount with larger scale of contracts. Research limitations/implications – More research is needed to study the effect of factors, including propriety of contract, complexity of work, contractual relationship, capability and quality of the contract and management teams on contract monitoring effort. Further works may take a similar approach of the study to investigate other transaction cost elements. Practical implications – The cost amounts for implementing the management tools inform practitioners about their significance relative to the amounts for procuring O&M services. How to make effective use of management tools to monitor O&M contracts should be investigated. Originality/value – It has been demonstrated how to measure the cost for using management tools to monitor building O&M contracts. The reviewed characteristics of the management tools and the unveiled amounts of contract monitoring cost are useful information to O&M practitioners. Keywords Buildings, Service operations, Maintenance, Contracts, Transaction costs, Performance monitoring Paper type Research paper

Facilities Vol. 25 No. 5/6, 2007 pp. 238-251 q Emerald Group Publishing Limited 0263-2772 DOI 10.1108/02632770710742200

Introduction The performance of buildings relies on the proper operation and maintenance (O&M) of facilities which, in modern commercial buildings, typically comprise installations such as air-conditioning, electrical, fire protection, and plumbing and drainage installations. To strive for lower cost yet better service quality, these O&M services are commonly outsourced, in whole or in parts, from managing contractors who mediate a wide range of specialist trades or directly from contractors specialised in a particular trade (e.g. lifts) (Lai et al., 2004, 2006). On top of contract sums, the formation and management of the contracts incurs transaction costs (Williamson, 1979, 1985) before (ex ante) and after (ex post) contract commencement. Before entering into an O&M contract, a building owner (outsourcer) would need to cause a tender document drafted on searching relevant information including extent of work, condition of built assets, desirable response time and allowable working hours,

etc. Events following tender return include tender assessment, clarification of any unclear tender information or submission, and interview with short-listed tenderers prior to negotiating with the prospective awardee on an agreed contract sum. Having accepted the offer by the contractor does not mean the outsourcer will obtain the expected service. Work performed by the contractor would need to be monitored to ensure its compliance with that specified, and the work that the contractor accomplished would require measurement to justify its payment. Any disputes during the contract period would entail time and monetary resources for their resolution. In theory, ex ante and ex post transaction costs are interdependent (Rao, 2003). Lesser input of ex ante resources would result in a less proper contract that requires more ex post management efforts (Lai et al., 2006); among which the cost for monitoring contract performance is critical, not merely because its amount counts toward the total transaction cost, but also its effectiveness affects quality outcomes (Dean and Kiu, 2001). Deploying more management resources to closely monitor the contract work would help obtain better service but the cost so incurred may outweigh the gain in service quality, thus reducing the contract value. In practice, the way in which building O&M contracts are managed often depends on customary practice, corporate culture and available human resources, etc. of the management team. While the difficulties with measuring transaction costs (Masten, 1996; Buckley and Chapman, 1997) and service quality (Øvretveit, 1993; Edvardsson et al., 1994) are common hurdles against evaluation of contract management effectiveness, measuring the cost in particular for contract monitoring should be feasible because management tools used for such purpose are typically implemented on a regular (e.g. weekly, monthly, etc.) or intermittent but definite (e.g. once or twice a year) basis where the deployed human and time resources can be recorded. This paper reports the findings of a research which, through collecting information from face-to-face interviews with practitioners who looked after O&M contracts for commercial buildings in Hong Kong, studied the costs for their monitoring. The upcoming section reviews the common management tools that can be used for contract monitoring. The remaining parts summarise the data collation method, demography of samples and analysis of the contract monitoring costs. Management tools for monitoring O&M contracts There have been literatures on management of services in a wide range of industries and businesses (e.g. Jones, 1989; Fitzgerald et al., 1991; Domberger, 1998; Johnston and Clark, 2005) and built facilities (e.g. Alexander, 1996; Atkin and Brooks, 2000; Barrett, 2000; Reuvid and Hinks, 2002), and studies on measuring service quality in various industries (e.g. Bouman and van der Wiele, 1992; Johns and Tyas, 1996; Fuentes, 1999; Aldlaigan and Buttle, 2002). But few researches focused on the use of management tools and their costs for monitoring building O&M service contracts. Among the tools recommended for monitoring contract performance, balanced scorecard, benchmarking, customer satisfaction survey, performance review meeting and audit which can be used or adapted for use on building O&M contracts are reviewed as follows. Balanced scorecard, taking four different perspectives, namely, financial, customer, internal business and, innovation and learning, can be used for performance measurement (Kaplan and Norton, 1993). This method is devised primarily for gauging the performance of an organisation as a whole rather than being specifically suitable

Monitoring building O&M contracts 239

F 25,5/6

240

for monitoring O&M contract performance. Despite some recent works (e.g. Amaratunga and Baldry, 2003) have suggested developing balanced scorecard for facilities management which embraces building O&M as one of its core competence, its common application is yet to be seen. Benchmarking can help organisations obtain value-for-money services and strive for competitive advantage through continuous monitoring and improvement of service delivery (Camp, 1989; Varcoe, 1993; Lema and Price, 1995). After much effort on promoting what and how to benchmark for facilities performance (Kincaid, 1994; Varcoe, 1996; Hinks and McNay, 1999; McDougall and Hinks, 2000), benchmarking applications based on ballpark data have emerged in surveys (e.g. IFMA, 2001) and benchmarking clubs (e.g. BSRIA, 2006). Rigorous benchmarking studies involving in-depth and sensitive information about building O&M contracts, however, remains rare because of the financial, knowledge, motivation and information barriers that the practitioners often encounter (Lai and Yik, 2005, 2006). Customers’ (end-users’) satisfaction with O&M service results from comparing the service they perceived with that they expected (Gro¨nroos, 1984; Parasuraman et al., 1985). The customer satisfaction survey is an essential tool to assess customers’ satisfaction, which may vary among individuals over time (Kennedy, 1996; Bandy, 2002). Regular undertaking of customer satisfaction surveys helps reveal any gaps between the performance of O&M service and the expectation of end-users, based on which remedial actions can be taken to continually improve the service for total quality (Grigg, 1996; Roberts, 2002). Outsourcing for O&M work would bring along quality service at lower cost only if there is adequate and appropriate review or monitoring of the contractor’s performance through regular meetings (Greaver, 1999). There has been evidence that good contractor service was ruined by the lack of performance monitoring (e.g. Angelici et al., 1995). In addition to monitoring the contractor’s performance, the prime purpose of performance review meeting is to allow both contracting parties to communicate bilaterally any matters arising during the contract period, so as to identify any necessary changes to align with the strategic O&M objectives. Rather than for satisfying bureaucratic policies and procedures, performance review meeting is a regular mechanism for improving service performance. BSI (1993) defines an audit as a systematic examination of, for example, documents, reports accounts, stock holdings or quality attributes. The term audit, however, often varies with the function being audited. Without a well-established definition, O&M audit may be regarded as (Nanayakkara and Smith, 1997; Donaldson and Armstrong, 2000): “a systematic examination carried out to check whether O&M activities are carried out as planned and whether the results of these activities yield the anticipated benefits”. Different from an audit for a quality assurance system (e.g. ISO 9000) which focuses on checking compliance of procedures on paper, an O&M audit extends to cover physical examination of work (Nanayakkara and Smith, 1997). This is essential for safeguarding quality and workmanship of O&M work, especially for some hidden work (e.g. installation of a concealed conduit) and servicing (e.g. visual check on the running of equipment) which could be costly and sometimes impracticable to verify after their completion. Table I summarises the major characteristics of the reviewed management tools. Balanced scorecard and customer satisfaction survey offer the same flexibility as to

Balanced scorecard Outsourcer involvement Contractor involvement Consultant involvement End-user involvement Issues in focus

Optional Optional Optional No Strategic Tactical

Customer satisfaction Benchmarking survey

Performance review meeting

O&M audit

Optional Essential Optional No Strategic Tactical Operational

Essential Essential No Optional Strategic Tactical

Optional Essential Optional No Tactical Operational

Optional Optional Optional Essential Tactical Operational

whether the outsourcer, contractor or consultant (a third party appointed by the outsourcer) would take part in their implementation. Whereas customer satisfaction survey relies on collecting feedback from end-users and it largely focuses on issues at tactical and operational levels, there is no user involvement in a balanced scorecard exercise whose emphasis is usually on strategic and tactical matters. Without involving end-users, benchmarking, performance review meeting and O&M audit belong to a set of management tools that necessitate the contractor to provide performance data (e.g. energy cost for system operation, breakdown frequency of equipment, etc.) for benchmarking purpose; participate in meetings for reviewing his performance; and demonstrate compliance with contract requirements upon auditing his work. The outsourcer may directly involve or engage a consultant to participate in the conduction of benchmarking and O&M audit, but the contractor’s performance would be reviewed in meetings by the outsourcer and, sometimes by the representatives of end-users depending on the issues of concern. Unlike in a benchmarking exercise where issues in focus can be strategic, tactical as well as operational (Lai and Yik, 2006), the usual agenda of performance review meetings would include only strategic and tactical items, whereas typically tactical and operational O&M matters would be audited. Data collection and demography of samples Face-to-face interview, which helps soliciting more reliable and detailed information, was conducted individually with 28 experienced practitioners who looked after O&M contracts for commercial buildings in Hong Kong. Based on a selected contract each interviewee was asked to indicate the contract sum, the management tools and the manpower and time resources used for monitoring the contract. To enable quantifying the human capital deployed for contract monitoring, the questionnaire asked about the salary of the interviewees and that of their colleagues participated in the monitoring process. However, nine interviewees refused to provide this sensitive information. Among the useful responses, senior practitioners at top management and managerial levels had no problem in disclosing the salaries of their subordinates such as supervisor and technician, but some interviewees (e.g. manager) were only able to indicate the approximate rather than exact salary of their superior (e.g. director). The salary ranges, classified into four job ranks: top management, managerial, supervisory and operational, are displayed in Table II. These findings are comparable to those obtained in Yik et al. (2002), although the latter categorised the practitioners into three ranks.

Monitoring building O&M contracts 241 Table I. Characteristics of the management tools

F 25,5/6

242

Table II. Salary range of O&M practitioners

Table III. Frequency of customer satisfaction survey

Table IV. Frequency of performance review meeting

Table V. Frequency of O&M audit

One of the interviewees managed a Grade C office building where individual tenants were required to shoulder the O&M responsibility for the unitary installations (e.g. split-type air-conditioners) serving their respective premises and all the routine O&M works for the communal installations were entirely undertaken by in-house staff of the outsourcer. Because of incomplete records, data about efforts made by the in-house team on monitoring those statutory or large-scale O&M works procured on an irregular basis were unavailable. Only information about the frequency of using the management tools by this interviewee is usable in part of the later analyses (i.e. Tables III-V). Job rank

Typical job titles

Top management Managerial Supervisory Operational

Director, department head Senior manager, manager Engineer, assistant engineer Technician, artisan

Monthly salary (HK$) 50,000-100,000 20,000-60,000 12,000-30,000 8,500-19,500

Frequency

Number of contracts

None Monthly Quarterly Half yearly Yearly Others

8 0 1 3 5 2

Frequency

Number of contracts

None Weekly Bi-weekly Monthly Quarterly Ad hoc Others

4 0 2 4 1 8 0

Frequency

Number of contract

None Monthly Quarterly Half yearly Yearly Others (biennially)

15 1 0 1 1 1

Except for a sample pertinent to a hotel building, the rest are O&M contracts for office or office-retail buildings equipped with centralised building services systems, i.e. Grade A or B buildings according to the classification by RVD (2004). One sampled contract, lasting for 1.5 months only, was procured for a small-scale equipment calibration work. Another contract was a three-year total outsourcing package which covered the O&M for all the landlord installations of a large-scale office building. The contract period of the remaining samples, which are contracts for individual trades of specialist work, ranged from one to three years. Customer satisfaction survey, performance review meeting and O&M audit were variously used in monitoring the contracts, but the use of balanced scorecard or benchmarking was not found with any of the samples. Customer satisfaction survey The frequency of conducting customer satisfaction survey on O&M service quality varied among the surveyed contracts (Table III). Despite the well-documented benefits, customer satisfaction survey was not carried out in most of the contracts or, it had been done only once a year. Echoing with the critics made by Bandy (2002) and Pratt (2003), these results have uncovered the general lack of attention being paid to the importance of establishing a “culture of service”. Alternative to the common gauging of customer satisfaction at fixed intervals in office or office-retail buildings, in the hotel case the guests were invited to participate in the survey during their stay. In an office-retail building, such survey was carried out on a jobbing basis where the tenants were asked to express their perceived satisfaction with the work completed by the O&M contractor. Among one-third of the contracts, the interviewees were unaware of the customer satisfaction survey result, reflecting that the results were not shared with them or they paid little attention to the results: TCcs ¼ F cs £

4 X j¼1

 Tcsj £ N j

 Sj : Dmj £ Hdj

ð1Þ

There were no cases employing a third party (consultant) to carry out customer satisfaction survey. The transaction cost incurred for undertaking customer satisfaction survey (TCcs) by the staff of the outsourcer or contractor (if so required by the contract) can thus be determined using equation (1) where Fcs is the frequency of the surveys during the contract period; Tcsj is the time of staff at rank j devoted to the survey task; Nj, Sj, Dmj and Hdj are respectively the number, monthly salary, working days per month and working hours per day of representative at work level j attending the meeting; and j is 1 (top management), 2 (managerial) 3 (supervisory) or 4 (operational). But because the process of customer satisfaction survey typically spans over a period to allow sufficient time for collecting the customers’ response, the interviewees expressed that no proper record has been kept for counting the irregular manpower and time expended during the survey period. Calculation of the transaction cost in this respect is therefore infeasible. Performance review meeting In the majority of the samples where the contractor’s work was monitored by a building management company, only ad hoc performance review meetings were held

Monitoring building O&M contracts 243

F 25,5/6

244

(Table IV). This, according to the opinions of the interviewees, is mainly because of the following: . The contractor performs satisfactorily without requiring regular review by the management company. . The two parties have good working relationship such that they can communicate effectively through telephone or email without meeting face-to-face at regular time intervals. . For resource saving purpose, the management company would prefer to meet with the contractor only when his performance is intolerable or when important issues need to be resolved through meetings. Monthly meetings were regularly conducted in another four contracts. Two key factors give rise to having meetings at such moderate frequency. First, the outsourcer takes great care about the contractor’s performance and hence the quality of service delivered to the users, because the in-house team would be directly exposed to any complaint of dissatisfaction from the users. Second, the in-house team has ample time resource for reviewing the contractor’s performance face-to-face. Surprisingly, no performance review meeting was conducted for four other contracts despite it having been commonly recommended to hold such a meeting at least once a year for progress review and for forecasting future developments or changes (Angel, 2003). These contracts bear one or both of the following characteristics: . The contract period was short. For example, a contract lasted for only 1.5 months during which the contractor was required to calibrate and tune some critical devices (e.g. temperature and pressure sensors) of the chiller plant. The in-house team intensively witnessed the calibrations that the contractor did during the short contract period, which makes the conduction of performance review meetings unnecessary. . The in-house team lacked technical knowledge. For instance, none of the management team members possessed relevant O&M knowledge, forcing them to offer the contractor free hands to perform. Similar to the frequency of performance review meeting, its duration also varied. The case of total outsourcing recorded the highest duration: 48 meeting hours per year. This is sensible given the substantial contract scope. While “putting all eggs into one single basket”, the outsourcer would need to keep a close eye on the contractor’s performance to prevent it from becoming unbearable. Performance review meetings would involve both the contracting parties (Table I). Aside from optimum frequency and duration, delegating the right person(s) is crucial to economising the transaction cost incurred for performance review meetings. Table VI shows the mean and range of headcounts representing the management and contractor teams in the meetings. At managerial, supervisory and operational levels, both the management and contractor sides on average assigned similar number of representatives. However, the generally smaller amount of contractor’s representatives, as pointed out by the interviewees, is because some contractor staff who need to physically execute the O&M work were unable to attend the meetings.

The contractor’s top management never joined the meetings. This suggests that the contractor had limited human resources at top management level or considered his accountability for the service outcome relatively small, in contrast to the much-concerned counterpart of the outsourcer whose participation in the meetings was comparatively active. Besides, the higher involvement of supervisory and operational staff implies that issues about work supervision and execution dominated at the meetings. This deviates from the principle of putting strategic matters high in the agenda. O&M practitioners in Hong Kong generally need to work eight hours a day during Monday to Friday and four hours in the morning on alternate Saturdays. Discounting four Sundays, two off-duty Saturdays and two non-working Saturday afternoons from a 30-day calendar month, there are 23 equivalent working days, or 184 working hours. Having known the frequency and duration of the performance meetings and number of representatives in attendance, the monthly average duration of the meetings (Dpm ) over 12 months preceding the interviews with the respondents were calculated and the costs incurred for the meetings (TCpm) were determined using equation (2) where Fpm is the frequency of meeting (number of meeting during the contract period); Dpm is the duration of meeting (hours each); Nj, Sj, Dmj and Hdj are the same notations as those in equation (1): TCpm ¼ F pm £ Dpm £

4 X

245

 Nj

j¼1

 Sj : Dmj £ Hdj

Monitoring building O&M contracts

ð2Þ

The calculated results are summarised in Table VII. Although the largest monthly average duration (two hours) seems to be insubstantial, its corresponding transaction cost is HK$5,000.0, which is equivalent to 7.4 per cent of the contract sum (OC). Note that for simplistic representation the significance of the transaction cost amount is expressed in terms of fraction (per cent) out of the respective contract sum; transaction cost and contract sum are in principle two separate entities.

Job rank Top management Managerial Supervisory Operational

Dpm (hours) TC (HK$) TC/OC (%) (TCpm þ TCad Þ=OC (%)

Management team Mean Range 0.3 0.7 1.3 0.6

Contractor team Mean Range

0-1 0-4 0-2 0-6

0.0 0.9 1.1 0.6

0-0 0-2 0-3 0-3

Performance review meeting

O&M audit

0-2.0 0-5,000.0 0-7.4

0-1.3 0-1,157.0 0-5.3 0-7.4

Table VI. Number of representatives attending performance review meeting

Table VII. Monthly average duration and costs for performance review meeting and O&M audit

F 25,5/6

246

O&M audit O&M audit was not implemented in most of the contracts (Table V). In the remaining cases, audit was carried out either monthly, half yearly, yearly or biannually. Only in two cases audit was conducted comprehensively in one whole working day (around eight hours); in other cases only brief audits were conducted in one to three hours. In all the audits, generally one to two supervisory or managerial staffs of the in-house organisation were involved. None of the cases had employed an external (third party) auditor. This suggests that the resources deployed for O&M audits were usually minimal and the importance of engaging professionals for independent audits was not commonly recognised. O&M audit would incur transaction cost (TCad) which can be determined by equation (3), where Fad is the frequency of audit (number of audit during the contract period) and Dad is the duration (hours each). The notations of Nj, Sj, Dmj and Hdj are identical to those in equation (1). Drawn from the interview response and running a similar calculation procedure as for performance review meeting, the results of time and cost expended on conducting O&M audit for the sampled contracts are shown in Table VII. Although in none of the samples was a third-party consultant hired to carry out O&M audit, the consultant fee, if any, should be added to the transaction cost: TCad ¼ F ad £ Dad £

4 X j¼1

 Nj

 Sj : Dmj £ Hdj

ð3Þ

The highest monthly average duration of O&M audit is 1.3 hours and its equivalent transaction cost is HK$1,157.0. This amount, being about 10 per cent of the median of operational staff salary (Table II), is significant even though its fraction out of the contract sum (5.3 per cent) is slightly lower than the case of performance review meeting (7.4 per cent). Comparison between the monthly duration of performance review meeting and that of O&M audit does not reveal significant difference. Among all the sampled cases where either or both performance review meeting and O&M audit had taken place, the highest monitoring cost equals to 7.4 per cent of the respective contract sum. Obviously most outsourcers would have little problem to afford this amount (equivalent to HK$5,000.0 per month), but whether it is excessive, minimal or worthwhile would depend on the gain in service quality brought about by the monitoring. In theory, the amount of contract monitoring effort is under two opposite tensions. On one side, a bigger contract would demand more resources for monitoring its large volume of work; on the other, outsourcers can take advantages of greater managerial economies when administering bigger contracts with increased utilisation of their management team, and larger commercial economies when procuring larger contracts with bigger discounts. A scatter-plot of the fraction of transaction cost for contract monitoring against monthly contract sum (Figure 1(a)) upholds that the latter tension prevailed. The downward trend exhibits clearer in Figure 1(b) where data of the total outsourcing contract costing HK$2.8 million per month were discarded. The above analysis, nonetheless, has not taken into account the transaction cost for monitoring the contracts by customer satisfaction survey. The following factors, which are too extensive to be investigated by the current study, may also have contributed to the data variations:

Monitoring building O&M contracts 247

Figure 1. Monitoring cost against contract sum

.

.

.

Propriety of the formed contracts. Clearer and more complete contract scope, if adequately priced by the contractor, would demand less monitoring as the contractor would be less likely to veil any work which is unknown to the employer (asymmetric information) or evade his responsibility with guile (moral hazard) (Williamson, 1985; Lai et al., 2006). Complexity of the contract works. More complicated works would need more idiosyncratic investments (Williamson, 1979) for specialist manpower, tools and materials for their execution, thereby closer monitoring of the work processes. Contractual relationship. Under a more relational contract (Macneil, 1974, 1978), less monitoring effort would be needed as the contractor should have known the expected performance standard better (e.g. the contracting parties have

F 25,5/6 .

248

.

cooperated before) or would perform better in order to secure future contracts (e.g. the contracting parties anticipate future contract opportunities). Competence and diligence of the contractor. Good contractors who are more competent and diligent would timely deliver good quality work, thus demanding less intensive monitoring of their performance. Knowledge and experience of the management team. Management staff who are more knowledgeable and familiar with the contract work would be able to monitor the contractor’s performance more efficiently.

Conclusions Devised for monitoring different levels of performance issues, different management tools would involve different stakeholders of building O&M contracts for their implementation. The study shows that balanced scorecard and benchmarking are rarely used. Customer satisfaction survey is commonly used, but quantifying the cost of such survey is difficult without proper record of relevant human and time resources. The contracting parties generally regarded performance review meeting as a useful means for resolving operational issues on an “as-needed” basis rather than a tool for communicating and formulating strategic matters of the contracts. The use of O&M audit was occasional, indicating the uncommon evaluation of input resources and O&M service quality. It has been shown how the human and time resources deployed for conducting customer satisfaction survey, performance review meeting and O&M audit can be measured. Such cost, being part of the total transaction cost, is significant although it tends to diminish with larger contracts. While outsourcers are liberal to choose which or which combination of management tools to monitor contract performance, their use should be optimised and justified to ensure the costs so incurred do not outweigh benefits gained from better service. The focus of the study was on contract monitoring cost, which may vary with the propriety of contract, complexity of work, contractual relationship, capability and quality of the contractor and management teams. The extent to which these factors would influence the monitoring effort is yet to be studied. More research work needs to be undertaken to measure other ex ante and ex post transaction cost elements which, when made available, would be helpful for evaluating outsourcing decisions. References Aldlaigan, A.H. and Buttle, F.A. (2002), “SYSTRA-SQ: a new measure of bank service quality”, International Journal of Service Industry Management, Vol. 13 No. 4, pp. 362-81. Alexander, K. (Ed.) (1996), Facilities Management: Theory and Practice, E&FN Spon, London, pp. 57-70. Amaratunga, D. and Baldry, D. (2003), “A conceptual framework to measure facilities management performance”, Property Management, Vol. 21 No. 2, pp. 171-89. Angel, J. (2003), Technology Outsourcing: A Practitioner’s Guide, The Law Society, London. Angelici, K., Struyk, R.J. and Tikhomirova, M. (1995), “Private maintenance for Moscow’s municipal housing stock: does it work?”, Journal of Housing Economics, Vol. 4 No. 1, pp. 50-70. Atkin, B. and Brooks, A. (2000), Total Facilities Management, Blackwell Science, Oxford.

Bandy, N.M. (2002), “Setting service standards: a structured approach to delivering outstanding customer service for the facility manager”, Journal of Facilities Management, Vol. 1 No. 4, pp. 322-36. Barrett, P. (2000), Facilities Management: Towards Best Practice, Blackwell Science, Oxford. Bouman, M. and van der Wiele, T. (1992), “Measuring service quality in the car service industry: building and testing an instrument”, International Journal of Service Industry Management, Vol. 3 No. 4, pp. 4-16. British Standards Institution (BSI) (1993), BS 3811:1993, Glossary of Terms Used in Terotechnology, BSI, London. Buckley, P.J. and Chapman, M. (1997), “The perception and measurement of transaction costs”, Cambridge Journal of Economics, Vol. 21 No. 2, pp. 127-45. Building Services Research and Information Association (BSRIA) (2006), O&M Benchmarking Network, BSRIA, Bracknell, available at: www.bsria.co.uk/fm/ Camp, R.C. (1989), “Benchmarking: the search for best practices that lead to superior performance”, Quality Progress, January, pp. 61-8. Dean, A.M. and Kiu, C. (2001), “Performance monitoring and quality outcomes in contracted services”, International Journal of Quality & Reliability Management, Vol. 19 No. 4, pp. 396-413. Domberger, S. (1998), The Contracting Organisation, Oxford University Press, Oxford. Donaldson, C. and Armstrong, J. (2000), Toolkit for Building Operation Audits, Application Guide AG 13/2000, Building Services Research and Information Association, Bracknell. Edvardsson, B., Thomasson, B. and Øvretveit, J. (1994), Quality of Service: Making It Really Work, McGraw-Hill, Maidenhead. Fitzgerald, L., Johnston, R., Brignall, S., Silvestro, R. and Voss, C. (1991), Performance Measurement in Service Businesses, Chartered Institute of Management Accountants, London. Fuentes, C.M. (1999), “Measuring hospital service quality: a methodological study”, Managing Service Quality, Vol. 9 No. 4, pp. 230-9. Greaver, M.F. II (1999), Strategic Outsourcing: A Structured Approach to Outsourcing Decision and Initiatives, American Management Association, Chicago, IL. Grigg, J. (1996), “Quality management”, in Alexander, K. (Ed.), Facilities Management: Theory and Practice, E&FN Spon, London, pp. 57-70. Gro¨nroos, C. (1984), “A service quality model and its marketing implication”, European Journal of Marketing, Vol. 18 No. 4, pp. 36-44. Hinks, J. and McNay, P. (1999), “The creation of a management-by-variance tool for facilities management performance assessment”, Facilities, Vol. 17 No. 1/2, January/February, pp. 31-53. International Facility Management Association (IFMA) (2001), Operations and Maintenance Benchmarks, Research Report No. 21, IFMA, Houston, TX. Johns, N. and Tyas, P. (1996), “Use of service quality gap theory to differentiate between foodservice outlets”, The Service Industries Journal, Vol. 16 No. 3, pp. 321-46. Johnston, R. and Clark, G. (2005), Service Operations Management: Improving Services Delivery, Prentice-Hall, Englewood Cliffs, NJ. Jones, P. (Ed.) (1989), Management in Service Industries, Pitman, London. Kaplan, R.S. and Norton, D.P. (1993), “Putting the balanced scorecard to work”, Harvard Business Review, September/October, pp. 134-47.

Monitoring building O&M contracts 249

F 25,5/6

250

Kennedy, A. (1996), “Facilities management support services”, in Alexander, K. (Ed.), Facilities Management: Theory and Practice, E&FN Spon, London, pp. 134-44. Kincaid, D.G. (1994), “Measuring performance in facility management”, Facilities, Vol. 12 No. 6, pp. 17-20. Lai, J.H.K. and Yik, F.W.H. (2005), “Benchmarking operation and maintenance cost, performance and value for commercial buildings in Hong Kong”, Proceedings of Joint (HKIE/CIBSE/ASHRAE) Symposium 2005: New Challenges in Building Services, Hong Kong SAR, 15 November. Lai, J.H.K. and Yik, F.W.H. (2006), “Developing performance indicators for benchmarking building services operation and maintenance for commercial buildings”, CIBW70 Trondheim International Symposium: Changing User Demands on Buildings, 12-14 June, pp. 283-94. Lai, J.H.K., Yik, F.W.H. and Jones, P. (2004), “Practices and performance of outsourced operation and maintenance in commercial buildings”, Proceedings of the CIBW70 2004 Hong Kong International Symposium: Facilities Management and Maintenance, Hong Kong SAR, 7-8 December, pp. 357-67. Lai, J.H.K., Yik, F.W.H. and Jones, P. (2006), “Critical contractual issues of outsourced operation and maintenance service for commercial buildings”, International Journal of Service Industry Management, Vol. 17 No. 4, pp. 320-43. Lema, N.M. and Price, A.D.F. (1995), “Benchmarking: performance improvement toward competitive advantage”, Journal of Management in Engineering, January/February, pp. 28-37. McDougall, G. and Hinks, J. (2000), “Identifying priority issues in facilities management benchmarking”, Facilities, Vol. 18 Nos 10/11/12, pp. 427-34. Macneil, I.R. (1974), “The many futures of contracts”, Southern California Law Review, Vol. 47 No. 688, pp. 691-816. Macneil, I.R. (1978), “Contractors: adjustment of long-term economic relationship under classical, neoclassical and relational contract law”, Northwestern University Law Review, Vol. 72 No. 6, pp. 854-906. Masten, S.E. (1996), “Empirical research in transaction cost economics: challenges, progress, directions”, in Groenewegen, J. (Ed.), Transaction Cost Economics and Beyond, Kluwer Academic Publishers, New York, NY. Nanayakkara, R. and Smith, M.H. (1997), Operation and Maintenance Audits, Application Guide AG 24/7, Building Services Research and Information Association, Bracknell. Øvretveit, J.A. (1993), “Auditing and awards for service quality”, International Journal of Service Industry Management, Vol. 4 No. 2, pp. 74-84. Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1985), “A conceptual model of service quality and its implications for future research”, Journal of Marketing, Vol. 49 No. 4, pp. 41-50. Pratt, K.T. (2003), “Introducing a service level culture”, Facilities, Vol. 21 Nos 11/12, pp. 253-9. Rao, P.K. (2003), The Economics of Transaction Costs: Theory, Methods and Applications, Palgrave Macmillan, London. Rating and Valuation Department (RVD) (2004), Hong Kong Property Review 2004, RVD, Hong Kong. Reuvid, J. and Hinks, J. (Eds) (2002), Managing Business Support Services: Strategies for Outsourcing and Facilities Management, Kogan Page, London.

Roberts, P. (2002), “Procurement – best value criteria for selection”, in Reuvid, J. and Hinks, J. (Eds), Managing Business Support Services: Strategies for Outsourcing and Facilities Management, Kogan Page, London, pp. 123-35. Varcoe, B.J. (1993), “Facilities performance: achieving value-for-money through performance measurement and benchmarking”, Property Management, Vol. 11 No. 4, pp. 301-7. Varcoe, B.J. (1996), “Business-driven facilities benchmarking”, Facilities, Vol. 14 Nos 3/4, pp. 42-8. Williamson, O.E. (1979), “Transaction cost economics: the governance of contractual relations”, Journal of Law & Economics, Vol. 22, October, pp. 233-61. Williamson, O.E. (1985), The Economic Institutions of Capitalism, Free Press, New York, NY. Yik, F.W.H., Lee, W.L. and Ng, C.K. (2002), “Building energy efficiency and the remuneration of operation and maintenance personnel”, Facilities, Vol. 20 Nos 13/14, pp. 406-13. Corresponding author Joseph H.K. Lai can be contacted at: [email protected]

To purchase reprints of this article please e-mail: [email protected] Or visit our web site for further details: www.emeraldinsight.com/reprints

Monitoring building O&M contracts 251