IntroductionCorporate facilities management (CFM) has evolved as a corporate function through integration of the construction client function (provision of buildings) and the building operation function (provision of maintenance and other services) (Jensen 2008). For this study, CFM is defined as the management of buildings, facilities, and services during the whole life cycle (Van der Voordt 2017). In this CFM context, maintenance managers are confronted with assets that are digitalized during design, construction, maintenance, or operation. The combined use of sensor technology, radio frequency identification (RFID), and distributed ledger technologies, can connect physical assets to the internet of things (IoT), creating “smart facilities” (Taneja et al. 2011; Pishdad-Bozorgi 2017). However, for CFM organizations that have outsourced maintenance execution to contractors, the utilization of digital data is not all plain sailing, because of the networked position of maintenance internally, as well as externally. Internally, maintenance management relies on capital works for the supply of accurate as-built data (Thabet and Lucas 2017). Externally, there is a same kind of dependency on maintenance contractors that need to supply data that is supplied to them by subcontractors and suppliers. And while data supply can be formalized in contracts, many contracts by their nature are incomplete, requiring both parties to work around unforeseen data issues or collaborate in solving them. The internal and external dependencies in fragmented data supply chains must be addressed by the maintenance function in order to capture and utilize digital data. This may require the development of capabilities related to directing and leading a dynamic and complex network without (or with very limited) hierarchical authority over individual actors.Maturity models are widely used to develop and improve organizational capabilities through the assessment of maturity, which is identified as “competency, capability, level of sophistication” (De Bruin et al. 2005). In recent years, maturity models have been gaining attention in the field of maintenance management. However, there is no maturity model that measures, to what extent the requirements imposed by digitalization of assets in networked maintenance organizations are fulfilled.In this paper, such a maturity framework is designed, in particular with consideration of the lateral network management aspects of smartness in smart maintenance. The purpose is to develop a smart maintenance maturity assessment framework for CFM organizations. The following section reviews existing maturity frameworks and literature on smart maintenance. We will demonstrate that the current literature on smart maintenance is dominated by the manufacturing industry and we propose a definition of smart maintenance for the context of CFM. Section “Methodology” describes how the research has been designed in two stages, using case studies and expert consultation. Section “Results” describes the smart maintenance maturity framework and in the section “Discussion” it is critically reviewed. Finally, the paper ends with some conclusions and intentions for future research.Literature ReviewThe concept of maturity is applied in the literature to the maturing of persons, objects, or social systems (Kohlegger et al. 2009). Maturing of a social system can be viewed as going through several phases of increasing capability on a defined process area (Kohlegger et al. 2009). Maintenance maturity models have been developed for a variety of industries and asset classes. Some models are applied in a wide range of organizational contexts and industries, such as the framework proposed by Campbell and Reyes Picknell (2016), and the maturity frameworks of the Publicly Available Specification (PAS) 55:2008 (British Standards Institution 2008) and the Institute of Asset Management (Institute of Asset Management 2016). Other frameworks have been developed for specific industries, such as offshore oil and gas exploration (Energy Institute 2007), manufacturing (Schuh et al. 2010; Macchi and Fumagalli 2013; Oliveira and Lopes 2019), air traffic control (Kundler 2012; ISO 2003, 2004a, b, c, 2006), hospitals (Ali and Mohamad 2009), electricity and gas infrastructure (Mehairjan et al. 2016), and road and water infrastructure (Volker et al. 2013). An overview of the maturity models is given in Table 1. Parallel to the development of these maturity models, extensive digitalization of physical assets has emerged recently and contributed to what has gradually become known as smart maintenance.Table 1. Overview of maintenance maturity modelsTable 1. Overview of maintenance maturity modelsMaturity modelTheoretical foundationMaturity levelsEnergy Institute (2007)Capability maturity model5Schuh et al. (2010)Capability maturity model5Volker et al. (2013)Capability maturity model5Macchi and Fumagalli (2013)Capability maturity model integration5Kundler (2012)ISO/IEC Standard 155046Chemweno et al. (2013)Analytic network process methodology5British Standards Institution (2008)Deducted from field observations5Ali and Mohamad (2009)Deducted from field observations5Mehairjan et al. (2016)Deducted from field observations4Institute of Asset Management (2016)Deducted from field observations6Campbell and Reyes-Picknell (2016)Deducted from field observations5Oliveira and Lopes (2019)Deducted from field observations5The literature on smart maintenance is evolving. A great deal of attention has been aimed at understanding the impact of Industry 4.0 (the fourth generation of industrial activity) on maintenance with a particular interest in advanced technologies, data science, and predictive analytics (Yan et al. 2017; Bokrantz et al. 2017; Kans and Galar 2017; Macchi et al. 2017; Lee et al. 2014, 2015, 2017; García and García 2019). A definition of smart maintenance is given by Bokrantz et al. (2020), who defined smart maintenance as “an organizational design for managing maintenance of manufacturing plants in environments with pervasive digital technologies.” Smart maintenance in this definition is specifically linked to the manufacturing industry and based on four underlying subdimensions: data-driven decision-making, human capital resource, internal integration, and external integration.Other researchers have researched asset digitalization for smart contracting. Smart contracts, in the form of computerized transaction protocols, execute the terms of a maintenance contract for which repetitive maintenance transactions are programmed, coded, and embedded into a blockchain in advance (Christides and Devetsikiotis 2016; Moretti and Re Cecconi; 2018). When applied properly, smart contracts can improve workflows, reduce administration costs of transactions, and can reduce the number of disputes between maintenance organizations and service agents (Li et al. 2018).Many maintenance experts and researchers support the idea that the transition toward smart maintenance implies the development of new organizational capabilities (e.g., Bokrantz et al. 2017). Murphy and Chang (2009) were among the first who applied a maturity model to what, in hindsight, could be viewed as some form of smart maintenance. However, their discussion is limited to capturing and managing engineering data. Schmiedbauer et al. (2020) presented a maturity model for the manufacturing industry that combines smart maintenance with lean management, and Papic and Cerovsek (2019) presented a maturity framework that describes how organizations can become more mature in working with a digital twin. The models are summarized in Table 2.Table 2. Comparison of smart maintenance maturity modelsTable 2. Comparison of smart maintenance maturity modelsMaturity modelTheoretical foundationMaturity levelsMurphy and Chang (2009)Capability maturity model5Schiedbauer et al. (2020)Capability maturity model integration5Papic and Cerovsek (2019)Capability maturity model5In the assessment of what is known in the literature on smart maintenance maturity frameworks, we make some observations. The first one is related to the conceptualization of smart maintenance. It appears that the debate on this topic is dominated by researchers from the manufacturing industry and industrial maintenance. The facilities management (FM) perspective on smart maintenance is missing in the literature. Such a perspective should take the networked context of the maintenance function into account at the interface of an internal network of stakeholders, and external construction-supply and maintenance-supply networks. For this study, situated in the CFM business environment, we propose a definition of smart maintenance management as the lateral leading of maintenance networks in creating stakeholder value with networked asset data. The second observation with respect to the existing literature is that maturity models can serve different purposes. Some of the reviewed models were developed and used as part of a large-scale organizational change initiative in nationwide operating asset owners (e.g., Volker et al. 2013; Mehairjan et al. 2016). Others were built and used to evaluate and compare the performance of license holders (e.g., Kundler 2012; Ali and Mohamad 2009). In this research, we are looking for a maturity framework that can be used to guide CFM organizations in developing new capabilities related to smart maintenance. A third observation is related to the role that data, information, and knowledge play in smart maintenance networks. By having certain data, information, and knowledge that others do not have, a company develops a competitive advantage that can be used to influence the course of action of a network in one’s favor. A Delphi study among senior maintenance managers by Bokrantz et al. (2017) points toward the competitive advantage that data, information, trade secrets, services, and knowledge bring to individual companies. According to the experts consulted in that study, this forms a barrier for collaborating in digital networks because of “secret policies” and “difficulties in achieving obvious mutual benefits” (Bokrantz et al. 2017, p. 166). In economic literature, several mechanisms are proposed to mitigate the negative effects of information asymmetries in procurement and supply chain management: regular meetings, joint problem solving, and goal alignment. The research by Bokrantz et al. (2017) seems to suggest that maintenance managers feel uncertain about how to implement such mechanisms.The knowledge gap that this research aims to close, is the gap between the existing smart maintenance maturity models and the requirements of asset owners in the building and construction industry in general and CFM organizations in particular. The purpose is to design a smart maintenance maturity framework that meets the requirements of CFM organizations in addressing the internal and external dependencies in the data supply chain. The research is guided by the following research question: How can a smart maintenance maturity model be developed for CFM organizations that addresses internal and external data supply chain dependencies? In developing such a framework, this paper aims to broaden the theoretical scope of smart maintenance to include building assets and the construction industry.MethodologyResearch Design and Case SelectionManaging dependencies in data supply chains takes place in real-life situations on an ongoing basis. As discussed by Yin (2014), Cavaye (1996), and Darke et al. (1998), case studies are very well suited to explore the complexities and richness of such phenomena. In designing case study research, a fundamental question is how many cases should be studied and which cases should be selected. While Yin (2014) and Eisenhardt (1989, 1991) suggest that more cases lead to better theories, Flyvbjerg (2006), Dubois and Gadde (2002, 2014, 2017), Chen (2015), and Järvensivu and Törnroos (2010) point out the value of deep insights and theories that can be obtained from a single in-depth case study. The discussion about the number of cases is related to their generalizability or external validity, i.e., the way findings can be generalized beyond the case(s) that first generated the findings. This is based on the idea that a theory should be able to account for phenomena in settings other than the setting that was used to develop it (e.g., Yin 2014; Gibbert et al. 2008). Analytical generalization is used in case study research to generalize findings into theory, rather than to populations (Bryman 2012; Yin 2014; Gibbert et al. 2008). In this research, the purpose is not to develop a theory that explains smart maintenance, but to design a framework that can be used to measure smart maintenance maturity. Generalization in this work comes down to specifying the context for which the maturity framework was developed, and providing detailed research procedures and transparent criteria used for selecting cases, analyzing data, and identifying maturity dimensions (Dubois and Gadde 2014, 2017). By providing an account of the methodology applied, the researchers describe their intellectual journey so readers can evaluate the research approach (Dubois and Araujo 2007; Ruddin 2006).Several measures were taken to validate the smart maintenance maturity framework and to allow it to be transferred to other contexts. First, the context was clearly specified as being for CFM organizations of universities. This guided purposeful sampling of cases during the design stage and experts in the validation stage. Second, an extreme case was studied as a high-quality instance of smart maintenance in CFM organizations. This case provided detailed insights into smart maintenance capabilities. This is to say that an extreme or deviant case was used to obtain information from an unusual case at the far end of a particular dimension of interest (Bryman 2012; Flyvbjerg 2006). If dimensions of smart maintenance maturity are identified in this extreme, or best practice case, then it is likely that these same dimensions can also be used to measure maturity of CFM organizations of other universities. A third measure was collecting feedback from experts on a preliminary version of the framework. While this does not confer the framework with universal applicability, it can provide an initial confirmation of its applicability in the given business context of CFM organizations.This research used a two-stage research design, in which a typical and atypical (extreme) case were used to design a smart maintenance maturity framework. This was subsequently validated via an expert consultation (Fig. 1). In the design stage, the interview guide (as the key research instrument) was tested through pilot interviews with independent practitioners (a senior asset manager and a tender manager) and an academic facilities management researcher. Two CFM organizations from two universities were selected for identification of maturity dimensions of smart maintenance and measurement scales. Two different networks within each CFM organization were studied (embedded cases) around two different maintenance contracts. Background information on the cases is summarized in Table 3.Table 3. Characteristics of selected casesTable 3. Characteristics of selected casesCharacteristicCase 1: Building maintenanceCase 2: MEP maintenanceCase 3: Hard and soft servicesaCase 4: EPCServices contractedCM, PPM, investment projectsCM, PPM, investment projectsCM, PPMCM, PPMNumber of buildings serviced11 buildings, 4 campuses11 buildings, 4 campuses120 buildings, 8 campuses70 buildings, 1 campusFloor area serviced (m2)117.600 (NIA)117.600 (NIA)488.000 (GFA)351.700 (GFA)Contract typeVolume and projects basedPerformance, assets, and project basedPerformance, and assets basedPerformance basedInstitutional environmentEducation and research facilitiesEducation and research facilitiesEducation and research facilitiesEducation and research facilitiesLocationNetherlandsNetherlandsAustraliaAustraliaIn the validation stage, the preliminary smart maintenance maturity model was presented to a panel of seven experts selected for their experience and expertise. First, a profile was developed of the expertise required for validating the smart maintenance maturity framework. Four expert knowledge domains were identified as critical for validation: (1) CFM in universities, (2) organization of maintenance processes, (3) predictive maintenance, and (4) data science and analytics in asset management. LinkedIn was used as a search tool to identify experts who matched either one, or a combination, of the required knowledge domains. Candidate experts were contacted through mail and were sent a two-page information sheet with background information about the research and the aim of the expert meeting. The final step in selecting the experts was a brief intake interview by phone to assess the candidates’ suitability as an expert for the purpose of the meeting. This procedure created an expert panel that could cover all dimensions of the maturity framework. The final panel balanced the roles of the asset owner and the maintenance contractors (Table 4). Four experts worked for asset owners; two of them in CFM organizations of universities, and two of them in large asset management organizations of public infrastructure operators. Three experts worked for maintenance contractors in building services and data science. Two had experience in working for CFM organizations of universities. All experts were provided with information on the maturity framework 5 days before the consultation took place. The consultation was held online. Separate online meetings were organized for two experts who were unable to join the group meeting. During these meetings, the experts commented on the following questions: (1) Which maturity dimensions do you recognize as important and why?; (2) Which maturity dimensions do you consider unimportant and why?; and (3) Which maturity dimensions are missing?Table 4. Level of experience of the experts consultedTable 4. Level of experience of the experts consultedExpertArea of expertiseHighest educationRelevant experience (years)Current employmentCurrent job description1CFM in universities and maintenance processesB.Sc. Building Engineering37UniversitySenior asset manager2CFM in universities and maintenance processesB.Sc. Mechanical and Electrical Engineering10UniversityManager building services maintenance3Organizational development and process managementM.Sc. Business Administration20Gas utilitiesManager asset data management4Maintenance processes and data science and analyticsM.Sc. Mechanical Engineering5AirportManager data and analytics asset management5Maintenance processes and predictive maintenanceB.Sc. Economics30Building services contractorDirector national contracts6Maintenance processes and predictive maintenanceVocational education36Building services contractorRegional manager7Data science and analyticsM.Sc. Technology Management7Infrastructure and buildings contractorProduct owner data science teamCase Study and Data CollectionThe typical case was used to find barriers for smart maintenance because data quality was identified as a concern to the CFM organization. The research question that guided the case study was as follows: What are the characteristics of maintenance networks that form a barrier for smart maintenance and what processes are required to eliminate those barriers?The extreme case was used to find drivers of smart maintenance maturity. Contrary to the typical case, in the extreme case, asset data management was insourced. Prior to the case study, asset data management-related capabilities were implemented, enabling the organization to reach an unusual high level of maturity in asset data governance. The university’s CFM organization had been recognized by independent researchers as a best practice case in asset management and the asset management team was internationally certified and awarded for their achievements (Vago 2018). During the initial conversations with leaders and key informants of the CFM organization in negotiating access, this unusually high level of maturity was reflected in a detailed, accurate asset register and implemented practices and procedures for intra- and interorganizational asset data governance. For the extreme case, the following research question guided the case study: How is internal and external collaboration driving smart maintenance and what processes are required to implement such collaboration?Data was collected through 48 semistructured interviews and analyzing case-related documentation. The respondents were selected for their knowledge and experience in working with asset data. They were recruited from different teams within CFM (project management, maintenance management, procurement), main contractors and subcontractors. They were identified with the assistance of the key informant through snowballing on embedded case level, as well as on organizational level (Tables 5 and 6). All interviews were recorded, transcribed, anonymized, and sent to the respondents for comments. The software NVivo version 12 was used to code and analyze interview data. The case studies were also informed by project data. The documents collected included, among others, contracts, information system documentation, and minutes of contract governance meetings. In Cases 1 and 2, the first researcher was given access to contractual documentation and to the online file sharing platform used to store and share drawings and technical data on the buildings. The contract documentation included, but was not limited to, requests for expression of interests, program of requirements, pricing formats, and maintenance implementation plans. The information management documents included (when available) asset register structure, site-related technical log file structure, data hub structure, and building passport design. Organizational policy documents included, but were not limited to, organizational charts and the contract governance structure. In Cases 3 and 4, the first researcher joined a project handover meeting as an observer (Case 3) and visited an equipment site (Case 4) with a service technician. Finally, secondary data was collected: two student reports on data quality and data ownership in building maintenance (Cases 1 and 2), a state government audit report on asset management in universities (Case 3), and online news articles (Cases 1, 2, and 4). During the overlapping periods of data collection and analyses, the researcher kept a diary and took notes from informal and occasional conversations. These were used to interpret the data and adjust the interview questions during the data collection.Table 5. Data collected per caseTable 5. Data collected per caseData collection methodCase 1: Building maintenanceCase 2: MEP maintenanceCase 3: Hard and soft servicesCase 4: EPCInterviewsFMM: 2; FMO: 1; CS/CM: 1; and SC: 4FMM: 1; FMO: 1; CS: 2; CM: 4; and SC: 5FMPR: 1; CS: 1; and CM: 2FMP: 2; FMO: 1; and SC: 1Case dataContract, information systems, online file share, meeting minutesContract, information systems, online file share, meeting minutesContract, information systems, meeting minutesContract, project organization chart, energy conservation measures, measurement and verification planningSite visit——Direct observation handover meeting, notes (one meeting)Direct observation equipment site, notes, pictures (one site visit)OtherStudent report, news articlesStudent report, news articlesState government report on asset management in higher educationNews articlesTable 6. Interview data collected on the level of the CFM organizationTable 6. Interview data collected on the level of the CFM organizationData collection methodCFM organization A (Cases 1 and 2)CFM organization B (Cases 3 and 4)InterviewsFMS: 1; FMM: 1; FMPR: 1; and FMO: 4FMS: 2; FMP: 1; FMM: 6; FMPR: 1; and FMO: 2In the final stages of both the typical and the extreme case, findings were discussed with key informants of the cases. In the typical case, draft versions of case study reports were discussed with participants and key informants from both levels of analysis (embedded case level and organizational level). In Cases 3 and 4, findings were discussed with two key informants during final interviews.Cross-Case Analysis and Expert ValidationCase data of the four cases was analyzed in different ways to identify the maturity dimensions for smart maintenance. Coding was used to analyze the data and identify maturity dimensions. Coding in qualitative research is a process of breaking down qualitative data into component parts, which are given labels that seem to be of theoretical significance or that appear to be particularly salient within the social worlds of those being studied (Bryman 2012). The way the data is labeled in coding may depend on which angle or lens is being used by the researchers (Saldaña 2016). Using different lenses in coding the data for the cross-case analyses of the typical and extreme case provided understanding of what happened in both cases (Bazeley 2009). In Cross-case analysis 1, for the typical case, the lens used for coding was finding barriers for smart maintenance (Fig. 2). For the typical case, data was screened for text fragments that described processes related to absent or underdeveloped asset digitalization. Coding of the data produced 15 barriers (Cases 1 and 2) for smart maintenance (Table 7). For Cross-case analysis 2, the lens for coding was finding drivers for smart maintenance (Fig. 2). Data was screened for text fragments that described processes that enabled collaboration between asset owner and maintenance contractor in exchanging digital asset data. Initial coding produced 17 drivers (Cases 3 and 4) for smart maintenance (Table 7). The second step in data analysis was to use several tactics to combine and refine these 15 barriers and 17 drivers into a coherent and manageable set of maturity dimensions. First, the barriers and drivers were compared in case they might refer to the same underlying process and could be integrated into a single maturity dimension. Second, axial coding was used to refine and describe the meaning of the maturity dimensions. In that process, data or text fragments were reassembled to link lower-level maturity dimensions to higher-level maturity dimensions, thus bringing structure and hierarchy into the maturity framework (Saldaña 2016). Third, the emerging maturity dimensions and their relations were compared with those of existing maturity models to understand to what extent they confirmed or deviated from them. This iterating process, comparing empirical findings with existing maturity models or frameworks, is described as “matching” by Dubois and Gadde (2002, 2014).Table 7. Barriers and drivers of smart maintenanceTable 7. Barriers and drivers of smart maintenanceBarriers (Cases 1 and 2)Drivers (Cases 3 and 4)1.Fragmented maintenance budgets2.Fragmented client-contractor communication3.Poorly understood project handover process4.Information management based on personal ad-hoc trouble shooting5.Excessive dependence on tacit knowledge of suppliers and subcont ractors6.Engineering dominant role perception of clients’ maintenance managers7.Undocumented asset criticality8.Negotiating different versions of asset register9.Information asymmetries contractor and client10.Adversarial relationship maintenance-contractor11.Adversarial relationship maintenance-projects12.External locus of control over data flow13.Data loss or leakage during contractor transitions14.Inability to understand each other’s information needs15.Lack of commitment and ownership among individuals1.Environmental goals of the university2.Reporting obligations energy performance contract3.Shared understanding within CFM of asset life cycle4.Culture of continuous improvement5.Demand for predictable costs and asset availability6.Internalized process data and system ownership7.Internalized locus of control over data flow8.Maintenance involvement in early project definition stage9.Organizational dominant role perception of clients’ maintenance managers10.Pro-active approach to project handover11.Pro-active approach to maintenance contractor transition12.Maintenance authorization role in project handover13.Day-to-day data quality control14.Contractor coaching in data management15.Project manager support in data collection16.Procedures for transforming tacit into explicit knowledge17.The asset register as the single source of truthAfter the maturity dimensions were identified, both CFM Organizations A and B were compared in Cross-case analyses 3 to identify measuring scales for the maturity dimensions. Many existing maintenance maturity models, based on the capability maturity model (CMM) or capability maturity model integration (CMMI) framework (Paulk et al. 1993; Macchi and Fumagalli 2013), use five or six levels to measure maturity as the extent to which certain processes are implemented and repeatedly practiced. Although this approach has proven its value for certification purposes, it has received criticism as discussed by Pöppelbuß and Röglinger (2011). According to King and Kraemer (1984), maturity models should not focus on a sequence of levels toward a predefined end state, but rather on factors driving evolution and change. The rationale behind the Cross-case analysis 3 is to maximize learning by comparing the extreme with the typical case. Therefore, the analysis aimed at identifying the factors for each maturity dimension that drove evolution toward an extreme case. The question that guided the analysis was as follows: What variables can be used to express differences in the level of maturity for the identified maturity dimensions?Combining the data from the typical case and the extreme case for all maturity dimensions, ensured that both ends of the spectrum could be defined qualitatively. Selective coding of data from Cases 3 and 4 for all maturity dimensions ensured the data was screened for text fragments that described the organizational development of CFM Organization B for that particular dimension. This process showed that, for almost all 23 lower-level maturity dimensions, there was a strong relationship between the lower-level and the associated higher-level maturity dimension, in the sense that the same scale variable could be used for both.When these maturity dimensions were presented to the experts, they were asked to respond to three questions: (1) Which maturity dimensions do you recognize as important and why?; (2) Which maturity dimensions do you consider unimportant and why?; and (3) Which maturity dimensions are missing?With respect to the first question, “alignment,” “leadership,” and “culture” were mentioned as important maturity dimensions because they were considered critical for collaboration and communication between clients and contractors, as well as between different departments of the client. “Culture” was considered important as something that should be developed top-down from senior management toward middle managers and the workforce, in order to stimulate behaviors aligned with the intentions of contractual arrangements. “Knowledge management” was considered important because retirement of maintenance staff and outsourcing often lead to a loss of tacit knowledge.With respect to unimportant dimensions, one contractor-employed expert regarded “tracking and tracing of jobs” to be unimportant because this data allegedly would be recorded elsewhere in the maintenance supply network by the contractors and subcontractors. However, one other client-employed expert thought this was a relevant dimension because it assures access to relevant management information. These differing expert opinions point to the different positions that clients and contractors take in controlling the maintenance service supply network and the use of information for that purpose; this will be discussed later in the paper.With regard to maturity dimensions missing in the model, one expert suggested “user communication.” In the final analysis, the researchers chose not to follow this suggestion because, while relevant for facilities management, this was deemed to be beyond the scope of the smart maintenance maturity model.ResultsThe generic structure of the maintenance networks found in all cases is shown in Fig. 3. In the intraorganizational network, three organizational teams or units were collaborating in asset operation and maintenance: the maintenance unit itself, the projects unit, and the facilities management unit. Both the projects unit and the maintenance unit engaged with contractors and suppliers, but often not the same ones. The project unit engaged with designers, contractors and external project managers, and consultants for the construction and installation of new assets. The maintenance unit coordinated the work of main contractors, while condition assessments were outsourced to asset inspectors.In all cases, maintenance-related data on the assets was stored on different places and in different systems operated by different units. A computerized maintenance management system (CMMS) was used by all the maintenance organizations. However, there were significant differences in how this CMMS was used, as will be discussed later. Another common finding was that, in all four cases, the maintenance organization, as well as the maintenance contractors and the condition inspectors, all used their own CMMSs to capture and store the data sets used for their part of the overall maintenance process. In Cases 3 and 4, a large number and variety of data sets were used for managing maintenance processes. The networked business environment of maintenance, combined with the existence of distributed data sets, produced severe challenges for the maintenance organization with respect to electronic integration internally and externally, and data quality. These challenges were taken differently by actors in the extreme case compared to the typical case.Dimensions FoundAccording to the procedures described in the “Methodology” section, the smart maintenance framework was developed and validated. It consists of 23 lower-level maturity dimensions that are linked to 8 higher-level maturity dimensions (Fig. 4).Governance structure is about allocating responsibilities for tasks, activities, or processes to persons or organizational units. The data from the cases showed various levels of maturity in this respect, but designing governance structures emerged as a relevant dimension of maturity in smart maintenance in all the cases, in particular on the areas of: project governance, maintenance governance, and data governance. Governance of projects refers to decision-making responsibilities for initiating and delivering projects. These responsibilities can be allocated to one centralized capital works unit, or they can be allocated to different parts of the organization (e.g., based on location, project-type, or asset class). Governance of outsourced maintenance refers to the allocation of responsibilities for outsourcing maintenance and administrating maintenance contracts. These responsibilities can be allocated to one centralized unit, or to different parts of the organization (e.g., based on location, building, or asset class). Governance of asset data refers to the allocation of responsibilities for definition, capturing, and storing asset data. Just as with the other governance dimensions, these responsibilities can be centralized or decentralized.Alignment of data definitions, processes, and systems is about linking and connecting data definitions, processes, and systems from different actors in a meaningful way. There was no central repository where all asset data was stored for any of our cases, and each actor used his own system. Alignment emerged as a relevant maturity dimension. Alignment of processes and work instructions refers to the extent to which processes and work instructions of the maintenance organization are aligned with those of internal stakeholders (mainly the capital works team), as well as external contractors. Processes prescribe workflows that can flow across organizational boundaries, and work instructions prescribe how individuals behave in certain situations. When processes and instructions are aligned, contractors, project managers and maintenance managers use consistent ways for capturing and working with data in different stages of the project or asset life cycle. Alignment of data definitions refers to the extent to which data definitions of the maintenance organization are aligned with those of internal stakeholders (mainly the capital works team), as well as external contractors. When the data definitions are aligned, contractors, project managers, and maintenance managers use the same terminology and data definitions across different stages of the project life cycle. Alignment of systems refers to the extent to which systems of the maintenance organization are aligned with those of internal stakeholders (mainly the capital works team), as well as external contractors. Different stakeholders should be aligned to produce the same output, even when they use different commercial products and systems.Tracking and tracing is about the ability to identify individual pieces of asset data. Tracking and tracing was associated with three types of asset data: assets, condition, and jobs. Assets tracking and tracing refers to the way the maintenance organization tracks and traces individual assets. One manager reported that approximately 600 smaller and larger projects were executed on a yearly basis to adapt the facilities to accommodate changing requirements. Asset data can be stored in the asset register according to different levels of detail to keep track of all assets. The level of detail is related to the way larger infrastructures are configured and broken down into individual components and subcomponents. Condition tracking and tracing refers to scale and frequency of collecting condition data. Condition data, for example, can be captured for all the assets on a yearly basis (large scale + high frequency), or for a part of the assets, on a 3-year frequency (limited scale + low frequency). For some pieces of equipment that are monitored closely, specific condition data can be automatically sampled in intervals of milliseconds. Jobs tracking and tracing refers to the way individual maintenance jobs can be traced back to individual assets. A maintenance job on a chiller, for example, can be linked to that specific chiller (high detail) or it can be linked to the building where the chiller is located (low detail).Data-driven decision-making is the ability to support maintenance decision-making by the analysis and interpretation of various data sets. Data-driven managerial decision-making emerged around life cycle modeling, predictive maintenance, and evidence-based contract administration. Life cycle modeling refers to a structured and systematic approach to analyze and predict the service life of all assets, based on condition data and predictive models. The maintenance organization, for example, can produce a yearly life cycle condition report for the assets under maintenance. Predictive maintenance refers to the ability to implement and use predictive maintenance policies for critical assets in which servicing is not executed according to predetermined intervals or run hours, but according to predictive analysis of real-time data streams. In evidence-based contract administration, the maintenance organization supports the procurement organization in managing the contract with the maintenance contractor by providing evidence-based information on the number of assets and the exact amount of maintenance tasks that have been carried out in a specified time window.Sustainability monitoring is about the ability to measure and report asset performance from a sustainability perspective. In our cases, this was related to energy performance and asset replacement and upgrade programming. Energy performance measuring and verification refers to the consistent evaluation of energy performance. In energy performance contracting, the evaluation of the realized water and energy savings relies on accurate data and certified verification methods. Asset monitoring and replacement programming refers to the systematic review of the asset portfolio from a life cycle perspective. While certain assets may have several remaining years of technical service life, upgrading them before end-of-life may improve the energy performance or might even be desirable from a strategic real estate management perspective. For some asset classes, relocation of an asset from one building to another may be an option.Knowledge management in this context describes the ability to learn from the processing of networked asset data. This emerged in some rather practical and down-to-earth ways. Contractor instruction and coaching refers to the maintenance organization’s role in instructing and coaching the maintenance contractor in their data capturing role (e.g., barcoding individual assets, tagging maintenance jobs to individual assets, coordinating subcontractors), ensuring that the contractor is supplying the data in the right formats and definitions. Project manager support refers to the maintenance organization’s role in instructing and coaching the project managers in their data capturing responsibilities, during project delivery and project handover, ensuring the project manager is supplying the data in the right formats and definitions. Knowledge creation refers to the way data is used to create managerial insights about maintenance, combining explicit and tacit knowledge in various degrees. In Cases 1 and 2, for example, planning and decision-making was based heavily on tacit knowledge, while in Cases 3 and 4 tacit knowledge was used to trigger and interpret the data analytical side of understanding. The maintenance organization can use its data about the assets and their behavior to learn about the maintenance needs and requirements of their assets. The accumulation of data creates insights, making the maintenance organization become more knowledgeable.Culture is related to human values and behavior. Many respondents recognized that digitalization of assets was changing maintenance processes, but the extent to which the cultural dimension of these changes was addressed, varied among the cases. Culture is related to the handling of nonstandard data, the valuation of asset data and the innovation approach of the maintenance organization. Handling nonstandard asset data refers to the behaviors of individuals in dealing with unknown, unfamiliar, and nonstandard facts, information, and data. Nonstandard data was not treated as a valuable asset in Cases 1 and 2, while in Cases 3 and 4 the team members of the maintenance organization demonstrated ownership and acted as data stewards trying to incorporate nonstandard data into the system of data definitions. Valuation of asset data is related to the behaviors of individuals and teams to recognize the value of asset data to improve maintenance operations and to develop practices to assess this value from the perspective of different stakeholders. Innovation approach refers to the attitudes and behaviors with respect to innovation. This varied from a culture of continuous improvement toward one of incidental and incremental (project-based) improvement.Leadership emerged in the sense of creating a safe learning environment and propagating shared business goals and values. As one senior management put it, “My team members are allowed to fail.” By practicing this rule, he was creating a safe learning environment. It is the ability to create a trustful learning environment where staff members are encouraged to “try new things” and “are allowed to fail.” At the other end of the spectrum, leadership can create an environment that is risk averse in which people are charged for their errors. Shared business values and principles refers to the extent to which leadership of the maintenance organization propagates and applies a set of shared values (e.g., service excellence, continuous improvement, innovation) and practices among its staff members. Leadership can be based either on a clear set of shared values and principles or on individualized values and principles.Case ComparisonTable 8 provides the case data for the four embedded cases from CFM Organizations A and B. We can make some observations by comparing the case data. First, the data suggest that maturity in smart maintenance is more related to organizational factors than to contractual factors. It appears that when certain capabilities have reached a certain level of maturity, they are used in whatever contractual arrangement maintenance is executed. Cases 1 and 2 were situated in the same maintenance unit (MU) of the same CFM organization. While both cases dealt with different asset classes, there were many similarities in the way asset data was managed. In both cases the same governance structure for maintenance was used by the corporate facility management organization, in which authorization of maintenance jobs and projects was allocated according to a separation of responsibility for different groupings of assets. Capturing and maintaining asset register data was also outsourced to the maintenance main contractor in both cases, without processes, work instructions, and data definitions being clearly described and defined. Alignment of data definitions, processes, and systems was rather immature, just as the tracking and tracing capabilities were. While the MU did use a CMMS, it was not used for storing and warehousing structured asset register data. The CMMS data was on a high level of abstraction compared to the CMMS data in the main contractors’ system. The condition monitoring data was gathered on a high level of abstraction, and with a low frequency (once every 3 years). While the maintenance main contractor operated a digital asset register, this was not shared with the MU frequently. Data exchange was very much based on unstructured information in pdfs, Word documents, and spreadsheets. Since the basic data was not very well managed, the MU’s maturity on data driven decision-making and sustainability monitoring was rather low.Table 8. Linking smart maintenance maturity dimensions to case dataTable 8. Linking smart maintenance maturity dimensions to case dataMaturity dimensionCase 1: Building maintenanceCase 2: MEP maintenanceCase 3: Hard and soft servicesCase 4: EPCGovernance structureDecentralizedDecentralizedCentralizedCentralizedAlignmentNot alignedNot alignedAlignedAlignedTracking and tracingAbsentLow detailHigh detailHigh detailData-driven decision-makingAbsentAbsentAdvancedAdvancedSustainability monitoringBasicBasicAdvancedAdvancedKnowledge managementPersonal learningPersonal learningOrganizational learningOrganizational learningCultureBureaucraticBureaucraticEntrepreneurialEntrepreneurialLeadershipAutocraticAutocraticCollaborativeCollaborativeBoth Cases 3 and 4, in many ways, were the opposite of Cases 1 and 2. Sustainability ambitions of the university, incorporated in the energy performance contract (EPC), triggered changes in the MU’s data management role. New jobs were designed around data management activities. Two asset database officers were made responsible for capturing asset data, scheduling maintenance jobs, and ongoing updating of the digital asset register for the entire building portfolio. Another job was designed around asset life cycle coordination and capturing condition assessment data systematically on a yearly basis. The development of this new role for the MU was part of a wider ambition within the corporate facilities management organization to raise the standards of professionalism and to create a culture of continuous improvement.Comparing the MU of Cases 1 and 2 with the MU of Cases 3 and 4, it appears that a shared vision on the asset life cycle within the CFM organization can catalyze the maturing process in smart maintenance. It was accepted in Organization A that the worlds of project managers and the maintenance managers were to a certain degree two different worlds, each with their own interests, culture, and practices. Whereas in Organization B, senior managers from the maintenance unit invested heavily on a personal level in building relations with the project managers from the capital works team. They developed a shared vision on the overall project (or asset) lifecycle in which the roles of both teams were described. This so-called gateway process was the master process for many other collaborative processes and data definitions that transcended the boundaries of different units.DiscussionThe aim of this paper is to identify the dimensions of smart maintenance maturity for CFM. Eight maturity dimensions for smart maintenance were identified empirically. Using the literature, we discuss how these dimensions capture the data supply chain dependencies in maintenance networks.The connection of physical assets to the internet of things and smart maintenance contracts based on distributed ledger technology may create configurations of linked data in complex and dynamic maintenance networks. Linked data in this context refers to the process of connecting (or linking) dispersed data sets in a way that the meaning and quality of the data is consistent across the maintenance network. The use of linked data platforms as a means to connect dispersed data sets for asset management and maintenance is discussed by Curry et al. (2013) and Luiten et al. (2017). When the data sets are produced with technology of various providers, this raises questions on the ownership, distribution (in the sense of publication), and governance of the data that must be addressed by the maintenance organization. When the data sets from different data bases are linked in a data hub, measures will have to be taken by the asset owner to ensure that the overall linked data structure is futureproof, in the sense that the integrity of the data structure is not at risk when a service provider leaves the network. The data needs to be available for the asset owner when current contractors, data brokers, and technology providers may be substituted by others at some moment in time. We now discuss how each of the eight smart maintenance maturity dimensions can be operationalized to incorporate the requirements of linked data and complex dynamic networks.The governance structure dimension needs to address the tokenization of physical assets if smart contracts are used. A token is a digital representation of a physical asset on a blockchain (Weingärtner 2019). Tokenization of assets needs to address the allocation of responsibilities for data capture, storage, and use to ensure data quality as discussed by Christides and Devetsikiotis (2016) and Johannes et al. (2018). Smart maintenance partially relies on technologies such as vibration monitoring, thermography, tribology, ultrasonics, and visual inspection (Mobley 2002). A variety of instruments is used to capture data on critical parameters required to analyze and predict degradation, reliability, and efficiency of machinery and equipment. The governance structure dimension of smart maintenance needs to address the responsibilities for predictive maintenance programs, sourcing technology solutions, and the data sets that will result from it (e.g., Kans and Galar 2017; Maritsch et al. 2016). The different forms of responsibility can be allocated and shared in the network through a RACI matrix, as discussed by Wende (2007).The alignment of data definitions processes and systems needs to address the role of technology providers that supply and install monitoring technology and systems (sensors, RFID-chips, embedded software) to existing assets. Alignment and handover of asset data between project management and facilities management is discussed by Thabet and Lucas (2017). The sourcing of monitoring and data capturing technologies may complicate this alignment even more. In addition to normal maintenance organizations, smart maintenance organizations need to align their data, processes, and systems with more stakeholders. When dispersed data from various sources and suppliers is incorporated into a linked data platform as described by Luiten et al. (2017) and Curry et al. (2013), the alignment of data definitions will have to include a semantic model in the form of an integrated graph of relevant information, describing the meaning of and relationships between data elements (Luiten et al. 2017; Curry et al. 2013). The smart maintenance organization will have to develop the capability to identify technology providers for alignment and to introduce them into the existent networks of maintenance contractors and subcontractors.Maintenance organizations distinguish data requirements for maintenance engineering and maintenance management (Smit 2014; Mobley et al. 2008), and should implement tracking and tracing strategies for both requirements. While maintenance managers may have outsourced the execution of maintenance to contractors, they may be reluctant to take responsibility for maintenance engineering data that, at short notice, appears to be relevant only for contractors, but which could in the long-term turn out to be of importance for knowledge creation and management. Outsourcing of maintenance does not eliminate the asset owners long-term responsibilities for longitudinal data integration beyond the current maintenance contractors (Thabet and Lucas 2017; Yuan et al. 2017). In order to accumulate historical data in appropriate formats and ensure that data is futureproof, and not locked into the system of current suppliers, the quality of data produced and supplied by the internal and external stakeholders should be monitored as discussed by Migliaccio et al. (2014).The data-driven decision-making dimension of smart maintenance maturity refers to the level of professionalism in the processing and analyzing of asset data to support decision-making. This dimension addresses the development and use of diagnostic and prognostic algorithms and data science methods that deepen understanding of degradation characteristics and failure probabilities of assets as discussed by Chai et al. (2014), Karim et al. (2016), and Mohseni et al. (2017). While in some situations, machine learning algorithms can trigger dynamic scheduling of maintenance jobs without human interference, in other situations, data has to be interpreted by experts and decisions have to be made by professionals before actions can be scheduled (Lee et al. 2014). When maintenance is contracted to one main contractor, the coordination and direction of subcontractors is the responsibility of the main contractor. This complicates engineering and managerial decision-making, while combining, analyzing and interpretation of data sets may require the collaborative expertise and judgement of specialists with different backgrounds from different providers or contractors.Asset owners can use their data for sustainability reporting and monitoring of energy performance. Various, separately stored data sets (e.g., opening hours, temperature settings, gas consumption, electricity production, energy flows) may be used to analyze the efficiency of machines and equipment (Curry et al. 2013). But sustainability monitoring can also contribute to improve circularity in the use, reuse, and disposal of equipment and materials through sharing data with supply chain partners. Iung and Levrat (2014) discuss how diagnostic and prognostic methods can be used to redesign maintenance processes for service life extension of assets and improved energy efficiency. Other researchers have discussed the role of asset data to enable reuse, recycling, and marketing of building components and materials (Ness et al. 2015).Knowledge management is the ability to learn from the processing of linked data in complex networks. The nature of linked data (dispersed production and storage) means that tacit knowledge of the asset and network configurations are important for understanding the opportunities and limitations of the data. Collins and Hitt (2006) discuss how personal relationships contribute to the deployment of tacit knowledge in collaborations. Two capabilities stand out as being required. First, the combination of explicit knowledge (derived from handbooks, procedures, digital models, and manuals) with tacit knowledge (derived from site familiarity, and professional experience) as discussed by Aromaa et al. (2015). Second, the combination of data science expertise with maintenance engineering expertise, which may be embodied within specialists from different engineering backgrounds (e.g., mechanical, electrical, structural).Investments in ICT become beneficial for asset owners when they change their operational business processes and adopt new skills (Love et al. 2014). Culture relates to the capability of the maintenance organization to create and maintain a desire for continuous improvement of processes and practices. Not only within the maintenance organization, but among all network partners, internally as well as externally. Davis et al. (1997) argued that intrinsic motivation of individuals for growth and achievement, required for continuous innovation are characteristics of the stewardship model of management. According to Hernandez (2012), stewardship involves psychological ownership and creates a culture in which individuals are willing to subjugate personal interests to long-term interests of the organization. This maturity dimension measures how maintenance organizations cultivate such a culture of stewardship.Compared to constructed assets, ICT and related technologies have a short economic life. Data capturing devices and their embedded software may become obsolete and substituted by newer generation systems creating legacy management issues. Therefore, the maintenance network needs to be dynamic in terms of new contractors and providers entering and others leaving the network. This generates additional leadership challenges for maintenance managers who must coordinate the collaboration between different contractors during transition periods. Changes in the maintenance network that require unfamiliar skills may introduce feelings of fear or anxiety among maintenance staff (Campbell and Reyes-Picknell 2016). Leadership can address this through demonstrating supportive behaviors that foster courage, confidence, and trust (Hernandez 2008; Schillemans and Bjurstrøm 2019).Table 9 summarizes how the smart maintenance maturity dimensions capture the requirements of both networked data and complex networks as discussed. In some papers, cyber-physical systems and data-driven approaches to maintenance are proposed to improve the reliability, availability, and efficiency of machinery and equipment (e.g., Lee et al. 2014, 2015; Karim et al. 2016; García and García 2019). However, the case data in this research shows that for a large number of assets permanent capturing of real-time data was not economically justifiable. And yet, those assets also needed to be serviced, and reliable basic data on these assets was required for planning, decision-making, and reporting purposes. For CFM we therefore propose a smart-maintenance conceptualization that not only focusses on reliability and efficiency of equipment and machinery but incorporates leveraging asset data in a broader sense, in particular with respect to the integration of multiple construction maintenance supply networks. In the extreme case, a high level of smart maintenance maturity enabled the university to control their asset data and to let information flow from one maintenance network to the other. Demonstrating client leadership across networks, the maintenance manager took care of implementation of a consistent data model across different networks. Client leadership in construction supply chains drives performance improvement and innovation in construction supply networks (Briscoe et al. 2004). Clients make decisions about how to procure construction works that influence and affect collaboration, data exchange, and innovation in supply networks. Our case data suggest that client leadership is not limited to construction supply networks but also relates to maintenance supply networks and to the integration of both. The smart maintenance maturity dimensions proposed in this paper, present a new theoretical perspective on such client leadership in digital construction, driving innovation not only in construction supply networks, but also in maintenance networks.Table 9. Smart maintenance requirements of linked data in maintenance networksTable 9. Smart maintenance requirements of linked data in maintenance networksMaturity dimensionLinked dataNetwork complexity and dynamicsGovernance structureIdentify responsibilities for: •predictive maintenance programs•sourcing/contracting of technologies•managing dispersed data setsAllocate responsibilities: •centralization within CFMa•shared within the network (RACIa)AlignmentIdentify partners and scope for alignment: •inspection technology providers•data science/services providers•DLTb providers for transaction validationImplement alignment practices: •collaborative semantic model•exchanging and sharing data•synchronizing/connecting systemsTracking and tracingIdentify data requirements: •maintenance management•maintenance engineeringMonitor data quality from: •internal data suppliers•external data suppliersData-driven decision-makingDevelop understanding of: •engineering versus managerial data-driven decision-makingApply within network: •joint multidisciplinary decision-making•dynamic job schedulingSustainability monitoringDevelop understanding: •energy consumption and equipment reliability and efficiency•reuse, relocate, dispose assetsShare data with: •policy developers•supply chains for reverse logisticsKnowledge managementCombine and integrate: •explicit and tacit (site)-knowledge•data science and maintenance engineeringApply managerial insights to: •asset criticalities and failure behavior•risk management and allocationCultureRealize continuous improvement through: •explaining strategic maintenance goals•demonstrating stewardship behaviorApply practices aimed at: •creating mutual understanding•treating data as valuable assets•sharing data and knowledgeLeadershipCreate safe learning environment for: •experimenting with new technologies•learning new skillsDemonstrate transitional leadership: •guiding entrance and exit of contractors and providersReferences Ali, M., and W. M. N. B. W. Mohamad. 2009. “Audit assessment of the facilities maintenance management in a public hospital in Malaysia.” J. Facil. Manage. 7 (2): 142–158. https://doi.org/10.1108/14725960910952523. Aromaa, S., A. Väätänen, I. Aaltonen, and T. Heimonen. 2015. “A model for gathering and sharing knowledge in maintenance work.” In Proc., European Conf. on Cognitive Ergonomics 2015, edited by A. Dittmar and M. Sikorski, 1–8. New York: Association for Computing Machinery. Bazeley, P. 2009. “Analysing qualitative data: More than ‘identifying themes.” Malaysian J. Qual. Res. 2 (2): 6–22. Bokrantz, J., A. Skoogh, C. Berlin, and J. Stahre. 2017. “Maintenance in digitalised manufacturing: Delphi-based scenarios for 2030.” Int. J. Prod. Econ. 191 (Sep): 154–169. https://doi.org/10.1016/j.ijpe.2017.06.010. Bokrantz, J., A. Skoogh, C. Berlin, T. Wuest, and J. Stahre. 2020. “Smart maintenance: An empirically grounded conceptualization.” Int. J. Prod. Econ. 223 (May): 107534. https://doi.org/10.1016/j.ijpe.2019.107534. Briscoe, G. H., A. R. J. Dainty, S. J. Millett, and R. H. Neale. 2004. “Client-led strategies for construction supply chain improvement.” Constr. Manage. Econ. 22 (2): 193–201. https://doi.org/10.1080/0144619042000201394. British Standards Institution. 2008. Asset management part 1: Specification for the optimized management of physical assets. London: British Standards Institution. Bryman, A. 2012. Social research methods. 4th ed. Oxford, UK: Oxford University Press. Campbell, J. D., and J. V. Reyes-Picknell. 2016. Uptime: Strategies for excellence in maintenance management. 3rd ed. Boca Raton, Florida: CRC Press. Chai C., J. de Brito, P. L. Gaspar, and A. Silva. 2014. “Predicting the service life of exterior wall painting: Techno-economic analysis of alternative maintenance strategies.” J. Constr. Eng. Manage. 140 (3): 04013057. https://doi.org/10.1061/(ASCE)CO.1943-7862.0000812. Chemweno, P. K., L. Pintelon, and A. Van Horenbeek. 2013. “Asset maintenance maturity model as a structured guide to maintenance process maturity.” In Proc., 3rd MPMM (Maintenance Performance Measurement and Management) Conf. Sari Monto, Miia Pirttila and Timo Karri, 58–70. Lappeenranta, Finland: Lappeenranta Univ. of Technology. Chen, K. K. 2015. “Using extreme cases to understand organizations.” In Handbook of qualitative organizational research, edited by K. D. Elsbach and R. M. Kramer, 65–76. New York: Routledge. Collins, J. D., and M. A. Hitt. 2006. “Leveraging tacit knowledge in alliances: The importance of using relational capabilities to build and leverage relational capital.” J. Eng. Technol. Manage. 23 (3): 147–167. https://doi.org/10.1016/j.jengtecman.2006.06.007. Curry, E., J. O’Donnell, E. Corry, S. Hasan, M. Keane, and S. O’Riain. 2013. “Linking building data in the cloud: Integrating cross-domain building data using linked data.” Adv. Eng. Inf. 27 (2): 206–219. https://doi.org/10.1016/j.aei.2012.10.003. De Bruin, T., M. Rosemann, R. Freeze, and U. Kaulkarni. 2005. “Understanding the main phases of developing a maturity assessment model.” In Proc., Australasian Conf. on Information Systems (ACIS), 8–19. Atlanta: Association for Information Systems. Energy Institute. 2007. Capability maturity model for maintenance management. London: Energy Institute. Gibbert, M., W. Ruigrok, and B. Wicki. 2008. “What passes as a rigorous case study?” Strategic Manage. J. 29 (13): 1465–1474. https://doi.org/10.1002/smj.722. Institute of Asset Management. 2016. Asset management maturity scale and guidance, version 1.1. Bristol, UK: Institute of Asset Management. ISO. 2003. Information technology—Process assessment. Part 2: Performing an assessment. ISO/IEC 15504-2. Geneva: ISO. ISO. 2004a. Information technology—Process assessment. Part 1: Concepts and vocabulary. ISO/IEC 15504-1. Geneva: ISO. ISO. 2004b. Information technology—Process assessment. Part 3: Guidance on performing an assessment. ISO/IEC 15504-3. Geneva: ISO. ISO. 2004c. Information technology—Process assessment. Part 4: Guidance on use for process improvement and process capability determination. ISO/IEC 15504-4. Geneva: ISO. ISO. 2006. Information technology—Process assessment. Part 5: An examplar process assessment model. ISO/IEC 15504-5. Geneva: ISO. Johannes, K., J. T. Voordijk, A. M. Adriaanse, and G. Aranda-Mena. 2018. “Investigating (inter)organisational data governance design in maintenance networks: Developing a research methodology and crafting data collection methods.” In Proc., CIRRE 2018, edited by B. Grum, A. T. Salaj, and J. Veuger, 245–258. Ljubljana, Slovenia: Institute of Real Estate Studies. Kans, M., and D. Galar. 2017. “The impact of maintenance 4.0 and big data analytics within strategic asset management.” In Proc., Maintenance Performance and Measurement and Management 2016 (MPMM 2016), 96–104. Luleå, Sweden: Luleå Univ. of Technology. King, J. L., and K. L. Kraemer. 1984. “Evolution and organizational information systems: An assessment of Nolan’s stage model.” Commun. ACM 27 (5): 466–475. https://doi.org/10.1145/358189.358074. Kohlegger, M., R. Maier, and S. Thalmann. 2009. “Understanding maturity models. Results of a structured content analysis.” In Proc., I-KNOW ’09 and I-SEMANTICS ’09, 2–4. New York: Association for Computing Machinery. Lee, J., H. D. Ardakani, S. Yang, and B. Bagheri. 2015. “Industrial big data analytics and cyber-physical systems for future maintenance & service innovation.” Procedia CIRP 38 (6): 3–7. https://doi.org/10.1016/j.procir.2015.08.026. Lee, J., C. Jin, Z. Liu, and H. Davari Ardakani. 2017. “Introduction to data-driven methodologies for prognostics and health management.” In Probabilistic prognostics and health management of energy systems, edited by S. Ekwaro-Osire, A. C. Gonçalves, and F. M. Alemayehu, 9–32. Berlin: Springer. Lee, J., H. Kao, and S. Yang. 2014. “Service innovation and smart analytics for industry 4.0 and big data environment.” In Proc., 6th CIRP Conf. on Industrial Product-Service Systems, 3–8. Paris: International Institution for Production Engineering Research. Li, J., D. Greenwood, and M. Kassem. 2018. “Blockchain in the built environment: Analysing current applications and developing an emergent framework.” In Proc., of the Creative Construction Conf. 2018, edited by M. J. Skibniewski and M. Hajdu. Budapest, Hungary: Diamond Congress Ltd. Love, P. E. D., J. Matthews, I. Simpson, A. Hill, and O. A. Olatunji. 2014. “A benefits realization management building information modeling framework for asset owners.” Autom. Constr. 37 (9): 1–10. https://doi.org/10.1016/j.autcon.2013.09.007. Luiten, G. T., H. M. Bohms, A. O’Keeffe, S. Nederveen, J. Bakker, and L. Wikstrom. 2017. “A hybrid linked data approach to support asset management.” In Life-cycle of engineering systems: Emphasis on sustainable civil infrastructure, 648–654. London: Taylor & Francis. Macchi, M., I. Roda, and L. Fumagalli. 2017. “On the advancement of maintenance management towards smart maintenance in manufacturing.” In Proc., IIFIP Int. Conf. APMS 2017. Heidelberg, Germany: Springer. Maritsch, M., C. Lesjak, and A. Aldrian. 2016. “Enabling smart maintenance services: Broker-based equipment status data acquisition and backend workflows.” In Proc., 2016 IEEE 14th Int. Conf. on Industrial Informatics, 699–705. New York: IEEE. Mehairjan, R. P. Y., M. van Hattem, D. Djairam, and J. J. Smit. 2016. “Development and implementation of a maturity model for professionalising maintenance management.” In Proc., 10th World Congress on Engineering Asset Management (WCEAM 2015), edited by K. T. Koskinen, H. Kortelainen, J. Aaltonen, T. Uusitalo, K. Komonen, J. Mathew, and J. Laitinen, 415–427. Berlin: Springer. Migliaccio, G. C., S. M. Bogus, and A. A. Cordova-Alvidrez. 2014. “Continuous quality improvement techniques for data collection in asset management systems.” J. Constr. Eng. Manage. 140 (4): B4013008. https://doi.org/10.1061/(ASCE)CO.1943-7862.0000427. Mobley, R. K. 2002. An introduction to predictive maintenance. 2nd ed. Boston: Butterworth Heinemann. Mobley, R. K., L. R. Higgins, and R. Smith. 2008. Maintenance engineering handbook, 7th ed. New York: McGraw-Hill. Moretti, N., and F. Re Cecconi. 2018. “Blockchain application to maintenance smart contracts.” In Proc., Research in Building Engineering, Exco 2018, 46–56. Valencia, Spain: Universitat Politècnica de València. Murphy, G., and A. Chang. 2009. “A capability maturity model for data acquisition and utilization.” In Proc., ICOMS Asset Management Conf. Proc.: Sustain Your Business through Good Asset Management, 1–7. Melbourne, Australia: International Conference of Maintenance Societies. Ness, D., J. Swift, D. C. Ranasinghe, K. Xing, and V. Soebarto. 2015. “Smart steel: New paradigms for the reuse of steel enabled by digital tracking and modelling.” J. Cleaner Prod. 98 (8): 292–303. https://doi.org/10.1016/j.jclepro.2014.08.055. Oliveira, M. A., and I. Lopes. 2019. “Evaluation and improvement of maintenance management performance using a maturity model.” Int. J. Prod. Perform. Manage. 2019 (Mar): 5. https://doi.org/10.1108/IJPPM-07-2018-0247. Papic, D., and T. Cerovsek. 2019. “Digital built environment maturity model: Digital twins advancing smart infrastructure asset management.” In Proc., EC3 European Conf. on Computing in Construction, 387–394. Sint-Niklaas, Belgium: European Council on Computing in Construction. Paulk, M. C., B. Curtis, M. B. Chrissis, and C. V. Weber. 1993. Capability maturity model, version 1.1. Pittsburgh: Carnegie Mellon Univ. Pöppelbuß, J., and M. Röglinger. 2011. What makes a useful maturity model? A framework of general design principles for maturity models and its demonstration in business process management. Atlanta: Association for Information Systems. Saldaña, J. 2016. The coding manual for qualitative researchers. 3rd ed. Thousand Oaks, CA: SAGE. Schillemans, T., and K. H. Bjurstrøm. 2019. “Trust and verification: Balancing agency and stewardship theory in the governance of agencies.” Int. Public Manage. J. 23 (5): 1–35. https://doi.org/10.1080/10967494.2018.1553807. Schmiedbauer, O., H. T. Maier, and H. Biedermann. 2020. “Evolution of a lean smart maintenance maturity model towards the new age of industry 4.0.” In Proc., 1st Conf. on Production Systems and Logistics (CPSL 2020), 78–91. Hanover, Germany: Leibniz Universität Hannover. Schuh, G., B. Lorenz, C. Winter, and G. Gudergan. 2010. “The house of maintenance—Identifying the potential for improvement in internal maintenance organizations by means of a capability maturity model.” In Proc., 4th World Congress on Engineering Asset Management (WCEAM 2009), edited by D. Kiritsis, C. Emmanouilidis, A. Koronios, and J. Mathew, 15–24. London: Springer. Smit, K. 2014. Maintenance engineering and management. Delft, Netherland: Delft Academic Press. VAGO (Victorian Auditor General’s Office). 2018. Results of 2017 audits: Universities. Melbourne, Australia: VAGO. Volker, L., A. Ligtvoet, M. van den Boomen, L. P. Wessels, J. van der Velde, T. E. van der Lei, and P. M. Herder. 2013. “Asset management maturity in public infrastructure: The case of Rijkswaterstaat.” Int. J. Strategic Eng. Asset Manage. 1 (4): 439–453. https://doi.org/10.1504/IJSEAM.2013.060469. Wende, K. 2007. “A model for data governance—Organising accountabilities for data quality management.” In Proc., 18th Australasian Conf. on Information Systems. Atlanta: Association for Information Systems. Yan, J., Y. Meng, L. Lu, and L. Li. 2017. “Industrial big data in an industry 4.0 environment: Challenges, schemes, and applications for predictive maintenance.” IEEE Access 5 (4): 23484–23491. https://doi.org/10.1109/ACCESS.2017.2765544. Yin, R. K. 2014. Case study research: Design and methods. Thousand Oaks, CA: SAGE. Yuan, C., T. McClure, H. Cai, and P. S. Dunston. 2017. “Life-cycle approach to collecting, managing, and sharing transportation infrastructure asset data.” J. Constr. Eng. Manage. 143 (6): 04017001. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001288.