CIVIL ENGINEERING 365 ALL ABOUT CIVIL ENGINEERING



IntroductionAn earned value management system (EVMS) is “an organization’s management system for project and program management that integrates a defined set of associated work scopes, schedules and budgets for effective planning, performance, and management control; it integrates these functions with other business systems such as accounting and human resources among others,” as defined by Aramali et al. (2021) based on the existing literature (e.g., McGregor 2019a; DOE 2018b; NASA 2018a; NDIA 2018b; Humphreys 2018; Anderson 2015; Stratton 2006), and informed by expert practitioners. While sometimes used interchangeably with earned value management (EVM), EVMS is actually different (Aramali et al. 2021). EVM is the use of performance management information, produced from the EVMS, to plan, direct, control, and forecast the execution and accomplishment of contract/project cost, schedule, and technical performance objectives versus the plan (McGregor 2019a; DOE 2018b; ISO 2018; NASA 2018a; NDIA 2018b; Humphreys 2018; PMI 2017; Chen and Zhang 2012; Garrett and Rendon 2006). Noticing the difference in definitions, the authors consider EVM and EVMS as two different terms in this study and address both appropriately. Furthermore, EVMS provides several benefits such as the capability to measure the progress of a project, visibility of the performance status, benchmarking against previous projects and programs, forecasting future performance, and early warnings of potential problems (Devanshu et al. 2018; Humphreys 2018; Fleming and Koppelman 2010; Dinsmore and Cabanis-Brewin 2014; Gupta 2014; GAO 2009).As a common practice in the industry, government agencies are required to have compliant EVMS with particular standards and guidelines, such as ANSI/EIA-748, on various acquisitions (DOE 2018a, 2020; Frahm 2012). Historically, government organizations and their contractors worked on maturing their projects’ cost and management systems based on guidelines to investigate EVMS effectiveness and compliance (Liggett et al. 2017). EVMS is certified as compliant when the contractor’s EVMS conforms with attributes and characteristics provided by the recognized guidelines and when the system produces “reliable, timely, and actionable” performance data (NDIA 2018b; Kester et al. 2015). If the guidelines’ criteria are satisfied, then the EVMS reliability is achieved (Christensen 1998). Orgut et al. (2020) found that developing guidelines to follow enhance the reliability of project control systems based on 10 in-depth case studies. On the other hand, even though compliance oversight teams are mostly concerned with assessing the reliability (Jaeger 2014; Christensen 1998), a number of industry practitioners expressed their concerns on whether certifying an EVMS as compliant is enough to ensure that it is reliable (Laqua 2018; McNamee et al. 2017; Kester et al. 2015). However, a major principle of DOE (2020) is that an effective EVMS should be compliant and assures that following guidelines improves EVMS reliability. To improve and standardize the implementation and execution of EVMS, the International Organization for Standardization (ISO), National Defense Industrial Association (NDIA) Integrated Program Management Division (IPMD), and Project Management Institute/American National Standards Institute (PMI/ANSI) provide standard guidelines that support EVM application for successful project and program management (ISO 2018; NDIA 2018a). These entities are examples, among others, that offer guiding principles to the EVMS field of knowledge and practice; yet even such guidelines are still subject to varying interpretations. On the other hand, effective project management and EVM application necessitate mature processes (Efe and Demirörs 2013), and the maturity level of EVM implementation is verified by compliance, while higher levels are achieved by improved compliance (Zhan et al. 2019; Laqua 2018; Kratzert and Houser 2011; Stratton 2006).However, even when the system provides reliable data by following guidelines and is mature, if its use through decision-making has shortcomings, then the project’s cost and schedule status become problematic, which increases the risks of cost and schedule overruns (Frank et al. 2017). Despite the growing interest of industry practitioners in compliance that leads to effective EVMS implementation to achieve project success (NDIA 2018b), understanding other EVMS success determinants such as knowledge and skills of team members and management decision-making have gained attention in the academic literature (e.g., Kwak and Anbari 2012). According to Cho et al. (2020), a reliable EVMS is mature, compliant with the applicable established guidelines, supported by a strong system environment, and provides trustworthy information to inform decision-making. Kwak and Anbari (2012) provide visible examples of projects that failed due to reasons attributed not to compliance, but other factors such as planning and ineffective management. Furthermore, they identified major projects that used EVMS that were successful, because of factors related to the “environment” of EVMS (e.g., culture, technical capability, etc.). In fact, the success of EVMS implementation and execution depends heavily on such project and program factors and is highly influenced by the project and program’s stakeholders, team members, project and program manager, and so on (Bryde et al. 2018). Thus, EVMS reliability is contingent on areas beyond just the maturity of processes (which is improved through compliance with government and industry guidelines). It is also contingent on the environment in which EVMS is planned, developed and applied, as highlighted in the academic literature.Accordingly, this paper aims to address the topic of the implementation and execution of EVMS from a novel consistent perspective by building on a comprehensive literature review method focused on both EVMS maturity and environment concepts introduced recently in the literature (Aramali et al. 2021; Cho et al. 2020), while considering both academic and industry references to understand the status of academic research and the needs of industry practitioners. This type of review could bring both academics’ and practitioners’ efforts one-step closer toward building a reliable EVMS by highlighting trends and any consensus or disconnects that exist on the topic of EVM/EVMS between academia and industry. The novel approach focuses on the maturity and environment concepts that are different but complementary dimensions of a reliable EVMS. EVMS maturity is the degree to which an implemented system, associated processes, and deliverables serve as the basis for an effective and compliant EVMS (Aramali et al. 2021; Cho et al. 2020). An EVMS environment is defined as the conditions (i.e., people, culture, practices, and resources) that enable or limit the ability to manage the project and program using the EVMS, serving as a basis for timely and effective decision-making (Aramali et al. 2021; Cho et al. 2020). This approach is derived from the principle of designing efficient systems called “sociotechnical systems design,” which calls for taking into consideration both technical (i.e., EVMS maturity) and human (i.e., EVMS environment) inputs when improving the efficiency of the system (Bider and Klyukina 2018; Walker 2015; Baxter and Sommerville 2011). The authors’ previous investigation of EVMS state of practice was based on a large survey of 294 industry expert respondents. It revealed sociotechnical factors as the underlying causes of poor functionality of a system (Aramali et al. 2021). In part, this paper attempts to support this finding through studying the literature. As such, investigating the EVMS literature from both its maturity and environment perspectives is needed to fully understand the foundation of a reliable EVMS.Existing EVM/EVMS literature has been classified by academic researchers in various ways. One example is the work of Chen and Zhang (2012) who have divided the studies into empirical and nonempirical classifications. In another publication [a book entitled Engineering Management (Hernández et al. 2013)], EVM research was classified into six groups: EVM and fuzzy determination of EV, EVM forecast accuracy, EVM and earned schedule, EVM to integrate risk management, EVM to integrate quality, and EVM to integrate technical performance (Hernández et al. 2013). In their prior work, the authors of this paper classified the EVMS literature into three high-level categories: “(1) improving EVMS estimation predictability, (2) adapting EVMS for the agency’s management process, and (3) improving EVMS reliability” (Cho et al. 2020). Although EVMS reliability has been introduced to the body of knowledge over the last two decades (Cho et al. 2020; Martens and Vanhoucke 2017; Jaeger 2014), there has not been a recent update to the existing literature since 2012 (Chen and Zhang 2012) in terms of summarizing and analyzing the EVM/EVMS body of knowledge. Cho et al. (2020) identified a clear need to examine available literature addressing the improvement of EVMS reliability, to complement the progress and the findings of relevant research efforts, while also taking into account the industry’s state of practice. A recent study by Aramali et al. (2021) administered a survey with 294 project management industry experts to examine the industry’s state of practice exploring the new sociotechnical dimensions of a reliable EVMS, namely environment and maturity. Finally, there has not been any previous noted study that has examined the disconnect between academia and industry on the topic of EVM/EVMS.Because EVM/EVMS is a topic that has been subject to a vast amount of research by both practitioners and the academic community, the authors examine the literature emerging from both. The review provides clarity on what the industry’s practical interests are, while also identifying the research areas that academic researchers have been engaged in. Identifying and highlighting the disconnect between these two bodies of work can show gaps in efforts between industry and academia, provide information to both communities related to research needs in light of the gaps that exist, and may allow industry to see opportunities that are forthcoming from academic research. EVM research can become even more valuable when it is deeply rooted in the industry’s experience and practical needs (Vanhoucke 2017).The advancement in the state of practice of EVM/EVMS coupled with the lack of an up-to-date comprehensive literature review covering new EVMS dimensions, inspired the authors to undertake this review of the EVM/EVMS literature. Also, this paper fills an existing gap bridging the disconnect between academic efforts and industry needs in the enhancement of EVMS implementation. Accordingly, the objectives of this paper are to: (1) critically review and analyze relevant and recent academic and industry works on EVM/EVMS to reflect up-to-date practices and issues, and (2) perform trend and comparison analyses of EVM/EVMS between academic and industry literature to identify potential gaps aiming to inform future research directions for improvement.This paper begins by describing the methodology that details the steps undertaken for this systematic literature review. Then, classification of the EVM/EVMS literature by themes is undertaken and described. Data characteristics and trend analyses are presented and analyzed, followed by a critical assessment of the existing publications. Conclusions, along with guidance to researchers and practitioners are then provided, including recommendations for future research directions. This paper contributes to the project management body of knowledge by addressing the issues and identifying future needs for a high-performing EVMS through content and trend analyses of the existing academic and industry literature, and by highlighting the sociotechnical framework applicability on EVMS.MethodologyThis section presents the methodology followed in this paper while conducting a systematic review of the literature as recommended by Briner and Denyer (2012) as well as Denyer and Tranfield (2009). As suggested by Booth et al. (2016), the authors’ preliminary step is establishing a guiding exploratory question: “How can EVMS be reliable?” Next was forming an expert advisory group and determining the types of studies that might answer this question. In this case, the authors’ work was overseen by a steering committee of 27 industry and government professionals, each having more than 20 years of extensive experience in project management and controls, and who represent 16 owner organizations and 11 contractor organizations from diverse industry fields including energy, military, nuclear, security, chemical waste, aerospace, infrastructure, industrial, engineering, and manufacturing.The steering committee members pointed out to the authors that industry sources (e.g., professional magazines, industry guides, etc.) provide valuable information on current practices and needs related to EVMS application. In fact, much of the EVM/EVMS research is conducted by project management professionals, published in trade publications/magazines, and complements the academic points of view (Vanhoucke 2017). It is important to clarify that what is referred to as industry references and publications in this paper are publications drafted, published, and used by practitioners from industry and government in the project management field of study. Types of industry publications include technical articles published in newsletters, industry standards published by nongovernmental international organizations or leading professional associations, and guidelines published by government agencies. Industry publications by practitioners from industry and government reflect the state of practice and draw attention to the current issues as well as interests of EVM experts. On the other hand, academic references and publications are those studied and written by academics reflecting findings from project management related research efforts. Types of academic publications include books, dissertations, as well as peer-reviewed conference proceedings and journal articles. As such, the following subsections describe the systematic literature review conducted by the authors using publications from both industry and academic literature. Fig. 1 provides an overview of the methodology steps, including the number of publications examined at key junctures.Step 1: Preparatory StepThe authors conducted a preliminary broad data collection by gathering and examining 600 publications, dating back to 1962, from several academic sources (such as Google Scholar and ASCE) and from industry references shared by the steering committee (industry standards and guidelines from US government agencies, among others). This ensured that EVMS application from the industry and academic viewpoints was considered. Examination of these publications led to several observations. Out of the 600 publications, only 282 discussed EVM/EVMS as a main research contribution, and the rest did not have EVM/EVMS neither in their titles nor abstracts; rather, they discussed generic topics related to project or program control. Also, older publications addressed more foundational tools such as program evaluation review technique (PERT) and cost/schedule control system criteria (C/SCSC), instead of EVM. The 27 industry guidelines, standards, and tools in this data set do not solely focus on EVM/EVMS. This initial literature collection informed the authors on the history and evolution of earned value management practice, guided conducive changes to the scoping process, sharpened the focus of the literature review on EVM/EVMS, and informed the development of inclusion and exclusion criteria. Thus, the authors tailored the typical systematic review to match the purpose of this study.Step 2: Defining the ScopeThe initial literature review helped define the scope and narrow the objectives of this study to focus more on recent literature (from 2011 to end of March 2021) and relevant literature to EVMS implementation and application. A couple of papers had reviewed available literature on EVM-specific topics (Chen and Zhang 2012; Christensen 1994). One of the most recent publications is a study by Chen and Zhang (2012), who analytically reviewed EVM studies and applications. The key topics reviewed are close in nature to the interests of the authors, and they include effective implementation, accuracy, performance metrics, and integration with other project management techniques. However, this study was published in 2012 and reviewed studies published by midyear of 2011. Because this study reflected on publications before 2011 and on limited topics only, the authors chose to review publications from 2011 onward to reflect the most recent findings and a comprehensive review of EVM/EVMS. This scope guided the authors’ choice of the keywords used in search engines to be “earned value management” or “EVM” because “earned value management system” or “EVMS” are subsets of the chosen keywords, respectively, and lead to inclusive search results.Step 3: Selection Process for the PublicationsTo select the type and sources of publications to be reviewed, the authors started with Google Scholar as the preliminary search engine because it is a widely used and freely accessible search engine that publishes from all over the world. Although there are some books and dissertations spotted on the topic, most of the publications are journal and conference articles. Moreover, the sources of these articles are diverse where most of the recent and relevant publications come from three dominant well established academic databases. Accordingly, peer-reviewed journal articles and conference proceedings from the ASCE library database, the ScienceDirect database, and the Institute of Electrical and Electronics Engineers (IEEE) Xplore library database were collected.Similarly, the authors started with the industry publications and sources suggested or shared by the research steering committee. Those that are bringing relevant valuable practical perspective to the review were considered and they include conference papers, recommended practices, journal articles, and newsletters from the Association for the Advancement of Cost Engineering (AACE) International library and Measurable News from the College of Performance Measurement (CPM)’s database. As previously mentioned, not all the shared guidelines from US government agencies solely focus on EVMS; thus, they are not analyzed but are referred to throughout the paper to shed the light on the progress done by practitioners in EVMS implementation and execution.Step 4: Screening of PublicationsConsidering the large number of publications available on the various areas of the EVM/EVMS topic, the authors defined a set of inclusion and exclusion criteria to satisfy the research scope. Any included publication must be available in full text English language online. The term “earned value management” or “EVM” must be spelled in the title or abstract of the publication to be examined. Publications not focusing on EVM/EVMS are excluded. Academic publications focusing on EVM and sustainability, EVM and labor productivity, and teaching EVM are excluded because they are not related to the research objective. State-of-the-art or literature review papers are also excluded from the analyses but are used as foundation to start this study, are used to cross-check results, and are cited multiple times throughout the paper. Recommended practices, issued by AACE, are documents that intend to provide descriptions, explanations, and guidelines on particular topics as a foundation for educational purposes; therefore, they are cited in the paper upon need but are not included in the analysis, considering also that the authors do not aim to critically review the rigorous work of institutions that publish guidelines or standards in this scope of study. However, publications related to compliance to standards and guidelines that provide research contributions or discuss concern areas are analyzed. In summary, the preliminary data collection step yielded 600 publications published between 1962 and 2021. Focusing on this paper’s objectives yielded 301 publications after conducting steps 1 and 2 shown in Fig. 1. The number was narrowed down further to 160 publications after applying the inclusion and exclusion criteria in step 3. The final 160 publications will be analyzed in great detail.Step 5: Classification of the LiteratureThe publications were categorized into themes based on a descriptive review (Rowe 2014). This classification started by first compiling the collected data in a spreadsheet, organized by source, type of publication, title, abstract, year of publication, and first author name. Second, the authors determined the preliminary themes of interest while focusing on bringing a new perspective and considering that the research enquiry is around EVMS reliability; where EVMS reliability is dependent on guideline compliance (DOE 2020; Christensen 1998), EVMS maturity (Cho et al. 2020; Zhan et al. 2019; Stratton 2006), and EVMS environment (Cho et al. 2020; Kwak and Anbari 2012) as previously discussed. Thus, the initial themes of categorizing the data (literature), used as initial condition for an iterative process, are: “EVMS compliance,” “EVMS maturity,” and “EVMS environment.” Third, the authors followed the concept of thematic analysis, suggested by Alhojailan (2012), to identify, analyze, and report themes or patterns/trends in a given data set (Cruzes and Dyba 2011). Fig. 2 presents the iterative process followed to categorize all the literature sources; where the term “keyword” represents a primary focus term that the study revolves around.Based on a reading of the publications, the authors identified the keywords that represent the main topic of each study in relation to this paper’s research question, as shown in Fig. 2. Grouping the references based on keywords led to the discovery of the emergent themes, where no single reference was duplicated in more than one theme. This process stopped once the themes were saturated based on the identified keywords. Fig. 3 presents two examples of this thematic analysis step.Step 6: Analysis ApproachBefore going into the discussion and critical review, the authors applied qualitative and quantitative analyses on the data. First, the collected data are presented by publication source followed by describing each theme based on its identified keywords and set of references. Then the demographics of the collected data are presented, including source count and number of citations, and statistical analysis is performed to test and examine any disconnects between industry and academia. With decades of EVM/EVMS history (since 1962), one can anticipate that the research focus within the EVM/EVMS field may change over time. Thus, to gain perspective on how researchers’ focus evolved when trying to improve the system’s reliability, the research trend is analyzed over the last decade (2011–2021).Step 7: Discussion of the ResultsBased on the analyses performed, the authors identified issues and gaps within the available literature, categorized by theme, and suggest future improvements for the EVMS implementation and execution to support industry needs and inform EVMS applications. Hence, promoting a new theory around ensuring EVMS reliability that can be applied in implementing EVMS in industry and government practices.EVM/EVMS Literature Review by ThemeIn the following subsections, an overarching critical review of publications is performed for each theme to highlight the current body of knowledge and identify gaps for a future research direction. The review also elaborates on the identified disconnect between academia and industry. Moreover, the review ties each theme with either the “technical” aspect, the “social” aspect, or both aspects of a “sociotechnical” system to demonstrate its applicability to EVMS.The sociotechnical systems approach has been used in other business processes found in the literature, such as for sales process improvement (Bider and Klyukina 2018) and software engineering (Baxter and Sommerville 2011), among others. The term “socio” or social is used to describe factors related to the people who work together, whereas the term “technical” denotes the factors of the design and requirements of the system itself (Baxter and Sommerville 2011).Therefore, this literature review, for the first time, analyzes existing studies to show that EVMS is not just a technical process for project and program management. Social aspects of the system are also critical. The rationale for adopting the sociotechnical approach into system design, in this case EVMS, is that failure to do so increases the risk of poor performance and effective functionality of the system (Bider and Klyukina 2018; Baxter and Sommerville 2011).Theme 1: History of EVMSThe articles published on EVMS history target how and why EVMS was created as well as the evolution of EVMS from previous management techniques (Humphreys 2016; Morin 2016; Abba 2017; Driessnack 2017). The concept of program control has existed since the early 1960s, based on references published in 1962 (Driessnack 2017; Fleming and Ervin 1962). Earlier concepts of EV components were called PERT and cost/schedule planning and control specification (C/SPCS) (Abba 2017; Driessnack 2017; Humphreys 2016; Morin 2016; Abba 2000; Christensen 1994; Fleming and Ervin 1962). In 1967, the use of EVM originated in a policy requirement by the US Department of Defense (DoD), called C/SCSC (or CS2), for certain defense contracts (Christensen 1994). As programs became more complex and with higher levels of risk, the existing management techniques and tools became inadequate, leading to the birth of EVM by pioneers of the process such as Hans Driessnack and Ernest Fitzgerald (Abba 2017, 2000). These individuals and others worked on providing program management best practices, focusing on a list of 35 criteria. The 35 criteria morphed over time and were subsequently documented as 32 EVMS guidelines (Abba 2017), and in 1998, these 32 guidelines were officially published as an American National Standards Institute/Electronic Industries Alliance (ANSI/EIA-748) document (Bembers et al. 2017a; Humphreys 2016). Overall, EVM and correspondingly EVMS have been actively applied for almost 60 years. Being so widely used make them a subject of interest for further research and improvements. In a nutshell, EVMS was created as a new improved management technique, because earlier tools technically failed to manage complex and risky programs while integrating cost and schedule. For this reason, this first theme around the history of EVMS sheds the light on EVMS’ foundational technical aspects.Theme 2: ComplianceSince the origin of EVM, many guidelines, standards, and recommended practices have been established to successfully implement it across the industry (e.g., DOE 2020; McGregor 2019a, b; PMI 2019a; DOE 2018a, 2019; ISO 2018; NASA 2018a, b; NDIA 2018a, b; AACE 2014a; DCMA 2012). Examples of recognized entities that have published guidelines include the US DoD, the US Department of Energy (DOE), the NDIA, the Defense Contract Management Agency (DCMA), the Project Management Institute (PMI), the ISO, and the National Aeronautics and Space Administration (NASA). Among federal agencies, the US DoD pioneered the concept and has led by example in enforcing the guidelines as policies on major acquisitions. Taken together, these guiding documents are important because complying with them is essential for effective EVM application (DOE 2018b; McNamee et al. 2017; Kester et al. 2015).Nevertheless, compliance reviews, oversight, and EVMS certification are top strategies to mitigate EVMS deficiencies (McNamee et al. 2017; Kester et al. 2015; Finefield 2013a, b). However, the analysis showed that only Liggett et al. (2017), Hunter et al. (2014), and McGrath (2012) have focused on this subject in academic publications. These articles have rigorously documented the industry processes of certification and compliance reviews in real case studies by highlighting challenges and recommendations that benefit the broader community for an improved EVMS application.Although guidelines and standards aim to standardize the application of EVMS across the industry, they have been subject to different interpretations and perspectives (e.g., Melamed and Plumery 2015; Finefield 2013a, b). As such, there is no common and systematic interpretation by the different organizations or EVM practitioners on implementing EVMS (Cho et al. 2020; NDIA 2016). This highlights the need to have consistent criteria, across organizations and industries, to enhance EVMS implementation, attain compliance, and thus achieve a reliable EVMS. Furthermore, Driessnack (2019) has looked at the alignment between the EVMS guidelines, drafted by multiple agencies, that are similar in nature, yet different in content and found that the existence of multiple guiding resources for the same system creates difficulties and inconsistencies. Furthermore, practitioners have raised concerns such as whether adherence to guidelines is costly or too demanding (e.g., Frahm 2012; Crowe and Basche 2011), and such areas have not been found to be investigated by academics. This calls for further investigations to be done on how to ensure the availability of resources to attain a compliant EVMS where “resources” is one of the identified factors to have a good environment for a reliable EVMS (Aramali et al. 2021).Other than adhering to guidelines, EVMS surveillance is identified by a large industry survey as a top strategy to mitigate deficiencies (Aramali et al. 2021), as well as a mean to implement a healthy and reliable EVMS (Bembers et al. 2017b). However, only one recent academic paper has focused on it (Liggett et al. 2019). This is important because such processes help identify corrective actions to promote a better EVMS implementation and execution, as well as project and program success. Accordingly, there should be more focus directed on how to achieve the surveillance results and corrective actions by providing a set of criteria for the contractors and owners to move by along each of the EVMS subprocesses.Working to ensure and improve both compliance and surveillance is expected to mature the EVMS of a project and program (Laqua 2018; Kratzert and Houser 2011; Stratton 2006) and also improve its reliability (Bembers et al. 2017). Maturity is a concept to describe the effectiveness of a certain activity that usually develops with time. It was initiated in the late 1980s in the software industry in a model called the “Capability Maturity Model (CMM)” by the Software Engineering Institute (SEI) (Humphrey 1989). Table 3 presents some of the studies that have investigated maturity in different applications focused on improving the planning and management of a project and program.Table 3. Examples of maturity frameworks in different applicationsTable 3. Examples of maturity frameworks in different applicationsMaturity frameworkBrief descriptionReference(s)Capability maturity model (CMM)Maturity of processes in the software industry.Humphrey (1989)Maturity level of project managementMaturity is represented in a spider web layout illustrating the level of project management competences. These competencies are in individual, project team, and organization dimensions.Gareis and Huemann (2000)Project management scalable maturity modelMaturity is in five scalable levels that aim to improve the management of projects using other maturity references.Crawford (2001)Maturity of projects in organizationsMaturity is in three dimensions: (1) knowledge (ability to perform different tasks), (2) attitudes (willingness to perform them), and (3) actions (doing them in the actual reality).Andersen and Jessen (2003)Maturity of EVMSMaturity is a scalable approach to assess the maturity level of EVMS in five levels: (1) little or no EVM, (2) low-cost and effective EVMS, (3) EVM compliant with ANSI/EIA 748 guidelines, (4) fully committed to EVM, and (5) improving EVMS as an ongoing process.Stratton (2006)Maturity of project managementMaturity of project management is investigated using the CMM, Organizational Project Management Maturity Model (OPM3), and other means.Lipke (2011a)OPM3Maturity of project management is measured against a comprehensive and broad-based set of organizational project management best practices.PMI (2013)Maturity in performance measurement systems in research and development (R and D) organizationsAdaptation of CMM to develop a framework to assess maturity levels of performance measurement systems.Baggett (2014)Project management maturityMaturity is a proper foundation of tools, techniques, processes, and culture, when striving for excellence. Maturity levels move from a low level called “Common Language” to the highest level (level 5), called “Continuous Improvement.”Kerzner (2017)Maturity of the front end engineering design (FEED)Project definition elements are assessed in five different levels; along with the option of not being required (0), to measure the maturity of FEED elements. They are rated numerically from 1 to 5 (completed, mostly completed, somewhat addressed, initial thoughts have been applied, and work not started).El Asmar et al. (2018), and Yussef et al. (2019)Maturity of the project definition rating index (PDRI)PDRI maturity elements are assessed in five different levels; along with the option of not being applicable (0) to the project; to measure the maturity of the front end planning elements. The five levels range from 1 to 5 (complete definition, minor deficiencies, some deficiencies, some deficiencies, major deficiencies, and incomplete or poor definition).Gibson et al. (2019)Table 3 illustrates that maturity assessment models have been widely developed and used in different applications. Considering this paper’s scope, only one recent publication has studied EVMS maturity by looking at its 10 associated EVMS subprocesses (organizing, planning and scheduling, budgeting and work authorization, accounting considerations, indirect budget and cost management, analysis and management reporting, change control, material management, subcontract management, and risk management) (Aramali et al. 2021). Before that, only one mechanism was found in the literature to assess EVMS maturity, and it was developed by Stratton (2000) in the year 2000.Even though various authors from industry and academia have mentioned that the maturity of EVM helps overcome project and program obstacles and promote successful implementation (Cho et al. 2020; Netto et al. 2020; Sutrisna et al. 2020; Zhan et al. 2019; Efe and Demirörs 2013; Kersbergen 2011; Lipke 2011a), there is a gap in the recent literature in evaluating the maturity level of EVMS subprocesses and their effect on project and program performance outcomes. Industry guidelines have contextualized visually the application of EVMS subprocesses across project and program life cycle to promote efficient project management (NDIA 2020; McGregor 2019b; DOE 2018a). However, in academia, few authors such as Willems and Vanhoucke (2015) have done the same. Thus, there needs to be broader studies that investigate and improve the EVMS subprocesses tied to the different project and program phases. Considering these observations and the findings by Stratton (2006) and Aramali et al. (2021), as well as Zhan et al. (2019), a mature EVMS is compliant to the guidelines, and mature EVMS subprocesses ensure a mature EVMS. A new goal around achieving a compliant and reliable EVMS can be achieved by ensuring a mature EVMS. This can be done by linking the compliance guidelines to each subprocess and assessing the project and program’s performance with respect to each subprocess against its required criteria.As discussed, the literature found that compliance to guidelines leads to improved EVMS application. However, effective oversight that is in place and used by the project team actually helps reach such a mature and compliant state of the system. This emphasizes how EVMS compliance is achieved through adopting a sociotechnical system perspective.Theme 3: Forecasting/PredictionOne main objective of applying EVMS is monitoring the project and program’s cost and schedule performance. A lot of research has been performed on offering reliable estimates during project execution. Most of the research has focused on improving the accuracy and reliability of the EAC and CPI calculations (e.g., Kim and Pinto 2019; Batselier and Vanhoucke 2017; Kim 2016; Heron et al. 2015; Narbaev and De Marco 2014; Mortaji et al. 2015). However, to address criticism that EVM knowledge works well in cost management but not in schedule management (Vanhoucke et al. 2015), studies examining the use of EVMS schedule indicators, to offer reliable information on schedule performance during the execution of the project, have been published (e.g., Martens and Vanhoucke 2020; Carrico et al. 2019; Demachkieh and Abdul-Malak 2019; Alshaheen 2018; Batselier and Vanhoucke 2015b). Earned schedule, an emergent topic since the year of 2003 by the industry professional Walt Lipke, was the focus of many of these publications (e.g., Abdel Razik 2020; Alshaheen 2018; Ballesteros-Pérez et al. 2019; Lipke 2019; Heron et al. 2015; Crumrine and Ritschel 2013).Due to the vigorous study of forecasting/prediction in academia, the issues of EVMS performance (i.e., lack of accuracy in forecasts and the need for an improved visibility of future project performance) are addressed by various methods, such as statistical models. However, even statistical model predictions have biases and errors caused by multiple uncertainties (Bain and Polakovic 2005) where a model may produce accurate results in some projects, while the same model may show significantly different performance due to unconsidered project and program variables in other projects. Because EVMS is applied on different types of projects and programs, this issue cannot be avoided unless new tools that help identify gaps and achieve criteria for better EAC and progress measures are developed. This sheds light on the importance of ensuring a mature EVMS that follows guidelines.On the other hand, if project and program risks are not integrated with the forecasting process, chances of delivering a successful project and program would become limited where risk needs to be associated with forecasting results for effective performance control (Kim and Reinschmidt 2010). The analysis shows that six recent papers that explicitly investigated risks with forecasting have been identified (Kim and Pinto 2019; Babar et al. 2017; Kim 2015, 2016; Kim and Kim 2014; Kim and Reinschmidt 2011).As the results showed, the largest portion of the EVM literature studies the forecasting aspects of EVMS through developing theoretical methods. Aiming to improve the accuracy of the cost and schedule estimates represents a technical side of EVMS.Theme 4: Application of Risk Management in EVM/EVMSAs indicated in the previous subsection, the probability of a given program’s success is increased by continuously applying risk management, a project management process (Nouban et al. 2020; PMI 2019b; Alleman et al. 2018). Although there is a lack of clear guidelines for risk management process integration with EVMS (ISO 2018; NDIA 2018a), risk-based decision-making with the use of EVM information would provide a strong rationale for effective project decisions and management (Martinelli and Waddell 2014).The analysis of the collected publications on the topic of risk management integration with EVMS revealed two research approaches to account for risks and uncertainties in EVM: building new methods such as the grey EVM (Mahmoudi et al. 2019), and adding new forward-looking performance indicators to the traditional EV metrices such as the risk performance index (Babar et al. 2017). However, the limitations of such new methods or measures are that they are yet to be validated on large and diverse project data.Furthermore, academics recently started to study the integration of risk management with earned duration management (EDM), an extension of EVM, to overcome its poor progress measurement aspect (Votto et al. 2020). However, there are some topics raised by industry needs that did not yet gain sufficient attention in academia: formulation and understanding of management reserve as a best industry practice (Infanti 2011); baseline and dynamic scheduling based on risk analysis (Vanhoucke 2012); and the use of cost and schedule contingency reserves in EVM (Hammad et al. 2018; Eldosouky et al. 2014). This analysis indicates the need to develop tools that help practitioners follow criteria to efficiently incorporate risk management with EVMS, especially that risk management is not yet incorporated into one of the major EVMS industry standards, NDIA (2018a) (Solomon and Young 2007).The literature emphasized the importance of integrating the risk management process (e.g., risk analysis, cost and schedule reserves to account for risks) into EVM and EVMS as a system design requirement. The objective of this integration enhances the technical aspect of EVMS.Theme 5: Application of EVMSThe literature confirms that EVMS can be utilized in a wide variety of industries, including research, energy, oil and gas, construction, manufacturing, mining, software, residential, nuclear, production, harvesting, data center management, and EPC (Erez and Sloan 2017; Dodson et al. 2015; Rybina et al. 2015; Chirinos 2014; Efe and Demirörs 2013; Giammalvo 2012; Lorenz et al. 2012; Wibiksana 2012; Jung et al. 2011; Naderpour and Mofid 2011). However, research shows that there still exist challenges in applying EVMS to projects and programs in certain industries as indicated in what follows. According to Chirinos (2014), statistics show that projects in the oil and gas industry are at a higher failure rate than other industries, therefore further research on the application of EVM is needed in the oil and gas industry to avoid the failures (Steege 2019; Chirinos 2014). Nevertheless, in software projects, EVM in its basic form is not capable of meeting the project needs (Efe and Demirörs 2013). Kranz (2017) argues that for managing projects in the aerospace industry, construction industry, and defense acquisitions, EVMS needs the support of Building Information Modeling (BIM) because BIM provides accurate project schedule information and a better performance management than EVMS. Based on the industry practice, EVMS can also be applied in an agile environment (Agile Alliance 2017; Stratton 2016); however no recent academic papers have focused on explaining and investigating the application of EVMS in an agile project management method. Despite these identified barriers, EVM remains the most recognized project management tool (Demachkieh and Abdul-Malak 2019), and the wide use of EVMS in various industries calls for finding solutions to improve its reliability and overcome the identified challenges.Other than implementing EVMS on projects by industry types, some authors have argued whether compliance should be tailored on projects by contractual conditions such as contract value thresholds (e.g., Frahm 2012; Gustavus and Hunter 2012). In fact, EVMS is a robust system, therefore it is useful to tailor it on different contracts/projects as necessary depending on the project’s contractual requirements, size, and scope (Picornell et al. 2017; Bhaumik 2016; Gustavus and Hunter 2012). Only five recent academic papers have fully focused on how to design a tailored EVMS depending on contractual requirements or sufficient guidelines (Liggett et al. 2019; Bergerud 2017; Picornell et al. 2017; Frahm 2012; McGrath 2012). This calls to integrate lessons learned from industry experience and best practices to develop frameworks that help systematically tailor EVMS.Finally, case studies where EVMS guidelines and standards have been implemented are scarce in the literature [e.g., case study using ISO standards to complement EVM by Erez and Sloan (2017)]. Consequently, there is a need to have studies that demonstrate the effectiveness of EVMS when following published EVMS guidelines and standards.The literature validates that EVMS can be implemented or tailored to diverse types of projects and industries. This is successfully achieved by integrating lessons learned from human experience and following best practices by the project team. This highlights the team’s role from a social aspect of EVMS, while designing an EVMS specific to a certain application highlights a technical aspect of EVMS.Theme 6: EVMS EnvironmentThe EVMS environment was previously defined in this paper as the conditions that enable or limit the ability to manage a project and program using EVMS. An EVMS environment factor is one of the circumstances, facts, or elements that contributes to the result or outcome of an EVMS (Aramali et al. 2021; Cho et al. 2020). Table 4 lists various environment factors that would impact the reliability of EVMS as identified in the literature.Table 4. Examples of environment factors that affect EVMSTable 4. Examples of environment factors that affect EVMSEnvironment factor(s)Reference(s)Effective team alignment, during preproject planning.Griffith and Gibson (2001)EVM users, EVM methodology, project environment, and EVM implementation process.Kim et al. (2003)Feedback from previous EVM application and lessons learned.Ramahatra (2011), McGrath (2012), and Baker (2015)Effective decision-making.Lucas (2012), Lappenbusch (2017), and Shi (2019)Leadership support and organizational buy-in.Liggett et al. (2017), Farok and Garcia (2015), King (2018), and Zhan et al. (2019)Practitioners’ right skills and knowledge of EVM.AACE (2014b)Effective processes for measuring progress versus plan.Alleman (2014), and Bowman and Sabouri (2014)The ability of the people responsible for EVMS to plan, monitor, and control the project with the right balance between processes and results in a collaborative environment.Hunter et al. (2014)Project team technical skills, knowledge, and analytical skills.Wolf (2014)Accurate, reliable, and verified data.Jaeger (2014), and Lofgren (2017)Culture.Hunter et al. (2014), and King (2018)Communication between the project team and the client.Farok and Garcia (2015), and McNamee and Immonen (2019)Effective planning for project controls, conditioned by people, processes, and tools.Gonzales (2016)Human psychology of key stakeholders that affect decision-making: over optimism, curiosity, pressure to deliver, distortion of facts, and other.Bolinger and Phillips (2018)Design related, operation related, and agency related factors.Bryde et al. (2018)Alignment with organizational strategy, organizational integration, and continuous development.PMI (2018)In addition, Rezouki and Mortadha (2020) have determined a list of 37 factors categorized across cost, time, quality, risk, safety, social, and project. The most recent publication in this area is by Aramali et al. (2021) who have developed a list of 18 environment factors that may have an impact on EVMS effectiveness, and are categorized across people, culture, practices, and resources. As shown, various environment factors are examined in the literature, but there is no method to assess their degree of applicability within a project or program. Also, there is a need to study their effect on project performance outcomes and examine each factor’s correlation with EVMS to improve EVMS reliability and project success.Considering the definition of EVMS environment and the various environment factors related to people that are examined in the literature, the EVMS environment is at the core of defining the social aspects of EVMS.Theme 7: Limitations and Extensions of EVMThe limitations of EVM have been widely discussed in literature; many authors have focused on overcoming them (Demachkieh and Abdul-Malak 2018). Table 5 lists examples of studies that discuss the limitations as well as “extensions” (solutions) for different objectives of EVM.Table 5. Examples of studies on EVM limitationsTable 5. Examples of studies on EVM limitationsObjective(s) of EVMLimitation(s)Extension(s)Reference(s)Progress measurementPoor progress measurementEarned schedule, EDMLipke (2011a)Performance trackingNo consideration for dynamic changes impacted by time extension of projectsImproved EVM frameworkSiu and Lu (2011)Quality control in EVMaEVM does not cover the project’s qualityEQVM, Quality Earned Value ManagementMa and Yang (2012), and Ong et al. (2018)Cost controlAssumptions exist in EACNew methodology for cost estimate at completion (CEAC)Narbaev and De Marco (2014)Schedule controlNo direct interface with decision-makingMultivariate model for EVMColin et al. (2015); and Colin and Vanhoucke (2015)Metrices do not inform well about the scheduleBIMKranz (2017)Scope control in EVMaEVM does not explicitly address scopeEarned scope managementValdés-Souto (2016), and Tariq et al. (2020)Project controlSchedule and cost control warning signals are inaccurateEarned incentive metric (EIM)Kerkhove and Vanhoucke (2017)EVM data is subjective and uncertainZ-number–based earned value management (ZEVM)Hendiani et al. (2020)Inaccurate indications by time performance related indexes of EVM and earned schedule methodsEDM, control chartsVotto et al. (2020)Demachkieh and Abdul-Malak (2018) have summarized the shortcomings of EVMS in literature since 2000 and found that inaccurate results of estimate at completion (EAC) and percent complete of executed work are the top limitations. On the other hand, one of the major noted extensions to EVM is EDM, a technique to assess project duration performance (Khamooshi and Abdi 2017; Khamooshi and Golafshani 2014). EDM got initiated in the industry and its application is heavily investigated in recent academic research efforts (Votto et al. 2020; Hamzeh and Mousavi 2019).Although some authors have stated that the earned value concept is the best project management tool and its application is a “best practice” (Bergerud 2017; Anderson 2015), EVM/EVMS come with drawbacks, which have been well acknowledged and investigated by academic researchers to improve EVMS reliability, particularly by aiming for a reliable EVM analysis (Lipke 2011a), for reliable forecasting methods (Narbaev and De Marco 2014), and for using reliable EVM metrics (Colin and Vanhoucke 2015). Despite the research efforts addressing the limitations, it was found that the EVM extensions come with their own limitations of applicability. For example, earned quality value management (EQVM) was only applied on general building, excluding related infrastructural works (Ong et al. 2018). Also, the new model of Miguel et al. (2019) that integrates risk, quality, and earned schedule into EVM is only tested on a small bicycle project. Furthermore, the extensions of EVM seem theoretical and need tests of practicability on large project data.The identified limitations show that most of the drawbacks of EVM/EVMS investigated by academic researchers were technical in nature, and many of which were resolved. In comparison, social aspects of EVMS, such as the EVMS environment previously discussed, were rarely identified as limitations. This finding is in contrast with the most recent practitioner-focused research on EVMS environment (i.e., Rezouki and Mortadha 2020; Aramali et al. 2021), which revealed social aspects of EVM/EVMS as oftentimes the key to unleashing EVMS’ technical potential.Theme 8: Other Topics on EVM/EVMSA number of topics have been studied that are related to EVMS implementation or use of EVMS but do not fall under any of the previously discussed themes. The following paragraphs present the highlights for these topics.The benefits of applying EVM/EVMS have been weighed against its implementation costs in publications by practitioners from industry and government (Bembers et al. 2017b; Kerby et al. 2017; Kratzert and Houser 2011). In comparison, only one recent academic publication has studied the EVMS implementation costs (Hunter et al. 2014). Cost is an important factor in shaping the project team’s willingness to properly implement EVMS; thus, influencing its reliability and success.The industry identifies the EVM components that impact various project deliverables (Aramvareekul and Stephens 2014). However, none of the reviewed publications focus on the component of EVM deliverables for better project control. This indicates a disconnect between industry practitioners experience and academic researchers’ interests.Although Lukas (2018) acknowledges that using EVM can be “frustrating,” and although 272 industry experts rank “planning for EVM,” the second most important environment factor out of 18 impacting EVMS (Aramali et al. 2021), there is still lack of systematic frameworks that explicitly lay down the overarching process of planning for EVM.Recent academic research emerged around automating EVM where few papers have focused on the management of earned value in construction projects by using BIM (Elghaish et al. 2019; Alzraiee 2018). Such endeavors allow for more collaboration between the project stakeholders, for effective and reliable decision-making, and promoting a successful project delivery (Marzouk and Hisham 2014). The approach can be extended to various types of projects and industries. Furthermore, automated risk management integration into EVM is another endeavor for a reliable achievement of project goals (Eldosouky et al. 2014). Other recent academic research on EVM/EVMS include studying the effect of project disruption on construction, EVM with PERT, budget for corrective actions, and buffer control.The literature discussed a number of different topics that highlight both the social aspect of EVM/EVMS (e.g., planning for EVM by the project team) and its technical aspect (e.g., automated EVM).Summary: Gaps and Future ResearchBased on the analysis of the literature and in line with the study’s objectives, potential gaps were identified and inform future research directions. The summary of gaps and future research is presented in Table 6.Table 6. Summary of EVM/EVMS research gaps and future directionsTable 6. Summary of EVM/EVMS research gaps and future directionsTheme #ThemeIdentified gapsFuture potential directions(1)History of EVMSNo identified gaps1. Comparison of EVM and EVMS versus earlier tools (C/SCSC)(2)ComplianceCompliance reviews, oversight and EVMS certification2. Standardization of guideline application processes for contractor compliance evaluation (oversight)3. Establishing consistency among EVM compliance criteria4. Tailoring compliance criteria on different types of projects (project delivery type, project size, industry)5. Guideline implementation resource needs (costs, administrative and planning requirements)6. Maturity assessment of EVMS with its subprocesses7. Clarity on EVMS subprocesses and their interdependence and integration throughout the different project phases8. Compliance success factors and their correlation with project performance(3)Forecasting/predictionAccuracy of cost and schedule estimates9. Development of better estimation models or techniques for improved visibility of project future performance (e.g., artificial intelligence) applicable to diverse projects10. Consideration of different project uncertainties and risks in forecasting processes(4)Application of risk management in EVM/EVMSRisk management process integration with EVMS11. Formation of guidelines and standards to integrate risks with EVMS12. Risk-based decision-making models with the use of EVM information13. Development and validation of risk performance metrices on large data of real-life projects14. Use of management reserves, cost and schedule contingency reserves in EVM(5)Application of EVMSAddressing the challenges of EVM in oil and gas and software industries, and application of EVM in agile projects15. Exploration of EVM application in agile project environments16. Comparison of EVM application to different project contract types, industries contractual requirements, size and scope, and tailoring requirements with associated challenges17. Case studies of EVM guidelines and standards application(6)EVMS environmentAssessment frameworks of EVMS environment18. Objective assessment models of EVMS environment factors19. Correlation of EVMS environment factors with project performance or success(7)Limitations and extensions of EVMTesting of EVM extensions20. Tests of EVM extensions on large and diverse real-life project data21. Further EVM extensions to address scope, scope changes and project quality(8)Other topics on EVM/EVMSEVMS implementation costs, EVM deliverables, planning for EVM, automated EVM in various types of projects and industries, automated risk management integration into EVM, buffer control22. Further studies looking at the identified gaps under the theme of “Other topics on EVM/EVMS”By addressing the identified gaps, and following the suggested future directions, academic research could address some key needs of practitioners, aiming for improved and reliable management of earned value. That would help bridge the EVM disconnect between academia and industry.Conclusions and Future Research DirectionsIn this study, a classification of EVM/EVMS literature by themes was performed, highlighting the body of knowledge and identifying areas for further research on the topic of ensuring a reliable EVMS. The paper also introduced the concept and theory of EVMS maturity and environment and their impact on project performance. The resulting themes that reflect the up-to-date EVM/EVMS practices and trend are: (1) History of EVMS, (2) Compliance, (3) Forecasting/prediction, (4) Application of risk management in EVM/EVMS, (5) Application of EVMS, (6) EVMS environment, (7) Limitations and extensions of EVM, and (8) Other topics on EVM/EVMS.The themes of forecasting/prediction, and limitations and extensions of EVM, and their related recommendations for future research existed in the literature reviews and research investigations conducted earlier than 2012 (Hernández et al. 2013; Chen and Zhang 2012). Newer studies have a narrower focus: Devanshu et al. (2018) critically reviewed the literature emphasizing only the technical side (work breakdown structure, scope changes, and modern software tools); Nizam and Elshannaway (2019) solely focused on EVM limitations and extensions in their review, and proposed a path forward related to the discussed limitations; and the project control research trends and gaps identified by Willems and Vanhoucke (2015) are technical in nature (i.e., test and consider uncertainties in project control techniques to trigger corrective actions). A number of these topics were mentioned in an article by Kwak and Anbari (2012) representing the perspective of one federal agency (i.e., NASA). However, none of the recent reviews on EVM/EVMS shed the light on all of the aforementioned topics in addition to the key topics of EVMS history, compliance, application, and the EVMS maturity and environment dimensions, which represent a novelty of this study. Furthermore, the exploration and adaptation of EVMS as a sociotechnical system does not exist in any of the previous studies on EVM/EVMS.The three themes that contain the highest number of publications are: forecasting/prediction, followed by application of EVMS, and limitations and extensions of EVM, whereas the other themes constitute equal portions of literature (approximately 10%), except for history of EVMS (3%). The authors found a significant association between these themes and the publication type. The most significant disconnect between academia and industry is found in the themes of limitations and extensions of EVM, EVMS environment, compliance, and history of EVMS; where academics are more interested in identifying and addressing limitations by developing new tools and techniques while practitioners are more interested in working on compliance and environment factors through following guidelines and best practices. Nevertheless, forecasting/prediction is the most popular theme in both academic and industry publications.Although the number of EVM/EVMS publications have been relatively stable in the last decade, the focus areas have changed with time, but forecasting/prediction and application of EVMS have maintained the majority of the research focus in the last decade. However, publications on application of EVM/EVMS in risk management and compliance have declined in the last 5 years versus the previous 5 years. New miscellaneous topics emerged within the last decade including cost of EVM implementation, deliverables, automated EVM, and buffer control.The authors acknowledge that some limitations may exist in this paper. First, only studying publications with “earned value management” or “earned value management system” terms spelled out in the title or abstract might have caused overlooking interesting articles around the earned value method. Second, some publications in different academic search engines and industry databases were not considered in the analysis. Nevertheless, the review and classification methodology can be replicated on any other EVM/EVMS focused literature source beyond this scope, and the review could be compared to the outcomes of this study. Furthermore, the review started with the following objective in mind: introducing the new perspective of EVMS maturity and EVMS environment as variables that contribute to a reliable EVMS. However, the findings revealed only one recent publication that identifies the challenges faced while trying to achieve a compliant EVMS and highlights how ensuring maturity across the EVMS subprocesses helps with that. While the concept of maturity was mostly missing from the recent academic EVMS literature, the review showed a number of industry publications introducing environment factors.Accordingly, to advance the state of the art and help practitioners better manage their large and complex projects and programs, the authors recommend performing more research on EVMS maturity. Moreover, and since surveillance, EVMS certification, and compliance reviews against guidelines and standards, are all important for the success of a project and program, there is a need to investigate their consistency, costs, administrative efforts, and associated processes. This is necessary because these practices can impact the maturity level of the system, thus influencing its reliability.As mentioned throughout the analysis, there is a need to have a consistent method that helps project and program stakeholders assess their EVMS maturity and compliance. The maturity of an EVMS may be gauged by evaluating the maturity of its subprocesses, as a basis to develop assessment frameworks using EVMS guidelines to evaluate maturity at different phases in a project or program’s life cycle such that the assessment helps identify the shortcomings of the system’s compliance with EVMS guidelines. The industry’s use of a common assessment framework could also help in achieving coherent interpretations of the guidelines. Such EVMS maturity assessment results would allow the project and program stakeholders to identify deficiencies and work toward improving their project.Although several environment factors were identified in the literature, there is no single consistent framework that encompasses the criteria of the most influential factors. This highlights the need to develop a comprehensive framework to assess the EVMS environment at different phases in a project and program life cycle; the assessment could identify the people, culture, practices, and resources problems that need to be resolved.In line with these recommendations for future EVMS research directions, the EVMS assessments’ frameworks can be developed as automated tools. They need to be consistent, timely, objective, and compatible with various types of contracts and projects and programs as well as usable by both project owners and contractors. Such a method can allow the project and program decision-makers to rely on potential early warnings from the assessments to either plan for corrective actions to mitigate cost overruns and schedule delays or plan for preventive procedures to avoid such issues. Such assessments are opportunities for establishing interactive and productive communication between stakeholders, as well as developing strategies for successful and effective management of projects and programs. They further create possibilities for benchmarking the current project with previous projects and learning lessons for continuous improvement. Even though EVMS has multiple proven benefits to projects and programs, the correlation between its successful implementation and the probability of project and program success needs to be investigated quantitatively, which is an opportunity for a new line of research around EVMS.Other than the suggested recommendations around EVMS maturity and environment, and based on the findings of this paper, the authors recommend conducting more academic studies in the following areas to improve EVMS reliability: development and use of contingency reserves, implementation of EVMS in agile contexts, cost of EVM implementation, management of EVMS deliverables, effective planning for EVM, automated EVM focused on decision-making, change control, and EVMS integration with subcontract management. There are also opportunities to use technology such as machine learning techniques to improve proactive project control management, similar to how previous research used artificial intelligence in cost and schedule predictions (Wauters and Vanhoucke 2016; Willems and Vanhoucke 2015). This can be done by the development and use of large data sets from different types of projects.The authors believe that the aforementioned avenues for future research will make a considerable impact on the EVMS state of the art and industry practice, some of which is tied to best practices for scheduling, estimating, and work breakdown structure development, among others (DoD 2020; GAO 2019; Richey 2017). Finally, the paper highlighted different components of a reliable EVMS, which should account for both technical and social components. Through this examination of the existing literature, the authors validate that a reliable EVMS is a “sociotechnical system” where technical and human elements are interrelated, affect one another, and can contribute the success of the system.EVM is viewed by many experts as a great means to control and manage complex projects. It is a topic with tremendous amounts of research and publications. The literature has extensively studied the technical side of EVM/EVMS; this state-of-the-art review shows that there are many other aspects that need further attention to achieve a significant level of EVMS reliability.References AACE (American Association of Cost Engineering). 2014a. Level of effort planning and execution on earned value projects: Within the framework of ANSI/EIA-748. Morgantown, WV: AACE. AACE (American Association of Cost Engineering). 2014b. Required skills and knowledge of earned value management. Morgantown, WV: AACE. Abba, W. 2000. “How earned value got to primetime: A short look back and a glance ahead.” In Proc., Project Management Institute Seminars and Symp. Newtown Square, PA: Project Management Institute. Abba, W. F. 2017. “The evolution of earned value management.” Coll. Perform. Manage. 2: 9–12. Abdel Razik, M. 2020. “Earned schedule as a tool to forecast indirect costs.” In Proc., EVP PSP, 2020 AACE Int. Transactions, EVM-3544. Morgantown, WV: American Association of Cost Engineering. Agile Alliance. 2017. Agile practice guide. Newtown Square, PA: Project Management Institute. Alhojailan, M. I. 2012. “Thematic analysis: A critical review of its process and evaluation.” West East J. Social Sci. 1 (1): 39–47. Alleman, G. 2014. Performance-based project management: Increasing the probability of project success. New York: AMACOM. Alleman, G., T. Coonce, and R. Price. 2018. “Increasing the probability of program success with continuous risk management.” Coll. Perform. Manage. 4: 27–46. Alshaheen, A. 2018. “Forecasting project completion date using earned schedule and primavera P6TM.” Coll. Perform. Manage. 2: 11–22. Alzraiee, H. 2018. “Integrating BIM and earned value management system to measure construction progress.” In Proc., Construction Research Congress 2018, 684–693. Reston, VA: ASCE. Anderson, L. 2015. “Managing by exception–simplifying earned value for mainstream application.” Coll. Perform. Manage. 3: 29–30. Aramali, V., G. E. Gibson Jr., M. El Asmar, and N. Cho. 2021. “Earned value management system state of practice: Identifying critical subprocesses, challenges, and environment factors of a high-performing EVMS.” J. Manage. Eng. 37 (4): 04021031. https://doi.org/10.1061/(ASCE)ME.1943-5479.0000925. Aramvareekul, P., and R. D. Stephens. 2014. “Planning for EVM within basic project controls deliverables.” In Proc., 2014 AACE Int. Transactions, EVM.1552. Morgantown, WV: American Association of Cost Engineering. Baggett, K. S., Jr. 2014. “A systems-based framework for the assessment of performance measurement system implementations in R&D organizations.” Ph.D. dissertation, College of Engineering & Technology, Engineering Management, Old Dominion Univ. Bain, R., and L. Polakovic. 2005. Traffic forecasting risk study update 2005: Through ramp-up and beyond. Standard and Poor’s Rating Direct on the Global Credit Portal. New York: Standard & Poor’s. Baker, M. C. 2015. “A survival guide for using EVMS on small EPC projects.” In Proc., CCP CCT, 2015 AACE Int. Transactions, EVM.1896. Morgantown, WV: American Association of Cost Engineering. Ballesteros-Pérez, P., E. Sanz-Ablanedo, D. Mora-Melia, M. C. González-Cruz, J. L. Fuentes-Bargues, and E. Pellicer. 2019. “Earned schedule min-max: Two new EVM metrics for monitoring and controlling projects.” Autom. Constr. 103 (Jul): 279–290. https://doi.org/10.1016/j.autcon.2019.03.016. Batselier, J., and M. Vanhoucke. 2015b. “Evaluation of deterministic state-of-the-art forecasting approaches for project duration based on earned value management.” Int. J. Project Manage. 33 (7): 1588–1596. https://doi.org/10.1016/j.ijproman.2015.04.003. Batselier, J., and M. Vanhoucke. 2017. “Improving project forecast accuracy by integrating earned value management with exponential smoothing and reference class forecasting.” Int. J. Project Manage. 35 (1): 28–43. https://doi.org/10.1016/j.ijproman.2016.10.003. Bembers, I., M. Jones, E. Knox, and J. Traczyk. 2017a. Better earned value management system implementation research study. Arlington, VA: Joint Space Cost Council. Bembers, I., E. Knox, M. Jones, and J. Traczyk. 2017b. “EVM system’s high cost—Fact or fiction? Defense Acquisition University.” Coll. Perform. Manage. 1: 57–61. Bergerud, C. 2017. “Adopting a flexible EVM strategy to optimize project performance.” In Proc., 2017 AACE Int. Transactions, EVM-2590. Morgantown, WV: American Association of Cost Engineering. Bhaumik, H. 2016. “EVMS recommendations for multi-contract projects.” In Proc., 2016 AACE Int. Transactions, EVM-2141. Morgantown, WV: American Association of Cost Engineering. Bider, I., and V. Klyukina. 2018. “Using a socio-technical systems approach for a sales process improvement.” In Proc., 2018 IEEE 22nd Int. Enterprise Distributed Object Computing Workshop (EDOCW), 48–58. New York: IEEE. Blanco, F. 2011. “Revenue recognition methods and cost control through earned value management (EVM).” In Proc., 2016 AACE Int. Transactions, EVM.476. Morgantown, WV: American Association of Cost Engineering. Bolinger, P., and S. Phillips. 2018. “On the psychology of human misjudgment: Charlie Munger on decision-making.” Coll. Perform. Manage. 4: 9–11. Booth, A., A. Sutton, and D. Papaioannou. 2016. Systematic approaches to a successful literature review. London: SAGE. Bowman, L., and M. Sabouri. 2014. “Effective use of earned value for controlling construction projects.” In Proc., 2014 AACE Int. Transactions, EVM.1747. Morgantown, WV: American Association of Cost Engineering. Briner, R. B., and D. Denyer. 2012. “Systematic review and evidence synthesis as a practice and scholarship tool.” In Handbook of evidence-based management: Companies, classrooms and research, 112–129. Oxford, UK: Oxford University Press. Carrico, K., E. Dembert, M. Hollowell, J. Johannsen, and J. Lund. 2019. “The power of projections: Innovative schedule forecasting techniques.” In Proc., CCP PSP, 2019 AACE Int. Transactions, CSC-3107. Morgantown, WV: American Association of Cost Engineering. Chen, S., and X. Zhang. 2012. “An analytic review of earned value management studies in the construction industry.” In Proc., Construction Research Congress 2012: Construction Challenges in a Flat World, 236–246. Reston, VA: ASCE. Chirinos, W. 2014. “Applying earned value to overcome challenges in oil and gas industry surface projects.” Coll. Perform. Manage. 4: 7–13. Cho, N., M. El Asmar, G. E. Gibson Jr., and V. Aramali. 2020. “Earned value management system (EVMS) reliability: A review of existing EVMS literature.” In Proc., Construction Research Congress 2020, 631–639. Reston, VA: ASCE. Christensen, D. S. 1994. “A review of cost/schedule control systems criteria literature.” Project Manage. J. 25 (3): 32–39. Colin, J., A. Martens, M. Vanhoucke, and M. Wauters. 2015. “A multivariate approach for top-down project control using earned value management.” Decis. Support Syst. 79 (Nov): 65–76. https://doi.org/10.1016/j.dss.2015.08.002. Colin, J., and M. Vanhoucke. 2015. “A comparison of the performance of various project control methods using earned value management systems.” Expert Syst. Appl. 42 (6): 3159–3175. https://doi.org/10.1016/j.eswa.2014.12.007. Crawford, J. K. 2001. Project management maturity model: Providing a proven path to project management excellence. Boca Raton, FL: CRC Press. Crowe, S., and A. Basche. 2011. “Preparing for a successful EVMS certification.” In Proc., 2011 AACE Int. Transactions, EVM-622. Morgantown, WV: American Association of Cost Engineering. Crumrine, K. T., and J. D. Ritschel. 2013. “A comparison of earned value management and earned schedule as schedule predictors on DoD ACAT I programs.” Coll. Perform. Manage. 2: 37–44. Cruzes, D. S., and T. Dyba. 2011. “Recommended steps for thematic synthesis in software engineering.” In Proc., 2011 Int. Symp. on Empirical Software Engineering and Measurement, 275–284. New York: IEEE. DCMA (Defense Contract Management Agency). 2012. Earned value management system (EVMS) program analysis pamphlet (PAP). DCMA-EA PAM 200.1. Fort Lee, VA: DCMA. Dekker, S. 2012. “Accepted standards and emerging trends in Over Target Baseline (OTB) contracts.” In Proc., AACE Int. Transactions, CSC.816. Morgantown, WV: American Association of Cost Engineering. Demachkieh, F., and M. A. Abdul-Malak. 2018. “Synthesis of improvements to EVMS key parameters representation.” In Proc., Construction Research Congress 2018, 399–408. Reston, VA: ASCE. Demachkieh, F., and M. A. Abdul-Malak. 2019. “Emergent EVM techniques for construction schedule performance measurement and control.” In Proc., 2019 AACE Int. Transactions, EVM-3271. Morgantown, WV: American Association of Cost Engineering. Denyer, D., and D. Tranfield. 2009. “Producing a systematic review.” In The SAGE handbook of organizational research methods. London: SAGE. Devanshu, V., P. M. Rajgor, and J. Pitroda. 2018. “A critical literature review on implementation of earn value management.” Int. J. Constructive Res. Civ. Eng. 4 (1): 39–43. https://doi.org/10.20431/2454-8693.0401004. Dinsmore, P. C., and J. Cabanis-Brewin. 2014. The project management body of knowledge: Comprehension and practice: The AMA handbook of project management. New York: AMACOM. DoD (Department of Defense). 2020. Work breakdown structures for defense material items. Military Standard 881E. Arlington, VA: DoD. DOE. 2018a. Office of project management EVMS Compliance Review Standard Operating Procedure (ECRSOP). Washington, DC: DOE. DOE. 2018b. Program and project management for the acquisition of capital assets. DOE O 413.3B. Washington, DC: DOE. DOE. 2019. DOE EVMS gold card. Washington, DC: DOE. DOE. 2020. Integrated project management—Earned value management system (EVMS). DOE G 413.3-10B. Washington, DC: DOE. Driessnack, J. D. 2017. “The history of earned value management, the 60’s.” Coll. Perform. Manage. 2: 29–34. Driessnack, J. D. 2019. “Earned value management standards structures, part 1—Why three.” Coll. Perform. Manage. 2: 34–38. Efe, P., and O. Demirörs. 2013. “Applying EVM in a software company: Benefits and difficulties.” In Proc., 39th Euromicro Conf. on Software Engineering and Advanced Applications, 333–340. New York: IEEE. El Asmar, M., G. E. Gibson Jr., D. Ramsey, A. Yussef, and Z. U. Din. 2018. The maturity and accuracy of front end engineering design (FEED) and its impact on project performance. Research Rep. No. RR331-11. Austin, TX: Construction Industry Institute. Eldosouky, I. A., A. H. Ibrahim, and H. E. Mohammed. 2014. “Management of construction cost contingency covering upside and downside risks.” Alexandria Eng. J. 53 (4): 863–881. https://doi.org/10.1016/j.aej.2014.09.008. Elghaish, F., S. Abrishami, M. R. Hosseini, S. Abu-Samra, and M. Gaterell. 2019. “Integrated project delivery with BIM: An automated EVM-based approach.” Autom. Constr. 106 (Oct): 102907. https://doi.org/10.1016/j.autcon.2019.102907. Erez, S., and J. Sloan. 2017. “Case study: Using ISO 20000 to supplement earned value management.” In Proc., 2017 AACE Int. Transactions, EVM-2552. Morgantown, WV: American Association of Cost Engineering. Farok, G., and J. A. Garcia. 2015. “Developing group leadership and communication skills for monitoring EVM in project management.” J. Mech. Eng. 45 (1): 53–60. https://doi.org/10.3329/jme.v45i1.24385. Finefield, T. 2013a. “Earned value management guidelines: Accounting considerations, analysis and management reports, revisions and data maintenance.” Coll. Perform. Manage. 2: 7–18. Finefield, T. 2013b. “Earned value management guidelines: Organization and planning, scheduling and budgeting.” Coll. Perform. Manage. 1: 5–13. Fleming, Q. W., and C. W. Ervin. 1962. “Management aids for program control.” Aerosp. Manage. 5 (7): 26–30. Fleming, Q. W., and J. M. Koppelman. 2010. Earned value project management. Newton Square, PA: Project Management Institute. Frahm, V. L. 2012. “Designing a tailored earned value management system (EVMS).” In Proc., 2012 AACE Int. Transactions, EVM.1059. Morgantown, WV: American Association of Cost Engineering. Frank, M., D. Kester, and K. Urschel. 2017. “The power of data: New thinking and technology can keep EVMS relevant.” Defense Acquisition Magazine, May 1, 2017. GAO (Government Accountability Office). 2009. Cost estimating and assessment guide: Best practices for developing and managing capital program costs. Washington, DC: GAO. GAO (Government Accountability Office). 2019. Cost and schedule performance of large facilities construction projects and opportunities to improve project management. Washington, DC: GAO. Gareis, R., and M. Huemann. 2000. “Project management competences in the project-oriented organization.” In Vol. 3 of Gower handbook of project management. Aldershot, UK: Gower Publishing. Garrett, G. A., and R. G. Rendon. 2006. US military program management: Lessons learned and best practices. San Francisco: Berrett-Koehler Publishers. Giammalvo, P. D. 2012. “Real time’ performance reporting using earned value for the mining sector.” In Proc., 2012 AACE Int. TCM Conf. Transactions, EVM.1179. Morgantown, WV: American Association of Cost Engineering. Gibson, G. E. J., M. El Asmar, and N. Cho. 2019. Project definition rating index: Maturity and accuracy total rating system (PDRI MATRS). Austin, TX: Construction Industry Institute. Gonzales, M. T. 2016. “Implementing project controls: Preparing for the establishment of the integrated (cost and schedule) performance measurement baseline.” Coll. Perform. Manage. 2: 17–24. Graham, D. 2018. “Cost risk management.” Coll. Perform. Manage. 1: 9–12. Gupta, R. 2014. “Earned value management system.” Int. J. Emerging Eng. Res. Technol. 2 (4): 160–165. Gustavus, R. L., and K. Hunter. 2012. “How the department of defense determines if EVM should be required and what contractual requirements are necessary if the answer is ‘yes.’” Coll. Perform. Manage. 1: 22–30. Hammad, M. W., A. Abbasi, and M. J. Ryan. 2018. “Developing a novel framework to manage schedule contingency using theory of constraints and earned schedule method.” J. Constr. Eng. Manage. 144 (4): 04018011. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001178. Hamzeh, A. M., and S. M. Mousavi. 2019. “A new fuzzy approach for project time assessment under uncertain conditions.” In Proc., 2019 15th Iran Int. Industrial Engineering Conf. (IIIEC), 76–80. New York: IEEE. Hendiani, S., M. Bagherpour, A. Mahmoudi, and H. Liao. 2020. “Z-number based earned value management (ZEVM): A novel pragmatic contribution towards a possibilistic cost-duration assessment.” Comput. Ind. Eng. 143 (May): 106430. https://doi.org/10.1016/j.cie.2020.106430. Hernández, J. I. M., J. R. O. Olaso, and J. R. Gómez. 2013. “Technical performance based earned value as a management tool for engineering projects.” Chap. 7 in Engineering management, 143–166. London: Intechopen. Heron, R. A., E. T. White, J. D. Ritschel, and C. G. Keaton. 2015. “Forecasting DoD mid-acquisition space program EACs using WBS level 2 and 3 data.” Coll. Perform. Manage. 3: 9–14. Humphrey, W. S. 1989. Managing the software process. Boston: Addison-Wesley Longman. Humphreys, G. C. 2016. “EVMS—After the evolution: The long slow road.” Coll. Perform. Manage. 1: 19–22. Humphreys, G. C. 2018. Project management using earned value. Irvine, CA: Humphreys & Associates. Infanti, M. 2011. “How to estimate and use management reserve in an earned value management system (EVMS).” Coll. Perform. Manage. 4: 32–37. ISO. 2018. Earned value management in project and programme management. Geneva: ISO. Jaeger, T. W. 2014. “Why earned value metrics sometimes deceive management.” In Proc., 2014 AACE Int. Transactions, EVM.1556. Morgantown, WV: American Association of Cost Engineering. Jung, Y., B. S. Moon, and J. Y. Kim. 2011. “EVMS for nuclear power plant construction: Variables for theory and implementation.” In Proc., Computing in Civil Engineering, 728–735. Reston, VA: ASCE. Kerby, J., M. Forbes, and S. Terrell. 2017. “Cutting the cost of earned value management.” Defense Acquisition Magazine, April 10, 2017. Kerkhove, L.-P., and M. Vanhoucke. 2017. “Extensions of earned value management: Using the earned incentive metric to improve signal quality.” Int. J. Project Manage. 35 (2): 148–168. https://doi.org/10.1016/j.ijproman.2016.10.014. Kersbergen, J. 2011. “The challenge for earned value in commercial industry.” Coll. Perform. Manage. 1: 18–20. Kerzner, H. 2017. Project management: A systems approach to planning, scheduling, and controlling. New York: Wiley. Kester, D., D. Cottrell, and K. Carney. 2015. “Data driven EVMS compliance: An analytical approach that will transform the way we think about managing.” Coll. Perform. Manage. 2: 7–13. Khamooshi, H., and H. Golafshani. 2014. “EDM: Earned Duration Management, a new approach to schedule performance management and measurement.” Int. J. Project Manage. 32 (6): 1019–1041. https://doi.org/10.1016/j.ijproman.2013.11.002. King, J. S. 2018. “Methodologies for implementing program controls: Strategic methodologies for implementing program controls in change resistant defense contracting environments.” Master’s thesis, College of Business, Wilmington Univ. Kranz, G. M. 2017. “Intelligently linking information for better performance management across industry and government.” Coll. Perform. Manage. 4: 11–13. Kratzert, K., and J. R. Houser. 2011. “Cost of earned value management.” Coll. Perform. Manage. 3: 12–14. Kwak, Y. H., and F. T. Anbari. 2012. “History, practices, and future of earned value management in government: Perspectives from NASA.” Project Manage. J. 43 (1): 77–90. https://doi.org/10.1002/pmj.20272. Lappenbusch, C. F. J. 2017. “Misuse of earned value management results in erroneous conclusions.” In Proc., CCP EVP PSP, 2017 AACE Int. Transactions, EVM-2471. Morgantown, WV: American Association of Cost Engineering. Laqua, R. 2018. “Why compliance needs to change.” Coll. Perform. Manage. 1: 13–19. Liggett, W., H. Hunter, and M. Jones. 2017. “Navigating an earned value management validation led by NASA: A contractor’s perspective and helpful hints.” In Proc., 2017 IEEE Aerospace Conf., 1–28. New York: IEEE. Liggett, W., H. Hunter, and M. Jones. 2019. “So you passed an earned value management government validation-now what?” In Proc., 2019 IEEE Aerospace Conf., 1–10. New York: IEEE. Lipke, W. 2011a. “Is something missing from project management.” Coll. Perform. Manage. 1: 23–28. Lipke, W. 2011b. “Schedule adherence and rework.” Coll. Perform. Manage. 1: 9–18. Lipke, W. 2019. “Earned schedule forecasting method selection.” Coll. Perform. Manage. 1: 21–28. Lofgren, E. M. 2017. “Trust, but verify: An improved estimating technique using the integrated master schedule (IMS).” Coll. Perform. Manage. 4: 29–36. Lorenz, A., H. S. Bosch, and K. Kuttler. 2012. “Implementation of earned value management tools in the Wendelstein 7-X project.” IEEE Trans. Plasma Sci. 40 (12): 3560–3565. https://doi.org/10.1109/TPS.2012.2220784. Lucas, D. A. 2012. “Increasing project controls impact on a successful project.” In Vol. 54 of Proc., AACE Int., Cost Engineering. Morgantown, WV: American Association of Cost Engineering. Lukas, A. J. 2018. “How to successfully use earned value on projects.” In Vol. 60 of Proc., Cost Engineering. Morgantown, WV: American Association of Cost Engineering. Ma, X., and B. Yang. 2012. “Optimization study of earned value method in construction project management.” In Proc., 2012 Int. Conf. on Information Management, Innovation Management and Industrial Engineering, 201–204. New York: IEEE. Mahmoudi, A., M. Bagherpour, and S. A. Javed. 2019. “Grey earned value management: Theory and applications.” IEEE Trans. Eng. Manage. 68 (6): 1703–1721. https://doi.org/10.1109/TEM.2019.2920904. Martinelli, R., and J. Waddell. 2014. “A program management decision process.” Coll. Perform. Manage. 4: 7–16. McGrath, M. T. 2012. “Earned value management: Transitioning from discouragement to a glimmer of hope.” In Proc., 2012 IEEE Aerospace Conf., 1–11. New York: IEEE. McGregor, J. S. 2019a. Department of defense earned value management implementation guide (EVMIG). Arlington, VA: Department of Defense. McGregor, J. S. 2019b. Department of defense earned value management system interpretation guide (EVMSIG). Arlington, VA: Department of Defense. McNamee, E. M., C. E. Hanner, and C. W. Immonen. 2017. “Improving EVMS compliance through data integration.” In Proc., 2017 AACE Int. Transactions, EVM-2581. Morgantown, WV: American Association of Cost Engineering. McNamee, E. M., and C. W. Immonen. 2019. “Use of earned value management as a communication tool with the project team and the client.” In Proc., 2019 AACE Int. Transactions, EVM-3155. Morgantown, WV: American Association of Cost Engineering. Melamed, D., and R. C. Plumery. 2015. “A critical analysis of the ANSI/EIA standard for EVMS and the TCM framework.” In Proc., 2015 AACE Int. Transactions, EVM.2041. Morgantown, WV: American Association of Cost Engineering. Miguel, A., W. Madria, and R. Polancos. 2019. “Project management model: Integrating earned schedule, quality, and risk in earned value management.” In Proc., 2019 IEEE 6th Int. Conf. on Industrial Engineering and Applications (ICIEA), 622–628. New York: IEEE. Mishakova, A., A. Vakhrushkina, V. Murgul, and T. Sazonova. 2016. “Project control based on a mutual application of pert and earned value management methods.” Procedia Eng. 165 (Jan): 1812–1817. https://doi.org/10.1016/j.proeng.2016.11.927. Morad, M., and S. M. El-Sayegh. 2016. “Use of earned value management in the UAE construction industry.” In Proc., 2016 Int. Conf. on Industrial Engineering, Management Science and Application (ICIMSA), 1–5. New York: IEEE. Morin, B. J. 2016. “How it all began: The creation of earned value and the evolution of C/SPCS and C/SCSC.” Coll. Perform. Manage. 1: 15–17. NASA (National Aeronautics and Space Administration). 2018a. Earned value management (EVM) implementation handbook. Washington, DC: NASA. NASA (National Aeronautics and Space Administration). 2018b. Earned value management reference guide for project-control account managers. Washington, DC: NASA. NDIA (National Defense Industrial Association). 2016. ANSI/EIA 748 earned value management system acceptance guide. Arlington, VA: Integrated Program Management Division. NDIA (National Defense Industrial Association). 2018a. Earned value management systems EIA-748-D intent guide. Arlington, VA: Integrated Program Management Division. NDIA (National Defense Industrial Association). 2018b. Earned value management systems application guide. Arlington, VA: Integrated Program Management Division. NDIA (National Defense Industrial Association). 2020. Earned value management system guideline scalability guide. Arlington, VA: Integrated Program Management Division. Netto, J. T., N. L. F. de Oliveira, A. P. A. Freitas, and J. A. N. dos Santos. 2020. “Critical factors and benefits in the use of earned value management in construction.” Braz. J. Oper. Prod. Manage. 17 (1): 1–10. https://doi.org/10.14488/BJOPM.2020.007. Nizam, A., and A. Elshannaway. 2019. “Review of earned value management (EVM) methodology, its limitations, and applicable extensions.” J. Manage. Eng. Integr. 12 (1): 59–70. Nouban, F., N. Alijl, and M. Tawalbeh. 2020. “Integrated earned value analysis and their impact on project success.” Int. J. Adv. Eng. Sci. Appl. 1 (1): 34–39. https://doi.org/10.47346/ijaesa.v1i1.18. Oien, A. M. L. 2015. “EVM and agile: Complementary control loops of a project management system.” Coll. Perform. Manage. 3: 17–25. Ong, H. Y., C. Wang, and N. Zainon. 2018. “Developing a quality-embedded EVM tool to facilitate the iron triangle in architectural, construction, and engineering practices.” J. Constr. Eng. Manage. 144 (9): 04018079. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001533. Orgut, R. E., M. Batouli, J. Zhu, A. Mostafavi, and E. J. Jaselskis. 2020. “Critical factors for improving reliability of project control metrics throughout project life cycle.” J. Manage. Eng. 36 (1): 04019033. https://doi.org/10.1061/(ASCE)ME.1943-5479.0000710. Pajares, J., and A. Lopez-Paredes. 2011. “An extension of the EVM analysis for project monitoring: The cost control index and the schedule control index.” Int. J. Project Manage. 29 (5): 615–621. https://doi.org/10.1016/j.ijproman.2010.04.005. Plemmons, M. E., and T. R. Hodges. 2011. “Transforming earned value management through systems integration.” In Proc., 2011 AACE Int. Transactions, EVM-520. Morgantown, WV: American Association of Cost Engineering. PMI (Project Management Institute). 2013. Organizational project management maturity model (OPM3) knowledge foundation. Newtown Square, PA: Project Management Institute. PMI (Project Management Institute). 2017. PMI Lexicon of project management terms. Newtown Square, PA: Project Management Institute. PMI (Project Management Institute). 2018. The standard for organizational project management. Newtown Square, PA: Project Management Institute. PMI (Project Management Institute). 2019a. The standard for earned value management. Newtown Square, PA: Project Management Institute. PMI (Project Management Institute). 2019b. The standard for risk management in portfolios, programs and projects. Newtown Square, PA: Project Management Institute. Ramahatra, N. 2011. “Small projects, big savings by implementing best practices with earned value management (lessons learned).” Coll. Perform. Manage. 3: 17–24. Richey, K. A. 2017. GAO best practice guides light the way. Fort Belvoir, VA: Defense Acquisition University Press. Rybina, E., D. Skorobogatov, S. T. Regan, and J. K. Owen. 2015. “CIS: Utilization of earned value management for monitoring production facilities.” In Proc., 2015 AACE Int. Transactions, EVM-1848. Morgantown, WV: American Association of Cost Engineering. Scheepbouwer, E., A. Ezz, B. Guo, and D. V. Der Walt. 2018. “Risk management and the effects on project success.” In Proc., Construction Research Congress 2018, 460–470. Reston, VA: ASCE. Shi, H. 2019. “Accurate quantity update: A key for project management success.” In Proc., CCP PSP, 2019 AACE Int. Transactions, TCM-A3106. Morgantown, WV: American Association of Cost Engineering. Siu, M.-F., and M. Lu. 2011. “Scheduling simulation-based techniques for earned value management on resource-constrained schedules under delayed scenarios.” In Proc., 2011 Winter Simulation Conf. (WSC), 3455–3466. New York: IEEE. Solomon, P. J., and R. R. Young. 2007. Performance-based earned value. New York: Wiley. Sowers, D., and H. Jarnagan. 2019. “Program management lessons learned: Alaskan way viaduct replacement program.” In Proc., 2019 AACE International Transactions, PM-3105. Morgantown, WV: American Association of Cost Engineering. Steege, A. W. V. D. 2019. “Unpacking earned value management for oil and gas projects.” In Proc., CCP, 2019 AACE International Transactions, EVM-3165. Morgantown, WV: American Association of Cost Engineering. Stratton, R. W. 2000. The EVM maturity model—EVM3. Newtown Square, PA: Project Management Institute. Stratton, R. W. 2006. The earned value management maturity model. San Francisco: Berrett-Koehler Publishers. Stratton, R. W. 2016. “Making EVM work in agile development projects.” Coll. Perform. Manage. 2: 7–16. Sutrisna, M., E. Pellicer, C. Torres-Machi, and M. Picornell. 2020. “Exploring earned value management in the Spanish construction industry as a pathway to competitive advantage.” Int. J. Construct. Manage. 20 (1): 1–12. https://doi.org/10.1080/15623599.2018.1459155. Tariq, S., N. Ahmad, M. U. Ashraf, A. M. Alghamdi, and A. S. Alfakeeh. 2020. “Measuring the impact of scope changes on project plan using EVM.” IEEE Access 8 (Aug): 154589–154613. https://doi.org/10.1109/ACCESS.2020.3018169. Valdés-Souto, F. 2016. “Earned scope management: A case of study of scope performance using COSMIC (ISO 19761) with a real project.” In Proc., 2016 Joint Conf. of the Int. Workshop on Software Measurement and the Int. Conf. on Software Process and Product Measurement (IWSM-MENSURA), 53–64. New York: IEEE. Vanhoucke, M. 2012. “Dynamic scheduling: Integrating schedule risk analysis with earned value management.” Coll. Perform. Manage. 2: 11–13. Vanhoucke, M. 2017. “About academic research on earned value management inspired by the college of performance management.” Coll. Perform. Manage. 3: 28–35. Vanhoucke, M., P. Andrade, F. Salvaterra, and B. Jordy. 2015. “Introduction to earned duration.” Coll. Perform. Manage. 2: 15–27. Votto, R., L. Lee Ho, and F. Berssaneti. 2020. “Applying and assessing performance of earned duration management control charts for EPC project duration monitoring.” J. Constr. Eng. Manage. 146 (3): 04020001. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001765. Wauters, M., and M. Vanhoucke. 2016. “A comparative study of artificial intelligence methods for project duration forecasting.” Expert Syst. Appl. 46 (Mar): 249–261. https://doi.org/10.1016/j.eswa.2015.10.008. Wibiksana, R. 2012. “Earned value management: Adapted for use in underground mining operations.” Coll. Perform. Manage. 3: 16–19. Wolf, L. D. 2014. “Project controls personnel: Finding the ‘right stuff’.” Coll. Perform. Manage. 3: 9–14. Zhan, Z., C. Wang, J. B. H. Yap, S. Samsudin, and H. Abdul-Rahman. 2019. “Earned value analysis, implementation barriers, and maturity level in oil & gas production.” S. Afr. J. Ind. Eng. 30 (4): 44–59. https://doi.org/10.7166/30-4-2030.



Source link

Leave a Reply

Your email address will not be published.