To be an effective project manager, one must possess a number of skills in order to successfully guide the project to completion. This includes having a working knowledge of the information coming from multiple sources and the ability to make sense of that information in a cohesive manner. This is so that, when brought together, it provides an accurate picture of where the project has been, where it is in its present state, and what actions must be taken to keep it (or bring it back) on track.
Looking Back to the Dawn of Digital
Starting in about the early 1990s, with the first wave of PC-based digitization in business systems, software focused on automating functions based on specializations as defined by the division of labor. Thus, early deployments of software solutions during the initial tech bubble focused on so-called “best-of-breed” approaches in business systems, in which those applications that performed a specific function best were knitted together to form the fabric of a strategic toolset.
Thus, project management organizations would usually begin by selecting a scheduling application (Microsoft® Project, Oracle Primavera, Deltek Open Plan, Artemis ProjectView, etc.) and then select other applications that mirrored the skillsets required, some of these being selected by higher C-level managers: financial management, resource and personnel management, acquisition management, cost performance, risk management, configuration control, and others.
Once put in place, the need arose, similar to but significantly different from pre-digitized days, for a method of integration and control of these systems, which relied heavily on the establishment of manual systems and procedures. Labor effort was shifted from ensuring the validity of information as it related to the work of project management, to the validity, reconciliation, and management of data under rigorous non-automated controls that are vulnerable to human error.
Where It Went Wrong
Thus, in a perverse way, digitization reduced the value of labor of the specialist. But this was not the end of the transformation. The economic basis for digitization was and is improved productivity and reduction of labor overhead. As such, organizations—having acquired the requisite technology and armed with their business plans—reduced workforce and even the level (and salaries) of the knowledge workers remaining. Implicit in the business plans was the assumption that while in many cases only providing the 80% or 90% solution, that the software technology deployed would eventually be developed to fill the gaps. This has not happened.
Instead, the gaps have been filled by custom one-off solutions, often based in manual systems to fill the data required in PowerPoint reports, the use of Microsoft® Excel spreadsheets, and customized Access applications. Widespread sub-optimization is now the rule, with demands on a smaller workforce requiring that people—many of whom do not have the required skills since they came in after the transformation—must interpret and understand the significance of the data, reconcile the data, and manage the data.
Fulfilling the Vision
In order to break the cycle of increasing complexity and lack of interoperability created by best-of-breed and one-off solutions in an environment built on the reduction of available labor and resources, I believe it is imperative for organizations to go to the next step and finally realize what was expected to begin with. That is, to develop an integrated digital environment (IDE) strategy.
Since I first worked on this concept as a senior U.S. Navy Commander almost 18 years ago, the definition has morphed to be almost undefinable, so let me tell you what I mean by IDE: it is the ability to receive, integrate and normalize essential project management data regardless of proprietary source, and to then aggregate and deconstruct that data so that it can be delivered on a near real-time basis to the appropriate level of project management in a manner that is useful to that level. The following are the characteristics of what this means.
First, that normalization is to be defined by the successful acceptance of industry-adopted data schemas that reduce the constituent parts of the data—whether it be schedule, cost, risk, or financial data—so that the basis of syntax to support each discipline is objectively consistent. For example, the U.S. Department of Defense and the aerospace and defense industry have spearheaded the adoption of an international UN/CEFACT XML schema for normalization of earned value and schedule information with risk and other data to be included over time. While ostensibly designed to support DoD-type work, the schema, particularly as it relates to project schedule data, is expansive enough to allow for adoption and use by other commercial industries and across national economies.
Second, that integration be established through either acceptance of the standard data schema—with a corresponding consistent manner of hosting that data in a relational database—or through the direct access to the appropriate data, via data communication protocols which forgo proprietary development or hardcoding to establish their connections (that is, utilization of standard protocols for communication to access data). This approach precludes methods such as internal data swapping, transfer or interfacing that require constant manual reconciliation.
Third, the use of new user-interface technologies that forgo one-trick-pony solutions, which allow for the configuration of solutions that fill the gap and achieve integration of data in an automated fashion—fulfilling the goals of improved productivity and the elimination of labor dedicated to data reconciliation and management, and one-off solutions built on Excel and PowerPoint.
Fourth, that the technology used to achieve these characteristics is repeatable and sustainable. That once configured and proved out initially, that the environment utilizes digitized technologies without constant manual intervention except by exception.
Fifth, that the integration of data from disparate sources process data so that it is useable as information, leveraging integration to provide new parametric techniques and leading indicators based on the insight provided by data that is properly associated at the optimum logical level.
The first digital wave automated what it could, constrained by swim lanes of specialization. The next wave must break these constraints to do what automation is best at doing: quantitative data processing, facilitating the person to do what they are best doing: qualitative assessment.
For more brilliant insights, check out Nick’s blog: Life, Project Management, and Everything