## Safety and Environmental Management Systems: Quantitative Methods for Data Analysis and Accounting for Imperfect ReportingChristopher J. Jablonowski* (cjablonowski@mail.utexas.edu)
As part of the response to the Macondo/Horizon blowout in the Gulf of Mexico, the Bureau of Ocean Energy Management, Regulation and Enforcement (BOEMRE), part of the US Department of the Interior, has defined new rules regarding workplace safety. Oil and gas operators will now be required to develop and maintain a Safety and Environmental Management System (SEMS). A SEMS is a comprehensive management program for identifying, addressing and managing operational safety hazards and impacts. The new rules apply to all offshore oil and gas operations in Federal waters. Many oil and gas operators have had a SEMS in place for many years, but the new rules impose the requirement for a SEMS on all OCS operators, and provide BOEMRE officials with oversight and enforcement authority. To comply with the new rules, oil and gas operators will have to demonstrate that they have a systematic approach for managing safety and environmental hazards and impacts. In a recent press release, BOEMRE summarized the new requirements with a 13 point bullet list (USDOI, 2010): 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. To satisfy the new requirements, it is anticipated that oil and gas operators will define and implement formal processes for safety and environmental data collection, analysis, policy design, and implementation. These steps are typical for any continuous improvement process as depicted in Figure 1.
This article addresses the need for systematic quantitative analysis at the “Analyze” stage. Qualitative analysis is unlikely to yield useful insights because incidents are often the result of complex and sometimes confounding interactions among different risk factors and mitigation efforts. Without a systematic quantitative approach, resources are likely to be misallocated. Quantitative models of safety incidence allow managers to connect specific elements of the SEMS to outcomes. The results provide evidence that can be used to allocate resources to those incident prevention efforts with the largest benefit-cost ratios at the “Design/Update” stage. Statistical and regression analysis are obvious analytical tools. However, conventional methods do not account for the possibility of imperfect reporting (under- and overreporting). Thus, reliance on conventional methods alone may result in misdirected policies and inefficient resource allocation, and thus defeat the fundamental purpose of the new regulation for SEMS. Complementary models are needed that more accurately represent incidence and reporting phenomena. This article provides a brief discussion of the implications of imperfect reporting, and then proposes a method to explicitly account for underreporting.
Underreporting of incidents can be intentional (evasion) or unintentional (ignorance). There is also the prospect for overreporting, that is, fraudulent reporting of incidents that did not occur. Fraud is a complex phenomenon, and there is a significant literature on the subject. While it is not included here, the approach presented below can be extended to incorporate fraud (for example see Jablonowski, 2010b). Imperfect reporting distorts the observations of incident data. A simple example will demonstrate the impacts of underreporting and overreporting. Consider 100 observations on safety outcomes in Table 1. The columns represent whether or not an incident occurred, while the rows represent whether or not the incident was reported. In this omniscient “truth” case, both under- and overreporting are observed. In practice, this data is not observable to the analyst. Instead, the fraudulent reports are counted as actual incidents, and the underreported incidents are counted with the actual non-incidents. Thus, the analyst observes the data as depicted in Table 2.
Depending on the levels of imperfect reporting, the implications can be severe. The true probability of an incident, It is clear that in the presence of imperfect reporting, use of the data in Table 2 will distort any qualitative or quantitative analysis. Therefore, the challenge is to develop methods that use the observed data to reveal information about the unobserved incident and reporting phenomena. If the imperfect reporting can be modeled explicitly, then more accurate assessments can be made of the true incident phenomenon. There is an emerging literature on the subject of incomplete detection based on the seminal work of Feinstein (1989, 1990). As Feinstein predicted, his model of detection controlled estimation (DCE) could be applied in various contexts. Studies have been completed in tax compliance (Erard, 1997), environmental compliance (Brehm and Hamilton, 1996; Helland, 1998), health diagnosis (Bradford et al., 2001; Kleit and Ruiz, 2003), political science (Scholz and Wang, 2006), and safety in oil and gas drilling (Jablonowski, 2007 and 2010b).
It is assumed that incidents are reported as the result of a sequential process. First, an incident either occurs or does not occur. Second, an incident either is reported or not reported. This assumption facilitates the mathematical treatment and discussion. Using the previous notation, Equations (1) and (2) are used to compute the unconditional probability of a reported incident and of a non-report. (1) (2) The left-hand side of Equations (1) and (2) is available from the observed data and is specified as the dependent variable
i = 1…n observations and h independent variables, X is defined as a 1x_{i}h vector of independent variables believed to influence the probability of incidents and β is defined as a hx1 vector of coefficients (to be estimated).(3) In setting up the data set for analysis, the dependent variable is recorded as a 1 when the number of reported incidents is greater than or equal to 1, and 0 otherwise. Note that Φ is the cumulative standard normal distribution. The Poisson model is also an appropriate option. The probability that observation (4) The dependent variable ln(λ_{i})=X_{i}β, where X and β are the same as defined previously, and π is the Poisson probability density function.
P(R|I), but it is still assumed that P(R|NI) = 0. Thus, Equation (1) reduces to P(R)=P(R|I)P(I). The objective is to model P(I) and P(R|I). In doing so, the analyst can differentiate the marginal impacts of the incidence and reporting phenomena. Z is defined as a 1x_{i}j vector of independent variables believed to influence the conditional probability of reporting incidents after they occur, and γ is defined as a jx1 vector of coefficients. If a binary incidence model is assumed as in Equation (3), and the probability P(R|I) is also modeled as a binary function, then the probability that observation y on the dependent variable takes on a value of 1 is represented as shown in Equation (5)._{i}(5) Using Equation (2) to specify y on the dependent variable takes on a value of 0 reduces to the expression given in Equation (6), which is recognized as the complement of _{i}P(y=1)._{i}(6) The log-likelihood function can be derived as given in Equation (7). (7) The dependent variable is recorded as a 1 when the number of reported accidents is greater than or equal to 1, and 0 otherwise. Again, the Poisson model is an appropriate option for modeling incidence. The next model specifies the incidence phenomenon using a Poisson model, and the probability If one allows for the possibility of partial reporting, the implications are severe. The number of conditional reporting probabilities that need to be estimated grows significantly, even when reasonable simplifying assumptions are made. In addition, the number of terms on the right hand side of the regression is in theory, infinite. For example, to compute the probability of observing one reported incident, the analyst would have to consider all potential values of incidence. The analyst could constrain this number to limit the scope of the computation, but the selection of the cutoff point would be arbitrary. When the Poisson model of Equation (4) is used to model m, can then be derived as shown in Equation (9).(8) (9) For all zero observations, L._{n} = L_{m} + L_{n-m}(10) (11)
New requirements imposed by BOEMRE require oil and gas operators to develop and maintain a SEMS to identify and manage operational safety hazards and impacts. To comply with the new rules, oil and gas operators will have to demonstrate that they have a systematic approach for managing safety and environmental hazards and impacts. It is anticipated that oil and gas operators will implement formal processes for safety and environmental data collection, analysis, policy design, and implementation. Quantitative models of safety incidence facilitate systematic analysis and continuous improvement. They enable safety managers to connect specific policies to safety and environmental outcomes. The results provide evidence that can be used to efficiently allocate resources. However, conventional methods of statistical and regression analysis of safety incidents do not account for the fact that some incidents are not reported. By relying on conventional methods, it is possible that resources will be misallocated, companies will miss opportunities for improvement, and the new regulation for SEMS will not deliver the desired benefits. Models are needed that more accurately represent safety and environmental incidence and reporting phenomena. This article describes one approach for such an analysis. The method provides insights that are not available from conventional approaches. However, the models of perfect and imperfect reporting should be used in a complementary manner because results from one model can often be used to explain results in the other.
Bradford, W.D., Kleit, A.N., Krousel-Wood, M.A., Re, R.N. 2001. Testing Efficacy with Detection Controlled Estimation: An Application to Telemedicine. Brehm, J., Hamilton, J.T. 1996. Noncompliance in Environmental Reporting: Are Violators Ignorant, or Evasive, of the Law?” Chunlin, H., Chengyu, F. 1999. Evaluating Effects of Culture and Language on Safety. Conchie, S., Donald, I. 2006. The Role of Distrust in Offshore Safety Performance. Erard, B. 1997. Self-selection with Measurement Errors: A Microeconometric Analysis of the Decision to Seek Tax Assistance and Its Implications for Tax Compliance. Feinstein, J. 1989. The Safety Regulation of U.S. Nuclear Power Plants: Violations, Inspections, and Abnormal Occurrences. Feinstein, J. 1990. Detection Controlled Estimation. Fleming, M., Flin, R., Mearns, K., Gordon, R. 1996. The Offshore Supervisor’s Role in Safety Management: Law Enforcer or Risk Manager. Paper SPE 35906 presented at the Third International Conference on Health, Safety, and Environment in Oil and Gas Exploration and Production, New Orleans, LA, USA, 9-12 June. Helland, E. 1998. The Enforcement of Pollution Control Laws: Inspections, Violations, and Self-Reporting. Iledare, O., Pulsipher, A., Dismukes, D., Mesyanzhinov, D. 1997. Oil Spills, Workplace Safety and Firm Size: Evidence from the U.S. Gulf of Mexico OCS. Jablonowski, C. 2007. Employing Detection Controlled Models in Health and Environmental Risk Assessment: A Case in Offshore Oil Drilling. Jablonowski, C. 2010a. Using Regression Analysis to Relate Safety and Environmental Outcomes to Incidence Factors. Paper SPE 133018 presented at the SPE Trinidad and Tobago Energy Resources Conference, Port of Spain, Trinidad, 27-30 June. Jablonowski, C. 2010b. Statistical Analysis of Safety Incidents and the Implications of Imperfect Reporting. Paper SPE 134612 presented at the SPE Annual Technical Conference and Exhibition, Florence, Italy, 19-22 September. Kleit, A.N., Ruiz, J.F. 2003. False Positive Mammograms and Detection Controlled Estimation. Malallah, S. 2009. Leadership Influence in Safety Change Effort. Paper IPTC 13816 presented at the International Petroleum Technology Conference, Doha, Qatar, 7-9 December. Mearns, K., Whitaker, S., Flin, R. 2001. Scholz, J.T., Wang, C.L. 2006. Cooptation or Transformation? Local Policy Networks and Federal Regulatory Enforcement. Shultz, J. 1999. The Risk of Accidents and Spills at Offshore Production Platforms: A Statistical Analysis of Risk Factors and the Development of Predictive Models. Doctoral Dissertation, Carnegie Mellon University. Schultz, J., Fischbeck, P. 1999. Predicting Risks Associated with Offshore Production Facilities: Neural Network, Statistical, and Expert Opinion Models. Paper SPE 52677 presented at the SPE/EPA Exploration and Production Environmental Conference, Austin, TX, USA, 28 February-3 March. USDOI (United States Department of the Interior). 2010. Fact Sheet, The Workplace Safety Rule On Safety and Environmental Management Systems (SEMS). Accessed online on November 15 at http://www.doi.gov/news/pressreleases/. Winter, J., Owen, K., Read, B., Ritchie, R. 2010. How Effective Leadership Practices Deliver Safety Performance And Operational Excellence. Paper SPE 129035 presented at the SPE Oil and Gas India Conference and Exhibition, Mumbai, India, 20-22 January.
* Assistant Professor, Department of Petroleum and Geosystems Engineering, The University of Texas at Austin. |
## Recent Issues |