Principle Component Analysis as a Method For Error Covariance Matrix Inflation
MetadataShow full item record
An oil and gas discovery goes through a multiple-stage process to increase the understanding of the asset in hand. By increasing the understanding of the field, production plan gains extra credibility and the uncertainty associated with the plan decreases. To increase the understanding of the asset, each measurement is considered an indication of the reservoir properties. These measurements are used to update the prior proposed reservoir models. This process of model update and calibration is called history matching in the oil and gas industry. The use of several models adds the value of considering the uncertainty associated with our understanding of the reservoir, and decreases the uncertainty in the future prediction, thus field development plan and facilities design are more reliable. The power of the ensemble-based modelling is its ability to represent various points in the possibility space. If it happens to have identical (or near identical) models, the ensemble loses its power and adds unneeded computational cost. Nonetheless, each measurement point used to assimilate the models decreases the standard deviation of the models, therefore using redundant data leads eventually to a model collapse. During the project, the root of the redundancy was studied and methods of eliminating or reducing redundancy is discussed and its effect on collapsing the ensemble of models (filter divergence). Few methods have been used to prevent the filter divergence. In this paper we discuss the use of Principle Component Analysis (PCA) in innovation to count for the dependency associated with the measurements. Moreover, the workflow associated use PCA and its effect on the ensemble spread is presented.
Master's thesis in Petroleum geosciences engineering