Quantitative analysis of experimental rock deformation data
It is very important to have tools to analyze laboratory rock experiments in a probabilistic fashion (e.g., Bayesian inference). The idea is to test possible constitutive relationships, and to obtain not only the optimum parameter values, but also their uncertainty and maybe even more importantly, their correlation. Indeed, if the constitutive model is right and does not require correlation between its parameters, finding one in the results is indicative of an experimental set-up that is not appropriate (maybe the T and P ranges are too narrow, maybe the experiments are to alike, ...). For a given model, one can make such diagnostic in advance, i.e., a data analysis tool can be used to help plan lab experiments.
The constitutive models that we test may be very generic for a family of mechanisms (for example, fluid-assisted slow deformation at high temperature), or may be specific to a given mechanism. Working on synthetic data generated by complex numerical models of the processes can help identify regime changes, test the inversion scheme or emphasize difficulties that will arise in lab experiments. One can also study how competing processes contribute to the deformation of the sample at different time scales (through model selection or model combination methods).
Click here for some examples of such studies.