Fast and reliable field-portable chemical analysis of produced waters remains one of the main challenges of scale-risk mitigation, as it enables timely control over scale inhibitor type and dosage. While many analytical methods are potentially applicable to produced waters, most of them lack reliability under field conditions or do not meet increasingly tight cost requirements. X-ray fluorescence (XRF) spectrometry is routinely used in the oil field for analysis of cores, drilling cuttings, and muds, as it provides quick noninvasive detection of many elements simultaneously. Inexpensive portable handheld analyzers typically feature limited element range, sensitivity, and resolution, and are therefore expected to be inferior to benchtop analyzers. The question is whether handheld devices can offer suitable detection limits and accuracy for produced water analysis, despite these limitations. At the same time, one of the major challenges of XRF, the so-called matrix effect, is strong in produced waters and also negatively affects the analysis accuracy.
This study demonstrates how multivariate machine-learning (ML) techniques can be applied to the full XRF spectra recorded with a handheld analyzer. ML spectra processing is shown to successfully mitigate matrix effects and enable simultaneous quantification of all ions of interest. Interestingly, key physical (density) and chemical (total dissolved solids and hardness) properties of produced water can also be quantified using ML techniques. In the paper, the experimental protocols are described first, followed by a detailed discussion of the data workflows, which covers the XRF spectra preprocessing, algorithm selection and tuning, and independent validation procedures. Over 50 different ML algorithms are trained on different spectra ranges of a multicomponent calibration dataset, and the three best models are applied to several real-life produced water sample sets for validation. A rigorous error analysis is performed for all ML models. In field samples, the resulting analysis errors (RMSE) are less than 100 mg/L for barium and strontium, less than 150 mg/L for sulfate, and remarkably small/accurate for other ions and properties considering measurement with a handheld device.