Tremendous progress has been made in recent years towards developing computational tools that allow ready solution of most common rock engineering problems, with amongst the most advanced of the tools focusing on evaluating engineered openings within Synthetic Rock Masses. These new codes have the potential to allow much more realistic design, but they must not be used without careful calibration – prior to – during and – post completion of any modelling. This paper explores some of the pitfalls of inadequate data collection and suggests more emphasis be given to calibration. Guidelines for minimum levels of parameter data are suggested for improving application of basic to sophisticated modelling codes.
Tremendous progress has been made in recent years towards developing computational tools that allow ready solution of most common rock engineering problems. The degree of sophistication now possible with some of the available codes allows one to tackle extremely complex problems that were not thought tractable even a decade ago. Indeed, generation of synthetic numerical rock masses, based on discrete fracture network representations that closely mimic actual rock mass conditions, such as are shown in Figure 1, is today relatively straightforward, and not as user-challenging as employing state-of-the-art codes of even a few decades ago. This is largely because major improvements in computer graphics and in availability of ready-made user-friendly input-output interfaces have paralleled development of these new codes, now in some cases outpacing actual new analytical/procedure developments. This is something of a reversal from the first few decades of DOS-based rock engineering computational experience, when only rudimentary graphics were available and user-friendly input-output interfaces were virtually non-existent.
Nevertheless, it was a worry of eminent engineers even in the 1980's with the proliferation of emerging early geotechnical computational tools that too much reliance would be given to the then new program codes to the exclusion of practical observation and sensible site investigation. Because today it is even easier to use and abuse the latest state-of-the-art, new, and even more powerful (and in many ways deceptively believable) computer tools, it is relevant to revisit the words of Peck (1985) …. commenting that …"The numerical calculation subculture, in its enthusiasm over its many early successes, has fostered the opinion that the observational or learn-as-you-go procedure is now outdated. Proponents of this view feel that predictions for even the most complicated problems can be made reliably, and that design can be based confidently on these predictions. This view might be tenable were it not for the uncertainties associated with the constitutive relations and with the applicability of these relations to real geological conditions including the often undiscovered "minor geologic details" at specific sites. However, although much better predictions can often be made than heretofore, the complexity of the problems to which finiteelement solutions are applied enhances the likelihood of departure from reality".