You are on page 1of 10

Integrating Advances in Ground-Motion And Seismic-Hazard Analysis

Gail M. Atkinson
University of Western Ontario, London, Canada N6A 5B7

SUMMARY: Emerging trends in engineering seismology have the potential to integrate advances in several areas of groundmotion analysis and seismic hazard evaluation. Improved integration of the knowledge gained in ground motions and simulations may in the near-future make the use of ground-motion-prediction equations (GMPEs) an unnecessary intermediate step in PSHA. Instead, we may use the Monte Carlo approach to draw ground-motion time histories from a compiled catalogue of applicable records. A major advantage of eliminating GMPEs in PSHA is that it would automatically streamline the process of obtaining time histories for engineering analysis and risk studies. The major challenge in implementing such an approach is that suitable ground-motion catalogues (or simulation algorithms) must be developed to adequately sample expected motions in the required magnitude-distance-variability space. Advances in simulation technology, and continued growth in recorded motion databases, are tools that can be exploited in the development of such catalogues. Keywords: engineering seismology, ground motion prediction equations, Monte Carlo seismic hazard analysis

1. INTRODUCTION Emerging trends in engineering seismology have the potential to integrate advances in several areas of ground-motion analysis and seismic hazard evaluation. The way in which integration can be achieved, and an appreciation of its potential, can be effectively visualized in the context of Probabilistic Seismic Hazard Analysis (PSHA; see Reiter, 1990; McGuire, 2004). The Monte Carlo approach to PSHA (Musson, 1999, 2000; Hong and Goda, 2006) is particularly powerful in this regard. In this paper, I first summarize the Monte Carlo approach to PSHA to provide a framework with which to consider the implications of recent developments in engineering seismology. Then, I discuss recent improvements in ground-motion prediction equations (GMPEs) and ground-motion simulation. GMPEs, expressing expected median motions and their variability as a function of magnitude, distance and other variables, are important in earthquake engineering because they are typically the most critical input to a PSHA, in terms of controlling the results. PSHA hazard results are currently expressed as a Uniform Hazard Spectrum (UHS), in which the expected elastic response spectral amplitude (most often the 5% damped pseudo-acceleration, PSA) is plotted versus vibration frequency for a specified annual probability of exceedance. However, the desired input for many structural analyses is a time history of acceleration, not a UHS. The UHS must be de-aggregated to determine the magnitude-distance combinations that contribute most strongly to hazard (e.g. McGuire, 1995; 2004). The deaggregation results are then used to select time histories that are modified or scaled to reproduce the UHS, or a more problem-specific UHS variant such as the Conditional Mean Spectrum (CMS) or Conditional Spectrum (CS) (e.g. Baker, 2011; Jayaram et al., 2011; NIST, 2012). Increasingly, simulated time histories are being used in addition to recorded time histories in such procedures. The use of simulated time histories allows for improved representation of the kinds of motion that may be expected at a site, but may not be well-represented within traditional time-history databases. Examples of the types of time histories that are sparse in current datasets are: (i) strong

motions in mid-continental regions where enriched high-frequency content is expected; or (ii) nearfault motions for very large earthquakes, such as for a repeat of the 1857 or 1906 ruptures of the San Andreas fault. The selection and scaling of time histories (either recorded or simulated) via the UHS, CMS or CS is a common current practice that is well developed and workable, but rather cumbersome and problematic. A recent major report by NIST (2012) provides a thorough description of best practice in time history selection and scaling. In this paper, I describe how GMPEs may soon be replaced within the hazard evaluation framework with more direct formulations of ground-motion information, including direct use of catalogues and/or simulations. This formulation is facilitated by a Monte Carlo approach to PSHA. The direct implementation of time history selection in PSHA would avoid the need to rely on selection and scaling procedures that invoke the UHS or its variants, and may provide a more transparent assessment of the expected ground motions. I discuss the advantages and challenges facing this anticipated evolution of hazard assessment, as we strive to make better use of our growing knowledge of groundmotion processes.

2. OVERVIEW OF THE MONTE CARLO APPROACH TO PSHA Modern PSHA as implemented in good engineering practice throughout the world is predominantly based on the Cornell-McGuire method (Cornell, 1968; McGuire, 1976, 1977; Reiter, 1990). In the Cornell-McGuire method, the spatial distribution of earthquakes is described by seismic source zones, which may be either areas or faults. The source zones are defined based on seismotectonic information. An active fault is defined as a line source; geologic information may be used, in addition to historical seismicity, to constrain the sizes of events and their rates of occurrence on the fault. Areas of diffuse seismicity, where earthquakes are occurring on a poorly-understood network of buried faults, are represented as areal source zones (e.g. polygons in map view); historical seismicity is used to establish the rates of earthquake occurrence for earthquakes of different magnitudes within the areal source zones. The exponential relation of Gutenberg and Richter (Richter, 1958), asymptotic to an upper-bound magnitude (Mx), is used to describe the magnitude recurrence statistics in most cases, although for specific faults or sources a characteristic earthquake model (Schwartz and Coppersmith 1984) may be used. The upper magnitude bound for the recurrence relations, Mx, is a limit for integration in the hazard analysis, and represents the magnitude above which the probability of occurrence is 0. Mx values may be defined from geological information in the case of wellunderstood active faults. For areal source zones, Mx is usually based on the largest observed magnitudes in similar tectonic regions worldwide (though in some regions statistical approaches are attempted to define Mx). The rationale for using global analogues to define a conservative Mx is that the historical time period is too short to establish Mx empirically for any particular source zone; by using a global seismicity database for similar regions, we essentially substitute space for time in extending the seismicity database. Thus an Mx for unrifted mid-plate regions would be about M7, while Mx for rifted mid-plate regions such as the St. Lawrence Valley would be about M7.5, or even slightly larger (Johnston et al., 1994). The spatial distribution of earthquakes within each source is often assumed to be random (i.e. uniformly distributed), or may be modeled by a smoothing algorithm over events that occurred within the zone. Ground-motion prediction equations (GMPEs) provide the link between earthquake occurrence within a zone and ground shaking at a site. GMPEs are equations specifying the median amplitude of a ground motion parameter, such as peak ground acceleration or response spectra, as a function of earthquake magnitude, distance and site condition; these relations also specify the distribution of ground motion amplitudes about the median value (i.e., variability). To compute the probability of exceeding a specified ground motion amplitude at a site, hazard contributions are integrated over all magnitudes and distances, for all source zones, according to the total probability theorem (in practice, sensible limits are placed on the integration range for computational efficiency). Thus the mean annual rate of exceeding a specific shaking level, x, at a site is:

where i is the mean annual rate of the ith source, m is earthquake magnitude, R is the distance to the site, fi ( ) represents a probability density function, and P ( ) stands for the probability of the argument (Reiter, 1990). Calculations are performed for a number of ground motion amplitudes, and interpolation is used to find the ground motions associated with the chosen probability levels. The basic procedures are described by Reiter (1990) and McGuire (2004). Because of its ability to incorporate both seismicity and geologic information, the Cornell-McGuire method quickly became widely used and popular in applications throughout the world. The Cornell- McGuire method of PSHA traditionally has been implemented in computer programs that numerically perform the hazard integral. The program of McGuire (1976) was the first widelyused such program, and has been followed by many subsequent refined versions, such as the popular industry-based programs, EZFrisk and Frisk88 (available through Lettis, Inc.), and public-domain packages, such as OpenSHA (www.openSHA.org). Recently, there has been a trend towards solving the PSHA problem via Monte Carlo simulation rather than integration (Musson, 1999, 2000; Goda and Hong, 2006; Assatourians and Atkinson, 2012). The solution provided is the same as that from a numerical integration (both are equally correct) but as we will see there are some advantages to the Monte Carlo approach in terms of flexibility and transparency. In a Monte Carlo PSHA, a long-time simulated catalogue of earthquake occurrence is generated. We first characterize the spatio-temporal features of seismic sources, which may incorporate new advances in understanding of their clustering properties in both space and time, and long-term information on occurrence from the fields of paleoseismology and GPS. This information is used to specify the probability distributions of event occurrence in time and space; a simple example would be an areal source zone in which seismicity is randomly distributed according to the GutenbergRichter magnitude-recurrence relation (Richter, 1958), with specified values of rate, slope (b-value) and Mx. Monte Carlo simulation is used to generate the events of a very long catalogue (say, 1,000,000 years) that contains the time, magnitude and location of the events. (Other parameters could also be included in the simulation, such as event stress drop, mechanism, etc.) Figure 1 provides an example of a subcatalogue generated by the Monte Carlo approach for a time period of 2475 years in southeastern Canada. Multiple realizations of such subcatalogues would be used in a PSHA, to comprise a total catalogue of very long duration. In the next step of a Monte Carlo PSHA, we use the simulated catalogue to obtain a distribution of ground motions received at a site of interest. This can be done using GMPEs, as has been the traditional practice. The use of GMPEs in a PSHA (whether by numerical integration or Monte Carlo draw) results in the construction of a UHS that express the frequency-dependent expected motions at a site as a function of probability (and their uncertainty). In a Monte Carlo PSHA, the UHS is obtained by associating every earthquake in the catalogue with a ground motion received at the site. This is achieved by using a prescribed GMPE, or by drawing one from a suite of alternatives, and then applying a random variability (sigma) that models the probability distribution of ground motion about the expected median value. The ground motions received at the site may be used to define the meanhazard curve, and various fractiles (e.g. see Assatourians and Atkinson, 2012). In the Monte Carlo approach, de-aggregation (McGuire, 1995) occurs naturally, as we can easily track which combinations of event location, GMPE and sigma produce the most significant contributions to the hazard curve. A CMS (Baker, 2011) may also be developed from the de-aggregation information, to better characterize the motions that are expected for a given vibration period, considering the interfrequency correlation of ground motions. In the following, I describe some recent trends in characterizing ground motion processes, both in terms of traditional empirical GMPEs and the use of simulated earthquake time histories. I explore how this new information can be implemented in practice in PSHAs. I describe how the Monte Carlo approach to PSHA offers the potential to eliminate the intermediate step of using GMPEs to define a

UHS or CMS, allowing us to move towards a more direct method of selecting time histories for analysis.

Figure 1. Source zone boundaries of historical seismicity zones (blue lines; M6.5) and overlying characteristic Iapetan Rifted Margin zone (red lines: M6.5 to 7.5), along with a 2475 year subcatalogue of events of M5 generated by Monte Carlo simulation. Events outside the source zones come from a prescribed background seismicity rate. (from Assatourians and Atkinson, 2012)

3. RECENT DEVELOPMENTS IN GROUND-MOTION KNOWLEDGE: GMPES AND SIMULATIONS Strong-motion seismology has been an active field of development over the last decade from two perspectives: (i) in the compilation and analysis of extensive recorded ground-motion databases to empirically model expected median motions through GMPEs; and (ii) in the use of seismological models to simulate and characterize ground motions. 3.1 Developments in GMPEs Much of our knowledge regarding ground motions to be expected during future earthquakes comes from empirical analysis of ground-motion recordings from past events. GMPEs have played a particularly important role in this regard in PSHA, encapsulating our experience of the amplitudes and frequency content of motions as a function of earthquake magnitude, distance, site condition and other variables. It is not surprising that GMPEs have evolved rapidly over the last decade or so, as the number of available recordings has grown exponentially. A good example of this evolution, one which has had significant impact on PSHA results worldwide, has come from the NGA (Next Generation Attenuation) ground-motion project. The initial NGA-West1 project compiled a qualitycontrolled dataset of spectral amplitudes and meta-data parameters (such as distance to fault metrics, site information, etc.) from shallow crustal earthquakes in active tectonic regions worldwide, which was used by 5 developer terms to derive alternative GMPEs (see Power et al., 2008 and references

therein). A follow-on project, NGA-West2, is currently underway, using larger databases and further extending the work begun in NGA-West1. The NGA-West2 project looks at many aspects of ground motion in greater detail, including the scaling of motion over a wider magnitude range, and improved modeling of both site and directivity effects (e.g. see Campbell and Bozorgnia, 2012, this volume). Figure 2 illustrates the data available in NGA-West with which to characterize ground motion amplitudes.

Figure 2. Distribution of NGA-West ground-motion data for empirical regressions of amplitudes of shallow earthquakes in active tectonic regions (from Campbell and Bozorgnia, 2012).

Despite the advances in GMPEs made possible by quality datasets, there are a number of limitations that remain in both the development and application of GMPEs. A significant issue is that empirical GMPEs depend on the notion of combining data from different events, and usually different regions (e.g. as in NGAWest), so as to have sufficient data to derive robust empirical relationships. This necessarily obscures many of the ground-motion processes that we seek to characterize. For example, it is known that attenuation rates differ regionally. Even within a relatively limited area, such as southern California versus northern California, attenuation rates may differ (Atkinson and Morrison, 2009). The wider is the net cast in gathering data, the more pronounced may be the regional differences and potential biases that have been implicitly included. Furthermore, it is common practice to include a site variable in GMPEs, with Vs30 (shear-wave velocity over the top 30m) being a popular descriptive index variable for site condition. However, regional differences in typical soil profiles render the use of simple indices like Vs30 inadequate to characterize the site. For example, it has been shown that soil sites in Japan systematically amplify high-frequency motions relative to those in western North America, for sites of the same Vs30; the opposite trend applies at low frequencies (Atkinson and Casey; 2003). These differences can be explained in terms of the different depths of the deposits (Ghofrani et al., 2012). In areas such as Japan, or many parts of ENA, where soft soil deposits overlie a much firmer foundation, the peak amplification at the sites natural period may be very large, a factor of 5 or more (e.g. Motazedian and Hunter, 2008; Pal and Atkinson, 2012; Ghofrani et al., 2012); this type of amplification is not readily amenable to description through the use of Vs30, and raises questions concerning the appropriate way to incorporate site information into GMPEs. Due to site, attenuation and source issues, there are significant questions as to the extent to which generic GMPEs are transportable from one region to another, and from one type of event to another. A related consideration in using GMPEs in PSHA is to what extent the use of alternative GMPEs is a valid representation of the epistemic uncertainty (knowledge uncertainty) in median ground motion. This is standard PSHA practice, but is not readily justifiable on scientific grounds (Bommer and Scherbaum, 2008; Atkinson, 2011). Due to the manner in which they are derived, a suite of median GMPEs considered in a PSHA may be expected to overestimate epistemic uncertainty in some magnitude-distances ranges, while perhaps not fully accounting for it in others. Examining the

alternative NGA equations in comparison to the data they purport to represent, for example, I conclude that the inferred epistemic uncertainty in median ground-motion amplitude levels (due to data and therefore knowledge limitations) is not well modeled by simply comparing the alternative NGA equations. The GMPE predictions tend to cluster together for most magnitudes and distances, while in some cases the data suggest a wider range of alternative equations that might be realized. Furthermore, all of the equations tend to overpredict high-frequency PSA for large magnitudes, due to the common reluctance of ground-motion modelers (myself included) to permit over-saturation in their models. This arises because the data actually suggest lower motions for the largest magnitudes (e.g. near-source motions at high frequencies may be smaller for M7.5 than for M7.0), but most seismologists are not convinced that this is a true representation of ground-motion scaling, for various reasons. (The data from the largest events in the NGA dataset come from outside California -such as the M7.6 Chi Chi, Taiwan earthquake - and are often exceeded in their high-frequency amplitudes by data from moderate-to-large California events, such as the M6.7 Northridge earthquake.) Another approach to modelling epistemic uncertainty is to use the alternative GMPEs and applicable data to guide the choice of a representative or central GMPE, and to define bounding (upper and lower) GMPEs that express uncertainty in the central GMPE (e.g. Atkinson and Adams, 2012). A disadvantage of this approach is that is has a large subjective component. 3.2. Developments in Ground-Motion Simulations Many of the noted limitations of GMPEs can be attributed to paucity of data. Despite the large increases in the amount of data available in the last decade, there are still insufficient data to robustly characterize the expected range of ground motions over the entire magnitude-distance range of interest. Data are particularly inadequate for large magnitude events at close distances, especially as these might vary by region or even by fault. A powerful tool to fill in the gaps in available ground motions is simulation. In this approach, a seismological model whose parameters have been calibrated against recorded data is used to simulate ground motion records. Such models range from relatively simple stochastic simulations (e.g. Boore, 2003; Motazedian and Atkinson, 2005) to broadband simulations that include the physics of wave propagation to a greater degree (e.g. Mai and Beroza, 2003; Graves and Pitarka, 2004; Liu et al., 2006; Graves et al., 2008). It should be noted that broadband simulations are actually a combination of deterministic ground motion predictions (based on the specified rupture and propagation parameters) at periods longer than 1s, combined with stochastic predictions (having a more random character) at short periods. The use of stochastic modeling in broadband simulations arises because the deterministic prediction of high-frequency ground motion amplitudes is not yet possible, due to limitations in both detailed knowledge of earth structure and computational limitations. Simulations have been used in GMPE development to augment a sparse dataset in regions such as ENA (Atkinson and Boore, 2006). Increasingly, they are being used to provide time histories for use in engineering analyses, especially to explore the effects of specific expected motions. A good example is the simulations developed for the ShakeOut exercise in southern California (Bielak et al., 2010), coordinated by the Southern California Earthquake Center (SCEC). A large number of simulations were generated to predict the expected ground motions across southern California from a M7.8 rupture on the San Andreas fault. The simulations explored expected motions considering different rupture scenarios, and incorporating the effects of wave propagation in the Los Angeles basin, as well as the overall site conditions across the region. Figure 3 shows an example of the predicted motions for one of the scenarios, at a response spectral period of 3s. It is anticipated that such simulations will play an increasingly large role in characterizing our knowledge of ground motions for use in engineering applications. This process is being facilitated by developments at SCEC to implement a broadband simulation platform (scec.usc.edu/ research/ cme/ groups/ broadband) that will put a range of simulation tools into the hands of the broader research community in engineering seismology.

Figure 3. Example of simulated ground motions (spectral acceleration at 3s) from broadband (0-10Hz) M7.8 ShakeOut scenario on San Andreas fault (south hypocentre), with examples of 3-component time histories simulated at selected stations within the Los Angeles area. Simulations and figure courtesy of Robert Graves, 2008 (scec.usc.edu/research/cme/groups/broadband).

4. TOWARDS THE ELIMINATION OF GMPES The specification of GMPEs, whether developed from empirical records or simulated records, has long been an integral and critical step in PSHA. However, there is no fundamental reason why we need to use GMPEs to assess hazard. They are a convenient way of expressing our knowledge of the expected ground motions in magnitude-distance space, but they have some significant drawbacks. In particular, because we use GMPEs to express the expected motions, we end up with a hazard representation in terms of a UHS, which then must be de-aggregated in order to obtain the information needed to specify time histories for use in design. If our target ground-motion specification is a set of time histories for design, GMPEs are really just an inconvenient and imperfect intermediate step towards this specification. A more direct application of our knowledge of expected ground motions can be achieved by eliminating the intermediate step of specifying GMPEs. Instead, we can draw ground-motion time histories (and their spectra) from a compiled catalogue of records that are applicable to the problem. The Monte Carlo approach to PSHA makes this conceptually very simple. The catalogue could be comprised of either actual recordings, or simulated records or potentially some combination of the two. The way it works is as follows. From a simulated earthquake catalogue for the return period of interest, we can assemble the list of events that may contribute to the hazard (i.e. those that may have ground motions above some threshold on the basis of magnitude and distance or response spectrum). Suppose that a rich and extensive catalogue of time histories exists (this time history catalogue is discussed next). For each event in the list of simulated earthquakes that will impact the site, we select some number of records from the time history catalogue that fit the criteria and that will implicitly sample the aleatory variability. A subset (or the full set) of these records would be the time history set for design. Various statistics regarding the selected time histories (peak motions, response spectra, etc.) could be readily compiled. By repeating the process for an arbitrary number of simulated earthquake catalogues, both aleatory and epistemic uncertainty could be sampled as desired. The details of sampling the uncertainty, the determination of how many records are needed, how stringent

are the selection criteria, and so on, are all amenable to investigation and fine-tuning as we gain experience. 4.1. Time history Catalogue from Real Recordings Ideally, we would have a rich library of time histories from the region of interest, with the correct site conditions (or for a reference site condition that would allow use of the records in a site response analysis). Then we could simply draw some number of records for each magnitude-distance combination in the simulated earthquake catalogue, applying any other desired filters for factors such as mechanism, directivity and so on along the way. In practice, of course, this ideal catalogue does not yet exist. However, there is already a fairly rich set of records for many important magnitudedistance-site combinations; the online PEER catalogue (http:// peer2.berkeley.edu/ peer_ground_motion_database) is a good example of a compilation of such records. Thus the needed time history catalogue can be thought of existing, but having at present a large number of holes. 4.2. Time History Catalogue from Simulations We can use simulations to fill in the holes in the existing catalogue of recordings, or to populate an entire time history catalogue. The obvious advantage to using simulations is that there is no restriction on the number or combinations of magnitude-distance-site-etc. in such a catalogue. In fact, there is no need for a time history catalogue at all: we can use a simulation algorithm (such as will be available on the SCEC broadband platform) to generate any number of time histories that could be received at our site, for each event in the simulated catalogue. The simulations could be very specific, to the extent allowed by the available information, considering factors such as the particular faults, expected source parameters, regional velocity and attenuation structure, particular site conditions, and so on. Alternatively, the simulations could be rather generic, such as those developed using a stochastic approach. Both epistemic and aleatory uncertainty in these factors could be included through alternative realizations within the time history set. Thus we can envision a process in which we go directly from a simulated occurrence catalogue to a simulated set of ground-motion time histories for analysis, including epistemic and aleatory uncertainties in each process as applicable. This would seamlessly integrate the probabilistic information on the temporal and spatial distribution of seismicity across a region with the ground-motion time histories expected at a site.

5. CONCLUDING REMARKS In this paper, I have overviewed some recent trends in engineering seismology, with a focus on: (i) the Monte Carlo approach to PSHA; and (ii) developments in ground-motion characterization. I have shown how these developments can be integrated to improve our characterization of ground motions for use in engineering analysis. Specifically, I have outlined how we can move towards eliminating the need for GMPEs, by using a Monte Carlo approach to PSHA in combination with catalogues of real and simulated time histories. A major advantage of eliminating GMPEs in PSHA is that it will automatically streamline the process of obtaining time histories for engineering analysis and risk studies. At present, these time histories must be obtained by deaggregating the PSHA, then selecting and matching appropriate records to the UHS, CMS or CS. In my view, it is a better approach, conceptually, to draw or simulate records for the events directly. This eliminates the need for deaggregation and time history selection and matching to a target spectrum these processes become implicit. The major challenge in implementing such an approach is that suitable ground-motion record catalogues (and/or suitable selection/simulation algorithms) must be developed to adequately sample expected motions in the required magnitude-distance-variability space. Advances in simulation technology, and continued growth in recorded motion databases, are tools that can be exploited in the development of such catalogues.

ACKNOWLEDGEMENT Funding for Atkinsons research in ground motions and seismic hazards comes from the Natural Sciences Research Council of Canada.

REFERENCES Assatourians, K. and G. Atkinson (2012). EqHaz: An open-source probabilistic seismic hazard code based on the Monte Carlo simulation approach. Seism. Res. L., submitted (see also www.seismotoolbox.ca) Atkinson, G. (2011). An empirical perspective on uncertainty in earthquake ground motions. Can.J.Civil Eng., 38,1-14. DOI:10.1139/l10-120. Atkinson, G. and Adams, J. (2012). Ground-motion prediction equations for use in the 2012 National Seismic Hazard Maps of Canada. Can. J. Civil Eng., submitted. Atkinson, G. and D. Boore (2006). Ground motion prediction equations for earthquakes in eastern North America. Bull. Seism. Soc. Am., 96, 2181-2205. Atkinson, G. and R. Casey (2003). A comparison of ground motions from the 2001 M6.8 in-slab earthquakes in Cascadia and Japan. Bull. Seism. Soc. Am.,93, 1823-1831. Baker, J. (2011). The conditional mean spectrum: A tool for ground-motion selection. ASCE J. Struct. Eng., 137, 322-331. Bielak, J., R. Graves, K.B. Olsen, R. Taborda, L. Ramirez-Guzman, S.M. Day, G. Ely, D. Roten, T. Jordan, P. Maechling, J. Urbanic, Y. Cui and G. Juve (2010). The ShakeOut Earthquake Scenario: Verification of Three Simulation Sets , Geophysical Journal International , 180 , 375-404. Bommer, J. J. , and F. Scherbaum (2008). The use and misuse of logic-trees in Probabilistic Seismic Hazard Analysis, Earthquake Spectra, 24(4), 997-1009. Boore, D. M. (2003). Simulation of ground motion using the stochastic method, Pure and Applied Geophysics 160, 635-675. Campbell, K. and Y. Bozorgnia (2012). 2012 update of the Campbell-Bozorgnia NGA ground motion prediction equations: A progress report. Proc. 15th World Conf. Earthq. Eng., this volume. Cornell, C. (1968). Engineering seismic risk analysis. Bull. Seism. Soc. Am., 58, 1583-1606. Ghofrani, H., G. Atkinson and K. Goda (2012). Implications of the 2011 M9.0 Tohoku Japan earthquake for the treatment of site effects in large earthquakes. Bull. Earthq. Eng., submitted. Graves, R, and A. Pitarka (2004), Broadband time history simulation using a hybrid approach, 13th World Conference on Earthquake Engineering, Vancouver, BC Canada, August 1-6, 2004, Paper No. 1098. Graves, R. W., B. T. Aagaard, K. W. Hudnut, L. M. Star, J. P. Stewart, and T. H. Jordan (2008). Broadband simulations for Mw 7.8 southern San Andreas earthquakes: Ground motion sensitivity to rupture speed, Geophys. Res. Lett. 35, L22302. Hong, H. P., and K. Goda (2006). A comparison of seismic-hazard and risk deaggregation, Bulletin of the Seismological Society of America 96, 2021-2039. Jayaram, N., T. Lin and J. Baker (2011). A computationally efficient ground-motion selection algorithm for matching a target response spectrum mean and variance. Earthquake Spectra, 27, 775-796. Johnston, A., K. Coppersmith, L. Kanter and C. Cornell (1994). The earthquakes of stable continental regions. Electric Power Research Institute Report TR-102261, volume 1, Palo Alto, CA. Liu, P., R. J. Archuleta, and S. H. Hartzell (2006). Prediction of broadband ground-motion time histories: Hybrid low/high frequency method with correlated random source parameters, Bull. Seismol. Soc. Am. 96, 2118 2130. Mai, P.M., and G.C. Beroza (2003). A hybrid method for calculating near-source broadband seismograms: application to strong motion prediction, Phys. Earth Planet. Int., 137, 183-199. McGuire, R. (1976). FORTRAN computer program for seismic risk analysis. U.S. Geological Survey Open-file Report 76-67. McGuire, R. (1977). Seismic design spectra and mapping procedures using hazard analysis based directly on oscillator response. International Journal of Earthquake Engineering and Structural Dynamics, 5, 211-234. McGuire, R. (1995). Probabilistic seismic hazard analysis and design earthquakes: Closing the loop. Bull. Seism. Soc. Am., 85, 1275-1284. McGuire, R. K. (2004). Seismic hazard and risk analysis, Vol. MNO-10, Earthquake Engineering Research Institute, Oakland, Calif. Motazedian, D., and G. Atkinson (2005). Stochastic finite-fault modeling based on a dynamic corner frequency. Bull. Seismol. Soc. Am. 95, 995-1010. Motazedian D. and J. Hunter (2008). Development of a NEHRP Map for the Orleans Suburb of Ottawa, Ontario, Canadian Geotechnical Engineering Journal 45:1180-1188 .

Musson, R. M. W. (1999). Determination of design earthquakes in seismic hazard analysis through Monte Carlo simulation, J.Earthquake Eng. 3, 463-474. Musson, R. M. W. (2000). The use of Monte Carlo simulations for seismic hazard assessment in the U.K. Ann.Geofis. 43, 1-9. NIST (2012). Selecting and scaling earthquake ground motions for performing response-history analyses. NEHRP Consultants Joint Venture (A partnership of the Applied Technology Council and the Consortium of Universities for Research in Earthquake Engineering). NIST GCR 11-917-15. National Institute of Standards and Technology, U.S. Dept. Commerce, Wa. 256 pp. Pal, J. and G. Atkinson (2012). Scenario ShakeMaps for Ottawa, Canada. Bull. Seism. Soc. Am., 102, 650-660. Power, M., Chiou, B., Abrahamson, N, Y. Bozorgnia, T. Shantz and Roblee, C. (2008). An overview of the NGA project. Earthquake Spectra, 24, 3-21. Reiter, L. (1990). Earthquake Hazard Analysis: Issues and Insights. Columbia University Press, New York, NY. Richter, C. (1958). Elementary seismology. W.H. Freeman and Co., New York. Schwartz, D. and K. Coppersmith (1984). Fault behavior and characteristic earthquakes: examples from the Wasatch and San Andreas fault zones. Journal of Geophysical Research, 89, 5681-5698.

You might also like