From 1 - 10 / 215
  • We report four lessons from experience gained in applying the multiple-mode spatially-averaged coherency method (MMSPAC) at 25 sites in Newcastle (NSW) for the purpose of establishing shear-wave velocity profiles as part of an earthquake hazard study. The MMSPAC technique is logistically viable for use in urban and suburban areas, both on grass sports fields and parks, and on footpaths and roads. A set of seven earthquake-type recording systems and team of three personnel is sufficient to survey three sites per day. The uncertainties of local noise sources from adjacent road traffic or from service pipes contribute to loss of low-frequency SPAC data in a way which is difficult to predict in survey design. Coherencies between individual pairs of sensors should be studied as a quality-control measure with a view to excluding noise-affected sensors prior to interpretation; useful data can still be obtained at a site where one sensor is excluded. The combined use of both SPAC data and HVSR data in inversion and interpretation is a requirement in order to make effective use of low frequency data (typically 0.5 to 2 Hz at these sites) and thus resolve shear-wave velocities in basement rock below 20 to 50 m of soft transported sediments.

  • This paper describes the methods used to define earthquake source zones and calculate their recurrence parameters (a, b, Mmax). These values, along with the ground motion relations, effectively define the final hazard map. Definition of source zones is a highly subjective process, relying on seismology and geology to provide some quantitative guidance. Similarly the determination of Mmax is often subjective. Whilst the calculation of a and b is quantitative, the assumptions inherent in the available methods need to be considered when choosing the most appropriate one. For the new map we have maximised quantitative input into the definition of zones and their parameters. The temporal and spatial Poisson statistical properties of Australia's seismicity, along with models of intra-plate seismicity based on results from neotectonic, geodetic and computer modelling studies of stable continental crust, suggest a multi-layer source zonation model is required to account for the seismicity. Accordingly we propose a three layer model consisting of three large background seismicity zones covering 100% of the continent, 25 regional scale source zones covering ~50% of the continent, and 44 hotspot zones covering 2% of the continent. A new algorithm was developed to calculate a and b. This algorithm was designed to minimise the problems with both the maximum likelihood method (which is sensitive to the effects of varying magnitude completeness at small magnitudes) and the least squares regression method (which is sensitive to the presence of outlier large magnitude earthquakes). This enabled fully automated calculation of a and b parameters for all sources zones. The assignment of Mmax for the zones was based on the results of a statistical analysis of neotectonic fault scarps.

  • The effect of ground motion models, site response and recurrence parameters (a, b, Mmax) on the uncertainty in estimating earthquake hazard have been widely discussed. There has been less discussion on the effect of the choice of source zones and the implied seismicity model. In the current Australian national seismic hazard map we have adopted a 3 layer source zone model. This attempts to capture the variability of the spatial distribution of the seismicity in the stable continental crust of Australia. PSHA has an implied assumption that the spatial distribution of earthquakes within a source zone is either uniform or random - with the random distribution approaching uniformity as it becomes sufficiently dense. At almost any scale in no area of Australia does the seismicity conform to either a random (single Poisson model) or a uniform distribution - it is more clustered. Generally, at least three Poisson models are required to match the observed spatial statistical distribution; typically zones of low, moderate and very high seismicity. Using the full (not declustered catalogue) at least 4 Poisson models are required. In all cases examined there are more bins than expected with <1 and >3 earthquakes and a deficit of bins with 1 or 2 earthquakes. This observaion is consistent with emerging models of earthquakes in stable continents being a non-stationary or episodic, rather than a steady state, process. In order to account for this observation, we use a three layer source zone model, consisting of: a Background layer, with three zones covering 100% of the continent, based on the geological and geophysical properties; a Regional layer, of 25 zones covering ~50% of the continent, based on the pattern of earthquake density; and a Hotspot layer, of 44 zones covering 2% of the continent, based on the areas of sustained intense seismicity. In the final hazard model the maximum of the three hazard values is used, not a weighted average of the three layers.

  • Source The data was sourced from CSIRO (Victoria) in 2012 by Bob Cechet. It is not known specifically which division of CSIRO, although it is likely to have been the Marine and Atmospheric Research Division (Aspendale), nor the contact details of the person who provided the data to Bob. The data was originally produced by CSIRO for their input into the South-East Queensland Climate Adaptation Research Initiative (SEQCARI). Reference, from an email of 16 March 2012 sent from Bob Cechet to Chris Thomas (Appendix 1 of the README doc stored at the parent folder level with the data), is made to 'download NCEP AVN/GFS files' or to source them from the CSIRO archive. Content The data is compressed into 'tar' files. The name content is separated by a dot where the first section is the climatic variable as outlined in the table format below: Name Translation rain 24 hr accumulated precipitation rh1_3PM Relative humidity at 3pm local time tmax Maximum temperature tmin Minimum temperature tscr_3PM Screen temperature (2 m above ground) at 3pm local time u10_3PM 10-metre above ground eastward wind speed at 3pm local time v10_3PM 10-metre above ground northward wind speed at 3pm local time The second part of the name is the General Circulation Model (GCM) applied: Name Translation gfdlcm21 GFDL CM2.1 miroc3_2_medres MIROC 3.2 (medres) mpi_echam5 MPI ECHAM5 ncep NCEP The third, and final, part of the tarball name is the year range that the results relate to: 1961-2000, 1971-2000, 2001-2040 and 2041-2099 Data format and extent Inside each of the tarball files is a collection of NetCDF files covering each simulation that constitutes the year range (12 simulations for each year). A similar naming protocol is used for the NetCDF files with a two digit extension added to the year for each of the simulations for that year (e.g 01-12). The spatial coverage of the NetCDF files is shown in the bounding box extents as shown below. Max X: -9.92459297180176 Min X: -50.0749073028564 Max Y: 155.149784088135 Min Y: 134.924812316895 The cell size is 0.15 degrees by 0.15 degrees (approximately 17 km square at the equator) The data is stored relative to the WGS 1984 Geographic Coordinate System. The GCMs were forced with the Intergovernmental Panel on Climate Change (IPCC) A2 emission scenario as described in the IPCC Special Report on Emissions Scenarios (SRES) inputs for the future climate. The GCM results were then downscaled from a 2 degree cell resolution by CSIRO using their Cubic Conformal Atmospheric Model (CCAM) to the 0.15 degree cell resolution. Use This data was used within the Rockhampton Project to identify the future climate changes based on the IPCC A2 SRES emissions scenario. The relative difference of the current climate GCM results to the future climate results was applied to the results of higher resolution current climate natural hazard modelling. Refer to GeoCat # 75085 for the details relating to the report and the 59 attached ANZLIC metadata entries for data outputs.

  • This paper describes the methods used to define earthquake source zones and calculate their recurrence parameters (a, b, Mmax). These values, along with the ground motion relations, effectively define the final hazard map. Definition of source zones is a highly subjective process, relying on seismology and geology to provide some quantitative guidance. Similarly the determination of Mmax is often subjective. Whilst the calculation of a and b is quantitative, the assumptions inherent in the available methods need to be considered when choosing the most appropriate one. For the new map we have maximised quantitative input into the definition of zones and their parameters. The temporal and spatial Poisson statistical properties of Australia's seismicity, along with models of intra-plate seismicity based on results from neotectonic, geodetic and computer modelling studies of stable continental crust, suggest a multi-layer source zonation model is required to account for the seismicity. Accordingly we propose a three layer model consisting of three large background seismicity zones covering 100% of the continent, 25 regional scale source zones covering ~50% of the continent, and 44 hotspot zones covering 2% of the continent. A new algorithm was developed to calculate a and b. This algorithm was designed to minimise the problems with both the maximum likelihood method (which is sensitive to the effects of varying magnitude completeness at small magnitudes) and the least squares regression method (which is sensitive to the presence of outlier large magnitude earthquakes). This enabled fully automated calculation of a and b parameters for all sources zones. The assignment of Mmax for the zones was based on the results of a statistical analysis of neotectonic fault scarps.

  • The Attorney General's Department (AGD) has supported Geoscience Australia (GA) to develop inundation models for one South Australia (SA) community with the view of building the tsunami planning and preparation capacity of the SA State Government. The community that was chosen was Victor Harbor, which also includes the townships of Port Elliot and Middleton. These locations were selected in collaboration with the SA State Emergency Service (SES), SA Department of Environment and Natural Resources (DENR) and the Australian Government based on the National Near Shore Tsunami Hazard Assessment [1] that highlighted tsunami amplification near Victor Harbor. Three tsunamigenic events were selected for modelling from the scenario database that was calculated as part of the national offshore probabilistic Probabilistic Tsunami Hazard Aassessment (PTHA) [2]. The events selected are hypothetical and are based on the current understanding of the tsunami hazard. Only earthquake sources are considered as these account for the majority of tsunami that have historically been observed in Australia. The suite of events includes three 'worst-case' or 1 in 10 000 year hazard event from three different source zones; the Puysegur Trench, Java Trench and South Sandwhich Islands Trench. These three source zones were identified from the PTHA to contribute significantly to the offshore tsunami hazard near Victor Harbor.

  • Abstract: Severe wind is one of the major natural hazards affecting Australia. The main wind hazards contributing to economic loss in Australia are tropical cyclones, thunderstorms and mid-latitude storms. Geoscience Australia's Risk and Impact Analysis Group (RIAG) has developed mathematical models to study a number of natural hazards including wind hazard. In this paper, we describe a model to study 'combined' gust wind hazard produced by thunderstorm and mid-latitude or synoptic storms. The model is aimed at applications in regions where these two wind types dominate the hazard spectrum across all return periods (most of the Australian continent apart from the coastal region stretching north from about 27 degrees south). Each of these severe wind types is generated by different physical phenomena and poses a different hazard to the built environment. For these reasons, it is necessary to model them separately. The return period calculated for each wind type is then combined probabilistically to produce the combined gust wind return period, the indicator used to quantify severe wind hazard. The combined wind hazard model utilises climate-simulated wind speeds and hence it allows wind analysts to assess the impact of climate change on future wind hazard. It aims to study severe wind hazard in the non-cyclonic regions of Australia (region 'A', as defined in the Australian/NZ Wind Loading Standard, AS/NZS 1170.2:2002) which are dominated by thunderstorm and synoptic winds.

  • This user guide describes the important instructions for using the Tasmanian Extreme Wind Hazard Standalone Tool (TEWHST). It aims to assist the Tasmanian State Emergency Service (SES) to view the spatial nature of extreme wind hazard (and how it varies depending on the direction of the extreme wind gusts). This information indicates detailed spatial texture for extreme hazard, which can provide guidance for understanding where the local-scale hazard (and impact) is expected to be the greatest for any particular event depending on the intensity and directional influence of the broad-scale severe storm. The tool provides spatial information at the local scale (25 metre resolution) of the return period extreme wind hazard (3-second gust at 10 metre height; variation with direction) where the broad-scale regional hazard is provided by the Australian and New Zealand Wind Loading Standard (AS/NZS 1170.2, 2002).

  • Along the Aceh-Andaman subduction zone, there was no historical precedent for an event the size of the 2004 Sumatra-Andaman tsunami; therefore, neither the countries affected by the tsunami nor their neighbours were adequately prepared for the disaster. By studying the geological signatures of past tsunamis, the record may be extended by thousands of years, leading to a better understanding of tsunami frequency and magnitude. Sedimentary evidence for the 2004 Sumatra-Andaman tsunami and three predecessor great Holocene tsunamis is preserved on a beach ridge plain on Phra Thong Island, Thailand. Optically stimulated luminescence ages were obtained from tsunami-laid sediment sheets and surrounding morphostratigraphic units. Single-grain results from the 2004 sediment sheet show sizable proportions of near-zero grains, suggesting that the majority of sediment was well-bleached prior to tsunami entrainment or that the sediment was bleached during transport. However, a minimum-age model needed to be applied in order to obtain a near-zero luminescence age for the 2004 tsunami deposit as residual ages were found in a small population of grains. This demonstrates the importance of considering partial bleaching in water-transported sediments. The OSL results from the predecessor tsunami deposits and underlying tidal flat sands show good agreement with paired radiocarbon ages and constrain the average recurrence of large late Holocene tsunami on the western Thai coast to between 500 to 1000 years. This is the first large-scale application of luminescence dating to gain recurrence estimates for large Indian Ocean tsunami. These results increase confidence in the use of OSL to date tsunami-laid sediments, providing an additional tool to tsunami geologists when material for radiocarbon dating is unavailable. Through an understanding of the frequency of past tsunami, OSL dating of tsunami deposits can improve our understanding of tsunami hazard and provide a means of assessing fu