product
Type of resources
Keywords
Publication year
Topics
-
Unique challenges are faced in modelling faults in intraplate regions for seismic hazard purposes. Low fault slip rates compared to landscape modification rates lead to often poor discoverability of fault sources, and favours incomplete characterisation of rupture behaviours. Irrespective, regional and local test cases have demonstrated that fault sources assigned activity rates consistent with paleoseismic observations have the potential to significantly impact probabilistic seismic hazard assessments in Australia. To reflect this, the 2018 Australian NSHA will for the first time incorporate a fault source model. The model includes over 300 onshore faults, and a handful of offshore faults, which are modelled as simplified planes and assigned a general dip and dip direction. Dips are obtained from seismic-reflection profiles, where available, or inferred by taking into account surface geology and geomorphology, or other fault geometries within similar neotectonic settings. The base of faulting is generally taken as the regional maximum depth of distributed seismicity. Slip rates are calculated from displaced strata of known age, estimated from surface expression, or are extrapolated from other faults within similar neotectonic settings. We construct logic trees to capture epistemic uncertainty in fault source parameters, including magnitude frequency distribution, and the potential for random, periodic or episodic recurrence behaviour. This presentation introduces the new fault source database, the fault source logic tree as it currently exists, and discusses uncertainty in and sensitivity to various elements of the proposed fault source input model.
-
Bookmark developed during the year of the 30th anniversary of the Newcastle earthquake and used to raise awareness of earthquakes and to provide information on what to do in an earthquake. As Geoscience Australia jointly operates the Joint Australian Tsunami Warning Centre with the Bureau of Meteorology, the bookmark also provides information on tsunami safety. Geoscience Australia identifies and characterises potentially tsunamigenic earthquakes and this information is used to initiate the tsunami warning chain.
-
<p>Lu-Hf isotopic analysis of zircon is becoming a common way to characterise the source signature of granite. The data are collected by MC-LA-ICP-MS (multi-collector laser ablation inductively coupled plasma mass spectrometry) as a series of spot analyses on a number of zircons from a single sample. These data are often plotted as spot analyses, and variable significance is attributed to extreme values, and amount of scatter. <p>Lu-Hf data is used to understand the origin of granites, and often a distribution of εHf values is interpreted to derive from heterogeneity in the source or from mixing processes. As with any physical measurement, however, before the data are used to describe geologic processes, care ought to be taken to account for sources of analytical variability. The null hypothesis of any dataset is that there is no difference between measurements that cannot be explained by analytical uncertainty. This null hypothesis must then be disproven using common statistical methods. <p>There are many sources of uncertainty in any analytical method. First is the uncertainty associated with the counting statistics of each analysis. This uncertainty is usually recorded as the SE (standard error) uncertainty attributed to each spot. This uncertainty commonly underestimates the total uncertainty of the population, as it only contains information about the consistency of the measurement within a single analysis. The other source of uncertainty that needs to be characterised is similarity over multiple analyses. This is very difficult to assess in an unknown material, but can be assessed by measuring well-understood reference zircons. <p>Reference materials are characterised by homogeneity in the isotope of interest, and multiple analyses of this material should produce a single statistical population. Where these populations display significant excess scatter, manifested as a MSWD value that far exceeds 1, this means that counting statistics are not the sole source of uncertainty. This can be addressed by expanding the uncertainty on the analyses until the standard zircons form a coherent statistical population. This expansion should then be applied to the unknown zircons to accommodate this ‘spot-to-spot-uncertainty’ or ‘repeatability’ factor. This approach is routinely applied to SHRIMP U-Pb data, and here is similarly applied to Lu-Hf data from granites of the northeast Lachlan Orogen. <p>By applying these uncertainty factors appropriately, it is then possible to assess the homogeneity of unknown materials by calculating weighted means and MSWD factors. The MSWD is a measure of scatter away from a single population (McIntyre et al., 1966; Wendt and Carl, 1991). Where the MSWD is 1, the scatter in data points can be explained solely by analytical means. The higher the MSWD, the less likely it is that the data can be described as a single population. Data which disperses over several εHf units can still be attributed to a single population if the uncertainty envelopes of analyses largely overlap each other. These concepts are illustrated using the data presented in Figure 1. Four out of five of the εHf datasets on zircons from granites form statistically coherent populations (MSWD = 0.69 to 2.4). <p>A high MSWD does not necessarily imply that variation is due to processes occurring during granite formation. Although zircon is a robust mineral, isotopic disturbances are still possible. In the U-Pb system, there is often evidence of post-crystallisation ‘Pb-loss’ which leads to erroneously young apparent U-Pb ages. The Lu-Hf system in zircon is generally thought to be more robust than the U-Pb system, but that does not mean that it is impervious to such effects. In the data set presented in Figure 1, the sample with the most scatter in Lu-Hf (Glenariff Granite, εHf = -0.2 ± 1.5, MSWD = 7.20) is also the sample which had the most rejections in the SHRIMP U-Pb data due to Pb-loss. The subsequent Hf analyses targeted only those grains which fell within the magmatic population (i.e., no observed Pb-loss), but the larger volume excavated by laser Hf analysis means that it is likely that disturbed regions of these grains were incorporated into the measurement. This gives an explanation for the scatter that has nothing to do with geological source characteristics. <p>This line of logic can similarly be applied to all types of multi-spot analyses, including O-isotope analyses. While most of the εHf datasets presented here form coherent populations, the O-isotope data are significantly more scattered (MSWD = 2.8 to 9.4). The analyses on the unknowns scatter much more than on the co-analysed TEMORA2 reference zircon. This implies a source of scatter additional to those described above. In addition to the above described sources of uncertainty, O-isotope analysis by SIMS is also extremely sensitive to topography on the surface of the epoxy into which zircons are mounted (Ickert et al., 2008). O isotopes may also be susceptible to post-formation disturbance and so care should also be taken when interpreting O data, before assigning geological meaning. <p>While it is possible for Lu-Hf and O analyses of zircons in granites to reflect heterogeneous sources and/or complex formation processes, it is important to first exclude other sources of heterogeneity such as analytical sources of uncertainty, and post-formation isotopic disturbances.
-
A postcard providing an overview of the marine ecology programme at Geoscience Australia
-
GA publication: Flyer AEIP, ELVIS, EM-LINK 2021
-
Australia is a unique continent. This short video introduces the physical geography of Australia using a colourful topographic map. Viewers are shown the three major physical regions of the continent, the lack of large mountains and consider why relatively few people live in Australia given its size.
-
Probabilistic methods applied to infrequent but devastating natural events are intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, asteroid impacts) with varying mean return times. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of assessing and mitigating tsunami risk and improving the early warning systems. The PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific characteristics of the tsunami intensities (e.g. run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including: (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms; (ii) developments in modelling the propagation and impact of tsunami waves; (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence and uncertainties in an integrated and consistent probabilistic framework.
-
The annual Asia Pacific Regional Geodetic Project (APRGP) GPS campaign is an activity of the Geodetic Reference Frame Working Group (WG) of the Regional Committee of United Nations Global Geospatial Information Management for Asia and the Pacific (UN-GGIM-AP). This document describes the data analysis of the APRGP GPS campaign undertaken between the 15th and 22nd of September 2019. Campaign GPS data collected at 101 sites in ten countries across the Asia Pacific region were processed using version 5.2 of the Bernese GNSS Software in a regional network together with selected IGS (International GNSS Service) sites. The GPS solution was constrained to the ITRF2014 reference frame by adopting IGS14 coordinates on selected IGS reference sites and using the final IGS earth orientation parameters and satellite ephemerides products. The average of the root mean square repeatability of the station coordinates for the campaign was 1.8 mm, 1.6 mm and 5.4 mm in north, east and up components of station position respectively.
-
Google has partnered with hundreds of museums, cultural institutions and archives including Geoscience Australia to host treasures from our National Mineral and Fossil Collection online on the Google Arts & Culture website. Our building's public areas have been scanned and are online via a streetview virtual tour, there are a large number of collection items uploaded which have been used to create many unique and fascinating exhibits.
-
The Australian Geoscience Data Cube has won the 2016 Content Platform of the Year category at the Geospatial World Leadership Awards. The awards recognise significant contributions made by champions of change within the global geospatial industry and were presented during the 2017 Geospatial World Forum held in Hyderabad, India. The Data Cube was developed by Geoscience Australia in partnership with the CSIRO and the National Computational Infrastructure at the Australian National University, and is a world-leading data analysis system for satellite and other Earth observation data. Visit www.datacube.org.au to find out more including the technical specifications, and learn how you can develop your own Data Cube and become part of the collective.