From 1 - 10 / 95
  • To determine the magnitude of severe wind gust hazard due to thunderstorm downbursts using regional climate model output and analysis of observed data (including radar reflectivity and proximity soundings).

  • Tropical cyclones are the most common disaster in the Pacific, and among the most destructive. In December 2012, Cyclone Evan caused over US$200 million damage in Samoa, nearly 30 percent of Samoan GDP. Niue suffered losses of US$85 million following Cyclone Heta in 2004-over five times its GDP. As recently as January 2014, Cyclone Ian caused significant damage throughout Tonga, resulting in the first payout of the Pacific Catastrophe Risk Insurance Pilot system operated by the World Bank (2014). According to the Intergovernmental Panel on Climate Change (IPCC), intense tropical cyclone activity in the Pacific basin will likely increase in the future (IPCC 2013). But such general statements about global tropical cyclone activity provide little guidance on how impacts may change locally or even regionally, and thus do little to help communities and nations prepare appropriate adaptation measures. This study assesses climate change in terms of impact on the human population and its assets, expressed in terms of financial loss. An impact focus is relevant to adaptation because changes in hazard do not necessarily result in a proportional change in impact. This is because impacts are driven by exposure and vulnerability as well as by hazard. For example, a small shift in hazard in a densely populated area may have more significant consequences than a bigger change in an unpopulated area. Analogously, a dense population that has a low vulnerability to a particular hazard might not need to adapt significantly to a change in hazard. Even in regions with high tropical cyclone risk and correspondingly stringent building codes, such as the state of Florida, a modest 1 percent increase in wind speeds can result in a 5 percent to 10 percent increase in loss to residential property. Quantifying the change impact thus supports evidence-based decision making on adaptation to future climate risk.

  • The Earthquake Scenario Selection is an interactive tool for querying, visualising and downloading earthquake scenarios. There are over 160 sites nationally with pre-generated scenarios available. These represent plausible future scenarios that can be used for earthquake risk management and planning (see https://www.ga.gov.au/about/projects/safety/nsha for more details).

  • Prior to the development of Australian-specific magnitude formulae, the 1935 magnitude corrections by Charles Richter – originally developed for southern California – was almost exclusively used to calculate earthquake magnitudes throughout Australia prior to the 1990s. Due to the difference in ground-motion attenuation between southern California and much of Australia, many historical earthquake magnitudes are likely to be overestimated in the Australian earthquake catalogue. A method has been developed that corrects local magnitudes using the difference between the original (inappropriate) magnitude corrections and the Australian-specific corrections at a distance determined by the nearest recording station likely to have recorded the earthquake. These corrections have reduced the rates of local magnitudes of 4.5 in the historical catalogue by about 30% since 1900, while the number of magnitude 5.0 earthquakes has reduced by about 60% in the same time period. The reduction in the number of moderate-to-large-magnitude earthquakes over the instrumental period yields long-term earthquake rates that are more consistent with present-day rates, since the development of Australian-specific magnitude formulae. The adjustment of historical earthquake magnitudes is important for seismic hazard assessments, which assume a Poisson distribution of earthquakes in space and time.

  • The National Exposure Information System (NEXIS) is a unique modelling capability designed by Geoscience Australia (GA) to provide comprehensive and nationally-consistent exposure information in response to the 2003 COAG commitment to cost-effective, evidence-based disaster mitigation. Since its inception, NEXIS has continually evolved to fill known information gaps by improving statistical methodologies and integrating the best publically-available data. In addition to Residential, Commercial and Industrial building exposure information, NEXIS has recently expanded to include exposure information about agricultural assets providing a wider understanding of how communities can be affected by a potential event. GA's collaboration with the Attorney General's Department (AGD) has involved the consolidation of location-based data to deliver consistent map and exposure information products. The complex information requirements emphasised the importance of having all relevant building, demographic, economic, agriculture and infrastructure information in NEXIS available in a clear and unified Exposure Report to aid decision-makers. The Exposure Report includes a situational map of the hazard footprint to provide geographic context and a listing of detailed exposure information consisting of estimates for number and potential cost of impacted buildings by use, agricultural commodities and cost, the number and social vulnerability of the affected population, and the number and lengths of infrastructure assets and institutions. Developed within an FME workbench, the tool accepts hazard footprints and other report specifics as input before providing an HTML link to the final output in approximately 5 minutes. The consolidation of data and streamlining of exposure information into a simple and uniform document has greatly assisted the AGD in timely evidence-based decision-making during the 2014-15 summer season.

  • ARR is a series of national guidelines and datasets fundamental to flood estimation. The work is being completed by Engineers Australia and funded by the Australian Government through the National Flood Risk Information Project at Geoscience Australia. This flyer is for promoting the revision of ARR at the Hydrology & Water Resources Symposium (HWRS 2015) in Hobart in December 2015.

  • Tsunami hazard assessments are often derived using computational approaches that model the occurrence rates of a suite of hypothetical earthquake-tsunami scenarios. While uniform slip earthquake models are often used, recent studies have emphasized that spatially non-uniform earthquake slip substantially affects tsunamis, with wave heights and run-up varying by a factor of three or more due to slip heterogeneities alone (i.e. assuming fixed ‘bulk rupture parameters’ such as the earthquake magnitude, rupture plane geometry, location, and shear modulus). As a result, stochastic slip models are increasingly being used for directly simulating slip variability in hazard assessments. Irrespective of how the tsunami scenarios are generated, the statistical properties of the modelled tsunami need to well approximate the statistical properties of real tsunami with the same bulk rupture parameters. For example, ideally a future real tsunami will have a 50% chance of having a peak wave height below the median corresponding synthetic peak wave height; a 90% chance of being below the 90th percentile; and so on. Testing is required to determine whether any model has performance comparable to this ideal case. The literature suggests large differences in the statistical properties of stochastic slip models, implying not all will give a good representation of real tsunami variability. However, by comparing model scenarios against a suite of historic tsunami observations, we can statistically test whether key properties of real tsunami have the same distribution as their corresponding synthetic scenarios. We would recommend that such tests become standard in the validation of tsunami hazard scenario generation methods, to reduce the chance of using an inappropriate model which could significantly bias a hazard assessment. The current study evaluates the statistical performance of earthquake-tsunami scenarios which form part of the updated Australian Probabilistic Tsunami Hazard Assessment, currently being developed by Geoscience Australia. The model scenarios are compared with deep-ocean DART buoy wave time-series for 15 recent tsunamis, each recorded at between 1 and 28 sites. No event specific calibration is applied to the models. We evaluate three different earthquake-tsunami scenario generation methods (fixed-size uniform slip; variable-size uniform-slip; variable-size stochastic-slip) in terms of how well they model the statistical properties of wave heights, and discuss the capacity of each method to generate wave time-series which match historical events. We find that some events cannot be well modelled using our fixed-size uniform-slip scenarios, while it is usually possible to match observations reasonably well with a variable-size uniform-slip event, or a variable-size stochastic-slip event. Both of the latter produce families of solutions which usually envelope the observed DART buoy tsunami wave heights, although quantiles of the variable-size uniform-slip events appear to have some downward bias, while quantiles of the variable-size stochastic-slip events seem more consistent with observations.

  • Using the new release of the local wind multipliers software (V.3.1) (https://pid.geoscience.gov.au/dataset/ga/145699) and an appropriate source of classified terrain data, local wind multipliers on a national scale for the whole continent of Australia at (approximately) 25-metre resolution were calculated. This product is a necessary component for calculating local wind speeds from scenarios and guiding impact assessment of severe wind hazards for both federal and state-wide Emergency Services in Australia.

  • A new finite volume algorithm to solve the two dimensional shallow water equations on an unstructured triangular mesh has been implemented in the open source ANUGA software, jointly developed by the Australian National University and Geoscience Australia. The algorithm allows for 'discontinuous-elevation', or 'jumps' in the bed profile between neighbouring cells. This has a number of benefits compared with previously implemented 'continuous-elevation' approaches. Firstly it can preserve stationary states at wet-dry fronts, while also permitting simulation of very shallow frictionally dominated flow down slopes as occurs in direct-rainfall flood models. Additionally the use of discontinuous-elevation enables the sharp resolution of rapid changes in the topography associated with e.g. narrow rectangular drainage channels, or buildings, without the computational expense of a very fine mesh. The approach also supports a simple and computationally efficient treatment of river walls. A number of benchmark tests are presented illustrating these features of the algorithm, along with its application to urban flood hazard simulation and comparison with field data.

  • Probabilistic seismic hazard map of Papua New Guinea, in terms of Peak Ground Acceleration, is developed for return period of 475 years. The calculations were performed for bedrock site conditions (Vs30=760 m/s). Logic-tree framework is applied to include epistemic uncertainty in seismic source as well as ground-motion modelling processes. In this regard two source models, using area source zones and smoothed seismicity, are developed. Based on available geological and seismological data, defined seismic sources are classified into 4 different tectonic environments. For each of the tectonic regimes three Ground Motion Prediction Equations are selected and used to estimate the ground motions at a grid of sites with spacing of 0.1 degree in latitude and longitude. Results show high level of hazard in the coastal areas of Huon Peninsula and New Britain/ Bougainville regions and relatively low level of hazard in the southern part of the New Guinea highlands block. In Huon Peninsula, as shown by seismic hazard disaggregation results, high level of hazard is caused by modelled frequent moderate to large earthquakes occurring at Ramu-Markham Fault zone. On the other hand in New Britain/Bougainville region, the geometry and distance to the subduction zone along New Britain Trench mainly controls the calculated level of hazard. It is also shown that estimated level of PGAs is very sensitive to the selection of GMPEs and overall the results are closer to the results from studies using more recent ground-motion models.