INFORMATION AND COMPUTING SCIENCES
Type of resources
Keywords
Publication year
Topics
-
The purpose of this document is to define an Emergency Management (EM) Metadata Profile Extension to the ISO 19115-1:2014/AMD 1:2018 to identify the metadata required to accurately describe EM resources. The EM Metadata Profile is designed to support the documentation and discovery of EM datasets, services, and other resources. This version of the Profile was developed to reflect extensions made to the current version of the international metadata standard: ISO 19115-1:2014/AMD 1:2018.
-
The magnetotelluric (MT) method is increasingly being applied to map tectonic architecture and mineral systems. Under the Exploring for the Future (EFTF) program, Geoscience Australia has invested significantly in the collection of new MT data. The science outputs from these data are underpinned by an open-source data analysis and visualisation software package called MTPy. MTPy started at the University of Adelaide as a means to share academic code among the MT community. Under EFTF, we have applied software engineering best practices to the code base, including adding automated documentation and unit testing, code refactoring, workshop tutorial materials and detailed installation instructions. New functionality has been developed, targeted to support EFTF-related products, and includes data analysis and visualisation. Significant development has focused on modules to work with 3D MT inversions, including capability to export to commonly used software such as Gocad and ArcGIS. This export capability has been particularly important in supporting integration of resistivity models with other EFTF datasets. The increased functionality, and improvements to code quality and usability, have directly supported the EFTF program and assisted with uptake of MTPy among the international MT community. <b>Citation:</b> Kirkby, A.L., Zhang, F., Peacock, J., Hassan, R. and Duan, J., 2020. Development of the open-source MTPy package for magnetotelluric data analysis and visualisation. In: Czarnota, K., Roach, I., Abbott, S., Haynes, M., Kositcin, N., Ray, A. and Slatter, E. (eds.) Exploring for the Future: Extended Abstracts, Geoscience Australia, Canberra, 1–4.
-
This video demonstrates to viewers the importance and value on fit for purpose metadata, metadata standards, and metadata profiles.
-
A publicly available AGOL Dashboard that periodically updates to show the status of requests made to the Australian Exposure Information Platform (AEIP), categorised as Running, Queued and Completed (www.aeip.ga.gov.au)
-
All commercially produced hydrogen worldwide is presently stored in salt caverns. In eastern Australia, the only known thick salt accumulations are found in the Boree Salt of the Adavale Basin in central Queensland. Although the number of wells penetrating the basin is limited, salt intervals up to 555 m thick have been encountered. The Boree Salt consists predominantly of halite and is considered to be suitable for hydrogen storage. Using well data and historical 2D seismic interpretations, we have developed a 3D model of the Adavale Basin, particularly focussing on the thicker sections of the Boree Salt. Most of the salt appears to be present at depths greater than 2000 m, but shallower sections are found in the main salt body adjacent to the Warrego Fault and to the south at the Dartmouth Dome. The preliminary 3D model developed for this study has identified three main salt bodies that may be suitable for salt cavern construction and hydrogen storage. These are the only known large salt bodies in eastern Australia and therefore represent potentially strategic assets for underground hydrogen storage. There are still many unknowns, with further work and data acquisition required to fully assess the suitability of these salt bodies for hydrogen storage. Recommendations for future work are provided. <b>Citation:</b> Paterson R., Feitz A. J., Wang L., Rees S. & Keetley J., 2022. From A preliminary 3D model of the Boree Salt in the Adavale Basin, Queensland. In: Czarnota, K. (ed.) Exploring for the Future: Extended Abstracts, Geoscience Australia, Canberra, https://dx.doi.org/10.26186/146935
-
Following the successful outcomes of the Tennant Creek-Mt Isa (TISA) mineral potential assessment (Murr et al., 2019; Skirrow et al., 2019), the methodology has been expanded to encompass the entire North Australian Craton (NAC). Like its predecessor, this assessment uses a knowledge-based, data-rich mineral systems approach to predict the potential for iron oxide-copper-gold (IOCG) mineralisation. With their high metal yield and large alteration footprint, IOCG mineral systems remain an attractive target in directing exploration efforts towards undercover regions. This mineral potential assessment uses a 2D GIS-based workflow to map four key mineral system components: (1) Sources of metals, fluids and ligands, (2) Energy to drive fluid flow, (3) Fluid flow pathways and architecture, and (4) Deposition mechanisms, such as redox or chemical gradients. For each of these key mineral system components, theoretical criteria representing important ore-forming processes were identified and translated into mappable proxies using a wide range of input datasets. Each of these criterion are weighted and combined using an established workflow to produce a models of IOCG potential. Metadata and selection rational are documented in the accompanying NAC IOCG Assessment Criteria Table. Two scenarios were modelled for this assessment. The first is a comprehensive assessment, targeting pre-Neoproterozoic mineral systems (>1500 Ma), using a combination of interpreted, geological and geophysical datasets. As geological interpretations are subjective to the geological knowledge of the interpreter, well-documented areas, such as shallow pre-Neoproterozoic basement, have a greater density of data. This increase in data density can create an inherent bias in the modelled result towards previously explored shallow terrains. The second assessment utilises only datasets which can be mapped consistently across the assessment area. As such, these are predominately based on geophysical data and are more consistent in assessing exposed and covered areas. However, far fewer criteria are included in this assessment, and observations are reflective of only the modern geological environment. Both assessments highlight existing mineral fields in WA, NT and QLD, and suggest that these regions extend under cover. Furthermore, regions not previously known for IOCG mineralisation display a high modelled potential, offering exploration prospects in previously unknown or discounted areas.
-
The geosciences are a data-rich domain where Earth materials and processes are analysed from local to global scales. However, often we only have discrete measurements at specific locations, and a limited understanding of how these features vary across the landscape. Earth system processes are inherently complex, and trans-disciplinary science will likely become increasingly important in finding solutions to future challenges associated with the environment, mineral/petroleum resources and food security. Machine learning is an important approach to synthesise the increasing complexity and sheer volume of Earth science data, and is now widely used in prediction across many scientific disciplines. In this context, we have built a machine learning pipeline, called Uncover-ML, for both supervised and unsupervised learning, prediction and classification. The Uncover-ML pipeline was developed from a partnership between CSIRO and Geoscience Australia, and is largely built around the Python scikit-learn machine learning libraries. In this paper, we briefly describe the architecture and components of Uncover-ML for feature extraction, data scaling, sample selection, predictive mapping, estimating model performance, model optimisation and estimating model uncertainties. Links to download the source code and information on how to implement the algorithms are also provided. <b>Citation:</b> Wilford, J., Basak, S., Hassan, R., Moushall, B., McCalman, L., Steinberg, D. and Zhang, F, 2020. Uncover-ML: a machine learning pipeline for geoscience data analysis. In: Czarnota, K., Roach, I., Abbott, S., Haynes, M., Kositcin, N., Ray, A. and Slatter, E. (eds.) Exploring for the Future: Extended Abstracts, Geoscience Australia, Canberra, 1–4.
-
Fuelled by a virtual explosion in digital capabilities the world is changing very fast and the field of Geoscience is no exception. Geoscience Australia (GA) has undergone a rapid digital transformation in the past 5 years. This has in part been driven by increasing organisational ICT costs, and partly by whole of government and more broadly global changes to the digital environment such as; growth in data, cheaper, more available cloud and infrastructure, new advances in AI and ML, new models for continuous development (DevOps) and public expectations that data will be available and accessible. The overarching principle of Geoscience Australia's Digital Strategy 2019-22 acknowledges that GA’s Enterprise ICT is inherently integrated with scientific data and ICT requirements, and more broadly these are the foundation of the digital science we are respected for at GA. As such, this strategy brings all three of these components together in a whole of agency digital strategy. The Digital Strategy focuses on strong foundations, systematic experimentation, data-driven science and digital culture and capability. Eighteen different strategies lie under these themes to help guide GA's focus for the next 3 years. There are also a number of high level KPI’s designed to help us measure our success.
-
The pace, with which government agencies, researchers, industry, and the public need to react to national and international challenges of economic, environmental, and social natures, is constantly changing and rapidly increasing. Responses to the global COVID-19 pandemic event, the 2020 Australian bushfire and 2021 flood crisis situations are recent examples of these requirements. Decisions are no longer made on information or data coming from a single source or discipline or a solitary aspect of life: the issues of today are too complex. Solving complex issues requires seamless integration of data across multiple domains and understanding and consideration of potential impacts on businesses, the economy, and the environment. Modern technologies, easy access to information on the web, abundance of openly available data shifts is not enough to overcome previous limitations of dealing with data and information. Data and software have to be Findable, Accessible, Interoperable and Reusable (FAIR), processes have to be transparent, verifiable and trusted. The approaches toward data integration, analysis, evaluation, and access require rethinking to: - Support building flexible re-usable and re-purposeful data and information solutions serving multiple domains and communities. - Enable timely and effective delivery of complex solutions to enable effective decision and policy making. The unifying factor for these events is location: everything is happening somewhere at some time. Inconsistent representation of location (e.g. coordinates, statistical aggregations, and descriptions) and the use of multiple techniques to represent the same data creates difficulty in spatially integrating multiple data streams often from independent sources and providers. To use location for integration, location information needs to be embedded within the datasets and metadata, describing those datasets, so those datasets and metadata would become ‘spatially enabled’.
-
The gnssanalysis Python package is designed to provide the public with a source of useful python functions and classes that help with processing of GNSS observations. The functionality found within the package includes: - reading of many standard file formats commonly used in the geodetic community including SP3, SNX, RNX, CLK, PSD, etc. into pandas dataframes (Also writing certain file formats) - transformation of data, for example datetime conversions, helmert inversions, rotations, transforming geodata from XYZ to longitude-latitude-altitude, etc. - functions for the download of standard files and upload to other sources (e.g. s3)