From 1 - 10 / 21
  • Introductory video to explaining Linked Data and DGGS practices and philosophies

  • Linked Data refers to a collection of interrelated datasets on the Web expressed in a standard structure. These Linked Data and relationships among them can be reached and managed by Semantic Web tools. Linked Data enables large scale integration of and reasoning on data on the Web. This cookbook is to documents the processes and workflows required to create a Linked Data API for a dataset in the Foundation Base Project in Geoscience Australia (GA) and further publish it on the AWS.

  • HiQGA is a general purpose software package for spatial statistical inference, geophysical forward modeling, Bayesian inference and inversion (both deterministic and probabilistic). It includes readily usable geophysical forward operators for airborne electromagnetics (AEM), controlled-source electromagnetics (CSEM) and magnetotellurics (MT). Physics-independent inversion frameworks are provided for probabilistic reversible-jump Markov chain Monte Carlo (rj-MCMC) inversions, with models parametrised by Gaussian processes (Ray and Myer, 2019), as well as deterministic inversions with an "Occam inversion" framework (Constable et al., 1987). In development software for EFTF since 2020

  • The gnssanalysis Python package is designed to provide the public with a source of useful python functions and classes that help with processing of GNSS observations. The functionality found within the package includes: - reading of many standard file formats commonly used in the geodetic community including SP3, SNX, RNX, CLK, PSD, etc. into pandas dataframes (Also writing certain file formats) - transformation of data, for example datetime conversions, helmert inversions, rotations, transforming geodata from XYZ to longitude-latitude-altitude, etc. - functions for the download of standard files and upload to other sources (e.g. s3)

  • All commercially produced hydrogen worldwide is presently stored in salt caverns. In eastern Australia, the only known thick salt accumulations are found in the Boree Salt of the Adavale Basin in central Queensland. Although the number of wells penetrating the basin is limited, salt intervals up to 555 m thick have been encountered. The Boree Salt consists predominantly of halite and is considered to be suitable for hydrogen storage. Using well data and historical 2D seismic interpretations, we have developed a 3D model of the Adavale Basin, particularly focussing on the thicker sections of the Boree Salt. Most of the salt appears to be present at depths greater than 2000 m, but shallower sections are found in the main salt body adjacent to the Warrego Fault and to the south at the Dartmouth Dome. The preliminary 3D model developed for this study has identified three main salt bodies that may be suitable for salt cavern construction and hydrogen storage. These are the only known large salt bodies in eastern Australia and therefore represent potentially strategic assets for underground hydrogen storage. There are still many unknowns, with further work and data acquisition required to fully assess the suitability of these salt bodies for hydrogen storage. Recommendations for future work are provided. <b>Citation:</b> Paterson R., Feitz A. J., Wang L., Rees S. & Keetley J., 2022. From A preliminary 3D model of the Boree Salt in the Adavale Basin, Queensland. In: Czarnota, K. (ed.) Exploring for the Future: Extended Abstracts, Geoscience Australia, Canberra, https://dx.doi.org/10.26186/146935

  • The Flying Hellfish provide Geoscience Australia with web portals of an unprecedented quality and impact. They have achieved this by embracing automation, digital culture and cloud to uplift Geoscience Australia's web portal presence to scale and meet the demands of the modern user. In 2014 these concepts were only ideas and experiments. However, since the team formed in 2016 they have been on a transformational journey towards a new way of working which has delivered radically better digital products than what was available at the outset. User experience is now at the forefront of our web portals, with the common look and feel providing a seamless experience across more than 15 digital products on any device (including smartphones). The security has been proven to be state-of-the-art, and the products are designed to be fast and responsive. In this presentation you will learn how the team utilises NoOps (the No Operations paradigm) to build, operate and support these products while continuing to quickly and efficiently deliver new and innovative digital products.

  • The magnetotelluric (MT) method is increasingly being applied to map tectonic architecture and mineral systems. Under the Exploring for the Future (EFTF) program, Geoscience Australia has invested significantly in the collection of new MT data. The science outputs from these data are underpinned by an open-source data analysis and visualisation software package called MTPy. MTPy started at the University of Adelaide as a means to share academic code among the MT community. Under EFTF, we have applied software engineering best practices to the code base, including adding automated documentation and unit testing, code refactoring, workshop tutorial materials and detailed installation instructions. New functionality has been developed, targeted to support EFTF-related products, and includes data analysis and visualisation. Significant development has focused on modules to work with 3D MT inversions, including capability to export to commonly used software such as Gocad and ArcGIS. This export capability has been particularly important in supporting integration of resistivity models with other EFTF datasets. The increased functionality, and improvements to code quality and usability, have directly supported the EFTF program and assisted with uptake of MTPy among the international MT community. <b>Citation:</b> Kirkby, A.L., Zhang, F., Peacock, J., Hassan, R. and Duan, J., 2020. Development of the open-source MTPy package for magnetotelluric data analysis and visualisation. In: Czarnota, K., Roach, I., Abbott, S., Haynes, M., Kositcin, N., Ray, A. and Slatter, E. (eds.) Exploring for the Future: Extended Abstracts, Geoscience Australia, Canberra, 1–4.

  • This video demonstrates to viewers the importance and value on fit for purpose metadata, metadata standards, and metadata profiles.

  • The purpose of this document is to define an Emergency Management (EM) Metadata Profile Extension to the ISO 19115-1:2014/AMD 1:2018 to identify the metadata required to accurately describe EM resources. The EM Metadata Profile is designed to support the documentation and discovery of EM datasets, services, and other resources. This version of the Profile was developed to reflect extensions made to the current version of the international metadata standard: ISO 19115-1:2014/AMD 1:2018.

  • The pace, with which government agencies, researchers, industry, and the public need to react to national and international challenges of economic, environmental, and social natures, is constantly changing and rapidly increasing. Responses to the global COVID-19 pandemic event, the 2020 Australian bushfire and 2021 flood crisis situations are recent examples of these requirements. Decisions are no longer made on information or data coming from a single source or discipline or a solitary aspect of life: the issues of today are too complex. Solving complex issues requires seamless integration of data across multiple domains and understanding and consideration of potential impacts on businesses, the economy, and the environment. Modern technologies, easy access to information on the web, abundance of openly available data shifts is not enough to overcome previous limitations of dealing with data and information. Data and software have to be Findable, Accessible, Interoperable and Reusable (FAIR), processes have to be transparent, verifiable and trusted. The approaches toward data integration, analysis, evaluation, and access require rethinking to: - Support building flexible re-usable and re-purposeful data and information solutions serving multiple domains and communities. - Enable timely and effective delivery of complex solutions to enable effective decision and policy making. The unifying factor for these events is location: everything is happening somewhere at some time. Inconsistent representation of location (e.g. coordinates, statistical aggregations, and descriptions) and the use of multiple techniques to represent the same data creates difficulty in spatially integrating multiple data streams often from independent sources and providers. To use location for integration, location information needs to be embedded within the datasets and metadata, describing those datasets, so those datasets and metadata would become ‘spatially enabled’.