From 1 - 10 / 67
  • Geochronology is the vital fourth dimension for geological knowledge. It provides the temporal framework for understanding and modelling geological processes and rates of change. Incorporating geochronological 'observations and measurements' into interoperable geological data systems is thus a critical pursuit. - Although there are several resources for storing and accessing geochronological data, there is no standard format for exchanging such data among users. Current systems are a mixture of comma-delimited text files, Excel spreadsheets and PDFs that assume prior specialist knowledge and frequently force the user to laboriously - and potentially erroneously - extract the required data manually. - Geoscience Australia and partners are developing a standard data exchange format for geochronological data ('geochronML') within the broader framework of Observations and Measurements and GeoSciML that are an important facet of emerging international geoscience data format standards. - Geochronology analytical processes and resulting data present some challenging issues as a rock "age" is typically not a direct measurement, but rather the interpretation of a statistical amalgam of several measurements chosen with the aid of prior geological knowledge and analytical metadata. The level at which these data need to be exposed to a user varies greatly, even to the same user over the course of a project. GeochronML is also attempting to provide a generic pattern that will support as wide as range of radioisotopic systems as possible. This presentation will discuss developments at Geoscience Australia and the opportunities for collaboration.

  • Digital technology and the Internet have contributed to the information explosion and in part the widespread increase in the use of spatial information. In this regard community needs for geoscientific information has extended beyond the traditional area of mineral and petroleum exploration. Geoscience is now recognised by society as having a part to play in the achievement of social wellbeing and environmental outcomes. This paper examines whether the geoscience data providers are playing their part in the data explosion. It looks at how geoscience can be applied to real world problems and questions whether the data providers are up to the mark in satisfying the immediate expectations of users as well as initiating new areas of application. The discussion incorporates issues of price, accessibility, formats and data assemblage in relation to a hierarchy of need for decision making.

  • Legacy product - no abstract available

  • Geoscience Australia (GA) produces geoscientific and geospatial data for the benefit of the Australian government and community, to inform public policy, to promote development of Australia's economy, to assist environmental management and to help manage and mitigate natural hazards. Users of GA's data want to know that data are produced to known standards using open and accountable processes and come from a unique and reliable source. Single Point of Truth (SPOT) is Geoscience Australia's standard for processes that produce data. The SPOT methodology describes a consistent approach to transforming an existing data theme into a SPOT. The same methodology can be used for developing a SPOT for a new data theme.

  • Earth comprises systems of enormous complexity that sustain all life and control the distribution of our mineral, energy and water resources. Increasingly earth scientists are now moving away from focusing on single domain research on understanding isolated parts of these intricate systems to adopting multidisciplinary, computationally intensive integrated methodologies to model and simulate the real world complexities of earth systems science. Simultaneously developments in information technology are increasing the capacity of computational systems to credibly simulate complex systems. Real world Solid Earth and Environmental Science data sets are extremely heterogenous, complex and large, and are currently in the order of terabytes (1012 bytes). However, the size and complexity of geoscience data sets are also exponentially increasing, as more powerful modern computing systems combine with enhanced engineering capacity to design and build automated instruments to collect more data and new data types. We are rapidly moving into an era when Earth Scientists will need to have the capacity to analyse petabyte (1015 bytes) databases if they are to realistically model and simulate complex earth processes. Although digital geoscientific data sets are becoming increasingly available over the Internet, current Internet technologies only allow for the downloading of data (if the connection is fast enough): integration, processing and analysis then has to take place locally. As data sets get larger and more complex, then large computational resources are required to effectively process these data. Such resources are increasingly only available to the major industry players, which in turn creates a strong bias against the Small to Middle Enterprises, as well as many University researchers. For those that do not have access to large-scale computing resources, analysis of these voluminous data sets has to be compromised by dividing the data set into smaller units, accepting sub-optimal solutions and/or introducing sub-optimal approximations. It is clear that if we are to begin grappling with accurate analysis of large-scale geoscientific data sets to enable sustainable management of our mineral, energy and water resources, then current computational infrastructures are no longer viable.

  • We propose an automated capture system that follows the fundamental scientific methodology. It starts with the instrument that captures the data, uses web services to make standardised data reduction programs more widely accessible, and finally uses internationally agreed data transfer standards to make geochemical data seamlessly accessible online from a series of internationally distributed certified repositories. The Australian National Data Service (http://www.ands.org.au/) is funding a range of data capture solutions to ensure that the data creation and data capture phases of research are fully integrated to enable effective ingestion into research data and metadata stores at the institution or elsewhere. They are developing a national discovery service that enables access to data in institutional stores with rich context. No data is stored in this system, only metadata with pointers back to the original data. This enables researchers to keep their own data but also enables access to many repositories at once. Such a system will require standardisation at all phases of the process of analytical geochemistry. The geochemistry community needs to work together to develop standards for attributes as the data are collected from the instrument, to develop more standardised processing of the raw data and to agree on what is required for publishing. An online-collaborative workspace such as this would be ideal for geochemical data and the provision of standardised, open source software would greatly enhance the persistence of individual geochemistry data collections and facilitate reuse and repurposing. This conforms to the guidelines from Geoinformatics for Geochemistry (http://www.geoinfogeochem.org/) which requires metadata on how the samples were analysed.

  • Geoscience data standards as a field of research may come as a surprise to many geoscientists, who probably think of it as a dull peripheral issue, of little relevance to their domain. However, the subject is gaining rapidly in importance as the information revolution begins to take hold, as ultimately billions of dollars worth of information are at stake. In this article we take a look at what has happened recently in this field, where we think it is heading, and AGSO's role in national geoscience standards.

  • This documentation manual for the national mineral deposits dataset provides the necessary description of AGSO's mineral deposit database (OZMIN) - its structure, the main data and authority tables used by OZMIN, database table definitions, details on the Microsoft Access version of the database and a listing of those deposits in the dataset.

  • GeoSciML is the international standard for transfer of digital geological maps and relational database data. GeoSciML was developed over the past decade by the IUGS Commission for the Management and Application of Geoscience Information (CGI), and was adopted as an Open Geospatial Consortium (OGC) standard in June 2016. Ratification as an official OGC standard marked a coming of age for GeoSciML - it now meets the highest standards for documentation and current best practice for interoperable data transfer. GeoSciML is the preferred standard for geoscience data sharing initiatives worldwide, such as OneGeology, the European INSPIRE directive, the Australian Geoscience Portal, and the US Geoscience Information Network (USGIN). GeoSciML is also used by OGC's GroundwaterML data standard [1] and CGI's EarthResourceML standard [2]. Development of GeoSciML version 4 learnt considerably from user experiences with version 3.2, which was released in 2013 [3]. Although the GeoSciML v3 data model was conceptually sound, its XML schema implementation was considered overly complex for the general user. Version 4 developments focussed strongly on designing simpler XML schemas that allow data providers and users to interact with data at various levels of complexity. As a result, GeoSciML v4 provides three levels of user experience - 1. simple map portrayal, 2. GeoSciML-Basic for common age and lithology data for geological features, and 3. GeoSciML-Extended, which extends GeoSciML-Basic to deliver more detailed and complex relational data. Similar to GeoSciML v3, additional GeoSciML v4 schemas also extend the ISO Observations & Measurements standard to cover geological boreholes, sampling, and analytical measurements. The separate levels of GeoSciML also make it easier for software vendors to develop capabilities to consume relatively simple GeoSciML data without having to deal with the full range of complex GeoSciML schemas. Previously mandatory elements of GeoSciML, that were found to be overly taxing on users in version 3, are now optional in version 4. GeoSciML v4 comes with Schematron validation scripts which can be used by user communities to create profiles of GeoSciML to suit their particular community needs. For example, the European INSPIRE community has developed Schematrons for web service validation which require its users to populate otherwise-optional GeoSciML-Basic elements, and to use particular community vocabularies for geoscience terminology. Online assistance for data providers to use GeoSciML is now better than ever, with user communities such as OneGeology, INSPIRE, and USGIN providing user guides explaining how to create simple and complex GeoSciML web services. CGI also provides a range of standard vocabularies that can be used to populate GeoSciML data services. Full documentation and user guides are at www.geosciml.org.

  • The Australian Geological Survey Organisation (AGSO) presents its solutions to mapping and GIS on the Internet. Software used is based on commercial and open source products. A distributed web mapping system is demonstrated, and concepts of distributed web mapping discussed. Systems for online delivery of spatial data are also demonstrated. AGSO has been providing Internet access to spatial data since 1996. AGSO is the main repository for national geoscientific data, and services a wide range of clients across industry, government and the general public. Data provided range from point data, such as site descriptions and scientific analysis of samples, to line, polygon and grid data, such as geological and geophysical surveys and associated maps. AGSO currently holds 500 MB of GIS data and a similar amount of image data on its web site; these data are expected to expand to a number of terabytes over the next few years. A primary role of AGSO is to provide its data to clients and stakeholders in as efficient a way as possible, hence its choice of Internet delivery. The major obstacle for supplying data of large volume over the Internet is bandwidth. Many AGSO clients are in remote locations with low bandwidth connections to the Internet. Possible solutions to this problem are presented. Examples of AGSO web tools are available at http://www.agso.gov.au/map/