data standards
Type of resources
Keywords
Publication year
Topics
-
With the increasing emphasis on electronic rather that paper products, the need for adequate metadata is becoming more and more pressing. The new AGSO Catalog is designed to address this problem at the corporate level. Developed from the AGSO Products Database, the AGSO Catalog is designed to encompass most of AGSOs outputs, datasets and resources. It does this with the help of various intranet and Web interfaces. Projects or authors must initiate Catalog entries, for without an acceptable metadata a product cannot be sold by the Sales Centre, or permission to publish will not be granted. The Catalog is the key to future systems of information distribution and sales. It will permit us to go directly from the metadata to the electronically stored objects, thus enabling automated information distribution and electronic commerce.
-
GeoSciML version 3.0 (http://www.geosciml.org), released in late 2011, is the latest version of the CGI-IUGS* Interoperability Working Group geoscience data interchange standard. The new version is a significant upgrade and refactoring of GeoSciML v2 which was released in 2008. GeoSciML v3 has already been adopted by several major international interoperability initiatives, including OneGeology, the EU INSPIRE program, and the US Geoscience Information Network, as their standard data exchange format for geoscience data. GeoSciML v3 makes use of recently upgraded versions of several Open Geospatial Consortium (OGC) and ISO data transfer standards, including GML v3.2, SWE Common v2.0, and Observations and Measurements v2 (ISO 19156). The GeoSciML v3 data model has been refactored from a single large application schema with many packages, into a number of smaller, but related, application schema modules with individual namespaces. This refactoring allows the use and future development of modules of GeoSciML (eg; GeologicUnit, GeologicStructure, GeologicAge, Borehole) in smaller, more manageable units. As a result of this refactoring and the integration with new OGC and ISO standards, GeoSciML v3 is not backwardly compatible with previous GeoSciML versions. The scope of GeoSciML has been extended in version 3.0 to include new models for geomorphological data (a Geomorphology application schema), and for geological specimens, geochronological interpretations, and metadata for geochemical and geochronological analyses (a LaboratoryAnalysis-Specimen application schema). In addition, there is better support for borehole data, and the PhysicalProperties model now supports a wider range of petrophysical measurements. The previously used CGI_Value data type has been superseded in favour of externally governed data types provided by OGC's SWE Common v2 and GML v3.2 data standards. The GeoSciML v3 release includes worked examples of best practice in delivering geochemical analytical data using the Observations and Measurements (ISO19156) and SWE Common v2 models. The GeoSciML v3 data model does not include vocabularies to support the data model. However, it does provide a standard pattern to reference controlled vocabulary concepts using HTTP-URIs. The international GeoSciML community has developed distributed RDF-based geoscience vocabularies that can be accessed by GeoSciML web services using the standard pattern recommended in GeoSciML v3. GeoSciML v3 is the first version of GeoSciML that will be accompanied by web service validation tools using Schematron rules. For example, these validation tools may check for compliance of a web service to a particular profile of GeoSciML, or for logical consistency of data content that cannot be enforced by the application schemas. This validation process will support accreditation of GeoSciML services and a higher degree of semantic interoperability. * International Union of Geological Sciences Commission for Management and Application of Geoscience Information (CGI-IUGS)
-
Geoscience data standards as a field of research may come as a surprise to many geoscientists, who probably think of it as a dull peripheral issue, of little relevance to their domain. However, the subject is gaining rapidly in importance as the information revolution begins to take hold, as ultimately billions of dollars worth of information are at stake. In this article we take a look at what has happened recently in this field, where we think it is heading, and AGSO's role in national geoscience standards.
-
In this age of state-of-the-art devices producing analytical results with little input from analytical specialists, how do we know that the results produced are correct? When reporting the result of a measurement of a physical quantity, it is obligatory that some quantitative indication of the quality of the result be given so that those who use it can assess its reliability. Without such an indication, measurement results cannot be compared, either among themselves or with reference values given in a specification or standard. It is therefore necessary that there be a readily implemented, easily understood, and generally accepted procedure for characterising the quality of a result of a measurement, that is, for evaluating and expressing its 'uncertainty'. The concept of 'uncertainty' as quantifiable attribute is relatively new in the history of measurement, although error and error analysis have long been part of the practice of measurement science or 'metrology'. It is now widely recognised that, when all of the known or suspected components of error have been evaluated and the appropriate corrections have been applied, there still remains an uncertainty about the correctness of the stated result, that is, a doubt about how well the result of the measurement represents the value of the quantity being measured. This presentation will discuss the latest practices for the production of 'reliable' geochemical data that are associated with small measurement uncertainties, and will provide an overview of current understanding of metrological traceability and the proper use of reference materials. Correct use of reference materials will be discussed, as well as the role of measurement uncertainty and how it is affected by such issues as sample preparation, sample heterogeneity and data acquisition.
-
Proceedings of the Second National Forum on GIS in the Geosciences, 29 - 31 March 1995, held at the National Library of Australia.
-
Part-page item on matters relating to stratigraphic nomenclature and the Australian Stratigraphic Units Database (ASUD). This column (59) discusses names that do not meet the recommendations of the current International Stratigraphic Guide, and why they are in the ASUD database. ISSN 0312 4711
-
The Australian National Gravity Database (ANGD) contains over 1.8 million gravity observations from over 2,000 surveys conducted in Australia over the last 80 years. Three processes are required to correct these observations for the effects of the surrounding topography: firstly a Bouguer correction (Bullard A), which approximates the topography as an infinite horizontal slab; secondly a correction to that horizontal slab for the curvature of the Earth (Bullard B); and thirdly a terrain correction (Bullard C), which accounts for the undulations in the surrounding topography. These three corrections together produce complete bouguer anomalies. Since February 2008, a spherical cap bouguer anomaly calculation has been applied to data extracted from the ANGD. This calculation applies the Bullard A and Bullard B corrections. Terrain corrections, Bullard C, have now been calculated for all terrestrial gravity observations in the ANGD allowing the calculation of complete bouguer anomalies. These terrain corrections were calculated using the Shuttle Radar Topography Mission 3 arc-second digital elevation data. The complete bouguer anomalies calculated for the ANGD provide users of the data with a more accurate representation of crustal density variations through the application of a more accurate Earth model to the gravity observations.
-
Marine science is expensive. Duplication of research activities is potentially money wasted. Not being aware of other marine science studies could question the validity of findings made in single-discipline studies. A simple means of discovery is needed. The development of Earth Browsers (principally Google Earth) and KML (Keyhole Markup Language) files offer a possible solution. Google Earth is easy to use, and KML files are relatively simple, ASCII, XML-tagged files that can encode locations (points, lines and polygons), relevant metadata for presentation in descriptive 'balloons', and links to digital sources (data, publications, web-pages, etc). A suite of studies will be presented showing how information relating to investigations at a point (e.g. observation platform), along a line (e.g. ship borne survey) or over a region (e.g. satellite imagery) can be presented in a small (10 Kbyte) file. The information will cover a range of widely used data types including seismic data, underwater video, image files, documents and spreadsheets. All will be sourced directly from the web and can be downloaded from within the browser to one's desktop for analysis with appropriate applications. To be useful, this methodology requires data and metadata to be properly managed; and a degree of cooperation between major marine science organizations which could become 'sponsors' of the principal marine science disciplines (i.e oceanography, marine biology, geoscience). This need not be a complex task in many cases. The partitioning of the sciences is not important, so long as the information is being managed effectively and their existence is widely advertised. KML files provide a simple way of achieving this. The various discipline-based KML files could be hosted by an umbrella organization such as the AODCJF, enabling it to become a 'one-stop-shop' for marine science data.
-
Codes, guidelines, and standard practices for naming and describing Australian stratigraphic units have been discussed for more than 60 years since the Australian and New Zealand Association for the Advancement of Science (ANZAAS) set up a Research Committee on Stratigraphic Nomenclature in 1946. Like today's Australian Stratigraphy Commission, its aims were 'to encourage the orderly use of names and definitions for stratigraphic units'. .......
-
This extended abstract describes the 1:1 million scale Surface Geology of Northern Territory digital dataset and advances in digital data delivery via WMS/WFS services and the GeoSciML geological data model.