Making human place knowledge digestible by computers. This project aims to develop the tools that will enable people to interact intuitively with computers about places and the relations between places. People understand their environment in a different way to computers; they think of places and their relations, while computers use coordinates and maps. People’s interaction with maps is cognitively costly and error-prone, which is becoming untenable in situations needing time-critical decision m ....Making human place knowledge digestible by computers. This project aims to develop the tools that will enable people to interact intuitively with computers about places and the relations between places. People understand their environment in a different way to computers; they think of places and their relations, while computers use coordinates and maps. People’s interaction with maps is cognitively costly and error-prone, which is becoming untenable in situations needing time-critical decision making. The project will revolutionise the design of information services where computers deal with humans and location in time-critical or stressful situations, including emergency calls, disaster response and local search queries. The uptake of this design by industry will lead to economic benefits as well as a safer society living in a smarter environment.Read moreRead less
Stochastic Construction of Error Correcting Codes with Application to Digital Communications. Modern society would be unrecognisable without error correcting codes; mobile telephones, storage devices such as DVD's and high speed data communications simply would not exist. Yet most theoretical results on error correcting codes are asymptotic in nature and ignore computational complexity issues, that is, they are not representative of many real life situations. By building on recent breakthrough ....Stochastic Construction of Error Correcting Codes with Application to Digital Communications. Modern society would be unrecognisable without error correcting codes; mobile telephones, storage devices such as DVD's and high speed data communications simply would not exist. Yet most theoretical results on error correcting codes are asymptotic in nature and ignore computational complexity issues, that is, they are not representative of many real life situations. By building on recent breakthroughs in statistics and stochastic optimisation, this project will develop algorithms for designing optimised error correcting codes subject to realistic finite data length and computational complexity constraints. Successful outcomes will lead to enhanced data communications and storage, greatly benefiting industry and consumers alike.
Read moreRead less
A Unified Grid Programming Methodology for Global e-Science. This project will contribute to the national benefit in three important ways. First, we will build a set of novel e-Science applications as demonstrator projects in areas of national priority. These will have enormous economic impact in areas ranging from environmental management to health. Second, we will build software infrastructure that will have both commercial and strategic value in its own right. Third, we shall build a critical ....A Unified Grid Programming Methodology for Global e-Science. This project will contribute to the national benefit in three important ways. First, we will build a set of novel e-Science applications as demonstrator projects in areas of national priority. These will have enormous economic impact in areas ranging from environmental management to health. Second, we will build software infrastructure that will have both commercial and strategic value in its own right. Third, we shall build a critical mass of expertise that bridges the physical sciences and computer science. The support provided to this proposal will allow multi-disciplinary teams to address scientific problems of significant scale.Read moreRead less
Preventing sensitive data exfiltration from insiders . Confidential data such as military secrets or intellectual property must never be disclosed outside the organisation; formally protecting data exfiltration from insider attacks is a major challenge. This project aims to develop a pattern matching based systematic methodology for data exfiltration in database systems. We will devise highly accurate detection tools and secure provenance techniques that can effectively protect against insider a ....Preventing sensitive data exfiltration from insiders . Confidential data such as military secrets or intellectual property must never be disclosed outside the organisation; formally protecting data exfiltration from insider attacks is a major challenge. This project aims to develop a pattern matching based systematic methodology for data exfiltration in database systems. We will devise highly accurate detection tools and secure provenance techniques that can effectively protect against insider attacks. The outcomes of the project will incorporate new security constraints and policies raised by emerging technologies to enable better protection of sensitive information. Read moreRead less
A Generic Framework for Verifying Machine Learning Algorithms. This project aims to discover new ways to verify whether decisions made by Artificial Intelligence and Machine Learning algorithms are as per the specifications set by their designers and/or regulatory bodies. The project also provides new methods to align algorithm decisions when they are found to be non-abiding. The outcomes will include new machine learning theories and frameworks for algorithmic assurance. The significance of the ....A Generic Framework for Verifying Machine Learning Algorithms. This project aims to discover new ways to verify whether decisions made by Artificial Intelligence and Machine Learning algorithms are as per the specifications set by their designers and/or regulatory bodies. The project also provides new methods to align algorithm decisions when they are found to be non-abiding. The outcomes will include new machine learning theories and frameworks for algorithmic assurance. The significance of the project is that it will offer a crucial platform for certifying algorithms and thus benefit society and businesses in deciding the right Artificial Intelligence algorithms. Read moreRead less
Exploiting Geometries of Learning for Fast, Adaptive and Robust AI. This project aims to uniquely exploit geometric manifolds in deep learning to advance the frontier of Artificial Intelligence (AI) research and applications in cybersecurity and general cognitive tasks. It expects to develop new theories, algorithms, tools, and technologies for machine learning systems that are fast, adaptive, lifelong and robust, even with limited supervision. Expected outcomes will enhance Australia's capabili ....Exploiting Geometries of Learning for Fast, Adaptive and Robust AI. This project aims to uniquely exploit geometric manifolds in deep learning to advance the frontier of Artificial Intelligence (AI) research and applications in cybersecurity and general cognitive tasks. It expects to develop new theories, algorithms, tools, and technologies for machine learning systems that are fast, adaptive, lifelong and robust, even with limited supervision. Expected outcomes will enhance Australia's capability and competitiveness in AI, and deliver robust and trustworthy learning technology. The project should provide significant benefits not only in advancing scientific and translational knowledge but also in accelerating AI innovations, safeguarding cyberspace, and reducing the burden on defence expenses in Australia.Read moreRead less
Resource Allocation for High-Volume Streaming Data in Data Centers. Almost all chip vendors are producing new hardware accelerators by combining several units into a single main-board, and therefore making the execution of parallel and distributed run-time primitives not efficient/scalable. This project aims to develop innovative ways to building incremental and iterative computations over massive data sets in a cluster of heterogeneous systems. This will provide a significant reduction of perfo ....Resource Allocation for High-Volume Streaming Data in Data Centers. Almost all chip vendors are producing new hardware accelerators by combining several units into a single main-board, and therefore making the execution of parallel and distributed run-time primitives not efficient/scalable. This project aims to develop innovative ways to building incremental and iterative computations over massive data sets in a cluster of heterogeneous systems. This will provide a significant reduction of performance bottlenecks when running heavily distributed data-driven applications. Expected outcomes will include resource management algorithms that optimise performance at large scale. The project will benefit many areas, including running stateful iterative stream-based data-analysis applications in data centres. Read moreRead less
The dog that didn't bark: a Bayesian account of reasoning from censored data. This project aims to develop and test a new computational theory of inductive reasoning. Inductive reasoning involves extending knowledge from known to novel instances, and is a central component of intelligent behaviour. This project will address the cognitive mechanisms that allow people to draw inferences based on both observed and censored evidence. The project intends to test the model through an extensive program ....The dog that didn't bark: a Bayesian account of reasoning from censored data. This project aims to develop and test a new computational theory of inductive reasoning. Inductive reasoning involves extending knowledge from known to novel instances, and is a central component of intelligent behaviour. This project will address the cognitive mechanisms that allow people to draw inferences based on both observed and censored evidence. The project intends to test the model through an extensive program of experimental investigation and computational modelling. The anticipated benefits include an enhanced understanding of human inference, especially in domains such as the evaluation of forensic or financial evidence, where data censoring is common.Read moreRead less
Adaptive Vector Filters for the Restoration of Digital Colour Images. Colour image restoration has very important applications in colour cameras, robotic navigation, video security systems, multimedia communications, and digital TV broadcast. Conventional vector filters have difficulty in simultaneously achieving three major objectives: noise suppression, detail preservation, and chromaticity retention. This project aims at formulating and evaluating novel adaptive filters to accomplish the th ....Adaptive Vector Filters for the Restoration of Digital Colour Images. Colour image restoration has very important applications in colour cameras, robotic navigation, video security systems, multimedia communications, and digital TV broadcast. Conventional vector filters have difficulty in simultaneously achieving three major objectives: noise suppression, detail preservation, and chromaticity retention. This project aims at formulating and evaluating novel adaptive filters to accomplish the three objectives simultaneously. Theoretical basis will be investigated for devising different adaptive filters to restore colour images and video sequences contaminated by various types of noise. The new filters will provide significant improvement over exiting techniques, yielding better tools and packages for colour image processing and restoration applications.Read moreRead less
Economic Scheduling for Efficient Management of Clusters and their Cooperative Federation. Clusters of commodity computers have emerged as mainstream parallel and distributed platforms for high-performance computing. They are presented together as a single, unified resource to the end users by middleware technologies such as resource management and scheduling (RMS) systems. However, existing cluster RMS systems continue to use system centric models rather than utility models for the management a ....Economic Scheduling for Efficient Management of Clusters and their Cooperative Federation. Clusters of commodity computers have emerged as mainstream parallel and distributed platforms for high-performance computing. They are presented together as a single, unified resource to the end users by middleware technologies such as resource management and scheduling (RMS) systems. However, existing cluster RMS systems continue to use system centric models rather than utility models for the management and allocation of resources. There is also little emphasis on the construction of a cooperative federation of clusters to facilitate transparent sharing of resources. To enhance the value delivered by shared clusters, we propose the use of computational economy metaphor in resource management. This project aims to develop (A) computational economy based scheduling policies for allocation of resources and (B) a software infrastructure for creation of cooperative federation of distributed clusters.Read moreRead less