Accelerated Finite-time Learning and Control in Cyber-Physical Systems. Efficient learning and control in cyber-physical systems such as smart grids and robotic systems are very important for achieving economic and social benefits. This project aims to establish a breakthrough accelerated finite-time dynamics theory and technology to assist in delivering efficient learning and control. Expected outcomes include new distributed accelerated finite-time dynamics based learning and control algorithm ....Accelerated Finite-time Learning and Control in Cyber-Physical Systems. Efficient learning and control in cyber-physical systems such as smart grids and robotic systems are very important for achieving economic and social benefits. This project aims to establish a breakthrough accelerated finite-time dynamics theory and technology to assist in delivering efficient learning and control. Expected outcomes include new distributed accelerated finite-time dynamics based learning and control algorithms and tools for optimal operations in cyber-physical systems. This should provide significant benefits including a practical technology for industry applications in smart grids and robotic systems, and training of the next generation engineers in this technology for Australia.Read moreRead less
Small Scalable Natural Language Models using Explicit Memory. Deep neural networks have had spectacular success in natural language processing, seeing wide-spread deployment as part of automatic assistant devices in homes and cars, and across many valuable industries including finance, medicine and law. Fueling this success is the use of ever larger models, with exponentially increasing training resources, accompanying hardware and energy demands. This project aims to develop more compact models ....Small Scalable Natural Language Models using Explicit Memory. Deep neural networks have had spectacular success in natural language processing, seeing wide-spread deployment as part of automatic assistant devices in homes and cars, and across many valuable industries including finance, medicine and law. Fueling this success is the use of ever larger models, with exponentially increasing training resources, accompanying hardware and energy demands. This project aims to develop more compact models, based on the incorporation of an explicit searchable memory, which will dramatically reduce model size, hardware requirements and energy usage. This will make modern natural language processing more accessible, while also providing greater flexibility, allowing for more adaptable and portable technologies.Read moreRead less