Machine Learning (ML) and Artificial Intelligence (AI) techniques and technologies continue to develop at a rapid pace and have demonstrated remarkable success across a broad range of application areas. In Cyber Security in particular, there have been numerous applications of ML, alleviating pressure on the bottleneck caused by limited availability of expert human cyber operators and analysts.
However, despite this ongoing success, there are significant challenges in ensuring trustworthiness of ML systems. Recent work has shown that the use of ML can introduce additional vulnerabilities into a system, which arise from weaknesses in the algorithms themselves (e.g. ML classifiers incorrectly classifying adversarial data with a high degree of confidence), or from the exploitation of weaknesses in the ML system’s goals. Such vulnerabilities inevitably lead to a loss of trust in automated and autonomous ML systems. While security of ML is not a focus of traditional ML algorithm development, when used in domains such as cyber security, there are incentivised malicious adversaries present in the system willing to game and exploit such ML vulnerabilities. More broadly than such issues of robustness of ML, there are concerns with the correctness of the predictions of ML systems. These issues are of concern not only in Cyber Security, but in numerous other application areas for ML.
We are interested in improving the robustness and resilience of both ML algorithms and of the entire development pipeline of ML solutions. We are further interested in techniques quantifying their levels of correctness, robustness or resilience and/or considering additional outputs that attest to the correctness of the predictions. We are therefore seeking proposals that enable significant advances in the science of increasing robustness and quantifying and enhancing trustworthiness of ML concepts, techniques and technologies with a particular focus on applicability to Cyber Security. Consideration must be given in any proposal to the impact on, and possible trade-offs regarding, performance (speed, precision, recall, …) of the ML systems from suggested approaches.
This opportunity is open to all registered Australian Universities and Australian Publicly Funded Research Agencies.
- Successful applicants must be able to meet the milestones and timelines outlined in their submission.
- Successful applicants must enter into a Data61 University Collaboration Agreement.
- Successful applicants will enter into the appropriate contracting arrangement within 3 weeks of announcement.
Terms and Conditions
Proposals submitted will be assessed equally on the following criteria:
- Alignment to Defence strategy and the project priorities articulated in this document
- Future science criticality
- Collaboration depth (e.g. Collaboration with DST staff, Data61 staff, other universities, an industry partner, etc.)
- Delivery of outcomes (e.g. the ability of the proposal to deliver the agreed outcomes and milestones).
- Game changing potential to Defence
Please limit submissions to no more than 2000 words. Ensure that all contact details, current and potential DST, Data61 collaborators and/or research partner details are on a separate page/covering sheet. The proposals will be de-identified during the selection process to eliminate any potential conflicts of interest.
Defence and Data61 reserves the right to fund all, some or none of the proposals received under this Call for Applications.
Contracts and Intellectual Property
Successful applicants will be required to enter into a Data61 University Collaboration Agreement and a subsidiary Collaborative Research Project Agreement with Data61 in order to access project funding. Data61 will enter into contracts with the lead party in each proposal.
Any IP generated as part of the projects will vest in Data61 unless otherwise agreed, and Defence will receive a license for Commonwealth purposes only.
Any Commonwealth funding contributed to the projects will be paid in accordance with successful completion of milestones and as negotiated by the parties. Where circumstances necessitate it is possible for a small payment to be made upon execution of the agreement and in accordance with Defence procurement rules.
How to Apply
Please submit via the DST portal.
Proposals are to be submitted by 4.30pm Australian Eastern Daylight Time (AEDT), 15 August 2018. Only projects submitted via email to Cyber-NGTF@dst.defence.gov.au by the above deadline will be considered in this round.
For further information or assistance, please contact: