Australia’s top national security bureaucrat used an AI chatbot to ghostwrite speeches and messages to his colleagues, internal documents show.
As Finance Minister Katy Gallagher announces a significant government-wide AI plan to promote its use in the public service, Crikey has used freedom of information (FOI) laws to reveal how a government employee is already using the nascent generative AI technology in their work.
Hamish Hansford is the Department of Home Affairs’ head of national security, the Commonwealth counter-terrorism coordinator and national counter foreign interference coordinator.
Following a public statement by Hansford about his use of Microsoft’s AI chatbot Copilot, Crikey obtained 50 documents containing his prompts and responses with the tool over a two-month period via an FOI request.
These logs show how one of Australia’s most senior public servants has leaned on the tool, currently being piloted by the department, for external and internal communications. The fact that they were obtainable by FOI also shows a new vector for insight into the government’s work.
A spokesperson for the Department of Home Affairs said that it has “no concern with the use of AI in the circumstances noted in this FOI”.
‘I’ve been trying Copilot … it’s foundationally amazing’
On August 26, Hansford gave a speech at the think tank Australian Institute of International Affairs titled “Protecting Australia’s Infrastructure in a Fragile World”.
As part of his comments, Hansford mentioned his use of AI in relation to the forthcoming end of technical support of Microsoft’s operating system Windows 10.
“Think about what happens in October this year, when Windows 10 stops being supported. Moving to Windows 11 is foundationally much better for everyone,” he said, according to footage of the event posted online last month.
“I’ve been trying Copilot on Windows 11. It’s foundationally amazing. Didn’t write this speech, but it gives me lots of ideas, and I think that people are thinking about that transition potentially with a security cost in mind, but actually it’s just one very small example about how we can foundationally change our productivity.”
Documents laying out his use of Copilot show that the senior public servant leaned heavily on the AI chatbot to prepare for the speech.
At 11.50am on the morning of August 26, Hansford asked Copilot to “Write an analysis of the critical infrastructure environment where Australia has conceptually come from since 1980 to today; outline the emerging threats and then discuss what critical infrastructure can do immediately and into the future.”
What followed was a nearly 900-word document that formed the bones of Hansford’s conversational address later that night.
In some cases, Hansford’s remarks drew near verbatim from the AI-generated document. Sections about the history of Australia’s critical infrastructure, decade by decade, were very similar between Copilot output and speech transcript.
A Copilot-written section “in the 1980s, Australia regarded critical infrastructure as mainly physical, state-owned assets” became “if you think about the 1980s where there was state owned regulation, state owned entities, rather running out critical infrastructure” when delivered by Hansford.
Other sections absent from the Copilot generation script drew from Hansford’s own personal anecdotes and other asides. The spoken remarks were presented in a significantly more casual tone than the AI-generated document.
The chatbot logs also show how Hansford gave additional instructions to Microsoft Copilot to hone its output.
After the initial speech was generated, Hansford asked, “Can we add in a few examples of australian [sic] threats that are either near misses or threats that have materialised.”
A subsequent version of the Copilot speech introduced references to the 2016 South Australia blackout, the foiled 2017 Etihad flight hijacking plot, JBS foods’ ransomware attack and the COVID-19 pandemic, which all made it into the speech, too.
In another case, Hansford asked Copilot to introduce “an analogy and a few theories into the beginning”. The logs don’t show Copilot’s answer to the question, but Hansford introduced analogies of a “spider’s web” and a reference to “German sociologist Ulrich Beck” in the speech that weren’t in the early AI-generated speech.
Terrorism speeches, ‘personal interactions’ and first drafts for ‘intergovernmental’ communications
The week before, Hansford generated another speech using AI.
On 19 August, Hansford asked Copilot to “[Give a speech to [redacted] drawing on [a hyperlink to the A Safer Australia Australia’s Counter – Terrorism and Violent Extremism Strategy 2025]”.
The audience for the speech was redacted under an FOI exemption for documents that would “substantial adverse effect on the proper and efficient conduct of the operations of an agency”.
The logs also allude to other uses of Copilot by Hansford. Other prompts include him asking Copilot on September 18 for “the pros and cons about having an agenda for an international meeting”.
It also appears as though Hansford used AI to help write messages to other staff. Twelve documents were exempted from release because they “do not relate to operational matters of the Department associated with the subject’s role as Head of National Security but instead relate to personal interactions with members of his team”, a letter accompanying the release of documents said.
The rationale for exempting other documents claims that Hansford put information about “intergovernmental bodies” that would have a “substantial adverse effect on Department’s operations” if released. These were “first drafts of material” for official communications with those “particular bodies”, the letter said.
‘No concern’ about AI use
A spokesperson for the Department of Home Affairs told Crikey that Hansford’s use of Copilot was part of a pilot program for the AI technology.
It said that staff have been trained on “addresses safeguarding of data, privacy and the critical assessment of outputs produced by generative AI tools”.
“Staff must also uphold the APS Values and Code of Conduct while using Copilot — this means acting with integrity, transparency, and respect for privacy, while ensuring decisions and outputs reflect impartiality and accountability,” they said in an email.
The Department also said that external parties — such as Microsoft itself — cannot access information within its Copilot instance, such as logs of users’ prompts and answers.
“Contractual arrangements are established with Microsoft, together with technical safeguards, to ensure that Departmental information is not accessed by external parties or by Microsoft employees,” they said.
Earlier this month, Microsoft announced that it will give customers in Australia the “option to have Microsoft 365 Copilot interactions processed in-country” by the end of 2025.
The Australian government conducted a six-month Copilot trial starting in January 2024 before launching the GovAI platform in late July, which allows public servants to learn and test using generative AI. In September, Home Affairs employed former Microsoft executive Rishi Nicolai to be its “director of AI adoption”.
On Wednesday, Katy Gallagher gave a speech to the Government Innovation Week forum, unveiling the government’s public service AI plan that includes requiring every department to appoint an AI officer and to introduce a public servant-specific chatbot, GovAI Chat.
Just as consumer chatbot services have provided a new avenue of data on users to be obtained by police, politicians and public servants around the world, too, are finding that their interactions with these products are accessible using document request powers.
In the UK, New Scientist obtained logs of the UK’s technology secretary’s use of ChatGPT after he said that he used the tool to understand difficult concepts. In the US, KNKX Public Radio did the same for a Washington city mayor.
But the fear of disclosure could make government staff less likely to use AI, according to the Department.
The letter accompanying the request argued that “any reduction of the Department’s capacity to use AI functions could be reasonably expected to have a substantial adverse effect on the proper and efficient conduct of the operations of this department”.
And the identity of the authorised decision-maker who signed the letter? Hamish Hansford.
- This story first appeared on Crikey. You can read the original here.



Daily startup news and insights, delivered to your inbox.