Maincode is on a mission to be Australia’s ‘artificial intelligence factory’, producing homegrown AI models designed, trained and hosted end-to-end in Australia.
This meant Maincode often found itself in the conversation when 'sovereign AI' was brought up.
It was a label that Maincode proudly wore for the first months of its life but around the time the company hit its first birthday, Lemphers wrote on LinkedIn that the company was moving away from the term.
Instead, it would lean into wording that has been gracing local goods for decades: 'Australian-made'.
The switch, according to Lemphers’ LinkedIn post, was driven by the way he felt the conversation around sovereign AI was no longer appropriate for the company’s goals.
It states that sovereign AI is “too loaded, too defensive … rhetoric that divides”, while Australian-made AI is “a phrase that invites participation instead of fear”.
While that may be strong rhetoric, Lemphers told techpartner.news that he was not saying that sovereign AI isn’t a valid issue, but “it's just one that doesn't make sense to us as a business anymore”.
“As we started slowly ramping up the organisation and started working with customers and building things, the first thing that hit us was that the problem space changed for us,” he said.
“Initially, everyone was talking about LLMs (large language models), everyone was talking about chatbots, and then you start speaking to customers, and they're like ‘chatbots are great for summarising documents and stuff, but when we're trying to do business process automation or compliance tasks or anything like that, you actually can't use a chatbot LLM’.
“It hallucinates [or] it doesn't explain well, so you need to move to models that have different architectures and different capabilities, and that immediately shifts the conversation away from what felt like some of the major concerns that sovereign AI had come to mean.”
Refined and ethical
He said often those calling for sovereign AI models were mainly concerned with the unethical practices around the training of models, a point that had lost relevance as Maincode had matured.
Maincode does have an LLM chatbot it calls Matilda, which is only trained on open source and freely available data according to Lemphers, as opposed to “foreign companies hoovering up copyright data and not being respectful”.
Consumer-facing chatbots require extraordinary amounts of data from a wide array of sources, which necessitates using copyrighted material without permission by admission of the creators of the tech.
This approach also exacerbates issues like hallucinations and false responses to questions because, in part, the training dataset has so much information, meaning the chatbots are not very good at knowing which of its predicted answers will be true and accurate and which will be misleading.
While avoiding these concerns is just part of what drives support for sovereign AI, Lemphers said these discussions are driving Maincode away from the term because they are largely irrelevant to the company’s approach.
“A lot of the things that plague chatbot LLMs just didn't seem to apply to us anymore, because customers bring their own data,” he said.
“It's highly symbolic tasks that they're trying to achieve.”
What about sovereign?
With sovereignty being a concern for many customer organisations for IT providers, the move away from the term may seem counterintuitive but, Lemphers argues, what sovereignty means can vary between organisations – another reason the company is moving away from the term.
“When you start wading into this sovereign conversation, it becomes infeasible because some people without actual technical capability say ‘Well, you should be going all the way down to building the silicon,’ and it's like, ‘sure but that's not actually where we need to go, it's not going to advance productivity in this country’,” he explained.
The Tech Policy Design Institute (TPDi) recently announced it is undertaking a project to define 'AI Sovereignty' and baseline Australia’s national AI capabilities.
The TPDi said that 'AI Sovereignty' is a "provocative concept often invoked but rarely defined", with the project aiming to address "three immediate challenges" of shared languages, a full stack view and a stocktake of Australia’s national AI capabilities.
A commonly understood but narrow definition of sovereignty is that the AI is trained with Australian cultural values at the core and any data it holds remains in a fully Australian domiciled and controlled data centre.
Lemphers said IT providers and end users looking for something that fits those parameters will still find Maincode’s products fit the bill.
He said the company wouldn’t run any datasets within any US-owned data centres on Australian soil because of the US Cloud Act, but customers were welcome to run Maincode models wherever they chose.
“We end up coming back to practical, pragmatic engineering aspects, and that tends to be where it’s made and who made it,” he said.




