Cloudera expands AI inference service to on-premises environments

By Jason Pollock on Feb 12, 2026 10:53AM
Cloudera expands AI inference service to on-premises environments
Leo Brunnick, Cloudera.
LinkedIn

Cloudera has expanded its Cloudera AI Inference service and Cloudera Data Warehouse product with Trino to on-premises environments.

With Cloudera AI Inference, powered by NVIDIA technology, now available on premises, organisations can deploy and scale any AI model, including the latest NVIDIA Nemotron open models - from LLMs, fraud detection, computer vision, voice, and more - directly within their data centres.

Organisations gain full control over latency, compliance and data privacy while ensuring that, once AI moves into steady production, long-term costs remain lower and easier to manage, according to Cloudera.

Cloudera Data Warehouse with Trino - now available in data centre environments - aims to enable centralised security, governance and observability across the entire data estate.

The company also made enhancements to its Cloudera Data Visualization product, introducing AI annotation, generating summaries and contextual insights for charts and visuals without manual writing; AI query logging and traceability; and simplified admin management.

Leo Brunnick, chief product officer at Cloudera, said with Cloudera AI Inference, Cloudera Data Warehouse with Trino and Cloudera Data Visualization all accessible in the data centre, organisations can securely deploy AI and analytics exactly where their most critical data resides.

"This means enterprises can drive innovation and derive insights without compromising on data security, compliance, or operational efficiency," he said.

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © nextmedia Pty Ltd. All rights reserved.
Tags:

Log in

Email:
Password:
  |  Forgot your password?