Elastic AI features in Elastic Security, Observability, and Search are now enabled by default in Elastic Cloud.

Getting started with generative AI (GenAI) shouldn’t be a project in itself. Too often teams encounter organizational friction that slows adoption of AI-based features, from third-party contracts and external API keys, to additional terms of service and billing management. With the Elastic Managed LLM, you can sidestep these blockers and get powerful AI features for automatic ingest, threat detection, problem investigation, root cause analysis, and more, ready to go from day one.

Prefer your own model? We’ve got you covered there, too, with the ability to integrate any popular third-party LLM of your choosing.

https://static-www.elastic.co/v3/assets/bltefdd0b53724fa2ce/blt5a6df87d39574ae5/685103294c53c84bece8879e/attack-discovery.png,attack-discovery.pngOut-of-the-box AI for SREs: Accelerated problem resolution

All AI features in Elastic Observability are ready to use out of the box — no setup required. Teams can accelerate root cause analysis, streamline incident response, and start getting value from generative AI on day one. For organizations that need more control, connecting a preferred LLM is still fully supported.

The Elastic Managed LLM powers all generative AI capabilities in Elastic Observability, including:

AI for developers: Prototype and test GenAI capabilities from day one

With the default Elastic Managed LLM, AI Playground and the Search AI Assistant are ready to use out of the box, without need for additional setup or API keys for an external model. Playground offers a low-code interface for rapidly prototyping RAG workflows with your own data. Now, you can test the latest GenAI capabilities and start building instantly — no model configuration needed. If you prefer your own model, you still have the flexibility to use the open inference API to connect any provider or custom endpoint of your choice.

https://static-www.elastic.co/v3/assets/bltefdd0b53724fa2ce/blt4a907834a226efc3/6851051280eb856f0140cd90/Ai-playground.png,Ai-playground.pngElastic’s unique approach to AI

Elastic delivers AI where it matters most, natively integrated with your data, workflows, and use cases. With a default managed LLM enabled out of the box, teams can start using AI immediately, without setup or third-party contracts. For more flexibility, developers can also connect to public LLMs using Elastic’s open inference API.

What truly sets Elastic apart is how it combines Search AI capabilities for security and observability:

With the default LLM, you get:

Whether you need speed, control, or customization, Elastic gives you a flexible, production-ready AI stack designed for how modern teams work.