From zero to governed AI in under a day. This guide covers prerequisites, deployment options, configuration, and creating your first governed workspace.
Ensure your environment meets these requirements before deploying Areebi.
Need help sizing your deployment? See our platform architecture page or contact our team for guidance.
Areebi ships as a single golden image that runs on Docker, Kubernetes, or bare metal. Choose the deployment method that fits your infrastructure.
The fastest path to a running Areebi instance. Ideal for evaluation, development, and small deployments. Single command deployment with all services pre-configured.
docker compose up -dRecommended for: Teams under 100 users
Production-grade deployment with horizontal scaling, rolling updates, and health monitoring. Our Helm chart configures all services with best-practice defaults.
helm install areebi areebi/areebi-platformRecommended for: 100+ users, production workloads
For organizations with strict infrastructure requirements. Areebi ships as a self-contained binary with embedded dependencies. Contact our team for bare metal deployment guides.
Contact sales for deployment guideRecommended for: Air-gapped or classified environments
Learn more about how Areebi's golden image architecture simplifies deployment and maintenance.
After deployment, walk through these configuration steps to set up your Areebi instance.
Connect your SAML 2.0 or OIDC identity provider for single sign-on. Areebi supports all major identity providers including Okta, Azure AD, Google Workspace, and Ping Identity. SSO configuration is done through the admin panel at /admin/sso.
Add your AI provider API keys in the admin panel. Areebi supports 30+ LLM providers including OpenAI, Anthropic, Google, Azure OpenAI, AWS Bedrock, and self-hosted models. Each provider can have independent DLP and policy rules.
Use the visual policy builder to configure data loss prevention rules. Start with Areebi's pre-built templates for PII, financial data, healthcare data, or source code - then customize based on your organization's requirements.
Workspaces are isolated AI environments with their own policies, model access, and user permissions. Create a workspace for your pilot team, assign users through your identity provider groups, and configure workspace-specific policies.
Need help with your deployment? Our team provides hands-on onboarding support for every customer. You can also explore the API reference for integration details, review use case guides for deployment patterns, or check the changelog for the latest features.
For security and compliance details, visit the Trust Center or review our SOC 2 compliance documentation.