Secure by design. Built for enterprise.

Kapa.ai was built with security, privacy, and compliance as top priorities from the start, using industry-leading practices to safeguard your data.

Kapa.ai is reviewed and trusted by +100 COMPANIES

Meets the highest industry standards.

We have achieved SOC 2 Type II certification, demonstrating our commitment to maintaining high standards of security, availability, processing integrity, confidentiality, and privacy of customer data.

Encryption of All Data

All data is encrypted at rest using AES-256 and in transit using TLS 1.2+, leveraging Google Cloud's built-in encryption mechanisms.

Encryption of All Data

All data is encrypted at rest using AES-256 and in transit using TLS 1.2+, leveraging Google Cloud's built-in encryption mechanisms.

Encryption of All Data

All data is encrypted at rest using AES-256 and in transit using TLS 1.2+, leveraging Google Cloud's built-in encryption mechanisms.

Secure Data Connectors

Authenticated methods to safely load data from sources like Slack, Zendesk, and others without compromising security.

Secure Data Connectors

Authenticated methods to safely load data from sources like Slack, Zendesk, and others without compromising security.

Secure Data Connectors

Authenticated methods to safely load data from sources like Slack, Zendesk, and others without compromising security.

Role-Based Access Control (RBAC)

Granular permissions at team and project levels ensure users only access the data they need.

Role-Based Access Control (RBAC)

Granular permissions at team and project levels ensure users only access the data they need.

Role-Based Access Control (RBAC)

Granular permissions at team and project levels ensure users only access the data they need.

Sensitive Data Masking

Supports anonymizing PII in knowledge sources when ingesting, ensuring you can safely upload support tickets and more.

Sensitive Data Masking

Supports anonymizing PII in knowledge sources when ingesting, ensuring you can safely upload support tickets and more.

Sensitive Data Masking

Supports anonymizing PII in knowledge sources when ingesting, ensuring you can safely upload support tickets and more.

PII Detection

Automatically detect personally identifiable information (PII) in user messages, preventing storage or processing of sensitive data.

PII Detection

Automatically detect personally identifiable information (PII) in user messages, preventing storage or processing of sensitive data.

PII Detection

Automatically detect personally identifiable information (PII) in user messages, preventing storage or processing of sensitive data.

SOC 2 Type II Certified

Compliant with industry standards, demonstrating our commitment to security best practices.

SOC 2 Type II Certified

Compliant with industry standards, demonstrating our commitment to security best practices.

SOC 2 Type II Certified

Compliant with industry standards, demonstrating our commitment to security best practices.

Model Agnostic Approach

Not tied to a single vendor, with training opt-outs signed with all LLM providers to protect your data.

Model Agnostic Approach

Not tied to a single vendor, with training opt-outs signed with all LLM providers to protect your data.

Model Agnostic Approach

Not tied to a single vendor, with training opt-outs signed with all LLM providers to protect your data.

Secure Deployment Integrations

Multiple secure deployment options including website widget, Slack bot, Discord bot, Zendesk app, and API integrations.

Secure Deployment Integrations

Multiple secure deployment options including website widget, Slack bot, Discord bot, Zendesk app, and API integrations.

Secure Deployment Integrations

Multiple secure deployment options including website widget, Slack bot, Discord bot, Zendesk app, and API integrations.

Reporting security concerns

If you believe you've identified a security vulnerability, please report it to security@kapa.ai. For general security inquiries, you may also reach out to that email.

Frequently asked questions

What types of data does Kapa.ai collect?

We collect two main types of data:

  1. Knowledge source data: This includes documentation, support tickets, tutorials, etc. that you connect to Kapa to create the bot's knowledge base. It can be part of an "External" project (public sources) or an "Internal" project (private/internal sources).

  2. Question/answer data: This refers to the questions users ask the bot and the answers it provides. This is used to provide analytics to the team managing the kapa project (e.g., to see common questions and identify knowledge gaps).

What types of data does Kapa.ai collect?

We collect two main types of data:

  1. Knowledge source data: This includes documentation, support tickets, tutorials, etc. that you connect to Kapa to create the bot's knowledge base. It can be part of an "External" project (public sources) or an "Internal" project (private/internal sources).

  2. Question/answer data: This refers to the questions users ask the bot and the answers it provides. This is used to provide analytics to the team managing the kapa project (e.g., to see common questions and identify knowledge gaps).

What types of data does Kapa.ai collect?

We collect two main types of data:

  1. Knowledge source data: This includes documentation, support tickets, tutorials, etc. that you connect to Kapa to create the bot's knowledge base. It can be part of an "External" project (public sources) or an "Internal" project (private/internal sources).

  2. Question/answer data: This refers to the questions users ask the bot and the answers it provides. This is used to provide analytics to the team managing the kapa project (e.g., to see common questions and identify knowledge gaps).

Can I use private/internal data as a knowledge source?

Yes, in fact using private/internal knowledge sources like Zendesk, Confluence, internal docs, etc. is a common use kapa.ai to help internal support teams. Specifically there are two types of projects that exist in kapa:

  1. External Project (default): Uses publicly available data sources.

  2. Internal Project: Uses private/internal data sources.

You choose the right mix of data to meet your operational and privacy goals. We invest in security and privacy, including by maintaining SOC 2 Type II certification.

Can I use private/internal data as a knowledge source?

Yes, in fact using private/internal knowledge sources like Zendesk, Confluence, internal docs, etc. is a common use kapa.ai to help internal support teams. Specifically there are two types of projects that exist in kapa:

  1. External Project (default): Uses publicly available data sources.

  2. Internal Project: Uses private/internal data sources.

You choose the right mix of data to meet your operational and privacy goals. We invest in security and privacy, including by maintaining SOC 2 Type II certification.

Can I use private/internal data as a knowledge source?

Yes, in fact using private/internal knowledge sources like Zendesk, Confluence, internal docs, etc. is a common use kapa.ai to help internal support teams. Specifically there are two types of projects that exist in kapa:

  1. External Project (default): Uses publicly available data sources.

  2. Internal Project: Uses private/internal data sources.

You choose the right mix of data to meet your operational and privacy goals. We invest in security and privacy, including by maintaining SOC 2 Type II certification.

How is knowledge data stored and processed?

Knowledge data, whether from "External" or "Internal" projects, is processed as follows:

  1. Ingested using an integration (e.g., web crawling, GitHub files loader)

  2. Stored in a US-based Google Cloud-hosted PostgreSQL database

  3. Transformed into LLM-friendly formats

  4. Uploaded to a vector database (Weaviate) hosted on Google Cloud

  5. Given to LLMs to synthesize answers

All data is encrypted in transit and at rest.

How is knowledge data stored and processed?

Knowledge data, whether from "External" or "Internal" projects, is processed as follows:

  1. Ingested using an integration (e.g., web crawling, GitHub files loader)

  2. Stored in a US-based Google Cloud-hosted PostgreSQL database

  3. Transformed into LLM-friendly formats

  4. Uploaded to a vector database (Weaviate) hosted on Google Cloud

  5. Given to LLMs to synthesize answers

All data is encrypted in transit and at rest.

How is knowledge data stored and processed?

Knowledge data, whether from "External" or "Internal" projects, is processed as follows:

  1. Ingested using an integration (e.g., web crawling, GitHub files loader)

  2. Stored in a US-based Google Cloud-hosted PostgreSQL database

  3. Transformed into LLM-friendly formats

  4. Uploaded to a vector database (Weaviate) hosted on Google Cloud

  5. Given to LLMs to synthesize answers

All data is encrypted in transit and at rest.

What LLM providers do you use, and how do you ensure they don't train on our data?

We utilize a stack of multiple LLM calls from various providers to optimize the answer for each individual question. Our current external providers include Cohere Inc., OpenAI LLC., Voyage Inc., and Anthropic PBC. We have vendor agreements in place with all model providers to prevent them from training on any data we share with them.

What LLM providers do you use, and how do you ensure they don't train on our data?

We utilize a stack of multiple LLM calls from various providers to optimize the answer for each individual question. Our current external providers include Cohere Inc., OpenAI LLC., Voyage Inc., and Anthropic PBC. We have vendor agreements in place with all model providers to prevent them from training on any data we share with them.

What LLM providers do you use, and how do you ensure they don't train on our data?

We utilize a stack of multiple LLM calls from various providers to optimize the answer for each individual question. Our current external providers include Cohere Inc., OpenAI LLC., Voyage Inc., and Anthropic PBC. We have vendor agreements in place with all model providers to prevent them from training on any data we share with them.

How long do you retain data?

Both knowledge data and question/answer data are retained for the duration of the service to provide ongoing question-answering capabilities and support our analytics platform. Enterprise customers can customize retention periods and request data deletion. If you need your data deleted, please contact us at security@kapa.ai.

How long do you retain data?

Both knowledge data and question/answer data are retained for the duration of the service to provide ongoing question-answering capabilities and support our analytics platform. Enterprise customers can customize retention periods and request data deletion. If you need your data deleted, please contact us at security@kapa.ai.

How long do you retain data?

Both knowledge data and question/answer data are retained for the duration of the service to provide ongoing question-answering capabilities and support our analytics platform. Enterprise customers can customize retention periods and request data deletion. If you need your data deleted, please contact us at security@kapa.ai.

Can we sign a DPA?

Yes, we offer Data Processing Agreements for enterprise customers with specific requirements. If you need a DPA or have any questions about our data processing practices, please contact us at security@kapa.ai We're committed to working with our customers to meet their compliance and data protection needs.

Can we sign a DPA?

Yes, we offer Data Processing Agreements for enterprise customers with specific requirements. If you need a DPA or have any questions about our data processing practices, please contact us at security@kapa.ai We're committed to working with our customers to meet their compliance and data protection needs.

Can we sign a DPA?

Yes, we offer Data Processing Agreements for enterprise customers with specific requirements. If you need a DPA or have any questions about our data processing practices, please contact us at security@kapa.ai We're committed to working with our customers to meet their compliance and data protection needs.

Turn your knowledge base into a production-ready AI assistant

Request a demo to try kapa.ai on your data sources today.

Start improving developer experience and reducing support now.

Explore kapa.ai with the founding team and spin up a demo environment in less than 24 hours.