Tools

Tools

Tools allow you to extend copilots by connecting them to your application's APIs. An external tool is defined using an OpenAPI specification, providing a standardized way to describe the API endpoints, parameters, and responses. In the future, Continual may support additional tool types.

Creating a tool

To create a new tool, navigate to the Tools page in the Continual console and click Create tool.

Give your tool a name and description to help users understand its purpose and functionality.

Authentication

Select the appropriate authentication scheme for your tool to ensure secure access. The available options are:

  1. No Authentication: Choose this for public or testing tools that don't require authentication.

  2. Continual Access Token: Use this if you have set up Continual Identity Verification and want to require a Continual access token for tool access.

  3. Service Bearer Token: Select this for tools that are shared among all users, authenticated with a single bearer token.

  4. User Bearer Token: Choose this if tool access is restricted to individual users, each with their own unique bearer token.

  5. OAuth2: Use this for tools that require OAuth2 authentication, which may involve redirecting users to an authorization server.

Ensure you have obtained and securely stored the necessary credentials or tokens based on your selected authentication method.

API specification

Upload your tool's OpenAPI specification file in either YAML or JSON format. Continual supports OpenAPI versions 2.0.1 through 3.1.0.

Your specification must include:

  • Server URL
  • API summary/description
  • Operation IDs for each endpoint

Verify that your specification is error-free before uploading, as tool creation will fail if the spec contains errors. You can use an OpenAPI validator (opens in a new tab) to check for issues.

Best practices

Follow these guidelines to optimize your tools for use with Continual copilots:

Keep tool size under half the LLM's context window

The total token size of your tool should not exceed half of the underlying LLM's context window. For example, if using GPT-4 32K, limit your tool to under 16,000 tokens. Use the OpenAI Tokenizer (opens in a new tab) to measure spec size.

Provide a search endpoint

Include an endpoint that allows copilots to search for resources by user input, as opposed to opaque IDs. This makes it easier for the copilot to find relevant information.

Limit request and response fields

Omit unnecessary fields from API requests and responses to reduce latency and improve performance. Continual ignores undocumented response fields.

Keep responses concise

Design your API responses to return specific, concise data for the copilot to work with. Multiple targeted requests are better than a single large, dense response.

Prefer flat response structures

Where possible, avoid deeply nested response objects. Flat, unnested responses lead to faster processing and fewer copilot errors.

By following these practices and providing a clear, well-documented tool specification, you can extend your copilots with valuable API integrations that enhance their abilities to assist users.