Toronto-based Startup Cohere Releases New Chat API for AI Chatbots

Toronto, Canada-based Cohere, founded by ex-Googlers, has emerged as one of the leading startups in the generative AI marketplace. The company focuses on developing foundation models and other AI-powered technologies for enterprises. Today, Cohere has expanded its offerings by releasing a new application programming interface (API) for building chat applications based on its proprietary large language model (LLM), Command. The Chat API is designed to simplify the creation of reliable conversational AI products for various purposes, including knowledge assistants and customer support systems.

Introduction of Cohere’s Chat API

In a blog post, Cohere announced the launch of its Chat API, stating:

“Whether you’re building a knowledge assistant or customer support system, the Chat API makes creating reliable conversational AI products simpler.”

The company has already introduced APIs for content generation (Generate) and text summarization (Summary). The addition of the Chat API expands Cohere’s suite of tools for developers.

Coral Showcase and Performance Comparison

Cohere provides a free chatbot demo called the Coral Showcase on its website. Users can test out the chatbot; however, access requires signing in with Google or Cohere credentials. The Coral chatbot, initially introduced in July, can now be integrated into external-facing or internal apps using the new API.

In VentureBeat’s tests, the Coral chatbot, powered by Cohere’s Command, demonstrated slightly slower response times compared to some competing, closed-source chatbots like OpenAI’s ChatGPT or Anthropic’s Claude 2. However, the responses were accurate, up-to-date, well-written, and free of visible hallucinations. The chatbot also provided sources and included links. The only drawback was that it occasionally failed to provide the most recent information when asked about a specific company.

Cohere highlighted the unique feature of its Chat API, called Retrieval-Augmented Generation (RAG). This method allows developers to control the chatbot’s information sources and tailor them to their enterprise data or expand to search the entire web. Cohere explains:

“RAG systems improve the relevance and accuracy of generative AI responses by incorporating information from data sources that were not part of pre-trained models.”

Cohere’s RAG-enabled Chat API currently supports two additional information sources: web search implementation and plain text documents from an enterprise or another source. This allows developers to equip their chatbots with real-time news or industry trends when building market research assistants. Cohere ensures high performance from its model trained specifically for RAG tasks.

While VentureBeat’s initial tests indicated some reliability issues with returning current news, it is important to note that the tests were limited and only consisted of a few queries.

Modular Components and Future Plans

In addition to the RAG-enabled Chat API, Cohere offers three modular components for third-party developers to connect:

  • “Document mode” allows developers to specify which documents their Cohere-powered chatbot should reference when answering user prompts.
  • “Query-generation mode” instructs the chatbot to return search queries based on the user’s prompt.
  • “Connector mode” lets developers connect their chatbot to the web or another information source.

Cohere plans to expand its connector/modular ecosystem in the future to provide even more flexibility and customization options for developers.

This release from Cohere comes shortly after OpenAI’s reintroduction of web browsing capabilities to ChatGPT and the introduction of ChatGPT for Enterprise subscription service tier. The competition in the AI chatbot market is heating up as companies aim to provide more powerful and efficient solutions for businesses.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts