Anthropic launches Claude Enterprise plan to compete with OpenAI

SAN FRANCISCO, CALIFORNIA - SEPTEMBER 20: Anthropic Co-Founder & CEO Dario Amodei speaks onstage during TechCrunch Disrupt 2023 at Moscone Center on September 20, 2023 in San Francisco, California. (Photo by Kimberly White/Getty Images for TechCrunch)

Anthropic is launching a new subscription plan for its AI chatbot, Claude, catered toward enterprise customers that want more administrative controls and increased security. Claude Enterprise will compete with OpenAI's business-specific solution, ChatGPT Enterprise, released roughly a year ago.

Claude Enterprise allows businesses to upload proprietary company knowledge into Anthropic's AI chatbot. Then Claude can analyze the information, answer questions about it, create graphics and simple web pages, or act as a company-specific AI assistant.

Anthropic appears to be playing catch-up with OpenAI, trying to put Claude everywhere that ChatGPT already is. The startup has released a few ways to use Claude that closely match how OpenAI already offers ChatGPT.

"The reality is that Claude has been usable for companies for a year. Candidly, we've had a product in the market for a lot less long," Anthropic product lead Scott White told TechCrunch. "But we're responding to the needs of our customers at a high velocity with a smaller team."

In May, Anthropic released the Claude Team plan which, much like the ChatGPT team plan, allowed small businesses to collaborate on projects. Since the spring, Anthropic has launched Claude mobile apps for iOS and Android. Now, it's going up against ChatGPT Enterprise, which has seen widespread adoption among Fortune 500 companies.

But Anthropic's enterprise offering is different from what's on the market in a few key ways. First, the context window on Claude Enterprise is 500,000 tokens, meaning Anthropic's models can process up to 200,000 lines of code, dozens of 100-page documents or a two-hour audio transcript in a single prompt. ChatGPT Enterprise and Claude's Team plan offer context windows less than half that.

Claude Enterprise comes with Projects and Artifacts, Anthropic's workspaces where several users can upload and edit content. These features could be useful in business cases, where you might be working on a longer project with lots of data sources and people involved.

The Enterprise plan also includes GitHub integration for engineering teams to sync their GitHub repositories with Claude. Coding has become a popular use case for Claude Sonnet 3.5, and this integration means that Anthropic's models can have direct access to its customers' codebases. This can be useful for catching a new engineer up to speed, creating a new feature or solving a bug, for instance.

Like ChatGPT's Enterprise plan, Claude Enterprise allows businesses to assign a primary owner for your company's workspace. That owner can assign different levels of access to projects and information within Claude, and trace activity across the system for security and compliance monitoring.

Also like OpenAI, Anthropic says it is not training on Claude Enterprise customer data. That's important for many businesses that don't want their trade secrets ending up in Claude or ChatGPT's knowledge base six months from now.

Anthropic refused to disclose the price of Claude Enterprise, though White said it's more expensive than Anthropic's Team plan (which costs $30 per month, per member). White said that's because Enterprise customers get a lot more mileage out of Claude, with larger context windows and higher rate limits. (For what it's worth, OpenAI won't publicly disclose the price of its enterprise product either.)

White says Anthropic has been working in a private beta for months with early adopters such as GitLab, Midjourney, IG Group and Menlo Ventures (an investor in Anthropic).

However, gaining broader adoption will be key. AI model developers like Anthropic have faced pressure to sell API access at lower and lower prices. Products like Claude Enterprise offer a path to revenue, however, widespread adoption is needed to offset the expensive inference costs that come with them. It's not clear that any AI model developers are profiting off of these business-specific plans just yet.