Home Artificial Intelligence Companies are building customized AI bots to depend less on OpenAI

Companies are building customized AI bots to depend less on OpenAI

OpenAI dominates the generative AI market, and its GPT-4 is the industry’s best-performing model to date. But businesses are increasingly opting to build their own, smaller AI models that are more tailored to their business needs.

Salesforce, for example, has piloted two coding AI assistants called Einstein for Developers and Einstein for Flow, which are trained on both Salesforce’s in-house programming data and open-source data. They’re “small” AI models for niche business applications. The assistants also can write poems and such, but they won’t be as good at it because they haven’t been trained on the broader internet like ChatGPT, said Patrick Stokes, Salesforce’s executive vice president of product.

With OpenAI, Google, Amazon, and Meta focused on building larger and larger AI models, there’s still a good argument for companies to wait and see what kind of capabilities arise from them. But there very well could be an ocean of smaller AI models designed for specific tasks, which means people might interact with different AI bots for various activities throughout their day. Ultimately, companies might find they can adopt AI in a less costly way by focusing on specific applications, said Yoon Kim, an assistant professor at the Massachusetts Institute of Technology whose research focuses on making generative AI models more efficient.

“You can’t use ChatGPT out of the box”

Braden Hancock is the chief technology officer of Snorkel AI, a Redwood City, California-based company that refines AI models. He has been helping businesses, many of which are in the financial sector, build small AI models powering bots that do one thing: a customer service assistant, or a coding assistant, for example.

“There was maybe a moment early at the beginning of the year, right after ChatGPT came out, where people weren’t quite sure—like, oh my gosh, is this game over? Is AI just solved now?” said Hancock. Then, on closer look, companies realized there are few if any business applications that could be addressed by ChatGPT without any modifications.

What does this mean for OpenAI?

If hardware costs come down enough, then there’s a scenario where GPT-4 will do everything for everyone, said Amin Ahmad, founder and CEO of Vectera, a software company focused on semantic search. AMD has just released a set of chips that could lower the costs of developing AI models.

But there’s another scenario where more large-language models (LLMs) on the market will create increased competition for OpenAI. That can help explain why OpenAI has been lobbying for more regulation to get ahead of AI competitors and make it harder for others to participate.

 

Reference

Denial of responsibility! TechCodex is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment