foropenai

Open AI makes AI more usable

This post is a subclaim

Open AI isn't just about how you can run AI models. Open AI has created a differentiated market where there are many different hosts for the same model, and each one has different aspects.

Open AI caters to more use cases

Since anyone can run open AI models, there are many alternative AI hosts to choose from. Let's say you're making something that uses the model Mistral 7B. Want the original host? Use Mistral's platform and get access to more models. Making something on Cloudflare? Use Workers AI and run the model close to the user. Want high speeds? Some models run on hosts like Fireworks or Groq, at 3x-10x faster speeds.

With closed AI, there are only as many hosts as the model creator allows there to be. With open AI, the different ways to run a model can compete and cater to use cases. There are too many cases to be named here, but they include speed, pricing, localization, privacy, and sustainability.

Open AI also works better in developing countries. If someone runs AI on their own device, it works completely offline; if someone founds a local AI host, it could grow the economy.

Plus, open AI is generally cheaper. While base intelligence closed models typically cost $1.25 - $1.50 per million tokens output, Mixtral (a comparable model) typically costs $0.27 - $0.70.

This affordability means an AI subscription that cost a lot could cost less, and a small fee to use an AI-based app could turn into no fee, as the creator can bear the costs. It also allows for techniques like using an internal monologue to improve generation quality, since generating text is cheap.

In general, open AI can fit in more cases than closed AI. It allows for more creative apps, better user experiences, and global adoption, since it can run in more ways.