OpenAI launches a pair of AI inference models: o3 and o4-mini
OpenAI Launches a Pair of AI Inference Models: o3 and o4-mini
via cnBeta.COM Chinese Industry Information Station - Telegram Channel
Telegraph
OpenAI Launches a Pair of AI Inference Models: o3 and o4-miniOpenAI has launched a pair of AI inference models: o3 and o4-mini. At the same time, o4-mini offers what OpenAI calls a balance between price, speed, and performance - three factors that developers often consider when choosing an AI model to power their applications. Unlike previous inference models, o3 and o4-mini can generate responses using tools in ChatGPT, such as web browsing, Python code execution, image processing, and image generation. Starting today, these models, along with a variant of o4-mini, o4-mini-high (which spends more time crafting answers to improve their reliability), are available to OpenAI's Pro, Plus…