According to GSM Arena, OpenAI began piloting group chats for ChatGPT users last week in just four countries: Japan, Taiwan, New Zealand, and South Korea. The company announced it’s now expanding the feature globally to all logged-in users across Free, Go, Plus, and Pro plans over the coming days. Group chats can include up to 20 participants and use ChatGPT 5.1 Auto, which automatically selects the best model for each response. Rate limits only apply when ChatGPT responds, not when users are chatting with each other. Early feedback from the pilot has been positive, prompting the global rollout.
The business play here
This is a pretty smart move from OpenAI. They tested in smaller, tech-savvy markets first – basically dipping their toes in the water before going all-in. And now they’re rolling it out to everyone. Here’s the thing: group chats aren’t just a nice-to-have feature. They’re a strategic play to make ChatGPT more sticky in people’s workflows.
Think about it – when you’re collaborating with colleagues or friends in a shared AI space, you’re less likely to jump ship to another AI provider. The 20-person limit is interesting too – it’s large enough for most team collaborations but small enough to manage performance. And using ChatGPT 5.1 Auto? That’s basically their way of saying “we’ll use whatever model works best for your specific situation,” which is a clever way to showcase their model portfolio.
Who wins with this?
Small teams and study groups are probably the biggest beneficiaries here. Students working on projects, remote teams brainstorming, even families planning trips – they all get a shared AI assistant that remembers context across multiple users. The rate limit policy is actually pretty thoughtful too. Only counting AI responses against limits means teams can chat freely without worrying about hitting caps until the AI actually participates.
But here’s a question: will this actually drive more paid subscriptions? Free users get access too, but I suspect teams doing serious work will quickly bump into those rate limits and consider upgrading. That’s probably the real calculation here – get people hooked on collaborative AI, then monetize the heavy usage.
It’s worth checking out OpenAI’s official announcement for the technical details. The rollout is happening now, so if you don’t see it yet, you probably will within the next few days.
