GPT-4 Turbo is now available. How can I start using it?
The publicly released and free version of ChatGPT was based on GPT-3.5. Almost a year later, GPT-4, a much more capable version than its predecessor, was launched and has now been updated to the GPT-4 Turbo version.
When ChatGPT was upgraded to GPT-4, it became available exclusively through the ChatGPT Plus subscription. Users can still use ChatGPT for free, a version based on GPT-3.5.
Following the agreement between OpenAI and Microsoft, ChatGPT became available through Bing completely free of charge. Initially, with the version based on GPT-3.5 and later with GPT-4. With the release of GPT-4 Turbo, many users wonder when Bing Chat (now Copilot) will be updated to support the new version of the language model created by OpenAI.
Unlike previous updates, we still have to wait to use Bing Chat with GPT-4 Turbo. According to Mikhail Parakhin, head of the web and Windows experiences team, we have to wait because, due to Panos Panay's sudden departure a few weeks ago, the company has decided to consolidate several teams.
This is also because GPT-4 Turbo can only be used at the API level, so only developers can make use of it. Until Microsoft completes its reorganization with Panay's departure to Amazon, there is no set timeline for the implementation of GPT-4 Turbo with Bing Chat.
GPT-4 Turbo is available for all paying developers to try by passing gpt-4-1106-preview
in the API.
GPT-4 vs. GPT-4 Turbo
With the launch of GPT-4, OpenAI made an incredible leap in ChatGPT's functionality. With GPT-4, the amount of data it was trained on and its ability to generate responses to longer questions were expanded.
The first update GPT-4 received was GPT-4 32K, slightly higher than the original. With the launch of GPT-4 Turbo, which could be appended with the 128K tag, the context it can analyze in ChatGPT is further expanded to provide longer and more accurate responses.
Furthermore, unlike GPT-3.5, the data it was trained on is updated to April 2023. The free version of ChatGPT only allows responses based on data produced before September 2021.
Basically, this means that this new version allows for the input and consideration of longer text when writing the prompt/question from which we want to get a response. This way, we can be much more precise and input many more details, details that ChatGPT will take into account.
ChatGPT operates through tokens. Each word does not equate to one token, as there is no exact equivalence, but it's similar. GPT-3.5 can handle up to 4096 tokens, meaning we can input texts of about 3000 words for it to consider when generating a response.
When GPT-4 was updated to 32K, it could consider up to 32,000 tokens. With the launch of GPT-4 Turbo, this new version can analyze and consider up to 128,000 tokens to generate a response. This translates to being able to ask much more complex questions with much more data and details, resulting in a more comprehensive and accurate response.