Close Menu
  • Home
  • Feature
  • News
  • Opinion
  • Photo Stories/Events
  • Report
Facebook X (Twitter) Instagram
  • About TheNumbersNG
  • Contact Us
Facebook Instagram
TheNumbersNGTheNumbersNG
  • Home
  • Feature
  • News
  • Opinion
  • Photo Stories/Events
  • Report
TheNumbersNGTheNumbersNG
Home » OpenAI Unveils GPT-4o, Faster And Free Iteration Of GPT-4 Model
News

OpenAI Unveils GPT-4o, Faster And Free Iteration Of GPT-4 Model

May 17, 2024Updated:May 17, 2024No Comments3 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email

Over the past week, there have been conflicting reports predicting OpenAI’s plan to announce an AI search engine to rival Google and Perplexity, a voice assistant baked into GPT-4, or a launch the GPT-5 ahead of the Google IO event.

Well, OpenAI has put those speculations to bed as it instead launches an improved iteration of its GPT-4 model called GPT-4o (“o” for omni) which is faster, free, and improves capabilities across text, vision, and audio, at its OpenAI event yesterday.

GPT-4o brings a pivotal evolution to the GPT models, transforming ChatGPT into a digital personal assistant, responding in real-time and observing the world around you.

Notably, the voice mode of ChatGPT receives a substantial enhancement as part of the GPT-4o rollout. Evolving beyond its previous limitations of responding to one prompt at a time and working with only what it can hear, the app now embodies characteristics akin to the intelligent voice assistant in the 2013 film “Her”, offering real-time responsiveness and environmental awareness.

Its multimodal abilities will also allow it to seamlessly interact via text, and “vision,” enabling it to interpret and engage in real-time spoken conversations about screenshots, images, documents, and charts uploaded by users.

The updated version is also getting memory capabilities, allowing it to learn from past interactions and provide more contextually relevant responses. Furthermore, GPT-4o facilitates real-time translation.

Meanwhile, the full potential of GPT-4o is still unfolding. For now, you can only use the text and image capabilities on GPT-4o as other features will be rolled out in the coming days.

While it is nice to know that this model iteration is free and does not require any form of subscription, it is still best to continue as a subscriber if you have been one before now.

This is because subscribers of ChatGPT Plus can use the new GPT-4o to get more prompts per hour than non-subscribers. This means you can send GPT-4o five times as many prompts before waiting or switching to a less powerful model.

Interestingly, this would be the first time OpenAI is unveiling a new language model without a subscription fee, unlike GPT-4, which was introduced in March last year and was made available to only subscribers of ChatGPT Plus and GPT-4 Turbo.

Going by what we know, this new voice mode feature will first be made available to subscribers of ChatGPT Plus, who currently stand at about 250,000 globally per data from Nerdynav, before it rolls out to non-subscribers.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Elvis Eromosele

Related Posts

Prof. Timothy Anake, Trailblazing Mathematician, Named Covenant University Vice-Chancellor

April 30, 2026

FG Declares May 1 Public Holiday for 2026 Workers’ Day Celebration

April 30, 2026

MTN Nigeria Sells 60% of MoMo PSB, Y’ello Digital in ₦152 Billion Deal with MTN Group

April 30, 2026
Add A Comment
Leave A Reply Cancel Reply

You must be logged in to post a comment.

TheNumbersNG
  • About TheNumbersNG
  • Contact Us
© 2026 TheNumbersNG.

Type above and press Enter to search. Press Esc to cancel.

Ad Blocker Enabled!
Ad Blocker Enabled!
Our website is made possible by displaying online advertisements to our visitors. Please support us by disabling your Ad Blocker.