Microsoft May Update Bing With This Improved Version of ChatGPT
ChatGPT, OpenAI’s natural language processing (NLP) based chatbot, has taken the internet by storm ever since it was made available for public testing in November last year. The generative AI model is expected to take a massive leap with the introduction of GPT-4, which will succeed the currently utilised GPT-3 autoregressive language model. The update is reportedly being integrated into Microsoft’s applications, with the tech giant’s search engine, Bing, among the platforms that are set to benefit from the development.
Microsoft recently announced a multi-million-dollar investment in ChatGPT’s parent firm, OpenAI, in a move that could help it overtake Alphabet’s Google as the most preferred search engine in the world. The tech giant has already announced integration of the ChatGPT chatbot into its workplace instant messaging platform, Microsoft Teams, through a premium subscription model priced at $7 (roughly Rs. 600).
According to a Semafor report, the upcoming GPT-4 system could be integrated into Microsoft’s search engine, Bing. GPT stands for generative pre-trained transformer, or the ability of an AI-model to parse through humungous data sets, and generate comprehensive answers to queries in a way that mimics human language.
The most important upgrade that GPT-4 will bring over GPT-3, is the speed at which it can respond to queries, as per the report. This is being done through improved input datasets, algorithms, parameterisation, and alignment, rather than just making the language model larger and vaster. The majority of work is reportedly being done on the server side, through optimisation tweaks that make the system more efficient, accurate, and faster.
ChatGPT, is currently free for all users to test as part of an ongoing public beta. On Wednesday, OpenAI also launched a pilot subscription model — ChatGPT Plus — priced at $20 per month (roughly Rs. 1,650). This is mostly due to the fact that running the system does come at a cost, with each search costing OpenAI a couple of cents according to CEO Sam Altman.
The process is currently powered by supercomputers co-developed by Microsoft and OpenAI that has 285,000 CPU cores and 10,000 GPUs which have a 400Gbps network connectivity speed, as per the report.
With the source code that ChatGPT is built on open for all, developers have been rushing to get their hands on Nvidia’s latest graphics processor, the H100 Tensor Core GPU, which is best suited to run systems like those used in GPT. Nvidia’s H100 Tensor Core GPU costs $30,000 (roughly Rs.25,00,00) which makes this a rather expensive affair, to say the least.
Therefore, much of the race towards domination of the search engine market could be led by AI-powered generative models like GPT, and by the innovation on the server side of these models, in particular.
Google has enjoyed the pole position ever since it beat Yahoo several years ago, to become the world’s most preferred search engine. It remains to be seen if Microsoft’s Bing will end up challenging and overthrowing Google as the world’s most popular and widely used search engine.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.