Impossible not to talk about ChatGPT right now as the media buzz is strong around OpenAI’s conversational tool. But how does this tool work? What are the mechanisms at work in the “belly of the beast”? What are its potential limitations? And above all, the question that dominates the search engine landerneau today: can this tool compete with Google in the long term? Sylvain Peyronnet, an eminently recognized specialist in the field, provides his analysis and predictions on this subject here.
I have been working in the field of algorithms for over 20 years. My lifelong subject is decision making in the presence of uncertainty and in the context of big data. Some of the algorithms I’ve been working on are now in what is commonly called artificial intelligence.
Yet I could never have imagined what the last few years have seen in terms of machine learning, natural language processing, image analysis, and more. We live in a time when outstanding results are presented almost weekly. If I have to be honest, I feel like I’m reliving the first good years of the Web…
The topic of the moment is obviously the famous chatGPT. The latest addition to openAI is making a lot of noise and is popular due to its ability to interact with humans. No one can deny that the tool is truly stunning, although of course we also see its limitations from the very first uses.
For SEO professionals, the AI revolution is both an opportunity and a risk. It’s a risk for web editors, but it’s an opportunity for those who have turned to them and who perhaps see a way to get less than what they paid elsewhere. For a search engine like Google, this is obviously a risk given that Sundar Pichai, boss of Google, has even set up a new internal organization to try not to be overtaken in the AI field by openAI.
In this article, I will tell you about chatGPT. What is it really? What are its technical limits? How much ? I will also try to give some answers to the question that torments us all: is chatGPT a potential danger for Google?
ChatGPT, what is it?
ChatGPT (ref [1]) is a language model designed – as the name suggests – to be conversational. This means that it is able to follow instructions given by a human through a request (a question). To follow these instructions, the model has the ability to converse and use a form of common sense that a human being can have and which is not found in other models (such as GPT3 for example).
Another strong point of the model, to mimic human behavior, is its memory continuity capability: chatGPT is able to remember what you told it in previous questions, and to elaborate answers based on this chat history. . This aspect is the most anthropomorphic: sometimes we have the illusion that we are discussing with a real person, who will occasionally make mistakes in his answers.
Chat GPT, this is the latest model in a long line of language models. It all started with word2old by Tomas Mikolov (then at Google), then fastText by the same researcher (then on Facebook) and many other likes Erni in 2019 (at Baidu), BERT in 2018 (Google), Grover And Helmet in 2018 and 2019 (Allen Institute). There are also models in France (from Lighton).
Over time, these models have become larger and more expressive. But the real breakthrough was the emergence of transformer-based models in openAI. Today it is the leading operator in the field, ahead of all others, for both image and text applications. Their first model dates back to 2018, it is GPT. But the general public started to care with GPT2, the first very large model when it had “only” 1.5 billion parameters and a training dataset of 8 million web pages. We now realize that GPT2 was ultimately just a proof of concept, and the real breakthrough is GPT3 (see reference [3]), a model with 175 billion parameters, trained on a dataset of several billion pages of content.
…
#ChatGPT #dethrone #Google #News #SEO #engines
Comments
Post a Comment