The Beijing Academy of Artificial Intelligence (BAAI), AKA OpenAI of China, recently launched the latest version of Wudao, a pre-trained deep learning model that is called “the world largest model ever”. The model has an incredible 1.75, wait for it…. TRILLION parameters. But how big is it really? Well, the model is 10 times bigger than the OpenAI GPT-3 model which, until now, has the title of the best model for language generation tasks.
Another feature of Wudao is that the model was built and trained to tackle multi-modal problems, text, and image at the same time, two very different realms for the current deep learning models. At BAAI Wudao demonstrated performing tasks such as natural language processing, text generation, image recognition, and image generation.
Generally, deep learning architectures are built to be highly specialized in one set of problems per time, like Convolutional Neural Networks for images and Recurrent Neural Networks for text. Because of that, multi-modal models (which can tackle multiple problems at once) are gaining popularity in the AI field as they are seen as an important step towards AGI.
At BAAI, they believe that AGI can be achieved with big models (or a massive quantity of parameters) and big computers (literally big computers). This makes me wonder... What's the next step of EUA (and west in general) to face the emergence of China's AI supremacy?