[Baichuan Intelligence publishes Chinese and English models: open source model can be used for free commercial use] Science and Technology Innovation Board Daily reported on the 15th that Baichuan Intelligent Company, founded by Sogou founder Wang Xiaochuan, announced the launch of a 7 billion-parameter open source Chinese and English pre-training model-baichuan-7B. At present, the baichuan-7B model has been released on Hugging Face, Github and Model Scope platforms. Baichuan Intelligence said that in order to verify the capabilities of the model, baichuan-7B conducted a comprehensive evaluation on the three most influential Chinese evaluation benchmarks, C-Eval, AGIEval and Gaokao, and achieved excellent results, and it has become the best native pre-training model for Chinese language performance under the same parameter scale. Baichuan-7B code uses Apache-2.0 protocol, model weight uses free commercial protocol, as long as a simple registration can be free commercial.