Faifei News Agency, Shangtang Technology and Shanghai AI Lab jointly with the Chinese University of Hong Kong, Fudan University and Shanghai Jiaotong University recently released the 100 billion-level parameter large language model "Scholar Puyu" (InternLM). The Scholar Pu language has 104 billion parameters and is trained on a high-quality multilingual data set containing 16 trillion tokens. The comprehensive evaluation results show that Scholar Puyu not only performs well in many test tasks, such as knowledge mastery, reading comprehension, mathematical reasoning, multilingual translation, but also has strong comprehensive ability, so it performs well in the comprehensive examination, and has surpassed the ChatGPT in a number of Chinese examinations, including the data set (GaoKao) of various subjects in the Chinese college entrance examination.