On August 16, the Global Open Source Technology Conference (GOTC2024) was held in Shanghai, organized by Open Source China and the Shanghai Pudong Software Park. Zhang Qunhui, Chief Expert of Data Storage Software at Huawei, shared insights on the practical applications of ModelEngine in vertical large model domains.
Zhang described ModelEngine as Huawei's comprehensive AI training and inference toolchain within its Data Center Stack (DCS). This cutting-edge toolchain is the first in the industry to use an AI streaming programming framework, providing an end-to-end solution for data processing, knowledge generation, model fine-tuning, deployment, and retrieval-augmented generation (RAG) application development. It offers data engineers, model engineers, and application developers a seamless AI development experience.
He emphasized that the open-source toolchain based on ModelEngine enables data processing, model utilization, and application support with open data operators, mainstream models, and application operators. The platform has significantly reduced corpus generation time from months to just days. Additionally, it provides developers with a one-stop solution for AI application development, evaluation, optimization, and deployment, facilitating the rapid construction of AI applications while ensuring compatibility with existing AI assets like LangChain and LlamaIndex plugins.