Home Page >  News List >> Tech >> Tech

Intel Zhang Yu: edge computing plays an important role in the entire AI ecosystem

Tech 2023-07-12 17:03:49 Source: Network
AD

Global Network Technology Reporter Lin Mengxue: Currently, generative AI and large models are showing a high fever worldwide. During the recently passed 2023 World Artificial Intelligence Conference (WAIC 2023), various manufacturers have launched a "hundred model war"

Global Network Technology Reporter Lin Mengxue: Currently, generative AI and large models are showing a high fever worldwide. During the recently passed 2023 World Artificial Intelligence Conference (WAIC 2023), various manufacturers have launched a "hundred model war". According to incomplete statistics by the organizing committee, over 30 large model platforms have been released and made public, and 60% of offline booths have showcased relevant introductions and applications of generative AI technology, Eighty percent of the participants discussed the content around the big model.

During WAIC 2023, Zhang Yu, Senior Chief AI Engineer of Intel Corporation and Chief Technology Officer of the Network and Edge Business Unit in China, believes that the most core factor driving the development of artificial intelligence in this round is actually the continuous improvement of computing, communication, and storage technologies. Whether it is a large model or a fusion of AI, edges actually play a very important role in the entire AI ecosystem.

Zhang Yu stated that, "With the Digital transformation of the industry, people's demands for agile connectivity, real-time business and application intelligence have promoted the development of edge AI. However, most applications of edge AI are still in the edge reasoning stage. That is to say, we need to use a lot of data and great computing power to train a model in the data center, and we will push the training results to the front end to perform a reasoning operation. This is At present, the vast majority of artificial intelligence usage patterns are implemented at the edge

This mode inevitably limits the frequency of model updates, but we also see that many intelligent industries actually have demands for model updates. For example, autonomous driving requires the ability to adapt to different road conditions and driving behaviors of different drivers. However, when we train a character model in a car factory, there is often a certain difference between the training data used and the data generated during the dynamic driving process Different. This difference affects the model's generalization ability, which is its ability to adapt to new road conditions and driving behaviors. This requires us to continuously train and optimize this model at the edge He said.

Therefore, Zhang Yu proposed that the second stage of artificial intelligence development should be the edge training stage. If we want to achieve edge training, we need more automated means and tools to complete a complete development process from data annotation to model training, as well as model deployment. "He said that the next development direction of edge artificial intelligence should be self-learning.

In the actual development process, edge artificial intelligence also faces many challenges. In Zhang Yu's opinion, in addition to the challenges of edge training, there are also challenges of Edge device. "Because the power consumption that the provided computing power can carry is often limited, how to achieve edge reasoning and training with limited resources puts forward higher requirements for chip performance and power ratio." He also proposed that the fragmentation of Edge device is very obvious, and how to use software to achieve migration between different platforms also puts forward more requirements.

In addition, the development of artificial intelligence is closely related to computing power, and behind computing power lies a huge data foundation. Faced with massive data assets, how to protect data has become a hot topic in the development of edge artificial intelligence. Once artificial intelligence is deployed at the edge, these models will be far away from the control of service providers. How can we protect the models at this time? And to achieve good protection effects both in storage and operation, these are the challenges faced by edge artificial intelligence

Intel is a data company, and our products precisely cover various aspects of computing, communication, and storage. In terms of computing, Intel provides a variety of products, including CPUs, GPUs, FPGAs, and various artificial intelligence acceleration chips, to meet users' different requirements for computing power. For example, in the field of artificial intelligence big models, the Gaudi2 product launched by Habana, a subsidiary of Intel, is the only one in the industry in big model training Products that exhibit excellent performance in terms of performance. In terms of edge reasoning, the OpenVINO deep learning deployment tool suite provided by Intel can quickly deploy models designed and trained by developers on open artificial intelligence frameworks to different hardware platforms for inference operations


Disclaimer: The content of this article is sourced from the internet. The copyright of the text, images, and other materials belongs to the original author. The platform reprints the materials for the purpose of conveying more information. The content of the article is for reference and learning only, and should not be used for commercial purposes. If it infringes on your legitimate rights and interests, please contact us promptly and we will handle it as soon as possible! We respect copyright and are committed to protecting it. Thank you for sharing.(Email:[email protected])

Mobile advertising space rental

Tag: Intel Zhang Yu edge computing plays an important role

Unite directoryCopyright @ 2011-2024 All Rights Reserved. Copyright Webmaster Search Directory System