Intel, a renowned player in the technology sector for over four decades, envisions a significant year ahead in the realm of Artificial Intelligence (AI). Gadi Singer, Intel’s Vice President and Director of Emergent AI Research reflects on the past and provides insights into the future. In a recent interview with Fierce Electronics, Singer underscores the swift evolution of AI and the emergence of agile models, shedding light on Intel’s perspective for AI implementation in 2023.
The expanding parameters of AI
AI is making strides, with Singer pointing out that the number of parameters within AI models is growing exponentially, increasing tenfold yearly. According to his assessment, 2023 is poised to be a turning point, witnessing the introduction of enormous AI models with anywhere from 100 billion to an astounding 1 trillion parameters. This rapid expansion underscores the continuously accelerating pace of innovation in the field.
In the midst of the push for larger models, Singer emphasizes the merits of smaller, more focused AI models. With their reduced parameter counts, these elegant models present an alternative approach to AI deployment. Singer argues that not every application demands the extensive functionality of larger models. Smaller models can be tailored to specific use cases, delivering precision and efficiency.
Retrieval-based traceability
A critical aspect of these agile models is retrieval-based traceability. AI-generated information is fetched from traceable external sources rather than relying solely on the model’s internal memory. This approach offers numerous advantages, including transparency, verifiability, improved accuracy, adherence to copyright regulations, diminished bias, broader access to private data, and reliance on selective and secure data sources. Retrieval-based traceability represents a substantial leap in AI’s ethical and practical dimensions.
Environmental and cost considerations
Singer underscores the environmental and cost benefits of smaller AI models. Operating smaller models conserves energy and significantly reduces the carbon footprint. This aligns with global endeavors to make technology more sustainable. Additionally, from a financial perspective, smaller models can be more cost-effective to deploy and maintain.
Security remains a paramount concern in the AI landscape. Singer emphasizes that agile models can be run within trusted environments, mitigating the necessity to transmit queries to external cloud services. This approach bolsters data privacy and security, particularly significant to businesses handling sensitive information.
The coexistence of cloud and edge
In a nuanced viewpoint, Singer highlights the coexistence of cloud and edge computing in AI. While larger models may continue to operate in the cloud, smaller, specialized models find their niche at the edge. This dual approach permits flexibility in AI deployment, catering to a spectrum of use cases. The decision between cloud and edge hinges on security and specific business requisites.
Intel’s breakthrough: The AI PC
Intel is poised to play a pivotal role in this evolving AI landscape with its groundbreaking development—the AI PC. At the Intel Innovation event on September 19, CEO Pat Gelsinger unveiled the Intel Core Ultra processor, codenamed Meteor Lake. This processor boasts a CPU, GPU, and Neural Processing Unit (NPU), housing AI functionality. Set for release on December 14, it will debut in the new Acer Swift laptop.
Intel touts the Core Ultra as the “most significant client architectural shift in four decades.” Manufactured on the new Intel 4 process node and employing 3D high-performance hybrid architecture and Foveros packaging technology, this processor is set to power the AI PC and serve as the cornerstone for client computing in edge devices. These devices, including those deployed in challenging industrial environments, are poised to reap the benefits of the Core Ultra’s capabilities.
The AI landscape is on the brink of a transformative year in 2023, as Intel anticipates the introduction of colossal AI models boasting trillions of parameters. Nevertheless, Intel quickly underscores the importance of smaller, specialized models underpinned by retrieval-based traceability and offering environmental and cost benefits.