Sign Up
Already have an account?Log In
By clicking "Sign Up" you agree to our terms of service and privacy policy
- Username should be more than 3 characters.
- Username cannot start with numeric character.
- Username characters must be from {a-z,0-9}, special characters are not allowed.
- Make sure the Email is working to receive verification code & password reset link.
- Password should be more than 6 characters.
Forgot Password
AMD CTO Highlights Shift of AI Inference from Data Centers to Phones and Laptops
AMD's CTO, Mark Papermaster, emphasizes a significant industry transition as AI inference workloads shift from traditional data centers to edge devices like smartphones and laptops. This move promises enhanced performance, reduced latency, and improved energy efficiency, enabling smarter, more responsive personal devices. By leveraging AMD's latest chip technologies, this trend aims to bring advanced AI capabilities directly to consumers' hands, fostering innovation in wearable tech, mobile computing, and portable AI solutions. The shift to edge computing not only benefits end-users with faster access to AI features but also reduces reliance on centralized data centers, optimizing overall network efficiency. As AMD positions itself at the forefront of this evolution, it underscores the importance of developing hardware optimized for AI inference at the device level, transforming how AI applications are deployed and experienced worldwide.
Share
Copied