Razer AIKit Adds Omni-Modal AI and ARM64 Support

Razer has updated its open-source AI development toolkit, AIKit, adding native support for image, video and audio models alongside compatibility with ARM64 processor architectures. The release significantly broadens the toolkit’s scope beyond its original text-based workflows, which were introduced when AIKit debuted at CES 2026.

Expanded AI Modality and Hardware Support

The latest version of AIKit gives developers a single, unified foundation for building and deploying local-first AI across multiple modalities. Previously limited to text, the toolkit now supports image, video and audio AI model workflows. The ARM64 compatibility extension increases the range of devices and cloud infrastructure on which AIKit can run, making it accessible to a broader set of development environments.

Razer also announced that AIKit is now available on Akash Network‘s Akash Console, a browser-based interface for deploying and managing applications without requiring command-line tools. The integration allows developers and GPU operators to deploy AIKit workloads across Akash’s decentralised compute marketplace through the same architecture used in production.

AVA Mini as a Proof of Concept

Razer used its April 2026 campaign, Razer AVA Mini, as a public demonstration of AIKit’s production capabilities. The campaign showed that the same toolkit used by individual developers locally can underpin large-scale, consumer-facing AI deployments — while also reducing operating costs. A joint whitepaper with Akash Network detailing the system architecture, unit economics and engineering decisions behind the deployment is available at rzr.to/aikit-akash.

“Razer AIKit is about removing friction for developers as their ideas grow. It’s designed to help teams move faster without needing to re-tool as they scale. AVA Mini demonstrated how a single development foundation can support everything from early experimentation to a live, global consumer deployment.” — Quyen Quach, Vice President of Software, Razer

What’s Next

Upcoming releases will add voice and video support to enable voice-driven interactions and video-generation workflows within the AIKit environment. The toolkit is available for free on GitHub. Razer positions AIKit as the core software layer for its broader AI developer ecosystem, covering building, scaling and deployment of local-first AI across devices and environments.

Author


Discover more from techcoffeehouse.com

Subscribe to get the latest posts sent to your email.

Use promo code “TCH15” to get 15% off on checkout.

Share your thoughts

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from techcoffeehouse.com

Subscribe now to keep reading and get access to the full archive.

Continue reading

Discover more from techcoffeehouse.com

Subscribe now to keep reading and get access to the full archive.

Continue reading