Seed

November 2024, Argmax Palo Alto Office

We are thrilled to share that Argmax raised $8M led by Salesforce Ventures with participation from General Catalyst, Julien Chamond, Amjad Masad, Michele Catasta, and other industry leaders. These funds will be allocated towards growing our world-class engineering team to accelerate our on-device inference roadmap and scaling our go-to-market

We are looking for exceptional folks to join us in pulling the on-device future forward faster!

On-Device ML Frameworks Engineer (Staff)

Inference Performance Engineer (Staff)

ML Software QA Lead Engineer

Foundation Models On Device

On-device, offline, private AI was a futuristic concept for many as late as 2023. Before founding Argmax, we witnessed the inflection point of this technology while building on-device inference platforms at Apple and Google as early as 2018. We believe this technology will culminate and go mainstream in production use cases within the next 2 years. In fact, we will make sure it does.

Adoption of a new technology explodes when the technology becomes mature enough to disappear into the background. We are building open-source tools and frameworks to accelerate that explosion for on-device inference. Our entire product portfolio is about bringing Foundation Models On Device (FMOD). Besides best-in-class performance and feature set in our FMOD SDKs, we differentiate with an avid focus on reliability for the end-user experience.

Our incentives are fully aligned with our developer and enterprise customers: Making FMOD technology accessible to as many would-be users as possible regardless of the age or brand of their device. WhisperKit has already made a dent in Automatic Speech Recognition, making state-of-the-art Speech Foundation Models available on 5 year consumer devices. In collaboration with Qualcomm AI Hub, we have started the journey of bringing these capabilities to Android and Linux platforms.

Market

We are working with customers across industries as diverse as social media, gaming, healthcare, and writing tools. It is extremely exciting to see all the innovative products our customers have in the pipeline for their users.

Our thesis is that the majority of the commercial value in AI will be created through delightful user experiences powered by efficient models that reliably solve the task at hand as opposed to generalist models. Efficient models will be faster, more reliable, economical, and end-user-aligned on the edge as opposed to the cloud. We think cloud inference will continue to grow for certain use cases as part of the growing pie and we believe the market is big enough for both types of businesses to succeed.

Previous
Previous

Android

Next
Next

WhisperKit