AI Power Indexstatic
NVDA+2.34%
MSFT-0.12%
GOOGL+1.87%
META+0.95%
AMD+1.73%
ORCL-0.44%
PLTR+3.21%
SNOW+4.15%
AI INDEX+1.42%

Mirai Raises $10M to Boost On-Device AI Performance

AI Fresh Daily
1 min read
Feb 19, 2026

This article was written by AI based on multiple news sources.Read original source →

A new startup founded by the co-creators of the popular AI apps Reface and Prisma has emerged with a $10 million seed round to tackle a critical bottleneck in artificial intelligence. The company, Mirai, is focusing its efforts on improving the efficiency and performance of AI models that run directly on consumer devices like smartphones and laptops. This technical push aims to shift more AI processing from powerful cloud servers to the hardware in users' pockets, a move with significant implications for privacy, cost, and accessibility.

The founding team, which includes Denys Dovhan and Yaroslav Boiko, brings substantial experience from building viral AI-powered applications. Their previous ventures, Reface for face-swapping videos and Prisma for transforming photos into artistic styles, relied heavily on both cloud and on-device processing. This hands-on experience with the limitations of current mobile inference—the process where a trained AI model makes predictions or generates content—has directly informed Mirai's mission. The $10 million seed investment provides the capital to develop specialized software and tools intended to make complex AI models run faster and more efficiently on standard device hardware without a constant internet connection.

At its core, Mirai's work addresses the growing demand for local AI processing. Currently, many advanced AI features, from sophisticated language model chatbots to real-time image generators, require sending data to remote data centers for computation. This cloud-dependent model introduces latency, ongoing operational costs for providers, and potential privacy concerns as user data traverses the network. By optimizing inference to happen on-device, Mirai seeks to mitigate these issues. The technical challenge is substantial, as it involves compressing models, optimizing them for specific mobile chipsets (like those from Apple, Qualcomm, and MediaTek), and managing constrained memory and battery life, all while maintaining the quality and speed of the AI's output.

The successful development of Mirai's technology could accelerate a broader industry trend toward hybrid or fully local AI. For application developers, more efficient on-device inference lowers the barrier to integrating powerful AI features, as it reduces or eliminates cloud infrastructure costs and complexities. For end-users, the benefits are tangible: faster response times for AI assistants, the ability to use features offline or in areas with poor connectivity, and enhanced data privacy as sensitive information, such as personal photos or documents, never leaves the device. This shift also aligns with the hardware roadmaps of major tech companies, which are increasingly designing system-on-chips with dedicated neural processing units (NPUs) to handle these very workloads.

While Mirai is entering a competitive space with other companies and open-source projects focused on model optimization, its founders' proven track record in creating mass-market AI applications gives it a distinct practical perspective. The seed funding will be crucial for building out its team and technology stack. If successful, Mirai's tools could empower a new wave of applications that offer cloud-level AI sophistication with the immediacy and privacy of local computation, fundamentally changing how users interact with intelligent features on their most personal devices.

Key Points

  • 1Mirai founded by Reface and Prisma co-founders
  • 2Raised $10 million in seed funding
  • 3Focuses on improving on-device AI model inference
  • 4Targets smartphones and laptops for local AI processing
Why It Matters

Efficient on-device AI reduces cloud dependence, lowering costs for developers while offering users faster, more private, and offline-capable applications.