A Brief Analysis of McKinsey's Lilli: What Development Ideas Does It Provide for the Enterprise AI Market?

The McKinsey Lilli case provides key development insights for the enterprise AI market: Edge Computing + potential market opportunities for small models. This AI assistant, which integrates 100,000 internal documents, not only achieved a 70% adoption rate among employees but is also used an average of 17 times per week, a level of product stickiness that is rare among enterprise tools. Below, I will share my thoughts:

  1. Data security for enterprises is a pain point: The core knowledge assets accumulated by McKinsey over 100 years and some specific data accumulated by small and medium-sized enterprises have strong data sensitivity and are not suitable for processing on public clouds. Exploring a balanced state where "data does not leave the local environment and AI capabilities are not compromised" is an actual market demand. Edge Computing is a direction for exploration;

  2. Professional small models will replace general large models: enterprise users do not need a "billion-parameter, all-purpose" general model, but rather a professional assistant that can accurately answer specific domain problems. In contrast, there is an inherent contradiction between the generality of large models and their professional depth, and in enterprise scenarios, small models are often valued more.

  3. Cost balance of self-built AI infrastructure and API calls: Although the combination of Edge Computing and small models requires a large initial investment, the long-term operating costs are significantly reduced. Imagine if the AI large model frequently used by 45,000 employees comes from API calls; the resulting dependency, increase in usage scale, and feedback would make self-built AI infrastructure a rational choice for medium and large enterprises.

  4. New Opportunities in the Edge Hardware Market: Large model training relies on high-end GPUs, but the hardware requirements for edge inference are completely different. Chip manufacturers like Qualcomm and MediaTek are seizing the market opportunity with processors optimized for edge AI. As every company aims to create its own "Lilli", edge AI chips designed for low power consumption and high efficiency will become a necessity for infrastructure.

  5. The decentralized web3 AI market is also simultaneously enhanced: Once the demand for computing power, fine-tuning, algorithms, etc., on small models is driven by enterprises, how to balance resource scheduling will become a problem. Traditional centralized resource scheduling will become a challenge, which will directly create significant market demand for web3AI decentralized small model fine-tuning networks, decentralized computing power service platforms, and so on.

While the market is still discussing the boundaries of AGI's general capabilities, it is more gratifying to see many enterprise end users already tapping into the practical value of AI. Clearly, compared to the past monopolistic leaps in computing power and algorithms, when the market shifts its focus to Edge Computing + small models, it will bring greater market vitality.

INFRA-0.68%
AGI-5.9%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate app
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)