Neuchips Partners with Vecow and GSH to Accelerate Proprietary Data Processing with Offline Gen AI

By using Neuchips' Viper series cards, the collaboration also delivers power efficiency and offline LLMs to bring enterprises maximum privacy and security.

TAIPEI, March 5, 2025 /PRNewswire/ -- Ahead of Embedded World 2025, Neuchips, a leading Artificial Intelligence (AI) Application-Specific Integrated Circuits (ASIC) provider, is announcing a collaboration with Vecow and Golden Smart Home (GSH) Technology Corp.'s ShareGuru. The partnership is aimed at revolutionizing SQL data processing using a private, secure, and power-efficient AI solution, which delivers real-time insights from in-house databases via natural language requests.

Neuchips partners with GSH and Vecow to launch AI solution that converts conversations to SQL for seamless database queries.
Neuchips partners with GSH and Vecow to launch AI solution that converts conversations to SQL for seamless database queries.

Please join Neuchips and Vecow at Hall 3, Booth #3-449 during Embedded World 2025 (March 11-13, Nuremberg, Germany).

"Our collaboration with Vecow and GSH represents the future of industrial AI deployment," said Ken Lau, CEO of Neuchips. "At Embedded World 2025, visitors to Vecow's booth will experience how our Viper AI accelerator card's unique capabilities—including 12B parameter model support at just 45W power consumption—complement Vecow's robust industrial Edge AI Computing Systems and GSH's ShareGuru SLM solutions. This powerful combination delivers secure, efficient AI processing of proprietary data that meets the demanding requirements of modern industrial environments. We're proud to partner with  Vecow to bring this generative AI innovation into the enterprise-focused application space."

"As on-premise generative AI applications expand, the demand for multimodal large language models (LLMs) is rapidly growing," said Joseph Huang, Executive Vice President at Vecow. "As a provider of edge AI computing solution services, Vecow is partnering with Neuchips to develop cutting-edge RAG-based LLM solutions, enabling users to access the latest data without training models, thereby delivering more relevant and high-quality results. It is essential for our customers who seek a cost-effective, compact and low-power AI workstation that outperforms traditional cloud-based GPU solutions."

Combining forces for the ultimate AI-driven data processing solution

As database complexity grows and SQL expertise remains scarce, businesses face significant delays in extracting critical insights from data. However, online AI models cannot be used due to the lack of protection for proprietary information.

To solve these pain points for enterprises across industries, the breakthrough solution leverages the Vecow ECX-3100 RAG Edge AI Inference Workstation, a RAG-enabled (Retrieval-Augmented Generation) LLM computing platform. This runs GSH's ShareGuru QA 2.0 solution, powered by the ShareGuru SLM Platform, using a single Neuchips LLM card—the Viper series Gen AI card.

Combined, the solution enables using human language for generating SQL queries, making them more accessible and efficient while reducing SQL expertise costs. In addition, it offers:

  • Maximum data privacy and security: Neuchips' offline card runs the ShareGuru solution and platform locally
  • High accuracy: Through AI power query validation
  • High power efficiency: Neuchips' Viper series delivers 45W power efficiency with a full 12 billion parameter model

Neuchips' Viper series AI accelerator card

Launched at COMPUTEX 2024, the Viper series offloads more than 90% of the resources required for generative AI from the CPU for unleashing the full potential of LLMs. It distinguishes itself in the market by offering:

  • Extra 32GB memory capacity
  • Native BF16 Structured Language Model support
  • Neuchips' Raptor Gen AI accelerator chip, launched at CES 2024

Looking ahead to 2026, Neuchips plans to focus on low-power multi-modality ASIC for further performance gains.

About Neuchips

Neuchips is at the forefront of AI ASIC solutions, pioneering the development of purpose-built hardware for DLRM and LLM. With our dedicated team of experts, a commitment to innovation, and a strong presence in industry organizations, we are poised to continue shaping the future of AI hardware and ushering in a new era of efficiency and performance.

To learn more, please visit: http://www.neuchips.ai/