Your resource for web content, online publishing
and the distribution of digital products.
«  
  »
S M T W T F S
 
 
 
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
10
 
11
 
12
 
13
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 
 
 
 
 
 

Neuchips, Vecow, and GSH collaborate on offline GenAI for data processing

DATE POSTED:March 5, 2025
Neuchips, Vecow, and GSH collaborate on offline GenAI for data processing

Neuchips, a provider of AI Application-Specific Integrated Circuits (ASICs), has announced a partnership with Vecow and Golden Smart Home (GSH) Technology Corp.’s ShareGuru to develop an offline generative AI solution for proprietary data processing. The collaboration aims to enable businesses to extract real-time insights from in-house databases using natural language queries.

The partnership focuses on addressing the challenges of complex SQL data processing and the need for data privacy. The solution combines Vecow’s ECX-3100 RAG Edge AI Inference Workstation, GSH’s ShareGuru QA 2.0 solution, and Neuchips’ Viper series Gen AI card.

Key aspects of the collaborative solution include:

  • Offline data processing: Neuchips’ Viper series card enables local processing of the ShareGuru solution, enhancing data privacy and security.
  • Natural language queries: The system allows users to generate SQL queries using human language, simplifying data access and reducing reliance on SQL expertise.
  • Power efficiency: Neuchips highlights the Viper series’ ability to support 12 billion parameter models at 45W power consumption.
  • RAG-enabled LLM: The solution leverages Retrieval-Augmented Generation (RAG) to provide access to up-to-date data without requiring model retraining.
  • Hardware specifications: The Neuchips Viper series AI accelerator card features 32GB of memory capacity and native BF16 Structured Language Model support.

Agora launches conversational AI engine to democratize real-time voice interactions

Neuchips emphasizes that its Viper series card offloads a significant portion of generative AI processing from the CPU. Vecow highlights the solution’s role in meeting the growing demand for on-premise generative AI applications and multimodal large language models (LLMs). GSH’s ShareGuru Platform provides the SLM solution that interacts with the Neuchips hardware.

The companies will showcase their collaborative solution at Embedded World 2025 in Nuremberg, Germany, at Vecow’s booth. Neuchips also announced future plans to focus on low-power multi-modality ASICs by 2026.

Featured image credit: Google DeepMind/Unsplash