Documentation
Complete guide to installing and configuring AI Search
Module Overview
AI Search for OpenCart is an extension that replaces the standard store search with semantic AI-powered search. The module uses vector embeddings to understand query context and find relevant products.
Key Features
- Semantic search (understands context, not just keywords)
- Automatic typo correction and fuzzy matching
- Synonym support without additional dictionaries
- Search by product attributes, filters, and options
- Query autocomplete (Business+ plans)
- Mixed results: products, categories, pages
- Fallback to standard search on failures
- Automatic re-indexing on product changes
System Requirements
SaaS Version
- OpenCart 4.0 or higher
- PHP 8.0 or higher
- MySQL 5.7+ or MariaDB 10.3+
- 50MB free space
- Internet connection (for API requests)
Self-hosted Version
All SaaS version requirements, plus:
- VPS/dedicated server (minimum 2 CPU, 4GB RAM)
- Docker and Docker Compose
- 10GB free space (for Ollama models)
- GPU recommended (CUDA or ROCm)
Architecture
Vector Index
The oc_ai_embeddings table stores vector representations of products, categories, and pages.
- › Model: multilingual-e5-large-instruct (1024d)
- › Search: Cosine similarity in PHP
- › Cache: file-based (OpenCart Cache)
Trigram Index
The oc_ai_trigrams table for fuzzy autocomplete.
- › Method: 3-character tokens
- › Re-ranking: via levenshtein()
- › Speed: <20ms
Installation
-
1
Download — go to Dashboard → Downloads and download
aisearch.ocmod.zip. -
2
Upload — in your OpenCart admin go to
Extensions → Installer, click Upload and select the ZIP file. -
3
Enable — go to
Extensions → Extensions → Modules, find AI Search and click Install, then Edit. -
4
Enter License Key — paste your key from the dashboard into the License Key field and save.
-
5
Index products — open the Indexer tab and click Start Indexing. Done.
Configuration
The General tab controls core behaviour: mode, API connection, embedding model, and search parameters.
Mode
SaaS — uses our cloud API (no server needed, requires license key). Self-hosted — runs Ollama on your own VPS.
Embedding Model
Choose from the table: multilingual-e5-large-instruct (best quality, 100+ languages), nomic-embed-text-v1.5 (fastest, English only), and others. Changing the model requires re-indexing.
Min Query Length
Minimum characters before AI search activates. Recommended: 3.
Fallback to LIKE
If AI search is unavailable, automatically falls back to OpenCart's standard LIKE search. Keep enabled.
Indexing
The Indexer tab lets you control what content gets indexed and how. Select fields carefully — more data improves accuracy, but unnecessary technical fields add noise.
Use Field Order (drag to reorder) to prioritise what gets included first — if the total text exceeds the model's token limit, lower-priority fields are dropped. Use Re-index (Farm Queue) for large catalogs to offload embedding generation to the GPU farm.
After indexing, switch to the Index tab to verify all products are indexed. You can filter by truncated items to check if any descriptions were cut off.
API
API reference coming soon.
Troubleshooting
Troubleshooting guide coming soon.