v1.0.2March 31, 2026
Version 1.0.2
Offline Mode & AI CopilotOffline Mode
- SQLite local database for offline persistence — create, edit, score, and delete custom companies without internet
- Background sync engine with automatic retry — all offline changes push to the server when reconnected
- Smart API response cache for faster loading and offline fallback
- Portfolio page shows offline-created companies with 'pending sync' indicator
- Offline banner with pending sync count and 'Sync now' button
- Auto-refresh on reconnect — portfolio and data reload automatically when internet returns (10s polling)
AI Copilot
- Floating AI Copilot panel (bottom-right) with chat interface and streaming token display
- Supports both local LLM (Phi-3 Mini GGUF) and external providers (ChatGPT, Claude, DeepSeek)
- Portfolio Q&A — ask questions about your portfolio in natural language ('What is my industry concentration?')
- Quick actions: Risk Summary, Weak Scores, Concentration analysis
- Real-time context injection — API responses are captured as you browse and fed to the LLM automatically
- AI token counter in sidebar showing context size and session usage
- Clear conversation button to reset chat history
- Provider badge shows active LLM (Claude, ChatGPT, DeepSeek, Local, Disabled)
- Status updates instantly when provider is changed — no restart needed
AI Auto-Fill for Custom Company
- Upload up to 30 PDF broker submissions for automatic data extraction
- Paste text mode for emails, proposals, and application forms
- Full PDF text extraction via pdfjs-dist (digital PDFs, all pages)
- 8-step schema mapping with field descriptions, enum values, and example output
- Token estimation display before sending (content + prompt + total)
- Works with all configured LLM providers (local and external)
- Auto-populates all wizard steps: company profile, infrastructure, security controls, compliance, data handling, incidents
First-Launch Setup Wizard
- 4-step guided setup: Offline Mode → AI Copilot → AI Provider → Installation
- Progress bar with step indicators and animated transitions
- Offline Mode step explains SQLite, Auto Sync, and API Cache with feature cards
- AI Copilot step explains Portfolio Q&A, Document Auto-Fill, and Underwriting Reports
- AI Provider selection: Local Model vs External LLM with SVG logos for OpenAI, Claude, DeepSeek
- API key input with 'Load Models' button that fetches available models from the provider API
- Model selector dropdown populated from the provider's model list
- Rankiteo icon in wizard header (loaded from resources)
- Maximize/restore and close buttons for window control
- Real AI model download from HuggingFace (Phi-3 Mini 4K Instruct Q4_K_M, 2.3 GB)
Settings Menu
- File → Offline Mode: enable/disable offline with status display (DB ready, pending sync count)
- File → AI Copilot: configure/change/disable AI provider with model selection
- Tabbed settings modal with Offline Mode and AI Copilot tabs
- Enable/Disable buttons with confirmation dialogs
- Changes apply immediately — copilot, banner, and sidebar update without restart
- API key stored locally in encrypted SQLite database
Performance & Packaging
- Installer reduced from 381 MB to 82 MB — heavy AI deps (node-llama-cpp GPU binaries) filtered by platform
- node-llama-cpp bundled for Windows CPU only (78 MB vs 712 MB for all platforms)
- pdfjs-dist, tesseract.js moved to on-demand download to keep installer small
- ESM module compatibility for node-llama-cpp via dynamic import() workaround
- Fresh context per LLM request — no more 'No sequences left' errors on consecutive queries
- 8192 token context window for local model (doubled from 4096)