Changelog
Tracking the evolution of the Engram Operating System.
Engram for VS CodeLatest
The official IDE extension. brings the "Ghost" sidebar to your editor, offering context-aware code explanations without data egress.
- Sidebar Chat: Discuss your codebase locally with Llama 3.
- Privacy Mode: Whitelist specific folders for vectorization.
- Codebase indexing speed improved by 200% via Rust bindings.
The "Terminal Genie" Integration
Engram now lives in your shell. It intercepts exit codes and suggests fixes for failed commands automatically.
- Released Zsh and Bash plugins for terminal interception.
- Added "Auto-Fix": Press [Tab] to accept Engram`s suggested command repair.
- Natural Language Commands: Type "undo last git commit" to execute the complex git reset sequence.
Local Git Agent
An autonomous worker that manages your version control workflow.
- Auto-Commit: Generates semantic commit messages based on `git diff` analysis.
- PR Drafter: Reads your branch changes and writes a structured Pull Request description.
- Added "Pre-Push" safety check to scan for accidental secret leaks.
The Documentation Crawler
Engram can now ingest entire documentation sites (StackOverflow, MDN, internal wikis) to become an expert on your specific tech stack.
- Added "Doc-Spider": Point Engram at a URL, and it scrapes/vectorizes the documentation locally.
- Offline Mode: Query your documentation sets even without internet.
- Optimized vector chunking for technical code blocks.
Linear & Jira Sync
Engram now connects to your project management tools to help you prioritize your day.
- Two-way sync with Linear and Jira issues.
- Daily Briefing: "You have 3 high-priority bugs assigned to you today."
- Resolved OAuth token refresh issues for third-party integrations.
Voice Mode & Whisper
Added ears to the OS. You can now talk to Engram, and it transcribes meetings automatically.
- Integrated "Whisper.cpp" for real-time local speech-to-text.
- Added "Meeting Mode": Automatically records and summarizes Zoom/Meet audio.
- Voice activation keyphrase detection added.
Visual Context / Vision
Engram gains sight. Users can now reference what is on their screen in conversation.
- Added "Vision Agent" utilizing LLaVA (Large Language-and-Vision Assistant).
- Screen-capture context: "Explain this chart I am looking at."
- Optimized OCR pipeline for reading text from images.
Universal Installer Launch
The official public launch. Transitioned from manual Python scripts to a single NPX command for mass adoption.
- Released `npx engram-os` CLI tool.
- Automated Docker Desktop detection and provisioning.
- Patched cross-platform path issues for Windows users.
The Nervous System Upgrade
Migrated from simple loops to a robust event-driven architecture.
- Implemented Celery & Redis for asynchronous task management.
- Added "Celery Beat" for precise agent scheduling.
- Decoupled API (Brain) from Workers (Muscle) for system stability.
Deep Work Email Agent
The first autonomous writer agent. Capable of drafting emails while you sleep.
- Gmail OAuth integration implemented.
- Added "Draft-Only" safety mode for human-in-the-loop review.
- Spam and Newsletter filtering logic.
The Calendar Agent
Gave the OS the ability to understand time and manage schedules.
- Natural Language Processing for "fuzzy" date detection.
- Bi-directional Google Calendar API sync.
- Added "Processed" status flags to prevent duplicate bookings.
Visual Cortex (Dashboard)
Launched the first graphical interface (Streamlit) for the OS.
- Released "Command Center" UI with Chat and Activity Feed.
- Added "Tech-Noir" aesthetic and CSS card styling.
- Implemented "Manual Override" buttons for agents.
Browser Spy Extension
Context-awareness update. Allowed the OS to read web history.
- Chrome history snapshotting (60s interval).
- Added privacy filters for banking/sensitive URLs.
- Reduced vector payload size for web content.
Passive Ingestion Pipeline
Automation of file entry via background daemons.
- Built the "Hot Folder" watcher (ingestor.py).
- Added support for PDF and TXT vectorization.
- Fixed file locking issues on macOS.
The Llama Upgrade
Major intelligence boost by migrating to the new Llama 3 model.
- Migrated from Llama 2 to Llama 3 8B.
- Reasoning capabilities improved by 40%.
- Implemented Ollama for easier model management.
Vector Memory Integration
The OS gained long-term memory via Qdrant.
- Dockerized Qdrant instance setup.
- Built initial RAG (Retrieval Augmented Generation) pipeline.
- Created "memories" collection schema.
Project Inception
- Initial ideation phase and architecture planning.
- Basic Python API structure created.
- Defined "Local-First" zero-egress philosophy.
- Initial Docker Compose environment configuration.