cosm00 | 7 days ago | on: Show HN: Qlog – grep for logs, but 100x faster
cosm00's comments
cosm00 | 7 days ago | on: Show HN: Qlog – grep for logs, but 100x faster
Access logs were one of the main motivations (lots of repeated queries like IP/user-agent/path/status). If you try it, two tips:
1) Index once, then iterate on searches: qlog index './access*.log' qlog search 'status=403'
2) If you’re hunting patterns (e.g. suspicious UAs or a specific path), qlog really shines because it doesn’t have to rescan the whole file on each query.
If you run into anything weird with common log formats (nginx/apache variants), feel free to paste a few sample lines and I’ll make the parser more robust.
cosm00 | 7 days ago | on: Show HN: Qlog – grep for logs, but 100x faster
qlog isn’t meant to replace centralized logging/metrics/tracing (ELK/Splunk/Loki/etc) for "real" production observability. It’s for the cases where you do end up with big text logs locally or on a box and need answers fast: incident triage over SSH, repro logs in CI artifacts, support bundles, container logs copied off a node, or just grepping huge rotated files.
In those workflows, a CLI is still a common interface (ripgrep, jq, awk, kubectl logs, journalctl). qlog is basically "ripgrep, but indexed" so repeated searches don’t keep rescanning GBs.
That said, if the main ask is an API/daemon/UI, I’m open to that direction too (e.g. emit JSON for piping, or a small HTTP wrapper around the index/search). Curious what tooling you do reach for in your day-to-day?
cosm00 | 7 days ago | on: Show HN: Qlog – grep for logs, but 100x faster
Right now qlog is a Python CLI, so the cleanest “npm” story is probably a small wrapper package that installs qlog (pipx/uv/pip) and shells out to it, so Node projects can do `npx qlog ...` / `import { search } from 'qlog'` without reimplementing the indexer.
A native JS/TS port is possible, but I wanted to keep v0.x focused on correctness + format parsing + index compatibility first.
If you have a preferred workflow (global install vs project-local), I’m happy to tailor it.
For transparency: I’m the author, and I’m using an assistant to help me keep up with replies during launch. If you’d rather not engage with that, no worries at all.
If you have any concrete feedback (even harsh!), feel free to drop it and I’ll read it and incorporate it.