(no title)
oever | 4 months ago
rgg() {
readarray -d '' -t FILES < <(git ls-files -z)
rg "${@}" "${FILES[@]}"
}
It speeds up a lot on directories with many binary files and committed dot files. To search the dot files, -uu is needed, but that also tells ripgrep to search the binary files.On repositories with hundreds of files, the git ls-files overhead a bit large.
burntsushi|4 months ago
Also, `-uu` tells ripgrep to not respect gitignore and to search hidden files. But ripgrep will still skip binary files. You need `-uuu` to also ignore binary files.
I tried playing with your `rgg` function. First problem occurred when I tried it on a checkout the Linux kernel:
OK, so let's just use `xargs`: And compared to just `rg APM_RESUME`: So do you have an example where `git ls-files -z | xargs -0 rg ...` is faster than just `rg ...`?oever|4 months ago
The repository contains CI files in .woodpecker. These are scripts that I'd normally expect to be searching in. Until a week ago I used -uu to do so, but that made rg take over 4 seconds for a search. Using -. brings the search time down to 24ms.
To reproduce this with the given repository, fill it with 20GB of binary files.The -. flag makes this point moot though.
[0] https://codeberg.org/vandenoever/rehorse
EnPissant|4 months ago
It will only search tracked files. For that it can just use the index. I would expect the index to be faster than looking at the fs for listings.
oever|4 months ago
Searching in hidden files tracked by git would be great but the overhead of querying git to list all tracked files is probably significant even in Rust.
woodruffw|4 months ago
oever|4 months ago
kibwen|4 months ago
oever|4 months ago
All are less than 100ms, so fast enough.