Qshr is a small Turtle-inspired toolkit for writing shell-style scripts in
Rust. A single use qshr::prelude::*; gives you:
Shell<T>: a lazy iterator with handy combinators (map,chunks,join, ...).Command/Pipeline: an ergonomic wrapper aroundstd::process::Command.- Filesystem helpers: globbing, walking, copying, watchers, temp files, etc.
use qshr::prelude::*;
fn main() -> qshr::Result<()> {
sh("echo hello && echo world")
.stream_lines()?
.for_each(|line| println!("stdout: {}", line?));
Ok(())
}use qshr::prelude::*;
fn main() -> qshr::Result<()> {
let rust_sources = filter_extension(glob_entries("src/**/*.rs")?, "rs");
for entry in rust_sources.take(3) {
let entry = entry?;
println!("{}", entry.path.display());
}
Ok(())
}use qshr::prelude::*;
use std::time::Duration;
fn main() -> qshr::Result<()> {
let events = watch_filtered(".", Duration::from_millis(300), "**/*.rs")?;
for event in events {
let event = event?;
println!("changed: {}", event.path().display());
sh("cargo check").run()?;
}
Ok(())
}use qshr::{pipeline, prelude::*, qshr};
fn main() -> qshr::Result<()> {
qshr! {
println!("Running scripted commands...");
"echo hi from macro";
env "RUST_BACKTRACE" = "1";
"echo RUST_BACKTRACE=$RUST_BACKTRACE";
let rustc = cmd("rustc").arg("--version").stdout_text()?;
println!("rustc -> {}", rustc.trim());
"echo listing src" | "more";
let echo_twice = pipeline!(sh("echo builder pipeline") | "more");
run echo_twice;
run pipeline!(sh("echo expression stage") | "more");
cd("src") {
"ls";
};
parallel {
"echo one";
} {
"echo two";
};
unset "RUST_BACKTRACE";
}
}String literals inside the macro run as shell commands automatically, and you can join them with | to build pipelines. When you want to mix in builder-style commands, use the pipeline! helper (pipeline!(sh("echo hi") | "more")) and run it inline with run <expr>;. Regular Rust statements (like the let rustc = ... line) work alongside the command sugar so you can still capture output or branch as needed. You can also set/unset environment variables inline with env "KEY" = ...; and unset "KEY";, run blocks inside a different directory via cd("path") { ... }, and fire blocks in parallel threads with parallel { ... } { ... };. See examples/macro.rs for the basics and examples/macro_workflow.rs for a more involved workflow.
use qshr::{cmd, cmd as cmd_fn};
fn main() -> qshr::Result<()> {
let output = cmd!("git", "status", "--short").stdout_text()?;
println!("{output}");
// Equivalent builder-style version.
let fallback = cmd_fn("git").arg("status").arg("--short").stdout_text()?;
assert_eq!(output, fallback);
Ok(())
}use qshr::{prelude::*, qshr};
fn main() -> qshr::Result<()> {
let tracked = ["src/lib.rs", "src/shell.rs"];
qshr! {
println!("Sanity checks");
"cargo fmt";
"cargo test --lib";
for path in &tracked {
let summary = cmd("wc").arg("-l").arg(path).stdout_text()?;
print!("{summary}");
};
{
let status = cmd("git").args(["status", "--short"]).stdout_text()?;
println!("git status:\n{status}");
};
"rg TODO -n src" | "head -n 5";
}?;
Ok(())
}Within qshr!, any Rust statement is permitted, so you can loop, branch, or shadow variables while the string literals do the repetitive shell work for you.
use qshr::prelude::*;
use std::fs::File;
fn main() -> qshr::Result<()> {
let file = File::open("README.md")?;
let output = cmd("wc").arg("-l").stdin_reader(file).stdout_text()?;
println!("README has {output}");
Ok(())
}When you need to stream data into a child process (large files, sockets, pipes),
stdin_reader avoids buffering everything into memory up front.
Every filesystem iterator (ls, walk_files, glob_entries, etc.) yields a Shell<Result<_>>, so you can lazily stream and short-circuit as needed:
use qshr::prelude::*;
fn main() -> qshr::Result<()> {
let recent: Vec<_> = walk_files("src")?
.filter_map(|entry| {
let entry = entry.ok()?;
entry
.modified()
.ok()
.and_then(|time| time.elapsed().ok())
.filter(|age| *age.as_secs() < 300)
.map(|_| entry)
})
.take(10)
.collect();
// When you want to collect fallible entries, use `collect::<qshr::Result<_>>()?`.
let all: Vec<_> = walk_files("src")?.collect::<qshr::Result<Vec<_>>>()?;
println!("First {} files, total {}.", recent.len(), all.len());
Ok(())
}use qshr::prelude::*;
fn main() -> qshr::Result<()> {
let rustc = cmd!("rustc", "--version").stdout_text()?;
println!("rust -> {rustc}");
// Pipe shell commands and capture the output.
let files = sh("ls src").pipe(sh("wc -l")).stdout_text()?;
println!("src has {files} entries");
Ok(())
}use qshr::prelude::*;
fn main() -> qshr::Result<()> {
let doubled: Vec<_> = Shell::from_iter(0..100)
.chunks(16)
.chunk_map_parallel(16, |chunk| chunk.into_iter().map(|n| n * 2).collect())
.to_vec();
println!("doubled len {}", doubled.len());
Ok(())
}See examples/macro_watch.rs for a qshr!-driven watcher; the core pattern is:
use qshr::prelude::*;
use std::time::Duration;
fn main() -> qshr::Result<()> {
let events = watch_filtered(".", Duration::from_millis(300), "**/*.rs")?;
for event in events.take(3) {
let event = event?;
println!("changed -> {}", event.path().display());
sh("cargo check").run()?;
}
Ok(())
}
// Prefer a channel for manual polling:
let rx = watch_channel(".")?;
if let Ok(event) = rx.try_recv() {
println!("changed -> {}", event?.path().display());
}Rename operations surface as WatchEvent::Renamed, giving you access to both
event.path() (the destination) and event.from_path() for the source when
files move without being rewritten.
When you need to reuse glob metadata multiple times (copy/move operations, filtering), resolve once via GlobCache::new("src/**/*.rs") and call .entries() to avoid repeated fs::metadata calls.
Need backwards iteration? Wrap in DoubleEndedShell::from_vec(vec) and call next_back() on it before converting back into a plain Shell.
parallel: enablesShell::chunk_map_parallelviarayon.async: exposes async helpers (e.g.Command::output_async,watch_async_stream) built ontokio.
Browse examples/ for small scripts—script.rs, watch_glob.rs,
watch_debounce.rs, the async runners, macro_watch.rs, etc. Run them with
cargo run --example <name>.
There is a repo-local pre-commit hook at .githooks/pre-commit that runs
cargo fmt --all and cargo clippy --fix --allow-dirty --allow-staged --all-targets --all-features
before allowing a commit. Opt in by
pointing Git at the hooks directory once:
git config core.hooksPath .githooks
You can reset to Git’s default hooks later with git config --unset core.hooksPath.
The crate aims to stay compact and dependency-light. Contributions are welcome!