Problem
The src/evaluation/evaluator.py script contains two hard-coded absolute paths specific to the original author's development machine:
- Line 70: Hard-coded path to
summations.csv used by the BM25 retriever
- Line 180: Hard-coded path to the Chroma embeddings directory
This causes a FileNotFoundError for every contributor except the original author, making the evaluation pipeline completely broken for community use.
Proposed Fix
Replace hard-coded paths with:
- New CLI args:
--embeddings-dir and --summations-csv
- Environment variable fallbacks:
REACTOME_EMBEDDINGS_DIR and REACTOME_SUMMATIONS_CSV
- A reusable resolve_path() utility with a clear error message
This makes the evaluator portable across Windows, Linux, and macOS without any code changes.
I am working on a fix and will submit a PR shortly.
Problem
The src/evaluation/evaluator.py script contains two hard-coded absolute paths specific to the original author's development machine:
summations.csvused by the BM25 retrieverThis causes a
FileNotFoundErrorfor every contributor except the original author, making the evaluation pipeline completely broken for community use.Proposed Fix
Replace hard-coded paths with:
--embeddings-dirand--summations-csvREACTOME_EMBEDDINGS_DIRandREACTOME_SUMMATIONS_CSVThis makes the evaluator portable across Windows, Linux, and macOS without any code changes.
I am working on a fix and will submit a PR shortly.