first commit
This commit is contained in:
22
.env.example
Normal file
22
.env.example
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
# Copy to .env and fill in your values
|
||||||
|
|
||||||
|
# URL of your Firefly Data Importer (FIDI) instance
|
||||||
|
FIDI_URL=http://localhost:8080
|
||||||
|
|
||||||
|
# FIDI auto-import secret (set in FIDI config under AUTO_IMPORT_SECRET)
|
||||||
|
# Leave blank if not configured
|
||||||
|
FIDI_SECRET=
|
||||||
|
|
||||||
|
# Firefly III Personal Access Token (optional — needed if FIDI requires auth)
|
||||||
|
FIDI_ACCESS_TOKEN=
|
||||||
|
|
||||||
|
# Directory containing .json/.csv import pairs (default: ./imports)
|
||||||
|
# JSON configs live here permanently; processed CSVs are staged here by watch-imports.sh
|
||||||
|
IMPORT_DIR=./imports
|
||||||
|
|
||||||
|
# Directory where raw CSV files are dropped (default: ./incoming)
|
||||||
|
# watch-imports.sh monitors this and moves files to IMPORT_DIR after processing
|
||||||
|
INCOMING_DIR=./incoming
|
||||||
|
|
||||||
|
# Set to "true" to automatically run import.sh after watch-imports.sh stages a file
|
||||||
|
AUTO_IMPORT=false
|
||||||
13
.gitignore
vendored
Normal file
13
.gitignore
vendored
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
# Environment — contains credentials, never commit
|
||||||
|
.env
|
||||||
|
|
||||||
|
# Financial data — keep off version control
|
||||||
|
imports/*.csv
|
||||||
|
incoming/*.csv
|
||||||
|
|
||||||
|
# macOS
|
||||||
|
.DS_Store
|
||||||
|
|
||||||
|
# Editor
|
||||||
|
.vscode/
|
||||||
|
*.swp
|
||||||
145
README.md
Normal file
145
README.md
Normal file
@@ -0,0 +1,145 @@
|
|||||||
|
# firefly-importer
|
||||||
|
|
||||||
|
Bash scripts to automate importing financial data into a self-hosted [Firefly III](https://firefly-iii.org/) instance via the [Firefly Data Importer (FIDI)](https://docs.firefly-iii.org/how-to/data-importer/about/).
|
||||||
|
|
||||||
|
## How it works
|
||||||
|
|
||||||
|
```
|
||||||
|
incoming/
|
||||||
|
jerickdiscover.csv ← raw CSV dropped here
|
||||||
|
checking.csv
|
||||||
|
|
||||||
|
↓ watch-imports.sh
|
||||||
|
↓ (flips 4th column sign on matching files)
|
||||||
|
|
||||||
|
imports/
|
||||||
|
jerickdiscover.json ← FIDI config (permanent)
|
||||||
|
jerickdiscover.csv ← processed CSV (staged)
|
||||||
|
checking.json
|
||||||
|
checking.csv
|
||||||
|
|
||||||
|
↓ import.sh
|
||||||
|
↓ (POST each JSON+CSV pair to FIDI /autoimport)
|
||||||
|
|
||||||
|
Firefly III ✓
|
||||||
|
```
|
||||||
|
|
||||||
|
CSV files are paired with their FIDI config by base name: `checking.json` + `checking.csv`.
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
- bash
|
||||||
|
- curl
|
||||||
|
- python3 (standard on Ubuntu)
|
||||||
|
- inotify-tools — for continuous file watching: `sudo apt-get install inotify-tools`
|
||||||
|
|
||||||
|
## Setup
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git clone <repo-url> firefly-importer
|
||||||
|
cd firefly-importer
|
||||||
|
|
||||||
|
cp .env.example .env
|
||||||
|
# Edit .env with your FIDI URL and credentials
|
||||||
|
|
||||||
|
mkdir -p incoming imports
|
||||||
|
chmod +x import.sh watch-imports.sh
|
||||||
|
|
||||||
|
# Place your FIDI JSON config files in imports/
|
||||||
|
# e.g.: imports/checking.json, imports/jerickdiscover.json
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
All config lives in `.env` (never committed to git):
|
||||||
|
|
||||||
|
| Variable | Required | Default | Description |
|
||||||
|
|---------------------|----------|---------------|----------------------------------------------------------|
|
||||||
|
| `FIDI_URL` | Yes | — | URL of your FIDI instance, e.g. `http://localhost:8080` |
|
||||||
|
| `FIDI_SECRET` | No | — | FIDI `AUTO_IMPORT_SECRET` value |
|
||||||
|
| `FIDI_ACCESS_TOKEN` | No | — | Firefly III Personal Access Token |
|
||||||
|
| `IMPORT_DIR` | No | `./imports` | Directory with JSON configs and staged CSVs |
|
||||||
|
| `INCOMING_DIR` | No | `./incoming` | Drop zone for raw CSV files |
|
||||||
|
| `AUTO_IMPORT` | No | `false` | Run `import.sh` automatically after a file is staged |
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
### Watch for new files (continuous)
|
||||||
|
|
||||||
|
Monitors `incoming/` and processes any CSV that arrives:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./watch-imports.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
### Process existing files (one-shot)
|
||||||
|
|
||||||
|
Useful for batch runs or cron jobs:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./watch-imports.sh --once
|
||||||
|
```
|
||||||
|
|
||||||
|
### Run the importer
|
||||||
|
|
||||||
|
Posts all staged JSON+CSV pairs to FIDI:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./import.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Preview what would be sent without making any requests:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./import.sh --dry-run
|
||||||
|
```
|
||||||
|
|
||||||
|
### Typical workflow
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Drop your CSVs into incoming/, then:
|
||||||
|
./watch-imports.sh --once && ./import.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Or set `AUTO_IMPORT=true` in `.env` and just run `./watch-imports.sh` — it will stage and import automatically each time a file lands.
|
||||||
|
|
||||||
|
## CSV sign-flip
|
||||||
|
|
||||||
|
Some bank exports report credits as negative and debits as positive (the reverse of what Firefly III expects). The following files have their 4th column sign automatically flipped during staging:
|
||||||
|
|
||||||
|
- `jerickdiscover.csv`
|
||||||
|
- `paigediscover.csv`
|
||||||
|
|
||||||
|
To add more files, edit the `FLIP_FILES` array near the top of [watch-imports.sh](watch-imports.sh).
|
||||||
|
|
||||||
|
## Running as a systemd service
|
||||||
|
|
||||||
|
A unit file is included at [firefly-importer.service](firefly-importer.service). Edit the `User` and path values, then install it:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 1. Edit the unit file
|
||||||
|
nano firefly-importer.service
|
||||||
|
# Set User= and both path references to your actual install path
|
||||||
|
|
||||||
|
# 2. Install and start
|
||||||
|
sudo cp firefly-importer.service /etc/systemd/system/
|
||||||
|
sudo systemctl daemon-reload
|
||||||
|
sudo systemctl enable --now firefly-importer
|
||||||
|
|
||||||
|
# 3. Check status / logs
|
||||||
|
sudo systemctl status firefly-importer
|
||||||
|
sudo journalctl -u firefly-importer -f
|
||||||
|
```
|
||||||
|
|
||||||
|
## Project structure
|
||||||
|
|
||||||
|
```
|
||||||
|
firefly-importer/
|
||||||
|
├── import.sh # Batch importer — POSTs JSON+CSV pairs to FIDI
|
||||||
|
├── watch-imports.sh # File watcher — processes and stages incoming CSVs
|
||||||
|
├── firefly-importer.service # systemd unit file
|
||||||
|
├── .env.example # Config template
|
||||||
|
├── .gitignore
|
||||||
|
├── imports/ # JSON configs (committed) + staged CSVs (gitignored)
|
||||||
|
└── incoming/ # Drop zone for raw CSV files (gitignored)
|
||||||
|
```
|
||||||
14
firefly-importer.service
Normal file
14
firefly-importer.service
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
[Unit]
|
||||||
|
Description=Firefly III CSV Import Watcher
|
||||||
|
After=network.target
|
||||||
|
|
||||||
|
[Service]
|
||||||
|
Type=simple
|
||||||
|
User=CHANGEME
|
||||||
|
WorkingDirectory=/path/to/firefly-importer
|
||||||
|
ExecStart=/path/to/firefly-importer/watch-imports.sh
|
||||||
|
Restart=on-failure
|
||||||
|
RestartSec=5
|
||||||
|
|
||||||
|
[Install]
|
||||||
|
WantedBy=multi-user.target
|
||||||
142
import.sh
Normal file
142
import.sh
Normal file
@@ -0,0 +1,142 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
# Firefly III Data Importer - batch auto-import
|
||||||
|
# Pairs .json + .csv files by base name and POSTs each to FIDI /autoimport
|
||||||
|
#
|
||||||
|
# Usage: ./import.sh [--dry-run|-n]
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Args
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
DRY_RUN=false
|
||||||
|
for arg in "$@"; do
|
||||||
|
case "$arg" in
|
||||||
|
--dry-run|-n) DRY_RUN=true ;;
|
||||||
|
*) echo "Unknown argument: $arg"; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Config — override via environment or .env file
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
if [[ -f "$SCRIPT_DIR/.env" ]]; then
|
||||||
|
# shellcheck source=/dev/null
|
||||||
|
source "$SCRIPT_DIR/.env"
|
||||||
|
fi
|
||||||
|
|
||||||
|
FIDI_URL="${FIDI_URL:?Set FIDI_URL in .env or environment (e.g. http://localhost:8080)}"
|
||||||
|
FIDI_SECRET="${FIDI_SECRET:-}"
|
||||||
|
FIDI_ACCESS_TOKEN="${FIDI_ACCESS_TOKEN:-}"
|
||||||
|
IMPORT_DIR="${IMPORT_DIR:-$SCRIPT_DIR/imports}"
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Helpers
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
RED='\033[0;31m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
CYAN='\033[0;36m'
|
||||||
|
NC='\033[0m'
|
||||||
|
|
||||||
|
pass() { echo -e "${GREEN}[OK]${NC} $*"; }
|
||||||
|
fail() { echo -e "${RED}[FAIL]${NC} $*"; }
|
||||||
|
info() { echo -e "${YELLOW}[INFO]${NC} $*"; }
|
||||||
|
dry() { echo -e "${CYAN}[DRY]${NC} $*"; }
|
||||||
|
|
||||||
|
build_url() {
|
||||||
|
local url="${FIDI_URL%/}/autoimport"
|
||||||
|
[[ -n "$FIDI_SECRET" ]] && url="${url}?secret=${FIDI_SECRET}"
|
||||||
|
echo "$url"
|
||||||
|
}
|
||||||
|
|
||||||
|
import_pair() {
|
||||||
|
local json_file="$1"
|
||||||
|
local csv_file="$2"
|
||||||
|
local base
|
||||||
|
base="$(basename "$json_file" .json)"
|
||||||
|
local url
|
||||||
|
url="$(build_url)"
|
||||||
|
|
||||||
|
if $DRY_RUN; then
|
||||||
|
dry "$base"
|
||||||
|
dry " POST $url"
|
||||||
|
dry " csv → $csv_file"
|
||||||
|
dry " json → $json_file"
|
||||||
|
[[ -n "$FIDI_ACCESS_TOKEN" ]] && dry " auth → Bearer ***"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
info "Importing: $base"
|
||||||
|
|
||||||
|
# Build auth args as an array to safely handle spaces/special chars
|
||||||
|
local curl_args=(-s -w "\n%{http_code}")
|
||||||
|
[[ -n "$FIDI_ACCESS_TOKEN" ]] && curl_args+=(-H "Authorization: Bearer $FIDI_ACCESS_TOKEN")
|
||||||
|
curl_args+=(
|
||||||
|
-F "csv=@${csv_file};type=text/csv"
|
||||||
|
-F "json=@${json_file};type=application/json"
|
||||||
|
"$url"
|
||||||
|
)
|
||||||
|
|
||||||
|
local response http_code body
|
||||||
|
response=$(curl "${curl_args[@]}")
|
||||||
|
http_code=$(echo "$response" | tail -n1)
|
||||||
|
body=$(echo "$response" | head -n -1)
|
||||||
|
|
||||||
|
if [[ "$http_code" =~ ^2 ]]; then
|
||||||
|
pass "$base (HTTP $http_code)"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
fail "$base (HTTP $http_code)"
|
||||||
|
echo " Response: $body"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Main
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
if [[ ! -d "$IMPORT_DIR" ]]; then
|
||||||
|
echo "Import directory not found: $IMPORT_DIR"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
$DRY_RUN && info "Dry-run mode — no requests will be sent"
|
||||||
|
|
||||||
|
mapfile -t json_files < <(find "$IMPORT_DIR" -maxdepth 1 -name '*.json' | sort)
|
||||||
|
|
||||||
|
if [[ ${#json_files[@]} -eq 0 ]]; then
|
||||||
|
info "No .json files found in $IMPORT_DIR"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
success=0
|
||||||
|
skipped=0
|
||||||
|
failed=0
|
||||||
|
|
||||||
|
for json_file in "${json_files[@]}"; do
|
||||||
|
base="$(basename "$json_file" .json)"
|
||||||
|
csv_file="${IMPORT_DIR}/${base}.csv"
|
||||||
|
|
||||||
|
if [[ ! -f "$csv_file" ]]; then
|
||||||
|
info "Skipping $base — no matching .csv found"
|
||||||
|
((skipped++))
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
if import_pair "$json_file" "$csv_file"; then
|
||||||
|
((success++))
|
||||||
|
else
|
||||||
|
((failed++))
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "----------------------------------------"
|
||||||
|
$DRY_RUN && echo " (dry run — nothing was imported)"
|
||||||
|
echo " Done: ${success} succeeded, ${failed} failed, ${skipped} skipped"
|
||||||
|
echo "----------------------------------------"
|
||||||
|
|
||||||
|
[[ "$failed" -eq 0 ]]
|
||||||
158
watch-imports.sh
Normal file
158
watch-imports.sh
Normal file
@@ -0,0 +1,158 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
# watch-imports.sh - Watch for new CSV files, flip sign on 4th column for
|
||||||
|
# specific accounts, then stage them in the imports/ directory.
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# ./watch-imports.sh # watch continuously (requires inotify-tools)
|
||||||
|
# ./watch-imports.sh --once # process existing files in INCOMING_DIR and exit
|
||||||
|
#
|
||||||
|
# Requires: inotify-tools (sudo apt-get install inotify-tools)
|
||||||
|
# python3 (standard on Ubuntu)
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Args
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
ONCE_MODE=false
|
||||||
|
for arg in "$@"; do
|
||||||
|
case "$arg" in
|
||||||
|
--once) ONCE_MODE=true ;;
|
||||||
|
*) echo "Unknown argument: $arg"; exit 1 ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Config
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
if [[ -f "$SCRIPT_DIR/.env" ]]; then
|
||||||
|
# shellcheck source=/dev/null
|
||||||
|
source "$SCRIPT_DIR/.env"
|
||||||
|
fi
|
||||||
|
|
||||||
|
INCOMING_DIR="${INCOMING_DIR:-$SCRIPT_DIR/incoming}"
|
||||||
|
IMPORT_DIR="${IMPORT_DIR:-$SCRIPT_DIR/imports}"
|
||||||
|
AUTO_IMPORT="${AUTO_IMPORT:-false}"
|
||||||
|
|
||||||
|
# Files whose 4th column values should have their sign flipped
|
||||||
|
FLIP_FILES=(
|
||||||
|
"jerickdiscover.csv"
|
||||||
|
"paigediscover.csv"
|
||||||
|
)
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Helpers
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
CYAN='\033[0;36m'
|
||||||
|
NC='\033[0m'
|
||||||
|
|
||||||
|
pass() { echo -e "${GREEN}[OK]${NC} $*"; }
|
||||||
|
info() { echo -e "${YELLOW}[INFO]${NC} $*"; }
|
||||||
|
step() { echo -e "${CYAN}[>>]${NC} $*"; }
|
||||||
|
|
||||||
|
needs_flip() {
|
||||||
|
local filename="$1"
|
||||||
|
for name in "${FLIP_FILES[@]}"; do
|
||||||
|
[[ "$filename" == "$name" ]] && return 0
|
||||||
|
done
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Flip the sign of all numeric values in the 4th column using python3.
|
||||||
|
# Handles quoted CSV fields correctly.
|
||||||
|
flip_fourth_column() {
|
||||||
|
local filepath="$1"
|
||||||
|
python3 - "$filepath" <<'PYEOF'
|
||||||
|
import csv, sys, os, tempfile
|
||||||
|
|
||||||
|
filepath = sys.argv[1]
|
||||||
|
col_idx = 3 # 4th column (0-indexed)
|
||||||
|
|
||||||
|
rows = []
|
||||||
|
with open(filepath, 'r', newline='', encoding='utf-8-sig') as f:
|
||||||
|
rows = list(csv.reader(f))
|
||||||
|
|
||||||
|
if len(rows) < 2:
|
||||||
|
sys.exit(0)
|
||||||
|
|
||||||
|
output = [rows[0]]
|
||||||
|
for row in rows[1:]:
|
||||||
|
if len(row) > col_idx:
|
||||||
|
try:
|
||||||
|
val = float(row[col_idx])
|
||||||
|
flipped = -val
|
||||||
|
# Preserve integer formatting when there's no fractional part
|
||||||
|
row[col_idx] = f"{flipped:.2f}" if flipped != int(flipped) else str(int(flipped))
|
||||||
|
except ValueError:
|
||||||
|
pass
|
||||||
|
output.append(row)
|
||||||
|
|
||||||
|
# Write atomically via a temp file in the same directory
|
||||||
|
dir_ = os.path.dirname(filepath)
|
||||||
|
fd, tmp = tempfile.mkstemp(dir=dir_, suffix='.tmp')
|
||||||
|
try:
|
||||||
|
with os.fdopen(fd, 'w', newline='', encoding='utf-8') as f:
|
||||||
|
csv.writer(f).writerows(output)
|
||||||
|
os.replace(tmp, filepath)
|
||||||
|
except Exception:
|
||||||
|
os.unlink(tmp)
|
||||||
|
raise
|
||||||
|
PYEOF
|
||||||
|
}
|
||||||
|
|
||||||
|
process_csv() {
|
||||||
|
local src="$1"
|
||||||
|
local filename
|
||||||
|
filename="$(basename "$src")"
|
||||||
|
local dest="$IMPORT_DIR/$filename"
|
||||||
|
|
||||||
|
if needs_flip "$filename"; then
|
||||||
|
step "Flipping 4th column: $filename"
|
||||||
|
flip_fourth_column "$src"
|
||||||
|
fi
|
||||||
|
|
||||||
|
mv -f "$src" "$dest"
|
||||||
|
pass "Staged: $filename"
|
||||||
|
|
||||||
|
if [[ "$AUTO_IMPORT" == "true" ]]; then
|
||||||
|
info "Running import..."
|
||||||
|
"$SCRIPT_DIR/import.sh"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Main
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
mkdir -p "$INCOMING_DIR" "$IMPORT_DIR"
|
||||||
|
|
||||||
|
if $ONCE_MODE; then
|
||||||
|
mapfile -t existing < <(find "$INCOMING_DIR" -maxdepth 1 -name '*.csv' | sort)
|
||||||
|
if [[ ${#existing[@]} -eq 0 ]]; then
|
||||||
|
info "No CSV files found in $INCOMING_DIR"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
for f in "${existing[@]}"; do
|
||||||
|
process_csv "$f"
|
||||||
|
done
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Continuous watch mode
|
||||||
|
if ! command -v inotifywait &>/dev/null; then
|
||||||
|
echo "inotify-tools not found. Install with:"
|
||||||
|
echo " sudo apt-get install inotify-tools"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
info "Watching $INCOMING_DIR for new CSV files... (Ctrl+C to stop)"
|
||||||
|
|
||||||
|
inotifywait -m -e close_write --format '%f' "$INCOMING_DIR" 2>/dev/null \
|
||||||
|
| while IFS= read -r filename; do
|
||||||
|
if [[ "$filename" == *.csv ]]; then
|
||||||
|
process_csv "$INCOMING_DIR/$filename"
|
||||||
|
fi
|
||||||
|
done
|
||||||
Reference in New Issue
Block a user