Skip to content

Loyalty

The loyalty system enables restaurants to reward customers with points, tiers, and offers. It runs as a separate smackz-loyalty service with its own database, communicating with Yum via REST API and Redis Streams.

Core Concepts

Concept Description
Club Membership A customer's enrollment in a restaurant's loyalty program
Points Earned on orders, redeemable for rewards; optionally expire
Tiers NONE, BRONZE, SILVER, GOLD, PLATINUM -- based on lifetime points earned
Offers Time-limited promotions that can be targeted by tier or segment

Points Lifecycle

Customer places order
       |
       v
Points earned (rate from points_rules)
       |
       v
Points balance updated
       |
       | (if points_ttl_days configured)
       v
Points expire after TTL (Points Expiry Worker)
  • Points earn rate is configured per restaurant via points_rules
  • points_ttl_days is optional; NULL means points never expire
  • Expiry is tracked per-EARN entry via expires_at timestamp
  • Expired points create a REVERSAL ledger entry (immutable audit trail)

Tier System

Tiers are computed from lifetime_points_earned (cumulative, never decreases):

Tier Example Threshold
NONE 0 (default)
BRONZE 100
SILVER 500
GOLD 1,500
PLATINUM 5,000

Thresholds are configured per restaurant in the membership_tier_rules table. Points expiry does not cause tier downgrades -- this matches standard loyalty program behavior (Starbucks, airlines).

Background Workers

Three cron workers automate loyalty lifecycle management:

Worker Schedule Purpose
Points Expiry Daily 02:00 UTC Expires points past their TTL, creates REVERSAL entries
Membership Tier Every 5 minutes Evaluates tier eligibility, emits upgrade events
Offer Expiry Daily 01:00 UTC Deactivates offers past valid_until

Workers use Redis distributed locks (SET NX EX) to prevent duplicate execution across replicas. Each worker runs in try/catch and cannot crash the main Fastify process.

Events

Workers emit events to the loyalty:events Redis Stream:

Event Trigger
loyalty.points.expired Points expired for a customer
loyalty.tier.upgraded Customer advanced to a higher tier

These events are consumed by the Notification Orchestrator in Yum for push/email delivery.

Offers

Offers have:

  • Start and end dates (valid_until)
  • Active/inactive status
  • Eligibility rules (tier-based, segment-based)
  • Real-time validation via offer-calculation.service.ts (belt-and-suspenders with the Offer Expiry Worker)

Admin API Endpoints

Endpoint Method Description
/loyalty/restaurants/{id}/tier-rules GET List tier thresholds
/loyalty/restaurants/{id}/tier-rules POST Set tier thresholds
/loyalty/restaurants/{id}/tier-rules DELETE Remove tier config
/loyalty/restaurants/{id}/rules PUT Update points rules (incl. TTL)

Hardening (RES-2 / RES-3 / RES-9)

Three hardening initiatives shipped to the loyalty service in 2026-04:

RES-2 — Transaction & Data Integrity

The loyalty service had decorative transactions: a DB context was created but never propagated to repositories. Fixed by adding an optional tx?: AppTransaction parameter to all 30+ repository methods across four repos and threading it through service-level transactional blocks. recordOfferUsage (insert + counter) and createRule (disable + insert) are now wrapped in atomic transactions; bulkUpsertOffers batches in groups of 10 with per-batch transactions. Multi-step membership / ledger / rules / offers operations no longer leave partial state.

RES-3 — Resilience

Eight resilience improvements:

Area Change
DB retries db-retry.ts utility — 3 exponential backoff attempts for transient PG errors
Pool config statement_timeout + connection timeouts + getPoolStats() getter
Circuit breaker CLOSED → OPEN after 5 failures → HALF_OPEN after 30s on YUM API calls
Health probes Structured /health and /health/ready with DB / memory / CB checks and degradation levels
Rate limiting 120 req/min global via @fastify/rate-limit
Graceful shutdown 30s drain, 5 startup DB retries
Backpressure 503 on non-critical endpoints when pool.waitingCount > 5

RES-9 — Observability

/metrics endpoint returns JSON with request counts, DB pool stats, circuit breaker state, and process info. An onResponse hook logs request duration and warns at >2s. An event-loop monitor measures lag every 5s, warns >500ms, and auto-degrades >2000ms. Files: api/app.ts, api/utils/metrics.ts, api/utils/event-loop-monitor.ts.

Key Files

  • docs/Smackz-Phase2/Loyalty-Lifecycle-Workers-FRD.md -- Full workers FRD
  • smackz-loyalty/api/workers/ -- Worker implementations
  • smackz-loyalty/api/services/offer-calculation.service.ts -- Offer validation
  • smackz-loyalty/api/services/loyalty.service.ts -- Points earn/reversal
  • smackz-loyalty/docs/RES-2-TRANSACTION-FIXES.md -- Atomicity hardening
  • smackz-loyalty/docs/RES-3-LOYALTY-HARDENING.md -- Resilience hardening
  • smackz-loyalty/docs/RES-9-OBSERVABILITY.md -- Metrics & event loop monitor