3.9 KiB
3.9 KiB
rmtPocketWatcher Setup Guide
What's Been Created
✅ Backend Scraper Service
- Playwright-based scrapers for Eldorado and PlayerAuctions
- Automatic retry logic and error handling
- Scheduled scraping every 5 minutes
- Tracks all seller listings with platform, price, and delivery time
✅ Database Layer
- PostgreSQL with Prisma ORM
- Three tables: VendorPrice, PriceIndex, ScrapeLog
- Stores all historical listings for trend analysis
- Indexed for fast queries
✅ API Layer
- Fastify REST API with 6 endpoints
- WebSocket for real-time updates
- Filter by seller, platform, date range
- Historical price data
✅ Docker Setup
- Docker Compose orchestration
- PostgreSQL container with health checks
- Backend container with auto-migration
- Volume persistence for database
Current Status
The Docker Compose stack is building. This will:
- Pull PostgreSQL 16 Alpine image
- Build the backend Node.js container
- Install Playwright and dependencies
- Generate Prisma client
- Start both services
Once Build Completes
Check Status
# View logs
docker-compose logs -f backend
# Check if services are running
docker ps
Test the API
# Health check
curl http://localhost:3000/health
# Get latest prices (after first scrape)
curl http://localhost:3000/api/prices/latest
# Get lowest price
curl http://localhost:3000/api/prices/lowest
# Get price history
curl "http://localhost:3000/api/index/history?range=7d"
Monitor Scraping
The backend will automatically:
- Scrape Eldorado and PlayerAuctions every 5 minutes
- Save all listings to the database
- Calculate and store the lowest price
- Log all scrape attempts
Check logs to see scraping activity:
docker-compose logs -f backend | grep "scraping"
Database Access
Using Prisma Studio
cd backend
npm run db:studio
Using psql
docker exec -it rmtpw-postgres psql -U rmtpw -d rmtpocketwatcher
Stopping and Restarting
# Stop services
docker-compose down
# Start services
docker-compose up -d
# Rebuild after code changes
docker-compose up --build -d
# View logs
docker-compose logs -f
# Remove everything including data
docker-compose down -v
Environment Variables
Edit .env or docker-compose.yml to configure:
SCRAPE_INTERVAL_MINUTES- How often to scrape (default: 5)SCRAPER_HEADLESS- Run browser in headless mode (default: true)PORT- API server port (default: 3000)DATABASE_URL- PostgreSQL connection string
Next Steps
- Wait for build to complete (~2-5 minutes)
- Verify services are running:
docker ps - Check first scrape:
docker-compose logs -f backend - Test API endpoints: See examples above
- Build Electron frontend (coming next)
Troubleshooting
Backend won't start
# Check logs
docker-compose logs backend
# Restart backend
docker-compose restart backend
Database connection issues
# Check postgres is healthy
docker-compose ps
# Restart postgres
docker-compose restart postgres
Scraper errors
# View detailed logs
docker-compose logs -f backend | grep -A 5 "error"
# Check scrape log in database
docker exec -it rmtpw-postgres psql -U rmtpw -d rmtpocketwatcher -c "SELECT * FROM scrape_log ORDER BY timestamp DESC LIMIT 10;"
File Structure
rmtPocketWatcher/
├── docker-compose.yml # Docker orchestration
├── .env # Environment variables
├── backend/
│ ├── Dockerfile # Backend container definition
│ ├── prisma/
│ │ └── schema.prisma # Database schema
│ ├── src/
│ │ ├── scrapers/ # Scraping logic
│ │ ├── api/ # REST + WebSocket API
│ │ ├── database/ # Prisma client & repository
│ │ └── index.ts # Main server
│ └── package.json
└── README.md