Files
rmtPocketWatcher/SETUP.md
2025-12-03 18:00:10 -05:00

172 lines
3.9 KiB
Markdown

# rmtPocketWatcher Setup Guide
## What's Been Created
**Backend Scraper Service**
- Playwright-based scrapers for Eldorado and PlayerAuctions
- Automatic retry logic and error handling
- Scheduled scraping every 5 minutes
- Tracks all seller listings with platform, price, and delivery time
**Database Layer**
- PostgreSQL with Prisma ORM
- Three tables: VendorPrice, PriceIndex, ScrapeLog
- Stores all historical listings for trend analysis
- Indexed for fast queries
**API Layer**
- Fastify REST API with 6 endpoints
- WebSocket for real-time updates
- Filter by seller, platform, date range
- Historical price data
**Docker Setup**
- Docker Compose orchestration
- PostgreSQL container with health checks
- Backend container with auto-migration
- Volume persistence for database
## Current Status
The Docker Compose stack is building. This will:
1. Pull PostgreSQL 16 Alpine image
2. Build the backend Node.js container
3. Install Playwright and dependencies
4. Generate Prisma client
5. Start both services
## Once Build Completes
### Check Status
```bash
# View logs
docker-compose logs -f backend
# Check if services are running
docker ps
```
### Test the API
```bash
# Health check
curl http://localhost:3000/health
# Get latest prices (after first scrape)
curl http://localhost:3000/api/prices/latest
# Get lowest price
curl http://localhost:3000/api/prices/lowest
# Get price history
curl "http://localhost:3000/api/index/history?range=7d"
```
### Monitor Scraping
The backend will automatically:
- Scrape Eldorado and PlayerAuctions every 5 minutes
- Save all listings to the database
- Calculate and store the lowest price
- Log all scrape attempts
Check logs to see scraping activity:
```bash
docker-compose logs -f backend | grep "scraping"
```
## Database Access
### Using Prisma Studio
```bash
cd backend
npm run db:studio
```
### Using psql
```bash
docker exec -it rmtpw-postgres psql -U rmtpw -d rmtpocketwatcher
```
## Stopping and Restarting
```bash
# Stop services
docker-compose down
# Start services
docker-compose up -d
# Rebuild after code changes
docker-compose up --build -d
# View logs
docker-compose logs -f
# Remove everything including data
docker-compose down -v
```
## Environment Variables
Edit `.env` or `docker-compose.yml` to configure:
- `SCRAPE_INTERVAL_MINUTES` - How often to scrape (default: 5)
- `SCRAPER_HEADLESS` - Run browser in headless mode (default: true)
- `PORT` - API server port (default: 3000)
- `DATABASE_URL` - PostgreSQL connection string
## Next Steps
1. **Wait for build to complete** (~2-5 minutes)
2. **Verify services are running**: `docker ps`
3. **Check first scrape**: `docker-compose logs -f backend`
4. **Test API endpoints**: See examples above
5. **Build Electron frontend** (coming next)
## Troubleshooting
### Backend won't start
```bash
# Check logs
docker-compose logs backend
# Restart backend
docker-compose restart backend
```
### Database connection issues
```bash
# Check postgres is healthy
docker-compose ps
# Restart postgres
docker-compose restart postgres
```
### Scraper errors
```bash
# View detailed logs
docker-compose logs -f backend | grep -A 5 "error"
# Check scrape log in database
docker exec -it rmtpw-postgres psql -U rmtpw -d rmtpocketwatcher -c "SELECT * FROM scrape_log ORDER BY timestamp DESC LIMIT 10;"
```
## File Structure
```
rmtPocketWatcher/
├── docker-compose.yml # Docker orchestration
├── .env # Environment variables
├── backend/
│ ├── Dockerfile # Backend container definition
│ ├── prisma/
│ │ └── schema.prisma # Database schema
│ ├── src/
│ │ ├── scrapers/ # Scraping logic
│ │ ├── api/ # REST + WebSocket API
│ │ ├── database/ # Prisma client & repository
│ │ └── index.ts # Main server
│ └── package.json
└── README.md
```