This commit is contained in:
Deep Koluguri 2025-11-06 21:45:24 -05:00
parent 8c2d72eb84
commit 526e60e0ec
5109 changed files with 5502 additions and 547811 deletions

View File

@ -1,244 +0,0 @@
# 🚀 LuckyChit Deployment Guide
Complete production deployment guide.
---
## 📍 Production Setup
**Server IP**: 192.168.8.148
**Domain**: chitfund.deepteklabs.com
**User**: luckychit
**Project**: /home/luckychit/apps/chitfund
**Branch**: prodnew
---
## 🏗️ Architecture
```
Internet
Cloudflare (SSL, CDN)
Nginx Proxy (LXC 1)
Application Server (LXC 2: 192.168.8.148)
├── PM2: luckychit-api (Port 3000)
├── PM2: luckychit-frontend (Port 8080)
└── PostgreSQL (Port 5432)
```
---
## ⚡ Quick Deploy
```bash
ssh luckychit@192.168.8.148
cd /home/luckychit/apps/chitfund
./scripts/deploy.sh
```
### Deploy Options:
```bash
./scripts/deploy.sh # Deploy both backend + frontend
./scripts/deploy.sh backend # Backend only
./scripts/deploy.sh frontend # Frontend only
```
---
## 🔧 Manual Deployment
### Backend
```bash
cd /home/luckychit/apps/chitfund
git pull origin prodnew
cd backend
npm install
pm2 restart luckychit-api
pm2 logs luckychit-api --lines 20
```
### Frontend
```bash
cd /home/luckychit/apps/chitfund
git pull origin prodnew
cd luckychit
flutter pub get
flutter build web --release --pwa-strategy=none
pm2 restart luckychit-frontend
pm2 logs luckychit-frontend --lines 20
```
---
## 📊 PM2 Commands
```bash
pm2 status # Show all processes
pm2 logs # Live logs
pm2 logs luckychit-api # Backend logs
pm2 logs luckychit-frontend # Frontend logs
pm2 restart all # Restart everything
pm2 monit # Real-time monitoring
```
---
## 🗑️ Clear Caches (After Deployment)
### 1. Server Cache
```bash
pm2 restart all
```
### 2. Nginx Proxy Cache (if using separate LXC)
```bash
ssh root@<nginx-lxc-ip>
sudo rm -rf /var/cache/nginx/*
sudo systemctl reload nginx
```
### 3. Cloudflare Cache
- Login to Cloudflare dashboard
- Caching → Purge Everything
### 4. Browser Cache
```
Hard Refresh: Ctrl + Shift + R (Windows) or Cmd + Shift + R (Mac)
Or test in Incognito: Ctrl + Shift + N
```
---
## 🔥 Emergency: Complete Cache Clear
```bash
cd /home/luckychit/apps/chitfund
./scripts/deploy.sh --force
```
This will:
1. Clean all build caches
2. Rebuild from scratch
3. Restart all services
4. Clear nginx cache (if configured)
---
## 🗄️ Database Backup
### Manual Backup
```bash
./scripts/backup-db.sh
```
### Automated Daily Backup
```bash
# Add to crontab
crontab -e
# Add: 0 2 * * * /home/luckychit/apps/chitfund/scripts/backup-db.sh
```
### Restore from Backup
```bash
./scripts/restore-db.sh
```
---
## 🔍 Initial Server Setup (One-Time)
### 1. Install Dependencies
```bash
# Node.js, Flutter, PM2, PostgreSQL
# See PM2_PRODUCTION_GUIDE.md for details
```
### 2. Setup PM2
```bash
cd /home/luckychit/apps/chitfund/backend
pm2 start src/server.js --name luckychit-api
cd ../luckychit
pm2 serve build/web 8080 --name luckychit-frontend --spa
pm2 save
pm2 startup systemd -u luckychit --hp /home/luckychit
```
### 3. Configure Auto-start
```bash
pm2 startup
# Run the command it outputs
pm2 save
```
---
## 🔐 Security Checklist
- [ ] Firewall configured (ports 3000, 8080)
- [ ] Strong database password
- [ ] JWT secret set (32+ characters)
- [ ] SSL/TLS enabled (via Cloudflare)
- [ ] Database backups automated
- [ ] PM2 auto-start configured
---
## 📋 Pre-Deployment Checklist
- [ ] Changes tested locally
- [ ] Committed to git
- [ ] Pushed to `prodnew` branch
- [ ] Database migrations ready (if any)
- [ ] Backup recent (< 24 hours)
---
## 📋 Post-Deployment Checklist
- [ ] `pm2 status` shows all online
- [ ] No errors in `pm2 logs`
- [ ] Backend health check works: `curl http://localhost:3000/health`
- [ ] Frontend loads: `curl http://localhost:8080`
- [ ] Domain works: https://chitfund.deepteklabs.com
- [ ] Login works
- [ ] Clear browser cache tested
---
## 🆘 Troubleshooting
See [TROUBLESHOOTING.md](TROUBLESHOOTING.md) for common issues and fixes.
Quick diagnostics:
```bash
./scripts/diagnose.sh
```
---
## 📞 Emergency Contacts
**If everything is broken:**
```bash
./scripts/diagnose.sh # See what's wrong
./scripts/fix-502.sh # Fix 502 errors
pm2 restart all # Restart everything
```
---
## 📚 Additional Resources
- **PM2 Production Guide**: [PM2_PRODUCTION_GUIDE.md](PM2_PRODUCTION_GUIDE.md)
- **Quick Reference**: [QUICK_REFERENCE.md](QUICK_REFERENCE.md)
- **Backend API Docs**: [backend/API_DOCUMENTATION.md](backend/API_DOCUMENTATION.md)
---
**Last Updated**: November 2025

View File

@ -1,60 +0,0 @@
# ⚡ Quick Start - LuckyChit
Get up and running in 5 minutes!
---
## 🎯 For Users: Access the App
**Production URL**: https://chitfund.deepteklabs.com
That's it! Just open the link and login.
---
## 👨‍💻 For Developers: Local Development
### Backend
```bash
cd backend
npm install
cp env.example .env # Edit with your database credentials
npm start # Runs on http://localhost:3000
```
### Frontend
```bash
cd luckychit
flutter pub get
flutter run -d chrome # Opens in browser
```
---
## 🚀 For Ops: Deploy to Production
```bash
ssh luckychit@192.168.8.148
cd /home/luckychit/apps/chitfund
./scripts/deploy.sh
```
That's it! See [DEPLOYMENT.md](DEPLOYMENT.md) for details.
---
## 🆘 Something Broken?
```bash
./scripts/diagnose.sh # Shows what's wrong
```
See [TROUBLESHOOTING.md](TROUBLESHOOTING.md) for fixes.
---
## 📚 Need More Info?
- **Deployment**: [DEPLOYMENT.md](DEPLOYMENT.md)
- **Troubleshooting**: [TROUBLESHOOTING.md](TROUBLESHOOTING.md)
- **Full README**: [README.md](README.md)

179
README.md
View File

@ -1,179 +0,0 @@
# 🚀 LuckyChit - Digital Chit Fund Management
Complete production-ready chit fund management system with Flutter frontend and Node.js backend.
---
## 📚 Documentation
| Document | Purpose | Audience |
|----------|---------|----------|
| **[QUICK_START.md](QUICK_START.md)** | Get started in 5 minutes | Everyone |
| **[DEPLOYMENT.md](DEPLOYMENT.md)** | Complete deployment guide | DevOps/Developers |
| **[TROUBLESHOOTING.md](TROUBLESHOOTING.md)** | Fix common issues | Everyone |
| **[QUICK_REFERENCE.md](QUICK_REFERENCE.md)** | Command cheat sheet | DevOps |
---
## ⚡ Quick Commands
### 🚀 Deploy to Production
```bash
ssh luckychit@192.168.8.148
cd /home/luckychit/apps/chitfund
./scripts/deploy.sh
```
### 📊 Check Status
```bash
pm2 status
pm2 logs --lines 50
```
### 🔧 Fix Issues
```bash
./scripts/diagnose.sh # What's wrong?
./scripts/fix-502.sh # Fix 502 errors
```
---
## 🏗️ Architecture
```
Internet
Cloudflare (SSL, CDN, DDoS Protection)
Nginx Proxy (Reverse Proxy)
Application Server (192.168.8.148)
├── PM2: luckychit-api (Backend - Port 3000)
├── PM2: luckychit-frontend (Frontend - Port 8080)
└── PostgreSQL Database (Port 5432)
```
**Production URL**: https://chitfund.deepteklabs.com
---
## 📁 Project Structure
```
chitfund/
├── README.md # This file
├── QUICK_START.md # Getting started
├── DEPLOYMENT.md # Deployment guide
├── TROUBLESHOOTING.md # Issue fixes
├── scripts/ # All deployment scripts
│ ├── deploy.sh # Main deployment
│ ├── diagnose.sh # System diagnostics
│ ├── backup-db.sh # Database backup
│ ├── restore-db.sh # Database restore
│ └── fix-502.sh # Fix 502 errors
├── backend/ # Node.js Express API
│ ├── src/ # Source code
│ ├── README.md # Backend docs
│ └── API_DOCUMENTATION.md
└── luckychit/ # Flutter Web App
├── lib/ # Dart source code
├── web/ # Web assets
└── README.md # Frontend docs
```
---
## 🚀 Getting Started
### For Users
Just visit: **https://chitfund.deepteklabs.com**
### For Developers (Local)
```bash
# Backend
cd backend
npm install
npm start # http://localhost:3000
# Frontend
cd luckychit
flutter pub get
flutter run -d chrome # Opens in browser
```
### For DevOps (Production)
See **[DEPLOYMENT.md](DEPLOYMENT.md)**
---
## 📊 Tech Stack
**Frontend**
- Flutter Web
- GetX (State Management)
- Dio (HTTP Client)
**Backend**
- Node.js + Express
- PostgreSQL
- JWT Authentication
- Sequelize ORM
**Infrastructure**
- PM2 (Process Manager)
- Nginx (Reverse Proxy)
- Cloudflare (CDN + SSL)
- LXC Containers (Isolation)
---
## 🆘 Need Help?
**Something not working?**
1. Check [TROUBLESHOOTING.md](TROUBLESHOOTING.md)
2. Run `./scripts/diagnose.sh`
3. Check PM2 logs: `pm2 logs`
**Want to deploy?**
- See [DEPLOYMENT.md](DEPLOYMENT.md)
- Quick: `./scripts/deploy.sh`
**Learning the system?**
- Start with [QUICK_START.md](QUICK_START.md)
- Reference [QUICK_REFERENCE.md](QUICK_REFERENCE.md)
---
## 📝 Recent Updates
- ✅ Removed demo credentials from login
- ✅ Fixed all caching issues (browser, service worker, nginx)
- ✅ Added signup navigation to dashboard
- ✅ Consolidated documentation and scripts
- ✅ Created unified deployment system
---
## 🎯 Quick Links
- **Production**: https://chitfund.deepteklabs.com
- **Backend API**: http://192.168.8.148:3000
- **Frontend**: http://192.168.8.148:8080
- **API Docs**: [backend/API_DOCUMENTATION.md](backend/API_DOCUMENTATION.md)
---
**Version**: 1.0.0
**Last Updated**: November 6, 2025
**Status**: ✅ Production Ready
---
## 📄 License & Documentation
For complete feature documentation, see individual README files in `backend/` and `luckychit/` directories.
**Questions?** Check [TROUBLESHOOTING.md](TROUBLESHOOTING.md) first!

426
README/ADMIN_GUIDE.md Normal file
View File

@ -0,0 +1,426 @@
# Admin Features Guide
Complete guide for managers to edit, delete, and manage their chit groups.
---
## 🎯 Overview
As a manager, you have complete control over:
- Monthly draws (edit/delete)
- Chit groups (edit/delete before starting)
- Members (edit details, remove from group)
- Payments and reports
---
## 📋 Table of Contents
1. [Manage Monthly Draws](#manage-monthly-draws)
2. [Manage Chit Groups](#manage-chit-groups)
3. [Manage Members](#manage-members)
4. [Member Numbers](#member-numbers)
5. [API Reference](#api-reference)
---
## 🎲 Manage Monthly Draws
### Edit Draw (Fix Mistakes)
**When to use**: Wrong winner selected or incorrect prize amount
**How to**:
1. Go to Draws tab
2. Find the incorrect draw
3. Click ✏️ (Edit icon)
4. Select correct winner and/or update prize amount
5. Add notes explaining the correction
6. Click "Update Draw"
**API**: `PUT /api/monthly-draws/{drawId}`
**What you can edit**:
- Winner (select from member list)
- Prize amount
- Notes
**What you CANNOT edit**:
- Month/year (delete and recreate instead)
- Draw date
---
### Delete Draw (Remove Mistakes)
**When to use**: Draw was added for wrong month or by mistake
**How to**:
1. Go to Draws tab
2. Find the incorrect draw
3. Click 🗑️ (Delete icon)
4. Confirm deletion (shows draw details)
5. Draw is permanently removed
**API**: `DELETE /api/monthly-draws/{drawId}`
**Warning**: Deletion is permanent! Consider editing instead if possible.
---
## 🏢 Manage Chit Groups
### Edit Group Details
**When**: Only for groups in "Forming" status (not started yet)
**How to**:
1. Open group details page
2. Click top menu (⋮)
3. Select "Edit Group Details"
4. Update any field (name, amounts, dates, etc.)
5. Click "Update Group"
**API**: `PUT /api/chit-groups/{groupId}`
**What you can edit**:
- Group name
- Total value
- Monthly installment
- Duration (months)
- Max members
- Commission amount
- Draw date (1-31)
**Restriction**: Only works for groups in "forming" status. Once started, cannot edit!
---
### Delete Group
**When**: Only for groups in "Forming" status with 0 active members
**How to**:
1. Remove all members first (status → 'removed')
2. Click top menu (⋮)
3. Select "Delete Group"
4. Confirm deletion
**API**: `DELETE /api/chit-groups/{groupId}`
**Requirements**:
- ✅ Status must be "forming"
- ✅ Must have 0 active members (removed members are OK)
**Note**: Removed members don't block deletion!
---
## 👥 Manage Members
### Member Numbers
Every member gets a readable number:
- **Member #1, #2, #3...**
- Auto-assigned when they join
- Unique within each group
- Easy to reference verbally
**Display**:
- Purple circle badge with #X
- "Member #X" chip on card
- Large display in edit dialog
**Usage**:
- "Call Member Number 5"
- "Member #3 hasn't paid"
- Print on receipts
---
### Edit Member Details
**When**: Anytime! Works for all group statuses
**How to**:
1. Go to Members tab
2. Click menu (⋮) on member card
3. Select "Edit Member"
4. Update any field
5. Click "Update Member"
**API**: `PUT /api/auth/member/{memberId}`
**What you can edit**:
- Full name
- Mobile number (must be unique, 10 digits)
- Email address
- Physical address
- Emergency contact
**What you CANNOT edit**:
- Member number (auto-assigned)
- UUID (permanent identifier)
**Features**:
- View member number (highlighted in purple)
- View/copy UUID for technical support
- All fields optional (update only what you need)
---
### Remove Member from Group
**Important**: This removes them from THIS group only, NOT from the system!
**How to**:
1. Go to Members tab
2. Click menu (⋮) on member card
3. Select "Remove"
4. Confirm removal
**API**: `DELETE /api/members/{groupId}/members/{userId}`
**What happens**:
- ✅ Member removed from THIS group
- ✅ Their account remains active
- ✅ They can still login
- ✅ They remain in other groups
- ✅ Can be re-added to this group
- ✅ All data preserved
**Restriction**: Only works for groups in "forming" status
---
## 🔄 Common Workflows
### Fix Wrong Draw Winner
```
1. Draws tab → Find draw → Click ✏️
2. Select correct winner
3. Add note: "Corrected winner selection"
4. Update → ✅ Fixed!
```
### Update Member Phone Number
```
1. Members tab → Member menu (⋮) → Edit Member
2. Update mobile number
3. Update → ✅ They'll now get SMS/WhatsApp!
```
### Delete Empty Test Group
```
1. Remove all members (if any)
2. Top menu (⋮) → Delete Group
3. Confirm → ✅ Group deleted!
```
### Fix Group Name Before Starting
```
1. Top menu (⋮) → Edit Group Details
2. Update name
3. Update → ✅ Name fixed!
4. Start group when ready
```
---
## 🔒 Permissions & Restrictions
### What Managers CAN Do:
- ✅ Edit/delete their own group draws
- ✅ Edit group details (if forming)
- ✅ Delete groups (if forming & 0 active members)
- ✅ Edit any member's details
- ✅ Remove members from groups
### What Managers CANNOT Do:
- ❌ Edit other manager's groups
- ❌ Edit groups after they start
- ❌ Delete groups with active members
- ❌ Delete member accounts (only remove from group)
- ❌ Change member numbers or UUIDs
---
## 🎨 Visual Reference
### Member Card
```
┌────────────────────────────────────┐
│ [#5] K Sundeep Reddy [Member #5]│
│ 9876543228 │
│ [Active] ₹1,02,500 paid │
│ [⋮] │
└────────────────────────────────────┘
```
### Draw Card
```
┌────────────────────────────────────┐
│ [🎲] March 2025 Draw │
│ Winner: K Sundeep Reddy │
│ [Completed] ₹1,95,000 │
│ │
│ 15/3/2025 │
│ [✏️] [🗑️] │
└────────────────────────────────────┘
```
### Top Menu (Forming Group)
```
[⋮] Menu
├─ ✏️ Edit Group Details
├─ 🗑️ Delete Group
├─ 👥 Select Members
├─ Add New User
├─ 🕐 Add Past Draw Result
└─ 💰 Add Past Payments
```
---
## 📊 Status-Based Features
### "Forming" Status 🟡
**You CAN**:
- ✅ Edit group details
- ✅ Delete group (if 0 active members)
- ✅ Add/remove members
- ✅ Edit member details
- ✅ Add past draws/payments
**You CANNOT**:
- ❌ Conduct live draws (not started)
### "Active" Status 🟢
**You CAN**:
- ✅ Conduct monthly draws
- ✅ Edit/delete draws
- ✅ Edit member details
- ✅ Record payments
- ✅ View reports
**You CANNOT**:
- ❌ Edit group details (locked)
- ❌ Delete group
- ❌ Remove members
---
## ⚠️ Important Notes
### Deletions Are Permanent
- No undo button
- Always double-check before deleting
- Consider editing instead when possible
### "Remove Member" ≠ "Delete User"
- Removing member only removes them from THAT group
- Their account stays active
- They can still be in other groups
- Can be re-added later
### Group Edits Before Start Only
- Once you start a group, details are locked
- Check everything carefully before starting
- Members and draws can still be edited after start
---
## 🔍 Member ID System
### Two Types of IDs
**1. Member Number** (Human-friendly)
- Format: #1, #2, #3, #4...
- Display: Large purple badge
- Usage: Verbal communication, receipts, WhatsApp
- Example: "Member #5"
**2. UUID** (Technical)
- Format: c5530367-7110-4914-bfa6-f03818dedeb2
- Display: Hidden by default (show in edit dialog)
- Usage: Database operations, API calls
- Example: For technical support only
**Best Practice**:
- Show users the member number (#5)
- Keep UUID for backend operations
---
## 📱 API Quick Reference
### Monthly Draws
```
POST /api/monthly-draws Create draw
GET /api/monthly-draws/group/{id} List draws
PUT /api/monthly-draws/{id} Edit draw ⭐
DELETE /api/monthly-draws/{id} Delete draw ⭐
```
### Chit Groups
```
POST /api/chit-groups Create group
GET /api/chit-groups/manager List my groups
PUT /api/chit-groups/{id} Edit group ⭐
DELETE /api/chit-groups/{id} Delete group ⭐
POST /api/chit-groups/{id}/start Start group
```
### Members
```
POST /api/auth/create-member Create member
GET /api/members/{gId}/members List members
PUT /api/auth/member/{id} Edit member ⭐
DELETE /api/members/{gId}/members/{uId} Remove from group ⭐
```
⭐ = New admin features
---
## 🧪 Testing Checklist
- [ ] Create test group
- [ ] Add members (see member numbers #1, #2, #3)
- [ ] Add past draw
- [ ] Edit draw (change winner)
- [ ] Delete draw
- [ ] Edit member (update phone)
- [ ] Edit group name
- [ ] Remove all members
- [ ] Delete group
- [ ] Verify large fonts work for elderly users
---
## 🆘 Support
### Common Issues
- **Can't delete group**: Check if all members are removed (not just inactive)
- **404 on member delete**: Using member.userId (not member.id)?
- **Can't edit group**: Is status "forming"?
- **Module not found**: Run `npm install` in backend
See `TROUBLESHOOTING.md` for more help.
---
## 🎉 Latest Updates (November 2025)
- ✅ Added member numbers (#1, #2, #3...)
- ✅ Complete admin edit/delete features
- ✅ Accessibility improvements for elderly users
- ✅ Fixed duplicate winner prevention
- ✅ Removed all hardcoded data
- ✅ Enhanced UI/UX
---
**Version**: 2.0
**Status**: Production Ready ✅

71
README/CHANGELOG.md Normal file
View File

@ -0,0 +1,71 @@
# Changelog
## Version 2.0 - November 6, 2025
### 🎉 Major Features Added
#### Member Numbers
- Added readable serial numbers (#1, #2, #3...) for each member
- Auto-assigned when member joins
- Displayed prominently in large purple badges
- Easy to reference verbally ("Call Member #5")
#### Complete Admin Features
- **Edit Monthly Draws**: Fix wrong winner or prize amount
- **Delete Monthly Draws**: Remove draws added by mistake
- **Edit Member Details**: Update name, phone, email, address
- **Edit Group Details**: Update group info before starting
- **Delete Groups**: Remove empty forming groups
- **View Member IDs**: Display both member number and UUID
#### Accessibility for Elderly Users (50+)
- 30% larger fonts throughout entire app
- High contrast colors (pure black on white)
- 40% larger buttons and touch targets
- Bolder text for better readability
- Thicker borders for visibility
- WCAG AAA compliant
### 🐛 Bug Fixes
- Fixed 500 error when creating past draws
- Fixed duplicate winner prevention
- Fixed delete group counting removed members
- Fixed member delete using wrong ID
- Removed all hardcoded test data
- Fixed add draw result screen scrolling
### 🎨 UI/UX Improvements
- Made all dialogs scrollable
- Enhanced confirmation messages
- Added "Already Won" badges for previous winners
- Improved empty states
- Added clear warnings before deletions
- Better visual hierarchy
### 🔧 Technical Improvements
- Extended User model with email, address, emergency contact
- Added member_number field to GroupMember
- Improved API error messages
- Smart change detection (only update changed fields)
- Better validation throughout
---
## Version 1.0 - Previous
Initial release with core features:
- Chit group management
- Member management
- Payment tracking
- Monthly draws with provably fair system
- WhatsApp integration
- Financial reports
---
**Current Version**: 2.0
**Status**: Production Ready ✅

386
README/DEPLOYMENT.md Normal file
View File

@ -0,0 +1,386 @@
# Deployment Guide
Complete guide for deploying LuckyChit to production.
---
## 📋 Prerequisites
- PostgreSQL 12+
- Node.js 14+
- PM2 (for process management)
- Flutter SDK (for mobile app)
---
## 🚀 Backend Deployment
### 1. Initial Setup
```bash
# Clone repository
cd /home/luckychit/apps/chitfund/backend
# Install dependencies
npm install
# Setup environment
cp env.example .env
nano .env # Configure your settings
```
### 2. Database Setup
```bash
# Create database
sudo -u postgres psql
postgres=# CREATE DATABASE luckychit;
postgres=# CREATE USER luckychit WITH PASSWORD 'your_password';
postgres=# GRANT ALL PRIVILEGES ON DATABASE luckychit TO luckychit;
postgres=# \q
```
### 3. Apply Member Number Migration
**Option A: Using Node.js Script** (Easiest)
```bash
cd /home/luckychit/apps/chitfund/backend
node run-member-number-migration.js
```
**Option B: Using SQL File**
```bash
# Copy to accessible location
sudo cp migrations/20251106_add_member_number.sql /tmp/
sudo chmod 644 /tmp/20251106_add_member_number.sql
# Apply migration
sudo -u postgres psql -d luckychit -f /tmp/20251106_add_member_number.sql
# Clean up
sudo rm /tmp/20251106_add_member_number.sql
```
### 4. Start Backend with PM2
```bash
# Install PM2 globally (if not installed)
npm install -g pm2
# Start backend
pm2 start ecosystem.config.js
# Save PM2 configuration
pm2 save
# Setup auto-start on reboot
pm2 startup
# Follow the command it gives you
```
### 5. Verify Backend is Running
```bash
# Check status
pm2 status
# Check logs
pm2 logs chitfund-backend --lines 50
# Test health endpoint
curl http://localhost:3000/health
```
---
## 📱 Frontend Deployment
### Android APK
```bash
cd luckychit
# Build release APK
flutter build apk --release
# APK location:
# build/app/outputs/flutter-apk/app-release.apk
```
### iOS App
```bash
cd luckychit
# Build iOS
flutter build ios --release
# Then archive in Xcode
```
### Web Deployment
```bash
cd luckychit
# Build web
flutter build web --release
# Deploy files from:
# build/web/
```
---
## 🔄 Updating Production
### Update Backend
```bash
cd /home/luckychit/apps/chitfund/backend
# Pull latest code
git pull origin main
# Install any new dependencies
npm install
# Run migrations (if any)
node run-member-number-migration.js # If not already applied
# Restart
pm2 restart chitfund-backend
# Check logs
pm2 logs chitfund-backend
```
### Update Frontend
```bash
cd luckychit
# Pull latest code
git pull origin main
# Get dependencies
flutter pub get
# Build new APK
flutter build apk --release
# Distribute to users
```
---
## 🔧 Troubleshooting
### Backend Won't Start
**Module Not Found Error**:
```bash
cd backend
rm -rf node_modules package-lock.json
npm install
pm2 restart chitfund-backend
```
**Database Connection Error**:
```bash
# Check PostgreSQL is running
sudo systemctl status postgresql
# Check credentials in .env file
cat .env | grep DB_
# Test connection
node test-db-connection.js
```
**Port Already in Use**:
```bash
# Find what's using port 3000
lsof -i :3000
# Stop old process
pm2 delete all
pm2 start ecosystem.config.js
```
---
### Migration Issues
**Permission Denied**:
- Use Node.js script: `node run-member-number-migration.js`
- Or copy SQL to /tmp first
**Peer Authentication Failed**:
- Use: `sudo -u postgres psql`
- Or configure md5 auth in pg_hba.conf
**Column Already Exists**:
- Migration was already applied, safe to ignore
- Verify: `SELECT member_number FROM group_members LIMIT 1;`
---
### Frontend Issues
**API Connection Error**:
- Check API URL in `lib/core/services/api_service.dart`
- Should be: `https://chitfund.deepteklabs.com/api`
**Member Numbers Not Showing**:
- Check backend migration was applied
- Check API response includes `member_number` field
- Restart both backend and frontend
---
## 📊 Health Checks
### Backend Health
```bash
# Check PM2 status
pm2 status
# Check if responding
curl https://chitfund.deepteklabs.com/health
# Check database
sudo -u postgres psql -d luckychit -c "SELECT COUNT(*) FROM users;"
# Check member numbers
sudo -u postgres psql -d luckychit -c "SELECT member_number FROM group_members LIMIT 5;"
```
### Frontend Health
```bash
# Build test
flutter build apk --debug
# Check for errors
flutter analyze
# Run on device
flutter run
```
---
## 🔐 Security Checklist
- [ ] Change default passwords
- [ ] Set strong JWT secret in .env
- [ ] Enable rate limiting in production
- [ ] Use HTTPS only
- [ ] Configure CORS properly
- [ ] Keep dependencies updated
- [ ] Regular database backups
- [ ] Monitor PM2 logs
---
## 📦 Environment Variables
### Required (.env file)
```bash
# Database
DB_HOST=localhost
DB_PORT=5432
DB_NAME=luckychit
DB_USER=luckychit
DB_PASSWORD=your_secure_password
# JWT
JWT_SECRET=your_very_long_secret_key_here
JWT_EXPIRY=7d
# Server
PORT=3000
NODE_ENV=production
# Optional
RATE_LIMIT_WINDOW_MS=900000
RATE_LIMIT_MAX_REQUESTS=100
```
---
## 📝 Maintenance Tasks
### Daily
- Monitor PM2 logs: `pm2 logs`
- Check disk space: `df -h`
### Weekly
- Review error logs
- Check database size
- Update dependencies if needed
### Monthly
- Database backup
- Review and archive old logs
- Check for security updates
---
## 🎯 Production Checklist
### Before Going Live
- [x] Database migration applied
- [x] Backend running on PM2
- [x] Backend health check passes
- [x] Frontend built and tested
- [x] Admin features working
- [x] Member numbers displaying
- [x] All CRUD operations tested
- [x] Authentication working
- [x] WhatsApp integration configured
- [x] Large fonts for accessibility
### After Deployment
- [ ] Monitor error logs
- [ ] Test with real users
- [ ] Gather feedback
- [ ] Document any issues
- [ ] Plan next iteration
---
## 📞 Quick Commands Reference
```bash
# Backend
pm2 restart chitfund-backend # Restart
pm2 logs chitfund-backend # View logs
pm2 status # Check status
npm install # Install deps
# Database
sudo -u postgres psql -d luckychit # Connect
node run-member-number-migration.js # Migrate
# Frontend
flutter pub get # Get deps
flutter build apk --release # Build APK
flutter run # Run app
```
---
## ✅ Deployment Complete!
After following this guide:
- ✅ Backend running on PM2
- ✅ Database migrated with member numbers
- ✅ Frontend built and deployed
- ✅ All admin features working
- ✅ Ready for production use
**Your chitfund management system is live!** 🎉

View File

@ -0,0 +1,296 @@
# 🚀 Ready to Deploy - All Fixes
## Summary of Changes
Multiple fixes ready for production deployment!
---
## ✅ Changes to Deploy
### **Backend Fixes (3)**
1. **Notification API** - Fixed Mongoose → Sequelize
- File: `backend/src/routes/notifications.js`
- Error: 500 on `/api/notifications`
2. **Available Users API** - Added missing parentheses to subquery
- File: `backend/src/controllers/memberController.js`
- Error: 500 on `/api/members/users/available/:groupId`
3. **Updated ecosystem.config.js** - Fixed paths
- File: `backend/ecosystem.config.js`
- Changed script path to `./src/server.js`
### **Frontend Fixes (4)**
4. **Removed Demo Credentials** - Login screen clean
- File: `luckychit/lib/features/auth/views/login_screen.dart`
5. **Signup Navigation** - Auto-redirect to dashboard
- File: `luckychit/lib/features/auth/views/signup_screen.dart`
6. **Member Dashboard** - Removed hardcoded data
- File: `luckychit/lib/interfaces/member/member_dashboard.dart`
- Shows real data from API
- Empty state when no groups
- Fixed property name: `totalValue` (not `totalAmount`)
7. **Cache-Busting** - Fixed service worker issues
- File: `luckychit/web/index.html`
- Added cache control headers
- Auto-clears service workers
### **Infrastructure (2)**
8. **Consolidated Scripts** - All in `scripts/` folder
- `scripts/deploy.sh` - Unified deployment
- `scripts/diagnose.sh` - Diagnostics
- `scripts/backup-db.sh` - Database backup
- `scripts/restore-db.sh` - Database restore
- `scripts/fix-502.sh` - Fix 502 errors
9. **Consolidated Documentation** - Clean structure
- `README.md` - Main docs
- `QUICK_START.md` - Getting started
- `DEPLOYMENT.md` - Deployment guide
- `TROUBLESHOOTING.md` - All fixes
---
## 🚀 Deployment Steps
### Step 1: Commit All Changes
```bash
git add .
git commit -m "Production fixes: APIs, UI cleanup, cache-busting, consolidation"
git push origin prodnew
```
### Step 2: Deploy to Production
```bash
ssh luckychit@192.168.8.148
cd /home/luckychit/apps/chitfund
git pull origin prodnew
# Make scripts executable
chmod +x scripts/*.sh
# Deploy everything
./scripts/deploy.sh
```
### Step 3: Fix bcrypt Error (If Needed)
If you see "invalid ELF header" error:
```bash
cd /home/luckychit/apps/chitfund/backend
rm -rf node_modules package-lock.json
npm install
pm2 restart luckychit-api
```
### Step 4: Clear All Caches
```bash
# On server - already done by deploy script
pm2 restart all
# Clear nginx cache (if using separate proxy LXC)
ssh root@<nginx-proxy-ip>
sudo rm -rf /var/cache/nginx/*
sudo systemctl reload nginx
# In browser - Clear service worker
# F12 → Application → Service Workers → Unregister all
# Then: Ctrl + Shift + R (hard refresh)
```
### Step 5: Verify Everything Works
```bash
# Check PM2 status
pm2 status
# Check logs for errors
pm2 logs --lines 30
# Test backend APIs
curl http://localhost:3000/health
curl -H "Authorization: Bearer TOKEN" http://localhost:3000/api/notifications
# Test frontend
curl http://localhost:8080
# Test from domain
curl https://chitfund.deepteklabs.com/health
```
---
## 🧪 Testing Checklist
### Backend:
- [ ] API starts without bcrypt errors
- [ ] `/api/notifications` returns 200
- [ ] `/api/members/users/available/:groupId` returns 200
- [ ] No errors in `pm2 logs luckychit-api`
### Frontend:
- [ ] Login screen shows (no demo credentials)
- [ ] Signup redirects to dashboard
- [ ] Member dashboard shows empty state (if no groups)
- [ ] Member dashboard shows real data (if has groups)
- [ ] No console errors
- [ ] Service worker cleared
### Infrastructure:
- [ ] Deployment script works: `./scripts/deploy.sh`
- [ ] Diagnostics work: `./scripts/diagnose.sh`
- [ ] All scripts executable
---
## 🐛 Known Issues During Deployment
### Issue 1: bcrypt "invalid ELF header"
**Fix:**
```bash
cd /home/luckychit/apps/chitfund/backend
rm -rf node_modules package-lock.json
npm install
pm2 restart luckychit-api
```
### Issue 2: Frontend showing old version
**Fix:**
```bash
# Server side
./scripts/deploy.sh frontend --force
# Browser side
# F12 → Application → Service Workers → Unregister
# Ctrl + Shift + R (hard refresh)
```
### Issue 3: 502 Bad Gateway
**Fix:**
```bash
./scripts/fix-502.sh
```
---
## 📊 Expected Results
### Backend API:
```bash
# Should work now
curl https://chitfund.deepteklabs.com/api/notifications
# Response: { "success": true, "data": { "notifications": [], ... } }
curl https://chitfund.deepteklabs.com/api/members/users/available/GROUP_ID
# Response: { "success": true, "data": { "users": [...], ... } }
```
### Frontend:
- ✅ Login screen - No demo credentials
- ✅ Signup - Redirects to dashboard automatically
- ✅ Member dashboard - Shows empty state or real data
- ✅ No console errors
---
## 🎯 Quick Deploy Command
```bash
# One command to deploy everything
ssh luckychit@192.168.8.148 "cd /home/luckychit/apps/chitfund && git pull origin prodnew && chmod +x scripts/*.sh && ./scripts/deploy.sh"
```
Then clear browser cache and refresh!
---
## 📝 Post-Deployment
After successful deployment:
1. **Test in incognito** first (no cache)
2. **Verify APIs** work without errors
3. **Test user flows** (signup, login, dashboard)
4. **Check PM2 logs** for any warnings
5. **Monitor for a few minutes** to ensure stability
---
## 🆘 If Something Breaks
```bash
# Run diagnostics
./scripts/diagnose.sh
# Check logs
pm2 logs --lines 50
# Restart everything
pm2 restart all
# Still broken?
# See TROUBLESHOOTING.md
```
---
## 📋 Files Changed (Summary)
### Backend (3 files):
- `backend/src/routes/notifications.js`
- `backend/src/controllers/memberController.js`
- `backend/ecosystem.config.js`
### Frontend (4 files):
- `luckychit/lib/features/auth/views/login_screen.dart`
- `luckychit/lib/features/auth/views/signup_screen.dart`
- `luckychit/lib/interfaces/member/member_dashboard.dart`
- `luckychit/web/index.html`
### Scripts (5 new files):
- `scripts/deploy.sh`
- `scripts/diagnose.sh`
- `scripts/backup-db.sh`
- `scripts/restore-db.sh`
- `scripts/fix-502.sh`
### Documentation (4 core files):
- `README.md`
- `QUICK_START.md`
- `DEPLOYMENT.md`
- `TROUBLESHOOTING.md`
---
## 🎉 Benefits
After this deployment:
- ✅ All API 500 errors fixed
- ✅ Clean, professional UI (no demo data)
- ✅ Better user experience (signup navigation)
- ✅ Real data displayed (no hardcoded content)
- ✅ Cache issues resolved
- ✅ Clean project structure
- ✅ Easy deployment process
---
**Ready to deploy! Follow the steps above.** 🚀
**Estimated time**: 10-15 minutes (including bcrypt rebuild if needed)

View File

@ -0,0 +1,114 @@
# Documentation Index
Quick reference to find what you need.
---
## 📚 Main Documentation
### Getting Started
- **[README.md](./README.md)** - Project overview and quick start
- **[CHANGELOG.md](./CHANGELOG.md)** - Version history and updates
### For Managers/Admins
- **[ADMIN_GUIDE.md](./ADMIN_GUIDE.md)** - Complete admin features guide
- Edit/delete draws
- Edit/delete groups
- Edit members
- Member numbers explained
- Visual examples
### Technical Documentation
- **[backend/API_DOCUMENTATION.md](./backend/API_DOCUMENTATION.md)** - All API endpoints
- **[DEPLOYMENT.md](./DEPLOYMENT.md)** - Production deployment guide
- **[backend/TROUBLESHOOTING.md](./backend/TROUBLESHOOTING.md)** - Fix common issues
### Feature-Specific Guides
- **[HOW_TO_ADD_PAST_DRAWS.md](./HOW_TO_ADD_PAST_DRAWS.md)** - Import historical draws
- **[backend/WHATSAPP_USAGE_EXAMPLES.md](./backend/WHATSAPP_USAGE_EXAMPLES.md)** - WhatsApp features
---
## 🗂️ Documentation Structure
```
Project Root
├── README.md # Main project README
├── ADMIN_GUIDE.md # All admin features
├── DEPLOYMENT.md # Deployment instructions
├── CHANGELOG.md # Version history
├── DOCUMENTATION_INDEX.md # This file
├── TROUBLESHOOTING.md # General troubleshooting
├── HOW_TO_ADD_PAST_DRAWS.md # Feature guide
├── backend/
│ ├── README.md # Backend setup
│ ├── API_DOCUMENTATION.md # API reference
│ ├── TROUBLESHOOTING.md # Backend issues
│ ├── WHATSAPP_USAGE_EXAMPLES.md # WhatsApp guide
│ ├── run-member-number-migration.js # Migration script
│ └── migrations/
│ └── 20251106_add_member_number.sql
└── luckychit/
└── README.md # Flutter app info
```
---
## 🎯 Quick Links by Task
### I want to...
**Setup the project**
→ [README.md](./README.md) → Quick Start section
**Deploy to production**
→ [DEPLOYMENT.md](./DEPLOYMENT.md)
**Use admin features**
→ [ADMIN_GUIDE.md](./ADMIN_GUIDE.md)
**Fix an error**
→ [backend/TROUBLESHOOTING.md](./backend/TROUBLESHOOTING.md)
**Import old data**
→ [HOW_TO_ADD_PAST_DRAWS.md](./HOW_TO_ADD_PAST_DRAWS.md)
**Use the API**
→ [backend/API_DOCUMENTATION.md](./backend/API_DOCUMENTATION.md)
**See what's new**
→ [CHANGELOG.md](./CHANGELOG.md)
---
## 📖 Documentation Standards
### File Naming
- **UPPERCASE.md** - Important docs (README, CHANGELOG)
- **PascalCase.md** - Feature guides
- **lowercase.md** - Technical/internal docs
### Location
- **Root** - Project-wide documentation
- **backend/** - Backend-specific docs
- **luckychit/** - Flutter app docs
- **docs/** - Archive/additional docs
---
## ✅ Documentation Status
- [x] Main README created
- [x] Admin guide consolidated
- [x] Deployment guide complete
- [x] API documentation updated
- [x] Troubleshooting guides merged
- [x] Redundant files removed
- [x] Clean structure established
---
**All documentation is now organized and easy to find!** 📚✨

View File

@ -0,0 +1,299 @@
# 📝 How to Add Past Draws and Winners
Step-by-step guide to backfill historical draw data for imported groups.
---
## 🎯 Use Case
You imported a group that started 6 months ago. Now you need to record:
- **Month 1 Draw**: John Doe won ₹99,000
- **Month 2 Draw**: Jane Smith won ₹98,500
- **Month 3 Draw**: Bob Johnson won ₹98,000
- ... and so on
---
## 🚀 Quick Steps
### Method 1: Using the UI (Coming Soon)
**Step 1**: Go to **Group Details Page**
**Step 2**: Click **"Actions" menu** (top right)
**Step 3**: Select **"Add Past Draw Result"**
**Step 4**: Fill in the form:
- Month number (e.g., 1, 2, 3)
- Select winner from member list
- Enter prize amount
**Step 5**: Click **"Add Draw"**
**Repeat** for each past month!
---
### Method 2: Using the API Directly (For Now)
Since the UI buttons aren't wired yet, you can use the API directly:
#### 1. Get Your Auth Token
Login and copy your token from browser DevTools:
```
F12 → Application → Local Storage → auth_token
```
#### 2. Add Past Draw via API
```bash
curl -X POST https://chitfund.deepteklabs.com/api/monthly-draws \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
-d '{
"group_id": "YOUR_GROUP_ID",
"month": 6,
"year": 2024,
"winner_id": "MEMBER_USER_ID",
"prize_amount": 98000,
"is_past_draw": true,
"client_seed": "PAST_DRAW_JUNE_2024"
}'
```
#### 3. Repeat for Each Month
Change the parameters:
- `month`: 6, 7, 8, etc.
- `year`: 2024
- `winner_id`: Different member for each draw
- `prize_amount`: Amount they won
---
## 🔍 How to Get IDs
### Get Group ID:
1. Go to Group Details page
2. Check URL or browser console
3. Or list all groups via API:
```bash
curl https://chitfund.deepteklabs.com/api/chit-groups/manager \
-H "Authorization: Bearer YOUR_TOKEN"
```
### Get Member IDs:
1. View group members in UI
2. Or list members via API:
```bash
curl https://chitfund.deepteklabs.com/api/members/YOUR_GROUP_ID/members \
-H "Authorization: Bearer YOUR_TOKEN"
```
---
## 📋 Example: Complete Backfill
### Scenario:
- Group started: June 2024
- Current: December 2024 (Month 7)
- Need to add: 6 past draws
### API Calls:
```bash
# Set your token
TOKEN="your_auth_token_here"
GROUP_ID="your_group_id_here"
# Month 1 - June 2024
curl -X POST https://chitfund.deepteklabs.com/api/monthly-draws \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOKEN" \
-d '{
"group_id": "'$GROUP_ID'",
"month": 6,
"year": 2024,
"winner_id": "john_doe_user_id",
"prize_amount": 99000,
"is_past_draw": true
}'
# Month 2 - July 2024
curl -X POST https://chitfund.deepteklabs.com/api/monthly-draws \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOKEN" \
-d '{
"group_id": "'$GROUP_ID'",
"month": 7,
"year": 2024,
"winner_id": "jane_smith_user_id",
"prize_amount": 98500,
"is_past_draw": true
}'
# Month 3 - August 2024
curl -X POST https://chitfund.deepteklabs.com/api/monthly-draws \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOKEN" \
-d '{
"group_id": "'$GROUP_ID'",
"month": 8,
"year": 2024,
"winner_id": "bob_johnson_user_id",
"prize_amount": 98000,
"is_past_draw": true
}'
# Continue for months 4, 5, 6...
```
---
## 🛠️ Add UI Buttons (For You to Implement)
I created the dialogs but need to add buttons to access them. Here's where:
### Location: Group Details Page Actions Menu
Add these options to the PopupMenuButton in `group_details_page.dart`:
```dart
PopupMenuItem(
value: 'add_past_draw',
child: Row(
children: [
Icon(Icons.history, color: Colors.blue.shade600),
SizedBox(width: 12.w),
const Text('Add Past Draw Result'),
],
),
),
PopupMenuItem(
value: 'add_past_payments',
child: Row(
children: [
Icon(Icons.payment_outlined, color: Colors.green.shade600),
SizedBox(width: 12.w),
const Text('Add Past Payments'),
],
),
),
```
Then handle the selection:
```dart
onSelected: (value) {
if (value == 'add_past_draw') {
_showAddPastDrawDialog();
} else if (value == 'add_past_payments') {
_showAddPastPaymentsDialog();
}
}
void _showAddPastDrawDialog() {
// Need to determine which month to add
final currentMonth = _determineCurrentMonth();
showDialog(
context: context,
builder: (context) => AddPastDrawDialog(
group: widget.group,
monthNumber: currentMonth,
),
).then((result) {
if (result == true) {
_chitGroupService.loadGroupMonthlyDraws(widget.group.id);
}
});
}
void _showAddPastPaymentsDialog() {
final currentMonth = _determineCurrentMonth();
showDialog(
context: context,
builder: (context) => AddPastPaymentsDialog(
group: widget.group,
monthNumber: currentMonth,
),
).then((result) {
if (result == true) {
_paymentService.loadGroupPayments(widget.group.id);
}
});
}
```
---
## 📊 Recommended Workflow
### Option A: Use API (Current - Works Now)
1. Get group ID and member IDs
2. Use curl commands above
3. Add all past draws via API
4. Verify in UI
### Option B: Add UI Buttons (Better UX)
1. Add buttons to group details page
2. Import dialogs I created
3. Click through UI to add draws
4. More user-friendly
---
## 🎓 Quick Reference
### API Endpoint:
```
POST /api/monthly-draws
```
### Required Fields:
- `group_id` - The imported group's ID
- `month` - Month number (1-12)
- `year` - Year (e.g., 2024)
- `winner_id` - User ID of the winner
- `prize_amount` - Amount won
- `is_past_draw: true` - Flag for past draws
### Optional:
- `client_seed` - Any string (e.g., "PAST_JUNE_2024")
---
## ⚠️ Important Notes
1. **Add members FIRST** before adding draws
2. **Winner must be a member** of the group
3. **Can't duplicate** - only one draw per month/year
4. **Month/year** must match group timeline
5. **Prize amounts** should be realistic
---
## 🔗 Files You Need
The dialogs are already created:
- ✅ `AddPastDrawDialog` - Ready to use
- ✅ `AddPastPaymentsDialog` - Ready to use
- ⚠️ Just need to add buttons in `group_details_page.dart`
---
## 💡 Quick Solution
**For now**, use the API method (curl commands) to add past draws.
**For better UX**, I can add the buttons to the group details page - just let me know!
---
**See the API examples above to add past draws immediately!** 🚀
Or say "add the UI buttons" and I'll wire them up for you.

97
README/QUICK_START.md Normal file
View File

@ -0,0 +1,97 @@
# Quick Start Guide
Get LuckyChit running in 5 minutes.
---
## 🚀 Backend (2 minutes)
```bash
cd backend
# 1. Install dependencies
npm install
# 2. Setup environment
cp env.example .env
nano .env # Add your database credentials
# 3. Create database (if needed)
sudo -u postgres psql
postgres=# CREATE DATABASE luckychit;
postgres=# \q
# 4. Run migration
node run-member-number-migration.js
# 5. Start server
pm2 start ecosystem.config.js
# or
npm start
```
✅ Backend running at `http://localhost:3000`
---
## 📱 Frontend (2 minutes)
```bash
cd luckychit
# 1. Install dependencies
flutter pub get
# 2. Update API URL (if needed)
# Edit: lib/core/services/api_service.dart
# Change baseURL to your backend URL
# 3. Run app
flutter run
# or build APK
flutter build apk --release
```
✅ App running!
---
## ✅ Verify Everything Works
### Test Backend
```bash
# Health check
curl http://localhost:3000/health
# Should return: {"success":true, "message":"..."}
```
### Test App
1. Open app
2. Signup as manager
3. Create a test group
4. Add members (see member numbers #1, #2, #3)
5. Edit member details
6. Everything should work!
---
## 🎯 Default Credentials
**Test Manager**:
- Mobile: `9876543210`
- Password: `password123`
**Test Member**:
- Mobile: `9876543211`
- Password: `password123`
---
## 🆘 Having Issues?
See [TROUBLESHOOTING.md](./TROUBLESHOOTING.md) or [backend/TROUBLESHOOTING.md](./backend/TROUBLESHOOTING.md)
---
**Next**: Read [ADMIN_GUIDE.md](./ADMIN_GUIDE.md) to learn all the admin features!

200
README/README.md Normal file
View File

@ -0,0 +1,200 @@
# LuckyChit - Chit Fund Management System
A complete, production-ready chit fund management application with Flutter frontend and Node.js backend.
## 🎯 Features
### For Managers
- ✅ Create and manage multiple chit groups
- ✅ Add members to groups (with auto-assigned member numbers #1, #2, #3...)
- ✅ Conduct monthly draws with animated lottery system
- ✅ Track payments and generate receipts
- ✅ **Complete admin controls**: Edit/delete draws, groups, and member details
- ✅ WhatsApp integration for notifications
- ✅ Financial reports and analytics
- ✅ Import existing groups with historical data
### For Members
- ✅ View all groups they belong to
- ✅ Track payment history
- ✅ View draw results
- ✅ Verify draws using provably fair system
- ✅ Receive payment reminders
### Accessibility
- ✅ **Optimized for elderly users (50+)** with 30% larger fonts
- ✅ High contrast colors for better visibility
- ✅ Large touch targets (WCAG AAA compliant)
- ✅ Simple member numbers instead of complex UUIDs
---
## 📁 Project Structure
```
chitfund/
├── backend/ # Node.js + Express + PostgreSQL
│ ├── src/
│ │ ├── controllers/ # Business logic
│ │ ├── models/ # Database models
│ │ ├── routes/ # API routes
│ │ └── services/ # Background services
│ └── migrations/ # Database migrations
├── luckychit/ # Flutter mobile app
│ ├── lib/
│ │ ├── core/ # Services, models, themes
│ │ ├── features/ # Feature modules
│ │ ├── interfaces/ # UI screens
│ │ └── shared/ # Shared widgets
│ └── assets/ # Images, fonts, icons
└── docs/ # Additional documentation
```
---
## 🚀 Quick Start
### Backend Setup
```bash
cd backend
npm install
cp env.example .env
# Edit .env with your database credentials
npm start
```
See `backend/README.md` for detailed setup instructions.
### Frontend Setup
```bash
cd luckychit
flutter pub get
flutter run
```
---
## 📚 Documentation
### Essential Guides
- **[Admin Features](./ADMIN_GUIDE.md)** - Complete guide for all admin features
- **[API Documentation](./backend/API_DOCUMENTATION.md)** - All API endpoints
- **[Deployment Guide](./DEPLOYMENT.md)** - Production deployment steps
### Additional Docs
- **[How to Add Past Draws](./HOW_TO_ADD_PAST_DRAWS.md)** - Import historical data
- **[WhatsApp Integration](./backend/WHATSAPP_USAGE_EXAMPLES.md)** - WhatsApp features
- **[Troubleshooting](./TROUBLESHOOTING.md)** - Common issues and fixes
---
## 🔑 Key Features
### Member Numbers
Each member gets a readable serial number (#1, #2, #3...) instead of just UUID:
- Easy to reference: "Call Member #5"
- Large display for elderly users
- Unique per group
- Auto-assigned when member joins
### Admin Controls
Managers can:
- ✅ Edit/delete monthly draws (fix mistakes)
- ✅ Edit member details (name, phone, email)
- ✅ Edit group details (before starting)
- ✅ Delete empty groups
- ✅ Remove members from groups (without deleting user account)
### Accessibility
Designed for elderly users with:
- 30% larger fonts throughout
- High contrast colors (pure black on white)
- Large buttons (67px height)
- Bold text for clarity
- Simple member numbers
### Provably Fair Draws
- Cryptographic verification
- Multiple animation options (wheel, roulette, cards)
- Transparent and auditable
- Screen recording capability
---
## 🛠️ Tech Stack
### Backend
- Node.js + Express
- PostgreSQL (Sequelize ORM)
- JWT authentication
- WhatsApp integration
- PM2 for process management
### Frontend
- Flutter (cross-platform)
- GetX (state management)
- ScreenUtil (responsive design)
- Dio (HTTP client)
---
## 📱 Supported Platforms
- ✅ Android
- ✅ iOS
- ✅ Web
- ⏳ Windows/Mac/Linux (Flutter supports, not tested)
---
## 🔐 Security Features
- JWT token authentication
- Password hashing (bcrypt)
- Manager-only operations
- Rate limiting
- Input validation
- SQL injection protection
---
## 📊 Database Schema
### Main Tables
- `users` - User accounts (managers and members)
- `chit_groups` - Chit fund groups
- `group_members` - Member relationships with **member_number**
- `monthly_draws` - Draw results
- `payments` - Payment records
- `notifications` - Notification log
---
## 🎨 Screenshots
*Add screenshots of your app here*
---
## 📝 License
*Add your license here*
---
## 👥 Credits
Developed for managing chit funds with ease, especially designed for elderly users.
---
## 📞 Support
For issues or questions, see `TROUBLESHOOTING.md` or contact support.
---
**Version**: 2.0 (November 2025)
**Status**: Production Ready ✅

80
README/START_HERE.md Normal file
View File

@ -0,0 +1,80 @@
# 👋 Start Here!
Welcome to LuckyChit - Complete Chit Fund Management System
---
## 📚 Documentation Guide
### 🚀 New to LuckyChit?
**Start with**: [README.md](./README.md) - Project overview
**Then read**: [QUICK_START.md](./QUICK_START.md) - Get it running in 5 minutes
### 👨‍💼 Manager/Admin?
**Read**: [ADMIN_GUIDE.md](./ADMIN_GUIDE.md) - Learn all admin features
- Edit draws, groups, members
- Delete operations
- Member number system
- Complete workflows
### 🚀 Deploying to Production?
**Read**: [DEPLOYMENT.md](./DEPLOYMENT.md) - Step-by-step deployment
- Backend setup
- Database migration
- Frontend build
- PM2 configuration
### 🔧 Having Problems?
**Read**: [TROUBLESHOOTING.md](./TROUBLESHOOTING.md) - Fix common issues
- Module errors
- Database problems
- Migration issues
- Quick solutions
### 💻 Developer?
**Read**: [backend/API_DOCUMENTATION.md](./backend/API_DOCUMENTATION.md) - Complete API reference
---
## 🎯 Quick Navigation
**I want to...**
| Task | Go To |
|------|-------|
| Setup project | [QUICK_START.md](./QUICK_START.md) |
| Learn admin features | [ADMIN_GUIDE.md](./ADMIN_GUIDE.md) |
| Deploy to server | [DEPLOYMENT.md](./DEPLOYMENT.md) |
| Fix an error | [TROUBLESHOOTING.md](./TROUBLESHOOTING.md) |
| Use the API | [backend/API_DOCUMENTATION.md](./backend/API_DOCUMENTATION.md) |
| See what's new | [CHANGELOG.md](./CHANGELOG.md) |
| Find any doc | [DOCUMENTATION_INDEX.md](./DOCUMENTATION_INDEX.md) |
---
## ✨ Key Features Highlight
- ✅ **Member Numbers** (#1, #2, #3...) - Easy to reference!
- ✅ **Large Fonts** - 30% bigger for elderly users
- ✅ **Edit Everything** - Fix mistakes easily
- ✅ **Complete Control** - Full admin features
- ✅ **WhatsApp Ready** - Notifications & sharing
- ✅ **Provably Fair** - Transparent draws
---
## 🎯 Most Important Files
1. **[README.md](./README.md)** - Start here
2. **[ADMIN_GUIDE.md](./ADMIN_GUIDE.md)** - Essential for managers
3. **[DEPLOYMENT.md](./DEPLOYMENT.md)** - For going live
4. **[TROUBLESHOOTING.md](./TROUBLESHOOTING.md)** - When things break
---
**Happy coding! 🎉**
---
*All documentation is now organized and easy to navigate. Start with README.md and follow the links!*

187
README/TROUBLESHOOTING.md Normal file
View File

@ -0,0 +1,187 @@
# Troubleshooting Guide
Solutions for common issues in both backend and frontend.
---
## 🔧 Backend Issues
See [backend/TROUBLESHOOTING.md](./backend/TROUBLESHOOTING.md) for detailed backend troubleshooting.
### Quick Fixes
**Module Not Found**:
```bash
cd backend
npm install
pm2 restart chitfund-backend
```
**Can't Delete Group**:
- Remove all active members first
- Removed members (status='removed') are OK
- Backend already fixed to count only active members
**Can't Remove Member (404)**:
- Make sure using `member.userId` not `member.id`
- Already fixed in latest code
---
## 📱 Frontend Issues
### Member Numbers Not Showing
**Cause**: Database migration not applied
**Fix**:
```bash
cd backend
node run-member-number-migration.js
pm2 restart chitfund-backend
```
Then restart Flutter app.
---
### API Connection Error
**Cause**: Wrong API URL
**Fix**: Check `luckychit/lib/core/services/api_service.dart`
```dart
static const String baseURL = 'https://chitfund.deepteklabs.com/api';
```
---
### Build Errors
**Module errors**:
```bash
cd luckychit
flutter clean
flutter pub get
flutter run
```
**Gradle errors** (Android):
```bash
cd luckychit/android
./gradlew clean
cd ..
flutter build apk
```
---
## 🎨 UI Issues
### Text Too Small
Already fixed! Theme now uses:
- 30% larger fonts (fontSizeMultiplier = 1.3)
- High contrast colors
- Large buttons
If still small, check device font scaling in system settings.
---
### Member Delete Not Working
Make sure you're passing `member.userId`:
```dart
// ✅ Correct
_service.removeMemberFromGroup(groupId, member.userId);
// ❌ Wrong
_service.removeMemberFromGroup(groupId, member.id);
```
Already fixed in latest code!
---
## 🔄 After Code Updates
Always do these after pulling new code:
### Backend
```bash
npm install # Get new dependencies
node run-member-number-migration.js # Apply migrations (if not done)
pm2 restart chitfund-backend # Restart
pm2 logs chitfund-backend # Check logs
```
### Frontend
```bash
flutter pub get # Get new dependencies
flutter clean # Clean build cache
flutter run # Run app
```
---
## 📊 Verification
### Check Backend is Healthy
```bash
curl http://localhost:3000/health
# Should return: {"success":true}
curl http://localhost:3000/api/auth/profile -H "Authorization: Bearer YOUR_TOKEN"
# Should return user data
```
### Check Migration Applied
```bash
sudo -u postgres psql -d luckychit -c "SELECT member_number FROM group_members LIMIT 1;"
# Should show a number (1, 2, 3...)
```
### Check Frontend Connects
```bash
# In Flutter app, try to login
# Should connect to backend successfully
```
---
## 🆘 Still Having Issues?
### Backend Issues
→ See [backend/TROUBLESHOOTING.md](./backend/TROUBLESHOOTING.md)
### Deployment Issues
→ See [DEPLOYMENT.md](./DEPLOYMENT.md)
### API Issues
→ See [backend/API_DOCUMENTATION.md](./backend/API_DOCUMENTATION.md)
### Admin Features
→ See [ADMIN_GUIDE.md](./ADMIN_GUIDE.md)
---
## 💡 Common Solutions
**99% of issues are fixed by**:
```bash
# Backend
cd backend
npm install
pm2 restart chitfund-backend
# Frontend
cd luckychit
flutter clean
flutter pub get
flutter run
```
---
**If all else fails, see [DEPLOYMENT.md](./DEPLOYMENT.md) for complete reset procedure.**

View File

@ -1,340 +0,0 @@
# 🔧 LuckyChit Troubleshooting Guide
Quick fixes for common issues.
---
## 🚨 Quick Diagnostics
```bash
cd /home/luckychit/apps/chitfund
./scripts/diagnose.sh
```
This shows exactly what's wrong!
---
## Common Issues
### 🔴 502 Bad Gateway Error
**Symptom**: Site shows "502 Bad Gateway"
**Cause**: PM2 processes are down or nginx can't reach backend
**Fix**:
```bash
# Check PM2
pm2 status
# If processes are down
pm2 restart all
# If processes don't exist
cd /home/luckychit/apps/chitfund
./scripts/fix-502.sh
# Test
curl http://localhost:3000/health
curl http://localhost:8080
```
---
### 🔴 Changes Not Showing (Cache Issue)
**Symptom**: Deployed code but seeing old version
**Cause**: Multiple cache layers (browser, service worker, nginx, Cloudflare)
**Fix**:
```bash
# 1. Clear browser service worker
# F12 → Application → Service Workers → Unregister
# Then: Ctrl + Shift + R (hard refresh)
# 2. Or test in incognito
# Ctrl + Shift + N (no cache at all)
# 3. Force rebuild on server
cd /home/luckychit/apps/chitfund
./scripts/deploy.sh --force
# 4. Clear nginx cache (if using separate proxy)
ssh root@<nginx-ip>
sudo rm -rf /var/cache/nginx/*
sudo systemctl reload nginx
# 5. Purge Cloudflare cache
# Dashboard → Caching → Purge Everything
```
---
### 🔴 PM2 Not Running After Reboot
**Symptom**: Server restarted, apps not running
**Cause**: PM2 auto-start not configured
**Fix**:
```bash
# Restore processes
pm2 resurrect
# Or setup auto-start
pm2 startup systemd -u luckychit --hp /home/luckychit
# Run the command it outputs
pm2 save
```
---
### 🔴 "invalid ELF header" Error
**Symptom**: Backend crashes with bcrypt error
**Cause**: node_modules installed on Windows, running on Linux
**Fix**:
```bash
cd /home/luckychit/apps/chitfund/backend
rm -rf node_modules package-lock.json
npm install
pm2 restart luckychit-api
```
---
### 🔴 Database Connection Error
**Symptom**: Backend logs show "connection refused"
**Cause**: PostgreSQL not running or wrong credentials
**Fix**:
```bash
# Check PostgreSQL
sudo systemctl status postgresql
# If not running
sudo systemctl start postgresql
# Test connection
psql -U luckychit -h localhost -d luckychit
# Check .env credentials
cd /home/luckychit/apps/chitfund/backend
cat .env | grep DB_
```
---
### 🔴 Port Already in Use
**Symptom**: "Port 3000 already in use"
**Cause**: Old process still running
**Fix**:
```bash
# Find process
netstat -tulpn | grep 3000
# Kill process
kill -9 <PID>
# Or restart PM2
pm2 restart luckychit-api
```
---
### 🔴 Out of Memory
**Symptom**: PM2 processes keep crashing
**Cause**: Server out of RAM
**Fix**:
```bash
# Check memory
free -h
# Check PM2 memory usage
pm2 monit
# Restart to free memory
pm2 restart all
# Or reboot server
sudo reboot
```
---
### 🔴 Frontend Won't Build
**Symptom**: `flutter build web` fails
**Cause**: Corrupted cache or missing dependencies
**Fix**:
```bash
cd /home/luckychit/apps/chitfund/luckychit
flutter clean
rm -rf .dart_tool build
flutter pub get
flutter build web --release --pwa-strategy=none
```
---
### 🔴 Service Worker Errors in Browser
**Symptom**: Console shows "Loading from existing service worker" + errors
**Cause**: Old service worker serving stale code
**Fix**:
```bash
# In browser:
# 1. Press F12
# 2. Application tab → Service Workers
# 3. Click "Unregister" for all
# 4. Application tab → Clear storage → Clear site data
# 5. Close DevTools
# 6. Hard refresh: Ctrl + Shift + R
# Already fixed in code - rebuilds now disable service worker
```
---
### 🔴 Nginx Not Forwarding Requests
**Symptom**: 502 only on domain, direct IP works
**Cause**: Nginx config issue or cache
**Fix**:
```bash
# SSH to nginx proxy LXC
ssh root@<nginx-ip>
# Test backend connectivity
curl http://192.168.8.148:3000/health
curl http://192.168.8.148:8080
# Check nginx config
sudo nginx -t
# Clear cache
sudo rm -rf /var/cache/nginx/*
# Reload nginx
sudo systemctl reload nginx
# Check logs
sudo tail -f /var/log/nginx/error.log
```
---
### 🔴 Firewall Blocking Access
**Symptom**: Can't access from outside server
**Cause**: Firewall rules
**Fix**:
```bash
# Check firewall
sudo ufw status
# Allow ports
sudo ufw allow 3000/tcp
sudo ufw allow 8080/tcp
sudo ufw reload
# Test
curl http://192.168.8.148:3000/health
```
---
## 🔍 Diagnostic Commands
```bash
# Check everything
./scripts/diagnose.sh
# PM2 status
pm2 status
pm2 logs --lines 50
# Test connectivity
curl http://localhost:3000/health # Backend
curl http://localhost:8080 # Frontend
# Check disk space
df -h
# Check memory
free -h
# Check ports
netstat -tulpn | grep -E '(3000|8080)'
# Check database
psql -U luckychit -h localhost -d luckychit
```
---
## 🚀 Emergency Recovery
### Complete Reset
```bash
# Stop everything
pm2 kill
# Restart from scratch
cd /home/luckychit/apps/chitfund
# Backend
cd backend
pm2 start src/server.js --name luckychit-api
# Frontend
cd ../luckychit
pm2 serve build/web 8080 --name luckychit-frontend --spa
# Save
pm2 save
# Verify
pm2 status
```
### Nuclear Option (Full Rebuild)
```bash
cd /home/luckychit/apps/chitfund
./scripts/deploy.sh --force
```
---
## 📞 Still Stuck?
1. **Run diagnostics**: `./scripts/diagnose.sh`
2. **Check PM2 logs**: `pm2 logs --lines 100`
3. **Try in incognito**: Rules out browser cache
4. **Restart everything**: `pm2 restart all`
5. **Check all docs**: [DEPLOYMENT.md](DEPLOYMENT.md)
---
## 🎯 Prevention Tips
1. **Always test locally** before deploying
2. **Use deployment script** (handles caching automatically)
3. **Check logs after deploy**: `pm2 logs --lines 20`
4. **Test in incognito** first (no cache)
5. **Backup database** before major changes
6. **Monitor PM2**: `pm2 monit`
---
**Most issues are cache-related. When in doubt, clear all caches and rebuild!**

View File

@ -5,10 +5,10 @@ PORT=3000
# Database Configuration
DB_HOST=localhost
DB_PORT=5432
DB_NAME=luckychit
DB_USER=luckychit
DB_NAME=postgres
DB_USER=postgres
DB_PASSWORD=postgres
DATABASE_URL=postgresql://luckychit:postgres@localhost:5432/luckychit
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/luckychit
# JWT Configuration
JWT_SECRET=2559a382d606e4209085401d693ae25f

View File

@ -189,6 +189,8 @@ GET /chit-groups/{groupId}/stats
```
**Headers:** `Authorization: Bearer <token>`
**Note:** Update and Delete operations only work for groups in 'forming' status. Once a group starts, it cannot be edited or deleted.
---
## Member Management
@ -205,12 +207,31 @@ POST /members/{groupId}/members
}
```
### Update Member Details (Manager Only)
```
PUT /auth/member/{memberId}
```
**Headers:** `Authorization: Bearer <token>`
**Request Body:**
```json
{
"full_name": "Updated Name",
"mobile_number": "9876543299",
"email": "newemail@example.com",
"address": "New Address",
"emergency_contact": "9876543200"
}
```
**Note:** All fields are optional. Only provide fields you want to update.
### Remove Member from Group (Manager Only)
```
DELETE /members/{groupId}/members/{memberId}
```
**Headers:** `Authorization: Bearer <token>`
**Note:** Each member has a unique UUID identifier that never changes, even if name or phone number is updated.
### Get Group Members
```
GET /members/{groupId}/members?status=active&page=1&limit=20
@ -303,6 +324,80 @@ GET /payments/group/{groupId}/pending
---
## Monthly Draws Management
### Create Monthly Draw (Manager Only)
```
POST /monthly-draws
```
**Headers:** `Authorization: Bearer <token>`
**Request Body:**
```json
{
"group_id": "group-uuid",
"month": 3,
"year": 2025,
"client_seed": "user_provided_seed", // Optional
"winner_id": "member-uuid", // Required for past draws
"prize_amount": 195000, // Optional
"is_past_draw": true // Optional, for historical data
}
```
### Get Group Monthly Draws
```
GET /monthly-draws/group/{groupId}?status=completed&page=1&limit=20
```
**Headers:** `Authorization: Bearer <token>`
### Get Monthly Draw Details
```
GET /monthly-draws/{drawId}
```
**Headers:** `Authorization: Bearer <token>`
### Update Monthly Draw (Manager Only - NEW)
```
PUT /monthly-draws/{drawId}
```
**Headers:** `Authorization: Bearer <token>`
**Request Body:**
```json
{
"winner_id": "new-winner-uuid", // Optional
"prize_amount": 195000, // Optional
"notes": "Corrected winner" // Optional
}
```
**Use Case:** Fix mistakes in draw entry (wrong winner, wrong amount)
### Delete Monthly Draw (Manager Only - NEW)
```
DELETE /monthly-draws/{drawId}
```
**Headers:** `Authorization: Bearer <token>`
**Use Case:** Remove draw that was added by mistake
### Verify Draw Result
```
POST /monthly-draws/{drawId}/verify
```
**Headers:** `Authorization: Bearer <token>`
**Request Body:**
```json
{
"client_seed": "optional-client-seed"
}
```
### Get Draw Statistics
```
GET /monthly-draws/group/{groupId}/statistics
```
**Headers:** `Authorization: Bearer <token>`
---
## WhatsApp Share Endpoints
### Share Payment Receipt

View File

@ -1,335 +1,175 @@
# LuckyChit Backend API
# LuckyChit Backend
Backend API for LuckyChit - Digital Chit Fund Management Platform
Node.js + Express + PostgreSQL backend for chit fund management.
## Features
---
- **Authentication & Authorization**: JWT-based authentication with role-based access control
- **User Management**: Create and manage users (managers and members)
- **Database**: PostgreSQL with Sequelize ORM
- **Security**: Helmet, CORS, Rate limiting, Input validation
- **Logging**: Morgan for HTTP logging
## Tech Stack
- **Runtime**: Node.js
- **Framework**: Express.js
- **Database**: PostgreSQL
- **ORM**: Sequelize
- **Authentication**: JWT (jsonwebtoken)
- **Security**: bcrypt, helmet, cors
- **Validation**: Built-in Sequelize validation
## Prerequisites
- Node.js (v16 or higher)
- PostgreSQL (v12 or higher)
- npm or yarn
## Installation
1. **Clone the repository**
```bash
git clone <repository-url>
cd backend
```
2. **Install dependencies**
```bash
npm install
```
3. **Environment Setup**
```bash
cp env.example .env
```
Edit `.env` file with your configuration:
```env
NODE_ENV=development
PORT=3000
# Database Configuration
DB_HOST=localhost
DB_PORT=5432
DB_NAME=luckychit
DB_USER=postgres
DB_PASSWORD=your_password
# JWT Configuration
JWT_SECRET=your-super-secret-jwt-key
JWT_EXPIRES_IN=24h
# CORS Configuration
ALLOWED_ORIGINS=http://localhost:8080
```
4. **Database Setup**
```sql
-- Create database
CREATE DATABASE luckychit;
-- Create user (if needed)
CREATE USER postgres WITH PASSWORD 'your_password';
GRANT ALL PRIVILEGES ON DATABASE luckychit TO postgres;
```
5. **Run the application**
```bash
# Development mode
npm run dev
# Production mode
npm start
```
## API Endpoints
### Authentication
#### POST `/api/auth/login`
Login with mobile number and password.
**Request Body:**
```json
{
"mobile_number": "9999999999",
"password": "password123"
}
```
**Response:**
```json
{
"success": true,
"message": "Login successful",
"data": {
"token": "jwt_token_here",
"user": {
"id": "uuid",
"mobile_number": "9999999999",
"full_name": "Test User",
"role": "manager",
"is_active": true
}
}
}
```
#### POST `/api/auth/create-member` (Manager only)
Create a new member account.
**Headers:**
```
Authorization: Bearer <jwt_token>
```
**Request Body:**
```json
{
"mobile_number": "8888888888",
"full_name": "John Doe"
}
```
**Response:**
```json
{
"success": true,
"message": "Member created successfully",
"data": {
"user": {
"id": "uuid",
"mobile_number": "8888888888",
"full_name": "John Doe",
"role": "member"
},
"tempPassword": "abc12345"
}
}
```
#### GET `/api/auth/profile`
Get current user profile.
**Headers:**
```
Authorization: Bearer <jwt_token>
```
#### PUT `/api/auth/profile`
Update user profile.
**Headers:**
```
Authorization: Bearer <jwt_token>
```
**Request Body:**
```json
{
"full_name": "Updated Name"
}
```
#### PUT `/api/auth/change-password`
Change user password.
**Headers:**
```
Authorization: Bearer <jwt_token>
```
**Request Body:**
```json
{
"current_password": "old_password",
"new_password": "new_password"
}
```
### Health Check
#### GET `/health`
Check API health status.
**Response:**
```json
{
"success": true,
"message": "LuckyChit API is running",
"timestamp": "2024-01-01T00:00:00.000Z",
"environment": "development"
}
```
## Database Models
### User
- `id` (UUID, Primary Key)
- `mobile_number` (String, Unique)
- `full_name` (String)
- `password_hash` (String)
- `role` (Enum: 'manager', 'member')
- `created_by` (UUID, Foreign Key to User)
- `is_active` (Boolean)
- `last_login` (Date)
### ChitGroup
- `id` (UUID, Primary Key)
- `name` (String)
- `total_value` (Decimal)
- `monthly_installment` (Decimal)
- `duration_months` (Integer)
- `max_members` (Integer)
- `foreman_commission_percentage` (Decimal)
- `draw_date` (Integer)
- `status` (Enum: 'forming', 'active', 'completed')
- `manager_id` (UUID, Foreign Key to User)
### GroupMember
- `id` (UUID, Primary Key)
- `group_id` (UUID, Foreign Key to ChitGroup)
- `user_id` (UUID, Foreign Key to User)
- `joined_date` (Date)
- `status` (Enum: 'active', 'inactive', 'removed')
- `total_paid` (Decimal)
- `total_won` (Decimal)
### Payment
- `id` (UUID, Primary Key)
- `group_id` (UUID, Foreign Key to ChitGroup)
- `user_id` (UUID, Foreign Key to User)
- `month` (Integer)
- `year` (Integer)
- `amount` (Decimal)
- `payment_method` (Enum: 'upi', 'bank_transfer', 'cash', 'cheque')
- `transaction_id` (String)
- `status` (Enum: 'pending', 'success', 'failed', 'cancelled')
- `paid_at` (Date)
### MonthlyDraw
- `id` (UUID, Primary Key)
- `group_id` (UUID, Foreign Key to ChitGroup)
- `month` (Integer)
- `year` (Integer)
- `draw_date` (Date)
- `eligible_members` (JSONB)
- `winner_id` (UUID, Foreign Key to User)
- `prize_amount` (Decimal)
- `server_seed` (String)
- `server_seed_hash` (String)
- `client_seed` (String)
- `nonce` (BigInt)
- `result_hash` (String)
- `status` (Enum: 'pending', 'completed', 'cancelled')
## Development
### Scripts
## 🚀 Quick Setup
```bash
# Start development server with nodemon
npm run dev
# Install dependencies
npm install
# Start production server
# Configure environment
cp env.example .env
nano .env
# Setup database
sudo -u postgres psql
CREATE DATABASE luckychit;
\q
# Run migration
node run-member-number-migration.js
# Start server
npm start
# Run tests
npm test
# or with PM2
pm2 start ecosystem.config.js
```
### Project Structure
---
## 📁 Structure
```
src/
├── config/
│ └── database.js # Database configuration
├── controllers/
│ └── authController.js # Authentication controller
├── middleware/
│ └── auth.js # Authentication middleware
├── models/
│ ├── User.js # User model
│ ├── ChitGroup.js # ChitGroup model
│ ├── GroupMember.js # GroupMember model
│ ├── Payment.js # Payment model
│ ├── MonthlyDraw.js # MonthlyDraw model
│ └── index.js # Model associations
├── routes/
│ └── auth.js # Authentication routes
└── server.js # Main server file
backend/
├── src/
│ ├── controllers/ # Request handlers
│ ├── models/ # Database models
│ ├── routes/ # API routes
│ ├── middleware/ # Auth, validation
│ ├── services/ # Background jobs
│ ├── config/ # Configuration
│ └── utils/ # Helper functions
├── migrations/ # Database migrations
├── *.js # Utility scripts
└── *.md # Documentation
```
## Security Features
---
- **JWT Authentication**: Secure token-based authentication
- **Password Hashing**: bcrypt for password security
- **CORS Protection**: Configurable CORS settings
- **Rate Limiting**: Prevent abuse with request limiting
- **Helmet**: Security headers
- **Input Validation**: Sequelize model validation
- **SQL Injection Protection**: Sequelize ORM protection
## 🔌 API Endpoints
## Error Handling
See [API_DOCUMENTATION.md](./API_DOCUMENTATION.md) for complete reference.
The API uses a centralized error handling system:
**Base URL**: `http://localhost:3000/api`
- **400**: Bad Request (validation errors)
- **401**: Unauthorized (authentication required)
- **403**: Forbidden (insufficient permissions)
- **404**: Not Found (resource not found)
- **429**: Too Many Requests (rate limit exceeded)
- **500**: Internal Server Error (server errors)
### Quick Reference
- **Auth**: `/api/auth/*`
- **Groups**: `/api/chit-groups/*`
- **Members**: `/api/members/*`
- **Draws**: `/api/monthly-draws/*`
- **Payments**: `/api/payments/*`
## Contributing
---
1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests if applicable
5. Submit a pull request
## 💾 Database
## License
### Models
- **User** - User accounts (with email, address, emergency contact)
- **ChitGroup** - Chit fund groups
- **GroupMember** - Relationships (with member_number)
- **MonthlyDraw** - Draw results
- **Payment** - Payment records
- **Notification** - Notification log
This project is licensed under the ISC License.
### Migrations
**Member Number Migration**:
```bash
node run-member-number-migration.js
```
This adds `member_number` field (1, 2, 3...) to all members.
---
## 🧪 Testing
```bash
# Test database connection
node test-db-connection.js
# Test API
node test-api.js
# Create test user
node create-test-user.js
```
---
## 🔧 Common Commands
```bash
# Development
npm start # Start dev server
npm run dev # Watch mode (if configured)
# Production
pm2 start ecosystem.config.js
pm2 restart chitfund-backend
pm2 logs chitfund-backend
pm2 status
# Database
node run-member-number-migration.js
sudo -u postgres psql -d luckychit
```
---
## 📊 Environment Variables
```bash
# Database
DB_HOST=localhost
DB_PORT=5432
DB_NAME=luckychit
DB_USER=luckychit
DB_PASSWORD=your_password
# JWT
JWT_SECRET=your_secret_key
JWT_EXPIRY=7d
# Server
PORT=3000
NODE_ENV=production
```
---
## 🆘 Troubleshooting
See [TROUBLESHOOTING.md](./TROUBLESHOOTING.md) for solutions to common issues:
- Module not found
- PostgreSQL authentication
- Migration errors
- Connection issues
---
## 📚 Documentation
- **[API_DOCUMENTATION.md](./API_DOCUMENTATION.md)** - Complete API reference
- **[TROUBLESHOOTING.md](./TROUBLESHOOTING.md)** - Fix common issues
- **[WHATSAPP_USAGE_EXAMPLES.md](./WHATSAPP_USAGE_EXAMPLES.md)** - WhatsApp integration
---
## 🎯 Latest Updates
**Version 2.0** (November 2025):
- ✅ Member numbers (#1, #2, #3...)
- ✅ Admin edit/delete features
- ✅ Enhanced error messages
- ✅ Bug fixes
See [../CHANGELOG.md](../CHANGELOG.md) for full history.
---
**Status**: Production Ready ✅

169
backend/TROUBLESHOOTING.md Normal file
View File

@ -0,0 +1,169 @@
# Backend Troubleshooting Guide
Quick solutions for common backend issues.
---
## 🔧 Module Not Found Error
**Error**: `Cannot find module ...`
**Fix**:
```bash
cd backend
npm install
pm2 restart chitfund-backend
```
---
## 🔐 PostgreSQL Authentication Issues
**Error**: `Peer authentication failed`
**Solution 1** (Easiest):
```bash
sudo -u postgres psql
```
**Solution 2**: Change auth method
```bash
sudo nano /etc/postgresql/*/main/pg_hba.conf
# Change: local all all peer
# To: local all all md5
sudo systemctl restart postgresql
```
---
## 💾 Database Migration
### Apply Member Number Migration
**Option A** (Recommended):
```bash
node run-member-number-migration.js
```
**Option B**:
```bash
sudo cp migrations/20251106_add_member_number.sql /tmp/
sudo -u postgres psql -d luckychit -f /tmp/20251106_add_member_number.sql
```
---
## 🚫 Delete Group Issues
**Error**: "Cannot delete, has members"
**Cause**: Counting removed members
**Fix**: Already fixed! Just restart backend:
```bash
pm2 restart chitfund-backend
```
Now only counts active members.
---
## 🚫 Remove Member 404 Error
**Error**: 404 "Member not found in this group"
**Cause**: Using wrong ID
**Fix**: Use `member.userId` not `member.id`
```dart
// Correct
removeMemberFromGroup(groupId, member.userId);
```
---
## 🔌 Connection Issues
**Can't connect to database**:
```bash
# Check PostgreSQL is running
sudo systemctl status postgresql
# Start if stopped
sudo systemctl start postgresql
# Test connection
node test-db-connection.js
```
---
## 📊 Check Backend Status
```bash
# PM2 status
pm2 status
# View logs
pm2 logs chitfund-backend --lines 50
# Restart
pm2 restart chitfund-backend
# Stop
pm2 stop chitfund-backend
# Delete and restart fresh
pm2 delete chitfund-backend
pm2 start ecosystem.config.js
```
---
## 🔍 Verify Migration Applied
```bash
sudo -u postgres psql -d luckychit -c "SELECT member_number FROM group_members LIMIT 5;"
```
Should show numbers 1, 2, 3, etc.
---
## 🆘 Complete Reset
If everything is broken:
```bash
cd backend
# Stop backend
pm2 delete all
# Clean install
rm -rf node_modules package-lock.json
npm install
# Run migrations
node run-member-number-migration.js
# Start fresh
pm2 start ecosystem.config.js
pm2 save
```
---
## 📞 Still Having Issues?
Check:
1. PostgreSQL is running: `sudo systemctl status postgresql`
2. Database exists: `sudo -u postgres psql -l`
3. .env file is configured correctly
4. Port 3000 is not in use: `lsof -i :3000`
5. Node version is 14+: `node --version`
---
**Most issues are fixed by**: `npm install && pm2 restart chitfund-backend` 🚀

View File

@ -1,16 +0,0 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*|*MINGW*|*MSYS*)
if command -v cygpath > /dev/null 2>&1; then
basedir=`cygpath -w "$basedir"`
fi
;;
esac
if [ -x "$basedir/node" ]; then
exec "$basedir/node" "$basedir/../color-support/bin.js" "$@"
else
exec node "$basedir/../color-support/bin.js" "$@"
fi

16
backend/node_modules/.bin/mime generated vendored
View File

@ -1,16 +0,0 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*|*MINGW*|*MSYS*)
if command -v cygpath > /dev/null 2>&1; then
basedir=`cygpath -w "$basedir"`
fi
;;
esac
if [ -x "$basedir/node" ]; then
exec "$basedir/node" "$basedir/../mime/cli.js" "$@"
else
exec node "$basedir/../mime/cli.js" "$@"
fi

16
backend/node_modules/.bin/mkdirp generated vendored
View File

@ -1,16 +0,0 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*|*MINGW*|*MSYS*)
if command -v cygpath > /dev/null 2>&1; then
basedir=`cygpath -w "$basedir"`
fi
;;
esac
if [ -x "$basedir/node" ]; then
exec "$basedir/node" "$basedir/../mkdirp/bin/cmd.js" "$@"
else
exec node "$basedir/../mkdirp/bin/cmd.js" "$@"
fi

View File

@ -1,16 +0,0 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*|*MINGW*|*MSYS*)
if command -v cygpath > /dev/null 2>&1; then
basedir=`cygpath -w "$basedir"`
fi
;;
esac
if [ -x "$basedir/node" ]; then
exec "$basedir/node" "$basedir/../@mapbox/node-pre-gyp/bin/node-pre-gyp" "$@"
else
exec node "$basedir/../@mapbox/node-pre-gyp/bin/node-pre-gyp" "$@"
fi

16
backend/node_modules/.bin/nopt generated vendored
View File

@ -1,16 +0,0 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*|*MINGW*|*MSYS*)
if command -v cygpath > /dev/null 2>&1; then
basedir=`cygpath -w "$basedir"`
fi
;;
esac
if [ -x "$basedir/node" ]; then
exec "$basedir/node" "$basedir/../nopt/bin/nopt.js" "$@"
else
exec node "$basedir/../nopt/bin/nopt.js" "$@"
fi

16
backend/node_modules/.bin/rimraf generated vendored
View File

@ -1,16 +0,0 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*|*MINGW*|*MSYS*)
if command -v cygpath > /dev/null 2>&1; then
basedir=`cygpath -w "$basedir"`
fi
;;
esac
if [ -x "$basedir/node" ]; then
exec "$basedir/node" "$basedir/../rimraf/bin.js" "$@"
else
exec node "$basedir/../rimraf/bin.js" "$@"
fi

16
backend/node_modules/.bin/semver generated vendored
View File

@ -1,16 +0,0 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*|*MINGW*|*MSYS*)
if command -v cygpath > /dev/null 2>&1; then
basedir=`cygpath -w "$basedir"`
fi
;;
esac
if [ -x "$basedir/node" ]; then
exec "$basedir/node" "$basedir/../semver/bin/semver.js" "$@"
else
exec node "$basedir/../semver/bin/semver.js" "$@"
fi

16
backend/node_modules/.bin/uuid generated vendored
View File

@ -1,16 +0,0 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*|*MINGW*|*MSYS*)
if command -v cygpath > /dev/null 2>&1; then
basedir=`cygpath -w "$basedir"`
fi
;;
esac
if [ -x "$basedir/node" ]; then
exec "$basedir/node" "$basedir/../uuid/dist/bin/uuid" "$@"
else
exec node "$basedir/../uuid/dist/bin/uuid" "$@"
fi

2687
backend/node_modules/.package-lock.json generated vendored

File diff suppressed because it is too large Load Diff

View File

@ -1,26 +0,0 @@
MIT License
Original Library
- Copyright (c) Marak Squires
Additional Functionality
- Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
- Copyright (c) DABH (https://github.com/DABH)
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

View File

@ -1,219 +0,0 @@
# @colors/colors ("colors.js")
[![Build Status](https://github.com/DABH/colors.js/actions/workflows/ci.yml/badge.svg)](https://github.com/DABH/colors.js/actions/workflows/ci.yml)
[![version](https://img.shields.io/npm/v/@colors/colors.svg)](https://www.npmjs.org/package/@colors/colors)
Please check out the [roadmap](ROADMAP.md) for upcoming features and releases. Please open Issues to provide feedback.
## get color and style in your node.js console
![Demo](https://raw.githubusercontent.com/DABH/colors.js/master/screenshots/colors.png)
## Installation
npm install @colors/colors
## colors and styles!
### text colors
- black
- red
- green
- yellow
- blue
- magenta
- cyan
- white
- gray
- grey
### bright text colors
- brightRed
- brightGreen
- brightYellow
- brightBlue
- brightMagenta
- brightCyan
- brightWhite
### background colors
- bgBlack
- bgRed
- bgGreen
- bgYellow
- bgBlue
- bgMagenta
- bgCyan
- bgWhite
- bgGray
- bgGrey
### bright background colors
- bgBrightRed
- bgBrightGreen
- bgBrightYellow
- bgBrightBlue
- bgBrightMagenta
- bgBrightCyan
- bgBrightWhite
### styles
- reset
- bold
- dim
- italic
- underline
- inverse
- hidden
- strikethrough
### extras
- rainbow
- zebra
- america
- trap
- random
## Usage
By popular demand, `@colors/colors` now ships with two types of usages!
The super nifty way
```js
var colors = require('@colors/colors');
console.log('hello'.green); // outputs green text
console.log('i like cake and pies'.underline.red); // outputs red underlined text
console.log('inverse the color'.inverse); // inverses the color
console.log('OMG Rainbows!'.rainbow); // rainbow
console.log('Run the trap'.trap); // Drops the bass
```
or a slightly less nifty way which doesn't extend `String.prototype`
```js
var colors = require('@colors/colors/safe');
console.log(colors.green('hello')); // outputs green text
console.log(colors.red.underline('i like cake and pies')); // outputs red underlined text
console.log(colors.inverse('inverse the color')); // inverses the color
console.log(colors.rainbow('OMG Rainbows!')); // rainbow
console.log(colors.trap('Run the trap')); // Drops the bass
```
I prefer the first way. Some people seem to be afraid of extending `String.prototype` and prefer the second way.
If you are writing good code you will never have an issue with the first approach. If you really don't want to touch `String.prototype`, the second usage will not touch `String` native object.
## Enabling/Disabling Colors
The package will auto-detect whether your terminal can use colors and enable/disable accordingly. When colors are disabled, the color functions do nothing. You can override this with a command-line flag:
```bash
node myapp.js --no-color
node myapp.js --color=false
node myapp.js --color
node myapp.js --color=true
node myapp.js --color=always
FORCE_COLOR=1 node myapp.js
```
Or in code:
```javascript
var colors = require('@colors/colors');
colors.enable();
colors.disable();
```
## Console.log [string substitution](http://nodejs.org/docs/latest/api/console.html#console_console_log_data)
```js
var name = 'Beowulf';
console.log(colors.green('Hello %s'), name);
// outputs -> 'Hello Beowulf'
```
## Custom themes
### Using standard API
```js
var colors = require('@colors/colors');
colors.setTheme({
silly: 'rainbow',
input: 'grey',
verbose: 'cyan',
prompt: 'grey',
info: 'green',
data: 'grey',
help: 'cyan',
warn: 'yellow',
debug: 'blue',
error: 'red'
});
// outputs red text
console.log("this is an error".error);
// outputs yellow text
console.log("this is a warning".warn);
```
### Using string safe API
```js
var colors = require('@colors/colors/safe');
// set single property
var error = colors.red;
error('this is red');
// set theme
colors.setTheme({
silly: 'rainbow',
input: 'grey',
verbose: 'cyan',
prompt: 'grey',
info: 'green',
data: 'grey',
help: 'cyan',
warn: 'yellow',
debug: 'blue',
error: 'red'
});
// outputs red text
console.log(colors.error("this is an error"));
// outputs yellow text
console.log(colors.warn("this is a warning"));
```
### Combining Colors
```javascript
var colors = require('@colors/colors');
colors.setTheme({
custom: ['red', 'underline']
});
console.log('test'.custom);
```
*Protip: There is a secret undocumented style in `colors`. If you find the style you can summon him.*

View File

@ -1,83 +0,0 @@
var colors = require('../lib/index');
console.log('First some yellow text'.yellow);
console.log('Underline that text'.yellow.underline);
console.log('Make it bold and red'.red.bold);
console.log(('Double Raindows All Day Long').rainbow);
console.log('Drop the bass'.trap);
console.log('DROP THE RAINBOW BASS'.trap.rainbow);
// styles not widely supported
console.log('Chains are also cool.'.bold.italic.underline.red);
// styles not widely supported
console.log('So '.green + 'are'.underline + ' ' + 'inverse'.inverse
+ ' styles! '.yellow.bold);
console.log('Zebras are so fun!'.zebra);
//
// Remark: .strikethrough may not work with Mac OS Terminal App
//
console.log('This is ' + 'not'.strikethrough + ' fun.');
console.log('Background color attack!'.black.bgWhite);
console.log('Use random styles on everything!'.random);
console.log('America, Heck Yeah!'.america);
// eslint-disable-next-line max-len
console.log('Blindingly '.brightCyan + 'bright? '.brightRed + 'Why '.brightYellow + 'not?!'.brightGreen);
console.log('Setting themes is useful');
//
// Custom themes
//
console.log('Generic logging theme as JSON'.green.bold.underline);
// Load theme with JSON literal
colors.setTheme({
silly: 'rainbow',
input: 'grey',
verbose: 'cyan',
prompt: 'grey',
info: 'green',
data: 'grey',
help: 'cyan',
warn: 'yellow',
debug: 'blue',
error: 'red',
});
// outputs red text
console.log('this is an error'.error);
// outputs yellow text
console.log('this is a warning'.warn);
// outputs grey text
console.log('this is an input'.input);
console.log('Generic logging theme as file'.green.bold.underline);
// Load a theme from file
try {
colors.setTheme(require(__dirname + '/../themes/generic-logging.js'));
} catch (err) {
console.log(err);
}
// outputs red text
console.log('this is an error'.error);
// outputs yellow text
console.log('this is a warning'.warn);
// outputs grey text
console.log('this is an input'.input);
// console.log("Don't summon".zalgo)

View File

@ -1,80 +0,0 @@
var colors = require('../safe');
console.log(colors.yellow('First some yellow text'));
console.log(colors.yellow.underline('Underline that text'));
console.log(colors.red.bold('Make it bold and red'));
console.log(colors.rainbow('Double Raindows All Day Long'));
console.log(colors.trap('Drop the bass'));
console.log(colors.rainbow(colors.trap('DROP THE RAINBOW BASS')));
// styles not widely supported
console.log(colors.bold.italic.underline.red('Chains are also cool.'));
// styles not widely supported
console.log(colors.green('So ') + colors.underline('are') + ' '
+ colors.inverse('inverse') + colors.yellow.bold(' styles! '));
console.log(colors.zebra('Zebras are so fun!'));
console.log('This is ' + colors.strikethrough('not') + ' fun.');
console.log(colors.black.bgWhite('Background color attack!'));
console.log(colors.random('Use random styles on everything!'));
console.log(colors.america('America, Heck Yeah!'));
// eslint-disable-next-line max-len
console.log(colors.brightCyan('Blindingly ') + colors.brightRed('bright? ') + colors.brightYellow('Why ') + colors.brightGreen('not?!'));
console.log('Setting themes is useful');
//
// Custom themes
//
// console.log('Generic logging theme as JSON'.green.bold.underline);
// Load theme with JSON literal
colors.setTheme({
silly: 'rainbow',
input: 'blue',
verbose: 'cyan',
prompt: 'grey',
info: 'green',
data: 'grey',
help: 'cyan',
warn: 'yellow',
debug: 'blue',
error: 'red',
});
// outputs red text
console.log(colors.error('this is an error'));
// outputs yellow text
console.log(colors.warn('this is a warning'));
// outputs blue text
console.log(colors.input('this is an input'));
// console.log('Generic logging theme as file'.green.bold.underline);
// Load a theme from file
colors.setTheme(require(__dirname + '/../themes/generic-logging.js'));
// outputs red text
console.log(colors.error('this is an error'));
// outputs yellow text
console.log(colors.warn('this is a warning'));
// outputs grey text
console.log(colors.input('this is an input'));
// console.log(colors.zalgo("Don't summon him"))

View File

@ -1,184 +0,0 @@
// Type definitions for @colors/colors 1.4+
// Project: https://github.com/Marak/colors.js
// Definitions by: Bart van der Schoor <https://github.com/Bartvds>, Staffan Eketorp <https://github.com/staeke>
// Definitions: https://github.com/DABH/colors.js
export interface Color {
(text: string): string;
strip: Color;
stripColors: Color;
black: Color;
red: Color;
green: Color;
yellow: Color;
blue: Color;
magenta: Color;
cyan: Color;
white: Color;
gray: Color;
grey: Color;
brightRed: Color;
brightGreen: Color;
brightYellow: Color;
brightBlue: Color;
brightMagenta: Color;
brightCyan: Color;
brightWhite: Color;
bgBlack: Color;
bgRed: Color;
bgGreen: Color;
bgYellow: Color;
bgBlue: Color;
bgMagenta: Color;
bgCyan: Color;
bgWhite: Color;
bgBrightRed: Color;
bgBrightGreen: Color;
bgBrightYellow: Color;
bgBrightBlue: Color;
bgBrightMagenta: Color;
bgBrightCyan: Color;
bgBrightWhite: Color;
reset: Color;
bold: Color;
dim: Color;
italic: Color;
underline: Color;
inverse: Color;
hidden: Color;
strikethrough: Color;
rainbow: Color;
zebra: Color;
america: Color;
trap: Color;
random: Color;
zalgo: Color;
}
export function enable(): void;
export function disable(): void;
export function setTheme(theme: any): void;
export let enabled: boolean;
export const strip: Color;
export const stripColors: Color;
export const black: Color;
export const red: Color;
export const green: Color;
export const yellow: Color;
export const blue: Color;
export const magenta: Color;
export const cyan: Color;
export const white: Color;
export const gray: Color;
export const grey: Color;
export const brightRed: Color;
export const brightGreen: Color;
export const brightYellow: Color;
export const brightBlue: Color;
export const brightMagenta: Color;
export const brightCyan: Color;
export const brightWhite: Color;
export const bgBlack: Color;
export const bgRed: Color;
export const bgGreen: Color;
export const bgYellow: Color;
export const bgBlue: Color;
export const bgMagenta: Color;
export const bgCyan: Color;
export const bgWhite: Color;
export const bgBrightRed: Color;
export const bgBrightGreen: Color;
export const bgBrightYellow: Color;
export const bgBrightBlue: Color;
export const bgBrightMagenta: Color;
export const bgBrightCyan: Color;
export const bgBrightWhite: Color;
export const reset: Color;
export const bold: Color;
export const dim: Color;
export const italic: Color;
export const underline: Color;
export const inverse: Color;
export const hidden: Color;
export const strikethrough: Color;
export const rainbow: Color;
export const zebra: Color;
export const america: Color;
export const trap: Color;
export const random: Color;
export const zalgo: Color;
declare global {
interface String {
strip: string;
stripColors: string;
black: string;
red: string;
green: string;
yellow: string;
blue: string;
magenta: string;
cyan: string;
white: string;
gray: string;
grey: string;
brightRed: string;
brightGreen: string;
brightYellow: string;
brightBlue: string;
brightMagenta: string;
brightCyan: string;
brightWhite: string;
bgBlack: string;
bgRed: string;
bgGreen: string;
bgYellow: string;
bgBlue: string;
bgMagenta: string;
bgCyan: string;
bgWhite: string;
bgBrightRed: string;
bgBrightGreen: string;
bgBrightYellow: string;
bgBrightBlue: string;
bgBrightMagenta: string;
bgBrightCyan: string;
bgBrightWhite: string;
reset: string;
// @ts-ignore
bold: string;
dim: string;
italic: string;
underline: string;
inverse: string;
hidden: string;
strikethrough: string;
rainbow: string;
zebra: string;
america: string;
trap: string;
random: string;
zalgo: string;
}
}

View File

@ -1,211 +0,0 @@
/*
The MIT License (MIT)
Original Library
- Copyright (c) Marak Squires
Additional functionality
- Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
*/
var colors = {};
module['exports'] = colors;
colors.themes = {};
var util = require('util');
var ansiStyles = colors.styles = require('./styles');
var defineProps = Object.defineProperties;
var newLineRegex = new RegExp(/[\r\n]+/g);
colors.supportsColor = require('./system/supports-colors').supportsColor;
if (typeof colors.enabled === 'undefined') {
colors.enabled = colors.supportsColor() !== false;
}
colors.enable = function() {
colors.enabled = true;
};
colors.disable = function() {
colors.enabled = false;
};
colors.stripColors = colors.strip = function(str) {
return ('' + str).replace(/\x1B\[\d+m/g, '');
};
// eslint-disable-next-line no-unused-vars
var stylize = colors.stylize = function stylize(str, style) {
if (!colors.enabled) {
return str+'';
}
var styleMap = ansiStyles[style];
// Stylize should work for non-ANSI styles, too
if (!styleMap && style in colors) {
// Style maps like trap operate as functions on strings;
// they don't have properties like open or close.
return colors[style](str);
}
return styleMap.open + str + styleMap.close;
};
var matchOperatorsRe = /[|\\{}()[\]^$+*?.]/g;
var escapeStringRegexp = function(str) {
if (typeof str !== 'string') {
throw new TypeError('Expected a string');
}
return str.replace(matchOperatorsRe, '\\$&');
};
function build(_styles) {
var builder = function builder() {
return applyStyle.apply(builder, arguments);
};
builder._styles = _styles;
// __proto__ is used because we must return a function, but there is
// no way to create a function with a different prototype.
builder.__proto__ = proto;
return builder;
}
var styles = (function() {
var ret = {};
ansiStyles.grey = ansiStyles.gray;
Object.keys(ansiStyles).forEach(function(key) {
ansiStyles[key].closeRe =
new RegExp(escapeStringRegexp(ansiStyles[key].close), 'g');
ret[key] = {
get: function() {
return build(this._styles.concat(key));
},
};
});
return ret;
})();
var proto = defineProps(function colors() {}, styles);
function applyStyle() {
var args = Array.prototype.slice.call(arguments);
var str = args.map(function(arg) {
// Use weak equality check so we can colorize null/undefined in safe mode
if (arg != null && arg.constructor === String) {
return arg;
} else {
return util.inspect(arg);
}
}).join(' ');
if (!colors.enabled || !str) {
return str;
}
var newLinesPresent = str.indexOf('\n') != -1;
var nestedStyles = this._styles;
var i = nestedStyles.length;
while (i--) {
var code = ansiStyles[nestedStyles[i]];
str = code.open + str.replace(code.closeRe, code.open) + code.close;
if (newLinesPresent) {
str = str.replace(newLineRegex, function(match) {
return code.close + match + code.open;
});
}
}
return str;
}
colors.setTheme = function(theme) {
if (typeof theme === 'string') {
console.log('colors.setTheme now only accepts an object, not a string. ' +
'If you are trying to set a theme from a file, it is now your (the ' +
'caller\'s) responsibility to require the file. The old syntax ' +
'looked like colors.setTheme(__dirname + ' +
'\'/../themes/generic-logging.js\'); The new syntax looks like '+
'colors.setTheme(require(__dirname + ' +
'\'/../themes/generic-logging.js\'));');
return;
}
for (var style in theme) {
(function(style) {
colors[style] = function(str) {
if (typeof theme[style] === 'object') {
var out = str;
for (var i in theme[style]) {
out = colors[theme[style][i]](out);
}
return out;
}
return colors[theme[style]](str);
};
})(style);
}
};
function init() {
var ret = {};
Object.keys(styles).forEach(function(name) {
ret[name] = {
get: function() {
return build([name]);
},
};
});
return ret;
}
var sequencer = function sequencer(map, str) {
var exploded = str.split('');
exploded = exploded.map(map);
return exploded.join('');
};
// custom formatter methods
colors.trap = require('./custom/trap');
colors.zalgo = require('./custom/zalgo');
// maps
colors.maps = {};
colors.maps.america = require('./maps/america')(colors);
colors.maps.zebra = require('./maps/zebra')(colors);
colors.maps.rainbow = require('./maps/rainbow')(colors);
colors.maps.random = require('./maps/random')(colors);
for (var map in colors.maps) {
(function(map) {
colors[map] = function(str) {
return sequencer(colors.maps[map], str);
};
})(map);
}
defineProps(colors, init());

View File

@ -1,46 +0,0 @@
module['exports'] = function runTheTrap(text, options) {
var result = '';
text = text || 'Run the trap, drop the bass';
text = text.split('');
var trap = {
a: ['\u0040', '\u0104', '\u023a', '\u0245', '\u0394', '\u039b', '\u0414'],
b: ['\u00df', '\u0181', '\u0243', '\u026e', '\u03b2', '\u0e3f'],
c: ['\u00a9', '\u023b', '\u03fe'],
d: ['\u00d0', '\u018a', '\u0500', '\u0501', '\u0502', '\u0503'],
e: ['\u00cb', '\u0115', '\u018e', '\u0258', '\u03a3', '\u03be', '\u04bc',
'\u0a6c'],
f: ['\u04fa'],
g: ['\u0262'],
h: ['\u0126', '\u0195', '\u04a2', '\u04ba', '\u04c7', '\u050a'],
i: ['\u0f0f'],
j: ['\u0134'],
k: ['\u0138', '\u04a0', '\u04c3', '\u051e'],
l: ['\u0139'],
m: ['\u028d', '\u04cd', '\u04ce', '\u0520', '\u0521', '\u0d69'],
n: ['\u00d1', '\u014b', '\u019d', '\u0376', '\u03a0', '\u048a'],
o: ['\u00d8', '\u00f5', '\u00f8', '\u01fe', '\u0298', '\u047a', '\u05dd',
'\u06dd', '\u0e4f'],
p: ['\u01f7', '\u048e'],
q: ['\u09cd'],
r: ['\u00ae', '\u01a6', '\u0210', '\u024c', '\u0280', '\u042f'],
s: ['\u00a7', '\u03de', '\u03df', '\u03e8'],
t: ['\u0141', '\u0166', '\u0373'],
u: ['\u01b1', '\u054d'],
v: ['\u05d8'],
w: ['\u0428', '\u0460', '\u047c', '\u0d70'],
x: ['\u04b2', '\u04fe', '\u04fc', '\u04fd'],
y: ['\u00a5', '\u04b0', '\u04cb'],
z: ['\u01b5', '\u0240'],
};
text.forEach(function(c) {
c = c.toLowerCase();
var chars = trap[c] || [' '];
var rand = Math.floor(Math.random() * chars.length);
if (typeof trap[c] !== 'undefined') {
result += trap[c][rand];
} else {
result += c;
}
});
return result;
};

View File

@ -1,110 +0,0 @@
// please no
module['exports'] = function zalgo(text, options) {
text = text || ' he is here ';
var soul = {
'up': [
'̍', '̎', '̄', '̅',
'̿', '̑', '̆', '̐',
'͒', '͗', '͑', '̇',
'̈', '̊', '͂', '̓',
'̈', '͊', '͋', '͌',
'̃', '̂', '̌', '͐',
'̀', '́', '̋', '̏',
'̒', '̓', '̔', '̽',
'̉', 'ͣ', 'ͤ', 'ͥ',
'ͦ', 'ͧ', 'ͨ', 'ͩ',
'ͪ', 'ͫ', 'ͬ', 'ͭ',
'ͮ', 'ͯ', '̾', '͛',
'͆', '̚',
],
'down': [
'̖', '̗', '̘', '̙',
'̜', '̝', '̞', '̟',
'̠', '̤', '̥', '̦',
'̩', '̪', '̫', '̬',
'̭', '̮', '̯', '̰',
'̱', '̲', '̳', '̹',
'̺', '̻', '̼', 'ͅ',
'͇', '͈', '͉', '͍',
'͎', '͓', '͔', '͕',
'͖', '͙', '͚', '̣',
],
'mid': [
'̕', '̛', '̀', '́',
'͘', '̡', '̢', '̧',
'̨', '̴', '̵', '̶',
'͜', '͝', '͞',
'͟', '͠', '͢', '̸',
'̷', '͡', ' ҉',
],
};
var all = [].concat(soul.up, soul.down, soul.mid);
function randomNumber(range) {
var r = Math.floor(Math.random() * range);
return r;
}
function isChar(character) {
var bool = false;
all.filter(function(i) {
bool = (i === character);
});
return bool;
}
function heComes(text, options) {
var result = '';
var counts;
var l;
options = options || {};
options['up'] =
typeof options['up'] !== 'undefined' ? options['up'] : true;
options['mid'] =
typeof options['mid'] !== 'undefined' ? options['mid'] : true;
options['down'] =
typeof options['down'] !== 'undefined' ? options['down'] : true;
options['size'] =
typeof options['size'] !== 'undefined' ? options['size'] : 'maxi';
text = text.split('');
for (l in text) {
if (isChar(l)) {
continue;
}
result = result + text[l];
counts = {'up': 0, 'down': 0, 'mid': 0};
switch (options.size) {
case 'mini':
counts.up = randomNumber(8);
counts.mid = randomNumber(2);
counts.down = randomNumber(8);
break;
case 'maxi':
counts.up = randomNumber(16) + 3;
counts.mid = randomNumber(4) + 1;
counts.down = randomNumber(64) + 3;
break;
default:
counts.up = randomNumber(8) + 1;
counts.mid = randomNumber(6) / 2;
counts.down = randomNumber(8) + 1;
break;
}
var arr = ['up', 'mid', 'down'];
for (var d in arr) {
var index = arr[d];
for (var i = 0; i <= counts[index]; i++) {
if (options[index]) {
result = result + soul[index][randomNumber(soul[index].length)];
}
}
}
}
return result;
}
// don't summon him
return heComes(text, options);
};

View File

@ -1,110 +0,0 @@
var colors = require('./colors');
module['exports'] = function() {
//
// Extends prototype of native string object to allow for "foo".red syntax
//
var addProperty = function(color, func) {
String.prototype.__defineGetter__(color, func);
};
addProperty('strip', function() {
return colors.strip(this);
});
addProperty('stripColors', function() {
return colors.strip(this);
});
addProperty('trap', function() {
return colors.trap(this);
});
addProperty('zalgo', function() {
return colors.zalgo(this);
});
addProperty('zebra', function() {
return colors.zebra(this);
});
addProperty('rainbow', function() {
return colors.rainbow(this);
});
addProperty('random', function() {
return colors.random(this);
});
addProperty('america', function() {
return colors.america(this);
});
//
// Iterate through all default styles and colors
//
var x = Object.keys(colors.styles);
x.forEach(function(style) {
addProperty(style, function() {
return colors.stylize(this, style);
});
});
function applyTheme(theme) {
//
// Remark: This is a list of methods that exist
// on String that you should not overwrite.
//
var stringPrototypeBlacklist = [
'__defineGetter__', '__defineSetter__', '__lookupGetter__',
'__lookupSetter__', 'charAt', 'constructor', 'hasOwnProperty',
'isPrototypeOf', 'propertyIsEnumerable', 'toLocaleString', 'toString',
'valueOf', 'charCodeAt', 'indexOf', 'lastIndexOf', 'length',
'localeCompare', 'match', 'repeat', 'replace', 'search', 'slice',
'split', 'substring', 'toLocaleLowerCase', 'toLocaleUpperCase',
'toLowerCase', 'toUpperCase', 'trim', 'trimLeft', 'trimRight',
];
Object.keys(theme).forEach(function(prop) {
if (stringPrototypeBlacklist.indexOf(prop) !== -1) {
console.log('warn: '.red + ('String.prototype' + prop).magenta +
' is probably something you don\'t want to override. ' +
'Ignoring style name');
} else {
if (typeof(theme[prop]) === 'string') {
colors[prop] = colors[theme[prop]];
addProperty(prop, function() {
return colors[prop](this);
});
} else {
var themePropApplicator = function(str) {
var ret = str || this;
for (var t = 0; t < theme[prop].length; t++) {
ret = colors[theme[prop][t]](ret);
}
return ret;
};
addProperty(prop, themePropApplicator);
colors[prop] = function(str) {
return themePropApplicator(str);
};
}
}
});
}
colors.setTheme = function(theme) {
if (typeof theme === 'string') {
console.log('colors.setTheme now only accepts an object, not a string. ' +
'If you are trying to set a theme from a file, it is now your (the ' +
'caller\'s) responsibility to require the file. The old syntax ' +
'looked like colors.setTheme(__dirname + ' +
'\'/../themes/generic-logging.js\'); The new syntax looks like '+
'colors.setTheme(require(__dirname + ' +
'\'/../themes/generic-logging.js\'));');
return;
} else {
applyTheme(theme);
}
};
};

View File

@ -1,13 +0,0 @@
var colors = require('./colors');
module['exports'] = colors;
// Remark: By default, colors will add style properties to String.prototype.
//
// If you don't wish to extend String.prototype, you can do this instead and
// native String will not be touched:
//
// var colors = require('@colors/colors/safe');
// colors.red("foo")
//
//
require('./extendStringPrototype')();

View File

@ -1,10 +0,0 @@
module['exports'] = function(colors) {
return function(letter, i, exploded) {
if (letter === ' ') return letter;
switch (i%3) {
case 0: return colors.red(letter);
case 1: return colors.white(letter);
case 2: return colors.blue(letter);
}
};
};

View File

@ -1,12 +0,0 @@
module['exports'] = function(colors) {
// RoY G BiV
var rainbowColors = ['red', 'yellow', 'green', 'blue', 'magenta'];
return function(letter, i, exploded) {
if (letter === ' ') {
return letter;
} else {
return colors[rainbowColors[i++ % rainbowColors.length]](letter);
}
};
};

View File

@ -1,11 +0,0 @@
module['exports'] = function(colors) {
var available = ['underline', 'inverse', 'grey', 'yellow', 'red', 'green',
'blue', 'white', 'cyan', 'magenta', 'brightYellow', 'brightRed',
'brightGreen', 'brightBlue', 'brightWhite', 'brightCyan', 'brightMagenta'];
return function(letter, i, exploded) {
return letter === ' ' ? letter :
colors[
available[Math.round(Math.random() * (available.length - 2))]
](letter);
};
};

View File

@ -1,5 +0,0 @@
module['exports'] = function(colors) {
return function(letter, i, exploded) {
return i % 2 === 0 ? letter : colors.inverse(letter);
};
};

View File

@ -1,95 +0,0 @@
/*
The MIT License (MIT)
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
*/
var styles = {};
module['exports'] = styles;
var codes = {
reset: [0, 0],
bold: [1, 22],
dim: [2, 22],
italic: [3, 23],
underline: [4, 24],
inverse: [7, 27],
hidden: [8, 28],
strikethrough: [9, 29],
black: [30, 39],
red: [31, 39],
green: [32, 39],
yellow: [33, 39],
blue: [34, 39],
magenta: [35, 39],
cyan: [36, 39],
white: [37, 39],
gray: [90, 39],
grey: [90, 39],
brightRed: [91, 39],
brightGreen: [92, 39],
brightYellow: [93, 39],
brightBlue: [94, 39],
brightMagenta: [95, 39],
brightCyan: [96, 39],
brightWhite: [97, 39],
bgBlack: [40, 49],
bgRed: [41, 49],
bgGreen: [42, 49],
bgYellow: [43, 49],
bgBlue: [44, 49],
bgMagenta: [45, 49],
bgCyan: [46, 49],
bgWhite: [47, 49],
bgGray: [100, 49],
bgGrey: [100, 49],
bgBrightRed: [101, 49],
bgBrightGreen: [102, 49],
bgBrightYellow: [103, 49],
bgBrightBlue: [104, 49],
bgBrightMagenta: [105, 49],
bgBrightCyan: [106, 49],
bgBrightWhite: [107, 49],
// legacy styles for colors pre v1.0.0
blackBG: [40, 49],
redBG: [41, 49],
greenBG: [42, 49],
yellowBG: [43, 49],
blueBG: [44, 49],
magentaBG: [45, 49],
cyanBG: [46, 49],
whiteBG: [47, 49],
};
Object.keys(codes).forEach(function(key) {
var val = codes[key];
var style = styles[key] = [];
style.open = '\u001b[' + val[0] + 'm';
style.close = '\u001b[' + val[1] + 'm';
});

View File

@ -1,35 +0,0 @@
/*
MIT License
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
of the Software, and to permit persons to whom the Software is furnished to do
so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
*/
'use strict';
module.exports = function(flag, argv) {
argv = argv || process.argv || [];
var terminatorPos = argv.indexOf('--');
var prefix = /^-{1,2}/.test(flag) ? '' : '--';
var pos = argv.indexOf(prefix + flag);
return pos !== -1 && (terminatorPos === -1 ? true : pos < terminatorPos);
};

View File

@ -1,151 +0,0 @@
/*
The MIT License (MIT)
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
*/
'use strict';
var os = require('os');
var hasFlag = require('./has-flag.js');
var env = process.env;
var forceColor = void 0;
if (hasFlag('no-color') || hasFlag('no-colors') || hasFlag('color=false')) {
forceColor = false;
} else if (hasFlag('color') || hasFlag('colors') || hasFlag('color=true')
|| hasFlag('color=always')) {
forceColor = true;
}
if ('FORCE_COLOR' in env) {
forceColor = env.FORCE_COLOR.length === 0
|| parseInt(env.FORCE_COLOR, 10) !== 0;
}
function translateLevel(level) {
if (level === 0) {
return false;
}
return {
level: level,
hasBasic: true,
has256: level >= 2,
has16m: level >= 3,
};
}
function supportsColor(stream) {
if (forceColor === false) {
return 0;
}
if (hasFlag('color=16m') || hasFlag('color=full')
|| hasFlag('color=truecolor')) {
return 3;
}
if (hasFlag('color=256')) {
return 2;
}
if (stream && !stream.isTTY && forceColor !== true) {
return 0;
}
var min = forceColor ? 1 : 0;
if (process.platform === 'win32') {
// Node.js 7.5.0 is the first version of Node.js to include a patch to
// libuv that enables 256 color output on Windows. Anything earlier and it
// won't work. However, here we target Node.js 8 at minimum as it is an LTS
// release, and Node.js 7 is not. Windows 10 build 10586 is the first
// Windows release that supports 256 colors. Windows 10 build 14931 is the
// first release that supports 16m/TrueColor.
var osRelease = os.release().split('.');
if (Number(process.versions.node.split('.')[0]) >= 8
&& Number(osRelease[0]) >= 10 && Number(osRelease[2]) >= 10586) {
return Number(osRelease[2]) >= 14931 ? 3 : 2;
}
return 1;
}
if ('CI' in env) {
if (['TRAVIS', 'CIRCLECI', 'APPVEYOR', 'GITLAB_CI'].some(function(sign) {
return sign in env;
}) || env.CI_NAME === 'codeship') {
return 1;
}
return min;
}
if ('TEAMCITY_VERSION' in env) {
return (/^(9\.(0*[1-9]\d*)\.|\d{2,}\.)/.test(env.TEAMCITY_VERSION) ? 1 : 0
);
}
if ('TERM_PROGRAM' in env) {
var version = parseInt((env.TERM_PROGRAM_VERSION || '').split('.')[0], 10);
switch (env.TERM_PROGRAM) {
case 'iTerm.app':
return version >= 3 ? 3 : 2;
case 'Hyper':
return 3;
case 'Apple_Terminal':
return 2;
// No default
}
}
if (/-256(color)?$/i.test(env.TERM)) {
return 2;
}
if (/^screen|^xterm|^vt100|^rxvt|color|ansi|cygwin|linux/i.test(env.TERM)) {
return 1;
}
if ('COLORTERM' in env) {
return 1;
}
if (env.TERM === 'dumb') {
return min;
}
return min;
}
function getSupportLevel(stream) {
var level = supportsColor(stream);
return translateLevel(level);
}
module.exports = {
supportsColor: getSupportLevel,
stdout: getSupportLevel(process.stdout),
stderr: getSupportLevel(process.stderr),
};

View File

@ -1,45 +0,0 @@
{
"name": "@colors/colors",
"description": "get colors in your node.js console",
"version": "1.6.0",
"author": "DABH",
"contributors": [
{
"name": "DABH",
"url": "https://github.com/DABH"
}
],
"homepage": "https://github.com/DABH/colors.js",
"bugs": "https://github.com/DABH/colors.js/issues",
"keywords": [
"ansi",
"terminal",
"colors"
],
"repository": {
"type": "git",
"url": "http://github.com/DABH/colors.js.git"
},
"license": "MIT",
"scripts": {
"lint": "eslint . --fix",
"test": "export FORCE_COLOR=1 && node tests/basic-test.js && node tests/safe-test.js"
},
"engines": {
"node": ">=0.1.90"
},
"main": "lib/index.js",
"files": [
"examples",
"lib",
"LICENSE",
"safe.js",
"themes",
"index.d.ts",
"safe.d.ts"
],
"devDependencies": {
"eslint": "^8.9.0",
"eslint-config-google": "^0.14.0"
}
}

View File

@ -1,64 +0,0 @@
// Type definitions for Colors.js 1.2
// Project: https://github.com/Marak/colors.js
// Definitions by: Bart van der Schoor <https://github.com/Bartvds>, Staffan Eketorp <https://github.com/staeke>
// Definitions: https://github.com/Marak/colors.js
export const enabled: boolean;
export function enable(): void;
export function disable(): void;
export function setTheme(theme: any): void;
export function strip(str: string): string;
export function stripColors(str: string): string;
export function black(str: string): string;
export function red(str: string): string;
export function green(str: string): string;
export function yellow(str: string): string;
export function blue(str: string): string;
export function magenta(str: string): string;
export function cyan(str: string): string;
export function white(str: string): string;
export function gray(str: string): string;
export function grey(str: string): string;
export function brightRed(str: string): string;
export function brightGreen(str: string): string;
export function brightYellow(str: string): string;
export function brightBlue(str: string): string;
export function brightMagenta(str: string): string;
export function brightCyan(str: string): string;
export function brightWhite(str: string): string;
export function bgBlack(str: string): string;
export function bgRed(str: string): string;
export function bgGreen(str: string): string;
export function bgYellow(str: string): string;
export function bgBlue(str: string): string;
export function bgMagenta(str: string): string;
export function bgCyan(str: string): string;
export function bgWhite(str: string): string;
export function bgBrightRed(str: string): string;
export function bgBrightGreen(str: string): string;
export function bgBrightYellow(str: string): string;
export function bgBrightBlue(str: string): string;
export function bgBrightMagenta(str: string): string;
export function bgBrightCyan(str: string): string;
export function bgBrightWhite(str: string): string;
export function reset(str: string): string;
export function bold(str: string): string;
export function dim(str: string): string;
export function italic(str: string): string;
export function underline(str: string): string;
export function inverse(str: string): string;
export function hidden(str: string): string;
export function strikethrough(str: string): string;
export function rainbow(str: string): string;
export function zebra(str: string): string;
export function america(str: string): string;
export function trap(str: string): string;
export function random(str: string): string;
export function zalgo(str: string): string;

View File

@ -1,10 +0,0 @@
//
// Remark: Requiring this file will use the "safe" colors API,
// which will not touch String.prototype.
//
// var colors = require('colors/safe');
// colors.red("foo")
//
//
var colors = require('./lib/colors');
module['exports'] = colors;

View File

@ -1,12 +0,0 @@
module['exports'] = {
silly: 'rainbow',
input: 'grey',
verbose: 'cyan',
prompt: 'grey',
info: 'green',
data: 'grey',
help: 'cyan',
warn: 'yellow',
debug: 'blue',
error: 'red',
};

View File

@ -1,26 +0,0 @@
# CHANGELOG
### 2.0.2
- Bump to kuler 2.0, which removes colornames as dependency, which we
never used. So smaller install size, less dependencies for all.
### 2.0.1
- Use `storag-engine@3.0` which will automatically detect the correct
AsyncStorage implementation.
- The upgrade also fixes a bug where it the `debug` and `diagnostics` values
to be JSON encoded instead of regular plain text.
### 2.0.0
- Documentation improvements.
- Fixed a issue where async adapters were incorrectly detected.
- Correctly inherit colors after applying colors the browser's console.
### 2.0.0-alpha
- Complete rewrite of all internals, now comes with separate builds for `browser`
`node` and `react-native` as well as dedicated builds for `production` and
`development` environments. Various utility methods and properties have
been added to the returned logger to make your lives even easier.

View File

@ -1,20 +0,0 @@
The MIT License (MIT)
Copyright (c) 2015 Arnout Kazemier, Martijn Swaagman, the Contributors.
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@ -1,473 +0,0 @@
# `diagnostics`
Diagnostics in the evolution of debug pattern that is used in the Node.js core,
this extremely small but powerful technique can best be compared as feature
flags for loggers. The created debug logger is disabled by default but can be
enabled without changing a line of code, using flags.
- Allows debugging in multiple JavaScript environments such as Node.js, browsers
and React-Native.
- Separated development and production builds to minimize impact on your
application when bundled.
- Allows for customization of logger, messages, and much more.
![Output Example](example.png)
## Installation
The module is released in the public npm registry and can be installed by
running:
```
npm install --save @dabh/diagnostics
```
## Usage
- [Introduction](#introduction)
- [Advanced usage](#advanced-usage)
- [Production and development builds](#production-and-development-builds)
- [WebPack](#webpack)
- [Node.js](#nodejs)
- [API](#api)
- [.enabled](#enabled)
- [.namespace](#namespace)
- [.dev/prod](#devprod)
- [set](#set)
- [modify](#modify)
- [use](#use)
- [Modifiers](#modifiers)
- [namespace](#namespace-1)
- [Adapters](#adapters)
- [process.env](#process-env)
- [hash](#hash)
- [localStorage](#localstorage)
- [AsyncStorage](#asyncstorage)
- [Loggers](#loggers)
### Introduction
To create a new logger simply `require` the `@dabh/diagnostics` module and call
the returned function. It accepts 2 arguments:
1. `namespace` **Required** This is the namespace of your logger so we know if we need to
enable your logger when a debug flag is used. Generally you use the name of
your library or application as first root namespace. For example if you're
building a parser in a library (example) you would set namespace
`example:parser`.
2. `options` An object with additional configuration for the logger.
following keys are recognized:
- `force` Force the logger to be enabled.
- `colors` Colors are enabled by default for the logs, but you can set this
option to `false` to disable it.
```js
const debug = require('@dabh/diagnostics')('foo:bar:baz');
const debug = require('@dabh/diagnostics')('foo:bar:baz', { options });
debug('this is a log message %s', 'that will only show up when enabled');
debug('that is pretty neat', { log: 'more', data: 1337 });
```
Unlike `console.log` statements that add and remove during your development
lifecycle you create meaningful log statements that will give you insight in
the library or application that you're developing.
The created debugger uses different "adapters" to extract the debug flag
out of the JavaScript environment. To learn more about enabling the debug flag
in your specific environment click on one of the enabled adapters below.
- **browser**: [localStorage](#localstorage), [hash](#hash)
- **node.js**: [environment variables](#processenv)
- **react-native**: [AsyncStorage](#asyncstorage)
Please note that the returned logger is fully configured out of the box, you
do not need to set any of the adapters/modifiers your self, they are there
for when you want more advanced control over the process. But if you want to
learn more about that, read the next section.
### Advanced usage
There are 2 specific usage patterns for `diagnostic`, library developers who
implement it as part of their modules and applications developers who either
use it in their application or are searching for ways to consume the messages.
With the simple log interface as discussed in the [introduction](#introduction)
section we make it easy for developers to add it as part of their libraries
and applications, and with powerful [API](#api) we allow infinite customization
by allowing custom adapters, loggers and modifiers to ensure that this library
maintains relevant. These methods not only allow introduction of new loggers,
but allow you think outside the box. For example you can maintain a history
of past log messages, and output those when an uncaught exception happens in
your application so you have additional context
```js
const diagnostics = require('@dabh/diagnostics');
let index = 0;
const limit = 200;
const history = new Array(limit);
//
// Force all `diagnostic` loggers to be enabled.
//
diagnostics.force = process.env.NODE_ENV === 'prod';
diagnostics.set(function customLogger(meta, message) {
history[index]= { meta, message, now: Date.now() };
if (index++ === limit) index = 0;
//
// We're running a development build, so output.
//
if (meta.dev) console.log.apply(console, message);
});
process.on('uncaughtException', async function (err) {
await saveErrorToDisk(err, history);
process.exit(1);
});
```
The small snippet above will maintain a 200 limited FIFO (First In First Out)
queue of all debug messages that can be referenced when your application crashes
#### Production and development builds
When you `require` the `@dabh/diagnostics` module you will be given a logger that is
optimized for `development` so it can provide the best developer experience
possible.
The development logger enables all the [adapters](#adapters) for your
JavaScript environment, adds a logger that outputs the messages to `console.log`
and registers our message modifiers so log messages will be prefixed with the
supplied namespace so you know where the log messages originates from.
The development logger does not have any adapter, modifier and logger enabled
by default. This ensures that your log messages never accidentally show up in
production. However this does not mean that it's not possible to get debug
messages in production. You can `force` the debugger to be enabled, and
supply a [custom logger](#loggers).
```js
const diagnostics = require('@dabh/diagnostics');
const debug = debug('foo:bar', { force: true });
//
// Or enable _every_ diagnostic instance:
//
diagnostics.force = true;
```
##### WebPack
WebPack has the concept of [mode](https://webpack.js.org/concepts/mode/#usage)'s
which creates different
```js
module.exports = {
mode: 'development' // 'production'
}
```
When you are building your app using the WebPack CLI you can use the `--mode`
flag:
```
webpack --mode=production app.js -o /dist/bundle.js
```
##### Node.js
When you are running your app using `Node.js` you should the `NODE_ENV`
environment variable to `production` to ensure that you libraries that you
import are optimized for production.
```
NODE_ENV=production node app.js
```
### API
The returned logger exposes some addition properties that can be used used in
your application or library:
#### .enabled
The returned logger will have a `.enabled` property assigned to it. This boolean
can be used to check if the logger was enabled:
```js
const debug = require('@dabh/diagnostics')('foo:bar');
if (debug.enabled) {
//
// Do something special
//
}
```
This property is exposed as:
- Property on the logger.
- Property on the meta/options object.
#### .namespace
This is the namespace that you originally provided to the function.
```js
const debug = require('@dabh/diagnostics')('foo:bar');
console.log(debug.namespace); // foo:bar
```
This property is exposed as:
- Property on the logger.
- Property on the meta/options object.
#### .dev/prod
There are different builds available of `diagnostics`, when you create a
production build of your application using `NODE_ENV=production` you will be
given an optimized, smaller build of `diagnostics` to reduce your bundle size.
The `dev` and `prod` booleans on the returned logger indicate if you have a
production or development version of the logger.
```js
const debug = require('@dabh/diagnostics')('foo:bar');
if (debug.prod) {
// do stuff
}
```
This property is exposed as:
- Property on the logger.
- Property on the meta/options object.
#### set
Sets a new logger as default for **all** `diagnostic` instances. The passed
argument should be a function that write the log messages to where ever you
want. It receives 2 arguments:
1. `meta` An object with all the options that was provided to the original
logger that wants to write the log message as well as properties of the
debugger such as `prod`, `dev`, `namespace`, `enabled`. See [API](#api) for
all exposed properties.
2. `args` An array of the log messages that needs to be written.
```js
const debug = require('@dabh/diagnostics')('foo:more:namespaces');
debug.use(function logger(meta, args) {
console.log(meta);
console.debug(...args);
});
```
This method is exposed as:
- Method on the logger.
- Method on the meta/options object.
- Method on `diagnostics` module.
#### modify
The modify method allows you add a new message modifier to **all** `diagnostic`
instances. The passed argument should be a function that returns the passed
message after modification. The function receives 2 arguments:
1. `message`, Array, the log message.
2. `options`, Object, the options that were passed into the logger when it was
initially created.
```js
const debug = require('@dabh/diagnostics')('example:modifiers');
debug.modify(function (message, options) {
return messages;
});
```
This method is exposed as:
- Method on the logger.
- Method on the meta/options object.
- Method on `diagnostics` module.
See [modifiers](#modifiers) for more information.
#### use
Adds a new `adapter` to **all** `diagnostic` instances. The passed argument
should be a function returns a boolean that indicates if the passed in
`namespace` is allowed to write log messages.
```js
const diagnostics = require('@dabh/diagnostics');
const debug = diagnostics('foo:bar');
debug.use(function (namespace) {
return namespace === 'foo:bar';
});
```
This method is exposed as:
- Method on the logger.
- Method on the meta/options object.
- Method on `diagnostics` module.
See [adapters](#adapters) for more information.
### Modifiers
To be as flexible as possible when it comes to transforming messages we've
come up with the concept of `modifiers` which can enhance the debug messages.
This allows you to introduce functionality or details that you find important
for debug messages, and doesn't require us to add additional bloat to the
`diagnostic` core.
For example, you want the messages to be prefixed with the date-time of when
the log message occured:
```js
const diagnostics = require('@dabh/diagnostics');
diagnostics.modify(function datetime(args, options) {
args.unshift(new Date());
return args;
});
```
Now all messages will be prefixed with date that is outputted by `new Date()`.
The following modifiers are shipped with `diagnostics` and are enabled in
**development** mode only:
- [namespace](#namespace)
#### namespace
This modifier is enabled for all debug instances and prefixes the messages
with the name of namespace under which it is logged. The namespace is colored
using the `colorspace` module which groups similar namespaces under the same
colorspace. You can have multiple namespaces for the debuggers where each
namespace should be separated by a `:`
```
foo
foo:bar
foo:bar:baz
```
For console based output the `namespace-ansi` is used.
### Adapters
Adapters allows `diagnostics` to pull the `DEBUG` and `DIAGNOSTICS` environment
variables from different sources. Not every JavaScript environment has a
`process.env` that we can leverage. Adapters allows us to have different
adapters for different environments. It means you can write your own custom
adapter if needed as well.
The `adapter` function should be passed a function as argument, this function
will receive the `namespace` of a logger as argument and it should return a
boolean that indicates if that logger should be enabled or not.
```js
const debug = require('@dabh/diagnostics')('example:namespace');
debug.adapter(require('@dabh/diagnostics/adapters/localstorage'));
```
The modifiers are only enabled for `development`. The following adapters are
available are available:
#### process.env
This adapter is enabled for `node.js`.
Uses the `DEBUG` or `DIAGNOSTICS` (both are recognized) environment variables to
pass in debug flag:
**UNIX/Linux/Mac**
```
DEBUG=foo* node index.js
```
Using environment variables on Windows is a bit different, and also depends on
toolchain you are using:
**Windows**
```
set DEBUG=foo* & node index.js
```
**Powershell**
```
$env:DEBUG='foo*';node index.js
```
#### hash
This adapter is enabled for `browsers`.
This adapter uses the `window.location.hash` of as source for the environment
variables. It assumes that hash is formatted using the same syntax as query
strings:
```js
http://example.com/foo/bar#debug=foo*
```
It triggers on both the `debug=` and `diagnostics=` names.
#### localStorage
This adapter is enabled for `browsers`.
This adapter uses the `localStorage` of the browser to store the debug flags.
You can set the debug flag your self in your application code, but you can
also open browser WebInspector and enable it through the console.
```js
localStorage.setItem('debug', 'foo*');
```
It triggers on both the `debug` and `diagnostics` storage items. (Please note
that these keys should be entered in lowercase)
#### AsyncStorage
This adapter is enabled for `react-native`.
This adapter uses the `AsyncStorage` API that is exposed by the `react-native`
library to store and read the `debug` or `diagnostics` storage items.
```js
import { AsyncStorage } from 'react-native';
AsyncStorage.setItem('debug', 'foo*');
```
Unlike other adapters, this is the only adapter that is `async` so that means
that we're not able to instantly determine if a created logger should be
enabled or disabled. So when a logger is created in `react-native` we initially
assume it's disabled, any message that send during period will be queued
internally.
Once we've received the data from the `AsyncStorage` API we will determine
if the logger should be enabled, flush the queued messages if needed and set
all `enabled` properties accordingly on the returned logger.
### Loggers
By default it will log all messages to `console.log` in when the logger is
enabled using the debug flag that is set using one of the adapters.
## License
[MIT](LICENSE)

View File

@ -1,11 +0,0 @@
var adapter = require('./');
/**
* Extracts the values from process.env.
*
* @type {Function}
* @public
*/
module.exports = adapter(function hash() {
return /(debug|diagnostics)=([^&]+)/i.exec(window.location.hash)[2];
});

View File

@ -1,18 +0,0 @@
var enabled = require('enabled');
/**
* Creates a new Adapter.
*
* @param {Function} fn Function that returns the value.
* @returns {Function} The adapter logic.
* @public
*/
module.exports = function create(fn) {
return function adapter(namespace) {
try {
return enabled(namespace, fn());
} catch (e) { /* Any failure means that we found nothing */ }
return false;
};
}

View File

@ -1,11 +0,0 @@
var adapter = require('./');
/**
* Extracts the values from process.env.
*
* @type {Function}
* @public
*/
module.exports = adapter(function storage() {
return localStorage.getItem('debug') || localStorage.getItem('diagnostics');
});

View File

@ -1,11 +0,0 @@
var adapter = require('./');
/**
* Extracts the values from process.env.
*
* @type {Function}
* @public
*/
module.exports = adapter(function processenv() {
return process.env.DEBUG || process.env.DIAGNOSTICS;
});

View File

@ -1,35 +0,0 @@
var create = require('../diagnostics');
/**
* Create a new diagnostics logger.
*
* @param {String} namespace The namespace it should enable.
* @param {Object} options Additional options.
* @returns {Function} The logger.
* @public
*/
var diagnostics = create(function dev(namespace, options) {
options = options || {};
options.namespace = namespace;
options.prod = false;
options.dev = true;
if (!dev.enabled(namespace) && !(options.force || dev.force)) {
return dev.nope(options);
}
return dev.yep(options);
});
//
// Configure the logger for the given environment.
//
diagnostics.modify(require('../modifiers/namespace'));
diagnostics.use(require('../adapters/localstorage'));
diagnostics.use(require('../adapters/hash'));
diagnostics.set(require('../logger/console'));
//
// Expose the diagnostics logger.
//
module.exports = diagnostics;

View File

@ -1,8 +0,0 @@
//
// Select the correct build version depending on the environment.
//
if (process.env.NODE_ENV === 'production') {
module.exports = require('./production.js');
} else {
module.exports = require('./development.js');
}

View File

@ -1,6 +0,0 @@
var diagnostics = require('./');
//
// No way to override `debug` with `diagnostics` in the browser.
//
module.exports = diagnostics;

View File

@ -1,24 +0,0 @@
var create = require('../diagnostics');
/**
* Create a new diagnostics logger.
*
* @param {String} namespace The namespace it should enable.
* @param {Object} options Additional options.
* @returns {Function} The logger.
* @public
*/
var diagnostics = create(function prod(namespace, options) {
options = options || {};
options.namespace = namespace;
options.prod = true;
options.dev = false;
if (!(options.force || prod.force)) return prod.nope(options);
return prod.yep(options);
});
//
// Expose the diagnostics logger.
//
module.exports = diagnostics;

View File

@ -1,212 +0,0 @@
/**
* Contains all configured adapters for the given environment.
*
* @type {Array}
* @public
*/
var adapters = [];
/**
* Contains all modifier functions.
*
* @typs {Array}
* @public
*/
var modifiers = [];
/**
* Our default logger.
*
* @public
*/
var logger = function devnull() {};
/**
* Register a new adapter that will used to find environments.
*
* @param {Function} adapter A function that will return the possible env.
* @returns {Boolean} Indication of a successful add.
* @public
*/
function use(adapter) {
if (~adapters.indexOf(adapter)) return false;
adapters.push(adapter);
return true;
}
/**
* Assign a new log method.
*
* @param {Function} custom The log method.
* @public
*/
function set(custom) {
logger = custom;
}
/**
* Check if the namespace is allowed by any of our adapters.
*
* @param {String} namespace The namespace that needs to be enabled
* @returns {Boolean|Promise} Indication if the namespace is enabled by our adapters.
* @public
*/
function enabled(namespace) {
var async = [];
for (var i = 0; i < adapters.length; i++) {
if (adapters[i].async) {
async.push(adapters[i]);
continue;
}
if (adapters[i](namespace)) return true;
}
if (!async.length) return false;
//
// Now that we know that we Async functions, we know we run in an ES6
// environment and can use all the API's that they offer, in this case
// we want to return a Promise so that we can `await` in React-Native
// for an async adapter.
//
return new Promise(function pinky(resolve) {
Promise.all(
async.map(function prebind(fn) {
return fn(namespace);
})
).then(function resolved(values) {
resolve(values.some(Boolean));
});
});
}
/**
* Add a new message modifier to the debugger.
*
* @param {Function} fn Modification function.
* @returns {Boolean} Indication of a successful add.
* @public
*/
function modify(fn) {
if (~modifiers.indexOf(fn)) return false;
modifiers.push(fn);
return true;
}
/**
* Write data to the supplied logger.
*
* @param {Object} meta Meta information about the log.
* @param {Array} args Arguments for console.log.
* @public
*/
function write() {
logger.apply(logger, arguments);
}
/**
* Process the message with the modifiers.
*
* @param {Mixed} message The message to be transformed by modifers.
* @returns {String} Transformed message.
* @public
*/
function process(message) {
for (var i = 0; i < modifiers.length; i++) {
message = modifiers[i].apply(modifiers[i], arguments);
}
return message;
}
/**
* Introduce options to the logger function.
*
* @param {Function} fn Calback function.
* @param {Object} options Properties to introduce on fn.
* @returns {Function} The passed function
* @public
*/
function introduce(fn, options) {
var has = Object.prototype.hasOwnProperty;
for (var key in options) {
if (has.call(options, key)) {
fn[key] = options[key];
}
}
return fn;
}
/**
* Nope, we're not allowed to write messages.
*
* @returns {Boolean} false
* @public
*/
function nope(options) {
options.enabled = false;
options.modify = modify;
options.set = set;
options.use = use;
return introduce(function diagnopes() {
return false;
}, options);
}
/**
* Yep, we're allowed to write debug messages.
*
* @param {Object} options The options for the process.
* @returns {Function} The function that does the logging.
* @public
*/
function yep(options) {
/**
* The function that receives the actual debug information.
*
* @returns {Boolean} indication that we're logging.
* @public
*/
function diagnostics() {
var args = Array.prototype.slice.call(arguments, 0);
write.call(write, options, process(args, options));
return true;
}
options.enabled = true;
options.modify = modify;
options.set = set;
options.use = use;
return introduce(diagnostics, options);
}
/**
* Simple helper function to introduce various of helper methods to our given
* diagnostics function.
*
* @param {Function} diagnostics The diagnostics function.
* @returns {Function} diagnostics
* @public
*/
module.exports = function create(diagnostics) {
diagnostics.introduce = introduce;
diagnostics.enabled = enabled;
diagnostics.process = process;
diagnostics.modify = modify;
diagnostics.write = write;
diagnostics.nope = nope;
diagnostics.yep = yep;
diagnostics.set = set;
diagnostics.use = use;
return diagnostics;
}

View File

@ -1,19 +0,0 @@
/**
* An idiot proof logger to be used as default. We've wrapped it in a try/catch
* statement to ensure the environments without the `console` API do not crash
* as well as an additional fix for ancient browsers like IE8 where the
* `console.log` API doesn't have an `apply`, so we need to use the Function's
* apply functionality to apply the arguments.
*
* @param {Object} meta Options of the logger.
* @param {Array} messages The actuall message that needs to be logged.
* @public
*/
module.exports = function (meta, messages) {
//
// So yea. IE8 doesn't have an apply so we need a work around to puke the
// arguments in place.
//
try { Function.prototype.apply.call(console.log, console, messages); }
catch (e) {}
}

View File

@ -1,20 +0,0 @@
var colorspace = require('@so-ric/colorspace');
var kuler = require('kuler');
/**
* Prefix the messages with a colored namespace.
*
* @param {Array} args The messages array that is getting written.
* @param {Object} options Options for diagnostics.
* @returns {Array} Altered messages array.
* @public
*/
module.exports = function ansiModifier(args, options) {
var namespace = options.namespace;
var ansi = options.colors !== false
? kuler(namespace +':', colorspace(namespace))
: namespace +':';
args[0] = ansi +' '+ args[0];
return args;
};

View File

@ -1,32 +0,0 @@
var colorspace = require('@so-ric/colorspace');
/**
* Prefix the messages with a colored namespace.
*
* @param {Array} messages The messages array that is getting written.
* @param {Object} options Options for diagnostics.
* @returns {Array} Altered messages array.
* @public
*/
module.exports = function colorNamespace(args, options) {
var namespace = options.namespace;
if (options.colors === false) {
args[0] = namespace +': '+ args[0];
return args;
}
var color = colorspace(namespace);
//
// The console API supports a special %c formatter in browsers. This is used
// to style console messages with any CSS styling, in our case we want to
// use colorize the namespace for clarity. As these are formatters, and
// we need to inject our CSS string as second messages argument so it
// gets picked up correctly.
//
args[0] = '%c'+ namespace +':%c '+ args[0];
args.splice(1, 0, 'color:'+ color, 'color:inherit');
return args;
};

View File

@ -1,36 +0,0 @@
var create = require('../diagnostics');
var tty = require('tty').isatty(1);
/**
* Create a new diagnostics logger.
*
* @param {String} namespace The namespace it should enable.
* @param {Object} options Additional options.
* @returns {Function} The logger.
* @public
*/
var diagnostics = create(function dev(namespace, options) {
options = options || {};
options.colors = 'colors' in options ? options.colors : tty;
options.namespace = namespace;
options.prod = false;
options.dev = true;
if (!dev.enabled(namespace) && !(options.force || dev.force)) {
return dev.nope(options);
}
return dev.yep(options);
});
//
// Configure the logger for the given environment.
//
diagnostics.modify(require('../modifiers/namespace-ansi'));
diagnostics.use(require('../adapters/process.env'));
diagnostics.set(require('../logger/console'));
//
// Expose the diagnostics logger.
//
module.exports = diagnostics;

View File

@ -1,8 +0,0 @@
//
// Select the correct build version depending on the environment.
//
if (process.env.NODE_ENV === 'production') {
module.exports = require('./production.js');
} else {
module.exports = require('./development.js');
}

View File

@ -1,21 +0,0 @@
const diagnostics = require('./');
//
// Override the existing `debug` call so it will use `diagnostics` instead
// of the `debug` module.
//
try {
var key = require.resolve('debug');
require.cache[key] = {
exports: diagnostics,
filename: key,
loaded: true,
id: key
};
} catch (e) { /* We don't really care if it fails */ }
//
// Export the default import as exports again.
//
module.exports = diagnostics;

View File

@ -1,24 +0,0 @@
var create = require('../diagnostics');
/**
* Create a new diagnostics logger.
*
* @param {String} namespace The namespace it should enable.
* @param {Object} options Additional options.
* @returns {Function} The logger.
* @public
*/
var diagnostics = create(function prod(namespace, options) {
options = options || {};
options.namespace = namespace;
options.prod = true;
options.dev = false;
if (!(options.force || prod.force)) return prod.nope(options);
return prod.yep(options);
});
//
// Expose the diagnostics logger.
//
module.exports = diagnostics;

View File

@ -1,64 +0,0 @@
{
"name": "@dabh/diagnostics",
"version": "2.0.8",
"description": "Tools for debugging your node.js modules and event loop",
"main": "./node",
"browser": "./browser",
"scripts": {
"test:basic": "mocha --require test/mock.js test/*.test.js",
"test:node": "mocha --require test/mock test/node.js",
"test:browser": "mocha --require test/mock test/browser.js",
"test:runner": "npm run test:basic && npm run test:node && npm run test:browser",
"webpack:node:prod": "webpack --mode=production node/index.js -o /dev/null --json | webpack-bundle-size-analyzer",
"webpack:node:dev": "webpack --mode=development node/index.js -o /dev/null --json | webpack-bundle-size-analyzer",
"webpack:browser:prod": "webpack --mode=production browser/index.js -o /dev/null --json | webpack-bundle-size-analyzer",
"webpack:browser:dev": "webpack --mode=development browser/index.js -o /dev/null --json | webpack-bundle-size-analyzer",
"test": "nyc --reporter=text --reporter=lcov npm run test:runner"
},
"repository": {
"type": "git",
"url": "git://github.com/DABH/diagnostics.git"
},
"keywords": [
"debug",
"debugger",
"debugging",
"diagnostic",
"diagnostics",
"event",
"loop",
"metrics",
"stats"
],
"author": "Arnout Kazemier",
"license": "MIT",
"bugs": {
"url": "https://github.com/DABH/diagnostics/issues"
},
"homepage": "https://github.com/DABH/diagnostics",
"devDependencies": {
"assume": "2.3.x",
"asyncstorageapi": "^1.0.2",
"mocha": "^11.7.2",
"nyc": "^17.1.0",
"objstorage": "^1.0.0",
"pre-commit": "github:metcalfc/pre-commit#b36c649fd5348d7604a86b7b2f3429c780d1478f",
"require-poisoning": "^2.0.0",
"webpack": "5.x",
"webpack-bundle-size-analyzer": "^3.0.0",
"webpack-cli": "6.x"
},
"dependencies": {
"@so-ric/colorspace": "^1.1.6",
"enabled": "2.0.x",
"kuler": "^2.0.0"
},
"contributors": [
"Martijn Swaagman (https://github.com/swaagie)",
"Jarrett Cruger (https://github.com/jcrugzz)",
"Sevastos (https://github.com/sevastos)"
],
"directories": {
"test": "test"
}
}

View File

@ -1,74 +0,0 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: "CodeQL"
on:
push:
branches: [ "master" ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ "master" ]
schedule:
- cron: '24 5 * * 4'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: [ 'javascript' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@v3
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# Details on CodeQL's query packs refer to : https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
# If the Autobuild fails above, remove it and uncomment the following three lines.
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
# - run: |
# echo "Run, Build Application using script"
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
with:
category: "/language:${{matrix.language}}"

View File

@ -1,510 +0,0 @@
# node-pre-gyp changelog
## 1.0.11
- Fixes dependabot alert [CVE-2021-44906](https://nvd.nist.gov/vuln/detail/CVE-2021-44906)
## 1.0.10
- Upgraded minimist to 1.2.6 to address dependabot alert [CVE-2021-44906](https://nvd.nist.gov/vuln/detail/CVE-2021-44906)
## 1.0.9
- Upgraded node-fetch to 2.6.7 to address [CVE-2022-0235](https://www.cve.org/CVERecord?id=CVE-2022-0235)
- Upgraded detect-libc to 2.0.0 to use non-blocking NodeJS(>=12) Report API
## 1.0.8
- Downgraded npmlog to maintain node v10 and v8 support (https://github.com/mapbox/node-pre-gyp/pull/624)
## 1.0.7
- Upgraded nyc and npmlog to address https://github.com/advisories/GHSA-93q8-gq69-wqmw
## 1.0.6
- Added node v17 to the internal node releases listing
- Upgraded various dependencies declared in package.json to latest major versions (node-fetch from 2.6.1 to 2.6.5, npmlog from 4.1.2 to 5.01, semver from 7.3.4 to 7.3.5, and tar from 6.1.0 to 6.1.11)
- Fixed bug in `staging_host` parameter (https://github.com/mapbox/node-pre-gyp/pull/590)
## 1.0.5
- Fix circular reference warning with node >= v14
## 1.0.4
- Added node v16 to the internal node releases listing
## 1.0.3
- Improved support configuring s3 uploads (solves https://github.com/mapbox/node-pre-gyp/issues/571)
- New options added in https://github.com/mapbox/node-pre-gyp/pull/576: 'bucket', 'region', and `s3ForcePathStyle`
## 1.0.2
- Fixed regression in proxy support (https://github.com/mapbox/node-pre-gyp/issues/572)
## 1.0.1
- Switched from mkdirp@1.0.4 to make-dir@3.1.0 to avoid this bug: https://github.com/isaacs/node-mkdirp/issues/31
## 1.0.0
- Module is now name-spaced at `@mapbox/node-pre-gyp` and the original `node-pre-gyp` is deprecated.
- New: support for staging and production s3 targets (see README.md)
- BREAKING: no longer supporting `node_pre_gyp_accessKeyId` & `node_pre_gyp_secretAccessKey`, use `AWS_ACCESS_KEY_ID` & `AWS_SECRET_ACCESS_KEY` instead to authenticate against s3 for `info`, `publish`, and `unpublish` commands.
- Dropped node v6 support, added node v14 support
- Switched tests to use mapbox-owned bucket for testing
- Added coverage tracking and linting with eslint
- Added back support for symlinks inside the tarball
- Upgraded all test apps to N-API/node-addon-api
- New: support for staging and production s3 targets (see README.md)
- Added `node_pre_gyp_s3_host` env var which has priority over the `--s3_host` option or default.
- Replaced needle with node-fetch
- Added proxy support for node-fetch
- Upgraded to mkdirp@1.x
## 0.17.0
- Got travis + appveyor green again
- Added support for more node versions
## 0.16.0
- Added Node 15 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/520)
## 0.15.0
- Bump dependency on `mkdirp` from `^0.5.1` to `^0.5.3` (https://github.com/mapbox/node-pre-gyp/pull/492)
- Bump dependency on `needle` from `^2.2.1` to `^2.5.0` (https://github.com/mapbox/node-pre-gyp/pull/502)
- Added Node 14 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/501)
## 0.14.0
- Defer modules requires in napi.js (https://github.com/mapbox/node-pre-gyp/pull/434)
- Bump dependency on `tar` from `^4` to `^4.4.2` (https://github.com/mapbox/node-pre-gyp/pull/454)
- Support extracting compiled binary from local offline mirror (https://github.com/mapbox/node-pre-gyp/pull/459)
- Added Node 13 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/483)
## 0.13.0
- Added Node 12 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/449)
## 0.12.0
- Fixed double-build problem with node v10 (https://github.com/mapbox/node-pre-gyp/pull/428)
- Added node 11 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/422)
## 0.11.0
- Fixed double-install problem with node v10
- Significant N-API improvements (https://github.com/mapbox/node-pre-gyp/pull/405)
## 0.10.3
- Now will use `request` over `needle` if request is installed. By default `needle` is used for `https`. This should unbreak proxy support that regressed in v0.9.0
## 0.10.2
- Fixed rc/deep-extent security vulnerability
- Fixed broken reinstall script do to incorrectly named get_best_napi_version
## 0.10.1
- Fix needle error event (@medns)
## 0.10.0
- Allow for a single-level module path when packing @allenluce (https://github.com/mapbox/node-pre-gyp/pull/371)
- Log warnings instead of errors when falling back @xzyfer (https://github.com/mapbox/node-pre-gyp/pull/366)
- Add Node.js v10 support to tests (https://github.com/mapbox/node-pre-gyp/pull/372)
- Remove retire.js from CI (https://github.com/mapbox/node-pre-gyp/pull/372)
- Remove support for Node.js v4 due to [EOL on April 30th, 2018](https://github.com/nodejs/Release/blob/7dd52354049cae99eed0e9fe01345b0722a86fde/schedule.json#L14)
- Update appveyor tests to install default NPM version instead of NPM v2.x for all Windows builds (https://github.com/mapbox/node-pre-gyp/pull/375)
## 0.9.1
- Fixed regression (in v0.9.0) with support for http redirects @allenluce (https://github.com/mapbox/node-pre-gyp/pull/361)
## 0.9.0
- Switched from using `request` to `needle` to reduce size of module deps (https://github.com/mapbox/node-pre-gyp/pull/350)
## 0.8.0
- N-API support (@inspiredware)
## 0.7.1
- Upgraded to tar v4.x
## 0.7.0
- Updated request and hawk (#347)
- Dropped node v0.10.x support
## 0.6.40
- Improved error reporting if an install fails
## 0.6.39
- Support for node v9
- Support for versioning on `{libc}` to allow binaries to work on non-glic linux systems like alpine linux
## 0.6.38
- Maintaining compatibility (for v0.6.x series) with node v0.10.x
## 0.6.37
- Solved one part of #276: now now deduce the node ABI from the major version for node >= 2 even when not stored in the abi_crosswalk.json
- Fixed docs to avoid mentioning the deprecated and dangerous `prepublish` in package.json (#291)
- Add new node versions to crosswalk
- Ported tests to use tape instead of mocha
- Got appveyor tests passing by downgrading npm and node-gyp
## 0.6.36
- Removed the running of `testbinary` during install. Because this was regressed for so long, it is too dangerous to re-enable by default. Developers needing validation can call `node-pre-gyp testbinary` directory.
- Fixed regression in v0.6.35 for electron installs (now skipping binary validation which is not yet supported for electron)
## 0.6.35
- No longer recommending `npm ls` in `prepublish` (#291)
- Fixed testbinary command (#283) @szdavid92
## 0.6.34
- Added new node versions to crosswalk, including v8
- Upgraded deps to latest versions, started using `^` instead of `~` for all deps.
## 0.6.33
- Improved support for yarn
## 0.6.32
- Honor npm configuration for CA bundles (@heikkipora)
- Add node-pre-gyp and npm versions to user agent (@addaleax)
- Updated various deps
- Add known node version for v7.x
## 0.6.31
- Updated various deps
## 0.6.30
- Update to npmlog@4.x and semver@5.3.x
- Add known node version for v6.5.0
## 0.6.29
- Add known node versions for v0.10.45, v0.12.14, v4.4.4, v5.11.1, and v6.1.0
## 0.6.28
- Now more verbose when remote binaries are not available. This is needed since npm is increasingly more quiet by default
and users need to know why builds are falling back to source compiles that might then error out.
## 0.6.27
- Add known node version for node v6
- Stopped bundling dependencies
- Documented method for module authors to avoid bundling node-pre-gyp
- See https://github.com/mapbox/node-pre-gyp/tree/master#configuring for details
## 0.6.26
- Skip validation for nw runtime (https://github.com/mapbox/node-pre-gyp/pull/181) via @fleg
## 0.6.25
- Improved support for auto-detection of electron runtime in `node-pre-gyp.find()`
- Pull request from @enlight - https://github.com/mapbox/node-pre-gyp/pull/187
- Add known node version for 4.4.1 and 5.9.1
## 0.6.24
- Add known node version for 5.8.0, 5.9.0, and 4.4.0.
## 0.6.23
- Add known node version for 0.10.43, 0.12.11, 4.3.2, and 5.7.1.
## 0.6.22
- Add known node version for 4.3.1, and 5.7.0.
## 0.6.21
- Add known node version for 0.10.42, 0.12.10, 4.3.0, and 5.6.0.
## 0.6.20
- Add known node version for 4.2.5, 4.2.6, 5.4.0, 5.4.1,and 5.5.0.
## 0.6.19
- Add known node version for 4.2.4
## 0.6.18
- Add new known node versions for 0.10.x, 0.12.x, 4.x, and 5.x
## 0.6.17
- Re-tagged to fix packaging problem of `Error: Cannot find module 'isarray'`
## 0.6.16
- Added known version in crosswalk for 5.1.0.
## 0.6.15
- Upgraded tar-pack (https://github.com/mapbox/node-pre-gyp/issues/182)
- Support custom binary hosting mirror (https://github.com/mapbox/node-pre-gyp/pull/170)
- Added known version in crosswalk for 4.2.2.
## 0.6.14
- Added node 5.x version
## 0.6.13
- Added more known node 4.x versions
## 0.6.12
- Added support for [Electron](http://electron.atom.io/). Just pass the `--runtime=electron` flag when building/installing. Thanks @zcbenz
## 0.6.11
- Added known node and io.js versions including more 3.x and 4.x versions
## 0.6.10
- Added known node and io.js versions including 3.x and 4.x versions
- Upgraded `tar` dep
## 0.6.9
- Upgraded `rc` dep
- Updated known io.js version: v2.4.0
## 0.6.8
- Upgraded `semver` and `rimraf` deps
- Updated known node and io.js versions
## 0.6.7
- Fixed `node_abi` versions for io.js 1.1.x -> 1.8.x (should be 43, but was stored as 42) (refs https://github.com/iojs/build/issues/94)
## 0.6.6
- Updated with known io.js 2.0.0 version
## 0.6.5
- Now respecting `npm_config_node_gyp` (https://github.com/npm/npm/pull/4887)
- Updated to semver@4.3.2
- Updated known node v0.12.x versions and io.js 1.x versions.
## 0.6.4
- Improved support for `io.js` (@fengmk2)
- Test coverage improvements (@mikemorris)
- Fixed support for `--dist-url` that regressed in 0.6.3
## 0.6.3
- Added support for passing raw options to node-gyp using `--` separator. Flags passed after
the `--` to `node-pre-gyp configure` will be passed directly to gyp while flags passed
after the `--` will be passed directly to make/visual studio.
- Added `node-pre-gyp configure` command to be able to call `node-gyp configure` directly
- Fix issue with require validation not working on windows 7 (@edgarsilva)
## 0.6.2
- Support for io.js >= v1.0.2
- Deferred require of `request` and `tar` to help speed up command line usage of `node-pre-gyp`.
## 0.6.1
- Fixed bundled `tar` version
## 0.6.0
- BREAKING: node odd releases like v0.11.x now use `major.minor.patch` for `{node_abi}` instead of `NODE_MODULE_VERSION` (#124)
- Added support for `toolset` option in versioning. By default is an empty string but `--toolset` can be passed to publish or install to select alternative binaries that target a custom toolset like C++11. For example to target Visual Studio 2014 modules like node-sqlite3 use `--toolset=v140`.
- Added support for `--no-rollback` option to request that a failed binary test does not remove the binary module leaves it in place.
- Added support for `--update-binary` option to request an existing binary be re-installed and the check for a valid local module be skipped.
- Added support for passing build options from `npm` through `node-pre-gyp` to `node-gyp`: `--nodedir`, `--disturl`, `--python`, and `--msvs_version`
## 0.5.31
- Added support for deducing node_abi for node.js runtime from previous release if the series is even
- Added support for --target=0.10.33
## 0.5.30
- Repackaged with latest bundled deps
## 0.5.29
- Added support for semver `build`.
- Fixed support for downloading from urls that include `+`.
## 0.5.28
- Now reporting unix style paths only in reveal command
## 0.5.27
- Fixed support for auto-detecting s3 bucket name when it contains `.` - @taavo
- Fixed support for installing when path contains a `'` - @halfdan
- Ported tests to mocha
## 0.5.26
- Fix node-webkit support when `--target` option is not provided
## 0.5.25
- Fix bundling of deps
## 0.5.24
- Updated ABI crosswalk to incldue node v0.10.30 and v0.10.31
## 0.5.23
- Added `reveal` command. Pass no options to get all versioning data as json. Pass a second arg to grab a single versioned property value
- Added support for `--silent` (shortcut for `--loglevel=silent`)
## 0.5.22
- Fixed node-webkit versioning name (NOTE: node-webkit support still experimental)
## 0.5.21
- New package to fix `shasum check failed` error with v0.5.20
## 0.5.20
- Now versioning node-webkit binaries based on major.minor.patch - assuming no compatible ABI across versions (#90)
## 0.5.19
- Updated to know about more node-webkit releases
## 0.5.18
- Updated to know about more node-webkit releases
## 0.5.17
- Updated to know about node v0.10.29 release
## 0.5.16
- Now supporting all aws-sdk configuration parameters (http://docs.aws.amazon.com/AWSJavaScriptSDK/guide/node-configuring.html) (#86)
## 0.5.15
- Fixed installation of windows packages sub directories on unix systems (#84)
## 0.5.14
- Finished support for cross building using `--target_platform` option (#82)
- Now skipping binary validation on install if target arch/platform do not match the host.
- Removed multi-arch validing for OS X since it required a FAT node.js binary
## 0.5.13
- Fix problem in 0.5.12 whereby the wrong versions of mkdirp and semver where bundled.
## 0.5.12
- Improved support for node-webkit (@Mithgol)
## 0.5.11
- Updated target versions listing
## 0.5.10
- Fixed handling of `-debug` flag passed directory to node-pre-gyp (#72)
- Added optional second arg to `node_pre_gyp.find` to customize the default versioning options used to locate the runtime binary
- Failed install due to `testbinary` check failure no longer leaves behind binary (#70)
## 0.5.9
- Fixed regression in `testbinary` command causing installs to fail on windows with 0.5.7 (#60)
## 0.5.8
- Started bundling deps
## 0.5.7
- Fixed the `testbinary` check, which is used to determine whether to re-download or source compile, to work even in complex dependency situations (#63)
- Exposed the internal `testbinary` command in node-pre-gyp command line tool
- Fixed minor bug so that `fallback_to_build` option is always respected
## 0.5.6
- Added support for versioning on the `name` value in `package.json` (#57).
- Moved to using streams for reading tarball when publishing (#52)
## 0.5.5
- Improved binary validation that also now works with node-webkit (@Mithgol)
- Upgraded test apps to work with node v0.11.x
- Improved test coverage
## 0.5.4
- No longer depends on external install of node-gyp for compiling builds.
## 0.5.3
- Reverted fix for debian/nodejs since it broke windows (#45)
## 0.5.2
- Support for debian systems where the node binary is named `nodejs` (#45)
- Added `bin/node-pre-gyp.cmd` to be able to run command on windows locally (npm creates an .npm automatically when globally installed)
- Updated abi-crosswalk with node v0.10.26 entry.
## 0.5.1
- Various minor bug fixes, several improving windows support for publishing.
## 0.5.0
- Changed property names in `binary` object: now required are `module_name`, `module_path`, and `host`.
- Now `module_path` supports versioning, which allows developers to opt-in to using a versioned install path (#18).
- Added `remote_path` which also supports versioning.
- Changed `remote_uri` to `host`.
## 0.4.2
- Added support for `--target` flag to request cross-compile against a specific node/node-webkit version.
- Added preliminary support for node-webkit
- Fixed support for `--target_arch` option being respected in all cases.
## 0.4.1
- Fixed exception when only stderr is available in binary test (@bendi / #31)
## 0.4.0
- Enforce only `https:` based remote publishing access.
- Added `node-pre-gyp info` command to display listing of published binaries
- Added support for changing the directory node-pre-gyp should build in with the `-C/--directory` option.
- Added support for S3 prefixes.
## 0.3.1
- Added `unpublish` command.
- Fixed module path construction in tests.
- Added ability to disable falling back to build behavior via `npm install --fallback-to-build=false` which overrides setting in a depedencies package.json `install` target.
## 0.3.0
- Support for packaging all files in `module_path` directory - see `app4` for example
- Added `testpackage` command.
- Changed `clean` command to only delete `.node` not entire `build` directory since node-gyp will handle that.
- `.node` modules must be in a folder of there own since tar-pack will remove everything when it unpacks.

View File

@ -1,27 +0,0 @@
Copyright (c), Mapbox
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice,
this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
* Neither the name of node-pre-gyp nor the names of its contributors
may be used to endorse or promote products derived from this software
without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

View File

@ -1,742 +0,0 @@
# @mapbox/node-pre-gyp
#### @mapbox/node-pre-gyp makes it easy to publish and install Node.js C++ addons from binaries
[![Build Status](https://travis-ci.com/mapbox/node-pre-gyp.svg?branch=master)](https://travis-ci.com/mapbox/node-pre-gyp)
[![Build status](https://ci.appveyor.com/api/projects/status/3nxewb425y83c0gv)](https://ci.appveyor.com/project/Mapbox/node-pre-gyp)
`@mapbox/node-pre-gyp` stands between [npm](https://github.com/npm/npm) and [node-gyp](https://github.com/Tootallnate/node-gyp) and offers a cross-platform method of binary deployment.
### Special note on previous package
On Feb 9th, 2021 `@mapbox/node-pre-gyp@1.0.0` was [released](./CHANGELOG.md). Older, unscoped versions that are not part of the `@mapbox` org are deprecated and only `@mapbox/node-pre-gyp` will see updates going forward. To upgrade to the new package do:
```
npm uninstall node-pre-gyp --save
npm install @mapbox/node-pre-gyp --save
```
### Features
- A command line tool called `node-pre-gyp` that can install your package's C++ module from a binary.
- A variety of developer targeted commands for packaging, testing, and publishing binaries.
- A JavaScript module that can dynamically require your installed binary: `require('@mapbox/node-pre-gyp').find`
For a hello world example of a module packaged with `node-pre-gyp` see <https://github.com/springmeyer/node-addon-example> and [the wiki ](https://github.com/mapbox/node-pre-gyp/wiki/Modules-using-node-pre-gyp) for real world examples.
## Credits
- The module is modeled after [node-gyp](https://github.com/Tootallnate/node-gyp) by [@Tootallnate](https://github.com/Tootallnate)
- Motivation for initial development came from [@ErisDS](https://github.com/ErisDS) and the [Ghost Project](https://github.com/TryGhost/Ghost).
- Development is sponsored by [Mapbox](https://www.mapbox.com/)
## FAQ
See the [Frequently Ask Questions](https://github.com/mapbox/node-pre-gyp/wiki/FAQ).
## Depends
- Node.js >= node v8.x
## Install
`node-pre-gyp` is designed to be installed as a local dependency of your Node.js C++ addon and accessed like:
./node_modules/.bin/node-pre-gyp --help
But you can also install it globally:
npm install @mapbox/node-pre-gyp -g
## Usage
### Commands
View all possible commands:
node-pre-gyp --help
- clean - Remove the entire folder containing the compiled .node module
- install - Install pre-built binary for module
- reinstall - Run "clean" and "install" at once
- build - Compile the module by dispatching to node-gyp or nw-gyp
- rebuild - Run "clean" and "build" at once
- package - Pack binary into tarball
- testpackage - Test that the staged package is valid
- publish - Publish pre-built binary
- unpublish - Unpublish pre-built binary
- info - Fetch info on published binaries
You can also chain commands:
node-pre-gyp clean build unpublish publish info
### Options
Options include:
- `-C/--directory`: run the command in this directory
- `--build-from-source`: build from source instead of using pre-built binary
- `--update-binary`: reinstall by replacing previously installed local binary with remote binary
- `--runtime=node-webkit`: customize the runtime: `node`, `electron` and `node-webkit` are the valid options
- `--fallback-to-build`: fallback to building from source if pre-built binary is not available
- `--target=0.4.0`: Pass the target node or node-webkit version to compile against
- `--target_arch=ia32`: Pass the target arch and override the host `arch`. Any value that is [supported by Node.js](https://nodejs.org/api/os.html#osarch) is valid.
- `--target_platform=win32`: Pass the target platform and override the host `platform`. Valid values are `linux`, `darwin`, `win32`, `sunos`, `freebsd`, `openbsd`, and `aix`.
Both `--build-from-source` and `--fallback-to-build` can be passed alone or they can provide values. You can pass `--fallback-to-build=false` to override the option as declared in package.json. In addition to being able to pass `--build-from-source` you can also pass `--build-from-source=myapp` where `myapp` is the name of your module.
For example: `npm install --build-from-source=myapp`. This is useful if:
- `myapp` is referenced in the package.json of a larger app and therefore `myapp` is being installed as a dependency with `npm install`.
- The larger app also depends on other modules installed with `node-pre-gyp`
- You only want to trigger a source compile for `myapp` and the other modules.
### Configuring
This is a guide to configuring your module to use node-pre-gyp.
#### 1) Add new entries to your `package.json`
- Add `@mapbox/node-pre-gyp` to `dependencies`
- Add `aws-sdk` as a `devDependency`
- Add a custom `install` script
- Declare a `binary` object
This looks like:
```js
"dependencies" : {
"@mapbox/node-pre-gyp": "1.x"
},
"devDependencies": {
"aws-sdk": "2.x"
}
"scripts": {
"install": "node-pre-gyp install --fallback-to-build"
},
"binary": {
"module_name": "your_module",
"module_path": "./lib/binding/",
"host": "https://your_module.s3-us-west-1.amazonaws.com"
}
```
For a full example see [node-addon-examples's package.json](https://github.com/springmeyer/node-addon-example/blob/master/package.json).
Let's break this down:
- Dependencies need to list `node-pre-gyp`
- Your devDependencies should list `aws-sdk` so that you can run `node-pre-gyp publish` locally or a CI system. We recommend using `devDependencies` only since `aws-sdk` is large and not needed for `node-pre-gyp install` since it only uses http to fetch binaries
- Your `scripts` section should override the `install` target with `"install": "node-pre-gyp install --fallback-to-build"`. This allows node-pre-gyp to be used instead of the default npm behavior of always source compiling with `node-gyp` directly.
- Your package.json should contain a `binary` section describing key properties you provide to allow node-pre-gyp to package optimally. They are detailed below.
Note: in the past we recommended putting `@mapbox/node-pre-gyp` in the `bundledDependencies`, but we no longer recommend this. In the past there were npm bugs (with node versions 0.10.x) that could lead to node-pre-gyp not being available at the right time during install (unless we bundled). This should no longer be the case. Also, for a time we recommended using `"preinstall": "npm install @mapbox/node-pre-gyp"` as an alternative method to avoid needing to bundle. But this did not behave predictably across all npm versions - see https://github.com/mapbox/node-pre-gyp/issues/260 for the details. So we do not recommend using `preinstall` to install `@mapbox/node-pre-gyp`. More history on this at https://github.com/strongloop/fsevents/issues/157#issuecomment-265545908.
##### The `binary` object has three required properties
###### module_name
The name of your native node module. This value must:
- Match the name passed to [the NODE_MODULE macro](http://nodejs.org/api/addons.html#addons_hello_world)
- Must be a valid C variable name (e.g. it cannot contain `-`)
- Should not include the `.node` extension.
###### module_path
The location your native module is placed after a build. This should be an empty directory without other Javascript files. This entire directory will be packaged in the binary tarball. When installing from a remote package this directory will be overwritten with the contents of the tarball.
Note: This property supports variables based on [Versioning](#versioning).
###### host
A url to the remote location where you've published tarball binaries (must be `https` not `http`).
It is highly recommended that you use Amazon S3. The reasons are:
- Various node-pre-gyp commands like `publish` and `info` only work with an S3 host.
- S3 is a very solid hosting platform for distributing large files.
- We provide detail documentation for using [S3 hosting](#s3-hosting) with node-pre-gyp.
Why then not require S3? Because while some applications using node-pre-gyp need to distribute binaries as large as 20-30 MB, others might have very small binaries and might wish to store them in a GitHub repo. This is not recommended, but if an author really wants to host in a non-S3 location then it should be possible.
It should also be mentioned that there is an optional and entirely separate npm module called [node-pre-gyp-github](https://github.com/bchr02/node-pre-gyp-github) which is intended to complement node-pre-gyp and be installed along with it. It provides the ability to store and publish your binaries within your repositories GitHub Releases if you would rather not use S3 directly. Installation and usage instructions can be found [here](https://github.com/bchr02/node-pre-gyp-github), but the basic premise is that instead of using the ```node-pre-gyp publish``` command you would use ```node-pre-gyp-github publish```.
##### The `binary` object other optional S3 properties
If you are not using a standard s3 path like `bucket_name.s3(.-)region.amazonaws.com`, you might get an error on `publish` because node-pre-gyp extracts the region and bucket from the `host` url. For example, you may have an on-premises s3-compatible storage server, or may have configured a specific dns redirecting to an s3 endpoint. In these cases, you can explicitly set the `region` and `bucket` properties to tell node-pre-gyp to use these values instead of guessing from the `host` property. The following values can be used in the `binary` section:
###### host
The url to the remote server root location (must be `https` not `http`).
###### bucket
The bucket name where your tarball binaries should be located.
###### region
Your S3 server region.
###### s3ForcePathStyle
Set `s3ForcePathStyle` to true if the endpoint url should not be prefixed with the bucket name. If false (default), the server endpoint would be constructed as `bucket_name.your_server.com`.
##### The `binary` object has optional properties
###### remote_path
It **is recommended** that you customize this property. This is an extra path to use for publishing and finding remote tarballs. The default value for `remote_path` is `""` meaning that if you do not provide it then all packages will be published at the base of the `host`. It is recommended to provide a value like `./{name}/v{version}` to help organize remote packages in the case that you choose to publish multiple node addons to the same `host`.
Note: This property supports variables based on [Versioning](#versioning).
###### package_name
It is **not recommended** to override this property unless you are also overriding the `remote_path`. This is the versioned name of the remote tarball containing the binary `.node` module and any supporting files you've placed inside the `module_path` directory. Unless you specify `package_name` in your `package.json` then it defaults to `{module_name}-v{version}-{node_abi}-{platform}-{arch}.tar.gz` which allows your binary to work across node versions, platforms, and architectures. If you are using `remote_path` that is also versioned by `./{module_name}/v{version}` then you could remove these variables from the `package_name` and just use: `{node_abi}-{platform}-{arch}.tar.gz`. Then your remote tarball will be looked up at, for example, `https://example.com/your-module/v0.1.0/node-v11-linux-x64.tar.gz`.
Avoiding the version of your module in the `package_name` and instead only embedding in a directory name can be useful when you want to make a quick tag of your module that does not change any C++ code. In this case you can just copy binaries to the new version behind the scenes like:
```sh
aws s3 sync --acl public-read s3://mapbox-node-binary/sqlite3/v3.0.3/ s3://mapbox-node-binary/sqlite3/v3.0.4/
```
Note: This property supports variables based on [Versioning](#versioning).
#### 2) Add a new target to binding.gyp
`node-pre-gyp` calls out to `node-gyp` to compile the module and passes variables along like [module_name](#module_name) and [module_path](#module_path).
A new target must be added to `binding.gyp` that moves the compiled `.node` module from `./build/Release/module_name.node` into the directory specified by `module_path`.
Add a target like this at the end of your `targets` list:
```js
{
"target_name": "action_after_build",
"type": "none",
"dependencies": [ "<(module_name)" ],
"copies": [
{
"files": [ "<(PRODUCT_DIR)/<(module_name).node" ],
"destination": "<(module_path)"
}
]
}
```
For a full example see [node-addon-example's binding.gyp](https://github.com/springmeyer/node-addon-example/blob/2ff60a8ded7f042864ad21db00c3a5a06cf47075/binding.gyp).
#### 3) Dynamically require your `.node`
Inside the main js file that requires your addon module you are likely currently doing:
```js
var binding = require('../build/Release/binding.node');
```
or:
```js
var bindings = require('./bindings')
```
Change those lines to:
```js
var binary = require('@mapbox/node-pre-gyp');
var path = require('path');
var binding_path = binary.find(path.resolve(path.join(__dirname,'./package.json')));
var binding = require(binding_path);
```
For a full example see [node-addon-example's index.js](https://github.com/springmeyer/node-addon-example/blob/2ff60a8ded7f042864ad21db00c3a5a06cf47075/index.js#L1-L4)
#### 4) Build and package your app
Now build your module from source:
npm install --build-from-source
The `--build-from-source` tells `node-pre-gyp` to not look for a remote package and instead dispatch to node-gyp to build.
Now `node-pre-gyp` should now also be installed as a local dependency so the command line tool it offers can be found at `./node_modules/.bin/node-pre-gyp`.
#### 5) Test
Now `npm test` should work just as it did before.
#### 6) Publish the tarball
Then package your app:
./node_modules/.bin/node-pre-gyp package
Once packaged, now you can publish:
./node_modules/.bin/node-pre-gyp publish
Currently the `publish` command pushes your binary to S3. This requires:
- You have installed `aws-sdk` with `npm install aws-sdk`
- You have created a bucket already.
- The `host` points to an S3 http or https endpoint.
- You have configured node-pre-gyp to read your S3 credentials (see [S3 hosting](#s3-hosting) for details).
You can also host your binaries elsewhere. To do this requires:
- You manually publish the binary created by the `package` command to an `https` endpoint
- Ensure that the `host` value points to your custom `https` endpoint.
#### 7) Automate builds
Now you need to publish builds for all the platforms and node versions you wish to support. This is best automated.
- See [Appveyor Automation](#appveyor-automation) for how to auto-publish builds on Windows.
- See [Travis Automation](#travis-automation) for how to auto-publish builds on OS X and Linux.
#### 8) You're done!
Now publish your module to the npm registry. Users will now be able to install your module from a binary.
What will happen is this:
1. `npm install <your package>` will pull from the npm registry
2. npm will run the `install` script which will call out to `node-pre-gyp`
3. `node-pre-gyp` will fetch the binary `.node` module and unpack in the right place
4. Assuming that all worked, you are done
If a a binary was not available for a given platform and `--fallback-to-build` was used then `node-gyp rebuild` will be called to try to source compile the module.
#### 9) One more option
It may be that you want to work with two s3 buckets, one for staging and one for production; this
arrangement makes it less likely to accidentally overwrite a production binary. It also allows the production
environment to have more restrictive permissions than staging while still enabling publishing when
developing and testing.
The binary.host property can be set at execution time. In order to do so all of the following conditions
must be true.
- binary.host is falsey or not present
- binary.staging_host is not empty
- binary.production_host is not empty
If any of these checks fail then the operation will not perform execution time determination of the s3 target.
If the command being executed is either "publish" or "unpublish" then the default is set to `binary.staging_host`. In all other cases
the default is `binary.production_host`.
The command-line options `--s3_host=staging` or `--s3_host=production` override the default. If `s3_host`
is present and not `staging` or `production` an exception is thrown.
This allows installing from staging by specifying `--s3_host=staging`. And it requires specifying
`--s3_option=production` in order to publish to, or unpublish from, production, making accidental errors less likely.
## Node-API Considerations
[Node-API](https://nodejs.org/api/n-api.html#n_api_node_api), which was previously known as N-API, is an ABI-stable alternative to previous technologies such as [nan](https://github.com/nodejs/nan) which are tied to a specific Node runtime engine. Node-API is Node runtime engine agnostic and guarantees modules created today will continue to run, without changes, into the future.
Using `node-pre-gyp` with Node-API projects requires a handful of additional configuration values and imposes some additional requirements.
The most significant difference is that an Node-API module can be coded to target multiple Node-API versions. Therefore, an Node-API module must declare in its `package.json` file which Node-API versions the module is designed to run against. In addition, since multiple builds may be required for a single module, path and file names must be specified in way that avoids naming conflicts.
### The `napi_versions` array property
A Node-API module must declare in its `package.json` file, the Node-API versions the module is intended to support. This is accomplished by including an `napi-versions` array property in the `binary` object. For example:
```js
"binary": {
"module_name": "your_module",
"module_path": "your_module_path",
"host": "https://your_bucket.s3-us-west-1.amazonaws.com",
"napi_versions": [1,3]
}
```
If the `napi_versions` array property is *not* present, `node-pre-gyp` operates as it always has. Including the `napi_versions` array property instructs `node-pre-gyp` that this is a Node-API module build.
When the `napi_versions` array property is present, `node-pre-gyp` fires off multiple operations, one for each of the Node-API versions in the array. In the example above, two operations are initiated, one for Node-API version 1 and second for Node-API version 3. How this version number is communicated is described next.
### The `napi_build_version` value
For each of the Node-API module operations `node-pre-gyp` initiates, it ensures that the `napi_build_version` is set appropriately.
This value is of importance in two areas:
1. The C/C++ code which needs to know against which Node-API version it should compile.
2. `node-pre-gyp` itself which must assign appropriate path and file names to avoid collisions.
### Defining `NAPI_VERSION` for the C/C++ code
The `napi_build_version` value is communicated to the C/C++ code by adding this code to the `binding.gyp` file:
```
"defines": [
"NAPI_VERSION=<(napi_build_version)",
]
```
This ensures that `NAPI_VERSION`, an integer value, is declared appropriately to the C/C++ code for each build.
> Note that earlier versions of this document recommended defining the symbol `NAPI_BUILD_VERSION`. `NAPI_VERSION` is preferred because it used by the Node-API C/C++ headers to configure the specific Node-API versions being requested.
### Path and file naming requirements in `package.json`
Since `node-pre-gyp` fires off multiple operations for each request, it is essential that path and file names be created in such a way as to avoid collisions. This is accomplished by imposing additional path and file naming requirements.
Specifically, when performing Node-API builds, the `{napi_build_version}` text configuration value *must* be present in the `module_path` property. In addition, the `{napi_build_version}` text configuration value *must* be present in either the `remote_path` or `package_name` property. (No problem if it's in both.)
Here's an example:
```js
"binary": {
"module_name": "your_module",
"module_path": "./lib/binding/napi-v{napi_build_version}",
"remote_path": "./{module_name}/v{version}/{configuration}/",
"package_name": "{platform}-{arch}-napi-v{napi_build_version}.tar.gz",
"host": "https://your_bucket.s3-us-west-1.amazonaws.com",
"napi_versions": [1,3]
}
```
## Supporting both Node-API and NAN builds
You may have a legacy native add-on that you wish to continue supporting for those versions of Node that do not support Node-API, as you add Node-API support for later Node versions. This can be accomplished by specifying the `node_napi_label` configuration value in the package.json `binary.package_name` property.
Placing the configuration value `node_napi_label` in the package.json `binary.package_name` property instructs `node-pre-gyp` to build all viable Node-API binaries supported by the current Node instance. If the current Node instance does not support Node-API, `node-pre-gyp` will request a traditional, non-Node-API build.
The configuration value `node_napi_label` is set by `node-pre-gyp` to the type of build created, `napi` or `node`, and the version number. For Node-API builds, the string contains the Node-API version nad has values like `napi-v3`. For traditional, non-Node-API builds, the string contains the ABI version with values like `node-v46`.
Here's how the `binary` configuration above might be changed to support both Node-API and NAN builds:
```js
"binary": {
"module_name": "your_module",
"module_path": "./lib/binding/{node_napi_label}",
"remote_path": "./{module_name}/v{version}/{configuration}/",
"package_name": "{platform}-{arch}-{node_napi_label}.tar.gz",
"host": "https://your_bucket.s3-us-west-1.amazonaws.com",
"napi_versions": [1,3]
}
```
The C/C++ symbol `NAPI_VERSION` can be used to distinguish Node-API and non-Node-API builds. The value of `NAPI_VERSION` is set to the integer Node-API version for Node-API builds and is set to `0` for non-Node-API builds.
For example:
```C
#if NAPI_VERSION
// Node-API code goes here
#else
// NAN code goes here
#endif
```
### Two additional configuration values
The following two configuration values, which were implemented in previous versions of `node-pre-gyp`, continue to exist, but have been replaced by the `node_napi_label` configuration value described above.
1. `napi_version` If Node-API is supported by the currently executing Node instance, this value is the Node-API version number supported by Node. If Node-API is not supported, this value is an empty string.
2. `node_abi_napi` If the value returned for `napi_version` is non empty, this value is `'napi'`. If the value returned for `napi_version` is empty, this value is the value returned for `node_abi`.
These values are present for use in the `binding.gyp` file and may be used as `{napi_version}` and `{node_abi_napi}` for text substituion in the `binary` properties of the `package.json` file.
## S3 Hosting
You can host wherever you choose but S3 is cheap, `node-pre-gyp publish` expects it, and S3 can be integrated well with [Travis.ci](http://travis-ci.org) to automate builds for OS X and Ubuntu, and with [Appveyor](http://appveyor.com) to automate builds for Windows. Here is an approach to do this:
First, get setup locally and test the workflow:
#### 1) Create an S3 bucket
And have your **key** and **secret key** ready for writing to the bucket.
It is recommended to create a IAM user with a policy that only gives permissions to the specific bucket you plan to publish to. This can be done in the [IAM console](https://console.aws.amazon.com/iam/) by: 1) adding a new user, 2) choosing `Attach User Policy`, 3) Using the `Policy Generator`, 4) selecting `Amazon S3` for the service, 5) adding the actions: `DeleteObject`, `GetObject`, `GetObjectAcl`, `ListBucket`, `HeadBucket`, `PutObject`, `PutObjectAcl`, 6) adding an ARN of `arn:aws:s3:::bucket/*` (replacing `bucket` with your bucket name), and finally 7) clicking `Add Statement` and saving the policy. It should generate a policy like:
```js
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "objects",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObjectAcl",
"s3:GetObject",
"s3:DeleteObject",
"s3:PutObjectAcl"
],
"Resource": "arn:aws:s3:::your-bucket-name/*"
},
{
"Sid": "bucket",
"Effect": "Allow",
"Action": "s3:ListBucket",
"Resource": "arn:aws:s3:::your-bucket-name"
},
{
"Sid": "buckets",
"Effect": "Allow",
"Action": "s3:HeadBucket",
"Resource": "*"
}
]
}
```
#### 2) Install node-pre-gyp
Either install it globally:
npm install node-pre-gyp -g
Or put the local version on your PATH
export PATH=`pwd`/node_modules/.bin/:$PATH
#### 3) Configure AWS credentials
It is recommended to configure the AWS JS SDK v2 used internally by `node-pre-gyp` by setting these environment variables:
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCESS_KEY
But also you can also use the `Shared Config File` mentioned [in the AWS JS SDK v2 docs](https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/configuring-the-jssdk.html)
#### 4) Package and publish your build
Install the `aws-sdk`:
npm install aws-sdk
Then publish:
node-pre-gyp package publish
Note: if you hit an error like `Hostname/IP doesn't match certificate's altnames` it may mean that you need to provide the `region` option in your config.
## Appveyor Automation
[Appveyor](http://www.appveyor.com/) can build binaries and publish the results per commit and supports:
- Windows Visual Studio 2013 and related compilers
- Both 64 bit (x64) and 32 bit (x86) build configurations
- Multiple Node.js versions
For an example of doing this see [node-sqlite3's appveyor.yml](https://github.com/mapbox/node-sqlite3/blob/master/appveyor.yml).
Below is a guide to getting set up:
#### 1) Create a free Appveyor account
Go to https://ci.appveyor.com/signup/free and sign in with your GitHub account.
#### 2) Create a new project
Go to https://ci.appveyor.com/projects/new and select the GitHub repo for your module
#### 3) Add appveyor.yml and push it
Once you have committed an `appveyor.yml` ([appveyor.yml reference](http://www.appveyor.com/docs/appveyor-yml)) to your GitHub repo and pushed it AppVeyor should automatically start building your project.
#### 4) Create secure variables
Encrypt your S3 AWS keys by going to <https://ci.appveyor.com/tools/encrypt> and hitting the `encrypt` button.
Then paste the result into your `appveyor.yml`
```yml
environment:
AWS_ACCESS_KEY_ID:
secure: Dn9HKdLNYvDgPdQOzRq/DqZ/MPhjknRHB1o+/lVU8MA=
AWS_SECRET_ACCESS_KEY:
secure: W1rwNoSnOku1r+28gnoufO8UA8iWADmL1LiiwH9IOkIVhDTNGdGPJqAlLjNqwLnL
```
NOTE: keys are per account but not per repo (this is difference than Travis where keys are per repo but not related to the account used to encrypt them).
#### 5) Hook up publishing
Just put `node-pre-gyp package publish` in your `appveyor.yml` after `npm install`.
#### 6) Publish when you want
You might wish to publish binaries only on a specific commit. To do this you could borrow from the [Travis CI idea of commit keywords](http://about.travis-ci.org/docs/user/how-to-skip-a-build/) and add special handling for commit messages with `[publish binary]`:
SET CM=%APPVEYOR_REPO_COMMIT_MESSAGE%
if not "%CM%" == "%CM:[publish binary]=%" node-pre-gyp --msvs_version=2013 publish
If your commit message contains special characters (e.g. `&`) this method might fail. An alternative is to use PowerShell, which gives you additional possibilities, like ignoring case by using `ToLower()`:
ps: if($env:APPVEYOR_REPO_COMMIT_MESSAGE.ToLower().Contains('[publish binary]')) { node-pre-gyp --msvs_version=2013 publish }
Remember this publishing is not the same as `npm publish`. We're just talking about the binary module here and not your entire npm package.
## Travis Automation
[Travis](https://travis-ci.org/) can push to S3 after a successful build and supports both:
- Ubuntu Precise and OS X (64 bit)
- Multiple Node.js versions
For an example of doing this see [node-add-example's .travis.yml](https://github.com/springmeyer/node-addon-example/blob/2ff60a8ded7f042864ad21db00c3a5a06cf47075/.travis.yml).
Note: if you need 32 bit binaries, this can be done from a 64 bit Travis machine. See [the node-sqlite3 scripts for an example of doing this](https://github.com/mapbox/node-sqlite3/blob/bae122aa6a2b8a45f6b717fab24e207740e32b5d/scripts/build_against_node.sh#L54-L74).
Below is a guide to getting set up:
#### 1) Install the Travis gem
gem install travis
#### 2) Create secure variables
Make sure you run this command from within the directory of your module.
Use `travis-encrypt` like:
travis encrypt AWS_ACCESS_KEY_ID=${node_pre_gyp_accessKeyId}
travis encrypt AWS_SECRET_ACCESS_KEY=${node_pre_gyp_secretAccessKey}
Then put those values in your `.travis.yml` like:
```yaml
env:
global:
- secure: F+sEL/v56CzHqmCSSES4pEyC9NeQlkoR0Gs/ZuZxX1ytrj8SKtp3MKqBj7zhIclSdXBz4Ev966Da5ctmcTd410p0b240MV6BVOkLUtkjZJyErMBOkeb8n8yVfSoeMx8RiIhBmIvEn+rlQq+bSFis61/JkE9rxsjkGRZi14hHr4M=
- secure: o2nkUQIiABD139XS6L8pxq3XO5gch27hvm/gOdV+dzNKc/s2KomVPWcOyXNxtJGhtecAkABzaW8KHDDi5QL1kNEFx6BxFVMLO8rjFPsMVaBG9Ks6JiDQkkmrGNcnVdxI/6EKTLHTH5WLsz8+J7caDBzvKbEfTux5EamEhxIWgrI=
```
More details on Travis encryption at http://about.travis-ci.org/docs/user/encryption-keys/.
#### 3) Hook up publishing
Just put `node-pre-gyp package publish` in your `.travis.yml` after `npm install`.
##### OS X publishing
If you want binaries for OS X in addition to linux you can enable [multi-os for Travis](http://docs.travis-ci.com/user/multi-os/#Setting-.travis.yml)
Use a configuration like:
```yml
language: cpp
os:
- linux
- osx
env:
matrix:
- NODE_VERSION="4"
- NODE_VERSION="6"
before_install:
- rm -rf ~/.nvm/ && git clone --depth 1 https://github.com/creationix/nvm.git ~/.nvm
- source ~/.nvm/nvm.sh
- nvm install $NODE_VERSION
- nvm use $NODE_VERSION
```
See [Travis OS X Gotchas](#travis-os-x-gotchas) for why we replace `language: node_js` and `node_js:` sections with `language: cpp` and a custom matrix.
Also create platform specific sections for any deps that need install. For example if you need libpng:
```yml
- if [ $(uname -s) == 'Linux' ]; then apt-get install libpng-dev; fi;
- if [ $(uname -s) == 'Darwin' ]; then brew install libpng; fi;
```
For detailed multi-OS examples see [node-mapnik](https://github.com/mapnik/node-mapnik/blob/master/.travis.yml) and [node-sqlite3](https://github.com/mapbox/node-sqlite3/blob/master/.travis.yml).
##### Travis OS X Gotchas
First, unlike the Travis Linux machines, the OS X machines do not put `node-pre-gyp` on PATH by default. To do so you will need to:
```sh
export PATH=$(pwd)/node_modules/.bin:${PATH}
```
Second, the OS X machines do not support using a matrix for installing different Node.js versions. So you need to bootstrap the installation of Node.js in a cross platform way.
By doing:
```yml
env:
matrix:
- NODE_VERSION="4"
- NODE_VERSION="6"
before_install:
- rm -rf ~/.nvm/ && git clone --depth 1 https://github.com/creationix/nvm.git ~/.nvm
- source ~/.nvm/nvm.sh
- nvm install $NODE_VERSION
- nvm use $NODE_VERSION
```
You can easily recreate the previous behavior of this matrix:
```yml
node_js:
- "4"
- "6"
```
#### 4) Publish when you want
You might wish to publish binaries only on a specific commit. To do this you could borrow from the [Travis CI idea of commit keywords](http://about.travis-ci.org/docs/user/how-to-skip-a-build/) and add special handling for commit messages with `[publish binary]`:
COMMIT_MESSAGE=$(git log --format=%B --no-merges -n 1 | tr -d '\n')
if [[ ${COMMIT_MESSAGE} =~ "[publish binary]" ]]; then node-pre-gyp publish; fi;
Then you can trigger new binaries to be built like:
git commit -a -m "[publish binary]"
Or, if you don't have any changes to make simply run:
git commit --allow-empty -m "[publish binary]"
WARNING: if you are working in a pull request and publishing binaries from there then you will want to avoid double publishing when Travis CI builds both the `push` and `pr`. You only want to run the publish on the `push` commit. See https://github.com/Project-OSRM/node-osrm/blob/8eb837abe2e2e30e595093d16e5354bc5c573575/scripts/is_pr_merge.sh which is called from https://github.com/Project-OSRM/node-osrm/blob/8eb837abe2e2e30e595093d16e5354bc5c573575/scripts/publish.sh for an example of how to do this.
Remember this publishing is not the same as `npm publish`. We're just talking about the binary module here and not your entire npm package. To automate the publishing of your entire package to npm on Travis see http://about.travis-ci.org/docs/user/deployment/npm/
# Versioning
The `binary` properties of `module_path`, `remote_path`, and `package_name` support variable substitution. The strings are evaluated by `node-pre-gyp` depending on your system and any custom build flags you passed.
- `node_abi`: The node C++ `ABI` number. This value is available in Javascript as `process.versions.modules` as of [`>= v0.10.4 >= v0.11.7`](https://github.com/joyent/node/commit/ccabd4a6fa8a6eb79d29bc3bbe9fe2b6531c2d8e) and in C++ as the `NODE_MODULE_VERSION` define much earlier. For versions of Node before this was available we fallback to the V8 major and minor version.
- `platform` matches node's `process.platform` like `linux`, `darwin`, and `win32` unless the user passed the `--target_platform` option to override.
- `arch` matches node's `process.arch` like `x64` or `ia32` unless the user passes the `--target_arch` option to override.
- `libc` matches `require('detect-libc').family` like `glibc` or `musl` unless the user passes the `--target_libc` option to override.
- `configuration` - Either 'Release' or 'Debug' depending on if `--debug` is passed during the build.
- `module_name` - the `binary.module_name` attribute from `package.json`.
- `version` - the semver `version` value for your module from `package.json` (NOTE: ignores the `semver.build` property).
- `major`, `minor`, `patch`, and `prelease` match the individual semver values for your module's `version`
- `build` - the sevmer `build` value. For example it would be `this.that` if your package.json `version` was `v1.0.0+this.that`
- `prerelease` - the semver `prerelease` value. For example it would be `alpha.beta` if your package.json `version` was `v1.0.0-alpha.beta`
The options are visible in the code at <https://github.com/mapbox/node-pre-gyp/blob/612b7bca2604508d881e1187614870ba19a7f0c5/lib/util/versioning.js#L114-L127>
# Download binary files from a mirror
S3 is broken in China for the well known reason.
Using the `npm` config argument: `--{module_name}_binary_host_mirror` can download binary files through a mirror, `-` in `module_name` will be replaced with `_`.
e.g.: Install [v8-profiler](https://www.npmjs.com/package/v8-profiler) from `npm`.
```bash
$ npm install v8-profiler --profiler_binary_host_mirror=https://npm.taobao.org/mirrors/node-inspector/
```
e.g.: Install [canvas-prebuilt](https://www.npmjs.com/package/canvas-prebuilt) from `npm`.
```bash
$ npm install canvas-prebuilt --canvas_prebuilt_binary_host_mirror=https://npm.taobao.org/mirrors/canvas-prebuilt/
```

View File

@ -1,4 +0,0 @@
#!/usr/bin/env node
'use strict';
require('../lib/main');

View File

@ -1,2 +0,0 @@
@echo off
node "%~dp0\node-pre-gyp" %*

View File

@ -1,10 +0,0 @@
# Contributing
### Releasing a new version:
- Ensure tests are passing on travis and appveyor
- Run `node scripts/abi_crosswalk.js` and commit any changes
- Update the changelog
- Tag a new release like: `git tag -a v0.6.34 -m "tagging v0.6.34" && git push --tags`
- Run `npm publish`

View File

@ -1,51 +0,0 @@
'use strict';
module.exports = exports = build;
exports.usage = 'Attempts to compile the module by dispatching to node-gyp or nw-gyp';
const napi = require('./util/napi.js');
const compile = require('./util/compile.js');
const handle_gyp_opts = require('./util/handle_gyp_opts.js');
const configure = require('./configure.js');
function do_build(gyp, argv, callback) {
handle_gyp_opts(gyp, argv, (err, result) => {
let final_args = ['build'].concat(result.gyp).concat(result.pre);
if (result.unparsed.length > 0) {
final_args = final_args.
concat(['--']).
concat(result.unparsed);
}
if (!err && result.opts.napi_build_version) {
napi.swap_build_dir_in(result.opts.napi_build_version);
}
compile.run_gyp(final_args, result.opts, (err2) => {
if (result.opts.napi_build_version) {
napi.swap_build_dir_out(result.opts.napi_build_version);
}
return callback(err2);
});
});
}
function build(gyp, argv, callback) {
// Form up commands to pass to node-gyp:
// We map `node-pre-gyp build` to `node-gyp configure build` so that we do not
// trigger a clean and therefore do not pay the penalty of a full recompile
if (argv.length && (argv.indexOf('rebuild') > -1)) {
argv.shift(); // remove `rebuild`
// here we map `node-pre-gyp rebuild` to `node-gyp rebuild` which internally means
// "clean + configure + build" and triggers a full recompile
compile.run_gyp(['clean'], {}, (err3) => {
if (err3) return callback(err3);
configure(gyp, argv, (err4) => {
if (err4) return callback(err4);
return do_build(gyp, argv, callback);
});
});
} else {
return do_build(gyp, argv, callback);
}
}

View File

@ -1,31 +0,0 @@
'use strict';
module.exports = exports = clean;
exports.usage = 'Removes the entire folder containing the compiled .node module';
const rm = require('rimraf');
const exists = require('fs').exists || require('path').exists;
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
const path = require('path');
function clean(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
const to_delete = opts.module_path;
if (!to_delete) {
return callback(new Error('module_path is empty, refusing to delete'));
} else if (path.normalize(to_delete) === path.normalize(process.cwd())) {
return callback(new Error('module_path is not set, refusing to delete'));
} else {
exists(to_delete, (found) => {
if (found) {
if (!gyp.opts.silent_clean) console.log('[' + package_json.name + '] Removing "%s"', to_delete);
return rm(to_delete, callback);
}
return callback();
});
}
}

View File

@ -1,52 +0,0 @@
'use strict';
module.exports = exports = configure;
exports.usage = 'Attempts to configure node-gyp or nw-gyp build';
const napi = require('./util/napi.js');
const compile = require('./util/compile.js');
const handle_gyp_opts = require('./util/handle_gyp_opts.js');
function configure(gyp, argv, callback) {
handle_gyp_opts(gyp, argv, (err, result) => {
let final_args = result.gyp.concat(result.pre);
// pull select node-gyp configure options out of the npm environ
const known_gyp_args = ['dist-url', 'python', 'nodedir', 'msvs_version'];
known_gyp_args.forEach((key) => {
const val = gyp.opts[key] || gyp.opts[key.replace('-', '_')];
if (val) {
final_args.push('--' + key + '=' + val);
}
});
// --ensure=false tell node-gyp to re-install node development headers
// but it is only respected by node-gyp install, so we have to call install
// as a separate step if the user passes it
if (gyp.opts.ensure === false) {
const install_args = final_args.concat(['install', '--ensure=false']);
compile.run_gyp(install_args, result.opts, (err2) => {
if (err2) return callback(err2);
if (result.unparsed.length > 0) {
final_args = final_args.
concat(['--']).
concat(result.unparsed);
}
compile.run_gyp(['configure'].concat(final_args), result.opts, (err3) => {
return callback(err3);
});
});
} else {
if (result.unparsed.length > 0) {
final_args = final_args.
concat(['--']).
concat(result.unparsed);
}
compile.run_gyp(['configure'].concat(final_args), result.opts, (err4) => {
if (!err4 && result.opts.napi_build_version) {
napi.swap_build_dir_out(result.opts.napi_build_version);
}
return callback(err4);
});
}
});
}

View File

@ -1,38 +0,0 @@
'use strict';
module.exports = exports = info;
exports.usage = 'Lists all published binaries (requires aws-sdk)';
const log = require('npmlog');
const versioning = require('./util/versioning.js');
const s3_setup = require('./util/s3_setup.js');
function info(gyp, argv, callback) {
const package_json = gyp.package_json;
const opts = versioning.evaluate(package_json, gyp.opts);
const config = {};
s3_setup.detect(opts, config);
const s3 = s3_setup.get_s3(config);
const s3_opts = {
Bucket: config.bucket,
Prefix: config.prefix
};
s3.listObjects(s3_opts, (err, meta) => {
if (err && err.code === 'NotFound') {
return callback(new Error('[' + package_json.name + '] Not found: https://' + s3_opts.Bucket + '.s3.amazonaws.com/' + config.prefix));
} else if (err) {
return callback(err);
} else {
log.verbose(JSON.stringify(meta, null, 1));
if (meta && meta.Contents) {
meta.Contents.forEach((obj) => {
console.log(obj.Key);
});
} else {
console.error('[' + package_json.name + '] No objects found at https://' + s3_opts.Bucket + '.s3.amazonaws.com/' + config.prefix);
}
return callback();
}
});
}

View File

@ -1,235 +0,0 @@
'use strict';
module.exports = exports = install;
exports.usage = 'Attempts to install pre-built binary for module';
const fs = require('fs');
const path = require('path');
const log = require('npmlog');
const existsAsync = fs.exists || path.exists;
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
const makeDir = require('make-dir');
// for fetching binaries
const fetch = require('node-fetch');
const tar = require('tar');
let npgVersion = 'unknown';
try {
// Read own package.json to get the current node-pre-pyp version.
const ownPackageJSON = fs.readFileSync(path.join(__dirname, '..', 'package.json'), 'utf8');
npgVersion = JSON.parse(ownPackageJSON).version;
} catch (e) {
// do nothing
}
function place_binary(uri, targetDir, opts, callback) {
log.http('GET', uri);
// Try getting version info from the currently running npm.
const envVersionInfo = process.env.npm_config_user_agent ||
'node ' + process.version;
const sanitized = uri.replace('+', '%2B');
const requestOpts = {
uri: sanitized,
headers: {
'User-Agent': 'node-pre-gyp (v' + npgVersion + ', ' + envVersionInfo + ')'
},
follow_max: 10
};
if (opts.cafile) {
try {
requestOpts.ca = fs.readFileSync(opts.cafile);
} catch (e) {
return callback(e);
}
} else if (opts.ca) {
requestOpts.ca = opts.ca;
}
const proxyUrl = opts.proxy ||
process.env.http_proxy ||
process.env.HTTP_PROXY ||
process.env.npm_config_proxy;
let agent;
if (proxyUrl) {
const ProxyAgent = require('https-proxy-agent');
agent = new ProxyAgent(proxyUrl);
log.http('download', 'proxy agent configured using: "%s"', proxyUrl);
}
fetch(sanitized, { agent })
.then((res) => {
if (!res.ok) {
throw new Error(`response status ${res.status} ${res.statusText} on ${sanitized}`);
}
const dataStream = res.body;
return new Promise((resolve, reject) => {
let extractions = 0;
const countExtractions = (entry) => {
extractions += 1;
log.info('install', 'unpacking %s', entry.path);
};
dataStream.pipe(extract(targetDir, countExtractions))
.on('error', (e) => {
reject(e);
});
dataStream.on('end', () => {
resolve(`extracted file count: ${extractions}`);
});
dataStream.on('error', (e) => {
reject(e);
});
});
})
.then((text) => {
log.info(text);
callback();
})
.catch((e) => {
log.error(`install ${e.message}`);
callback(e);
});
}
function extract(to, onentry) {
return tar.extract({
cwd: to,
strip: 1,
onentry
});
}
function extract_from_local(from, targetDir, callback) {
if (!fs.existsSync(from)) {
return callback(new Error('Cannot find file ' + from));
}
log.info('Found local file to extract from ' + from);
// extract helpers
let extractCount = 0;
function countExtractions(entry) {
extractCount += 1;
log.info('install', 'unpacking ' + entry.path);
}
function afterExtract(err) {
if (err) return callback(err);
if (extractCount === 0) {
return callback(new Error('There was a fatal problem while extracting the tarball'));
}
log.info('tarball', 'done parsing tarball');
callback();
}
fs.createReadStream(from).pipe(extract(targetDir, countExtractions))
.on('close', afterExtract)
.on('error', afterExtract);
}
function do_build(gyp, argv, callback) {
const args = ['rebuild'].concat(argv);
gyp.todo.push({ name: 'build', args: args });
process.nextTick(callback);
}
function print_fallback_error(err, opts, package_json) {
const fallback_message = ' (falling back to source compile with node-gyp)';
let full_message = '';
if (err.statusCode !== undefined) {
// If we got a network response it but failed to download
// it means remote binaries are not available, so let's try to help
// the user/developer with the info to debug why
full_message = 'Pre-built binaries not found for ' + package_json.name + '@' + package_json.version;
full_message += ' and ' + opts.runtime + '@' + (opts.target || process.versions.node) + ' (' + opts.node_abi + ' ABI, ' + opts.libc + ')';
full_message += fallback_message;
log.warn('Tried to download(' + err.statusCode + '): ' + opts.hosted_tarball);
log.warn(full_message);
log.http(err.message);
} else {
// If we do not have a statusCode that means an unexpected error
// happened and prevented an http response, so we output the exact error
full_message = 'Pre-built binaries not installable for ' + package_json.name + '@' + package_json.version;
full_message += ' and ' + opts.runtime + '@' + (opts.target || process.versions.node) + ' (' + opts.node_abi + ' ABI, ' + opts.libc + ')';
full_message += fallback_message;
log.warn(full_message);
log.warn('Hit error ' + err.message);
}
}
//
// install
//
function install(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const source_build = gyp.opts['build-from-source'] || gyp.opts.build_from_source;
const update_binary = gyp.opts['update-binary'] || gyp.opts.update_binary;
const should_do_source_build = source_build === package_json.name || (source_build === true || source_build === 'true');
if (should_do_source_build) {
log.info('build', 'requesting source compile');
return do_build(gyp, argv, callback);
} else {
const fallback_to_build = gyp.opts['fallback-to-build'] || gyp.opts.fallback_to_build;
let should_do_fallback_build = fallback_to_build === package_json.name || (fallback_to_build === true || fallback_to_build === 'true');
// but allow override from npm
if (process.env.npm_config_argv) {
const cooked = JSON.parse(process.env.npm_config_argv).cooked;
const match = cooked.indexOf('--fallback-to-build');
if (match > -1 && cooked.length > match && cooked[match + 1] === 'false') {
should_do_fallback_build = false;
log.info('install', 'Build fallback disabled via npm flag: --fallback-to-build=false');
}
}
let opts;
try {
opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
} catch (err) {
return callback(err);
}
opts.ca = gyp.opts.ca;
opts.cafile = gyp.opts.cafile;
const from = opts.hosted_tarball;
const to = opts.module_path;
const binary_module = path.join(to, opts.module_name + '.node');
existsAsync(binary_module, (found) => {
if (!update_binary) {
if (found) {
console.log('[' + package_json.name + '] Success: "' + binary_module + '" already installed');
console.log('Pass --update-binary to reinstall or --build-from-source to recompile');
return callback();
}
log.info('check', 'checked for "' + binary_module + '" (not found)');
}
makeDir(to).then(() => {
const fileName = from.startsWith('file://') && from.slice('file://'.length);
if (fileName) {
extract_from_local(fileName, to, after_place);
} else {
place_binary(from, to, opts, after_place);
}
}).catch((err) => {
after_place(err);
});
function after_place(err) {
if (err && should_do_fallback_build) {
print_fallback_error(err, opts, package_json);
return do_build(gyp, argv, callback);
} else if (err) {
return callback(err);
} else {
console.log('[' + package_json.name + '] Success: "' + binary_module + '" is installed via remote');
return callback();
}
}
});
}
}

View File

@ -1,125 +0,0 @@
'use strict';
/**
* Set the title.
*/
process.title = 'node-pre-gyp';
const node_pre_gyp = require('../');
const log = require('npmlog');
/**
* Process and execute the selected commands.
*/
const prog = new node_pre_gyp.Run({ argv: process.argv });
let completed = false;
if (prog.todo.length === 0) {
if (~process.argv.indexOf('-v') || ~process.argv.indexOf('--version')) {
console.log('v%s', prog.version);
process.exit(0);
} else if (~process.argv.indexOf('-h') || ~process.argv.indexOf('--help')) {
console.log('%s', prog.usage());
process.exit(0);
}
console.log('%s', prog.usage());
process.exit(1);
}
// if --no-color is passed
if (prog.opts && Object.hasOwnProperty.call(prog, 'color') && !prog.opts.color) {
log.disableColor();
}
log.info('it worked if it ends with', 'ok');
log.verbose('cli', process.argv);
log.info('using', process.title + '@%s', prog.version);
log.info('using', 'node@%s | %s | %s', process.versions.node, process.platform, process.arch);
/**
* Change dir if -C/--directory was passed.
*/
const dir = prog.opts.directory;
if (dir) {
const fs = require('fs');
try {
const stat = fs.statSync(dir);
if (stat.isDirectory()) {
log.info('chdir', dir);
process.chdir(dir);
} else {
log.warn('chdir', dir + ' is not a directory');
}
} catch (e) {
if (e.code === 'ENOENT') {
log.warn('chdir', dir + ' is not a directory');
} else {
log.warn('chdir', 'error during chdir() "%s"', e.message);
}
}
}
function run() {
const command = prog.todo.shift();
if (!command) {
// done!
completed = true;
log.info('ok');
return;
}
// set binary.host when appropriate. host determines the s3 target bucket.
const target = prog.setBinaryHostProperty(command.name);
if (target && ['install', 'publish', 'unpublish', 'info'].indexOf(command.name) >= 0) {
log.info('using binary.host: ' + prog.package_json.binary.host);
}
prog.commands[command.name](command.args, function(err) {
if (err) {
log.error(command.name + ' error');
log.error('stack', err.stack);
errorMessage();
log.error('not ok');
console.log(err.message);
return process.exit(1);
}
const args_array = [].slice.call(arguments, 1);
if (args_array.length) {
console.log.apply(console, args_array);
}
// now run the next command in the queue
process.nextTick(run);
});
}
process.on('exit', (code) => {
if (!completed && !code) {
log.error('Completion callback never invoked!');
errorMessage();
process.exit(6);
}
});
process.on('uncaughtException', (err) => {
log.error('UNCAUGHT EXCEPTION');
log.error('stack', err.stack);
errorMessage();
process.exit(7);
});
function errorMessage() {
// copied from npm's lib/util/error-handler.js
const os = require('os');
log.error('System', os.type() + ' ' + os.release());
log.error('command', process.argv.map(JSON.stringify).join(' '));
log.error('cwd', process.cwd());
log.error('node -v', process.version);
log.error(process.title + ' -v', 'v' + prog.package.version);
}
// start running the given commands!
run();

View File

@ -1,309 +0,0 @@
'use strict';
/**
* Module exports.
*/
module.exports = exports;
/**
* Module dependencies.
*/
// load mocking control function for accessing s3 via https. the function is a noop always returning
// false if not mocking.
exports.mockS3Http = require('./util/s3_setup').get_mockS3Http();
exports.mockS3Http('on');
const mocking = exports.mockS3Http('get');
const fs = require('fs');
const path = require('path');
const nopt = require('nopt');
const log = require('npmlog');
log.disableProgress();
const napi = require('./util/napi.js');
const EE = require('events').EventEmitter;
const inherits = require('util').inherits;
const cli_commands = [
'clean',
'install',
'reinstall',
'build',
'rebuild',
'package',
'testpackage',
'publish',
'unpublish',
'info',
'testbinary',
'reveal',
'configure'
];
const aliases = {};
// differentiate node-pre-gyp's logs from npm's
log.heading = 'node-pre-gyp';
if (mocking) {
log.warn(`mocking s3 to ${process.env.node_pre_gyp_mock_s3}`);
}
// this is a getter to avoid circular reference warnings with node v14.
Object.defineProperty(exports, 'find', {
get: function() {
return require('./pre-binding').find;
},
enumerable: true
});
// in the following, "my_module" is using node-pre-gyp to
// prebuild and install pre-built binaries. "main_module"
// is using "my_module".
//
// "bin/node-pre-gyp" invokes Run() without a path. the
// expectation is that the working directory is the package
// root "my_module". this is true because in all cases npm is
// executing a script in the context of "my_module".
//
// "pre-binding.find()" is executed by "my_module" but in the
// context of "main_module". this is because "main_module" is
// executing and requires "my_module" which is then executing
// "pre-binding.find()" via "node-pre-gyp.find()", so the working
// directory is that of "main_module".
//
// that's why "find()" must pass the path to package.json.
//
function Run({ package_json_path = './package.json', argv }) {
this.package_json_path = package_json_path;
this.commands = {};
const self = this;
cli_commands.forEach((command) => {
self.commands[command] = function(argvx, callback) {
log.verbose('command', command, argvx);
return require('./' + command)(self, argvx, callback);
};
});
this.parseArgv(argv);
// this is set to true after the binary.host property was set to
// either staging_host or production_host.
this.binaryHostSet = false;
}
inherits(Run, EE);
exports.Run = Run;
const proto = Run.prototype;
/**
* Export the contents of the package.json.
*/
proto.package = require('../package.json');
/**
* nopt configuration definitions
*/
proto.configDefs = {
help: Boolean, // everywhere
arch: String, // 'configure'
debug: Boolean, // 'build'
directory: String, // bin
proxy: String, // 'install'
loglevel: String // everywhere
};
/**
* nopt shorthands
*/
proto.shorthands = {
release: '--no-debug',
C: '--directory',
debug: '--debug',
j: '--jobs',
silent: '--loglevel=silent',
silly: '--loglevel=silly',
verbose: '--loglevel=verbose'
};
/**
* expose the command aliases for the bin file to use.
*/
proto.aliases = aliases;
/**
* Parses the given argv array and sets the 'opts', 'argv',
* 'command', and 'package_json' properties.
*/
proto.parseArgv = function parseOpts(argv) {
this.opts = nopt(this.configDefs, this.shorthands, argv);
this.argv = this.opts.argv.remain.slice();
const commands = this.todo = [];
// create a copy of the argv array with aliases mapped
argv = this.argv.map((arg) => {
// is this an alias?
if (arg in this.aliases) {
arg = this.aliases[arg];
}
return arg;
});
// process the mapped args into "command" objects ("name" and "args" props)
argv.slice().forEach((arg) => {
if (arg in this.commands) {
const args = argv.splice(0, argv.indexOf(arg));
argv.shift();
if (commands.length > 0) {
commands[commands.length - 1].args = args;
}
commands.push({ name: arg, args: [] });
}
});
if (commands.length > 0) {
commands[commands.length - 1].args = argv.splice(0);
}
// if a directory was specified package.json is assumed to be relative
// to it.
let package_json_path = this.package_json_path;
if (this.opts.directory) {
package_json_path = path.join(this.opts.directory, package_json_path);
}
this.package_json = JSON.parse(fs.readFileSync(package_json_path));
// expand commands entries for multiple napi builds
this.todo = napi.expand_commands(this.package_json, this.opts, commands);
// support for inheriting config env variables from npm
const npm_config_prefix = 'npm_config_';
Object.keys(process.env).forEach((name) => {
if (name.indexOf(npm_config_prefix) !== 0) return;
const val = process.env[name];
if (name === npm_config_prefix + 'loglevel') {
log.level = val;
} else {
// add the user-defined options to the config
name = name.substring(npm_config_prefix.length);
// avoid npm argv clobber already present args
// which avoids problem of 'npm test' calling
// script that runs unique npm install commands
if (name === 'argv') {
if (this.opts.argv &&
this.opts.argv.remain &&
this.opts.argv.remain.length) {
// do nothing
} else {
this.opts[name] = val;
}
} else {
this.opts[name] = val;
}
}
});
if (this.opts.loglevel) {
log.level = this.opts.loglevel;
}
log.resume();
};
/**
* allow the binary.host property to be set at execution time.
*
* for this to take effect requires all the following to be true.
* - binary is a property in package.json
* - binary.host is falsey
* - binary.staging_host is not empty
* - binary.production_host is not empty
*
* if any of the previous checks fail then the function returns an empty string
* and makes no changes to package.json's binary property.
*
*
* if command is "publish" then the default is set to "binary.staging_host"
* if command is not "publish" the the default is set to "binary.production_host"
*
* if the command-line option '--s3_host' is set to "staging" or "production" then
* "binary.host" is set to the specified "staging_host" or "production_host". if
* '--s3_host' is any other value an exception is thrown.
*
* if '--s3_host' is not present then "binary.host" is set to the default as above.
*
* this strategy was chosen so that any command other than "publish" or "unpublish" uses "production"
* as the default without requiring any command-line options but that "publish" and "unpublish" require
* '--s3_host production_host' to be specified in order to *really* publish (or unpublish). publishing
* to staging can be done freely without worrying about disturbing any production releases.
*/
proto.setBinaryHostProperty = function(command) {
if (this.binaryHostSet) {
return this.package_json.binary.host;
}
const p = this.package_json;
// don't set anything if host is present. it must be left blank to trigger this.
if (!p || !p.binary || p.binary.host) {
return '';
}
// and both staging and production must be present. errors will be reported later.
if (!p.binary.staging_host || !p.binary.production_host) {
return '';
}
let target = 'production_host';
if (command === 'publish' || command === 'unpublish') {
target = 'staging_host';
}
// the environment variable has priority over the default or the command line. if
// either the env var or the command line option are invalid throw an error.
const npg_s3_host = process.env.node_pre_gyp_s3_host;
if (npg_s3_host === 'staging' || npg_s3_host === 'production') {
target = `${npg_s3_host}_host`;
} else if (this.opts['s3_host'] === 'staging' || this.opts['s3_host'] === 'production') {
target = `${this.opts['s3_host']}_host`;
} else if (this.opts['s3_host'] || npg_s3_host) {
throw new Error(`invalid s3_host ${this.opts['s3_host'] || npg_s3_host}`);
}
p.binary.host = p.binary[target];
this.binaryHostSet = true;
return p.binary.host;
};
/**
* Returns the usage instructions for node-pre-gyp.
*/
proto.usage = function usage() {
const str = [
'',
' Usage: node-pre-gyp <command> [options]',
'',
' where <command> is one of:',
cli_commands.map((c) => {
return ' - ' + c + ' - ' + require('./' + c).usage;
}).join('\n'),
'',
'node-pre-gyp@' + this.version + ' ' + path.resolve(__dirname, '..'),
'node@' + process.versions.node
].join('\n');
return str;
};
/**
* Version number getter.
*/
Object.defineProperty(proto, 'version', {
get: function() {
return this.package.version;
},
enumerable: true
});

View File

@ -1,73 +0,0 @@
'use strict';
module.exports = exports = _package;
exports.usage = 'Packs binary (and enclosing directory) into locally staged tarball';
const fs = require('fs');
const path = require('path');
const log = require('npmlog');
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
const existsAsync = fs.exists || path.exists;
const makeDir = require('make-dir');
const tar = require('tar');
function readdirSync(dir) {
let list = [];
const files = fs.readdirSync(dir);
files.forEach((file) => {
const stats = fs.lstatSync(path.join(dir, file));
if (stats.isDirectory()) {
list = list.concat(readdirSync(path.join(dir, file)));
} else {
list.push(path.join(dir, file));
}
});
return list;
}
function _package(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
const from = opts.module_path;
const binary_module = path.join(from, opts.module_name + '.node');
existsAsync(binary_module, (found) => {
if (!found) {
return callback(new Error('Cannot package because ' + binary_module + ' missing: run `node-pre-gyp rebuild` first'));
}
const tarball = opts.staged_tarball;
const filter_func = function(entry) {
const basename = path.basename(entry);
if (basename.length && basename[0] !== '.') {
console.log('packing ' + entry);
return true;
} else {
console.log('skipping ' + entry);
}
return false;
};
makeDir(path.dirname(tarball)).then(() => {
let files = readdirSync(from);
const base = path.basename(from);
files = files.map((file) => {
return path.join(base, path.relative(from, file));
});
tar.create({
portable: false,
gzip: true,
filter: filter_func,
file: tarball,
cwd: path.dirname(from)
}, files, (err2) => {
if (err2) console.error('[' + package_json.name + '] ' + err2.message);
else log.info('package', 'Binary staged at "' + tarball + '"');
return callback(err2);
});
}).catch((err) => {
return callback(err);
});
});
}

View File

@ -1,34 +0,0 @@
'use strict';
const npg = require('..');
const versioning = require('../lib/util/versioning.js');
const napi = require('../lib/util/napi.js');
const existsSync = require('fs').existsSync || require('path').existsSync;
const path = require('path');
module.exports = exports;
exports.usage = 'Finds the require path for the node-pre-gyp installed module';
exports.validate = function(package_json, opts) {
versioning.validate_config(package_json, opts);
};
exports.find = function(package_json_path, opts) {
if (!existsSync(package_json_path)) {
throw new Error(package_json_path + 'does not exist');
}
const prog = new npg.Run({ package_json_path, argv: process.argv });
prog.setBinaryHostProperty();
const package_json = prog.package_json;
versioning.validate_config(package_json, opts);
let napi_build_version;
if (napi.get_napi_build_versions(package_json, opts)) {
napi_build_version = napi.get_best_napi_build_version(package_json, opts);
}
opts = opts || {};
if (!opts.module_root) opts.module_root = path.dirname(package_json_path);
const meta = versioning.evaluate(package_json, opts, napi_build_version);
return meta.module;
};

View File

@ -1,81 +0,0 @@
'use strict';
module.exports = exports = publish;
exports.usage = 'Publishes pre-built binary (requires aws-sdk)';
const fs = require('fs');
const path = require('path');
const log = require('npmlog');
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
const s3_setup = require('./util/s3_setup.js');
const existsAsync = fs.exists || path.exists;
const url = require('url');
function publish(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
const tarball = opts.staged_tarball;
existsAsync(tarball, (found) => {
if (!found) {
return callback(new Error('Cannot publish because ' + tarball + ' missing: run `node-pre-gyp package` first'));
}
log.info('publish', 'Detecting s3 credentials');
const config = {};
s3_setup.detect(opts, config);
const s3 = s3_setup.get_s3(config);
const key_name = url.resolve(config.prefix, opts.package_name);
const s3_opts = {
Bucket: config.bucket,
Key: key_name
};
log.info('publish', 'Authenticating with s3');
log.info('publish', config);
log.info('publish', 'Checking for existing binary at ' + opts.hosted_path);
s3.headObject(s3_opts, (err, meta) => {
if (meta) log.info('publish', JSON.stringify(meta));
if (err && err.code === 'NotFound') {
// we are safe to publish because
// the object does not already exist
log.info('publish', 'Preparing to put object');
const s3_put_opts = {
ACL: 'public-read',
Body: fs.createReadStream(tarball),
Key: key_name,
Bucket: config.bucket
};
log.info('publish', 'Putting object', s3_put_opts.ACL, s3_put_opts.Bucket, s3_put_opts.Key);
try {
s3.putObject(s3_put_opts, (err2, resp) => {
log.info('publish', 'returned from putting object');
if (err2) {
log.info('publish', 's3 putObject error: "' + err2 + '"');
return callback(err2);
}
if (resp) log.info('publish', 's3 putObject response: "' + JSON.stringify(resp) + '"');
log.info('publish', 'successfully put object');
console.log('[' + package_json.name + '] published to ' + opts.hosted_path);
return callback();
});
} catch (err3) {
log.info('publish', 's3 putObject error: "' + err3 + '"');
return callback(err3);
}
} else if (err) {
log.info('publish', 's3 headObject error: "' + err + '"');
return callback(err);
} else {
log.error('publish', 'Cannot publish over existing version');
log.error('publish', "Update the 'version' field in package.json and try again");
log.error('publish', 'If the previous version was published in error see:');
log.error('publish', '\t node-pre-gyp unpublish');
return callback(new Error('Failed publishing to ' + opts.hosted_path));
}
});
});
}

View File

@ -1,20 +0,0 @@
'use strict';
module.exports = exports = rebuild;
exports.usage = 'Runs "clean" and "build" at once';
const napi = require('./util/napi.js');
function rebuild(gyp, argv, callback) {
const package_json = gyp.package_json;
let commands = [
{ name: 'clean', args: [] },
{ name: 'build', args: ['rebuild'] }
];
commands = napi.expand_commands(package_json, gyp.opts, commands);
for (let i = commands.length; i !== 0; i--) {
gyp.todo.unshift(commands[i - 1]);
}
process.nextTick(callback);
}

View File

@ -1,19 +0,0 @@
'use strict';
module.exports = exports = rebuild;
exports.usage = 'Runs "clean" and "install" at once';
const napi = require('./util/napi.js');
function rebuild(gyp, argv, callback) {
const package_json = gyp.package_json;
let installArgs = [];
const napi_build_version = napi.get_best_napi_build_version(package_json, gyp.opts);
if (napi_build_version != null) installArgs = [napi.get_command_arg(napi_build_version)];
gyp.todo.unshift(
{ name: 'clean', args: [] },
{ name: 'install', args: installArgs }
);
process.nextTick(callback);
}

View File

@ -1,32 +0,0 @@
'use strict';
module.exports = exports = reveal;
exports.usage = 'Reveals data on the versioned binary';
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
function unix_paths(key, val) {
return val && val.replace ? val.replace(/\\/g, '/') : val;
}
function reveal(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
let hit = false;
// if a second arg is passed look to see
// if it is a known option
// console.log(JSON.stringify(gyp.opts,null,1))
const remain = gyp.opts.argv.remain[gyp.opts.argv.remain.length - 1];
if (remain && Object.hasOwnProperty.call(opts, remain)) {
console.log(opts[remain].replace(/\\/g, '/'));
hit = true;
}
// otherwise return all options as json
if (!hit) {
console.log(JSON.stringify(opts, unix_paths, 2));
}
return callback();
}

View File

@ -1,79 +0,0 @@
'use strict';
module.exports = exports = testbinary;
exports.usage = 'Tests that the binary.node can be required';
const path = require('path');
const log = require('npmlog');
const cp = require('child_process');
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
function testbinary(gyp, argv, callback) {
const args = [];
const options = {};
let shell_cmd = process.execPath;
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
// skip validation for runtimes we don't explicitly support (like electron)
if (opts.runtime &&
opts.runtime !== 'node-webkit' &&
opts.runtime !== 'node') {
return callback();
}
const nw = (opts.runtime && opts.runtime === 'node-webkit');
// ensure on windows that / are used for require path
const binary_module = opts.module.replace(/\\/g, '/');
if ((process.arch !== opts.target_arch) ||
(process.platform !== opts.target_platform)) {
let msg = 'skipping validation since host platform/arch (';
msg += process.platform + '/' + process.arch + ')';
msg += ' does not match target (';
msg += opts.target_platform + '/' + opts.target_arch + ')';
log.info('validate', msg);
return callback();
}
if (nw) {
options.timeout = 5000;
if (process.platform === 'darwin') {
shell_cmd = 'node-webkit';
} else if (process.platform === 'win32') {
shell_cmd = 'nw.exe';
} else {
shell_cmd = 'nw';
}
const modulePath = path.resolve(binary_module);
const appDir = path.join(__dirname, 'util', 'nw-pre-gyp');
args.push(appDir);
args.push(modulePath);
log.info('validate', "Running test command: '" + shell_cmd + ' ' + args.join(' ') + "'");
cp.execFile(shell_cmd, args, options, (err, stdout, stderr) => {
// check for normal timeout for node-webkit
if (err) {
if (err.killed === true && err.signal && err.signal.indexOf('SIG') > -1) {
return callback();
}
const stderrLog = stderr.toString();
log.info('stderr', stderrLog);
if (/^\s*Xlib:\s*extension\s*"RANDR"\s*missing\s*on\s*display\s*":\d+\.\d+"\.\s*$/.test(stderrLog)) {
log.info('RANDR', 'stderr contains only RANDR error, ignored');
return callback();
}
return callback(err);
}
return callback();
});
return;
}
args.push('--eval');
args.push("require('" + binary_module.replace(/'/g, '\'') + "')");
log.info('validate', "Running test command: '" + shell_cmd + ' ' + args.join(' ') + "'");
cp.execFile(shell_cmd, args, options, (err, stdout, stderr) => {
if (err) {
return callback(err, { stdout: stdout, stderr: stderr });
}
return callback();
});
}

View File

@ -1,53 +0,0 @@
'use strict';
module.exports = exports = testpackage;
exports.usage = 'Tests that the staged package is valid';
const fs = require('fs');
const path = require('path');
const log = require('npmlog');
const existsAsync = fs.exists || path.exists;
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
const testbinary = require('./testbinary.js');
const tar = require('tar');
const makeDir = require('make-dir');
function testpackage(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
const tarball = opts.staged_tarball;
existsAsync(tarball, (found) => {
if (!found) {
return callback(new Error('Cannot test package because ' + tarball + ' missing: run `node-pre-gyp package` first'));
}
const to = opts.module_path;
function filter_func(entry) {
log.info('install', 'unpacking [' + entry.path + ']');
}
makeDir(to).then(() => {
tar.extract({
file: tarball,
cwd: to,
strip: 1,
onentry: filter_func
}).then(after_extract, callback);
}).catch((err) => {
return callback(err);
});
function after_extract() {
testbinary(gyp, argv, (err) => {
if (err) {
return callback(err);
} else {
console.log('[' + package_json.name + '] Package appears valid');
return callback();
}
});
}
});
}

View File

@ -1,41 +0,0 @@
'use strict';
module.exports = exports = unpublish;
exports.usage = 'Unpublishes pre-built binary (requires aws-sdk)';
const log = require('npmlog');
const versioning = require('./util/versioning.js');
const napi = require('./util/napi.js');
const s3_setup = require('./util/s3_setup.js');
const url = require('url');
function unpublish(gyp, argv, callback) {
const package_json = gyp.package_json;
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(package_json, gyp.opts, napi_build_version);
const config = {};
s3_setup.detect(opts, config);
const s3 = s3_setup.get_s3(config);
const key_name = url.resolve(config.prefix, opts.package_name);
const s3_opts = {
Bucket: config.bucket,
Key: key_name
};
s3.headObject(s3_opts, (err, meta) => {
if (err && err.code === 'NotFound') {
console.log('[' + package_json.name + '] Not found: https://' + s3_opts.Bucket + '.s3.amazonaws.com/' + s3_opts.Key);
return callback();
} else if (err) {
return callback(err);
} else {
log.info('unpublish', JSON.stringify(meta));
s3.deleteObject(s3_opts, (err2, resp) => {
if (err2) return callback(err2);
log.info(JSON.stringify(resp));
console.log('[' + package_json.name + '] Success: removed https://' + s3_opts.Bucket + '.s3.amazonaws.com/' + s3_opts.Key);
return callback();
});
}
});
}

File diff suppressed because it is too large Load Diff

View File

@ -1,93 +0,0 @@
'use strict';
module.exports = exports;
const fs = require('fs');
const path = require('path');
const win = process.platform === 'win32';
const existsSync = fs.existsSync || path.existsSync;
const cp = require('child_process');
// try to build up the complete path to node-gyp
/* priority:
- node-gyp on ENV:npm_config_node_gyp (https://github.com/npm/npm/pull/4887)
- node-gyp on NODE_PATH
- node-gyp inside npm on NODE_PATH (ignore on iojs)
- node-gyp inside npm beside node exe
*/
function which_node_gyp() {
let node_gyp_bin;
if (process.env.npm_config_node_gyp) {
try {
node_gyp_bin = process.env.npm_config_node_gyp;
if (existsSync(node_gyp_bin)) {
return node_gyp_bin;
}
} catch (err) {
// do nothing
}
}
try {
const node_gyp_main = require.resolve('node-gyp'); // eslint-disable-line node/no-missing-require
node_gyp_bin = path.join(path.dirname(
path.dirname(node_gyp_main)),
'bin/node-gyp.js');
if (existsSync(node_gyp_bin)) {
return node_gyp_bin;
}
} catch (err) {
// do nothing
}
if (process.execPath.indexOf('iojs') === -1) {
try {
const npm_main = require.resolve('npm'); // eslint-disable-line node/no-missing-require
node_gyp_bin = path.join(path.dirname(
path.dirname(npm_main)),
'node_modules/node-gyp/bin/node-gyp.js');
if (existsSync(node_gyp_bin)) {
return node_gyp_bin;
}
} catch (err) {
// do nothing
}
}
const npm_base = path.join(path.dirname(
path.dirname(process.execPath)),
'lib/node_modules/npm/');
node_gyp_bin = path.join(npm_base, 'node_modules/node-gyp/bin/node-gyp.js');
if (existsSync(node_gyp_bin)) {
return node_gyp_bin;
}
}
module.exports.run_gyp = function(args, opts, callback) {
let shell_cmd = '';
const cmd_args = [];
if (opts.runtime && opts.runtime === 'node-webkit') {
shell_cmd = 'nw-gyp';
if (win) shell_cmd += '.cmd';
} else {
const node_gyp_path = which_node_gyp();
if (node_gyp_path) {
shell_cmd = process.execPath;
cmd_args.push(node_gyp_path);
} else {
shell_cmd = 'node-gyp';
if (win) shell_cmd += '.cmd';
}
}
const final_args = cmd_args.concat(args);
const cmd = cp.spawn(shell_cmd, final_args, { cwd: undefined, env: process.env, stdio: [0, 1, 2] });
cmd.on('error', (err) => {
if (err) {
return callback(new Error("Failed to execute '" + shell_cmd + ' ' + final_args.join(' ') + "' (" + err + ')'));
}
callback(null, opts);
});
cmd.on('close', (code) => {
if (code && code !== 0) {
return callback(new Error("Failed to execute '" + shell_cmd + ' ' + final_args.join(' ') + "' (" + code + ')'));
}
callback(null, opts);
});
};

View File

@ -1,102 +0,0 @@
'use strict';
module.exports = exports = handle_gyp_opts;
const versioning = require('./versioning.js');
const napi = require('./napi.js');
/*
Here we gather node-pre-gyp generated options (from versioning) and pass them along to node-gyp.
We massage the args and options slightly to account for differences in what commands mean between
node-pre-gyp and node-gyp (e.g. see the difference between "build" and "rebuild" below)
Keep in mind: the values inside `argv` and `gyp.opts` below are different depending on whether
node-pre-gyp is called directory, or if it is called in a `run-script` phase of npm.
We also try to preserve any command line options that might have been passed to npm or node-pre-gyp.
But this is fairly difficult without passing way to much through. For example `gyp.opts` contains all
the process.env and npm pushes a lot of variables into process.env which node-pre-gyp inherits. So we have
to be very selective about what we pass through.
For example:
`npm install --build-from-source` will give:
argv == [ 'rebuild' ]
gyp.opts.argv == { remain: [ 'install' ],
cooked: [ 'install', '--fallback-to-build' ],
original: [ 'install', '--fallback-to-build' ] }
`./bin/node-pre-gyp build` will give:
argv == []
gyp.opts.argv == { remain: [ 'build' ],
cooked: [ 'build' ],
original: [ '-C', 'test/app1', 'build' ] }
*/
// select set of node-pre-gyp versioning info
// to share with node-gyp
const share_with_node_gyp = [
'module',
'module_name',
'module_path',
'napi_version',
'node_abi_napi',
'napi_build_version',
'node_napi_label'
];
function handle_gyp_opts(gyp, argv, callback) {
// Collect node-pre-gyp specific variables to pass to node-gyp
const node_pre_gyp_options = [];
// generate custom node-pre-gyp versioning info
const napi_build_version = napi.get_napi_build_version_from_command_args(argv);
const opts = versioning.evaluate(gyp.package_json, gyp.opts, napi_build_version);
share_with_node_gyp.forEach((key) => {
const val = opts[key];
if (val) {
node_pre_gyp_options.push('--' + key + '=' + val);
} else if (key === 'napi_build_version') {
node_pre_gyp_options.push('--' + key + '=0');
} else {
if (key !== 'napi_version' && key !== 'node_abi_napi')
return callback(new Error('Option ' + key + ' required but not found by node-pre-gyp'));
}
});
// Collect options that follow the special -- which disables nopt parsing
const unparsed_options = [];
let double_hyphen_found = false;
gyp.opts.argv.original.forEach((opt) => {
if (double_hyphen_found) {
unparsed_options.push(opt);
}
if (opt === '--') {
double_hyphen_found = true;
}
});
// We try respect and pass through remaining command
// line options (like --foo=bar) to node-gyp
const cooked = gyp.opts.argv.cooked;
const node_gyp_options = [];
cooked.forEach((value) => {
if (value.length > 2 && value.slice(0, 2) === '--') {
const key = value.slice(2);
const val = cooked[cooked.indexOf(value) + 1];
if (val && val.indexOf('--') === -1) { // handle '--foo=bar' or ['--foo','bar']
node_gyp_options.push('--' + key + '=' + val);
} else { // pass through --foo
node_gyp_options.push(value);
}
}
});
const result = { 'opts': opts, 'gyp': node_gyp_options, 'pre': node_pre_gyp_options, 'unparsed': unparsed_options };
return callback(null, result);
}

View File

@ -1,205 +0,0 @@
'use strict';
const fs = require('fs');
module.exports = exports;
const versionArray = process.version
.substr(1)
.replace(/-.*$/, '')
.split('.')
.map((item) => {
return +item;
});
const napi_multiple_commands = [
'build',
'clean',
'configure',
'package',
'publish',
'reveal',
'testbinary',
'testpackage',
'unpublish'
];
const napi_build_version_tag = 'napi_build_version=';
module.exports.get_napi_version = function() {
// returns the non-zero numeric napi version or undefined if napi is not supported.
// correctly supporting target requires an updated cross-walk
let version = process.versions.napi; // can be undefined
if (!version) { // this code should never need to be updated
if (versionArray[0] === 9 && versionArray[1] >= 3) version = 2; // 9.3.0+
else if (versionArray[0] === 8) version = 1; // 8.0.0+
}
return version;
};
module.exports.get_napi_version_as_string = function(target) {
// returns the napi version as a string or an empty string if napi is not supported.
const version = module.exports.get_napi_version(target);
return version ? '' + version : '';
};
module.exports.validate_package_json = function(package_json, opts) { // throws Error
const binary = package_json.binary;
const module_path_ok = pathOK(binary.module_path);
const remote_path_ok = pathOK(binary.remote_path);
const package_name_ok = pathOK(binary.package_name);
const napi_build_versions = module.exports.get_napi_build_versions(package_json, opts, true);
const napi_build_versions_raw = module.exports.get_napi_build_versions_raw(package_json);
if (napi_build_versions) {
napi_build_versions.forEach((napi_build_version)=> {
if (!(parseInt(napi_build_version, 10) === napi_build_version && napi_build_version > 0)) {
throw new Error('All values specified in napi_versions must be positive integers.');
}
});
}
if (napi_build_versions && (!module_path_ok || (!remote_path_ok && !package_name_ok))) {
throw new Error('When napi_versions is specified; module_path and either remote_path or ' +
"package_name must contain the substitution string '{napi_build_version}`.");
}
if ((module_path_ok || remote_path_ok || package_name_ok) && !napi_build_versions_raw) {
throw new Error("When the substitution string '{napi_build_version}` is specified in " +
'module_path, remote_path, or package_name; napi_versions must also be specified.');
}
if (napi_build_versions && !module.exports.get_best_napi_build_version(package_json, opts) &&
module.exports.build_napi_only(package_json)) {
throw new Error(
'The Node-API version of this Node instance is ' + module.exports.get_napi_version(opts ? opts.target : undefined) + '. ' +
'This module supports Node-API version(s) ' + module.exports.get_napi_build_versions_raw(package_json) + '. ' +
'This Node instance cannot run this module.');
}
if (napi_build_versions_raw && !napi_build_versions && module.exports.build_napi_only(package_json)) {
throw new Error(
'The Node-API version of this Node instance is ' + module.exports.get_napi_version(opts ? opts.target : undefined) + '. ' +
'This module supports Node-API version(s) ' + module.exports.get_napi_build_versions_raw(package_json) + '. ' +
'This Node instance cannot run this module.');
}
};
function pathOK(path) {
return path && (path.indexOf('{napi_build_version}') !== -1 || path.indexOf('{node_napi_label}') !== -1);
}
module.exports.expand_commands = function(package_json, opts, commands) {
const expanded_commands = [];
const napi_build_versions = module.exports.get_napi_build_versions(package_json, opts);
commands.forEach((command)=> {
if (napi_build_versions && command.name === 'install') {
const napi_build_version = module.exports.get_best_napi_build_version(package_json, opts);
const args = napi_build_version ? [napi_build_version_tag + napi_build_version] : [];
expanded_commands.push({ name: command.name, args: args });
} else if (napi_build_versions && napi_multiple_commands.indexOf(command.name) !== -1) {
napi_build_versions.forEach((napi_build_version)=> {
const args = command.args.slice();
args.push(napi_build_version_tag + napi_build_version);
expanded_commands.push({ name: command.name, args: args });
});
} else {
expanded_commands.push(command);
}
});
return expanded_commands;
};
module.exports.get_napi_build_versions = function(package_json, opts, warnings) { // opts may be undefined
const log = require('npmlog');
let napi_build_versions = [];
const supported_napi_version = module.exports.get_napi_version(opts ? opts.target : undefined);
// remove duplicates, verify each napi version can actaully be built
if (package_json.binary && package_json.binary.napi_versions) {
package_json.binary.napi_versions.forEach((napi_version) => {
const duplicated = napi_build_versions.indexOf(napi_version) !== -1;
if (!duplicated && supported_napi_version && napi_version <= supported_napi_version) {
napi_build_versions.push(napi_version);
} else if (warnings && !duplicated && supported_napi_version) {
log.info('This Node instance does not support builds for Node-API version', napi_version);
}
});
}
if (opts && opts['build-latest-napi-version-only']) {
let latest_version = 0;
napi_build_versions.forEach((napi_version) => {
if (napi_version > latest_version) latest_version = napi_version;
});
napi_build_versions = latest_version ? [latest_version] : [];
}
return napi_build_versions.length ? napi_build_versions : undefined;
};
module.exports.get_napi_build_versions_raw = function(package_json) {
const napi_build_versions = [];
// remove duplicates
if (package_json.binary && package_json.binary.napi_versions) {
package_json.binary.napi_versions.forEach((napi_version) => {
if (napi_build_versions.indexOf(napi_version) === -1) {
napi_build_versions.push(napi_version);
}
});
}
return napi_build_versions.length ? napi_build_versions : undefined;
};
module.exports.get_command_arg = function(napi_build_version) {
return napi_build_version_tag + napi_build_version;
};
module.exports.get_napi_build_version_from_command_args = function(command_args) {
for (let i = 0; i < command_args.length; i++) {
const arg = command_args[i];
if (arg.indexOf(napi_build_version_tag) === 0) {
return parseInt(arg.substr(napi_build_version_tag.length), 10);
}
}
return undefined;
};
module.exports.swap_build_dir_out = function(napi_build_version) {
if (napi_build_version) {
const rm = require('rimraf');
rm.sync(module.exports.get_build_dir(napi_build_version));
fs.renameSync('build', module.exports.get_build_dir(napi_build_version));
}
};
module.exports.swap_build_dir_in = function(napi_build_version) {
if (napi_build_version) {
const rm = require('rimraf');
rm.sync('build');
fs.renameSync(module.exports.get_build_dir(napi_build_version), 'build');
}
};
module.exports.get_build_dir = function(napi_build_version) {
return 'build-tmp-napi-v' + napi_build_version;
};
module.exports.get_best_napi_build_version = function(package_json, opts) {
let best_napi_build_version = 0;
const napi_build_versions = module.exports.get_napi_build_versions(package_json, opts);
if (napi_build_versions) {
const our_napi_version = module.exports.get_napi_version(opts ? opts.target : undefined);
napi_build_versions.forEach((napi_build_version)=> {
if (napi_build_version > best_napi_build_version &&
napi_build_version <= our_napi_version) {
best_napi_build_version = napi_build_version;
}
});
}
return best_napi_build_version === 0 ? undefined : best_napi_build_version;
};
module.exports.build_napi_only = function(package_json) {
return package_json.binary && package_json.binary.package_name &&
package_json.binary.package_name.indexOf('{node_napi_label}') === -1;
};

Some files were not shown because too many files have changed in this diff Show More