tt
Browse files
CELERY_SCHEDULING_SETUP.md
CHANGED
|
@@ -31,6 +31,38 @@ sudo systemctl start redis
|
|
| 31 |
sudo systemctl enable redis
|
| 32 |
```
|
| 33 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 34 |
### 2. Python Dependencies
|
| 35 |
Install the required packages:
|
| 36 |
```bash
|
|
@@ -133,7 +165,30 @@ celery -A celery_config inspect active
|
|
| 133 |
|
| 134 |
**1. Redis Connection Failed**
|
| 135 |
```
|
| 136 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 137 |
```
|
| 138 |
|
| 139 |
**2. Tasks Not Executing**
|
|
|
|
| 31 |
sudo systemctl enable redis
|
| 32 |
```
|
| 33 |
|
| 34 |
+
### 2. Hugging Face Spaces / Production Deployment
|
| 35 |
+
For Hugging Face Spaces or production deployments where you can't run Redis directly:
|
| 36 |
+
|
| 37 |
+
**Option A: Use Redis Cloud (Recommended)**
|
| 38 |
+
1. Create a free Redis Cloud account at https://redislabs.com/try-free/
|
| 39 |
+
2. Create a Redis database (free tier available)
|
| 40 |
+
3. Update your `.env` file:
|
| 41 |
+
```env
|
| 42 |
+
CELERY_BROKER_URL="redis://your-redis-host:port/0"
|
| 43 |
+
CELERY_RESULT_BACKEND="redis://your-redis-host:port/0"
|
| 44 |
+
```
|
| 45 |
+
|
| 46 |
+
**Option B: Use Docker Compose**
|
| 47 |
+
```yaml
|
| 48 |
+
# docker-compose.yml
|
| 49 |
+
version: '3.8'
|
| 50 |
+
services:
|
| 51 |
+
redis:
|
| 52 |
+
image: redis:7-alpine
|
| 53 |
+
ports:
|
| 54 |
+
- "6379:6379"
|
| 55 |
+
volumes:
|
| 56 |
+
- redis_data:/data
|
| 57 |
+
command: redis-server --appendonly yes
|
| 58 |
+
|
| 59 |
+
volumes:
|
| 60 |
+
redis_data:
|
| 61 |
+
```
|
| 62 |
+
|
| 63 |
+
**Option C: Skip Celery (Basic Mode)**
|
| 64 |
+
If Redis is not available, the Flask app will start without Celery functionality. Schedules will be saved but won't execute automatically.
|
| 65 |
+
|
| 66 |
### 2. Python Dependencies
|
| 67 |
Install the required packages:
|
| 68 |
```bash
|
|
|
|
| 165 |
|
| 166 |
**1. Redis Connection Failed**
|
| 167 |
```
|
| 168 |
+
Error: Error 111 connecting to localhost:6379. Connection refused
|
| 169 |
+
|
| 170 |
+
Solutions:
|
| 171 |
+
1. Start Redis server locally (development):
|
| 172 |
+
Windows: redis-server
|
| 173 |
+
Linux/Mac: sudo systemctl start redis
|
| 174 |
+
|
| 175 |
+
2. Use Redis Cloud (production/Hugging Face):
|
| 176 |
+
- Create free Redis Cloud account
|
| 177 |
+
- Update .env with Redis Cloud URL
|
| 178 |
+
- Set CELERY_BROKER_URL and CELERY_RESULT_BACKEND
|
| 179 |
+
|
| 180 |
+
3. Use Docker Compose:
|
| 181 |
+
version: '3.8'
|
| 182 |
+
services:
|
| 183 |
+
redis:
|
| 184 |
+
image: redis:7-alpine
|
| 185 |
+
ports: ["6379:6379"]
|
| 186 |
+
volumes: [redis_data:/data]
|
| 187 |
+
volumes: redis_data:
|
| 188 |
+
|
| 189 |
+
4. Skip Celery (basic mode):
|
| 190 |
+
- App will start without scheduling functionality
|
| 191 |
+
- Schedules saved but won't execute automatically
|
| 192 |
```
|
| 193 |
|
| 194 |
**2. Tasks Not Executing**
|
backend/celery_tasks/content_tasks.py
CHANGED
|
@@ -89,7 +89,7 @@ def generate_content_task(self, user_id: str, schedule_id: str, supabase_client_
|
|
| 89 |
'message': f'Error in content generation: {str(e)}'
|
| 90 |
}
|
| 91 |
|
| 92 |
-
@
|
| 93 |
def publish_post_task(self, schedule_id: str, supabase_client_config: dict):
|
| 94 |
"""
|
| 95 |
Celery task to publish a scheduled post.
|
|
|
|
| 89 |
'message': f'Error in content generation: {str(e)}'
|
| 90 |
}
|
| 91 |
|
| 92 |
+
@celery_app.task(bind=True)
|
| 93 |
def publish_post_task(self, schedule_id: str, supabase_client_config: dict):
|
| 94 |
"""
|
| 95 |
Celery task to publish a scheduled post.
|
backend/celery_tasks/schedule_loader.py
CHANGED
|
@@ -87,7 +87,7 @@ def load_schedules_task():
|
|
| 87 |
logger.info(f"Found {len(schedules)} schedules")
|
| 88 |
|
| 89 |
# Get current beat schedule
|
| 90 |
-
current_schedule =
|
| 91 |
|
| 92 |
# Remove existing scheduled jobs (except the loader job)
|
| 93 |
# In a production environment, you might want to be more selective about this
|
|
|
|
| 87 |
logger.info(f"Found {len(schedules)} schedules")
|
| 88 |
|
| 89 |
# Get current beat schedule
|
| 90 |
+
current_schedule = celery_app.conf.beat_schedule
|
| 91 |
|
| 92 |
# Remove existing scheduled jobs (except the loader job)
|
| 93 |
# In a production environment, you might want to be more selective about this
|
backend/celery_tasks/scheduler.py
CHANGED
|
@@ -1,7 +1,7 @@
|
|
| 1 |
from datetime import datetime, timedelta
|
| 2 |
from celery import chain
|
| 3 |
import logging
|
| 4 |
-
from backend.
|
| 5 |
from backend.celery_tasks.content_tasks import generate_content_task, publish_post_task
|
| 6 |
|
| 7 |
# Configure logging
|
|
|
|
| 1 |
from datetime import datetime, timedelta
|
| 2 |
from celery import chain
|
| 3 |
import logging
|
| 4 |
+
from backend.celery_config import celery_app
|
| 5 |
from backend.celery_tasks.content_tasks import generate_content_task, publish_post_task
|
| 6 |
|
| 7 |
# Configure logging
|