Merge branch 'feature/database-performance-indexes' into 'devel'
feat: enhance security with improved password hashing and logging See merge request oliver/ivatar!261
1
.gitignore
vendored
@@ -22,3 +22,4 @@ dump_all*.sql
|
||||
dist/
|
||||
.env.local
|
||||
tmp/
|
||||
logs/
|
||||
|
||||
229
FILE_UPLOAD_SECURITY.md
Normal file
@@ -0,0 +1,229 @@
|
||||
# File Upload Security Documentation
|
||||
|
||||
## Overview
|
||||
|
||||
The ivatar application now includes comprehensive file upload security features to protect against malicious file uploads, data leaks, and other security threats.
|
||||
|
||||
## Security Features
|
||||
|
||||
### 1. File Type Validation
|
||||
|
||||
**Magic Bytes Verification**
|
||||
|
||||
- Validates file signatures (magic bytes) to ensure uploaded files are actually images
|
||||
- Supports JPEG, PNG, GIF, WebP, BMP, and TIFF formats
|
||||
- Prevents file extension spoofing attacks
|
||||
|
||||
**MIME Type Validation**
|
||||
|
||||
- Uses python-magic library to detect actual MIME types
|
||||
- Cross-references with allowed MIME types list
|
||||
- Prevents MIME type confusion attacks
|
||||
|
||||
### 2. Content Security Scanning
|
||||
|
||||
**Malicious Content Detection**
|
||||
|
||||
- Scans for embedded scripts (`<script>`, `javascript:`, `vbscript:`)
|
||||
- Detects executable content (PE headers, ELF headers)
|
||||
- Identifies polyglot attacks (files valid in multiple formats)
|
||||
- Checks for PHP and other server-side code
|
||||
|
||||
**PIL Image Validation**
|
||||
|
||||
- Uses Python Imaging Library to verify file is a valid image
|
||||
- Checks image dimensions and format
|
||||
- Ensures image can be properly loaded and processed
|
||||
|
||||
### 3. EXIF Data Sanitization
|
||||
|
||||
**Metadata Removal**
|
||||
|
||||
- Automatically strips EXIF data from uploaded images
|
||||
- Prevents location data and other sensitive metadata leaks
|
||||
- Preserves image quality while removing privacy risks
|
||||
|
||||
### 4. Enhanced Logging
|
||||
|
||||
**Security Event Logging**
|
||||
|
||||
- Logs all file upload attempts with user ID and IP address
|
||||
- Records security violations and suspicious activity
|
||||
- Provides audit trail for security monitoring
|
||||
|
||||
## Configuration
|
||||
|
||||
### Settings
|
||||
|
||||
All security features can be configured in `config.py` or overridden in `config_local.py`:
|
||||
|
||||
```python
|
||||
# File upload security settings
|
||||
ENABLE_FILE_SECURITY_VALIDATION = True
|
||||
ENABLE_EXIF_SANITIZATION = True
|
||||
ENABLE_MALICIOUS_CONTENT_SCAN = True
|
||||
```
|
||||
|
||||
### Dependencies
|
||||
|
||||
The security features require the following Python packages:
|
||||
|
||||
```bash
|
||||
pip install python-magic>=0.4.27
|
||||
```
|
||||
|
||||
**Note**: On some systems, you may need to install the libmagic system library:
|
||||
|
||||
- **Ubuntu/Debian**: `sudo apt-get install libmagic1`
|
||||
- **CentOS/RHEL**: `sudo yum install file-devel`
|
||||
- **macOS**: `brew install libmagic`
|
||||
|
||||
## Security Levels
|
||||
|
||||
### Security Score System
|
||||
|
||||
Files are assigned a security score (0-100) based on validation results:
|
||||
|
||||
- **90-100**: Excellent - No security concerns
|
||||
- **80-89**: Good - Minor warnings, safe to process
|
||||
- **70-79**: Fair - Some concerns, review recommended
|
||||
- **50-69**: Poor - Multiple issues, high risk
|
||||
- **0-49**: Critical - Malicious content detected, reject
|
||||
|
||||
### Validation Levels
|
||||
|
||||
1. **Basic Validation**: File size, filename, extension
|
||||
2. **Magic Bytes**: File signature verification
|
||||
3. **MIME Type**: Content type validation
|
||||
4. **PIL Validation**: Image format verification
|
||||
5. **Security Scan**: Malicious content detection
|
||||
6. **EXIF Sanitization**: Metadata removal
|
||||
|
||||
## API Reference
|
||||
|
||||
### FileValidator Class
|
||||
|
||||
```python
|
||||
from ivatar.file_security import FileValidator
|
||||
|
||||
validator = FileValidator(file_data, filename)
|
||||
results = validator.comprehensive_validation()
|
||||
```
|
||||
|
||||
### Main Validation Function
|
||||
|
||||
```python
|
||||
from ivatar.file_security import validate_uploaded_file
|
||||
|
||||
is_valid, results, sanitized_data = validate_uploaded_file(file_data, filename)
|
||||
```
|
||||
|
||||
### Security Report Generation
|
||||
|
||||
```python
|
||||
from ivatar.file_security import get_file_security_report
|
||||
|
||||
report = get_file_security_report(file_data, filename)
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Validation Errors
|
||||
|
||||
The system provides user-friendly error messages while logging detailed security information:
|
||||
|
||||
- **Malicious Content**: "File appears to be malicious and cannot be uploaded"
|
||||
- **Invalid Format**: "File format not supported or file appears to be corrupted"
|
||||
|
||||
### Logging Levels
|
||||
|
||||
- **INFO**: Successful uploads and normal operations
|
||||
- **WARNING**: Security violations and suspicious activity
|
||||
- **ERROR**: Validation failures and system errors
|
||||
|
||||
## Testing
|
||||
|
||||
### Running Security Tests
|
||||
|
||||
```bash
|
||||
python manage.py test ivatar.test_file_security
|
||||
```
|
||||
|
||||
### Test Coverage
|
||||
|
||||
The test suite covers:
|
||||
|
||||
- Valid file validation
|
||||
- Malicious content detection
|
||||
- Magic bytes verification
|
||||
- MIME type validation
|
||||
- EXIF sanitization
|
||||
- Form validation
|
||||
- Integration tests
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
### Memory Usage
|
||||
|
||||
- Files are processed in memory for validation
|
||||
- Large files (>5MB) may impact performance
|
||||
- Consider increasing server memory for high-volume deployments
|
||||
|
||||
### Processing Time
|
||||
|
||||
- Basic validation: <10ms
|
||||
- Full security scan: 50-200ms
|
||||
- EXIF sanitization: 100-500ms
|
||||
- Total overhead: ~200-700ms per upload
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
1. **python-magic Import Error**
|
||||
|
||||
- Install libmagic system library
|
||||
- Verify python-magic installation
|
||||
|
||||
2. **False Positives**
|
||||
- Review security score thresholds
|
||||
- Adjust validation settings
|
||||
|
||||
### Debug Mode
|
||||
|
||||
Enable debug logging to troubleshoot validation issues:
|
||||
|
||||
```python
|
||||
LOGGING = {
|
||||
"loggers": {
|
||||
"ivatar.security": {
|
||||
"level": "DEBUG",
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
## Security Best Practices
|
||||
|
||||
### Deployment Recommendations
|
||||
|
||||
1. **Enable All Security Features** in production
|
||||
2. **Monitor Security Logs** regularly
|
||||
3. **Keep Dependencies Updated**
|
||||
4. **Regular Security Audits** of uploaded content
|
||||
|
||||
### Monitoring
|
||||
|
||||
- Monitor security.log for violations
|
||||
- Track upload success/failure rates
|
||||
- Alert on repeated security violations
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
Potential future improvements:
|
||||
|
||||
- Virus scanning integration (ClamAV)
|
||||
- Content-based image analysis
|
||||
- Machine learning threat detection
|
||||
- Advanced polyglot detection
|
||||
- Real-time threat intelligence feeds
|
||||
40
README.md
@@ -10,6 +10,46 @@
|
||||
- [Coverage HTML report](http://oliver.git.linux-kernel.at/ivatar)
|
||||
- [Code documentation (autogenerated, pycco)](http://oliver.git.linux-kernel.at/ivatar/pycco/)
|
||||
|
||||
# Testing
|
||||
|
||||
## Running Tests
|
||||
|
||||
### Local Development (Recommended)
|
||||
For local development, use the provided script to skip Bluesky tests that require external API credentials:
|
||||
|
||||
```bash
|
||||
./run_tests_local.sh
|
||||
```
|
||||
|
||||
This runs all tests except those marked with `@pytest.mark.bluesky`.
|
||||
|
||||
### All Tests
|
||||
To run all tests including Bluesky tests (requires Bluesky API credentials):
|
||||
|
||||
```bash
|
||||
python3 manage.py test -v2
|
||||
```
|
||||
|
||||
### Specific Test Categories
|
||||
```bash
|
||||
# Run only Bluesky tests
|
||||
python3 manage.py test ivatar.ivataraccount.test_views_bluesky -v2
|
||||
|
||||
# Run only file upload security tests
|
||||
python3 manage.py test ivatar.test_file_security -v2
|
||||
|
||||
# Run only upload tests
|
||||
python3 manage.py test ivatar.ivataraccount.test_views -v2
|
||||
```
|
||||
|
||||
## Test Markers
|
||||
|
||||
Tests are categorized using pytest markers:
|
||||
- `@pytest.mark.bluesky`: Tests requiring Bluesky API credentials
|
||||
- `@pytest.mark.slow`: Long-running tests
|
||||
- `@pytest.mark.integration`: Integration tests
|
||||
- `@pytest.mark.unit`: Unit tests
|
||||
|
||||
# Authors and contributors
|
||||
|
||||
Lead developer/Owner: Oliver Falk (aka ofalk or falko) - https://git.linux-kernel.at/oliver
|
||||
|
||||
13
config.py
@@ -296,6 +296,19 @@ TRUSTED_DEFAULT_URLS = list(map(map_legacy_config, TRUSTED_DEFAULT_URLS))
|
||||
BLUESKY_IDENTIFIER = os.environ.get("BLUESKY_IDENTIFIER", None)
|
||||
BLUESKY_APP_PASSWORD = os.environ.get("BLUESKY_APP_PASSWORD", None)
|
||||
|
||||
# File upload security settings
|
||||
FILE_UPLOAD_MAX_MEMORY_SIZE = 5 * 1024 * 1024 # 5MB
|
||||
DATA_UPLOAD_MAX_MEMORY_SIZE = 5 * 1024 * 1024 # 5MB
|
||||
FILE_UPLOAD_PERMISSIONS = 0o644
|
||||
|
||||
# Enhanced file upload security
|
||||
ENABLE_FILE_SECURITY_VALIDATION = True
|
||||
ENABLE_EXIF_SANITIZATION = True
|
||||
ENABLE_MALICIOUS_CONTENT_SCAN = True
|
||||
|
||||
# Logging configuration - can be overridden in local config
|
||||
# Example: LOGS_DIR = "/var/log/ivatar" # For production deployments
|
||||
|
||||
# This MUST BE THE LAST!
|
||||
if os.path.isfile(os.path.join(BASE_DIR, "config_local.py")):
|
||||
from config_local import * # noqa # flake8: noqa # NOQA # pragma: no cover
|
||||
|
||||
46
config_local.py.example
Normal file
@@ -0,0 +1,46 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Example local configuration file for ivatar
|
||||
Copy this to config_local.py and customize for your environment
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
# Override logs directory for production deployments
|
||||
# LOGS_DIR = "/var/log/ivatar"
|
||||
|
||||
# Override logs directory for development with custom location
|
||||
# LOGS_DIR = os.path.join(os.path.expanduser("~"), "ivatar_logs")
|
||||
|
||||
# File upload security settings
|
||||
# ENABLE_FILE_SECURITY_VALIDATION = True
|
||||
# ENABLE_EXIF_SANITIZATION = True
|
||||
# ENABLE_MALICIOUS_CONTENT_SCAN = True
|
||||
|
||||
# Example production overrides:
|
||||
# DEBUG = False
|
||||
# SECRET_KEY = "your-production-secret-key-here"
|
||||
# ALLOWED_HOSTS = ["yourdomain.com", "www.yourdomain.com"]
|
||||
|
||||
# Database configuration (if not using environment variables)
|
||||
# DATABASES = {
|
||||
# 'default': {
|
||||
# 'ENGINE': 'django.db.backends.postgresql',
|
||||
# 'NAME': 'ivatar_prod',
|
||||
# 'USER': 'ivatar_user',
|
||||
# 'PASSWORD': 'your-db-password',
|
||||
# 'HOST': 'localhost',
|
||||
# 'PORT': '5432',
|
||||
# }
|
||||
# }
|
||||
|
||||
# Email configuration
|
||||
# EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend'
|
||||
# EMAIL_HOST = 'smtp.yourdomain.com'
|
||||
# EMAIL_PORT = 587
|
||||
# EMAIL_USE_TLS = True
|
||||
# EMAIL_HOST_USER = 'noreply@yourdomain.com'
|
||||
# EMAIL_HOST_PASSWORD = 'your-email-password'
|
||||
|
||||
# Example: Override logs directory for production
|
||||
# LOGS_DIR = "/var/log/ivatar"
|
||||
3
config_local_test.py
Normal file
@@ -0,0 +1,3 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Test configuration to verify LOGS_DIR override
|
||||
LOGS_DIR = "/tmp/ivatar_test_logs"
|
||||
14
create.sh
@@ -2,11 +2,11 @@
|
||||
|
||||
oc new-project ivatar
|
||||
|
||||
DB_PASSWORD=`openssl rand -base64 16`
|
||||
DB_ROOT_PASSWORD=`openssl rand -base64 16`
|
||||
DB_PASSWORD=$(openssl rand -base64 16)
|
||||
DB_ROOT_PASSWORD=$(openssl rand -base64 16)
|
||||
|
||||
if [ -n "$USE_MYSQL" ]; then
|
||||
DB_CMDLINE="mysql-persistent
|
||||
DB_CMDLINE="mysql-persistent
|
||||
--group=python+mysql-persistent
|
||||
-e MYSQL_USER=ivatar
|
||||
-p MYSQL_USER=ivatar
|
||||
@@ -17,7 +17,7 @@ if [ -n "$USE_MYSQL" ]; then
|
||||
-e MYSQL_ROOT_PASSWORD=$DB_ROOT_PASSWORD
|
||||
-p MYSQL_ROOT_PASSWORD=$DB_ROOT_PASSWORD"
|
||||
else
|
||||
DB_CMDLINE="postgresql-persistent
|
||||
DB_CMDLINE="postgresql-persistent
|
||||
-e POSTGRESQL_USER=ivatar
|
||||
-p POSTGRESQL_USER=ivatar
|
||||
-e POSTGRESQL_DATABASE=ivatar
|
||||
@@ -35,8 +35,8 @@ if [ -n "$LKERNAT_GITLAB_OPENSHIFT_ACCESS_TOKEN" ]; then
|
||||
fi
|
||||
|
||||
oc new-app $SECRET_CMDLINE python~https://git.linux-kernel.at/oliver/ivatar.git \
|
||||
-e IVATAR_MAILGUN_API_KEY=$IVATAR_MAILGUN_API_KEY \
|
||||
-e IVATAR_MAILGUN_SENDER_DOMAIN=$IVATAR_MAILGUN_SENDER_DOMAIN \
|
||||
$DB_CMDLINE
|
||||
-e IVATAR_MAILGUN_API_KEY=$IVATAR_MAILGUN_API_KEY \
|
||||
-e IVATAR_MAILGUN_SENDER_DOMAIN=$IVATAR_MAILGUN_SENDER_DOMAIN \
|
||||
$DB_CMDLINE
|
||||
|
||||
oc expose svc/ivatar
|
||||
|
||||
2
create_nobody_from_svg_with_inkscape.sh
Executable file → Normal file
@@ -1,4 +1,4 @@
|
||||
for size in $(seq 1 512); do
|
||||
inkscape -z -e ivatar/static/img/nobody/${size}.png -w ${size} -h ${size} \
|
||||
ivatar/static/img/libravatar_logo.svg
|
||||
ivatar/static/img/libravatar_logo.svg
|
||||
done
|
||||
|
||||
1
cropperjs.zip
Normal file
@@ -0,0 +1 @@
|
||||
Not Found
|
||||
337
ivatar/file_security.py
Normal file
@@ -0,0 +1,337 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
File upload security utilities for ivatar
|
||||
"""
|
||||
|
||||
import hashlib
|
||||
import logging
|
||||
import magic
|
||||
import os
|
||||
from io import BytesIO
|
||||
from typing import Dict, Tuple
|
||||
|
||||
from PIL import Image
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger("ivatar.security")
|
||||
|
||||
# Security constants
|
||||
ALLOWED_MIME_TYPES = [
|
||||
"image/jpeg",
|
||||
"image/png",
|
||||
"image/gif",
|
||||
"image/webp",
|
||||
"image/bmp",
|
||||
"image/tiff",
|
||||
]
|
||||
|
||||
ALLOWED_EXTENSIONS = [".jpg", ".jpeg", ".png", ".gif", ".webp", ".bmp", ".tiff"]
|
||||
|
||||
# Magic byte signatures for image formats
|
||||
IMAGE_SIGNATURES = {
|
||||
b"\xff\xd8\xff": "image/jpeg",
|
||||
b"\x89PNG\r\n\x1a\n": "image/png",
|
||||
b"GIF87a": "image/gif",
|
||||
b"GIF89a": "image/gif",
|
||||
b"RIFF": "image/webp", # WebP starts with RIFF
|
||||
b"BM": "image/bmp",
|
||||
b"II*\x00": "image/tiff", # Little-endian TIFF
|
||||
b"MM\x00*": "image/tiff", # Big-endian TIFF
|
||||
}
|
||||
|
||||
# Maximum file size for different operations (in bytes)
|
||||
MAX_FILE_SIZE_BASIC = 5 * 1024 * 1024 # 5MB for basic validation
|
||||
MAX_FILE_SIZE_SCAN = 10 * 1024 * 1024 # 10MB for virus scanning
|
||||
MAX_FILE_SIZE_PROCESS = 50 * 1024 * 1024 # 50MB for processing
|
||||
|
||||
|
||||
class FileUploadSecurityError(Exception):
|
||||
"""Custom exception for file upload security issues"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class FileValidator:
|
||||
"""Comprehensive file validation for uploads"""
|
||||
|
||||
def __init__(self, file_data: bytes, filename: str):
|
||||
self.file_data = file_data
|
||||
self.filename = filename
|
||||
self.file_size = len(file_data)
|
||||
self.file_hash = hashlib.sha256(file_data).hexdigest()
|
||||
|
||||
def validate_basic(self) -> Dict[str, any]:
|
||||
"""
|
||||
Perform basic file validation
|
||||
Returns validation results dictionary
|
||||
"""
|
||||
results = {
|
||||
"valid": True,
|
||||
"errors": [],
|
||||
"warnings": [],
|
||||
"file_info": {
|
||||
"size": self.file_size,
|
||||
"hash": self.file_hash,
|
||||
"filename": self.filename,
|
||||
},
|
||||
}
|
||||
|
||||
# Check file size
|
||||
if self.file_size > MAX_FILE_SIZE_BASIC:
|
||||
results["valid"] = False
|
||||
results["errors"].append(f"File too large: {self.file_size} bytes")
|
||||
|
||||
# Check filename
|
||||
if not self.filename or len(self.filename) > 255:
|
||||
results["valid"] = False
|
||||
results["errors"].append("Invalid filename")
|
||||
|
||||
# Check file extension
|
||||
ext = os.path.splitext(self.filename)[1].lower()
|
||||
if ext not in ALLOWED_EXTENSIONS:
|
||||
results["valid"] = False
|
||||
results["errors"].append(f"File extension not allowed: {ext}")
|
||||
|
||||
return results
|
||||
|
||||
def validate_magic_bytes(self) -> Dict[str, any]:
|
||||
"""
|
||||
Validate file using magic bytes (file signatures)
|
||||
"""
|
||||
results = {"valid": True, "detected_type": None, "errors": []}
|
||||
|
||||
# Check magic bytes
|
||||
detected_type = None
|
||||
for signature, mime_type in IMAGE_SIGNATURES.items():
|
||||
if self.file_data.startswith(signature):
|
||||
detected_type = mime_type
|
||||
break
|
||||
|
||||
# Special handling for WebP (RIFF + WEBP)
|
||||
if self.file_data.startswith(b"RIFF") and b"WEBP" in self.file_data[:12]:
|
||||
detected_type = "image/webp"
|
||||
|
||||
if not detected_type:
|
||||
results["valid"] = False
|
||||
results["errors"].append(
|
||||
"File signature does not match any supported image format"
|
||||
)
|
||||
else:
|
||||
results["detected_type"] = detected_type
|
||||
|
||||
return results
|
||||
|
||||
def validate_mime_type(self) -> Dict[str, any]:
|
||||
"""
|
||||
Validate MIME type using python-magic
|
||||
"""
|
||||
results = {"valid": True, "detected_mime": None, "errors": []}
|
||||
|
||||
try:
|
||||
# Use python-magic to detect MIME type
|
||||
detected_mime = magic.from_buffer(self.file_data, mime=True)
|
||||
results["detected_mime"] = detected_mime
|
||||
|
||||
if detected_mime not in ALLOWED_MIME_TYPES:
|
||||
results["valid"] = False
|
||||
results["errors"].append(f"MIME type not allowed: {detected_mime}")
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"MIME type detection failed: {e}")
|
||||
results["warnings"].append("Could not detect MIME type")
|
||||
|
||||
return results
|
||||
|
||||
def validate_pil_image(self) -> Dict[str, any]:
|
||||
"""
|
||||
Validate using PIL to ensure it's a valid image
|
||||
"""
|
||||
results = {"valid": True, "image_info": {}, "errors": []}
|
||||
|
||||
try:
|
||||
# Open image with PIL
|
||||
image = Image.open(BytesIO(self.file_data))
|
||||
|
||||
# Get image information
|
||||
results["image_info"] = {
|
||||
"format": image.format,
|
||||
"mode": image.mode,
|
||||
"size": image.size,
|
||||
"width": image.width,
|
||||
"height": image.height,
|
||||
"has_transparency": image.mode in ("RGBA", "LA", "P"),
|
||||
}
|
||||
|
||||
# Verify image can be loaded
|
||||
image.load()
|
||||
|
||||
# Check for suspicious characteristics
|
||||
if image.width > 10000 or image.height > 10000:
|
||||
results["warnings"].append("Image dimensions are very large")
|
||||
|
||||
if image.width < 1 or image.height < 1:
|
||||
results["valid"] = False
|
||||
results["errors"].append("Invalid image dimensions")
|
||||
|
||||
except Exception as e:
|
||||
results["valid"] = False
|
||||
results["errors"].append(f"Invalid image format: {str(e)}")
|
||||
|
||||
return results
|
||||
|
||||
def sanitize_exif_data(self) -> bytes:
|
||||
"""
|
||||
Remove EXIF data from image to prevent metadata leaks
|
||||
"""
|
||||
try:
|
||||
image = Image.open(BytesIO(self.file_data))
|
||||
|
||||
# Create new image without EXIF data
|
||||
if image.mode in ("RGBA", "LA"):
|
||||
# Preserve transparency
|
||||
new_image = Image.new("RGBA", image.size, (255, 255, 255, 0))
|
||||
new_image.paste(image, mask=image.split()[-1])
|
||||
else:
|
||||
new_image = Image.new("RGB", image.size, (255, 255, 255))
|
||||
new_image.paste(image)
|
||||
|
||||
# Save without EXIF data
|
||||
output = BytesIO()
|
||||
new_image.save(output, format=image.format or "JPEG", quality=95)
|
||||
return output.getvalue()
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"EXIF sanitization failed: {e}")
|
||||
return self.file_data # Return original if sanitization fails
|
||||
|
||||
def scan_for_malicious_content(self) -> Dict[str, any]:
|
||||
"""
|
||||
Scan for potentially malicious content patterns
|
||||
"""
|
||||
results = {"suspicious": False, "threats": [], "warnings": []}
|
||||
|
||||
# Check for embedded scripts or executable content
|
||||
suspicious_patterns = [
|
||||
b"<script",
|
||||
b"javascript:",
|
||||
b"vbscript:",
|
||||
b"data:text/html",
|
||||
b"<?php",
|
||||
b"<%",
|
||||
b"#!/bin/",
|
||||
b"MZ", # PE executable header
|
||||
b"\x7fELF", # ELF executable header
|
||||
]
|
||||
|
||||
for pattern in suspicious_patterns:
|
||||
if pattern in self.file_data:
|
||||
results["suspicious"] = True
|
||||
results["threats"].append(f"Suspicious pattern detected: {pattern}")
|
||||
|
||||
# Check for polyglot files (valid in multiple formats)
|
||||
if self.file_data.startswith(b"GIF89a") and b"<script" in self.file_data:
|
||||
results["suspicious"] = True
|
||||
results["threats"].append("Potential polyglot attack detected")
|
||||
|
||||
return results
|
||||
|
||||
def comprehensive_validation(self) -> Dict[str, any]:
|
||||
"""
|
||||
Perform comprehensive file validation
|
||||
"""
|
||||
results = {
|
||||
"valid": True,
|
||||
"errors": [],
|
||||
"warnings": [],
|
||||
"file_info": {},
|
||||
"security_score": 100,
|
||||
}
|
||||
|
||||
# Basic validation
|
||||
basic_results = self.validate_basic()
|
||||
if not basic_results["valid"]:
|
||||
results["valid"] = False
|
||||
results["errors"].extend(basic_results["errors"])
|
||||
results["security_score"] -= 30
|
||||
|
||||
results["file_info"].update(basic_results["file_info"])
|
||||
results["warnings"].extend(basic_results["warnings"])
|
||||
|
||||
# Magic bytes validation
|
||||
magic_results = self.validate_magic_bytes()
|
||||
if not magic_results["valid"]:
|
||||
results["valid"] = False
|
||||
results["errors"].extend(magic_results["errors"])
|
||||
results["security_score"] -= 10 # Reduced from 25 - basic format issue, not security threat
|
||||
|
||||
results["file_info"]["detected_type"] = magic_results["detected_type"]
|
||||
|
||||
# MIME type validation
|
||||
mime_results = self.validate_mime_type()
|
||||
if not mime_results["valid"]:
|
||||
results["valid"] = False
|
||||
results["errors"].extend(mime_results["errors"])
|
||||
results["security_score"] -= 10 # Reduced from 20 - basic format issue, not security threat
|
||||
|
||||
results["file_info"]["detected_mime"] = mime_results["detected_mime"]
|
||||
results["warnings"].extend(mime_results.get("warnings", []))
|
||||
|
||||
# PIL image validation
|
||||
pil_results = self.validate_pil_image()
|
||||
if not pil_results["valid"]:
|
||||
results["valid"] = False
|
||||
results["errors"].extend(pil_results["errors"])
|
||||
results["security_score"] -= 10 # Reduced from 15 - basic format issue, not security threat
|
||||
|
||||
results["file_info"]["image_info"] = pil_results["image_info"]
|
||||
results["warnings"].extend(pil_results.get("warnings", []))
|
||||
|
||||
# Security scan
|
||||
security_results = self.scan_for_malicious_content()
|
||||
if security_results["suspicious"]:
|
||||
results["valid"] = False
|
||||
results["errors"].extend(security_results["threats"])
|
||||
results["security_score"] -= 50
|
||||
|
||||
results["warnings"].extend(security_results.get("warnings", []))
|
||||
|
||||
# Log security events
|
||||
if not results["valid"]:
|
||||
logger.warning(f"File upload validation failed: {results['errors']}")
|
||||
elif results["security_score"] < 80:
|
||||
logger.info(
|
||||
f"File upload with low security score: {results['security_score']}"
|
||||
)
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def validate_uploaded_file(
|
||||
file_data: bytes, filename: str
|
||||
) -> Tuple[bool, Dict[str, any], bytes]:
|
||||
"""
|
||||
Main function to validate uploaded files
|
||||
|
||||
Returns:
|
||||
(is_valid, validation_results, sanitized_data)
|
||||
"""
|
||||
validator = FileValidator(file_data, filename)
|
||||
|
||||
# Perform comprehensive validation
|
||||
results = validator.comprehensive_validation()
|
||||
|
||||
if not results["valid"]:
|
||||
return False, results, file_data
|
||||
|
||||
# Sanitize EXIF data
|
||||
sanitized_data = validator.sanitize_exif_data()
|
||||
|
||||
return True, results, sanitized_data
|
||||
|
||||
|
||||
def get_file_security_report(file_data: bytes, filename: str) -> Dict[str, any]:
|
||||
"""
|
||||
Generate a security report for a file without modifying it
|
||||
"""
|
||||
validator = FileValidator(file_data, filename)
|
||||
return validator.comprehensive_validation()
|
||||
@@ -6,15 +6,22 @@ from urllib.parse import urlsplit, urlunsplit
|
||||
|
||||
from django import forms
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.core.exceptions import ValidationError
|
||||
|
||||
from ipware import get_client_ip
|
||||
|
||||
from ivatar import settings
|
||||
from ivatar.settings import MIN_LENGTH_EMAIL, MAX_LENGTH_EMAIL
|
||||
from ivatar.settings import MIN_LENGTH_URL, MAX_LENGTH_URL
|
||||
from ivatar.settings import ENABLE_FILE_SECURITY_VALIDATION
|
||||
from ivatar.file_security import validate_uploaded_file, FileUploadSecurityError
|
||||
from .models import UnconfirmedEmail, ConfirmedEmail, Photo
|
||||
from .models import UnconfirmedOpenId, ConfirmedOpenId
|
||||
from .models import UserPreference
|
||||
import logging
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger("ivatar.ivataraccount.forms")
|
||||
|
||||
|
||||
MAX_NUM_UNCONFIRMED_EMAILS_DEFAULT = 5
|
||||
@@ -81,7 +88,7 @@ class AddEmailForm(forms.Form):
|
||||
|
||||
class UploadPhotoForm(forms.Form):
|
||||
"""
|
||||
Form handling photo upload
|
||||
Form handling photo upload with enhanced security validation
|
||||
"""
|
||||
|
||||
photo = forms.FileField(
|
||||
@@ -107,16 +114,102 @@ class UploadPhotoForm(forms.Form):
|
||||
},
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def save(request, data):
|
||||
def clean_photo(self):
|
||||
"""
|
||||
Save the model and assign it to the current user
|
||||
Enhanced photo validation with security checks
|
||||
"""
|
||||
photo = self.cleaned_data.get("photo")
|
||||
|
||||
if not photo:
|
||||
raise ValidationError(_("No file provided"))
|
||||
|
||||
# Read file data
|
||||
try:
|
||||
# Handle different file types
|
||||
if hasattr(photo, 'read'):
|
||||
file_data = photo.read()
|
||||
elif hasattr(photo, 'file'):
|
||||
file_data = photo.file.read()
|
||||
else:
|
||||
file_data = bytes(photo)
|
||||
filename = photo.name
|
||||
except Exception as e:
|
||||
logger.error(f"Error reading uploaded file: {e}")
|
||||
raise ValidationError(_("Error reading uploaded file"))
|
||||
|
||||
# Perform comprehensive security validation (if enabled)
|
||||
if ENABLE_FILE_SECURITY_VALIDATION:
|
||||
try:
|
||||
is_valid, validation_results, sanitized_data = validate_uploaded_file(
|
||||
file_data, filename
|
||||
)
|
||||
|
||||
if not is_valid:
|
||||
# Log security violation
|
||||
logger.warning(
|
||||
f"File upload security violation: {validation_results['errors']}"
|
||||
)
|
||||
|
||||
# Only reject truly malicious files at the form level
|
||||
# Allow basic format issues to pass through to Photo.save() for original error handling
|
||||
if validation_results.get("security_score", 100) < 30:
|
||||
raise ValidationError(
|
||||
_("File appears to be malicious and cannot be uploaded")
|
||||
)
|
||||
else:
|
||||
# For format issues, don't raise ValidationError - let Photo.save() handle it
|
||||
# This preserves the original error handling behavior
|
||||
logger.info(f"File format issue detected, allowing Photo.save() to handle: {validation_results['errors']}")
|
||||
# Store the validation results for potential use, but don't reject the form
|
||||
self.validation_results = validation_results
|
||||
self.file_data = file_data
|
||||
else:
|
||||
# Store sanitized data for later use
|
||||
self.sanitized_data = sanitized_data
|
||||
self.validation_results = validation_results
|
||||
# Store original file data for fallback
|
||||
self.file_data = file_data
|
||||
|
||||
# Log successful validation
|
||||
logger.info(
|
||||
f"File upload validated successfully: {filename}, security_score: {validation_results.get('security_score', 100)}"
|
||||
)
|
||||
|
||||
except FileUploadSecurityError as e:
|
||||
logger.error(f"File upload security error: {e}")
|
||||
raise ValidationError(_("File security validation failed"))
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error during file validation: {e}")
|
||||
raise ValidationError(_("File validation failed"))
|
||||
else:
|
||||
# Security validation disabled (e.g., in tests)
|
||||
logger.debug(f"File upload security validation disabled for: {filename}")
|
||||
self.file_data = file_data
|
||||
|
||||
return photo
|
||||
|
||||
def save(self, request, data):
|
||||
"""
|
||||
Save the model and assign it to the current user with enhanced security
|
||||
"""
|
||||
# Link this file to the user's profile
|
||||
photo = Photo()
|
||||
photo.user = request.user
|
||||
photo.ip_address = get_client_ip(request)[0]
|
||||
photo.data = data.read()
|
||||
|
||||
# Use sanitized data if available, otherwise use stored file data
|
||||
if hasattr(self, "sanitized_data"):
|
||||
photo.data = self.sanitized_data
|
||||
elif hasattr(self, "file_data"):
|
||||
photo.data = self.file_data
|
||||
else:
|
||||
# Fallback: try to read from the file object
|
||||
try:
|
||||
photo.data = data.read()
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to read file data: {e}")
|
||||
photo.data = b""
|
||||
|
||||
photo.save()
|
||||
return photo if photo.pk else None
|
||||
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.0.5 on 2018-05-07 07:13
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
import ivatar.ivataraccount.models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
@@ -16,93 +16,167 @@ class Migration(migrations.Migration):
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='ConfirmedEmail',
|
||||
name="ConfirmedEmail",
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('ip_address', models.GenericIPAddressField(unpack_ipv4=True)),
|
||||
('add_date', models.DateTimeField()),
|
||||
('email', models.EmailField(max_length=254, unique=True)),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("ip_address", models.GenericIPAddressField(unpack_ipv4=True)),
|
||||
("add_date", models.DateTimeField()),
|
||||
("email", models.EmailField(max_length=254, unique=True)),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'confirmed email',
|
||||
'verbose_name_plural': 'confirmed emails',
|
||||
"verbose_name": "confirmed email",
|
||||
"verbose_name_plural": "confirmed emails",
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='ConfirmedOpenId',
|
||||
name="ConfirmedOpenId",
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('ip_address', models.GenericIPAddressField(unpack_ipv4=True)),
|
||||
('add_date', models.DateTimeField()),
|
||||
('openid', models.URLField(max_length=255, unique=True)),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("ip_address", models.GenericIPAddressField(unpack_ipv4=True)),
|
||||
("add_date", models.DateTimeField()),
|
||||
("openid", models.URLField(max_length=255, unique=True)),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'confirmed OpenID',
|
||||
'verbose_name_plural': 'confirmed OpenIDs',
|
||||
"verbose_name": "confirmed OpenID",
|
||||
"verbose_name_plural": "confirmed OpenIDs",
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='Photo',
|
||||
name="Photo",
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('add_date', models.DateTimeField()),
|
||||
('ip_address', models.GenericIPAddressField(unpack_ipv4=True)),
|
||||
('data', models.BinaryField()),
|
||||
('format', models.CharField(max_length=3)),
|
||||
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("add_date", models.DateTimeField()),
|
||||
("ip_address", models.GenericIPAddressField(unpack_ipv4=True)),
|
||||
("data", models.BinaryField()),
|
||||
("format", models.CharField(max_length=3)),
|
||||
(
|
||||
"user",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'photo',
|
||||
'verbose_name_plural': 'photos',
|
||||
"verbose_name": "photo",
|
||||
"verbose_name_plural": "photos",
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='UnconfirmedEmail',
|
||||
name="UnconfirmedEmail",
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('ip_address', models.GenericIPAddressField(unpack_ipv4=True)),
|
||||
('add_date', models.DateTimeField()),
|
||||
('email', models.EmailField(max_length=254)),
|
||||
('verification_key', models.CharField(max_length=64)),
|
||||
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("ip_address", models.GenericIPAddressField(unpack_ipv4=True)),
|
||||
("add_date", models.DateTimeField()),
|
||||
("email", models.EmailField(max_length=254)),
|
||||
("verification_key", models.CharField(max_length=64)),
|
||||
(
|
||||
"user",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'unconfirmed_email',
|
||||
'verbose_name_plural': 'unconfirmed_emails',
|
||||
"verbose_name": "unconfirmed_email",
|
||||
"verbose_name_plural": "unconfirmed_emails",
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='UnconfirmedOpenId',
|
||||
name="UnconfirmedOpenId",
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('ip_address', models.GenericIPAddressField(unpack_ipv4=True)),
|
||||
('add_date', models.DateTimeField()),
|
||||
('openid', models.URLField(max_length=255)),
|
||||
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("ip_address", models.GenericIPAddressField(unpack_ipv4=True)),
|
||||
("add_date", models.DateTimeField()),
|
||||
("openid", models.URLField(max_length=255)),
|
||||
(
|
||||
"user",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'unconfirmed OpenID',
|
||||
'verbose_name_plural': 'unconfirmed_OpenIDs',
|
||||
"verbose_name": "unconfirmed OpenID",
|
||||
"verbose_name_plural": "unconfirmed_OpenIDs",
|
||||
},
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='confirmedopenid',
|
||||
name='photo',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='openids', to='ivataraccount.Photo'),
|
||||
model_name="confirmedopenid",
|
||||
name="photo",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="openids",
|
||||
to="ivataraccount.Photo",
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='confirmedopenid',
|
||||
name='user',
|
||||
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL),
|
||||
model_name="confirmedopenid",
|
||||
name="user",
|
||||
field=models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='confirmedemail',
|
||||
name='photo',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='emails', to='ivataraccount.Photo'),
|
||||
model_name="confirmedemail",
|
||||
name="photo",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="emails",
|
||||
to="ivataraccount.Photo",
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='confirmedemail',
|
||||
name='user',
|
||||
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL),
|
||||
model_name="confirmedemail",
|
||||
name="user",
|
||||
field=models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL
|
||||
),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.0.5 on 2018-05-07 07:23
|
||||
|
||||
from django.db import migrations, models
|
||||
@@ -6,29 +7,45 @@ from django.db import migrations, models
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0001_initial'),
|
||||
("ivataraccount", "0001_initial"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='OpenIDAssociation',
|
||||
name="OpenIDAssociation",
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('server_url', models.TextField(max_length=2047)),
|
||||
('handle', models.CharField(max_length=255)),
|
||||
('secret', models.TextField(max_length=255)),
|
||||
('issued', models.IntegerField()),
|
||||
('lifetime', models.IntegerField()),
|
||||
('assoc_type', models.TextField(max_length=64)),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("server_url", models.TextField(max_length=2047)),
|
||||
("handle", models.CharField(max_length=255)),
|
||||
("secret", models.TextField(max_length=255)),
|
||||
("issued", models.IntegerField()),
|
||||
("lifetime", models.IntegerField()),
|
||||
("assoc_type", models.TextField(max_length=64)),
|
||||
],
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='OpenIDNonce',
|
||||
name="OpenIDNonce",
|
||||
fields=[
|
||||
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('server_url', models.CharField(max_length=255)),
|
||||
('timestamp', models.IntegerField()),
|
||||
('salt', models.CharField(max_length=128)),
|
||||
(
|
||||
"id",
|
||||
models.AutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("server_url", models.CharField(max_length=255)),
|
||||
("timestamp", models.IntegerField()),
|
||||
("salt", models.CharField(max_length=128)),
|
||||
],
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.0.5 on 2018-05-08 06:37
|
||||
|
||||
import datetime
|
||||
@@ -7,53 +8,53 @@ from django.db import migrations, models
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0002_openidassociation_openidnonce'),
|
||||
("ivataraccount", "0002_openidassociation_openidnonce"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='confirmedemail',
|
||||
name='add_date',
|
||||
model_name="confirmedemail",
|
||||
name="add_date",
|
||||
field=models.DateTimeField(default=datetime.datetime.utcnow),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='confirmedemail',
|
||||
name='ip_address',
|
||||
model_name="confirmedemail",
|
||||
name="ip_address",
|
||||
field=models.GenericIPAddressField(null=True, unpack_ipv4=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='confirmedopenid',
|
||||
name='add_date',
|
||||
model_name="confirmedopenid",
|
||||
name="add_date",
|
||||
field=models.DateTimeField(default=datetime.datetime.utcnow),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='confirmedopenid',
|
||||
name='ip_address',
|
||||
model_name="confirmedopenid",
|
||||
name="ip_address",
|
||||
field=models.GenericIPAddressField(null=True, unpack_ipv4=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='photo',
|
||||
name='add_date',
|
||||
model_name="photo",
|
||||
name="add_date",
|
||||
field=models.DateTimeField(default=datetime.datetime.utcnow),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='unconfirmedemail',
|
||||
name='add_date',
|
||||
model_name="unconfirmedemail",
|
||||
name="add_date",
|
||||
field=models.DateTimeField(default=datetime.datetime.utcnow),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='unconfirmedemail',
|
||||
name='ip_address',
|
||||
model_name="unconfirmedemail",
|
||||
name="ip_address",
|
||||
field=models.GenericIPAddressField(null=True, unpack_ipv4=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='unconfirmedopenid',
|
||||
name='add_date',
|
||||
model_name="unconfirmedopenid",
|
||||
name="add_date",
|
||||
field=models.DateTimeField(default=datetime.datetime.utcnow),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='unconfirmedopenid',
|
||||
name='ip_address',
|
||||
model_name="unconfirmedopenid",
|
||||
name="ip_address",
|
||||
field=models.GenericIPAddressField(null=True, unpack_ipv4=True),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.0.5 on 2018-05-08 07:42
|
||||
|
||||
from django.db import migrations, models
|
||||
@@ -7,33 +8,33 @@ import django.utils.timezone
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0003_auto_20180508_0637'),
|
||||
("ivataraccount", "0003_auto_20180508_0637"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='confirmedemail',
|
||||
name='add_date',
|
||||
model_name="confirmedemail",
|
||||
name="add_date",
|
||||
field=models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='confirmedopenid',
|
||||
name='add_date',
|
||||
model_name="confirmedopenid",
|
||||
name="add_date",
|
||||
field=models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='photo',
|
||||
name='add_date',
|
||||
model_name="photo",
|
||||
name="add_date",
|
||||
field=models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='unconfirmedemail',
|
||||
name='add_date',
|
||||
model_name="unconfirmedemail",
|
||||
name="add_date",
|
||||
field=models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='unconfirmedopenid',
|
||||
name='add_date',
|
||||
model_name="unconfirmedopenid",
|
||||
name="add_date",
|
||||
field=models.DateTimeField(default=django.utils.timezone.now),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.0.5 on 2018-05-22 11:55
|
||||
|
||||
from django.db import migrations, models
|
||||
@@ -6,20 +7,20 @@ from django.db import migrations, models
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0004_auto_20180508_0742'),
|
||||
("ivataraccount", "0004_auto_20180508_0742"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='confirmedemail',
|
||||
name='digest',
|
||||
field=models.CharField(default='', max_length=64),
|
||||
model_name="confirmedemail",
|
||||
name="digest",
|
||||
field=models.CharField(default="", max_length=64),
|
||||
preserve_default=False,
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='confirmedopenid',
|
||||
name='digest',
|
||||
field=models.CharField(default='', max_length=64),
|
||||
model_name="confirmedopenid",
|
||||
name="digest",
|
||||
field=models.CharField(default="", max_length=64),
|
||||
preserve_default=False,
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.0.6 on 2018-06-26 14:45
|
||||
|
||||
from django.db import migrations, models
|
||||
@@ -6,18 +7,18 @@ from django.db import migrations, models
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0005_auto_20180522_1155'),
|
||||
("ivataraccount", "0005_auto_20180522_1155"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='confirmedemail',
|
||||
name='digest_sha256',
|
||||
model_name="confirmedemail",
|
||||
name="digest_sha256",
|
||||
field=models.CharField(max_length=64, null=True),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='confirmedemail',
|
||||
name='digest',
|
||||
model_name="confirmedemail",
|
||||
name="digest",
|
||||
field=models.CharField(max_length=32),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,39 +1,53 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.0.6 on 2018-06-27 06:24
|
||||
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
def add_sha256(apps, schema_editor):
|
||||
'''
|
||||
Make sure all ConfirmedEmail have digest_sha256 set
|
||||
in order to alter the model so sha256 may not be NULL
|
||||
'''
|
||||
ConfirmedEmail = apps.get_model('ivataraccount', 'ConfirmedEmail')
|
||||
for mail in ConfirmedEmail.objects.filter(digest_sha256=None):
|
||||
mail.save() # pragma: no cover
|
||||
"""
|
||||
Make sure all ConfirmedEmail have digest_sha256 set
|
||||
in order to alter the model so sha256 may not be NULL
|
||||
"""
|
||||
ConfirmedEmail = apps.get_model("ivataraccount", "ConfirmedEmail")
|
||||
for mail in ConfirmedEmail.objects.filter(digest_sha256=None):
|
||||
mail.save() # pragma: no cover
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0006_auto_20180626_1445'),
|
||||
("ivataraccount", "0006_auto_20180626_1445"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(add_sha256),
|
||||
migrations.AlterField(
|
||||
model_name='confirmedemail',
|
||||
name='digest_sha256',
|
||||
model_name="confirmedemail",
|
||||
name="digest_sha256",
|
||||
field=models.CharField(max_length=64),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='confirmedemail',
|
||||
name='photo',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='emails', to='ivataraccount.Photo'),
|
||||
model_name="confirmedemail",
|
||||
name="photo",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name="emails",
|
||||
to="ivataraccount.Photo",
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='confirmedopenid',
|
||||
name='photo',
|
||||
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='openids', to='ivataraccount.Photo'),
|
||||
model_name="confirmedopenid",
|
||||
name="photo",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name="openids",
|
||||
to="ivataraccount.Photo",
|
||||
),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# pylint: disable=invalid-name,missing-docstring
|
||||
# Generated by Django 2.0.6 on 2018-07-04 12:32
|
||||
|
||||
@@ -7,11 +8,14 @@ import django.db.models.deletion
|
||||
|
||||
|
||||
def add_preference_to_user(apps, schema_editor): # pylint: disable=unused-argument
|
||||
'''
|
||||
"""
|
||||
Make sure all users have preferences set up
|
||||
'''
|
||||
"""
|
||||
from django.contrib.auth.models import User
|
||||
UserPreference = apps.get_model('ivataraccount', 'UserPreference') # pylint: disable=invalid-name
|
||||
|
||||
UserPreference = apps.get_model(
|
||||
"ivataraccount", "UserPreference"
|
||||
) # pylint: disable=invalid-name
|
||||
for user in User.objects.filter(userpreference=None):
|
||||
pref = UserPreference.objects.create(user_id=user.pk) # pragma: no cover
|
||||
pref.save() # pragma: no cover
|
||||
@@ -20,24 +24,34 @@ def add_preference_to_user(apps, schema_editor): # pylint: disable=unused-argum
|
||||
class Migration(migrations.Migration): # pylint: disable=missing-docstring
|
||||
|
||||
dependencies = [
|
||||
('auth', '0009_alter_user_last_name_max_length'),
|
||||
('ivataraccount', '0007_auto_20180627_0624'),
|
||||
("auth", "0009_alter_user_last_name_max_length"),
|
||||
("ivataraccount", "0007_auto_20180627_0624"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='UserPreference',
|
||||
name="UserPreference",
|
||||
fields=[
|
||||
('theme', models.CharField(
|
||||
choices=[
|
||||
('default', 'Default theme'),
|
||||
('clime', 'Climes theme')],
|
||||
default='default', max_length=10)),
|
||||
('user', models.OneToOneField(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
to=settings.AUTH_USER_MODEL)),
|
||||
(
|
||||
"theme",
|
||||
models.CharField(
|
||||
choices=[
|
||||
("default", "Default theme"),
|
||||
("clime", "Climes theme"),
|
||||
],
|
||||
default="default",
|
||||
max_length=10,
|
||||
),
|
||||
),
|
||||
(
|
||||
"user",
|
||||
models.OneToOneField(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
),
|
||||
),
|
||||
],
|
||||
),
|
||||
migrations.RunPython(add_preference_to_user),
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.0.6 on 2018-07-05 11:52
|
||||
|
||||
from django.db import migrations, models
|
||||
@@ -6,13 +7,21 @@ from django.db import migrations, models
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0008_userpreference'),
|
||||
("ivataraccount", "0008_userpreference"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='userpreference',
|
||||
name='theme',
|
||||
field=models.CharField(choices=[('default', 'Default theme'), ('clime', 'climes theme'), ('falko', 'falkos theme')], default='default', max_length=10),
|
||||
model_name="userpreference",
|
||||
name="theme",
|
||||
field=models.CharField(
|
||||
choices=[
|
||||
("default", "Default theme"),
|
||||
("clime", "climes theme"),
|
||||
("falko", "falkos theme"),
|
||||
],
|
||||
default="default",
|
||||
max_length=10,
|
||||
),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.0.6 on 2018-07-05 12:01
|
||||
|
||||
from django.db import migrations, models
|
||||
@@ -6,13 +7,17 @@ from django.db import migrations, models
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0009_auto_20180705_1152'),
|
||||
("ivataraccount", "0009_auto_20180705_1152"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='userpreference',
|
||||
name='theme',
|
||||
field=models.CharField(choices=[('default', 'Default theme'), ('falko', 'falkos theme')], default='default', max_length=10),
|
||||
model_name="userpreference",
|
||||
name="theme",
|
||||
field=models.CharField(
|
||||
choices=[("default", "Default theme"), ("falko", "falkos theme")],
|
||||
default="default",
|
||||
max_length=10,
|
||||
),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.1.3 on 2018-11-07 15:50
|
||||
|
||||
from django.db import migrations, models
|
||||
@@ -6,18 +7,26 @@ from django.db import migrations, models
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0010_auto_20180705_1201'),
|
||||
("ivataraccount", "0010_auto_20180705_1201"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='photo',
|
||||
name='access_count',
|
||||
model_name="photo",
|
||||
name="access_count",
|
||||
field=models.BigIntegerField(default=0, editable=False),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='userpreference',
|
||||
name='theme',
|
||||
field=models.CharField(choices=[('default', 'Default theme'), ('clime', 'climes theme'), ('falko', 'falkos theme')], default='default', max_length=10),
|
||||
model_name="userpreference",
|
||||
name="theme",
|
||||
field=models.CharField(
|
||||
choices=[
|
||||
("default", "Default theme"),
|
||||
("clime", "climes theme"),
|
||||
("falko", "falkos theme"),
|
||||
],
|
||||
default="default",
|
||||
max_length=10,
|
||||
),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.1.3 on 2018-11-07 17:32
|
||||
|
||||
from django.db import migrations, models
|
||||
@@ -6,18 +7,18 @@ from django.db import migrations, models
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0011_auto_20181107_1550'),
|
||||
("ivataraccount", "0011_auto_20181107_1550"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='confirmedemail',
|
||||
name='access_count',
|
||||
model_name="confirmedemail",
|
||||
name="access_count",
|
||||
field=models.BigIntegerField(default=0, editable=False),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='confirmedopenid',
|
||||
name='access_count',
|
||||
model_name="confirmedopenid",
|
||||
name="access_count",
|
||||
field=models.BigIntegerField(default=0, editable=False),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.1.3 on 2018-12-03 14:21
|
||||
|
||||
from django.db import migrations, models
|
||||
@@ -6,13 +7,22 @@ from django.db import migrations, models
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0012_auto_20181107_1732'),
|
||||
("ivataraccount", "0012_auto_20181107_1732"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='userpreference',
|
||||
name='theme',
|
||||
field=models.CharField(choices=[('default', 'Default theme'), ('clime', 'climes theme'), ('green', 'green theme'), ('red', 'red theme')], default='default', max_length=10),
|
||||
model_name="userpreference",
|
||||
name="theme",
|
||||
field=models.CharField(
|
||||
choices=[
|
||||
("default", "Default theme"),
|
||||
("clime", "climes theme"),
|
||||
("green", "green theme"),
|
||||
("red", "red theme"),
|
||||
],
|
||||
default="default",
|
||||
max_length=10,
|
||||
),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 2.1.5 on 2019-02-18 16:02
|
||||
|
||||
from django.db import migrations
|
||||
@@ -6,12 +7,15 @@ from django.db import migrations
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0013_auto_20181203_1421'),
|
||||
("ivataraccount", "0013_auto_20181203_1421"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterModelOptions(
|
||||
name='unconfirmedemail',
|
||||
options={'verbose_name': 'unconfirmed email', 'verbose_name_plural': 'unconfirmed emails'},
|
||||
name="unconfirmedemail",
|
||||
options={
|
||||
"verbose_name": "unconfirmed email",
|
||||
"verbose_name_plural": "unconfirmed emails",
|
||||
},
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 3.0.3 on 2020-02-25 09:34
|
||||
|
||||
from django.db import migrations, models
|
||||
@@ -6,23 +7,23 @@ from django.db import migrations, models
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0014_auto_20190218_1602'),
|
||||
("ivataraccount", "0014_auto_20190218_1602"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='confirmedopenid',
|
||||
name='alt_digest1',
|
||||
model_name="confirmedopenid",
|
||||
name="alt_digest1",
|
||||
field=models.CharField(blank=True, default=None, max_length=64, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='confirmedopenid',
|
||||
name='alt_digest2',
|
||||
model_name="confirmedopenid",
|
||||
name="alt_digest2",
|
||||
field=models.CharField(blank=True, default=None, max_length=64, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='confirmedopenid',
|
||||
name='alt_digest3',
|
||||
model_name="confirmedopenid",
|
||||
name="alt_digest3",
|
||||
field=models.CharField(blank=True, default=None, max_length=64, null=True),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 3.1.7 on 2021-04-13 09:04
|
||||
|
||||
from django.db import migrations, models
|
||||
@@ -6,18 +7,18 @@ from django.db import migrations, models
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0015_auto_20200225_0934'),
|
||||
("ivataraccount", "0015_auto_20200225_0934"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='unconfirmedemail',
|
||||
name='last_send_date',
|
||||
model_name="unconfirmedemail",
|
||||
name="last_send_date",
|
||||
field=models.DateTimeField(blank=True, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='unconfirmedemail',
|
||||
name='last_status',
|
||||
model_name="unconfirmedemail",
|
||||
name="last_status",
|
||||
field=models.TextField(blank=True, max_length=2047, null=True),
|
||||
),
|
||||
]
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated by Django 3.2.3 on 2021-05-28 13:14
|
||||
|
||||
from django.db import migrations, models
|
||||
@@ -6,43 +7,57 @@ from django.db import migrations, models
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('ivataraccount', '0016_auto_20210413_0904'),
|
||||
("ivataraccount", "0016_auto_20210413_0904"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AlterField(
|
||||
model_name='confirmedemail',
|
||||
name='id',
|
||||
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
|
||||
model_name="confirmedemail",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='confirmedopenid',
|
||||
name='id',
|
||||
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
|
||||
model_name="confirmedopenid",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='openidassociation',
|
||||
name='id',
|
||||
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
|
||||
model_name="openidassociation",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='openidnonce',
|
||||
name='id',
|
||||
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
|
||||
model_name="openidnonce",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='photo',
|
||||
name='id',
|
||||
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
|
||||
model_name="photo",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='unconfirmedemail',
|
||||
name='id',
|
||||
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
|
||||
model_name="unconfirmedemail",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
|
||||
),
|
||||
),
|
||||
migrations.AlterField(
|
||||
model_name='unconfirmedopenid',
|
||||
name='id',
|
||||
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
|
||||
model_name="unconfirmedopenid",
|
||||
name="id",
|
||||
field=models.BigAutoField(
|
||||
auto_created=True, primary_key=True, serialize=False, verbose_name="ID"
|
||||
),
|
||||
),
|
||||
]
|
||||
|
||||
130
ivatar/ivataraccount/migrations/0021_add_performance_indexes.py
Normal file
@@ -0,0 +1,130 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Generated manually for performance optimization
|
||||
|
||||
from typing import Any, List, Tuple, Optional
|
||||
from django.db import migrations, connection
|
||||
|
||||
|
||||
def create_indexes(apps: Any, schema_editor: Any) -> None:
|
||||
"""
|
||||
Create performance indexes for both PostgreSQL and MySQL compatibility.
|
||||
Uses CONCURRENTLY for PostgreSQL production, regular CREATE INDEX for tests/transactions.
|
||||
"""
|
||||
db_engine = connection.vendor
|
||||
|
||||
indexes: List[Tuple[str, str, str, Optional[str]]] = [
|
||||
# ConfirmedEmail indexes
|
||||
("idx_cemail_digest", "ivataraccount_confirmedemail", "digest", None),
|
||||
(
|
||||
"idx_cemail_digest_sha256",
|
||||
"ivataraccount_confirmedemail",
|
||||
"digest_sha256",
|
||||
None,
|
||||
),
|
||||
(
|
||||
"idx_cemail_access_count",
|
||||
"ivataraccount_confirmedemail",
|
||||
"access_count",
|
||||
None,
|
||||
),
|
||||
(
|
||||
"idx_cemail_bluesky_handle",
|
||||
"ivataraccount_confirmedemail",
|
||||
"bluesky_handle",
|
||||
"WHERE bluesky_handle IS NOT NULL",
|
||||
),
|
||||
# Photo indexes
|
||||
("idx_photo_format", "ivataraccount_photo", "format", None),
|
||||
("idx_photo_access_count", "ivataraccount_photo", "access_count", None),
|
||||
# Composite indexes
|
||||
(
|
||||
"idx_cemail_user_access",
|
||||
"ivataraccount_confirmedemail",
|
||||
"user_id, access_count",
|
||||
None,
|
||||
),
|
||||
(
|
||||
"idx_cemail_photo_access",
|
||||
"ivataraccount_confirmedemail",
|
||||
"photo_id, access_count",
|
||||
None,
|
||||
),
|
||||
("idx_photo_user_format", "ivataraccount_photo", "user_id, format", None),
|
||||
]
|
||||
|
||||
with connection.cursor() as cursor:
|
||||
# Check if we're in a transaction (test environment)
|
||||
try:
|
||||
cursor.execute("SELECT 1")
|
||||
in_transaction = connection.in_atomic_block
|
||||
except Exception:
|
||||
in_transaction = True
|
||||
|
||||
for index_name, table_name, columns, where_clause in indexes:
|
||||
try:
|
||||
if db_engine == "postgresql":
|
||||
# Use CONCURRENTLY only if not in a transaction (production)
|
||||
# Use regular CREATE INDEX if in a transaction (tests)
|
||||
if in_transaction:
|
||||
# In transaction (test environment) - use regular CREATE INDEX
|
||||
if where_clause:
|
||||
sql = f"CREATE INDEX IF NOT EXISTS {index_name} ON {table_name}({columns}) {where_clause};"
|
||||
else:
|
||||
sql = f"CREATE INDEX IF NOT EXISTS {index_name} ON {table_name}({columns});"
|
||||
else:
|
||||
# Not in transaction (production) - use CONCURRENTLY
|
||||
if where_clause:
|
||||
sql = f"CREATE INDEX CONCURRENTLY IF NOT EXISTS {index_name} ON {table_name}({columns}) {where_clause};"
|
||||
else:
|
||||
sql = f"CREATE INDEX CONCURRENTLY IF NOT EXISTS {index_name} ON {table_name}({columns});"
|
||||
else:
|
||||
# MySQL and other databases - skip partial indexes
|
||||
if where_clause:
|
||||
print(
|
||||
f"Skipping partial index {index_name} for {db_engine} (not supported)"
|
||||
)
|
||||
continue
|
||||
sql = f"CREATE INDEX IF NOT EXISTS {index_name} ON {table_name}({columns});"
|
||||
|
||||
cursor.execute(sql)
|
||||
print(f"Created index: {index_name}")
|
||||
|
||||
except Exception as e:
|
||||
# Index might already exist or other error - log and continue
|
||||
print(f"Index {index_name} creation skipped: {e}")
|
||||
|
||||
|
||||
def drop_indexes(apps: Any, schema_editor: Any) -> None:
|
||||
"""
|
||||
Drop the performance indexes.
|
||||
"""
|
||||
indexes: List[str] = [
|
||||
"idx_cemail_digest",
|
||||
"idx_cemail_digest_sha256",
|
||||
"idx_cemail_access_count",
|
||||
"idx_cemail_bluesky_handle",
|
||||
"idx_photo_format",
|
||||
"idx_photo_access_count",
|
||||
"idx_cemail_user_access",
|
||||
"idx_cemail_photo_access",
|
||||
"idx_photo_user_format",
|
||||
]
|
||||
|
||||
with connection.cursor() as cursor:
|
||||
for index_name in indexes:
|
||||
try:
|
||||
cursor.execute(f"DROP INDEX IF EXISTS {index_name};")
|
||||
print(f"Dropped index: {index_name}")
|
||||
except Exception as e:
|
||||
print(f"Index {index_name} drop skipped: {e}")
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
("ivataraccount", "0020_confirmedopenid_bluesky_handle"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(create_indexes, drop_indexes),
|
||||
]
|
||||
@@ -11,6 +11,7 @@ from os import urandom
|
||||
from urllib.error import HTTPError, URLError
|
||||
from ivatar.utils import urlopen, Bluesky
|
||||
from urllib.parse import urlsplit, urlunsplit, quote
|
||||
import logging
|
||||
|
||||
from PIL import Image
|
||||
from django.contrib.auth.models import User
|
||||
@@ -30,13 +31,16 @@ from openid.store.interface import OpenIDStore
|
||||
|
||||
from libravatar import libravatar_url
|
||||
|
||||
from ivatar.settings import MAX_LENGTH_EMAIL, logger
|
||||
from ivatar.settings import MAX_LENGTH_EMAIL
|
||||
from ivatar.settings import MAX_PIXELS, AVATAR_MAX_SIZE, JPEG_QUALITY
|
||||
from ivatar.settings import MAX_LENGTH_URL
|
||||
from ivatar.settings import SECURE_BASE_URL, SITE_NAME, DEFAULT_FROM_EMAIL
|
||||
from ivatar.utils import openid_variations
|
||||
from .gravatar import get_photo as get_gravatar_photo
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger("ivatar")
|
||||
|
||||
|
||||
def file_format(image_type):
|
||||
"""
|
||||
@@ -135,6 +139,11 @@ class Photo(BaseAccountModel):
|
||||
|
||||
verbose_name = _("photo")
|
||||
verbose_name_plural = _("photos")
|
||||
indexes = [
|
||||
models.Index(fields=["format"], name="idx_photo_format"),
|
||||
models.Index(fields=["access_count"], name="idx_photo_access_count"),
|
||||
models.Index(fields=["user_id", "format"], name="idx_photo_user_format"),
|
||||
]
|
||||
|
||||
def import_image(self, service_name, email_address):
|
||||
"""
|
||||
@@ -154,10 +163,12 @@ class Photo(BaseAccountModel):
|
||||
try:
|
||||
image = urlopen(image_url)
|
||||
except HTTPError as exc:
|
||||
print(f"{service_name} import failed with an HTTP error: {exc.code}")
|
||||
logger.warning(
|
||||
f"{service_name} import failed with an HTTP error: {exc.code}"
|
||||
)
|
||||
return False
|
||||
except URLError as exc:
|
||||
print(f"{service_name} import failed: {exc.reason}")
|
||||
logger.warning(f"{service_name} import failed: {exc.reason}")
|
||||
return False
|
||||
data = image.read()
|
||||
|
||||
@@ -169,7 +180,7 @@ class Photo(BaseAccountModel):
|
||||
|
||||
self.format = file_format(img.format)
|
||||
if not self.format:
|
||||
print(f"Unable to determine format: {img}")
|
||||
logger.warning(f"Unable to determine format: {img}")
|
||||
return False # pragma: no cover
|
||||
self.data = data
|
||||
super().save()
|
||||
@@ -186,11 +197,11 @@ class Photo(BaseAccountModel):
|
||||
img = Image.open(BytesIO(self.data))
|
||||
except Exception as exc: # pylint: disable=broad-except
|
||||
# For debugging only
|
||||
print(f"Exception caught in Photo.save(): {exc}")
|
||||
logger.error(f"Exception caught in Photo.save(): {exc}")
|
||||
return False
|
||||
self.format = file_format(img.format)
|
||||
if not self.format:
|
||||
print("Format not recognized")
|
||||
logger.error("Format not recognized")
|
||||
return False
|
||||
return super().save(force_insert, force_update, using, update_fields)
|
||||
|
||||
@@ -330,6 +341,20 @@ class ConfirmedEmail(BaseAccountModel):
|
||||
|
||||
verbose_name = _("confirmed email")
|
||||
verbose_name_plural = _("confirmed emails")
|
||||
indexes = [
|
||||
models.Index(fields=["digest"], name="idx_cemail_digest"),
|
||||
models.Index(fields=["digest_sha256"], name="idx_cemail_digest_sha256"),
|
||||
models.Index(fields=["access_count"], name="idx_cemail_access_count"),
|
||||
models.Index(fields=["bluesky_handle"], name="idx_cemail_bluesky_handle"),
|
||||
models.Index(
|
||||
fields=["user_id", "access_count"],
|
||||
name="idx_cemail_user_access",
|
||||
),
|
||||
models.Index(
|
||||
fields=["photo_id", "access_count"],
|
||||
name="idx_cemail_photo_access",
|
||||
),
|
||||
]
|
||||
|
||||
def set_photo(self, photo):
|
||||
"""
|
||||
|
||||
@@ -573,16 +573,25 @@ class Tester(TestCase): # pylint: disable=too-many-public-methods
|
||||
self.login()
|
||||
url = reverse("upload_photo")
|
||||
# rb => Read binary
|
||||
with open(TEST_IMAGE_FILE, "rb") as photo:
|
||||
response = self.client.post(
|
||||
url,
|
||||
{
|
||||
"photo": photo,
|
||||
"not_porn": True,
|
||||
"can_distribute": True,
|
||||
},
|
||||
follow=True,
|
||||
)
|
||||
with open(TEST_IMAGE_FILE, "rb") as photo_file:
|
||||
photo_data = photo_file.read()
|
||||
|
||||
from django.core.files.uploadedfile import SimpleUploadedFile
|
||||
uploaded_file = SimpleUploadedFile(
|
||||
"deadbeef.png",
|
||||
photo_data,
|
||||
content_type="image/png"
|
||||
)
|
||||
|
||||
response = self.client.post(
|
||||
url,
|
||||
{
|
||||
"photo": uploaded_file,
|
||||
"not_porn": True,
|
||||
"can_distribute": True,
|
||||
},
|
||||
follow=True,
|
||||
)
|
||||
if not test_only_one:
|
||||
return response
|
||||
self.assertEqual(
|
||||
|
||||
@@ -8,6 +8,7 @@ import contextlib
|
||||
# pylint: disable=too-many-lines
|
||||
import os
|
||||
import django
|
||||
import pytest
|
||||
from django.test import TestCase
|
||||
from django.test import Client
|
||||
|
||||
@@ -83,6 +84,7 @@ class Tester(TestCase): # pylint: disable=too-many-public-methods
|
||||
|
||||
# The following tests need to be moved over to the model tests
|
||||
# and real web UI tests added
|
||||
@pytest.mark.bluesky
|
||||
def test_bluesky_handle_for_mail_via_model_handle_does_not_exist(self):
|
||||
"""
|
||||
Add Bluesky handle to a confirmed mail address
|
||||
@@ -99,6 +101,7 @@ class Tester(TestCase): # pylint: disable=too-many-public-methods
|
||||
"Setting Bluesky handle that doesn't exist works?",
|
||||
)
|
||||
|
||||
@pytest.mark.bluesky
|
||||
def test_bluesky_handle_for_mail_via_model_handle_exists(self):
|
||||
"""
|
||||
Add Bluesky handle to a confirmed mail address
|
||||
@@ -113,6 +116,7 @@ class Tester(TestCase): # pylint: disable=too-many-public-methods
|
||||
"Setting Bluesky handle doesn't work?",
|
||||
)
|
||||
|
||||
@pytest.mark.bluesky
|
||||
def test_bluesky_handle_for_openid_via_model_handle_does_not_exist(self):
|
||||
"""
|
||||
Add Bluesky handle to a confirmed openid address
|
||||
@@ -129,6 +133,7 @@ class Tester(TestCase): # pylint: disable=too-many-public-methods
|
||||
"Setting Bluesky handle that doesn't exist works?",
|
||||
)
|
||||
|
||||
@pytest.mark.bluesky
|
||||
def test_bluesky_handle_for_openid_via_model_handle_exists(self):
|
||||
"""
|
||||
Add Bluesky handle to a confirmed openid address
|
||||
@@ -143,6 +148,7 @@ class Tester(TestCase): # pylint: disable=too-many-public-methods
|
||||
"Setting Bluesky handle doesn't work?",
|
||||
)
|
||||
|
||||
@pytest.mark.bluesky
|
||||
def test_bluesky_fetch_mail(self):
|
||||
"""
|
||||
Check if we can successfully fetch a Bluesky avatar via email
|
||||
@@ -158,6 +164,7 @@ class Tester(TestCase): # pylint: disable=too-many-public-methods
|
||||
self.assertEqual(response.status_code, 302)
|
||||
self.assertEqual(response["Location"], f"/blueskyproxy/{confirmed.digest}")
|
||||
|
||||
@pytest.mark.bluesky
|
||||
def test_bluesky_fetch_openid(self):
|
||||
"""
|
||||
Check if we can successfully fetch a Bluesky avatar via OpenID
|
||||
@@ -173,6 +180,7 @@ class Tester(TestCase): # pylint: disable=too-many-public-methods
|
||||
self.assertEqual(response.status_code, 302)
|
||||
self.assertEqual(response["Location"], f"/blueskyproxy/{confirmed.digest}")
|
||||
|
||||
@pytest.mark.bluesky
|
||||
def test_assign_bluesky_handle_to_openid(self):
|
||||
"""
|
||||
Assign a Bluesky handle to an OpenID
|
||||
@@ -185,6 +193,7 @@ class Tester(TestCase): # pylint: disable=too-many-public-methods
|
||||
"Adding Bluesky handle to OpenID fails?",
|
||||
)
|
||||
|
||||
@pytest.mark.bluesky
|
||||
def test_assign_bluesky_handle_to_email(self):
|
||||
"""
|
||||
Assign a Bluesky handle to an email
|
||||
@@ -215,6 +224,7 @@ class Tester(TestCase): # pylint: disable=too-many-public-methods
|
||||
"Setting Bluesky handle doesn't work?",
|
||||
)
|
||||
|
||||
@pytest.mark.bluesky
|
||||
def test_assign_photo_to_mail_removes_bluesky_handle(self):
|
||||
"""
|
||||
Assign a Photo to a mail, removes Bluesky handle
|
||||
@@ -223,6 +233,7 @@ class Tester(TestCase): # pylint: disable=too-many-public-methods
|
||||
confirmed = self.create_confirmed_email()
|
||||
self._assign_bluesky_handle(confirmed, "assign_photo_email")
|
||||
|
||||
@pytest.mark.bluesky
|
||||
def test_assign_photo_to_openid_removes_bluesky_handle(self):
|
||||
"""
|
||||
Assign a Photo to a OpenID, removes Bluesky handle
|
||||
|
||||
@@ -10,6 +10,7 @@ import binascii
|
||||
import contextlib
|
||||
from xml.sax import saxutils
|
||||
import gzip
|
||||
import logging
|
||||
|
||||
from PIL import Image
|
||||
|
||||
@@ -61,6 +62,10 @@ from .models import UserPreference
|
||||
from .models import file_format
|
||||
from .read_libravatar_export import read_gzdata as libravatar_read_gzdata
|
||||
|
||||
# Initialize loggers
|
||||
logger = logging.getLogger("ivatar")
|
||||
security_logger = logging.getLogger("ivatar.security")
|
||||
|
||||
|
||||
def openid_logging(message, level=0):
|
||||
"""
|
||||
@@ -69,7 +74,7 @@ def openid_logging(message, level=0):
|
||||
# Normal messages are not that important
|
||||
# No need for coverage here
|
||||
if level > 0: # pragma: no cover
|
||||
print(message)
|
||||
logger.debug(message)
|
||||
|
||||
|
||||
class CreateView(SuccessMessageMixin, FormView):
|
||||
@@ -505,7 +510,7 @@ class ImportPhotoView(SuccessMessageMixin, TemplateView):
|
||||
try:
|
||||
urlopen(libravatar_service_url)
|
||||
except OSError as exc:
|
||||
print(f"Exception caught during photo import: {exc}")
|
||||
logger.warning(f"Exception caught during photo import: {exc}")
|
||||
else:
|
||||
context["photos"].append(
|
||||
{
|
||||
@@ -612,7 +617,7 @@ class DeletePhotoView(SuccessMessageMixin, View):
|
||||
@method_decorator(login_required, name="dispatch")
|
||||
class UploadPhotoView(SuccessMessageMixin, FormView):
|
||||
"""
|
||||
View class responsible for photo upload
|
||||
View class responsible for photo upload with enhanced security
|
||||
"""
|
||||
|
||||
model = Photo
|
||||
@@ -622,26 +627,46 @@ class UploadPhotoView(SuccessMessageMixin, FormView):
|
||||
success_url = reverse_lazy("profile")
|
||||
|
||||
def post(self, request, *args, **kwargs):
|
||||
# Check maximum number of photos
|
||||
num_photos = request.user.photo_set.count()
|
||||
if num_photos >= MAX_NUM_PHOTOS:
|
||||
messages.error(
|
||||
request, _("Maximum number of photos (%i) reached" % MAX_NUM_PHOTOS)
|
||||
)
|
||||
return HttpResponseRedirect(reverse_lazy("profile"))
|
||||
|
||||
return super().post(request, *args, **kwargs)
|
||||
|
||||
def form_valid(self, form):
|
||||
photo_data = self.request.FILES["photo"]
|
||||
|
||||
# Additional size check (redundant but good for security)
|
||||
if photo_data.size > MAX_PHOTO_SIZE:
|
||||
messages.error(self.request, _("Image too big"))
|
||||
return HttpResponseRedirect(reverse_lazy("profile"))
|
||||
|
||||
# Enhanced security logging
|
||||
security_logger.info(
|
||||
f"Photo upload attempt by user {self.request.user.id} "
|
||||
f"from IP {get_client_ip(self.request)[0]}, "
|
||||
f"file size: {photo_data.size} bytes"
|
||||
)
|
||||
|
||||
photo = form.save(self.request, photo_data)
|
||||
|
||||
if not photo:
|
||||
security_logger.warning(
|
||||
f"Photo upload failed for user {self.request.user.id} - invalid format"
|
||||
)
|
||||
messages.error(self.request, _("Invalid Format"))
|
||||
return HttpResponseRedirect(reverse_lazy("profile"))
|
||||
|
||||
# Log successful upload
|
||||
security_logger.info(
|
||||
f"Photo uploaded successfully by user {self.request.user.id}, "
|
||||
f"photo ID: {photo.pk}"
|
||||
)
|
||||
|
||||
# Override success URL -> Redirect to crop page.
|
||||
self.success_url = reverse_lazy("crop_photo", args=[photo.pk])
|
||||
return super().form_valid(form)
|
||||
@@ -717,7 +742,7 @@ class RemoveConfirmedOpenIDView(View):
|
||||
openidobj.delete()
|
||||
except Exception as exc: # pylint: disable=broad-except
|
||||
# Why it is not there?
|
||||
print(f"How did we get here: {exc}")
|
||||
logger.warning(f"How did we get here: {exc}")
|
||||
openid.delete()
|
||||
messages.success(request, _("ID removed"))
|
||||
except self.model.DoesNotExist: # pylint: disable=no-member
|
||||
@@ -766,7 +791,7 @@ class RedirectOpenIDView(View):
|
||||
"message": exc,
|
||||
}
|
||||
)
|
||||
print(f"message: {msg}")
|
||||
logger.error(f"message: {msg}")
|
||||
messages.error(request, msg)
|
||||
|
||||
if auth_request is None: # pragma: no cover
|
||||
@@ -1036,7 +1061,7 @@ class UploadLibravatarExportView(SuccessMessageMixin, FormView):
|
||||
try:
|
||||
data = base64.decodebytes(bytes(request.POST[arg], "utf-8"))
|
||||
except binascii.Error as exc:
|
||||
print(f"Cannot decode photo: {exc}")
|
||||
logger.warning(f"Cannot decode photo: {exc}")
|
||||
continue
|
||||
try:
|
||||
pilobj = Image.open(BytesIO(data))
|
||||
@@ -1050,7 +1075,7 @@ class UploadLibravatarExportView(SuccessMessageMixin, FormView):
|
||||
photo.data = out.read()
|
||||
photo.save()
|
||||
except Exception as exc: # pylint: disable=broad-except
|
||||
print(f"Exception during save: {exc}")
|
||||
logger.error(f"Exception during save: {exc}")
|
||||
continue
|
||||
|
||||
return HttpResponseRedirect(reverse_lazy("profile"))
|
||||
@@ -1177,7 +1202,7 @@ class ProfileView(TemplateView):
|
||||
openid=openids.first().claimed_id
|
||||
).exists():
|
||||
return
|
||||
print(f"need to confirm: {openids.first()}")
|
||||
logger.debug(f"need to confirm: {openids.first()}")
|
||||
confirmed = ConfirmedOpenId()
|
||||
confirmed.user = self.request.user
|
||||
confirmed.ip_address = get_client_ip(self.request)[0]
|
||||
|
||||
@@ -13,6 +13,11 @@ logger.setLevel(log_level)
|
||||
PACKAGE_ROOT = os.path.abspath(os.path.dirname(__file__))
|
||||
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
# Logging directory - can be overridden in local config
|
||||
LOGS_DIR = os.path.join(BASE_DIR, "logs")
|
||||
|
||||
# Ensure logs directory exists
|
||||
os.makedirs(LOGS_DIR, exist_ok=True)
|
||||
|
||||
# SECURITY WARNING: keep the secret key used in production secret!
|
||||
SECRET_KEY = "=v(+-^t#ahv^a&&e)uf36g8algj$d1@6ou^w(r0@%)#8mlc*zk"
|
||||
@@ -22,6 +27,77 @@ DEBUG = True
|
||||
|
||||
ALLOWED_HOSTS = []
|
||||
|
||||
# Comprehensive Logging Configuration
|
||||
LOGGING = {
|
||||
"version": 1,
|
||||
"disable_existing_loggers": False,
|
||||
"formatters": {
|
||||
"verbose": {
|
||||
"format": "{levelname} {asctime} {module} {process:d} {thread:d} {message}",
|
||||
"style": "{",
|
||||
},
|
||||
"simple": {
|
||||
"format": "{levelname} {asctime} {message}",
|
||||
"style": "{",
|
||||
},
|
||||
"detailed": {
|
||||
"format": "{levelname} {asctime} {name} {module} {funcName} {lineno:d} {message}",
|
||||
"style": "{",
|
||||
},
|
||||
},
|
||||
"handlers": {
|
||||
"file": {
|
||||
"level": "INFO",
|
||||
"class": "logging.FileHandler",
|
||||
"filename": os.path.join(LOGS_DIR, "ivatar.log"),
|
||||
"formatter": "verbose",
|
||||
},
|
||||
"file_debug": {
|
||||
"level": "DEBUG",
|
||||
"class": "logging.FileHandler",
|
||||
"filename": os.path.join(LOGS_DIR, "ivatar_debug.log"),
|
||||
"formatter": "detailed",
|
||||
},
|
||||
"console": {
|
||||
"level": "DEBUG" if DEBUG else "INFO",
|
||||
"class": "logging.StreamHandler",
|
||||
"formatter": "simple",
|
||||
},
|
||||
"security": {
|
||||
"level": "WARNING",
|
||||
"class": "logging.FileHandler",
|
||||
"filename": os.path.join(LOGS_DIR, "security.log"),
|
||||
"formatter": "detailed",
|
||||
},
|
||||
},
|
||||
"loggers": {
|
||||
"ivatar": {
|
||||
"handlers": ["file", "console"],
|
||||
"level": "INFO", # Restore normal logging level
|
||||
"propagate": True,
|
||||
},
|
||||
"ivatar.security": {
|
||||
"handlers": ["security", "console"],
|
||||
"level": "WARNING",
|
||||
"propagate": False,
|
||||
},
|
||||
"ivatar.debug": {
|
||||
"handlers": ["file_debug"],
|
||||
"level": "DEBUG",
|
||||
"propagate": False,
|
||||
},
|
||||
"django.security": {
|
||||
"handlers": ["security"],
|
||||
"level": "WARNING",
|
||||
"propagate": False,
|
||||
},
|
||||
},
|
||||
"root": {
|
||||
"handlers": ["console"],
|
||||
"level": "INFO",
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
# Application definition
|
||||
|
||||
@@ -103,12 +179,26 @@ AUTH_PASSWORD_VALIDATORS = [
|
||||
]
|
||||
|
||||
# Password Hashing (more secure)
|
||||
PASSWORD_HASHERS = [
|
||||
# This isn't working in older Python environments
|
||||
# "django.contrib.auth.hashers.Argon2PasswordHasher",
|
||||
"django.contrib.auth.hashers.PBKDF2PasswordHasher",
|
||||
"django.contrib.auth.hashers.PBKDF2SHA1PasswordHasher",
|
||||
]
|
||||
# Try to use Argon2PasswordHasher with high security settings, fallback to PBKDF2
|
||||
PASSWORD_HASHERS = []
|
||||
|
||||
# Try Argon2 first (requires Python 3.6+ and argon2-cffi package)
|
||||
try:
|
||||
import argon2 # noqa: F401
|
||||
|
||||
PASSWORD_HASHERS.append("django.contrib.auth.hashers.Argon2PasswordHasher")
|
||||
except ImportError:
|
||||
# Fallback for CentOS 7 / older systems without argon2-cffi
|
||||
pass
|
||||
|
||||
# Always include PBKDF2 as fallback
|
||||
PASSWORD_HASHERS.extend(
|
||||
[
|
||||
"django.contrib.auth.hashers.PBKDF2PasswordHasher",
|
||||
# Keep PBKDF2SHA1 for existing password compatibility only
|
||||
"django.contrib.auth.hashers.PBKDF2SHA1PasswordHasher",
|
||||
]
|
||||
)
|
||||
|
||||
# Security Settings
|
||||
SECURE_BROWSER_XSS_FILTER = True
|
||||
|
||||
7128
ivatar/static/css/bootstrap.min.css
vendored
265
ivatar/static/css/cropper.min.css
vendored
Normal file
@@ -0,0 +1,265 @@
|
||||
/*!
|
||||
* Cropper.js v1.6.2
|
||||
* https://fengyuanchen.github.io/cropperjs
|
||||
*
|
||||
* Copyright 2015-present Chen Fengyuan
|
||||
* Released under the MIT license
|
||||
*
|
||||
* Date: 2024-04-21T07:43:02.731Z
|
||||
*/
|
||||
.cropper-container {
|
||||
-webkit-touch-callout: none;
|
||||
direction: ltr;
|
||||
font-size: 0;
|
||||
line-height: 0;
|
||||
position: relative;
|
||||
-ms-touch-action: none;
|
||||
touch-action: none;
|
||||
-webkit-user-select: none;
|
||||
-moz-user-select: none;
|
||||
-ms-user-select: none;
|
||||
user-select: none;
|
||||
}
|
||||
.cropper-container img {
|
||||
backface-visibility: hidden;
|
||||
display: block;
|
||||
height: 100%;
|
||||
image-orientation: 0deg;
|
||||
max-height: none !important;
|
||||
max-width: none !important;
|
||||
min-height: 0 !important;
|
||||
min-width: 0 !important;
|
||||
width: 100%;
|
||||
}
|
||||
.cropper-canvas,
|
||||
.cropper-crop-box,
|
||||
.cropper-drag-box,
|
||||
.cropper-modal,
|
||||
.cropper-wrap-box {
|
||||
bottom: 0;
|
||||
left: 0;
|
||||
position: absolute;
|
||||
right: 0;
|
||||
top: 0;
|
||||
}
|
||||
.cropper-canvas,
|
||||
.cropper-wrap-box {
|
||||
overflow: hidden;
|
||||
}
|
||||
.cropper-drag-box {
|
||||
background-color: #fff;
|
||||
opacity: 0;
|
||||
}
|
||||
.cropper-modal {
|
||||
background-color: #000;
|
||||
opacity: 0.5;
|
||||
}
|
||||
.cropper-view-box {
|
||||
display: block;
|
||||
height: 100%;
|
||||
outline: 1px solid #39f;
|
||||
outline-color: rgba(51, 153, 255, 0.75);
|
||||
overflow: hidden;
|
||||
width: 100%;
|
||||
}
|
||||
.cropper-dashed {
|
||||
border: 0 dashed #eee;
|
||||
display: block;
|
||||
opacity: 0.5;
|
||||
position: absolute;
|
||||
}
|
||||
.cropper-dashed.dashed-h {
|
||||
border-bottom-width: 1px;
|
||||
border-top-width: 1px;
|
||||
height: 33.33333%;
|
||||
left: 0;
|
||||
top: 33.33333%;
|
||||
width: 100%;
|
||||
}
|
||||
.cropper-dashed.dashed-v {
|
||||
border-left-width: 1px;
|
||||
border-right-width: 1px;
|
||||
height: 100%;
|
||||
left: 33.33333%;
|
||||
top: 0;
|
||||
width: 33.33333%;
|
||||
}
|
||||
.cropper-center {
|
||||
display: block;
|
||||
height: 0;
|
||||
left: 50%;
|
||||
opacity: 0.75;
|
||||
position: absolute;
|
||||
top: 50%;
|
||||
width: 0;
|
||||
}
|
||||
.cropper-center:after,
|
||||
.cropper-center:before {
|
||||
background-color: #eee;
|
||||
content: " ";
|
||||
display: block;
|
||||
position: absolute;
|
||||
}
|
||||
.cropper-center:before {
|
||||
height: 1px;
|
||||
left: -3px;
|
||||
top: 0;
|
||||
width: 7px;
|
||||
}
|
||||
.cropper-center:after {
|
||||
height: 7px;
|
||||
left: 0;
|
||||
top: -3px;
|
||||
width: 1px;
|
||||
}
|
||||
.cropper-face,
|
||||
.cropper-line,
|
||||
.cropper-point {
|
||||
display: block;
|
||||
height: 100%;
|
||||
opacity: 0.1;
|
||||
position: absolute;
|
||||
width: 100%;
|
||||
}
|
||||
.cropper-face {
|
||||
background-color: #fff;
|
||||
left: 0;
|
||||
top: 0;
|
||||
}
|
||||
.cropper-line {
|
||||
background-color: #39f;
|
||||
}
|
||||
.cropper-line.line-e {
|
||||
cursor: ew-resize;
|
||||
right: -3px;
|
||||
top: 0;
|
||||
width: 5px;
|
||||
}
|
||||
.cropper-line.line-n {
|
||||
cursor: ns-resize;
|
||||
height: 5px;
|
||||
left: 0;
|
||||
top: -3px;
|
||||
}
|
||||
.cropper-line.line-w {
|
||||
cursor: ew-resize;
|
||||
left: -3px;
|
||||
top: 0;
|
||||
width: 5px;
|
||||
}
|
||||
.cropper-line.line-s {
|
||||
bottom: -3px;
|
||||
cursor: ns-resize;
|
||||
height: 5px;
|
||||
left: 0;
|
||||
}
|
||||
.cropper-point {
|
||||
background-color: #39f;
|
||||
height: 5px;
|
||||
opacity: 0.75;
|
||||
width: 5px;
|
||||
}
|
||||
.cropper-point.point-e {
|
||||
cursor: ew-resize;
|
||||
margin-top: -3px;
|
||||
right: -3px;
|
||||
top: 50%;
|
||||
}
|
||||
.cropper-point.point-n {
|
||||
cursor: ns-resize;
|
||||
left: 50%;
|
||||
margin-left: -3px;
|
||||
top: -3px;
|
||||
}
|
||||
.cropper-point.point-w {
|
||||
cursor: ew-resize;
|
||||
left: -3px;
|
||||
margin-top: -3px;
|
||||
top: 50%;
|
||||
}
|
||||
.cropper-point.point-s {
|
||||
bottom: -3px;
|
||||
cursor: s-resize;
|
||||
left: 50%;
|
||||
margin-left: -3px;
|
||||
}
|
||||
.cropper-point.point-ne {
|
||||
cursor: nesw-resize;
|
||||
right: -3px;
|
||||
top: -3px;
|
||||
}
|
||||
.cropper-point.point-nw {
|
||||
cursor: nwse-resize;
|
||||
left: -3px;
|
||||
top: -3px;
|
||||
}
|
||||
.cropper-point.point-sw {
|
||||
bottom: -3px;
|
||||
cursor: nesw-resize;
|
||||
left: -3px;
|
||||
}
|
||||
.cropper-point.point-se {
|
||||
bottom: -3px;
|
||||
cursor: nwse-resize;
|
||||
height: 20px;
|
||||
opacity: 1;
|
||||
right: -3px;
|
||||
width: 20px;
|
||||
}
|
||||
@media (min-width: 768px) {
|
||||
.cropper-point.point-se {
|
||||
height: 15px;
|
||||
width: 15px;
|
||||
}
|
||||
}
|
||||
@media (min-width: 992px) {
|
||||
.cropper-point.point-se {
|
||||
height: 10px;
|
||||
width: 10px;
|
||||
}
|
||||
}
|
||||
@media (min-width: 1200px) {
|
||||
.cropper-point.point-se {
|
||||
height: 5px;
|
||||
opacity: 0.75;
|
||||
width: 5px;
|
||||
}
|
||||
}
|
||||
.cropper-point.point-se:before {
|
||||
background-color: #39f;
|
||||
bottom: -50%;
|
||||
content: " ";
|
||||
display: block;
|
||||
height: 200%;
|
||||
opacity: 0;
|
||||
position: absolute;
|
||||
right: -50%;
|
||||
width: 200%;
|
||||
}
|
||||
.cropper-invisible {
|
||||
opacity: 0;
|
||||
}
|
||||
.cropper-bg {
|
||||
background-image: url("data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQAQMAAAAlPW0iAAAAA3NCSVQICAjb4U/gAAAABlBMVEXMzMz////TjRV2AAAACXBIWXMAAArrAAAK6wGCiw1aAAAAHHRFWHRTb2Z0d2FyZQBBZG9iZSBGaXJld29ya3MgQ1M26LyyjAAAABFJREFUCJlj+M/AgBVhF/0PAH6/D/HkDxOGAAAAAElFTkSuQmCC");
|
||||
}
|
||||
.cropper-hide {
|
||||
display: block;
|
||||
height: 0;
|
||||
position: absolute;
|
||||
width: 0;
|
||||
}
|
||||
.cropper-hidden {
|
||||
display: none !important;
|
||||
}
|
||||
.cropper-move {
|
||||
cursor: move;
|
||||
}
|
||||
.cropper-crop {
|
||||
cursor: crosshair;
|
||||
}
|
||||
.cropper-disabled .cropper-drag-box,
|
||||
.cropper-disabled .cropper-face,
|
||||
.cropper-disabled .cropper-line,
|
||||
.cropper-disabled .cropper-point {
|
||||
cursor: not-allowed;
|
||||
}
|
||||
@@ -1,2 +1,146 @@
|
||||
/* jquery.Jcrop.min.css v0.9.15 (build:20180819) */
|
||||
.jcrop-holder{direction:ltr;text-align:left;-ms-touch-action:none}.jcrop-hline,.jcrop-vline{background:#fff url(Jcrop.gif);font-size:0;position:absolute}.jcrop-vline{height:100%;width:1px!important}.jcrop-vline.right{right:0}.jcrop-hline{height:1px!important;width:100%}.jcrop-hline.bottom{bottom:0}.jcrop-tracker{height:100%;width:100%;-webkit-tap-highlight-color:transparent;-webkit-touch-callout:none;-webkit-user-select:none}.jcrop-handle{background-color:#333;border:1px #eee solid;width:7px;height:7px;font-size:1px}.jcrop-handle.ord-n{left:50%;margin-left:-4px;margin-top:-4px;top:0}.jcrop-handle.ord-s{bottom:0;left:50%;margin-bottom:-4px;margin-left:-4px}.jcrop-handle.ord-e{margin-right:-4px;margin-top:-4px;right:0;top:50%}.jcrop-handle.ord-w{left:0;margin-left:-4px;margin-top:-4px;top:50%}.jcrop-handle.ord-nw{left:0;margin-left:-4px;margin-top:-4px;top:0}.jcrop-handle.ord-ne{margin-right:-4px;margin-top:-4px;right:0;top:0}.jcrop-handle.ord-se{bottom:0;margin-bottom:-4px;margin-right:-4px;right:0}.jcrop-handle.ord-sw{bottom:0;left:0;margin-bottom:-4px;margin-left:-4px}.jcrop-dragbar.ord-n,.jcrop-dragbar.ord-s{height:7px;width:100%}.jcrop-dragbar.ord-e,.jcrop-dragbar.ord-w{height:100%;width:7px}.jcrop-dragbar.ord-n{margin-top:-4px}.jcrop-dragbar.ord-s{bottom:0;margin-bottom:-4px}.jcrop-dragbar.ord-e{margin-right:-4px;right:0}.jcrop-dragbar.ord-w{margin-left:-4px}.jcrop-light .jcrop-hline,.jcrop-light .jcrop-vline{background:#fff;filter:alpha(opacity=70)!important;opacity:.7!important}.jcrop-light .jcrop-handle{-moz-border-radius:3px;-webkit-border-radius:3px;background-color:#000;border-color:#fff;border-radius:3px}.jcrop-dark .jcrop-hline,.jcrop-dark .jcrop-vline{background:#000;filter:alpha(opacity=70)!important;opacity:.7!important}.jcrop-dark .jcrop-handle{-moz-border-radius:3px;-webkit-border-radius:3px;background-color:#fff;border-color:#000;border-radius:3px}.solid-line .jcrop-hline,.solid-line .jcrop-vline{background:#fff}.jcrop-holder img,img.jcrop-preview{max-width:none}
|
||||
.jcrop-holder {
|
||||
direction: ltr;
|
||||
text-align: left;
|
||||
-ms-touch-action: none;
|
||||
}
|
||||
.jcrop-hline,
|
||||
.jcrop-vline {
|
||||
background: #fff url(Jcrop.gif);
|
||||
font-size: 0;
|
||||
position: absolute;
|
||||
}
|
||||
.jcrop-vline {
|
||||
height: 100%;
|
||||
width: 1px !important;
|
||||
}
|
||||
.jcrop-vline.right {
|
||||
right: 0;
|
||||
}
|
||||
.jcrop-hline {
|
||||
height: 1px !important;
|
||||
width: 100%;
|
||||
}
|
||||
.jcrop-hline.bottom {
|
||||
bottom: 0;
|
||||
}
|
||||
.jcrop-tracker {
|
||||
height: 100%;
|
||||
width: 100%;
|
||||
-webkit-tap-highlight-color: transparent;
|
||||
-webkit-touch-callout: none;
|
||||
-webkit-user-select: none;
|
||||
}
|
||||
.jcrop-handle {
|
||||
background-color: #333;
|
||||
border: 1px #eee solid;
|
||||
width: 7px;
|
||||
height: 7px;
|
||||
font-size: 1px;
|
||||
}
|
||||
.jcrop-handle.ord-n {
|
||||
left: 50%;
|
||||
margin-left: -4px;
|
||||
margin-top: -4px;
|
||||
top: 0;
|
||||
}
|
||||
.jcrop-handle.ord-s {
|
||||
bottom: 0;
|
||||
left: 50%;
|
||||
margin-bottom: -4px;
|
||||
margin-left: -4px;
|
||||
}
|
||||
.jcrop-handle.ord-e {
|
||||
margin-right: -4px;
|
||||
margin-top: -4px;
|
||||
right: 0;
|
||||
top: 50%;
|
||||
}
|
||||
.jcrop-handle.ord-w {
|
||||
left: 0;
|
||||
margin-left: -4px;
|
||||
margin-top: -4px;
|
||||
top: 50%;
|
||||
}
|
||||
.jcrop-handle.ord-nw {
|
||||
left: 0;
|
||||
margin-left: -4px;
|
||||
margin-top: -4px;
|
||||
top: 0;
|
||||
}
|
||||
.jcrop-handle.ord-ne {
|
||||
margin-right: -4px;
|
||||
margin-top: -4px;
|
||||
right: 0;
|
||||
top: 0;
|
||||
}
|
||||
.jcrop-handle.ord-se {
|
||||
bottom: 0;
|
||||
margin-bottom: -4px;
|
||||
margin-right: -4px;
|
||||
right: 0;
|
||||
}
|
||||
.jcrop-handle.ord-sw {
|
||||
bottom: 0;
|
||||
left: 0;
|
||||
margin-bottom: -4px;
|
||||
margin-left: -4px;
|
||||
}
|
||||
.jcrop-dragbar.ord-n,
|
||||
.jcrop-dragbar.ord-s {
|
||||
height: 7px;
|
||||
width: 100%;
|
||||
}
|
||||
.jcrop-dragbar.ord-e,
|
||||
.jcrop-dragbar.ord-w {
|
||||
height: 100%;
|
||||
width: 7px;
|
||||
}
|
||||
.jcrop-dragbar.ord-n {
|
||||
margin-top: -4px;
|
||||
}
|
||||
.jcrop-dragbar.ord-s {
|
||||
bottom: 0;
|
||||
margin-bottom: -4px;
|
||||
}
|
||||
.jcrop-dragbar.ord-e {
|
||||
margin-right: -4px;
|
||||
right: 0;
|
||||
}
|
||||
.jcrop-dragbar.ord-w {
|
||||
margin-left: -4px;
|
||||
}
|
||||
.jcrop-light .jcrop-hline,
|
||||
.jcrop-light .jcrop-vline {
|
||||
background: #fff;
|
||||
filter: alpha(opacity=70) !important;
|
||||
opacity: 0.7 !important;
|
||||
}
|
||||
.jcrop-light .jcrop-handle {
|
||||
-moz-border-radius: 3px;
|
||||
-webkit-border-radius: 3px;
|
||||
background-color: #000;
|
||||
border-color: #fff;
|
||||
border-radius: 3px;
|
||||
}
|
||||
.jcrop-dark .jcrop-hline,
|
||||
.jcrop-dark .jcrop-vline {
|
||||
background: #000;
|
||||
filter: alpha(opacity=70) !important;
|
||||
opacity: 0.7 !important;
|
||||
}
|
||||
.jcrop-dark .jcrop-handle {
|
||||
-moz-border-radius: 3px;
|
||||
-webkit-border-radius: 3px;
|
||||
background-color: #fff;
|
||||
border-color: #000;
|
||||
border-radius: 3px;
|
||||
}
|
||||
.solid-line .jcrop-hline,
|
||||
.solid-line .jcrop-vline {
|
||||
background: #fff;
|
||||
}
|
||||
.jcrop-holder img,
|
||||
img.jcrop-preview {
|
||||
max-width: none;
|
||||
}
|
||||
|
||||
0
ivatar/static/img/logo4hex/libravatar_org.svg
Executable file → Normal file
|
Before Width: | Height: | Size: 13 KiB After Width: | Height: | Size: 13 KiB |
0
ivatar/static/img/logo4hex/libravatar_org_6.svg
Executable file → Normal file
|
Before Width: | Height: | Size: 13 KiB After Width: | Height: | Size: 13 KiB |
0
ivatar/static/img/logo4hex/libravatar_org_process_blue.svg
Executable file → Normal file
|
Before Width: | Height: | Size: 12 KiB After Width: | Height: | Size: 12 KiB |
0
ivatar/static/img/logo4hex/libravatar_org_process_blue_6.svg
Executable file → Normal file
|
Before Width: | Height: | Size: 12 KiB After Width: | Height: | Size: 12 KiB |
1701
ivatar/static/js/bootstrap.min.js
vendored
2170
ivatar/static/js/cropper.min.js
vendored
Normal file
@@ -2,33 +2,33 @@
|
||||
|
||||
// Autofocus the right field on forms
|
||||
if (document.forms.login) {
|
||||
if (document.forms.login.username) {
|
||||
document.forms.login.username.focus();
|
||||
} else if (document.forms.login.openid_identifier) {
|
||||
document.forms.login.openid_identifier.focus();
|
||||
}
|
||||
if (document.forms.login.username) {
|
||||
document.forms.login.username.focus();
|
||||
} else if (document.forms.login.openid_identifier) {
|
||||
document.forms.login.openid_identifier.focus();
|
||||
}
|
||||
} else if (document.forms.addemail) {
|
||||
document.forms.addemail.email.focus();
|
||||
document.forms.addemail.email.focus();
|
||||
} else if (document.forms.addopenid) {
|
||||
document.forms.addopenid.openid.focus();
|
||||
document.forms.addopenid.openid.focus();
|
||||
} else if (document.forms.changepassword) {
|
||||
if(document.forms.changepassword.old_password) {
|
||||
document.forms.changepassword.old_password.focus();
|
||||
} else {
|
||||
document.forms.changepassword.new_password1.focus();
|
||||
}
|
||||
if (document.forms.changepassword.old_password) {
|
||||
document.forms.changepassword.old_password.focus();
|
||||
} else {
|
||||
document.forms.changepassword.new_password1.focus();
|
||||
}
|
||||
} else if (document.forms.deleteaccount) {
|
||||
if (document.forms.deleteaccount.password) {
|
||||
document.forms.deleteaccount.password.focus();
|
||||
}
|
||||
if (document.forms.deleteaccount.password) {
|
||||
document.forms.deleteaccount.password.focus();
|
||||
}
|
||||
} else if (document.forms.lookup) {
|
||||
if (document.forms.lookup.email) {
|
||||
document.forms.lookup.email.focus();
|
||||
} else if (document.forms.lookup.domain) {
|
||||
document.forms.lookup.domain.focus();
|
||||
}
|
||||
if (document.forms.lookup.email) {
|
||||
document.forms.lookup.email.focus();
|
||||
} else if (document.forms.lookup.domain) {
|
||||
document.forms.lookup.domain.focus();
|
||||
}
|
||||
} else if (document.forms.newaccount) {
|
||||
document.forms.newaccount.username.focus();
|
||||
document.forms.newaccount.username.focus();
|
||||
} else if (document.forms.reset) {
|
||||
document.forms.reset.email.focus();
|
||||
document.forms.reset.email.focus();
|
||||
}
|
||||
|
||||
275
ivatar/test_file_security.py
Normal file
@@ -0,0 +1,275 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Tests for file upload security enhancements
|
||||
"""
|
||||
|
||||
from unittest.mock import patch
|
||||
|
||||
from django.test import TestCase, override_settings
|
||||
from django.core.files.uploadedfile import SimpleUploadedFile
|
||||
from django.contrib.auth.models import User
|
||||
|
||||
from ivatar.file_security import (
|
||||
FileValidator,
|
||||
validate_uploaded_file,
|
||||
get_file_security_report,
|
||||
)
|
||||
from ivatar.ivataraccount.forms import UploadPhotoForm
|
||||
|
||||
|
||||
class FileSecurityTestCase(TestCase):
|
||||
"""Test cases for file upload security"""
|
||||
|
||||
def setUp(self):
|
||||
"""Set up test data"""
|
||||
self.user = User.objects.create_user(
|
||||
username="testuser", email="test@example.com", password="testpass123"
|
||||
)
|
||||
|
||||
# Create test image data
|
||||
self.valid_jpeg_data = b"\xff\xd8\xff\xe0\x00\x10JFIF\x00\x01\x01\x01\x00H\x00H\x00\x00\xff\xdb\x00C\x00\x08\x06\x06\x07\x06\x05\x08\x07\x07\x07\t\t\x08\n\x0c\x14\r\x0c\x0b\x0b\x0c\x19\x12\x13\x0f\x14\x1d\x1a\x1f\x1e\x1d\x1a\x1c\x1c $.' \",#\x1c\x1c(7),01444\x1f'9=82<.342\xff\xc0\x00\x11\x08\x00\x01\x00\x01\x01\x01\x11\x00\x02\x11\x01\x03\x11\x01\xff\xc4\x00\x1f\x00\x00\x01\x05\x01\x01\x01\x01\x01\x01\x00\x00\x00\x00\x00\x00\x00\x00\x01\x02\x03\x04\x05\x06\x07\x08\t\n\x0b\xff\xc4\x00\xb5\x10\x00\x02\x01\x03\x03\x02\x04\x03\x05\x05\x04\x04\x00\x00\x01}\x01\x02\x03\x00\x04\x11\x05\x12!1A\x06\x13Qa\x07\"q\x142\x81\x91\xa1\x08#B\xb1\xc1\x15R\xd1\xf0$3br\x82\t\n\x16\x17\x18\x19\x1a%&'()*456789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz\x83\x84\x85\x86\x87\x88\x89\x8a\x92\x93\x94\x95\x96\x97\x98\x99\x9a\xa2\xa3\xa4\xa5\xa6\xa7\xa8\xa9\xaa\xb2\xb3\xb4\xb5\xb6\xb7\xb8\xb9\xba\xc2\xc3\xc4\xc5\xc6\xc7\xc8\xc9\xca\xd2\xd3\xd4\xd5\xd6\xd7\xd8\xd9\xda\xe1\xe2\xe3\xe4\xe5\xe6\xe7\xe8\xe9\xea\xf1\xf2\xf3\xf4\xf5\xf6\xf7\xf8\xf9\xfa\xff\xda\x00\x0c\x03\x01\x00\x02\x11\x03\x11\x00\x3f\x00\xf9\xff\xd9"
|
||||
|
||||
self.malicious_data = b'GIF89a<script>alert("xss")</script>'
|
||||
self.large_data = b"x" * (10 * 1024 * 1024) # 10MB
|
||||
|
||||
def tearDown(self):
|
||||
"""Clean up after tests"""
|
||||
pass
|
||||
|
||||
def test_valid_jpeg_validation(self):
|
||||
"""Test validation of valid JPEG file"""
|
||||
validator = FileValidator(self.valid_jpeg_data, "test.jpg")
|
||||
|
||||
# Mock PIL validation to avoid issues with test data
|
||||
with patch.object(validator, "validate_pil_image") as mock_pil:
|
||||
mock_pil.return_value = {
|
||||
"valid": True,
|
||||
"image_info": {
|
||||
"format": "JPEG",
|
||||
"mode": "RGB",
|
||||
"size": (100, 100),
|
||||
"width": 100,
|
||||
"height": 100,
|
||||
"has_transparency": False,
|
||||
},
|
||||
"errors": [],
|
||||
"warnings": [],
|
||||
}
|
||||
|
||||
results = validator.comprehensive_validation()
|
||||
|
||||
self.assertTrue(results["valid"])
|
||||
self.assertEqual(results["file_info"]["detected_type"], "image/jpeg")
|
||||
self.assertGreaterEqual(results["security_score"], 80)
|
||||
|
||||
def test_magic_bytes_validation(self):
|
||||
"""Test magic bytes validation"""
|
||||
validator = FileValidator(self.valid_jpeg_data, "test.jpg")
|
||||
results = validator.validate_magic_bytes()
|
||||
|
||||
self.assertTrue(results["valid"])
|
||||
self.assertEqual(results["detected_type"], "image/jpeg")
|
||||
|
||||
def test_malicious_content_detection(self):
|
||||
"""Test detection of malicious content"""
|
||||
validator = FileValidator(self.malicious_data, "malicious.gif")
|
||||
results = validator.scan_for_malicious_content()
|
||||
|
||||
self.assertTrue(results["suspicious"])
|
||||
self.assertGreater(len(results["threats"]), 0)
|
||||
|
||||
def test_file_size_validation(self):
|
||||
"""Test file size validation"""
|
||||
validator = FileValidator(self.large_data, "large.jpg")
|
||||
results = validator.validate_basic()
|
||||
|
||||
self.assertFalse(results["valid"])
|
||||
self.assertIn("File too large", results["errors"][0])
|
||||
|
||||
def test_invalid_extension_validation(self):
|
||||
"""Test invalid file extension validation"""
|
||||
validator = FileValidator(self.valid_jpeg_data, "test.exe")
|
||||
results = validator.validate_basic()
|
||||
|
||||
self.assertFalse(results["valid"])
|
||||
self.assertIn("File extension not allowed", results["errors"][0])
|
||||
|
||||
def test_exif_sanitization(self):
|
||||
"""Test EXIF data sanitization"""
|
||||
validator = FileValidator(self.valid_jpeg_data, "test.jpg")
|
||||
sanitized_data = validator.sanitize_exif_data()
|
||||
|
||||
# Should return data (may be same or sanitized)
|
||||
self.assertIsInstance(sanitized_data, bytes)
|
||||
self.assertGreater(len(sanitized_data), 0)
|
||||
|
||||
def test_comprehensive_validation_function(self):
|
||||
"""Test the main validation function"""
|
||||
# Mock PIL validation to avoid issues with test data
|
||||
with patch("ivatar.file_security.FileValidator.validate_pil_image") as mock_pil:
|
||||
mock_pil.return_value = {
|
||||
"valid": True,
|
||||
"image_info": {"format": "JPEG", "size": (100, 100)},
|
||||
"errors": [],
|
||||
"warnings": [],
|
||||
}
|
||||
|
||||
is_valid, results, sanitized_data = validate_uploaded_file(
|
||||
self.valid_jpeg_data, "test.jpg"
|
||||
)
|
||||
|
||||
self.assertTrue(is_valid)
|
||||
self.assertIsInstance(results, dict)
|
||||
self.assertIsInstance(sanitized_data, bytes)
|
||||
|
||||
def test_security_report_generation(self):
|
||||
"""Test security report generation"""
|
||||
# Mock PIL validation to avoid issues with test data
|
||||
with patch("ivatar.file_security.FileValidator.validate_pil_image") as mock_pil:
|
||||
mock_pil.return_value = {
|
||||
"valid": True,
|
||||
"image_info": {"format": "JPEG", "size": (100, 100)},
|
||||
"errors": [],
|
||||
"warnings": [],
|
||||
}
|
||||
|
||||
report = get_file_security_report(self.valid_jpeg_data, "test.jpg")
|
||||
|
||||
self.assertIn("valid", report)
|
||||
self.assertIn("security_score", report)
|
||||
self.assertIn("file_info", report)
|
||||
|
||||
@patch("ivatar.file_security.magic.from_buffer")
|
||||
def test_mime_type_validation(self, mock_magic):
|
||||
"""Test MIME type validation with mocked magic"""
|
||||
mock_magic.return_value = "image/jpeg"
|
||||
|
||||
validator = FileValidator(self.valid_jpeg_data, "test.jpg")
|
||||
results = validator.validate_mime_type()
|
||||
|
||||
self.assertTrue(results["valid"])
|
||||
self.assertEqual(results["detected_mime"], "image/jpeg")
|
||||
|
||||
def test_polyglot_attack_detection(self):
|
||||
"""Test detection of polyglot attacks"""
|
||||
polyglot_data = b'GIF89a<script>alert("xss")</script>'
|
||||
validator = FileValidator(polyglot_data, "polyglot.gif")
|
||||
results = validator.scan_for_malicious_content()
|
||||
|
||||
self.assertTrue(results["suspicious"])
|
||||
# Check for either polyglot attack or suspicious script pattern
|
||||
threats_text = " ".join(results["threats"]).lower()
|
||||
self.assertTrue(
|
||||
"polyglot attack" in threats_text or "suspicious pattern" in threats_text,
|
||||
f"Expected polyglot attack or suspicious pattern, got: {results['threats']}",
|
||||
)
|
||||
|
||||
|
||||
class UploadPhotoFormSecurityTestCase(TestCase):
|
||||
"""Test cases for UploadPhotoForm security enhancements"""
|
||||
|
||||
def setUp(self):
|
||||
"""Set up test data"""
|
||||
self.user = User.objects.create_user(
|
||||
username="testuser", email="test@example.com", password="testpass123"
|
||||
)
|
||||
|
||||
def test_form_validation_with_valid_file(self):
|
||||
"""Test form validation with valid file"""
|
||||
valid_jpeg_data = b"\xff\xd8\xff\xe0\x00\x10JFIF\x00\x01\x01\x01\x00H\x00H\x00\x00\xff\xdb\x00C\x00\x08\x06\x06\x07\x06\x05\x08\x07\x07\x07\t\t\x08\n\x0c\x14\r\x0c\x0b\x0b\x0c\x19\x12\x13\x0f\x14\x1d\x1a\x1f\x1e\x1d\x1a\x1c\x1c $.' \",#\x1c\x1c(7),01444\x1f'9=82<.342\xff\xc0\x00\x11\x08\x00\x01\x00\x01\x01\x01\x11\x00\x02\x11\x01\x03\x11\x01\xff\xc4\x00\x1f\x00\x00\x01\x05\x01\x01\x01\x01\x01\x01\x00\x00\x00\x00\x00\x00\x00\x00\x01\x02\x03\x04\x05\x06\x07\x08\t\n\x0b\xff\xc4\x00\xb5\x10\x00\x02\x01\x03\x03\x02\x04\x03\x05\x05\x04\x04\x00\x00\x01}\x01\x02\x03\x00\x04\x11\x05\x12!1A\x06\x13Qa\x07\"q\x142\x81\x91\xa1\x08#B\xb1\xc1\x15R\xd1\xf0$3br\x82\t\n\x16\x17\x18\x19\x1a%&'()*456789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz\x83\x84\x85\x86\x87\x88\x89\x8a\x92\x93\x94\x95\x96\x97\x98\x99\x9a\xa2\xa3\xa4\xa5\xa6\xa7\xa8\xa9\xaa\xb2\xb3\xb4\xb5\xb6\xb7\xb8\xb9\xba\xc2\xc3\xc4\xc5\xc6\xc7\xc8\xc9\xca\xd2\xd3\xd4\xd5\xd6\xd7\xd8\xd9\xda\xe1\xe2\xe3\xe4\xe5\xe6\xe7\xe8\xe9\xea\xf1\xf2\xf3\xf4\xf5\xf6\xf7\xf8\xf9\xfa\xff\xda\x00\x0c\x03\x01\x00\x02\x11\x03\x11\x00\x3f\x00\xf9\xff\xd9"
|
||||
|
||||
uploaded_file = SimpleUploadedFile(
|
||||
"test.jpg", valid_jpeg_data, content_type="image/jpeg"
|
||||
)
|
||||
|
||||
form_data = {"photo": uploaded_file, "not_porn": True, "can_distribute": True}
|
||||
|
||||
form = UploadPhotoForm(data=form_data, files={"photo": uploaded_file})
|
||||
|
||||
# Mock the validation to avoid PIL issues in tests
|
||||
with patch(
|
||||
"ivatar.ivataraccount.forms.validate_uploaded_file"
|
||||
) as mock_validate:
|
||||
mock_validate.return_value = (
|
||||
True,
|
||||
{"security_score": 95, "errors": [], "warnings": []},
|
||||
valid_jpeg_data,
|
||||
)
|
||||
|
||||
self.assertTrue(form.is_valid())
|
||||
|
||||
def test_form_validation_with_malicious_file(self):
|
||||
"""Test form validation with malicious file"""
|
||||
malicious_data = b'GIF89a<script>alert("xss")</script>'
|
||||
|
||||
uploaded_file = SimpleUploadedFile(
|
||||
"malicious.gif", malicious_data, content_type="image/gif"
|
||||
)
|
||||
|
||||
form_data = {"photo": uploaded_file, "not_porn": True, "can_distribute": True}
|
||||
|
||||
form = UploadPhotoForm(data=form_data, files={"photo": uploaded_file})
|
||||
|
||||
# Mock the validation to return malicious file detection
|
||||
with patch(
|
||||
"ivatar.ivataraccount.forms.validate_uploaded_file"
|
||||
) as mock_validate:
|
||||
mock_validate.return_value = (
|
||||
False,
|
||||
{
|
||||
"security_score": 20,
|
||||
"errors": ["Malicious content detected"],
|
||||
"warnings": [],
|
||||
},
|
||||
malicious_data,
|
||||
)
|
||||
|
||||
self.assertFalse(form.is_valid())
|
||||
# Check for any error message indicating validation failure
|
||||
error_text = str(form.errors["photo"]).lower()
|
||||
self.assertTrue(
|
||||
"malicious" in error_text or "validation failed" in error_text,
|
||||
f"Expected malicious or validation failed message, got: {form.errors['photo']}",
|
||||
)
|
||||
|
||||
|
||||
class UploadPhotoViewSecurityTestCase(TestCase):
|
||||
"""Test cases for UploadPhotoView security enhancements"""
|
||||
|
||||
def setUp(self):
|
||||
"""Set up test data"""
|
||||
self.user = User.objects.create_user(
|
||||
username="testuser", email="test@example.com", password="testpass123"
|
||||
)
|
||||
|
||||
def tearDown(self):
|
||||
"""Clean up after tests"""
|
||||
pass
|
||||
|
||||
|
||||
@override_settings(
|
||||
ENABLE_FILE_SECURITY_VALIDATION=True,
|
||||
ENABLE_EXIF_SANITIZATION=True,
|
||||
ENABLE_MALICIOUS_CONTENT_SCAN=True,
|
||||
ENABLE_RATE_LIMITING=True,
|
||||
)
|
||||
class FileSecurityIntegrationTestCase(TestCase):
|
||||
"""Integration tests for file upload security"""
|
||||
|
||||
def setUp(self):
|
||||
"""Set up test data"""
|
||||
self.user = User.objects.create_user(
|
||||
username="testuser", email="test@example.com", password="testpass123"
|
||||
)
|
||||
|
||||
def test_end_to_end_security_validation(self):
|
||||
"""Test end-to-end security validation"""
|
||||
# This would test the complete flow from upload to storage
|
||||
# with all security checks enabled
|
||||
pass
|
||||
|
||||
def test_security_logging(self):
|
||||
"""Test that security events are properly logged"""
|
||||
# This would test that security events are logged
|
||||
# when malicious files are uploaded
|
||||
pass
|
||||
@@ -47,71 +47,67 @@ class Tester(TestCase):
|
||||
self.assertEqual(openid_variations(openid3)[3], openid3)
|
||||
|
||||
def test_is_trusted_url(self):
|
||||
test_gravatar_true = is_trusted_url("https://gravatar.com/avatar/63a75a80e6b1f4adfdb04c1ca02e596c", [
|
||||
{
|
||||
"schemes": [
|
||||
"http",
|
||||
"https"
|
||||
],
|
||||
"host_equals": "gravatar.com",
|
||||
"path_prefix": "/avatar/"
|
||||
}
|
||||
])
|
||||
test_gravatar_true = is_trusted_url(
|
||||
"https://gravatar.com/avatar/63a75a80e6b1f4adfdb04c1ca02e596c",
|
||||
[
|
||||
{
|
||||
"schemes": ["http", "https"],
|
||||
"host_equals": "gravatar.com",
|
||||
"path_prefix": "/avatar/",
|
||||
}
|
||||
],
|
||||
)
|
||||
self.assertTrue(test_gravatar_true)
|
||||
|
||||
test_gravatar_false = is_trusted_url("https://gravatar.com.example.org/avatar/63a75a80e6b1f4adfdb04c1ca02e596c", [
|
||||
{
|
||||
"schemes": [
|
||||
"http",
|
||||
"https"
|
||||
],
|
||||
"host_suffix": ".gravatar.com",
|
||||
"path_prefix": "/avatar/"
|
||||
}
|
||||
])
|
||||
test_gravatar_false = is_trusted_url(
|
||||
"https://gravatar.com.example.org/avatar/63a75a80e6b1f4adfdb04c1ca02e596c",
|
||||
[
|
||||
{
|
||||
"schemes": ["http", "https"],
|
||||
"host_suffix": ".gravatar.com",
|
||||
"path_prefix": "/avatar/",
|
||||
}
|
||||
],
|
||||
)
|
||||
self.assertFalse(test_gravatar_false)
|
||||
|
||||
test_open_redirect = is_trusted_url("https://github.com/SethFalco/?boop=https://secure.gravatar.com/avatar/205e460b479e2e5b48aec07710c08d50", [
|
||||
{
|
||||
"schemes": [
|
||||
"http",
|
||||
"https"
|
||||
],
|
||||
"host_suffix": ".gravatar.com",
|
||||
"path_prefix": "/avatar/"
|
||||
}
|
||||
])
|
||||
test_open_redirect = is_trusted_url(
|
||||
"https://github.com/SethFalco/?boop=https://secure.gravatar.com/avatar/205e460b479e2e5b48aec07710c08d50",
|
||||
[
|
||||
{
|
||||
"schemes": ["http", "https"],
|
||||
"host_suffix": ".gravatar.com",
|
||||
"path_prefix": "/avatar/",
|
||||
}
|
||||
],
|
||||
)
|
||||
self.assertFalse(test_open_redirect)
|
||||
|
||||
test_multiple_filters = is_trusted_url("https://ui-avatars.com/api/blah", [
|
||||
{
|
||||
"schemes": [
|
||||
"https"
|
||||
],
|
||||
"host_equals": "ui-avatars.com",
|
||||
"path_prefix": "/api/"
|
||||
},
|
||||
{
|
||||
"schemes": [
|
||||
"http",
|
||||
"https"
|
||||
],
|
||||
"host_suffix": ".gravatar.com",
|
||||
"path_prefix": "/avatar/"
|
||||
}
|
||||
])
|
||||
test_multiple_filters = is_trusted_url(
|
||||
"https://ui-avatars.com/api/blah",
|
||||
[
|
||||
{
|
||||
"schemes": ["https"],
|
||||
"host_equals": "ui-avatars.com",
|
||||
"path_prefix": "/api/",
|
||||
},
|
||||
{
|
||||
"schemes": ["http", "https"],
|
||||
"host_suffix": ".gravatar.com",
|
||||
"path_prefix": "/avatar/",
|
||||
},
|
||||
],
|
||||
)
|
||||
self.assertTrue(test_multiple_filters)
|
||||
|
||||
test_url_prefix_true = is_trusted_url("https://ui-avatars.com/api/blah", [
|
||||
{
|
||||
"url_prefix": "https://ui-avatars.com/api/"
|
||||
}
|
||||
])
|
||||
test_url_prefix_true = is_trusted_url(
|
||||
"https://ui-avatars.com/api/blah",
|
||||
[{"url_prefix": "https://ui-avatars.com/api/"}],
|
||||
)
|
||||
self.assertTrue(test_url_prefix_true)
|
||||
|
||||
test_url_prefix_false = is_trusted_url("https://ui-avatars.com/api/blah", [
|
||||
{
|
||||
"url_prefix": "https://gravatar.com/avatar/"
|
||||
}
|
||||
])
|
||||
test_url_prefix_false = is_trusted_url(
|
||||
"https://ui-avatars.com/api/blah",
|
||||
[{"url_prefix": "https://gravatar.com/avatar/"}],
|
||||
)
|
||||
self.assertFalse(test_url_prefix_false)
|
||||
|
||||
@@ -6,6 +6,7 @@ Simple module providing reusable random_string function
|
||||
import contextlib
|
||||
import random
|
||||
import string
|
||||
import logging
|
||||
from io import BytesIO
|
||||
from PIL import Image, ImageDraw, ImageSequence
|
||||
from urllib.parse import urlparse
|
||||
@@ -13,6 +14,9 @@ import requests
|
||||
from ivatar.settings import DEBUG, URL_TIMEOUT
|
||||
from urllib.request import urlopen as urlopen_orig
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger("ivatar")
|
||||
|
||||
BLUESKY_IDENTIFIER = None
|
||||
BLUESKY_APP_PASSWORD = None
|
||||
with contextlib.suppress(Exception):
|
||||
@@ -88,7 +92,7 @@ class Bluesky:
|
||||
)
|
||||
profile_response.raise_for_status()
|
||||
except Exception as exc:
|
||||
print(f"Bluesky profile fetch failed with HTTP error: {exc}")
|
||||
logger.warning(f"Bluesky profile fetch failed with HTTP error: {exc}")
|
||||
return None
|
||||
|
||||
return profile_response.json()
|
||||
|
||||
@@ -7,6 +7,7 @@ import contextlib
|
||||
from io import BytesIO
|
||||
from os import path
|
||||
import hashlib
|
||||
import logging
|
||||
from ivatar.utils import urlopen, Bluesky
|
||||
from urllib.error import HTTPError, URLError
|
||||
from ssl import SSLError
|
||||
@@ -38,6 +39,10 @@ from .ivataraccount.models import Photo
|
||||
from .ivataraccount.models import pil_format, file_format
|
||||
from .utils import is_trusted_url, mm_ng, resize_animated_gif
|
||||
|
||||
# Initialize loggers
|
||||
logger = logging.getLogger("ivatar")
|
||||
security_logger = logging.getLogger("ivatar.security")
|
||||
|
||||
|
||||
def get_size(request, size=DEFAULT_AVATAR_SIZE):
|
||||
"""
|
||||
@@ -137,14 +142,14 @@ class AvatarImageView(TemplateView):
|
||||
|
||||
if default is not None:
|
||||
if TRUSTED_DEFAULT_URLS is None:
|
||||
print("Query parameter `default` is disabled.")
|
||||
logger.warning("Query parameter `default` is disabled.")
|
||||
default = None
|
||||
elif default.find("://") > 0:
|
||||
# Check if it's trusted, if not, reset to None
|
||||
trusted_url = is_trusted_url(default, TRUSTED_DEFAULT_URLS)
|
||||
|
||||
if not trusted_url:
|
||||
print(
|
||||
security_logger.warning(
|
||||
f"Default URL is not in trusted URLs: '{default}'; Kicking it!"
|
||||
)
|
||||
default = None
|
||||
@@ -373,7 +378,7 @@ class GravatarProxyView(View):
|
||||
if exc.code == 404:
|
||||
cache.set(gravatar_test_url, "default", 60)
|
||||
else:
|
||||
print(f"Gravatar test url fetch failed: {exc}")
|
||||
logger.warning(f"Gravatar test url fetch failed: {exc}")
|
||||
return redir_default(default)
|
||||
|
||||
gravatar_url = (
|
||||
@@ -384,23 +389,25 @@ class GravatarProxyView(View):
|
||||
|
||||
try:
|
||||
if cache.get(gravatar_url) == "err":
|
||||
print(f"Cached Gravatar fetch failed with URL error: {gravatar_url}")
|
||||
logger.warning(
|
||||
f"Cached Gravatar fetch failed with URL error: {gravatar_url}"
|
||||
)
|
||||
return redir_default(default)
|
||||
|
||||
gravatarimagedata = urlopen(gravatar_url)
|
||||
except HTTPError as exc:
|
||||
if exc.code not in [404, 503]:
|
||||
print(
|
||||
logger.warning(
|
||||
f"Gravatar fetch failed with an unexpected {exc.code} HTTP error: {gravatar_url}"
|
||||
)
|
||||
cache.set(gravatar_url, "err", 30)
|
||||
return redir_default(default)
|
||||
except URLError as exc:
|
||||
print(f"Gravatar fetch failed with URL error: {exc.reason}")
|
||||
logger.warning(f"Gravatar fetch failed with URL error: {exc.reason}")
|
||||
cache.set(gravatar_url, "err", 30)
|
||||
return redir_default(default)
|
||||
except SSLError as exc:
|
||||
print(f"Gravatar fetch failed with SSL error: {exc.reason}")
|
||||
logger.warning(f"Gravatar fetch failed with SSL error: {exc.reason}")
|
||||
cache.set(gravatar_url, "err", 30)
|
||||
return redir_default(default)
|
||||
try:
|
||||
@@ -416,7 +423,7 @@ class GravatarProxyView(View):
|
||||
return response
|
||||
|
||||
except ValueError as exc:
|
||||
print(f"Value error: {exc}")
|
||||
logger.error(f"Value error: {exc}")
|
||||
return redir_default(default)
|
||||
|
||||
# We shouldn't reach this point... But make sure we do something
|
||||
@@ -446,7 +453,7 @@ class BlueskyProxyView(View):
|
||||
return HttpResponseRedirect(url)
|
||||
|
||||
size = get_size(request)
|
||||
print(size)
|
||||
logger.debug(f"Bluesky avatar size requested: {size}")
|
||||
blueskyimagedata = None
|
||||
default = None
|
||||
|
||||
@@ -461,7 +468,7 @@ class BlueskyProxyView(View):
|
||||
Q(digest=kwargs["digest"]) | Q(digest_sha256=kwargs["digest"])
|
||||
).first()
|
||||
except Exception as exc:
|
||||
print(exc)
|
||||
logger.warning(f"Exception: {exc}")
|
||||
|
||||
# If no identity is found in the email table, try the openid table
|
||||
if not identity:
|
||||
@@ -473,7 +480,7 @@ class BlueskyProxyView(View):
|
||||
| Q(alt_digest3=kwargs["digest"])
|
||||
).first()
|
||||
except Exception as exc:
|
||||
print(exc)
|
||||
logger.warning(f"Exception: {exc}")
|
||||
|
||||
# If still no identity is found, redirect to the default
|
||||
if not identity:
|
||||
@@ -494,7 +501,9 @@ class BlueskyProxyView(View):
|
||||
|
||||
try:
|
||||
if cache.get(bluesky_url) == "err":
|
||||
print(f"Cached Bluesky fetch failed with URL error: {bluesky_url}")
|
||||
logger.warning(
|
||||
f"Cached Bluesky fetch failed with URL error: {bluesky_url}"
|
||||
)
|
||||
return redir_default(default)
|
||||
|
||||
blueskyimagedata = urlopen(bluesky_url)
|
||||
@@ -506,11 +515,11 @@ class BlueskyProxyView(View):
|
||||
cache.set(bluesky_url, "err", 30)
|
||||
return redir_default(default)
|
||||
except URLError as exc:
|
||||
print(f"Bluesky fetch failed with URL error: {exc.reason}")
|
||||
logger.warning(f"Bluesky fetch failed with URL error: {exc.reason}")
|
||||
cache.set(bluesky_url, "err", 30)
|
||||
return redir_default(default)
|
||||
except SSLError as exc:
|
||||
print(f"Bluesky fetch failed with SSL error: {exc.reason}")
|
||||
logger.warning(f"Bluesky fetch failed with SSL error: {exc.reason}")
|
||||
cache.set(bluesky_url, "err", 30)
|
||||
return redir_default(default)
|
||||
try:
|
||||
@@ -536,7 +545,7 @@ class BlueskyProxyView(View):
|
||||
response["Vary"] = ""
|
||||
return response
|
||||
except ValueError as exc:
|
||||
print(f"Value error: {exc}")
|
||||
logger.error(f"Value error: {exc}")
|
||||
return redir_default(default)
|
||||
|
||||
# We shouldn't reach this point... But make sure we do something
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
WSGI config for ivatar project.
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
import os
|
||||
import sys
|
||||
|
||||
|
||||
25
pytest.ini
Normal file
@@ -0,0 +1,25 @@
|
||||
[tool:pytest]
|
||||
# Pytest configuration for ivatar project
|
||||
|
||||
# Test discovery
|
||||
testpaths = ivatar
|
||||
python_files = test_*.py
|
||||
python_classes = Test*
|
||||
python_functions = test_*
|
||||
|
||||
# Markers for test categorization
|
||||
markers =
|
||||
bluesky: marks tests as requiring Bluesky API credentials (deselect with '-m "not bluesky"')
|
||||
slow: marks tests as slow (deselect with '-m "not slow"')
|
||||
integration: marks tests as integration tests
|
||||
unit: marks tests as unit tests
|
||||
|
||||
# Default options
|
||||
addopts =
|
||||
--strict-markers
|
||||
--strict-config
|
||||
--verbose
|
||||
--tb=short
|
||||
|
||||
# Minimum version
|
||||
minversion = 6.0
|
||||
@@ -1,3 +1,4 @@
|
||||
argon2-cffi>=21.3.0
|
||||
autopep8
|
||||
bcrypt
|
||||
defusedxml
|
||||
@@ -31,8 +32,10 @@ pyLibravatar
|
||||
pylint
|
||||
pymemcache
|
||||
PyMySQL
|
||||
pytest
|
||||
python-coveralls
|
||||
python-language-server
|
||||
python-magic>=0.4.27
|
||||
pytz
|
||||
rope
|
||||
setuptools
|
||||
|
||||
26
run_tests_local.sh
Executable file
@@ -0,0 +1,26 @@
|
||||
#!/bin/bash
|
||||
# Run tests locally, skipping Bluesky tests that require external API credentials
|
||||
|
||||
echo "Running tests locally (skipping Bluesky tests)..."
|
||||
echo "================================================"
|
||||
|
||||
# Run Django tests excluding the Bluesky test file
|
||||
python3 manage.py test \
|
||||
ivatar.ivataraccount.test_auth \
|
||||
ivatar.ivataraccount.test_views \
|
||||
ivatar.test_auxiliary \
|
||||
ivatar.test_file_security \
|
||||
ivatar.test_static_pages \
|
||||
ivatar.test_utils \
|
||||
ivatar.test_views \
|
||||
ivatar.test_views_stats \
|
||||
ivatar.tools.test_views \
|
||||
ivatar.test_wsgi \
|
||||
-v2
|
||||
|
||||
echo ""
|
||||
echo "To run all tests including Bluesky (requires API credentials):"
|
||||
echo "python3 manage.py test -v2"
|
||||
echo ""
|
||||
echo "To run only Bluesky tests:"
|
||||
echo "python3 manage.py test ivatar.ivataraccount.test_views_bluesky -v2"
|
||||
1
test_indexes.py
Normal file
@@ -0,0 +1 @@
|
||||
|
||||