Fission Python Template
A production-ready template for building Fission serverless Python functions with best practices for configuration, database connectivity, error handling, and testing.
Project Structure
project/
├── .fission/
│ ├── deployment.json # Fission function deployment configuration
│ ├── dev-deployment.json # Development overrides
│ └── local-deployment.json # Local development overrides
├── src/
│ ├── __init__.py # Package initialization
│ ├── vault.py # Vault encryption/decryption utilities
│ ├── helpers.py # Shared utilities (DB, secrets, configs)
│ ├── exceptions.py # Custom exception hierarchy
│ ├── models.py # Pydantic models (request/response schemas)
│ ├── build.sh # Package build script
│ └── your_function.py # Your function implementations
├── test/
│ ├── __init__.py
│ ├── test_*.py # Unit tests
│ └── requirements.txt # Test dependencies
├── migrates/
│ └── schema.sql # Database migration scripts
├── manifests/ # Kubernetes manifests (optional)
├── specs/ # Generated Fission specs (created by fission CLI)
├── requirements.txt # Runtime dependencies
├── dev-requirements.txt # Development dependencies
├── .env.example # Environment variable template
├── pytest.ini # Pytest configuration
└── README.md # Project documentation
Key Components
Fission Configuration in Docstrings
Fission reads function metadata from docstrings using the ````fission` marker:
def my_function(event, context):
"""
```fission
{
"name": "my-function",
"http_triggers": {
"my-trigger": {
"url": "/api/my-endpoint",
"methods": ["GET", "POST"]
}
}
}
```
"""
# Your implementation
return {"message": "Hello World"}
Note: Do not use fission.yaml or fission.json. The Fission Python builder reads the docstring annotations directly from your Python source files.
Environment Variables & Secrets
Configuration is managed through Kubernetes Secrets and ConfigMaps:
- Secrets: Database credentials, API keys, encryption keys (sensitive)
- ConfigMaps: Non-sensitive configuration, endpoints, feature flags
Access them via helper functions:
from helpers import get_secret, get_config
# Read secret (with optional default)
db_host = get_secret("PG_HOST", "localhost")
db_port = int(get_secret("PG_PORT", "5432"))
# Read config
api_endpoint = get_config("EXTERNAL_API_ENDPOINT")
Placeholder variables in deployment.json:
${PROJECT_NAME}- Replaced with your actual project name during project creation- Secret/configmap names follow pattern:
fission-${PROJECT_NAME}-envandfission-${PROJECT_NAME}-config
Database Connectivity
Use the provided init_db_connection() helper:
from helpers import init_db_connection, db_rows_to_array
conn = init_db_connection()
cursor = conn.cursor()
cursor.execute("SELECT * FROM items")
rows = db_rows_to_array(cursor, cursor.fetchall())
The helper automatically:
- Reads connection parameters from secrets (PG_HOST, PG_PORT, PG_DB, PG_USER, PG_PASS, PG_DBSCHEMA)
- Checks port connectivity before connecting
- Uses LoggingConnection for query logging
- Applies schema search path if PG_DBSCHEMA is set
Error Handling
Use the exception hierarchy from exceptions.py:
from exceptions import ValidationError, NotFoundError, ConflictError, DatabaseError
def get_item(item_id: str):
item = db.fetch_one(item_id)
if not item:
raise NotFoundError(f"Item {item_id} not found", x_user=get_user_from_headers())
return item
All exceptions return standardized error responses:
{
"error_code": "NOT_FOUND",
"http_status": 404,
"error_msg": "Item 123 not found",
"x_user": "user-456",
"details": {"item_id": "123"}
}
Validation with Pydantic
Validate request payloads using Pydantic models:
from models import ItemCreateRequest
from pydantic import ValidationError as PydanticValidationError
def create_item():
try:
data = ItemCreateRequest(**request.get_json())
except PydanticValidationError as e:
raise ValidationError(str(e), details=e.errors())
Development Workflow
1. Install Dependencies
# Install runtime and development dependencies
pip install -r dev-requirements.txt
# Or just runtime dependencies
pip install -r requirements.txt
2. Local Testing
Fission provides fission spec to test specs locally:
# Verify your deployment configuration
fission spec verify --file=.fission/deployment.json
# Build and test locally
fission function test --name your-function
3. Unit Testing
Run tests with pytest:
# Run all tests
pytest
# Run with coverage
pytest --cov=src
# Run specific test file
pytest test/test_my_function.py
# Verbose output
pytest -v
Example test structure:
# test/test_my_function.py
import pytest
from unittest.mock import patch
from src.my_function import main
def test_my_function_success():
event = {"key": "value"}
context = {}
result = main(event, context)
assert result["status"] == "success"
@patch("helpers.init_db_connection")
def test_my_function_with_db(mock_db):
# Mock database connection
mock_conn = MagicMock()
mock_db.return_value = mock_conn
# Test function
4. Building the Package
The build.sh script installs dependencies and packages your code:
# From project root
./src/build.sh
# This produces a package.zip in the specs directory
# Ready for deployment with: fission deploy
The build script detects the OS (Debian/Alpine) and installs the correct build dependencies (gcc, libpq-dev, python3-dev).
5. Deployment
# Deploy to Fission
fission deploy
# Or deploy specific function
fission function update --name my-function --env your-env
Deployment Configuration
Executors
Choose between two executor types in deployment.json:
poolmgr (default) - Good for high-concurrency HTTP functions:
"executor": {
"select": "poolmgr",
"poolmgr": {
"concurrency": 1,
"requestsperpod": 1,
"onceonly": false
}
}
newdeploy - Good for dedicated scaling:
"executor": {
"select": "newdeploy",
"newdeploy": {
"minscale": 1,
"maxscale": 5,
"targetcpu": 80
}
}
Resource Limits
Set resource allocation in function_common:
mincpu/maxcpu- CPU allocation in millicores (50 = 0.05 cores)minmemory/maxmemory- Memory in MB- Adjust based on your function's needs
Environment-Specific Overrides
Use dev-deployment.json for development environment (different secrets, lower resources). Fission will automatically use it when --dev flag is passed.
Vault Encryption
For encrypted secrets, use the vault utility functions:
from vault import encrypt_vault, decrypt_vault, is_valid_vault_format
# Encrypt a value (run locally to generate vault string)
encrypted = encrypt_vault("my-secret", "your-hex-key-here")
# Result: "vault:v1:base64-encrypted-data"
# Store the encrypted string in your K8s secret
# The helper will auto-decrypt if is_valid_vault_format() returns True
Important: Set CRYPTO_KEY in your helpers.py (or via environment override) to your actual 32-byte key in hex format.
Testing Strategies
Unit Tests
- Mock external dependencies (database, HTTP calls)
- Test business logic isolation
- Use
pytest-mockfor convenient mocking
Integration Tests
- Use a test database
- Clean up test data after each run
- Consider using
pytest.fixturesfor setup/teardown
Local Development
- Use
.fission/local-deployment.jsonfor local Fission setup - Override secrets/configmaps for local environment
- Run with:
fission function test --local
Migrations
Place SQL migration scripts in migrates/:
-- migrates/001_create_items_table.sql
CREATE TABLE items (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
name VARCHAR(255) NOT NULL,
description TEXT,
status VARCHAR(50) DEFAULT 'active',
created TIMESTAMP DEFAULT NOW(),
modified TIMESTAMP DEFAULT NOW()
);
Apply migrations manually via psql or using a migration tool like alembic.
Best Practices
- Keep functions small - Single responsibility per function
- Use Pydantic - Validate all inputs with request models
- Standardize errors - Use the provided exception classes
- Log appropriately - Use
loggerfrom helpers (already configured) - Track users - Use
get_user_from_headers()for audit trails - Write tests - Aim for high coverage of business logic
- Document functions - Add docstrings with fission config block
- Avoid global state - Functions should be stateless and idempotent
Continuous Integration
The template includes .gitea/workflows/ for CI/CD:
install-dispatch.yaml- Triggered on installation eventsuninstall-dispatch.yaml- Cleanup on uninstalldev-deployment.yaml- Development environment updatesanalystic-dispatch.yaml- Analytics processing
Adapt these workflows for your deployment pipeline (GitHub Actions, GitLab CI, etc.).
Troubleshooting
Spec Generation Fails
- Ensure all function files have proper fission config in docstrings
- Run:
python -m py_compile src/*.pyto check syntax - Verify
build.shis executable:chmod +x src/build.sh
Cannot Connect to Database
- Check that secrets are mounted correctly:
kubectl exec <pod> -- ls /secrets/default/ - Verify PG_HOST, PG_PORT are correct
- Use
check_port_open()debug output - Test connection manually:
psql -h $PG_HOST -p $PG_PORT -U $PG_USER $PG_DB
Missing Dependencies
- Ensure
requirements.txtincludes ALL dependencies (Flask is required!) - Check build logs for pip errors
- Rebuild package:
./src/build.sh
Example Implementations
CRUD Operation
from flask import request
from helpers import init_db_connection, format_error_response
from exceptions import ValidationError, NotFoundError, DatabaseError
from models import ItemCreateRequest, ItemResponse
def create_item(event, context):
"""Create a new item."""
try:
# Validate input
data = ItemCreateRequest(**request.get_json())
except Exception as e:
raise ValidationError(str(e))
conn = init_db_connection()
try:
cursor = conn.cursor()
cursor.execute(
"INSERT INTO items (name, description, status) VALUES (%s, %s, %s) RETURNING id, created, modified",
(data.name, data.description, data.status.value)
)
row = cursor.fetchone()
conn.commit()
item = db_row_to_dict(cursor, row)
return item
except Exception as e:
conn.rollback()
raise DatabaseError(str(e))
finally:
conn.close()
Webhook Receiver
def webhook_handler(event, context):
"""Process incoming webhook."""
# Webhook data is in event
payload = event.get("body", {})
signature = request.headers.get("X-Webhook-Signature")
# Verify signature
if not verify_signature(payload, signature):
raise ValidationError("Invalid signature")
# Process webhook
process_webhook(payload)
return {"status": "processed"}
Next Steps
- Replace placeholder values in
.fission/deployment.json - Update
SECRET_NAMEandCONFIG_NAMEinhelpers.py(or use create-project.sh) - Implement your business logic in new function files
- Write tests for your functions
- Deploy to Kubernetes cluster with Fission