This commit is contained in:
Duc Nguyen
2026-03-18 20:21:56 +07:00
commit 29667cd92f
58 changed files with 8459 additions and 0 deletions

View File

@@ -0,0 +1,360 @@
# Plan: Update Fission Python Template Based on Example Projects
## Context
The current Fission Python template (`fission-python/template/`) is essentially a copy of the `py-eom-quota` example project, making it **quota-specific** rather than a **generic starting point** for new Fission Python projects.
Three example projects were analyzed:
- `py-eom-quota` - User quota management API
- `py-eom-storage` - Storage resource management with S3 integration
- `py-ailbl-scheduler` - Background job scheduler with Dagster integration
All examples share common infrastructure patterns but differ in business logic. This plan will make the template **generic, reusable, and production-ready** by extracting shared best practices.
---
## Key Findings from Examples
### 1. Common Infrastructure (All Projects Share)
- **vault.py** - Identical across all three projects (encryption/decryption using PyNaCl)
- **helpers.py** - Nearly identical core utilities:
- `get_secret()` / `get_config()` (K8s secrets/configmaps with vault support)
- `init_db_connection()` (PostgreSQL connection)
- `db_row_to_dict()` / `db_rows_to_array()`
- `get_user_from_headers()` (extract user for audit logging)
- `format_error_response()` (standardized error format)
- `check_port_open()` (DB readiness check)
- `str_to_bool()` utility
- **Fission Configuration** - Using docstring format in `main()` functions
- **Exception Patterns** - Custom exception hierarchies with:
- `error_code` (machine-readable)
- `http_status` (HTTP status)
- `error_msg` (human-readable)
- `x_user` (optional user tracking)
- `details` (optional additional context)
- **Pydantic Models** - Request validation, response schemas, pagination/filtering
- **Project Structure** - Consistent layout:
```
project/
├── .fission/deployment.json
├── src/
│ ├── __init__.py
│ ├── exceptions.py
│ ├── helpers.py
│ ├── models.py
│ ├── vault.py
│ ├── build.sh
│ └── <business logic>.py
├── test/
├── migrates/
├── manifests/
├── specs/
├── requirements.txt
├── dev-requirements.txt
└── README.md
```
### 2. Variations Between Projects
**Database Connection:**
- `py-eom-quota`: Advanced `DBConfig` dataclass with `from_remote_config()` support
- `py-eom-storage` & `py-ailbl-scheduler`: Simplified direct connection from secrets
**Additional Dependencies:**
- Storage: `boto3` (S3/MinIO), `botocore`
- Scheduler: `gql` (GraphQL), `cron-descriptor`
- All: `pydantic`, `psycopg2-binary`, `PyNaCl`, `Flask`, `requests`
**Executors:**
- Quota: `poolmgr` (concurrency=1)
- Storage: `poolmgr` (concurrency=3, maxscale=3)
- Scheduler: `newdeploy` (minscale=1, maxscale=1)
### 3. Issues to Fix
- **README outdated** - References `pymake`, `fission.json`, `fission.yaml` (not used)
- **Missing Flask** - `src/requirements.txt` needs Flask (currently only in dev-requirements)
- **Quota-specific code** - Template should be generic (no `QuotaException`, `QuotaResponse`, etc.)
- **No .env.example** - Missing environment variable template
- **Test dependencies minimal** - Should include `pytest`, `pytest-mock`, `requests`, `flake8`, `black`
- **build.sh** - Should handle both alpine (apk) and debian (apt) properly
- **deployment.json** - Should not hardcode `fission-eom-quota-env` secret names
- **Missing Python version** - Should specify Python 3.11+ (scheduler uses 3.11-alpine)
---
## Recommended Changes
### Phase 1: Core Infrastructure (Keep Generic)
**Files to MODIFY:**
1. **`src/vault.py`** - Keep as-is (already perfect, identical in all examples)
2. **`src/helpers.py`** - Use the simplified pattern from `py-eom-storage` but add:
- Keep: `get_secret()`, `get_config()`, `init_db_connection()`, `db_row_to_dict()`, `db_rows_to_array()`, `get_current_namespace()`, `str_to_bool()`, `check_port_open()`, `get_user_from_headers()`, `format_error_response()`
- Remove: `DBConfig` class (too specific to quota, keep it simple)
- Add: `.strip()` when reading files (as in scheduler)
- Keep CORS_HEADERS and constants but make them configurable
3. **`src/exceptions.py`** - Replace quota-specific with generic patterns:
```python
class ServiceException(Exception):
"""Base exception for service errors."""
def __init__(self, error_code, http_status, error_msg, x_user=None, details=None):
...
class ValidationError(ServiceException): # 400
class NotFoundError(ServiceException): # 404
class ConflictError(ServiceException): # 409
class DatabaseError(ServiceException): # 500
```
(Based on `py-eom-storage` pattern - cleaner and more generic)
4. **`src/models.py`** - Replace with generic example patterns:
- Remove: All quota-specific models
- Add: Generic `ItemResponse`, `PaginatedResponse`, `ErrorResponse`
- Include examples of Pydantic models with Field descriptions and json_schema_extra
- Show patterns for: Enums, nested models, dataclasses for filters
5. **`src/requirements.txt`** - Update to include actual runtime deps:
```
Flask==2.1.1
pydantic==2.11.7
psycopg2-binary==2.9.10
PyNaCl==1.6.0
requests==2.32.2
```
(Remove commented examples - these go in docs, not requirements.txt)
6. **`dev-requirements.txt`** - Expand with useful dev tools:
```
Flask==2.1.1
requests==2.32.2
pytest==8.2.0
pytest-mock==3.14.0
flake8==7.0.0
black==24.1.1
mypy==1.8.0
```
7. **`README.md`** - Complete rewrite:
- Remove references to pymake, fission.json
- Explain actual project structure
- Document Fission configuration in docstrings
- Show how to use deployment.json
- Document environment variables (secrets/configmaps)
- Explain testing approach
- Add development workflow
- Include examples from all three projects as inspiration
8. **`.fission/deployment.json`** - Make generic with placeholders:
- Use `your-service-py` as environment name
- Use `your-package` as package name
- Use generic secret/configmap names: `fission-${PROJECT_NAME}-env`, `fission-${PROJECT_NAME}-config`
- Show both `poolmgr` and `newdeploy` executor examples (commented)
- Include optional fields like `imagepullsecret`, `runtime_envs`, `configmaps`
9. **`test/requirements.txt`** - Add:
```
pytest==8.2.0
pytest-mock==3.14.0
requests==2.32.3
```
10. **`build.sh`** - Fix to use `${SRC_PKG}` properly (current version is correct)
### Phase 2: Documentation & Examples
**New Files to ADD:**
1. **`src/__init__.py`** - Already exists, keep as is
2. **`examples/` directory** (new) - Sample function implementations:
- `example_crud.py` - Basic CRUD with Pydantic validation
- `example_webhook.py` - Webhook receiver pattern
- `example_scheduler.py` - Background job pattern (from ailbl-scheduler)
- Each should have proper Fission docstring config
3. **`.env.example`** - Template showing all environment variables:
```
# PostgreSQL
PG_HOST=
PG_PORT=5432
PG_DB=
PG_USER=
PG_PASS=
PG_DBSCHEMA=
# Optional: Service-specific config (via ConfigMap)
# YOUR_SERVICE_CONFIG_ENDPOINT=
# Optional: Vault encryption key (if using encrypted secrets)
# CRYPTO_KEY=
```
4. **`docs/` directory** (new) - Additional documentation:
- `STRUCTURE.md` - Detailed file structure explanation
- `TESTING.md` - How to write and run tests
- `DEPLOYMENT.md` - Deployment options and tuning
- `SECRETS.md` - Managing secrets and configmaps
- `MIGRATIONS.md` - Database migration workflow
5. **`pytest.ini`** - Default pytest configuration:
```ini
[pytest]
testpaths = test
python_files = test_*.py
python_classes = Test*
python_functions = test_*
log_cli = true
log_cli_level = INFO
```
6. **`.gitignore`** - Ensure it excludes:
- `__pycache__/`
- `*.pyc`
- `.env`
- `.venv/`
- `venv/`
- `.pytest_cache/`
- `.mypy_cache/`
- `.coverage`
- `coverage.xml`
- `specs/` (optional - generated files)
7. **`MANIFEST.md`** - Template for Kubernetes manifests (if not using auto-generated)
### Phase 3: Modernization
**Update CI/CD:**
Review `.gitea/workflows/` files:
- Ensure they install dependencies correctly
- Add linting (flake8/black) steps
- Add test execution
- Add deployment steps with proper environment detection
- Consider adding security scanning
**Python Version:**
- Ensure all files are compatible with Python 3.11+
- Update `build.sh` to use Python 3.11 image (like scheduler does) or keep generic
- Consider adding `runtime.txt` or `pyproject.toml` to specify Python version
---
## Files to Modify Summary
**Direct modifications:**
- `src/helpers.py` - Simplify, improve
- `src/exceptions.py` - Make generic
- `src/models.py` - Replace with generic patterns
- `src/requirements.txt` - Add Flask, remove commented section
- `dev-requirements.txt` - Comprehensive dev dependencies
- `test/requirements.txt` - Test dependencies
- `README.md` - Complete rewrite
- `.fission/deployment.json` - Generic placeholders
- `build.sh` - Already good, just ensure compatibility
**New files to add:**
- `.env.example`
- `pytest.ini`
- `.gitignore` (enhance)
- `examples/` directory with sample functions
- `docs/` directory with detailed guides
- `src/example_crud.py` (or in examples/)
- `src/example_webhook.py` (or in examples/)
- `src/example_scheduler.py` (or in examples/)
**New directories:**
- `examples/`
- `docs/`
---
## Implementation Approach
1. **Backup current template** (git branch)
2. **Modify core files** in order: helpers → exceptions → models → requirements → deployment.json → README
3. **Add new files** (examples, docs, configs)
4. **Test the template**:
- Run `create-project.sh` to generate a new project
- Verify build.sh works
- Run tests
- Check Fission spec generation
5. **Commit with clear message**
6. **Update plugin documentation** if needed
---
## Verification Steps
After implementing the changes:
1. **Create a test project** from the updated template:
```bash
./create-project.sh test-project ./tmp-test/
```
2. **Inspect generated project**:
- Verify all files are present
- Check that placeholders are substituted correctly
- Ensure imports work
3. **Build the package**:
```bash
cd tmp-test
./src/build.sh
```
4. **Run tests** (if any):
```bash
pip install -r dev-requirements.txt
pytest
```
5. **Check syntax**:
```bash
python -m py_compile src/*.py
flake8 src/
black --check src/
```
6. **Validate Fission config**:
```bash
fission spec verify --file=.fission/deployment.json
```
7. **Review README** - Does it accurately describe the project?
---
## Success Criteria
- Template is **generic**, not domain-specific
- All examples' best practices are incorporated
- Documentation is accurate and complete
- Dependencies are correctly listed (Flask in requirements, not just dev)
- README reflects actual Fission workflow (docstrings, not fission.yaml)
- Multiple example implementations provided (CRUD, webhook, scheduler)
- Secrets/configuration clearly explained
- Testing setup is comprehensive
- Project passes linting and type checks
---
## Risks & Mitigations
| Risk | Mitigation |
|------|------------|
| Breaking existing template users | Keep changes minimal in helpers; preserve backward compatibility where possible |
| Over-engineering | Stick to patterns that appear in at least 2 of 3 examples |
| Missing edge cases | Include optional advanced patterns (like DBConfig) in docs, not in core |
| Documentation drift | Keep docs close to code; add examples that mirror real projects |
---
## Post-Implementation
After the template is updated:
1. Consider creating a **template validation script** to ensure quality
2. Update the **plugin SKILL.md** to reflect template changes
3. Add **templating tests** to the fission-python-skill test suite
4. Document the **update process** for future template modifications
5. Consider **versioning** the template (e.g., `template-v2/`)

View File

@@ -0,0 +1,32 @@
# Plan: Update marketplace.json
## Context
The marketplace.json file currently has an empty plugins array. The goal is to register the existing `fission-python-skill` plugin in the marketplace by adding it to the plugins list. Owner information will remain unchanged.
## Current State
- **File**: `.claude-plugin/marketplace.json`
- **Current content**: `{ "name": "vega-claude-marketplace", "owner": {"name": "tiendd", "email": "fdm.dev17@gmail.com"}, "plugins": [] }`
- **Plugin to add**: `fission-python-skill/.claude-plugin/plugin.json` contains:
- id: "fission-python-skill"
- name: "Fission Python Skill"
- description: "Skill for creating, analyzing, and managing Fission Python projects."
- type: "skill"
- path: "fission-python-skill"
- tools: ["create-project", "analyze-config", "update-docstring"]
- version: "1.0.0"
## Implementation
1. Read the current `plugin.json` from `fission-python-skill/.claude-plugin/` to extract plugin metadata
2. Update `.claude-plugin/marketplace.json`:
- Keep existing name and owner unchanged
- Add a plugin object to the plugins array with the data from plugin.json
## Critical Files
- `.claude-plugin/marketplace.json` (to be modified)
- `fission-python-skill/.claude-plugin/plugin.json` (source of plugin data)
## Verification
After modification, verify:
1. The file contains valid JSON
2. The plugins array contains the fission-python-skill object
3. Owner information is unchanged

View File

@@ -0,0 +1,182 @@
# README.md Generation Plan
## Context
The user requested to "review project and generate README.md file" for the Claude Marketplace repository. This repository contains a plugin ecosystem for Claude Code with two major components:
1. **Fission Python Skill** - A plugin for creating, analyzing, and managing Fission serverless Python projects
2. **SDLC Agent System** - A complete multi-agent Software Development Life Cycle system for automated planning, architecture, coding, and code review
The repository is missing a root-level README.md file that documents these components, their usage, and how they work together.
## Problem Statement
The repository needs a comprehensive README.md at the root level that:
1. Introduces the Claude Marketplace project and its purpose
2. Documents the Fission Python Skill plugin (tools, installation, usage)
3. Documents the SDLC Agent System (agents, setup, workflow)
4. Explains the project structure and key directories
5. Provides quick start guides for both components
6. Includes reference information (technologies, environment variables, common tasks)
7. Links to existing detailed documentation (CLAUDE.md, agent docs, skill docs)
## Solution Approach
### What to Include
Based on repository analysis, the README should cover:
1. **Project Overview**
- What is Claude Marketplace?
- Key components (Fission Python Skill, SDLC Agents)
- Relationship between components
2. **Fission Python Skill**
- Purpose and use cases
- Available tools (create-project, analyze-config, update-docstring)
- Installation/setup (chmod +x on scripts)
- Usage examples for each tool
- Project structure
- Links to detailed docs (SKILL.md, reference.md)
3. **SDLC Agent System**
- Overview of the 7 agents (Initializer, Planning, Architect, Coding, Code Review, Curator, Retro)
- Agent workflow and handoffs
- Setup procedure (setup.sh script)
- agent-context directory structure
- Skills system (stack detection, patterns, frameworks)
- Quality gates and harness scripts
- Links to detailed agent docs
4. **Project Structure**
- Directory layout with descriptions
- Key configuration files
- Template locations
5. **Quick Start**
- Using Fission Python Skill to create a project
- Setting up SDLC Agents in an existing project
- Development environment (devcontainer)
6. **Development**
- Making changes to skill scripts
- Updating plugin metadata
- Testing approaches
7. **Configuration**
- Environment variables (for devcontainer)
- Claude Code settings
8. **Related Documentation**
- CLAUDE.md (comprehensive project guide)
- Agent-specific documentation
- Skill documentation
### Design Decisions
- **Structure**: Standard GitHub README with clear sections using markdown headings
- **Tone**: Professional, concise, informative
- **Format**: Single file at repository root
- **Links**: Cross-reference existing documentation rather than duplicating content
- **Code blocks**: Include practical examples for all commands
- **Tables**: Use for quick reference (tools, skills, agents)
### Reuse Existing Content
- CLAUDE.md contains excellent detailed information - will summarize and link to it
- Individual agent .md files have authoritative content - will link rather than copy
- Skill files (SKILL.md) already have user-facing docs - will summarize and link
- The template structure is documented in CLADE.md - will extract key info
## Implementation Steps
1. Create `/workspaces/claude-marketplace/README.md` with:
a. **Header section**
- Badges (if applicable)
- Title and subtitle
- One-sentence description
b. **Table of Contents** (auto-generated with markdown-toc or manual)
c. **Project Overview**
- Purpose of Claude Marketplace
- Components summary
- Key technologies
d. **Fission Python Skill section**
- Description
- Tools table
- Installation
- Usage examples
- Project structure
- Links to SKILL.md and reference.md
e. **SDLC Agent System section**
- What are SDLC Agents?
- The 7 agents with brief descriptions
- Setup instructions
- Agent-context structure
- Skills system overview
- Harness and quality gates
- Links to detailed docs
f. **Project Structure section**
- Directory tree visualization
- Key files table
g. **Quick Start section**
- Setting up dev environment
- Creating Fission project
- Initializing SDLC Agents
h. **Development section**
- Modifying skills
- Plugin registration
- Testing
i. **Configuration section**
- Environment variables table
- Settings files
j. **License** (check if exists)
2. Ensure the README:
- Is comprehensive but concise
- Uses consistent formatting (h2 for major sections, h3 for subsections)
- Includes practical examples with code blocks
- Links to existing detailed documentation
- Has a clear call-to-action for both components
3. Quality checks:
- Verify all linked files exist
- Ensure markdown renders properly (no broken syntax)
- Check for consistency with CLAUDE.md
## Critical Files
- `/workspaces/claude-marketplace/README.md` - The file to create
- `/workspaces/claude-marketplace/CLAUDE.md` - Source for detailed project information
- `/workspaces/claude-marketplace/fission-python-skill/SKILL.md` - Skill documentation source
- `/workspaces/claude-marketplace/.sdlc-agents/` - Agent documentation directory
- `/workspaces/claude-marketplace/.sdlc-agents/setup.sh` - Agent setup script
## Verification
After generating the README.md:
1. Check markdown syntax (headings, lists, code blocks, tables)
2. Verify all internal links point to existing files
3. Ensure all referenced tools and scripts actually exist
4. Confirm information consistency with source files
5. Review for completeness: Does it answer "What is this repo?" and "How do I use it?"
## Success Criteria
- README.md exists at repository root
- Provides clear overview of both major components
- Includes practical usage examples
- Links to authoritative detailed documentation
- Follows standard GitHub README conventions
- New users can understand the project and get started

View File

@@ -0,0 +1,421 @@
# Plan: Enhance Fission Python Projects with Exceptions, Pydantic Models, and Code Quality Improvements
## Context
Three Fission Python projects need systematic improvements to enhance error handling, data validation, and code maintainability:
- **py-eom-storage**: Storage management API (GET/POST /storages, GET/PUT/DELETE /storages/{id})
- **py-eom-quota**: Quota management API (GET/POST /quotas, POST/DELETE /users/{userId}/quotas/{quotaId})
- **py-ailbl-scheduler**: Background worker system for scheduled tasks
Currently, all projects use generic `Exception` with simple error messages returned as `{"error": str(err)}` with 500 status. There's no structured error handling, request validation, or consistent response formatting. Some projects have Pydantic models but not comprehensively used.
## Goals
1. **Custom Exceptions**: Implement domain-specific exception classes with:
- `error_code`: Machine-readable error identifier
- `http_status_code`: Appropriate HTTP status (400, 404, 409, 500, etc.)
- `error_msg`: Human-readable message
- `x_user`: User identifier from request header (X-Fission-Params-UserId or similar)
2. **Pydantic Models**: Add comprehensive request/response models for all endpoints:
- Request body validation (POST/PUT)
- Query parameter validation (GET)
- Structured response schemas
- Consistent error response format
3. **Code Quality**: Improve maintainability with:
- Detailed docstrings for all functions and classes
- Refactoring of complex, multi-responsibility functions
- Consistent error handling patterns
- Fix broken imports and type issues
## Project-Specific Plans
### 1. py-eom-storage
**Current State:**
- Has Pydantic models: `S3Resource`, `S3Credential` (unused)
- Uses dataclasses: `Page`, `Filter` (should be Pydantic)
- Endpoints: `/eom/admin/storages` (filter_or_insert.py), `/eom/admin/storages/{StorageId}` (update_or_delete.py)
**Changes Needed:**
**A. Create `src/exceptions.py`:**
```python
class StorageException(Exception):
"""Base exception for storage-related errors."""
def __init__(self, error_code: str, http_status: int, error_msg: str, x_user: str = None):
self.error_code = error_code
self.http_status = http_status
self.error_msg = error_msg
self.x_user = x_user
super().__init__(self.error_msg)
class ValidationError(StorageException):
"""Invalid input data."""
class NotFoundError(StorageException):
"""Resource not found."""
class ConflictError(StorageException):
"""Resource conflict (e.g., duplicate name)."""
class DatabaseError(StorageException):
"""Database operation failed."""
class S3ConnectionError(StorageException):
"""S3/MinIO connection failed."""
```
**B. Create/Update `src/models.py` (or extend existing):**
```python
# Request models
class StorageCreateRequest(BaseModel):
name: str = Field(..., min_length=1, max_length=255)
description: typing.Optional[str] = None
resource: dict # Should validate S3 structure
class StorageUpdateRequest(BaseModel):
name: typing.Optional[str] = None
description: typing.Optional[str] = None
resource: typing.Optional[dict] = None
active: typing.Optional[bool] = None
# Query models (convert Page/Filter to Pydantic)
class StorageFilter(BaseModel):
ids: typing.Optional[typing.List[str]] = None
keyword: typing.Optional[str] = None
collection_id: typing.Optional[str] = None
enable: typing.Optional[bool] = None
created_from: typing.Optional[datetime] = None
created_to: typing.Optional[datetime] = None
# ... other filters
class StorageQuery(BaseModel):
page: int = 0
size: int = Field(8, ge=1, le=100)
asc: bool = True
sortby: typing.Optional[Literal["name", "enable", "created", "modified"]] = None
filter: StorageFilter = StorageFilter()
# Response models
class StorageResponse(BaseModel):
id: str
name: str
description: typing.Optional[str]
resource: dict
enable: bool
created: datetime
modified: datetime
class ErrorResponse(BaseModel):
error_code: str
http_status: int
error_msg: str
x_user: typing.Optional[str] = None
details: typing.Optional[dict] = None
```
**C. Refactor `filter_or_insert.py`:**
- Replace try-except to catch custom exceptions
- Validate request body using Pydantic in `make_insert_request`
- Use Pydantic for query parsing in `make_filter_request`
- Add helper function `handle_exception` to format error responses consistently
- Extract SQL queries into separate functions for testability
- Add comprehensive docstrings explaining each endpoint's behavior
**D. Refactor `update_or_delete.py`:**
- Similar pattern: custom exceptions, Pydantic validation
- Refactor `is_depended_on_storage` - this function does too much, split into smaller helpers
- Add detailed comments for each database operation
- Ensure proper error messages with appropriate HTTP status codes
**E. Update `helpers.py`:**
- Add utility `get_user_from_header(request)` to extract x-user from various headers
---
### 2. py-eom-quota
**Current State:**
- Already has extensive Pydantic models in `models.py` (QuotaPage, UserQuotaPage, ScheduleCreate, etc.)
- But: `userquota_filter.py` imports from `quota_update_or_delete` which doesn't exist (broken import)
- Need to expand models to cover all request/response scenarios
- Endpoints: `/eom/admin/quotas` (filter), `/eom/admin/users/{UserId}/quotas` (filter/insert), `/eom/admin/users/{UserId}/quotas/{QuotaId}` (update/delete)
**Changes Needed:**
**A. Create `src/exceptions.py`:**
```python
class QuotaException(Exception):
"""Base exception for quota management."""
def __init__(self, error_code: str, http_status: int, error_msg: str, x_user: str = None):
self.error_code = error_code
self.http_status = http_status
self.error_msg = error_msg
self.x_user = x_user
super().__init__(self.error_msg)
class QuotaNotFoundError(QuotaException):
"""Quota does not exist."""
class UserQuotaConflictError(QuotaException):
"""User already has this type of quota."""
class ValidationError(QuotaException):
"""Invalid request data."""
class DatabaseError(QuotaException):
"""Database operation failed."""
```
**B. Extend `src/models.py`:**
The existing models mix schedule and quota models. Need to:
- Separate or clearly document which are for quotas vs schedules
- Add request models:
```python
class QuotaCreateRequest(BaseModel):
name: str
description: typing.Optional[str] = None
type: QuotaType
value: typing.Union[MaxSizeBody, MaxOrderTimesBody]
expire: ExpireBody
class QuotaUpdateRequest(BaseModel):
name: typing.Optional[str] = None
description: typing.Optional[str] = None
enable: typing.Optional[bool] = None
type: typing.Optional[QuotaType] = None
value: typing.Optional[typing.Union[MaxSizeBody, MaxOrderTimesBody]] = None
expire: typing.Optional[ExpireBody] = None
class UserQuotaAssignRequest(BaseModel):
quota_id: str
```
- Ensure response models exist (QuotaResponse, UserQuotaResponse)
**C. Fix `userquota_filter.py`:**
- Fix broken import: `from quota_update_or_delete import __get_by_id` → `from userquota_insert_or_delete import __get_by_id` (or better: move `__get_by_id` to a shared helpers module)
- Refactor `make_filter_request`:
- Use `UserQuotaPage` Pydantic model properly
- Validate user_id header is present using Pydantic
- Replace try-except with custom exceptions
- Add comprehensive docstring
- The function currently manually sets `paging.filter.user_ids = [user_id]` - this should be part of a validation layer
**D. Refactor `userquota_insert_or_delete.py`:**
- Fix the same broken import pattern (it imports nothing but uses `__get_by_id` in filter)
- Add proper request validation using Pydantic models
- Replace generic exceptions with `UserQuotaConflictError`, `QuotaNotFoundError`, etc.
- Refactor `__validate_user_quota_type` - currently SQL query is hardcoded, add comments explaining business logic
- The insert SQL has wrong columns: `INSERT INTO eom_user_quota(id, name, description, type, value, expire)` but the table likely only has (id, user_id, quota_id). Need to check database schema but from the code it seems mismatched.
**E. Improve `helpers.py`:**
- Add utility functions for extracting and validating user headers
- Add consistent error handling helpers
---
### 3. py-ailbl-scheduler
**Current State:**
- No HTTP endpoints (only time-triggered workers)
- No Pydantic models needed per user's choice
- Needs custom exceptions and code quality improvements
- Workers: `worker_session_picker.py`, `worker_session_poller.py`, `worker_scheduler_scan.py`, `worker_schedule_auto_disable.py`
- Common utilities in `common.py`, `helpers.py`
**Changes Needed:**
**A. Create `src/exceptions.py`:**
```python
class SchedulerException(Exception):
"""Base exception for scheduler operations."""
def __init__(self, error_code: str, error_msg: str, details: dict = None):
self.error_code = error_code
self.error_msg = error_msg
self.details = details
super().__init__(self.error_msg)
class ScheduleNotFoundError(SchedulerException):
"""Schedule does not exist."""
class SessionLockError(SchedulerError):
"""Failed to acquire session lock."""
class DagsterError(SchedulerError):
"""Dagster pipeline execution failed."""
class CronParseError(SchedulerError):
"""Invalid cron expression."""
class ConfigurationError(SchedulerError):
"""Missing or invalid configuration."""
```
**B. Refactor `worker_scheduler_scan.py`:**
This is the most complex function (446 lines). Goals:
- Extract helper functions:
- `_normalize_cron_for_cronner` (already exists)
- `_as_date`, `_as_time` (already exist)
- `_within_active_window` (already exists)
- `_is_due_by_cron` (already exists)
- `_is_due_by_freq` (already exists)
- Extract the schedule creation logic into `_create_session_for_schedule(cur, schedule, now, slot_start)`
- Extract the candidate schedule selection into `_fetch_due_schedules(cur, now, slot_start, slot_end, limit=50)`
- Add detailed docstrings explaining the overall algorithm: "Scan for schedules that are due in the current time slot and create sessions atomically"
- Improve variable names (e.g., `s` → `schedule`, `cur` → `cursor`)
- Add comments explaining the advisory lock strategy and why it's needed
- Ensure proper exception handling with custom exceptions
- The function currently catches generic Exception at the end - wrap specific operations with appropriate custom exceptions
**C. Refactor `worker_session_picker.py`:**
- Similar breakdown: extract `_pick_and_claim_sessions(conn, limit=20)` helper
- Extract `_process_kind5_session(session, ctx)` and `_process_kind1_session(session, ctx)` into separate functions
- Add detailed docstring explaining the picking strategy (FOR UPDATE SKIP LOCKED)
- Replace bare `except Exception` with specific exception types
- Add comments explaining the kind handling logic (kind 5 vs kind 1)
- The function `_build_run_config_kind5` is specific to that kind - could be moved to a separate module if needed
**D. Refactor `worker_session_poller.py`:**
- Extract `_update_completed_session(cur, session_id, status_info, now)` helper
- Extract `_update_started_session(cur, session_id, started_dt)` helper
- Add docstring explaining polling strategy
- Replace generic exception handling with `DagsterError` when Dagster calls fail
- Add type hints for the row unpacking: `for sid, run_id, started, cron_description, created_by in rows:`
**E. Refactor `worker_schedule_auto_disable.py`:**
- This is simple enough already but still add comprehensive docstring
- Consider adding custom exception for database errors
**F. Improve `helpers.py` (in scheduler):**
- The `GraphQL` class and related functions are specific to Dagster - add docstrings
- `safe_notify` is good, add docstring
- Consider creating a `SchedulerHelper` class to group related utilities
**G. Improve `common.py`:**
- Already has good docstrings but could be expanded
- Add type hints to function signatures
- Break `launch_pipeline_execution` if too complex (handles multiple error cases)
---
## Common Patterns
### Exception Hierarchy
Each project will have:
```python
class BaseProjectException(Exception):
"""Base with error_code, http_status (if applicable), message, metadata."""
pass
# Specific exceptions inherit from base
class NotFoundError(BaseProjectException): ...
class ValidationError(BaseProjectException): ...
class ConflictError(BaseProjectException): ...
class DatabaseError(BaseProjectException): ...
# Domain-specific: StorageNotFoundError, QuotaConflictError, ScheduleNotFoundError, etc.
```
### Error Response Format
Standardized JSON response:
```json
{
"error_code": "STORAGE_NOT_FOUND",
"http_status": 404,
"error_msg": "Storage with id 'xyz' does not exist",
"x_user": "user123",
"details": { /* optional additional context */ }
}
```
### Middleware Pattern
In each HTTP endpoint function:
```python
def main():
try:
# Extract user header
x_user = request.headers.get("X-Fission-Params-UserId")
# Route to handler
return handler()
except ValidationError as e:
return error_response(e), 400
except NotFoundError as e:
return error_response(e), 404
except ConflictError as e:
return error_response(e), 409
except StorageException as e:
logger.error(f"Storage error: {e.error_code}: {e.error_msg}")
return error_response(e), 500
except Exception as e:
logger.exception("Unexpected error")
return {"error": "Internal server error"}, 500
```
---
## Implementation Order
1. **Phase 1**: Create exception modules for all three projects
2. **Phase 2**: Add/expand Pydantic models (storage, then complete quota)
3. **Phase 3**: Refactor endpoints to use exceptions and models
4. **Phase 4**: Refactor complex functions in scheduler
5. **Phase 5**: Documentation pass - ensure all functions have docstrings
6. **Phase 6**: Test manually by running functions (no automated tests to update)
---
## Verification Steps
1. **Manual Testing**:
- Deploy each function to local Fission or use test environment
- Test error cases: invalid input, missing resources, database failures
- Verify error response format matches specification
- Check logs for proper error logging
2. **Code Review**:
- All functions have docstrings with Args, Returns, Raises sections
- No function exceeds ~50 lines (extracted helpers where needed)
- All exceptions are specific, not generic `Exception`
- Request validation happens before business logic
3. **Import Verification**:
- Fix broken imports (especially in py-eom-quota's userquota_filter.py)
- Ensure circular dependencies are avoided
4. **Type Safety**:
- Run static type checker if available (mypy/pyright)
- Ensure all functions have return type hints
---
## Critical Files to Modify
**py-eom-storage:**
- `src/exceptions.py` (new)
- `src/models.py` (create/extend)
- `src/filter_or_insert.py` (refactor)
- `src/update_or_delete.py` (refactor)
- `src/helpers.py` (add utilities)
- `src/vault.py` (minor: improve docs)
**py-eom-quota:**
- `src/exceptions.py` (new)
- `src/models.py` (extend with request models)
- `src/userquota_filter.py` (fix imports, refactor)
- `src/userquota_insert_or_delete.py` (refactor, fix SQL if needed)
- `src/helpers.py` (add utilities)
**py-ailbl-scheduler:**
- `src/exceptions.py` (new)
- `src/worker_scheduler_scan.py` (major refactor)
- `src/worker_session_picker.py` (refactor)
- `src/worker_session_poller.py` (refactor)
- `src/worker_schedule_auto_disable.py` (docs)
- `src/common.py` (docs, type hints)
- `src/helpers.py` (docs, maybe extract class)
---
## Notes
- All changes are in `/workspaces/claude-marketplace/data/examples/`
- Preserve existing API contracts (URLs, HTTP methods)
- Do not change database schema
- Maintain backward compatibility with existing clients
- Focus on internal improvements: error handling, validation, documentation
- Use consistent patterns across all three projects

View File

@@ -0,0 +1,72 @@
# Fission Python Skill Plan
## Context
The user wanted to create a new skill called `fission-python-skill` in the `@fission-plugin/skills` directory. This skill should provide three main capabilities:
1. Create a new fission python project with template (based on @data/py-eom-quota)
2. Analyze configuration in .fission of each fission-python project
3. Parse and update docstring of fission function method
## Approach
Based on my exploration of the codebase:
- The example project @data/py-eom-quota shows a standard fission python project structure
- Fission configuration is stored in .fission/deployment.json and similar files
- Python functions contain fission configuration in their docstrings using a specific format (between ```fission and ``` markers)
- The fission-plugin/skills directory currently contained empty SKILL.md and reference.md files
I implemented all three requested capabilities as shell scripts within the skill directory.
## Implementation Summary
### Phase 1: Skill Creation Completed
1. Created the skill directory: `/workspaces/claude-marketplace/fission-plugin/skills/fission-python-skill/`
2. Created SKILL.md following the skill format from the documentation
3. Created reference.md with detailed usage instructions
4. Implemented the three core tools as shell scripts:
- `create-project.sh`: Creates a new fission python project from template
- `analyze-config.sh`: Analyzes .fission configuration in a project
- `update-docstring.sh`: Parses and updates docstrings in fission function methods
### Phase 2: Tool Implementation Details (Completed)
#### create-project.sh
- Takes project name and optional destination directory
- Copies template from @data/py-eom-quota (excluding .git, etc.)
- Replaces placeholder values in configuration files
- Provides usage instructions for next steps
#### analyze-config.sh
- Takes path to a fission project
- Reads and parses .fission/deployment.json (and related files)
- Outputs structured summary of:
- Environments and their resource settings
- Packages and build commands
- Functions and their triggers
- Secrets and configmaps
- Archives and source configuration
#### update-docstring.sh
- Takes path to a python file and optionally function name
- Parses docstrings to extract embedded fission configuration (between ```fission markers)
- Allows updating the fission configuration within docstrings using --set flag
- Can retrieve current configuration using --get flag (default)
- Preserves existing function code and documentation outside fission blocks
- Uses Python script for robust JSON handling and string manipulation
### Phase 3: Testing (Completed)
- Tested create-project.sh by generating a new project and verifying structure
- Tested analyze-config.sh on the existing @data/py-eom-quota project
- Tested update-docstring.sh by retrieving and modifying fission configuration in function docstrings
- All tools have proper help text and error handling
## Files Created
- `/workspaces/claude-marketplace/fission-python-skill/SKILL.md`
- `/workspaces/claude-marketplace/fission-python-skill/reference.md`
- `/workspaces/claude-marketplace/fission-python-skill/create-project.sh`
- `/workspaces/claude-marketplace/fission-python-skill/analyze-config.sh`
- `/workspaces/claude-marketplace/fission-python-skill/update-docstring.sh`
## Verification Results
✓ create-project.sh: Successfully creates new fission python projects from template
✓ analyze-config.sh: Successfully analyzes .fission configuration showing environments, packages, functions, secrets, etc.
✓ update-docstring.sh: Successfully extracts and updates fission configuration in function docstrings
All tools are executable and include proper error handling and usage instructions.

View File

@@ -0,0 +1,156 @@
# Plan: Update FissionPython Skill
## Context
The `fission-python-skill` plugin needs to be updated to meet new requirements for Fission Python projects:
1. **Build script**: `src/build.sh` must exist and be referenced correctly in `.fission/deployment.json`
2. **Dependencies**: `src/requirements.txt` must exist and contain necessary packages (pydantic, etc.)
3. **CI/CD**: All projects must include `.gitea/workflows/` directory with deployment workflows
4. **API Design**: HTTP trigger functions must use Pydantic models for request/response validation
5. **Documentation**: All functions must have proper docstrings and code comments
6. **Portability**: Remove hardcoded absolute paths - the plugin should work from any location
Current issues:
- `create-project.sh` uses hardcoded path `/workspaces/claude-marketplace/data/py-eom-quota`
- Template resides in `data/examples/` which is outside the plugin
- No validation of generated projects
- Documentation references incorrect paths
## Approach
**Step 1: Make Plugin Portable**
- Copy the `py-eom-quota` template into `fission-python-skill/template/`
- Update `create-project.sh` to find template relative to script location using `dirname "$0"`
- Remove all absolute path references
**Step 2: Add Project Validation**
Enhance `create-project.sh` with post-creation validation:
- Check `src/build.sh` exists and is executable
- Verify `.fission/deployment.json` references the correct build command (`./build.sh`)
- Check `src/requirements.txt` exists and contains required dependencies:
- `pydantic==2.x`
- `flask` (for HTTP handlers)
- `psycopg2-binary` or `psycopg2` (if database usage)
- Verify `.gitea/workflows/` directory exists with the 4 standard workflow files
- Validate that function files contain pydantic models (basic grep check)
- Warn if docstrings appear minimal or missing
**Step 3: Update Documentation**
- Fix `SKILL.md` and `reference.md` to reference the correct template path
- Document the new validation checks
- Update examples to show portable usage
**Step 4: Potentially Add New Tool**
Consider adding a separate validation tool (`validate-project.sh`) that can be run on existing projects to check compliance with standards.
## Critical Files to Modify
1. **`fission-python-skill/create-project.sh`**
- Change `TEMPLATE_DIR` to use relative path: `$(dirname "$0")/template`
- Add validation functions after project creation
- Improve error messages
- Add warnings for missing documentation
2. **`fission-python-skill/template/`** (new directory)
- Copy entire structure from `data/examples/py-eom-quota/`
- Ensure `build.sh` has correct permissions (755)
- Verify all configuration files
3. **`fission-python-skill/SKILL.md`** and **`reference.md`**
- Update template path references
- Document validation behavior
- Update examples
4. **`.claude-plugin/marketplace.json`**
- No changes needed (plugin registration OK)
## Implementation Details
### Template Structure
```
fission-python-skill/
└── template/
├── .fission/
│ ├── deployment.json
│ ├── dev-deployment.json
│ └── local-deployment.json
├── .gitea/
│ └── workflows/
│ ├── dev-deployment.yaml
│ ├── install-dispatch.yaml
│ ├── uninstall-dispatch.yaml
│ └── analystic-dispatch.yaml
├── src/
│ ├── build.sh (executable)
│ ├── requirements.txt (with pydantic, flask, etc.)
│ ├── models.py (with pydantic models)
│ ├── exceptions.py
│ ├── helpers.py
│ ├── vault.py
│ └── <example functions>.py (with docstrings and pydantic usage)
├── test/
├── manifests/
├── migrates/
├── specs/
├── dev-requirements.txt
├── README.md
├── .gitignore
└── .devcontainer/
```
### Validation Checklist in create-project.sh
After copying template and doing substitutions:
1. `[ -f "$PROJECT_PATH/src/build.sh" ]` || warning
2. `[ -x "$PROJECT_PATH/src/build.sh" ]` || chmod +x
3. Check `deployment.json` contains `"./build.sh"` in packages.buildcmd
4. `[ -f "$PROJECT_PATH/src/requirements.txt" ]` || error
5. Check requirements.txt contains `pydantic` (grep -q "pydantic")
6. Check requirements.txt contains `flask` (grep -q "flask")
7. `[ -d "$PROJECT_PATH/.gitea/workflows" ]` || warning/copy from template
8. Count workflow files: should have at least 4 .yaml files
9. Optional: Check that Python files have docstrings (grep for triple quotes)
10. Optional: Check for pydantic BaseModel usage in models.py
### Portable Path Resolution
```bash
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
TEMPLATE_DIR="$SCRIPT_DIR/template"
```
This ensures the plugin works regardless of where it's invoked from.
## Verification Steps
1. **Test create-project**:
- Run `./fission-python-skill/create-project.sh test-project ./tmp/`
- Verify all expected directories/files exist
- Check that validation warnings/errors appear appropriately
2. **Test portability**:
- Move plugin to a different directory
- Run create-project from there
- Should still work without path adjustments
3. **Test validation**:
- Manually delete `src/requirements.txt` from template and create project → should error
- Remove pydantic from requirements.txt → should warn
- Remove .gitea/workflows → should warn
- Change build.sh buildcmd to something else → should warn
4. **Test generated project**:
- Verify functions have docstrings with fission config blocks
- Verify models.py uses pydantic BaseModel
- Verify HTTP triggers properly defined in deployment.json
## Risks and Considerations
- **Template duplication**: Moving template into plugin duplicates existing examples. That's acceptable - the examples in `data/examples/` are finished projects, while the template is a starter. Keep both.
- **Validation strictness**: Start with warnings for most checks, errors only for critical missing files (requirements.txt build.sh). Can tighten later.
- **Template maintenance**: When updating the template, only modify `fission-python-skill/template/`. The examples in `data/examples/` are independent and can diverge if needed.
## Post-Implementation
- Update any scripts or docs that reference the old template path
- Test the skill end-to-end through Claude Code
- Consider adding a `validate-project.sh` tool for existing projects