1.5 KiB
1.5 KiB
Python LLM & ML Workflow .cursorrules Prompt File
Synopsis
This prompt file is designed for senior Python AI/ML engineers specializing in Large Language Model (LLM) applications and Machine Learning (ML) workflow optimization. It provides a comprehensive set of guidelines and best practices for developing high-quality, maintainable, and efficient Python code.
Tech Stack
- Python 3.10+
- Poetry / Rye
- Ruff
typingmodulepytest- Google Style Docstrings
conda/venvdocker,docker-composeasyncandawaitfastapigradio,streamlitlangchain,transformers- (Optional)
faiss,chroma,mlflow,tensorboard,optuna,hyperopt,pandas,numpy,dask,pyspark gitgunicorn,uvicorn,nginx,caddysystemd,supervisor
Key Features
- Emphasizes modular design, code quality, and ML/AI-specific guidelines.
- Focuses on performance optimization, including asynchronous programming and caching.
- Provides detailed coding standards and best practices for Python and FastAPI.
- Includes guidelines for effective documentation, testing, and error handling.
- Tailored for use with the Cursor IDE, but applicable to general Python development.
Usage
Place this .cursorrules file in the root of your project to guide the AI assistant in adhering to these standards and practices.
Contribution
This prompt file is a collaborative effort, and contributions are welcome. Feel free to suggest improvements or additions to enhance its utility for Python AI/ML development.