Python Toolkit for Production Applications
Posted on Fri 24 January 2025 in Python • 4 min read
When building Python applications that go beyond scripts, you need solid foundations: validated configuration, clean command-line interfaces, proper logging, and type-safe data handling. Here's a toolkit of libraries that work well together.
Pydantic: Data Validation and Settings
Pydantic uses Python type hints to validate data at runtime. Define a model with types, and Pydantic ensures incoming data matches your schema.
from pydantic import BaseModel, Field
from pydantic_settings import BaseSettings
class DatabaseConfig(BaseModel):
host: str
port: int = Field(ge=1, le=65535)
name: str
pool_size: int = 5
class AppSettings(BaseSettings):
database: DatabaseConfig
debug: bool = False
log_level: str = "INFO"
class Config:
env_nested_delimiter = "__"
Why Pydantic?
- Catches errors early: Invalid data raises clear exceptions before it causes bugs downstream
- Self-documenting: Type hints serve as documentation
- Easy serialization: Convert to/from JSON, dictionaries seamlessly
- Settings management: Load configuration from environment variables with
pydantic-settings
# Load from environment variables like DATABASE__HOST, DATABASE__PORT
settings = AppSettings()
print(settings.database.host)
tomllib: Configuration Files (Python 3.11+)
TOML is a clean, human-readable format for configuration files. Python 3.11 added tomllib to the standard library for parsing TOML.
import tomllib
with open("config.toml", "rb") as f:
config = tomllib.load(f)
print(config["database"]["host"])
print(config["server"]["port"])
Example config.toml:
[database]
host = "localhost"
port = 5432
name = "myapp"
[server]
port = 8080
workers = 4
Key points:
- Read-only:
tomllibparses TOML but doesn't write it (usetomli-wfor writing) - Binary mode required: Open files with
"rb" - Returns dictionaries: Access values using standard dict syntax
Combining with Pydantic:
import tomllib
from pydantic import BaseModel
class Config(BaseModel):
database: DatabaseConfig
server: ServerConfig
with open("config.toml", "rb") as f:
raw_config = tomllib.load(f)
config = Config(**raw_config) # Validates the loaded config
Click: Command-Line Interfaces
Click uses decorators to build CLIs without boilerplate. It handles argument parsing, help text generation, and input validation.
import click
from importlib.metadata import version
@click.group()
@click.version_option(version=version("myapp"))
def cli():
"""My Application CLI"""
pass
@cli.command()
@click.option("--config", "-c", type=click.Path(exists=True), required=True)
@click.option("--verbose", "-v", is_flag=True)
def run(config: str, verbose: bool):
"""Run the application with the specified config."""
if verbose:
click.echo(f"Loading config from {config}")
# Application logic here
@cli.command()
@click.argument("name")
def greet(name: str):
"""Greet a user by name."""
click.echo(f"Hello, {name}!")
if __name__ == "__main__":
cli()
Usage:
$ myapp --version
myapp, version 1.0.0
$ myapp run --config config.toml --verbose
Loading config from config.toml
$ myapp greet World
Hello, World!
Why Click?
- Minimal code: Decorators eliminate boilerplate
- Automatic help:
--helpgenerated from docstrings and option descriptions - Nested commands: Group related commands (like
git commit,git push) - Type conversion: Paths, integers, choices validated automatically
Logging: Structured Application Logs
Python's built-in logging library provides leveled, configurable logging across your entire application.
import logging
# Configure once at application startup
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
datefmt="%Y-%m-%d %H:%M:%S"
)
# Get a logger for your module
logger = logging.getLogger(__name__)
# Use appropriate levels
logger.debug("Detailed debugging info")
logger.info("General information")
logger.warning("Something unexpected but not fatal")
logger.error("Error occurred")
logger.critical("Application cannot continue")
Logging levels (in order of severity):
DEBUG: Detailed diagnostic informationINFO: Confirmation that things work as expectedWARNING: Indication of potential problemsERROR: Serious problem, function couldn't completeCRITICAL: Program may not be able to continue
Best practices:
# Use one logger per module
logger = logging.getLogger(__name__)
# Log exceptions with traceback
try:
risky_operation()
except Exception:
logger.exception("Operation failed") # Includes traceback
# Use lazy formatting (faster if message isn't logged)
logger.debug("Processing item %s with value %d", item_id, value)
Putting It Together
Here's how these libraries work together in a typical application:
# app/config.py
import tomllib
from pydantic import BaseModel
from pydantic_settings import BaseSettings
class DatabaseConfig(BaseModel):
host: str
port: int
name: str
class AppConfig(BaseSettings):
database: DatabaseConfig
log_level: str = "INFO"
def load_config(path: str) -> AppConfig:
with open(path, "rb") as f:
data = tomllib.load(f)
return AppConfig(**data)
# app/cli.py
import logging
import click
from app.config import load_config
@click.command()
@click.option("--config", "-c", type=click.Path(exists=True), required=True)
def main(config: str):
"""Run the application."""
cfg = load_config(config)
logging.basicConfig(level=cfg.log_level)
logger = logging.getLogger(__name__)
logger.info("Connecting to database at %s:%d",
cfg.database.host, cfg.database.port)
# Application logic
if __name__ == "__main__":
main()
This combination gives you:
- Type-safe configuration with clear validation errors
- Human-readable config files in TOML format
- Clean CLI with automatic help and validation
- Structured logging for debugging and monitoring
These aren't the only options, but they're battle-tested, well-documented, and work well together.