11 KiB
11 KiB
PIF Compiler - Comsoguard API
Project Overview
PIF Compiler (branded as Comsoguard API) is a cosmetic safety assessment tool that compiles Product Information Files (PIF) for cosmetic products, as required by EU regulations. It aggregates toxicological, regulatory, and chemical data from multiple external sources to evaluate ingredient safety (DAP calculations, SED/MoS computations, NOAEL extraction).
The primary language is Italian (variable names, comments, some log messages). Code is written in Python 3.12.
Tech Stack
- Framework: FastAPI + Uvicorn
- Package manager: uv (with
pyproject.toml+uv.lock) - Data models: Pydantic v2
- Databases: MongoDB (substance data cache via
pymongo) + PostgreSQL (product presets, search logs, compilers viapsycopg2) - Web scraping: Playwright (PDF generation, browser automation), BeautifulSoup4 (HTML parsing)
- External APIs: PubChem (
pubchempy+pubchemprops), COSING (EU Search API), ECHA (chem.echa.europa.eu) - Logging: Python
loggingwith rotating file handlers (logs/debug.log,logs/error.log)
Project Structure
src/pif_compiler/
├── main.py # FastAPI app, middleware, exception handlers, routers
├── __init__.py
├── api/
│ └── routes/
│ ├── api_echa.py # ECHA endpoints (single + batch search)
│ ├── api_cosing.py # COSING endpoints (single + batch search)
│ ├── api_ingredients.py # Ingredient search by CAS + list all ingested
│ ├── api_esposition.py # Esposition preset creation + list all presets
│ └── common.py # PDF generation, PubChem, CIR search endpoints
├── classes/
│ ├── __init__.py # Re-exports all models from models.py and main_cls.py
│ ├── models.py # Pydantic models: Ingredient, DapInfo, CosingInfo,
│ │ # ToxIndicator, Toxicity, Esposition, RetentionFactors, StatoOrdine
│ └── main_cls.py # Orchestrator classes: Order (raw input layer),
│ # Project (processed layer), IngredientInput
├── functions/
│ ├── common_func.py # PDF generation with Playwright
│ ├── common_log.py # Centralized logging configuration
│ └── db_utils.py # MongoDB + PostgreSQL connection helpers
└── services/
├── srv_echa.py # ECHA scraping, HTML parsing, toxicology extraction,
│ # orchestrator (validate -> check cache -> fetch -> store)
├── srv_cosing.py # COSING API search + data cleaning
├── srv_pubchem.py # PubChem property extraction (DAP data)
└── srv_cir.py # CIR (Cosmetic Ingredient Review) database search
Other directories
data/- Input data files (input.jsonwith sample INCI/CAS/percentage lists), DB schema reference (db_schema.sql), old CSV datalogs/- Rotating log files (debug.log, error.log) - auto-generatedpdfs/- Generated PDF files from ECHA dossier pagesstreamlit/- Streamlit UI pages (ingredients_page.py,exposition_page.py)marimo/- Ignore this folder. Debug/test notebooks, not part of the main application
Architecture & Data Flow
Core workflow (per ingredient)
- Input: CAS number (and optionally INCI name + percentage)
- COSING (
srv_cosing.py): Search EU cosmetic ingredients database for regulatory restrictions, INCI match, annex references - ECHA (
srv_echa.py): Search substance -> get dossier -> parse HTML index -> extract toxicological data (NOAEL, LD50, LOAEL) from acute & repeated dose toxicity pages - PubChem (
srv_pubchem.py): Get molecular weight, XLogP, TPSA, melting point, dissociation constants - DAP calculation (
DapInfomodel): Dermal Absorption Percentage based on molecular properties (MW > 500, LogP, TPSA > 120, etc.) - Toxicity ranking (
Toxicitymodel): Best toxicological indicator selection with priority (NOAEL > LOAEL > LD50) and safety factors
Caching strategy
- ECHA results are cached in MongoDB (
toxinfo.substance_indexcollection) keyed bysubstance.rmlCas - Ingredients are cached in MongoDB (
toxinfo.ingredientscollection) keyed bycas, with PostgreSQLingredientitable as index (storesmongo_id+ enrichment flagsdap,cosing,tox) - Orders are cached in MongoDB (
toxinfo.orderscollection) keyed byuuid_ordine - Projects are cached in MongoDB (
toxinfo.projectscollection) keyed byuuid_progetto - The orchestrator checks local cache before making external requests
Ingredient.get_or_create(cas)checks PostgreSQL -> MongoDB cache, returns cached if not older than 365 days, otherwise re-scrapes- Search history is logged to PostgreSQL (
logs.search_historytable)
Order / Project architecture
- Order (
main_cls.py): Raw input layer. Receives JSON with client, compiler, product type, ingredients list (CAS + percentage). Cleans CAS numbers (strips\n, splits by;). Saves to MongoDBorderscollection. Registers client/compiler in PostgreSQL. - Project (
main_cls.py): Processed layer. Created from an Order viaProject.from_order(). Holds enrichedIngredientobjects, percentages mapping (CAS -> %), andEspositionpreset.process_ingredients()callsIngredient.get_or_create()for each CAS. Saves to MongoDBprojectscollection. - An order can update an older project — they are decoupled.
PostgreSQL schema (see data/db_schema.sql)
clienti- Customers (id_cliente,nome_cliente)compilatori- PIF compilers/assessors (id_compilatore,nome_compilatore)ordini- Orders linking a client + compiler to a project (uuid_ordine,uuid_progetto,data_ordine,stato_ordine). FK toclienti,compilatori,stati_ordinistati_ordini- Order status lookup table (id_stato,nome_stato). Values mapped toStatoOrdineIntEnum inmodels.pyingredienti- Ingredient registry keyed by CAS. Tracks enrichment status via boolean flags (dap,cosing,tox) and links to MongoDB document (mongo_id)inci- INCI name to CAS mapping. FK toingredienti(cas)progetti- Projects linked to an order (mongo_id->ordini.uuid_progetto) and a product type preset (preset_tipo_prodotto->tipi_prodotti)ingredients_lineage- Many-to-many join betweenprogettiandingredienti, tracking which ingredients belong to which projecttipi_prodotti- Product type presets with exposure parameters (preset_name,tipo_prodotto,luogo_applicazione, exposure routes,sup_esposta,freq_applicazione,qta_giornaliera,ritenzione). Maps to theEspositionPydantic modellogs.search_history- Search audit log (cas_ricercato,target,esito)
API Endpoints
All routes are under /api/v1:
| Method | Path | Description |
|---|---|---|
| POST | /echa/search |
Single ECHA substance search by CAS |
| POST | /echa/batch-search |
Batch ECHA search for multiple CAS numbers |
| POST | /cosing/search |
COSING search (by name, CAS, EC, or ID) |
| POST | /cosing/batch-search |
Batch COSING search |
| POST | /ingredients/search |
Get full ingredient by CAS (cached or scraped) |
| GET | /ingredients/list |
List all ingested ingredients from PostgreSQL |
| POST | /esposition/create |
Create a new esposition preset |
| GET | /esposition/presets |
List all esposition presets |
| POST | /common/pubchem |
PubChem property lookup by CAS |
| POST | /common/generate-pdf |
Generate PDF from URL via Playwright |
| GET | /common/download-pdf/{name} |
Download a generated PDF |
| POST | /common/cir-search |
CIR ingredient text search |
| GET | /health, /ping |
Health check endpoints |
Docs available at /docs (Swagger) and /redoc.
Environment Variables
Configured via .env file (loaded with python-dotenv):
ADMIN_USER- MongoDB admin usernameADMIN_PASSWORD- MongoDB admin passwordMONGO_HOST- MongoDB hostMONGO_PORT- MongoDB portDATABASE_URL- PostgreSQL connection string
Development
Setup
uv sync # Install dependencies
playwright install # Install browser binaries for PDF generation
Running the API
uv run python -m pif_compiler.main
# or
uv run uvicorn pif_compiler.main:app --reload --host 0.0.0.0 --port 8000
Key conventions
- Services in
services/handle external API calls and data extraction - Models in
classes/models.pyuse Pydantic@model_validatorand@classmethodbuilders for construction from raw API data - Orchestrator classes in
classes/main_cls.pyhandle Order (raw input) and Project (processed) layers - The
orchestratorpattern (seesrv_echa.py) handles: validate input -> check local cache -> fetch from external -> store locally -> return Ingredient.ingredient_builder(cas)calls scraping functions directly (pubchem_dap,cosing_entry,orchestrator)Ingredient.save()upserts to both MongoDB and PostgreSQL,Ingredient.from_cas()retrieves via PostgreSQL index -> MongoDBIngredient.get_or_create(cas)is the main entry point: checks cache freshness (365 days), scrapes if needed- All modules use the shared logger from
common_log.get_logger() - API routes define Pydantic request/response models inline in each route file
db_utils.py functions
db_connect(db_name, collection_name)- MongoDB collection accessorpostgres_connect()- PostgreSQL connectionupsert_ingrediente(cas, mongo_id, dap, cosing, tox)- Upsert ingredient in PostgreSQLget_ingrediente_by_cas(cas)- Get ingredient row by CASget_all_ingredienti()- List all ingredients from PostgreSQLupsert_cliente(nome_cliente)- Upsert client, returnsid_clienteupsert_compilatore(nome_compilatore)- Upsert compiler, returnsid_compilatoreaggiorna_stato_ordine(id_ordine, nuovo_stato)- Update order statuslog_ricerche(cas, target, esito)- Log search history
Streamlit UI
streamlit/ingredients_page.py- Ingredient search by CAS + result display + inventory of ingested ingredientsstreamlit/exposition_page.py- Esposition preset creation form + list of existing presets- Both pages call the FastAPI endpoints via
requests(API must be running onlocalhost:8000) - Run with:
streamlit run streamlit/<page>.py
Important domain concepts
- CAS number: Chemical Abstracts Service identifier (e.g., "50-00-0")
- INCI: International Nomenclature of Cosmetic Ingredients
- NOAEL: No Observed Adverse Effect Level (preferred toxicity indicator)
- LOAEL: Lowest Observed Adverse Effect Level
- LD50: Lethal Dose 50%
- DAP: Dermal Absorption Percentage
- SED: Systemic Exposure Dosage
- MoS: Margin of Safety
- PIF: Product Information File (EU cosmetic regulation requirement)