cosmoguard-bd/tests
2025-10-21 14:22:27 +02:00
..
__init__.py first commit: checkpoint per multi-device collab 2025-10-21 14:22:27 +02:00
conftest.py first commit: checkpoint per multi-device collab 2025-10-21 14:22:27 +02:00
README.md first commit: checkpoint per multi-device collab 2025-10-21 14:22:27 +02:00
RUN_TESTS.md first commit: checkpoint per multi-device collab 2025-10-21 14:22:27 +02:00
test_cosing_service.py first commit: checkpoint per multi-device collab 2025-10-21 14:22:27 +02:00
test_echa_find.py first commit: checkpoint per multi-device collab 2025-10-21 14:22:27 +02:00

PIF Compiler - Test Suite

Overview

Comprehensive test suite for the PIF Compiler project using pytest.

Structure

tests/
├── __init__.py              # Test package marker
├── conftest.py              # Shared fixtures and configuration
├── test_cosing_service.py   # COSING service tests
├── test_models.py           # (TODO) Pydantic model tests
├── test_echa_service.py     # (TODO) ECHA service tests
└── README.md                # This file

Installation

# Install test dependencies
uv add --dev pytest pytest-cov pytest-mock

# Or manually install
uv pip install pytest pytest-cov pytest-mock

Running Tests

Run All Tests (Unit only)

uv run pytest

Run Specific Test File

uv run pytest tests/test_cosing_service.py

Run Specific Test Class

uv run pytest tests/test_cosing_service.py::TestParseCasNumbers

Run Specific Test

uv run pytest tests/test_cosing_service.py::TestParseCasNumbers::test_single_cas_number

Run with Verbose Output

uv run pytest -v

Run with Coverage Report

uv run pytest --cov=src/pif_compiler --cov-report=html
# Open htmlcov/index.html in browser

Test Categories

Unit Tests (Default)

Fast tests with no external dependencies. Run by default.

uv run pytest -m unit

Integration Tests

Tests that hit real APIs or databases. Skipped by default.

uv run pytest -m integration

Slow Tests

Tests that take longer to run. Skipped by default.

uv run pytest -m slow

Database Tests

Tests requiring MongoDB. Ensure Docker is running.

cd utils
docker-compose up -d
uv run pytest -m database

Test Organization

test_cosing_service.py

Coverage:

  • parse_cas_numbers() - CAS parsing logic

    • Single/multiple CAS
    • Different separators (/, ;, ,, --)
    • Parentheses removal
    • Whitespace handling
    • Invalid dash removal
  • cosing_search() - API search

    • Search by name
    • Search by CAS
    • Search by EC number
    • Search by ID
    • No results handling
    • Invalid mode error
  • clean_cosing() - JSON cleaning

    • Basic field cleaning
    • Empty tag removal
    • CAS parsing
    • URL creation
    • Field renaming
  • Integration tests (marked as @pytest.mark.integration)

    • Real API calls (requires internet)

Writing New Tests

Example Unit Test

class TestMyFunction:
    """Test my_function."""

    def test_basic_case(self):
        """Test basic functionality."""
        result = my_function("input")
        assert result == "expected"

    def test_edge_case(self):
        """Test edge case handling."""
        with pytest.raises(ValueError):
            my_function("invalid")

Example Mock Test

from unittest.mock import Mock, patch

@patch('module.external_api_call')
def test_with_mock(mock_api):
    """Test with mocked external call."""
    mock_api.return_value = {"data": "mocked"}
    result = my_function()
    assert result == "expected"
    mock_api.assert_called_once()

Example Fixture Usage

def test_with_fixture(sample_cosing_response):
    """Test using a fixture from conftest.py."""
    result = clean_cosing(sample_cosing_response)
    assert "cosingUrl" in result

Best Practices

  1. Naming: Test files/classes/functions start with test_
  2. Arrange-Act-Assert: Structure tests clearly
  3. One assertion focus: Each test should test one thing
  4. Use fixtures: Reuse test data via conftest.py
  5. Mock external calls: Don't hit real APIs in unit tests
  6. Mark appropriately: Use @pytest.mark.integration for slow tests
  7. Descriptive names: Test names should describe what they test

Common Commands

# Run fast tests only (skip integration/slow)
uv run pytest -m "not integration and not slow"

# Run only integration tests
uv run pytest -m integration

# Run with detailed output
uv run pytest -vv

# Stop at first failure
uv run pytest -x

# Run last failed tests
uv run pytest --lf

# Run tests matching pattern
uv run pytest -k "test_parse"

# Generate coverage report
uv run pytest --cov=src/pif_compiler --cov-report=term-missing

CI/CD Integration

For GitHub Actions (example):

- name: Run tests
  run: |
    uv run pytest -m "not integration" --cov --cov-report=xml    

TODO

  • Add tests for models.py (Pydantic validation)
  • Add tests for echa_service.py
  • Add tests for echa_parser.py
  • Add tests for echa_extractor.py
  • Add tests for database_service.py
  • Add tests for pubchem_service.py
  • Add integration tests with test database
  • Set up GitHub Actions CI