lcc_tool/CLAUDE.md

636 lines
No EOL
21 KiB
Markdown

# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview
LCC (Logistic Cost Calculator) is a Spring Boot 3.5.9 backend API for calculating complex logistics costs across supply chain networks. It handles materials, packaging, transportation rates, route planning, and multi-component cost calculations including customs duties, handling, inventory, and risk assessment.
**Database Support:** The application supports both **MySQL 8.0** and **MSSQL Server 2022** through a database abstraction layer (`SqlDialectProvider`), allowing deployment flexibility across different database platforms.
## Build & Run Commands
```bash
# Build the project
mvn clean install
# Run the application (default: MySQL)
mvn spring-boot:run
# Run with MSSQL
mvn spring-boot:run -Dspring.profiles.active=mssql
# Run all tests on MySQL
mvn test -Dspring.profiles.active=test,mysql
# Run all tests on MSSQL
mvn test -Dspring.profiles.active=test,mssql
# Run repository integration tests on both databases
mvn test -Dtest="*RepositoryIntegrationTest" -Dspring.profiles.active=test,mysql
mvn test -Dtest="*RepositoryIntegrationTest" -Dspring.profiles.active=test,mssql
# Run a specific test class
mvn test -Dtest=NodeControllerIntegrationTest
# Run a specific test method
mvn test -Dtest=NodeControllerIntegrationTest#shouldReturnListOfNodesWithDefaultPagination
# Skip tests during build
mvn clean install -DskipTests
# Generate JAXB classes from WSDL (EU taxation service)
mvn jaxb:generate
# Generate Allure test report (requires allure-commandline)
mvn clean test
allure serve target/allure-results
```
## Development Environment (Distrobox)
**IMPORTANT:** This project runs inside a **Distrobox** container. This affects how TestContainers and Podman work.
### TestContainers with Distrobox + Podman
TestContainers needs access to the **host's Podman socket**, not the one inside the Distrobox. The configuration is handled via `~/.testcontainers.properties`:
```properties
docker.host=unix:///run/host/run/user/1000/podman/podman.sock
ryuk.disabled=true
```
### Troubleshooting TestContainers / Podman Issues
If tests fail with "Could not find a valid Docker environment":
1. **Check if Podman works on the host:**
```bash
distrobox-host-exec podman info
```
2. **If you see cgroup or UID/GID errors, run migration on the host:**
```bash
distrobox-host-exec podman system migrate
```
3. **Restart podman socket on host if needed:**
```bash
distrobox-host-exec systemctl --user restart podman.socket
```
4. **Verify the host socket is accessible from Distrobox:**
```bash
ls -la /run/host/run/user/1000/podman/podman.sock
```
5. **Test container execution via host:**
```bash
distrobox-host-exec podman run --rm hello-world
```
### Key Paths
| Path | Description |
|------|-------------|
| `/run/host/run/user/1000/podman/podman.sock` | Host's Podman socket (accessible from Distrobox) |
| `~/.testcontainers.properties` | TestContainers configuration file |
## Architecture
### Layered Architecture
```
Controllers → DTOs → Services → Transformers → Repositories → SqlDialectProvider → Database (MySQL/MSSQL)
```
### Package Structure (`de.avatic.lcc`)
- **controller/** - REST endpoints organized by domain (calculation, configuration, bulk, users, report)
- **service/access/** - Business logic for domain entities (PremisesService, MaterialService, NodeService, etc.)
- **service/calculation/** - Logistics cost calculation orchestration and step services
- **service/calculation/execution/steps/** - Individual calculation components (airfreight, handling, inventory, customs, etc.)
- **service/bulk/** - Excel-based bulk import/export operations
- **service/api/** - External API integrations (Azure Maps geocoding, EU taxation)
- **service/transformer/** - Entity-to-DTO mapping
- **repositories/** - JDBC-based data access (not JPA) with custom RowMappers
- **database/dialect/** - Database abstraction layer (SqlDialectProvider, MySQLDialectProvider, MSSQLDialectProvider)
- **model/db/** - Database entity classes
- **dto/** - Data transfer objects for API contracts
### Key Design Decisions
- **JDBC over JPA**: Uses `JdbcTemplate` and `NamedParameterJdbcTemplate` for complex queries
- **SqlDialectProvider abstraction**: Database-agnostic SQL through dialect-specific implementations (MySQL/MSSQL)
- **Transformer layer**: Explicit DTO mapping keeps entities separate from API contracts
- **Calculation chain**: Cost calculations broken into fine-grained services in `execution/steps/`
- **Profile-based configuration**: Spring profiles for environment-specific database selection
### Core Calculation Flow
```
CalculationExecutionService.launchJobCalculation()
→ ContainerCalculationService (container type selection: FEU/TEU/HC/TRUCK)
→ RouteSectionCostCalculationService (per-section costs)
→ AirfreightCalculationService
→ HandlingCostCalculationService
→ InventoryCostCalculationService
→ CustomCostCalculationService (tariff/duties)
```
### Authorization Model
Role-based access control via `@PreAuthorize` annotations:
- SUPER, CALCULATION, MATERIAL, FREIGHT, PACKAGING, BASIC
## Testing
### Test Architecture
**Integration Test Base Class:**
All repository integration tests extend `AbstractRepositoryIntegrationTest`, which provides:
- `JdbcTemplate` for test data setup
- `SqlDialectProvider` for database-agnostic SQL
- Helper methods: `isMysql()`, `isMssql()`, `executeRawSql()`
- Automatic TestContainers setup via `@Testcontainers`
- Transaction isolation via `@Transactional`
**TestContainers Setup:**
```java
@SpringBootTest(classes = {RepositoryTestConfig.class})
@Testcontainers
@Import(DatabaseTestConfiguration.class)
@Transactional
public abstract class AbstractRepositoryIntegrationTest {
@Autowired
protected JdbcTemplate jdbcTemplate;
@Autowired
protected SqlDialectProvider dialectProvider;
protected boolean isMysql() {
return getDatabaseProfile().contains("mysql");
}
protected void executeRawSql(String sql, Object... params) {
jdbcTemplate.update(sql, params);
}
}
```
**DatabaseTestConfiguration:**
- MySQL: `MySQLContainer` with `mysql:8.0` image
- MSSQL: `MSSQLServerContainer` with `mcr.microsoft.com/mssql/server:2022-latest` image
- Profile-based activation via `@Profile("mysql")` and `@Profile("mssql")`
### Database-Agnostic Test Patterns
**Pattern 1: Boolean literals in test data**
```java
String sql = String.format(
"INSERT INTO node (name, is_active) VALUES (?, %s)",
dialectProvider.getBooleanTrue());
```
**Pattern 2: Auto-increment ID retrieval**
```java
executeRawSql("INSERT INTO table (name) VALUES (?)", name);
String selectSql = isMysql() ? "SELECT LAST_INSERT_ID()" : "SELECT CAST(@@IDENTITY AS INT)";
return jdbcTemplate.queryForObject(selectSql, Integer.class);
```
**Pattern 3: Date functions**
```java
String dateFunc = isMysql() ? "NOW()" : "GETDATE()";
String sql = String.format("INSERT INTO table (created_at) VALUES (%s)", dateFunc);
```
### Running Tests
**Run all tests on MySQL:**
```bash
mvn test -Dspring.profiles.active=test,mysql
```
**Run all tests on MSSQL:**
```bash
mvn test -Dspring.profiles.active=test,mssql
```
**Run specific repository tests:**
```bash
mvn test -Dtest=CalculationJobRepositoryIntegrationTest -Dspring.profiles.active=test,mysql
mvn test -Dtest=CalculationJobRepositoryIntegrationTest -Dspring.profiles.active=test,mssql
```
**Run all repository integration tests on both databases:**
```bash
mvn test -Dtest="*RepositoryIntegrationTest" -Dspring.profiles.active=test,mysql
mvn test -Dtest="*RepositoryIntegrationTest" -Dspring.profiles.active=test,mssql
```
### Test Coverage
**Current Status (as of Phase 6 completion):**
- **365 tests** passing on both MySQL and MSSQL (100% success rate)
- **28 repository integration test classes** covering:
- Calculation repositories (CalculationJobRepository, CalculationJobDestinationRepository, CalculationJobRouteSectionRepository)
- Configuration repositories (NodeRepository, MaterialRepository, PackagingRepository, CountryRepository)
- Rate repositories (ContainerRateRepository, MatrixRateRepository)
- Property repositories (PropertyRepository, CountryPropertyRepository, PackagingPropertiesRepository)
- User repositories (UserRepository, GroupRepository)
- Bulk operation repositories (BulkOperationRepository)
- And 14 additional repositories
**Test Data:**
- `@Sql` annotations for controller integration tests from `src/test/resources/master_data/`
- Repository tests use inline SQL with `executeRawSql()` for database-agnostic test data setup
- Test data cleanup in `@BeforeEach` respects foreign key constraints
### Allure Test Reporting
**Overview:**
All tests (46 test classes, ~624 test methods) are annotated with Allure reporting framework annotations for comprehensive test documentation and reporting.
**Annotation Hierarchy:**
```java
@Epic("Controller") // Test layer: Controller, Repository, Database Layer, End-to-End
@Feature("Calculation") // Domain/subpackage: Calculation, Configuration, Master Data, etc.
@DisplayName("Test Suite Name") // Human-readable test suite name
class ExampleTest {
@Test
@Story("Create new calculation") // Test scenario description
@DisplayName("Should create calculation with valid data")
void testCreateCalculation() { ... }
}
```
**Annotation Coverage by Layer:**
| Layer | Epic | Features | Test Classes | Test Methods |
|-------|------|----------|--------------|--------------|
| **Controller** | `@Epic("Controller")` | Configuration, Calculation, Report | 11 | ~100 |
| **Repository** | `@Epic("Repository")` | Calculation, Master Data, Premise, Rates, Properties, Country, Packaging, Users, Bulk, Error, Infrastructure | 28 | ~400 |
| **Database Layer** | `@Epic("Database Layer")` | MySQL Dialect, MSSQL Dialect | 2 | ~42 |
| **End-to-End** | `@Epic("End-to-End")` | Smoke Tests, Calculation Workflow, Deviation Analysis | 3 | ~7 |
**Local Report Generation:**
```bash
# Run tests and generate Allure results
mvn clean test -Dspring.profiles.active=test,mysql
# Generate and view Allure report (requires allure-commandline)
allure serve target/allure-results
```
**CI/CD Integration:**
- Gitea Actions workflow (`.gitea/workflows/test.yml`) automatically uploads Allure results to Allure server
- Reports available at: `http://10.80.0.6:5050` (project: `lcc-{branch}`)
- Each CI run generates a new report with execution metadata
**Allure Configuration:**
- Dependency: `io.qameta.allure:allure-junit5` (version 2.29.0)
- Results directory: `target/allure-results`
- Report includes: test duration, stack traces, categorization by Epic/Feature/Story
## Database
### Multi-Database Support
The application supports both **MySQL 8.0** and **MSSQL Server 2022** through the `SqlDialectProvider` abstraction layer.
**Database selection via Spring profiles:**
- `mysql` - MySQL 8.0 (default)
- `mssql` - Microsoft SQL Server 2022
**Environment variables:**
```bash
export SPRING_PROFILES_ACTIVE=mysql # or mssql
export DB_HOST=localhost
export DB_DATABASE=lcc
export DB_USER=your_user
export DB_PASSWORD=your_password
```
### SqlDialectProvider Pattern
Database-specific SQL syntax is abstracted through `de.avatic.lcc.database.dialect.SqlDialectProvider`:
- **MySQLDialectProvider** - MySQL-specific SQL (LIMIT/OFFSET, NOW(), ON DUPLICATE KEY UPDATE, FOR UPDATE SKIP LOCKED)
- **MSSQLDialectProvider** - MSSQL-specific SQL (OFFSET/FETCH, GETDATE(), MERGE, WITH (UPDLOCK, READPAST))
**Key dialect differences:**
| Feature | MySQL | MSSQL |
|---------|-------|-------|
| Pagination | `LIMIT ? OFFSET ?` | `OFFSET ? ROWS FETCH NEXT ? ROWS ONLY` |
| Current timestamp | `NOW()` | `GETDATE()` |
| Date subtraction | `DATE_SUB(NOW(), INTERVAL 3 DAY)` | `DATEADD(DAY, -3, GETDATE())` |
| Boolean literals | `TRUE`, `FALSE` | `1`, `0` |
| Auto-increment | `AUTO_INCREMENT` | `IDENTITY(1,1)` |
| Upsert | `ON DUPLICATE KEY UPDATE` | `MERGE` statement |
| Insert ignore | `INSERT IGNORE` | `IF NOT EXISTS ... INSERT` |
| Skip locked rows | `FOR UPDATE SKIP LOCKED` | `WITH (UPDLOCK, READPAST)` |
| Last insert ID | `LAST_INSERT_ID()` | `CAST(@@IDENTITY AS INT)` |
**Repository usage example:**
```java
@Repository
public class ExampleRepository {
private final JdbcTemplate jdbcTemplate;
private final SqlDialectProvider dialectProvider;
public ExampleRepository(JdbcTemplate jdbcTemplate, SqlDialectProvider dialectProvider) {
this.jdbcTemplate = jdbcTemplate;
this.dialectProvider = dialectProvider;
}
public List<Entity> list(int limit, int offset) {
String sql = "SELECT * FROM table ORDER BY id " +
dialectProvider.buildPaginationClause(limit, offset);
Object[] params = dialectProvider.getPaginationParameters(limit, offset);
return jdbcTemplate.query(sql, params, rowMapper);
}
}
```
### Flyway Migrations
Database-specific migrations are organized by database type:
```
src/main/resources/db/migration/
├── mysql/
│ ├── V1__Create_schema.sql
│ ├── V2__Property_Set_Period.sql
│ └── V3-V12 (additional migrations)
└── mssql/
├── V1__Create_schema.sql
├── V2__Property_Set_Period.sql
└── V3-V12 (MSSQL-specific conversions)
```
**Migration naming:** `V{N}__{Description}.sql`
**Key schema differences:**
- MySQL uses `AUTO_INCREMENT`, MSSQL uses `IDENTITY(1,1)`
- MySQL supports `TIMESTAMP ... ON UPDATE CURRENT_TIMESTAMP`, MSSQL requires triggers
- MySQL `BOOLEAN` maps to MSSQL `BIT`
- Check constraints syntax differs (BETWEEN vs >= AND <=)
### Key Tables
Core entities:
- **premiss**, **premiss_sink**, **premiss_route** - Supply chain scenarios and routing
- **calculation_job**, **calculation_job_destination**, **calculation_job_route_section** - Calculation workflow
- **node** - Suppliers, destinations, intermediate locations
- **material**, **packaging** - Product and packaging master data
- **container_rate**, **country_matrix_rate** - Transportation rates
- **property_set**, **property** - Versioned configuration properties
## Important Database Considerations
### Concurrency Control
**Calculation Job Locking:**
The `CalculationJobRepository.fetchAndLockNextJob()` method uses database-specific row-level locking to prevent concurrent job processing:
- **MySQL**: `FOR UPDATE SKIP LOCKED` - Skips locked rows and returns next available job
- **MSSQL**: `WITH (UPDLOCK, READPAST)` - Similar semantics but different syntax
Both implementations ensure that multiple job processors can run concurrently without conflicts.
### Transaction Isolation
- Default isolation level: READ_COMMITTED
- Repository tests use `@Transactional` for automatic rollback
- Critical operations (job locking, rate updates) use pessimistic locking
### Schema Conversion Gotchas
When adding new Flyway migrations, be aware of these differences:
**Auto-increment columns:**
```sql
-- MySQL
id INT AUTO_INCREMENT PRIMARY KEY
-- MSSQL
id INT IDENTITY(1,1) PRIMARY KEY
```
**Timestamp with auto-update:**
```sql
-- MySQL
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
-- MSSQL (requires trigger)
updated_at DATETIME2 DEFAULT GETDATE()
-- Plus CREATE TRIGGER for ON UPDATE behavior
```
**Boolean values:**
```sql
-- MySQL
is_active BOOLEAN DEFAULT TRUE
-- MSSQL
is_active BIT DEFAULT 1
```
**Check constraints:**
```sql
-- MySQL
CHECK (latitude BETWEEN -90 AND 90)
-- MSSQL
CHECK (latitude >= -90 AND latitude <= 90)
```
### Performance Considerations
- Both databases use similar execution plans for most queries
- Indexes are defined identically in both migration sets
- MSSQL may benefit from additional statistics maintenance for complex joins
- Performance regression < 5% observed in comparative testing
## External Integrations
- **Azure AD**: OAuth2/OIDC authentication
- **Azure Maps**: Geocoding and route distance calculations (GeoApiService, DistanceApiService)
- **EU Taxation API**: TARIC nomenclature lookup for customs duties (EUTaxationApiService)
## Configuration
### Profile-Based Database Configuration
The application uses Spring profiles for database selection:
**application-mysql.properties:**
```properties
spring.profiles.active=mysql
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
spring.datasource.url=jdbc:mysql://${DB_HOST:localhost}:3306/${DB_DATABASE}
spring.datasource.username=${DB_USER}
spring.datasource.password=${DB_PASSWORD}
spring.flyway.enabled=true
spring.flyway.locations=classpath:db/migration/mysql
spring.flyway.baseline-on-migrate=true
```
**application-mssql.properties:**
```properties
spring.profiles.active=mssql
spring.datasource.driver-class-name=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.datasource.url=jdbc:sqlserver://${DB_HOST:localhost}:1433;databaseName=${DB_DATABASE};encrypt=true;trustServerCertificate=true
spring.datasource.username=${DB_USER}
spring.datasource.password=${DB_PASSWORD}
spring.flyway.enabled=true
spring.flyway.locations=classpath:db/migration/mssql
spring.flyway.baseline-on-migrate=true
```
**Environment Variables:**
```bash
# MySQL setup
export SPRING_PROFILES_ACTIVE=mysql
export DB_HOST=localhost
export DB_DATABASE=lcc
export DB_USER=root
export DB_PASSWORD=your_password
# MSSQL setup
export SPRING_PROFILES_ACTIVE=mssql
export DB_HOST=localhost
export DB_DATABASE=lcc
export DB_USER=sa
export DB_PASSWORD=YourStrong!Passw0rd
```
### Application Properties
Key properties in `application.properties`:
- `lcc.auth.identify.by` - User identification method (workday)
- `calculation.job.processor.*` - Async calculation job settings
- Flyway enabled by default; migrations run on startup
**Database-specific bean activation:**
- `@Profile("mysql")` - Activates MySQLDialectProvider
- `@Profile("mssql")` - Activates MSSQLDialectProvider
## Quick Reference
### Switching Databases
**Switch from MySQL to MSSQL:**
```bash
# Update environment
export SPRING_PROFILES_ACTIVE=mssql
export DB_HOST=localhost
export DB_DATABASE=lcc
export DB_USER=sa
export DB_PASSWORD=YourStrong!Passw0rd
# Run application
mvn spring-boot:run
```
**Switch back to MySQL:**
```bash
export SPRING_PROFILES_ACTIVE=mysql
export DB_HOST=localhost
export DB_DATABASE=lcc
export DB_USER=root
export DB_PASSWORD=your_password
mvn spring-boot:run
```
### Running Migrations
Migrations run automatically on application startup when Flyway is enabled.
**Manual migration with Flyway CLI:**
```bash
# MySQL
flyway -url=jdbc:mysql://localhost:3306/lcc -user=root -password=pass -locations=filesystem:src/main/resources/db/migration/mysql migrate
# MSSQL
flyway -url=jdbc:sqlserver://localhost:1433;databaseName=lcc -user=sa -password=pass -locations=filesystem:src/main/resources/db/migration/mssql migrate
```
### Testing Checklist
When modifying repositories or adding new database-dependent code:
1. **Run unit tests** (if applicable)
```bash
mvn test -Dtest=MySQLDialectProviderTest
mvn test -Dtest=MSSQLDialectProviderTest
```
2. **Run repository integration tests on MySQL**
```bash
mvn test -Dtest="*RepositoryIntegrationTest" -Dspring.profiles.active=test,mysql
```
3. **Run repository integration tests on MSSQL**
```bash
mvn test -Dtest="*RepositoryIntegrationTest" -Dspring.profiles.active=test,mssql
```
4. **Run full test suite on both databases**
```bash
mvn test -Dspring.profiles.active=test,mysql
mvn test -Dspring.profiles.active=test,mssql
```
### Common Repository Patterns
**Pattern 1: Constructor injection with SqlDialectProvider**
```java
@Repository
public class ExampleRepository {
private final JdbcTemplate jdbcTemplate;
private final SqlDialectProvider dialectProvider;
public ExampleRepository(JdbcTemplate jdbcTemplate, SqlDialectProvider dialectProvider) {
this.jdbcTemplate = jdbcTemplate;
this.dialectProvider = dialectProvider;
}
}
```
**Pattern 2: Pagination queries**
```java
public List<Entity> list(int limit, int offset) {
String sql = "SELECT * FROM table WHERE condition ORDER BY id " +
dialectProvider.buildPaginationClause(limit, offset);
Object[] params = ArrayUtils.addAll(
new Object[]{conditionValue},
dialectProvider.getPaginationParameters(limit, offset)
);
return jdbcTemplate.query(sql, params, rowMapper);
}
```
**Pattern 3: Insert with ID retrieval**
```java
public Integer create(Entity entity) {
String sql = "INSERT INTO table (name, is_active) VALUES (?, ?)";
jdbcTemplate.update(sql, entity.getName(), entity.isActive());
String idSql = dialectProvider.getLastInsertIdQuery();
return jdbcTemplate.queryForObject(idSql, Integer.class);
}
```
**Pattern 4: Upsert operations**
```java
public void upsert(Entity entity) {
String sql = dialectProvider.buildUpsertStatement(
"table_name",
List.of("unique_col1", "unique_col2"), // unique columns
List.of("unique_col1", "unique_col2", "value"), // insert columns
List.of("value") // update columns
);
jdbcTemplate.update(sql, entity.getCol1(), entity.getCol2(), entity.getValue());
}
```