This document outlines the Integration and Integration Testing strategy for CIRCULOOS:
- CIRCULOOS Data platform (CDP) ( Orion-LD Context Broker, Mintaka time-series storage, Keycloak authentication) - ED
- Manufacturing Process Orchestration (MPMS) - ED
- Stakeholder Engagement and Collaboration (RAMP) - ED
- 3D Digital Twin (SCDT) - CUT
- AI-driven Optimization (SCOPT) - CUT
- Sustainability & Lifecycle Assessment Tool (GRETA) - SUPSI
- CV-based Composition Detection (CVTOOL) - CAN
All components communicate via REST APIs using JSON-LD format, following NGSI-LD standards. The CIRCULOOS Data platform will be used as a database for all data related to the pilots and project. Even if the pilots have internal system, a copy of those data WILL need to be present on the Data platform.
- Manufacturing Process Orchestration (MPMS)
- Stakeholder Engagement and Collaboration (RAMP)
- 3D Digital Twin (SCDT)
- AI-driven Optimization (SCOPT)
- Sustainability & Lifecycle Assessment Tool (GRETA)
- CV-based Composition Detection (CVTOOL)
Circular Supply chain testing
- SCOPT <-> CDP <-> GRETA
- MPMS <-> CDP <-> SCOPT
- CVTOOL <-> CDP <-> MPMS
- Complete workflow testing
- Performance optimization
- Security hardening
Prior to any interaction with the CIRCULOOS data platform each component needs to get a BEARER token from the CIRCULOOS Keycloak. Detailed examples can be found in https://github.com/european-dynamics-rnd/circuloos-data-platform/tree/master/commands_URL
| Component | ShortName |
|---|---|
| Manufacturing Process Orchestration | MPMS |
| Stakeholder Engagement and Collaboration | RAMP |
| 3D Digital Twin | SCDT |
| AI-driven Optimization | SCOPT |
| Sustainability & Lifecycle Assessment Tool | GRETA |
| CV-based Composition Detection | CVTOOL |
- NGSI-LD Tenant: circuloos_integration
- Each component should replace COMPONENT with their short name
- Replace timestamp with the current timestamp when the test is done
Each component needs to read from the CIRCULOOS data platform the following entity:
urn:ngsi-ld:COMPONENT:reading-integration-test-1
Inside the entity there is a random alphanumeric value for each component. The alphanumeric value will be used in the next step. Save the data received from the data platform as reading.json
{
"id": "urn:ngsi-ld:ORION:reading-integration-test-1",
"type": "integration",
"magic-number": {
"type": "Property",
"value": "P5gADFDLM"
}
}You can utilize the following command getDataOrionSensorViaKong.sh to read/GET the data from the CIRCULOOS Data platform.
Create a file named reading.json with the data received from the data platform and place it on your corresponding folder under verification-phase-1
Each component needs to send/POST the following entity on the data platform to verify their ability to read/write on the platform. The id that you need to request is:
urn:ngsi-ld:COMPONENT:writing-integration-test-1.
Save the json-ld that you will provide to the platform as writing.json. IMPORTANT Replace the magic-number with the alphanumeric value received from the previous task and change the observedAt to the current data-time.
{
"id": "urn:ngsi-ld:ORION:writing-integration-test-1",
"type": "integration",
"leather_type": {
"type": "Property",
"value": "animal",
"observedAt": "2025-10-02T09:26:35Z"
},
"kind_of_animal": {
"type": "Property",
"value": "pig",
"observedAt": "2025-10-02T09:26:35Z"
},
"leather_type_tanned": {
"type": "Property",
"value": "chrome",
"observedAt": "2025-10-02T09:26:35Z"
},
"magic-number":
{
"type": "Property",
"value": "P5gADFDLM",
"observedAt": "2025-10-02T09:26:35Z"
}
}You can utilize addDataOrionViaKong.sh to send/POST data to the CIRCULOOS Data platform.
Each component needs to read the historical (Mintaka) data from the platform. The id that you need to request is: urn:ngsi-ld:COMPONENT:writing-integration-test-1. Save the response in a file called historical-data.json
Create a file named writing.json with the data to be sent to the data platform and place it on your corresponding folder under verification-phase-1 folder. Next generate a pull request from both reading.json, writing.json and historical-data.json files.Please use your ShortName as the name of your branch. An tutorial from the internet to create a pull request.
See also under the folder verification-phase-1/example for an example.
Also there is a Postman collection. See the variables to change the PARTNER_USERNAME and PARTNER_PASSWORD
All the open calls will need to be able to read/write (GET/PUSH) to the data platform. You will need to change the COMPONENT part of the ID with with your PARTNER_USERNAME. Check 3.2, 3.2.1, 3.2.2 and 3.2.4
So if your PARTNER_USERNAME is circuloos-european_dynamics, the ID that you need to read is:
urn:ngsi-ld:circuloos-european_dynamics:reading-integration-test-1
You will not need to upload anything to the GitHub (so DO NOT create a PULL request), but you will need to include relevant screenshot to your specific deliverable. Please remember to remove/hide your PARTNER_PASSWORD from any screenshot.
All integration issues will be tracked using GitHub Issues in the project repository. The workflow follows these steps:
- Identify the Issue: During integration testing, when an issue is discovered, create a new GitHub Issue immediately
- Use Labels: Apply appropriate labels:
integration- For all integration-related issuesbug- For defects or errorsenhancement- For improvement requestsdocumentation- For documentation issuescritical/high/medium/low- For priority levels- Component labels:
MPMS,RAMP,SCDT,SCOPT,GRETA,CVTOOL,data-platform
- Assign Responsibility: Assign to the partner responsible for the affected component
- Set Milestone: Link to the appropriate project phase milestone
- Open - Issue created and awaiting triage
- In Progress - Partner is actively working on the issue
- Testing - Fix implemented and ready for verification
- Closed - Issue resolved and verified
- All technical discussion happens in GitHub Issue comments
- Tag relevant partners using
@usernamementions - For urgent issues, notify via project communication channels (email/Slack) with GitHub Issue link
- Update issue status within 2 business days of assignment
- All partners must monitor issues labeled with their component name
Level 1: Component Level (Days 1-3)
- Issue assigned to component owner partner
- Technical team investigates and resolves
- Expected resolution: 3 business days
Level 2: Technical Coordination (Days 4-7)
- If unresolved after 3 days, escalate to technical coordinator (ED)
- Multi-partner coordination meeting scheduled if needed
- Expected resolution: 7 business days total
Level 3: Project Management (Days 8+)
- If unresolved after 7 days or if it impacts project deliverables
- Escalate to Project Management Team
- Impact assessment on project timeline and deliverables
- Risk mitigation plan created
For critical issues (system down, data loss, security breach):
- Immediate notification to all partners via email and project channels
- Label with
criticaltag - Technical coordinator involved from start
- Daily status updates required
- Expected resolution: 24-48 hours
When issues involve multiple components:
- Create a parent issue with all affected component labels
- Create linked child issues for each component if needed
- Technical coordinator facilitates resolution meeting
- Document dependencies and integration points in issue description
## Issue Details
- **Issue ID**: INT-XXX
- **Component(s)**: [Component names]
- **Severity**: Critical/Major/Minor
- **Test Case**: [Test ID]
## Description
[Detailed description of the issue]
## Steps to Reproduce
1. [Step 1]
2. [Step 2]
## Expected vs Actual
- **Expected**: [Expected behavior]
- **Actual**: [Actual behavior]
## Environment
- **Version**: [Component versions]
- **Test Data**: [Data set used]
## Logs/Screenshots
[Attach relevant logs or screenshots]