Comprehensive guide for developing, testing, and contributing to HugeGraph Store.
- Development Environment Setup
- Module Architecture
- Build and Test
- gRPC Development
- Debugging
- Contribution Guidelines
Required:
- Java: 11 or higher (OpenJDK or Oracle JDK)
- Maven: 3.5 or higher
- Git: Latest version
- IDE: IntelliJ IDEA (recommended) or Eclipse
Optional (for testing):
- Docker: For containerized testing
- grpcurl: For gRPC API testing
- Prometheus/Grafana: For metrics testing
# Clone HugeGraph repository
git clone https://github.com/apache/hugegraph.git
cd hugegraph
# Checkout development branch
git checkout 1.7-rebaseImport Project:
- File → Open → Select
hugegraphdirectory - IntelliJ detects Maven project → Click "Import"
- Wait for Maven to download dependencies
Code Style:
# Configure IDE code style
# Ensure EditorConfig support is enabled
# Code style is defined in .editorconfig at repository rootRun Configuration:
- Run → Edit Configurations
- Add new "Application" configuration:
- Main class:
org.apache.hugegraph.store.node.StoreNodeApplication - VM options:
-Xms4g -Xmx4g -Dconfig.file=conf/application.yml - Working directory:
hugegraph-store/hg-store-dist/target/apache-hugegraph-store-incubating-1.7.0 - Use classpath of module:
hg-store-node
- Main class:
Build entire project:
# From hugegraph root
mvn clean install -DskipTestsBuild Store module only:
# Build hugegraph-struct first (required dependency)
mvn install -pl hugegraph-struct -am -DskipTests
# Build Store
cd hugegraph-store
mvn clean install -DskipTestsBuild with tests:
mvn clean installhugegraph-struct (external dependency)
↓
hg-store-common
↓
├─→ hg-store-grpc (proto definitions)
├─→ hg-store-rocksdb
↓
hg-store-core
↓
├─→ hg-store-client
├─→ hg-store-node
↓
├─→ hg-store-cli
├─→ hg-store-dist
└─→ hg-store-test
Location: hugegraph-store/hg-store-common
Purpose: Shared utilities and query abstractions
Key Packages:
buffer: ByteBuffer utilitiesconstant: Constants and enumsquery: Query abstraction classesCondition: Filter conditionsAggregate: Aggregation typesQueryCondition: Query parameters
term: Term matching utilitiesutil: General utilities
Adding New Utility:
- Create class in appropriate package (e.g.,
util) - Add Javadoc comments
- Add unit tests in
hg-store-test
Location: hugegraph-store/hg-store-grpc
Purpose: gRPC protocol definitions
Structure:
hg-store-grpc/
├── src/main/proto/ # Protocol definitions
│ ├── store_session.proto
│ ├── query.proto
│ ├── graphpb.proto
│ ├── store_state.proto
│ ├── store_stream_meta.proto
│ ├── healthy.proto
│ └── store_common.proto
└── target/generated-sources/ # Generated Java code (git-ignored)
Generated Code: Excluded from source control and Apache RAT checks
Location: hugegraph-store/hg-store-core
Purpose: Core storage engine logic
Key Classes:
HgStoreEngine.java (~500 lines):
- Singleton per Store node
- Manages all
PartitionEngineinstances - Coordinates with PD
- Entry point for partition lifecycle
PartitionEngine.java (~300 lines):
- One instance per partition replica
- Wraps Raft node
- Delegates to
BusinessHandler
HgStoreStateMachine.java (~400 lines):
- Implements JRaft's
StateMachine - Applies Raft log entries
- Handles snapshot save/load
BusinessHandler.java (interface) / BusinessHandlerImpl.java** (~800 lines):
- Implements data operations (put, get, delete, scan)
- Processes queries with filters and aggregations
Key Packages:
business/: Business logic layermeta/: Metadata managementraft/: Raft integrationpd/: PD client and integrationcmd/: Command processingsnapshot/: Snapshot management
Location: hugegraph-store/hg-store-client
Purpose: Java client library
Key Classes:
HgStoreClient: Main client interfaceHgStoreSession: Session-based operationsHgStoreNodeManager: Connection managementHgStoreQuery: Query builder
Usage: See Integration Guide
Location: hugegraph-store/hg-store-node
Purpose: Store node server
Key Classes:
StoreNodeApplication: Spring Boot main classHgStoreSessionService: gRPC service implementationHgStoreQueryService: Query service implementation
Start Server:
cd hugegraph-store/hg-store-dist/target/apache-hugegraph-store-incubating-1.7.0
bin/start-hugegraph-store.shIf you want to run store module in debug mode Directly run HgStoreNodeService in your IDE (ensure PD is on).
Clean build:
mvn clean install -DskipTestsCompile only:
mvn compilePackage distribution:
mvn clean package -DskipTests
# Output: hg-store-dist/target/apache-hugegraph-store-incubating-<version>.tar.gzRegenerate gRPC stubs (after modifying .proto files):
cd hugegraph-store/hg-store-grpc
mvn clean compile
# Generated files: target/generated-sources/protobuf/Store tests use Maven profiles (all active by default):
<profile>
<id>store-client-test</id>
<activation><activeByDefault>true</activeByDefault></activation>
</profile>
<profile>
<id>store-core-test</id>
<activation><activeByDefault>true</activeByDefault></activation>
</profile>
<profile>
<id>store-common-test</id>
<activation><activeByDefault>true</activeByDefault></activation>
</profile>
<profile>
<id>store-rocksdb-test</id>
<activation><activeByDefault>true</activeByDefault></activation>
</profile>
<profile>
<id>store-server-test</id>
<activation><activeByDefault>true</activeByDefault></activation>
</profile>
<profile>
<id>store-raftcore-test</id>
<activation><activeByDefault>true</activeByDefault></activation>
</profile>All tests:
cd hugegraph-store
mvn testSpecific profile:
mvn test -P store-core-testSpecific test class:
mvn test -Dtest=HgStoreEngineTestSpecific test method:
mvn test -Dtest=HgStoreEngineTest#testPartitionCreationFrom IntelliJ:
- Right-click test class → Run 'TestClassName'
- Right-click test method → Run 'testMethodName'
Location: hugegraph-store/hg-store-test/src/main/java (non-standard location)
Packages:
client/: Client library testscommon/: Common utilities testscore/: Core storage testsraft/: Raft testssnapshot/: Snapshot testsstore/: Storage engine tests
meta/: Metadata testsraftcore/: Raft core testsrocksdb/: RocksDB testsservice/: Service tests
Base Test Class: BaseTest.java
- Provides common test utilities
- Sets up test environment
Example Test Class:
package org.apache.hugegraph.store.core;
import org.apache.hugegraph.store.BaseTest;
import org.junit.Test;
import static org.junit.Assert.*;
public class HgStoreEngineTest extends BaseTest {
@Test
public void testEngineCreation() {
// Arrange
HgStoreEngineConfig config = HgStoreEngineConfig.builder()
.dataPath("./test-data")
.build();
// Act
HgStoreEngine engine = HgStoreEngine.getInstance();
engine.init(config);
// Assert
assertNotNull(engine);
assertTrue(engine.isInitialized());
// Cleanup
engine.shutdown();
}
}Integration Test Example:
@Test
public void testRaftConsensus() throws Exception {
// Setup 3-node Raft group
List<PartitionEngine> engines = new ArrayList<>();
for (int i = 0; i < 3; i++) {
PartitionEngine engine = createPartitionEngine(i);
engines.add(engine);
engine.start();
}
// Wait for leader election
Thread.sleep(2000);
// Perform write on leader
PartitionEngine leader = findLeader(engines);
leader.put("key1".getBytes(), "value1".getBytes());
// Wait for replication
Thread.sleep(1000);
// Verify on all nodes
for (PartitionEngine engine : engines) {
byte[] value = engine.get("key1".getBytes());
assertEquals("value1", new String(value));
}
// Cleanup
for (PartitionEngine engine : engines) {
engine.stop();
}
}Generate Coverage Report:
mvn clean test jacoco:report
# Report: hg-store-test/target/site/jacoco/index.htmlView in Browser:
open hg-store-test/target/site/jacoco/index.htmlCreate or edit .proto file in hg-store-grpc/src/main/proto/:
Example: my_service.proto
syntax = "proto3";
package org.apache.hugegraph.store.grpc;
import "store_common.proto";
service MyService {
rpc MyOperation(MyRequest) returns (MyResponse);
}
message MyRequest {
Header header = 1;
string key = 2;
}
message MyResponse {
bytes value = 1;
}cd hg-store-grpc
mvn clean compile
# Generated classes:
# - MyServiceGrpc.java (service stub)
# - MyRequest.java
# - MyResponse.javaCreate service implementation in hg-store-node/src/main/java/.../service/:
package org.apache.hugegraph.store.node.service;
import io.grpc.stub.StreamObserver;
import org.apache.hugegraph.store.grpc.MyServiceGrpc;
import org.apache.hugegraph.store.grpc.MyRequest;
import org.apache.hugegraph.store.grpc.MyResponse;
public class MyServiceImpl extends MyServiceGrpc.MyServiceImplBase {
@Override
public void myOperation(MyRequest request, StreamObserver<MyResponse> responseObserver) {
try {
// Extract request parameters
String key = request.getKey();
// Perform operation (delegate to HgStoreEngine)
byte[] value = performOperation(key);
// Build response
MyResponse response = MyResponse.newBuilder()
.setValue(ByteString.copyFrom(value))
.build();
// Send response
responseObserver.onNext(response);
responseObserver.onCompleted();
} catch (Exception e) {
responseObserver.onError(e);
}
}
private byte[] performOperation(String key) {
// Implementation
return new byte[0];
}
}In StoreNodeApplication.java:
@Bean
public Server grpcServer() {
return ServerBuilder.forPort(grpcPort)
.addService(new HgStoreSessionService())
.addService(new HgStoreQueryService())
.addService(new MyServiceImpl()) // Add new service
.build();
}Using grpcurl:
# List services
grpcurl -plaintext localhost:8500 list
# Call method
grpcurl -plaintext -d '{"key": "test"}' localhost:8500 org.apache.hugegraph.store.grpc.MyService/MyOperationUnit Test:
@Test
public void testMyService() {
// Setup gRPC channel
ManagedChannel channel = ManagedChannelBuilder
.forAddress("localhost", 8500)
.usePlaintext()
.build();
// Create stub
MyServiceGrpc.MyServiceBlockingStub stub = MyServiceGrpc.newBlockingStub(channel);
// Build request
MyRequest request = MyRequest.newBuilder()
.setKey("test")
.build();
// Call service
MyResponse response = stub.myOperation(request);
// Verify
assertNotNull(response.getValue());
// Cleanup
channel.shutdown();
}Debug Store Node in IntelliJ:
- Set breakpoints in source code
- Run → Debug 'StoreNodeApplication'
- Debugger pauses at breakpoints
Debug with Remote Store:
-
Start Store with debug port:
# Edit start-hugegraph-store.sh JAVA_OPTS="$JAVA_OPTS -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005" bin/start-hugegraph-store.sh
-
Attach debugger in IntelliJ:
- Run → Edit Configurations → Add "Remote JVM Debug"
- Host: localhost, Port: 5005
- Run → Debug 'Remote Store'
Log Configuration: hg-store-dist/src/assembly/static/conf/log4j2.xml
Enable Debug Logging:
<!-- Store core -->
<Logger name="org.apache.hugegraph.store" level="DEBUG"/>
<!-- Raft -->
<Logger name="com.alipay.sofa.jraft" level="DEBUG"/>
<!-- RocksDB -->
<Logger name="org.rocksdb" level="DEBUG"/>
<!-- gRPC -->
<Logger name="io.grpc" level="DEBUG"/>Restart to apply:
bin/restart-hugegraph-store.shView Logs:
tail -f logs/hugegraph-store.log
tail -f logs/hugegraph-store.log | grep ERRORCheck Raft State:
# Raft logs location
ls -lh storage/raft/partition-*/log/
# Raft snapshots
ls -lh storage/raft/partition-*/snapshot/Raft Metrics (in code):
// Get Raft node status
RaftNode node = partitionEngine.getRaftNode();
NodeStatus status = node.getNodeStatus();
System.out.println("Term: " + status.getTerm());
System.out.println("State: " + status.getState()); // Leader, Follower, Candidate
System.out.println("Peers: " + status.getPeers());Enable Raft Logging:
<Logger name="com.alipay.sofa.jraft" level="DEBUG"/>RocksDB Statistics:
// In code
RocksDB db = rocksDBSession.getDb();
String stats = db.getProperty("rocksdb.stats");
System.out.println(stats);Dump RocksDB Data (for inspection):
# Using ldb tool (included with RocksDB)
ldb --db=storage/rocksdb scan --max_keys=100JVM Profiling (using async-profiler):
# Download async-profiler
wget https://github.com/jvm-profiling-tools/async-profiler/releases/download/v2.9/async-profiler-2.9-linux-x64.tar.gz
tar -xzf async-profiler-2.9-linux-x64.tar.gz
# Start profiling
./profiler.sh -d 60 -f flamegraph.html $(pgrep -f hugegraph-store)
# View flamegraph
open flamegraph.htmlMemory Profiling:
# Heap dump
jmap -dump:format=b,file=heap.bin $(pgrep -f hugegraph-store)
# Analyze with VisualVM or Eclipse MATJava:
- Follow Apache HugeGraph code style (configured via
.editorconfig) - Use 4 spaces for indentation (no tabs)
- Max line length: 120 characters
- Braces on same line (K&R style)
Example:
public class MyClass {
private static final Logger LOG = LoggerFactory.getLogger(MyClass.class);
public void myMethod(String param) {
if (param == null) {
throw new IllegalArgumentException("param cannot be null");
}
// Implementation
}
}Format:
<type>(<scope>): <subject>
<body>
<footer>
Types:
feat: New featurefix: Bug fixdocs: Documentation changesrefactor: Code refactoringtest: Test additions or changeschore: Build or tooling changes
Example:
feat(store): add query aggregation pushdown
Implement COUNT, SUM, MIN, MAX, AVG aggregations at Store level
to reduce network traffic and improve query performance.
Closes #1234
-
Fork and Clone:
# Fork on GitHub git clone https://github.com/YOUR_USERNAME/hugegraph.git cd hugegraph git remote add upstream https://github.com/apache/hugegraph.git
-
Create Branch:
git checkout -b feature-my-feature
-
Develop and Test:
# Make changes # Add tests mvn clean install # Ensure all tests pass
-
Check Code Quality:
# License header check mvn apache-rat:check # Code style check mvn editorconfig:check
-
Commit:
git add . git commit -m "feat(store): add new feature"
-
Push and Create PR:
git push origin feature-my-feature # Create PR on GitHub -
Code Review:
- Address review comments
- Update PR with fixes
- Request re-review
-
Merge:
- Maintainers merge after approval
Adding Dependencies:
When adding third-party dependencies:
- Add to
pom.xml - Add license file to
install-dist/release-docs/licenses/ - Update
install-dist/release-docs/LICENSE - If upstream has NOTICE, update
install-dist/release-docs/NOTICE - Update
install-dist/scripts/dependency/known-dependencies.txt
Run Dependency Check:
cd install-dist/scripts/dependency
./regenerate_known_dependencies.shWhen to Update Docs:
- New feature: Add usage examples
- API changes: Update API reference
- Configuration changes: Update configuration guide
- Bug fixes: Update troubleshooting section
Documentation Location:
- Main README:
hugegraph-store/README.md - Detailed docs:
hugegraph-store/docs/
Official Documentation:
- HugeGraph Docs: https://hugegraph.apache.org/docs/
- Apache TinkerPop: https://tinkerpop.apache.org/docs/
Community:
- Mailing List: dev@hugegraph.apache.org
- GitHub Issues: https://github.com/apache/hugegraph/issues
- Slack: (link in project README)
Related Projects:
- SOFA-JRaft: https://github.com/sofastack/sofa-jraft
- RocksDB: https://rocksdb.org/
- gRPC: https://grpc.io/docs/languages/java/
For operational procedures, see Operations Guide.
For production best practices, see Best Practices.