Skip to content

Commit

Permalink
Merge pull request #21 from SolaceDev/add-releasing
Browse files Browse the repository at this point in the history
Add Maven Releasing
  • Loading branch information
Nephery authored Dec 15, 2021
2 parents 14e72da + 929f31d commit 56e595d
Show file tree
Hide file tree
Showing 7 changed files with 535 additions and 24 deletions.
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
/bin/

# Jave related
target/**
/target/
*.jar
*.war
*.ear
Expand Down
59 changes: 39 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -337,44 +337,63 @@ Kerberos has some very specific requirements to operate correctly. Some addition

JDK 8 or higher is required for this project.

First, clone this GitHub repo:
```shell
git clone https://github.com/SolaceProducts/pubsubplus-connector-kafka-sink.git
cd pubsubplus-connector-kafka-sink
```

Then run the build script:
```shell
./gradlew clean build
```
1. First, clone this GitHub repo:
```shell
git clone https://github.com/SolaceProducts/pubsubplus-connector-kafka-sink.git
cd pubsubplus-connector-kafka-sink
```
2. Install the test support module:
```shell
git submodule update --init --recursive
cd solace-integration-test-support
./mvnw clean install -DskipTests
cd ..
```
3. Then run the build script:
```shell
./gradlew clean build
```

This script creates artifacts in the `build` directory, including the deployable packaged PubSub+ Sink Connector archives under `build\distributions`.

### Test the Project

An integration test suite is also included, which spins up a Docker-based deployment environment that includes a PubSub+ event broker, Zookeeper, Kafka broker, Kafka Connect. It deploys the connector to Kafka Connect and runs end-to-end tests.

1. Install the test support module:
```shell
git submodule update --init --recursive
cd solace-integration-test-support
./mvnw clean install -DskipTests
cd ..
```
2. Run the tests:
1. Run the tests:
```shell
./gradlew clean test integrationTest
```

### Build a New Record Processor

The processing of a Kafka record to create a PubSub+ message is handled by an interface defined in [`SolRecordProcessorIF.java`](/src/main/java/com/solace/connector/kafka/connect/sink/SolRecordProcessorIF.java). This is a simple interface that creates the Kafka source records from the PubSub+ messages. This project includes three examples of classes that implement this interface:
The processing of a Kafka record to create a PubSub+ message is handled by [`SolRecordProcessorIF`](/src/main/java/com/solace/connector/kafka/connect/sink/SolRecordProcessorIF.java). This is a simple interface that creates the Kafka source records from the PubSub+ messages.

To get started, import the following dependency into your project:

**Maven**
```xml
<dependency>
<groupId>com.solace.connector.kafka.connect</groupId>
<artifactId>pubsubplus-connector-kafka-sink</artifactId>
<version>2.2.0</version>
</dependency>
```

**Gradle**
```groovy
compile "com.solace.connector.kafka.connect:pubsubplus-connector-kafka-sink:2.2.0"
```

Now you can implement your custom `SolRecordProcessorIF`.

For reference, this project includes three examples which you can use as starting points for implementing your own custom record processors:

* [SolSimpleRecordProcessor](/src/main/java/com/solace/connector/kafka/connect/sink/recordprocessor/SolSimpleRecordProcessor.java)
* [SolSimpleKeyedRecordProcessor](/src/main/java/com/solace/connector/kafka/connect/sink/recordprocessor/SolSimpleKeyedRecordProcessor.java)
* [SolDynamicDestinationRecordProcessor](/src/main/java/com/solace/connector/kafka/connect/sink/recordprocessor/SolDynamicDestinationRecordProcessor.java)

You can use these examples as starting points for implementing your own custom record processors.
Once you've built the jar file for your custom record processor project, place it into the same directory as this connector, and update the connector's `sol.record_processor_class` config to point to the class of your new record processor.

More information on Kafka sink connector development can be found here:
- [Apache Kafka Connect](https://kafka.apache.org/documentation/)
Expand Down
97 changes: 97 additions & 0 deletions build.gradle
Original file line number Diff line number Diff line change
@@ -1,18 +1,23 @@
import com.github.spotbugs.snom.SpotBugsTask
import io.github.gradlenexus.publishplugin.InitializeNexusStagingRepository

plugins {
id 'java'
id 'distribution'
id 'jacoco'
id 'maven-publish'
id 'pmd'
id 'signing'
id 'com.github.spotbugs' version '4.7.6'
id 'io.github.gradle-nexus.publish-plugin' version '1.1.0'
id 'org.gradle.test-retry' version '1.3.1'
id 'org.unbroken-dome.test-sets' version '2.2.1'
}

ext {
kafkaVersion = '2.8.1'
solaceJavaAPIVersion = '10.12.1'
isSnapshot = project.version.endsWith('-SNAPSHOT')
}

repositories {
Expand Down Expand Up @@ -98,6 +103,7 @@ project.integrationTest {
useJUnitPlatform()
outputs.upToDateWhen { false }
dependsOn prepDistForIntegrationTesting
shouldRunAfter test
retry {
maxRetries = 3
}
Expand Down Expand Up @@ -186,6 +192,11 @@ project.compileJava {
dependsOn generateJava
}

java {
withJavadocJar()
withSourcesJar()
}

distributions {
main {
contents {
Expand All @@ -202,3 +213,89 @@ distributions {
}
}
}

publishing {
publications {
maven(MavenPublication) {
from components.java
pom {
name = "Solace PubSub+ Connector for Kafka: Sink"
description = "The Solace/Kafka adapter consumes Kafka topic records and streams them to the PubSub+ Event Mesh as topic and/or queue data events."
url = "https://github.com/SolaceProducts/pubsubplus-connector-kafka-sink"
packaging = "jar"
licenses {
license {
name = "Apache License, Version 2.0"
url = "https://github.com/SolaceProducts/pubsubplus-connector-kafka-sink/blob/master/LICENSE"
distribution = "repo"
}
}
organization {
name = "Solace"
url = "https://www.solace.com"
}
developers {
developer {
name = "Support for Solace"
email = "[email protected]"
organization = "Solace"
organizationUrl = "http://solace.community"
}
}
scm {
connection = "scm:git:git://github.com/SolaceProducts/pubsubplus-connector-kafka-sink.git"
developerConnection = "scm:git:[email protected]:SolaceProducts/pubsubplus-connector-kafka-sink.git"
url = "https://github.com/SolaceProducts/pubsubplus-connector-kafka-sink.git"
}
}
}
}
repositories {
maven {
def releasesUrl = uri('http://apps-jenkins:9090/nexus/content/repositories/releases')
def snapshotRepositoryUrl = uri('http://apps-jenkins:9090/nexus/content/repositories/snapshots')
url = isSnapshot ? snapshotRepositoryUrl : releasesUrl
name = 'internal'
credentials {
username = project.properties[name + "Username"]
password = project.properties[name + "Password"]
}
}
}
}

nexusPublishing {
repositories {
sonatype {
nexusUrl = uri('https://oss.sonatype.org/service/local/')
snapshotRepositoryUrl = uri('https://oss.sonatype.org/content/repositories/snapshots')
// gets credentials from project.properties["sonatypeUsername"] project.properties["sonatypePassword"]
}
}
}

signing {
required {
!isSnapshot
}
useGpgCmd()
sign publishing.publications.maven
}

tasks.withType(Sign) {
onlyIf {
gradle.taskGraph.allTasks.any {task ->
task.name.startsWith("publish") && task.name.contains('Sonatype')
}
}
shouldRunAfter test, integrationTest
}

tasks.withType(InitializeNexusStagingRepository).configureEach {
dependsOn test, integrationTest
shouldRunAfter tasks.withType(Sign)
}

tasks.withType(PublishToMavenRepository).configureEach {
dependsOn test, integrationTest
}
Loading

0 comments on commit 56e595d

Please sign in to comment.