Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhancement 2 #11

Merged
merged 34 commits into from
Dec 30, 2021
Merged
Show file tree
Hide file tree
Changes from 33 commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
203f643
Added .NET TimestampType
masesdevelopers Nov 22, 2021
11f7b10
Fix on ProducerRecord
masesdevelopers Nov 22, 2021
f34923b
Refix ProducerRecord
masesdevelopers Nov 22, 2021
69017dc
#4: update config classes
masesdevelopers Nov 22, 2021
bb8e160
#6: update config classes
masesdevelopers Nov 22, 2021
6aac8c2
#7: update config classes
masesdevelopers Nov 22, 2021
948d815
Updates test and templates
masesdevelopers Nov 22, 2021
575d38d
#4: added new methods
masesdevelopers Nov 23, 2021
6767abe
#6: update classes
masesdevelopers Nov 23, 2021
46a0b35
Update version
masesdevelopers Nov 23, 2021
d705574
#7: stream improvements
masesdevelopers Nov 23, 2021
b3ce0f0
#6: class updates
masesdevelopers Nov 23, 2021
673d659
#7: added basic stream template
masesdevelopers Nov 25, 2021
2a0bbb9
Updates on enum retrieve
masesdevelopers Nov 25, 2021
8909b7d
#7: add more methods on Topology
masesdevelopers Nov 25, 2021
0f8f415
#2: update to JCOBridge 2.4.3
masesdevelopers Dec 13, 2021
645fe01
Added java project, update source structure
masesdevelopers Dec 13, 2021
02db991
Added first version of java classes
masesdevelopers Dec 22, 2021
70f2995
Update to create a self-contained package
masesdevelopers Dec 22, 2021
ef18cf8
Added missing file
masesdevelopers Dec 22, 2021
f977bb2
Removed JCOBridge installation
masesdevelopers Dec 22, 2021
3e3f733
Removed wrong maven command
masesdevelopers Dec 22, 2021
d3f1e0c
Added command to install JCOBridge
masesdevelopers Dec 22, 2021
58227f6
Retest with new command
masesdevelopers Dec 22, 2021
2ae3baf
Retest install command
masesdevelopers Dec 22, 2021
20e4ced
Changing shell
masesdevelopers Dec 22, 2021
af5275d
#10: self contained NuGet package
masesdevelopers Dec 28, 2021
d06b49a
#10: added all Jars found in the official Apache Kafka delivery
masesdevelopers Dec 28, 2021
3606ed7
#8: added server start with project refactoring
masesdevelopers Dec 29, 2021
4fc32fb
#10: added configuration files (copied from Apache Kafka) into package
masesdevelopers Dec 29, 2021
1a8f813
Documentation upgrade
masesdevelopers Dec 29, 2021
b2ae1af
Documentation update 2
masesdevelopers Dec 29, 2021
f7c0e89
#4: added extra tools and admin classes
masesdevelopers Dec 30, 2021
54d3ba0
Fix pullrequest.yaml
masesdevelopers Dec 30, 2021
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 27 additions & 3 deletions .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -57,16 +57,40 @@ jobs:
with:
submodules: 'true'

- name: Compile
run: dotnet build --no-incremental --configuration Release /p:Platform="Any CPU" src\KafkaBridge.sln
- name: Pre compile
run: dotnet build --no-incremental --configuration Release /p:Platform="Any CPU" src\net\KafkaBridge.sln

- name: Set up Apache Maven Central
uses: actions/setup-java@v1
with: # running setup-java again overwrites the settings.xml
java-version: 11
server-id: ossrh # Value of the distributionManagement/repository/id field of the pom.xml
server-username: MAVEN_USERNAME # env variable for username in deploy
server-password: MAVEN_CENTRAL_TOKEN # env variable for token in deploy
gpg-private-key: ${{ secrets.MAVEN_GPG_PRIVATE_KEY }} # Value of the GPG private key to import
gpg-passphrase: MAVEN_GPG_PASSPHRASE # env variable for GPG private key passphrase

- name: Install local file to be used within Javadoc plugin of generated POM
run: mvn install:install-file -DgroupId=JCOBridge -DartifactId=JCOBridge -Dversion=2.4.3 -Dpackaging=jar -Dfile=./bin/net5.0/JCOBridge.jar
shell: bash

- name: Create Jars
run: mvn --file ./src/java/kafkabridge/pom.xml package
env:
MAVEN_USERNAME: ${{ secrets.MAVEN_USERNAME }}
MAVEN_CENTRAL_TOKEN: ${{ secrets.MAVEN_CENTRAL_TOKEN }}
MAVEN_GPG_PASSPHRASE: ${{ secrets.MAVEN_GPG_PASSPHRASE }}

- name: Recompile to create nuget packages
run: dotnet build --no-incremental --configuration Release /p:Platform="Any CPU" src\net\KafkaBridge.sln

- name: Clear documentation folder
run: Remove-Item .\docs\* -Recurse -Force -Exclude _config.yml

- name: Build documentation
run: |
choco install docfx
cd src\Documentation
cd src\net\Documentation
docfx

- uses: actions/upload-artifact@v2
Expand Down
28 changes: 26 additions & 2 deletions .github/workflows/pullrequest.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,32 @@ jobs:
with:
submodules: 'true'

- name: Compile
run: dotnet build --no-incremental --configuration Release /p:Platform="Any CPU" src\KafkaBridge.sln
- name: Pre compile
run: dotnet build --no-incremental --configuration Release /p:Platform="Any CPU" src\net\KafkaBridge.sln

- name: Set up Apache Maven Central
uses: actions/setup-java@v1
with: # running setup-java again overwrites the settings.xml
java-version: 11
server-id: ossrh # Value of the distributionManagement/repository/id field of the pom.xml
server-username: MAVEN_USERNAME # env variable for username in deploy
server-password: MAVEN_CENTRAL_TOKEN # env variable for token in deploy
gpg-private-key: ${{ secrets.MAVEN_GPG_PRIVATE_KEY }} # Value of the GPG private key to import
gpg-passphrase: MAVEN_GPG_PASSPHRASE # env variable for GPG private key passphrase

- name: Install local file to be used within Javadoc plugin of generated POM
run: mvn install:install-file -DgroupId=JCOBridge -DartifactId=JCOBridge -Dversion=2.4.3 -Dpackaging=jar -Dfile=./bin/net5.0/JCOBridge.jar
shell: bash

- name: Create Jars
run: mvn --file ./src/java/kafkabridge/pom.xml package
env:
MAVEN_USERNAME: ${{ secrets.MAVEN_USERNAME }}
MAVEN_CENTRAL_TOKEN: ${{ secrets.MAVEN_CENTRAL_TOKEN }}
MAVEN_GPG_PASSPHRASE: ${{ secrets.MAVEN_GPG_PASSPHRASE }}

- name: Recompile to create nuget packages
run: dotnet build --no-incremental --configuration Release /p:Platform="Any CPU" src\net\KafkaBridge.sln

- name: Clear documentation folder
run: Remove-Item .\docs\* -Recurse -Force -Exclude _config.yml
Expand Down
28 changes: 26 additions & 2 deletions .github/workflows/release.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,32 @@ jobs:
with:
submodules: 'true'

- name: Build NuGet Packages
run: dotnet build --no-incremental --configuration Release /p:Platform="Any CPU" src\KafkaBridge.sln
- name: Pre compile
run: dotnet build --no-incremental --configuration Release /p:Platform="Any CPU" src\net\KafkaBridge.sln

- name: Set up Apache Maven Central
uses: actions/setup-java@v1
with: # running setup-java again overwrites the settings.xml
java-version: 11
server-id: ossrh # Value of the distributionManagement/repository/id field of the pom.xml
server-username: MAVEN_USERNAME # env variable for username in deploy
server-password: MAVEN_CENTRAL_TOKEN # env variable for token in deploy
gpg-private-key: ${{ secrets.MAVEN_GPG_PRIVATE_KEY }} # Value of the GPG private key to import
gpg-passphrase: MAVEN_GPG_PASSPHRASE # env variable for GPG private key passphrase

- name: Install local file to be used within Javadoc plugin of generated POM
run: mvn install:install-file -DgroupId=JCOBridge -DartifactId=JCOBridge -Dversion=2.4.3 -Dpackaging=jar -Dfile=./bin/net5.0/JCOBridge.jar
shell: bash

- name: Create Jars
run: mvn --file ./src/java/kafkabridge/pom.xml package
env:
MAVEN_USERNAME: ${{ secrets.MAVEN_USERNAME }}
MAVEN_CENTRAL_TOKEN: ${{ secrets.MAVEN_CENTRAL_TOKEN }}
MAVEN_GPG_PASSPHRASE: ${{ secrets.MAVEN_GPG_PASSPHRASE }}

- name: Recompile to create nuget packages
run: dotnet build --no-incremental --configuration Release /p:Platform="Any CPU" src\net\KafkaBridge.sln

- uses: nuget/setup-nuget@v1
with:
Expand Down
10 changes: 8 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,15 @@ The project is organized in this folder structure:

* **docs** (website)
* **sc**
* **KafkaBridge** (The folder containing the source and project of the reflector)
* **config**: contains the configuration files copied from the oficial Apache Kafka delivery
* **java**
* **kafkabridge**: contains the JVM side implementation of some classes managed from .NET side; it is structured as a complete Maven project
* **net**
* **KafkaBridge**: The folder containing the source and project of the Apache Kafka files ported on .NET
* **KafkaBridgeCLI**: The folder containing the source and project of the CLI for Apache Kafka
* **templates**: The folder containing the source and project to generate the NuGet template package
* **tests**
* **KafkaBridgeTest** (The folder containing the source and project of the KafkaBridge test)
* **KafkaBridgeTest**: The folder containing the source and project of the KafkaBridge test

# How Can I Contribute?

Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,8 @@ This project adheres to the Contributor [Covenant code of conduct](CODE_OF_CONDU
This project aims to create a library able to direct access the features available in the [Apache Kafka binary distribution](https://kafka.apache.org/downloads).
There are many client libraries written to manage communication with Apache Kafka. Conversely, this poject use directly the Java packages released from Apache Foundation giving more than one benefit:
* all implemented features are availables;
* avoids protocol implementation from any third party;
* can access any feature made available.
* avoids protocol implementation from any third party;
* can access any feature made available from Apache Kafka: one of the most important one is Kafka Stream which does not have any C# implementation.
The benefits comes from tow main points related to JCOBridge:
* its ablitity to manage a direct access to the JVM from any .NET application: any Java / Scala class behind Apache Kafka can be directly managed;
* using the dynamic code feature of JCOBridge it is possible to write a Java/Scala/Kotlin/etc seamless language directly inside a standard .NET application written in C#/VB.NET
Expand Down
8 changes: 0 additions & 8 deletions src/Documentation/articles/actualstate.md

This file was deleted.

3 changes: 0 additions & 3 deletions src/Documentation/articles/usage.md

This file was deleted.

10 changes: 0 additions & 10 deletions src/Documentation/articles/usageCLI.md

This file was deleted.

26 changes: 0 additions & 26 deletions src/KafkaBridge/BridgedClasses/Common/Config/TopicConfig.cs

This file was deleted.

3 changes: 3 additions & 0 deletions src/config/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# Configuration files

This folder contains the copy of the official configuration files from Apache Kafka delivery
19 changes: 19 additions & 0 deletions src/config/connect-console-sink.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

name=local-console-sink
connector.class=org.apache.kafka.connect.file.FileStreamSinkConnector
tasks.max=1
topics=connect-test
19 changes: 19 additions & 0 deletions src/config/connect-console-source.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

name=local-console-source
connector.class=org.apache.kafka.connect.file.FileStreamSourceConnector
tasks.max=1
topic=connect-test
89 changes: 89 additions & 0 deletions src/config/connect-distributed.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
##
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
##

# This file contains some of the configurations for the Kafka Connect distributed worker. This file is intended
# to be used with the examples, and some settings may differ from those used in a production system, especially
# the `bootstrap.servers` and those specifying replication factors.

# A list of host/port pairs to use for establishing the initial connection to the Kafka cluster.
bootstrap.servers=localhost:9092

# unique name for the cluster, used in forming the Connect cluster group. Note that this must not conflict with consumer group IDs
group.id=connect-cluster

# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will
# need to configure these based on the format they want their data in when loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply
# it to
key.converter.schemas.enable=true
value.converter.schemas.enable=true

# Topic to use for storing offsets. This topic should have many partitions and be replicated and compacted.
# Kafka Connect will attempt to create the topic automatically when needed, but you can always manually create
# the topic before starting Kafka Connect if a specific topic configuration is needed.
# Most users will want to use the built-in default replication factor of 3 or in some cases even specify a larger value.
# Since this means there must be at least as many brokers as the maximum replication factor used, we'd like to be able
# to run this example on a single-broker cluster and so here we instead set the replication factor to 1.
offset.storage.topic=connect-offsets
offset.storage.replication.factor=1
#offset.storage.partitions=25

# Topic to use for storing connector and task configurations; note that this should be a single partition, highly replicated,
# and compacted topic. Kafka Connect will attempt to create the topic automatically when needed, but you can always manually create
# the topic before starting Kafka Connect if a specific topic configuration is needed.
# Most users will want to use the built-in default replication factor of 3 or in some cases even specify a larger value.
# Since this means there must be at least as many brokers as the maximum replication factor used, we'd like to be able
# to run this example on a single-broker cluster and so here we instead set the replication factor to 1.
config.storage.topic=connect-configs
config.storage.replication.factor=1

# Topic to use for storing statuses. This topic can have multiple partitions and should be replicated and compacted.
# Kafka Connect will attempt to create the topic automatically when needed, but you can always manually create
# the topic before starting Kafka Connect if a specific topic configuration is needed.
# Most users will want to use the built-in default replication factor of 3 or in some cases even specify a larger value.
# Since this means there must be at least as many brokers as the maximum replication factor used, we'd like to be able
# to run this example on a single-broker cluster and so here we instead set the replication factor to 1.
status.storage.topic=connect-status
status.storage.replication.factor=1
#status.storage.partitions=5

# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000

# List of comma-separated URIs the REST API will listen on. The supported protocols are HTTP and HTTPS.
# Specify hostname as 0.0.0.0 to bind to all interfaces.
# Leave hostname empty to bind to default interface.
# Examples of legal listener lists: HTTP://myhost:8083,HTTPS://myhost:8084"
#listeners=HTTP://:8083

# The Hostname & Port that will be given out to other workers to connect to i.e. URLs that are routable from other servers.
# If not set, it uses the value for "listeners" if configured.
#rest.advertised.host.name=
#rest.advertised.port=
#rest.advertised.listener=

# Set to a list of filesystem paths separated by commas (,) to enable class loading isolation for plugins
# (connectors, converters, transformations). The list should consist of top level directories that include
# any combination of:
# a) directories immediately containing jars with plugins and their dependencies
# b) uber-jars with plugins and their dependencies
# c) directories immediately containing the package directory structure of classes of plugins and their dependencies
# Examples:
# plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,
#plugin.path=
20 changes: 20 additions & 0 deletions src/config/connect-file-sink.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

name=local-file-sink
connector.class=FileStreamSink
tasks.max=1
file=test.sink.txt
topics=connect-test
20 changes: 20 additions & 0 deletions src/config/connect-file-source.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

name=local-file-source
connector.class=FileStreamSource
tasks.max=1
file=test.txt
topic=connect-test
Loading