-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Migrate Asn1DecodedDataRouter to use Spring Kafka #131
base: dev
Are you sure you want to change the base?
Conversation
… payload string in UdpHexDecoder.java
… payload string in UdpHexDecoder.java
…r something to confirm testing
…removing from both, and comparing the expected and actual JSON
…/spring-kafka/udp-receivers
… cost of integration test
…outerApprovalTest
…cted.json to use expected encoded value
…Test_ValidSSM.txt and the resulting SsmReceiverTest_ValidSSM_expected.json
…erTest_ValidSPAT.txt and the resulting SpatReceiverTest_ValidSPAT_expected.json
…Test_ValidPSM.txt and the resulting PsmReceiverTest_ValidPSM_expected.json
…Test_ValidSRM.txt and the resulting SrmReceiverTest_ValidSRM_expected.json
…risk of race conditions
Since this test suite produces to and consumes from the same topics in multiple tests, we needed a better way than `getSingleRecord` to perform test assertions. By using uuids as keys, we are able to reuse the same topics for multiple tests and select the correct records for test assertions without risk of tests conflicting with one another.
This change helps simplify managing unique topic names for the disabledTopics set provided by OdeKafkaProperties. It is necessary to prevent data from other tests leaking onto the disabled topics and causing intermittent test failures in this suite
@@ -27,4 +27,4 @@ jobs: | |||
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }} | |||
run: | | |||
ls -la && pwd | |||
mvn -e -X clean org.jacoco:jacoco-maven-plugin:prepare-agent package sonar:sonar -Dsonar.projectKey=usdot.jpo.ode:jpo-ode -Dsonar.projectName=jpo-ode -Dsonar.organization=usdot-jpo-ode -Dsonar.host.url=https://sonarcloud.io -Dsonar.branch.name=$GITHUB_REF_NAME |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note: -X
enables debug logging. It generated so much text that GH's runners were struggling to render test failures in the browser. Turning off debug logging doesn't mean we'll lose the information needed to determine the cause of test failures. It just makes finding failures in the logs easier
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah I like this
on: | ||
pull_request: | ||
push: | ||
branches: | ||
- "develop" | ||
- "master" | ||
- "release/*" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note: this change will stop us from running the ci workflow every time a push to any branch happens. This will restrict the ci workflow to only the push to our main branches and any time a PR is opened or updated. It should reduce unnecessary costs associated with running the workflow
import us.dot.its.jpo.ode.util.SerializationUtils; | ||
|
||
public class MessagingDeserializer<T> implements Deserializer<T> { | ||
|
||
SerializationUtils<T> deserializer = new SerializationUtils<T>(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note: default methods are provided for both configure
and close
, so overriding them is unnecessary
|
||
SerializationUtils<T> serializer = new SerializationUtils<T>(); | ||
|
||
@Override |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note: default methods are provided for both configure and close, so overriding them is unnecessary
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note: much of this code comes from the original implementations at Asn1DecodedDataListener and services/Asn1DecodedDataRouter
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! All the tests look good and pass. Each message type's decoded sample data is accurate for the input/output. I just have one minor comment about a debug log.
However.. decoded BSM messages don't seem to show up in the topic.OdeBsmJson or topic.OdeBsmPojo topics. The tests evidently pass for the BSM decode routing but I wonder what the difference is between the test and a deployed environment.
@@ -27,4 +27,4 @@ jobs: | |||
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }} | |||
run: | | |||
ls -la && pwd | |||
mvn -e -X clean org.jacoco:jacoco-maven-plugin:prepare-agent package sonar:sonar -Dsonar.projectKey=usdot.jpo.ode:jpo-ode -Dsonar.projectName=jpo-ode -Dsonar.organization=usdot-jpo-ode -Dsonar.host.url=https://sonarcloud.io -Dsonar.branch.name=$GITHUB_REF_NAME |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah I like this
id = "Asn1DecodedDataRouter", | ||
topics = "${ode.kafka.topics.asn1.decoder-output}" | ||
) | ||
public void listen(ConsumerRecord<String, String> consumerRecord) throws XmlUtilsException { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I really like this now. Very clean!
jpo-ode-svcs/src/main/java/us/dot/its/jpo/ode/kafka/producer/LoggingProducerListener.java
Outdated
Show resolved
Hide resolved
…oggingProducerListener.java Co-authored-by: Drew Johnston <[email protected]>
/** | ||
* MessagingDeserializer is a generic base class implementing the Kafka Deserializer interface to | ||
* provide serialization of objects for use in Kafka messages. | ||
* | ||
* <p>This class uses a generic type parameter, allowing it to handle serialization | ||
* of various types. Internal serialization is performed using an instance of the SerializationUtils | ||
* class, which leverages Kryo for efficient object serialization.</p> | ||
* | ||
* <p>The class is declared as sealed, restricting which other classes can directly extend it. It | ||
* will | ||
* soon be marked as final to prevent incorrect usage through unnecessary subtyping</p> | ||
* | ||
* @param <T> the type of data to be serialized | ||
*/ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(blocking) These comments indicate that the class is for serialization, but the class is for deserialization.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
corrected in d7f2bbe
* data by processing and forwarding it to different topics based on specific criteria. | ||
* | ||
* <p>This listener is specifically designed to handle decoded data produced by the asn1_codec. | ||
* Upon receiving a payload, it uses transforms the payload and then determines the appropriate |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(blocking) Looks like a typo here: 'uses transforms' should probably just be 'transforms'
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
corrected in d7f2bbe
* | ||
* @param kafkaTemplate the KafkaTemplate used for sending messages to Kafka topics. | ||
*/ | ||
public Asn1DecodedDataRouter(KafkaTemplate<String, String> kafkaTemplate, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(non-blocking) Is the @Autowired
annotation necessary here, or does the class @Component
annotation allow Spring to wire everything up without it?
marking as draft until I can figure out a nice way to handle the disabled topic exceptions blocking the rest of the code execution |
…eptingKafkaTemplate This allows us to prevent sending to disabled topics without throwing exceptions. It allows graceful blocking of sends to disabled topics where the previous exception-driven approach was interrupting normal code paths.
…thub.com/CDOT-CV/jpo-ode into mcook42/spring-kafka/asn1-decoded-router
PR Details
Description
Related Issue
Motivation and Context
Implementing Spring Kafka gives us better lifecycle management of producers and consumers, more reusable producer/consumer code, easier testability, and a more robust production-ready Kafka library. This is part of a more significant effort to replace our hand-rolled Kafka implementation with Spring Kafka. The previous changesets related to this effort are #118, #116, #123, and #129
How Has This Been Tested?
Unit and Integration tests were added before making any functional changes. These continue to pass after the code changes are made. Some additional unit and integration tests were added after the functionality was changed to increase coverage. I have also run all data from the
udpsender_[msgType].py
scripts found under scripts/tests through a live local system started up withmake rebuild
. I confirmed there were no errors in the logs. I also confirmed that all expected messages were produced to and consumed from the correct queues by using the kafka-ui container available atlocalhost:8001
(on my local machine, of course).Types of changes
Checklist:
ODE Contributing Guide