-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update/kafka implementations #3763
Update/kafka implementations #3763
Conversation
Updates from airqo staging
📝 WalkthroughWalkthroughThe changes made in this pull request focus on the Changes
Possibly related PRs
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Outside diff range and nitpick comments (1)
src/workflows/airqo_etl_utils/message_broker_utils.py (1)
38-38
: Consider architectural improvements for better separation of concerns.While reviewing the configuration change, I noticed some opportunities for architectural improvements:
Consider splitting this class into separate
KafkaProducer
andKafkaConsumer
classes following the Single Responsibility Principle. This would make the code more maintainable and testable.Move configuration management to a dedicated class:
class KafkaConfig: def __init__(self, bootstrap_servers: str): self.validate_bootstrap_servers(bootstrap_servers) self.bootstrap_servers = bootstrap_servers @staticmethod def validate_bootstrap_servers(servers: str) -> None: if not servers: raise ValueError("Bootstrap servers configuration is required") def get_producer_config(self) -> dict: return { "bootstrap.servers": self.bootstrap_servers, "metadata.max.age.ms": 60000, # ... other producer-specific configs } def get_consumer_config(self, group_id: str) -> dict: return { "bootstrap.servers": self.bootstrap_servers, "group.id": group_id, # ... other consumer-specific configs }
- Consider moving constants like
MAX_MESSAGE_SIZE
to the configuration module for better centralization of configuration parameters.Would you like me to help create a detailed design document for these architectural improvements?
@@ -35,7 +35,7 @@ def __init__(self): | |||
# Note: This should be updated in case the number of partions used changes. | |||
self.partition_loads = {int(p): 0 for p in self.__partitions} | |||
self.config = { | |||
"bootstrap.servers": "35.187.179.27:30200,34.79.78.204:30200,34.78.90.92:30200", | |||
"bootstrap.servers": self.__bootstrap_servers, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
🛠️ Refactor suggestion
Bootstrap servers configuration needs type validation
The change to use self.__bootstrap_servers
is correct, but there's a potential issue: the configuration source (configuration.BOOTSTRAP_SERVERS
) returns a list from split(",")
while the Kafka client expects a string. Consider adding type conversion:
def __init__(self):
bootstrap_servers = configuration.BOOTSTRAP_SERVERS
self.__bootstrap_servers = ','.join(bootstrap_servers) if isinstance(bootstrap_servers, list) else bootstrap_servers
if not self.__bootstrap_servers:
raise ValueError("Bootstrap servers configuration is required")
src/workflows/airqo_etl_utils/config.py
: Returns list fromsplit(",")
src/workflows/airqo_etl_utils/message_broker_utils.py
: Uses value directly in Kafka config
🔗 Analysis chain
Good improvement on configuration management! Consider adding validation.
The change to use self.__bootstrap_servers
instead of a hardcoded value is a positive step towards better configuration management. To make this even more robust, consider these suggestions:
- Add validation for the bootstrap servers configuration:
def __init__(self):
self.__partitions = configuration.TOPIC_PARTITIONS
self.__bootstrap_servers = configuration.BOOTSTRAP_SERVERS
+ if not self.__bootstrap_servers:
+ raise ValueError("Bootstrap servers configuration is required")
+ if not isinstance(self.__bootstrap_servers, str):
+ raise TypeError("Bootstrap servers must be a string")
self.partition_loads = {int(p): 0 for p in self.__partitions}
self.config = {
"bootstrap.servers": self.__bootstrap_servers,
"metadata.max.age.ms": 60000,
}
- Consider creating a configuration factory method to handle common configurations between Producer and Consumer. This would reduce code duplication in
publish_to_topic
andconsume_from_topic
methods.
Let's verify the bootstrap servers configuration usage:
Let me gather more information about the Kafka configuration usage to ensure a comprehensive verification.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for other potential hardcoded Kafka configurations
rg -g '*.py' '"bootstrap.servers":'
Length of output: 150
Script:
#!/bin/bash
# Check for configuration imports and usage
rg -g '*.py' "BOOTSTRAP_SERVERS" -A 2
# Look for any Kafka producer/consumer configuration patterns
ast-grep --pattern 'config = {
$$$
}'
# Search for configuration validation patterns
ast-grep --pattern 'if not $var:
raise ValueError($str)
'
Length of output: 2868
Description
Clean up
Summary by CodeRabbit
New Features
Bug Fixes