-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New morpheus stackstorm integration pack #176
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK. I read through the pack's code. I have several questions and suggestions.
Overall, it looks like this pack gets logs from morpheus, saves them in a new mongo db, then pushes them to splunk. Why this architecture? What goals does this accomplish?
stackstorm-morpheus/README.md
Outdated
|
||
Actions are defined in two groups: | ||
|
||
### Individual actions: GET, POST, PUT with under bar will precede each individual action |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think all of this is a heading.
Maybe call the heading "Action naming convention"?
And then describe how it's the http method
underscore
action
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the questions and suggestions. I cleaned up the Action heading. Just a quick note on the architecture. When I read logs from Morpheus, I'm doing it on an interval timer. If the first time I pull 100 log entries and send them directly to Splunk, and then 5 minutes later I do it again, chances are I will get duplicate entries. I don't want to send anything to Splunk that I already sent, so I store them in a mongodb and mark them as unprocessed. Then when the splunk sender runs, it sets the flag to those same records as processed. When the splunk sender runs again, it will only pull the 'unprocessed' entries to Splunk avoiding any duplicate entries. At the time it was the simplest answer.
stackstorm-morpheus/README.md
Outdated
* ``get_networks`` | ||
* ``get_alerts`` | ||
|
||
### Orquestra Workflows: will not |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This section seems to be about mongo, not Orquesta.
Did you mean to have a separate note about how this pack won't have Orquesta Workflows?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed as requested
stackstorm-morpheus/README.md
Outdated
This application uses the mongo db installed by StackStorm. Since the DB is secured | ||
you will need to log into the StackStorm mongo DB as a StackStorm admin and create a separate DB | ||
|
||
# To get this pack to work with A SINGLE HOST DEPLOYMENT StackStorm mongo DB |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm. This should probably be a subheading, so ####
instead of #
.
Same for the other headings after this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed as requested
stackstorm-morpheus/README.md
Outdated
You can ignore this section when using StackStorm in docker containers. There is | ||
no username and password associated with the database running in the mongo container. | ||
Use at your own discretion. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe say something like:
If your mongo instance does not have auth enabled, then you don't need to provide a
dbuser
anddbpass
.
There are so many ways to install StackStorm, I think it would be best to avoid discussing install methods here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agree very confusing! Fixed as requested.
stackstorm-morpheus/lib/__init__.py
Outdated
@@ -0,0 +1 @@ | |||
# |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# |
__init__.py
can be an empty file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed the # symbol from file.
@@ -0,0 +1 @@ | |||
# |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you can just drop this file. Do you plan to add a sensor?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
File dropped.
parameters: | ||
logs: | ||
required: true | ||
type: array |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you add an items
schema?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure how to add the schema, do I add it to the load-morpheus-logs.yaml file?
parameters: | ||
logs: | ||
required: true | ||
type: array |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
items
schema.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure how to add the schema, do I add it to the process_logs.yaml file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i figured this out
@@ -0,0 +1 @@ | |||
# |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed comment
# Uncomment dbuser & dbpass if using password protected mongo database | ||
# dbuser = self.config['dbuser'] | ||
# dbpass = self.config['dbpass'] | ||
|
||
# If running stackstorm in a singlehost deployment use this command | ||
# dbclient = | ||
# MongoClient('mongodb://%s:%s@localhost:27017/' % (dbuser,dbpass)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
People should not need to edit the pack to use a pack's features. This should be refactored so that the db connection settings defautl to no user/pass, but allow configuring the db user, pass, host, and port.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This one is tricky and my st2 skills lack in how to allow this to be configured later. I am thinking on dropping the support of dbuser and dbpass. I used to run the st2 all-in-one installation. The mongodb always had a password, so I figured out how to login the the secure mongo client and add another user. Then I bought a macbook M1 and found I could run docker desktop and use the st2 in containers (way cool) but the mongo did not require passwords.
A possible solution could be I add them to the schema file as not required which should allow the user to configure them in the GUI once the pack is installed? Maybe? Can you give me some guidance? Thank YOU!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, I figured it out. I add the dbuser and dbpass to the config.schema.yaml and not to the morpheus.yaml.example.
When I load the pack I can use the GUI to make sure the dbuser and dbpass are blank. I can add and remove them and st2 always updates the configs yaml file. WOW. I learned something new!
I'm not sure about including the morpheus=>splunk workflow in this pack because I expect packs to focus on only one service. In any case, here's some feedback on that workflow + actions: Passing around all of the logs like that could be resource intensive (eg the workflow context gets copied multiple times and it includes input parameters). Maybe you could use the
Also, to avoid passing large amounts of data via action input params, I would probably try to combine your actions. So, you could have one python action that:
That also has the benefit of simplifying the configuration so you don't need to know how to connect to mongo. Another thought: You can extend the |
Pack for Morpheus