Skip to content

New morpheus stackstorm integration pack#176

Open
xod442 wants to merge 7 commits intoStackStorm-Exchange:masterfrom
xod442:new-morpheus
Open

New morpheus stackstorm integration pack#176
xod442 wants to merge 7 commits intoStackStorm-Exchange:masterfrom
xod442:new-morpheus

Conversation

@xod442
Copy link
Copy Markdown

@xod442 xod442 commented Feb 19, 2022

Pack for Morpheus

@CLAassistant
Copy link
Copy Markdown

CLAassistant commented May 11, 2022

CLA assistant check
All committers have signed the CLA.

Copy link
Copy Markdown
Member

@cognifloyd cognifloyd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK. I read through the pack's code. I have several questions and suggestions.

Overall, it looks like this pack gets logs from morpheus, saves them in a new mongo db, then pushes them to splunk. Why this architecture? What goals does this accomplish?

Comment thread stackstorm-morpheus/README.md Outdated

Actions are defined in two groups:

### Individual actions: GET, POST, PUT with under bar will precede each individual action
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think all of this is a heading.

Maybe call the heading "Action naming convention"?

And then describe how it's the http method underscore action?

Copy link
Copy Markdown
Author

@xod442 xod442 May 25, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the questions and suggestions. I cleaned up the Action heading. Just a quick note on the architecture. When I read logs from Morpheus, I'm doing it on an interval timer. If the first time I pull 100 log entries and send them directly to Splunk, and then 5 minutes later I do it again, chances are I will get duplicate entries. I don't want to send anything to Splunk that I already sent, so I store them in a mongodb and mark them as unprocessed. Then when the splunk sender runs, it sets the flag to those same records as processed. When the splunk sender runs again, it will only pull the 'unprocessed' entries to Splunk avoiding any duplicate entries. At the time it was the simplest answer.

Comment thread stackstorm-morpheus/README.md Outdated
* ``get_networks``
* ``get_alerts``

### Orquestra Workflows: will not
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This section seems to be about mongo, not Orquesta.

Did you mean to have a separate note about how this pack won't have Orquesta Workflows?

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed as requested

Comment thread stackstorm-morpheus/README.md Outdated
This application uses the mongo db installed by StackStorm. Since the DB is secured
you will need to log into the StackStorm mongo DB as a StackStorm admin and create a separate DB

# To get this pack to work with A SINGLE HOST DEPLOYMENT StackStorm mongo DB
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. This should probably be a subheading, so #### instead of #.
Same for the other headings after this.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed as requested

Comment thread stackstorm-morpheus/README.md Outdated
Comment on lines +54 to +56
You can ignore this section when using StackStorm in docker containers. There is
no username and password associated with the database running in the mongo container.
Use at your own discretion.
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe say something like:

If your mongo instance does not have auth enabled, then you don't need to provide a dbuser and dbpass.

There are so many ways to install StackStorm, I think it would be best to avoid discussing install methods here.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree very confusing! Fixed as requested.

Comment thread stackstorm-morpheus/lib/__init__.py Outdated
@@ -0,0 +1 @@
#
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
#

__init__.py can be an empty file.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed the # symbol from file.

Comment thread stackstorm-morpheus/sensors/readme.md Outdated
@@ -0,0 +1 @@
#
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you can just drop this file. Do you plan to add a sensor?

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

File dropped.

parameters:
logs:
required: true
type: array
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add an items schema?

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure how to add the schema, do I add it to the load-morpheus-logs.yaml file?

parameters:
logs:
required: true
type: array
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

items schema.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure how to add the schema, do I add it to the process_logs.yaml file?

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i figured this out

@@ -0,0 +1 @@
#
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
#

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

removed comment

Comment on lines +49 to +55
# Uncomment dbuser & dbpass if using password protected mongo database
# dbuser = self.config['dbuser']
# dbpass = self.config['dbpass']

# If running stackstorm in a singlehost deployment use this command
# dbclient =
# MongoClient('mongodb://%s:%s@localhost:27017/' % (dbuser,dbpass))
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

People should not need to edit the pack to use a pack's features. This should be refactored so that the db connection settings defautl to no user/pass, but allow configuring the db user, pass, host, and port.

Copy link
Copy Markdown
Author

@xod442 xod442 May 25, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This one is tricky and my st2 skills lack in how to allow this to be configured later. I am thinking on dropping the support of dbuser and dbpass. I used to run the st2 all-in-one installation. The mongodb always had a password, so I figured out how to login the the secure mongo client and add another user. Then I bought a macbook M1 and found I could run docker desktop and use the st2 in containers (way cool) but the mongo did not require passwords.

A possible solution could be I add them to the schema file as not required which should allow the user to configure them in the GUI once the pack is installed? Maybe? Can you give me some guidance? Thank YOU!

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, I figured it out. I add the dbuser and dbpass to the config.schema.yaml and not to the morpheus.yaml.example.
When I load the pack I can use the GUI to make sure the dbuser and dbpass are blank. I can add and remove them and st2 always updates the configs yaml file. WOW. I learned something new!

@cognifloyd
Copy link
Copy Markdown
Member

I'm not sure about including the morpheus=>splunk workflow in this pack because I expect packs to focus on only one service. In any case, here's some feedback on that workflow + actions:

Passing around all of the logs like that could be resource intensive (eg the workflow context gets copied multiple times and it includes input parameters).

Maybe you could use the action_service to store an ID in the datastore: https://docs.stackstorm.com/actions.html#action-service
So, in an action you

  • retrieve the last ID that you sent to splunk from the datastore,
  • loop through the new logs until you find the ID after the last ID sent,
  • send the remaining log entries
  • save a new last ID in the datastore

Also, to avoid passing large amounts of data via action input params, I would probably try to combine your actions. So, you could have one python action that:

  • gets logs from morpheus (yes, in addition to the get_logs action - it can reuse the morpheus API bits)
  • gets the last ID from the datastore
  • ships the new log entries to splunk
  • save the new last ID in the datastore

That also has the benefit of simplifying the configuration so you don't need to know how to connect to mongo.

Another thought: You can extend the get_logs action with something like an after_id input parameter. Then inside the action it would filter the logs and only return log entries after that ID.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants