You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have jobber configured and running in a docker container. My jobs log 100's of JSON lines to stderr/stdout. If I configure the job to redirect output to a file, everything works nicely except I must attach a shell to inspect logs.
I would like the output of my scripts to be written to container's stdout/stderr (similar to how Jobber's run log is done) so that logs may be inspected using docker logs -f <container_name> or forwarded to ELK stack using Docker syslog log driver.
Result Sinks
Using the resultSinks: with type: stdout is not an option as the output is buffered then wrapped with Jobber status. If 100's of log entries occur during script execution, the resulting output is too long to send to remote syslog listener (aka logstash). I would like EACH log entry to be written individually and forwarded to a syslog listener for processing.
Sample Log Entry as received by ELK (SHORTEST OF ALL COMMANDS)
stderr contains four syslog entries prefixed with 2020-10-02T02:29:03+00:00 INFO (6):
{
"fate": 0,
"job": {
"command": "/usr/local/bin/generateSitemap.sh",
"name": "eventsAll",
"status": "Good",
"time": "*/30 0 3"
},
"startTime": 1601605740,
"stderr": "2020-10-02T02:29:03+00:00 INFO (6): {\"function\":\"generateSitemap\",\"processor\":\"generateSitemap\",\"script\":\"/var/lib/blahblah/generateSitemap.php\",\"eventIds\":\"\",\"excludeEventIds\":\"19,44,74,75,222,1001,1011,1021,2000\",\"msg\":\"START\"}\n2020-10-02T02:29:03+00:00 INFO (6): {\"class\":\"Event_Dispatcher\",\"function\":\"processQueue\",\"processor\":\"processEvents-all\",\"script\":\"/var/lib/blahblah/generateSitemap.php\",\"eventIds\":\"\",\"excludeEventIds\":\"19,44,74,75,222,1001,1011,1021,2000\",\"eliminateDups\":0,\"batchSize\":1000,\"processAll\":0}\n2020-10-02T02:29:03+00:00 INFO (6): {\"class\":\"Event_Dispatcher\",\"function\":\"processQueue\",\"processor\":\"processEvents-all\",\"script\":\"/var/lib/blahblah/generateSitemap.php\",\"eventIds\":\"\",\"excludeEventIds\":\"19,44,74,75,222,1001,1011,1021,2000\",\"eliminateDups\":0,\"batchSize\":1000,\"processAll\":0,\"countEvents\":0,\"countProcessed\":0,\"countSkipped\":0,\"countError\":0,\"msg\":\"PROCESSED ALL EVENTS\"}\n2020-10-02T02:29:03+00:00 INFO (6): {\"function\":\"generateSitemap\",\"processor\":\"processEvents-all\",\"script\":\"/var/lib/blahblah/generateSitemap.php\",\"eventIds\":\"\",\"excludeEventIds\":\"19,44,74,75,222,1001,1011,1021,2000\",\"tt\":7,\"msg\":\"END\"}\n",
"succeeded": true,
"user": "jobberuser",
"version": "1.4"
}
Would it be possible to pass Jobber's logPath file handle to the Command's stdout and stderr instead of creating temporary files which buffer command's output for return in resultSinks?
The above approach still allows use of resultSinks with stdout and/or stderr by using the io.MultiWriter() call passing common.Logger.Writer() or common.ErrLogger.Writer() and temporary file handle.
I have jobber configured and running in a docker container. My jobs log 100's of JSON lines to stderr/stdout. If I configure the job to redirect output to a file, everything works nicely except I must attach a shell to inspect logs.
I would like the output of my scripts to be written to container's stdout/stderr (similar to how Jobber's run log is done) so that logs may be inspected using
docker logs -f <container_name>
or forwarded to ELK stack using Docker syslog log driver.Result Sinks
Using the
resultSinks:
withtype: stdout
is not an option as the output is buffered then wrapped with Jobber status. If 100's of log entries occur during script execution, the resulting output is too long to send to remote syslog listener (aka logstash). I would like EACH log entry to be written individually and forwarded to a syslog listener for processing.Sample Log Entry as received by ELK (SHORTEST OF ALL COMMANDS)
stderr contains four syslog entries prefixed with
2020-10-02T02:29:03+00:00 INFO (6):
Other options tried:
Symlink log file to /dev/stderr
Any assistance in how to accomplish the above would be greatly appreciated.
The text was updated successfully, but these errors were encountered: