Otto  background

Audit Trail API

Learn how to use the Automox Audit Trail API with this quick tutorial and ready-to-use script.

In yet another security-related initiative from the CISA Secure by Design Pledge, we are very excited to announce the release of our Audit Trail API. This release enables security teams and IT administrators to retrieve audit logs from the Automox console and ingest them within their SIEM! Through this API, you can see who’s doing what – and where.

And this is just Phase One. We plan on releasing Phase Two later this month which will include a GUI where you can audit this activity directly from the Automox console!

A Python Script to Get You Started

There are many creative ways you could utilize this API. To help you get started, our SecOps team put together a Python script you can run from a Linux host via a cron job. The script is designed to grab the latest logs from the API and store them in an AWS S3 bucket where a SIEM can consume them (eg. Rapid7). Feel free to use the script (as-is) or edit it for your use case (or even simply as inspiration). You’ll also find a couple of helpful bash scripts in there to help set everything up for you!

Initial Setup

First, clone the repository to the system that will be running the script:

git clone https://github.com/AutomoxSecurity/automox-tools.git

Then, cd into the directory where the script lives and set up a Python virtual environment:

cd automox-tools/audit-log-collector

python3 -m venv venv

source venv/bin/activate

Pro Tip: You can usually tell your virtual environment is activated because you’ll see something like (venv) in your shell prompt. If you give your virtual environment a different name, you’ll see that instead.

And now, install the Python packages needed to run the script:

pip install -r requirements.txt

This should install requestsdotenv and boto3 (for talking to AWS S3).

The .env File

To run the script, you’ll have to pass along some credentials and other pertinent variables such as the name of the AWS S3 bucket. To do that, configure an .env file in the directory where the script lives. Good news! We already have an example for you which is included in the repo you cloned earlier - example.env

Make a copy of the file cp example.env .env and populate the blank variables. Don’t forget to fill in the variables.

AUTOMOX_ORG_UUID=

AUTOMOX_API_KEY=

AWS_ACCESS_KEY_ID=

AWS_SECRET_ACCESS_KEY=

AWS_REGION=

AWS_S3_BUCKET=

Pro Tip: Lock down the permissions so only the owner can view the file. You can do this by running chmod 600 .env.

Testing the Script

Now, it’s time to test the script. Given the test goes well, you can set up a Cron Job to run it automatically.

Try it out by running the following:

python3 main.py

If you get output like the above, you should be good to go. You can also verify by checking the S3 bucket and ensuring the logs are showing there:

If for some reason the script doesn’t work, the output/errors it provides should help point you to what is wrong or what might be missing. 

The Cron Job

Now that the script works, it’s time to tell the system to run it automagically:

  1. Run crontab -e

  2. Add a new line with the following and save it:

9-59/10 /Full/Path/To/venv/bin/python3 /Full/Path/To/main.py >> /Full/Path/To/cron.log 2>&1

Here’s the breakdown of what each part means:

  • 9-59/10 = The cron expression. In this case, we’re saying “run this every hour at 9 minutes, 19 minutes, 29 minutes, etc”. It’s set up this way for a reason (See: Important Note!). If you want to run the script on a different cadence, check out https://crontab.cronhub.io/ to help you build a cron expression.

  • /Full/Path/To/venv/bin/python3 = Where the python 3 binary from your virtual environment lives on your system. You can find this in the venv folder under venv/bin/python3. It’s important to specify this binary because the script relies on non-standard Python packages only available to our virtual environment.

  • /Full/Path/To/main.py = Where the script lives (where you cloned the repo)

  • >> - This is simply instructing the system to redirect and append the output from the script to a file of your choosing

  • /Full/Path/To/cron.log 2>&1 = Where you want to log the script’s operations. Because we included 2>&1, both stdout and stderr will be logged.

Here is how our cron expression looks on our test VM:

9-59/10  /home/parallels/automox-audit-log/venv/bin/python3 /home/parallels/automox-audit-log/main.py >> /home/parallels/automox-audit-log-collector/cron.log 2>&1

Once you save the crontab, the script should now run as you specified. You can see the results by referring to the cron.log file you put in the cron expression.

Important note: To ensure that the script is getting all of the logs, we recommend at a minimum running the cron at the end of each day at 11:59 PM. This is especially important because the script will grab logs for the current date of the time it is run. If we set our cron to run at the top of every hour, when it runs at 11 PM on 7/16/2024, there’s a possibility that we’ll miss the logs between 11 PM and 12 AM the next day.

A safe bet is to run the script every 10 minutes like so:

9-59/10  * * * *

In short, you’ll want to make sure you’re running the script at least at 11:59 PM to gather all the logs for a particular day.

AuditTrail API: Wrapping it Up

And that’s it. 

If everything is configured correctly, any new audit logs from the Audit Trail API should now be going to AWS S3 where they can be consumed by the SIEM of your choosing.

Dive deeper into this topic

loading...