Thorn Tech Marketing Ad
Skip to main content
Version: 1.1.1

Azure Monitor Agent

Overview

For SFTP Gateway v.3.3.3 and later, we are using Ubuntu 22 (instead of Ubuntu 20). The OMS agent, which is used to send custom logs to Azure Monitor, is no longer compatible. This is because Ubuntu 22 doesn't allow OpenSSLv1 (a dependency of the OMS agent) for security reasons.

This article shows you how to install the newer Azure Monitor Agent (AMA) for sending custom logs to your Azure Monitor Log Analytics Workspace (LAW).

The high level steps are as follows:

  1. Create a Log Analytics Workspace
  2. Within the Log Analytics Workspace, create a Custom Log table
  3. Create a Data Collection Endpoint (DCE), a dependency you'll need later
  4. Create a Data Collection Rule (DCR). This will automatically create a couple of dependencies for you:
    • Installs the Azure Monitor Agent (AMA) on the VM
    • Enables the System Managed Identity on the VM
  5. Grant the System Managed Identity permissions to the Log Analytics Workspace

There are a lot of elements that need to wire together properly in order for this to work. And certain elements need to be created in a specific order.

Create a Log Analytics Workspace

Azure Monitor is a service for aggregating logs and metrics. This information is organized into Log Analytics Workspaces.

  1. In the Azure Portal, go to the Log Analytics Workspace service

  2. Click Create

  3. Choose your Subscription

  4. Create a new Resource Group

  5. Give the Log Analytics Workspace a Name

  6. Choose your Region. Keeping all your resources in the same Region will make troubleshooting easier.

    Create Log Analytics Workspace

  7. Click Review + Create

Create a Custom Log Table

The SFTP Gateway logs do not match a standard schema. So, you will need to create a new Table in the Log Analytics Workspace to store them.

SFTP Gateway has two main logs:

  • Application Log: Events for troubleshooting the Java application
  • SFTP Audit Log: SFTP actions and authentication attempts

In this section, you will first configure one log type. Then, you will repeat the process for the other log type.

  1. Open the Log Analytics Workspace you created in the previous section

  2. Under Settings, go to Tables

  3. On the Tables tab, click Create and New custom log (MMA-based)

    Add Custom Log

  4. This will open the Create a custom log wizard

  5. Download this example application log file

  6. On the Sample tab, click the Select a file button to upload the example application log file.

  7. Click Next

  8. On the Record delimiter tab, make sure it's set to New line.

    record-delimiter

  9. Click Next

  10. On the Collection paths tab, use the following settings:

    • Type: Linux

    • Path: /opt/sftpgw/log/application-*.log

      collection-path

  11. Click Next

  12. On the Details tab, set the Custom log name to ApplicationLog

    custom-log-name

  13. Click Next

  14. On the Review + Create tab, click Create

Repeat the process for the SFTP Audit Log:

  • Custom log name: SFTPAuditLog
  • Sample log file: sftp-audit log file
  • Collection path: /opt/sftpgw/log/sftp-audit-*.log

IMPORTANT: Once the custom logs have been created, notice how the new tables are labeled as Custom table (classic). This is due to the tables being created via an MMA-based custom log, which sets the classic label on the table.

collection-path

This can be an issue, as if you try to create a Data Collection Rule (DCR) while specifying a classic table as the Data Source, the creation will fail with this error message:

"Classic (MMA-based) custom log tables for stream 'Custom-SFTPAudit_CL' with desitnation 'la-633823975' are not supported in Data Collection Rules. Please migrate to a Data Collection Rule based table to receive custom logs."

To get around this error so you're able to specify the custom logs on the DCR, we need to do a little manual configuration. Navigate back to the Tables section on your Log Analytics Workspace. To the far right for each table, there should be 3 dots you can select with a section called Edit schema.

To remove the classic label, select the Migrate to manual schema management button.

collection-path

After migrating to manual schema management, go back to the Tables section and you'll notice your table no longer has the classic label.

collection-path

Now, when you try to later specify your custom table as a data source for your DCR, you won't run into an error message and can proceed with the following steps.

Create a Data Collection Endpoint (DCE)

A Data Collection Endpoint is a dependency that you will need to support Custom Text Logs. Without it, you are limited to collecting Linux Syslogs.

  1. In the Azure Portal, go to the Azure Monitor service

  2. Under Settings, click Data Collection Endpoints

  3. Click + Create

    Create Data Collection Endpoint

  4. For Endpoint Name, use azure-monitor-agent-endpoint

  5. For the Resource Group, use the same one as your VM

  6. Region Use the same one as your VM

    Set Endpoint name

  7. Click Review + create

  8. Click Create

Stop your SFTP Gateway VM (IMPORTANT)

Due to a bug in the current version of the Azure Monitor Agent, we need to do a little bit of a workaround to get both sets of logs aggregated into the Log Analytics Workspace.

Essentially, you want to STOP your virtual machine before the creation of the first Data Collection Rule (DCR). Then, once the first DCR is created, start the VM and create the second DCR: Stop the VM

The reasoning behind this is that when you create the first DCR, if the VM is running, the AMA agent is immediately installed. If you leave the VM running when the first DCR is created, then it will only stream the logs specified in the first DCR, and ignore the second DCR entirely.

If the VM is stopped during the creation of the first DCR, the extension won't be installed since you can't install extensions on a stopped VM. Then, when we turn the VM back on prior to creating the second DCR, on it's creation, since the VM is now running, the AMA extension will be installed and see both sets of DCRs.

So, stop your VM before creating the first DCR, start it, then create the second DCR.

If you're running into the above issue of only seeing one set of logs, follow the below instructions.

Troubleshooting only one set of logs appearing on the LAW

To get the second set of logs aggregated into the LAW, we need to take these steps:

  1. Uninstall the Azure Monitor Agent Extension on the VM: AMA Extension Uninstall

  2. Delete the DCR for the log which doesn't appear on the LAW and recreate it: Delete DCR

When you recreate the DCR, it will reinstall the AMA extension and should add the logs which were missing to the LAW.

Create a Data Collection Rule (DCR)

The Data Collection Rule (DCR) kind of does everything. It wires the VM to the Log Analytics Workspace, and defines what kind of logs are collected.

The DCR also creates a couple of dependencies for you automatically:

  • Installs the Azure Monitor Agent (AMA) on the VM
  • Enables the System Managed Identity on the VM

To create a DCR:

  1. In the Azure Portal, go to the Azure Monitor service

  2. Under Settings, click Data Collection Rules

  3. Click + Create

    Create Data Collection Rule

  4. On the Basics tab, configure the following:

    • Rule Name: Use ApplicationLogDCR

    • Resource Group: Use the same one as your Log Analytics Workspace

    • Region: Important: make sure this matches your VM's region

    • Platform Type: Use Linux

    • Data collection endpoint: Set it to azure-monitor-agent-endpoint, which you just created

      Data Collection Rule Basics tab

  5. Click Next: Resources

  6. On the Resources tab, do the following:

    • Click + Add resources

    • Check the box next to your VM, and click Apply

    • Click the checkbox for Enable Data Collection Endpoints

    • In the table column for Data collection endpoint, choose azure-monitor-agent-endpoint in the drop-down menu

      Data Collection Rule Resources tab

  7. Click Next: Collect and deliver

  8. On the Collect and deliver tab, do the following:

    • Click + Add data source
    • For Data source type, choose Custom Text Logs from the drop-down menu. Note: you need to set the DCE on the Basics tab for this option to be available

    This opens an Add data source modal window.

  9. On the Data source tab, set the following options:

    • Data source type: Custom Text Logs

    • File pattern: /opt/sftpgw/log/application-*.log

    • Table name: ApplicationLog_CL

    • Transform: source

      Collect and Deliver tab

  10. At the bottom of the screen, click Add data source

  11. Click Review + create

  12. Click Create

Repeat this process to create a DCR for the SFTP Audit logs.

Verify the Azure Monitor Agent installation

While you were creating the DCR, Azure automatically installs the Azure Monitor Agent (AMA) on the VM. To verify this, do the following:

  1. Go to the VM detail page in the Azure Portal

  2. Under Settings, go to Extensions + applications

  3. You should see the AzureMonitorLinuxAgent with a status of Provisioning succeeded

    VM extension installed

Grant the VM permissions to send logs

In this section, you will configure the VM permissions for writing logs.

  1. Go to the VM detail page in the Azure Portal

  2. Under Settings, go to Identity

  3. Check the Status. It should be On (the default is Off).

    Verify System Identity

  4. Under Permissions, click Azure role assignments

  5. Click +Add role assignment

  6. Under Scope, select Resource group

  7. For Resource group, choose the one containing the Log Analytics Workspace

  8. For Role, select Contributor

    Add role assignment

  9. Toward the bottom, click Save

Check for incoming logs

At this point, everything should be wired properly. In this section, you are going to check the Log Analytics Workspace for any incoming logs.

  1. In the Azure Portal, search for Log Analytics workspaces

  2. Select the Log Analytics Workspace you created earlier

  3. In the left menu, under General, click Logs

  4. If you see a Queries modal, click the X on the top right to close it

  5. Type the following query:

ApplicationLog_CL
| project RawData

You should see rows of results in table below.

log query

Troubleshooting

There are a lot of moving pieces, and there are many things that can go wrong. This section has some troubleshooting steps you can try.

No logs are showing up in the Log Analytics workspace query

  • Try waiting an hour to give the incoming logs from the VM a chance to arrive
  • Try generating log activity on the VM. For example, restart the Java service: service sftpgw-admin-api restart
  • Make sure the Time range covers the log entries on the VM (i.e. try selecting Last 7 days)
  • Run the query Heartbeat. This will show the latest heartbeat from the VM.

heartbeat

The Azure Monitor Agent extension is not installing

Try manually installing the AMA software:

az vm extension set --name AzureMonitorLinuxAgent --publisher Microsoft.Azure.Monitor --ids /subscriptions/abc-123/resourceGroups/rob-vm/providers/Microsoft.Compute/virtualMachines/rob-vm  --enable-auto-upgrade true

You will need to supply the VM's Resource ID for the --ids parameter.

To get the VM's Resource ID:

  1. Go to the VM's detail page

  2. Under Settings, click Properties

  3. Scroll down, and look for Resource ID

The option for Custom Text Logs does not show up on the DCR

The Custom Text Logs option only shows up if you have a DCE set on the Basics tab. Try creating the DCE first.

Note: The Custom radio button (next to Windows and Linux) does not give you custom text logs. Rather, this setting means "both" Windows and Linux.

Custom Log troubleshooting