Skip to content

Cisco Umbrella IP

Overview

Cisco Umbrella offers flexible, cloud-delivered security. It combines multiple security functions into one solution, so that protection can be extended to devices, remote users, and distributed locations anywhere. CISCO Umbrella is a leading provider of network security and recursive DNS services.

Event Categories

The following table lists the data source offered by this integration.

Data Source Description
Host network interface every packets are logged
Netflow/Enclave netflow Umbrella IP logs are Netflow-like
Network device logs packets logged by Umbrella IP
Network protocol analysis traffic analysis at levels 2/3/4

Event Samples

Find below few samples of events and how they are normalized by Sekoia.io.

{
    "message": " \"2020-06-12 14:31:52\",\"FR123\",\"1.1.1.1\",\"54128\",\"2.2.2.2\",\"443\",\"\",\"Roaming Computers\"",
    "event": {
        "outcome": "success"
    },
    "@timestamp": "2020-06-12T14:31:52Z",
    "action": {
        "name": "block",
        "outcome": "success",
        "target": "network-traffic"
    },
    "destination": {
        "address": "2.2.2.2",
        "ip": "2.2.2.2",
        "port": 443
    },
    "host": {
        "hostname": "FR123",
        "name": "FR123"
    },
    "related": {
        "hosts": [
            "FR123"
        ],
        "ip": [
            "1.1.1.1",
            "2.2.2.2"
        ]
    },
    "source": {
        "address": "1.1.1.1",
        "ip": "1.1.1.1",
        "port": 54128
    }
}

Extracted Fields

The following table lists the fields that are extracted, normalized under the ECS format, analyzed and indexed by the parser. It should be noted that infered fields are not listed.

Name Type Description
@timestamp date Date/time when the event originated.
action.target keyword Target of the action
destination.ip ip IP address of the destination.
destination.port long Port of the destination.
host.hostname keyword Hostname of the host.
source.ip ip IP address of the source.
source.port long Port of the source.

Configure

This section will guide you to configure the forwarding of Cisco Umbrella logs to Sekoia.io by means of AWS S3 buckets.

Prerequities

  • Administrator access to the Cisco Umbrella console
  • Access to Sekoia.io Intakes and Playbook pages with write permissions
  • Access to AWS S3 and AWS SQS

Create an AWS S3 Bucket

To create a new AWS S3 Bucket, please refer to this guide.

  1. On the AWS S3, go to Buckets and select our bucket.
  2. Select Permissions tab and go to Bucket Policy section
  3. Click Edit and paste the JSON Bucket policy from Cisco Umbrella
  4. In the Policy, replace the bucketname placeholde by the name of our bucket.
  5. Click Save changes.

Important

Keep in mind to conserve the /* when defining in the policy.

Configure Cisco Umbrella

  1. Log on the Cisco Umbrella console
  2. Go to Admin > Log Management
  3. In the Amazon S3 section, select Use your company-managed Amazon S3 bucket
  4. In Amazon S3 bucket, type the name of your bucket and click Verify.

  5. On your AWS console, go in your bucket.

  6. In the Objects tab, click on README_FROM_UMBRELLA.txt then click on Open
  7. Copy the token from the readme
  8. On the Cisco Umbrella console, in the field Token Number, paste the token and click Save

Note

After clicking Verify, the message Great! We successfully verified your Amazon S3 bucket must be displayed

Note

After clicking Save, the message We’re sending data to your S3 storage must be displayed

Important

According to the type of the logs, the objects will be prefixed with dnslogs/ for DNS logs, proxylogs for proxy logs, iplogs for ip logs, ...

Create a SQS queue

The collect will rely on S3 Event Notifications (SQS) to get new S3 objects.

  1. Create a queue in the SQS service by following this guide
  2. In the Access Policy step, choose the advanced configuration and adapt this configuration sample with your own SQS Amazon Resource Name (ARN) (the main change is the Service directive allowing S3 bucket access):
    {
      "Version": "2008-10-17",
      "Id": "__default_policy_ID",
      "Statement": [
        {
          "Sid": "__owner_statement",
          "Effect": "Allow",
          "Principal": {
        "Service": "s3.amazonaws.com"
          },
          "Action": "SQS:SendMessage",
          "Resource": "arn:aws:sqs:XXX:XXX"
        }
      ]
    }
    

Important

Keep in mind that you have to create the SQS queue in the same region as the S3 bucket you want to watch.

Create a S3 Event notification

Use the following guide to create S3 Event Notification. Once created:

  1. In the General configuration, type iplogs/ as the Prefix
  2. Select the notification for object creation in the Event type section
  3. As the destination, choose the SQS service
  4. Select the queue you created in the previous section

Create the intake

Go to the intake page and create a new intake from the format Cisco Umbrella IP.

Pull events

To start to pull events, you have to:

  1. Go to the playbook page and create a new playbook with the AWS Fetch new logs on S3 connector
  2. Set up the module configuration with the AWS Access Key, the secret key and the region name. Set up the trigger configuration with the name of the SQS queue and the intake key, from the intake previously created
  3. Start the playbook and enjoy your events

Further Readings