Skip to content

HTTP requests


Cloudflare is a global network designed to make everything you connect to the Internet secure, private, fast, and reliable.

In this documentation, you will learn how to collect and send Cloudflare HTTP requests to

Benefit from SEKOIA.IO built-in rules and upgrade Cloudflare HTTP requests with the following detection capabilities out-of-the-box.

SEKOIA.IO x Cloudflare HTTP requests on ATT&CK Navigator

Nimbo-C2 User Agent

Nimbo-C2 Uses an unusual User-Agent format in its implants.

  • Effort: intermediate
Potential Azure AD Phishing Page (Adversary-in-the-Middle)

Detects an HTTP request to an URL typical of the Azure AD authentication flow, but towards a domain that is not one the legitimate Microsoft domains used for Azure AD authentication.

  • Effort: intermediate
Potential Bazar Loader User-Agents

Detects potential Bazar loader communications through the user-agent

  • Effort: elementary
Potential Lemon Duck User-Agent

Detects LemonDuck user agent. The format used two sets of alphabetical characters separated by dashes, for example "User-Agent: Lemon-Duck-[A-Z]-[A-Z]".

  • Effort: elementary
SEKOIA.IO Intelligence Feed

Detect threats based on indicators of compromise (IOCs) collected by SEKOIA's Threat and Detection Research team.

  • Effort: elementary

Event Categories

The following table lists the data source offered by this integration.

Data Source Description
Web logs Cloudflare act as a proxy and provide associated traffic logs
Web application firewall logs Cloudflare protect web application with its web application firewall and provide associated traffic logs

In details, the following table denotes the type of events produced by this integration.

Name Values
Kind event
Category web
Type access

Event Samples

Find below few samples of events and how they are normalized by

    "message": "{\"ClientIP\":\"\",\"ClientRequestHost\":\"\",\"ClientRequestMethod\":\"GET\",\"ClientRequestURI\":\"/wp1/wp-includes/wlwmanifest.xml\",\"EdgeEndTimestamp\":1658281702371000000,\"EdgeResponseBytes\":279,\"EdgeResponseStatus\":522,\"EdgeStartTimestamp\":1658281671671000000,\"RayID\":\"72d807ffeba5753d\"}",
    "event": {
        "kind": "event",
        "category": [
        "type": [
        "dataset": "http_requests",
        "start": "2022-07-20T01:47:51.671000Z",
        "end": "2022-07-20T01:48:22.371000Z"
    "source": {
        "ip": "",
        "address": ""
    "destination": {
        "address": ""
    "http": {
        "request": {
            "method": "GET"
        "response": {
            "bytes": 279,
            "status_code": 522
    "url": {
        "path": "/wp1/wp-includes/wlwmanifest.xml"
    "observer": {
        "vendor": "Cloudflare",
        "type": "proxy"
    "cloudflare": {
        "ClientIP": "",
        "ClientRequestHost": "",
        "ClientRequestMethod": "GET",
        "ClientRequestURI": "/wp1/wp-includes/wlwmanifest.xml",
        "EdgeEndTimestamp": "1658281702371000000",
        "EdgeResponseBytes": 279,
        "EdgeResponseStatus": 522,
        "EdgeStartTimestamp": "1658281671671000000",
        "RayID": "72d807ffeba5753d"
    "related": {
        "ip": [
    "message": "{\"WAFMatchedVar\":\"\",\"WAFProfile\":\"unknown\",\"WAFRuleID\":\"\",\"WAFRuleMessage\":\"\",\"WorkerCPUTime\":0,\"WorkerStatus\":\"unknown\",\"WorkerSubrequest\":false,\"WorkerSubrequestCount\":0,\"ZoneID\":545468107,\"ZoneName\":\"\"}\n\n",
    "event": {
        "kind": "event",
        "category": [
        "type": [
        "dataset": "http_requests"
    "observer": {
        "vendor": "Cloudflare",
        "type": "proxy"
    "cloudflare": {
        "WAFMatchedVar": "",
        "WAFProfile": "unknown",
        "WAFRuleID": "",
        "WAFRuleMessage": "",
        "WorkerCPUTime": 0,
        "WorkerStatus": "unknown",
        "WorkerSubrequest": false,
        "WorkerSubrequestCount": 0,
        "ZoneID": 545468107,
        "ZoneName": ""

Extracted Fields

The following table lists the fields that are extracted, normalized under the ECS format, analyzed and indexed by the parser. It should be noted that infered fields are not listed.

Name Type Description
@timestamp date Date/time when the event originated.
destination.address keyword Destination network address.
event.action keyword The action captured by the event.
event.category keyword Event category. The second categorization field in the hierarchy.
event.dataset keyword Name of the dataset.
event.end date event.end contains the date when the event ended or when the activity was last observed.
event.kind keyword The kind of the event. The highest categorization field in the hierarchy.
event.start date event.start contains the date when the event started or when the activity was first observed.
event.type keyword Event type. The third categorization field in the hierarchy.
http.request.bytes long Total size in bytes of the request (body and headers).
http.request.method keyword HTTP request method.
http.request.referrer keyword Referrer for this HTTP request.
http.response.bytes long Total size in bytes of the response (body and headers).
http.response.status_code long HTTP response status code.
network.protocol keyword Application protocol name.
observer.type keyword The type of the observer the data is coming from.
observer.vendor keyword Vendor name of the observer. keyword Rule ID
rule.ruleset keyword Rule ruleset long Unique number allocated to the autonomous system.
source.geo.country_name keyword Country name.
source.ip ip IP address of the source.
source.port long Port of the source.
tls.cipher keyword String indicating the cipher used during the current connection.
tls.version_protocol keyword Normalized lowercase protocol name parsed from original string.
url.path wildcard Path of the request, such as "/search".
user_agent.original keyword Unparsed user_agent string.


Create the intake on

Go to the intake page and create a new intake from the format Cloudflare.

Configure events forwarding on Cloudflare

Retrieve necessary information

First, you will have to retrieve configuration information. Connect to Cloudflare Console to collect the following :

  1. Cloudflare API Token

    • Go to My Profile, then on the left panel, click on API Tokens.
    • Click on the Create Token button and select the Create Custom Token entry.
    • Give a name to your token and set the following permissions:
    Scope Group Level
    Account Account Analytics Read
    Account Logs Read
    Account Logs Edit
    Zone Logs Read
    Zone Logs Edit
    • If you want zerotrust logs you should also add:
    Scope Group Level
    Account Zero Trust Read

    see the Cloudflare documentation

  2. Cloudflare Zone ID :

    • This information is specific to a Website.
    • On the left panel, click on Websites and select the Website you want.
    • On the right panel, there is an API section where you can retrieve the Zone ID.

Create a Logpush job

Configure a Logpush job with the following destination:<YOUR_INTAKE_KEY>

To do so, you can manage Logpush with cURL:

$ curl -X POST<CLOUDFLARE_ZONE_ID>/logpush/jobs \
-H "Authorization: Bearer <CLOUDFLARE_API_TOKEN>" \
-H "Content-Type: application/json" \
--data '{
    "dataset": "http_requests",
    "enabled": true,
    "max_upload_bytes": 5000000,
    "max_upload_records": 1000,
    "destination_conf": "<YOUR_INTAKE_KEY>"
}' # (1)
  1. will return
      "errors": [],
      "messages": [],
      "result": {
        "id": 146,
        "dataset": "http_requests",
        "enabled": false,
        "name": "<DOMAIN_NAME>",
        "logpull_options": "fields=<LIST_OF_FIELDS>&timestamps=rfc3339",
        "destination_conf": "<YOUR_INTAKE_KEY>",
        "last_complete": null,
        "last_error": null,
        "error_message": null
      "success": true


Replace :

  • <YOUR_INTAKE_KEY> with the Intake key you generated in the Create the intake on step.
  • <CLOUDFLARE_API_TOKEN> with the API Token you generated
  • <CLOUDFLARE_ZONE_ID> with the Zone ID you grabbed
Useful Cloudflare API endpoints

On their documentation, Cloudflare provides a list API endpoints you can use. Find below some useful endpoints:

  •<ZONE_ID>/logpush/jobs/<JOB_ID> to verify the job you previously created is correct (you need to specify the JOB_ID)
  •<ZONE_ID>/logpush/datasets/<DATASET>/jobs to get all the jobs for a specific dataset (dns_log, firewalls_events or http_requests in our case)
  •<ZONE_ID>/logpush/jobs/<JOB_ID> to update a job if you noticed a mistake after the creation of the job (wrong fields, wrong SEKOIA API Key...)