Skip to content

Access requests


Cloudflare is a global network designed to make everything you connect to the Internet secure, private, fast, and reliable.

In this documentation, you will learn how to collect and send Cloudflare Access Request logs to

The following built-in rules match the intake Cloudflare Access Requests. This documentation is updated automatically and is based solely on the fields used by the intake which are checked against our rules. This means that some rules will be listed but might not be relevant with the intake.

SEKOIA.IO x Cloudflare Access Requests on ATT&CK Navigator

Bazar Loader DGA (Domain Generation Algorithm)

Detects Bazar Loader domains based on the Bazar Loader DGA

  • Effort: elementary

Detection of domain names potentially related to cryptomining activities.

  • Effort: master
Dynamic DNS Contacted

Detect communication with dynamic dns domain. This kind of domain is often used by attackers. This rule can trigger false positive in non-controlled environment because dynamic dns is not always malicious.

  • Effort: master
Exfiltration Domain

Detects traffic toward a domain flagged as a possible exfiltration vector.

  • Effort: master
Remote Access Tool Domain

Detects traffic toward a domain flagged as a Remote Administration Tool (RAT).

  • Effort: master
SEKOIA.IO Intelligence Feed

Detect threats based on indicators of compromise (IOCs) collected by SEKOIA's Threat and Detection Research team.

  • Effort: elementary EICAR Detection

Detects observables in CTI tagged as EICAR, which are fake samples meant to test detection.

  • Effort: master
TOR Usage Generic Rule

Detects TOR usage globally, whether the IP is a destination or source. TOR is short for The Onion Router, and it gets its name from how it works. TOR intercepts the network traffic from one or more apps on user’s computer, usually the user web browser, and shuffles it through a number of randomly-chosen computers before passing it on to its destination. This disguises user location, and makes it harder for servers to pick him/her out on repeat visits, or to tie together separate visits to different sites, this making tracking and surveillance more difficult. Before a network packet starts its journey, user’s computer chooses a random list of relays and repeatedly encrypts the data in multiple layers, like an onion. Each relay knows only enough to strip off the outermost layer of encryption, before passing what’s left on to the next relay in the list.

  • Effort: master

Event Categories

The following table lists the data source offered by this integration.

Data Source Description
Authentication logs Record login and logouts

In details, the following table denotes the type of events produced by this integration.

Name Values
Kind ``
Category authentication, network
Type connection, denied, end, info, start

Event Samples

Find below few samples of events and how they are normalized by

    "message": "{\"Action\":\"\",\"Allowed\":true,\"AppDomain\":\"\",\"AppUUID\":\"123e233b-253e-7890-8844-08123123123a\",\"Connection\":\"onetimepin\",\"Country\":\"fr\",\"CreatedAt\":\"2023-02-24T14:52:47Z\",\"Email\":\"\",\"IPAddress\":\"\",\"PurposeJustificationPrompt\":\"\",\"PurposeJustificationResponse\":\"\",\"RayID\":\"79e906eb5dc32123\",\"TemporaryAccessApprovers\":[],\"TemporaryAccessDuration\":0,\"UserUID\":\"123f6715-400f-5fae-a345-d28191234123\"}",
    "event": {
        "category": [
        "dataset": "access_requests",
        "type": [
    "@timestamp": "2023-02-24T14:52:47Z",
    "client": {
        "address": "",
        "ip": ""
    "cloudflare": {
        "AppUUID": "123e233b-253e-7890-8844-08123123123a",
        "Connection": "onetimepin",
        "RayID": "79e906eb5dc32123",
        "TemporaryAccessDuration": 0
    "observer": {
        "type": "proxy",
        "vendor": "Cloudflare"
    "related": {
        "hosts": [
        "ip": [
    "source": {
        "address": "",
        "geo": {
            "country_iso_code": "fr"
        "ip": ""
    "url": {
        "domain": "",
        "registered_domain": "",
        "subdomain": "sekoiaio",
        "top_level_domain": "com"
    "user": {
        "email": "",
        "id": "123f6715-400f-5fae-a345-d28191234123"

Extracted Fields

The following table lists the fields that are extracted, normalized under the ECS format, analyzed and indexed by the parser. It should be noted that infered fields are not listed.

Name Type Description
@timestamp date Date/time when the event originated.
client.ip ip IP address of the client.
cloudflare.AppUUID keyword Access Application UUID.
cloudflare.Connection keyword Identity provider used for the login.
cloudflare.PurposeJustificationPrompt keyword Message prompted to the client when accessing the application.
cloudflare.PurposeJustificationResponse keyword Justification given by the client when accessing the application.
cloudflare.RayID keyword Identifier of the request.
cloudflare.TemporaryAccessApprovers array List of approvers for this access request.
cloudflare.TemporaryAccessDuration number Approved duration for this access request.
event.category keyword Event category. The second categorization field in the hierarchy.
event.dataset keyword Name of the dataset.
event.type keyword Event type. The third categorization field in the hierarchy.
observer.type keyword The type of the observer the data is coming from.
observer.vendor keyword Vendor name of the observer.
source.geo.country_iso_code keyword Country ISO code.
source.ip ip IP address of the source.
url.domain keyword Domain of the url. keyword User email address. keyword Unique identifier of the user.


Create the intake on

Go to the intake page and create a new intake from the format Cloudflare.

Configure events forwarding on Cloudflare

Retrieve necessary information

First, you will have to retrieve configuration information. Connect to Cloudflare Console to collect the following :

  1. Cloudflare API Token

    • Go to My Profile, then on the left panel, click on API Tokens.
    • Click on the Create Token button and select the Create Custom Token entry.
    • Give a name to your token and set the following permissions:
    Scope Group Level
    Account Account Analytics Read
    Account Logs Read
    Account Logs Edit
    Zone Logs Read
    Zone Logs Edit
    • If you want zerotrust logs you should also add:
    Scope Group Level
    Account Zero Trust Read

    see the Cloudflare documentation

  2. Cloudflare Zone ID :

    • This information is specific to a Website.
    • On the left panel, click on Websites and select the Website you want.
    • On the right panel, there is an API section where you can retrieve the Zone ID.

Create a Logpush job

Configure a Logpush job with the following destination:<YOUR_INTAKE_KEY>

To do so, you can manage Logpush with cURL:

$ curl -X POST '<CLOUDFLARE_ACCOUNT_ID>/logpush/jobs' \
-H 'Authorization: Bearer <CLOUDFLARE_API_TOKEN>' \
-H "Content-Type: application/json" \
-d '{
    "dataset": "access_requests",    
    "enabled": true,     
    "max_upload_bytes": 5000000,     
    "max_upload_records": 1000,
    "destination_conf": "<YOUR_INTAKE_KEY>"
    }' # (1)
  1. will return
      "errors": [],
      "messages": [],
      "result": {
        "id": "<ID>",
        "dataset": "access_requests",
        "max_upload_bytes": 5000000,     
        "max_upload_records": 1000, 
        "enabled": true,
        "name": "<DOMAIN_NAME>",
        "logpull_options": "fields=<LIST_OF_FIELDS>",
        "destination_conf": "<YOUR_INTAKE_KEY>",
        "last_complete": null,
        "last_error": null,
        "error_message": null,
      "success": true


Replace :

  • <YOUR_INTAKE_KEY> with the Intake key you generated in the Create the intake on step.
  • <CLOUDFLARE_ACCOUNT_ID> with the ACCOUNT_ID found on the overview page
  • <CLOUDFLARE_API_TOKEN> with the API Token you generated
Useful Cloudflare API endpoints

On their documentation, Cloudflare provides a list API endpoints you can use. Find below some useful endpoints:

  •<ACCOUNT_ID>/logpush/jobs/<JOB_ID> to verify the job you previously created is correct (you need to specify the JOB_ID)
  •<ACCOUNT_ID>/logpush/datasets/<DATASET>/jobs to get all the jobs for a specific dataset (dns_log, firewalls_events or http_requests in our case)
  •<ACCOUNT_ID>/logpush/jobs/<JOB_ID> to update a job if you noticed a mistake after the creation of the job (wrong fields, wrong SEKOIA API Key...)