Incoming feed - Advanced CSV#

Note

This article describes how to configure incoming feeds for a particular feed source. To see how to configure incoming feeds in general, see Create and configure incoming feeds.

Specifications

Transport types

Advanced CSV

Content type

EclecticIQ Entities Advanced CSV

Ingested data

Ingests CSV files.

Endpoint(s)

N/A

Processed data

See Data mapping.

Requirements#

  • A single CSV file served through HTTP/S at a URL.

Configure the incoming feed#

  1. Create or edit an incoming feed.

  2. Under Transport and content, fill out these fields:

    Note

    Required fields are marked with an asterisk (*).

    Field

    Description

    Transport type*

    Select Advanced CSV from the drop-down menu.

    Content type*

    Select EclecticIQ Entities Advanced CSV from the drop-down menu.

    CSV URL*

    URL of CSV file to ingest. Must be a direct download.

    See Serving a single CSV file through HTTP.

    CSV mapping*

    Enter how values in the CSV file should be mapped, in the form:

    <column_one_header>:<target_eiq_json_path>[,<column_two_header>:<target_eiq_json_path>[, ...]
    

    For more information, see Data mapping.

    Delimiter*

    Character used to separate values in selected file.

    Set to , by default.

    Quote characters*

    Character(s) used to enclose CSV fields that contain characters that must be escaped. For example: ,, ", and space characters.

    Set to " by default.

    Some software such as Microsoft Excel do not use a single " character to enclose CSV fields.

    Ignore lines that start with #

    Select to ignore all lines that don’t have delimiter characters (set in Delimiter).

    Ignore lines without defined delimiter

    Select to ignore all lines that start with with a ‘#’ character in the target CSV file.

    Field Names*

    Enter column headers as a comma-separated list, in the form:

    <column_one_header>[, <column_two_header>[, ...]]
    

    Leave blank to use first line of CSV file as headers.

    SSL verification

    Selected by default. Select this option to enable SSL for this feed.

    Path to SSL certificate file.

    Used when connecting to a feed source that uses a custom CA. Set this as the path to the SSL certificate to use when authenticating the feed source.

    For more information, see SSL certificates.

  3. Store your changes by selecting Save.

SSL certificates#

To use an SSL certificate, it must be:

  • Accessible on the EclecticIQ Intelligence Center host.

  • Placed in a location that can be accessed by the eclecticiq user.

  • Owned by eclecticiq:eclecticiq.

To make sure that EclecticIQ Intelligence Center can access the SSL certificate:

  1. Upload the SSL certificate to a location on the EclecticIQ Intelligence Center host.

  2. On the EclecticIQ Intelligence Center host, open the terminal.

  3. Change ownership of the SSL certificate by running as root in the terminal:

    chown eclecticiq:eclecticiq /path/to/cert.pem
    

    Where /path/to/cert.pem is the location of the SSL certificate EclecticIQ Intelligence Center needs to access.

Data mapping#

Caution

This implementation supports a limited set of Entity values that you can map CSV fields to. For more information, see Mapping table.

Overview#

Use the CSV mapping field to specify how columns in the selected CSV file map to entities and observables on the platform.

For example to ingest the following CSV file:

bf20c674a0533e7c0d825de097629a96cb42ae2d4840b07dd1168993d95163e8,ATMSpitter,"lorem ipsum example",2016-08-02,example_tag_1,example_tag_2,example_reference_1.com,example_reference_2.com
  1. Serve the CSV file at a URL. See Serving a single CSV file through HTTP.

  2. Set up the Advanced CSV incoming feed.

  3. Set the CSV mapping field to:

    hash:entity.observable.hash-sha256,name:entity.name,description:entity.description,timestamp:entity.tags,tag_1:entity.tags,tag_2:entity.tags,reference_1:entity.references,reference_2:entity.references
    
  4. Set Field Names to:

    hash,name,description,timestamp,tag_1,tag_2,reference_1,reference_2
    
  5. Save and run the feed.

This produces, per row in the CSV file:

  • An Indicator entity (for each row), because an entity.type is not specified. For more information, see Mapping table.

  • These indicators take their title from the name column, and their description field set to the contents of the description column.

  • Three tags attached to the Indicator, taking their respective values from the timestamp, tag_1, and tag_2 columns.

  • Two reference entries in the indicator’s Information source, taking their respective values from the reference_1 and reference_2 columns.

  • hash-256 type Observables attached to this Indicator, taking its value from the hash column.

Mapping table#

Each row in the CSV file is treated as a single record.

This means that all fields in a given row are mapped to a single resulting Entity on the platform.

EclecticIQ Entity field

Syntax

Description

Entity Title

entity.name

Sets the title for the resulting Entity.

Entity Type

entity.type

Set the type for the resulting Entity.

Entity Description

entity.description

Set the description for the resulting Entity.

Entity Tags

entity.tags

Adds entries to the Tags field of the resulting Entity. The text of the resulting Tag is taken from the value of the CSV field.

Each CSV field mapped to entity.tags creates a new tag for the Entity.

Entity References

entity.references

Adds entries to the Information source > References field of the resulting Entity. The text of the resulting reference is taken from the value of the CSV field.

Each CSV field mapped to entity.references creates a new reference for the Entity.

Observables

entity.observable.<observable_type>

Adds an Observable of <observabe_type> to the resulting Entity. The value of the resulting Observable takes its text from the value of that CSV field.

See List of Observable types.

Estimated threat start time

entity.meta.estimated_threat_start_time

Sets the Estimated threat start time value for the resulting entity.

Accepts a date and time. For example: 2021-01-17 07:44:46

See Date-time values for more information.

Estimated threat end time

entity.meta.estimated_threat_end_time

Sets the Estimated threat end time value for the resulting entity.

Accepts a date and time. For example: 2021-01-17 07:44:46

See Date-time values for more information.

Estimated observed time

entity.meta.estimated_observed_time

Sets the Estimated observed time value for the resulting entity.

Accepts a date and time. For example: 2021-01-17 07:44:46

See Date-time values for more information.

Date-time values#

The extension accepts a wide range of formats for date time values, but ambiguous date-time values (for example, writing 01 02 03 to represent 2 January 2003) can produce unexpected results.

For consistent ingestion of date-time values, those values should either follow ISO 8601 date-time formats (YYYY-MM-DDThh:mm:ss.sTZD), or be written such that the day, month, and year are distinct (2 Jan 2003 00:00).

Fields that accept date-time values also accept Unix epoch values.

List of Observable types#

List of Observable types:

  • actor-id

  • address

  • asn

  • bank-account

  • card

  • card-owner

  • cce

  • city

  • company

  • country

  • country-code

  • cve

  • cwe

  • domain

  • email

  • email-subject

  • eui-64

  • file

  • forum-name

  • forum-room

  • forum-thread

  • fox-it-portal-uri

  • geo

  • geo-lat

  • geo-long

  • handle

  • hash-authentihash

  • hash-imphash

  • hash-md5

  • hash-rich-pe-header

  • hash-sha1

  • hash-sha256

  • hash-sha512

  • hash-ssdeep

  • hash-vhash

  • host

  • industry

  • inetnum

  • ipv4

  • ipv4-cidr

  • ipv6

  • ipv6-cidr

  • ja3-full

  • ja3-hash

  • ja3s-full

  • ja3s-hash

  • mac-48

  • malware

  • mutex

  • name

  • nationality

  • netname

  • organization

  • person

  • port

  • postcode

  • process

  • product

  • registrar

  • rule

  • snort

  • street

  • telephone

  • uri

  • uri-hash-sha256

  • winregistry

  • yara

Supported entity types#

The following entity type are supported:

Entity type

Syntax

Attack pattern

attack-pattern

Campaign

campaign

Course of action

course-of-action

Exploit target

exploit-target

Identity

identity

Incident

incident

Indicator

indicator

Infrastructure

infrastructure

Intrusion set

intrusion-set

Location

location

Malware analysis

malware-analysis

Malware

malware

Report

report

Sighting

eclecticip-sighting

Threat actor

threat-actor

Tool

tool

TTP

ttp

Serving a single CSV file through HTTP#

This extension ingests a single CSV file through the URL you specify in the CSV URL field. This URL must allow a direct download of the CSV.

If you are trying to ingest a CSV file from a host on your network, you can start a Python-based web server to serve that CSV file over HTTP and allow this extension to access it.

This extension does not support URLs from Microsoft OneDrive, Google Drive, or similar file sharing services.

Requirements#

  • Python 3.

  • Access to the command line.

  • Administrator permissions on the file host.

  • IP address of the file host.

  • The platform can connect to the file host on port 8888.

Start the web server#

Caution

Do not use in production.

Only serve sensitive data for the extension to access through secure HTTPS connections.

Consider using a trusted CDN service, or ask your network administrator for assistance.

  1. Open the terminal or PowerShell/Command Prompt.

  2. Run:

    python3 -m http.server --directory /path/to/data.csv 8888
    
  3. When prompted by your Operating System, allow Python 3 to accept incoming connections.

You can now access the CSV file at http://<filehost_ip_address:8888. Set your CSV URL field to this value.

Troubleshooting#

If you encounter the following error in the tracelog:

backend/extensions/advanced_csv/eiq/extensions/advanced_csv/transformer.py", line 83, in make_uuid
    values_list.sort()
TypeError: '<' not supported between instances of 'NoneType' and 'str'

The mapping configured for the feed may not match the format of the CSV files being ingested. Check that the format of the CSV files match the contents of these fields in the feed configuration:

  • CSV mapping

  • Delimiter

  • Quote character

  • Field names

and that the contents of your target CSV file are valid.