Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DPDV-5997] feat: allow query all or selected accounts #127

Merged
Merged
Show file tree
Hide file tree
Changes from 8 commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 18 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,16 +52,32 @@ The add-on uses Splunk encrypted secrets storage, so admins require `admin_all_o
4. Under Log Access Keys, click Add Key > Add Write Key (required for alert action).
5. Optionally, click the pencil icon to rename the keys.

### SentinelOne Platform with Singularity Data Lake
To get the AuthN API token follow the below mentioned details:
1. Login into the SentinelOne console, click on your User Name > My User
![My User Page](README_images/my_user.png)
2. Click on Actions > API Token Operations > Generate / Regenrate API token.
- If you are generating the API Token for the first time then you will have the Generate API Token option. Otherwise you will find the generate API Token.
![Token generation](README_images/generate_token.png)
3. Copy the API Token and save it for configuration.

### Splunk
1. In Splunk, open the Add-on

![Configuring DataSet Account](README_images/setup_account.png)
![Configuring DataSet Account](README_images/acc_details_new.png)

2. On the configuration > account tab:
- Click Add
- Enter a user-friendly account name. For multiple accounts, the account name can be used in queries (more details below).
- Enter the full URL noted above (e.g.: `https://app.scalyr.com`, `https://xdr.us1.sentinelone.net` or `https://xdr.eu1.sentinelone.net`).
- Enter the DataSet read key from above (required for searching)
- Enter Tenant value, it can be True/False/Blank. If set to True, the queries will run for the entire Tenant or if set to False, provide Account IDs as a comma separated values to run searches in those specific accounts. Leave it blank if you are not trying use the Tenant level searches.
- Provide the comma seperated Account Ids, if Tenant is False. eg: 138687697697679,698767986986897666.
munna-shaik-s1 marked this conversation as resolved.
Show resolved Hide resolved
- Enter the AuthN API Token First part which includes first 220 characters.
- Enter the AuthN API Token Second part which includes remaining characters.
- Use this command to prepare both parts of AuthN API token:
`read -p "Enter Token: " input_string && echo "Part1: $(echo $input_string | cut -c 1-220)"; echo "Part2: $(echo $input_string | cut -c 221-)"`
- Reason for creating 2 parts of AuthN Token: Splunk Storage Manager has a limitation of storing only 256 characters of encrypted data from inputs. And the AuthN Token can have length <256, hence its split into 2 parts, the first one is encrypted (first 220 chars) and the second one is not. As we are encrypting most of the Token, its use is safe.
- Enter the DataSet read key from above (required for searching), please ignore this if AuthN token value is provided.
- Enter the DataSet write key from above (only required for alert actions).
- Click Save

Expand Down
Binary file added README_images/acc_details_new.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added README_images/generate_token.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added README_images/my_user.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 4 additions & 0 deletions TA_dataset/README/ta_dataset_settings.conf.spec
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
[account]
url = <string>
tenant = <bool>
account_ids = <string>
an_fir_part = <string>
zdaratom-s1 marked this conversation as resolved.
Show resolved Hide resolved
an_sec_part = <string>
dataset_log_read_access_key = <string>
dataset_log_write_access_key = <string>

Expand Down
15 changes: 14 additions & 1 deletion TA_dataset/bin/dataset_alerts.py
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,20 @@ def stream_events(self, inputs, ew):
ds_headers = {
"Authorization": "Bearer " + acct_dict[ds_acct]["ds_api_key"]
}

if acct_dict.get(ds_acct).get("tenant") is not None:
tenant_value = acct_dict.get(ds_acct).get("tenant")
if tenant_value:
ds_payload.update({"tenant": True})
else:
ds_payload.update(
{
"tenant": False,
"accountIds": acct_dict[ds_acct]["account_ids"],
}
)
logger.debug(
"ds payload in power query stream events = {}".format(ds_payload)
)
# Create checkpointer
checkpoint = checkpointer.KVStoreCheckpointer(
input_name, session_key, APP_NAME
Expand Down
49 changes: 35 additions & 14 deletions TA_dataset/bin/dataset_api.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,13 +46,7 @@ def convert_proxy(proxy):

# Executes Dataset LongRunningQuery for log events
def ds_lrq_log_query(
base_url,
api_key,
start_time,
end_time,
filter_expr,
limit,
proxy,
base_url, api_key, start_time, end_time, filter_expr, limit, proxy, logger, acc_conf
):
client = AuthenticatedClient(
base_url=base_url, token=api_key, proxy=convert_proxy(proxy)
Expand All @@ -63,11 +57,19 @@ def ds_lrq_log_query(
end_time=end_time,
log=LogAttributes(filter_=filter_expr, limit=limit),
)
return ds_lrq_run_loop(client=client, body=body)
if acc_conf.get("tenant") is not None:
tenant_value = acc_conf.get("tenant")
if tenant_value:
acc_conf = {"tenant": True}
else:
acc_conf = {"tenant": False, "accountIds": acc_conf["account_ids"]}
munna-shaik-s1 marked this conversation as resolved.
Show resolved Hide resolved
return ds_lrq_run_loop(logger, client=client, body=body, acc_conf=acc_conf)


# Executes Dataset LongRunningQuery using PowerQuery language
def ds_lrq_power_query(base_url, api_key, start_time, end_time, query, proxy):
def ds_lrq_power_query(
base_url, api_key, start_time, end_time, query, proxy, logger, acc_conf
):
client = AuthenticatedClient(
base_url=base_url, token=api_key, proxy=convert_proxy(proxy)
)
Expand All @@ -77,7 +79,13 @@ def ds_lrq_power_query(base_url, api_key, start_time, end_time, query, proxy):
end_time=end_time,
pq=PQAttributes(query=query),
)
return ds_lrq_run_loop(client=client, body=body)
if acc_conf.get("tenant") is not None:
tenant_value = acc_conf.get("tenant")
if tenant_value:
acc_conf = {"tenant": True}
else:
acc_conf = {"tenant": False, "accountIds": acc_conf["account_ids"]}
return ds_lrq_run_loop(logger, client=client, body=body, acc_conf=acc_conf)


# Executes Dataset LongRunningQuery to fetch facet values
Expand All @@ -90,6 +98,8 @@ def ds_lrq_facet_values(
name,
max_values,
proxy,
logger,
acc_conf,
):
client = AuthenticatedClient(
base_url=base_url, token=api_key, proxy=convert_proxy(proxy)
Expand All @@ -102,17 +112,28 @@ def ds_lrq_facet_values(
filter_=filter, name=name, max_values=max_values
),
)
return ds_lrq_run_loop(client=client, body=body)
if acc_conf.get("tenant") is not None:
tenant_value = acc_conf.get("tenant")
if tenant_value:
acc_conf = {"tenant": True}
else:
acc_conf = {"tenant": False, "accountIds": acc_conf["account_ids"]}
return ds_lrq_run_loop(logger, client=client, body=body)


# Executes LRQ run loop of launch-ping-remove API requests until the query completes
# with a result
# Returns tuple - value, error message
def ds_lrq_run_loop(
client: AuthenticatedClient, body: PostQueriesLaunchQueryRequestBody
log,
client: AuthenticatedClient,
body: PostQueriesLaunchQueryRequestBody,
acc_conf=None,
):
body.query_priority = PostQueriesLaunchQueryRequestBodyQueryPriority.HIGH
response = post_queries.sync_detailed(client=client, json_body=body)
response = post_queries.sync_detailed(
client=client, json_body=body, tenant_details=acc_conf, logger=log
munna-shaik-s1 marked this conversation as resolved.
Show resolved Hide resolved
)
logger().debug(response)
result = response.parsed
if result:
Expand Down Expand Up @@ -242,7 +263,7 @@ def parse_query(ds_columns, match_list, sessions):
if ds_columns is None:
session_key = match_list["session"]

for session_entry, session_dict in sessions.items():
for session_entry, session_dict in list(sessions.items()):
zdaratom-s1 marked this conversation as resolved.
Show resolved Hide resolved
if session_entry == session_key:
for key in session_dict:
ds_event_dict[key] = session_dict[key]
Expand Down
123 changes: 113 additions & 10 deletions TA_dataset/bin/dataset_common.py
Original file line number Diff line number Diff line change
Expand Up @@ -167,9 +167,31 @@ def get_acct_info(self, logger, account=None):
for conf in confs:
acct_dict[conf.name] = {}
acct_dict[conf.name]["base_url"] = conf.url
acct_dict[conf.name]["ds_api_key"] = get_token(
self, conf.name, "read", logger
)
token = ""
munna-shaik-s1 marked this conversation as resolved.
Show resolved Hide resolved
if hasattr(conf, "an_fir_part"):
logger.info("The AuthN api token first part was avialable")
first_half = get_token(
self, conf.name, "authn", logger, "an_fir_part"
)
token += first_half
if hasattr(conf, "an_sec_part"):
logger.info("The AuthN api token second part was avialable")
second_part = conf.an_sec_part
token += second_part
if not hasattr(conf, "an_fir_part") and not hasattr(
conf, "an_sec_part"
):
logger.info("The AuthN api token was not avialable")
munna-shaik-s1 marked this conversation as resolved.
Show resolved Hide resolved
acct_dict[conf.name]["ds_api_key"] = get_token(
self, conf.name, "read", logger
)
if token:
acct_dict[conf.name]["ds_api_key"] = token
if hasattr(conf, "tenant"):
acct_dict[conf.name]["tenant"] = get_tenant_value(conf, logger)
acct_dict[conf.name]["account_ids"] = get_account_ids(
munna-shaik-s1 marked this conversation as resolved.
Show resolved Hide resolved
conf, logger
)
except Exception as e:
msg = "Error retrieving add-on settings, error = {}".format(e)
logger.error(msg + " - %s", e, exc_info=True)
Expand All @@ -182,9 +204,34 @@ def get_acct_info(self, logger, account=None):
conf = self.service.confs[conf_name][entry]
acct_dict[entry] = {}
acct_dict[entry]["base_url"] = conf.url
acct_dict[entry]["ds_api_key"] = get_token(
self, entry, "read", logger
)
token = ""
munna-shaik-s1 marked this conversation as resolved.
Show resolved Hide resolved
if hasattr(conf, "an_fir_part"):
logger.info("The AuthN api token first part was avialable")
first_half = get_token(
self, entry, "authn", logger, "an_fir_part"
)
token += first_half
if hasattr(conf, "an_sec_part"):
logger.info("The AuthN api token second part was avialable")
second_part = conf.an_sec_part
token += second_part
if not hasattr(conf, "an_fir_part") and not hasattr(
conf, "an_sec_part"
):
logger.info("The AuthN api token was not avialable")
acct_dict[entry]["ds_api_key"] = get_token(
self, entry, "read", logger
)
if token:
acct_dict[entry]["ds_api_key"] = token
if hasattr(conf, "tenant"):
acct_dict[entry]["tenant"] = get_tenant_value(conf, logger)
logger.info(
"the tenant value in entry conf {}".format(
get_tenant_value(conf, logger)
)
)
acct_dict[entry]["account_ids"] = get_account_ids(conf, logger)
except Exception as e:
msg = "Error retrieving account settings, error = {}".format(e)
logger.error(msg + " - %s", e, exc_info=True)
Expand All @@ -197,9 +244,29 @@ def get_acct_info(self, logger, account=None):
for conf in confs:
acct_dict[conf.name] = {}
acct_dict[conf.name]["base_url"] = conf.url
acct_dict[conf.name]["ds_api_key"] = get_token(
self, conf.name, "read", logger
)
token = ""
munna-shaik-s1 marked this conversation as resolved.
Show resolved Hide resolved
if hasattr(conf, "an_fir_part"):
logger.info("The AuthN api token first part was avialable")
first_half = get_token(
self, conf.name, "authn", logger, "an_fir_part"
)
token += first_half
if hasattr(conf, "an_sec_part"):
logger.info("The AuthN api token second part was avialable")
second_part = conf.an_sec_part
token += second_part
if not hasattr(conf, "an_fir_part") and not hasattr(
conf, "an_sec_part"
):
logger.info("The AuthN api token was not avialable")
acct_dict[conf.name]["ds_api_key"] = get_token(
self, conf.name, "read", logger
)
if token:
acct_dict[conf.name]["ds_api_key"] = token
if hasattr(conf, "tenant"):
acct_dict[conf.name]["tenant"] = get_tenant_value(conf, logger)
acct_dict[conf.name]["account_ids"] = get_account_ids(conf, logger)
break
except Exception as e:
msg = (
Expand All @@ -212,7 +279,36 @@ def get_acct_info(self, logger, account=None):
return acct_dict


def get_token(self, account, rw, logger):
def get_tenant_value(conf, logger):
tenant_value = conf.tenant
tenant_value = tenant_value.strip()
logger.info("The provided tenant value in config is {}".format(tenant_value))
if tenant_value.lower() == "false" or tenant_value.lower() == "0":
return False
return True


def get_account_ids(conf, logger):
account_ids_array = []
tenant = False if conf.tenant == "0" else True
if tenant:
logger.debug("Account configured on global level")
return account_ids_array
zdaratom-s1 marked this conversation as resolved.
Show resolved Hide resolved
if hasattr(conf, "account_ids"):
account_ids_conf = conf.account_ids
account_ids_conf = account_ids_conf.strip()
if account_ids_conf:
account_ids_array = account_ids_conf.split(",")
zdaratom-s1 marked this conversation as resolved.
Show resolved Hide resolved
logger.info(f"the provided account ids in config: {account_ids_array}")
if not tenant and not account_ids_array:
raise Exception(
"Tenant is false, so please provide the valid comma-separated account IDs"
" in the account configuration page."
)
return account_ids_array


def get_token(self, account, rw, logger, key_field=None):
munna-shaik-s1 marked this conversation as resolved.
Show resolved Hide resolved
zdaratom-s1 marked this conversation as resolved.
Show resolved Hide resolved
try:
# use Python SDK secrets retrieval
for credential in self.service.storage_passwords:
Expand All @@ -224,6 +320,11 @@ def get_token(self, account, rw, logger):
and credential.username.startswith(account)
):
cred = credential.content.get("clear_password")
if rw == "authn":
if key_field in cred:
logger.info("the yes on authn token")
cred_json = json.loads(cred)
token = cred_json[key_field]
if rw == "read":
zdaratom-s1 marked this conversation as resolved.
Show resolved Hide resolved
if "dataset_log_read_access_key" in cred:
cred_json = json.loads(cred)
Expand All @@ -233,6 +334,8 @@ def get_token(self, account, rw, logger):
cred_json = json.loads(cred)
token = cred_json["dataset_log_write_access_key"]
return token
else:
logger.debug("the credentials were not retireived")
except Exception as e:
logger.error(
self,
Expand Down
15 changes: 14 additions & 1 deletion TA_dataset/bin/dataset_powerquery.py
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,20 @@ def stream_events(self, inputs, ew):
ds_headers = {
"Authorization": "Bearer " + acct_dict[ds_acct]["ds_api_key"]
}

if acct_dict.get(ds_acct).get("tenant") is not None:
tenant_value = acct_dict.get(ds_acct).get("tenant")
if tenant_value:
ds_payload.update({"tenant": True})
else:
ds_payload.update(
{
"tenant": False,
"accountIds": acct_dict[ds_acct]["account_ids"],
}
)
logger.info(
"ds payload in power query stream events = {}".format(ds_payload)
)
# Create checkpointer
checkpoint = checkpointer.KVStoreCheckpointer(
input_name, session_key, APP_NAME
Expand Down
12 changes: 12 additions & 0 deletions TA_dataset/bin/dataset_query.py
Original file line number Diff line number Diff line change
Expand Up @@ -144,7 +144,19 @@ def stream_events(self, inputs, ew):
proxy = get_proxy(session_key, logger)
acct_dict = get_acct_info(self, logger, ds_account)
for ds_acct in acct_dict.keys():
if acct_dict.get(ds_acct).get("tenant") is not None:
zdaratom-s1 marked this conversation as resolved.
Show resolved Hide resolved
tenant_value = acct_dict.get(ds_acct).get("tenant")
if tenant_value:
ds_payload.update({"tenant": True})
else:
ds_payload.update(
{
"tenant": False,
"accountIds": acct_dict[ds_acct]["account_ids"],
}
)
curr_payload = copy.deepcopy(ds_payload)
logger.info("query api account curr payload {}".format(curr_payload))
curr_maxcount = copy.copy(ds_maxcount)
ds_url = get_url(acct_dict[ds_acct]["base_url"], "query")
ds_headers = {
Expand Down
Loading
Loading