Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DRAFT: new RBA Object - Step 3 - ESCU 5.0 #263

Merged
merged 79 commits into from
Jan 17, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
79 commits
Select commit Hold shift + click to select a range
830d201
initial lookup updates
pyth0n1c Aug 15, 2024
e96fbd4
continuing to make lookup improvements.
pyth0n1c Aug 19, 2024
32ed03f
more lookup changes
pyth0n1c Aug 19, 2024
66e743e
initial sketch
ljstella Aug 27, 2024
802bfe6
Merge branch 'main' into obs_to_rba
ljstella Aug 27, 2024
2c1275c
Merge branch 'main' into obs_to_rba
ljstella Aug 28, 2024
f81f82e
Merge branch 'main' into obs_to_rba
ljstella Sep 4, 2024
dd5b52d
save point
ljstella Sep 5, 2024
56f4273
Merge branch 'main' into obs_to_rba
ljstella Sep 26, 2024
f6f2999
Merge branch 'main' into obs_to_rba
ljstella Nov 4, 2024
b294765
Implement hashing
ljstella Nov 8, 2024
7f7724c
Updated default detection
ljstella Nov 8, 2024
0224f9e
Merge branch 'main' into obs_to_rba
ljstella Nov 8, 2024
3be2c3a
Remove tags.message and tags.observable
ljstella Nov 8, 2024
9c138f1
remove code for tags.message
ljstella Nov 8, 2024
11a1ca9
reworking validations
ljstella Nov 8, 2024
3882b9b
new rba location
ljstella Nov 8, 2024
d584822
Refactor risk()
ljstella Nov 8, 2024
3cde4a6
slight tweak
ljstella Nov 8, 2024
f4739cc
Better guard against None
ljstella Nov 12, 2024
d6b848e
Another None case
ljstella Nov 12, 2024
9cda91e
remove print
ljstella Nov 12, 2024
8e5676c
Another None guard
ljstella Nov 12, 2024
6f77c47
Just production
ljstella Nov 12, 2024
9266898
Merge branch 'main' into obs_to_rba
ljstella Nov 13, 2024
1a4ea93
Validate all, not just production
ljstella Nov 13, 2024
e2565f4
Remove comment
ljstella Nov 14, 2024
12c8881
Temporary tweak for testing companion branch
ljstella Nov 14, 2024
afa864b
tweak to required
ljstella Nov 15, 2024
e7fd466
threat object type typo
ljstella Nov 15, 2024
2fe24e6
more threat object types
ljstella Nov 15, 2024
3435f4c
one more threat object type
ljstella Nov 15, 2024
140089f
Oopsied the merge
ljstella Nov 22, 2024
9790e16
Merge branch 'main' into obs_to_rba
ljstella Nov 22, 2024
042a53a
Wrong branch for 3.13
ljstella Nov 22, 2024
e671f2b
Create new rba object via new content workflow
ljstella Nov 25, 2024
1107ae1
Reordering output
ljstella Nov 25, 2024
12acd66
Merge branch 'contentctl_5' into obs_to_rba
ljstella Dec 10, 2024
474ede5
Merge branch 'contentctl_5' into obs_to_rba
ljstella Dec 12, 2024
f88bca6
convert plain enums, or enums with
pyth0n1c Dec 4, 2024
5b9cb95
Remove all usage of use_enum_values.
pyth0n1c Dec 4, 2024
827a8f4
Remove use of .value on enums in code
pyth0n1c Dec 4, 2024
eeaeb4d
fix missing typing of mode_name
pyth0n1c Dec 4, 2024
b794d15
remove files that are no longer used anymore. Add logic to serialize …
pyth0n1c Dec 6, 2024
334062c
Remove dead code from
pyth0n1c Dec 6, 2024
4bc5e68
remove the 'forbid' from a few classes
pyth0n1c Dec 11, 2024
84715bf
Clean up two more use of .value on
pyth0n1c Dec 11, 2024
8cc3451
Add GH Actions to Dependabot
ljstella Dec 12, 2024
b4848be
Reduce matrix for simplicity
ljstella Dec 12, 2024
753b3b0
Merge branch 'contentctl_5' into obs_to_rba
ljstella Dec 12, 2024
cc51953
More cleanup with
pyth0n1c Dec 23, 2024
fda382f
Merge branch 'contentctl_5' into improve_lookup_regex
pyth0n1c Dec 23, 2024
b24c88d
initial working cleanup of lookups code
pyth0n1c Dec 23, 2024
e7eb947
include inputlookup and outputlookup
pyth0n1c Dec 23, 2024
deefd57
more cleanup on lookup object.
pyth0n1c Jan 3, 2025
97daa61
Merge branch 'obs_to_rba' into improve_lookup_regex
pyth0n1c Jan 3, 2025
24b003c
Update CI to temporarily test against #3269 on security_content
ljstella Jan 6, 2025
41fab0f
Fix regex to step matching
pyth0n1c Jan 6, 2025
f04d92c
Progress and cleanup for
pyth0n1c Jan 7, 2025
825beaf
Able to build without any errors,
pyth0n1c Jan 8, 2025
a31d484
improve api output serialization
pyth0n1c Jan 8, 2025
0f70172
Clean up bad imports. Give more
pyth0n1c Jan 9, 2025
285acf1
New threat object type
ljstella Jan 10, 2025
a6faec5
merge latest rba target updates
pyth0n1c Jan 13, 2025
78aa05e
Fix access of variable that does
pyth0n1c Jan 13, 2025
9b158ce
initial commit; migrated integration testing to RBA structures; litte…
cmcginley-splunk Jan 16, 2025
0ef9754
Merge pull request #274 from splunk/improve_lookup_regex
ljstella Jan 16, 2025
779006e
Merge branch 'obs_to_rba' into integration_testing_rba_migration
cmcginley-splunk Jan 16, 2025
901415f
Change testing branch
ljstella Jan 16, 2025
da61571
Update template detection
ljstella Jan 16, 2025
72c51a4
cleanup; log fixes
cmcginley-splunk Jan 16, 2025
f72c796
resolving some todos
cmcginley-splunk Jan 16, 2025
8293a6d
Class name renaming
ljstella Jan 16, 2025
7b8b2ff
Merge branch 'obs_to_rba' into integration_testing_rba_migration
cmcginley-splunk Jan 16, 2025
51f0780
new class name
cmcginley-splunk Jan 16, 2025
c3cc5ab
little bit more cleanup on lookups.
pyth0n1c Jan 16, 2025
da39152
Merge branch 'obs_to_rba' of https://github.com/splunk/contentctl int…
pyth0n1c Jan 16, 2025
1de20e8
Merge branch 'obs_to_rba' into integration_testing_rba_migration
pyth0n1c Jan 16, 2025
0f53c69
Merge pull request #345 from splunk/integration_testing_rba_migration
ljstella Jan 17, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/workflows/test_against_escu.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ jobs:
with:
path: security_content
repository: splunk/security_content
ref: rba_migration

#Install the given version of Python we will test against
- name: Install Required Python Version
Expand Down
64 changes: 38 additions & 26 deletions contentctl/actions/build.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,11 @@
from contentctl.output.conf_writer import ConfWriter
from contentctl.output.api_json_output import ApiJsonOutput
from contentctl.output.data_source_writer import DataSourceWriter
from contentctl.objects.lookup import Lookup
from contentctl.objects.lookup import CSVLookup, Lookup_Type
import pathlib
import json
import datetime
from typing import Union
import uuid

from contentctl.objects.config import build

Expand All @@ -34,27 +34,41 @@ def execute(self, input_dto: BuildInputDto) -> DirectorOutputDto:
updated_conf_files:set[pathlib.Path] = set()
conf_output = ConfOutput(input_dto.config)


# Construct a path to a YML that does not actually exist.
# We mock this "fake" path since the YML does not exist.
# This ensures the checking for the existence of the CSV is correct
data_sources_fake_yml_path = input_dto.config.getPackageDirectoryPath() / "lookups" / "data_sources.yml"

# Construct a special lookup whose CSV is created at runtime and
# written directly into the output folder. It is created with model_construct,
# not model_validate, because the CSV does not exist yet.
# written directly into the lookups folder. We will delete this after a build,
# assuming that it is successful.
data_sources_lookup_csv_path = input_dto.config.getPackageDirectoryPath() / "lookups" / "data_sources.csv"
DataSourceWriter.writeDataSourceCsv(input_dto.director_output_dto.data_sources, data_sources_lookup_csv_path)
input_dto.director_output_dto.addContentToDictMappings(Lookup.model_construct(description= "A lookup file that will contain the data source objects for detections.",
filename=data_sources_lookup_csv_path,
name="data_sources"))



DataSourceWriter.writeDataSourceCsv(input_dto.director_output_dto.data_sources, data_sources_lookup_csv_path)
input_dto.director_output_dto.addContentToDictMappings(CSVLookup.model_construct(name="data_sources",
id=uuid.UUID("b45c1403-6e09-47b0-824f-cf6e44f15ac8"),
version=1,
author=input_dto.config.app.author_name,
date = datetime.date.today(),
description= "A lookup file that will contain the data source objects for detections.",
lookup_type=Lookup_Type.csv,
file_path=data_sources_fake_yml_path))
updated_conf_files.update(conf_output.writeHeaders())
updated_conf_files.update(conf_output.writeObjects(input_dto.director_output_dto.detections, SecurityContentType.detections))
updated_conf_files.update(conf_output.writeObjects(input_dto.director_output_dto.stories, SecurityContentType.stories))
updated_conf_files.update(conf_output.writeObjects(input_dto.director_output_dto.baselines, SecurityContentType.baselines))
updated_conf_files.update(conf_output.writeObjects(input_dto.director_output_dto.investigations, SecurityContentType.investigations))
updated_conf_files.update(conf_output.writeObjects(input_dto.director_output_dto.lookups, SecurityContentType.lookups))
updated_conf_files.update(conf_output.writeObjects(input_dto.director_output_dto.macros, SecurityContentType.macros))
updated_conf_files.update(conf_output.writeObjects(input_dto.director_output_dto.dashboards, SecurityContentType.dashboards))
updated_conf_files.update(conf_output.writeLookups(input_dto.director_output_dto.lookups))
updated_conf_files.update(conf_output.writeDetections(input_dto.director_output_dto.detections))
updated_conf_files.update(conf_output.writeStories(input_dto.director_output_dto.stories))
updated_conf_files.update(conf_output.writeBaselines(input_dto.director_output_dto.baselines))
updated_conf_files.update(conf_output.writeInvestigations(input_dto.director_output_dto.investigations))
updated_conf_files.update(conf_output.writeMacros(input_dto.director_output_dto.macros))
updated_conf_files.update(conf_output.writeDashboards(input_dto.director_output_dto.dashboards))
updated_conf_files.update(conf_output.writeMiscellaneousAppFiles())




#Ensure that the conf file we just generated/update is syntactically valid
for conf_file in updated_conf_files:
ConfWriter.validateConfFile(conf_file)
Expand All @@ -67,17 +81,15 @@ def execute(self, input_dto: BuildInputDto) -> DirectorOutputDto:
if input_dto.config.build_api:
shutil.rmtree(input_dto.config.getAPIPath(), ignore_errors=True)
input_dto.config.getAPIPath().mkdir(parents=True)
api_json_output = ApiJsonOutput()
for output_objects, output_type in [(input_dto.director_output_dto.detections, SecurityContentType.detections),
(input_dto.director_output_dto.stories, SecurityContentType.stories),
(input_dto.director_output_dto.baselines, SecurityContentType.baselines),
(input_dto.director_output_dto.investigations, SecurityContentType.investigations),
(input_dto.director_output_dto.lookups, SecurityContentType.lookups),
(input_dto.director_output_dto.macros, SecurityContentType.macros),
(input_dto.director_output_dto.deployments, SecurityContentType.deployments)]:
api_json_output.writeObjects(output_objects, input_dto.config.getAPIPath(), input_dto.config.app.label, output_type )


api_json_output = ApiJsonOutput(input_dto.config.getAPIPath(), input_dto.config.app.label)
api_json_output.writeDetections(input_dto.director_output_dto.detections)
api_json_output.writeStories(input_dto.director_output_dto.stories)
api_json_output.writeBaselines(input_dto.director_output_dto.baselines)
api_json_output.writeInvestigations(input_dto.director_output_dto.investigations)
api_json_output.writeLookups(input_dto.director_output_dto.lookups)
api_json_output.writeMacros(input_dto.director_output_dto.macros)
api_json_output.writeDeployments(input_dto.director_output_dto.deployments)


#create version file for sse api
version_file = input_dto.config.getAPIPath()/"version.json"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1094,6 +1094,7 @@ def retry_search_until_timeout(
job = self.get_conn().search(query=search, **kwargs)
results = JSONResultsReader(job.results(output_mode="json"))

# TODO (cmcginley): @ljstella you're removing this ultimately, right?
# Consolidate a set of the distinct observable field names
observable_fields_set = set([o.name for o in detection.tags.observable]) # keeping this around for later
risk_object_fields_set = set([o.name for o in detection.tags.observable if "Victim" in o.role ]) # just the "Risk Objects"
Expand Down Expand Up @@ -1121,7 +1122,10 @@ def retry_search_until_timeout(
missing_risk_objects = risk_object_fields_set - results_fields_set
if len(missing_risk_objects) > 0:
# Report a failure in such cases
e = Exception(f"The observable field(s) {missing_risk_objects} are missing in the detection results")
e = Exception(
f"The risk object field(s) {missing_risk_objects} are missing in the "
"detection results"
)
test.result.set_job_content(
job.content,
self.infrastructure,
Expand All @@ -1137,6 +1141,8 @@ def retry_search_until_timeout(
# on a field. In this case, the field will appear but will not contain any values
current_empty_fields: set[str] = set()

# TODO (cmcginley): @ljstella is this something we're keeping for testing as
# well?
for field in observable_fields_set:
if result.get(field, 'null') == 'null':
if field in risk_object_fields_set:
Expand Down
3 changes: 2 additions & 1 deletion contentctl/actions/validate.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
from contentctl.enrichments.attack_enrichment import AttackEnrichment
from contentctl.enrichments.cve_enrichment import CveEnrichment
from contentctl.objects.atomic import AtomicEnrichment
from contentctl.objects.lookup import FileBackedLookup
from contentctl.helper.utils import Utils
from contentctl.objects.data_source import DataSource
from contentctl.helper.splunk_app import SplunkApp
Expand Down Expand Up @@ -64,7 +65,7 @@ def ensure_no_orphaned_files_in_lookups(self, repo_path:pathlib.Path, director_o
lookupsDirectory = repo_path/"lookups"

# Get all of the files referneced by Lookups
usedLookupFiles:list[pathlib.Path] = [lookup.filename for lookup in director_output_dto.lookups if lookup.filename is not None] + [lookup.file_path for lookup in director_output_dto.lookups if lookup.file_path is not None]
usedLookupFiles:list[pathlib.Path] = [lookup.filename for lookup in director_output_dto.lookups if isinstance(lookup, FileBackedLookup)] + [lookup.file_path for lookup in director_output_dto.lookups if lookup.file_path is not None]

# Get all of the mlmodel and csv files in the lookups directory
csvAndMlmodelFiles = Utils.get_security_content_files_from_directory(lookupsDirectory, allowedFileExtensions=[".yml",".csv",".mlmodel"], fileExtensionsToReturn=[".csv",".mlmodel"])
Expand Down
10 changes: 5 additions & 5 deletions contentctl/input/director.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
from contentctl.objects.playbook import Playbook
from contentctl.objects.deployment import Deployment
from contentctl.objects.macro import Macro
from contentctl.objects.lookup import Lookup
from contentctl.objects.lookup import LookupAdapter, Lookup
from contentctl.objects.atomic import AtomicEnrichment
from contentctl.objects.security_content_object import SecurityContentObject
from contentctl.objects.data_source import DataSource
Expand Down Expand Up @@ -58,13 +58,12 @@ def addContentToDictMappings(self, content: SecurityContentObject):
f" - {content.file_path}\n"
f" - {self.name_to_content_map[content_name].file_path}"
)

if content.id in self.uuid_to_content_map:
raise ValueError(
f"Duplicate id '{content.id}' with paths:\n"
f" - {content.file_path}\n"
f" - {self.uuid_to_content_map[content.id].file_path}"
)
f" - {self.uuid_to_content_map[content.id].file_path}")

if isinstance(content, Lookup):
self.lookups.append(content)
Expand Down Expand Up @@ -157,7 +156,8 @@ def createSecurityContent(self, contentType: SecurityContentType) -> None:
modelDict = YmlReader.load_file(file)

if contentType == SecurityContentType.lookups:
lookup = Lookup.model_validate(modelDict, context={"output_dto":self.output_dto, "config":self.input_dto})
lookup = LookupAdapter.validate_python(modelDict, context={"output_dto":self.output_dto, "config":self.input_dto})
#lookup = Lookup.model_validate(modelDict, context={"output_dto":self.output_dto, "config":self.input_dto})
self.output_dto.addContentToDictMappings(lookup)

elif contentType == SecurityContentType.macros:
Expand Down
2 changes: 1 addition & 1 deletion contentctl/input/new_content_questions.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ def get_questions_detection(cls) -> list[dict[str,Any]]:
{
'type': 'checkbox',
'message': 'Your data source',
'name': 'data_source',
'name': 'data_sources',
#In the future, we should dynamically populate this from the DataSource Objects we have parsed from the data_sources directory
'choices': sorted(DataSource._value2member_map_ )

Expand Down
Loading
Loading