Skip to content

Commit

Permalink
Fixing docs build errors
Browse files Browse the repository at this point in the history
  • Loading branch information
ianhelle committed Oct 20, 2024
1 parent 91e44e0 commit f7b81d1
Show file tree
Hide file tree
Showing 4 changed files with 62 additions and 48 deletions.
4 changes: 4 additions & 0 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,8 @@
attrs>=18.2.0
azure.mgmt.network
azure.mgmt.resource
azure.mgmt.monitor
azure.mgmt.compute
cryptography
deprecated>=1.2.4
docutils<0.22.0
Expand Down
37 changes: 22 additions & 15 deletions docs/source/data_acquisition/UploadData.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,8 @@ The first step in uploading data is to instantiate an uploader for the location
For Azure Sentinel there are two parameters that need to be passed at this stage,
the workspace ID of the workspace to upload data to, and the workspace key.

**Note that these are different from the details required to query data from Log Analytics using the DataProvider.
Your workspace key can be found under the Advanced setting tab of your Log Analytics workspace.**
.. note:: that these are different from the details required to query data from Log Analytics using the DataProvider.
Your workspace key can be found under the Advanced setting tab of your Log Analytics workspace.**

.. code:: ipython3
Expand All @@ -36,7 +36,7 @@ you wish the data to be uploaded to. If that table exists the data will be appen
Note that all tables fall under the Custom Log category so any name you provide will be appended with _CL (i.e. table_name will be table_name_CL).
Log Analytics will parse each column in the DataFrame into a column in the resulting table.

*Note: table_name cannot contain any special characters except `_` all other characters will be removed.*
.. note:: table_name cannot contain any special characters except `_` all other characters will be removed.

.. code:: ipython3
Expand Down Expand Up @@ -98,13 +98,14 @@ On the other hand, You can use the stored credentials in msticpyconfig.yaml to S
from msticpy.data.uploaders.splunk_uploader import SplunkUploader
spup = SplunkUploader()
*Note: Due to the way Splunk API's work the time taken to upload a file to
Splunk can be significantly longer than with Log Analytics.*
.. note:: Due to the way Splunk API's work the time taken to upload a file to
Splunk can be significantly longer than with Log Analytics.*

Uploading a DataFrame to Splunk
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

To upload a Pandas DataFrame to Splunk you simply pass the DataFrame to ``.upload_df()`` along with index you wish the data to be uploaded to.
To upload a Pandas DataFrame to Splunk you simply pass the DataFrame to ``.upload_df()``
along with index you wish the data to be uploaded to.
As the ``source_type`` parameter, csv, json or others can be input and then passed to
df.to_csv(), df.to_json(), df.to_string() styles respectively and **json** is by default.
``table_name`` parameter remains for the backward compatibility.
Expand All @@ -126,12 +127,14 @@ To upload a file to Splunk pass the path to the file to ``.upload_file()`` along
the index you want the data uploaded to.
By default, a comma separated value file is expected but if your file has
some other separator value you can pass this with the ``delim`` parameter.
You can specify the sourcetype to upload the data to with that ``source_type`` parameter
You can specify the source type to upload the data to with that ``source_type`` parameter
but by default the uploader will upload to the sourcetype with the same name as the file.
As the ``source_type`` parameter, csv, json or others can be input and then passed to
df.to_csv(), df.to_json(), df.to_string() styles respectively.
df.to_csv(), df.to_json(), df.to_string() styles respectively.

The default is **json** if without ``table_name`` parameter, because ``table_name`` remains
only for the backward compatibility.
only for the backward compatibility.

As with uploading a DataFrame
if the index provided does not exist and you want it to be created, you can pass
the parameter ``create_index = True``.
Expand All @@ -144,14 +147,18 @@ Uploading a Folder to Splunk
^^^^^^^^^^^^^^^^^^^^^^^^^^^^

You can also upload a whole folder of files. To do this simply pass the folder path to
``.upload_folder()`` along with the
name of the index you want the data uploaded to. By default this will upload all csv files in that folder to Splunk,
with each file being uploaded to a sourcetype with a name corresponding to the file name. Alternatively you can also
specify single a sourcetype which all files will be uploaded with the ``source_type`` parameter.
``.upload_folder()`` along with the
name of the index you want the data uploaded to. By default,
this will upload all csv files in that folder to Splunk,
with each file being uploaded to a sourcetype with a name corresponding to the file name.

Alternatively, you can also
specify single a source type which all files will be uploaded with the ``source_type`` parameter.
As the ``source_type`` parameter, csv, json or others can be input and then passed to
df.to_csv(), df.to_json(), df.to_string() styles respectively.
df.to_csv(), df.to_json(), df.to_string() styles respectively.
The default is **json** if without ``table_name`` parameter, because ``table_name`` remains
only for the backward compatibility.
only for the backward compatibility.

If your files have some
other separated value file type you can pass ``delim``, and the specified delimiter value, however currently there is
only support for a single delim type across files. By default this method attempts to upload all files in the specified
Expand Down
4 changes: 3 additions & 1 deletion docs/source/getting_started/UserSessionConfig.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,9 @@ components based on configuration provided in a YAML file.
This allows you to load multiple providers and components in a single step
avoiding having to write a lot of repetitive code in your notebooks.

The user is expected to supply the path to the YAML file to the ``load_user_config`` function. Each key in the ``QueryProviders`` and `Components` sections of the YAML file will be the name of the component variable in the local namespace.
The user is expected to supply the path to the YAML file to the ``load_user_config`` function.
Each key in the ``QueryProviders`` and ``Components`` sections of the YAML file will be the
name of the component variable in the local namespace.

Example YAML Configuration
--------------------------
Expand Down
65 changes: 33 additions & 32 deletions msticpy/init/mp_user_session.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,44 +10,45 @@
Example YAML file:
QueryProviders:
qry_prov_sent:
DataEnvironment: MSSentinel
InitArgs:
debug: True
Connect: True
ConnectArgs:
workspace: CyberSecuritySoc
auth_methods: ['cli', 'device_code']
qry_prov_md:
DataEnvironment: M365D
qry_kusto_mde:
DataEnvironment: Kusto
Connect: True
ConnectArgs:
cluster: MDE-Scrubbed
qry_kusto_msticti:
DataEnvironment: Kusto
Connect: True
ConnectArgs:
cluster: MSTICTI
Components:
mssentinel:
Module: msticpy.context.azure
Class: MicrosoftSentinel
InitArgs:
Connect: True
ConnectArgs:
workspace: CyberSecuritySoc
auth_methods: ['cli', 'device_code']
.. code-block:: yaml
QueryProviders:
qry_prov_sent:
DataEnvironment: MSSentinel
InitArgs:
debug: True
Connect: True
ConnectArgs:
workspace: MySoc
auth_methods: ['cli', 'device_code']
qry_prov_md:
DataEnvironment: M365D
qry_kusto_mde:
DataEnvironment: Kusto
Connect: True
ConnectArgs:
cluster: MDEData
qry_kusto_mstic:
DataEnvironment: Kusto
Connect: True
ConnectArgs:
cluster: MSTIC
Components:
mssentinel:
Module: msticpy.context.azure
Class: MicrosoftSentinel
InitArgs:
Connect: True
ConnectArgs:
workspace: CyberSecuritySoc
auth_methods: ['cli', 'device_code']
Example usage:
```python
.. code-block:: python
import msticpy as mp
mp.init_notebook()
mp.mp_user_session.load_user_config()
```
"""

Expand Down

0 comments on commit f7b81d1

Please sign in to comment.