Skip to content
This repository was archived by the owner on Sep 25, 2025. It is now read-only.

Commit 39fd9be

Browse files
committed
add info on cloudTrail, using secrets manager directly, and clarifying how to open terminal when resetting env
1 parent e8b69f2 commit 39fd9be

File tree

2 files changed

+81
-13
lines changed

2 files changed

+81
-13
lines changed

docs/how-tos/airflow/use-aws-secrets-manager.md

Lines changed: 79 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -22,26 +22,29 @@ Each time a variable is accessed, an API call is made to AWS Secrets Manager. If
2222

2323
### To solve for this there are 2 best practices to follow:
2424

25-
1. Always call your `Variable.get` from within the Datacoves Task Decorator. This ensures the variable is only fetched at runtime.
26-
2. Make use of the `connections_lookup_pattern` and `variables_lookup_pattern` when setting up your secondary backend above. This means only variables and connections prefixed with `aws_` would be make an API call to AWS Secrets Manager. eg) `aws_mayras_secret`
25+
1. Always call your `Variable.get` from within an Airflow decorator such as the Datacoves Bash Task Decorator. This ensures the variable is only fetched at run time.
26+
2. Make use of the `connections_lookup_pattern` and `variables_lookup_pattern` when setting up your secondary backend above. This means only variables and connections prefixed with `aws_` would be make an API call to AWS Secrets Manager. eg) `aws_my_secret`
2727

2828

2929
```python
30+
"""
31+
## Sample DAG using variables
32+
This DAG is a sample using the Datacoves decorators with variable from AWS.
33+
"""
34+
3035
from airflow.decorators import dag, task
3136
from pendulum import datetime
3237
from airflow.models import Variable
3338

34-
doc = """## Datacoves Bash Decorator DAG
35-
This DAG is a sample using the Datacoves decorators with variable calls."""
36-
3739
@dag(
40+
doc_md = __doc__,
41+
catchup = False,
3842
default_args={
3943
"start_date": datetime(2024, 1, 1),
4044
"owner": "Mayra Pena",
41-
"email": "Mayra @example.com",
45+
"email": "mayra@example.com",
4246
"email_on_failure": True,
4347
},
44-
catchup=False,
4548
tags=["version_1"],
4649
description="Testing task decorators",
4750
schedule_interval="0 0 1 */12 *",
@@ -50,15 +53,80 @@ def task_decorators_example():
5053

5154
@task.datacoves_bash(connection_id="main")
5255
def calling_vars_in_decorators() -> str:
53-
my_var = Variable.get("aws_mayras_secret") # Call variable within @task.datacoves_bash
56+
my_var = Variable.get("aws_my_secret") # Call variable within @task.datacoves_bash
5457
return f"My variable is: {my_var}"
5558

5659
calling_vars_in_decorator() # Call task function
5760

5861
# Invoke Dag
59-
dag = task_decorators_example()
60-
dag.doc_md = doc
62+
task_decorators_example()
63+
```
64+
65+
>[!TIP]To auto mask your secret you can use `secret` or `password` in the secret name since this will set `hide_sensitive_var_conn_fields` to True. eg) aws_my_password. Please see [this documentation](https://www.astronomer.io/docs/learn/airflow-variables#hide-sensitive-information-in-airflow-variables) for a full list of masking words.
66+
67+
## Using a secrets manager directly from Airflow
68+
69+
While not recommended, you can bypass the Datacoves secrets manager integration by configuring an Airflow connection and using the `SecretsManagerHook` in an Airflow DAG.
70+
71+
### Configure an Airflow Connection
72+
Create a new Airflow Connection with the following parameters:
73+
74+
Connection Id: aws_secrets_manager
75+
Connection Type: Amazon Web Services
76+
AWS Access Key ID: ....
77+
AWS Secret Access Key: ....
78+
Extra:
79+
{
80+
"region_name": "us-west-2"
81+
}
6182

83+
84+
```python
85+
"""
86+
## Sample DAG using variables
87+
This DAG is a sample using the Datacoves decorators with variable from AWS.
88+
"""
89+
90+
from airflow.decorators import dag, task
91+
from pendulum import datetime
92+
from airflow.providers.amazon.aws.hooks.secrets_manager import SecretsManagerHook
93+
94+
@dag(
95+
doc_md = __doc__,
96+
catchup = False,
97+
98+
default_args={
99+
"start_date": datetime(2024, 1, 1),
100+
"owner": "Noel Gomez",
101+
"email": "[email protected]",
102+
"email_on_failure": True,
103+
},
104+
tags=["sample"],
105+
description="Testing task decorators",
106+
schedule_interval="0 0 1 */12 *",
107+
)
108+
def variable_usage():
109+
110+
@task.datacoves_bash
111+
def aws_var():
112+
secrets_manager_hook = SecretsManagerHook(aws_conn_id='aws_secrets_manager')
113+
var = secrets_manager_hook.get_secret("airflow/variables/aws_ngtest")
114+
return f"export MY_VAR={var} && echo $MY_VAR"
115+
116+
aws_var()
117+
118+
variable_usage()
62119
```
63120

64-
>[!TIP]To auto mask your secret you can use `secret` or `password` in the secret name since this will set `hide_sensitive_var_conn_fields` to True. eg) aws_mayras_password. Please see [this documentation](https://www.astronomer.io/docs/learn/airflow-variables#hide-sensitive-information-in-airflow-variables) for a full list of masking words.
121+
## Check when secret is being fetched from AWS
122+
123+
It is a good idea to verify that Secrets are only being fetched when expected. To do this, you can use AWS CloudTrail.
124+
125+
1. From the AWS Console, go to `CloudTrail`
126+
2. Click `Event History`
127+
3. Click `Clear Filter`
128+
4. In the `Lookup Attributes` dropdown, select `Event Name`
129+
5. In the `Enter an Event Name` input box, enter `GetSecretValue`
130+
131+
Review the `Resource name` and `Event time`.
132+
Note: it may take a few minutes for fetch events to show up in CloudTrail.

docs/how-tos/vscode/reset-user-env.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
# Reset the User's Env if the git Repository is Changed in the Project.
22

3-
If you change the repo associated with your environment after one has already been cloned into it then you will need to remove the cloned repo in the transform and reset the environment.
3+
If you need to reset your user environment because you change the repo associated with your environment after one has already been cloned or if the repo fails to clone, you will need to remove the workspace folder and reset the environment.
44

5-
- Open terminal and enter the following commands.
5+
- Open terminal and enter the following commands. If you get an error when opening the terminal because you don't have a `transform` folder or similar, simply right click in the file area and select `Open in Integrated Terminal`
66

77
```bash
88
cd ~/

0 commit comments

Comments
 (0)