Workflows are building blocks of automation, but sometimes you need to go beyond what the platform currently provides. Sometimes, rules of engagement forbids you from storing target-related information on external servers. Or maybe you have your own database that you maintain, storing results of scans.
This section will guide you through the process of extending workflows such that you can leverage the automation triggers, while customizing results and actions to fit your unique requirements.
By default, workflow results are stored within the platform's database if you
specify scans.[ID]artifacts[].
Let's say rules of engagement forbids you to store any target-related information on external servers. To comply with this requirement, you simply need to add a step at the end of your workflow, that will upload your files in the manner you want to a server you control. That is it.
So, let's say you are running a subfinder. You have your own server that stores results.
In this example, we added a final step that uploads the output.jsonl file to
our own server using curl. This way, no target-related information is stored
on external servers, and we comply with the rules of engagement.
By default, the runner sources secrets during service installation, or when you
start it using the run command.
For most cases, you would be better off using the platform's secret management, or vars to specify tokens or project-related information. However, let's say you are forbidden to share this information with other platforms.
What you can do is store your secrets locally, source them as environment variables, and then start the runner.
Then, you can start the runner as usual:
This way, your secrets are never shared with external platforms, and you can still leverage the power of workflows.
Instead of using ${{ secrets.BOUNTYHUB_TOKEN }} in your workflow, you would
use $BOUNTYHUB_TOKEN to access the environment variable.
Currently Reading
Extending Workflows