Knowing the Nodes
Qualys Flow categorizes nodes based on the type of function they perform.
To access the nodes,
- Log in to your Qualys Flow account.
- Go to the Editor tab and click the explore nodes icon located at the top left corner of the Editor window.
The Explore Nodes pop-up window is displayed.
To know the function of these nodes, refer to the following table.
Node |
Function |
Details |
---|---|---|
Trigger |
By default, is the first node present in the editor for any QFlow. The trigger can be time-based, AWS event-based, manual, or TotalCloud. |
Different types of triggers:
Triggers work automatically when the QFlow is 'Deployed'. If it is 'Undeployed', users can only run the QFlow manually by clicking 'Run Now'. Other triggers do not work unless the QFlow is 'Deployed'. |
AWS Nodes (Cloud-Specific Nodes) |
||
AWS Resource |
It fetches the resources that belong to a specific AWS service. The node can access all AWS services and resources available to you. For example, you can select the RDS service and use DB instances as a method in the AWS resource node to get metadata of RDS DB instances. |
The output of this node is a JSON representation of the resources fetched by the resource node. Use 'Additional parameters' to add parameters to the fetch operation, narrowing down the results. Use add-ons to enrich the resource data with related resources. |
AWS Action |
It performs the action you define on the selected resources; the actual automation is accomplished in this node. For example, after identifying the list of publicly available RDS DB instances, you can perform the action i.e. delete DB Instances. |
You can access attributes of input resources by using 'obj.<param>' within the params mapping. For example, use 'obj.InstanceId' to obtain the instance ID of EC2 instances fed into this node. Certain actions give you the option to wait for completion before proceeding to the next node in the QFlow. |
Azure Nodes (Cloud-Specific Nodes) |
||
Azure Resource |
It fetches the resources that belong to a specific Azure service. The node can access all Azure services and resources available to you. It fetches all the resources provided by Azure Software Development Kit (SDK) that belong to a specific Azure service. |
The output of this node is a JSON representation of the resources fetched by the resource node. Use 'Additional parameters' to add parameters to the fetch operation, narrowing down the results. Use add-ons to enrich the resource data with related resources. |
Azure Action |
It performs the action you define on the selected resources; the actual automation is accomplished in this node. It performs all the actions that are part of Azure SDK defined in the selected resources. |
The Action node retrieves Azure service resources.
|
GCP Nodes (Cloud-Specific Nodes) | ||
GCP Resource |
It fetches the resources that belong to a specific GCP service. The node can access all GCP services and resources available to you. For example, you can select the Google Compute Engine for service, Instances for resources, and the specific API that you want to execute (For example, List to get all the VM instances under Google Compute Engine) |
The output of this node is a JSON representation of the resources fetched by the resource node. Use 'Additional parameters' to add parameters to the fetch operation, narrowing down the results. Use add-ons to enrich the resource data with related resources.
|
GCP Action |
It performs the action you define on the selected resources; the actual automation is accomplished in this node. The GCP Action node can perform any action that is available for a resource, in the GCP SDK. |
The Action node fetches resources from GCP services, such as Compute Engine VMs, and outputs JSON data containing resource attributes. Parameters and Add-ons can customize and enrich the data. |
General Nodes |
||
Filter |
It performs filtering of the resources based on a set of conditions. You can combine criteria using logical AND/OR conditions to filter this data. For example, you can filter publicly available RDS DB instances from all RDS DB instances using the Filter node.
|
Use the following filters based on various fields:
|
Report |
It allows users to generate and download reports of the selected data in CSV or JSON format. |
The Report node creates a CSV or PDF report using data from previous nodes. Use the Passthrough option for including all input data, or the Input Transformer for selecting specific attributes. Download the report from the execution details page. |
Custom |
It is used to write scripts to perform custom operations on the input data. The code can be written in Javascript. The code runs in a sandbox and hence external libraries cannot be imported. |
The Custom node allows you to write custom scripts to process the input data. The custom code operates in a NodeJS v10 sandbox. You can choose from predefined custom logic options such as lambda cost predictor and detector, tags-info-format, or input any logic you require through this node. |
HTTP |
It makes HTTP(S) calls from a QFlow. This allows you make REST API calls to external applications or services. The HTTP node can be placed anywhere in the QFlow except after TotalCloud node. |
The HTTP node can be used to make HTTP/HTTPS API calls from a workflow. This is useful if you need to call external APIs and access remote files. You can select 'NONE' as the resource if you want this node to be independent. Select the API method as GET, PUT, POST, or DELETE. You can select URL Resolution to Internal for addresses within the same site, or to External for addresses that are accessible through the Internet. |
Workflow Trigger |
It is used to trigger another workflow (QFlow) present within the QFlow application. |
The Workflow Trigger node triggers another workflow that is currently deployed. You can select a deployed workflow from your account to be triggered. If 'Trigger workflow only on data' is enabled, the selected workflow is triggered only when input data is present. If this option is disabled, the selected workflow is triggered whether input data is present or not. |
Parallel Node |
It allows multiple sets of steps to be executed at the same time by placing them in parallel branches. |
Parallel branches are helpful if you need QFlow to perform multiple independent operations in parallel. This helps in reducing the total execution time of large QFlows where independent operations need to be performed. |
Data Formatter |
It allows you to format the input data as you need. The input data is the output of a previous node. |
The Data Formatter Node allows you to format the output of the preceding node in two ways.
|
Data Joiner |
It joins data from two previous nodes. |
|
TotalCloud |
TotalCloud node is used to make the workflow UDC compatible. With TotalCloud node added in, qflows can send UDC related information to TotalCloud. |
|
RAW |
Raw cloud node, similar to resource and action nodes, call Cloud APIs. Unlike resource and action nodes where read-only and modify concerns are separated, Raw nodes can do both. |
For RAW node, you can select the cloud platform from AWS, GCP, or AZURE and select the services and corresponding API for that service. |
Loop |
Loop node executes a set of nodes for a limited number of iterations. It evaluates the exit expression or number of loops before determining to run again or move to the next node. This Loop node runs the defined process at least once, regardless of previous activities. You can select the number of loops from 1 and 10. |
A Loop node is used to iterate over a group of nodes in the workflow. It iterates through nodes starting from the 'Loop Start Node' to the node just before the Loop node. The end of the loop node can be determined either by the number of iterations or by the exit condition.
|
Large HTTP |
It allows you to integrate the third-party application with an HTTP endpoint. Large HTTP nodes can be used to call APIs that return a large amount of data. |
The Large HTTP node is used to make HTTP/HTTPS API calls from a workflow. It is similar to the HTTP node but can handle large amounts of data, making it useful for calling external APIs and accessing remote files. If you want the node to be independent, you can select 'NONE' as the resource. You can choose the API method as GET, PUT, POST, or DELETE. When it comes to URL Resolution, you can select Internal for addresses within the same site or External for addresses accessible through the Internet. Some important notes:
|
CLI |
It works similarly to the AWS CLI tool and can execute CLI commands from within a QFlow. |
The CLI node accepts variables as arguments, supports all clouds, and requires specifying the command and its arguments. |