Knowing the Nodes

Qualys Flow categorizes nodes based on the type of function they perform.
To access the nodes,

  1. Log in to your Qualys Flow account.
  2. Go to the Editor tab and click the explore nodes icon located at the top left corner of the Editor window.

    The Explore Nodes pop-up window is displayed.

To know the function of these nodes, refer to the following table.

Node

Function

Details

Trigger

By default, is the first node present in the editor for any QFlow. The trigger can be time-based, AWS event-based, manual, or TotalCloud.

Different types of triggers:

  • Schedule: Allows triggering the workflow at specific times. It includes Recurrent and CronTrigger sub-types.
  • Manual: The default trigger that is always active and can be run by clicking 'Run Now'.
  • TotalCloud Trigger: Enables TotalCloud to initiate QFlow when User Defined Controls (UDC) are executed.
  • Events: Triggers the workflow in response to AWS events by configuring the event trigger.

Triggers work automatically when the QFlow is 'Deployed'. If it is 'Undeployed', users can only run the QFlow manually by clicking 'Run Now'. Other triggers do not work unless the QFlow is 'Deployed'.

AWS Nodes (Cloud-Specific Nodes)

 

AWS Resource

It fetches the resources that belong to a specific AWS service. The node can access all AWS services and resources available to you.

For example, you can select the RDS service and use DB instances as a method in the AWS resource node to get metadata of RDS DB instances.

The output of this node is a JSON representation of the resources fetched by the resource node. Use 'Additional parameters' to add parameters to the fetch operation, narrowing down the results. Use add-ons to enrich the resource data with related resources.

AWS Action

It performs the action you define on the selected resources; the actual automation is accomplished in this node.

For example, after identifying the list of publicly available RDS DB instances, you can perform the action i.e. delete DB Instances.

You can access attributes of input resources by using 'obj.<param>' within the params mapping.

For example, use 'obj.InstanceId' to obtain the instance ID of EC2 instances fed into this node. Certain actions give you the option to wait for completion before proceeding to the next node in the QFlow.

 Azure Nodes (Cloud-Specific Nodes)

 

Azure Resource

It fetches the resources that belong to a specific Azure service. The node can access all Azure services and resources available to you.

It fetches all the resources provided by Azure Software Development Kit (SDK) that belong to a specific Azure service.

The output of this node is a JSON representation of the resources fetched by the resource node. Use 'Additional parameters' to add parameters to the fetch operation, narrowing down the results. Use add-ons to enrich the resource data with related resources.

Azure Action 

It performs the action you define on the selected resources; the actual automation is accomplished in this node.

It performs all the actions that are part of Azure SDK defined in the selected resources.

The Action node retrieves Azure service resources.

  • It can fetch all Azure Virtual Machine instances and attributes for further processing.
  • Outputs a JSON with resource data and attributes.
  • Allows specific or advanced Azure parameters through Parameters.
  • Add-ons can enhance resource data with extra details, such as VM disk information.
GCP Nodes (Cloud-Specific Nodes)  

GCP Resource

It fetches the resources that belong to a specific GCP service. The node can access all GCP services and resources available to you.

For example, you can select the Google Compute Engine for service, Instances for resources, and the specific API that you want to execute (For example, List to get all the VM instances under Google Compute Engine)

The output of this node is a JSON representation of the resources fetched by the resource node. Use 'Additional parameters' to add parameters to the fetch operation, narrowing down the results. Use add-ons to enrich the resource data with related resources.

 

GCP Action

It performs the action you define on the selected resources; the actual automation is accomplished in this node.

The GCP Action node can perform any action that is available for a resource, in the GCP SDK.

The Action node fetches resources from GCP services, such as Compute Engine VMs, and outputs JSON data containing resource attributes. Parameters and Add-ons can customize and enrich the data.

General Nodes

 

Filter

It performs filtering of the resources based on a set of conditions. You can combine criteria using logical AND/OR conditions to filter this data.

For example, you can filter publicly available RDS DB instances from all RDS DB instances using the Filter node.

 

Use the following filters based on various fields:

  • Param: To filter the data based on a property of the resource object.

  • Date: To filter the data based on a date, like resources created in the last 30 days. 

  • Tags: To filter the data based on tags.

  • Security Group: To filter Security Group data based on the different aspects of a security group (ports, IP ranges, etc..). Available only for security group data.

  • Network ACLs: To filter Network ACL data based on the different aspects of a network ACL (ports, IP ranges, and so on ). Available only for Network ACLs.

  • Function: To filter data based on custom logic written in a function. You can write functions in Javascript.

Report

It allows users to generate and download reports of the selected data in CSV or JSON format.

The Report node creates a CSV or PDF report using data from previous nodes. Use the Passthrough option for including all input data, or the Input Transformer for selecting specific attributes. Download the report from the execution details page.

Custom

It is used to write scripts to perform custom operations on the input data. The code can be written in Javascript. The code runs in a sandbox and hence external libraries cannot be imported.

The Custom node allows you to write custom scripts to process the input data. The custom code operates in a NodeJS v10 sandbox. You can choose from predefined custom logic options such as lambda cost predictor and detector, tags-info-format, or input any logic you require through this node.

HTTP

It makes HTTP(S) calls from a QFlow. This allows you make REST API calls to external applications or services. The HTTP node can be placed anywhere in the QFlow except after TotalCloud node. 

The HTTP node can be used to make HTTP/HTTPS API calls from a workflow. This is useful if you need to call external APIs and access remote files.

You can select 'NONE' as the resource if you want this node to be independent. 

Select the API method as GET, PUT, POST, or DELETE. You can select URL Resolution to Internal for addresses within the same site, or to External for addresses that are accessible through the Internet.

Workflow Trigger

It is used to trigger another workflow (QFlow) present within the QFlow application.
 

The Workflow Trigger node triggers another workflow that is currently deployed. You can select a deployed workflow from your account to be triggered.

If 'Trigger workflow only on data' is enabled, the selected workflow is triggered only when input data is present. If this option is disabled, the selected workflow is triggered whether input data is present or not.

Parallel Node

It allows multiple sets of steps to be executed at the same time by placing them in parallel branches. 

Parallel branches are helpful if you need QFlow to perform multiple independent operations in parallel. This helps in reducing the total execution time of large QFlows where independent operations need to be performed.

Data Formatter

It allows you to format the input data as you need. The input data is the output of a previous node. 

The Data Formatter Node allows you to format the output of the preceding node in two ways. 

  • You can choose specific keys to be included in the output and ignore the rest, or create custom key-value pairs for each object in the input. 
  • Additionally, you have the option to create an entirely new output based on calculations performed on the input values or any other custom value.

Data Joiner

It joins data from two previous nodes. 

  • The Data Joiner Node combines data from two previous nodes using one of four different methods:
  • Left Join: Includes all the keys from the first input and the matching keys from the second input.
  • Right Join: Includes all the keys from the second input and the matching keys from the first input.
  • Inner Join: Includes only the keys that are common to both the first and second inputs (Intersection operation).
  • Full Join: Includes all the keys from both the first and second inputs (Union operation).

TotalCloud 

TotalCloud node is used to make the workflow UDC compatible. With TotalCloud node added in, qflows can send UDC related information to TotalCloud. 

  • QFlows should have a TotalCloud node if they are made to adopt as a Control in Qualys TotalCloud.
  • When adding a TotalCloud node, it must be the last node in the QFlow. No additional nodes can be added after the TotalCloud node.
  • Resources meeting the evaluation criteria are marked as and those not meeting the criteria are marked as Passed.
  • Ensure QFlow is deployed with the TotalCloud trigger enabled for it to function as a Control in TotalCloud. If skipped, the control will not trigger QFlow.

RAW

Raw cloud node, similar to resource and action nodes, call Cloud APIs. Unlike resource and action nodes where read-only and modify concerns are separated, Raw nodes can do both.

For RAW node, you can select the cloud platform from AWS, GCP, or AZURE and select the services and corresponding API for that service.

Loop

Loop node executes a set of nodes for a limited number of iterations. It evaluates the exit expression or number of loops before determining to run again or move to the next node. This Loop node runs the defined process at least once, regardless of previous activities. You can select the number of loops from 1 and 10.

A Loop node is used to iterate over a group of nodes in the workflow. It iterates through nodes starting from the 'Loop Start Node' to the node just before the Loop node. The end of the loop node can be determined either by the number of iterations or by the exit condition.

  • Start Node: Determines the loop's beginning.
  • Exit Type: Define loop node exit based on loop count or conditions similar to filter logic.
  • Number of Loops: Maximum number of iterations for the loop node.
  • Exit Conditions: Define criteria for ending the loop; it exits when criteria are met.
  • Delay between Loops: Indicates the minimum seconds to wait before the next loop starts, applicable after the first iteration.

Large HTTP

It allows you to integrate the third-party application with an HTTP endpoint. Large HTTP nodes can be used to call APIs that return a large amount of data.

The Large HTTP node is used to make HTTP/HTTPS API calls from a workflow. It is similar to the HTTP node but can handle large amounts of data, making it useful for calling external APIs and accessing remote files. If you want the node to be independent, you can select 'NONE' as the resource.

You can choose the API method as GET, PUT, POST, or DELETE. When it comes to URL Resolution, you can select Internal for addresses within the same site or External for addresses accessible through the Internet.

Some important notes:

  • If the endpoint is secure, you need to configure the authentication details. Depending on the endpoint configuration, you can set up Basic Auth, a Bearer Token, or API Keys.
  • The host header cannot be modified for internal URLs.

CLI

It works similarly to the AWS CLI tool and can execute CLI commands from within a QFlow.

The CLI node accepts variables as arguments, supports all clouds, and requires specifying the command and its arguments.

Related Topics

Viewing your QFlows

Using QFlows in TotalCloud

Creating QFlows from Scratch

Creating QFlows from a Template